Sample records for small response probability

  1. Examining Spillovers between Long and Short Repeated Prisoner's Dilemma Games Played in the Laboratory.

    PubMed

    Arechar, Antonio A; Kouchaki, Maryam; Rand, David G

    2018-03-01

    We had participants play two sets of repeated Prisoner's Dilemma (RPD) games, one with a large continuation probability and the other with a small continuation probability, as well as Dictator Games (DGs) before and after the RPDs. We find that, regardless of which is RPD set is played first, participants typically cooperate when the continuation probability is large and defect when the continuation probability is small. However, there is an asymmetry in behavior when transitioning from one continuation probability to the other. When switching from large to small, transient higher levels of cooperation are observed in the early games of the small continuation set. Conversely, when switching from small to large, cooperation is immediately high in the first game of the large continuation set. We also observe that response times increase when transitioning between sets of RPDs, except for altruistic participants transitioning into the set of RPDs with long continuation probabilities. These asymmetries suggest a bias in favor of cooperation. Finally, we examine the link between altruism and RPD play. We find that small continuation probability RPD play is correlated with giving in DGs played before and after the RPDs, whereas high continuation probability RPD play is not.

  2. Statistical context shapes stimulus-specific adaptation in human auditory cortex

    PubMed Central

    Henry, Molly J.; Fromboluti, Elisa Kim; McAuley, J. Devin

    2015-01-01

    Stimulus-specific adaptation is the phenomenon whereby neural response magnitude decreases with repeated stimulation. Inconsistencies between recent nonhuman animal recordings and computational modeling suggest dynamic influences on stimulus-specific adaptation. The present human electroencephalography (EEG) study investigates the potential role of statistical context in dynamically modulating stimulus-specific adaptation by examining the auditory cortex-generated N1 and P2 components. As in previous studies of stimulus-specific adaptation, listeners were presented with oddball sequences in which the presentation of a repeated tone was infrequently interrupted by rare spectral changes taking on three different magnitudes. Critically, the statistical context varied with respect to the probability of small versus large spectral changes within oddball sequences (half of the time a small change was most probable; in the other half a large change was most probable). We observed larger N1 and P2 amplitudes (i.e., release from adaptation) for all spectral changes in the small-change compared with the large-change statistical context. The increase in response magnitude also held for responses to tones presented with high probability, indicating that statistical adaptation can overrule stimulus probability per se in its influence on neural responses. Computational modeling showed that the degree of coadaptation in auditory cortex changed depending on the statistical context, which in turn affected stimulus-specific adaptation. Thus the present data demonstrate that stimulus-specific adaptation in human auditory cortex critically depends on statistical context. Finally, the present results challenge the implicit assumption of stationarity of neural response magnitudes that governs the practice of isolating established deviant-detection responses such as the mismatch negativity. PMID:25652920

  3. Examining Spillovers between Long and Short Repeated Prisoner’s Dilemma Games Played in the Laboratory

    PubMed Central

    Arechar, Antonio A.; Kouchaki, Maryam; Rand, David G.

    2018-01-01

    We had participants play two sets of repeated Prisoner’s Dilemma (RPD) games, one with a large continuation probability and the other with a small continuation probability, as well as Dictator Games (DGs) before and after the RPDs. We find that, regardless of which is RPD set is played first, participants typically cooperate when the continuation probability is large and defect when the continuation probability is small. However, there is an asymmetry in behavior when transitioning from one continuation probability to the other. When switching from large to small, transient higher levels of cooperation are observed in the early games of the small continuation set. Conversely, when switching from small to large, cooperation is immediately high in the first game of the large continuation set. We also observe that response times increase when transitioning between sets of RPDs, except for altruistic participants transitioning into the set of RPDs with long continuation probabilities. These asymmetries suggest a bias in favor of cooperation. Finally, we examine the link between altruism and RPD play. We find that small continuation probability RPD play is correlated with giving in DGs played before and after the RPDs, whereas high continuation probability RPD play is not. PMID:29809199

  4. Statistical context shapes stimulus-specific adaptation in human auditory cortex.

    PubMed

    Herrmann, Björn; Henry, Molly J; Fromboluti, Elisa Kim; McAuley, J Devin; Obleser, Jonas

    2015-04-01

    Stimulus-specific adaptation is the phenomenon whereby neural response magnitude decreases with repeated stimulation. Inconsistencies between recent nonhuman animal recordings and computational modeling suggest dynamic influences on stimulus-specific adaptation. The present human electroencephalography (EEG) study investigates the potential role of statistical context in dynamically modulating stimulus-specific adaptation by examining the auditory cortex-generated N1 and P2 components. As in previous studies of stimulus-specific adaptation, listeners were presented with oddball sequences in which the presentation of a repeated tone was infrequently interrupted by rare spectral changes taking on three different magnitudes. Critically, the statistical context varied with respect to the probability of small versus large spectral changes within oddball sequences (half of the time a small change was most probable; in the other half a large change was most probable). We observed larger N1 and P2 amplitudes (i.e., release from adaptation) for all spectral changes in the small-change compared with the large-change statistical context. The increase in response magnitude also held for responses to tones presented with high probability, indicating that statistical adaptation can overrule stimulus probability per se in its influence on neural responses. Computational modeling showed that the degree of coadaptation in auditory cortex changed depending on the statistical context, which in turn affected stimulus-specific adaptation. Thus the present data demonstrate that stimulus-specific adaptation in human auditory cortex critically depends on statistical context. Finally, the present results challenge the implicit assumption of stationarity of neural response magnitudes that governs the practice of isolating established deviant-detection responses such as the mismatch negativity. Copyright © 2015 the American Physiological Society.

  5. Introducing a Method for Calculating the Allocation of Attention in a Cognitive “Two-Armed Bandit” Procedure: Probability Matching Gives Way to Maximizing

    PubMed Central

    Heyman, Gene M.; Grisanzio, Katherine A.; Liang, Victor

    2016-01-01

    We tested whether principles that describe the allocation of overt behavior, as in choice experiments, also describe the allocation of cognition, as in attention experiments. Our procedure is a cognitive version of the “two-armed bandit choice procedure.” The two-armed bandit procedure has been of interest to psychologistsand economists because it tends to support patterns of responding that are suboptimal. Each of two alternatives provides rewards according to fixed probabilities. The optimal solution is to choose the alternative with the higher probability of reward on each trial. However, subjects often allocate responses so that the probability of a response approximates its probability of reward. Although it is this result which has attracted most interest, probability matching is not always observed. As a function of monetary incentives, practice, and individual differences, subjects tend to deviate from probability matching toward exclusive preference, as predicted by maximizing. In our version of the two-armed bandit procedure, the monitor briefly displayed two, small adjacent stimuli that predicted correct responses according to fixed probabilities, as in a two-armed bandit procedure. We show that in this setting, a simple linear equation describes the relationship between attention and correct responses, and that the equation’s solution is the allocation of attention between the two stimuli. The calculations showed that attention allocation varied as a function of the degree to which the stimuli predicted correct responses. Linear regression revealed a strong correlation (r = 0.99) between the predictiveness of a stimulus and the probability of attending to it. Nevertheless there were deviations from probability matching, and although small, they were systematic and statistically significant. As in choice studies, attention allocation deviated toward maximizing as a function of practice, feedback, and incentives. Our approach also predicts the frequency of correct guesses and the relationship between attention allocation and response latencies. The results were consistent with these two predictions, the assumptions of the equations used to calculate attention allocation, and recent studies which show that predictiveness and reward are important determinants of attention. PMID:27014109

  6. Value of in vivo electrophysiological measurements to evaluate canine small bowel autotransplants.

    PubMed Central

    Meijssen, M A; Heineman, E; de Bruin, R W; Veeze, H J; Bijman, J; de Jonge, H R; ten Kate, F J; Marquet, R L; Molenaar, J C

    1991-01-01

    This study aimed to develop a non-invasive method for in vivo measurement of the transepithelial potential difference in the canine small bowel and to evaluate this parameter in small bowel autotransplants. In group 0 (control group, n = 4), two intestinal loops were created without disturbing their vascular, neural, and lymphatic supplies. In group I (successful autotransplants, n = 11), two heterotopic small bowel loops were constructed. Long term functional sequelae of vascular, neural, and lymphatic division were studied. Group II (n = 6) consisted of dogs with unsuccessful autotransplants suffering thrombosis of the vascular anastomosis, which resulted in ischaemic small bowel autografts. In group I, values of spontaneous transepithelial potential difference, an index of base line active electrolyte transport, were significantly lower compared with group 0 (p less than 0.05), probably as a result of denervation of the autotransplants. Both theophylline and glucose stimulated potential difference responses, measuring cyclic adenosine monophosphate mediated chloride secretion and sodium coupled glucose absorption respectively, showed negative luminal values in group I at all time points after transplantation. These transepithelial potential difference responses diminished progressively with time. From day 21 onwards both theophylline and glucose stimulated potential difference responses were significantly less than the corresponding responses at day seven (p less than 0.05). Morphometric analysis showed that the reduction of transepithelial potential difference responses preceded degenerative mucosal changes in the heterotopic small bowel autografts. In group II, potential difference responses to theophylline and glucose showed positive luminal values (p<0.01 v group I), probably as a result of passive potassium effusion from necrotic enterocytes. Images Figure 3 PMID:1752464

  7. On the extinction probability in models of within-host infection: the role of latency and immunity.

    PubMed

    Yan, Ada W C; Cao, Pengxing; McCaw, James M

    2016-10-01

    Not every exposure to virus establishes infection in the host; instead, the small amount of initial virus could become extinct due to stochastic events. Different diseases and routes of transmission have a different average number of exposures required to establish an infection. Furthermore, the host immune response and antiviral treatment affect not only the time course of the viral load provided infection occurs, but can prevent infection altogether by increasing the extinction probability. We show that the extinction probability when there is a time-dependent immune response depends on the chosen form of the model-specifically, on the presence or absence of a delay between infection of a cell and production of virus, and the distribution of latent and infectious periods of an infected cell. We hypothesise that experimentally measuring the extinction probability when the virus is introduced at different stages of the immune response, alongside the viral load which is usually measured, will improve parameter estimates and determine the most suitable mathematical form of the model.

  8. Noise deconvolution based on the L1-metric and decomposition of discrete distributions of postsynaptic responses.

    PubMed

    Astrelin, A V; Sokolov, M V; Behnisch, T; Reymann, K G; Voronin, L L

    1997-04-25

    A statistical approach to analysis of amplitude fluctuations of postsynaptic responses is described. This includes (1) using a L1-metric in the space of distribution functions for minimisation with application of linear programming methods to decompose amplitude distributions into a convolution of Gaussian and discrete distributions; (2) deconvolution of the resulting discrete distribution with determination of the release probabilities and the quantal amplitude for cases with a small number (< 5) of discrete components. The methods were tested against simulated data over a range of sample sizes and signal-to-noise ratios which mimicked those observed in physiological experiments. In computer simulation experiments, comparisons were made with other methods of 'unconstrained' (generalized) and constrained reconstruction of discrete components from convolutions. The simulation results provided additional criteria for improving the solutions to overcome 'over-fitting phenomena' and to constrain the number of components with small probabilities. Application of the programme to recordings from hippocampal neurones demonstrated its usefulness for the analysis of amplitude distributions of postsynaptic responses.

  9. The effects of flow on schooling Devario aequipinnatus: school structure, startle response and information transmission

    PubMed Central

    Chicoli, A.; Butail, S.; Lun, Y.; Bak-Coleman, J.; Coombs, S.; Paley, D.A.

    2014-01-01

    To assess how flow affects school structure and threat detection, startle response rates of solitary and small groups of giant danio Devario aequipinnatus were compared to visual looming stimuli in flow and no-flow conditions. The instantaneous position and heading of each D. aequipinnatus were extracted from high-speed videos. Behavioural results indicate that (1) school structure is altered in flow such that D. aequipinnatus orient upstream while spanning out in a crosswise direction, (2) the probability of at least one D. aequipinnatus detecting the visual looming stimulus is higher in flow than no flow for both solitary D. aequipinnatus and groups of eight D. aequipinnatus, however, (3) the probability of three or more individuals responding is higher in no flow than flow. Taken together, these results indicate a higher probability of stimulus detection in flow but a higher probability of internal transmission of information in no flow. Finally, results were well predicted by a computational model of collective fright response that included the probability of direct detection (based on signal detection theory) and indirect detection (i.e. via interactions between group members) of threatening stimuli. This model provides a new theoretical framework for analysing the collective transfer of information among groups of fishes and other organisms. PMID:24773538

  10. Species' traits help predict small mammal responses to habitat homogenization by an invasive grass.

    PubMed

    Ceradini, Joseph P; Chalfoun, Anna D

    2017-07-01

    Invasive plants can negatively affect native species, however, the strength, direction, and shape of responses may vary depending on the type of habitat alteration and the natural history of native species. To prioritize conservation of vulnerable species, it is therefore critical to effectively predict species' responses to invasive plants, which may be facilitated by a framework based on species' traits. We studied the population and community responses of small mammals and changes in habitat heterogeneity across a gradient of cheatgrass (Bromus tectorum) cover, a widespread invasive plant in North America. We live-trapped small mammals over two summers and assessed the effect of cheatgrass on native small mammal abundance, richness, and species-specific and trait-based occupancy, while accounting for detection probability and other key habitat elements. Abundance was only estimated for the most common species, deer mice (Peromyscus maniculatus). All species were pooled for the trait-based occupancy analysis to quantify the ability of small mammal traits (habitat association, mode of locomotion, and diet) to predict responses to cheatgrass invasion. Habitat heterogeneity decreased with cheatgrass cover. Deer mouse abundance increased marginally with cheatgrass. Species richness did not vary with cheatgrass, however, pocket mouse (Perognathus spp.) and harvest mouse (Reithrodontomys spp.) occupancy tended to decrease and increase, respectively, with cheatgrass cover, suggesting a shift in community composition. Cheatgrass had little effect on occupancy for deer mice, 13-lined ground squirrels (Spermophilus tridecemlineatus), and Ord's kangaroo rat (Dipodomys ordii). Species' responses to cheatgrass primarily corresponded with our a priori predictions based on species' traits. The probability of occupancy varied significantly with a species' habitat association but not with diet or mode of locomotion. When considered within the context of a rapid habitat change, such as caused by invasive plants, relevant species' traits may provide a useful framework for predicting species' responses to a variety of habitat disturbances. Understanding which species are likely to be most affected by exotic plant invasion will help facilitate more efficient, targeted management and conservation of native species and habitats. © 2017 by the Ecological Society of America.

  11. Species’ traits help predict small mammal responses to habitat homogenization by an invasive grass

    USGS Publications Warehouse

    Ceradini, Joseph P.; Chalfoun, Anna D.

    2017-01-01

    Invasive plants can negatively affect native species, however, the strength, direction, and shape of responses may vary depending on the type of habitat alteration and the natural history of native species. To prioritize conservation of vulnerable species, it is therefore critical to effectively predict species’ responses to invasive plants, which may be facilitated by a framework based on species’ traits. We studied the population and community responses of small mammals and changes in habitat heterogeneity across a gradient of cheatgrass (Bromus tectorum) cover, a widespread invasive plant in North America. We live-trapped small mammals over two summers and assessed the effect of cheatgrass on native small mammal abundance, richness, and species-specific and trait-based occupancy, while accounting for detection probability and other key habitat elements. Abundance was only estimated for the most common species, deer mice (Peromyscus maniculatus). All species were pooled for the trait-based occupancy analysis to quantify the ability of small mammal traits (habitat association, mode of locomotion, and diet) to predict responses to cheatgrass invasion. Habitat heterogeneity decreased with cheatgrass cover. Deer mouse abundance increased marginally with cheatgrass. Species richness did not vary with cheatgrass, however, pocket mouse (Perognathus spp.) and harvest mouse (Reithrodontomys spp.) occupancy tended to decrease and increase, respectively, with cheatgrass cover, suggesting a shift in community composition. Cheatgrass had little effect on occupancy for deer mice, 13-lined ground squirrels (Spermophilus tridecemlineatus), and Ord's kangaroo rat (Dipodomys ordii). Species’ responses to cheatgrass primarily corresponded with our a priori predictions based on species’ traits. The probability of occupancy varied significantly with a species’ habitat association but not with diet or mode of locomotion. When considered within the context of a rapid habitat change, such as caused by invasive plants, relevant species’ traits may provide a useful framework for predicting species’ responses to a variety of habitat disturbances. Understanding which species are likely to be most affected by exotic plant invasion will help facilitate more efficient, targeted management and conservation of native species and habitats.

  12. Exposure-response evaluations of venetoclax efficacy and safety in patients with non-Hodgkin lymphoma.

    PubMed

    Parikh, Apurvasena; Gopalakrishnan, Sathej; Freise, Kevin J; Verdugo, Maria E; Menon, Rajeev M; Mensing, Sven; Salem, Ahmed Hamed

    2018-04-01

    Exposure-response analyses were performed for a venetoclax monotherapy study in 106 patients with varying subtypes of non-Hodgkin lymphoma (NHL) (NCT01328626). Logistic regression, time-to-event, and progression-free survival (PFS) analyses were used to evaluate the relationship between venetoclax exposure, NHL subtype and response, PFS, or occurrence of serious adverse events. Trends for small increases in the probability of response with increasing venetoclax exposures were identified, and became more evident when assessed by NHL subtype. Trends in exposure-PFS were shown for the mantle cell lymphoma (MCL) subtype, but not other subtypes. There was no increase in the probability of experiencing a serious adverse event with increasing exposure. Overall, the results indicate that venetoclax doses of 800-1200 mg as a single agent may be appropriate to maximize efficacy in MCL, follicular lymphoma, and diffuse large B-cell lymphoma subtypes with no expected negative impact on safety.

  13. Stochastic response and bifurcation of periodically driven nonlinear oscillators by the generalized cell mapping method

    NASA Astrophysics Data System (ADS)

    Han, Qun; Xu, Wei; Sun, Jian-Qiao

    2016-09-01

    The stochastic response of nonlinear oscillators under periodic and Gaussian white noise excitations is studied with the generalized cell mapping based on short-time Gaussian approximation (GCM/STGA) method. The solutions of the transition probability density functions over a small fraction of the period are constructed by the STGA scheme in order to construct the GCM over one complete period. Both the transient and steady-state probability density functions (PDFs) of a smooth and discontinuous (SD) oscillator are computed to illustrate the application of the method. The accuracy of the results is verified by direct Monte Carlo simulations. The transient responses show the evolution of the PDFs from being Gaussian to non-Gaussian. The effect of a chaotic saddle on the stochastic response is also studied. The stochastic P-bifurcation in terms of the steady-state PDFs occurs with the decrease of the smoothness parameter, which corresponds to the deterministic pitchfork bifurcation.

  14. Estimating trends in alligator populations from nightlight survey data

    USGS Publications Warehouse

    Fujisaki, Ikuko; Mazzotti, Frank J.; Dorazio, Robert M.; Rice, Kenneth G.; Cherkiss, Michael; Jeffery, Brian

    2011-01-01

    Nightlight surveys are commonly used to evaluate status and trends of crocodilian populations, but imperfect detection caused by survey- and location-specific factors makes it difficult to draw population inferences accurately from uncorrected data. We used a two-stage hierarchical model comprising population abundance and detection probability to examine recent abundance trends of American alligators (Alligator mississippiensis) in subareas of Everglades wetlands in Florida using nightlight survey data. During 2001–2008, there were declining trends in abundance of small and/or medium sized animals in a majority of subareas, whereas abundance of large sized animals had either demonstrated an increased or unclear trend. For small and large sized class animals, estimated detection probability declined as water depth increased. Detection probability of small animals was much lower than for larger size classes. The declining trend of smaller alligators may reflect a natural population response to the fluctuating environment of Everglades wetlands under modified hydrology. It may have negative implications for the future of alligator populations in this region, particularly if habitat conditions do not favor recruitment of offspring in the near term. Our study provides a foundation to improve inferences made from nightlight surveys of other crocodilian populations.

  15. Estimating trends in alligator populations from nightlight survey data

    USGS Publications Warehouse

    Fujisaki, Ikuko; Mazzotti, F.J.; Dorazio, R.M.; Rice, K.G.; Cherkiss, M.; Jeffery, B.

    2011-01-01

    Nightlight surveys are commonly used to evaluate status and trends of crocodilian populations, but imperfect detection caused by survey- and location-specific factors makes it difficult to draw population inferences accurately from uncorrected data. We used a two-stage hierarchical model comprising population abundance and detection probability to examine recent abundance trends of American alligators (Alligator mississippiensis) in subareas of Everglades wetlands in Florida using nightlight survey data. During 2001-2008, there were declining trends in abundance of small and/or medium sized animals in a majority of subareas, whereas abundance of large sized animals had either demonstrated an increased or unclear trend. For small and large sized class animals, estimated detection probability declined as water depth increased. Detection probability of small animals was much lower than for larger size classes. The declining trend of smaller alligators may reflect a natural population response to the fluctuating environment of Everglades wetlands under modified hydrology. It may have negative implications for the future of alligator populations in this region, particularly if habitat conditions do not favor recruitment of offspring in the near term. Our study provides a foundation to improve inferences made from nightlight surveys of other crocodilian populations. ?? 2011 US Government.

  16. Heterogeneous detection probabilities for imperiled Missouri River fishes: implications for large-river monitoring programs

    USGS Publications Warehouse

    Schloesser, J.T.; Paukert, Craig P.; Doyle, W.J.; Hill, Tracy D.; Steffensen, K.D.; Travnichek, Vincent H.

    2012-01-01

    Occupancy modeling was used to determine (1) if detection probabilities (p) for 7 regionally imperiled Missouri River fishes (Scaphirhynchus albus, Scaphirhynchus platorynchus, Cycleptus elongatus, Sander canadensis, Macrhybopsis aestivalis, Macrhybopsis gelida, and Macrhybopsis meeki) differed among gear types (i.e. stationary gill nets, drifted trammel nets, and otter trawls), and (2) how detection probabilities were affected by habitat (i.e. pool, bar, and open water), longitudinal position (five 189 to 367 rkm long segments), sampling year (2003 to 2006), and season (July 1 to October 30 and October 31 to June 30). Adult, large-bodied fishes were best detected with gill nets (p: 0.02–0.74), but most juvenile large-bodied and all small-bodied species were best detected with otter trawls (p: 0.02–0.58). Trammel nets may be a redundant sampling gear for imperiled fishes in the lower Missouri River because most species had greater detection probabilities with gill nets or otter trawls. Detection probabilities varied with river segment for S. platorynchus, C. elongatus, and all small-bodied fishes, suggesting that changes in habitat influenced gear efficiency or abundance changes among river segments. Detection probabilities varied by habitat for adult S. albus and S. canadensis, year for juvenile S. albus, C. elongatus, and S. canadensis, and season for adult S. albus. Concentrating sampling effort on gears with the greatest detection probabilities may increase species detections to better monitor a population's response to environmental change and the effects of management actions on large-river fishes.

  17. Is the Production of Embryos in Small-Scale Farming an Economically Feasible Enterprise?

    PubMed

    Sánchez, Z; Lammoglia, M A; Alarcón, M A; Romero, J J; Galina, C S

    2015-08-01

    The present assay attempts to evaluate the feasibility of using embryo transfer in small community farmers by in vivo study and by modelling the results obtained. From the total of 59 donor cows, 62.7% responded to treatment, with a significant difference (p = 0.002) in the percentage of the response between breeds, being 90.5% (19/21) in Holstein and 47.4% (18/38) in Brahman. A total of 283 embryos were graded as transferable, while 141 as non-transferable, without difference in the percentage of transferable embryo by breed (p = 0.18). The mean of transferable embryos graded as class I and II was not different between Holstein and Brahman (p = 0.96 and p = 0.92, respectively); besides, no differences were observed in the other grades (non-transferable). The highest difference in costs, regardless of its quality by breed, was seen in the lower levels of probable fertility of the embryo transferred, even reaching several hundred dollars. When modelling the expected costs for embryo produced and transferred, values can reach nearly $2000.00 when the probable fertility is only 10%. However, when the probable fertility was 60%, embryo cost was close to $300.00. This technology seems to be viable on average or high-scale systems, having a superovulatory response between 60 and 80% with 4-6 transferrable embryos. Yet, in small-scale farming, due to the reduced number of donors and/or recipients, the costs surpass the economical feasibility of the technique. © 2015 Blackwell Verlag GmbH.

  18. Accounting for randomness in measurement and sampling in studying cancer cell population dynamics.

    PubMed

    Ghavami, Siavash; Wolkenhauer, Olaf; Lahouti, Farshad; Ullah, Mukhtar; Linnebacher, Michael

    2014-10-01

    Knowing the expected temporal evolution of the proportion of different cell types in sample tissues gives an indication about the progression of the disease and its possible response to drugs. Such systems have been modelled using Markov processes. We here consider an experimentally realistic scenario in which transition probabilities are estimated from noisy cell population size measurements. Using aggregated data of FACS measurements, we develop MMSE and ML estimators and formulate two problems to find the minimum number of required samples and measurements to guarantee the accuracy of predicted population sizes. Our numerical results show that the convergence mechanism of transition probabilities and steady states differ widely from the real values if one uses the standard deterministic approach for noisy measurements. This provides support for our argument that for the analysis of FACS data one should consider the observed state as a random variable. The second problem we address is about the consequences of estimating the probability of a cell being in a particular state from measurements of small population of cells. We show how the uncertainty arising from small sample sizes can be captured by a distribution for the state probability.

  19. Review of probabilistic analysis of dynamic response of systems with random parameters

    NASA Technical Reports Server (NTRS)

    Kozin, F.; Klosner, J. M.

    1989-01-01

    The various methods that have been studied in the past to allow probabilistic analysis of dynamic response for systems with random parameters are reviewed. Dynamic response may have been obtained deterministically if the variations about the nominal values were small; however, for space structures which require precise pointing, the variations about the nominal values of the structural details and of the environmental conditions are too large to be considered as negligible. These uncertainties are accounted for in terms of probability distributions about their nominal values. The quantities of concern for describing the response of the structure includes displacements, velocities, and the distributions of natural frequencies. The exact statistical characterization of the response would yield joint probability distributions for the response variables. Since the random quantities will appear as coefficients, determining the exact distributions will be difficult at best. Thus, certain approximations will have to be made. A number of techniques that are available are discussed, even in the nonlinear case. The methods that are described were: (1) Liouville's equation; (2) perturbation methods; (3) mean square approximate systems; and (4) nonlinear systems with approximation by linear systems.

  20. Functional response and population dynamics for fighting predator, based on activity distribution.

    PubMed

    Garay, József; Varga, Zoltán; Gámez, Manuel; Cabello, Tomás

    2015-03-07

    The classical Holling type II functional response, describing the per capita predation as a function of prey density, was modified by Beddington and de Angelis to include interference of predators that increases with predator density and decreases the number of killed prey. In the present paper we further generalize the Beddington-de Angelis functional response, considering that all predator activities (searching and handling prey, fight and recovery) have time duration, the probabilities of predator activities depend on the encounter probabilities, and hence on the prey and predator abundance, too. Under these conditions, the aim of the study is to introduce a functional response for fighting the predator and to analyse the corresponding dynamics, when predator-predator-prey encounters also occur. From this general approach, the Holling type functional responses can also be obtained as particular cases. In terms of the activity distribution, we give biologically interpretable sufficient conditions for stable coexistence. We consider two-individual (predator-prey) and three-individual (predator-predator-prey) encounters. In the three-individual encounter model there is a relatively higher fighting rate and a lower killing rate. Using numerical simulation, we surprisingly found that when the intrinsic prey growth rate and the conversion rate are small enough, the equilibrium predator abundance is higher in the three-individual encounter case. The above means that, when the equilibrium abundance of the predator is small, coexistence appears first in the three-individual encounter model. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Low-dose radiotherapy as a chemo-potentiator of a chemotherapy regimen with pemetrexed for recurrent non-small-cell lung cancer: a prospective phase II study.

    PubMed

    Mantini, Giovanna; Valentini, Vincenzo; Meduri, Bruno; Margaritora, Stefano; Balducci, Mario; Micciché, Francesco; Nardone, Luigia; De Rose, Fiorenza; Cesario, Alfredo; Larici, Anna Rita; Maggi, Fabio; Calcagni, Maria Lucia; Granone, Pierluigi

    2012-11-01

    Low-dose radiotherapy (LDR) (<50 cGy) induces enhanced cell killing in vitro via the hyper-radiation sensitivity phenomenon. Aim of this study was to evaluate the safety and efficacy of a palliative regimen combining pemetrexed and LDR (as a chemopotentiator) on patients affected by recurrent non-small-cell lung cancer (NSCLC). Eligible patients had an ECOG performance status ≤2, one prior chemotherapy regimen for advanced NSCLC, adequate organ function, measurable lesions. Patients received pemetrexed (500 mg/m(2) IV) and concurrent LDR (40 cGy bid on days 1 and 2) delivered to target pulmonary or metastatic disease. This cycle was repeated fourfold every 21 days. The accrual was determined by the single proportion powered analysis (α=0.05, power=0.8) with H0 ("bad" response probability, 9% according to literature) and H1 ("good" response probability, 35% ongoing study); 19 is the number required. Nineteen patients with stage III and IV disease were enrolled. Only one patient experienced neutropenia grade 4. All patients are evaluable for clinical response of irradiated lesion: overall response rate was 42%. Low-dose radiotherapy combined with pemetrexed has a similar toxicity profile to chemotherapy alone. The response rate of this novel approach is encouraging, since it was higher than what was reported for pemetrexed alone (42% versus 9.1%). Additional scientific investigation of this new treatment paradigm is warranted. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  2. More heads choose better than one: Group decision making can eliminate probability matching.

    PubMed

    Schulze, Christin; Newell, Ben R

    2016-06-01

    Probability matching is a robust and common failure to adhere to normative predictions in sequential decision making. We show that this choice anomaly is nearly eradicated by gathering individual decision makers into small groups and asking the groups to decide. The group choice advantage emerged both when participants generated responses for an entire sequence of choices without outcome feedback (Exp. 1a) and when participants made trial-by-trial predictions with outcome feedback after each decision (Exp. 1b). We show that the dramatic improvement observed in group settings stands in stark contrast to a complete lack of effective solitary deliberation. These findings suggest a crucial role of group discussion in alleviating the impact of hasty intuitive responses in tasks better suited to careful deliberation.

  3. Design and Weighting Methods for a Nationally Representative Sample of HIV-infected Adults Receiving Medical Care in the United States-Medical Monitoring Project

    PubMed Central

    Iachan, Ronaldo; H. Johnson, Christopher; L. Harding, Richard; Kyle, Tonja; Saavedra, Pedro; L. Frazier, Emma; Beer, Linda; L. Mattson, Christine; Skarbinski, Jacek

    2016-01-01

    Background: Health surveys of the general US population are inadequate for monitoring human immunodeficiency virus (HIV) infection because the relatively low prevalence of the disease (<0.5%) leads to small subpopulation sample sizes. Objective: To collect a nationally and locally representative probability sample of HIV-infected adults receiving medical care to monitor clinical and behavioral outcomes, supplementing the data in the National HIV Surveillance System. This paper describes the sample design and weighting methods for the Medical Monitoring Project (MMP) and provides estimates of the size and characteristics of this population. Methods: To develop a method for obtaining valid, representative estimates of the in-care population, we implemented a cross-sectional, three-stage design that sampled 23 jurisdictions, then 691 facilities, then 9,344 HIV patients receiving medical care, using probability-proportional-to-size methods. The data weighting process followed standard methods, accounting for the probabilities of selection at each stage and adjusting for nonresponse and multiplicity. Nonresponse adjustments accounted for differing response at both facility and patient levels. Multiplicity adjustments accounted for visits to more than one HIV care facility. Results: MMP used a multistage stratified probability sampling design that was approximately self-weighting in each of the 23 project areas and nationally. The probability sample represents the estimated 421,186 HIV-infected adults receiving medical care during January through April 2009. Methods were efficient (i.e., induced small, unequal weighting effects and small standard errors for a range of weighted estimates). Conclusion: The information collected through MMP allows monitoring trends in clinical and behavioral outcomes and informs resource allocation for treatment and prevention activities. PMID:27651851

  4. Probability and volume of potential postwildfire debris flows in the 2010 Fourmile burn area, Boulder County, Colorado

    USGS Publications Warehouse

    Ruddy, Barbara C.; Stevens, Michael R.; Verdin, Kristine

    2010-01-01

    This report presents a preliminary emergency assessment of the debris-flow hazards from drainage basins burned by the Fourmile Creek fire in Boulder County, Colorado, in 2010. Empirical models derived from statistical evaluation of data collected from recently burned basins throughout the intermountain western United States were used to estimate the probability of debris-flow occurrence and volumes of debris flows for selected drainage basins. Data for the models include burn severity, rainfall total and intensity for a 25-year-recurrence, 1-hour-duration rainstorm, and topographic and soil property characteristics. Several of the selected drainage basins in Fourmile Creek and Gold Run were identified as having probabilities of debris-flow occurrence greater than 60 percent, and many more with probabilities greater than 45 percent, in response to the 25-year recurrence, 1-hour rainfall. None of the Fourmile Canyon Creek drainage basins selected had probabilities greater than 45 percent. Throughout the Gold Run area and the Fourmile Creek area upstream from Gold Run, the higher probabilities tend to be in the basins with southerly aspects (southeast, south, and southwest slopes). Many basins along the perimeter of the fire area were identified as having low probability of occurrence of debris flow. Volume of debris flows predicted from drainage basins with probabilities of occurrence greater than 60 percent ranged from 1,200 to 9,400 m3. The predicted moderately high probabilities and some of the larger volumes responses predicted for the modeled storm indicate a potential for substantial debris-flow effects to buildings, roads, bridges, culverts, and reservoirs located both within these drainages and immediately downstream from the burned area. However, even small debris flows that affect structures at the basin outlets could cause considerable damage.

  5. Cutaneous reflexes in small muscles of the hand

    PubMed Central

    Caccia, M. R.; McComas, A. J.; Upton, A. R. M.; Blogg, T.

    1973-01-01

    A study has been made of the responses of motoneurones innervating small muscles of the hand to electrical and mechanical stimulation of the skin. Both excitatory and inhibitory effects could be observed in the same muscle after a single stimulus to a given area of skin. The earliest excitatory and inhibitory responses are probably mediated by group III and the smaller group II afferent nerve fibres. A later inhibition results from activity in the larger group II fibres which are connected to cutaneous mechanoreceptors, especially those in the tips of the fingers and thumb. This late inhibitory reflex may operate through the fusimotor system. The possible roles of these reflexes are discussed in relation to previous investigations in man and the cat. PMID:4272546

  6. Experimental pancreatic hyperplasia and neoplasia: effects of dietary and surgical manipulation.

    PubMed Central

    Watanapa, P.; Williamson, R. C.

    1993-01-01

    Several studies carried out during the past two decades have investigated the effect of dietary and surgical manipulation on pancreatic growth and carcinogenesis. Diets high in trypsin inhibitor stimulate pancreatic growth and increase the formation of preneoplastic lesions and carcinomas in the rat pancreas. Cholecystokinin (CCK) is the key intermediary in this response, since both natural and synthetic trypsin inhibitors increase circulating levels of the hormone and CCK antagonists largely prevent these changes. Fatty acids enhance pancreatic carcinogenesis in both rats and hamsters, whereas protein appears to have a protective role in the rat, but to increase tumour yields in the hamster. Several surgical operations affect the pancreas. Pancreatobiliary diversion and partial gastrectomy stimulate pancreatic growth and enhance carcinogenesis, probably by means of increased CCK release. Complete duodenogastric reflux has similar effects on the pancreas but the gut peptide involved is gastrin. Although massive small bowel resection increases pancreatic growth, the marked reduction in caloric absorption probably explains its failure to enhance carcinogenesis. CCK and enteroglucagon might work in concert to modulate the tropic response of the pancreas to small bowel resection. In the pancreas, as in the large intestine, hyperplasia appears to precede and predispose to neoplasia. PMID:8494719

  7. Import risk assessment incorporating a dose-response model: introduction of highly pathogenic porcine reproductive and respiratory syndrome into Australia via illegally imported raw pork.

    PubMed

    Brookes, V J; Hernández-Jover, M; Holyoake, P; Ward, M P

    2014-03-01

    Highly pathogenic porcine reproductive and respiratory syndrome (PRRS) has spread through parts of south-east Asia, posing a risk to Australia. The objective of this study was to assess the probability of infection of a feral or domestic pig in Australia with highly pathogenic PRRS following ingestion of illegally imported raw pork. A conservative scenario was considered in which 500 g of raw pork was imported from the Philippines into Australia without being detected by border security, then discarded from a household and potentially accessed by a pig. Monte Carlo simulation of a two-dimensional, stochastic model was used to estimate the probability of entry and exposure, and the probability of infection was assessed by incorporating a virus-decay and mechanistic dose-response model. Results indicated that the probability of infection of a feral pig after ingestion of raw meat was higher than the probability of infection of a domestic pig. Sensitivity analysis was used to assess the influence of input parameters on model output probability estimates, and extension of the virus-decay and dose-response model was used to explore the impact of different temperatures and time from slaughter to ingestion of the meat, different weights of meat, and the level of viraemia at slaughter on the infectivity of meat. Parameters with the highest influence on the model output were the level of viraemia of a pig prior to slaughter and the probability of access by a feral pig to food-waste discarded on property surrounding a household. Extension of the decay and dose-response model showed that small pieces of meat (10 g) from a highly pathogenic PRRS viraemic pig could contain enough virus to have a high probability of infection of a pig, and that routes to Australia by sea or air from all highly pathogenic PRRS virus endemic countries were of interest dependent on the temperature of the raw meat during transport. This study highlighted the importance of mitigation strategies such as disposal of food-waste from international traffic as quarantine waste, and the need for further research into the probability of access to food-waste on properties by feral pigs. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Epidemics in interconnected small-world networks.

    PubMed

    Liu, Meng; Li, Daqing; Qin, Pengju; Liu, Chaoran; Wang, Huijuan; Wang, Feilong

    2015-01-01

    Networks can be used to describe the interconnections among individuals, which play an important role in the spread of disease. Although the small-world effect has been found to have a significant impact on epidemics in single networks, the small-world effect on epidemics in interconnected networks has rarely been considered. Here, we study the susceptible-infected-susceptible (SIS) model of epidemic spreading in a system comprising two interconnected small-world networks. We find that the epidemic threshold in such networks decreases when the rewiring probability of the component small-world networks increases. When the infection rate is low, the rewiring probability affects the global steady-state infection density, whereas when the infection rate is high, the infection density is insensitive to the rewiring probability. Moreover, epidemics in interconnected small-world networks are found to spread at different velocities that depend on the rewiring probability.

  9. Population variability complicates the accurate detection of climate change responses.

    PubMed

    McCain, Christy; Szewczyk, Tim; Bracy Knight, Kevin

    2016-06-01

    The rush to assess species' responses to anthropogenic climate change (CC) has underestimated the importance of interannual population variability (PV). Researchers assume sampling rigor alone will lead to an accurate detection of response regardless of the underlying population fluctuations of the species under consideration. Using population simulations across a realistic, empirically based gradient in PV, we show that moderate to high PV can lead to opposite and biased conclusions about CC responses. Between pre- and post-CC sampling bouts of modeled populations as in resurvey studies, there is: (i) A 50% probability of erroneously detecting the opposite trend in population abundance change and nearly zero probability of detecting no change. (ii) Across multiple years of sampling, it is nearly impossible to accurately detect any directional shift in population sizes with even moderate PV. (iii) There is up to 50% probability of detecting a population extirpation when the species is present, but in very low natural abundances. (iv) Under scenarios of moderate to high PV across a species' range or at the range edges, there is a bias toward erroneous detection of range shifts or contractions. Essentially, the frequency and magnitude of population peaks and troughs greatly impact the accuracy of our CC response measurements. Species with moderate to high PV (many small vertebrates, invertebrates, and annual plants) may be inaccurate 'canaries in the coal mine' for CC without pertinent demographic analyses and additional repeat sampling. Variation in PV may explain some idiosyncrasies in CC responses detected so far and urgently needs more careful consideration in design and analysis of CC responses. © 2016 John Wiley & Sons Ltd.

  10. Women doctors in Norway: the challenging balance between career and family life.

    PubMed

    Gjerberg, Elisabeth

    2003-10-01

    In most Western countries, women doctors are still underrepresented in the higher positions in the medical hierarchy and in the most prestigious specialities. A crucial question is whether family responsibilities affect female and male career differently. The article examines how Norwegian physicians balance their work and family responsibilities and demonstrates differences in the way doctors combine work and family obligations, between women and compared with men. Among women doctors, the probability of becoming a specialist decreased with an increasing number of children. Moreover, postponing the birth of the first child increased the probability of completing hospital specialities. Although more women than men work part-time, this was the case only for a small proportion of women doctors. Transition from full-time to part-time work is primarily an accommodating strategy to family responsibilities, however strongly influenced by variations in the opportunity structure of different specialities. The findings further demonstrate that being married to another doctor had a positive impact on the career, especially for women doctors.

  11. [Scleroderma cluster among type-setters].

    PubMed

    Magnavita, N

    2007-01-01

    The etiology of systemic sclerosis, probably multifactorial, is not yet well defined. Among the many endogenous and exogenous factors probably involved, occupational elements may play an essential role. Here we report a cluster of local scleroderma and systemic sclerosis, which occurred in a small group of typography workers exposed to polyvinyl-acetate glues, containing up to 1% of vinyl-acetate. Vinyl acetate exposure has been associated with acidification of the intracellular environment, which is thought to produce cytotoxic and/or mitogenic responses that are the sentinel pharmacodynamic steps toward cancer. Autoantibody production in systemic sclerosis depends upon intracellular acidification. More studies are needed to clarify the relationship between vinyl acetate exposure and scleroderma.

  12. Variability in Cell Response of Cronobacter sakazakii after Mild-Heat Treatments and Its Impact on Food Safety

    PubMed Central

    Parra-Flores, Julio; Juneja, Vijay; Garcia de Fernando, Gonzalo; Aguirre, Juan

    2016-01-01

    Cronobacter spp. have been responsible for severe infections in infants associated with consumption of powdered infant formula and follow-up formulae. Despite several risk assessments described in published studies, few approaches have considered the tremendous variability in cell response that small micropopulations or single cells can have in infant formula during storage, preparation or post process/preparation before the feeding of infants. Stochastic approaches can better describe microbial single cell response than deterministic models as we prove in this study. A large variability of lag phase was observed in single cell and micropopulations of ≤50 cells. This variability increased as the heat shock increased and growth temperature decreased. Obviously, variability of growth of individual Cronobacter sakazakii cell is affected by inoculum size, growth temperature and the probability of cells able to grow at the conditions imposed by the experimental conditions should be taken into account, especially when errors in bottle-preparation practices, such as improper holding temperatures, or manipulation, may lead to growth of the pathogen to a critical cell level. The mean probability of illness from initial inoculum size of 1 cell was below 0.2 in all the cases and for inoculum size of 50 cells the mean probability of illness, in most of the cases, was above 0.7. PMID:27148223

  13. Local tumor control probability modeling of primary and secondary lung tumors in stereotactic body radiotherapy.

    PubMed

    Guckenberger, Matthias; Klement, Rainer J; Allgäuer, Michael; Andratschke, Nicolaus; Blanck, Oliver; Boda-Heggemann, Judit; Dieckmann, Karin; Duma, Marciana; Ernst, Iris; Ganswindt, Ute; Hass, Peter; Henkenberens, Christoph; Holy, Richard; Imhoff, Detlef; Kahl, Henning K; Krempien, Robert; Lohaus, Fabian; Nestle, Ursula; Nevinny-Stickel, Meinhard; Petersen, Cordula; Semrau, Sabine; Streblow, Jan; Wendt, Thomas G; Wittig, Andrea; Flentje, Michael; Sterzing, Florian

    2016-03-01

    To evaluate whether local tumor control probability (TCP) in stereotactic body radiotherapy (SBRT) varies between lung metastases of different primary cancer sites and between primary non-small cell lung cancer (NSCLC) and secondary lung tumors. A retrospective multi-institutional (n=22) database of 399 patients with stage I NSCLC and 397 patients with 525 lung metastases was analyzed. Irradiation doses were converted to biologically effective doses (BED). Logistic regression was used for local tumor control probability (TCP) modeling and the second-order bias corrected Akaike Information Criterion was used for model comparison. After median follow-up of 19 months and 16 months (n.s.), local tumor control was observed in 87.7% and 86.7% of the primary and secondary lung tumors (n.s.), respectively. A strong dose-response relationship was observed in the primary NSCLC and metastatic cohort but dose-response relationships were not significantly different: the TCD90 (dose to achieve 90% TCP; BED of maximum planning target volume dose) estimates were 176 Gy (151-223) and 160 Gy (123-237) (n.s.), respectively. The dose-response relationship was not influenced by the primary cancer site within the metastatic cohort. Dose-response relationships for local tumor control in SBRT were not different between lung metastases of various primary cancer sites and between primary NSCLC and lung metastases. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  14. Mechanisms of small intestinal adaptation.

    PubMed

    Jenkins, A P; Thompson, R P

    1994-01-01

    Luminal nutrition, hormonal factors and pancreaticobiliary secretions are probably the major mediators of small intestinal adaptation. Their actions, as discussed in this paper, are likely to be interrelated. Direct local enterotrophic effects cannot account for all the actions of luminal nutrients. Additionally, hormonal factors have been shown to contribute to indirect effects of luminal nutrients and enteroglucagon is a likely mediator of adaptive responses. Furthermore, epidermal growth factor is a peptide for which there is convincing evidence of an enterotrophic action. Attention is drawn to the fact that pancreaticobiliary secretions may have a physiological role in stimulating small intestinal mucosal proliferation. Other factors may also influence small intestinal mucosal proliferation (e.g. prostaglandins, neurovascular mechanisms, bacteria). Additionally, polyamines are crucial in initiating cell division in the small intestine, but the detailed mechanisms of their action require further clarification. Finally, a number of therapeutic applications of small intestinal epithelial cell proliferation are discussed.

  15. A General Method for Assessing the Origin of Interstellar Small Bodies: The Case of 1I/2017 U1 (‘Oumuamua)

    NASA Astrophysics Data System (ADS)

    Zuluaga, Jorge I.; Sánchez-Hernández, Oscar; Sucerquia, Mario; Ferrín, Ignacio

    2018-06-01

    With the advent of more and deeper sky surveys, the discovery of interstellar small objects entering into the solar system has been finally possible. In 2017 October 19, using observations of the Pan-STARRS survey, a fast moving object, now officially named 1I/2017 U1 (‘Oumuamua), was discovered in a heliocentric unbound trajectory, suggesting an interstellar origin. Assessing the provenance of interstellar small objects is key for understanding their distribution, spatial density, and the processes responsible for their ejection from planetary systems. However, their peculiar trajectories place a limit on the number of observations available to determine a precise orbit. As a result, when its position is propagated ∼105–106 years backward in time, small errors in orbital elements become large uncertainties in position in the interstellar space. In this paper we present a general method for assigning probabilities to nearby stars of being the parent system of an observed interstellar object. We describe the method in detail and apply it for assessing the origin of ‘Oumuamua. A preliminary list of potential progenitors and their corresponding probabilities is provided. In the future, when further information about the object and/or the nearby stars be refined, the probabilities computed with our method can be updated. We provide all the data and codes we developed for this purpose in the form of an open source C/C++/Python package, iWander, which is publicly available at http://github.com/seap-udea/iWander.

  16. The importance of duodenal diverticula in the elderly.

    PubMed Central

    Pearce, V. R.

    1980-01-01

    All barium meal examinations performed, in patients aged greater than 65 years, in one year in one Health District are reviewed. There were 39 cases of duodenal diverticula. One case of osteomalacia and folate deficiency was discovered and this patient had evidence of small bowel bacterial overgrowth. In the remaining cases showing evidence of nutritional deficiency, other factors were probably responsible. The evidence for an association between deficiencies and duodenal diverticula is discussed, and it is concluded that these structures are rarely responsible for nutritional deficiencies in the elderly. PMID:6791147

  17. The effects of small field dosimetry on the biological models used in evaluating IMRT dose distributions

    NASA Astrophysics Data System (ADS)

    Cardarelli, Gene A.

    The primary goal in radiation oncology is to deliver lethal radiation doses to tumors, while minimizing dose to normal tissue. IMRT has the capability to increase the dose to the targets and decrease the dose to normal tissue, increasing local control, decrease toxicity and allow for effective dose escalation. This advanced technology does present complex dose distributions that are not easily verified. Furthermore, the dose inhomogeneity caused by non-uniform dose distributions seen in IMRT treatments has caused the development of biological models attempting to characterize the dose-volume effect in the response of organized tissues to radiation. Dosimetry of small fields can be quite challenging when measuring dose distributions for high-energy X-ray beams used in IMRT. The proper modeling of these small field distributions is essential in reproducing accurate dose for IMRT. This evaluation was conducted to quantify the effects of small field dosimetry on IMRT plan dose distributions and the effects on four biological model parameters. The four biological models evaluated were: (1) the generalized Equivalent Uniform Dose (gEUD), (2) the Tumor Control Probability (TCP), (3) the Normal Tissue Complication Probability (NTCP) and (4) the Probability of uncomplicated Tumor Control (P+). These models are used to estimate local control, survival, complications and uncomplicated tumor control. This investigation compares three distinct small field dose algorithms. Dose algorithms were created using film, small ion chamber, and a combination of ion chamber measurements and small field fitting parameters. Due to the nature of uncertainties in small field dosimetry and the dependence of biological models on dose volume information, this examination quantifies the effects of small field dosimetry techniques on radiobiological models and recommends pathways to reduce the errors in using these models to evaluate IMRT dose distributions. This study demonstrates the importance of valid physical dose modeling prior to the use of biological modeling. The success of using biological function data, such as hypoxia, in clinical IMRT planning will greatly benefit from the results of this study.

  18. Spatial segregation of adaptation and predictive sensitization in retinal ganglion cells

    PubMed Central

    Kastner, David B.; Baccus, Stephen A.

    2014-01-01

    Sensory systems change their sensitivity based upon recent stimuli to adjust their response range to the range of inputs, and to predict future sensory input. Here we report the presence of retinal ganglion cells that have antagonistic plasticity, showing central adaptation and peripheral sensitization. Ganglion cell responses were captured by a spatiotemporal model with independently adapting excitatory and inhibitory subunits, and sensitization requires GABAergic inhibition. Using a simple theory of signal detection we show that the sensitizing surround conforms to an optimal inference model that continually updates the prior signal probability. This indicates that small receptive field regions have dual functionality—to adapt to the local range of signals, but sensitize based upon the probability of the presence of that signal. Within this framework, we show that sensitization predicts the location of a nearby object, revealing prediction as a new functional role for adapting inhibition in the nervous system. PMID:23932000

  19. Empirical estimation of the conditional probability of natech events within the United States.

    PubMed

    Santella, Nicholas; Steinberg, Laura J; Aguirra, Gloria Andrea

    2011-06-01

    Natural disasters are the cause of a sizeable number of hazmat releases, referred to as "natechs." An enhanced understanding of natech probability, allowing for predictions of natech occurrence, is an important step in determining how industry and government should mitigate natech risk. This study quantifies the conditional probabilities of natechs at TRI/RMP and SICS 1311 facilities given the occurrence of hurricanes, earthquakes, tornadoes, and floods. During hurricanes, a higher probability of releases was observed due to storm surge (7.3 releases per 100 TRI/RMP facilities exposed vs. 6.2 for SIC 1311) compared to category 1-2 hurricane winds (5.6 TRI, 2.6 SIC 1311). Logistic regression confirms the statistical significance of the greater propensity for releases at RMP/TRI facilities, and during some hurricanes, when controlling for hazard zone. The probability of natechs at TRI/RMP facilities during earthquakes increased from 0.1 releases per 100 facilities at MMI V to 21.4 at MMI IX. The probability of a natech at TRI/RMP facilities within 25 miles of a tornado was small (∼0.025 per 100 facilities), reflecting the limited area directly affected by tornadoes. Areas inundated during flood events had a probability of 1.1 releases per 100 facilities but demonstrated widely varying natech occurrence during individual events, indicating that factors not quantified in this study such as flood depth and speed are important for predicting flood natechs. These results can inform natech risk analysis, aid government agencies responsible for planning response and remediation after natural disasters, and should be useful in raising awareness of natech risk within industry. © 2011 Society for Risk Analysis.

  20. On the quantification and efficient propagation of imprecise probabilities resulting from small datasets

    NASA Astrophysics Data System (ADS)

    Zhang, Jiaxin; Shields, Michael D.

    2018-01-01

    This paper addresses the problem of uncertainty quantification and propagation when data for characterizing probability distributions are scarce. We propose a methodology wherein the full uncertainty associated with probability model form and parameter estimation are retained and efficiently propagated. This is achieved by applying the information-theoretic multimodel inference method to identify plausible candidate probability densities and associated probabilities that each method is the best model in the Kullback-Leibler sense. The joint parameter densities for each plausible model are then estimated using Bayes' rule. We then propagate this full set of probability models by estimating an optimal importance sampling density that is representative of all plausible models, propagating this density, and reweighting the samples according to each of the candidate probability models. This is in contrast with conventional methods that try to identify a single probability model that encapsulates the full uncertainty caused by lack of data and consequently underestimate uncertainty. The result is a complete probabilistic description of both aleatory and epistemic uncertainty achieved with several orders of magnitude reduction in computational cost. It is shown how the model can be updated to adaptively accommodate added data and added candidate probability models. The method is applied for uncertainty analysis of plate buckling strength where it is demonstrated how dataset size affects the confidence (or lack thereof) we can place in statistical estimates of response when data are lacking.

  1. The Immunological and Migratory Properties of the Lymphocytes Recirculating Through the Rat Spleen

    PubMed Central

    Ford, W. L.

    1969-01-01

    The great majority of the cells released by the isolated, perfused rat spleen were lymphocytes of which about 95 per cent were small lymphocytes. The rate of release of small lymphocytes from the spleen was independent of the prevailing concentration in the perfusate. The cells released by the spleen were immunologically competent in respect of a graft-versus-host reaction and a primary antibody response. They could also transfer secondary responsiveness to a viral antigen. Several spleen donors were given a continuous intravenous infusion of tritiated thymidine prior to spleen perfusion and the proportion of labelled small lymphocytes among the population released by the spleen was compared to the proportion in populations of small lymphocytes from other sources. The small lymphocytes released by the spleen recirculated from the blood to thoracic duct lymph after injection into a syngeneic recipient. Conversely the perfused spleen released thoracic duct small lymphocytes which had been given to the spleen donor 24 hr previously. It is therefore probable that a single population of recirculating small lymphocytes exists which migrates from the blood into both lymph-nodes and spleen. The release of small lymphocytes from the spleen was mostly at the expense of the lymphocyte content of the periarteriolar lymphoid sheaths. ImagesFigs. 8-9Figs. 4-5Fig. 10Figs. 6-7 PMID:5792901

  2. Brief communication: Drought likelihood for East Africa

    NASA Astrophysics Data System (ADS)

    Yang, Hui; Huntingford, Chris

    2018-02-01

    The East Africa drought in autumn of year 2016 caused malnutrition, illness and death. Close to 16 million people across Somalia, Ethiopia and Kenya needed food, water and medical assistance. Many factors influence drought stress and response. However, inevitably the following question is asked: are elevated greenhouse gas concentrations altering extreme rainfall deficit frequency? We investigate this with general circulation models (GCMs). After GCM bias correction to match the climatological mean of the CHIRPS data-based rainfall product, climate models project small decreases in probability of drought with the same (or worse) severity as 2016 ASO (August to October) East African event. This is by the end of the 21st century compared to the probabilities for present day. However, when further adjusting the climatological variability of GCMs to also match CHIRPS data, by additionally bias-correcting for variance, then the probability of drought occurrence will increase slightly over the same period.

  3. Photoconductivity response time in amorphous semiconductors

    NASA Astrophysics Data System (ADS)

    Adriaenssens, G. J.; Baranovskii, S. D.; Fuhs, W.; Jansen, J.; Öktü, Ö.

    1995-04-01

    The photoconductivity response time of amorphous semiconductors is examined theoretically on the basis of standard definitions for free- and trapped-carrier lifetimes, and experimentally for a series of a-Si1-xCx:H alloys with x<0.1. Particular attention is paid to its dependence on carrier generation rate and temperature. As no satisfactory agreement between models and experiments emerges, a simple theory is developed that can account for the experimental observations on the basis of the usual multiple-trappping ideas, provided a small probability of direct free-carrier recombination is included. The theory leads to a stretched-exponential photocurrent decay.

  4. SAR/multispectral image fusion for the detection of environmental hazards with a GIS

    NASA Astrophysics Data System (ADS)

    Errico, Angela; Angelino, Cesario Vincenzo; Cicala, Luca; Podobinski, Dominik P.; Persechino, Giuseppe; Ferrara, Claudia; Lega, Massimiliano; Vallario, Andrea; Parente, Claudio; Masi, Giuseppe; Gaetano, Raffaele; Scarpa, Giuseppe; Amitrano, Donato; Ruello, Giuseppe; Verdoliva, Luisa; Poggi, Giovanni

    2014-10-01

    In this paper we propose a GIS-based methodology, using optical and SAR remote sensing data, together with more conventional sources, for the detection of small cattle breeding areas, potentially responsible of hazardous littering. This specific environmental problem is very relevant for the Caserta area, in southern Italy, where many small buffalo breeding farms exist which are not even known to the productive activity register, and are not easily monitored and surveyed. Experiments on a test area, with available specific ground truth, prove that the proposed systems is characterized by very large detection probability and negligible false alarm rate.

  5. On the definition of a Monte Carlo model for binary crystal growth.

    PubMed

    Los, J H; van Enckevort, W J P; Meekes, H; Vlieg, E

    2007-02-01

    We show that consistency of the transition probabilities in a lattice Monte Carlo (MC) model for binary crystal growth with the thermodynamic properties of a system does not guarantee the MC simulations near equilibrium to be in agreement with the thermodynamic equilibrium phase diagram for that system. The deviations remain small for systems with small bond energies, but they can increase significantly for systems with large melting entropy, typical for molecular systems. These deviations are attributed to the surface kinetics, which is responsible for a metastable zone below the liquidus line where no growth occurs, even in the absence of a 2D nucleation barrier. Here we propose an extension of the MC model that introduces a freedom of choice in the transition probabilities while staying within the thermodynamic constraints. This freedom can be used to eliminate the discrepancy between the MC simulations and the thermodynamic equilibrium phase diagram. Agreement is achieved for that choice of the transition probabilities yielding the fastest decrease of the free energy (i.e., largest growth rate) of the system at a temperature slightly below the equilibrium temperature. An analytical model is developed, which reproduces quite well the MC results, enabling a straightforward determination of the optimal set of transition probabilities. Application of both the MC and analytical model to conditions well away from equilibrium, giving rise to kinetic phase diagrams, shows that the effect of kinetics on segregation is even stronger than that predicted by previous models.

  6. Statistical tests for whether a given set of independent, identically distributed draws comes from a specified probability density.

    PubMed

    Tygert, Mark

    2010-09-21

    We discuss several tests for determining whether a given set of independent and identically distributed (i.i.d.) draws does not come from a specified probability density function. The most commonly used are Kolmogorov-Smirnov tests, particularly Kuiper's variant, which focus on discrepancies between the cumulative distribution function for the specified probability density and the empirical cumulative distribution function for the given set of i.i.d. draws. Unfortunately, variations in the probability density function often get smoothed over in the cumulative distribution function, making it difficult to detect discrepancies in regions where the probability density is small in comparison with its values in surrounding regions. We discuss tests without this deficiency, complementing the classical methods. The tests of the present paper are based on the plain fact that it is unlikely to draw a random number whose probability is small, provided that the draw is taken from the same distribution used in calculating the probability (thus, if we draw a random number whose probability is small, then we can be confident that we did not draw the number from the same distribution used in calculating the probability).

  7. Volatiles Inventory to the Inner Planets Due to Small Bodies Migration

    NASA Technical Reports Server (NTRS)

    Marov, M. Y.; Ipatov, S. I.

    2003-01-01

    The concurrent processes of endogeneous and exogeneous origin are assumed to be responsible for the volatile reserves in the terrestrial planets. Volatiles inventory through collisions is rooted in orbital dynamics of small bodies including near-Earth objects (NEOs), short and long-period comets, and trans-Neptunian objects (TNOs), the latter probably supplying a large amount of Jupiter crossing objects (JCOs). Our model testifies that even a relatively small portion (approx. 0.001) of JCOs which transit to orbits with aphelia inside Jupiter's orbit (Q<4.7 AU) and reside such orbits during more than 1 Myr may contribute significantly in collisions with the terrestrial planets. The total mass of volatiles delivered to the Earth from the feeding zone of the giant planets could be greater than the mass of the Earth's oceans.

  8. Methodological approach for substantiating disease freedom in a heterogeneous small population. Application to ovine scrapie, a disease with a strong genetic susceptibility.

    PubMed

    Martinez, Marie-José; Durand, Benoit; Calavas, Didier; Ducrot, Christian

    2010-06-01

    Demonstrating disease freedom is becoming important in different fields including animal disease control. Most methods consider sampling only from a homogeneous population in which each animal has the same probability of becoming infected. In this paper, we propose a new methodology to calculate the probability of detecting the disease if it is present in a heterogeneous population of small size with potentially different risk groups, differences in risk being defined using relative risks. To calculate this probability, for each possible arrangement of the infected animals in the different groups, the probability that all the animals tested are test-negative given this arrangement is multiplied by the probability that this arrangement occurs. The probability formula is developed using the assumption of a perfect test and hypergeometric sampling for finite small size populations. The methodology is applied to scrapie, a disease affecting small ruminants and characterized in sheep by a strong genetic susceptibility defining different risk groups. It illustrates that the genotypes of the tested animals influence heavily the confidence level of detecting scrapie. The results present the statistical power for substantiating disease freedom in a small heterogeneous population as a function of the design prevalence, the structure of the sample tested, the structure of the herd and the associated relative risks. (c) 2010 Elsevier B.V. All rights reserved.

  9. Prevalence and co-occurrence of addictive behaviors among former alternative high school youth: A longitudinal follow-up study.

    PubMed

    Sussman, Steve; Pokhrel, Pallav; Sun, Ping; Rohrbach, Louise A; Spruijt-Metz, Donna

    2015-09-01

    Recent work has studied addictions using a matrix measure, which taps multiple addictions through single responses for each type. This is the first longitudinal study using a matrix measure. We investigated the use of this approach among former alternative high school youth (average age = 19.8 years at baseline; longitudinal n = 538) at risk for addictions. Lifetime and last 30-day prevalence of one or more of 11 addictions reviewed in other work was the primary focus (i.e., cigarettes, alcohol, hard drugs, shopping, gambling, Internet, love, sex, eating, work, and exercise). These were examined at two time-points one year apart. Latent class and latent transition analyses (LCA and LTA) were conducted in Mplus. Prevalence rates were stable across the two time-points. As in the cross-sectional baseline analysis, the 2-class model (addiction class, non-addiction class) fit the data better at follow-up than models with more classes. Item-response or conditional probabilities for each addiction type did not differ between time-points. As a result, the LTA model utilized constrained the conditional probabilities to be equal across the two time-points. In the addiction class, larger conditional probabilities (i.e., 0.40-0.49) were found for love, sex, exercise, and work addictions; medium conditional probabilities (i.e., 0.17-0.27) were found for cigarette, alcohol, other drugs, eating, Internet and shopping addiction; and a small conditional probability (0.06) was found for gambling. Persons in an addiction class tend to remain in this addiction class over a one-year period.

  10. Small area estimation for semicontinuous data.

    PubMed

    Chandra, Hukum; Chambers, Ray

    2016-03-01

    Survey data often contain measurements for variables that are semicontinuous in nature, i.e. they either take a single fixed value (we assume this is zero) or they have a continuous, often skewed, distribution on the positive real line. Standard methods for small area estimation (SAE) based on the use of linear mixed models can be inefficient for such variables. We discuss SAE techniques for semicontinuous variables under a two part random effects model that allows for the presence of excess zeros as well as the skewed nature of the nonzero values of the response variable. In particular, we first model the excess zeros via a generalized linear mixed model fitted to the probability of a nonzero, i.e. strictly positive, value being observed, and then model the response, given that it is strictly positive, using a linear mixed model fitted on the logarithmic scale. Empirical results suggest that the proposed method leads to efficient small area estimates for semicontinuous data of this type. We also propose a parametric bootstrap method to estimate the MSE of the proposed small area estimator. These bootstrap estimates of the MSE are compared to the true MSE in a simulation study. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Modeling the probability distribution of peak discharge for infiltrating hillslopes

    NASA Astrophysics Data System (ADS)

    Baiamonte, Giorgio; Singh, Vijay P.

    2017-07-01

    Hillslope response plays a fundamental role in the prediction of peak discharge at the basin outlet. The peak discharge for the critical duration of rainfall and its probability distribution are needed for designing urban infrastructure facilities. This study derives the probability distribution, denoted as GABS model, by coupling three models: (1) the Green-Ampt model for computing infiltration, (2) the kinematic wave model for computing discharge hydrograph from the hillslope, and (3) the intensity-duration-frequency (IDF) model for computing design rainfall intensity. The Hortonian mechanism for runoff generation is employed for computing the surface runoff hydrograph. Since the antecedent soil moisture condition (ASMC) significantly affects the rate of infiltration, its effect on the probability distribution of peak discharge is investigated. Application to a watershed in Sicily, Italy, shows that with the increase of probability, the expected effect of ASMC to increase the maximum discharge diminishes. Only for low values of probability, the critical duration of rainfall is influenced by ASMC, whereas its effect on the peak discharge seems to be less for any probability. For a set of parameters, the derived probability distribution of peak discharge seems to be fitted by the gamma distribution well. Finally, an application to a small watershed, with the aim to test the possibility to arrange in advance the rational runoff coefficient tables to be used for the rational method, and a comparison between peak discharges obtained by the GABS model with those measured in an experimental flume for a loamy-sand soil were carried out.

  12. Evaluation and statistical judgement of neural responses to sinusoidal stimulation in cases with superimposed drift and noise.

    PubMed

    Jastreboff, P W

    1979-06-01

    Time histograms of neural responses evoked by sinuosidal stimulation often contain a slow drifting and an irregular noise which disturb Fourier analysis of these responses. Section 2 of this paper evaluates the extent to which a linear drift influences the Fourier analysis, and develops a combined Fourier and linear regression analysis for detecting and correcting for such a linear drift. Usefulness of this correcting method is demonstrated for the time histograms of actual eye movements and Purkinje cell discharges evoked by sinusoidal rotation of rabbits in the horizontal plane. In Sect. 3, the analysis of variance is adopted for estimating the probability of the random occurrence of the response curve extracted by Fourier analysis from noise. This method proved to be useful for avoiding false judgements as to whether the response curve was meaningful, particularly when the response was small relative to the contaminating noise.

  13. Probability theory for 3-layer remote sensing in ideal gas law environment.

    PubMed

    Ben-David, Avishai; Davidson, Charles E

    2013-08-26

    We extend the probability model for 3-layer radiative transfer [Opt. Express 20, 10004 (2012)] to ideal gas conditions where a correlation exists between transmission and temperature of each of the 3 layers. The effect on the probability density function for the at-sensor radiances is surprisingly small, and thus the added complexity of addressing the correlation can be avoided. The small overall effect is due to (a) small perturbations by the correlation on variance population parameters and (b) cancellation of perturbation terms that appear with opposite signs in the model moment expressions.

  14. Water resources and potential effects of surface coal mining in the area of the Woodson Preference Right Lease Application, Montana

    USGS Publications Warehouse

    Cannon, M.R.

    1987-01-01

    Federal coal lands of the Woodson Preference Right Lease Application are located in Dawson and Richland Counties, northeastern Montana. A probable mine area, comprised of the lease area and adjacent coal lands, contains about 220 million tons of recoverable lignite coal in the 12-37 ft thick Pust coal bed. A hydrologic study has been conducted in the area to describe the water resources and to evaluate potential effects of coal mining on the water resources. Geohydrologic data collected from wells and springs indicate that several aquifers exist in the area. Sandstone beds in the Tongue River Member of the Fort Union Formation (Paleocene age) are the most common aquifers and probably underlie the entire area. The Pust coal bed in the Tongue River Member is water saturated in part of the probable mine area and is dry in other parts of the probable mine area. Other aquifers, located mostly outside of the probable mine area, exist in gravel of the Flaxville Formation (Miocene of Pliocene age) and valley alluvium (Pleistocene and Holocene age). Chemical analyses of groundwater indicate a range in dissolved solids concentration of 240-2,280 mg/L. Surface water resources are limited. Most streams in the area are ephemeral and flow only in response to rainfall or snowmelt. Small reaches of the North and Middle Forks of Burns Creek have intermittent flow. Water sampled from a small perennial reach of the Middle Fork had a dissolved solids concentration of 700 mg/L. Mining of the Pust coal bed would destroy one spring and four stock wells, dewater areas of the Pust coal and sandstone aquifers, and probably lower water levels in seven stock and domestic wells. Mining in the valley of Middle Fork Burns Creek would intercept streamflow and alter flow characteristics of a small perennial reach of stream. Leaching of soluble minerals from mine spoils may cause a long-term degradation of the quality of water in the spoils and in aquifers downgradient from the spoils. Some of the effects on local water supplies could be mitigated by development of new wells in deeper sandstones of the Tongue River Member. Effects of mining on water resources would be minimized if only areas of dry coal were mined. (Author 's abstract)

  15. Web-Based Real Time Earthquake Forecasting and Personal Risk Management

    NASA Astrophysics Data System (ADS)

    Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Turcotte, D. L.; Donnellan, A.

    2012-12-01

    Earthquake forecasts have been computed by a variety of countries and economies world-wide for over two decades. For the most part, forecasts have been computed for insurance, reinsurance and underwriters of catastrophe bonds. One example is the Working Group on California Earthquake Probabilities that has been responsible for the official California earthquake forecast since 1988. However, in a time of increasingly severe global financial constraints, we are now moving inexorably towards personal risk management, wherein mitigating risk is becoming the responsibility of individual members of the public. Under these circumstances, open access to a variety of web-based tools, utilities and information is a necessity. Here we describe a web-based system that has been operational since 2009 at www.openhazards.com and www.quakesim.org. Models for earthquake physics and forecasting require input data, along with model parameters. The models we consider are the Natural Time Weibull (NTW) model for regional earthquake forecasting, together with models for activation and quiescence. These models use small earthquakes ('seismicity-based models") to forecast the occurrence of large earthquakes, either through varying rates of small earthquake activity, or via an accumulation of this activity over time. These approaches use data-mining algorithms combined with the ANSS earthquake catalog. The basic idea is to compute large earthquake probabilities using the number of small earthquakes that have occurred in a region since the last large earthquake. Each of these approaches has computational challenges associated with computing forecast information in real time. Using 25 years of data from the ANSS California-Nevada catalog of earthquakes, we show that real-time forecasting is possible at a grid scale of 0.1o. We have analyzed the performance of these models using Reliability/Attributes and standard Receiver Operating Characteristic (ROC) tests. We show how the Reliability and ROC tests allow us to judge data completeness and estimate error. It is clear from much of the analysis that data quality is a major limitation on the accurate computation of earthquake probabilities. We discuss the challenges and pitfalls in serving up these datasets over the web.

  16. Small and large wetland fragments are equally suited breeding sites for a ground-nesting passerine.

    PubMed

    Pasinelli, Gilberto; Mayer, Christian; Gouskov, Alexandre; Schiegg, Karin

    2008-06-01

    Large habitat fragments are generally thought to host more species and to offer more diverse and/or better quality habitats than small fragments. However, the importance of small fragments for population dynamics in general and for reproductive performance in particular is highly controversial. Using an information-theoretic approach, we examined reproductive performance and probability of local recruitment of color-banded reed buntings Emberiza schoeniclus in relation to the size of 18 wetland fragments in northeastern Switzerland over 4 years. We also investigated if reproductive performance and recruitment probability were density-dependent. None of the four measures of reproductive performance (laying date, nest failure probability, fledgling production per territory, fledgling condition) nor recruitment probability were found to be related to wetland fragment size. In terms of fledgling production, however, fragment size interacted with year, indicating that small fragments were better reproductive grounds in some years than large fragments. Reproductive performance and recruitment probability were not density-dependent. Our results suggest that small fragments are equally suited as breeding grounds for the reed bunting as large fragments and should therefore be managed to provide a habitat for this and other specialists occurring in the same habitat. Moreover, large fragments may represent sinks in specific years because a substantial percentage of all breeding pairs in our study area breed in large fragments, and reproductive failure in these fragments due to the regularly occurring floods may have a much stronger impact on regional population dynamics than comparable events in small fragments.

  17. Prevalence and co-occurrence of addictive behaviors among former alternative high school youth: A longitudinal follow-up study

    PubMed Central

    Sussman, Steve; Pokhrel, Pallav; Sun, Ping; Rohrbach, Louise A.; Spruijt-Metz, Donna

    2015-01-01

    Background and Aims Recent work has studied addictions using a matrix measure, which taps multiple addictions through single responses for each type. This is the first longitudinal study using a matrix measure. Methods We investigated the use of this approach among former alternative high school youth (average age = 19.8 years at baseline; longitudinal n = 538) at risk for addictions. Lifetime and last 30-day prevalence of one or more of 11 addictions reviewed in other work was the primary focus (i.e., cigarettes, alcohol, hard drugs, shopping, gambling, Internet, love, sex, eating, work, and exercise). These were examined at two time-points one year apart. Latent class and latent transition analyses (LCA and LTA) were conducted in Mplus. Results Prevalence rates were stable across the two time-points. As in the cross-sectional baseline analysis, the 2-class model (addiction class, non-addiction class) fit the data better at follow-up than models with more classes. Item-response or conditional probabilities for each addiction type did not differ between time-points. As a result, the LTA model utilized constrained the conditional probabilities to be equal across the two time-points. In the addiction class, larger conditional probabilities (i.e., 0.40−0.49) were found for love, sex, exercise, and work addictions; medium conditional probabilities (i.e., 0.17−0.27) were found for cigarette, alcohol, other drugs, eating, Internet and shopping addiction; and a small conditional probability (0.06) was found for gambling. Discussion and Conclusions Persons in an addiction class tend to remain in this addiction class over a one-year period. PMID:26551909

  18. Low-energy isovector and isoscalar dipole response in neutron-rich nuclei

    NASA Astrophysics Data System (ADS)

    Vretenar, D.; Niu, Y. F.; Paar, N.; Meng, J.

    2012-04-01

    The self-consistent random-phase approximation, based on the framework of relativistic energy density functionals, is employed in the study of isovector and isoscalar dipole response in 68Ni,132Sn, and 208Pb. The evolution of pygmy dipole states (PDSs) in the region of low excitation energies is analyzed as a function of the density dependence of the symmetry energy for a set of relativistic effective interactions. The occurrence of PDSs is predicted in the response to both the isovector and the isoscalar dipole operators, and its strength is enhanced with the increase in the symmetry energy at saturation and the slope of the symmetry energy. In both channels, the PDS exhausts a relatively small fraction of the energy-weighted sum rule but a much larger percentage of the inverse energy-weighted sum rule. For the isovector dipole operator, the reduced transition probability B(E1) of the PDSs is generally small because of pronounced cancellation of neutron and proton partial contributions. The isoscalar-reduced transition amplitude is predominantly determined by neutron particle-hole configurations, most of which add coherently, and this results in a collective response of the PDSs to the isoscalar dipole operator.

  19. Comparisons of estimates of annual exceedance-probability discharges for small drainage basins in Iowa, based on data through water year 2013 : [summary].

    DOT National Transportation Integrated Search

    2015-01-01

    Traditionally, the Iowa DOT has used the Iowa Runoff Chart and single-variable regional regression equations (RREs) from a USGS report : (published in 1987) as the primary methods to estimate annual exceedance-probability discharge : (AEPD) for small...

  20. Dealing with non-unique and non-monotonic response in particle sizing instruments

    NASA Astrophysics Data System (ADS)

    Rosenberg, Phil

    2017-04-01

    A number of instruments used as de-facto standards for measuring particle size distributions are actually incapable of uniquely determining the size of an individual particle. This is due to non-unique or non-monotonic response functions. Optical particle counters have non monotonic response due to oscillations in the Mie response curves, especially for large aerosol and small cloud droplets. Scanning mobility particle sizers respond identically to two particles where the ratio of particle size to particle charge is approximately the same. Images of two differently sized cloud or precipitation particles taken by an optical array probe can have similar dimensions or shadowed area depending upon where they are in the imaging plane. A number of methods exist to deal with these issues, including assuming that positive and negative errors cancel, smoothing response curves, integrating regions in measurement space before conversion to size space and matrix inversion. Matrix inversion (also called kernel inversion) has the advantage that it determines the size distribution which best matches the observations, given specific information about the instrument (a matrix which specifies the probability that a particle of a given size will be measured in a given instrument size bin). In this way it maximises use of the information in the measurements. However this technique can be confused by poor counting statistics which can cause erroneous results and negative concentrations. Also an effective method for propagating uncertainties is yet to be published or routinely implemented. Her we present a new alternative which overcomes these issues. We use Bayesian methods to determine the probability that a given size distribution is correct given a set of instrument data and then we use Markov Chain Monte Carlo methods to sample this many dimensional probability distribution function to determine the expectation and (co)variances - hence providing a best guess and an uncertainty for the size distribution which includes contributions from the non-unique response curve, counting statistics and can propagate calibration uncertainties.

  1. Nutritional supplementation in community-dwelling elderly people.

    PubMed

    Mucci, Elena; Jackson, S H D

    2008-01-01

    There is a large evidence base for nutritional intervention in acutely ill and post-operative hospitalised patients, but the evidence base for nursing home (NH) residents is small. The prevalence of poor nutrition in NHs is high and baseline nutrition appears to be an important determinant of response to nutritional intervention. Residents with mininutritional assessment (MNA) scores above 23.5 tend to show less response than those with lower scores. This relates in part to failure to increase intake in the better nourished as well as to actual response to increased intake. At the low end of the MNA spectrum, the increasing prevalence of multiple pathologies tends to result in a reduced response, but randomised controlled studies in this group is probably not ethical. Most studies have tended to investigate the intermediate group with MNA scores of 17-23.5 or equivalent using other scales. Interventions have usually resulted in increased intake of calories and micronutrients. Other end points have variously shown responses including weight, immunological measures, infection rates, decubitus ulcers, falls and fracture rates. Many studies have been too small to demonstrate benefit and some are likely to have suffered from type l errors - showing benefit by chance. Poorly quantifiable variables likely to be of importance include the local environment and catering as well as pathophysiological variability. Copyright 2008 S. Karger AG, Basel.

  2. Estimating the probability that the sample mean is within a desired fraction of the standard deviation of the true mean.

    PubMed

    Schillaci, Michael A; Schillaci, Mario E

    2009-02-01

    The use of small sample sizes in human and primate evolutionary research is commonplace. Estimating how well small samples represent the underlying population, however, is not commonplace. Because the accuracy of determinations of taxonomy, phylogeny, and evolutionary process are dependant upon how well the study sample represents the population of interest, characterizing the uncertainty, or potential error, associated with analyses of small sample sizes is essential. We present a method for estimating the probability that the sample mean is within a desired fraction of the standard deviation of the true mean using small (n<10) or very small (n < or = 5) sample sizes. This method can be used by researchers to determine post hoc the probability that their sample is a meaningful approximation of the population parameter. We tested the method using a large craniometric data set commonly used by researchers in the field. Given our results, we suggest that sample estimates of the population mean can be reasonable and meaningful even when based on small, and perhaps even very small, sample sizes.

  3. Ultrastructural and functional characterization of circulating hemocytes from the freshwater crayfish Astacus leptodactylus: cell types and their role after in vivo artificial non-self challenge.

    PubMed

    Giulianini, Piero Giulio; Bierti, Manuel; Lorenzon, Simonetta; Battistella, Silvia; Ferrero, Enrico Antonio

    2007-01-01

    The freshwater crayfish Astacus leptodactylus (Eschscholtz, 1823) is an important aquacultured decapod species as well as an invasive species in some European countries. In the current investigation we characterized the different classes of circulating blood cells in A. leptodactylus by means of light and electron microscopy analysis and we explored their reaction to different latex beads particles in vivo by total and differential cell counts at 0.5, 1, 2 and 4h after injections. We identified hemocytes by granule size morphometry as hyaline hemocytes with no or rare tiny granules, small granule hemocytes, unimodal medium diameter granule hemocytes and both small and large granule containing hemocytes. The latter granular hemocytes showed the strongest phenoloxidase l-DOPA reactivity both in granules and cytoplasm. A. leptodactylus respond to foreign particles with strong cellular immune responses. All treatments elicited a total hemocyte increase with a conspicuous recruitment of large granule containing hemocytes. All hemocyte types mounted some phagocytic response but the small granule hemocytes were the only ones involved in phagocytic response to all foreign particles with the highest percentages. These results (1) depict the variability in decapod hemocyte functional morphology; (2) identify the small granule hemocyte as the major phagocytic cell; (3) suggest that the rather rapid recruitment of large granule hemocyte in all treatments plays a relevant role by this hemocyte type in defense against foreign particles, probably in nodule formation.

  4. Use of a Latent Topic Model for Characteristic Extraction from Health Checkup Questionnaire Data.

    PubMed

    Hatakeyama, Y; Miyano, I; Kataoka, H; Nakajima, N; Watabe, T; Yasuda, N; Okuhara, Y

    2015-01-01

    When patients complete questionnaires during health checkups, many of their responses are subjective, making topic extraction difficult. Therefore, the purpose of this study was to develop a model capable of extracting appropriate topics from subjective data in questionnaires conducted during health checkups. We employed a latent topic model to group the lifestyle habits of the study participants and represented their responses to items on health checkup questionnaires as a probability model. For the probability model, we used latent Dirichlet allocation to extract 30 topics from the questionnaires. According to the model parameters, a total of 4381 study participants were then divided into groups based on these topics. Results from laboratory tests, including blood glucose level, triglycerides, and estimated glomerular filtration rate, were compared between each group, and these results were then compared with those obtained by hierarchical clustering. If a significant (p < 0.05) difference was observed in any of the laboratory measurements between groups, it was considered to indicate a questionnaire response pattern corresponding to the value of the test result. A comparison between the latent topic model and hierarchical clustering grouping revealed that, in the latent topic model method, a small group of participants who reported having subjective signs of urinary disorder were allocated to a single group. The latent topic model is useful for extracting characteristics from a small number of groups from questionnaires with a large number of items. These results show that, in addition to chief complaints and history of past illness, questionnaire data obtained during medical checkups can serve as useful judgment criteria for assessing the conditions of patients.

  5. A rational decision rule with extreme events.

    PubMed

    Basili, Marcello

    2006-12-01

    Risks induced by extreme events are characterized by small or ambiguous probabilities, catastrophic losses, or windfall gains. Through a new functional, that mimics the restricted Bayes-Hurwicz criterion within the Choquet expected utility approach, it is possible to represent the decisionmaker behavior facing both risky (large and reliable probability) and extreme (small or ambiguous probability) events. A new formalization of the precautionary principle (PP) is shown and a new functional, which encompasses both extreme outcomes and expectation of all the possible results for every act, is claimed.

  6. [Analysis of clinical phenotype and CGH1 gene mutations in a family affected with dopa-responsive dystonia].

    PubMed

    Yan, Yaping; Chen, Xiaohong; Luo, Wei

    2017-04-10

    To explore genetic mutations and clinical features of a pedigree affected with dopa-responsive dystonia. PCR and Sanger sequencing were applied to detect mutations of the GCH1 gene among 7 members from the pedigree. The family was detected to have a known heterozygous mutation of the GCH1 gene (c.550C>T). For the 7 members from the pedigree, the age of onset has ranged from 13 to 60 years. The mother of the proband has carried the same mutation but was still healthy at 80. The symptoms of the other three patients were in slow progression, with diurnal fluctuation which can be improved with sleeping, dystonias of lower limbs, and tremor of both hands. Treatment with small dose of levodopa has resulted in significant improvement of clinical symptoms. By database analysis, the c.550C>T mutation was predicted as probably pathological. The c.550C>T mutation probably underlies the disease in this pedigree. The clinical phenotypes of family members may be variable for their ages of onset. Some may even be symptom free.

  7. Probability Theory Plus Noise: Descriptive Estimation and Inferential Judgment.

    PubMed

    Costello, Fintan; Watts, Paul

    2018-01-01

    We describe a computational model of two central aspects of people's probabilistic reasoning: descriptive probability estimation and inferential probability judgment. This model assumes that people's reasoning follows standard frequentist probability theory, but it is subject to random noise. This random noise has a regressive effect in descriptive probability estimation, moving probability estimates away from normative probabilities and toward the center of the probability scale. This random noise has an anti-regressive effect in inferential judgement, however. These regressive and anti-regressive effects explain various reliable and systematic biases seen in people's descriptive probability estimation and inferential probability judgment. This model predicts that these contrary effects will tend to cancel out in tasks that involve both descriptive estimation and inferential judgement, leading to unbiased responses in those tasks. We test this model by applying it to one such task, described by Gallistel et al. ). Participants' median responses in this task were unbiased, agreeing with normative probability theory over the full range of responses. Our model captures the pattern of unbiased responses in this task, while simultaneously explaining systematic biases away from normatively correct probabilities seen in other tasks. Copyright © 2018 Cognitive Science Society, Inc.

  8. Water level dynamics in wetlands and nesting success of Black Terns in Maine

    USGS Publications Warehouse

    Gilbert, A.T.; Servello, F.A.

    2005-01-01

    The Black Tern (Chlidonias niger) nests in freshwater wetlands that are prone to water level fluctuations, and nest losses to flooding are common. We examined temporal patterns in water levels at six sites with Black Tern colonies in Maine and determined probabilities of flood events and associated nest loss at Douglas Pond, the location of the largest breeding colony. Daily precipitation data from weather stations and water flow data from a flow gauge below Douglas Pond were obtained for 1960-1999. Information on nest losses from three floods at Douglas Pond in 1997-1999 were used to characterize small (6% nest loss), medium (56% nest loss) and large (94% nest loss) flood events, and we calculated probabilities of these three levels of flooding occurring at Douglas Pond using historic water levels data. Water levels generally decreased gradually during the nesting season at colony sites, except at Douglas Pond where water levels fluctuated substantially in response to rain events. Annual probabilities of small, medium, and large flood events were 68%, 35%, and 13% for nests initiated during 23 May-12 July, with similar probabilities for early (23 May-12 June) and late (13 June-12 July) periods. An index of potential nest loss indicated that medium floods at Douglas Pond had the greatest potential effect on nest success because they occurred relatively frequently and inundated large proportions of nests. Nest losses at other colonies were estimated to be approximately 30% of those at Douglas Pond. Nest losses to flooding appear to be common for the Black Tern in Maine and related to spring precipitation patterns, but ultimate effects on breeding productivity are uncertain.

  9. Optimal Information Processing in Biochemical Networks

    NASA Astrophysics Data System (ADS)

    Wiggins, Chris

    2012-02-01

    A variety of experimental results over the past decades provide examples of near-optimal information processing in biological networks, including in biochemical and transcriptional regulatory networks. Computing information-theoretic quantities requires first choosing or computing the joint probability distribution describing multiple nodes in such a network --- for example, representing the probability distribution of finding an integer copy number of each of two interacting reactants or gene products while respecting the `intrinsic' small copy number noise constraining information transmission at the scale of the cell. I'll given an overview of some recent analytic and numerical work facilitating calculation of such joint distributions and the associated information, which in turn makes possible numerical optimization of information flow in models of noisy regulatory and biochemical networks. Illustrating cases include quantification of form-function relations, ideal design of regulatory cascades, and response to oscillatory driving.

  10. Stochastic resonance enhancement of small-world neural networks by hybrid synapses and time delay

    NASA Astrophysics Data System (ADS)

    Yu, Haitao; Guo, Xinmeng; Wang, Jiang

    2017-01-01

    The synergistic effect of hybrid electrical-chemical synapses and information transmission delay on the stochastic response behavior in small-world neuronal networks is investigated. Numerical results show that, the stochastic response behavior can be regulated by moderate noise intensity to track the rhythm of subthreshold pacemaker, indicating the occurrence of stochastic resonance (SR) in the considered neural system. Inheriting the characteristics of two types of synapses-electrical and chemical ones, neural networks with hybrid electrical-chemical synapses are of great improvement in neuron communication. Particularly, chemical synapses are conducive to increase the network detectability by lowering the resonance noise intensity, while the information is better transmitted through the networks via electrical coupling. Moreover, time delay is able to enhance or destroy the periodic stochastic response behavior intermittently. In the time-delayed small-world neuronal networks, the introduction of electrical synapses can significantly improve the signal detection capability by widening the range of optimal noise intensity for the subthreshold signal, and the efficiency of SR is largely amplified in the case of pure chemical couplings. In addition, the stochastic response behavior is also profoundly influenced by the network topology. Increasing the rewiring probability in pure chemically coupled networks can always enhance the effect of SR, which is slightly influenced by information transmission delay. On the other hand, the capacity of information communication is robust to the network topology within the time-delayed neuronal systems including electrical couplings.

  11. Uncertainty plus Prior Equals Rational Bias: An Intuitive Bayesian Probability Weighting Function

    ERIC Educational Resources Information Center

    Fennell, John; Baddeley, Roland

    2012-01-01

    Empirical research has shown that when making choices based on probabilistic options, people behave as if they overestimate small probabilities, underestimate large probabilities, and treat positive and negative outcomes differently. These distortions have been modeled using a nonlinear probability weighting function, which is found in several…

  12. Probabilistic Design of a Wind Tunnel Model to Match the Response of a Full-Scale Aircraft

    NASA Technical Reports Server (NTRS)

    Mason, Brian H.; Stroud, W. Jefferson; Krishnamurthy, T.; Spain, Charles V.; Naser, Ahmad S.

    2005-01-01

    approach is presented for carrying out the reliability-based design of a plate-like wing that is part of a wind tunnel model. The goal is to design the wind tunnel model to match the stiffness characteristics of the wing box of a flight vehicle while satisfying strength-based risk/reliability requirements that prevents damage to the wind tunnel model and fixtures. The flight vehicle is a modified F/A-18 aircraft. The design problem is solved using reliability-based optimization techniques. The objective function to be minimized is the difference between the displacements of the wind tunnel model and the corresponding displacements of the flight vehicle. The design variables control the thickness distribution of the wind tunnel model. Displacements of the wind tunnel model change with the thickness distribution, while displacements of the flight vehicle are a set of fixed data. The only constraint imposed is that the probability of failure is less than a specified value. Failure is assumed to occur if the stress caused by aerodynamic pressure loading is greater than the specified strength allowable. Two uncertain quantities are considered: the allowable stress and the thickness distribution of the wind tunnel model. Reliability is calculated using Monte Carlo simulation with response surfaces that provide approximate values of stresses. The response surface equations are, in turn, computed from finite element analyses of the wind tunnel model at specified design points. Because the response surface approximations were fit over a small region centered about the current design, the response surfaces were refit periodically as the design variables changed. Coarse-grained parallelism was used to simultaneously perform multiple finite element analyses. Studies carried out in this paper demonstrate that this scheme of using moving response surfaces and coarse-grained computational parallelism reduce the execution time of the Monte Carlo simulation enough to make the design problem tractable. The results of the reliability-based designs performed in this paper show that large decreases in the probability of stress-based failure can be realized with only small sacrifices in the ability of the wind tunnel model to represent the displacements of the full-scale vehicle.

  13. The effect of juvenile hormone on Polistes wasp fertility varies with cooperative behavior.

    PubMed

    Tibbetts, Elizabeth A; Sheehan, Michael J

    2012-04-01

    Social insects provide good models for studying how and why the mechanisms that underlie reproduction vary, as there is dramatic reproductive plasticity within and between species. Here, we test how the effect of juvenile hormone (JH) on fertility covaries with cooperative behavior in workers and nest-founding queens in the primitively eusocial wasp Polistes metricus. P. metricus foundresses and workers appear morphologically similar and both are capable of reproduction, though there is variation in the extent of social cooperation and the probability of reproduction across castes. Do the endocrine mechanisms that mediate reproduction co-vary with cooperative behavior? We found dramatic differences in the effect of JH on fertility across castes. In non-cooperative nest-founding queens, all individuals responded to JH by increasing their fertility. However, in cooperative workers, the effect of JH on fertility varies with body weight; large workers increase their fertility in response to JH while small workers do not. The variation in JH response may be an adaptation to facilitate resource allocation based on the probability of independent reproduction. This work contrasts with previous studies in closely related Polistes dominulus paper wasps, in which both foundresses and workers form cooperative associations and both castes show similar, condition-dependent JH response. The variation in JH responsiveness within and between species suggests that endocrine responsiveness and the factors influencing caste differentiation are surprisingly evolutionarily labile. Copyright © 2012 Elsevier Inc. All rights reserved.

  14. Training in Small Business Retailing: Testing Human Capital Theory.

    ERIC Educational Resources Information Center

    Barcala, Marta Fernandez; Perez, Maria Jose Sanzo; Gutierrez, Juan Antonio Trespalacios

    1999-01-01

    Looks at four models of training demand: (1) probability of attending training in the near future; (2) probability of having attended training in the past; (3) probability of being willing to follow multimedia and correspondence courses; and (4) probability of repeating the experience of attending another training course in the near future.…

  15. Daily rhythmicity of body temperature in the dog.

    PubMed

    Refinetti, R; Piccione, G

    2003-08-01

    Research over the past 50 years has demonstrated the existence of circadian or daily rhythmicity in the body core temperature of a large number of mammalian species. However, previous studies have failed to identify daily rhythmicity of body temperature in dogs. We report here the successful recording of daily rhythms of rectal temperature in female Beagle dogs. The low robustness of the rhythms (41% of maximal robustness) and the small range of excursion (0.5 degrees C) are probably responsible for previous failures in detecting rhythmicity in dogs.

  16. Introns in Cryptococcus.

    PubMed

    Janbon, Guilhem

    2018-01-01

    In Cryptococcus neoformans, nearly all genes are interrupted by small introns. In recent years, genome annotation and genetic analysis have illuminated the major roles these introns play in the biology of this pathogenic yeast. Introns are necessary for gene expression and alternative splicing can regulate gene expression in response to environmental cues. In addition, recent studies have revealed that C. neoformans introns help to prevent transposon dissemination and protect genome integrity. These characteristics of cryptococcal introns are probably not unique to Cryptococcus, and this yeast likely can be considered as a model for intron-related studies in fungi.

  17. Knowing where is different from knowing what: Distinct response time profiles and accuracy effects for target location, orientation, and color probability.

    PubMed

    Jabar, Syaheed B; Filipowicz, Alex; Anderson, Britt

    2017-11-01

    When a location is cued, targets appearing at that location are detected more quickly. When a target feature is cued, targets bearing that feature are detected more quickly. These attentional cueing effects are only superficially similar. More detailed analyses find distinct temporal and accuracy profiles for the two different types of cues. This pattern parallels work with probability manipulations, where both feature and spatial probability are known to affect detection accuracy and reaction times. However, little has been done by way of comparing these effects. Are probability manipulations on space and features distinct? In a series of five experiments, we systematically varied spatial probability and feature probability along two dimensions (orientation or color). In addition, we decomposed response times into initiation and movement components. Targets appearing at the probable location were reported more quickly and more accurately regardless of whether the report was based on orientation or color. On the other hand, when either color probability or orientation probability was manipulated, response time and accuracy improvements were specific for that probable feature dimension. Decomposition of the response time benefits demonstrated that spatial probability only affected initiation times, whereas manipulations of feature probability affected both initiation and movement times. As detection was made more difficult, the two effects further diverged, with spatial probability disproportionally affecting initiation times and feature probability disproportionately affecting accuracy. In conclusion, all manipulations of probability, whether spatial or featural, affect detection. However, only feature probability affects perceptual precision, and precision effects are specific to the probable attribute.

  18. Comet and asteroid hazard to the terrestrial planets

    NASA Astrophysics Data System (ADS)

    Ipatov, S. I.; Mather, J. C.

    2004-01-01

    We estimated the rate of comet and asteroid collisions with the terrestrial planets by calculating the orbits of 13,000 Jupiter-crossing objects (JCOs) and 1300 resonant asteroids and computing the probabilities of collisions based on random-phase approximations and the orbital elements sampled with a 500 years step. The Bulirsh-Stoer and a symplectic orbit integrator gave similar results for orbital evolution, but may give different collision probabilities with the Sun. A small fraction of former JCOs reached orbits with aphelia inside Jupiter's orbit and some reached Apollo orbits with semi-major axes less than 2 AU, Aten orbits and inner-Earth orbits (with aphelia less than 0.983 AU) and remained there for millions of years. Though less than 0.1% of the total, these objects were responsible for most of the collision probability of former JCOs with Earth and Venus. We conclude that a significant fraction of near-Earth objects could be extinct comets that came from the trans-Neptunian region or most of such comets disintegrated during their motion in near-Earth object orbits.

  19. Role of bioinformatics in establishing microRNAs as modulators of abiotic stress responses: the new revolution

    PubMed Central

    Tripathi, Anita; Goswami, Kavita; Sanan-Mishra, Neeti

    2015-01-01

    microRNAs (miRs) are a class of 21–24 nucleotide long non-coding RNAs responsible for regulating the expression of associated genes mainly by cleavage or translational inhibition of the target transcripts. With this characteristic of silencing, miRs act as an important component in regulation of plant responses in various stress conditions. In recent years, with drastic change in environmental and soil conditions different type of stresses have emerged as a major challenge for plants growth and productivity. The identification and profiling of miRs has itself been a challenge for research workers given their small size and large number of many probable sequences in the genome. Application of computational approaches has expedited the process of identification of miRs and their expression profiling in different conditions. The development of High-Throughput Sequencing (HTS) techniques has facilitated to gain access to the global profiles of the miRs for understanding their mode of action in plants. Introduction of various bioinformatics databases and tools have revolutionized the study of miRs and other small RNAs. This review focuses the role of bioinformatics approaches in the identification and study of the regulatory roles of plant miRs in the adaptive response to stresses. PMID:26578966

  20. Genetic erosion impedes adaptive responses to stressful environments

    PubMed Central

    Bijlsma, R; Loeschcke, Volker

    2012-01-01

    Biodiversity is increasingly subjected to human-induced changes of the environment. To persist, populations continually have to adapt to these often stressful changes including pollution and climate change. Genetic erosion in small populations, owing to fragmentation of natural habitats, is expected to obstruct such adaptive responses: (i) genetic drift will cause a decrease in the level of adaptive genetic variation, thereby limiting evolutionary responses; (ii) inbreeding and the concomitant inbreeding depression will reduce individual fitness and, consequently, the tolerance of populations to environmental stress. Importantly, inbreeding generally increases the sensitivity of a population to stress, thereby increasing the amount of inbreeding depression. As adaptation to stress is most often accompanied by increased mortality (cost of selection), the increase in the ‘cost of inbreeding’ under stress is expected to severely hamper evolutionary adaptive processes. Inbreeding thus plays a pivotal role in this process and is expected to limit the probability of genetically eroded populations to successfully adapt to stressful environmental conditions. Consequently, the dynamics of small fragmented populations may differ considerably from large nonfragmented populations. The resilience of fragmented populations to changing and deteriorating environments is expected to be greatly decreased. Alleviating inbreeding depression, therefore, is crucial to ensure population persistence. PMID:25568035

  1. Promise and problems in using stress triggering models for time-dependent earthquake hazard assessment

    NASA Astrophysics Data System (ADS)

    Cocco, M.

    2001-12-01

    Earthquake stress changes can promote failures on favorably oriented faults and modify the seismicity pattern over broad regions around the causative faults. Because the induced stress perturbations modify the rate of production of earthquakes, they alter the probability of seismic events in a specified time window. Comparing the Coulomb stress changes with the seismicity rate changes and aftershock patterns can statistically test the role of stress transfer in earthquake occurrence. The interaction probability may represent a further tool to test the stress trigger or shadow model. The probability model, which incorporate stress transfer, has the main advantage to include the contributions of the induced stress perturbation (a static step in its present formulation), the loading rate and the fault constitutive properties. Because the mechanical conditions of the secondary faults at the time of application of the induced load are largely unkown, stress triggering can only be tested on fault populations and not on single earthquake pairs with a specified time delay. The interaction probability can represent the most suitable tool to test the interaction between large magnitude earthquakes. Despite these important implications and the stimulating perspectives, there exist problems in understanding earthquake interaction that should motivate future research but at the same time limit its immediate social applications. One major limitation is that we are unable to predict how and if the induced stress perturbations modify the ratio between small versus large magnitude earthquakes. In other words, we cannot distinguish between a change in this ratio in favor of small events or of large magnitude earthquakes, because the interaction probability is independent of magnitude. Another problem concerns the reconstruction of the stressing history. The interaction probability model is based on the response to a static step; however, we know that other processes contribute to the stressing history perturbing the faults (such as dynamic stress changes, post-seismic stress changes caused by viscolelastic relaxation or fluid flow). If, for instance, we believe that dynamic stress changes can trigger aftershocks or earthquakes years after the passing of the seismic waves through the fault, the perspective of calculating interaction probability is untenable. It is therefore clear we have learned a lot on earthquake interaction incorporating fault constitutive properties, allowing to solve existing controversy, but leaving open questions for future research.

  2. Predicting above-ground density and distribution of small mammal prey species at large spatial scales

    PubMed Central

    2017-01-01

    Grassland and shrub-steppe ecosystems are increasingly threatened by anthropogenic activities. Loss of native habitats may negatively impact important small mammal prey species. Little information, however, is available on the impact of habitat variability on density of small mammal prey species at broad spatial scales. We examined the relationship between small mammal density and remotely-sensed environmental covariates in shrub-steppe and grassland ecosystems in Wyoming, USA. We sampled four sciurid and leporid species groups using line transect methods, and used hierarchical distance-sampling to model density in response to variation in vegetation, climate, topographic, and anthropogenic variables, while accounting for variation in detection probability. We created spatial predictions of each species’ density and distribution. Sciurid and leporid species exhibited mixed responses to vegetation, such that changes to native habitat will likely affect prey species differently. Density of white-tailed prairie dogs (Cynomys leucurus), Wyoming ground squirrels (Urocitellus elegans), and leporids correlated negatively with proportion of shrub or sagebrush cover and positively with herbaceous cover or bare ground, whereas least chipmunks showed a positive correlation with shrub cover and a negative correlation with herbaceous cover. Spatial predictions from our models provide a landscape-scale metric of above-ground prey density, which will facilitate the development of conservation plans for these taxa and their predators at spatial scales relevant to management. PMID:28520757

  3. Random function representation of stationary stochastic vector processes for probability density evolution analysis of wind-induced structures

    NASA Astrophysics Data System (ADS)

    Liu, Zhangjun; Liu, Zenghui

    2018-06-01

    This paper develops a hybrid approach of spectral representation and random function for simulating stationary stochastic vector processes. In the proposed approach, the high-dimensional random variables, included in the original spectral representation (OSR) formula, could be effectively reduced to only two elementary random variables by introducing the random functions that serve as random constraints. Based on this, a satisfactory simulation accuracy can be guaranteed by selecting a small representative point set of the elementary random variables. The probability information of the stochastic excitations can be fully emerged through just several hundred of sample functions generated by the proposed approach. Therefore, combined with the probability density evolution method (PDEM), it could be able to implement dynamic response analysis and reliability assessment of engineering structures. For illustrative purposes, a stochastic turbulence wind velocity field acting on a frame-shear-wall structure is simulated by constructing three types of random functions to demonstrate the accuracy and efficiency of the proposed approach. Careful and in-depth studies concerning the probability density evolution analysis of the wind-induced structure have been conducted so as to better illustrate the application prospects of the proposed approach. Numerical examples also show that the proposed approach possesses a good robustness.

  4. Preliminary performance assessment of biotoxin detection for UWS applications using a MicroChemLab device.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    VanderNoot, Victoria A.; Haroldsen, Brent L.; Renzi, Ronald F.

    2010-03-01

    In a multiyear research agreement with Tenix Investments Pty. Ltd., Sandia has been developing field deployable technologies for detection of biotoxins in water supply systems. The unattended water sensor or UWS employs microfluidic chip based gel electrophoresis for monitoring biological analytes in a small integrated sensor platform. This instrument collects, prepares, and analyzes water samples in an automated manner. Sample analysis is done using the {mu}ChemLab{trademark} analysis module. This report uses analysis results of two datasets collected using the UWS to estimate performance of the device. The first dataset is made up of samples containing ricin at varying concentrations andmore » is used for assessing instrument response and detection probability. The second dataset is comprised of analyses of water samples collected at a water utility which are used to assess the false positive probability. The analyses of the two sets are used to estimate the Receiver Operating Characteristic or ROC curves for the device at one set of operational and detection algorithm parameters. For these parameters and based on a statistical estimate, the ricin probability of detection is about 0.9 at a concentration of 5 nM for a false positive probability of 1 x 10{sup -6}.« less

  5. MicroRNA regulated defense responses in Triticum aestivum L. during Puccinia graminis f.sp. tritici infection.

    PubMed

    Gupta, Om Prakash; Permar, Vipin; Koundal, Vikas; Singh, Uday Dhari; Praveen, Shelly

    2012-02-01

    Plants have evolved diverse mechanism to recognize pathogen attack and triggers defense responses. These defense responses alter host cellular function regulated by endogenous, small, non-coding miRNAs. To understand the mechanism of miRNAs regulated cellular functions during stem rust infection in wheat, we investigated eight different miRNAs viz. miR159, miR164, miR167, miR171, miR444, miR408, miR1129 and miR1138, involved in three different independent cellular defense response to infection. The investigation reveals that at the initiation of disease, accumulation of miRNAs might be playing a key role in hypersensitive response (HR) from host, which diminishes at the maturation stage. This suggests a possible host-fungal synergistic relation leading to susceptibility. Differential expression of these miRNAs in presence and absence of R gene provides a probable explanation of miRNA regulated R gene mediated independent pathways.

  6. Mixed analytical-stochastic simulation method for the recovery of a Brownian gradient source from probability fluxes to small windows.

    PubMed

    Dobramysl, U; Holcman, D

    2018-02-15

    Is it possible to recover the position of a source from the steady-state fluxes of Brownian particles to small absorbing windows located on the boundary of a domain? To address this question, we develop a numerical procedure to avoid tracking Brownian trajectories in the entire infinite space. Instead, we generate particles near the absorbing windows, computed from the analytical expression of the exit probability. When the Brownian particles are generated by a steady-state gradient at a single point, we compute asymptotically the fluxes to small absorbing holes distributed on the boundary of half-space and on a disk in two dimensions, which agree with stochastic simulations. We also derive an expression for the splitting probability between small windows using the matched asymptotic method. Finally, when there are more than two small absorbing windows, we show how to reconstruct the position of the source from the diffusion fluxes. The present approach provides a computational first principle for the mechanism of sensing a gradient of diffusing particles, a ubiquitous problem in cell biology.

  7. Probability of coincidental similarity among the orbits of small bodies - I. Pairing

    NASA Astrophysics Data System (ADS)

    Jopek, Tadeusz Jan; Bronikowska, Małgorzata

    2017-09-01

    Probability of coincidental clustering among orbits of comets, asteroids and meteoroids depends on many factors like: the size of the orbital sample searched for clusters or the size of the identified group, it is different for groups of 2,3,4,… members. Probability of coincidental clustering is assessed by the numerical simulation, therefore, it depends also on the method used for the synthetic orbits generation. We have tested the impact of some of these factors. For a given size of the orbital sample we have assessed probability of random pairing among several orbital populations of different sizes. We have found how these probabilities vary with the size of the orbital samples. Finally, keeping fixed size of the orbital sample we have shown that the probability of random pairing can be significantly different for the orbital samples obtained by different observation techniques. Also for the user convenience we have obtained several formulae which, for given size of the orbital sample can be used to calculate the similarity threshold corresponding to the small value of the probability of coincidental similarity among two orbits.

  8. Review of the use of pretest probability for molecular testing in non-small cell lung cancer and overview of new mutations that may affect clinical practice.

    PubMed

    Martin, Petra; Leighl, Natasha B

    2017-06-01

    This article considers the use of pretest probability in non-small cell lung cancer (NSCLC) and how its use in EGFR testing has helped establish clinical guidelines on selecting patients for EGFR testing. With an ever-increasing number of molecular abnormalities being identified and often limited tissue available for testing, the use of pretest probability will need to be increasingly considered in the future for selecting investigations and treatments in patients. In addition we review new mutations that have the potential to affect clinical practice.

  9. Musculoskeletal anatomy of the Eurasian lynx, Lynx lynx (Carnivora: Felidae) forelimb: Adaptations to capture large prey?

    PubMed

    Viranta, Suvi; Lommi, Hanna; Holmala, Katja; Laakkonen, Juha

    2016-06-01

    Mammalian carnivores adhere to two different feeding strategies relative to their body masses. Large carnivores prey on animals that are the same size or larger than themselves, whereas small carnivores prey on smaller vertebrates and invertebrates. The Eurasian lynx (Lynx lynx) falls in between these two categories. Lynx descend from larger forms that were probably large prey specialists, but during the Pleistocene became predators of small prey. The modern Eurasian lynx may be an evolutionary reversal toward specializing in large prey again. We hypothesized that the musculoskeletal anatomy of lynx should show traits for catching large prey. To test our hypothesis, we dissected the forelimb muscles of six Eurasian lynx individuals and compared our findings to results published for other felids. We measured the bones and compared their dimensions to the published material. Our material displayed a well-developed pectoral girdle musculature with some uniquely extensive muscle attachments. The upper arm musculature resembled that of the pantherine felids and probably the extinct sabertooths, and also the muscles responsible for supination and pronation were similar to those in large cats. The muscles controlling the pollex were well-developed. However, skeletal indices were similar to those of small prey predators. Our findings show that lynx possess the topographic pattern of muscle origin and insertion like in large felids. J. Morphol. 277:753-765, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  10. "I Don't Really Understand Probability at All": Final Year Pre-Service Teachers' Understanding of Probability

    ERIC Educational Resources Information Center

    Maher, Nicole; Muir, Tracey

    2014-01-01

    This paper reports on one aspect of a wider study that investigated a selection of final year pre-service primary teachers' responses to four probability tasks. The tasks focused on foundational ideas of probability including sample space, independence, variation and expectation. Responses suggested that strongly held intuitions appeared to…

  11. Dynamical Response of Networks Under External Perturbations: Exact Results

    NASA Astrophysics Data System (ADS)

    Chinellato, David D.; Epstein, Irving R.; Braha, Dan; Bar-Yam, Yaneer; de Aguiar, Marcus A. M.

    2015-04-01

    We give exact statistical distributions for the dynamic response of influence networks subjected to external perturbations. We consider networks whose nodes have two internal states labeled 0 and 1. We let nodes be frozen in state 0, in state 1, and the remaining nodes change by adopting the state of a connected node with a fixed probability per time step. The frozen nodes can be interpreted as external perturbations to the subnetwork of free nodes. Analytically extending and to be smaller than 1 enables modeling the case of weak coupling. We solve the dynamical equations exactly for fully connected networks, obtaining the equilibrium distribution, transition probabilities between any two states and the characteristic time to equilibration. Our exact results are excellent approximations for other topologies, including random, regular lattice, scale-free and small world networks, when the numbers of fixed nodes are adjusted to take account of the effect of topology on coupling to the environment. This model can describe a variety of complex systems, from magnetic spins to social networks to population genetics, and was recently applied as a framework for early warning signals for real-world self-organized economic market crises.

  12. A random walk rule for phase I clinical trials.

    PubMed

    Durham, S D; Flournoy, N; Rosenberger, W F

    1997-06-01

    We describe a family of random walk rules for the sequential allocation of dose levels to patients in a dose-response study, or phase I clinical trial. Patients are sequentially assigned the next higher, same, or next lower dose level according to some probability distribution, which may be determined by ethical considerations as well as the patient's response. It is shown that one can choose these probabilities in order to center dose level assignments unimodally around any target quantile of interest. Estimation of the quantile is discussed; the maximum likelihood estimator and its variance are derived under a two-parameter logistic distribution, and the maximum likelihood estimator is compared with other nonparametric estimators. Random walk rules have clear advantages: they are simple to implement, and finite and asymptotic distribution theory is completely worked out. For a specific random walk rule, we compute finite and asymptotic properties and give examples of its use in planning studies. Having the finite distribution theory available and tractable obviates the need for elaborate simulation studies to analyze the properties of the design. The small sample properties of our rule, as determined by exact theory, compare favorably to those of the continual reassessment method, determined by simulation.

  13. Mineral resource potential map of the Raywood Flat Roadless Areas, Riverside and San Bernardino counties, California

    USGS Publications Warehouse

    Matti, Jonathan C.; Cox, Brett F.; Iverson, Stephen R.

    1983-01-01

    The area having moderate potential for base-metal resources forms a small zone in the eastern part of the recommended wilderness (A5-187). Within this zone, evidence provided by stream-sediment geochemistry suggests that crystalline bedrocks in several drainages contain concentrations of metallic elements. Because the terrain is inaccessible and covered with dense brush, most of the bedrock in the specific drainages containing the geochemical anomalies could not be examined. Thus, although we infer that mineral occurrences exist in the drainage basins, we have little data on which to base an estimate of their extent and quality. Locally, the crystalline rocks probably contain hydrothermal veins or disseminated occurrences where lead, copper, molybdenum, tin, cobalt, bismuth, and arsenic have been concentrated. However, the geochemical anomalies for these metals are small, and the stream drainages also are relatively small. Therefore, the inferred occurrences of metallic minerals probably are small scale, scattered, and low grade. There is only low probability that the inferred mineral occurrences are large scale.

  14. Shifting elasmobranch community assemblage at Cocos Island--an isolated marine protected area.

    PubMed

    White, Easton R; Myers, Mark C; Flemming, Joanna Mills; Baum, Julia K

    2015-08-01

    Fishing pressure has increased the extinction risk of many elasmobranch (shark and ray) species. Although many countries have established no-take marine reserves, a paucity of monitoring data means it is still unclear if reserves are effectively protecting these species. We examined data collected by a small group of divers over the past 21 years at one of the world's oldest marine protected areas (MPAs), Cocos Island National Park, Costa Rica. We used mixed effects models to determine trends in relative abundance, or probability of occurrence, of 12 monitored elasmobranch species while accounting for variation among observers and from abiotic factors. Eight of 12 species declined significantly over the past 2 decades. We documented decreases in relative abundance for 6 species, including the iconic scalloped hammerhead shark (Sphyrna lewini) (-45%), whitetip reef shark (Triaenodon obesus) (-77%), mobula ray (Mobula spp.) (-78%), and manta ray (Manta birostris) (-89%), and decreases in the probability of occurrence for 2 other species. Several of these species have small home ranges and should be better protected by an MPA, which underscores the notion that declines of marine megafauna will continue unabated in MPAs unless there is adequate enforcement effort to control fishing. In addition, probability of occurrence at Cocos Island of tiger (Galeocerdo cuvier), Galapagos (Carcharhinus galapagensis), blacktip (Carcharhinus limbatus), and whale (Rhincodon typus) sharks increased significantly. The effectiveness of MPAs cannot be evaluated by examining single species because population responses can vary depending on life history traits and vulnerability to fishing pressure. © 2015 Society for Conservation Biology.

  15. The Context Matters: Outcome Probability and Expectation Mismatch Modulate the Feedback Negativity When Self-Evaluation of Response Correctness Is Possible

    PubMed Central

    Leue, Anja; Cano Rodilla, Carmen; Beauducel, André

    2015-01-01

    Individuals typically evaluate whether their performance and the obtained feedback match. Previous research has shown that feedback negativity (FN) depends on outcome probability and feedback valence. It is, however, less clear to what extent previous effects of outcome probability on FN depend on self-evaluations of response correctness. Therefore, we investigated the effects of outcome probability on FN amplitude in a simple go/no-go task that allowed for the self-evaluation of response correctness. We also investigated effects of performance incompatibility and feedback valence. In a sample of N = 22 participants, outcome probability was manipulated by means of precues, feedback valence by means of monetary feedback, and performance incompatibility by means of feedback that induced a match versus mismatch with individuals' performance. We found that the 100% outcome probability condition induced a more negative FN following no-loss than the 50% outcome probability condition. The FN following loss was more negative in the 50% compared to the 100% outcome probability condition. Performance-incompatible loss resulted in a more negative FN than performance-compatible loss. Our results indicate that the self-evaluation of the correctness of responses should be taken into account when the effects of outcome probability and expectation mismatch on FN are investigated. PMID:26783525

  16. The Context Matters: Outcome Probability and Expectation Mismatch Modulate the Feedback Negativity When Self-Evaluation of Response Correctness Is Possible.

    PubMed

    Leue, Anja; Cano Rodilla, Carmen; Beauducel, André

    2015-01-01

    Individuals typically evaluate whether their performance and the obtained feedback match. Previous research has shown that feedback negativity (FN) depends on outcome probability and feedback valence. It is, however, less clear to what extent previous effects of outcome probability on FN depend on self-evaluations of response correctness. Therefore, we investigated the effects of outcome probability on FN amplitude in a simple go/no-go task that allowed for the self-evaluation of response correctness. We also investigated effects of performance incompatibility and feedback valence. In a sample of N = 22 participants, outcome probability was manipulated by means of precues, feedback valence by means of monetary feedback, and performance incompatibility by means of feedback that induced a match versus mismatch with individuals' performance. We found that the 100% outcome probability condition induced a more negative FN following no-loss than the 50% outcome probability condition. The FN following loss was more negative in the 50% compared to the 100% outcome probability condition. Performance-incompatible loss resulted in a more negative FN than performance-compatible loss. Our results indicate that the self-evaluation of the correctness of responses should be taken into account when the effects of outcome probability and expectation mismatch on FN are investigated.

  17. Rare event simulation in radiation transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kollman, Craig

    1993-10-01

    This dissertation studies methods for estimating extremely small probabilities by Monte Carlo simulation. Problems in radiation transport typically involve estimating very rare events or the expected value of a random variable which is with overwhelming probability equal to zero. These problems often have high dimensional state spaces and irregular geometries so that analytic solutions are not possible. Monte Carlo simulation must be used to estimate the radiation dosage being transported to a particular location. If the area is well shielded the probability of any one particular particle getting through is very small. Because of the large number of particles involved,more » even a tiny fraction penetrating the shield may represent an unacceptable level of radiation. It therefore becomes critical to be able to accurately estimate this extremely small probability. Importance sampling is a well known technique for improving the efficiency of rare event calculations. Here, a new set of probabilities is used in the simulation runs. The results are multiple by the likelihood ratio between the true and simulated probabilities so as to keep the estimator unbiased. The variance of the resulting estimator is very sensitive to which new set of transition probabilities are chosen. It is shown that a zero variance estimator does exist, but that its computation requires exact knowledge of the solution. A simple random walk with an associated killing model for the scatter of neutrons is introduced. Large deviation results for optimal importance sampling in random walks are extended to the case where killing is present. An adaptive ``learning`` algorithm for implementing importance sampling is given for more general Markov chain models of neutron scatter. For finite state spaces this algorithm is shown to give with probability one, a sequence of estimates converging exponentially fast to the true solution.« less

  18. Do Financial Incentives Influence GPs' Decisions to Do After-hours Work? A Discrete Choice Labour Supply Model.

    PubMed

    Broadway, Barbara; Kalb, Guyonne; Li, Jinhu; Scott, Anthony

    2017-12-01

    This paper analyses doctors' supply of after-hours care (AHC), and how it is affected by personal and family circumstances as well as the earnings structure. We use detailed survey data from a large sample of Australian General Practitioners (GPs) to estimate a structural, discrete choice model of labour supply and AHC. This allows us to jointly model GPs' decisions on the number of daytime-weekday working hours and the probability of providing AHC. We simulate GPs' labour supply responses to an increase in hourly earnings, both in a daytime-weekday setting and for AHC. GPs increase their daytime-weekday working hours if their hourly earnings in this setting increase, but only to a very small extent. GPs are somewhat more likely to provide AHC if their hourly earnings in that setting increase, but again, the effect is very small and only evident in some subgroups. Moreover, higher earnings in weekday-daytime practice reduce the probability of providing AHC, particularly for men. Increasing GPs' earnings appears to be at best relatively ineffective in encouraging increased provision of AHC and may even prove harmful if incentives are not well targeted. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  19. Feedback produces divergence from prospect theory in descriptive choice.

    PubMed

    Jessup, Ryan K; Bishara, Anthony J; Busemeyer, Jerome R

    2008-10-01

    A recent study demonstrated that individuals making experience-based choices underweight small probabilities, in contrast to the overweighting observed in a typical descriptive paradigm. We tested whether trial-by-trial feedback in a repeated descriptive paradigm would engender choices more correspondent with experiential or descriptive paradigms. The results of a repeated gambling task indicated that individuals receiving feedback underweighted small probabilities, relative to their no-feedback counterparts. These results implicate feedback as a critical component during the decision-making process, even in the presence of fully specified descriptive information. A model comparison at the individual-subject level suggested that feedback drove individuals' decision weights toward objective probability weighting.

  20. Review of the use of pretest probability for molecular testing in non-small cell lung cancer and overview of new mutations that may affect clinical practice

    PubMed Central

    Martin, Petra; Leighl, Natasha B.

    2017-01-01

    This article considers the use of pretest probability in non-small cell lung cancer (NSCLC) and how its use in EGFR testing has helped establish clinical guidelines on selecting patients for EGFR testing. With an ever-increasing number of molecular abnormalities being identified and often limited tissue available for testing, the use of pretest probability will need to be increasingly considered in the future for selecting investigations and treatments in patients. In addition we review new mutations that have the potential to affect clinical practice. PMID:28607579

  1. Small-Scale Spatio-Temporal Distribution of Bactrocera minax (Enderlein) (Diptera: Tephritidae) Using Probability Kriging.

    PubMed

    Wang, S Q; Zhang, H Y; Li, Z L

    2016-10-01

    Understanding spatio-temporal distribution of pest in orchards can provide important information that could be used to design monitoring schemes and establish better means for pest control. In this study, the spatial and temporal distribution of Bactrocera minax (Enderlein) (Diptera: Tephritidae) was assessed, and activity trends were evaluated by using probability kriging. Adults of B. minax were captured in two successive occurrences in a small-scale citrus orchard by using food bait traps, which were placed both inside and outside the orchard. The weekly spatial distribution of B. minax within the orchard and adjacent woods was examined using semivariogram parameters. The edge concentration was discovered during the most weeks in adult occurrence, and the population of the adults aggregated with high probability within a less-than-100-m-wide band on both of the sides of the orchard and the woods. The sequential probability kriged maps showed that the adults were estimated in the marginal zone with higher probability, especially in the early and peak stages. The feeding, ovipositing, and mating behaviors of B. minax are possible explanations for these spatio-temporal patterns. Therefore, spatial arrangement and distance to the forest edge of traps or spraying spot should be considered to enhance pest control on B. minax in small-scale orchards.

  2. The gravitational law of social interaction

    NASA Astrophysics Data System (ADS)

    Levy, Moshe; Goldenberg, Jacob

    2014-01-01

    While a great deal is known about the topology of social networks, there is much less agreement about the geographical structure of these networks. The fundamental question in this context is: how does the probability of a social link between two individuals depend on the physical distance between them? While it is clear that the probability decreases with the distance, various studies have found different functional forms for this dependence. The exact form of the distance dependence has crucial implications for network searchability and dynamics: Kleinberg (2000) [15] shows that the small-world property holds if the probability of a social link is a power-law function of the distance with power -2, but not with any other power. We investigate the distance dependence of link probability empirically by analyzing four very different sets of data: Facebook links, data from the electronic version of the Small-World experiment, email messages, and data from detailed personal interviews. All four datasets reveal the same empirical regularity: the probability of a social link is proportional to the inverse of the square of the distance between the two individuals, analogously to the distance dependence of the gravitational force. Thus, it seems that social networks spontaneously converge to the exact unique distance dependence that ensures the Small-World property.

  3. The hydrological response of a small catchment after the abandonment of terrace cultivation. A study case in northwestern Spain

    NASA Astrophysics Data System (ADS)

    Llorente-Adán, Jose A.; Lana-Renault, Noemí; Galilea, Ianire; Ruiz-Flaño, Purificacion

    2015-04-01

    Terrace construction for cultivation results in a complete transformation of the hillslopes to a series of flat sectors and almost vertical steps. This strategy, which involves a redistribution of soils and a re-organization of the drainage network, provides fertile soil over steep slopes, improves infiltration and controls overland flow under conditions of intense rainstorms. In Camero Viejo (north-western Iberian ranges) most of the hillslopes are occupied by terraced fields. During the XXth century, rural population declined and agricultural practices were abandoned. In this area, a small catchment (1.9 km2) was monitored in 2012 for studying how the abandonment of agricultural terraces affect water and sediment transfer from the hillslopes to the channels. Terraces occupy 40% of the catchment and are covered by sparse grass and shrubs. The equipment installed in the catchment registers continuously meteorological data, discharge and water table fluctuations. Data on suspended sediment transport is obtained by means of a rising-stage sampler. Here we present the hydrological results corresponding to the years 2012-13 and 2013-14. The hydrological response of the catchment was moderate (annual runoff coefficient < 0.20), which could be in part explained by the high evapotranspiration rates reported in the area. Lows flows were recorded in summer and autumn, when the water reserves of the catchment were dry, and high flows occurred from January, when the catchment became wetter. The shape of the hydrographs, with slow response times, moderate peakflows and long recession limbs suggested a large contribution of subsurface flow, probably favored by deep and well structured soils in the bench terraces. Soil saturation areas were not observed during the study period, suggesting that soil infiltration processes and subsurface flow are important, and that the drainage system of the terraces is probably well maintained. No suspended sediment has been collected so far, confirming the hypothesis that subsurface flow might be a dominant runoff generation process.

  4. Small white matter lesion detection in cerebral small vessel disease

    NASA Astrophysics Data System (ADS)

    Ghafoorian, Mohsen; Karssemeijer, Nico; van Uden, Inge; de Leeuw, Frank E.; Heskes, Tom; Marchiori, Elena; Platel, Bram

    2015-03-01

    Cerebral small vessel disease (SVD) is a common finding on magnetic resonance images of elderly people. White matter lesions (WML) are important markers for not only the small vessel disease, but also neuro-degenerative diseases including multiple sclerosis, Alzheimer's disease and vascular dementia. Volumetric measurements such as the "total lesion load", have been studied and related to these diseases. With respect to SVD we conjecture that small lesions are important, as they have been observed to grow over time and they form the majority of lesions in number. To study these small lesions they need to be annotated, which is a complex and time-consuming task. Existing (semi) automatic methods have been aimed at volumetric measurements and large lesions, and are not suitable for the detection of small lesions. In this research we established a supervised voxel classification CAD system, optimized and trained to exclusively detect small WMLs. To achieve this, several preprocessing steps were taken, which included a robust standardization of subject intensities to reduce inter-subject intensity variability as much as possible. A number of features that were found to be well identifying small lesions were calculated including multimodal intensities, tissue probabilities, several features for accurate location description, a number of second order derivative features as well as multi-scale annular filter for blobness detection. Only small lesions were used to learn the target concept via Adaboost using random forests as its basic classifiers. Finally the results were evaluated using Free-response receiver operating characteristic.

  5. The influence of intestinal infusion of fats on small intestinal motility and digesta transit in pigs.

    PubMed Central

    Gregory, P C; Rayner, V; Wenham, G

    1986-01-01

    The influence of duodenal and ileal infusion of nutrients on small intestinal transit of digesta, measured by the passage of phenol red marker, was studied in twelve pigs fitted with duodenal and ileal catheters, and a terminal ileal cannula. Changes in gastrointestinal motility were observed by electromyography and by use of an X-ray image intensifier in four of the pigs fitted additionally with nichrome wire electrodes in the gut wall and in seven pigs fitted only with a gastric catheter. Small intestinal transit time was unaffected by intestinal catheterization per se, or by duodenal or ileal infusion of glucose or peptone. It was reduced by duodenal infusion of fat or of some of the products of fat digestion including oleic acid and a monoglyceride containing unsaturated fatty acids (monoglyceride LS) but was not affected by infusion of glycerol, stearic acid or a monoglyceride containing saturated fatty acids (monoglyceride P). Ileal transit time was greatly reduced by ileal infusion of soya bean oil mixed with bile salts and lipase and by monoglyceride LS but not by soya bean oil alone. Total small intestinal transit time was reduced to a lesser degree by ileal infusion of soya bean oil mixed with bile salts and lipase and by monoglyceride LS and was unaffected by soya bean oil alone. The level of irregular spiking activity of the small intestine was greatly reduced by both duodenal and ileal infusion of fat, but rapidly propagated spike bursts were initiated from the point of infusion (identified radiologically as peristaltic rushes) many of which travelled right through to the ileo-caecal junction. It is concluded that intestinal infusion of fat accelerates small intestinal transit in pigs by induction of peristaltic rushes; that since the ileal transit times were more severely reduced than total small intestinal transit times by ileal infusion of fat the response is probably only seen over those areas of intestine in direct contract with the fat; and that the effect depends upon the presence of fat digestion products, i.e. the fatty acid and the monoglyceride, although probably only those containing unsaturated fatty acids. PMID:3559994

  6. Computing Earthquake Probabilities on Global Scales

    NASA Astrophysics Data System (ADS)

    Holliday, James R.; Graves, William R.; Rundle, John B.; Turcotte, Donald L.

    2016-03-01

    Large devastating events in systems such as earthquakes, typhoons, market crashes, electricity grid blackouts, floods, droughts, wars and conflicts, and landslides can be unexpected and devastating. Events in many of these systems display frequency-size statistics that are power laws. Previously, we presented a new method for calculating probabilities for large events in systems such as these. This method counts the number of small events since the last large event and then converts this count into a probability by using a Weibull probability law. We applied this method to the calculation of large earthquake probabilities in California-Nevada, USA. In that study, we considered a fixed geographic region and assumed that all earthquakes within that region, large magnitudes as well as small, were perfectly correlated. In the present article, we extend this model to systems in which the events have a finite correlation length. We modify our previous results by employing the correlation function for near mean field systems having long-range interactions, an example of which is earthquakes and elastic interactions. We then construct an application of the method and show examples of computed earthquake probabilities.

  7. The Probability of Exceedance as a Nonparametric Person-Fit Statistic for Tests of Moderate Length

    ERIC Educational Resources Information Center

    Tendeiro, Jorge N.; Meijer, Rob R.

    2013-01-01

    To classify an item score pattern as not fitting a nonparametric item response theory (NIRT) model, the probability of exceedance (PE) of an observed response vector x can be determined as the sum of the probabilities of all response vectors that are, at most, as likely as x, conditional on the test's total score. Vector x is to be considered…

  8. Effects of partial time delays on phase synchronization in Watts-Strogatz small-world neuronal networks

    NASA Astrophysics Data System (ADS)

    Sun, Xiaojuan; Perc, Matjaž; Kurths, Jürgen

    2017-05-01

    In this paper, we study effects of partial time delays on phase synchronization in Watts-Strogatz small-world neuronal networks. Our focus is on the impact of two parameters, namely the time delay τ and the probability of partial time delay pdelay, whereby the latter determines the probability with which a connection between two neurons is delayed. Our research reveals that partial time delays significantly affect phase synchronization in this system. In particular, partial time delays can either enhance or decrease phase synchronization and induce synchronization transitions with changes in the mean firing rate of neurons, as well as induce switching between synchronized neurons with period-1 firing to synchronized neurons with period-2 firing. Moreover, in comparison to a neuronal network where all connections are delayed, we show that small partial time delay probabilities have especially different influences on phase synchronization of neuronal networks.

  9. Effects of partial time delays on phase synchronization in Watts-Strogatz small-world neuronal networks.

    PubMed

    Sun, Xiaojuan; Perc, Matjaž; Kurths, Jürgen

    2017-05-01

    In this paper, we study effects of partial time delays on phase synchronization in Watts-Strogatz small-world neuronal networks. Our focus is on the impact of two parameters, namely the time delay τ and the probability of partial time delay p delay , whereby the latter determines the probability with which a connection between two neurons is delayed. Our research reveals that partial time delays significantly affect phase synchronization in this system. In particular, partial time delays can either enhance or decrease phase synchronization and induce synchronization transitions with changes in the mean firing rate of neurons, as well as induce switching between synchronized neurons with period-1 firing to synchronized neurons with period-2 firing. Moreover, in comparison to a neuronal network where all connections are delayed, we show that small partial time delay probabilities have especially different influences on phase synchronization of neuronal networks.

  10. Size determines antennal sensitivity and behavioral threshold to odors in bumblebee workers

    NASA Astrophysics Data System (ADS)

    Spaethe, Johannes; Brockmann, Axel; Halbig, Christine; Tautz, Jürgen

    2007-09-01

    The eusocial bumblebees exhibit pronounced size variation among workers of the same colony. Differently sized workers engage in different tasks (alloethism); large individuals are found to have a higher probability to leave the colony and search for food, whereas small workers tend to stay inside the nest and attend to nest duties. We investigated the effect of size variation on morphology and physiology of the peripheral olfactory system and the behavioral response thresholds to odors in workers of Bombus terrestris. Number and density of olfactory sensilla on the antennae correlate significantly with worker size. Consistent with these morphological changes, we found that antennal sensitivity to odors increases with body size. Antennae of large individuals show higher electroantennogram responses to a given odor concentration than those of smaller nestmates. This finding indicates that large antennae exhibit an increased capability to catch odor molecules and thus are more sensitive to odors than small antennae. We confirmed this prediction in a dual choice behavioral experiment showing that large workers indeed are able to respond correctly to much lower odor concentrations than small workers. Learning performance in these experiments did not differ between small and large bumblebees. Our results clearly show that, in the social bumblebees, variation in olfactory sensilla number due to size differences among workers strongly affects individual odor sensitivity. We speculate that superior odor sensitivity of large workers has favored size-related division of labor in bumblebee colonies.

  11. Assessing the Probability that a Finding Is Genuine for Large-Scale Genetic Association Studies

    PubMed Central

    Kuo, Chia-Ling; Vsevolozhskaya, Olga A.; Zaykin, Dmitri V.

    2015-01-01

    Genetic association studies routinely involve massive numbers of statistical tests accompanied by P-values. Whole genome sequencing technologies increased the potential number of tested variants to tens of millions. The more tests are performed, the smaller P-value is required to be deemed significant. However, a small P-value is not equivalent to small chances of a spurious finding and significance thresholds may fail to serve as efficient filters against false results. While the Bayesian approach can provide a direct assessment of the probability that a finding is spurious, its adoption in association studies has been slow, due in part to the ubiquity of P-values and the automated way they are, as a rule, produced by software packages. Attempts to design simple ways to convert an association P-value into the probability that a finding is spurious have been met with difficulties. The False Positive Report Probability (FPRP) method has gained increasing popularity. However, FPRP is not designed to estimate the probability for a particular finding, because it is defined for an entire region of hypothetical findings with P-values at least as small as the one observed for that finding. Here we propose a method that lets researchers extract probability that a finding is spurious directly from a P-value. Considering the counterpart of that probability, we term this method POFIG: the Probability that a Finding is Genuine. Our approach shares FPRP's simplicity, but gives a valid probability that a finding is spurious given a P-value. In addition to straightforward interpretation, POFIG has desirable statistical properties. The POFIG average across a set of tentative associations provides an estimated proportion of false discoveries in that set. POFIGs are easily combined across studies and are immune to multiple testing and selection bias. We illustrate an application of POFIG method via analysis of GWAS associations with Crohn's disease. PMID:25955023

  12. Assessing the Probability that a Finding Is Genuine for Large-Scale Genetic Association Studies.

    PubMed

    Kuo, Chia-Ling; Vsevolozhskaya, Olga A; Zaykin, Dmitri V

    2015-01-01

    Genetic association studies routinely involve massive numbers of statistical tests accompanied by P-values. Whole genome sequencing technologies increased the potential number of tested variants to tens of millions. The more tests are performed, the smaller P-value is required to be deemed significant. However, a small P-value is not equivalent to small chances of a spurious finding and significance thresholds may fail to serve as efficient filters against false results. While the Bayesian approach can provide a direct assessment of the probability that a finding is spurious, its adoption in association studies has been slow, due in part to the ubiquity of P-values and the automated way they are, as a rule, produced by software packages. Attempts to design simple ways to convert an association P-value into the probability that a finding is spurious have been met with difficulties. The False Positive Report Probability (FPRP) method has gained increasing popularity. However, FPRP is not designed to estimate the probability for a particular finding, because it is defined for an entire region of hypothetical findings with P-values at least as small as the one observed for that finding. Here we propose a method that lets researchers extract probability that a finding is spurious directly from a P-value. Considering the counterpart of that probability, we term this method POFIG: the Probability that a Finding is Genuine. Our approach shares FPRP's simplicity, but gives a valid probability that a finding is spurious given a P-value. In addition to straightforward interpretation, POFIG has desirable statistical properties. The POFIG average across a set of tentative associations provides an estimated proportion of false discoveries in that set. POFIGs are easily combined across studies and are immune to multiple testing and selection bias. We illustrate an application of POFIG method via analysis of GWAS associations with Crohn's disease.

  13. A risk assessment method for multi-site damage

    NASA Astrophysics Data System (ADS)

    Millwater, Harry Russell, Jr.

    This research focused on developing probabilistic methods suitable for computing small probabilities of failure, e.g., 10sp{-6}, of structures subject to multi-site damage (MSD). MSD is defined as the simultaneous development of fatigue cracks at multiple sites in the same structural element such that the fatigue cracks may coalesce to form one large crack. MSD is modeled as an array of collinear cracks with random initial crack lengths with the centers of the initial cracks spaced uniformly apart. The data used was chosen to be representative of aluminum structures. The structure is considered failed whenever any two adjacent cracks link up. A fatigue computer model is developed that can accurately and efficiently grow a collinear array of arbitrary length cracks from initial size until failure. An algorithm is developed to compute the stress intensity factors of all cracks considering all interaction effects. The probability of failure of two to 100 cracks is studied. Lower bounds on the probability of failure are developed based upon the probability of the largest crack exceeding a critical crack size. The critical crack size is based on the initial crack size that will grow across the ligament when the neighboring crack has zero length. The probability is evaluated using extreme value theory. An upper bound is based on the probability of the maximum sum of initial cracks being greater than a critical crack size. A weakest link sampling approach is developed that can accurately and efficiently compute small probabilities of failure. This methodology is based on predicting the weakest link, i.e., the two cracks to link up first, for a realization of initial crack sizes, and computing the cycles-to-failure using these two cracks. Criteria to determine the weakest link are discussed. Probability results using the weakest link sampling method are compared to Monte Carlo-based benchmark results. The results indicate that very small probabilities can be computed accurately in a few minutes using a Hewlett-Packard workstation.

  14. Destroying a Craton by Plate Subduction, Small-scale Convection, and Mantle Plume: Comparison of the Wyoming Craton and the North China Craton

    NASA Astrophysics Data System (ADS)

    Li, A.; Dave, R.

    2016-12-01

    A typical craton has a thick, strong, and neutrally buoyant lithosphere that protects it from being destructed by mantle convection. The Wyoming craton and the North China craton are two rare representatives, where the thick Archean lithosphere has been significantly thinned and partially removed as revealed in seismic tomography models. The Wyoming craton in the west-central US experienced pervasive deformation 80-55 Ma during the Laramide orogeny. It has been subsequently encroached upon by the Yellowstone hotspot since 2.0 Ma. Recent seismic models agree that the northern cratonic root in eastern Montana has been broadly removed while the thick root is still present in Wyoming. Our radial anisotropy model images a VSV>VSH anomaly associated with the deep fast anomaly in central Wyoming, indicating mantle downwelling. Continuous low velocities are observed beneath the Yellowstone hotspot and the Cheyenne belt at the craton's southern margin, suggesting mantle upwelling in the sub-lithosphere mantle. These observations evidence for small-scale mantle convection beneath the south-central Wyoming craton, which probably has been actively eroding the cratonic lithosphere. The small-scale mantle convection is probably also responsible for the observed, localized lithosphere delamination beneath the eastern North China craton. In addition, a plume-like, low-velocity feature is imaged beneath the central block of the North China craton and is suggested as the driving force for destructing the cratonic root. Like the Wyoming craton that was subducted by the Farallon plate during the Laramide orogeny, the North China craton was underlined by the ancient Pacific plate before the root destruction in Late Jurassic. In both cases, the subducted slab helped to hydrate and weaken the cratonic lithosphere above it, initiate local metasomatism and partial melting, and promote small-scale convection. The craton's interaction with a mantle plume could further strengthen the small-scale convection and lead a massive destruction of the craton.

  15. Detection of white matter lesions in cerebral small vessel disease

    NASA Astrophysics Data System (ADS)

    Riad, Medhat M.; Platel, Bram; de Leeuw, Frank-Erik; Karssemeijer, Nico

    2013-02-01

    White matter lesions (WML) are diffuse white matter abnormalities commonly found in older subjects and are important indicators of stroke, multiple sclerosis, dementia and other disorders. We present an automated WML detection method and evaluate it on a dataset of small vessel disease (SVD) patients. In early SVD, small WMLs are expected to be of importance for the prediction of disease progression. Commonly used WML segmentation methods tend to ignore small WMLs and are mostly validated on the basis of total lesion load or a Dice coefficient for all detected WMLs. Therefore, in this paper, we present a method that is designed to detect individual lesions, large or small, and we validate the detection performance of our system with FROC (free-response ROC) analysis. For the automated detection, we use supervised classification making use of multimodal voxel based features from different magnetic resonance imaging (MRI) sequences, including intensities, tissue probabilities, voxel locations and distances, neighborhood textures and others. After preprocessing, including co-registration, brain extraction, bias correction, intensity normalization, and nonlinear registration, ventricle segmentation is performed and features are calculated for each brain voxel. A gentle-boost classifier is trained using these features from 50 manually annotated subjects to give each voxel a probability of being a lesion voxel. We perform ROC analysis to illustrate the benefits of using additional features to the commonly used voxel intensities; significantly increasing the area under the curve (Az) from 0.81 to 0.96 (p<0.05). We perform the FROC analysis by testing our classifier on 50 previously unseen subjects and compare the results with manual annotations performed by two experts. Using the first annotator results as our reference, the second annotator performs at a sensitivity of 0.90 with an average of 41 false positives per subject while our automated method reached the same level of sensitivity at approximately 180 false positives per subject.

  16. Neural Encoding and Integration of Learned Probabilistic Sequences in Avian Sensory-Motor Circuitry

    PubMed Central

    Brainard, Michael S.

    2013-01-01

    Many complex behaviors, such as human speech and birdsong, reflect a set of categorical actions that can be flexibly organized into variable sequences. However, little is known about how the brain encodes the probabilities of such sequences. Behavioral sequences are typically characterized by the probability of transitioning from a given action to any subsequent action (which we term “divergence probability”). In contrast, we hypothesized that neural circuits might encode the probability of transitioning to a given action from any preceding action (which we term “convergence probability”). The convergence probability of repeatedly experienced sequences could naturally become encoded by Hebbian plasticity operating on the patterns of neural activity associated with those sequences. To determine whether convergence probability is encoded in the nervous system, we investigated how auditory-motor neurons in vocal premotor nucleus HVC of songbirds encode different probabilistic characterizations of produced syllable sequences. We recorded responses to auditory playback of pseudorandomly sequenced syllables from the bird's repertoire, and found that variations in responses to a given syllable could be explained by a positive linear dependence on the convergence probability of preceding sequences. Furthermore, convergence probability accounted for more response variation than other probabilistic characterizations, including divergence probability. Finally, we found that responses integrated over >7–10 syllables (∼700–1000 ms) with the sign, gain, and temporal extent of integration depending on convergence probability. Our results demonstrate that convergence probability is encoded in sensory-motor circuitry of the song-system, and suggest that encoding of convergence probability is a general feature of sensory-motor circuits. PMID:24198363

  17. Cost effectiveness of alternative imaging strategies for the diagnosis of small-bowel Crohn's disease.

    PubMed

    Levesque, Barrett G; Cipriano, Lauren E; Chang, Steven L; Lee, Keane K; Owens, Douglas K; Garber, Alan M

    2010-03-01

    The cost effectiveness of alternative approaches to the diagnosis of small-bowel Crohn's disease is unknown. This study evaluates whether computed tomographic enterography (CTE) is a cost-effective alternative to small-bowel follow-through (SBFT) and whether capsule endoscopy is a cost-effective third test in patients in whom a high suspicion of disease remains after 2 previous negative tests. A decision-analytic model was developed to compare the lifetime costs and benefits of each diagnostic strategy. Patients were considered with low (20%) and high (75%) pretest probability of small-bowel Crohn's disease. Effectiveness was measured in quality-adjusted life-years (QALYs) gained. Parameter assumptions were tested with sensitivity analyses. With a moderate to high pretest probability of small-bowel Crohn's disease, and a higher likelihood of isolated jejunal disease, follow-up evaluation with CTE has an incremental cost-effectiveness ratio of less than $54,000/QALY-gained compared with SBFT. The addition of capsule endoscopy after ileocolonoscopy and negative CTE or SBFT costs greater than $500,000 per QALY-gained in all scenarios. Results were not sensitive to costs of tests or complications but were sensitive to test accuracies. The cost effectiveness of strategies depends critically on the pretest probability of Crohn's disease and if the terminal ileum is examined at ileocolonoscopy. CTE is a cost-effective alternative to SBFT in patients with moderate to high suspicion of small-bowel Crohn's disease. The addition of capsule endoscopy as a third test is not a cost-effective third test, even in patients with high pretest probability of disease. Copyright 2010 AGA Institute. Published by Elsevier Inc. All rights reserved.

  18. Use of high-granularity CdZnTe pixelated detectors to correct response non-uniformities caused by defects in crystals

    DOE PAGES

    Bolotnikov, A. E.; Camarda, G. S.; Cui, Y.; ...

    2015-09-06

    Following our successful demonstration of the position-sensitive virtual Frisch-grid detectors, we investigated the feasibility of using high-granularity position sensing to correct response non-uniformities caused by the crystal defects in CdZnTe (CZT) pixelated detectors. The development of high-granularity detectors able to correct response non-uniformities on a scale comparable to the size of electron clouds opens the opportunity of using unselected off-the-shelf CZT material, whilst still assuring high spectral resolution for the majority of the detectors fabricated from an ingot. Here, we present the results from testing 3D position-sensitive 15×15×10 mm 3 pixelated detectors, fabricated with conventional pixel patterns with progressively smallermore » pixel sizes: 1.4, 0.8, and 0.5 mm. We employed the readout system based on the H3D front-end multi-channel ASIC developed by BNL's Instrumentation Division in collaboration with the University of Michigan. We use the sharing of electron clouds among several adjacent pixels to measure locations of interaction points with sub-pixel resolution. By using the detectors with small-pixel sizes and a high probability of the charge-sharing events, we were able to improve their spectral resolutions in comparison to the baseline levels, measured for the 1.4-mm pixel size detectors with small fractions of charge-sharing events. These results demonstrate that further enhancement of the performance of CZT pixelated detectors and reduction of costs are possible by using high spatial-resolution position information of interaction points to correct the small-scale response non-uniformities caused by crystal defects present in most devices.« less

  19. Applicability of the linear-quadratic formalism for modeling local tumor control probability in high dose per fraction stereotactic body radiotherapy for early stage non-small cell lung cancer.

    PubMed

    Guckenberger, Matthias; Klement, Rainer Johannes; Allgäuer, Michael; Appold, Steffen; Dieckmann, Karin; Ernst, Iris; Ganswindt, Ute; Holy, Richard; Nestle, Ursula; Nevinny-Stickel, Meinhard; Semrau, Sabine; Sterzing, Florian; Wittig, Andrea; Andratschke, Nicolaus; Flentje, Michael

    2013-10-01

    To compare the linear-quadratic (LQ) and the LQ-L formalism (linear cell survival curve beyond a threshold dose dT) for modeling local tumor control probability (TCP) in stereotactic body radiotherapy (SBRT) for stage I non-small cell lung cancer (NSCLC). This study is based on 395 patients from 13 German and Austrian centers treated with SBRT for stage I NSCLC. The median number of SBRT fractions was 3 (range 1-8) and median single fraction dose was 12.5 Gy (2.9-33 Gy); dose was prescribed to the median 65% PTV encompassing isodose (60-100%). Assuming an α/β-value of 10 Gy, we modeled TCP as a sigmoid-shaped function of the biologically effective dose (BED). Models were compared using maximum likelihood ratio tests as well as Bayes factors (BFs). There was strong evidence for a dose-response relationship in the total patient cohort (BFs>20), which was lacking in single-fraction SBRT (BFs<3). Using the PTV encompassing dose or maximum (isocentric) dose, our data indicated a LQ-L transition dose (dT) at 11 Gy (68% CI 8-14 Gy) or 22 Gy (14-42 Gy), respectively. However, the fit of the LQ-L models was not significantly better than a fit without the dT parameter (p=0.07, BF=2.1 and p=0.86, BF=0.8, respectively). Generally, isocentric doses resulted in much better dose-response relationships than PTV encompassing doses (BFs>20). Our data suggest accurate modeling of local tumor control in fractionated SBRT for stage I NSCLC with the traditional LQ formalism. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  20. Twelve- to 14-Month-Old Infants Can Predict Single-Event Probability with Large Set Sizes

    ERIC Educational Resources Information Center

    Denison, Stephanie; Xu, Fei

    2010-01-01

    Previous research has revealed that infants can reason correctly about single-event probabilities with small but not large set sizes (Bonatti, 2008; Teglas "et al.", 2007). The current study asks whether infants can make predictions regarding single-event probability with large set sizes using a novel procedure. Infants completed two trials: A…

  1. [Diagnostic work-up of pulmonary nodules : Management of pulmonary nodules detected with low‑dose CT screening].

    PubMed

    Wormanns, D

    2016-09-01

    Pulmonary nodules are the most frequent pathological finding in low-dose computed tomography (CT) scanning for early detection of lung cancer. Early stages of lung cancer are often manifested as pulmonary nodules; however, the very commonly occurring small nodules are predominantly benign. These benign nodules are responsible for the high percentage of false positive test results in screening studies. Appropriate diagnostic algorithms are necessary to reduce false positive screening results and to improve the specificity of lung cancer screening. Such algorithms are based on some of the basic principles comprehensively described in this article. Firstly, the diameter of nodules allows a differentiation between large (>8 mm) probably malignant and small (<8 mm) probably benign nodules. Secondly, some morphological features of pulmonary nodules in CT can prove their benign nature. Thirdly, growth of small nodules is the best non-invasive predictor of malignancy and is utilized as a trigger for further diagnostic work-up. Non-invasive testing using positron emission tomography (PET) and contrast enhancement as well as invasive diagnostic tests (e.g. various procedures for cytological and histological diagnostics) are briefly described in this article. Different nodule morphology using CT (e.g. solid and semisolid nodules) is associated with different biological behavior and different algorithms for follow-up are required. Currently, no obligatory algorithm is available in German-speaking countries for the management of pulmonary nodules, which reflects the current state of knowledge. The main features of some international and American recommendations are briefly presented in this article from which conclusions for the daily clinical use are derived.

  2. The island rule: made to be broken?

    PubMed Central

    Meiri, Shai; Cooper, Natalie; Purvis, Andy

    2007-01-01

    The island rule is a hypothesis whereby small mammals evolve larger size on islands while large insular mammals dwarf. The rule is believed to emanate from small mammals growing larger to control more resources and enhance metabolic efficiency, while large mammals evolve smaller size to reduce resource requirements and increase reproductive output. We show that there is no evidence for the existence of the island rule when phylogenetic comparative methods are applied to a large, high-quality dataset. Rather, there are just a few clade-specific patterns: carnivores; heteromyid rodents; and artiodactyls typically evolve smaller size on islands whereas murid rodents usually grow larger. The island rule is probably an artefact of comparing distantly related groups showing clade-specific responses to insularity. Instead of a rule, size evolution on islands is likely to be governed by the biotic and abiotic characteristics of different islands, the biology of the species in question and contingency. PMID:17986433

  3. Using radiology reports to encourage evidence-based practice in the evaluation of small, incidentally detected pulmonary nodules. A preliminary study.

    PubMed

    Woloshin, Steven; Schwartz, Lisa M; Dann, Elizabeth; Black, William C

    2014-02-01

    Standard radiology report forms do not guide ordering clinicians toward evidence-based practice. To test an enhanced radiology report that estimates the probability that a pulmonary nodule is malignant and provides explicit, professional guideline recommendations. Anonymous, institutional review board-approved, internet-based survey of all clinicians with privileges at the Dartmouth-Hitchcock Medical Center comparing a standard versus an enhanced chest computed tomography report for a 65-year-old former smoker with an incidentally detected 7-mm pulmonary nodule. A total of 43% (n = 447) of 1045 eligible clinicians answered patient management questions after reading a standard and then an enhanced radiology report (which included the probability of malignancy and Fleischner Society guideline recommendations). With the enhanced report, more clinicians chose the correct management strategy (72% with enhanced versus 32% with standard report [40% difference; 95% confidence interval (CI) = 35-45%]), appropriately made fewer referrals to pulmonary for opinions or biopsy (21 vs. 41% [-40% difference; 95% CI = -25 to -16%]), ordered fewer positron emission tomography scans (3 versus 13%; -10% difference; 95% CI = -13 to -7%), and fewer computed tomography scans outside the recommended time interval (2 versus 7%; -5% difference; 95% CI = -7 to -2%). Most clinicians preferred or strongly preferred the enhanced report, and thought they had a better understanding of the nodule's significance and management. An enhanced radiology report with probability estimates for malignancy and management recommendations was associated with improved clinicians' response to incidentally detected small pulmonary nodules in an internet-based survey of clinicians at one academic medical center, and was strongly preferred. The utility of this approach should be tested next in clinical practice.

  4. Weathering the storm: hurricanes and birth outcomes.

    PubMed

    Currie, Janet; Rossin-Slater, Maya

    2013-05-01

    A growing literature suggests that stressful events in pregnancy can have negative effects on birth outcomes. Some of the estimates in this literature may be affected by small samples, omitted variables, endogenous mobility in response to disasters, and errors in the measurement of gestation, as well as by a mechanical correlation between longer gestation and the probability of having been exposed. We use millions of individual birth records to examine the effects of exposure to hurricanes during pregnancy, and the sensitivity of the estimates to these econometric problems. We find that exposure to a hurricane during pregnancy increases the probability of abnormal conditions of the newborn such as being on a ventilator more than 30min and meconium aspiration syndrome (MAS). Although we are able to reproduce previous estimates of effects on birth weight and gestation, our results suggest that measured effects of stressful events on these outcomes are sensitive to specification and it is preferable to use more sensitive indicators of newborn health. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. Risky Business: Do Native Rodents Use Habitat and Odor Cues to Manage Predation Risk in Australian Deserts?

    PubMed Central

    Spencer, Emma E.; Crowther, Mathew S.; Dickman, Christopher R.

    2014-01-01

    In open, arid environments with limited shelter there may be strong selection on small prey species to develop behaviors that facilitate predator avoidance. Here, we predicted that rodents should avoid predator odor and open habitats to reduce their probability of encounter with potential predators, and tested our predictions using a native Australian desert rodent, the spinifex hopping-mouse (Notomys alexis). We tested the foraging and movement responses of N. alexis to non-native predator (fox and cat) odor, in sheltered and open macro- and microhabitats. Rodents did not respond to predator odor, perhaps reflecting the inconsistent selection pressure that is imposed on prey species in the desert environment due to the transience of predator-presence. However, they foraged primarily in the open and moved preferentially across open sand. The results suggest that N. alexis relies on escape rather than avoidance behavior when managing predation risk, with its bipedal movement probably allowing it to exploit open environments most effectively. PMID:24587396

  6. Risky business: do native rodents use habitat and odor cues to manage predation risk in Australian deserts?

    PubMed

    Spencer, Emma E; Crowther, Mathew S; Dickman, Christopher R

    2014-01-01

    In open, arid environments with limited shelter there may be strong selection on small prey species to develop behaviors that facilitate predator avoidance. Here, we predicted that rodents should avoid predator odor and open habitats to reduce their probability of encounter with potential predators, and tested our predictions using a native Australian desert rodent, the spinifex hopping-mouse (Notomys alexis). We tested the foraging and movement responses of N. alexis to non-native predator (fox and cat) odor, in sheltered and open macro- and microhabitats. Rodents did not respond to predator odor, perhaps reflecting the inconsistent selection pressure that is imposed on prey species in the desert environment due to the transience of predator-presence. However, they foraged primarily in the open and moved preferentially across open sand. The results suggest that N. alexis relies on escape rather than avoidance behavior when managing predation risk, with its bipedal movement probably allowing it to exploit open environments most effectively.

  7. Thermodynamics and signatures of criticality in a network of neurons.

    PubMed

    Tkačik, Gašper; Mora, Thierry; Marre, Olivier; Amodei, Dario; Palmer, Stephanie E; Berry, Michael J; Bialek, William

    2015-09-15

    The activity of a neural network is defined by patterns of spiking and silence from the individual neurons. Because spikes are (relatively) sparse, patterns of activity with increasing numbers of spikes are less probable, but, with more spikes, the number of possible patterns increases. This tradeoff between probability and numerosity is mathematically equivalent to the relationship between entropy and energy in statistical physics. We construct this relationship for populations of up to N = 160 neurons in a small patch of the vertebrate retina, using a combination of direct and model-based analyses of experiments on the response of this network to naturalistic movies. We see signs of a thermodynamic limit, where the entropy per neuron approaches a smooth function of the energy per neuron as N increases. The form of this function corresponds to the distribution of activity being poised near an unusual kind of critical point. We suggest further tests of criticality, and give a brief discussion of its functional significance.

  8. Weathering the Storm: Hurricanes and Birth Outcomes

    PubMed Central

    Currie, Janet

    2013-01-01

    A growing literature suggests that stressful events in pregnancy can have negative effects on birth outcomes. Some of the estimates in this literature may be affected by small samples, omitted variables, endogenous mobility in response to disasters, and errors in the measurement of gestation, as well as by a mechanical correlation between longer gestation and the probability of having been exposed. We use millions of individual birth records to examine the effects of exposure to hurricanes during pregnancy, and the sensitivity of the estimates to these econometric problems. We find that exposure to a hurricane during pregnancy increases the probability of abnormal conditions of the newborn such as being on a ventilator more than 30 minutes and meconium aspiration syndrome (MAS). Although we are able to reproduce previous estimates of effects on birth weight and gestation, our results suggest that measured effects of stressful events on these outcomes are sensitive to specification and it is preferable to use more sensitive indicators of newborn health. PMID:23500506

  9. Modeling the frequency-dependent detective quantum efficiency of photon-counting x-ray detectors.

    PubMed

    Stierstorfer, Karl

    2018-01-01

    To find a simple model for the frequency-dependent detective quantum efficiency (DQE) of photon-counting detectors in the low flux limit. Formula for the spatial cross-talk, the noise power spectrum and the DQE of a photon-counting detector working at a given threshold are derived. Parameters are probabilities for types of events like single counts in the central pixel, double counts in the central pixel and a neighboring pixel or single count in a neighboring pixel only. These probabilities can be derived in a simple model by extensive use of Monte Carlo techniques: The Monte Carlo x-ray propagation program MOCASSIM is used to simulate the energy deposition from the x-rays in the detector material. A simple charge cloud model using Gaussian clouds of fixed width is used for the propagation of the electric charge generated by the primary interactions. Both stages are combined in a Monte Carlo simulation randomizing the location of impact which finally produces the required probabilities. The parameters of the charge cloud model are fitted to the spectral response to a polychromatic spectrum measured with our prototype detector. Based on the Monte Carlo model, the DQE of photon-counting detectors as a function of spatial frequency is calculated for various pixel sizes, photon energies, and thresholds. The frequency-dependent DQE of a photon-counting detector in the low flux limit can be described with an equation containing only a small set of probabilities as input. Estimates for the probabilities can be derived from a simple model of the detector physics. © 2017 American Association of Physicists in Medicine.

  10. Phase synchronization of bursting neurons in clustered small-world networks

    NASA Astrophysics Data System (ADS)

    Batista, C. A. S.; Lameu, E. L.; Batista, A. M.; Lopes, S. R.; Pereira, T.; Zamora-López, G.; Kurths, J.; Viana, R. L.

    2012-07-01

    We investigate the collective dynamics of bursting neurons on clustered networks. The clustered network model is composed of subnetworks, each of them presenting the so-called small-world property. This model can also be regarded as a network of networks. In each subnetwork a neuron is connected to other ones with regular as well as random connections, the latter with a given intracluster probability. Moreover, in a given subnetwork each neuron has an intercluster probability to be connected to the other subnetworks. The local neuron dynamics has two time scales (fast and slow) and is modeled by a two-dimensional map. In such small-world network the neuron parameters are chosen to be slightly different such that, if the coupling strength is large enough, there may be synchronization of the bursting (slow) activity. We give bounds for the critical coupling strength to obtain global burst synchronization in terms of the network structure, that is, the probabilities of intracluster and intercluster connections. We find that, as the heterogeneity in the network is reduced, the network global synchronizability is improved. We show that the transitions to global synchrony may be abrupt or smooth depending on the intercluster probability.

  11. Dynamic Response of an Optomechanical System to a Stationary Random Excitation in the Time Domain

    DOE PAGES

    Palmer, Jeremy A.; Paez, Thomas L.

    2011-01-01

    Modern electro-optical instruments are typically designed with assemblies of optomechanical members that support optics such that alignment is maintained in service environments that include random vibration loads. This paper presents a nonlinear numerical analysis that calculates statistics for the peak lateral response of optics in an optomechanical sub-assembly subject to random excitation of the housing. The work is unique in that the prior art does not address peak response probability distribution for stationary random vibration in the time domain for a common lens-retainer-housing system with Coulomb damping. Analytical results are validated by using displacement response data from random vibration testingmore » of representative prototype sub-assemblies. A comparison of predictions to experimental results yields reasonable agreement. The Type I Asymptotic form provides the cumulative distribution function for peak response probabilities. Probabilities are calculated for actual lens centration tolerances. The probability that peak response will not exceed the centration tolerance is greater than 80% for prototype configurations where the tolerance is high (on the order of 30 micrometers). Conversely, the probability is low for those where the tolerance is less than 20 micrometers. The analysis suggests a design paradigm based on the influence of lateral stiffness on the magnitude of the response.« less

  12. Probability-based hazard avoidance guidance for planetary landing

    NASA Astrophysics Data System (ADS)

    Yuan, Xu; Yu, Zhengshi; Cui, Pingyuan; Xu, Rui; Zhu, Shengying; Cao, Menglong; Luan, Enjie

    2018-03-01

    Future landing and sample return missions on planets and small bodies will seek landing sites with high scientific value, which may be located in hazardous terrains. Autonomous landing in such hazardous terrains and highly uncertain planetary environments is particularly challenging. Onboard hazard avoidance ability is indispensable, and the algorithms must be robust to uncertainties. In this paper, a novel probability-based hazard avoidance guidance method is developed for landing in hazardous terrains on planets or small bodies. By regarding the lander state as probabilistic, the proposed guidance algorithm exploits information on the uncertainty of lander position and calculates the probability of collision with each hazard. The collision probability serves as an accurate safety index, which quantifies the impact of uncertainties on the lander safety. Based on the collision probability evaluation, the state uncertainty of the lander is explicitly taken into account in the derivation of the hazard avoidance guidance law, which contributes to enhancing the robustness to the uncertain dynamics of planetary landing. The proposed probability-based method derives fully analytic expressions and does not require off-line trajectory generation. Therefore, it is appropriate for real-time implementation. The performance of the probability-based guidance law is investigated via a set of simulations, and the effectiveness and robustness under uncertainties are demonstrated.

  13. Renal Effects of Long Term Administration of Triamcinolone Acetonide in Normal Dogs

    PubMed Central

    Osbaldiston, G. W.

    1971-01-01

    Triamcinolone acetonide was administered in excessive dosage to dogs to study the renal mechanism responsible for polyuria which is a clinically undesirable side effect of long term glucocorticoid therapy. Polyuria occurred coincident with a significant increase in urinary solute output. Although continuous administration of triamcinolone acetonide at 0.1 or 0.2 mg/lb/day caused a small but significant increase in creatinine output, the primary mechanism for the polyuria was increased solute excretion. Associated with the polyuria was pronounced hyperphagia and polydipsia. The cause of the hyperphagia was not established. The increase in electrolyte excretion caused by this synthetic steroid was probably compensated for by the hyperphagia. Because all the dogs showed muscle weakness and loss of body condition, it is likely that alteration in protein and amino acid metabolism was responsible for the hyperphagia. PMID:4251411

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bolotnikov, A. E.; Camarda, G. S.; Cui, Y.

    Following our successful demonstration of the position-sensitive virtual Frisch-grid detectors, we investigated the feasibility of using high-granularity position sensing to correct response non-uniformities caused by the crystal defects in CdZnTe (CZT) pixelated detectors. The development of high-granularity detectors able to correct response non-uniformities on a scale comparable to the size of electron clouds opens the opportunity of using unselected off-the-shelf CZT material, whilst still assuring high spectral resolution for the majority of the detectors fabricated from an ingot. Here, we present the results from testing 3D position-sensitive 15×15×10 mm 3 pixelated detectors, fabricated with conventional pixel patterns with progressively smallermore » pixel sizes: 1.4, 0.8, and 0.5 mm. We employed the readout system based on the H3D front-end multi-channel ASIC developed by BNL's Instrumentation Division in collaboration with the University of Michigan. We use the sharing of electron clouds among several adjacent pixels to measure locations of interaction points with sub-pixel resolution. By using the detectors with small-pixel sizes and a high probability of the charge-sharing events, we were able to improve their spectral resolutions in comparison to the baseline levels, measured for the 1.4-mm pixel size detectors with small fractions of charge-sharing events. These results demonstrate that further enhancement of the performance of CZT pixelated detectors and reduction of costs are possible by using high spatial-resolution position information of interaction points to correct the small-scale response non-uniformities caused by crystal defects present in most devices.« less

  15. Molecular analyses of the principal components of response strength.

    PubMed Central

    Killeen, Peter R; Hall, Scott S; Reilly, Mark P; Kettle, Lauren C

    2002-01-01

    Killeen and Hall (2001) showed that a common factor called strength underlies the key dependent variables of response probability, latency, and rate, and that overall response rate is a good predictor of strength. In a search for the mechanisms that underlie those correlations, this article shows that (a) the probability of responding on a trial is a two-state Markov process; (b) latency and rate of responding can be described in terms of the probability and period of stochastic machines called clocked Bernoulli modules, and (c) one such machine, the refractory Poisson process, provides a functional relation between the probability of observing a response during any epoch and the rate of responding. This relation is one of proportionality at low rates and curvilinearity at higher rates. PMID:12216975

  16. Electrophoresis and spectrometric analyses of adaptation-related proteins in thermally stressed Chromobacterium violaceum.

    PubMed

    Cordeiro, I B; Castro, D P; Nogueira, P P O; Angelo, P C S; Nogueira, P A; Gonçalves, J F C; Pereira, A M R F; Garcia, J S; Souza, G H M F; Arruda, M A Z; Eberlin, M N; Astolfi-Filho, S; Andrade, E V; López-Lozano, J L

    2013-10-29

    Chromobacterium violaceum is a Gram-negative proteobacteria found in water and soil; it is widely distributed in tropical and subtropical regions, such as the Amazon rainforest. We examined protein expression changes that occur in C. violaceum at different growth temperatures using electrophoresis and mass spectrometry. The total number of spots detected was 1985; the number ranged from 99 to 380 in each assay. The proteins that were identified spectrometrically were categorized as chaperones, proteins expressed exclusively under heat stress, enzymes involved in the respiratory and fermentation cycles, ribosomal proteins, and proteins related to transport and secretion. Controlling inverted repeat of chaperone expression and inverted repeat DNA binding sequences, as well as regions recognized by sigma factor 32, elements involved in the genetic regulation of the bacterial stress response, were identified in the promoter regions of several of the genes coding proteins, involved in the C. violaceum stress response. We found that 30 °C is the optimal growth temperature for C. violaceum, whereas 25, 35, and 40 °C are stressful temperatures that trigger the expression of chaperones, superoxide dismutase, a probable small heat shock protein, a probable phasing, ferrichrome-iron receptor protein, elongation factor P, and an ornithine carbamoyltransferase catabolite. This information improves our comprehension of the mechanisms involved in stress adaptation by C. violaceum.

  17. Allometric growth in juvenile marine turtles: possible role as an antipredator adaptation.

    PubMed

    Salmon, Michael; Scholl, Joshua

    2014-04-01

    Female marine turtles produce hundreds of offspring during their lifetime but few survive because small turtles have limited defenses and are vulnerable to many predators. Little is known about how small turtles improve their survival probabilities with growth though it is assumed that they do. We reared green turtles (Chelonia mydas) and loggerheads (Caretta caretta) from hatchlings to 13 weeks of age and documented that they grew wider faster than they grew longer. This pattern of allometric growth might enable small turtles to more quickly achieve protection from gape-limited predators, such as the dolphinfish (Coryphaena hippurus). As a test of that hypothesis, we measured how dolphinfish gape increased with length, reviewed the literature to determine how dolphinfish populations were size/age structured in nearby waters, and then determined the probability that a small turtle would encounter a fish large enough to consume it if it grew by allometry vs. by isometry (in which case it retained its hatchling proportions). Allometric growth more quickly reduced the probability of a lethal encounter than did isometric growth. On that basis, we suggest that allometry during early ontogeny may have evolved because it provides a survival benefit for small turtles. Copyright © 2014 Elsevier GmbH. All rights reserved.

  18. Toxicological and epidemiological evidence for health risks from inhaled engine emissions.

    PubMed Central

    Mauderly, J L

    1994-01-01

    Information from toxicological and epidemiological studies of the cancer and noncancer health risks from inhaled diesel engine exhaust (DE) and gasoline engine exhaust (GE) was reviewed. The toxicological database is more extensive for DE than for GE. Animal studies have shown that heavy, chronic exposures to both DE and GE can cause lung pathology and associated physiological effects. Inhaled GE has not been shown to be carcinogenic in animals. Chronically inhaled DE at high concentrations is a pulmonary carcinogen in rats, but the response is questionable in mice and negative in Syrian hamsters. The response in rats is probably not attributable to the DE soot-associated organic compounds, as previously assumed, and the usefulness of the rat data for predicting risk in humans is uncertain. Experimental human exposures to DE show that lung inflammatory and other cellular effects can occur after single exposures, and sparse data suggest that occupational exposures might affect respiratory function and symptoms. Epidemiology suggests that heavy occupational exposures to exhaust probably increase the risks for mortality from both lung cancer and noncancer pulmonary disease. The small magnitudes of the increases in these risks make the studies very sensitive to confounding factors and uncertainties of exposure; thus, it may not be possible to resolve exposure-response relationships conclusively by epidemiology. Our present knowledge suggests that heavy occupational exposures to DE and GE are hazardous but does not allow quantitative estimates of risk with a high degree of certainty. PMID:7529701

  19. Radiobiological modeling of two stereotactic body radiotherapy schedules in patients with stage I peripheral non-small cell lung cancer.

    PubMed

    Huang, Bao-Tian; Lin, Zhu; Lin, Pei-Xian; Lu, Jia-Yang; Chen, Chuang-Zhen

    2016-06-28

    This study aims to compare the radiobiological response of two stereotactic body radiotherapy (SBRT) schedules for patients with stage I peripheral non-small cell lung cancer (NSCLC) using radiobiological modeling methods. Volumetric modulated arc therapy (VMAT)-based SBRT plans were designed using two dose schedules of 1 × 34 Gy (34 Gy in 1 fraction) and 4 × 12 Gy (48 Gy in 4 fractions) for 19 patients diagnosed with primary stage I NSCLC. Dose to the gross target volume (GTV), planning target volume (PTV), lung and chest wall (CW) were converted to biologically equivalent dose in 2 Gy fraction (EQD2) for comparison. Five different radiobiological models were employed to predict the tumor control probability (TCP) value. Three additional models were utilized to estimate the normal tissue complication probability (NTCP) value for the lung and the modified equivalent uniform dose (mEUD) value to the CW. Our result indicates that the 1 × 34 Gy dose schedule provided a higher EQD2 dose to the tumor, lung and CW. Radiobiological modeling revealed that the TCP value for the tumor, NTCP value for the lung and mEUD value for the CW were 7.4% (in absolute value), 7.2% (in absolute value) and 71.8% (in relative value) higher on average, respectively, using the 1 × 34 Gy dose schedule.

  20. An information diffusion technique to assess integrated hazard risks.

    PubMed

    Huang, Chongfu; Huang, Yundong

    2018-02-01

    An integrated risk is a scene in the future associated with some adverse incident caused by multiple hazards. An integrated probability risk is the expected value of disaster. Due to the difficulty of assessing an integrated probability risk with a small sample, weighting methods and copulas are employed to avoid this obstacle. To resolve the problem, in this paper, we develop the information diffusion technique to construct a joint probability distribution and a vulnerability surface. Then, an integrated risk can be directly assessed by using a small sample. A case of an integrated risk caused by flood and earthquake is given to show how the suggested technique is used to assess the integrated risk of annual property loss. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Rats’ preferences for high fructose corn syrup vs. sucrose and sugar mixtures

    PubMed Central

    Ackroff, Karen; Sclafani, Anthony

    2011-01-01

    High fructose corn syrup (HFCS) has replaced sucrose in many food products, which has prompted research comparing these two sweeteners in rodents. The present study examined the relative palatability of HFCS and sucrose for rats, offering 11% carbohydrate solutions to match the content of common beverages for human consumption. The animals initially preferred HFCS to sucrose but after separate experience with each solution they switched to sucrose preference. Approximating the composition of HFCS with a mixture of fructose and glucose (55:45) yielded a solution that was less attractive than sucrose or HFCS. However, HFCS contains a small amount of glucose polymers, which are very attractive to rats. A 55:42:3 mixture of fructose, glucose and glucose polymers (Polycose) was equally preferred to HFCS and was treated similarly to HFCS in comparisons vs. sucrose. Post-oral effects of sucrose, which is 50% fructose and 50% glucose, may be responsible for the shift in preference with experience. This shift, and the relatively small magnitude of differences in preference for HFCS and sucrose, suggest that palatability factors probably do not contribute to any possible difference in weight gain responses to these sweeteners. PMID:21236278

  2. Stretch-induced contraction in pulmonary arteries.

    PubMed

    Kulik, T J; Evans, J N; Gamble, W J

    1988-12-01

    Stretch stimulates contraction of systemic blood vessels, but the response has not been described in pulmonary vessels. To determine whether pulmonary arteries contract when stretched, isolated cylindrical segments of pulmonary arteries were suspended between two parallel wires, stretched, and the active force was generated in response to stretch measured. Eighty-nine percent of segments from small (in situ diameter less than 1,000 microns) feline pulmonary arteries contracted when stretched, and in 65% of these the magnitude of stretch was related to the magnitude of contraction. Large (in situ diameter greater than or equal to 1,000 microns) feline pulmonary arteries did not contract with stretch. Multiple, rapidly repeated stretches resulted in a diminution of active force development. Stretch-induced contraction required external Ca2+ and was abolished by diltiazem (10 microns), but it was not affected by phenoxybenzamine, phentolamine, diethylcarbamazine, or mechanical removal of endothelium. Indomethacin blunted but did not abolish stretch-induced contraction, an effect that may have been nonspecific. This study suggests that stretch can act, probably directly, on smooth muscle in small feline pulmonary arteries to elicit contraction and that it may be a determinant of pulmonary vascular tone. In addition, feline pulmonary arteries are suitable for the in vitro study of stretch-induced contraction.

  3. [PD-L1 expression: An emerging biomarker in non-small cell lung cancer].

    PubMed

    Adam, Julien; Planchard, David; Marabelle, Aurélien; Soria, Jean-Charles; Scoazec, Jean-Yves; Lantuéjoul, Sylvie

    2016-01-01

    Therapies targeting immune checkpoints, in particular programmed death 1 (PD-1) and its ligand programmed death ligand 1 (PD-L1), are major new strategies for the treatment of several malignancies including mestatatic non-small cell lung cancer (NSCLC). The identification of predictive biomarkers of response is required, considering efficacy, cost and potential adverse events. Expression of PD-L1 by immunohistochemistry has been associated with higher response rate and overall survival in several clinical trials evaluating anti-PD-1 and anti-PD-L1 monoclonal antibodies. Thus, PD-L1 immunohistochemical companion assays could be required for treatment with some of these therapies in NSCLC. However, heterogeneity in methodologies of PD-L1 assays in terms of primary antibodies and scoring algorithms, and tumor heterogenity for PD-L1 expression are important issues to be considered. More studies are required to compare the different assays, ensure their harmonization and standardization and identify the optimal conditions for testing. PD-L1 expression is likely an imperfect predictive biomarker for patient selection and association with other markers of the tumor immune microenvironment will be probably necessary in the future. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  4. Rats' preferences for high fructose corn syrup vs. sucrose and sugar mixtures.

    PubMed

    Ackroff, Karen; Sclafani, Anthony

    2011-03-28

    High fructose corn syrup (HFCS) has replaced sucrose in many food products, which has prompted research comparing these two sweeteners in rodents. The present study examined the relative palatability of HFCS and sucrose for rats, offering 11% carbohydrate solutions to match the content of common beverages for human consumption. The animals initially preferred HFCS to sucrose but after separate experience with each solution they switched to sucrose preference. Approximating the composition of HFCS with a mixture of fructose and glucose (55:45) yielded a solution that was less attractive than sucrose or HFCS. However, HFCS contains a small amount of glucose polymers, which are very attractive to rats. A 55:42:3 mixture of fructose, glucose and glucose polymers (Polycose) was equally preferred to HFCS and was treated similarly to HFCS in comparisons vs. sucrose. Post-oral effects of sucrose, which is 50% fructose and 50% glucose, may be responsible for the shift in preference with experience. This shift, and the relatively small magnitude of differences in preference for HFCS and sucrose, suggest that palatability factors probably do not contribute to any possible difference in weight gain responses to these sweeteners. Copyright © 2011 Elsevier Inc. All rights reserved.

  5. Stimulus probability effects in absolute identification.

    PubMed

    Kent, Christopher; Lamberts, Koen

    2016-05-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of presentation probability on both proportion correct and response times. The effects were moderated by the ubiquitous stimulus position effect. The accuracy and response time data were predicted by an exemplar-based model of perceptual cognition (Kent & Lamberts, 2005). The bow in discriminability was also attenuated when presentation probability for middle items was relatively high, an effect that will constrain future model development. The study provides evidence for item-specific learning in absolute identification. Implications for other theories of absolute identification are discussed. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  6. Stochastic resonance on a modular neuronal network of small-world subnetworks with a subthreshold pacemaker

    NASA Astrophysics Data System (ADS)

    Yu, Haitao; Wang, Jiang; Liu, Chen; Deng, Bin; Wei, Xile

    2011-12-01

    We study the phenomenon of stochastic resonance on a modular neuronal network consisting of several small-world subnetworks with a subthreshold periodic pacemaker. Numerical results show that the correlation between the pacemaker frequency and the dynamical response of the network is resonantly dependent on the intensity of additive spatiotemporal noise. This effect of pacemaker-driven stochastic resonance of the system depends extensively on the local and the global network structure, such as the intra- and inter-coupling strengths, rewiring probability of individual small-world subnetwork, the number of links between different subnetworks, and the number of subnetworks. All these parameters play a key role in determining the ability of the network to enhance the noise-induced outreach of the localized subthreshold pacemaker, and only they bounded to a rather sharp interval of values warrant the emergence of the pronounced stochastic resonance phenomenon. Considering the rather important role of pacemakers in real-life, the presented results could have important implications for many biological processes that rely on an effective pacemaker for their proper functioning.

  7. Development of STS/Centaur failure probabilities liftoff to Centaur separation

    NASA Technical Reports Server (NTRS)

    Hudson, J. M.

    1982-01-01

    The results of an analysis to determine STS/Centaur catastrophic vehicle response probabilities for the phases of vehicle flight from STS liftoff to Centaur separation from the Orbiter are presented. The analysis considers only category one component failure modes as contributors to the vehicle response mode probabilities. The relevant component failure modes are grouped into one of fourteen categories of potential vehicle behavior. By assigning failure rates to each component, for each of its failure modes, the STS/Centaur vehicle response probabilities in each phase of flight can be calculated. The results of this study will be used in a DOE analysis to ascertain the hazard from carrying a nuclear payload on the STS.

  8. Prediction of the comparative reinforcement values of running and drinking.

    PubMed

    PREMACK, D

    1963-03-15

    The probability of free drinking and running in rats was controlled by sucrose concentration and force requirements on an activity wheel. Drinking and running were then made contingent on pressing a bar. Barpressing increased monotonically with the associated response probability, and equally for drinking and running. The results support the assumption that different responses of equal probability have equal reinforcement value.

  9. Use of the negative binomial-truncated Poisson distribution in thunderstorm prediction

    NASA Technical Reports Server (NTRS)

    Cohen, A. C.

    1971-01-01

    A probability model is presented for the distribution of thunderstorms over a small area given that thunderstorm events (1 or more thunderstorms) are occurring over a larger area. The model incorporates the negative binomial and truncated Poisson distributions. Probability tables for Cape Kennedy for spring, summer, and fall months and seasons are presented. The computer program used to compute these probabilities is appended.

  10. Causal inference in survival analysis using pseudo-observations.

    PubMed

    Andersen, Per K; Syriopoulou, Elisavet; Parner, Erik T

    2017-07-30

    Causal inference for non-censored response variables, such as binary or quantitative outcomes, is often based on either (1) direct standardization ('G-formula') or (2) inverse probability of treatment assignment weights ('propensity score'). To do causal inference in survival analysis, one needs to address right-censoring, and often, special techniques are required for that purpose. We will show how censoring can be dealt with 'once and for all' by means of so-called pseudo-observations when doing causal inference in survival analysis. The pseudo-observations can be used as a replacement of the outcomes without censoring when applying 'standard' causal inference methods, such as (1) or (2) earlier. We study this idea for estimating the average causal effect of a binary treatment on the survival probability, the restricted mean lifetime, and the cumulative incidence in a competing risks situation. The methods will be illustrated in a small simulation study and via a study of patients with acute myeloid leukemia who received either myeloablative or non-myeloablative conditioning before allogeneic hematopoetic cell transplantation. We will estimate the average causal effect of the conditioning regime on outcomes such as the 3-year overall survival probability and the 3-year risk of chronic graft-versus-host disease. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  11. Survival of Pseudomonas aeruginosa exposed to sunlight resembles the phenom of persistence.

    PubMed

    Forte Giacobone, Ana F; Oppezzo, Oscar J

    2015-01-01

    During exposure of Pseudomonas aeruginosa stationary phase cells to natural solar radiation, a reduction in the rate of loss of bacterial viability was observed when survival fractions were lower than 1/10,000. This reduction was independent of the growth medium used and of the initial bacterial concentration, and was also observed when irradiation was performed with artificial UVA radiation (365nm, 47Wm(-2)). These results indicate the presence of a small bacterial subpopulation with increased tolerance to radiation. Such a tolerance is non-heritable, since survival curves comparable to those of the parental strain were obtained from survivors to long-term exposure to radiation. The radiation response described here resembles the phenomenon called persistence, which consists of the presence of a small subpopulation of slow-growing cells which are able to survive antibiotic treatment within a susceptible bacterial population. The condition of persister cells is acquired via a reversible switch and involves active defense systems towards oxidative stress. Persistence is probably responsible for biphasic responses of bacteria to several stress conditions, one of which may be exposure to sunlight. The models currently used to analyze the lethal action of sunlight overestimate the effect of high-dose irradiation. These models could be improved by including the potential formation of persister cells. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. A stochastic model for the probability of malaria extinction by mass drug administration.

    PubMed

    Pemberton-Ross, Peter; Chitnis, Nakul; Pothin, Emilie; Smith, Thomas A

    2017-09-18

    Mass drug administration (MDA) has been proposed as an intervention to achieve local extinction of malaria. Although its effect on the reproduction number is short lived, extinction may subsequently occur in a small population due to stochastic fluctuations. This paper examines how the probability of stochastic extinction depends on population size, MDA coverage and the reproduction number under control, R c . A simple compartmental model is developed which is used to compute the probability of extinction using probability generating functions. The expected time to extinction in small populations after MDA for various scenarios in this model is calculated analytically. The results indicate that mass drug administration (Firstly, R c must be sustained at R c  < 1.2 to avoid the rapid re-establishment of infections in the population. Secondly, the MDA must produce effective cure rates of >95% to have a non-negligible probability of successful elimination. Stochastic fluctuations only significantly affect the probability of extinction in populations of about 1000 individuals or less. The expected time to extinction via stochastic fluctuation is less than 10 years only in populations less than about 150 individuals. Clustering of secondary infections and of MDA distribution both contribute positively to the potential probability of success, indicating that MDA would most effectively be administered at the household level. There are very limited circumstances in which MDA will lead to local malaria elimination with a substantial probability.

  13. Tracking the Sensory Environment: An ERP Study of Probability and Context Updating in ASD

    PubMed Central

    Westerfield, Marissa A.; Zinni, Marla; Vo, Khang; Townsend, Jeanne

    2014-01-01

    We recorded visual event-related brain potentials (ERPs) from 32 adult male participants (16 high-functioning participants diagnosed with Autism Spectrum Disorder (ASD) and 16 control participants, ranging in age from 18–53 yrs) during a three-stimulus oddball paradigm. Target and non-target stimulus probability was varied across three probability conditions, whereas the probability of a third non-target stimulus was held constant in all conditions. P3 amplitude to target stimuli was more sensitive to probability in ASD than in TD participants, whereas P3 amplitude to non-target stimuli was less responsive to probability in ASD participants. This suggests that neural responses to changes in event probability are attention-dependant in high-functioning ASD. The implications of these findings for higher-level behaviors such as prediction and planning are discussed. PMID:24488156

  14. Sampling techniques for burbot in a western non-wadeable river

    USGS Publications Warehouse

    Klein, Z. B.; Quist, Michael C.; Rhea, D.T.; Senecal, A. C.

    2015-01-01

    Burbot, Lota lota (L.), populations are declining throughout much of their native distribution. Although numerous aspects of burbot ecology are well understood, less is known about effective sampling techniques for burbot in lotic systems. Occupancy models were used to estimate the probability of detection () for three gears (6.4- and 19-mm bar mesh hoop nets, night electric fishing), within the context of various habitat characteristics. During the summer, night electric fishing had the highest estimated detection probability for both juvenile (, 95% C.I.; 0.35, 0.26–0.46) and adult (0.30, 0.20–0.41) burbot. However, small-mesh hoop nets (6.4-mm bar mesh) had similar detection probabilities to night electric fishing for both juvenile (0.26, 0.17–0.36) and adult (0.27, 0.18–0.39) burbot during the summer. In autumn, a similar overlap between detection probabilities was observed for juvenile and adult burbot. Small-mesh hoop nets had the highest estimated probability of detection for both juvenile and adult burbot (0.46, 0.33–0.59), whereas night electric fishing had a detection probability of 0.39 (0.28–0.52) for juvenile and adult burbot. By using detection probabilities to compare gears, the most effective sampling technique can be identified, leading to increased species detections and more effective management of burbot.

  15. A Comparative Study of Automated Infrasound Detectors - PMCC and AFD with Analyst Review.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Junghyun; Hayward, Chris; Zeiler, Cleat

    Automated detections calculated by the progressive multi-channel correlation (PMCC) method (Cansi, 1995) and the adaptive F detector (AFD) (Arrowsmith et al., 2009) are compared to the signals identified by five independent analysts. Each detector was applied to a four-hour time sequence recorded by the Korean infrasound array CHNAR. This array was used because it is composed of both small (<100 m) and large (~1000 m) aperture element spacing. The four hour time sequence contained a number of easily identified signals under noise conditions that have average RMS amplitudes varied from 1.2 to 4.5 mPa (1 to 5 Hz), estimated withmore » running five-minute window. The effectiveness of the detectors was estimated for the small aperture, large aperture, small aperture combined with the large aperture, and full array. The full and combined arrays performed the best for AFD under all noise conditions while the large aperture array had the poorest performance for both detectors. PMCC produced similar results as AFD under the lower noise conditions, but did not produce as dramatic an increase in detections using the full and combined arrays. Both automated detectors and the analysts produced a decrease in detections under the higher noise conditions. Comparing the detection probabilities with Estimated Receiver Operating Characteristic (EROC) curves we found that the smaller value of consistency for PMCC and the larger p-value for AFD had the highest detection probability. These parameters produced greater changes in detection probability than estimates of the false alarm rate. The detection probability was impacted the most by noise level, with low noise (average RMS amplitude of 1.7 mPa) having an average detection probability of ~40% and high noise (average RMS amplitude of 2.9 mPa) average detection probability of ~23%.« less

  16. Total coliform and E. coli in public water systems using undisinfected ground water in the United States.

    PubMed

    Messner, Michael J; Berger, Philip; Javier, Julie

    2017-06-01

    Public water systems (PWSs) in the United States generate total coliform (TC) and Escherichia coli (EC) monitoring data, as required by the Total Coliform Rule (TCR). We analyzed data generated in 2011 by approximately 38,000 small (serving fewer than 4101 individuals) undisinfected public water systems (PWSs). We used statistical modeling to characterize a distribution of TC detection probabilities for each of nine groupings of PWSs based on system type (community, non-transient non-community, and transient non-community) and population served (less than 101, 101-1000 and 1001-4100 people). We found that among PWS types sampled in 2011, on average, undisinfected transient PWSs test positive for TC 4.3% of the time as compared with 3% for undisinfected non-transient PWSs and 2.5% for undisinfected community PWSs. Within each type of PWS, the smaller systems have higher median TC detection than the larger systems. All TC-positive samples were assayed for EC. Among TC-positive samples from small undisinfected PWSs, EC is detected in about 5% of samples, regardless of PWS type or size. We evaluated the upper tail of the TC detection probability distributions and found that significant percentages of some system types have high TC detection probabilities. For example, assuming the systems providing data are nationally-representative, then 5.0% of the ∼50,000 small undisinfected transient PWSs in the U.S. have TC detection probabilities of 20% or more. Communities with such high TC detection probabilities may have elevated risk of acute gastrointestinal (AGI) illness - perhaps as great or greater than the attributable risk to drinking water (6-22%) calculated for 14 Wisconsin community PWSs with much lower TC detection probabilities (about 2.3%, Borchardt et al., 2012). Published by Elsevier GmbH.

  17. Geomorphological and sedimentary evidence of probable glaciation in the Jizerské hory Mountains, Central Europe

    NASA Astrophysics Data System (ADS)

    Engel, Zbyněk; Křížek, Marek; Kasprzak, Marek; Traczyk, Andrzej; Hložek, Martin; Krbcová, Klára

    2017-03-01

    The Jizerské hory Mountains in the Czech Republic have traditionally been considered to be a highland that lay beyond the limits of Quaternary glaciations. Recent work on cirque-like valley heads in the central part of the range has shown that niche glaciers could form during the Quaternary. Here we report geomorphological and sedimentary evidence for a small glacier in the Pytlácká jáma Hollow that represents one of the most-enclosed valley heads within the range. Shape and size characteristics of this landform indicate that the hollow is a glacial cirque at a degraded stage of development. Boulder accumulations at the downslope side of the hollow probably represent a relic of terminal moraines, and the grain size distribution of clasts together with micromorphology of quartz grains from the hollow indicate the glacial environment of a small glacier. This glacier represents the lowermost located such system in central Europe and provides evidence for the presence of niche or small cirque glaciers probably during pre-Weichselian glacial periods. The glaciation limit (1000 m asl) and paleo-ELA (900 m asl) proposed for the Jizerské hory Mountains implies that central European ranges lower than 1100 m asl were probably glaciated during the Quaternary.

  18. Evaluation of trap capture in a geographically closed population of brown treesnakes on Guam

    USGS Publications Warehouse

    Tyrrell, C.L.; Christy, M.T.; Rodda, G.H.; Yackel Adams, A.A.; Ellingson, A.R.; Savidge, J.A.; Dean-Bradley, K.; Bischof, R.

    2009-01-01

    1. Open population mark-recapture analysis of unbounded populations accommodates some types of closure violations (e.g. emigration, immigration). In contrast, closed population analysis of such populations readily allows estimation of capture heterogeneity and behavioural response, but requires crucial assumptions about closure (e.g. no permanent emigration) that are suspect and rarely tested empirically. 2. In 2003, we erected a double-sided barrier to prevent movement of snakes in or out of a 5-ha semi-forested study site in northern Guam. This geographically closed population of >100 snakes was monitored using a series of transects for visual searches and a 13 ?? 13 trapping array, with the aim of marking all snakes within the site. Forty-five marked snakes were also supplemented into the resident population to quantify the efficacy of our sampling methods. We used the program mark to analyse trap captures (101 occasions), referenced to census data from visual surveys, and quantified heterogeneity, behavioural response, and size bias in trappability. Analytical inclusion of untrapped individuals greatly improved precision in the estimation of some covariate effects. 3. A novel discovery was that trap captures for individual snakes consisted of asynchronous bouts of high capture probability lasting about 7 days (ephemeral behavioural effect). There was modest behavioural response (trap happiness) and significant latent (unexplained) heterogeneity, with small influences on capture success of date, gender, residency status (translocated or not), and body condition. 4. Trapping was shown to be an effective tool for eradicating large brown treesnakes Boiga irregularis (>900 mm snout-vent length, SVL). 5. Synthesis and applications. Mark-recapture modelling is commonly used by ecological managers to estimate populations. However, existing models involve making assumptions about either closure violations or response to capture. Physical closure of our population on a landscape scale allowed us to determine the relative importance of covariates influencing capture probability (body size, trappability periods, and latent heterogeneity). This information was used to develop models in which different segments of the population could be assigned different probabilities of capture, and suggests that modelling of open populations should incorporate easily measured, but potentially overlooked, parameters such as body size or condition. ?? 2008 The Authors.

  19. Combination of a Stresor-Response Model with a Conditional Probability Anaylsis Approach to Develop Candidate Criteria from Empirical Data

    EPA Science Inventory

    We show that a conditional probability analysis that utilizes a stressor-response model based on a logistic regression provides a useful approach for developing candidate water quality criterai from empirical data. The critical step in this approach is transforming the response ...

  20. Population size influences amphibian detection probability: implications for biodiversity monitoring programs.

    PubMed

    Tanadini, Lorenzo G; Schmidt, Benedikt R

    2011-01-01

    Monitoring is an integral part of species conservation. Monitoring programs must take imperfect detection of species into account in order to be reliable. Theory suggests that detection probability may be determined by population size but this relationship has not yet been assessed empirically. Population size is particularly important because it may induce heterogeneity in detection probability and thereby cause bias in estimates of biodiversity. We used a site occupancy model to analyse data from a volunteer-based amphibian monitoring program to assess how well different variables explain variation in detection probability. An index to population size best explained detection probabilities for four out of six species (to avoid circular reasoning, we used the count of individuals at a previous site visit as an index to current population size). The relationship between the population index and detection probability was positive. Commonly used weather variables best explained detection probabilities for two out of six species. Estimates of site occupancy probabilities differed depending on whether the population index was or was not used to model detection probability. The relationship between the population index and detectability has implications for the design of monitoring and species conservation. Most importantly, because many small populations are likely to be overlooked, monitoring programs should be designed in such a way that small populations are not overlooked. The results also imply that methods cannot be standardized in such a way that detection probabilities are constant. As we have shown here, one can easily account for variation in population size in the analysis of data from long-term monitoring programs by using counts of individuals from surveys at the same site in previous years. Accounting for variation in population size is important because it can affect the results of long-term monitoring programs and ultimately the conservation of imperiled species.

  1. SELWAY-BITTERROOT WILDERNESS, IDAHO AND MONTANA.

    USGS Publications Warehouse

    Toth, Margo I.; Zilka, Nicholas T.

    1984-01-01

    Mineral-resource studies of the Selway-Bitterroot Wilderness in Idaho County, Idaho, and Missoula and Ravalli Counties, Montana, were carried out. Four areas with probable and one small area of substantiated mineral-resource potential were recognized. The areas of the Running Creek, Painted Rocks, and Whistling Pig plutons of Tertiary age have probable resource potential for molybdenum, although detailed geochemical sampling and surface investigations failed to recognize mineralized systems at the surface. Randomly distributed breccia zones along a fault in the vicinity of the Cliff mine have a substantiated potential for small silver-copper-lead resources.

  2. Blackmail propagation on small-world networks

    NASA Astrophysics Data System (ADS)

    Shao, Zhi-Gang; Jian-Ping Sang; Zou, Xian-Wu; Tan, Zhi-Jie; Jin, Zhun-Zhi

    2005-06-01

    The dynamics of the blackmail propagation model based on small-world networks is investigated. It is found that for a given transmitting probability λ the dynamical behavior of blackmail propagation transits from linear growth type to logistical growth one with the network randomness p increases. The transition takes place at the critical network randomness pc=1/N, where N is the total number of nodes in the network. For a given network randomness p the dynamical behavior of blackmail propagation transits from exponential decrease type to logistical growth one with the transmitting probability λ increases. The transition occurs at the critical transmitting probability λc=1/, where is the average number of the nearest neighbors. The present work will be useful for understanding computer virus epidemics and other spreading phenomena on communication and social networks.

  3. Comparison of rate one-half, equivalent constraint length 24, binary convolutional codes for use with sequential decoding on the deep-space channel

    NASA Technical Reports Server (NTRS)

    Massey, J. L.

    1976-01-01

    Virtually all previously-suggested rate 1/2 binary convolutional codes with KE = 24 are compared. Their distance properties are given; and their performance, both in computation and in error probability, with sequential decoding on the deep-space channel is determined by simulation. Recommendations are made both for the choice of a specific KE = 24 code as well as for codes to be included in future coding standards for the deep-space channel. A new result given in this report is a method for determining the statistical significance of error probability data when the error probability is so small that it is not feasible to perform enough decoding simulations to obtain more than a very small number of decoding errors.

  4. Crash probability estimation via quantifying driver hazard perception.

    PubMed

    Li, Yang; Zheng, Yang; Wang, Jianqiang; Kodaka, Kenji; Li, Keqiang

    2018-07-01

    Crash probability estimation is an important method to predict the potential reduction of crash probability contributed by forward collision avoidance technologies (FCATs). In this study, we propose a practical approach to estimate crash probability, which combines a field operational test and numerical simulations of a typical rear-end crash model. To consider driver hazard perception characteristics, we define a novel hazard perception measure, called as driver risk response time, by considering both time-to-collision (TTC) and driver braking response to impending collision risk in a near-crash scenario. Also, we establish a driving database under mixed Chinese traffic conditions based on a CMBS (Collision Mitigation Braking Systems)-equipped vehicle. Applying the crash probability estimation in this database, we estimate the potential decrease in crash probability owing to use of CMBS. A comparison of the results with CMBS on and off shows a 13.7% reduction of crash probability in a typical rear-end near-crash scenario with a one-second delay of driver's braking response. These results indicate that CMBS is positive in collision prevention, especially in the case of inattentive drivers or ole drivers. The proposed crash probability estimation offers a practical way for evaluating the safety benefits in the design and testing of FCATs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. A Comparison of EPI Sampling, Probability Sampling, and Compact Segment Sampling Methods for Micro and Small Enterprises

    PubMed Central

    Chao, Li-Wei; Szrek, Helena; Peltzer, Karl; Ramlagan, Shandir; Fleming, Peter; Leite, Rui; Magerman, Jesswill; Ngwenya, Godfrey B.; Pereira, Nuno Sousa; Behrman, Jere

    2011-01-01

    Finding an efficient method for sampling micro- and small-enterprises (MSEs) for research and statistical reporting purposes is a challenge in developing countries, where registries of MSEs are often nonexistent or outdated. This lack of a sampling frame creates an obstacle in finding a representative sample of MSEs. This study uses computer simulations to draw samples from a census of businesses and non-businesses in the Tshwane Municipality of South Africa, using three different sampling methods: the traditional probability sampling method, the compact segment sampling method, and the World Health Organization’s Expanded Programme on Immunization (EPI) sampling method. Three mechanisms by which the methods could differ are tested, the proximity selection of respondents, the at-home selection of respondents, and the use of inaccurate probability weights. The results highlight the importance of revisits and accurate probability weights, but the lesser effect of proximity selection on the samples’ statistical properties. PMID:22582004

  6. Time delay and long-range connection induced synchronization transitions in Newman-Watts small-world neuronal networks.

    PubMed

    Qian, Yu

    2014-01-01

    The synchronization transitions in Newman-Watts small-world neuronal networks (SWNNs) induced by time delay τ and long-range connection (LRC) probability P have been investigated by synchronization parameter and space-time plots. Four distinct parameter regions, that is, asynchronous region, transition region, synchronous region, and oscillatory region have been discovered at certain LRC probability P = 1.0 as time delay is increased. Interestingly, desynchronization is observed in oscillatory region. More importantly, we consider the spatiotemporal patterns obtained in delayed Newman-Watts SWNNs are the competition results between long-range drivings (LRDs) and neighboring interactions. In addition, for moderate time delay, the synchronization of neuronal network can be enhanced remarkably by increasing LRC probability. Furthermore, lag synchronization has been found between weak synchronization and complete synchronization as LRC probability P is a little less than 1.0. Finally, the two necessary conditions, moderate time delay and large numbers of LRCs, are exposed explicitly for synchronization in delayed Newman-Watts SWNNs.

  7. Time Delay and Long-Range Connection Induced Synchronization Transitions in Newman-Watts Small-World Neuronal Networks

    PubMed Central

    Qian, Yu

    2014-01-01

    The synchronization transitions in Newman-Watts small-world neuronal networks (SWNNs) induced by time delay and long-range connection (LRC) probability have been investigated by synchronization parameter and space-time plots. Four distinct parameter regions, that is, asynchronous region, transition region, synchronous region, and oscillatory region have been discovered at certain LRC probability as time delay is increased. Interestingly, desynchronization is observed in oscillatory region. More importantly, we consider the spatiotemporal patterns obtained in delayed Newman-Watts SWNNs are the competition results between long-range drivings (LRDs) and neighboring interactions. In addition, for moderate time delay, the synchronization of neuronal network can be enhanced remarkably by increasing LRC probability. Furthermore, lag synchronization has been found between weak synchronization and complete synchronization as LRC probability is a little less than 1.0. Finally, the two necessary conditions, moderate time delay and large numbers of LRCs, are exposed explicitly for synchronization in delayed Newman-Watts SWNNs. PMID:24810595

  8. Response of the San Andreas fault to the 1983 Coalinga-Nuñez earthquakes: an application of interaction-based probabilities for Parkfield

    USGS Publications Warehouse

    Toda, Shinji; Stein, Ross S.

    2002-01-01

    The Parkfield-Cholame section of the San Andreas fault, site of an unfulfilled earthquake forecast in 1985, is the best monitored section of the world's most closely watched fault. In 1983, the M = 6.5 Coalinga and M = 6.0 Nuñez events struck 25 km northeast of Parkfield. Seismicity rates climbed for 18 months along the creeping section of the San Andreas north of Parkfield and dropped for 6 years along the locked section to the south. Right-lateral creep also slowed or reversed from Parkfield south. Here we calculate that the Coalinga sequence increased the shear and Coulomb stress on the creeping section, causing the rate of small shocks to rise until the added stress was shed by additional slip. However, the 1983 events decreased the shear and Coulomb stress on the Parkfield segment, causing surface creep and seismicity rates to drop. We use these observations to cast the likelihood of a Parkfield earthquake into an interaction-based probability, which includes both the renewal of stress following the 1966 Parkfield earthquake and the stress transfer from the 1983 Coalinga events. We calculate that the 1983 shocks dropped the 10-year probability of a M ∼ 6 Parkfield earthquake by 22% (from 54 ± 22% to 42 ± 23%) and that the probability did not recover until about 1991, when seismicity and creep resumed. Our analysis may thus explain why the Parkfield earthquake did not strike in the 1980s, but not why it was absent in the 1990s. We calculate a 58 ± 17% probability of a M ∼ 6 Parkfield earthquake during 2001–2011.

  9. Armc5 deletion causes developmental defects and compromises T-cell immune responses

    PubMed Central

    Hu, Yan; Lao, Linjiang; Mao, Jianning; Jin, Wei; Luo, Hongyu; Charpentier, Tania; Qi, Shijie; Peng, Junzheng; Hu, Bing; Marcinkiewicz, Mieczyslaw Martin; Lamarre, Alain; Wu, Jiangping

    2017-01-01

    Armadillo repeat containing 5 (ARMC5) is a cytosolic protein with no enzymatic activities. Little is known about its function and mechanisms of action, except that gene mutations are associated with risks of primary macronodular adrenal gland hyperplasia. Here we map Armc5 expression by in situ hybridization, and generate Armc5 knockout mice, which are small in body size. Armc5 knockout mice have compromised T-cell proliferation and differentiation into Th1 and Th17 cells, increased T-cell apoptosis, reduced severity of experimental autoimmune encephalitis, and defective immune responses to lymphocytic choriomeningitis virus infection. These mice also develop adrenal gland hyperplasia in old age. Yeast 2-hybrid assays identify 16 ARMC5-binding partners. Together these data indicate that ARMC5 is crucial in fetal development, T-cell function and adrenal gland growth homeostasis, and that the functions of ARMC5 probably depend on interaction with multiple signalling pathways. PMID:28169274

  10. Trophic compensation reinforces resistance: herbivory absorbs the increasing effects of multiple disturbances.

    PubMed

    Ghedini, Giulia; Russell, Bayden D; Connell, Sean D

    2015-02-01

    Disturbance often results in small changes in community structure, but the probability of transitioning to contrasting states increases when multiple disturbances combine. Nevertheless, we have limited insights into the mechanisms that stabilise communities, particularly how perturbations can be absorbed without restructuring (i.e. resistance). Here, we expand the concept of compensatory dynamics to include countervailing mechanisms that absorb disturbances through trophic interactions. By definition, 'compensation' occurs if a specific disturbance stimulates a proportional countervailing response that eliminates its otherwise unchecked effect. We show that the compounding effects of disturbances from local to global scales (i.e. local canopy-loss, eutrophication, ocean acidification) increasingly promote the expansion of weedy species, but that this response is countered by a proportional increase in grazing. Finally, we explore the relatively unrecognised role of compensatory effects, which are likely to maintain the resistance of communities to disturbance more deeply than current thinking allows. © 2015 John Wiley & Sons Ltd/CNRS.

  11. Education, Training and Employment in Small-Scale Enterprises: Three Industries in Sao Paulo, Brazil. IIEP Research Report No. 63.

    ERIC Educational Resources Information Center

    Leite, Elenice M.; Caillods, Francoise

    Despite the prophecies forecasting their probable disappearance or annihilation, small-scale enterprises have persisted in the Brazilian industrial structure since 1950. To account for the survival of small firms in Brazil, specifically in the state of Sao Paulo, a study examined 100 small firms in three industrial sectors: clothing, mechanical…

  12. Application of a multistate model to estimate culvert effects on movement of small fishes

    USGS Publications Warehouse

    Norman, J.R.; Hagler, M.M.; Freeman, Mary C.; Freeman, B.J.

    2009-01-01

    While it is widely acknowledged that culverted road-stream crossings may impede fish passage, effects of culverts on movement of nongame and small-bodied fishes have not been extensively studied and studies generally have not accounted for spatial variation in capture probabilities. We estimated probabilities for upstream and downstream movement of small (30-120 mm standard length) benthic and water column fishes across stream reaches with and without culverts at four road-stream crossings over a 4-6-week period. Movement and reach-specific capture probabilities were estimated using multistate capture-recapture models. Although none of the culverts were complete barriers to passage, only a bottomless-box culvert appeared to permit unrestricted upstream and downstream movements by benthic fishes based on model estimates of movement probabilities. At two box culverts that were perched above the water surface at base flow, observed movements were limited to water column fishes and to intervals when runoff from storm events raised water levels above the perched level. Only a single fish was observed to move through a partially embedded pipe culvert. Estimates for probabilities of movement over distances equal to at least the length of one culvert were low (e.g., generally ???0.03, estimated for 1-2-week intervals) and had wide 95% confidence intervals as a consequence of few observed movements to nonadjacent reaches. Estimates of capture probabilities varied among reaches by a factor of 2 to over 10, illustrating the importance of accounting for spatially variable capture rates when estimating movement probabilities with capture-recapture data. Longer-term studies are needed to evaluate temporal variability in stream fish passage at culverts (e.g., in relation to streamflow variability) and to thereby better quantify the degree of population fragmentation caused by road-stream crossings with culverts. ?? American Fisheries Society 2009.

  13. Time Dependence of Collision Probabilities During Satellite Conjunctions

    NASA Technical Reports Server (NTRS)

    Hall, Doyle T.; Hejduk, Matthew D.; Johnson, Lauren C.

    2017-01-01

    The NASA Conjunction Assessment Risk Analysis (CARA) team has recently implemented updated software to calculate the probability of collision (P (sub c)) for Earth-orbiting satellites. The algorithm can employ complex dynamical models for orbital motion, and account for the effects of non-linear trajectories as well as both position and velocity uncertainties. This “3D P (sub c)” method entails computing a 3-dimensional numerical integral for each estimated probability. Our analysis indicates that the 3D method provides several new insights over the traditional “2D P (sub c)” method, even when approximating the orbital motion using the relatively simple Keplerian two-body dynamical model. First, the formulation provides the means to estimate variations in the time derivative of the collision probability, or the probability rate, R (sub c). For close-proximity satellites, such as those orbiting in formations or clusters, R (sub c) variations can show multiple peaks that repeat or blend with one another, providing insight into the ongoing temporal distribution of risk. For single, isolated conjunctions, R (sub c) analysis provides the means to identify and bound the times of peak collision risk. Additionally, analysis of multiple actual archived conjunctions demonstrates that the commonly used “2D P (sub c)” approximation can occasionally provide inaccurate estimates. These include cases in which the 2D method yields negligibly small probabilities (e.g., P (sub c)) is greater than 10 (sup -10)), but the 3D estimates are sufficiently large to prompt increased monitoring or collision mitigation (e.g., P (sub c) is greater than or equal to 10 (sup -5)). Finally, the archive analysis indicates that a relatively efficient calculation can be used to identify which conjunctions will have negligibly small probabilities. This small-P (sub c) screening test can significantly speed the overall risk analysis computation for large numbers of conjunctions.

  14. Fractional poisson--a simple dose-response model for human norovirus.

    PubMed

    Messner, Michael J; Berger, Philip; Nappier, Sharon P

    2014-10-01

    This study utilizes old and new Norovirus (NoV) human challenge data to model the dose-response relationship for human NoV infection. The combined data set is used to update estimates from a previously published beta-Poisson dose-response model that includes parameters for virus aggregation and for a beta-distribution that describes variable susceptibility among hosts. The quality of the beta-Poisson model is examined and a simpler model is proposed. The new model (fractional Poisson) characterizes hosts as either perfectly susceptible or perfectly immune, requiring a single parameter (the fraction of perfectly susceptible hosts) in place of the two-parameter beta-distribution. A second parameter is included to account for virus aggregation in the same fashion as it is added to the beta-Poisson model. Infection probability is simply the product of the probability of nonzero exposure (at least one virus or aggregate is ingested) and the fraction of susceptible hosts. The model is computationally simple and appears to be well suited to the data from the NoV human challenge studies. The model's deviance is similar to that of the beta-Poisson, but with one parameter, rather than two. As a result, the Akaike information criterion favors the fractional Poisson over the beta-Poisson model. At low, environmentally relevant exposure levels (<100), estimation error is small for the fractional Poisson model; however, caution is advised because no subjects were challenged at such a low dose. New low-dose data would be of great value to further clarify the NoV dose-response relationship and to support improved risk assessment for environmentally relevant exposures. © 2014 Society for Risk Analysis Published 2014. This article is a U.S. Government work and is in the public domain for the U.S.A.

  15. The statistical significance of error probability as determined from decoding simulations for long codes

    NASA Technical Reports Server (NTRS)

    Massey, J. L.

    1976-01-01

    The very low error probability obtained with long error-correcting codes results in a very small number of observed errors in simulation studies of practical size and renders the usual confidence interval techniques inapplicable to the observed error probability. A natural extension of the notion of a 'confidence interval' is made and applied to such determinations of error probability by simulation. An example is included to show the surprisingly great significance of as few as two decoding errors in a very large number of decoding trials.

  16. Joint probabilities and quantum cognition

    NASA Astrophysics Data System (ADS)

    de Barros, J. Acacio

    2012-12-01

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  17. A Systemic Small RNA Signaling System in Plants

    PubMed Central

    Yoo, Byung-Chun; Kragler, Friedrich; Varkonyi-Gasic, Erika; Haywood, Valerie; Archer-Evans, Sarah; Lee, Young Moo; Lough, Tony J.; Lucas, William J.

    2004-01-01

    Systemic translocation of RNA exerts non-cell-autonomous control over plant development and defense. Long-distance delivery of mRNA has been proven, but transport of small interfering RNA and microRNA remains to be demonstrated. Analyses performed on phloem sap collected from a range of plants identified populations of small RNA species. The dynamic nature of this population was reflected in its response to growth conditions and viral infection. The authenticity of these phloem small RNA molecules was confirmed by bioinformatic analysis; potential targets for a set of phloem small RNA species were identified. Heterografting studies, using spontaneously silencing coat protein (CP) plant lines, also established that transgene-derived siRNA move in the long-distance phloem and initiate CP gene silencing in the scion. Biochemical analysis of pumpkin (Cucurbita maxima) phloem sap led to the characterization of C. maxima Phloem SMALL RNA BINDING PROTEIN1 (CmPSRP1), a unique component of the protein machinery probably involved in small RNA trafficking. Equivalently sized small RNA binding proteins were detected in phloem sap from cucumber (Cucumis sativus) and lupin (Lupinus albus). PSRP1 binds selectively to 25-nucleotide single-stranded RNA species. Microinjection studies provided direct evidence that PSRP1 could mediate the cell-to-cell trafficking of 25-nucleotide single-stranded, but not double-stranded, RNA molecules. The potential role played by PSRP1 in long-distance transmission of silencing signals is discussed with respect to the pathways and mechanisms used by plants to exert systemic control over developmental and physiological processes. PMID:15258266

  18. Early warning of climate tipping points

    NASA Astrophysics Data System (ADS)

    Lenton, Timothy M.

    2011-07-01

    A climate 'tipping point' occurs when a small change in forcing triggers a strongly nonlinear response in the internal dynamics of part of the climate system, qualitatively changing its future state. Human-induced climate change could push several large-scale 'tipping elements' past a tipping point. Candidates include irreversible melt of the Greenland ice sheet, dieback of the Amazon rainforest and shift of the West African monsoon. Recent assessments give an increased probability of future tipping events, and the corresponding impacts are estimated to be large, making them significant risks. Recent work shows that early warning of an approaching climate tipping point is possible in principle, and could have considerable value in reducing the risk that they pose.

  19. Isotopic response with small scintillator based gamma-ray spectrometers

    DOEpatents

    Madden, Norman W [Sparks, NV; Goulding, Frederick S [Lafayette, CA; Asztalos, Stephen J [Oakland, CA

    2012-01-24

    The intrinsic background of a gamma ray spectrometer is significantly reduced by surrounding the scintillator with a second scintillator. This second (external) scintillator surrounds the first scintillator and has an opening of approximately the same diameter as the smaller central scintillator in the forward direction. The second scintillator is selected to have a higher atomic number, and thus has a larger probability for a Compton scattering interaction than within the inner region. Scattering events that are essentially simultaneous in coincidence to the first and second scintillators, from an electronics perspective, are precluded electronically from the data stream. Thus, only gamma-rays that are wholly contained in the smaller central scintillator are used for analytic purposes.

  20. Wind speed affects prey-catching behaviour in an orb web spider.

    PubMed

    Turner, Joe; Vollrath, Fritz; Hesselberg, Thomas

    2011-12-01

    Wind has previously been shown to influence the location and orientation of spider web sites and also the geometry and material composition of constructed orb webs. We now show that wind also influences components of prey-catching behaviour within the web. A small wind tunnel was used to generate different wind speeds. Araneus diadematus ran more slowly towards entangled Drosophila melanogaster in windy conditions, which took less time to escape the web. This indicates a lower capture probability and a diminished overall predation efficiency for spiders at higher wind speeds. We conclude that spiders' behaviour of taking down their webs as wind speed increases may therefore not be a response only to possible web damage.

  1. Wind speed affects prey-catching behaviour in an orb web spider

    NASA Astrophysics Data System (ADS)

    Turner, Joe; Vollrath, Fritz; Hesselberg, Thomas

    2011-12-01

    Wind has previously been shown to influence the location and orientation of spider web sites and also the geometry and material composition of constructed orb webs. We now show that wind also influences components of prey-catching behaviour within the web. A small wind tunnel was used to generate different wind speeds. Araneus diadematus ran more slowly towards entangled Drosophila melanogaster in windy conditions, which took less time to escape the web. This indicates a lower capture probability and a diminished overall predation efficiency for spiders at higher wind speeds. We conclude that spiders' behaviour of taking down their webs as wind speed increases may therefore not be a response only to possible web damage.

  2. A role for neurotransmission and neurodevelopment in attention-deficit/hyperactivity disorder

    PubMed Central

    2009-01-01

    Attention-deficit/hyperactivity disorder (ADHD) has a moderate to high genetic component, probably due to many genes with small effects. Several susceptibility genes have been suggested on the basis of hypotheses that catecholaminergic pathways in the brain are responsible for ADHD. However, many negative association findings have been reported, indicating a limited success for investigations using this approach. The results from genome-wide association studies have suggested that genes related to general brain functions rather than specific aspects of the disorder may contribute to its development. Plausible biological hypotheses linked to neurotransmission and neurodevelopment in general and common to different psychiatric conditions need to be considered when defining candidate genes for ADHD association studies. PMID:19930624

  3. Method of determining the orbits of the small bodies in the solar system based on an exhaustive search of orbital planes

    NASA Astrophysics Data System (ADS)

    Bondarenko, Yu. S.; Vavilov, D. E.; Medvedev, Yu. D.

    2014-05-01

    A universal method of determining the orbits of newly discovered small bodies in the Solar System using their positional observations has been developed. The proposed method suggests determining geocentric distances of a small body by means of an exhaustive search for heliocentric orbital planes and subsequent determination of the distance between the observer and the points at which the chosen plane intersects with the vectors pointing to the object. Further, the remaining orbital elements are determined using the classical Gauss method after eliminating those heliocentric distances that have a fortiori low probabilities. The obtained sets of elements are used to determine the rms between the observed and calculated positions. The sets of elements with the least rms are considered to be most probable for newly discovered small bodies. Afterwards, these elements are improved using the differential method.

  4. Competency criteria and the class inclusion task: modeling judgments and justifications.

    PubMed

    Thomas, H; Horton, J J

    1997-11-01

    Preschool age children's class inclusion task responses were modeled as mixtures of different probability distributions. The main idea: Different response strategies are equivalent to different probability distributions. A child displays cognitive strategy s if P (child uses strategy s, given the child's observed score X = x) = p(s) is the most probable strategy. The general approach is widely applicable to many settings. Both judgment and justification questions were asked. Judgment response strategies identified were subclass comparison, guessing, and inclusion logic. Children's justifications lagged their judgments in development. Although justification responses may be useful, C. J. Brainerd was largely correct: If a single response variable is to be selected, a judgments variable is likely the preferable one. But the process must be modeled to identify cognitive strategies, as B. Hodkin has demonstrated.

  5. Unbiased multi-fidelity estimate of failure probability of a free plane jet

    NASA Astrophysics Data System (ADS)

    Marques, Alexandre; Kramer, Boris; Willcox, Karen; Peherstorfer, Benjamin

    2017-11-01

    Estimating failure probability related to fluid flows is a challenge because it requires a large number of evaluations of expensive models. We address this challenge by leveraging multiple low fidelity models of the flow dynamics to create an optimal unbiased estimator. In particular, we investigate the effects of uncertain inlet conditions in the width of a free plane jet. We classify a condition as failure when the corresponding jet width is below a small threshold, such that failure is a rare event (failure probability is smaller than 0.001). We estimate failure probability by combining the frameworks of multi-fidelity importance sampling and optimal fusion of estimators. Multi-fidelity importance sampling uses a low fidelity model to explore the parameter space and create a biasing distribution. An unbiased estimate is then computed with a relatively small number of evaluations of the high fidelity model. In the presence of multiple low fidelity models, this framework offers multiple competing estimators. Optimal fusion combines all competing estimators into a single estimator with minimal variance. We show that this combined framework can significantly reduce the cost of estimating failure probabilities, and thus can have a large impact in fluid flow applications. This work was funded by DARPA.

  6. Rare Event Simulation in Radiation Transport

    NASA Astrophysics Data System (ADS)

    Kollman, Craig

    This dissertation studies methods for estimating extremely small probabilities by Monte Carlo simulation. Problems in radiation transport typically involve estimating very rare events or the expected value of a random variable which is with overwhelming probability equal to zero. These problems often have high dimensional state spaces and irregular geometries so that analytic solutions are not possible. Monte Carlo simulation must be used to estimate the radiation dosage being transported to a particular location. If the area is well shielded the probability of any one particular particle getting through is very small. Because of the large number of particles involved, even a tiny fraction penetrating the shield may represent an unacceptable level of radiation. It therefore becomes critical to be able to accurately estimate this extremely small probability. Importance sampling is a well known technique for improving the efficiency of rare event calculations. Here, a new set of probabilities is used in the simulation runs. The results are multiplied by the likelihood ratio between the true and simulated probabilities so as to keep our estimator unbiased. The variance of the resulting estimator is very sensitive to which new set of transition probabilities are chosen. It is shown that a zero variance estimator does exist, but that its computation requires exact knowledge of the solution. A simple random walk with an associated killing model for the scatter of neutrons is introduced. Large deviation results for optimal importance sampling in random walks are extended to the case where killing is present. An adaptive "learning" algorithm for implementing importance sampling is given for more general Markov chain models of neutron scatter. For finite state spaces this algorithm is shown to give, with probability one, a sequence of estimates converging exponentially fast to the true solution. In the final chapter, an attempt to generalize this algorithm to a continuous state space is made. This involves partitioning the space into a finite number of cells. There is a tradeoff between additional computation per iteration and variance reduction per iteration that arises in determining the optimal grid size. All versions of this algorithm can be thought of as a compromise between deterministic and Monte Carlo methods, capturing advantages of both techniques.

  7. Asymptotics of small deviations of the Bogoliubov processes with respect to a quadratic norm

    NASA Astrophysics Data System (ADS)

    Pusev, R. S.

    2010-10-01

    We obtain results on small deviations of Bogoliubov’s Gaussian measure occurring in the theory of the statistical equilibrium of quantum systems. For some random processes related to Bogoliubov processes, we find the exact asymptotic probability of their small deviations with respect to a Hilbert norm.

  8. No Bridge Too High: Infants Decide Whether to Cross Based on the Probability of Falling not the Severity of the Potential Fall

    ERIC Educational Resources Information Center

    Kretch, Kari S.; Adolph, Karen E.

    2013-01-01

    Do infants, like adults, consider both the probability of falling and the severity of a potential fall when deciding whether to cross a bridge? Crawling and walking infants were encouraged to cross bridges varying in width over a small drop-off, a large drop-off, or no drop-off. Bridge width affects the probability of falling, whereas drop-off…

  9. Portfolio effects, climate change, and the persistence of small populations: analyses on the rare plant Saussurea weberi.

    PubMed

    Abbott, Ronald E; Doak, Daniel F; Peterson, Megan L

    2017-04-01

    The mechanisms that stabilize small populations in the face of environmental variation are crucial to their long-term persistence. Building from diversity-stability concepts in community ecology, within-population diversity is gaining attention as an important component of population stability. Genetic and microhabitat variation within populations can generate diverse responses to common environmental fluctuations, dampening temporal variability across the population as a whole through portfolio effects. Yet, the potential for portfolio effects to operate at small scales within populations or to change with systematic environmental shifts, such as climate change, remain largely unexplored. We tracked the abundance of a rare alpine perennial plant, Saussurea weberi, in 49 1-m 2 plots within a single population over 20 yr. We estimated among-plot correlations in log annual growth rate to test for population-level synchrony and quantify portfolio effects across the 20-yr study period and also in 5-yr subsets based on June temperature quartiles. Asynchrony among plots, due to different plot-level responses to June temperature, reduced overall fluctuations in abundance and the probability of decline in population models, even when accounting for the effects of density dependence on dynamics. However, plots became more synchronous and portfolio effects decreased during the warmest years of the study, suggesting that future climate warming may erode stabilizing mechanisms in populations of this rare plant. © 2017 by the Ecological Society of America.

  10. Disentangle the Causes of the Road Barrier Effect in Small Mammals through Genetic Patterns.

    PubMed

    Ascensão, Fernando; Mata, Cristina; Malo, Juan E; Ruiz-Capillas, Pablo; Silva, Catarina; Silva, André P; Santos-Reis, Margarida; Fernandes, Carlos

    2016-01-01

    Road barrier effect is among the foremost negative impacts of roads on wildlife. Knowledge of the factors responsible for the road barrier effect is crucial to understand and predict species' responses to roads, and to improve mitigation measures in the context of management and conservation. We built a set of hypothesis aiming to infer the most probable cause of road barrier effect (traffic effect or road surface avoidance), while controlling for the potentially confounding effects road width, traffic volume and road age. The wood mouse Apodemus sylvaticus was used as a model species of small and forest-dwelling mammals, which are more likely to be affected by gaps in cover such as those resulting from road construction. We confront genetic patterns from opposite and same roadsides from samples of three highways and used computer simulations to infer migration rates between opposite roadsides. Genetic patterns from 302 samples (ca. 100 per highway) suggest that the highway barrier effect for wood mouse is due to road surface avoidance. However, from the simulations we estimated a migration rate of about 5% between opposite roadsides, indicating that some limited gene flow across highways does occur. To reduce highway impact on population genetic diversity and structure, possible mitigation measures could include retrofitting of culverts and underpasses to increase their attractiveness and facilitate their use by wood mice and other species, and setting aside roadside strips without vegetation removal to facilitate establishment and dispersal of small mammals.

  11. Disentangle the Causes of the Road Barrier Effect in Small Mammals through Genetic Patterns

    PubMed Central

    Ascensão, Fernando; Mata, Cristina; Malo, Juan E.; Ruiz-Capillas, Pablo; Silva, Catarina; Silva, André P.; Santos-Reis, Margarida; Fernandes, Carlos

    2016-01-01

    Road barrier effect is among the foremost negative impacts of roads on wildlife. Knowledge of the factors responsible for the road barrier effect is crucial to understand and predict species’ responses to roads, and to improve mitigation measures in the context of management and conservation. We built a set of hypothesis aiming to infer the most probable cause of road barrier effect (traffic effect or road surface avoidance), while controlling for the potentially confounding effects road width, traffic volume and road age. The wood mouse Apodemus sylvaticus was used as a model species of small and forest-dwelling mammals, which are more likely to be affected by gaps in cover such as those resulting from road construction. We confront genetic patterns from opposite and same roadsides from samples of three highways and used computer simulations to infer migration rates between opposite roadsides. Genetic patterns from 302 samples (ca. 100 per highway) suggest that the highway barrier effect for wood mouse is due to road surface avoidance. However, from the simulations we estimated a migration rate of about 5% between opposite roadsides, indicating that some limited gene flow across highways does occur. To reduce highway impact on population genetic diversity and structure, possible mitigation measures could include retrofitting of culverts and underpasses to increase their attractiveness and facilitate their use by wood mice and other species, and setting aside roadside strips without vegetation removal to facilitate establishment and dispersal of small mammals. PMID:26978779

  12. Small mammal-heavy metal concentrations from mined and control sites

    USGS Publications Warehouse

    Smith, G.J.; Rongstad, O.J.

    1982-01-01

    Total body concentrations of zinc, copper, cadmium, lead, nickel, mercury and arsenic were determined for Peromyscus maniculatus and Microtus pennsylvanicus from an active zinc-copper mine near Timmins, Ontario, Canada, and a proposed zinc-copper mine near Crandon, Wisconsin, USA. Metal concentrations were evaluated with respect to area, species, sex and age groups. Metal concentrations in Peromyscus from the proposed mine site were not different from those collected in a third area where no mine or deposit exists. This is probably due to the 30 m of glacial material over the proposed mine site deposit. A statistical interaction between area, species, sex and age was observed for zinc and copper concentrations in small mammals we examined. Peromyscus from the mine site had consistently higher metal concentrations than Peromyscus from the control site. Greater total body cadmium and lead concentrations in adult?compared with juvenile?Peromyscus collected at the mine site suggests age-dependent accumulation of these toxic metals. Microtus did not exhibit this age-related response, and responded to other environmental metals more erratically and to a lesser degree. Differences in the response of these two species to environmental metal exposure may be due to differences in food habits. Nickel, mercury and arsenic concentrations in small mammals from the mine site were not different from controls. Heavy metal concentrations are also presented for Sorex cinereus, Blarina brevicauda and Zapus hudsonicus without respect to age and sex cohorts. Peromyscus may be a potentially important species for the monitoring of heavy metal pollution.

  13. Critical behavior of the XY-rotor model on regular and small-world networks

    NASA Astrophysics Data System (ADS)

    De Nigris, Sarah; Leoncini, Xavier

    2013-07-01

    We study the XY rotors model on small networks whose number of links scales with the system size Nlinks˜Nγ, where 1≤γ≤2. We first focus on regular one-dimensional rings in the microcanonical ensemble. For γ<1.5 the model behaves like a short-range one and no phase transition occurs. For γ>1.5, the system equilibrium properties are found to be identical to the mean field, which displays a second-order phase transition at a critical energy density ɛ=E/N,ɛc=0.75. Moreover, for γc≃1.5 we find that a nontrivial state emerges, characterized by an infinite susceptibility. We then consider small-world networks, using the Watts-Strogatz mechanism on the regular networks parametrized by γ. We first analyze the topology and find that the small-world regime appears for rewiring probabilities which scale as pSW∝1/Nγ. Then considering the XY-rotors model on these networks, we find that a second-order phase transition occurs at a critical energy ɛc which logarithmically depends on the topological parameters p and γ. We also define a critical probability pMF, corresponding to the probability beyond which the mean field is quantitatively recovered, and we analyze its dependence on γ.

  14. Characterizing risk of Ebola transmission based on frequency and type of case–contact exposures

    PubMed Central

    Fallah, Mosoka P.; Gaffney, Stephen G.; Yaari, Rami; Yamin, Dan; Huppert, Amit; Bawo, Luke; Nyenswah, Tolbert; Galvani, Alison P.

    2017-01-01

    During the initial months of the 2013–2016 Ebola epidemic, rapid geographical dissemination and intense transmission challenged response efforts across West Africa. Contextual behaviours associated with increased risk of exposure included travel to high-transmission settings, caring for sick and preparing the deceased for traditional funerals. Although such behaviours are widespread in West Africa, high-transmission pockets were observed. Superspreading and clustering are typical phenomena in infectious disease outbreaks, as a relatively small number of transmission chains are often responsible for the majority of events. Determining the characteristics of contacts at greatest risk of developing disease and of cases with greatest transmission potential could therefore help curb propagation of infection. Our analysis of contact tracing data from Montserrado County, Liberia, suggested that the probability of transmission was 4.5 times higher for individuals who were reported as having contact with multiple cases. The probability of individuals developing disease was not significantly associated with age or sex of their source case but was higher when they were in the same household as the infectious case. Surveillance efforts for rapidly identifying symptomatic individuals and effectively messaged campaigns encouraging household members to bring the sick to designated treatment centres without administration of home care could mitigate transmission. This article is part of the themed issue ‘The 2013–2016 West African Ebola epidemic: data, decision-making and disease control’. PMID:28396472

  15. Non-additive effects of pollen limitation and self-incompatibility reduce plant reproductive success and population viability

    PubMed Central

    Young, Andrew G.; Broadhurst, Linda M.; Thrall, Peter H.

    2012-01-01

    Background and Aims Mating system is a primary determinant of the ecological and evolutionary dynamics of wild plant populations. Pollen limitation and loss of self-incompatibility genotypes can both act independently to reduce seed set and these effects are commonly observed in fragmented landscapes. This study used a simulation modelling approach to assess the interacting effects of these two processes on plant reproductive performance and population viability for a range of pollination likelihood, self-incompatibility systems and S-allele richness conditions. Methods A spatially explicit, individual-based, genetic and demographic simulation model parameterized to represent a generic self-incompatible, short-lived perennial herb was used to conduct simulation experiments in which pollination probability, self-incompatibility type (gametophytic and sporophytic) and S-allele richness were systematically varied in combination to assess their independent and interacting effects on the demographic response variables of mate availability, seed set, population size and population persistence. Key Results Joint effects of reduced pollination probability and low S-allele richness were greater than independent effects for all demographic response variables except population persistence under high pollinator service (>50 %). At intermediate values of 15–25 % pollination probability, non-linear interactions with S-allele richness generated significant reductions in population performance beyond those expected by the simple additive effect of each independently. This was due to the impacts of reduced effective population size on the ability of populations to retain S alleles and maintain mate availability. Across a limited set of pollination and S-allele conditions (P = 0·15 and S = 20) populations with gametophytic SI showed reduced S-allele erosion relative to those with sporophytic SI, but this had limited effects on individual fecundity and translated into only modest increases in population persistence. Conclusions Interactions between pollen limitation and loss of S alleles have the potential to significantly reduce the viability of populations of a few hundred plants. Population decline may occur more rapidly than expected when pollination probabilities drop below 25 % and S alleles are fewer than 20 due to non-additive interactions. These are likely to be common conditions experienced by plants in small populations in fragmented landscapes and are also those under which differences in response between gameptophytic and sporophtyic systems are observed. PMID:22184620

  16. Non-additive effects of pollen limitation and self-incompatibility reduce plant reproductive success and population viability.

    PubMed

    Young, Andrew G; Broadhurst, Linda M; Thrall, Peter H

    2012-02-01

    Mating system is a primary determinant of the ecological and evolutionary dynamics of wild plant populations. Pollen limitation and loss of self-incompatibility genotypes can both act independently to reduce seed set and these effects are commonly observed in fragmented landscapes. This study used a simulation modelling approach to assess the interacting effects of these two processes on plant reproductive performance and population viability for a range of pollination likelihood, self-incompatibility systems and S-allele richness conditions. A spatially explicit, individual-based, genetic and demographic simulation model parameterized to represent a generic self-incompatible, short-lived perennial herb was used to conduct simulation experiments in which pollination probability, self-incompatibility type (gametophytic and sporophytic) and S-allele richness were systematically varied in combination to assess their independent and interacting effects on the demographic response variables of mate availability, seed set, population size and population persistence. Joint effects of reduced pollination probability and low S-allele richness were greater than independent effects for all demographic response variables except population persistence under high pollinator service (>50 %). At intermediate values of 15-25 % pollination probability, non-linear interactions with S-allele richness generated significant reductions in population performance beyond those expected by the simple additive effect of each independently. This was due to the impacts of reduced effective population size on the ability of populations to retain S alleles and maintain mate availability. Across a limited set of pollination and S-allele conditions (P = 0·15 and S = 20) populations with gametophytic SI showed reduced S-allele erosion relative to those with sporophytic SI, but this had limited effects on individual fecundity and translated into only modest increases in population persistence. Interactions between pollen limitation and loss of S alleles have the potential to significantly reduce the viability of populations of a few hundred plants. Population decline may occur more rapidly than expected when pollination probabilities drop below 25 % and S alleles are fewer than 20 due to non-additive interactions. These are likely to be common conditions experienced by plants in small populations in fragmented landscapes and are also those under which differences in response between gameptophytic and sporophtyic systems are observed.

  17. National surveillance for radiological exposures and intentional potassium iodide and iodine product ingestions in the United States associated with the 2011 Japan radiological incident

    PubMed Central

    LAW, ROYAL K.; SCHIER, JOSH G.; MARTIN, COLLEEN A.; OLIVARES, DAGNY E.; THOMAS, RICHARD G.; BRONSTEIN, ALVIN C.; CHANG, ARTHUR S.

    2015-01-01

    Background In March of 2011, an earthquake struck Japan causing a tsunami that resulted in a radiological release from the damaged Fukushima Daiichi nuclear power plant. Surveillance for potential radiological and any iodine/iodide product exposures was initiated on the National Poison Data System (NPDS) to target public health messaging needs within the United States (US). Our objectives are to describe self-reported exposures to radiation, potassium iodide (KI) and other iodine/iodide products which occurred during the US federal response and discuss its public health impact. Methods All calls to poison centers associated with the Japan incident were identified from March 11, 2011 to April 18, 2011 in NPDS. Exposure, demographic and health outcome information were collected. Calls about reported radiation exposures and KI or other iodine/iodide product ingestions were then categorized with regard to exposure likelihood based on follow-up information obtained from the PC where each call originated. Reported exposures were subsequently classified as probable exposures (high likelihood of exposure), probable non-exposures (low likelihood of exposure), and suspect exposure (unknown likelihood of exposure). Results We identified 400 calls to PCs associated with the incident, with 340 information requests (no exposure reported) and 60 reported exposures. The majority (n = 194; 57%) of the information requests mentioned one or more substances. Radiation was inquired about most frequently (n = 88; 45%), followed by KI (n = 86; 44%) and other iodine/iodide products (n = 47; 24%). Of the 60 reported exposures, KI was reported most frequently (n = 25; 42%), followed by radiation (n = 22; 37%) and other iodine/iodide products (n = 13; 22%). Among reported KI exposures, most were classified as probable exposures (n = 24; 96%); one was a probable non-exposure. Among reported other iodine/iodide product exposures, most were probable exposures (n = 10, 77%) and the rest were suspect exposures (n = 3; 23%). The reported radiation exposures were classified as suspect exposures (n = 16, 73%) or probable non-exposures (n = 6; 27%). No radiation exposures were classified as probable exposures. A small number of the probable exposures to KI and other iodide/iodine products reported adverse signs or symptoms (n = 9; 26%). The majority of probable exposures had no adverse outcomes (n = 28; 82%). These data identified a potential public health information gap regarding KI and other iodine/iodide products which was then addressed through public health messaging activities. Conclusion During the Japan incident response, surveillance activities using NPDS identified KI and other iodine/iodide products as potential public health concerns within the US, which guided CDC’s public health messaging and communication activities. Regional PCs can provide timely and additional information during a public health emergency to enhance data collected from surveillance activities, which in turn can be used to inform public health decision-making. PMID:23043524

  18. National surveillance for radiological exposures and intentional potassium iodide and iodine product ingestions in the United States associated with the 2011 Japan radiological incident.

    PubMed

    Law, Royal K; Schier, Josh G; Martin, Colleen A; Olivares, Dagny E; Thomas, Richard G; Bronstein, Alvin C; Chang, Arthur S

    2013-01-01

    In March of 2011, an earthquake struck Japan causing a tsunami that resulted in a radiological release from the damaged Fukushima Daiichi nuclear power plant. Surveillance for potential radiological and any iodine/iodide product exposures was initiated on the National Poison Data System (NPDS) to target public health messaging needs within the United States (US). Our objectives are to describe self-reported exposures to radiation, potassium iodide (KI) and other iodine/iodide products which occurred during the US federal response and discuss its public health impact. All calls to poison centers associated with the Japan incident were identified from March 11, 2011 to April 18, 2011 in NPDS. Exposure, demographic and health outcome information were collected. Calls about reported radiation exposures and KI or other iodine/iodide product ingestions were then categorized with regard to exposure likelihood based on follow-up information obtained from the PC where each call originated. Reported exposures were subsequently classified as probable exposures (high likelihood of exposure), probable non-exposures (low likelihood of exposure), and suspect exposure (unknown likelihood of exposure). We identified 400 calls to PCs associated with the incident, with 340 information requests (no exposure reported) and 60 reported exposures. The majority (n = 194; 57%) of the information requests mentioned one or more substances. Radiation was inquired about most frequently (n = 88; 45%), followed by KI (n = 86; 44%) and other iodine/iodide products (n = 47; 24%). Of the 60 reported exposures, KI was reported most frequently (n = 25; 42%), followed by radiation (n = 22; 37%) and other iodine/iodide products (n = 13; 22%). Among reported KI exposures, most were classified as probable exposures (n = 24; 96%); one was a probable non-exposure. Among reported other iodine/iodide product exposures, most were probable exposures (n = 10, 77%) and the rest were suspect exposures (n = 3; 23%). The reported radiation exposures were classified as suspect exposures (n = 16, 73%) or probable non-exposures (n = 6; 27%). No radiation exposures were classified as probable exposures. A small number of the probable exposures to KI and other iodide/iodine products reported adverse signs or symptoms (n = 9; 26%). The majority of probable exposures had no adverse outcomes (n = 28; 82%). These data identified a potential public health information gap regarding KI and other iodine/iodide products which was then addressed through public health messaging activities. During the Japan incident response, surveillance activities using NPDS identified KI and other iodine/iodide products as potential public health concerns within the US, which guided CDC's public health messaging and communication activities. Regional PCs can provide timely and additional information during a public health emergency to enhance data collected from surveillance activities, which in turn can be used to inform public health decision-making.

  19. Small-area estimation of the probability of toxocariasis in New York City based on sociodemographic neighborhood composition.

    PubMed

    Walsh, Michael G; Haseeb, M A

    2014-01-01

    Toxocariasis is increasingly recognized as an important neglected infection of poverty (NIP) in developed countries, and may constitute the most important NIP in the United States (US) given its association with chronic sequelae such as asthma and poor cognitive development. Its potential public health burden notwithstanding, toxocariasis surveillance is minimal throughout the US and so the true burden of disease remains uncertain in many areas. The Third National Health and Nutrition Examination Survey conducted a representative serologic survey of toxocariasis to estimate the prevalence of infection in diverse US subpopulations across different regions of the country. Using the NHANES III surveillance data, the current study applied the predicted probabilities of toxocariasis to the sociodemographic composition of New York census tracts to estimate the local probability of infection across the city. The predicted probability of toxocariasis ranged from 6% among US-born Latino women with a university education to 57% among immigrant men with less than a high school education. The predicted probability of toxocariasis exhibited marked spatial variation across the city, with particularly high infection probabilities in large sections of Queens, and smaller, more concentrated areas of Brooklyn and northern Manhattan. This investigation is the first attempt at small-area estimation of the probability surface of toxocariasis in a major US city. While this study does not define toxocariasis risk directly, it does provide a much needed tool to aid the development of toxocariasis surveillance in New York City.

  20. Small-Area Estimation of the Probability of Toxocariasis in New York City Based on Sociodemographic Neighborhood Composition

    PubMed Central

    Walsh, Michael G.; Haseeb, M. A.

    2014-01-01

    Toxocariasis is increasingly recognized as an important neglected infection of poverty (NIP) in developed countries, and may constitute the most important NIP in the United States (US) given its association with chronic sequelae such as asthma and poor cognitive development. Its potential public health burden notwithstanding, toxocariasis surveillance is minimal throughout the US and so the true burden of disease remains uncertain in many areas. The Third National Health and Nutrition Examination Survey conducted a representative serologic survey of toxocariasis to estimate the prevalence of infection in diverse US subpopulations across different regions of the country. Using the NHANES III surveillance data, the current study applied the predicted probabilities of toxocariasis to the sociodemographic composition of New York census tracts to estimate the local probability of infection across the city. The predicted probability of toxocariasis ranged from 6% among US-born Latino women with a university education to 57% among immigrant men with less than a high school education. The predicted probability of toxocariasis exhibited marked spatial variation across the city, with particularly high infection probabilities in large sections of Queens, and smaller, more concentrated areas of Brooklyn and northern Manhattan. This investigation is the first attempt at small-area estimation of the probability surface of toxocariasis in a major US city. While this study does not define toxocariasis risk directly, it does provide a much needed tool to aid the development of toxocariasis surveillance in New York City. PMID:24918785

  1. Reliability of Radioisotope Stirling Convertor Linear Alternator

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin; Korovaichuk, Igor; Geng, Steven M.; Schreiber, Jeffrey G.

    2006-01-01

    Onboard radioisotope power systems being developed and planned for NASA s deep-space missions would require reliable design lifetimes of up to 14 years. Critical components and materials of Stirling convertors have been undergoing extensive testing and evaluation in support of a reliable performance for the specified life span. Of significant importance to the successful development of the Stirling convertor is the design of a lightweight and highly efficient linear alternator. Alternator performance could vary due to small deviations in the permanent magnet properties, operating temperature, and component geometries. Durability prediction and reliability of the alternator may be affected by these deviations from nominal design conditions. Therefore, it is important to evaluate the effect of these uncertainties in predicting the reliability of the linear alternator performance. This paper presents a study in which a reliability-based methodology is used to assess alternator performance. The response surface characterizing the induced open-circuit voltage performance is constructed using 3-D finite element magnetic analysis. Fast probability integration method is used to determine the probability of the desired performance and its sensitivity to the alternator design parameters.

  2. Entanglement-Assisted Weak Value Amplification

    NASA Astrophysics Data System (ADS)

    Pang, Shengshi; Dressel, Justin; Brun, Todd A.

    2014-07-01

    Large weak values have been used to amplify the sensitivity of a linear response signal for detecting changes in a small parameter, which has also enabled a simple method for precise parameter estimation. However, producing a large weak value requires a low postselection probability for an ancilla degree of freedom, which limits the utility of the technique. We propose an improvement to this method that uses entanglement to increase the efficiency. We show that by entangling and postselecting n ancillas, the postselection probability can be increased by a factor of n while keeping the weak value fixed (compared to n uncorrelated attempts with one ancilla), which is the optimal scaling with n that is expected from quantum metrology. Furthermore, we show the surprising result that the quantum Fisher information about the detected parameter can be almost entirely preserved in the postselected state, which allows the sensitive estimation to approximately saturate the relevant quantum Cramér-Rao bound. To illustrate this protocol we provide simple quantum circuits that can be implemented using current experimental realizations of three entangled qubits.

  3. The Transcriptional Response of Lactobacillus sanfranciscensis DSM 20451T and Its tcyB Mutant Lacking a Functional Cystine Transporter to Diamide Stress

    PubMed Central

    Stetina, Mandy; Behr, Jürgen

    2014-01-01

    As a result of its strong adaptation to wheat and rye sourdoughs, Lactobacillus sanfranciscensis has the smallest genome within the genus Lactobacillus. The concomitant absence of some important antioxidative enzymes and the inability to synthesize glutathione suggest a role of cystine transport in maintenance of an intracellular thiol balance. Diamide [synonym 1,1′-azobis(N,N-dimethylformamide)] disturbs intracellular and membrane thiol levels in oxidizing protein thiols depending on its initial concentration. In this study, RNA sequencing was used to reveal the transcriptional response of L. sanfranciscensis DSM 20451T (wild type [WT]) and its ΔtcyB mutant with a nonfunctional cystine transporter after thiol stress caused by diamide. Along with the different expression of genes involved in amino acid starvation, pyrimidine synthesis, and energy production, our results show that thiol stress in the wild type can be compensated through activation of diverse chaperones and proteases whereas the ΔtcyB mutant shifts its metabolism in the direction of survival. Only a small set of genes are significantly differentially expressed between the wild type and the mutant. In the WT, mainly genes which are associated with a heat shock response are upregulated whereas glutamine import and synthesis genes are downregulated. In the ΔtcyB mutant, the whole opp operon was more highly expressed, as well as a protein which probably includes enzymes for methionine transport. The two proteins encoded by spxA and nrdH, which are involved in direct or indirect oxidative stress responses, are also upregulated in the mutant. This work emphasizes that even in the absence of definitive antioxidative enzymes, bacteria with a small genome and a high frequency of gene inactivation and elimination use small molecules such as the cysteine/cystine couple to overcome potential cell damage resulting from oxidative stress. PMID:24795368

  4. The transcriptional response of Lactobacillus sanfranciscensis DSM 20451T and its tcyB mutant lacking a functional cystine transporter to diamide stress.

    PubMed

    Stetina, Mandy; Behr, Jürgen; Vogel, Rudi F

    2014-07-01

    As a result of its strong adaptation to wheat and rye sourdoughs, Lactobacillus sanfranciscensis has the smallest genome within the genus Lactobacillus. The concomitant absence of some important antioxidative enzymes and the inability to synthesize glutathione suggest a role of cystine transport in maintenance of an intracellular thiol balance. Diamide [synonym 1,1'-azobis(N,N-dimethylformamide)] disturbs intracellular and membrane thiol levels in oxidizing protein thiols depending on its initial concentration. In this study, RNA sequencing was used to reveal the transcriptional response of L. sanfranciscensis DSM 20451(T) (wild type [WT]) and its ΔtcyB mutant with a nonfunctional cystine transporter after thiol stress caused by diamide. Along with the different expression of genes involved in amino acid starvation, pyrimidine synthesis, and energy production, our results show that thiol stress in the wild type can be compensated through activation of diverse chaperones and proteases whereas the ΔtcyB mutant shifts its metabolism in the direction of survival. Only a small set of genes are significantly differentially expressed between the wild type and the mutant. In the WT, mainly genes which are associated with a heat shock response are upregulated whereas glutamine import and synthesis genes are downregulated. In the ΔtcyB mutant, the whole opp operon was more highly expressed, as well as a protein which probably includes enzymes for methionine transport. The two proteins encoded by spxA and nrdH, which are involved in direct or indirect oxidative stress responses, are also upregulated in the mutant. This work emphasizes that even in the absence of definitive antioxidative enzymes, bacteria with a small genome and a high frequency of gene inactivation and elimination use small molecules such as the cysteine/cystine couple to overcome potential cell damage resulting from oxidative stress. Copyright © 2014, American Society for Microbiology. All Rights Reserved.

  5. Informing the network: Improving communication with interface communities during wildland fire

    USGS Publications Warehouse

    Taylor, J.G.; Gillette, S.C.; Hodgson, R.W.; Downing, J.L.; Burns, M.R.; Chavez, D.J.; Hogan, J.T.

    2007-01-01

    An interagency research team studied fire communications that took place during different stages of two wildfires in southern California: one small fire of short duration and one large fire of long duration. This "quick- response" research showed that pre-fire communication planning was particularly effective for smaller fire events and parts of that planning proved invaluable for the large fire event as well. Information seeking by the affected public relied on locally convenient sources during the small fire. During the large fire, widespread evacuations disrupted many of the local informal communication networks. Residents' needs were for "real-time, " place-specific information: precise location, severity, size, and direction of spread of the fires. Fire management agencies must contribute real-time, place-specific fire information when it is most needed by the affected public, as they try to make sense out of the chaos of a wildland fire. Disseminating fire information as broadly as possible through multiple pathways will maximize the probability of the public finding the information they need. ?? Society for Human Ecology.

  6. Effects of the March 1964 Alaska earthquake on glaciers: Chapter D in The Alaska earthquake, March 27, 1964: effects on hydrologic regimen

    USGS Publications Warehouse

    Post, Austin

    1967-01-01

    The 1964 Alaska earthquake occurred in a region where there are many hundreds of glaciers, large and small. Aerial photographic investigations indicate that no snow and ice avalanches of large size occurred on glaciers despite the violent shaking. Rockslide avalanches extended onto the glaciers in many localities, seven very large ones occurring in the Copper River region 160 kilometers east of the epicenter. Some of these avalanches traveled several kilometers at low gradients; compressed air may have provided a lubricating layer. If long-term changes in glaciers due to tectonic changes in altitude and slope occur, they will probably be very small. No evidence of large-scale dynamic response of any glacier to earthquake shaking or avalanche loading was found in either the Chugach or Kenai Mountains 16 months after the 1964 earthquake, nor was there any evidence of surges (rapid advances) as postulated by the Earthquake-Advance Theory of Tarr and Martin.

  7. Selection of forage-fish schools by Murrelets and Tufted Puffins in Prince William Sound, Alaska

    USGS Publications Warehouse

    Ostrand, William D.; Coyle, Kenneth O.; Drew, Gary S.; Maniscalco, John M.; Irons, David B.

    1998-01-01

    We collected hydroacoustic and bird-observation data simultaneously along transects in three areas in Prince William Sound, Alaska, 21 July-11 August 1995. The probability of the association of fish schools with Marbled Murrelets (Brachyramphus marmoratus) and Tufted Puffins (Fratercula cirrhata) was determined through the use of resource selection functions based on logistic regression. Mean (± SD) group sizes were small for both species, 1.7 ± 1.1 and 1.2 ± 0.7 for Marbled Murrelets and Tufted Puffins, respectively. Oceanographically, all study areas were stratified with synchronous thermo- and pycnoclines (a water layer of increasing temperature and density, respectively, with increasing depth). Our analysis indicated that Tufted Puffins selected fish schools near their colony, whereas Marbled Murrelets selected smaller, denser fish schools in shallower habitats. We suggest that murrelets selected shallower habitats in response to lower maximum diving depths than puffins. Small feeding-groups size is discussed in terms of foraging theory and as a consequence of dispersed, low density food resources.

  8. Generalized skew-symmetric interfacial probability distribution in reflectivity and small-angle scattering analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Zhang; Chen, Wei

    Generalized skew-symmetric probability density functions are proposed to model asymmetric interfacial density distributions for the parameterization of any arbitrary density profiles in the `effective-density model'. The penetration of the densities into adjacent layers can be selectively controlled and parameterized. A continuous density profile is generated and discretized into many independent slices of very thin thickness with constant density values and sharp interfaces. The discretized profile can be used to calculate reflectivities via Parratt's recursive formula, or small-angle scattering via the concentric onion model that is also developed in this work.

  9. Generalized skew-symmetric interfacial probability distribution in reflectivity and small-angle scattering analysis

    DOE PAGES

    Jiang, Zhang; Chen, Wei

    2017-11-03

    Generalized skew-symmetric probability density functions are proposed to model asymmetric interfacial density distributions for the parameterization of any arbitrary density profiles in the `effective-density model'. The penetration of the densities into adjacent layers can be selectively controlled and parameterized. A continuous density profile is generated and discretized into many independent slices of very thin thickness with constant density values and sharp interfaces. The discretized profile can be used to calculate reflectivities via Parratt's recursive formula, or small-angle scattering via the concentric onion model that is also developed in this work.

  10. Warship Combat System Selection Methodology Based on Discrete Event Simulation

    DTIC Science & Technology

    2010-09-01

    Platform (from Spanish) PD Damage Probability xiv PHit Hit Probability PKill Kill Probability RSM Response Surface Model SAM Surface-Air Missile...such a large target allows an assumption that the probability of a hit ( PHit ) is one. This structure can be considered as a bridge; therefore, the

  11. Green tea extract induces protective autophagy in A549 non-small lung cancer cell line.

    PubMed

    Izdebska, Magdalena; Klimaszewska-Wiśniewska, Anna; Hałas, Marta; Gagat, Maciej; Grzanka, Alina

    2015-12-31

    For many decades, polyphenols, including green tea extract catechins, have been reported to exert multiple anti-tumor activities. However, to date the mechanisms of their action have not been completely elucidated. Thus, the aim of this study was to assess the effect of green tea extract on non-small lung cancer A549 cells. A549 cells following treatment with GTE were analyzed using the inverted light and fluorescence microscope. In order to evaluate cell sensitivity and cell death, the MTT assay and Tali image-based cytometer were used, respectively. Ultrastructural alterations were assessed using a transmission electron microscope. The obtained data suggested that GTE, even at the highest dose employed (150 μM), was not toxic to A549 cells. Likewise, the treatment with GTE resulted in only a very small dose-dependent increase in the population of apoptotic cells. However, enhanced accumulation of vacuole-like structures in response to GTE was seen at the light and electron microscopic level. Furthermore, an increase in the acidic vesicular organelles and LC3-II puncta formation was observed under the fluorescence microscope, following GTE treatment. The analysis of the functional status of autophagy revealed that GTE-induced autophagy may provide self-protection against its own cytotoxicity, since we observed that the blockage of autophagy by bafilomycin A1 decreased the viability of A549 cells and potentiated necrotic cell death induction in response to GTE treatment. Collectively, our results revealed that A549 cells are insensitive to both low and high concentrations of the green tea extract, probably due to the induction of cytoprotective autophagy. These data suggest that a potential utility of GTE in lung cancer therapy may lie in its synergistic combinations with drugs or small molecules that target autophagy, rather than in monotherapy.

  12. Score distributions of gapped multiple sequence alignments down to the low-probability tail

    NASA Astrophysics Data System (ADS)

    Fieth, Pascal; Hartmann, Alexander K.

    2016-08-01

    Assessing the significance of alignment scores of optimally aligned DNA or amino acid sequences can be achieved via the knowledge of the score distribution of random sequences. But this requires obtaining the distribution in the biologically relevant high-scoring region, where the probabilities are exponentially small. For gapless local alignments of infinitely long sequences this distribution is known analytically to follow a Gumbel distribution. Distributions for gapped local alignments and global alignments of finite lengths can only be obtained numerically. To obtain result for the small-probability region, specific statistical mechanics-based rare-event algorithms can be applied. In previous studies, this was achieved for pairwise alignments. They showed that, contrary to results from previous simple sampling studies, strong deviations from the Gumbel distribution occur in case of finite sequence lengths. Here we extend the studies to multiple sequence alignments with gaps, which are much more relevant for practical applications in molecular biology. We study the distributions of scores over a large range of the support, reaching probabilities as small as 10-160, for global and local (sum-of-pair scores) multiple alignments. We find that even after suitable rescaling, eliminating the sequence-length dependence, the distributions for multiple alignment differ from the pairwise alignment case. Furthermore, we also show that the previously discussed Gaussian correction to the Gumbel distribution needs to be refined, also for the case of pairwise alignments.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Jiangjiang; Li, Weixuan; Lin, Guang

    In decision-making for groundwater management and contamination remediation, it is important to accurately evaluate the probability of the occurrence of a failure event. For small failure probability analysis, a large number of model evaluations are needed in the Monte Carlo (MC) simulation, which is impractical for CPU-demanding models. One approach to alleviate the computational cost caused by the model evaluations is to construct a computationally inexpensive surrogate model instead. However, using a surrogate approximation can cause an extra error in the failure probability analysis. Moreover, constructing accurate surrogates is challenging for high-dimensional models, i.e., models containing many uncertain input parameters.more » To address these issues, we propose an efficient two-stage MC approach for small failure probability analysis in high-dimensional groundwater contaminant transport modeling. In the first stage, a low-dimensional representation of the original high-dimensional model is sought with Karhunen–Loève expansion and sliced inverse regression jointly, which allows for the easy construction of a surrogate with polynomial chaos expansion. Then a surrogate-based MC simulation is implemented. In the second stage, the small number of samples that are close to the failure boundary are re-evaluated with the original model, which corrects the bias introduced by the surrogate approximation. The proposed approach is tested with a numerical case study and is shown to be 100 times faster than the traditional MC approach in achieving the same level of estimation accuracy.« less

  14. Effects of track structure and cell inactivation on the calculation of heavy ion mutation rates in mammalian cells

    NASA Technical Reports Server (NTRS)

    Cucinotta, F. A.; Wilson, J. W.; Shavers, M. R.; Katz, R.

    1996-01-01

    It has long been suggested that inactivation severely effects the probability of mutation by heavy ions in mammalian cells. Heavy ions have observed cross sections of inactivation that approach and sometimes exceed the geometric size of the cell nucleus in mammalian cells. In the track structure model of Katz the inactivation cross section is found by summing an inactivation probability over all impact parameters from the ion to the sensitive sites within the cell nucleus. The inactivation probability is evaluated using the dose-response of the system to gamma-rays and the radial dose of the ions and may be equal to unity at small impact parameters for some ions. We show how the effects of inactivation may be taken into account in the evaluation of the mutation cross sections from heavy ions in the track structure model through correlation of sites for gene mutation and cell inactivation. The model is fit to available data for HPRT mutations in Chinese hamster cells and good agreement is found. The resulting calculations qualitatively show that mutation cross sections for heavy ions display minima at velocities where inactivation cross sections display maxima. Also, calculations show the high probability of mutation by relativistic heavy ions due to the radial extension of ions track from delta-rays in agreement with the microlesion concept. The effects of inactivation on mutations rates make it very unlikely that a single parameter such as LET or Z*2/beta(2) can be used to specify radiation quality for heavy ion bombardment.

  15. Array coding for large data memories

    NASA Technical Reports Server (NTRS)

    Tranter, W. H.

    1982-01-01

    It is pointed out that an array code is a convenient method for storing large quantities of data. In a typical application, the array consists of N data words having M symbols in each word. The probability of undetected error is considered, taking into account three symbol error probabilities which are of interest, and a formula for determining the probability of undetected error. Attention is given to the possibility of reading data into the array using a digital communication system with symbol error probability p. Two different schemes are found to be of interest. The conducted analysis of array coding shows that the probability of undetected error is very small even for relatively large arrays.

  16. Vertex evoked potentials in a rating-scale detection task: Relation to signal probability

    NASA Technical Reports Server (NTRS)

    Squires, K. C.; Squires, N. K.; Hillyard, S. A.

    1974-01-01

    Vertex evoked potentials were recorded from human subjects performing in an auditory detection task with rating scale responses. Three values of a priori probability of signal presentation were tested. The amplitudes of the N1 and P3 components of the vertex potential associated with correct detections of the signal were found to be systematically related to the strictness of the response criterion and independent of variations in a priori signal probability. No similar evoked potential components were found associated with signal absent judgements (misses and correct rejections) regardless of the confidence level of the judgement or signal probability. These results strongly support the contention that the form of the vertex evoked response is closely correlated with the subject's psychophysical decision regarding the presence or absence of a threshold level signal.

  17. Adenosine uptake is the major effector of extracellular ATP toxicity in human cervical cancer cells

    PubMed Central

    Mello, Paola de Andrade; Filippi-Chiela, Eduardo Cremonese; Nascimento, Jéssica; Beckenkamp, Aline; Santana, Danielle Bertodo; Kipper, Franciele; Casali, Emerson André; Nejar Bruno, Alessandra; Paccez, Juliano Domiraci; Zerbini, Luiz Fernando; Wink, Marcia Rosângela; Lenz, Guido; Buffon, Andréia

    2014-01-01

    In cervical cancer, HPV infection and disruption of mechanisms involving cell growth, differentiation, and apoptosis are strictly linked with tumor progression and invasion. Tumor microenvironment is ATP and adenosine rich, suggesting a role for purinergic signaling in cancer cell growth and death. Here we investigate the effect of extracellular ATP on human cervical cancer cells. We find that extracellular ATP itself has a small cytotoxic effect, whereas adenosine formed from ATP degradation by ectonucleotidases is the main factor responsible for apoptosis induction. The level of P2×7 receptor seemed to define the main cytotoxic mechanism triggered by ATP, since ATP itself eliminated a small subpopulation of cells that express high P2×7 levels, probably through its activation. Corroborating these data, blockage or knockdown of P2×7 only slightly reduced ATP cytotoxicity. On the other hand, cell viability was almost totally recovered with dipyridamole, an adenosine transporter inhibitor. Moreover, ATP-induced apoptosis and signaling—p53 increase, AMPK activation, and PARP cleavage—as well as autophagy induction were also inhibited by dipyridamole. In addition, inhibition of adenosine conversion into AMP also blocked cell death, indicating that metabolization of intracellular adenosine originating from extracellular ATP is responsible for the main effects of the latter in human cervical cancer cells. PMID:25103241

  18. An inflammation-related nomogram for predicting the survival of patients with non-small cell lung cancer after pulmonary lobectomy.

    PubMed

    Wang, Ying; Qu, Xiao; Kam, Ngar-Woon; Wang, Kai; Shen, Hongchang; Liu, Qi; Du, Jiajun

    2018-06-26

    Emerging inflammatory response biomarkers are developed to predict the survival of patients with cancer, the aim of our study is to establish an inflammation-related nomogram based on the classical predictive biomarkers to predict the survivals of patients with non-small cell lung cancer (NSCLC). Nine hundred and fifty-two NSCLC patients with lung cancer surgery performed were enrolled into this study. The cutoffs of inflammatory response biomarkers were determined by Receiver operating curve (ROC). Univariate and multivariate analysis were conducted to select independent prognostic factors to develop the nomogram. The median follow-up time was 40.0 months (range, 1 to 92 months). The neutrophil to lymphocyte ratio (cut-off: 3.10, HR:1.648, P = 0.045) was selected to establish the nomogram which could predict the 5-year OS probability. The C-index of nomogram was 0.72 and the 5-year OS calibration curve displayed an optimal agreement between the actual observed outcomes and the predictive results. Neutrophil to lymphocyte ratio was shown to be a valuable biomarker for predicting survival of patients with NSCLC. The addition of neutrophil to lymphocyte ratio could improve the accuracy and predictability of the nomogram in order to provide reference for clinicians to assess patient outcomes.

  19. Distributed Storage Algorithm for Geospatial Image Data Based on Data Access Patterns.

    PubMed

    Pan, Shaoming; Li, Yongkai; Xu, Zhengquan; Chong, Yanwen

    2015-01-01

    Declustering techniques are widely used in distributed environments to reduce query response time through parallel I/O by splitting large files into several small blocks and then distributing those blocks among multiple storage nodes. Unfortunately, however, many small geospatial image data files cannot be further split for distributed storage. In this paper, we propose a complete theoretical system for the distributed storage of small geospatial image data files based on mining the access patterns of geospatial image data using their historical access log information. First, an algorithm is developed to construct an access correlation matrix based on the analysis of the log information, which reveals the patterns of access to the geospatial image data. Then, a practical heuristic algorithm is developed to determine a reasonable solution based on the access correlation matrix. Finally, a number of comparative experiments are presented, demonstrating that our algorithm displays a higher total parallel access probability than those of other algorithms by approximately 10-15% and that the performance can be further improved by more than 20% by simultaneously applying a copy storage strategy. These experiments show that the algorithm can be applied in distributed environments to help realize parallel I/O and thereby improve system performance.

  20. Concurrent progressive ratio schedules: Effects of reinforcer probability on breakpoint and response allocation.

    PubMed

    Jarmolowicz, David P; Sofis, Michael J; Darden, Alexandria C

    2016-07-01

    Although progressive ratio (PR) schedules have been used to explore effects of a range of reinforcer parameters (e.g., magnitude, delay), effects of reinforcer probability remain underexplored. The present project used independently progressing concurrent PR PR schedules to examine effects of reinforcer probability on PR breakpoint (highest completed ratio prior to a session terminating 300s pause) and response allocation. The probability of reinforcement on one lever remained at 100% across all conditions while the probability of reinforcement on the other lever was systematically manipulated (i.e., 100%, 50%, 25%, 12.5%, and a replication of 25%). Breakpoints systematically decreased with decreasing reinforcer probabilities while breakpoints on the control lever remained unchanged. Patterns of switching between the two levers were well described by a choice-by-choice unit price model that accounted for the hyperbolic discounting of the value of probabilistic reinforcers. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Ordinal probability effect measures for group comparisons in multinomial cumulative link models.

    PubMed

    Agresti, Alan; Kateri, Maria

    2017-03-01

    We consider simple ordinal model-based probability effect measures for comparing distributions of two groups, adjusted for explanatory variables. An "ordinal superiority" measure summarizes the probability that an observation from one distribution falls above an independent observation from the other distribution, adjusted for explanatory variables in a model. The measure applies directly to normal linear models and to a normal latent variable model for ordinal response variables. It equals Φ(β/2) for the corresponding ordinal model that applies a probit link function to cumulative multinomial probabilities, for standard normal cdf Φ and effect β that is the coefficient of the group indicator variable. For the more general latent variable model for ordinal responses that corresponds to a linear model with other possible error distributions and corresponding link functions for cumulative multinomial probabilities, the ordinal superiority measure equals exp(β)/[1+exp(β)] with the log-log link and equals approximately exp(β/2)/[1+exp(β/2)] with the logit link, where β is the group effect. Another ordinal superiority measure generalizes the difference of proportions from binary to ordinal responses. We also present related measures directly for ordinal models for the observed response that need not assume corresponding latent response models. We present confidence intervals for the measures and illustrate with an example. © 2016, The International Biometric Society.

  2. Calculation of Heavy Ion Inactivation and Mutation Rates in Radial Dose Model of Track Structure

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Wilson, John W.; Shavers, Mark R.; Katz, Robert

    1997-01-01

    In the track structure model, the inactivation cross section is found by summing an inactivation probability over all impact parameters from the ion to the sensitive sites within the cell nucleus. The inactivation probability is evaluated by using the dose response of the system to gamma rays and the radial dose of the ions and may be equal to unity at small impact parameters. We apply the track structure model to recent data with heavy ion beams irradiating biological samples of E. Coli, B. Subtilis spores, and Chinese hamster (V79) cells. Heavy ions have observed cross sections for inactivation that approach and sometimes exceed the geometric size of the cell nucleus. We show how the effects of inactivation may be taken into account in the evaluation of the mutation cross sections in the track structure model through correlation of sites for gene mutation and cell inactivation. The model is fit to available data for HPRT (hypoxanthine guanine phosphoribosyl transferase) mutations in V79 cells, and good agreement is found. Calculations show the high probability for mutation by relativistic ions due to the radial extension of ions track from delta rays. The effects of inactivation on mutation rates make it very unlikely that a single parameter such as LET (linear energy transfer) can be used to specify radiation quality for heavy ion bombardment.

  3. Comparison of Motor Inhibition in Variants of the Instructed-Delay Choice Reaction Time Task

    PubMed Central

    Quoilin, Caroline; Lambert, Julien; Jacob, Benvenuto; Klein, Pierre-Alexandre; Duque, Julie

    2016-01-01

    Using instructed-delay choice reaction time (RT) paradigms, many previous studies have shown that the motor system is transiently inhibited during response preparation: motor-evoked potentials (MEPs) elicited by transcranial magnetic stimulation (TMS) over the primary motor cortex are typically suppressed during the delay period. This effect has been observed in both selected and non-selected effectors, although MEP changes in selected effectors have been more inconsistent across task versions. Here, we compared changes in MEP amplitudes in three different variants of an instructed-delay choice RT task. All variants required participants to choose between left and right index finger movements but the responses were either provided “in the air” (Variant 1), on a regular keyboard (Variant 2), or on a response device designed to control from premature responses (Variant 3). The task variants also differed according to the visual layout (more concrete in Variant 3) and depending on whether participants received a feedback of their performance (absent in Variant 1). Behavior was globally comparable between the three variants of the task although the propensity to respond prematurely was highest in Variant 2 and lowest in Variant 3. MEPs elicited in a non-selected hand were similarly suppressed in the three variants of the task. However, significant differences emerged when considering MEPs elicited in the selected hand: these MEPs were suppressed in Variants 1 and 3 whereas they were often facilitated in Variant 2, especially in the right dominant hand. In conclusion, MEPs elicited in selected muscles seem to be more sensitive to small variations to the task design than those recorded in non-selected effectors, probably because they reflect a complex combination of inhibitory and facilitatory influences on the motor output system. Finally, the use of a standard keyboard seems to be particularly inappropriate because it encourages participants to respond promptly with no means to control for premature responses, probably increasing the relative amount of facilitatory influences at the time motor inhibition is probed. PMID:27579905

  4. Comparison of Motor Inhibition in Variants of the Instructed-Delay Choice Reaction Time Task.

    PubMed

    Quoilin, Caroline; Lambert, Julien; Jacob, Benvenuto; Klein, Pierre-Alexandre; Duque, Julie

    2016-01-01

    Using instructed-delay choice reaction time (RT) paradigms, many previous studies have shown that the motor system is transiently inhibited during response preparation: motor-evoked potentials (MEPs) elicited by transcranial magnetic stimulation (TMS) over the primary motor cortex are typically suppressed during the delay period. This effect has been observed in both selected and non-selected effectors, although MEP changes in selected effectors have been more inconsistent across task versions. Here, we compared changes in MEP amplitudes in three different variants of an instructed-delay choice RT task. All variants required participants to choose between left and right index finger movements but the responses were either provided "in the air" (Variant 1), on a regular keyboard (Variant 2), or on a response device designed to control from premature responses (Variant 3). The task variants also differed according to the visual layout (more concrete in Variant 3) and depending on whether participants received a feedback of their performance (absent in Variant 1). Behavior was globally comparable between the three variants of the task although the propensity to respond prematurely was highest in Variant 2 and lowest in Variant 3. MEPs elicited in a non-selected hand were similarly suppressed in the three variants of the task. However, significant differences emerged when considering MEPs elicited in the selected hand: these MEPs were suppressed in Variants 1 and 3 whereas they were often facilitated in Variant 2, especially in the right dominant hand. In conclusion, MEPs elicited in selected muscles seem to be more sensitive to small variations to the task design than those recorded in non-selected effectors, probably because they reflect a complex combination of inhibitory and facilitatory influences on the motor output system. Finally, the use of a standard keyboard seems to be particularly inappropriate because it encourages participants to respond promptly with no means to control for premature responses, probably increasing the relative amount of facilitatory influences at the time motor inhibition is probed.

  5. Secondary School Students' Reasoning about Conditional Probability, Samples, and Sampling Procedures

    ERIC Educational Resources Information Center

    Prodromou, Theodosia

    2016-01-01

    In the Australian mathematics curriculum, Year 12 students (aged 16-17) are asked to solve conditional probability problems that involve the representation of the problem situation with two-way tables or three-dimensional diagrams and consider sampling procedures that result in different correct answers. In a small exploratory study, we…

  6. Tropical Cyclone Wind Probability Forecasting (WINDP).

    DTIC Science & Technology

    1981-04-01

    llq. h. ,c ilrac (t’ small probabilities (below 107c) is limited II(t’h, numb(r o!, significant digits given: therefore 1t( huld lU r~ruidvd as being...APPLIED SCI. CORP. ENGLAMD ;7MOS. SCIENCES OEPT., LIBRARY ATTN: LIBARY , SUITE 500 400 WASHINGTON AVE. 6811 KENILWORTH AVE. EUROPEAN CENTRE FOR MEDIUM

  7. Kolmogorov complexity, statistical regularization of inverse problems, and Birkhoff's formalization of beauty

    NASA Astrophysics Data System (ADS)

    Kreinovich, Vladik; Longpre, Luc; Koshelev, Misha

    1998-09-01

    Most practical applications of statistical methods are based on the implicit assumption that if an event has a very small probability, then it cannot occur. For example, the probability that a kettle placed on a cold stove would start boiling by itself is not 0, it is positive, but it is so small, that physicists conclude that such an event is simply impossible. This assumption is difficult to formalize in traditional probability theory, because this theory only describes measures on sets and does not allow us to divide functions into 'random' and non-random ones. This distinction was made possible by the idea of algorithmic randomness, introduce by Kolmogorov and his student Martin- Loef in the 1960s. We show that this idea can also be used for inverse problems. In particular, we prove that for every probability measure, the corresponding set of random functions is compact, and, therefore, the corresponding restricted inverse problem is well-defined. The resulting techniques turns out to be interestingly related with the qualitative esthetic measure introduced by G. Birkhoff as order/complexity.

  8. Randomness in Competitions

    NASA Astrophysics Data System (ADS)

    Ben-Naim, E.; Hengartner, N. W.; Redner, S.; Vazquez, F.

    2013-05-01

    We study the effects of randomness on competitions based on an elementary random process in which there is a finite probability that a weaker team upsets a stronger team. We apply this model to sports leagues and sports tournaments, and compare the theoretical results with empirical data. Our model shows that single-elimination tournaments are efficient but unfair: the number of games is proportional to the number of teams N, but the probability that the weakest team wins decays only algebraically with N. In contrast, leagues, where every team plays every other team, are fair but inefficient: the top √{N} of teams remain in contention for the championship, while the probability that the weakest team becomes champion is exponentially small. We also propose a gradual elimination schedule that consists of a preliminary round and a championship round. Initially, teams play a small number of preliminary games, and subsequently, a few teams qualify for the championship round. This algorithm is fair and efficient: the best team wins with a high probability and the number of games scales as N 9/5, whereas traditional leagues require N 3 games to fairly determine a champion.

  9. The contribution of threat probability estimates to reexperiencing symptoms: a prospective analog study.

    PubMed

    Regambal, Marci J; Alden, Lynn E

    2012-09-01

    Individuals with posttraumatic stress disorder (PTSD) are hypothesized to have a "sense of current threat." Perceived threat from the environment (i.e., external threat), can lead to overestimating the probability of the traumatic event reoccurring (Ehlers & Clark, 2000). However, it is unclear if external threat judgments are a pre-existing vulnerability for PTSD or a consequence of trauma exposure. We used trauma analog methodology to prospectively measure probability estimates of a traumatic event, and investigate how these estimates were related to cognitive processes implicated in PTSD development. 151 participants estimated the probability of being in car-accident related situations, watched a movie of a car accident victim, and then completed a measure of data-driven processing during the movie. One week later, participants re-estimated the probabilities, and completed measures of reexperiencing symptoms and symptom appraisals/reactions. Path analysis revealed that higher pre-existing probability estimates predicted greater data-driven processing which was associated with negative appraisals and responses to intrusions. Furthermore, lower pre-existing probability estimates and negative responses to intrusions were both associated with a greater change in probability estimates. Reexperiencing symptoms were predicted by negative responses to intrusions and, to a lesser degree, by greater changes in probability estimates. The undergraduate student sample may not be representative of the general public. The reexperiencing symptoms are less severe than what would be found in a trauma sample. Threat estimates present both a vulnerability and a consequence of exposure to a distressing event. Furthermore, changes in these estimates are associated with cognitive processes implicated in PTSD. Copyright © 2012 Elsevier Ltd. All rights reserved.

  10. 75 FR 11729 - Secretary's Order 2-2010

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-11

    ... Order 13515, ``Increasing Participation of Asian Americans and Pacific Islanders in Federal Programs... increase the probability of participation by small businesses as prime contractors, or to facilitate small... Orders 12432 and 12928, including efforts to increase the involvement of minority businesses in the...

  11. Qualitative fusion technique based on information poor system and its application to factor analysis for vibration of rolling bearings

    NASA Astrophysics Data System (ADS)

    Xia, Xintao; Wang, Zhongyu

    2008-10-01

    For some methods of stability analysis of a system using statistics, it is difficult to resolve the problems of unknown probability distribution and small sample. Therefore, a novel method is proposed in this paper to resolve these problems. This method is independent of probability distribution, and is useful for small sample systems. After rearrangement of the original data series, the order difference and two polynomial membership functions are introduced to estimate the true value, the lower bound and the supper bound of the system using fuzzy-set theory. Then empirical distribution function is investigated to ensure confidence level above 95%, and the degree of similarity is presented to evaluate stability of the system. Cases of computer simulation investigate stable systems with various probability distribution, unstable systems with linear systematic errors and periodic systematic errors and some mixed systems. The method of analysis for systematic stability is approved.

  12. Combination of a Stressor-Response Model with a Conditional Probability Analysis Approach for Developing Candidate Criteria from MBSS

    EPA Science Inventory

    I show that a conditional probability analysis using a stressor-response model based on a logistic regression provides a useful approach for developing candidate water quality criteria from empirical data, such as the Maryland Biological Streams Survey (MBSS) data.

  13. Investigating the probability of detection of typical cavity shapes through modelling and comparison of geophysical techniques

    NASA Astrophysics Data System (ADS)

    James, P.

    2011-12-01

    With a growing need for housing in the U.K., the government has proposed increased development of brownfield sites. However, old mine workings and natural cavities represent a potential hazard before, during and after construction on such sites, and add further complication to subsurface parameters. Cavities are hence a limitation to certain redevelopment and their detection is an ever important consideration. The current standard technique for cavity detection is a borehole grid, which is intrusive, non-continuous, slow and expensive. A new robust investigation standard in the detection of cavities is sought and geophysical techniques offer an attractive alternative. Geophysical techniques have previously been utilised successfully in the detection of cavities in various geologies, but still has an uncertain reputation in the engineering industry. Engineers are unsure of the techniques and are inclined to rely on well known techniques than utilise new technologies. Bad experiences with geophysics are commonly due to the indiscriminate choice of particular techniques. It is imperative that a geophysical survey is designed with the specific site and target in mind at all times, and the ability and judgement to rule out some, or all, techniques. To this author's knowledge no comparative software exists to aid technique choice. Also, previous modelling software limit the shapes of bodies and hence typical cavity shapes are not represented. Here, we introduce 3D modelling software (Matlab) which computes and compares the response to various cavity targets from a range of techniques (gravity, gravity gradient, magnetic, magnetic gradient and GPR). Typical near surface cavity shapes are modelled including shafts, bellpits, various lining and capping materials, and migrating voids. The probability of cavity detection is assessed in typical subsurface and noise conditions across a range of survey parameters. Techniques can be compared and the limits of detection distance assessed. The density of survey points required to achieve a required probability of detection can be calculated. The software aids discriminate choice of technique, improves survey design, and increases the likelihood of survey success; all factors sought in the engineering industry. As a simple example, the response from magnetometry, gravimetry, and gravity gradient techniques above an example 3m deep, 1m cube air cavity in limestone across a 15m grid was calculated. The maximum responses above the cavity are small (amplitudes of 0.018nT, 0.0013mGal, 8.3eotvos respectively), but at typical site noise levels the detection reliability is over 50% for the gradient gravity method on a single survey line. Increasing the number of survey points across the site increases the reliability of detection of the anomaly by the addition of probabilities. We can calculate the probability of detection at different profile spacings to assess the best possible survey design. At 1m spacing the overall probability of by the gradient gravity method is over 90%, and over 60% for magnetometry (at 3m spacing the probability drops to 32%). The use of modelling in near surface surveys is a useful tool to assess the feasibility of a range of techniques to detect subtle signals. Future work will integrate this work with borehole measured parameters.

  14. A Hybrid Demand Response Simulator Version 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-05-02

    A hybrid demand response simulator is developed to test different control algorithms for centralized and distributed demand response (DR) programs in a small distribution power grid. The HDRS is designed to model a wide variety of DR services such as peak having, load shifting, arbitrage, spinning reserves, load following, regulation, emergency load shedding, etc. The HDRS does not model the dynamic behaviors of the loads, rather, it simulates the load scheduling and dispatch process. The load models include TCAs (water heaters, air conditioners, refrigerators, freezers, etc) and non-TCAs (lighting, washer, dishwasher, etc.) The ambient temperature changes, thermal resistance, capacitance, andmore » the unit control logics can be modeled for TCA loads. The use patterns of the non-TCA can be modeled by probability of use and probabilistic durations. Some of the communication network characteristics, such as delays and errors, can also be modeled. Most importantly, because the simulator is modular and greatly simplified the thermal models for TCA loads, it is very easy and fast to be used to test and validate different control algorithms in a simulated environment.« less

  15. Risk, Reward, and Decision-Making in a Rodent Model of Cognitive Aging

    PubMed Central

    Gilbert, Ryan J.; Mitchell, Marci R.; Simon, Nicholas W.; Bañuelos, Cristina; Setlow, Barry; Bizon, Jennifer L.

    2011-01-01

    Impaired decision-making in aging can directly impact factors (financial security, health care) that are critical to maintaining quality of life and independence at advanced ages. Naturalistic rodent models mimic human aging in other cognitive domains, and afford the opportunity to parse the effects of age on discrete aspects of decision-making in a manner relatively uncontaminated by experiential factors. Young adult (5–7 months) and aged (23–25 months) male F344 rats were trained on a probability discounting task in which they made discrete-trial choices between a small certain reward (one food pellet) and a large but uncertain reward (two food pellets with varying probabilities of delivery ranging from 100 to 0%). Young rats chose the large reward when it was associated with a high probability of delivery and shifted to the small but certain reward as probability of the large reward decreased. As a group, aged rats performed comparably to young, but there was significantly greater variance among aged rats. One subgroup of aged rats showed strong preference for the small certain reward. This preference was maintained under conditions in which large reward delivery was also certain, suggesting decreased sensitivity to reward magnitude. In contrast, another subgroup of aged rats showed strong preference for the large reward at low probabilities of delivery. Interestingly, this subgroup also showed elevated preference for probabilistic rewards when reward magnitudes were equalized. Previous findings using this same aged study population described strongly attenuated discounting of delayed rewards with age, together suggesting that a subgroup of aged rats may have deficits associated with accounting for reward costs (i.e., delay or probability). These deficits in cost-accounting were dissociable from the age-related differences in sensitivity to reward magnitude, suggesting that aging influences multiple, distinct mechanisms that can impact cost–benefit decision-making. PMID:22319463

  16. Risk, reward, and decision-making in a rodent model of cognitive aging.

    PubMed

    Gilbert, Ryan J; Mitchell, Marci R; Simon, Nicholas W; Bañuelos, Cristina; Setlow, Barry; Bizon, Jennifer L

    2011-01-01

    Impaired decision-making in aging can directly impact factors (financial security, health care) that are critical to maintaining quality of life and independence at advanced ages. Naturalistic rodent models mimic human aging in other cognitive domains, and afford the opportunity to parse the effects of age on discrete aspects of decision-making in a manner relatively uncontaminated by experiential factors. Young adult (5-7 months) and aged (23-25 months) male F344 rats were trained on a probability discounting task in which they made discrete-trial choices between a small certain reward (one food pellet) and a large but uncertain reward (two food pellets with varying probabilities of delivery ranging from 100 to 0%). Young rats chose the large reward when it was associated with a high probability of delivery and shifted to the small but certain reward as probability of the large reward decreased. As a group, aged rats performed comparably to young, but there was significantly greater variance among aged rats. One subgroup of aged rats showed strong preference for the small certain reward. This preference was maintained under conditions in which large reward delivery was also certain, suggesting decreased sensitivity to reward magnitude. In contrast, another subgroup of aged rats showed strong preference for the large reward at low probabilities of delivery. Interestingly, this subgroup also showed elevated preference for probabilistic rewards when reward magnitudes were equalized. Previous findings using this same aged study population described strongly attenuated discounting of delayed rewards with age, together suggesting that a subgroup of aged rats may have deficits associated with accounting for reward costs (i.e., delay or probability). These deficits in cost-accounting were dissociable from the age-related differences in sensitivity to reward magnitude, suggesting that aging influences multiple, distinct mechanisms that can impact cost-benefit decision-making.

  17. Managing Climate Change Refugia for Biodiversity ...

    EPA Pesticide Factsheets

    Climate change threatens to create fundamental shifts in in the distributions and abundances of species. Given projected losses, increased emphasis on management for ecosystem resilience to help buffer fish and wildlife populations against climate change is emerging. Such efforts stake a claim for an adaptive, anticipatory planning response to the climate change threat. To be effective, approaches will need to address critical uncertainties in both the physical basis for projected landscape changes, as well as the biological responses of organisms. Recent efforts define future potential climate refugia based on air temperatures and associated microclimatic changes. These efforts reflect the relatively strong conceptual foundation for linkages between regional climate change and local responses and thermal dynamics. Yet important questions remain. Drawing on case studies, we illustrate some key uncertainties in the responses of species and their habitats to altered hydro-climatic regimes currently not well addressed by physical or ecological models. These uncertainties need not delay anticipatory planning, but rather highlight the need for identification and communication of actions with high probabilities of success, and targeted research within an adaptive management framework.In this workshop, we will showcase the latest science on climate refugia and participants will interact through small group discussions, relevant examples, and facilitated dialogue to i

  18. Host defences against Giardia lamblia.

    PubMed

    Lopez-Romero, G; Quintero, J; Astiazarán-García, H; Velazquez, C

    2015-08-01

    Giardia spp. is a protozoan parasite that inhabits the upper small intestine of mammals and other species and is the aetiological agent of giardiasis. It has been demonstrated that nitric oxide, mast cells and dendritic cells are the first line of defence against Giardia. IL-6 and IL-17 play an important role during infection. Several cytokines possess overlapping functions in regulating innate and adaptive immune responses. IgA and CD4(+) T cells are fundamental to the process of Giardia clearance. It has been suggested that CD4(+) T cells play a double role during the anti-Giardia immune response. First, they activate and stimulate the differentiation of B cells to generate Giardia-specific antibodies. Second, they act through a B-cell-independent mechanism that is probably mediated by Th17 cells. Several Giardia proteins that stimulate humoral and cellular immune responses have been described. Variant surface proteins, α-1 giardin, and cyst wall protein 2 can induce host protective responses to future Giardia challenges. The characterization and evaluation of the protective potential of the immunogenic proteins that are associated with Giardia will offer new insights into host-parasite interactions and may aid in the development of an effective vaccine against the parasite. © 2015 John Wiley & Sons Ltd.

  19. Heterogeneity of the neuropeptide Y (NPY) contractile and relaxing receptors in horse penile small arteries.

    PubMed

    Prieto, Dolores; Arcos, Luis Rivera de Los; Martínez, Pilar; Benedito, Sara; García-Sacristán, Albino; Hernández, Medardo

    2004-12-01

    The distribution of neuropeptide Y (NPY)-immunorective nerves and the receptors involved in the effects of NPY upon electrical field stimulation (EFS)- and noradrenaline (NA)-elicited contractions were investigated in horse penile small arteries. NPY-immunoreactive nerves were widely distributed in the erectile tissues with a particularly high density around penile intracavernous small arteries. In small arteries isolated from the proximal part of the corpora cavernosa, NPY (30 nM) produced a variable modest enhancement of the contractions elicited by both EFS and NA. At the same concentration, the NPY Y(1) receptor agonist, [Leu(31), Pro(34)]NPY, markedly potentiated responses to EFS and NA, whereas the NPY Y(2) receptor agonist, NPY(13-36), enhanced exogenous NA-induced contractions. In arteries precontracted with NA, NPY, peptide YY (PYY), [Leu(31), Pro(34)]NPY and the NPY Y(2) receptor agonists, N-acetyl[Leu(28,31)]NPY (24-36) and NPY(13-36), elicited concentration-dependent contractile responses. Human pancreatic polypeptide (hPP) evoked a biphasic response consisting of a relaxation followed by contraction. NPY(3-36), the compound 1229U91 (Ile-Glu-Pro-Dapa-Tyr-Arg-Leu-Arg-Tyr-NH2, cyclic(2,4')diamide) and eventually NPY(13-36) relaxed penile small arteries. The selective NPY Y(1) receptor antagonist BIBP3226 ((R)-N(2)-(diphenacetyl)-N-[(4-hydroxyphenyl)methyl]D-arginineamide) (0.3 microM) shifted to the right the concentration-response curves to both NPY and [Leu(31), Pro(34)]NPY and inhibited the contractions induced by the highest concentrations of hPP but not the relaxations observed at lower doses. In the presence of the selective NPY Y(2) receptor antagonist BIIE0246 ((S)-N2-[[1-[2-[4-[(R,S)-5,11-dihydro-6(6h)-oxodibenz[b,e]azepin-11-y1]-1-piperazinyl]-2-oxoethyl]cyclo-pentyl-N-[2-[1,2-dihydro-3,5 (4H)-dioxo-1,2-diphenyl-3H-1,2, 4-triazol-4-yl]ethyl]-argininamide) (0.3 microM), the Y(2) receptor agonists NPY(13-36) and N-acetyl[Leu(28,31)]NPY (24-36) evoked potent slow relaxations in NA-precontracted arteries, under conditions of nitric oxide (NO) synthase blockade. Mechanical removal of the endothelium markedly enhanced contractions of NPY on NA-precontracted arteries, whereas blockade of the neuronal voltage-dependent Ca(2+) channels did not alter NPY responses. These results demonstrate that NPY can elicit dual contractile/relaxing responses in penile small arteries through a heterogeneous population of postjunctional NPY receptors. Potentiation of the contractions evoked by NA involve both NPY Y(1) and NPY Y(2) receptors. An NO-independent relaxation probably mediated by an atypical endothelial NPY receptor is also shown and unmasked in the presence of selective antagonists of the NPY contractile receptors.

  20. Probability effects on stimulus evaluation and response processes

    NASA Technical Reports Server (NTRS)

    Gehring, W. J.; Gratton, G.; Coles, M. G.; Donchin, E.

    1992-01-01

    This study investigated the effects of probability information on response preparation and stimulus evaluation. Eight subjects responded with one hand to the target letter H and with the other to the target letter S. The target letter was surrounded by noise letters that were either the same as or different from the target letter. In 2 conditions, the targets were preceded by a warning stimulus unrelated to the target letter. In 2 other conditions, a warning letter predicted that the same letter or the opposite letter would appear as the imperative stimulus with .80 probability. Correct reaction times were faster and error rates were lower when imperative stimuli confirmed the predictions of the warning stimulus. Probability information affected (a) the preparation of motor responses during the foreperiod, (b) the development of expectancies for a particular target letter, and (c) a process sensitive to the identities of letter stimuli but not to their locations.

  1. Probability and the changing shape of response distributions for orientation.

    PubMed

    Anderson, Britt

    2014-11-18

    Spatial attention and feature-based attention are regarded as two independent mechanisms for biasing the processing of sensory stimuli. Feature attention is held to be a spatially invariant mechanism that advantages a single feature per sensory dimension. In contrast to the prediction of location independence, I found that participants were able to report the orientation of a briefly presented visual grating better for targets defined by high probability conjunctions of features and locations even when orientations and locations were individually uniform. The advantage for high-probability conjunctions was accompanied by changes in the shape of the response distributions. High-probability conjunctions had error distributions that were not normally distributed but demonstrated increased kurtosis. The increase in kurtosis could be explained as a change in the variances of the component tuning functions that comprise a population mixture. By changing the mixture distribution of orientation-tuned neurons, it is possible to change the shape of the discrimination function. This prompts the suggestion that attention may not "increase" the quality of perceptual processing in an absolute sense but rather prioritizes some stimuli over others. This results in an increased number of highly accurate responses to probable targets and, simultaneously, an increase in the number of very inaccurate responses. © 2014 ARVO.

  2. Effects of Hurricanes Katrina and Rita on the chemistry of bottom sediments in Lake Pontchartrain, La.: Chapter 7F in Science and the storms-the USGS response to the hurricanes of 2005

    USGS Publications Warehouse

    Van Metre, Peter C.; Horowitz, Arthur J.; Mahler, Barbara J.; Foreman, William T.; Fuller, Christopher C.; Burkhardt, Mark R.; Elrick, Kent A.; Furlong, Edward T.; Skrobialowski, Stanley C.; Smith, James J.; Wilson, Jennifer T.; Zaugg, Stephen D.

    2007-01-01

    Concerns about the effect of pumping contaminated flood waters into Lake Pontchartrain following the hurricanes of 2005 prompted the U.S. Geological Survey (USGS) to sample street mud, canal-suspended sediment, and bottom sediment in Lake Pontchartain. The samples were analyzed for a wide variety of potential inorganic and organic contaminants. Results indicate that contamination of lake sediment relative to other urban lakes and to accepted sedimentquality guidelines was limited to a relatively small area offshore from the Metairie Outfall Canal (popularly known as the 17th Street Canal) and that this contamination is probably transient.

  3. Comparison of the Effects of Iodine and Iodide on Thyroid Function in Humans

    NASA Technical Reports Server (NTRS)

    Robison, Linda M.; Bull, Richard J.; Sylvester, Paul W.; Birkenfeld, Paul; Lang, Jerome

    1995-01-01

    The present experiment in humans failed to confirm the differential effect of I(sub 2) on maintenance of serum T(sub 4) concentrations relative to the effects of I(-) that was observed in prior experiments in rats. The reaction of I(sub 2) with metabolites of thyroid hormones in the intestine that appears responsible for this effect in rats probably also exists at some level in humans. The present results suggest that the concentrations of such metabolites in the human intestinal tract are too small to significantly affect circulating concentration of T(sub 4). However, based on the elevations in TSH, there should be some concern over the potential impacts of chronic consumption of iodine in drinking water.

  4. Did the Crab Pulsar Undergo a Small Glitch in 2006 Late March/Early April?

    NASA Astrophysics Data System (ADS)

    Vivekanand, M.

    2016-08-01

    On 2006 August 23 the Crab Pulsar underwent a glitch, which was reported by the Jodrell Bank and the Xinjiang radio observatories. Neither data are available to the public. However, the Jodrell group publishes monthly arrival times of the Crab Pulsar pulse (their actual observations are done daily), and using these, it is shown that about 5 months earlier the Crab Pulsar probably underwent a small glitch, which has not been reported before. Neither observatory discusses the detailed analysis of data from 2006 March to August; either they may not have detected this small glitch, or they may have attributed it to timing noise in the Crab Pulsar. The above result is verified using X-ray data from RXTE. If this is indeed true, this is probably the smallest glitch observed in the Crab Pulsar so far, whose implications are discussed. This work addresses the confusion possible between small-magnitude glitches and timing noise in pulsars.

  5. A study of small impact parameter ion channeling effects in thin crystals

    NASA Astrophysics Data System (ADS)

    Motapothula, Mallikarjuna Rao; Breese, Mark B. H.

    2018-03-01

    We have recorded channeling patterns produced by 1-2 MeV protons aligned with ⟨1 1 1⟩ axes in 55 nm thick silicon crystals which exhibit characteristic angular structure for deflection angles up to and beyond the axial critical angle, ψ a . Such large angular deflections are produced by ions incident on atomic strings with small impact parameters, resulting in trajectories which pass through several radial rings of atomic strings before exiting the thin crystal. Each ring may focus, steer or scatter the channeled ions in the transverse direction and the resulting characteristic angular structure beyond 0.6 ψ a at different depths can be related to peaks and troughs in the nuclear encounter probability. Such "radial focusing" underlies other axial channeling phenomena in thin crystals including planar channeling of small impact parameter trajectories, peaks around the azimuthal distribution at small tilts and large shoulders in the nuclear encounter probability at tilts beyond ψ a .

  6. DID THE CRAB PULSAR UNDERGO A SMALL GLITCH IN 2006 LATE MARCH/EARLY APRIL?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vivekanand, M., E-mail: viv.maddali@gmail.com

    2016-08-01

    On 2006 August 23 the Crab Pulsar underwent a glitch, which was reported by the Jodrell Bank and the Xinjiang radio observatories. Neither data are available to the public. However, the Jodrell group publishes monthly arrival times of the Crab Pulsar pulse (their actual observations are done daily), and using these, it is shown that about 5 months earlier the Crab Pulsar probably underwent a small glitch, which has not been reported before. Neither observatory discusses the detailed analysis of data from 2006 March to August; either they may not have detected this small glitch, or they may have attributedmore » it to timing noise in the Crab Pulsar. The above result is verified using X-ray data from RXTE . If this is indeed true, this is probably the smallest glitch observed in the Crab Pulsar so far, whose implications are discussed. This work addresses the confusion possible between small-magnitude glitches and timing noise in pulsars.« less

  7. Stream response to repeated coseismic folding, Tiptonville dome, New Madrid seismic zone

    NASA Astrophysics Data System (ADS)

    Guccione, M. J.; Mueller, K.; Champion, J.; Shepherd, S.; Carlson, S. D.; Odhiambo, B.; Tate, A.

    2002-03-01

    Fluvial response to tectonic deformation is dependent on the amount and style of surface deformation and the relative size of the stream. Active folding in the New Madrid seismic zone (NMSZ) forms the Tiptonville dome, a 15-km long and 5-km wide surface fold with up to 11 m of late Holocene structural relief. The fold is crossed by streams of varying size, from the Mississippi River to small flood-plain streams. Fluvial response of these streams to repeated coseismic folding has only been preserved for the past 2.3 ka, since the Tiptonville meander of the Mississippi River migrated across the area forming the present flood plain. This surface comprises a sandy point-bar deposit locally overlain by clayey overbank and silty sand crevasse-splay deposits, an abandoned chute channel infilled with laminated sandy silt and silty clay, and an abandoned neck cutoff filled with a sandy cutoff bar and silty clay oxbow lake deposits. Dating various stream responses to coseismic folding has more tightly constrained the timing of earthquake events in the central NMSZ and provides a means of partitioning the deformation amount into individual seismic events. Three earthquakes have been dated in the Reelfoot Lake area, ca. A.D. 900, 1470, and 1812. The latter two earthquakes had large local coseismic deformation. Both of these events were responsible for numerous stream responses such as shifting depocenters, modification of Mississippi River channel geometry, and derangement of small streams. Overbank sedimentation ceased on the dome as it was uplifted above the normal flood stage, and sedimentation of crevasse-splay deposits from the Mississippi River, colluvium from the scarp, and lacustrine sediment accumulated in the adjacent Reelfoot basin. The much larger Mississippi River channel responded to uplift by increasing its sinuosity across the uplift relative to both upstream and downstream, increasing its width/depth ratio across and downstream of the uplift, and decreasing the width/depth ratio upstream of the uplift. Despite the size of the Mississippi River, it has not yet attained equilibrium since the latest uplift 190 years ago. Small channels that could not downcut through the uplift were filled, locally reversed flow direction, or formed a lake where they were dammed. Uplift and stream response to folding along the Tiptonville dome is less dramatic between 2.3 and 0.53 ka. During this interval, abandoned channel fill and overbank deposition across the dome suggests that it was not a high-relief feature. One earthquake event occurred during this interval (ca. A.D. 900), but coseismic stream response was probably limited to a slight aggradation of a small flood-plain stream.

  8. Factors Influencing Ball-Player Impact Probability in Youth Baseball

    PubMed Central

    Matta, Philip A.; Myers, Joseph B.; Sawicki, Gregory S.

    2015-01-01

    Background: Altering the weight of baseballs for youth play has been studied out of concern for player safety. Research has shown that decreasing the weight of baseballs may limit the severity of both chronic arm and collision injuries. Unfortunately, reducing the weight of the ball also increases its exit velocity, leaving pitchers and nonpitchers with less time to defend themselves. The purpose of this study was to examine impact probability for pitchers and nonpitchers. Hypothesis: Reducing the available time to respond by 10% (expected from reducing ball weight from 142 g to 113 g) would increase impact probability for pitchers and nonpitchers, and players’ mean simple response time would be a primary predictor of impact probability for all participants. Study Design: Nineteen subjects between the ages of 9 and 13 years performed 3 experiments in a controlled laboratory setting: a simple response time test, an avoidance response time test, and a pitching response time test. Methods: Each subject performed these tests in order. The simple reaction time test tested the subjects’ mean simple response time, the avoidance reaction time test tested the subjects’ ability to avoid a simulated batted ball as a fielder, and the pitching reaction time test tested the subjects’ ability to avoid a simulated batted ball as a pitcher. Results: Reducing the weight of a standard baseball from 142 g to 113 g led to a less than 5% increase in impact probability for nonpitchers. However, the results indicate that the impact probability for pitchers could increase by more than 25%. Conclusion: Pitching may greatly increase the amount of time needed to react and defend oneself from a batted ball. Clinical Relevance: Impact injuries to youth baseball players may increase if a 113-g ball is used. PMID:25984261

  9. Urban sprawl and delayed ambulance arrival in the U.S.

    PubMed

    Trowbridge, Matthew J; Gurka, Matthew J; O'Connor, Robert E

    2009-11-01

    Minimizing emergency medical service (EMS) response time is a central objective of prehospital care, yet the potential influence of built environment features such as urban sprawl on EMS system performance is often not considered. This study measures the association between urban sprawl and EMS response time to test the hypothesis that features of sprawling development increase the probability of delayed ambulance arrival. In 2008, EMS response times for 43,424 motor-vehicle crashes were obtained from the Fatal Analysis Reporting System, a national census of crashes involving > or =1 fatality. Sprawl at each crash location was measured using a continuous county-level index previously developed by Ewing et al. The association between sprawl and the probability of a delayed ambulance arrival (> or =8 minutes) was then measured using generalized linear mixed modeling to account for correlation among crashes from the same county. Urban sprawl is significantly associated with increased EMS response time and a higher probability of delayed ambulance arrival (p=0.03). This probability increases quadratically as the severity of sprawl increases while controlling for nighttime crash occurrence, road conditions, and presence of construction. For example, in sprawling counties (e.g., Fayette County GA), the probability of a delayed ambulance arrival for daytime crashes in dry conditions without construction was 69% (95% CI=66%, 72%) compared with 31% (95% CI=28%, 35%) in counties with prominent smart-growth characteristics (e.g., Delaware County PA). Urban sprawl is significantly associated with increased EMS response time and a higher probability of delayed ambulance arrival following motor-vehicle crashes in the U.S. The results of this study suggest that promotion of community design and development that follows smart-growth principles and regulates urban sprawl may improve EMS performance and reliability.

  10. The exact probability distribution of the rank product statistics for replicated experiments.

    PubMed

    Eisinga, Rob; Breitling, Rainer; Heskes, Tom

    2013-03-18

    The rank product method is a widely accepted technique for detecting differentially regulated genes in replicated microarray experiments. To approximate the sampling distribution of the rank product statistic, the original publication proposed a permutation approach, whereas recently an alternative approximation based on the continuous gamma distribution was suggested. However, both approximations are imperfect for estimating small tail probabilities. In this paper we relate the rank product statistic to number theory and provide a derivation of its exact probability distribution and the true tail probabilities. Copyright © 2013 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.

  11. Application of binomial and multinomial probability statistics to the sampling design process of a global grain tracing and recall system

    USDA-ARS?s Scientific Manuscript database

    Small, coded, pill-sized tracers embedded in grain are proposed as a method for grain traceability. A sampling process for a grain traceability system was designed and investigated by applying probability statistics using a science-based sampling approach to collect an adequate number of tracers fo...

  12. Attention as Inference: Selection Is Probabilistic; Responses Are All-or-None Samples

    ERIC Educational Resources Information Center

    Vul, Edward; Hanus, Deborah; Kanwisher, Nancy

    2009-01-01

    Theories of probabilistic cognition postulate that internal representations are made up of multiple simultaneously held hypotheses, each with its own probability of being correct (henceforth, "probability distributions"). However, subjects make discrete responses and report the phenomenal contents of their mind to be all-or-none states rather than…

  13. The Sequential Probability Ratio Test and Binary Item Response Models

    ERIC Educational Resources Information Center

    Nydick, Steven W.

    2014-01-01

    The sequential probability ratio test (SPRT) is a common method for terminating item response theory (IRT)-based adaptive classification tests. To decide whether a classification test should stop, the SPRT compares a simple log-likelihood ratio, based on the classification bound separating two categories, to prespecified critical values. As has…

  14. Risk preferences, probability weighting, and strategy tradeoffs in wildfire management

    Treesearch

    Michael S. Hand; Matthew J. Wibbenmeyer; Dave Calkin; Matthew P. Thompson

    2015-01-01

    Wildfires present a complex applied risk management environment, but relatively little attention has been paid to behavioral and cognitive responses to risk among public agency wildfire managers. This study investigates responses to risk, including probability weighting and risk aversion, in a wildfire management context using a survey-based experiment administered to...

  15. Sampling little fish in big rivers: Larval fish detection probabilities in two Lake Erie tributaries and implications for sampling effort and abundance indices

    USGS Publications Warehouse

    Pritt, Jeremy J.; DuFour, Mark R.; Mayer, Christine M.; Roseman, Edward F.; DeBruyne, Robin L.

    2014-01-01

    Larval fish are frequently sampled in coastal tributaries to determine factors affecting recruitment, evaluate spawning success, and estimate production from spawning habitats. Imperfect detection of larvae is common, because larval fish are small and unevenly distributed in space and time, and coastal tributaries are often large and heterogeneous. We estimated detection probabilities of larval fish from several taxa in the Maumee and Detroit rivers, the two largest tributaries of Lake Erie. We then demonstrated how accounting for imperfect detection influenced (1) the probability of observing taxa as present relative to sampling effort and (2) abundance indices for larval fish of two Detroit River species. We found that detection probabilities ranged from 0.09 to 0.91 but were always less than 1.0, indicating that imperfect detection is common among taxa and between systems. In general, taxa with high fecundities, small larval length at hatching, and no nesting behaviors had the highest detection probabilities. Also, detection probabilities were higher in the Maumee River than in the Detroit River. Accounting for imperfect detection produced up to fourfold increases in abundance indices for Lake Whitefish Coregonus clupeaformis and Gizzard Shad Dorosoma cepedianum. The effect of accounting for imperfect detection in abundance indices was greatest during periods of low abundance for both species. Detection information can be used to determine the appropriate level of sampling effort for larval fishes and may improve management and conservation decisions based on larval fish data.

  16. Small-Noise Analysis and Symmetrization of Implicit Monte Carlo Samplers

    DOE PAGES

    Goodman, Jonathan; Lin, Kevin K.; Morzfeld, Matthias

    2015-07-06

    Implicit samplers are algorithms for producing independent, weighted samples from multivariate probability distributions. These are often applied in Bayesian data assimilation algorithms. We use Laplace asymptotic expansions to analyze two implicit samplers in the small noise regime. Our analysis suggests a symmetrization of the algorithms that leads to improved implicit sampling schemes at a relatively small additional cost. Here, computational experiments confirm the theory and show that symmetrization is effective for small noise sampling problems.

  17. Behavioural and physiological responses of brook trout Salvelinus fontinalis to midwinter flow reduction in a small ice-free mountain stream.

    PubMed

    Krimmer, A N; Paul, A J; Hontela, A; Rasmussen, J B

    2011-09-01

    This study presents an experimental analysis of the effects of midwinter flow reduction (50-75%, reduction in discharge in 4 h daily pulses) on the physical habitat and on behaviour and physiology of overwintering brook trout Salvelinus fontinalis in a small mountain stream. Flow reduction did not result in significant lowering of temperature or formation of surface or subsurface ice. The main findings were (1) daily movement by S. fontinalis increased (c. 2·5-fold) during flow reduction, but was limited to small-scale relocations (<10 m). (2) Undercut banks were the preferred habitat and availability of these habitats was reduced during flow reduction. (3) Although both experimental and reference fish did lose mass and condition during the experiment, no effects of flow reduction on stress indicators (blood cortisol or glucose) or bioenergetics (total body fat, water content or mass loss) were detected, probably because access to the preferred type of cover remained available. Like other salmonids, S. fontinalis moves little and seeks physical cover during winter. Unlike many of the more studied salmonids, however, this species overwinters successfully in small groundwater-rich streams that often remain ice-free, and this study identifies undercut banks as the critical winter habitat rather than substratum cover. © 2011 The Authors. Journal of Fish Biology © 2011 The Fisheries Society of the British Isles.

  18. Probabilistic Evaluation of Blade Impact Damage

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Abumeri, G. H.

    2003-01-01

    The response to high velocity impact of a composite blade is probabilistically evaluated. The evaluation is focused on quantifying probabilistically the effects of uncertainties (scatter) in the variables that describe the impact, the blade make-up (geometry and material), the blade response (displacements, strains, stresses, frequencies), the blade residual strength after impact, and the blade damage tolerance. The results of probabilistic evaluations results are in terms of probability cumulative distribution functions and probabilistic sensitivities. Results show that the blade has relatively low damage tolerance at 0.999 probability of structural failure and substantial at 0.01 probability.

  19. Impact of Venetoclax Exposure on Clinical Efficacy and Safety in Patients with Relapsed or Refractory Chronic Lymphocytic Leukemia.

    PubMed

    Freise, Kevin J; Jones, Aksana K; Eckert, Doerthe; Mensing, Sven; Wong, Shekman L; Humerickhouse, Rod A; Awni, Walid M; Salem, Ahmed Hamed

    2017-05-01

    Venetoclax is a selective, potent, first-in-class B-cell lymphoma-2 inhibitor that restores apoptosis in cancer cells and has demonstrated efficacy in a variety of hematological malignancies. The objective of this research was to characterize the relationship between venetoclax exposures and efficacy and safety in patients with relapsed or refractory (R/R) chronic lymphocytic leukemia (CLL)/small lymphocytic lymphoma (SLL). A total of 272 and 338 patients from four clinical studies were pooled for the exposure-efficacy and exposure-safety analyses, respectively. Demographics, baseline disease characteristics, and select co-medications were evaluated for their impact on efficacy (lymphocytes, tumor size, objective response [OR]) and safety (neutropenia and infection). Higher venetoclax concentrations led to a more rapid decrease in lymphocyte counts and tumor size, which translated into patients more rapidly achieving OR. The 17p deletion somatic mutation was not identified, in any of the analyses, to affect the responsiveness of patients to venetoclax. Model-based simulations of lymphocyte counts and tumor size estimated an OR rate (ORR) of 84.8 % (95 % confidence interval 81.5-88.0 %) at a venetoclax dosage of 400 mg daily, with minimal increase in ORR at higher doses. The safety analyses of the adverse events (grade 3 or higher) of neutropenia and infection indicated that higher average venetoclax concentrations were not associated with an increase in adverse events. The exposure-response analyses indicated that a venetoclax dosage regimen of 400 mg daily results in a high (>80 %) probability of achieving OR in R/R CLL/SLL patients, with minimal probability of increasing neutropenia or infection with higher exposures.

  20. Stochastic optimization of intensity modulated radiotherapy to account for uncertainties in patient sensitivity

    NASA Astrophysics Data System (ADS)

    Kåver, Gereon; Lind, Bengt K.; Löf, Johan; Liander, Anders; Brahme, Anders

    1999-12-01

    The aim of the present work is to better account for the known uncertainties in radiobiological response parameters when optimizing radiation therapy. The radiation sensitivity of a specific patient is usually unknown beyond the expectation value and possibly the standard deviation that may be derived from studies on groups of patients. Instead of trying to find the treatment with the highest possible probability of a desirable outcome for a patient of average sensitivity, it is more desirable to maximize the expectation value of the probability for the desirable outcome over the possible range of variation of the radiation sensitivity of the patient. Such a stochastic optimization will also have to consider the distribution function of the radiation sensitivity and the larger steepness of the response for the individual patient. The results of stochastic optimization are also compared with simpler methods such as using biological response `margins' to account for the range of sensitivity variation. By using stochastic optimization, the absolute gain will typically be of the order of a few per cent and the relative improvement compared with non-stochastic optimization is generally less than about 10 per cent. The extent of this gain varies with the level of interpatient variability as well as with the difficulty and complexity of the case studied. Although the dose changes are rather small (<5 Gy) there is a strong desire to make treatment plans more robust, and tolerant of the likely range of variation of the radiation sensitivity of each individual patient. When more accurate predictive assays of the radiation sensitivity for each patient become available, the need to consider the range of variations can be reduced considerably.

  1. Does prescribed fire promote resistance to drought in low elevation forests of the Sierra Nevada, California, USA?

    USGS Publications Warehouse

    van Mantgem, Phillip J.; Caprio, Anthony C.; Stephenson, Nathan L.; Das, Adrian J.

    2016-01-01

    Prescribed fire is a primary tool used to restore western forests following more than a century of fire exclusion, reducing fire hazard by removing dead and live fuels (small trees and shrubs).  It is commonly assumed that the reduced forest density following prescribed fire also reduces competition for resources among the remaining trees, so that the remaining trees are more resistant (more likely to survive) in the face of additional stressors, such as drought.  Yet this proposition remains largely untested, so that managers do not have the basic information to evaluate whether prescribed fire may help forests adapt to a future of more frequent and severe drought.During the third year of drought, in 2014, we surveyed 9950 trees in 38 burned and 18 unburned mixed conifer forest plots at low elevation (<2100 m a.s.l.) in Kings Canyon, Sequoia, and Yosemite national parks in California, USA.  Fire had occurred in the burned plots from 6 yr to 28 yr before our survey.  After accounting for differences in individual tree diameter, common conifer species found in the burned plots had significantly reduced probability of mortality compared to unburned plots during the drought.  Stand density (stems ha-1) was significantly lower in burned versus unburned sites, supporting the idea that reduced competition may be responsible for the differential drought mortality response.  At the time of writing, we are not sure if burned stands will maintain lower tree mortality probabilities in the face of the continued, severe drought of 2015.  Future work should aim to better identify drought response mechanisms and how these may vary across other forest types and regions, particularly in other areas experiencing severe drought in the Sierra Nevada and on the Colorado Plateau.

  2. Neural response to reward anticipation under risk is nonlinear in probabilities.

    PubMed

    Hsu, Ming; Krajbich, Ian; Zhao, Chen; Camerer, Colin F

    2009-02-18

    A widely observed phenomenon in decision making under risk is the apparent overweighting of unlikely events and the underweighting of nearly certain events. This violates standard assumptions in expected utility theory, which requires that expected utility be linear (objective) in probabilities. Models such as prospect theory have relaxed this assumption and introduced the notion of a "probability weighting function," which captures the key properties found in experimental data. This study reports functional magnetic resonance imaging (fMRI) data that neural response to expected reward is nonlinear in probabilities. Specifically, we found that activity in the striatum during valuation of monetary gambles are nonlinear in probabilities in the pattern predicted by prospect theory, suggesting that probability distortion is reflected at the level of the reward encoding process. The degree of nonlinearity reflected in individual subjects' decisions is also correlated with striatal activity across subjects. Our results shed light on the neural mechanisms of reward processing, and have implications for future neuroscientific studies of decision making involving extreme tails of the distribution, where probability weighting provides an explanation for commonly observed behavioral anomalies.

  3. Variables associated with detection probability, detection latency, and behavioral responses of Golden-winged Warblers (Vermivora chrysoptera)

    USGS Publications Warehouse

    Aldinger, Kyle R.; Wood, Petra B.

    2015-01-01

    Detection probability during point counts and its associated variables are important considerations for bird population monitoring and have implications for conservation planning by influencing population estimates. During 2008–2009, we evaluated variables hypothesized to be associated with detection probability, detection latency, and behavioral responses of male Golden-winged Warblers in pastures in the Monongahela National Forest, West Virginia, USA. This is the first study of male Golden-winged Warbler detection probability, detection latency, or behavioral response based on point-count sampling with known territory locations and identities for all males. During 3-min passive point counts, detection probability decreased as distance to a male's territory and time since sunrise increased. During 3-min point counts with playback, detection probability decreased as distance to a male's territory increased, but remained constant as time since sunrise increased. Detection probability was greater when point counts included type 2 compared with type 1 song playback, particularly during the first 2 min of type 2 song playback. Golden-winged Warblers primarily use type 1 songs (often zee bee bee bee with a higher-pitched first note) in intersexual contexts and type 2 songs (strident, rapid stutter ending with a lower-pitched buzzy note) in intrasexual contexts. Distance to a male's territory, ordinal date, and song playback type were associated with the type of behavioral response to song playback. Overall, ~2 min of type 2 song playback may increase the efficacy of point counts for monitoring populations of Golden-winged Warblers by increasing the conspicuousness of males for visual identification and offsetting the consequences of surveying later in the morning. Because playback may interfere with the ability to detect distant males, it is important to follow playback with a period of passive listening. Our results indicate that even in relatively open pasture vegetation, detection probability of male Golden-winged Warblers is imperfect and highly variable.

  4. Spider movement, UV reflectance and size, but not spider crypsis, affect the response of honeybees to Australian crab spiders.

    PubMed

    Llandres, Ana L; Rodríguez-Gironés, Miguel A

    2011-02-16

    According to the crypsis hypothesis, the ability of female crab spiders to change body colour and match the colour of flowers has been selected because flower visitors are less likely to detect spiders that match the colour of the flowers used as hunting platform. However, recent findings suggest that spider crypsis plays a minor role in predator detection and some studies even showed that pollinators can become attracted to flowers harbouring Australian crab spider when the UV contrast between spider and flower increases. Here we studied the response of Apis mellifera honeybees to the presence of white or yellow Thomisus spectabilis Australian crab spiders sitting on Bidens alba inflorescences and also the response of honeybees to crab spiders that we made easily detectable painting blue their forelimbs or abdomen. To account for the visual systems of crab spider's prey, we measured the reflectance properties of the spiders and inflorescences used for the experiments. We found that honeybees did not respond to the degree of matching between spiders and inflorescences (either chromatic or achromatic contrast): they responded similarly to white and yellow spiders, to control and painted spiders. However spider UV reflection, spider size and spider movement determined honeybee behaviour: the probability that honeybees landed on spider-harbouring inflorescences was greatest when the spiders were large and had high UV reflectance or when spiders were small and reflected little UV, and honeybees were more likely to reject inflorescences if spiders moved as the bee approached the inflorescence. Our study suggests that only the large, but not the small Australian crab spiders deceive their preys by reflecting UV light, and highlights the importance of other cues that elicited an anti-predator response in honeybees.

  5. Spider Movement, UV Reflectance and Size, but Not Spider Crypsis, Affect the Response of Honeybees to Australian Crab Spiders

    PubMed Central

    Llandres, Ana L.; Rodríguez-Gironés, Miguel A.

    2011-01-01

    According to the crypsis hypothesis, the ability of female crab spiders to change body colour and match the colour of flowers has been selected because flower visitors are less likely to detect spiders that match the colour of the flowers used as hunting platform. However, recent findings suggest that spider crypsis plays a minor role in predator detection and some studies even showed that pollinators can become attracted to flowers harbouring Australian crab spider when the UV contrast between spider and flower increases. Here we studied the response of Apis mellifera honeybees to the presence of white or yellow Thomisus spectabilis Australian crab spiders sitting on Bidens alba inflorescences and also the response of honeybees to crab spiders that we made easily detectable painting blue their forelimbs or abdomen. To account for the visual systems of crab spider's prey, we measured the reflectance properties of the spiders and inflorescences used for the experiments. We found that honeybees did not respond to the degree of matching between spiders and inflorescences (either chromatic or achromatic contrast): they responded similarly to white and yellow spiders, to control and painted spiders. However spider UV reflection, spider size and spider movement determined honeybee behaviour: the probability that honeybees landed on spider-harbouring inflorescences was greatest when the spiders were large and had high UV reflectance or when spiders were small and reflected little UV, and honeybees were more likely to reject inflorescences if spiders moved as the bee approached the inflorescence. Our study suggests that only the large, but not the small Australian crab spiders deceive their preys by reflecting UV light, and highlights the importance of other cues that elicited an anti-predator response in honeybees. PMID:21359183

  6. Activating mutations in the tyrosine kinase domain of the epidermal growth factor receptor are associated with improved survival in gefitinib-treated chemorefractory lung adenocarcinomas.

    PubMed

    Taron, Miguel; Ichinose, Yukito; Rosell, Rafael; Mok, Tony; Massuti, Bartomeu; Zamora, Lurdes; Mate, Jose Luis; Manegold, Christian; Ono, Mayumi; Queralt, Cristina; Jahan, Thierry; Sanchez, Jose Javier; Sanchez-Ronco, Maria; Hsue, Victor; Jablons, David; Sanchez, Jose Miguel; Moran, Teresa

    2005-08-15

    Activating mutations in the tyrosine kinase domain of the epidermal growth factor receptor (EGFR) confer a strong sensitivity to gefitinib, a selective tyrosine kinase inhibitor of EGFR. We examined EGFR mutations at exons 18, 19, and 21 in tumor tissue from 68 gefitinib-treated, chemorefractory, advanced non-small cell lung cancer patients from the United States, Europe, and Asia and in a highly gefitinib-sensitive non-small cell lung cancer cell line and correlated their presence with response and survival. In addition, in a subgroup of 28 patients for whom the remaining tumor tissue was available, we examined the relationship among EGFR mutations, CA repeats in intron 1 of EGFR, EGFR and caveolin-1 mRNA levels, and increased EGFR gene copy numbers. Seventeen patients had EGFR mutations, all of which were in lung adenocarcinomas. Radiographic response was observed in 16 of 17 (94.1%) patients harboring EGFR mutations, in contrast with 6 of 51 (12.6%) with wild-type EGFR (P < 0.0001). Probability of response increased significantly in never smokers, patients receiving a greater number of prior chemotherapy regimens, Asians, and younger patients. Median survival was not reached for patients with EGFR mutations and was 9.9 months for those with wild-type EGFR (P = 0.001). EGFR mutations tended to be associated with increased numbers of CA repeats and increased EGFR gene copy numbers but not with EGFR and caveolin-1 mRNA overexpression (P = not significant). The presence of EGFR mutations is a major determinant of gefitinib response, and targeting EGFR should be considered in preference to chemotherapy as first-line treatment in lung adenocarcinomas that have demonstrable EGFR mutations.

  7. Stimulus Probability Effects in Absolute Identification

    ERIC Educational Resources Information Center

    Kent, Christopher; Lamberts, Koen

    2016-01-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…

  8. Is an immune reaction required for malignant transformation and cancer growth?

    PubMed

    Prehn, Richmond T; Prehn, Liisa M

    2012-07-01

    Increasing evidence has shown that probably all malignant mouse cells, even those of spontaneous sporadic cancers, are endowed with tumor-specific antigens. Stimulation of cancer growth, rather than inhibition by the immune reaction, is seemingly the prevalent effect in the animal of origin (the autochthonous animal). Small initial dosages of even strong tumor antigens tend to produce stimulatory immune reactions rather than tumor inhibition in any animal. Thus, an immune response at a low level may be an essential growth-driving feature of nascent cancers, and this may be why all cancers apparently have tumor-specific antigens. Inasmuch as a low level of immunity is stimulatory to tumor growth while larger dosages are inhibitory, immuno-selection via this low response may tend to keep the antitumor immune reaction weak and at a nearly maximal stimulatory level throughout most of a tumor's existence. These facts suggest that both suppression of tumor immunity and a heightened immune reaction might each be therapeutic although very contrasting modalities.

  9. Bidirectional interactions between indomethacin and the murine intestinal microbiota

    PubMed Central

    Liang, Xue; Bittinger, Kyle; Li, Xuanwen; Abernethy, Darrell R; Bushman, Frederic D; FitzGerald, Garret A

    2015-01-01

    The vertebrate gut microbiota have been implicated in the metabolism of xenobiotic compounds, motivating studies of microbe-driven metabolism of clinically important drugs. Here, we studied interactions between the microbiota and indomethacin, a nonsteroidal anti-inflammatory drug (NSAID) that inhibits cyclooxygenases (COX) -1 and -2. Indomethacin was tested in both acute and chronic exposure models in mice at clinically relevant doses, which suppressed production of COX-1- and COX-2-derived prostaglandins and caused small intestinal (SI) damage. Deep sequencing analysis showed that indomethacin exposure was associated with alterations in the structure of the intestinal microbiota in both dosing models. Perturbation of the intestinal microbiome by antibiotic treatment altered indomethacin pharmacokinetics and pharmacodynamics, which is probably the result of reduced bacterial β-glucuronidase activity. Humans show considerable inter-individual differences in their microbiota and their responses to indomethacin — thus, the drug-microbe interactions described here provide candidate mediators of individualized drug responses. DOI: http://dx.doi.org/10.7554/eLife.08973.001 PMID:26701907

  10. An efficient reliability algorithm for locating design point using the combination of importance sampling concepts and response surface method

    NASA Astrophysics Data System (ADS)

    Shayanfar, Mohsen Ali; Barkhordari, Mohammad Ali; Roudak, Mohammad Amin

    2017-06-01

    Monte Carlo simulation (MCS) is a useful tool for computation of probability of failure in reliability analysis. However, the large number of required random samples makes it time-consuming. Response surface method (RSM) is another common method in reliability analysis. Although RSM is widely used for its simplicity, it cannot be trusted in highly nonlinear problems due to its linear nature. In this paper, a new efficient algorithm, employing the combination of importance sampling, as a class of MCS, and RSM is proposed. In the proposed algorithm, analysis starts with importance sampling concepts and using a represented two-step updating rule of design point. This part finishes after a small number of samples are generated. Then RSM starts to work using Bucher experimental design, with the last design point and a represented effective length as the center point and radius of Bucher's approach, respectively. Through illustrative numerical examples, simplicity and efficiency of the proposed algorithm and the effectiveness of the represented rules are shown.

  11. Modulation of fluid absorption and the secretory response of rat jejunum to cholera toxin by dietary fat.

    PubMed Central

    Sagher, F A; Dodge, J A; Moore, R; McMaster, C; McCaughey, G

    1990-01-01

    To study the effects of dietary fat on jejunal water and ion absorption and on cholera toxin-induced secretion, 3 week old Sprague Dawley rats were fed isocaloric diets. Forty per cent of the total calories were given as fat, as butter (high saturated fat), olive oil (high monounsaturated fat), or corn oil (high polyunsaturated fat), with one group on low fat (10% of calories) standard laboratory diet as controls. During in vivo jejunal perfusion studies we found that (i) a polyunsaturated fat (corn oil) supplemented diet improves jejunal absorption of water and electrolytes and these changes are independent of the observed concentrations of luminal prostaglandins; (ii) high dietary fat appreciably reduced the secretory response to cholera toxin, probably without fundamentally changing the mechanism by which cholera toxin induces secretion. We conclude that dietary fat composition altered the permeability and transport characteristics of the small intestine. This observation might have relevance to some human diarrhoeal disorders. PMID:2253909

  12. Landslide susceptibility in the Tully Valley area, Finger Lakes region, New York

    USGS Publications Warehouse

    Jager, Stefan; Wieczorek, Gerald E.

    1994-01-01

    As a consequence of a large landslide in the Tully Valley, Onondaga County, New York, an investigation was undertaken to determine the factors responsible for the landslide in order to develop a model for regional landslide susceptibility. The April 27, 1993 Tully Valley landslide occurred within glacial lake clays overlain by till and colluvium on gentle slopes of 9-12 degrees. The landslide was triggered by extreme climatic events of prolonged heavy rainfall combined with rapid melting of a winter snowpack. A photoinventory and field checking of landslides within a 415 km2 study area, including the Tully Valley, revealed small recently-active landslides and other large dormant prehistoric landslides, probably Pleistocene in age. Similar to the larger Tully Valley landslide, the smaller recently-active landslides occurred in red, glacial lake clays very likely triggered by seasonal rainfall. The large dormant landslides have been stable for long periods as evidenced by slope denudational processes that have modified the landslides. These old and ancient landslides correspond with proglacial lake levels during the Pleistocene, suggesting that either inundation or rapid drainage was responsible for triggering these landslides. A logistic regression analysis was performed within a Geographic Information System (GIS) environment to develop a model of landslide susceptibility for the Tully Valley study area. Presence of glacial clays, slope angle, and glacial lake levels were used as explanatory variables for landslide incidence. The spatial probability of landsliding, categorized as low, moderate and high, is portrayed within 90-m square cells on the susceptibility map.

  13. More Intense Experiences, Less Intense Forecasts: Why People Overweight Probability Specifications in Affective Forecasts

    PubMed Central

    Buechel, Eva C.; Zhang, Jiao; Morewedge, Carey K.; Vosgerau, Joachim

    2014-01-01

    We propose that affective forecasters overestimate the extent to which experienced hedonic responses to an outcome are influenced by the probability of its occurrence. The experience of an outcome (e.g., winning a gamble) is typically more affectively intense than the simulation of that outcome (e.g., imagining winning a gamble) upon which the affective forecast for it is based. We suggest that, as a result, experiencers allocate a larger share of their attention toward the outcome (e.g., winning the gamble) and less to its probability specifications than do affective forecasters. Consequently, hedonic responses to an outcome are less sensitive to its probability specifications than are affective forecasts for that outcome. The results of 6 experiments provide support for our theory. Affective forecasters overestimated how sensitive experiencers would be to the probability of positive and negative outcomes (Experiments 1 and 2). Consistent with our attentional account, differences in sensitivity to probability specifications disappeared when the attention of forecasters was diverted from probability specifications (Experiment 3) or when the attention of experiencers was drawn toward probability specifications (Experiment 4). Finally, differences in sensitivity to probability specifications between forecasters and experiencers were diminished when the forecasted outcome was more affectively intense (Experiments 5 and 6). PMID:24128184

  14. More intense experiences, less intense forecasts: why people overweight probability specifications in affective forecasts.

    PubMed

    Buechel, Eva C; Zhang, Jiao; Morewedge, Carey K; Vosgerau, Joachim

    2014-01-01

    We propose that affective forecasters overestimate the extent to which experienced hedonic responses to an outcome are influenced by the probability of its occurrence. The experience of an outcome (e.g., winning a gamble) is typically more affectively intense than the simulation of that outcome (e.g., imagining winning a gamble) upon which the affective forecast for it is based. We suggest that, as a result, experiencers allocate a larger share of their attention toward the outcome (e.g., winning the gamble) and less to its probability specifications than do affective forecasters. Consequently, hedonic responses to an outcome are less sensitive to its probability specifications than are affective forecasts for that outcome. The results of 6 experiments provide support for our theory. Affective forecasters overestimated how sensitive experiencers would be to the probability of positive and negative outcomes (Experiments 1 and 2). Consistent with our attentional account, differences in sensitivity to probability specifications disappeared when the attention of forecasters was diverted from probability specifications (Experiment 3) or when the attention of experiencers was drawn toward probability specifications (Experiment 4). Finally, differences in sensitivity to probability specifications between forecasters and experiencers were diminished when the forecasted outcome was more affectively intense (Experiments 5 and 6).

  15. Community-specific hydraulic conductance potential of soil water decomposed for two Alpine grasslands by small-scale lysimetry

    NASA Astrophysics Data System (ADS)

    Frenck, Georg; Leitinger, Georg; Obojes, Nikolaus; Hofmann, Magdalena; Newesely, Christian; Deutschmann, Mario; Tappeiner, Ulrike; Tasser, Erich

    2018-02-01

    For central Europe in addition to rising temperatures an increasing variability in precipitation is predicted. This will increase the probability of drought periods in the Alps, where water supply has been sufficient in most areas so far. For Alpine grasslands, community-specific imprints on drought responses are poorly analyzed so far due to the sufficient natural water supply. In a replicated mesocosm experiment we compared evapotranspiration (ET) and biomass productivity of two differently drought-adapted Alpine grassland communities during two artificial drought periods divided by extreme precipitation events using high-precision small lysimeters. The drought-adapted vegetation type showed a high potential to utilize even scarce water resources. This is combined with a low potential to translate atmospheric deficits into higher water conductance and a lower biomass production as those measured for the non-drought-adapted type. The non-drought-adapted type, in contrast, showed high water conductance potential and a strong increase in ET rates when environmental conditions became less constraining. With high rates even at dry conditions, this community appears not to be optimized to save water and might experience drought effects earlier and probably more strongly. As a result, the water use efficiency of the drought-adapted plant community is with 2.6 gDW kg-1 of water much higher than that of the non-drought-adapted plant community (0.16 gDW kg-1). In summary, the vegetation's reaction to two covarying gradients of potential evapotranspiration and soil water content revealed a clear difference in vegetation development and between water-saving and water-spending strategies regarding evapotranspiration.

  16. Applying the shelterwood system

    Treesearch

    Richard M. Godman

    1992-01-01

    The 2-cut shelterwood silvicultural system is the most reliable method we have for regenerating even-aged hardwoods. Unlike other systems it can be used for all hardwood species, both small- and large-seeded, but will probably be used most often for the small-seeded ones such as yellow birch, paper birch, and hemlock.

  17. The Precise Time Course of Lexical Activation: MEG Measurements of the Effects of Frequency, Probability, and Density in Lexical Decision

    ERIC Educational Resources Information Center

    Stockall, Linnaea; Stringfellow, Andrew; Marantz, Alec

    2004-01-01

    Visually presented letter strings consistently yield three MEG response components: the M170, associated with letter-string processing (Tarkiainen, Helenius, Hansen, Cornelissen, & Salmelin, 1999); the M250, affected by phonotactic probability, (Pylkkanen, Stringfellow, & Marantz, 2002); and the M350, responsive to lexical frequency (Embick,…

  18. On the dynamic nature of response criterion in recognition memory: effects of base rate, awareness, and feedback.

    PubMed

    Rhodes, Matthew G; Jacoby, Larry L

    2007-03-01

    The authors examined whether participants can shift their criterion for recognition decisions in response to the probability that an item was previously studied. Participants in 3 experiments were given recognition tests in which the probability that an item was studied was correlated with its location during the test. Results from all 3 experiments indicated that participants' response criteria were sensitive to the probability that an item was previously studied and that shifts in criterion were robust. In addition, awareness of the bases for criterion shifts and feedback on performance were key factors contributing to the observed shifts in decision criteria. These data suggest that decision processes can operate in a dynamic fashion, shifting from item to item.

  19. Accelerating rejection-based simulation of biochemical reactions with bounded acceptance probability

    NASA Astrophysics Data System (ADS)

    Thanh, Vo Hong; Priami, Corrado; Zunino, Roberto

    2016-06-01

    Stochastic simulation of large biochemical reaction networks is often computationally expensive due to the disparate reaction rates and high variability of population of chemical species. An approach to accelerate the simulation is to allow multiple reaction firings before performing update by assuming that reaction propensities are changing of a negligible amount during a time interval. Species with small population in the firings of fast reactions significantly affect both performance and accuracy of this simulation approach. It is even worse when these small population species are involved in a large number of reactions. We present in this paper a new approximate algorithm to cope with this problem. It is based on bounding the acceptance probability of a reaction selected by the exact rejection-based simulation algorithm, which employs propensity bounds of reactions and the rejection-based mechanism to select next reaction firings. The reaction is ensured to be selected to fire with an acceptance rate greater than a predefined probability in which the selection becomes exact if the probability is set to one. Our new algorithm improves the computational cost for selecting the next reaction firing and reduces the updating the propensities of reactions.

  20. Accelerating rejection-based simulation of biochemical reactions with bounded acceptance probability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thanh, Vo Hong, E-mail: vo@cosbi.eu; Priami, Corrado, E-mail: priami@cosbi.eu; Department of Mathematics, University of Trento, Trento

    Stochastic simulation of large biochemical reaction networks is often computationally expensive due to the disparate reaction rates and high variability of population of chemical species. An approach to accelerate the simulation is to allow multiple reaction firings before performing update by assuming that reaction propensities are changing of a negligible amount during a time interval. Species with small population in the firings of fast reactions significantly affect both performance and accuracy of this simulation approach. It is even worse when these small population species are involved in a large number of reactions. We present in this paper a new approximatemore » algorithm to cope with this problem. It is based on bounding the acceptance probability of a reaction selected by the exact rejection-based simulation algorithm, which employs propensity bounds of reactions and the rejection-based mechanism to select next reaction firings. The reaction is ensured to be selected to fire with an acceptance rate greater than a predefined probability in which the selection becomes exact if the probability is set to one. Our new algorithm improves the computational cost for selecting the next reaction firing and reduces the updating the propensities of reactions.« less

  1. Fragment size distribution in viscous bag breakup of a drop

    NASA Astrophysics Data System (ADS)

    Kulkarni, Varun; Bulusu, Kartik V.; Plesniak, Michael W.; Sojka, Paul E.

    2015-11-01

    In this study we examine the drop size distribution resulting from the fragmentation of a single drop in the presence of a continuous air jet. Specifically, we study the effect of Weber number, We, and Ohnesorge number, Oh on the disintegration process. The regime of breakup considered is observed between 12 <= We <= 16 for Oh <= 0.1. Experiments are conducted using phase Doppler anemometry. Both the number and volume fragment size probability distributions are plotted. The volume probability distribution revealed a bi-modal behavior with two distinct peaks: one corresponding to the rim fragments and the other to the bag fragments. This behavior was suppressed in the number probability distribution. Additionally, we employ an in-house particle detection code to isolate the rim fragment size distribution from the total probability distributions. Our experiments showed that the bag fragments are smaller in diameter and larger in number, while the rim fragments are larger in diameter and smaller in number. Furthermore, with increasing We for a given Ohwe observe a large number of small-diameter drops and small number of large-diameter drops. On the other hand, with increasing Oh for a fixed We the opposite is seen.

  2. Critical behavior of the contact process on small-world networks

    NASA Astrophysics Data System (ADS)

    Ferreira, Ronan S.; Ferreira, Silvio C.

    2013-11-01

    We investigate the role of clustering on the critical behavior of the contact process (CP) on small-world networks using the Watts-Strogatz (WS) network model with an edge rewiring probability p. The critical point is well predicted by a homogeneous cluster-approximation for the limit of vanishing clustering ( p → 1). The critical exponents and dimensionless moment ratios of the CP are in agreement with those predicted by the mean-field theory for any p > 0. This independence on the network clustering shows that the small-world property is a sufficient condition for the mean-field theory to correctly predict the universality of the model. Moreover, we compare the CP dynamics on WS networks with rewiring probability p = 1 and random regular networks and show that the weak heterogeneity of the WS network slightly changes the critical point but does not alter other critical quantities of the model.

  3. Effect of drain current on appearance probability and amplitude of random telegraph noise in low-noise CMOS image sensors

    NASA Astrophysics Data System (ADS)

    Ichino, Shinya; Mawaki, Takezo; Teramoto, Akinobu; Kuroda, Rihito; Park, Hyeonwoo; Wakashima, Shunichi; Goto, Tetsuya; Suwa, Tomoyuki; Sugawa, Shigetoshi

    2018-04-01

    Random telegraph noise (RTN), which occurs in in-pixel source follower (SF) transistors, has become one of the most critical problems in high-sensitivity CMOS image sensors (CIS) because it is a limiting factor of dark random noise. In this paper, the behaviors of RTN toward changes in SF drain current conditions were analyzed using a low-noise array test circuit measurement system with a floor noise of 35 µV rms. In addition to statistical analysis by measuring a large number of transistors (18048 transistors), we also analyzed the behaviors of RTN parameters such as amplitude and time constants in the individual transistors. It is demonstrated that the appearance probability of RTN becomes small under a small drain current condition, although large-amplitude RTN tends to appear in a very small number of cells.

  4. Fermi's paradox, extraterrestrial life and the future of humanity: a Bayesian analysis

    NASA Astrophysics Data System (ADS)

    Verendel, Vilhelm; Häggström, Olle

    2017-01-01

    The Great Filter interpretation of Fermi's great silence asserts that Npq is not a very large number, where N is the number of potentially life-supporting planets in the observable universe, p is the probability that a randomly chosen such planet develops intelligent life to the level of present-day human civilization, and q is the conditional probability that it then goes on to develop a technological supercivilization visible all over the observable universe. Evidence suggests that N is huge, which implies that pq is very small. Hanson (1998) and Bostrom (2008) have argued that the discovery of extraterrestrial life would point towards p not being small and therefore a very small q, which can be seen as bad news for humanity's prospects of colonizing the universe. Here we investigate whether a Bayesian analysis supports their argument, and the answer turns out to depend critically on the choice of prior distribution.

  5. Preservice Elementary Teachers and the Fundamentals of Probability

    ERIC Educational Resources Information Center

    Dollard, Clark

    2011-01-01

    This study examined how preservice elementary teachers think about situations involving probability. Twenty-four preservice elementary teachers who had not yet studied probability as part of their preservice elementary mathematics coursework were interviewed using a task-based interview. The participants' responses showed a wide variety of…

  6. Seasonal And Intra-seasonal Hydrological Responses To Change In Climate Pattern And Small Dams of the Faga Watershed In Burkina-Faso

    NASA Astrophysics Data System (ADS)

    Mamounata, K.

    2015-12-01

    In response to the increasing demand for food linked to the substantial growth of population in Burkina Faso, irrigation has been widely used by the farming community to support agricultural production. Thus a promising option for water resources development in such a context is to increase the number of small dams. It is assumed that the great number of small dams may have effect on sub-basins' hydrological dynamic. This study aims to assess the seasonal and the intra-seasonal change in river basins hydrology with the case study of the Faga River sub-basin located in Burkina-Faso, West Africa, using Water Simulation Model (WaSiM). For this watershed the number of small dams is slightly very important (More than 60) and their impact on the watershed runoff has been estimated simultaneously with the change in climate pattern. The coefficient of variation for rainfall in this sub-basin from 1982 to 2010 is 0.097 and the stream flow presents a seasonal average of 25.58Km3 per month for the same period. The intra-seasonal climate variation for the same period is estimated at 0.087 in the scenario where any dam has not been considered. Results based on simulation including the five important dams over the sub-basin show that the overall effect of small dams is on average a 20.76% in runoff. Projections using the Representative Concentration Pathways (RCP) 4.5 and 8.5 climate scenarios with increase of 25% of dams' number show a probable decrease of about 29.54% and 35.25% of the average during the next fifty years runoff. The study findings show that small dams reduce significantly the runoff from their watershed and the uncertainties related to the sustainability of the resource seems to be increasing during the same period. Therefore, despite the very large number of water storage infrastructures, reservoirs operating strategies have to be achieved for water sustainability within the Faga sub-basin.

  7. Oil spill contamination probability in the southeastern Levantine basin.

    PubMed

    Goldman, Ron; Biton, Eli; Brokovich, Eran; Kark, Salit; Levin, Noam

    2015-02-15

    Recent gas discoveries in the eastern Mediterranean Sea led to multiple operations with substantial economic interest, and with them there is a risk of oil spills and their potential environmental impacts. To examine the potential spatial distribution of this threat, we created seasonal maps of the probability of oil spill pollution reaching an area in the Israeli coastal and exclusive economic zones, given knowledge of its initial sources. We performed simulations of virtual oil spills using realistic atmospheric and oceanic conditions. The resulting maps show dominance of the alongshore northerly current, which causes the high probability areas to be stretched parallel to the coast, increasing contamination probability downstream of source points. The seasonal westerly wind forcing determines how wide the high probability areas are, and may also restrict these to a small coastal region near source points. Seasonal variability in probability distribution, oil state, and pollution time is also discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Optimizing Probability of Detection Point Estimate Demonstration

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  9. WHAMII - An enumeration and insertion procedure with binomial bounds for the stochastic time-constrained traveling salesman problem

    NASA Technical Reports Server (NTRS)

    Dahl, Roy W.; Keating, Karen; Salamone, Daryl J.; Levy, Laurence; Nag, Barindra; Sanborn, Joan A.

    1987-01-01

    This paper presents an algorithm (WHAMII) designed to solve the Artificial Intelligence Design Challenge at the 1987 AIAA Guidance, Navigation and Control Conference. The problem under consideration is a stochastic generalization of the traveling salesman problem in which travel costs can incur a penalty with a given probability. The variability in travel costs leads to a probability constraint with respect to violating the budget allocation. Given the small size of the problem (eleven cities), an approach is considered that combines partial tour enumeration with a heuristic city insertion procedure. For computational efficiency during both the enumeration and insertion procedures, precalculated binomial probabilities are used to determine an upper bound on the actual probability of violating the budget constraint for each tour. The actual probability is calculated for the final best tour, and additional insertions are attempted until the actual probability exceeds the bound.

  10. The precise time course of lexical activation: MEG measurements of the effects of frequency, probability, and density in lexical decision.

    PubMed

    Stockall, Linnaea; Stringfellow, Andrew; Marantz, Alec

    2004-01-01

    Visually presented letter strings consistently yield three MEG response components: the M170, associated with letter-string processing (Tarkiainen, Helenius, Hansen, Cornelissen, & Salmelin, 1999); the M250, affected by phonotactic probability, (Pylkkänen, Stringfellow, & Marantz, 2002); and the M350, responsive to lexical frequency (Embick, Hackl, Schaeffer, Kelepir, & Marantz, 2001). Pylkkänen et al. found evidence that the M350 reflects lexical activation prior to competition among phonologically similar words. We investigate the effects of lexical and sublexical frequency and neighborhood density on the M250 and M350 through orthogonal manipulation of phonotactic probability, density, and frequency. The results confirm that probability but not density affects the latency of the M250 and M350; however, an interaction between probability and density on M350 latencies suggests an earlier influence of neighborhoods than previously reported.

  11. Neuroimaging Characteristics of Small-Vessel Disease in Older Adults with Normal Cognition, Mild Cognitive Impairment, and Alzheimer Disease.

    PubMed

    Mimenza-Alvarado, Alberto; Aguilar-Navarro, Sara G; Yeverino-Castro, Sara; Mendoza-Franco, César; Ávila-Funes, José Alberto; Román, Gustavo C

    2018-01-01

    Cerebral small-vessel disease (SVD) represents the most frequent type of vascular brain lesions, often coexisting with Alzheimer disease (AD). By quantifying white matter hyperintensities (WMH) and hippocampal and parietal atrophy, we aimed to describe the prevalence and severity of SVD among older adults with normal cognition (NC), mild cognitive impairment (MCI), and probable AD and to describe associated risk factors. This study included 105 older adults evaluated with magnetic resonance imaging and clinical and neuropsychological tests. We used the Fazekas scale (FS) for quantification of WMH, the Scheltens scale (SS) for hippocampal atrophy, and the Koedam scale (KS) for parietal atrophy. Logistic regression models were performed to determine the association between FS, SS, and KS scores and the presence of NC, MCI, or probable AD. Compared to NC subjects, SVD was more prevalent in MCI and probable AD subjects. After adjusting for confounding factors, logistic regression showed a positive association between higher scores on the FS and probable AD (OR = 7.6, 95% CI 2.7-20, p < 0.001). With the use of the SS and KS (OR = 4.5, 95% CI 3.5-58, p = 0.003 and OR = 8.9, 95% CI 1-72, p = 0.04, respectively), the risk also remained significant for probable AD. These results suggest an association between severity of vascular brain lesions and neurodegeneration.

  12. Cost-effectiveness of first-line erlotinib in patients with advanced non-small-cell lung cancer unsuitable for chemotherapy

    PubMed Central

    Khan, Iftekhar; Morris, Stephen; Hackshaw, Allan; Lee, Siow-Ming

    2015-01-01

    Objective To assess the cost-effectiveness of erlotinib versus supportive care (placebo) overall and within a predefined rash subgroup in elderly patients with advanced non-small-cell lung cancer who are unfit for chemotherapy and receive only active supportive care due to their poor performance status or presence of comorbidities. Setting Between 2005 and 2009, a total of 670 patients with non-small cell lung cancer (NSCLC) were randomised across 78 hospital sites (centres) in the UK. Participants 670 patients with pathologically confirmed stage IIIb-IV NSCLC, unfit for chemotherapy, predominantly poor performance status (>2 on Eastern Cooperative Oncology Group, ECOG) and estimated life expectancy of at least 8 weeks. Patients were followed until disease progression or death, including a subgroup of patients who developed first cycle rash. Interventions Patients were randomised (1:1) to receive best supportive care plus oral placebo or erlotinib (150 mg/day) until disease progression, toxicity or death. Primary outcome Overall survival (OS). Secondary outcomes Progression-free survival (PFS), tumour response and quality adjusted life years (QALY), including within prespecified subgroups. Results The mean incremental cost per QALY in all patients was £202 571/QALY. The probability of cost-effectiveness of erlotinib in all patients was <10% at thresholds up to £100 000. However, within the rash subgroup, the incremental cost/QALY was £56 770/QALY with a probability of cost-effectiveness of about 80% for cost-effectiveness thresholds between £50 000 to £60 000. Conclusions Erlotinib has about 80% chance of being cost-effective at thresholds between £50 000–£60 000 in a subset of elderly poor performance patients with NSCLC unfit for chemotherapy who develop first cycle (28 days) rash. Erlotinib is potentially cost-effective for this population, for which few treatment options apart from best supportive care are available. Trial registration number (ISCRTN): 77383050. PMID:26137881

  13. Exploration of multiphoton entangled states by using weak nonlinearities

    PubMed Central

    He, Ying-Qiu; Ding, Dong; Yan, Feng-Li; Gao, Ting

    2016-01-01

    We propose a fruitful scheme for exploring multiphoton entangled states based on linear optics and weak nonlinearities. Compared with the previous schemes the present method is more feasible because there are only small phase shifts instead of a series of related functions of photon numbers in the process of interaction with Kerr nonlinearities. In the absence of decoherence we analyze the error probabilities induced by homodyne measurement and show that the maximal error probability can be made small enough even when the number of photons is large. This implies that the present scheme is quite tractable and it is possible to produce entangled states involving a large number of photons. PMID:26751044

  14. Oscillation characteristics of neutrino in the model with three sterile neutrinos for analysis of the anomalies on small distances

    NASA Astrophysics Data System (ADS)

    Khruschov, V. V.; Fomichev, S. V.

    2017-11-01

    In the framework of the model with three sterile neutrinos, the transition probabilities for different flavours of neutrino are calculated and the graphical dependences are obtained, in particular, for the appearance probability of electron neutrino and antineutrino in the muon neutrino and antineutrino jets as a function of distance and other model parameters at their acceptable values and at the neutrino energy less than 50 MeV, as well as a function of a ratio of distance to the neutrino energy. The theoretical results obtained can be used for analysis of the neutrino data related to the anomalies on small distances.

  15. Fixation probability on clique-based graphs

    NASA Astrophysics Data System (ADS)

    Choi, Jeong-Ok; Yu, Unjong

    2018-02-01

    The fixation probability of a mutant in the evolutionary dynamics of Moran process is calculated by the Monte-Carlo method on a few families of clique-based graphs. It is shown that the complete suppression of fixation can be realized with the generalized clique-wheel graph in the limit of small wheel-clique ratio and infinite size. The family of clique-star is an amplifier, and clique-arms graph changes from amplifier to suppressor as the fitness of the mutant increases. We demonstrate that the overall structure of a graph can be more important to determine the fixation probability than the degree or the heat heterogeneity. The dependence of the fixation probability on the position of the first mutant is discussed.

  16. A Bayesian predictive two-stage design for phase II clinical trials.

    PubMed

    Sambucini, Valeria

    2008-04-15

    In this paper, we propose a Bayesian two-stage design for phase II clinical trials, which represents a predictive version of the single threshold design (STD) recently introduced by Tan and Machin. The STD two-stage sample sizes are determined specifying a minimum threshold for the posterior probability that the true response rate exceeds a pre-specified target value and assuming that the observed response rate is slightly higher than the target. Unlike the STD, we do not refer to a fixed experimental outcome, but take into account the uncertainty about future data. In both stages, the design aims to control the probability of getting a large posterior probability that the true response rate exceeds the target value. Such a probability is expressed in terms of prior predictive distributions of the data. The performance of the design is based on the distinction between analysis and design priors, recently introduced in the literature. The properties of the method are studied when all the design parameters vary.

  17. A Connection Admission Control Method for Web Server Systems

    NASA Astrophysics Data System (ADS)

    Satake, Shinsuke; Inai, Hiroshi; Saito, Tomoya; Arai, Tsuyoshi

    Most browsers establish multiple connections and download files in parallel to reduce the response time. On the other hand, a web server limits the total number of connections to prevent from being overloaded. That could decrease the response time, but would increase the loss probability, the probability of which a newly arriving client is rejected. This paper proposes a connection admission control method which accepts only one connection from a newly arriving client when the number of connections exceeds a threshold, but accepts new multiple connections when the number of connections is less than the threshold. Our method is aimed at reducing the response time by allowing as many clients as possible to establish multiple connections, and also reducing the loss probability. In order to reduce spending time to examine an adequate threshold for web server administrators, we introduce a procedure which approximately calculates the loss probability under a condition that the threshold is given. Via simulation, we validate the approximation and show effectiveness of the admission control.

  18. Risk Preferences, Probability Weighting, and Strategy Tradeoffs in Wildfire Management.

    PubMed

    Hand, Michael S; Wibbenmeyer, Matthew J; Calkin, David E; Thompson, Matthew P

    2015-10-01

    Wildfires present a complex applied risk management environment, but relatively little attention has been paid to behavioral and cognitive responses to risk among public agency wildfire managers. This study investigates responses to risk, including probability weighting and risk aversion, in a wildfire management context using a survey-based experiment administered to federal wildfire managers. Respondents were presented with a multiattribute lottery-choice experiment where each lottery is defined by three outcome attributes: expenditures for fire suppression, damage to private property, and exposure of firefighters to the risk of aviation-related fatalities. Respondents choose one of two strategies, each of which includes "good" (low cost/low damage) and "bad" (high cost/high damage) outcomes that occur with varying probabilities. The choice task also incorporates an information framing experiment to test whether information about fatality risk to firefighters alters managers' responses to risk. Results suggest that managers exhibit risk aversion and nonlinear probability weighting, which can result in choices that do not minimize expected expenditures, property damage, or firefighter exposure. Information framing tends to result in choices that reduce the risk of aviation fatalities, but exacerbates nonlinear probability weighting. © 2015 Society for Risk Analysis.

  19. A Comparative Analysis of Two Full-Scale MD-500 Helicopter Crash Tests

    NASA Technical Reports Server (NTRS)

    Littell, Justin D.

    2011-01-01

    Two full scale crash tests were conducted on a small MD-500 helicopter at NASA Langley Research Center fs Landing and Impact Research Facility. One of the objectives of this test series was to compare airframe impact response and occupant injury data between a test which outfitted the airframe with an external composite passive energy absorbing honeycomb and a test which had no energy absorbing features. In both tests, the nominal impact velocity conditions were 7.92 m/sec (26 ft/sec) vertical and 12.2 m/sec (40 ft/sec) horizontal, and the test article weighed approximately 1315 kg (2900 lbs). Airframe instrumentation included accelerometers and strain gages. Four Anthropomorphic Test Devices were also onboard; three of which were standard Hybrid II and III, while the fourth was a specialized torso. The test which contained the energy absorbing honeycomb showed vertical impact acceleration loads of approximately 15 g, low risk for occupant injury probability, and minimal airframe damage. These results were contrasted with the test conducted without the energy absorbing honeycomb. The test results showed airframe accelerations of approximately 40 g in the vertical direction, high risk for injury probability in the occupants, and substantial airframe damage.

  20. Synthetic membrane-targeted antibiotics.

    PubMed

    Vooturi, S K; Firestine, S M

    2010-01-01

    Antimicrobial resistance continues to evolve and presents serious challenges in the therapy of both nosocomial and community-acquired infections. The rise of resistant strains like methicillin-resistant Staphylococcus aureus (MRSA), vancomycin-resistant Staphylococcus aureus (VRSA) and vancomycin-resistant enterococci (VRE) suggests that antimicrobial resistance is an inevitable evolutionary response to antimicrobial use. This highlights the tremendous need for antibiotics against new bacterial targets. Agents that target the integrity of bacterial membrane are relatively novel in the clinical armamentarium. Daptomycin, a lipopeptide is a classical example of membrane-bound antibiotic. Nature has also utilized this tactic. Antimicrobial peptides (AMPs), which are found in all kingdoms, function primarily by permeabilizing the bacterial membrane. AMPs have several advantages over existing antibiotics including a broad spectrum of activity, rapid bactericidal activity, no cross-resistance with the existing antibiotics and a low probability for developing resistance. Currently, a small number of peptides have been developed for clinical use but therapeutic applications are limited because of poor bioavailability and high manufacturing cost. However, their broad specificity, potent activity and lower probability for resistance have spurred the search for synthetic mimetics of antimicrobial peptides as membrane-active antibiotics. In this review, we will discuss the different classes of synthetic membrane-bound antibiotics published since 2004.

  1. The functional significance of the spinose keel structure of benthic foraminifera: inferences from Miliolina cristata Millett, 1898 (Miliolida) from northeast Romania

    NASA Astrophysics Data System (ADS)

    Dumitriţa Dumitriu, Simina; Dubicka, Zofia; Ionesi, Viorel

    2018-01-01

    The paper presents Miocene (lower Sarmatian) benthic foraminifera from the FH3P1 Rădăuţi Core section from the northwestern part of the Moldavian Platform, Romania. Based on foraminiferal assemblages we infer sediments were deposited in shallow-water, including marine-marginal environments, of varying salinities from brackish to normal marine with some short and rather small sea-level changes. Moreover, we describe for the first time in the Moldavian Platform a very rare species, Miliolina cristata Millett, which presents a characteristic spinose keel. Based on a detailed study of the test morphology and its variability, observed in picked material as well as in thin sections, we discuss some palaeoecological aspects of these foraminifera. M. cristata probably does not constitute a distinctive species, but it is more probable that some miliolid taxa developed such an exoskeletal feature in response to new environmental conditions, such as more turbulent water. Accordingly, our study would support the thesis that one of the functions of the benthic foraminiferal spines is to stabilize foraminiferal tests found in sandy substrates from high-energy environments.

  2. Chronic fatigue of the small enterprise workers participating in an occupational health checkup center in southern Taiwan.

    PubMed

    Wang, Fu-Wei; Chiu, Yu-Wen; Tu, Ming-Shium; Chou, Ming-Yueh; Wang, Chao-Ling; Chuang, Hung-Yi

    2009-07-01

    There has been increasing interest in the occupational health of workers in small enterprises, especially in developing countries. This study examines the association between psychosocial job characteristics and fatigue, and attempts to identify risk factors for fatigue among workers of small enterprises in southern Taiwan. A structured questionnaire was administered to workers receiving regular health examinations between August 2005 and January 2006. The questionnaire collected demographic information and data on working conditions, personal health status and life styles. It also collected information on psychosocial job characteristics, fatigue and psychological distress using three instruments. A total of 647 workers with mean age of 43.7 were completed. Probable fatigue was found in 34.6% of the sample. Fatigue was found by multiple logistic regressions to be associated with the lack of exercise, working in shifts, depression score and lack of social support at workplace. This study found associations between life style, psychosocial job characteristics and fatigue. Because the high prevalence of probable fatigue was found in such small enterprises, the authors suggest that a short interview with some quick questionnaires in health checkup for these small enterprise workers are helpful to early detect psychosocial and fatigue problems.

  3. The Influence of Phonotactic Probability on Word Recognition in Toddlers

    ERIC Educational Resources Information Center

    MacRoy-Higgins, Michelle; Shafer, Valerie L.; Schwartz, Richard G.; Marton, Klara

    2014-01-01

    This study examined the influence of phonotactic probability on word recognition in English-speaking toddlers. Typically developing toddlers completed a preferential looking paradigm using familiar words, which consisted of either high or low phonotactic probability sound sequences. The participants' looking behavior was recorded in response to…

  4. Potential impacts of offshore oil spills on polar bears in the Chukchi Sea.

    PubMed

    Wilson, Ryan R; Perham, Craig; French-McCay, Deborah P; Balouskus, Richard

    2018-04-01

    Sea ice decline is anticipated to increase human access to the Arctic Ocean allowing for offshore oil and gas development in once inaccessible areas. Given the potential negative consequences of an oil spill on marine wildlife populations in the Arctic, it is important to understand the magnitude of impact a large spill could have on wildlife to inform response planning efforts. In this study we simulated oil spills that released 25,000 barrels of oil for 30 days in autumn originating from two sites in the Chukchi Sea (one in Russia and one in the U.S.) and tracked the distribution of oil for 76 days. We then determined the potential impact such a spill might have on polar bears (Ursus maritimus) and their habitat by overlapping spills with maps of polar bear habitat and movement trajectories. Only a small proportion (1-10%) of high-value polar bear sea ice habitat was directly affected by oil sufficient to impact bears. However, 27-38% of polar bears in the region were potentially exposed to oil. Oil consistently had the highest probability of reaching Wrangel and Herald islands, important areas of denning and summer terrestrial habitat. Oil did not reach polar bears until approximately 3 weeks after the spills. Our study found the potential for significant impacts to polar bears under a worst case discharge scenario, but suggests that there is a window of time where effective containment efforts could minimize exposure to bears. Our study provides a framework for wildlife managers and planners to assess the level of response that would be required to treat exposed wildlife and where spill response equipment might be best stationed. While the size of spill we simulated has a low probability of occurring, it provides an upper limit for planners to consider when crafting response plans. Published by Elsevier Ltd.

  5. Concurrent performance in a three-alternative choice situation: response allocation in a Rock/Paper/Scissors game.

    PubMed

    Kangas, Brian D; Berry, Meredith S; Cassidy, Rachel N; Dallery, Jesse; Vaidya, Manish; Hackenberg, Timothy D

    2009-10-01

    Adult human subjects engaged in a simulated Rock/Paper/Scissors game against a computer opponent. The computer opponent's responses were determined by programmed probabilities that differed across 10 blocks of 100 trials each. Response allocation in Experiment 1 was well described by a modified version of the generalized matching equation, with undermatching observed in all subjects. To assess the effects of instructions on response allocation, accurate probability-related information on how the computer was programmed to respond was provided to subjects in Experiment 2. Five of 6 subjects played the counter response of the computer's dominant programmed response near-exclusively (e.g., subjects played paper almost exclusively if the probability of rock was high), resulting in minor overmatching, and higher reinforcement rates relative to Experiment 1. On the whole, the study shows that the generalized matching law provides a good description of complex human choice in a gaming context, and illustrates a promising set of laboratory methods and analytic techniques that capture important features of human choice outside the laboratory.

  6. Male-induced short oestrous and ovarian cycles in sheep and goats: a working hypothesis.

    PubMed

    Chemineau, Philippe; Pellicer-Rubio, Maria-Theresa; Lassoued, Narjess; Khaldi, Gley; Monniaux, Danielle

    2006-01-01

    The existence of short ovulatory cycles (5-day duration) after the first male-induced ovulations in anovulatory ewes and goats, associated or not with the appearance of oestrous behaviour, is the origin of the two-peak abnormal distribution of parturitions after the "male effect". We propose here a working hypothesis to explain the presence of these short cycles. The male-effect is efficient during anoestrus, when follicles contain granulosa cells of lower quality than during the breeding season. They generate corpora lutea (CL) with a lower proportion of large luteal cells compared to small cells, which secrete less progesterone, compared to what is observed in the breeding season cycle. This is probably not sufficient to block prostaglandin synthesis in the endometrial cells of the uterus at the time when the responsiveness to prostaglandins of the new-formed CL is initiated and, in parallel, to centrally reduce LH pulsatility. This LH pulsatility stimulates a new wave of follicles secreting oestradiol which, in turn, stimulates prostaglandin synthesis and provokes luteolysis and new ovulation(s). The occurrence of a new follicular wave on days 3-4 of the first male-induced cycle and the initiation of the responsiveness to prostaglandins of the CL from day 3 of the oestrous cycle are probably the key elements which ensure such regularity in the duration of the short cycles. Exogenous progesterone injection suppresses short cycles, probably not by delaying ovulation time, but rather by blocking prostaglandin synthesis, thus impairing luteolysis. The existence, or not, of oestrous behaviour associated to these ovulatory events mainly varies with species: ewes, compared to does, require a more intense endogenous progesterone priming; only ovulations preceded by normal cycles are associated with oestrous behaviour. Thus, the precise and delicate mechanism underlying the existence of short ovulatory and oestrous cycles induced by the male effect appears to be dependent on the various levels of the hypothalamo-pituitary-ovario-uterine axis.

  7. Deviation from Power Law Behavior in Landslide Phenomenon

    NASA Astrophysics Data System (ADS)

    Li, L.; Lan, H.; Wu, Y.

    2013-12-01

    Power law distribution of magnitude is widely observed in many natural hazards (e.g., earthquake, floods, tornadoes, and forest fires). Landslide is unique as the size distribution of landslide is characterized by a power law decrease with a rollover in the small size end. Yet, the emergence of the rollover, i.e., the deviation from power law behavior for small size landslides, remains a mystery. In this contribution, we grouped the forces applied on landslide bodies into two categories: 1) the forces proportional to the volume of failure mass (gravity and friction), and 2) the forces proportional to the area of failure surface (cohesion). Failure occurs when the forces proportional to volume exceed the forces proportional to surface area. As such, given a certain mechanical configuration, the failure volume to failure surface area ratio must exceed a corresponding threshold to guarantee a failure. Assuming all landslides share a uniform shape, which means the volume to surface area ratio of landslide regularly increase with the landslide volume, a cutoff of landslide volume distribution in the small size end can be defined. However, in realistic landslide phenomena, where heterogeneities of landslide shape and mechanical configuration are existent, a simple cutoff of landslide volume distribution does not exist. The stochasticity of landslide shape introduce a probability distribution of the volume to surface area ratio with regard to landslide volume, with which the probability that the volume to surface ratio exceed the threshold can be estimated regarding values of landslide volume. An experiment based on empirical data showed that this probability can induce the power law distribution of landslide volume roll down in the small size end. We therefore proposed that the constraints on the failure volume to failure surface area ratio together with the heterogeneity of landslide geometry and mechanical configuration attribute for the deviation from power law behavior in landslide phenomenon. Figure shows that a rollover of landslide size distribution in the small size end is produced as the probability for V/S (the failure volume to failure surface ratio of landslide) exceeding the mechanical threshold applied to the power law distribution of landslide volume.

  8. Diversity of multilayer networks and its impact on collaborating epidemics

    NASA Astrophysics Data System (ADS)

    Min, Yong; Hu, Jiaren; Wang, Weihong; Ge, Ying; Chang, Jie; Jin, Xiaogang

    2014-12-01

    Interacting epidemics on diverse multilayer networks are increasingly important in modeling and analyzing the diffusion processes of real complex systems. A viral agent spreading on one layer of a multilayer network can interact with its counterparts by promoting (cooperative interaction), suppressing (competitive interaction), or inducing (collaborating interaction) its diffusion on other layers. Collaborating interaction displays different patterns: (i) random collaboration, where intralayer or interlayer induction has the same probability; (ii) concentrating collaboration, where consecutive intralayer induction is guaranteed with a probability of 1; and (iii) cascading collaboration, where consecutive intralayer induction is banned with a probability of 0. In this paper, we develop a top-bottom framework that uses only two distributions, the overlaid degree distribution and edge-type distribution, to model collaborating epidemics on multilayer networks. We then state the response of three collaborating patterns to structural diversity (evenness and difference of network layers). For viral agents with small transmissibility, we find that random collaboration is more effective in networks with higher diversity (high evenness and difference), while the concentrating pattern is more suitable in uneven networks. Interestingly, the cascading pattern requires a network with moderate difference and high evenness, and the moderately uneven coupling of multiple network layers can effectively increase robustness to resist cascading failure. With large transmissibility, however, we find that all collaborating patterns are more effective in high-diversity networks. Our work provides a systemic analysis of collaborating epidemics on multilayer networks. The results enhance our understanding of biotic and informative diffusion through multiple vectors.

  9. SANTA LUCIA WILDERNESS, AND GARCIA MOUNTAIN, BLACK MOUNTAIN, LA PANZA, MACHESNA MOUNTAIN, LOS MACHOS HILLS, BIG ROCKS, AND STANLEY MOUNTAIN ROADLESS AREAS, CALIFORNIA.

    USGS Publications Warehouse

    Frizzell, Virgil A.; Kuizon, Lucia

    1984-01-01

    The Santa Lucia Wilderness Area and Garcia Mountain, Black Mountain, La Panza, Machesna Mountain, Los Machos Hills, Big Rocks, and Stanley Mountain Roadless Areas together occupy an area of about 218 sq mi in the Los Padres National Forest, California. On the basis of a mineral-resource evaluation a small area in the Black Mountain Roadless Area has a probable mineral-resource potential for uranium, and a small area in the Stanley Mountain Roadless Area has probable potential for low-grade mercury resources. Although petroleum resources occur in rocks similar to those found in the study area, no potential for petroleum resources was identified in the wilderness or any of the roadless areas. No resource potential for other mineral resources was identified in any of the areas. Detailed geologic mapping and geochemical sampling probably would increase knowledge about distribution and modes of occurrence of uranium and cinnabar in those areas, respectively.

  10. 2012 Workplace and Gender Relations Survey of Reserve Component Members: Statistical Methodology Report

    DTIC Science & Technology

    2012-09-01

    3,435 10,461 9.1 3.1 63 Unmarried with Children+ Unmarried without Children 439,495 0.01 10,350 43,870 10.1 2.2 64 Married with Children+ Married ...logistic regression model was used to predict the probability of eligibility for the survey (known eligibility vs . unknown eligibility). A second logistic...regression model was used to predict the probability of response among eligible sample members (complete response vs . non-response). CHAID (Chi

  11. A comparison of survey methods for documenting presence of Myotis leibii (Eastern Small-Footed Bats) at roosting areas in Western Virginia

    USGS Publications Warehouse

    Huth, John K.; Silvis, Alexander; Moosman, Paul R.; Ford, W. Mark; Sweeten, Sara E.

    2015-01-01

    Many aspects of foraging and roosting habitat of Myotis leibii (Eastern Small-Footed Bat), an emergent rock roosting-obligate, are poorly described. Previous comparisons of effectiveness of acoustic sampling and mist-net captures have not included Eastern Small-Footed Bat. Habitat requirements of this species differ from congeners in the region, and it is unclear whether survey protocols developed for other species are applicable. Using data from three overlapping studies at two sampling sites in western Virginia’s central Appalachian Mountains, detection probabilities were examined for three survey methods (acoustic surveys with automated identification of calls, visual searches of rock crevices, and mist-netting) for use in the development of “best practices” for future surveys and monitoring. Observer effects were investigated using an expanded version of visual search data. Results suggested that acoustic surveys with automated call identification are not effective for documenting presence of Eastern Small-Footed Bats on talus slopes (basal detection rate of 0%) even when the species is known to be present. The broadband, high frequency echolocation calls emitted by Eastern Small-Footed Bat may be prone to attenuation by virtue of their high frequencies, and these factors, along with signal reflection, lower echolocation rates or possible misidentification to other bat species over talus slopes may all have contributed to poor acoustic survey success. Visual searches and mist-netting of emergent rock had basal detection probabilities of 91% and 75%, respectively. Success of visual searches varied among observers, but detection probability improved with practice. Additionally, visual searches were considerably more economical than mist-netting.

  12. Rapid spread and association of Schmallenberg virus with ruminant abortions and foetal death in Austria in 2012/2013.

    PubMed

    Steinrigl, Adolf; Schiefer, Peter; Schleicher, Corina; Peinhopf, Walter; Wodak, Eveline; Bagó, Zoltán; Schmoll, Friedrich

    2014-10-15

    Schmallenberg virus (SBV) has emerged in summer-autumn 2011 in north-western Europe. Since then, SBV has been continuously spreading over Europe, including Austria, where antibodies to SBV, as well as SBV genome, were first detected in autumn 2012. This study was performed to demonstrate the dynamics of SBV spread within Austria, after its probable first introduction in summer 2012. True seroprevalence estimates for cattle and small ruminates were calculated to demonstrate temporal and regional differences of infection. Furthermore, the probability of SBV genome detection in foetal tissues of aborted or stillborn cattle and small ruminants as well as in allantoic fluid samples from cows with early foetal losses was retrospectively assessed. SBV first reached Austria most likely in July-August 2012, as indicated by retrospective detection of SBV antibodies and SBV genome in archived samples. From August to October 2012, a rapid increase in seroprevalence to over 98% in cattle and a contemporaneous peak in the detection of SBV genome in foetal tissues and allantoic fluid samples was noted, indicating widespread acute infections. Notably, foetal malformations were absent in RT-qPCR positive foetuses at this time of the epidemic. SBV spread within Austrian cattle reached a plateau phase as early as October 2012, without significant regional differences in SBV seroprevalence (98.4-100%). Estimated true seroprevalences among small ruminates were comparatively lower than in cattle and regionally different (58.3-95.6% in October 2012), potentially indicating an eastward spread of the infection, as well as different infection dynamics between cattle and small ruminants. Additionally, the probability of SBV genome detection over time differed significantly between small ruminant and cattle samples subjected to RT-qPCR testing. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Reweighting Data in the Spirit of Tukey: Using Bayesian Posterior Probabilities as Rasch Residuals for Studying Misfit

    ERIC Educational Resources Information Center

    Dardick, William R.; Mislevy, Robert J.

    2016-01-01

    A new variant of the iterative "data = fit + residual" data-analytical approach described by Mosteller and Tukey is proposed and implemented in the context of item response theory psychometric models. Posterior probabilities from a Bayesian mixture model of a Rasch item response theory model and an unscalable latent class are expressed…

  14. On the Dynamic Nature of Response Criterion in Recognition Memory: Effects of Base Rate, Awareness, and Feedback

    ERIC Educational Resources Information Center

    Rhodes, Matthew G.; Jacoby, Larry L.

    2007-01-01

    The authors examined whether participants can shift their criterion for recognition decisions in response to the probability that an item was previously studied. Participants in 3 experiments were given recognition tests in which the probability that an item was studied was correlated with its location during the test. Results from all 3…

  15. Mutational analysis of S12 protein and implications for the accuracy of decoding by the ribosome.

    PubMed

    Sharma, Divya; Cukras, Anthony R; Rogers, Elizabeth J; Southworth, Daniel R; Green, Rachel

    2007-12-07

    The fidelity of aminoacyl-tRNA selection by the ribosome depends on a conformational switch in the decoding center of the small ribosomal subunit induced by cognate but not by near-cognate aminoacyl-tRNA. The aminoglycosides paromomycin and streptomycin bind to the decoding center and induce related structural rearrangements that explain their observed effects on miscoding. Structural and biochemical studies have identified ribosomal protein S12 (as well as specific nucleotides in 16S ribosomal RNA) as a critical molecular contributor in distinguishing between cognate and near-cognate tRNA species as well as in promoting more global rearrangements in the small subunit, referred to as "closure." Here we use a mutational approach to define contributions made by two highly conserved loops in S12 to the process of tRNA selection. Most S12 variant ribosomes tested display increased levels of fidelity (a "restrictive" phenotype). Interestingly, several variants, K42A and R53A, were substantially resistant to the miscoding effects of paromomycin. Further characterization of the compromised paromomycin response identified a probable second, fidelity-modulating binding site for paromomycin in the 16S ribosomal RNA that facilitates closure of the small subunit and compensates for defects associated with the S12 mutations.

  16. Identification of differentially expressed small non-coding RNAs in the legume endosymbiont Sinorhizobium meliloti by comparative genomics

    PubMed Central

    del Val, Coral; Rivas, Elena; Torres-Quesada, Omar; Toro, Nicolás; Jiménez-Zurdo, José I

    2007-01-01

    Bacterial small non-coding RNAs (sRNAs) are being recognized as novel widespread regulators of gene expression in response to environmental signals. Here, we present the first search for sRNA-encoding genes in the nitrogen-fixing endosymbiont Sinorhizobium meliloti, performed by a genome-wide computational analysis of its intergenic regions. Comparative sequence data from eight related α-proteobacteria were obtained, and the interspecies pairwise alignments were scored with the programs eQRNA and RNAz as complementary predictive tools to identify conserved and stable secondary structures corresponding to putative non-coding RNAs. Northern experiments confirmed that eight of the predicted loci, selected among the original 32 candidates as most probable sRNA genes, expressed small transcripts. This result supports the combined use of eQRNA and RNAz as a robust strategy to identify novel sRNAs in bacteria. Furthermore, seven of the transcripts accumulated differentially in free-living and symbiotic conditions. Experimental mapping of the 5′-ends of the detected transcripts revealed that their encoding genes are organized in autonomous transcription units with recognizable promoter and, in most cases, termination signatures. These findings suggest novel regulatory functions for sRNAs related to the interactions of α-proteobacteria with their eukaryotic hosts. PMID:17971083

  17. The effects of neighborhood views containing multiple environmental features on road traffic noise perception at dwellings.

    PubMed

    Leung, T M; Xu, J M; Chau, C K; Tang, S K; Pun-Cheng, L S C

    2017-04-01

    The importance of non-acoustical factors including the type of visual environment on human noise perception becomes increasingly recognized. In order to reveal the relationships between long-term noise annoyance and different types of neighborhood views, 2033 questionnaire responses were collected for studying the effect of perceptions of different combinations of views of sea, urban river, greenery, and/or noise barrier on the annoyance responses from residents living in high-rise apartments in Hong Kong. The collected responses were employed to formulate a multivariate model to predict the probability of invoking a high annoyance response from residents. Results showed that views of sea, urban river, or greenery could lower the probability, while views of noise barrier could increase the probability. Views of greenery had a stronger noise moderation capability than views of sea or urban river. The presence of an interaction effect between views of water and views of noise barrier exerted a negative influence on the noise annoyance moderation capability. The probability due to exposure to an environment containing views of noise barriers and urban rivers would be even higher than that due to exposure to an environment containing views of noise barriers alone.

  18. Applications of the Galton Watson process to human DNA evolution and demography

    NASA Astrophysics Data System (ADS)

    Neves, Armando G. M.; Moreira, Carlos H. C.

    2006-08-01

    We show that the problem of existence of a mitochondrial Eve can be understood as an application of the Galton-Watson process and presents interesting analogies with critical phenomena in Statistical Mechanics. In the approximation of small survival probability, and assuming limited progeny, we are able to find for a genealogic tree the maximum and minimum survival probabilities over all probability distributions for the number of children per woman constrained to a given mean. As a consequence, we can relate existence of a mitochondrial Eve to quantitative demographic data of early mankind. In particular, we show that a mitochondrial Eve may exist even in an exponentially growing population, provided that the mean number of children per woman Nbar is constrained to a small range depending on the probability p that a child is a female. Assuming that the value p≈0.488 valid nowadays has remained fixed for thousands of generations, the range where a mitochondrial Eve occurs with sizeable probability is 2.0492

  19. Post-fire recovery of torpor and activity patterns of a small mammal.

    PubMed

    Stawski, Clare; Hume, Taylor; Körtner, Gerhard; Currie, Shannon E; Nowack, Julia; Geiser, Fritz

    2017-05-01

    To cope with the post-fire challenges of decreased availability of food and shelter, brown antechinus ( Antechinus stuartii ), a small marsupial mammal, increase the use of energy-conserving torpor and reduce activity. However, it is not known how long it takes for animals to resume pre-fire torpor and activity patterns during the recovery of burnt habitat. Therefore, we tested the hypothesis that antechinus will adjust torpor use and activity after a fire depending on vegetation recovery. We simultaneously quantified torpor and activity patterns for female antechinus from three adjacent areas: (i) the area of a management burn 1 year post-fire, (ii) an area that was burned 2 years prior, and (iii) a control area. In comparison to shortly after the management burn, antechinus in all three groups displayed less frequent and less pronounced torpor while being more active. We provide the first evidence that only 1 year post-fire antechinus resume pre-fire torpor and activity patterns, probably in response to the return of herbaceous ground cover and foraging opportunities. © 2017 The Author(s).

  20. Transfer of Minibeam Radiation Therapy into a cost-effective equipment for radiobiological studies: a proof of concept.

    PubMed

    Prezado, Y; Dos Santos, M; Gonzalez, W; Jouvion, G; Guardiola, C; Heinrich, S; Labiod, D; Juchaux, M; Jourdain, L; Sebrie, C; Pouzoulet, F

    2017-12-11

    Minibeam radiation therapy (MBRT) is an innovative synchrotron radiotherapy technique able to shift the normal tissue complication probability curves to significantly higher doses. However, its exploration was hindered due to the limited and expensive beamtime at synchrotrons. The aim of this work was to develop a cost-effective equipment to perform systematic radiobiological studies in view of MBRT. Tumor control for various tumor entities will be addressable as well as studies to unravel the distinct biological mechanisms involved in normal and tumor tissues responses when applying MBRT. With that aim, a series of modifications of a small animal irradiator were performed to make it suitable for MBRT experiments. In addition, the brains of two groups of rats were irradiated. Half of the animals received a standard irradiation, the other half, MBRT. The animals were followed-up for 6.5 months. Substantial brain damage was observed in the group receiving standard RT, in contrast to the MBRT group, where no significant lesions were observed. This work proves the feasibility of the transfer of MBRT outside synchrotron sources towards a small animal irradiator.

  1. Train stimulation of parallel fibre to Purkinje cell inputs reveals two populations of synaptic responses with different receptor signatures

    PubMed Central

    Devi, Suma Priya Sudarsana; Howe, James R.

    2016-01-01

    Key points Purkinje cells of the cerebellum receive ∼180,000 parallel fibre synapses, which have often been viewed as a homogeneous synaptic population and studied using single action potentials.Many parallel fibre synapses might be silent, however, and granule cells in vivo fire in bursts. Here, we used trains of stimuli to study parallel fibre inputs to Purkinje cells in rat cerebellar slices.Analysis of train EPSCs revealed two synaptic components, phase 1 and 2. Phase 1 is initially large and saturates rapidly, whereas phase 2 is initially small and facilitates throughout the train. The two components have a heterogeneous distribution at dendritic sites and different pharmacological profiles.The differential sensitivity of phase 1 and phase 2 to inhibition by pentobarbital and NBQX mirrors the differential sensitivity of AMPA receptors associated with the transmembrane AMPA receptor regulatory protein, γ‐2, gating in the low‐ and high‐open probability modes, respectively. Abstract Cerebellar granule cells fire in bursts, and their parallel fibre axons (PFs) form ∼180,000 excitatory synapses onto the dendritic tree of a Purkinje cell. As many as 85% of these synapses have been proposed to be silent, but most are labelled for AMPA receptors. Here, we studied PF to Purkinje cell synapses using trains of 100 Hz stimulation in rat cerebellar slices. The PF train EPSC consisted of two components that were present in variable proportions at different dendritic sites: one, with large initial EPSC amplitude, saturated after three stimuli and dominated the early phase of the train EPSC; and the other, with small initial amplitude, increased steadily throughout the train of 10 stimuli and dominated the late phase of the train EPSC. The two phases also displayed different pharmacological profiles. Phase 2 was less sensitive to inhibition by NBQX but more sensitive to block by pentobarbital than phase 1. Comparison of synaptic results with fast glutamate applications to recombinant receptors suggests that the high‐open‐probability gating mode of AMPA receptors containing the auxiliary subunit transmembrane AMPA receptor regulatory protein γ‐2 makes a substantial contribution to phase 2. We argue that the two synaptic components arise from AMPA receptors with different functional signatures and synaptic distributions. Comparisons of voltage‐ and current‐clamp responses obtained from the same Purkinje cells indicate that phase 1 of the EPSC arises from synapses ideally suited to transmit short bursts of action potentials, whereas phase 2 is likely to arise from low‐release‐probability or ‘silent’ synapses that are recruited during longer bursts. PMID:27094216

  2. An operational system of fire danger rating over Mediterranean Europe

    NASA Astrophysics Data System (ADS)

    Pinto, Miguel M.; DaCamara, Carlos C.; Trigo, Isabel F.; Trigo, Ricardo M.

    2017-04-01

    A methodology is presented to assess fire danger based on the probability of exceedance of prescribed thresholds of daily released energy. The procedure is developed and tested over Mediterranean Europe, defined by latitude circles of 35 and 45°N and meridians of 10°W and 27.5°E, for the period 2010-2016. The procedure involves estimating the so-called static and daily probabilities of exceedance. For a given point, the static probability is estimated by the ratio of the number of daily fire occurrences releasing energy above a given threshold to the total number of occurrences inside a cell centred at the point. The daily probability of exceedance which takes into account meteorological factors by means of the Canadian Fire Weather Index (FWI) is in turn estimated based on a Generalized Pareto distribution with static probability and FWI as covariates of the scale parameter. The rationale of the procedure is that small fires, assessed by the static probability, have a weak dependence on weather, whereas the larger fires strongly depend on concurrent meteorological conditions. It is shown that observed frequencies of exceedance over the study area for the period 2010-2016 match with the estimated values of probability based on the developed models for static and daily probabilities of exceedance. Some (small) variability is however found between different years suggesting that refinements can be made in future works by using a larger sample to further increase the robustness of the method. The developed methodology presents the advantage of evaluating fire danger with the same criteria for all the study area, making it a good parameter to harmonize fire danger forecasts and forest management studies. Research was performed within the framework of EUMETSAT Satellite Application Facility for Land Surface Analysis (LSA SAF). Part of methods developed and results obtained are on the basis of the platform supported by The Navigator Company that is currently providing information about fire meteorological danger for Portugal for a wide range of users.

  3. 75 FR 67998 - Notice of Inventory Completion: Western Michigan University, Anthropology Department, Kalamazoo, MI

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-04

    ... long bone shaft, 1 possible black bear phalanx, 1 possible crane carpometacarpus, 1 raptor carpometacarpus, 1 possible small bird long bone, 1 unidentified non-human cranium fragment, 2 bird or small mammal long bones and 2 probable bird phalanxes. In 1972, Middle Woodland period ceramic sherds were...

  4. Energy Distributions in Small Populations: Pascal versus Boltzmann

    ERIC Educational Resources Information Center

    Kugel, Roger W.; Weiner, Paul A.

    2010-01-01

    The theoretical distributions of a limited amount of energy among small numbers of particles with discrete, evenly-spaced quantum levels are examined systematically. The average populations of energy states reveal the pattern of Pascal's triangle. An exact formula for the probability that a particle will be in any given energy state is derived.…

  5. Technology Tips: Sample Too Small? Probably Not!

    ERIC Educational Resources Information Center

    Strayer, Jeremy F.

    2013-01-01

    Statistical studies are referenced in the news every day, so frequently that people are sometimes skeptical of reported results. Often, no matter how large a sample size researchers use in their studies, people believe that the sample size is too small to make broad generalizations. The tasks presented in this article use simulations of repeated…

  6. Stability Criteria Analysis for Landing Craft Utility

    DTIC Science & Technology

    2017-12-01

    Square meter m/s Meters per Second m/s2 Meters per Second Squared n Vertical Displacement of Sea Water Free Surface n3 Ship’s Heave... Displacement n5 Ship’s Pitch Angle p(ξ) Rayleigh Distribution Probability Function POSSE Program of Ship Salvage Engineering pk...Spectrum Constant γ JONSWAP Wave Spectrum Peak Factor Γ(λ) Gamma Probability Function Δ Ship’s Displacement Δω Small Frequency

  7. Statistical evaluation of vibration analysis techniques

    NASA Technical Reports Server (NTRS)

    Milner, G. Martin; Miller, Patrice S.

    1987-01-01

    An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.

  8. Fine flow structures in the transition region small-scale loops

    NASA Astrophysics Data System (ADS)

    Yan, L.; Peter, H.; He, J.; Wei, Y.

    2016-12-01

    The observation and model have suggested that the transition region EUV emission from the quiet sun region is contributed by very small scale loops which have not been resolved. Recently, the observation from IRIS has revealed that this kind of small scale loops. Based on the high resolution spectral and imaging observation from IRIS, much more detail work needs to be done to reveal the fine flow features in this kind of loop to help us understand the loop heating. Here, we present a detail statistical study of the spatial and temporal evolution of Si IV line profiles of small scale loops and report the spectral features: there is a transition from blue (red) wing enhancement dominant to red (blue) wing enhancement dominant along the cross-section of the loop, which is independent of time. This feature appears as the loop appear and disappear as the loop un-visible. This is probably the signature of helical flow along the loop. The result suggests that the brightening of this kind of loop is probably due to the current dissipation heating in the twisted magnetic field flux tube.

  9. Wormlike Chain Theory and Bending of Short DNA

    NASA Astrophysics Data System (ADS)

    Mazur, Alexey K.

    2007-05-01

    The probability distributions for bending angles in double helical DNA obtained in all-atom molecular dynamics simulations are compared with theoretical predictions. The computed distributions remarkably agree with the wormlike chain theory and qualitatively differ from predictions of the subelastic chain model. The computed data exhibit only small anomalies in the apparent flexibility of short DNA and cannot account for the recently reported AFM data. It is possible that the current atomistic DNA models miss some essential mechanisms of DNA bending on intermediate length scales. Analysis of bent DNA structures reveal, however, that the bending motion is structurally heterogeneous and directionally anisotropic on the length scales where the experimental anomalies were detected. These effects are essential for interpretation of the experimental data and they also can be responsible for the apparent discrepancy.

  10. Genetic Algorithms and Classification Trees in Feature Discovery: Diabetes and the NHANES database

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heredia-Langner, Alejandro; Jarman, Kristin H.; Amidan, Brett G.

    2013-09-01

    This paper presents a feature selection methodology that can be applied to datasets containing a mixture of continuous and categorical variables. Using a Genetic Algorithm (GA), this method explores a dataset and selects a small set of features relevant for the prediction of a binary (1/0) response. Binary classification trees and an objective function based on conditional probabilities are used to measure the fitness of a given subset of features. The method is applied to health data in order to find factors useful for the prediction of diabetes. Results show that our algorithm is capable of narrowing down the setmore » of predictors to around 8 factors that can be validated using reputable medical and public health resources.« less

  11. The potential benefits of a new poliovirus vaccine for long-term poliovirus risk management.

    PubMed

    Duintjer Tebbens, Radboud J; Thompson, Kimberly M

    2016-12-01

    To estimate the incremental net benefits (INBs) of a hypothetical ideal vaccine with all of the advantages and no disadvantages of existing oral and inactivated poliovirus vaccines compared with current vaccines available for future outbreak response. INB estimates based on expected costs and polio cases from an existing global model of long-term poliovirus risk management. Excluding the development costs, an ideal poliovirus vaccine could offer expected INBs of US$1.6 billion. The ideal vaccine yields small benefits in most realizations of long-term risks, but great benefits in low-probability-high-consequence realizations. New poliovirus vaccines may offer valuable insurance against long-term poliovirus risks and new vaccine development efforts should continue as the world gathers more evidence about polio endgame risks.

  12. Exposure to an extremely low-frequency electromagnetic field only slightly modifies the proteome of Chromobacterium violaceumATCC 12472.

    PubMed

    Baraúna, Rafael A; Santos, Agenor V; Graças, Diego A; Santos, Daniel M; Ghilardi, Rubens; Pimenta, Adriano M C; Carepo, Marta S P; Schneider, Maria P C; Silva, Artur

    2015-05-01

    Several studies of the physiological responses of different organisms exposed to extremely low-frequency electromagnetic fields (ELF-EMF) have been described. In this work, we report the minimal effects of in situ exposure to ELF-EMF on the global protein expression of Chromobacterium violaceum using a gel-based proteomic approach. The protein expression profile was only slightly altered, with five differentially expressed proteins detected in the exposed cultures; two of these proteins (DNA-binding stress protein, Dps, and alcohol dehydrogenase) were identified by MS/MS. The enhanced expression of Dps possibly helped to prevent physical damage to DNA. Although small, the changes in protein expression observed here were probably beneficial in helping the bacteria to adapt to the stress generated by the electromagnetic field.

  13. Metallothionein--a promising tool for cancer diagnostics.

    PubMed

    Krizkova, S; Fabrik, I; Adam, V; Hrabeta, J; Eckschlager, T; Kizek, R

    2009-01-01

    The latest research outcomes indicate that metallothionein (MT) levels in peripheral blood and serum from cancer patients can provide many interesting information about type or clinical stage of the disease, or response to therapy. MT plays a key role in transport of essential heavy metals, detoxification of toxic metals and protection of cells against oxidation stress. Serum MT levels of cancer patients are three times higher than control patients (0.5 microM). The elevated MT levels in cancer cells are probably related to their increased proliferation and protection against apoptosis. Automated electrochemical detection of MT allows its serial analysis in a very small volume with excellent sensitivity, reliability and reproducibility and therefore it can be considered as a new tool for cancer diagnosis (Fig. 4, Ref. 55). Full Text (Free, PDF) www.bmj.sk.

  14. Factors affecting breeding dispersal of European ducks on Engure Marsh, Latvia

    USGS Publications Warehouse

    Blums, P.; Nichols, J.D.; Lindberg, M.S.; Hines, J.E.; Mednis, A.

    2003-01-01

    1. We used up to 35 years of capture-recapture data from nearly 3300 individual female ducks nesting on Engure Marsh, Latvia, and multistate modelling to test predictions about the influence of environmental, habitat and management factors on breeding dispersal probability within the marsh. 2. Analyses based on observed dispersal distances of common pochards and tufted ducks provided no evidence that breeding success in year t influenced dispersal distance between t and t + 1. 3. Breeding dispersal distances (year t to t + 1) of pochards and tufted ducks were associated with a delay in relative nest initiation dates in year t + 1. The delay was greater for pochards (c. 4 days) than for tufted ducks (c. 2 days) when females dispersed > 0.8 km. 4. Northern shovelers and tufted ducks moved from a large island to small islands at low water levels and from small islands to the large island at high water levels before the construction of elevated small islands (1960-82). Following this habitat management (1983-94). breeding fidelity was extremely high and not influenced by water level in the marsh for either species. 5. Because pochard nesting habitats in black-headed gull colonies were saturated during the entire study period, nesting females moved into and out of colonies with similar probabilities. Local survival probabilities and incubation body masses were higher for both yearlings (SY) and adults (ASY) nesting within gull colonies, suggesting that these females were of better quality than females nesting outside of the colonies. 6. Tufted ducks showed higher probabilities of moving from islands to emergent marshes when water levels were higher both before and after habitat management. However, rates of movement for a given water level were higher during the period before management than after. 7. Both pochards and tufted ducks exhibited asymmetric movement with respect to proximity to water, with higher movement probabilities to near-water nesting locations than away from these locations. 8. Multistate capture-recapture models provided analyses that were useful in investigating sources of variation in breeding dispersal probabilities.

  15. Subtle variation in shade avoidance responses may have profound consequences for plant competitiveness.

    PubMed

    Bongers, Franca J; Pierik, Ronald; Anten, Niels P R; Evers, Jochem B

    2017-12-21

    Although phenotypic plasticity has been shown to be beneficial for plant competitiveness for light, there is limited knowledge on how variation in these plastic responses plays a role in determining competitiveness. A combination of detailed plant experiments and functional-structural plant (FSP) modelling was used that captures the complex dynamic feedback between the changing plant phenotype and the within-canopy light environment in time and 3-D space. Leaf angle increase (hyponasty) and changes in petiole elongation rates in response to changes in the ratio between red and far-red light, two important shade avoidance responses in Arabidopsis thaliana growing in dense population stands, were chosen as a case study for plant plasticity. Measuring and implementing these responses into an FSP model allowed simulation of plant phenotype as an emergent property of the underlying growth and response mechanisms. Both the experimental and model results showed that substantial differences in competitiveness may arise between genotypes with only marginally different hyponasty or petiole elongation responses, due to the amplification of plant growth differences by small changes in plant phenotype. In addition, this study illustrated that strong competitive responses do not necessarily have to result in a tragedy of the commons; success in competition at the expense of community performance. Together, these findings indicate that selection pressure could probably have played a role in fine-tuning the sensitive shade avoidance responses found in plants. The model approach presented here provides a novel tool to analyse further how natural selection could have acted on the evolution of plastic responses.

  16. Pasture size effects on the ability of off-stream water or restricted stream access to alter the spatial/temporal distribution of grazing beef cows.

    PubMed

    Bisinger, J J; Russell, J R; Morrical, D G; Isenhart, T M

    2014-08-01

    For 2 grazing seasons, effects of pasture size, stream access, and off-stream water on cow distribution relative to a stream were evaluated in six 12.1-ha cool-season grass pastures. Two pasture sizes (small [4.0 ha] and large [12.1 ha]) with 3 management treatments (unrestricted stream access without off-stream water [U], unrestricted stream access with off-stream water [UW], and stream access restricted to a stabilized stream crossing [R]) were alternated between pasture sizes every 2 wk for 5 consecutive 4-wk intervals in each grazing season. Small and large pastures were stocked with 5 and 15 August-calving cows from mid May through mid October. At 10-min intervals, cow location was determined with Global Positioning System collars fitted on 2 to 3 cows in each pasture and identified when observed in the stream (0-10 m from the stream) or riparian (0-33 m from the stream) zones and ambient temperature was recorded with on-site weather stations. Over all intervals, cows were observed more (P ≤ 0.01) frequently in the stream and riparian zones of small than large pastures regardless of management treatment. Cows in R pastures had 24 and 8% less (P < 0.01) observations in the stream and riparian zones than U or UW pastures regardless of pasture size. Off-stream water had little effect on the presence of cows in or near pasture streams regardless of pasture size. In 2011, the probability of cow presence in the stream and riparian zones increased at greater (P < 0.04) rates as ambient temperature increased in U and UW pastures than in 2010. As ambient temperature increased, the probability of cow presence in the stream and riparian zones increased at greater (P < 0.01) rates in small than large pastures. Across pasture sizes, the probability of cow presence in the stream and riparian zone increased less (P < 0.01) with increasing ambient temperatures in R than U and UW pastures. Rates of increase in the probability of cow presence in shade (within 10 m of tree drip lines) in the total pasture with increasing temperatures did not differ between treatments. However, probability of cow presence in riparian shade increased at greater (P < 0.01) rates in small than large pastures. Pasture size was a major factor affecting congregation of cows in or near pasture streams with unrestricted access.

  17. On the use of secondary capture-recapture samples to estimate temporary emigration and breeding proportions

    USGS Publications Warehouse

    Kendall, W.L.; Nichols, J.D.; North, P.M.; Nichols, J.D.

    1995-01-01

    The use of the Cormack- Jolly-Seber model under a standard sampling scheme of one sample per time period, when the Jolly-Seber assumption that all emigration is permanent does not hold, leads to the confounding of temporary emigration probabilities with capture probabilities. This biases the estimates of capture probability when temporary emigration is a completely random process, and both capture and survival probabilities when there is a temporary trap response in temporary emigration, or it is Markovian. The use of secondary capture samples over a shorter interval within each period, during which the population is assumed to be closed (Pollock's robust design), provides a second source of information on capture probabilities. This solves the confounding problem, and thus temporary emigration probabilities can be estimated. This process can be accomplished in an ad hoc fashion for completely random temporary emigration and to some extent in the temporary trap response case, but modelling the complete sampling process provides more flexibility and permits direct estimation of variances. For the case of Markovian temporary emigration, a full likelihood is required.

  18. Investigating flight response of Pacific brant to helicopters at Izembek Lagoon, Alaska by using logistic regression

    USGS Publications Warehouse

    Erickson, Wallace P.; Nick, Todd G.; Ward, David H.; Peck, Roxy; Haugh, Larry D.; Goodman, Arnold

    1998-01-01

    Izembek Lagoon, an estuary in Alaska, is a very important staging area for Pacific brant, a small migratory goose. Each fall, nearly the entire Pacific Flyway population of 130,000 brant flies to Izembek Lagoon and feeds on eelgrass to accumulate fat reserves for nonstop transoceanic migration to wintering areas as distant as Mexico. In the past 10 years, offshore drilling activities in this area have increased, and, as a result, the air traffic in and out of the nearby Cold Bay airport has also increased. There has been a concern that this increased air traffic could affect the brant by disturbing them from their feeding and resting activities, which in turn could result in reduced energy intake and buildup. This may increase the mortality rates during their migratory journey. Because of these concerns, a study was conducted to investigate the flight response of brant to overflights of large helicopters. Response was measured on flocks during experimental overflights of large helicopters flown at varying altitudes and lateral (perpendicular) distances from the flocks. Logistic regression models were developed for predicting probability of flight response as a function of these distance variables. Results of this study may be used in the development of new FAA guidelines for aircraft near Izembek Lagoon.

  19. Albumin elicits calcium signals from astrocytes in brain slices from neonatal rat cortex

    PubMed Central

    Nadal, Angel; Sul, Jai-Yoon; Valdeolmillos, Miguel; McNaughton, Peter A

    1998-01-01

    Albumin causes calcium signals and mitosis in cultured astrocytes, but it has not been established whether astrocytes in intact brain also respond to albumin. The effect of albumin on intracellular calcium concentration ([Ca2+]i) in single cells was therefore studied in acutely isolated cortical brain slices from the neonatal rat.Physiological concentrations of albumin from plasma and from serum produced an increase in [Ca2+]i in a subpopulation of cortical cells. Trains of transient elevations in [Ca2+]i (Ca2+ spikes) were seen in 41 % of these cells.The cells responding to albumin are identified as astrocytes because the neurone-specific agonist NMDA caused much smaller and slower responses in these cells. On the other hand NMDA-responsive cells, which are probably neurones, exhibited only small and slow responses to albumin. The residual responses of astrocytes to NMDA and neurones to albumin are likely to be due to crosstalk with adjacent neurones and astrocytes, respectively.Methanol extraction of albumin removes a polar lipid and abolishes the ability of albumin to increase intracellular calcium.Astrocyte calcium signalling caused by albumin may have important physiological consequences when the blood-brain barrier breaks down and allows albumin to enter the CNS. PMID:9596793

  20. Effects of preparation time and trial type probability on performance of anti- and pro-saccades.

    PubMed

    Pierce, Jordan E; McDowell, Jennifer E

    2016-02-01

    Cognitive control optimizes responses to relevant task conditions by balancing bottom-up stimulus processing with top-down goal pursuit. It can be investigated using the ocular motor system by contrasting basic prosaccades (look toward a stimulus) with complex antisaccades (look away from a stimulus). Furthermore, the amount of time allotted between trials, the need to switch task sets, and the time allowed to prepare for an upcoming saccade all impact performance. In this study the relative probabilities of anti- and pro-saccades were manipulated across five blocks of interleaved trials, while the inter-trial interval and trial type cue duration were varied across subjects. Results indicated that inter-trial interval had no significant effect on error rates or reaction times (RTs), while a shorter trial type cue led to more antisaccade errors and faster overall RTs. Responses following a shorter cue duration also showed a stronger effect of trial type probability, with more antisaccade errors in blocks with a low antisaccade probability and slower RTs for each saccade task when its trial type was unlikely. A longer cue duration yielded fewer errors and slower RTs, with a larger switch cost for errors compared to a short cue duration. Findings demonstrated that when the trial type cue duration was shorter, visual motor responsiveness was faster and subjects relied upon the implicit trial probability context to improve performance. When the cue duration was longer, increased fixation-related activity may have delayed saccade motor preparation and slowed responses, guiding subjects to respond in a controlled manner regardless of trial type probability. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Diagnostic Performance of (18)F-Fluorodeoxyglucose in 162 Small Pulmonary Nodules Incidentally Detected in Subjects Without a History of Malignancy.

    PubMed

    Calcagni, Maria Lucia; Taralli, Silvia; Cardillo, Giuseppe; Graziano, Paolo; Ialongo, Pasquale; Mattoli, Maria Vittoria; Di Franco, Davide; Caldarella, Carmelo; Carleo, Francesco; Indovina, Luca; Giordano, Alessandro

    2016-04-01

    Solitary pulmonary nodule (SPN) still represents a diagnostic challenge. The aim of our study was to evaluate the diagnostic performance of (18)F-fluorodeoxyglucose positron emission tomography-computed tomography in one of the largest samples of small SPNs, incidentally detected in subjects without a history of malignancy (nonscreening population) and undetermined at computed tomography. One-hundred and sixty-two small (>0.8 to 1.5 cm) and, for comparison, 206 large nodules (>1.5 to 3 cm) were retrospectively evaluated. Diagnostic performance of (18)F-fluorodeoxyglucose visual analysis, receiver-operating characteristic (ROC) analysis for maximum standardized uptake value (SUVmax), and Bayesian analysis were assessed using histology or radiological follow-up as a golden standard. In 162 small nodules, (18)F-fluorodeoxyglucose visual and ROC analyses (SUVmax = 1.3) provided 72.6% and 77.4% sensitivity and 88.0% and 82.0% specificity, respectively. The prevalence of malignancy was 38%; Bayesian analysis provided 78.8% positive and 16.0% negative posttest probabilities of malignancy. In 206 large nodules (18)F-fluorodeoxyglucose visual and ROC analyses (SUVmax = 1.9) provided 89.5% and 85.1% sensitivity and 70.8% and 79.2% specificity, respectively. The prevalence of malignancy was 65%; Bayesian analysis provided 85.0% positive and 21.6% negative posttest probabilities of malignancy. In both groups, malignant nodules had a significant higher SUVmax (p < 0.0001) than benign nodules. Only in the small group, malignant nodules were significantly larger (p = 0.0054) than benign ones. (18)F-fluorodeoxyglucose can be clinically relevant to rule in and rule out malignancy in undetermined small SPNs, incidentally detected in nonscreening population with intermediate pretest probability of malignancy, as well as in larger ones. Visual analysis can be considered an optimal diagnostic criterion, adequately detecting a wide range of malignant nodules with different metabolic activity. Copyright © 2016 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  2. The price of complexity in financial networks

    NASA Astrophysics Data System (ADS)

    Battiston, Stefano; Caldarelli, Guido; May, Robert M.; Roukny, Tarik; Stiglitz, Joseph E.

    2016-09-01

    Financial institutions form multilayer networks by engaging in contracts with each other and by holding exposures to common assets. As a result, the default probability of one institution depends on the default probability of all of the other institutions in the network. Here, we show how small errors on the knowledge of the network of contracts can lead to large errors in the probability of systemic defaults. From the point of view of financial regulators, our findings show that the complexity of financial networks may decrease the ability to mitigate systemic risk, and thus it may increase the social cost of financial crises.

  3. Small violations of Bell inequalities for multipartite pure random states

    NASA Astrophysics Data System (ADS)

    Drumond, Raphael C.; Duarte, Cristhiano; Oliveira, Roberto I.

    2018-05-01

    For any finite number of parts, measurements, and outcomes in a Bell scenario, we estimate the probability of random N-qudit pure states to substantially violate any Bell inequality with uniformly bounded coefficients. We prove that under some conditions on the local dimension, the probability to find any significant amount of violation goes to zero exponentially fast as the number of parts goes to infinity. In addition, we also prove that if the number of parts is at least 3, this probability also goes to zero as the local Hilbert space dimension goes to infinity.

  4. The price of complexity in financial networks.

    PubMed

    Battiston, Stefano; Caldarelli, Guido; May, Robert M; Roukny, Tarik; Stiglitz, Joseph E

    2016-09-06

    Financial institutions form multilayer networks by engaging in contracts with each other and by holding exposures to common assets. As a result, the default probability of one institution depends on the default probability of all of the other institutions in the network. Here, we show how small errors on the knowledge of the network of contracts can lead to large errors in the probability of systemic defaults. From the point of view of financial regulators, our findings show that the complexity of financial networks may decrease the ability to mitigate systemic risk, and thus it may increase the social cost of financial crises.

  5. Structure and function of small heat shock/alpha-crystallin proteins: established concepts and emerging ideas.

    PubMed

    MacRae, T H

    2000-06-01

    Small heat shock/alpha-crystallin proteins are defined by conserved sequence of approximately 90 amino acid residues, termed the alpha-crystallin domain, which is bounded by variable amino- and carboxy-terminal extensions. These proteins form oligomers, most of uncertain quaternary structure, and oligomerization is prerequisite to their function as molecular chaperones. Sequence modelling and physical analyses show that the secondary structure of small heat shock/alpha-crystallin proteins is predominately beta-pleated sheet. Crystallography, site-directed spin-labelling and yeast two-hybrid selection demonstrate regions of secondary structure within the alpha-crystallin domain that interact during oligomer assembly, a process also dependent on the amino terminus. Oligomers are dynamic, exhibiting subunit exchange and organizational plasticity, perhaps leading to functional diversity. Exposure of hydrophobic residues by structural modification facilitates chaperoning where denaturing proteins in the molten globule state associate with oligomers. The flexible carboxy-terminal extension contributes to chaperone activity by enhancing the solubility of small heat shock/alpha-crystallin proteins. Site-directed mutagenesis has yielded proteins where the effect of the change on structure and function depends upon the residue modified, the organism under study and the analytical techniques used. Most revealing, substitution of a conserved arginine residue within the alpha-crystallin domain has a major impact on quaternary structure and chaperone action probably through realignment of beta-sheets. These mutations are linked to inherited diseases. Oligomer size is regulated by a stress-responsive cascade including MAPKAP kinase 2/3 and p38. Phosphorylation of small heat shock/alpha-crystallin proteins has important consequences within stressed cells, especially for microfilaments.

  6. Food stress causes sex-specific maternal effects in mites

    PubMed Central

    Walzer, Andreas; Schausberger, Peter

    2015-01-01

    ABSTRACT Life history theory predicts that females should produce few large eggs under food stress and many small eggs when food is abundant. We tested this prediction in three female-biased size-dimorphic predatory mites feeding on herbivorous spider mite prey: Phytoseiulus persimilis, a specialized spider mite predator; Neoseiulus californicus, a generalist preferring spider mites; Amblyseius andersoni, a broad diet generalist. Irrespective of predator species and offspring sex, most females laid only one small egg under severe food stress. Irrespective of predator species, the number of female but not male eggs decreased with increasing maternal food stress. This sex-specific effect was probably due to the higher production costs of large female than small male eggs. The complexity of the response to the varying availability of spider mite prey correlated with the predators' degree of adaptation to this prey. Most A. andersoni females did not oviposit under severe food stress, whereas N. californicus and P. persimilis did oviposit. Under moderate food stress, only P. persimilis increased its investment per offspring, at the expense of egg number, and produced few large female eggs. When prey was abundant, P. persimilis decreased the female egg sizes at the expense of increased egg numbers, resulting in a sex-specific egg size/number trade-off. Maternal effects manifested only in N. californicus and P. persimilis. Small egg size correlated with the body size of daughters but not sons. Overall, our study provides a key example of sex-specific maternal effects, i.e. food stress during egg production more strongly affects the sex of the large than the small offspring. PMID:26089530

  7. Dysphagia in supratentorial recent small subcortical infarcts results from bilateral pyramidal tract damage.

    PubMed

    Fandler, Simon; Gattringer, Thomas; Pinter, Daniela; Pirpamer, Lukas; Borsodi, Florian; Eppinger, Sebastian; Niederkorn, Kurt; Enzinger, Christian; Fazekas, Franz

    2018-01-01

    Background Dysphagia occurs in up to 20% of patients with a recent small subcortical infarct, even when excluding brainstem infarcts. Aim To examine the impact of lesion topography and concomitant cerebrovascular lesions on the occurrence of dysphagia in patients with a single supratentorial recent small subcortical infarct. Methods We retrospectively identified all inpatients with magnetic resonance imaging-confirmed supratentorial recent small subcortical infarcts over a five-year period. Dysphagia was determined by speech-language therapists. Recent small subcortical infarcts were compiled into a standard brain model and compared using lesion probability maps. Furthermore, magnetic resonance imaging scans were reviewed for the combination of both acute and old cerebrovascular lesions. Results A total of 243 patients with a recent small subcortical infarct were identified (mean age 67.9 ± 12.2 years). Of those, 29 had mild and 18 moderate-to-severe dysphagia. Lesion probability maps suggested no recent small subcortical infarct location favoring the occurrence of moderate-to-severe dysphagia. However, patients with moderate-to-severe dysphagia more frequently showed combined damage to both pyramidal tracts by the recent small subcortical infarct and a contralateral old lesion (lacune: 77.8% vs. 19.9%, p < 0.001; lacune or confluent white matter hyperintensities: 100% vs. 57.7%, p < 0.001) than patients without swallowing dysfunction. Comparable results were obtained when analyzing patients with any degree of dysphagia. Conclusions Preexisting contralateral vascular pyramidal tract lesions are closely related to the occurrence of moderate-to-severe dysphagia in patients with supratentorial recent small subcortical infarcts.

  8. Suboptimal Decision Criteria Are Predicted by Subjectively Weighted Probabilities and Rewards

    PubMed Central

    Ackermann, John F.; Landy, Michael S.

    2014-01-01

    Subjects performed a visual detection task in which the probability of target occurrence at each of the two possible locations, and the rewards for correct responses for each, were varied across conditions. To maximize monetary gain, observers should bias their responses, choosing one location more often than the other in line with the varied probabilities and rewards. Typically, and in our task, observers do not bias their responses to the extent they should, and instead distribute their responses more evenly across locations, a phenomenon referred to as ‘conservatism.’ We investigated several hypotheses regarding the source of the conservatism. We measured utility and probability weighting functions under Prospect Theory for each subject in an independent economic choice task and used the weighting-function parameters to calculate each subject’s subjective utility (SU(c)) as a function of the criterion c, and the corresponding weighted optimal criteria (wcopt). Subjects’ criteria were not close to optimal relative to wcopt. The slope of SU (c) and of expected gain EG(c) at the neutral criterion corresponding to β = 1 were both predictive of subjects’ criteria. The slope of SU(c) was a better predictor of observers’ decision criteria overall. Thus, rather than behaving optimally, subjects move their criterion away from the neutral criterion by estimating how much they stand to gain by such a change based on the slope of subjective gain as a function of criterion, using inherently distorted probabilities and values. PMID:25366822

  9. Suboptimal decision criteria are predicted by subjectively weighted probabilities and rewards.

    PubMed

    Ackermann, John F; Landy, Michael S

    2015-02-01

    Subjects performed a visual detection task in which the probability of target occurrence at each of the two possible locations, and the rewards for correct responses for each, were varied across conditions. To maximize monetary gain, observers should bias their responses, choosing one location more often than the other in line with the varied probabilities and rewards. Typically, and in our task, observers do not bias their responses to the extent they should, and instead distribute their responses more evenly across locations, a phenomenon referred to as 'conservatism.' We investigated several hypotheses regarding the source of the conservatism. We measured utility and probability weighting functions under Prospect Theory for each subject in an independent economic choice task and used the weighting-function parameters to calculate each subject's subjective utility (SU(c)) as a function of the criterion c, and the corresponding weighted optimal criteria (wc opt ). Subjects' criteria were not close to optimal relative to wc opt . The slope of SU(c) and of expected gain EG(c) at the neutral criterion corresponding to β = 1 were both predictive of the subjects' criteria. The slope of SU(c) was a better predictor of observers' decision criteria overall. Thus, rather than behaving optimally, subjects move their criterion away from the neutral criterion by estimating how much they stand to gain by such a change based on the slope of subjective gain as a function of criterion, using inherently distorted probabilities and values.

  10. Performance analysis of the word synchronization properties of the outer code in a TDRSS decoder

    NASA Technical Reports Server (NTRS)

    Costello, D. J., Jr.; Lin, S.

    1984-01-01

    A self-synchronizing coding scheme for NASA's TDRSS satellite system is a concatenation of a (2,1,7) inner convolutional code with a (255,223) Reed-Solomon outer code. Both symbol and word synchronization are achieved without requiring that any additional symbols be transmitted. An important parameter which determines the performance of the word sync procedure is the ratio of the decoding failure probability to the undetected error probability. Ideally, the former should be as small as possible compared to the latter when the error correcting capability of the code is exceeded. A computer simulation of a (255,223) Reed-Solomon code as carried out. Results for decoding failure probability and for undetected error probability are tabulated and compared.

  11. Proton cellular influx as a probable mechanism of variation potential influence on photosynthesis in pea.

    PubMed

    Sukhov, Vladimir; Sherstneva, Oksana; Surova, Lyubov; Katicheva, Lyubov; Vodeneev, Vladimir

    2014-11-01

    Electrical signals (action potential and variation potential, VP) caused by environmental stimuli are known to induce various physiological responses in plants, including changes in photosynthesis; however, their functional mechanisms remain unclear. In this study, the influence of VP on photosynthesis in pea (Pisum sativum L.) was investigated and the proton participation in this process analysed. VP, induced by local heating, inactivated photosynthesis and activated respiration, with the initiation of the photosynthetic response connected with inactivation of the photosynthetic dark stage; however, direct VP influence on the light stage was also probable. VP generation was accompanied with pH increases in apoplasts (0.17-0.30 pH unit) and decreases in cytoplasm (0.18-0.60 pH unit), which probably reflected H(+) -ATPase inactivation and H(+) influx during this electrical event. Imitation of H(+) influx using the protonophore carbonyl cyanide m-chlorophenylhydrazone (CCCP) induced a photosynthetic response that was similar with a VP-induced response. Experiments on chloroplast suspensions showed that decreased external pH also induced an analogous response and that its magnitude depended on the magnitude of pH change. Thus, the present results showed that proton cellular influx was the probable mechanism of VP's influence on photosynthesis in pea. Potential means of action for this influence are discussed. © 2014 John Wiley & Sons Ltd.

  12. Effects of stop-signal probability in the stop-signal paradigm: the N2/P3 complex further validated.

    PubMed

    Ramautar, J R; Kok, A; Ridderinkhof, K R

    2004-11-01

    The aim of this study was to examine the effects of frequency of occurrence of stop signals in the stop-signal paradigm. Presenting stop signals less frequently resulted in faster reaction times to the go stimulus and a lower probability of inhibition. Also, go stimuli elicited larger and somewhat earlier P3 responses when stop signals occurred less frequently. Since the amplitude effect was more pronounced on trials when go signals were followed by fast than slow reactions, it probably reflected a stronger set to produce fast responses. N2 and P3 components to stop signals were observed to be larger and of longer latency when stop signals occurred less frequently. The amplitude enhancement of these N2 and P3 components were more pronounced for unsuccessful than for successful stop-signal trials. Moreover, the successfully inhibited stop trials elicited a frontocentral P3 whereas unsuccessfully inhibited stop trials elicited a more posterior P3 that resembled the classical P3b. P3 amplitude in the unsuccessfully inhibited condition also differed between waveforms synchronized with the stop signal and waveforms synchronized with response onset whereas N2 amplitude did not. Taken together these findings suggest that N2 reflected a greater significance of failed inhibitions after low probability stop signals while P3 reflected continued processing of the erroneous response after response execution.

  13. Sensitivity of the two-dimensional shearless mixing layer to the initial turbulent kinetic energy and integral length scale

    NASA Astrophysics Data System (ADS)

    Fathali, M.; Deshiri, M. Khoshnami

    2016-04-01

    The shearless mixing layer is generated from the interaction of two homogeneous isotropic turbulence (HIT) fields with different integral scales ℓ1 and ℓ2 and different turbulent kinetic energies E1 and E2. In this study, the sensitivity of temporal evolutions of two-dimensional, incompressible shearless mixing layers to the parametric variations of ℓ1/ℓ2 and E1/E2 is investigated. The sensitivity methodology is based on the nonintrusive approach; using direct numerical simulation and generalized polynomial chaos expansion. The analysis is carried out at Reℓ 1=90 for the high-energy HIT region and different integral length scale ratios 1 /4 ≤ℓ1/ℓ2≤4 and turbulent kinetic energy ratios 1 ≤E1/E2≤30 . It is found that the most influential parameter on the variability of the mixing layer evolution is the turbulent kinetic energy while variations of the integral length scale show a negligible influence on the flow field variability. A significant level of anisotropy and intermittency is observed in both large and small scales. In particular, it is found that large scales have higher levels of intermittency and sensitivity to the variations of ℓ1/ℓ2 and E1/E2 compared to the small scales. Reconstructed response surfaces of the flow field intermittency and the turbulent penetration depth show monotonic dependence on ℓ1/ℓ2 and E1/E2 . The mixing layer growth rate and the mixing efficiency both show sensitive dependence on the initial condition parameters. However, the probability density function of these quantities shows relatively small solution variations in response to the variations of the initial condition parameters.

  14. Small scale photo probability sampling and vegetation classification in southeast Arizona as an ecological base for resource inventory. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Johnson, J. R. (Principal Investigator)

    1974-01-01

    The author has identified the following significant results. The broad scale vegetation classification was developed for a 3,200 sq mile area in southeastern Arizona. The 31 vegetation types were derived from association tables which contained information taken at about 500 ground sites. The classification provided an information base that was suitable for use with small scale photography. A procedure was developed and tested for objectively comparing photo images. The procedure consisted of two parts, image groupability testing and image complexity testing. The Apollo and ERTS photos were compared for relative suitability as first stage stratification bases in two stage proportional probability sampling. High altitude photography was used in common at the second stage.

  15. Striatal activity is modulated by target probability.

    PubMed

    Hon, Nicholas

    2017-06-14

    Target probability has well-known neural effects. In the brain, target probability is known to affect frontal activity, with lower probability targets producing more prefrontal activation than those that occur with higher probability. Although the effect of target probability on cortical activity is well specified, its effect on subcortical structures such as the striatum is less well understood. Here, I examined this issue and found that the striatum was highly responsive to target probability. This is consistent with its hypothesized role in the gating of salient information into higher-order task representations. The current data are interpreted in light of that fact that different components of the striatum are sensitive to different types of task-relevant information.

  16. Response preparation and intra-individual reaction time variability in schizophrenia.

    PubMed

    Dankinas, Denisas; Mėlynytė, Sigita; Šiurkutė, Aldona; Dapšys, Kastytis

    2016-01-01

    Background. It is important to prepare response in advance to increase the efficiency of its execution. The process of response preparation is usually studied using the precueing paradigm. In this paradigm subjects have to employ the preceding information about further imperative stimulus to perform proper response preparation, which shortens the reaction time of subsequent response execution. Previous studies detected the impairment of response preparation in schizophrenia only with the help of electroencephalographic parameters, but not with the assessing of reaction time. Therefore, in this study we attempted to find a behavioural parameter that could detect impairment in response preparation of schizophrenia patients. It was recently found that appropriate response preparation not only shortens the reaction time but also increases its stability, which is measured with the intra-individual reaction time variability. It was also revealed that response stability could better find cognitive dysfunction in some studies of schizophrenia disorder than classical behavioural parameters. Hence, the main goal of this study was to verify if intra-individual reaction time variability could detect the impairment of response preparation in schizophrenia patients. Materials and methods. In order to achieve the main purpose, we carried out a study with 14 schizophrenia patients and 14 control group subjects. We used precueing paradigm in our research, in which participants had to employ information about stimulus probability for the proper response preparation. Results. Our main result showed that despite the responses of schizophrenia patients were faster to the high-probability stimulus than to the low-probability one ( F (1, 13) = 30.9, p < 0.001), intra-individual reaction time variability did not differ in this group between the responses to more and less probable stimuli ( F (1, 13) = 0.64, p = 0.44). Conclusions. Results of the study suggest that people with schizophrenia were able to use precueing probabilistic information only to shorten their reaction time, but not to increase response stability. Therefore, it was found that intra-individual reaction time variability parameter could detect response preparation impairment in schizophrenia, and could be used in clinical purposes.

  17. Small-world behaviour in a system of mobile elements

    NASA Astrophysics Data System (ADS)

    Manrubia, S. C.; Delgado, J.; Luque, B.

    2001-03-01

    We analyze the propagation of activity in a system of mobile automata. A number ρLd of elements move as random walkers on a lattice of dimension d, while with a small probability p they can jump to any empty site in the system. We show that this system behaves as a Dynamic Small World (DSW) and present analytic and numerical results for several quantities. Our analysis shows that the persistence time T* (equivalent to the persistence size L* of small-world networks) scales as T* ~ (ρp)-τ, with τ = 1/(d + 1).

  18. Neural coding of repetitive clicks in the medial geniculate body of cat.

    PubMed

    Rouiller, E; de Ribaupierre, Y; Toros-Morel, A; de Ribaupierre, F

    1981-09-01

    The activity of 418 medial geniculate body (MGB) units was studied in response to repetitive acoustic pulses in 35 nitrous oxide anaesthetized cats. The proportion of MGB neurons insensitive to repetitive clicks was close to 30%. On the basis of their pattern of discharge, the responsive units were divided into three categories. The majority of them (71%), classified as "lockers', showed discharges precisely time-locked to the individual clicks of the train. A few units (8%), called "groupers', had discharges loosely synchronized to low-rate repetitive clicks. When the spikes were not synchronized, the cell had transient or sustained responses for a limited frequency range and was classified as a "special responder' (21%). Responses of "lockers' were time-locked up to a limiting rate, which varied between 10 and 800 Hz; half of the "lockers' had a limiting rate of locking equal to or higher than 100 Hz. The degree of entrainment, defined as the probability that each click evokes at least one spike, regularly decreases for increasing rates; on the other hand, the precision of locking increasing increases with frequency. The time jitter observed at 100 Hz might be as small as 0.2 ms and was 1.2 ms on average. The population of "lockers' can mark with precision the transients of complex sounds and has response properties still compatible with a temporal coding of the fundamental frequency of most animal vocalizations.

  19. Guinea Pig Oxygen-Sensing and Carotid Body Functional Properties

    PubMed Central

    Gonzalez-Obeso, Elvira; Docio, Inmaculada; Olea, Elena; Cogolludo, Angel; Obeso, Ana; Rocher, Asuncion; Gomez-Niño, Angela

    2017-01-01

    Mammals have developed different mechanisms to maintain oxygen supply to cells in response to hypoxia. One of those mechanisms, the carotid body (CB) chemoreceptors, is able to detect physiological hypoxia and generate homeostatic reflex responses, mainly ventilatory and cardiovascular. It has been reported that guinea pigs, originally from the Andes, have a reduced ventilatory response to hypoxia compared to other mammals, implying that CB are not completely functional, which has been related to genetically/epigenetically determined poor hypoxia-driven CB reflex. This study was performed to check the guinea pig CB response to hypoxia compared to the well-known rat hypoxic response. These experiments have explored ventilatory parameters breathing different gases mixtures, cardiovascular responses to acute hypoxia, in vitro CB response to hypoxia and other stimuli and isolated guinea pig chemoreceptor cells properties. Our findings show that guinea pigs are hypotensive and have lower arterial pO2 than rats, probably related to a low sympathetic tone and high hemoglobin affinity. Those characteristics could represent a higher tolerance to hypoxic environment than other rodents. We also find that although CB are hypo-functional not showing chronic hypoxia sensitization, a small percentage of isolated carotid body chemoreceptor cells contain tyrosine hydroxylase enzyme and voltage-dependent K+ currents and therefore can be depolarized. However hypoxia does not modify intracellular Ca2+ levels or catecholamine secretion. Guinea pigs are able to hyperventilate only in response to intense acute hypoxic stimulus, but hypercapnic response is similar to rats. Whether other brain areas are also activated by hypoxia in guinea pigs remains to be studied. PMID:28533756

  20. Guinea Pig Oxygen-Sensing and Carotid Body Functional Properties.

    PubMed

    Gonzalez-Obeso, Elvira; Docio, Inmaculada; Olea, Elena; Cogolludo, Angel; Obeso, Ana; Rocher, Asuncion; Gomez-Niño, Angela

    2017-01-01

    Mammals have developed different mechanisms to maintain oxygen supply to cells in response to hypoxia. One of those mechanisms, the carotid body (CB) chemoreceptors, is able to detect physiological hypoxia and generate homeostatic reflex responses, mainly ventilatory and cardiovascular. It has been reported that guinea pigs, originally from the Andes, have a reduced ventilatory response to hypoxia compared to other mammals, implying that CB are not completely functional, which has been related to genetically/epigenetically determined poor hypoxia-driven CB reflex. This study was performed to check the guinea pig CB response to hypoxia compared to the well-known rat hypoxic response. These experiments have explored ventilatory parameters breathing different gases mixtures, cardiovascular responses to acute hypoxia, in vitro CB response to hypoxia and other stimuli and isolated guinea pig chemoreceptor cells properties. Our findings show that guinea pigs are hypotensive and have lower arterial pO 2 than rats, probably related to a low sympathetic tone and high hemoglobin affinity. Those characteristics could represent a higher tolerance to hypoxic environment than other rodents. We also find that although CB are hypo-functional not showing chronic hypoxia sensitization, a small percentage of isolated carotid body chemoreceptor cells contain tyrosine hydroxylase enzyme and voltage-dependent K + currents and therefore can be depolarized. However hypoxia does not modify intracellular Ca 2+ levels or catecholamine secretion. Guinea pigs are able to hyperventilate only in response to intense acute hypoxic stimulus, but hypercapnic response is similar to rats. Whether other brain areas are also activated by hypoxia in guinea pigs remains to be studied.

  1. Use and interpretation of logistic regression in habitat-selection studies

    USGS Publications Warehouse

    Keating, Kim A.; Cherry, Steve

    2004-01-01

     Logistic regression is an important tool for wildlife habitat-selection studies, but the method frequently has been misapplied due to an inadequate understanding of the logistic model, its interpretation, and the influence of sampling design. To promote better use of this method, we review its application and interpretation under 3 sampling designs: random, case-control, and use-availability. Logistic regression is appropriate for habitat use-nonuse studies employing random sampling and can be used to directly model the conditional probability of use in such cases. Logistic regression also is appropriate for studies employing case-control sampling designs, but careful attention is required to interpret results correctly. Unless bias can be estimated or probability of use is small for all habitats, results of case-control studies should be interpreted as odds ratios, rather than probability of use or relative probability of use. When data are gathered under a use-availability design, logistic regression can be used to estimate approximate odds ratios if probability of use is small, at least on average. More generally, however, logistic regression is inappropriate for modeling habitat selection in use-availability studies. In particular, using logistic regression to fit the exponential model of Manly et al. (2002:100) does not guarantee maximum-likelihood estimates, valid probabilities, or valid likelihoods. We show that the resource selection function (RSF) commonly used for the exponential model is proportional to a logistic discriminant function. Thus, it may be used to rank habitats with respect to probability of use and to identify important habitat characteristics or their surrogates, but it is not guaranteed to be proportional to probability of use. Other problems associated with the exponential model also are discussed. We describe an alternative model based on Lancaster and Imbens (1996) that offers a method for estimating conditional probability of use in use-availability studies. Although promising, this model fails to converge to a unique solution in some important situations. Further work is needed to obtain a robust method that is broadly applicable to use-availability studies.

  2. Testing anthropic reasoning for the cosmological constant with a realistic galaxy formation model

    NASA Astrophysics Data System (ADS)

    Sudoh, Takahiro; Totani, Tomonori; Makiya, Ryu; Nagashima, Masahiro

    2017-01-01

    The anthropic principle is one of the possible explanations for the cosmological constant (Λ) problem. In previous studies, a dark halo mass threshold comparable with our Galaxy must be assumed in galaxy formation to get a reasonably large probability of finding the observed small value, P(<Λobs), though stars are found in much smaller galaxies as well. Here we examine the anthropic argument by using a semi-analytic model of cosmological galaxy formation, which can reproduce many observations such as galaxy luminosity functions. We calculate the probability distribution of Λ by running the model code for a wide range of Λ, while other cosmological parameters and model parameters for baryonic processes of galaxy formation are kept constant. Assuming that the prior probability distribution is flat per unit Λ, and that the number of observers is proportional to stellar mass, we find P(<Λobs) = 6.7 per cent without introducing any galaxy mass threshold. We also investigate the effect of metallicity; we find P(<Λobs) = 9.0 per cent if observers exist only in galaxies whose metallicity is higher than the solar abundance. If the number of observers is proportional to metallicity, we find P(<Λobs) = 9.7 per cent. Since these probabilities are not extremely small, we conclude that the anthropic argument is a viable explanation, if the value of Λ observed in our Universe is determined by a probability distribution.

  3. Reliable evaluation of the quantal determinants of synaptic efficacy using Bayesian analysis

    PubMed Central

    Beato, M.

    2013-01-01

    Communication between neurones in the central nervous system depends on synaptic transmission. The efficacy of synapses is determined by pre- and postsynaptic factors that can be characterized using quantal parameters such as the probability of neurotransmitter release, number of release sites, and quantal size. Existing methods of estimating the quantal parameters based on multiple probability fluctuation analysis (MPFA) are limited by their requirement for long recordings to acquire substantial data sets. We therefore devised an algorithm, termed Bayesian Quantal Analysis (BQA), that can yield accurate estimates of the quantal parameters from data sets of as small a size as 60 observations for each of only 2 conditions of release probability. Computer simulations are used to compare its performance in accuracy with that of MPFA, while varying the number of observations and the simulated range in release probability. We challenge BQA with realistic complexities characteristic of complex synapses, such as increases in the intra- or intersite variances, and heterogeneity in release probabilities. Finally, we validate the method using experimental data obtained from electrophysiological recordings to show that the effect of an antagonist on postsynaptic receptors is correctly characterized by BQA by a specific reduction in the estimates of quantal size. Since BQA routinely yields reliable estimates of the quantal parameters from small data sets, it is ideally suited to identify the locus of synaptic plasticity for experiments in which repeated manipulations of the recording environment are unfeasible. PMID:23076101

  4. Stochastic and deterministic model of microbial heat inactivation.

    PubMed

    Corradini, Maria G; Normand, Mark D; Peleg, Micha

    2010-03-01

    Microbial inactivation is described by a model based on the changing survival probabilities of individual cells or spores. It is presented in a stochastic and discrete form for small groups, and as a continuous deterministic model for larger populations. If the underlying mortality probability function remains constant throughout the treatment, the model generates first-order ("log-linear") inactivation kinetics. Otherwise, it produces survival patterns that include Weibullian ("power-law") with upward or downward concavity, tailing with a residual survival level, complete elimination, flat "shoulder" with linear or curvilinear continuation, and sigmoid curves. In both forms, the same algorithm or model equation applies to isothermal and dynamic heat treatments alike. Constructing the model does not require assuming a kinetic order or knowledge of the inactivation mechanism. The general features of its underlying mortality probability function can be deduced from the experimental survival curve's shape. Once identified, the function's coefficients, the survival parameters, can be estimated directly from the experimental survival ratios by regression. The model is testable in principle but matching the estimated mortality or inactivation probabilities with those of the actual cells or spores can be a technical challenge. The model is not intended to replace current models to calculate sterility. Its main value, apart from connecting the various inactivation patterns to underlying probabilities at the cellular level, might be in simulating the irregular survival patterns of small groups of cells and spores. In principle, it can also be used for nonthermal methods of microbial inactivation and their combination with heat.

  5. Disease-emergence dynamics and control in a socially-structured wildlife species

    NASA Astrophysics Data System (ADS)

    Pepin, Kim M.; Vercauteren, Kurt C.

    2016-04-01

    Once a pathogen is introduced in a population, key factors governing rate of spread include contact structure, supply of susceptible individuals and pathogen life-history. We examined the interplay of these factors on emergence dynamics and efficacy of disease prevention and response. We contrasted transmission dynamics of livestock viruses with different life-histories in hypothetical populations of feral swine with different contact structures (homogenous, metapopulation, spatial and network). Persistence probability was near 0 for the FMDV-like case under a wide range of parameter values and contact structures, while persistence was probable for the CSFV-like case. There were no sets of conditions where the FMDV-like pathogen persisted in every stochastic simulation. Even when population growth rates were up to 300% annually, the FMDV-like pathogen persisted in <25% of simulations regardless of transmission probabilities and contact structure. For networks and spatial contact structure, persistence probability of the FMDV-like pathogen was always <10%. Because of its low persistence probability, even very early response to the FMDV-like pathogen in feral swine was unwarranted while response to the CSFV-like pathogen was generally effective. When pre-emergence culling of feral swine caused population declines, it was effective at decreasing outbreak size of both diseases by ≥80%.

  6. Immunomodulation of Tumor Growth

    PubMed Central

    Prehn, Richmond T.

    1974-01-01

    Most and perhaps all neoplasms arouse an immune response in their hosts. Unfortunately, this response is seldom effective in limiting tumor growth. Immunologic surveillance, as originally conceived, probably does not exist. The early weak response to nascent tumors stimulates rather than inhibits their growth. A truly tumor-limiting reaction occurs only in exceptional tumor systems, and then it is relatively late and ineffectual. Immunity may be of great importance in limiting the activity of oncogenic viruses, but is probably seldom the determiner of whether or not an already transformed cell gives rise to a lethal cancer. PMID:4548632

  7. Constituent loads in small streams: the process and problems of estimating sediment flux

    Treesearch

    R. B. Thomas

    1989-01-01

    Constituent loads in small streams are often estimated poorly. This is especially true for discharge-related constituents like sediment, since their flux is highly variable and mainly occurs during infrequent high-flow events. One reason for low-quality estimates is that most prevailing data collection methods ignore sampling probabilities and only partly account for...

  8. Bayesian Estimation of Small Effects in Exercise and Sports Science.

    PubMed

    Mengersen, Kerrie L; Drovandi, Christopher C; Robert, Christian P; Pyne, David B; Gore, Christopher J

    2016-01-01

    The aim of this paper is to provide a Bayesian formulation of the so-called magnitude-based inference approach to quantifying and interpreting effects, and in a case study example provide accurate probabilistic statements that correspond to the intended magnitude-based inferences. The model is described in the context of a published small-scale athlete study which employed a magnitude-based inference approach to compare the effect of two altitude training regimens (live high-train low (LHTL), and intermittent hypoxic exposure (IHE)) on running performance and blood measurements of elite triathletes. The posterior distributions, and corresponding point and interval estimates, for the parameters and associated effects and comparisons of interest, were estimated using Markov chain Monte Carlo simulations. The Bayesian analysis was shown to provide more direct probabilistic comparisons of treatments and able to identify small effects of interest. The approach avoided asymptotic assumptions and overcame issues such as multiple testing. Bayesian analysis of unscaled effects showed a probability of 0.96 that LHTL yields a substantially greater increase in hemoglobin mass than IHE, a 0.93 probability of a substantially greater improvement in running economy and a greater than 0.96 probability that both IHE and LHTL yield a substantially greater improvement in maximum blood lactate concentration compared to a Placebo. The conclusions are consistent with those obtained using a 'magnitude-based inference' approach that has been promoted in the field. The paper demonstrates that a fully Bayesian analysis is a simple and effective way of analysing small effects, providing a rich set of results that are straightforward to interpret in terms of probabilistic statements.

  9. Pollutant Concentrations and Toxic Effects on the Red Alga Ceramium tenuicorne of Sediments from Natural Harbors and Small Boat Harbors on the West Coast of Sweden.

    PubMed

    Eklund, Britta; Hansson, Tomas; Bengtsson, Henrik; Eriksson Wiklund, Ann-Kristin

    2016-04-01

    This investigation set out to analyze the toxicity of surface sediments in a number of natural harbors and small boat harbors on the west coast of Sweden. This was done with the growth inhibition method with Ceramium tenuicorne. Also, concentrations of copper (Cu), lead (Pb), zinc (Zn), irgarol, organotin compounds, and polycyclic aromatic hydrocarbons (PAHs) in the sediments were analyzed. The small boat harbors were heavily polluted by Cu, Zn, butyltins, and PAHs, and to a lesser extent by Pb. The Cu, Pb, Zn, and butyltins probably originated from their past and/or present use in antifouling paints, whereas the PAHs probably had multiple sources, including boat motor exhausts. The measured toxicity of the sediment was generally related to their Cu, Zn, and butyltin content, although other toxic substances than those analyzed here probably contributed to the toxicity in some of the harbors. The natural harbor sediments contained less pollutants and were less toxic than the small boat harbor sediments. Nevertheless, our data indicate that the boating pressure today may be high enough to produce toxic effects even in natural harbors in pristine areas. The strongest relationship between toxicity and the major pollutants was obtained when the sediment toxicity was expressed as gram wet weight per liter compared with gram dry weight per liter and gram total organic carbon per liter. Hence, for pollutants that can be elutriated with natural sea water, sediment toxicity expressed as gram wet weight per liter appears preferable.

  10. Uncertainty quantification for complex systems with very high dimensional response using Grassmann manifold variations

    NASA Astrophysics Data System (ADS)

    Giovanis, D. G.; Shields, M. D.

    2018-07-01

    This paper addresses uncertainty quantification (UQ) for problems where scalar (or low-dimensional vector) response quantities are insufficient and, instead, full-field (very high-dimensional) responses are of interest. To do so, an adaptive stochastic simulation-based methodology is introduced that refines the probability space based on Grassmann manifold variations. The proposed method has a multi-element character discretizing the probability space into simplex elements using a Delaunay triangulation. For every simplex, the high-dimensional solutions corresponding to its vertices (sample points) are projected onto the Grassmann manifold. The pairwise distances between these points are calculated using appropriately defined metrics and the elements with large total distance are sub-sampled and refined. As a result, regions of the probability space that produce significant changes in the full-field solution are accurately resolved. An added benefit is that an approximation of the solution within each element can be obtained by interpolation on the Grassmann manifold. The method is applied to study the probability of shear band formation in a bulk metallic glass using the shear transformation zone theory.

  11. Are stress-induced cortisol changes during pregnancy associated with postpartum depressive symptoms?

    PubMed

    Nierop, Ada; Bratsikas, Aliki; Zimmermann, Roland; Ehlert, Ulrike

    2006-01-01

    The purpose of this study was to examine the association between psychobiological stress reactivity during healthy pregnancy and depressive symptoms in the early puerperium. A sample of healthy nulliparous pregnant women (N = 57) between the ages of 21 and 35 years underwent a standardized psychosocial stress test during pregnancy. Within an average of 13 days after delivery, postpartum depressive symptoms were assessed using the German version of the Edinburgh postnatal depression scale (EPDS). The sample was divided into a group with probable cases (EPDS score >9, N = 16) and a group with probable noncases (EPDS score < or =9, N = 41). The probable case group showed significantly higher cortisol responses to the stress test compared with the probable noncase group, whereas baseline levels did not differ. Additionally, women in the probable case group showed significantly higher state anxiety and lower mood state throughout the experiment. Furthermore, the probable case group showed higher stress susceptibility, higher trait anxiety, and higher levels in the Symptom Checklist. No differences were found for prior episodes of psychiatric disorders, obstetrical complications, birth weight, or mode of delivery. Our data provide evidence that healthy pregnant women developing postpartum depressive symptoms might already be identified during pregnancy by means of their higher cortisol reactivity and their higher psychological reactivity in response to psychosocial stress. Further investigations are required to explore whether higher psychobiological stress responses not only precede depressive symptoms within 2 weeks after birth, but might also predict postpartum major depression.

  12. Approximation of Failure Probability Using Conditional Sampling

    NASA Technical Reports Server (NTRS)

    Giesy. Daniel P.; Crespo, Luis G.; Kenney, Sean P.

    2008-01-01

    In analyzing systems which depend on uncertain parameters, one technique is to partition the uncertain parameter domain into a failure set and its complement, and judge the quality of the system by estimating the probability of failure. If this is done by a sampling technique such as Monte Carlo and the probability of failure is small, accurate approximation can require so many sample points that the computational expense is prohibitive. Previous work of the authors has shown how to bound the failure event by sets of such simple geometry that their probabilities can be calculated analytically. In this paper, it is shown how to make use of these failure bounding sets and conditional sampling within them to substantially reduce the computational burden of approximating failure probability. It is also shown how the use of these sampling techniques improves the confidence intervals for the failure probability estimate for a given number of sample points and how they reduce the number of sample point analyses needed to achieve a given level of confidence.

  13. Fixation Probability in a Haploid-Diploid Population

    PubMed Central

    Bessho, Kazuhiro; Otto, Sarah P.

    2017-01-01

    Classical population genetic theory generally assumes either a fully haploid or fully diploid life cycle. However, many organisms exhibit more complex life cycles, with both free-living haploid and diploid stages. Here we ask what the probability of fixation is for selected alleles in organisms with haploid-diploid life cycles. We develop a genetic model that considers the population dynamics using both the Moran model and Wright–Fisher model. Applying a branching process approximation, we obtain an accurate fixation probability assuming that the population is large and the net effect of the mutation is beneficial. We also find the diffusion approximation for the fixation probability, which is accurate even in small populations and for deleterious alleles, as long as selection is weak. These fixation probabilities from branching process and diffusion approximations are similar when selection is weak for beneficial mutations that are not fully recessive. In many cases, particularly when one phase predominates, the fixation probability differs substantially for haploid-diploid organisms compared to either fully haploid or diploid species. PMID:27866168

  14. Characterization of nonGaussian atmospheric turbulence for prediction of aircraft response statistics

    NASA Technical Reports Server (NTRS)

    Mark, W. D.

    1977-01-01

    Mathematical expressions were derived for the exceedance rates and probability density functions of aircraft response variables using a turbulence model that consists of a low frequency component plus a variance modulated Gaussian turbulence component. The functional form of experimentally observed concave exceedance curves was predicted theoretically, the strength of the concave contribution being governed by the coefficient of variation of the time fluctuating variance of the turbulence. Differences in the functional forms of response exceedance curves and probability densities also were shown to depend primarily on this same coefficient of variation. Criteria were established for the validity of the local stationary assumption that is required in the derivations of the exceedance curves and probability density functions. These criteria are shown to depend on the relative time scale of the fluctuations in the variance, the fluctuations in the turbulence itself, and on the nominal duration of the relevant aircraft impulse response function. Metrics that can be generated from turbulence recordings for testing the validity of the local stationary assumption were developed.

  15. Dopamine neurons learn relative chosen value from probabilistic rewards

    PubMed Central

    Lak, Armin; Stauffer, William R; Schultz, Wolfram

    2016-01-01

    Economic theories posit reward probability as one of the factors defining reward value. Individuals learn the value of cues that predict probabilistic rewards from experienced reward frequencies. Building on the notion that responses of dopamine neurons increase with reward probability and expected value, we asked how dopamine neurons in monkeys acquire this value signal that may represent an economic decision variable. We found in a Pavlovian learning task that reward probability-dependent value signals arose from experienced reward frequencies. We then assessed neuronal response acquisition during choices among probabilistic rewards. Here, dopamine responses became sensitive to the value of both chosen and unchosen options. Both experiments showed also the novelty responses of dopamine neurones that decreased as learning advanced. These results show that dopamine neurons acquire predictive value signals from the frequency of experienced rewards. This flexible and fast signal reflects a specific decision variable and could update neuronal decision mechanisms. DOI: http://dx.doi.org/10.7554/eLife.18044.001 PMID:27787196

  16. Practice Parameter update: Management issues for women with epilepsy—Focus on pregnancy (an evidence-based review): Teratogenesis and perinatal outcomes

    PubMed Central

    Harden, C. L.; Meador, K. J.; Pennell, P. B.; Hauser, W. A.; Gronseth, G. S.; French, J. A.; Wiebe, S.; Thurman, D.; Koppel, B. S.; Kaplan, P. W.; Robinson, J. N.; Hopp, J.; Ting, T. Y.; Gidal, B.; Hovinga, C. A.; Wilner, A. N.; Vazquez, B.; Holmes, L.; Krumholz, A.; Finnell, R.; Hirtz, D.; Le Guen, C.

    2009-01-01

    Objective: To reassess the evidence for management issues related to the care of women with epilepsy (WWE) during pregnancy. Methods: Systematic review of relevant articles published between January 1985 and June 2007. Results: It is highly probable that intrauterine first-trimester valproate (VPA) exposure has higher risk of major congenital malformations (MCMs) compared to carbamazepine and possible compared to phenytoin or lamotrigine. Compared to untreated WWE, it is probable that VPA as part of polytherapy and possible that VPA as monotherapy contribute to the development of MCMs. It is probable that antiepileptic drug (AED) polytherapy as compared to monotherapy regimens contributes to the development of MCMs and to reduced cognitive outcomes. For monotherapy, intrauterine exposure to VPA probably reduces cognitive outcomes. Further, monotherapy exposure to phenytoin or phenobarbital possibly reduces cognitive outcomes. Neonates of WWE taking AEDs probably have an increased risk of being small for gestational age and possibly have an increased risk of a 1-minute Apgar score of <7. Recommendations: If possible, avoidance of valproate (VPA) and antiepileptic drug (AED) polytherapy during the first trimester of pregnancy should be considered to decrease the risk of major congenital malformations (Level B). If possible, avoidance of VPA and AED polytherapy throughout pregnancy should be considered to prevent reduced cognitive outcomes (Level B). If possible, avoidance of phenytoin and phenobarbital during pregnancy may be considered to prevent reduced cognitive outcomes (Level C). Pregnancy risk stratification should reflect that the offspring of women with epilepsy taking AEDs are probably at increased risk for being small for gestational age (Level B) and possibly at increased risk of 1-minute Apgar scores of <7 (Level C). PMID:19398681

  17. Exit probability of the one-dimensional q-voter model: Analytical results and simulations for large networks

    NASA Astrophysics Data System (ADS)

    Timpanaro, André M.; Prado, Carmen P. C.

    2014-05-01

    We discuss the exit probability of the one-dimensional q-voter model and present tools to obtain estimates about this probability, both through simulations in large networks (around 107 sites) and analytically in the limit where the network is infinitely large. We argue that the result E(ρ )=ρq/ρq+(1-ρ)q, that was found in three previous works [F. Slanina, K. Sznajd-Weron, and P. Przybyła, Europhys. Lett. 82, 18006 (2008), 10.1209/0295-5075/82/18006; R. Lambiotte and S. Redner, Europhys. Lett. 82, 18007 (2008), 10.1209/0295-5075/82/18007, for the case q =2; and P. Przybyła, K. Sznajd-Weron, and M. Tabiszewski, Phys. Rev. E 84, 031117 (2011), 10.1103/PhysRevE.84.031117, for q >2] using small networks (around 103 sites), is a good approximation, but there are noticeable deviations that appear even for small systems and that do not disappear when the system size is increased (with the notable exception of the case q =2). We also show that, under some simple and intuitive hypotheses, the exit probability must obey the inequality ρq/ρq+(1-ρ)≤E(ρ)≤ρ/ρ +(1-ρ)q in the infinite size limit. We believe this settles in the negative the suggestion made [S. Galam and A. C. R. Martins, Europhys. Lett. 95, 48005 (2001), 10.1209/0295-5075/95/48005] that this result would be a finite size effect, with the exit probability actually being a step function. We also show how the result that the exit probability cannot be a step function can be reconciled with the Galam unified frame, which was also a source of controversy.

  18. Fitness prospects: effects of age, sex and recruitment age on reproductive value in a long-lived seabird.

    PubMed

    Zhang, He; Rebke, Maren; Becker, Peter H; Bouwhuis, Sandra

    2015-01-01

    Reproductive value is an integrated measure of survival and reproduction fundamental to understanding life-history evolution and population dynamics, but little is known about intraspecific variation in reproductive value and factors explaining such variation, if any. By applying generalized additive mixed models to longitudinal individual-based data of the common tern Sterna hirundo, we estimated age-specific annual survival probability, breeding probability and reproductive performance, based on which we calculated age-specific reproductive values. We investigated effects of sex and recruitment age (RA) on each trait. We found age effects on all traits, with survival and breeding probability declining with age, while reproductive performance first improved with age before levelling off. We only found a very small, marginally significant, sex effect on survival probability, but evidence for decreasing age-specific breeding probability and reproductive performance with RA. As a result, males had slightly lower age-specific reproductive values than females, while birds of both sexes that recruited at the earliest ages of 2 and 3 years (i.e. 54% of the tern population) had somewhat higher fitness prospects than birds recruiting at later ages. While the RA effects on breeding probability and reproductive performance were statistically significant, these effects were not large enough to translate to significant effects on reproductive value. Age-specific reproductive values provided evidence for senescence, which came with fitness costs in a range of 17-21% for the sex-RA groups. Our study suggests that intraspecific variation in reproductive value may exist, but that, in the common tern, the differences are small. © 2014 The Authors. Journal of Animal Ecology © 2014 British Ecological Society.

  19. Estimation of the lower and upper bounds on the probability of failure using subset simulation and random set theory

    NASA Astrophysics Data System (ADS)

    Alvarez, Diego A.; Uribe, Felipe; Hurtado, Jorge E.

    2018-02-01

    Random set theory is a general framework which comprises uncertainty in the form of probability boxes, possibility distributions, cumulative distribution functions, Dempster-Shafer structures or intervals; in addition, the dependence between the input variables can be expressed using copulas. In this paper, the lower and upper bounds on the probability of failure are calculated by means of random set theory. In order to accelerate the calculation, a well-known and efficient probability-based reliability method known as subset simulation is employed. This method is especially useful for finding small failure probabilities in both low- and high-dimensional spaces, disjoint failure domains and nonlinear limit state functions. The proposed methodology represents a drastic reduction of the computational labor implied by plain Monte Carlo simulation for problems defined with a mixture of representations for the input variables, while delivering similar results. Numerical examples illustrate the efficiency of the proposed approach.

  20. Probability machines: consistent probability estimation using nonparametric learning machines.

    PubMed

    Malley, J D; Kruppa, J; Dasgupta, A; Malley, K G; Ziegler, A

    2012-01-01

    Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications.

  1. Responsiveness-informed multiple imputation and inverse probability-weighting in cohort studies with missing data that are non-monotone or not missing at random.

    PubMed

    Doidge, James C

    2018-02-01

    Population-based cohort studies are invaluable to health research because of the breadth of data collection over time, and the representativeness of their samples. However, they are especially prone to missing data, which can compromise the validity of analyses when data are not missing at random. Having many waves of data collection presents opportunity for participants' responsiveness to be observed over time, which may be informative about missing data mechanisms and thus useful as an auxiliary variable. Modern approaches to handling missing data such as multiple imputation and maximum likelihood can be difficult to implement with the large numbers of auxiliary variables and large amounts of non-monotone missing data that occur in cohort studies. Inverse probability-weighting can be easier to implement but conventional wisdom has stated that it cannot be applied to non-monotone missing data. This paper describes two methods of applying inverse probability-weighting to non-monotone missing data, and explores the potential value of including measures of responsiveness in either inverse probability-weighting or multiple imputation. Simulation studies are used to compare methods and demonstrate that responsiveness in longitudinal studies can be used to mitigate bias induced by missing data, even when data are not missing at random.

  2. Congenital heart disease in Liverpool: 1960--69.

    PubMed

    Kenna, A P; Smithells, R W; Fielding, D W

    1975-01-01

    The incidence of congenital heart disease (C.H.D.) in Liverpool from 1960 to 1969 inclusive has been determined from the Liverpool Congenital Abnormalities Registry with a follow-up period of 3 to 12 years. The incidence is 6-6 per 1000 total births and this probably represents a very small degree of under-reporting. There is no consistent seasonal variation in the incidence of any of the main congenital heart lesions. In general, infants with C.H.D. tend to be of lower birth weight and born after shorter gestation than controls. This is most conspicuous with patent ductus arteriosus (P.D.A.). Females preponderate in P.D.A. and males in transposition. There is probably also a male preponderance in coarctation and aortic stenosis. Fallot's tetralogy is associated with increased maternal age and parity. Pregnancies leading to the birth of a baby with C.H.D. are complicated by threatened abortion more frequently than are controls. The concordance rate for C.H.D. in twins is low. Monozygotic twins are more liable to C.H.D. than are dizygotic twins. The incidence of C.H.D. in the siblings of affected propositi is 2-3 times that expected. Affected sibs often have the same lesion. About 20 per cent of infants with C.H.D. have associated major defects notably monogolism and defects of the alimentary, skeletal, genito-urinary and nervous systems. These are responsible for the early death of about one quarter of all infants born with C.H.D. The data presented here suggest that environmental rather than genetic factors are predominantly responsible for congenital heart disease.

  3. Sustained Attention in Mild Alzheimer’s Disease

    PubMed Central

    Berardi, Anna Maria; Parasuraman, Raja; Haxby, James V.

    2008-01-01

    The vigilance decrement in perceptual sensitivity was examined in 10 patients with mild Alzheimer’s disease (AD) and 20 age-matched controls. A visual high-event rate digit-discrimination task lasting 7.2 min. (six 1.2 min blocks) was presented at different levels of stimulus degradation. Previous studies have shown that sensitivity decrements (d′) over time at high-stimulus degradation result from demands on effortful processing. For all degradation levels, the overall level of vigilance (d′) was lower in AD patients than in controls. All participants showed sensitivity decrement over blocks, with greater decrement at higher degradation levels. AD patients exhibited greater sensitivity decrement over time at the highest degradation level they all could perform relative to control participants. There were no concomitant changes in either response bias (C) or response times. The results indicate that mild AD patients have overall lower levels of vigilance under conditions that require both automatic and effortful processing. Mild AD patients also exhibit a deficit in the maintenance of vigilance over time under effortful processing conditions. Although the sample of AD patients was small, results further suggest that both possible and probable AD patients had greater sensitivity decrement over time at the highest degradation level than did control participants, but only probable AD patients had lower overall levels of vigilance. In the possible AD patients as a group, the decrement in vigilance occurred in the absence of concurrent deficits on standard attentional tasks, such as the Stroop and Trail Making tests, suggesting that deficits in vigilance over time may appear earlier than deficits in selective attention. PMID:15992254

  4. Using Crowdsourcing to Examine Relations Between Delay and Probability Discounting

    PubMed Central

    Jarmolowicz, David P.; Bickel, Warren K.; Carter, Anne E.; Franck, Christopher T.; Mueller, E. Terry

    2016-01-01

    Although the extensive lines of research on delay and/or probability discounting have greatly expanded our understanding of human decision-making processes, the relation between these two phenomena remains unclear. For example, some studies have reported robust associations between delay and probability discounting, whereas others have failed to demonstrate a consistent relation between the two. The current study sought to clarify this relation by examining the relation between delay and probability discounting in a large sample of internet users (n= 904) using the Amazon Mechanical Turk (AMT) crowdsourcing service. Because AMT is a novel data collection platform, the findings were validated through the replication of a number of previously established relations (e.g., relations between delay discounting and cigarette smoking status). A small but highly significant positive correlation between delay and probability discounting rates was obtained, and principal component analysis suggested that two (rather than one) components were preferable to account for the variance in both delay and probability discounting. Taken together, these findings suggest that delay and probability discounting may be related, but are not manifestations of a single component (e.g., impulsivity). PMID:22982370

  5. Magnetic particle-scanning for ultrasensitive immunodetection on-chip.

    PubMed

    Cornaglia, Matteo; Trouillon, Raphaël; Tekin, H Cumhur; Lehnert, Thomas; Gijs, Martin A M

    2014-08-19

    We describe the concept of magnetic particle-scanning for on-chip detection of biomolecules: a magnetic particle, carrying a low number of antigens (Ag's) (down to a single molecule), is transported by hydrodynamic forces and is subjected to successive stochastic reorientations in an engineered magnetic energy landscape. The latter consists of a pattern of substrate-bound small magnetic particles that are functionalized with antibodies (Ab's). Subsequationuent counting of the captured Ag-carrying particles provides the detection signal. The magnetic particle-scanning principle is investigated in a custom-built magneto-microfluidic chip and theoretically described by a random walk-based model, in which the trajectory of the contact point between an Ag-carrying particle and the small magnetic particle pattern is described by stochastic moves over the surface of the mobile particle, until this point coincides with the position of an Ag, resulting in the binding of the particle. This model explains the particular behavior of previously reported experimental dose-response curves obtained for two different ligand-receptor systems (biotin/streptavidin and TNF-α) over a wide range of concentrations. Our model shows that magnetic particle-scanning results in a very high probability of immunocomplex formation for very low Ag concentrations, leading to an extremely low limit of detection, down to the single molecule-per-particle level. When compared to other types of magnetic particle-based surface coverage assays, our strategy was found to offer a wider dynamic range (>8 orders of magnitude), as the system does not saturate for concentrations as high as 10(11) Ag molecules in a 5 μL drop. Furthermore, by emphasizing the importance of maximizing the encounter probability between the Ag and the Ab to improve sensitivity, our model also contributes to explaining the behavior of other particle-based heterogeneous immunoassays.

  6. Portals for Real-Time Earthquake Data and Forecasting: Challenge and Promise (Invited)

    NASA Astrophysics Data System (ADS)

    Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Feltstykket, R.; Donnellan, A.; Glasscoe, M. T.

    2013-12-01

    Earthquake forecasts have been computed by a variety of countries world-wide for over two decades. For the most part, forecasts have been computed for insurance, reinsurance and underwriters of catastrophe bonds. However, recent events clearly demonstrate that mitigating personal risk is becoming the responsibility of individual members of the public. Open access to a variety of web-based forecasts, tools, utilities and information is therefore required. Portals for data and forecasts present particular challenges, and require the development of both apps and the client/server architecture to deliver the basic information in real time. The basic forecast model we consider is the Natural Time Weibull (NTW) method (JBR et al., Phys. Rev. E, 86, 021106, 2012). This model uses small earthquakes (';seismicity-based models') to forecast the occurrence of large earthquakes, via data-mining algorithms combined with the ANSS earthquake catalog. This method computes large earthquake probabilities using the number of small earthquakes that have occurred in a region since the last large earthquake. Localizing these forecasts in space so that global forecasts can be computed in real time presents special algorithmic challenges, which we describe in this talk. Using 25 years of data from the ANSS California-Nevada catalog of earthquakes, we compute real-time global forecasts at a grid scale of 0.1o. We analyze and monitor the performance of these models using the standard tests, which include the Reliability/Attributes and Receiver Operating Characteristic (ROC) tests. It is clear from much of the analysis that data quality is a major limitation on the accurate computation of earthquake probabilities. We discuss the challenges of serving up these datasets over the web on web-based platforms such as those at www.quakesim.org , www.e-decider.org , and www.openhazards.com.

  7. Cache-enabled small cell networks: modeling and tradeoffs.

    PubMed

    Baştuǧ, Ejder; Bennis, Mehdi; Kountouris, Marios; Debbah, Mérouane

    We consider a network model where small base stations (SBSs) have caching capabilities as a means to alleviate the backhaul load and satisfy users' demand. The SBSs are stochastically distributed over the plane according to a Poisson point process (PPP) and serve their users either (i) by bringing the content from the Internet through a finite rate backhaul or (ii) by serving them from the local caches. We derive closed-form expressions for the outage probability and the average delivery rate as a function of the signal-to-interference-plus-noise ratio (SINR), SBS density, target file bitrate, storage size, file length, and file popularity. We then analyze the impact of key operating parameters on the system performance. It is shown that a certain outage probability can be achieved either by increasing the number of base stations or the total storage size. Our results and analysis provide key insights into the deployment of cache-enabled small cell networks (SCNs), which are seen as a promising solution for future heterogeneous cellular networks.

  8. Reinforcement Probability Modulates Temporal Memory Selection and Integration Processes

    PubMed Central

    Matell, Matthew S.; Kurti, Allison N.

    2013-01-01

    We have previously shown that rats trained in a mixed-interval peak procedure (tone = 4s, light = 12s) respond in a scalar manner at a time in between the trained peak times when presented with the stimulus compound (Swanton & Matell, 2011). In our previous work, the two component cues were reinforced with different probabilities (short = 20%, long = 80%) to equate response rates, and we found that the compound peak time was biased toward the cue with the higher reinforcement probability. Here, we examined the influence that different reinforcement probabilities have on the temporal location and shape of the compound response function. We found that the time of peak responding shifted as a function of the relative reinforcement probability of the component cues, becoming earlier as the relative likelihood of reinforcement associated with the short cue increased. However, as the relative probabilities of the component cues grew dissimilar, the compound peak became non-scalar, suggesting that the temporal control of behavior shifted from a process of integration to one of selection. As our previous work has utilized durations and reinforcement probabilities more discrepant than those used here, these data suggest that the processes underlying the integration/selection decision for time are based on cue value. PMID:23896560

  9. High But Not Low Probability of Gain Elicits a Positive Feeling Leading to the Framing Effect.

    PubMed

    Gosling, Corentin J; Moutier, Sylvain

    2017-01-01

    Human risky decision-making is known to be highly susceptible to profit-motivated responses elicited by the way in which options are framed. In fact, studies investigating the framing effect have shown that the choice between sure and risky options depends on how these options are presented. Interestingly, the probability of gain of the risky option has been highlighted as one of the main factors causing variations in susceptibility to the framing effect. However, while it has been shown that high probabilities of gain of the risky option systematically lead to framing bias, questions remain about the influence of low probabilities of gain. Therefore, the first aim of this paper was to clarify the respective roles of high and low probabilities of gain in the framing effect. Due to the difference between studies using a within- or between-subjects design, we conducted a first study investigating the respective roles of these designs. For both designs, we showed that trials with a high probability of gain led to the framing effect whereas those with a low probability did not. Second, as emotions are known to play a key role in the framing effect, we sought to determine whether they are responsible for such a debiasing effect of the low probability of gain. Our second study thus investigated the relationship between emotion and the framing effect depending on high and low probabilities. Our results revealed that positive emotion was related to risk-seeking in the loss frame, but only for trials with a high probability of gain. Taken together, these results support the interpretation that low probabilities of gain suppress the framing effect because they prevent the positive emotion of gain anticipation.

  10. High But Not Low Probability of Gain Elicits a Positive Feeling Leading to the Framing Effect

    PubMed Central

    Gosling, Corentin J.; Moutier, Sylvain

    2017-01-01

    Human risky decision-making is known to be highly susceptible to profit-motivated responses elicited by the way in which options are framed. In fact, studies investigating the framing effect have shown that the choice between sure and risky options depends on how these options are presented. Interestingly, the probability of gain of the risky option has been highlighted as one of the main factors causing variations in susceptibility to the framing effect. However, while it has been shown that high probabilities of gain of the risky option systematically lead to framing bias, questions remain about the influence of low probabilities of gain. Therefore, the first aim of this paper was to clarify the respective roles of high and low probabilities of gain in the framing effect. Due to the difference between studies using a within- or between-subjects design, we conducted a first study investigating the respective roles of these designs. For both designs, we showed that trials with a high probability of gain led to the framing effect whereas those with a low probability did not. Second, as emotions are known to play a key role in the framing effect, we sought to determine whether they are responsible for such a debiasing effect of the low probability of gain. Our second study thus investigated the relationship between emotion and the framing effect depending on high and low probabilities. Our results revealed that positive emotion was related to risk-seeking in the loss frame, but only for trials with a high probability of gain. Taken together, these results support the interpretation that low probabilities of gain suppress the framing effect because they prevent the positive emotion of gain anticipation. PMID:28232808

  11. Neural substrates of reward magnitude, probability, and risk during a wheel of fortune decision-making task.

    PubMed

    Smith, Bruce W; Mitchell, Derek G V; Hardin, Michael G; Jazbec, Sandra; Fridberg, Daniel; Blair, R James R; Ernst, Monique

    2009-01-15

    Economic decision-making involves the weighting of magnitude and probability of potential gains/losses. While previous work has examined the neural systems involved in decision-making, there is a need to understand how the parameters associated with decision-making (e.g., magnitude of expected reward, probability of expected reward and risk) modulate activation within these neural systems. In the current fMRI study, we modified the monetary wheel of fortune (WOF) task [Ernst, M., Nelson, E.E., McClure, E.B., Monk, C.S., Munson, S., Eshel, N., et al. (2004). Choice selection and reward anticipation: an fMRI study. Neuropsychologia 42(12), 1585-1597.] to examine in 25 healthy young adults the neural responses to selections of different reward magnitudes, probabilities, or risks. Selection of high, relative to low, reward magnitude increased activity in insula, amygdala, middle and posterior cingulate cortex, and basal ganglia. Selection of low-probability, as opposed to high-probability reward, increased activity in anterior cingulate cortex, as did selection of risky, relative to safe reward. In summary, decision-making that did not involve conflict, as in the magnitude contrast, recruited structures known to support the coding of reward values, and those that integrate motivational and perceptual information for behavioral responses. In contrast, decision-making under conflict, as in the probability and risk contrasts, engaged the dorsal anterior cingulate cortex whose role in conflict monitoring is well established. However, decision-making under conflict failed to activate the structures that track reward values per se. Thus, the presence of conflict in decision-making seemed to significantly alter the pattern of neural responses to simple rewards. In addition, this paradigm further clarifies the functional specialization of the cingulate cortex in processes of decision-making.

  12. The Statistical Value of Raw Fluorescence Signal in Luminex xMAP Based Multiplex Immunoassays

    PubMed Central

    Breen, Edmond J.; Tan, Woei; Khan, Alamgir

    2016-01-01

    Tissue samples (plasma, saliva, serum or urine) from 169 patients classified as either normal or having one of seven possible diseases are analysed across three 96-well plates for the presences of 37 analytes using cytokine inflammation multiplexed immunoassay panels. Censoring for concentration data caused problems for analysis of the low abundant analytes. Using fluorescence analysis over concentration based analysis allowed analysis of these low abundant analytes. Mixed-effects analysis on the resulting fluorescence and concentration responses reveals a combination of censoring and mapping the fluorescence responses to concentration values, through a 5PL curve, changed observed analyte concentrations. Simulation verifies this, by showing a dependence on the mean florescence response and its distribution on the observed analyte concentration levels. Differences from normality, in the fluorescence responses, can lead to differences in concentration estimates and unreliable probabilities for treatment effects. It is seen that when fluorescence responses are normally distributed, probabilities of treatment effects for fluorescence based t-tests has greater statistical power than the same probabilities from concentration based t-tests. We add evidence that the fluorescence response, unlike concentration values, doesn’t require censoring and we show with respect to differential analysis on the fluorescence responses that background correction is not required. PMID:27243383

  13. PRODIGEN: visualizing the probability landscape of stochastic gene regulatory networks in state and time space.

    PubMed

    Ma, Chihua; Luciani, Timothy; Terebus, Anna; Liang, Jie; Marai, G Elisabeta

    2017-02-15

    Visualizing the complex probability landscape of stochastic gene regulatory networks can further biologists' understanding of phenotypic behavior associated with specific genes. We present PRODIGEN (PRObability DIstribution of GEne Networks), a web-based visual analysis tool for the systematic exploration of probability distributions over simulation time and state space in such networks. PRODIGEN was designed in collaboration with bioinformaticians who research stochastic gene networks. The analysis tool combines in a novel way existing, expanded, and new visual encodings to capture the time-varying characteristics of probability distributions: spaghetti plots over one dimensional projection, heatmaps of distributions over 2D projections, enhanced with overlaid time curves to display temporal changes, and novel individual glyphs of state information corresponding to particular peaks. We demonstrate the effectiveness of the tool through two case studies on the computed probabilistic landscape of a gene regulatory network and of a toggle-switch network. Domain expert feedback indicates that our visual approach can help biologists: 1) visualize probabilities of stable states, 2) explore the temporal probability distributions, and 3) discover small peaks in the probability landscape that have potential relation to specific diseases.

  14. The Animism Controversy Revisited: A Probability Analysis

    ERIC Educational Resources Information Center

    Smeets, Paul M.

    1973-01-01

    Considers methodological issues surrounding the Piaget-Huang controversy. A probability model, based on the difference between the expected and observed animistic and deanimistic responses is applied as an improved technique for the assessment of animism. (DP)

  15. Increased whole-body auditory startle reflex and autonomic reactivity in children with anxiety disorders

    PubMed Central

    Bakker, Mirte J.; Tijssen, Marina A.J.; van der Meer, Johan N.; Koelman, Johannes H.T.M.; Boer, Frits

    2009-01-01

    Background Young patients with anxiety disorders are thought to have a hypersensitive fear system, including alterations of the early sensorimotor processing of threatening information. However, there is equivocal support in auditory blink response studies for an enlarged auditory startle reflex (ASR) in such patients. We sought to investigate the ASR measured over multiple muscles (whole-body) in children and adolescents with anxiety disorders. Methods Between August and December 2006, we assessed ASRs (elicited by 8 consecutive tones of 104 dB, interstimulus interval of about 2 min) in 25 patients and 25 matched controls using a case–control design and in 9 nonaffected siblings. We recorded the electromyographic activity of 6 muscles and the sympathetic skin response. We investigated response occurrence (probability %) and response magnitude (area under the curve in μV × ms) of the combined response of 6 muscles and of the single blink response. Results In patients (17 girls, mean age 12 years; 13 social phobia, 9 generalized anxiety, 3 other), the combined response probability (p = 0.027) of all muscles, the combined area under the curve of all muscles (p = 0.011) and the sympathetic skin response (p = 0.006) were enlarged compared with matched controls. The response probability (p = 0.48) and area under the curve (p = 0.07) of the blink response were normal in patients compared with controls. The ASR pattern was normal with normal latencies in patients compared with controls. In nonaffected siblings, the sympathetic skin response (p = 0.038), but not the combined response probability of all muscles (p = 0.15), was enlarged compared with controls. Limitations Limitations are the sample size and restricted comparison to the psychophysiological ASR paradigm. Conclusion The results point toward a hypersensitive central nervous system (fear system), including early sensorimotor processing alterations and autonomic hyperreactivity. The multiple muscle (whole-body) ASR is suggested to be a better tool to detect ASR abnormalities in patients with anxiety disorders than the blink response alone. Abnormalities in ASR serve as a candidate endophenotype of anxiety disorders. PMID:19568483

  16. Conditional survival in patients with chronic myeloid leukemia in chronic phase in the era of tyrosine kinase inhibitors.

    PubMed

    Sasaki, Koji; Kantarjian, Hagop M; Jain, Preetesh; Jabbour, Elias J; Ravandi, Farhad; Konopleva, Marina; Borthakur, Gautam; Takahashi, Koichi; Pemmaraju, Naveen; Daver, Naval; Pierce, Sherry A; O'Brien, Susan M; Cortes, Jorge E

    2016-01-15

    Tyrosine kinase inhibitors (TKIs) significantly improve survival in patients with chronic myeloid leukemia in chronic phase (CML-CP). Conditional probability provides survival information in patients who have already survived for a specific period of time after treatment. Cumulative response and survival data from 6 consecutive frontline TKI clinical trials were analyzed. Conditional probability was calculated for failure-free survival (FFS), transformation-free survival (TFS), event-free survival (EFS), and overall survival (OS) according to depth of response within 1 year of the initiation of TKIs, including complete cytogenetic response, major molecular response, and molecular response with a 4-log or 4.5-log reduction. A total of 483 patients with a median follow-up of 99.4 months from the initiation of treatment with TKIs were analyzed. Conditional probabilities of FFS, TFS, EFS, and OS for 1 additional year for patients alive after 12 months of therapy ranged from 92.0% to 99.1%, 98.5% to 100%, 96.2% to 99.6%, and 96.8% to 99.7%, respectively. Conditional FFS for 1 additional year did not improve with a deeper response each year. Conditional probabilities of TFS, EFS, and OS for 1 additional year were maintained at >95% during the period. In the era of TKIs, patients with chronic myeloid leukemia in chronic phase who survived for a certain number of years maintained excellent clinical outcomes in each age group. Cancer 2016;122:238-248. © 2015 American Cancer Society. © 2015 American Cancer Society.

  17. Setting Time Limits on Tests

    ERIC Educational Resources Information Center

    van der Linden, Wim J.

    2011-01-01

    It is shown how the time limit on a test can be set to control the probability of a test taker running out of time before completing it. The probability is derived from the item parameters in the lognormal model for response times. Examples of curves representing the probability of running out of time on a test with given parameters as a function…

  18. Scattering of ultrashort electromagnetic pulses on metal clusters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Astapenko, V. A., E-mail: astval@mail.ru; Sakhno, S. V.

    We have calculated and analyzed the probability of ultrashort electromagnetic pulse (USP) scattering on small metal clusters in the frequency range of plasmon resonances during the field action. The main attention is devoted to dependence of the probability of scattering on the pulse duration for various detunings of the USP carrier frequency from the plasmon resonance frequency. Peculiarities of the USP scattering from plasmon resonances with various figures of merit are revealed.

  19. Scattering of ultrashort electromagnetic pulses on metal clusters

    NASA Astrophysics Data System (ADS)

    Astapenko, V. A.; Sakhno, S. V.

    2016-12-01

    We have calculated and analyzed the probability of ultrashort electromagnetic pulse (USP) scattering on small metal clusters in the frequency range of plasmon resonances during the field action. The main attention is devoted to dependence of the probability of scattering on the pulse duration for various detunings of the USP carrier frequency from the plasmon resonance frequency. Peculiarities of the USP scattering from plasmon resonances with various figures of merit are revealed.

  20. Reconstruction of Porous Media with Multiple Solid Phases

    PubMed

    Losic; Thovert; Adler

    1997-02-15

    A process is proposed to generate three-dimensional multiphase porous media with fixed phase probabilities and an overall correlation function. By varying the parameters, a specific phase can be located either at the interface between two phases or within a single phase. When the interfacial phase has a relatively small probability, its shape can be chosen as granular or lamellar. The influence of a third phase on the macroscopic conductivity of a medium is illustrated.

  1. DINKEY LAKES ROADLESS AREA, CALIFORNIA.

    USGS Publications Warehouse

    Dodge, F.C.W.; Federspiel, F.E.

    1984-01-01

    The Dinkey Lakes Roadless Area occupies an area of about 184 sq mi on the western slope of the Sierra Nevada, California. The results of a mineral survey show that parts of the area have substantiated resource potential for tungsten and marble and probable resource potential for quartz crystal gemstones. A probable resource potential for geothermal energy exists in one small area. No potential for other metallic mineral or energy resources was identified in this study.

  2. The price of complexity in financial networks

    PubMed Central

    May, Robert M.; Roukny, Tarik; Stiglitz, Joseph E.

    2016-01-01

    Financial institutions form multilayer networks by engaging in contracts with each other and by holding exposures to common assets. As a result, the default probability of one institution depends on the default probability of all of the other institutions in the network. Here, we show how small errors on the knowledge of the network of contracts can lead to large errors in the probability of systemic defaults. From the point of view of financial regulators, our findings show that the complexity of financial networks may decrease the ability to mitigate systemic risk, and thus it may increase the social cost of financial crises. PMID:27555583

  3. Counterfactuality of ‘counterfactual’ communication

    NASA Astrophysics Data System (ADS)

    Vaidman, L.

    2015-11-01

    The counterfactuality of the recently proposed protocols for direct quantum communication is analyzed. It is argued that the protocols can be counterfactual only for one value of the transmitted bit. The protocols achieve a reduced probability of detection of the particle in the transmission channel by increasing the number of paths in the channel. However, this probability is not lower than the probability of detecting a particle actually passing through such a multi-path channel, which was found to be surprisingly small. The relation between security and counterfactuality of the protocols is discussed. An analysis of counterfactuality of the protocols in the framework of the Bohmian interpretation is performed.

  4. Kullback-Leibler information function and the sequential selection of experiments to discriminate among several linear models

    NASA Technical Reports Server (NTRS)

    Sidik, S. M.

    1972-01-01

    The error variance of the process prior multivariate normal distributions of the parameters of the models are assumed to be specified, prior probabilities of the models being correct. A rule for termination of sampling is proposed. Upon termination, the model with the largest posterior probability is chosen as correct. If sampling is not terminated, posterior probabilities of the models and posterior distributions of the parameters are computed. An experiment was chosen to maximize the expected Kullback-Leibler information function. Monte Carlo simulation experiments were performed to investigate large and small sample behavior of the sequential adaptive procedure.

  5. A cat's tale: the impact of genetic restoration on Florida panther population dynamics and persistence.

    PubMed

    Hostetler, Jeffrey A; Onorato, David P; Jansen, Deborah; Oli, Madan K

    2013-05-01

    1. Genetic restoration has been suggested as a management tool for mitigating detrimental effects of inbreeding depression in small, inbred populations, but the demographic mechanisms underlying population-level responses to genetic restoration remain poorly understood. 2. We studied the dynamics and persistence of the endangered Florida panther Puma concolor coryi population and evaluated the potential influence of genetic restoration on population growth and persistence parameters. As part of the genetic restoration programme, eight female Texas pumas P. c. stanleyana were released into Florida panther habitat in southern Florida in 1995. 3. The overall asymptotic population growth rate (λ) was 1.04 (5th and 95th percentiles: 0.95-1.14), suggesting an increase in the panther population of approximately 4% per year. Considering the effects of environmental and demographic stochasticities and density-dependence, the probability that the population will fall below 10 panthers within 100 years was 0.072 (0-0.606). 4. Our results suggest that the population would have declined at 5% per year (λ = 0.95; 0.83-1.08) in the absence of genetic restoration. Retrospective life table response experiment analysis revealed that the positive effect of genetic restoration on survival of kittens was primarily responsible for the substantial growth of the panther population that would otherwise have been declining. 5. For comparative purposes, we also estimated probability of quasi-extinction under two scenarios - implementation of genetic restoration and no genetic restoration initiative - using the estimated abundance of panthers in 1995, the year genetic restoration was initiated. Assuming no density-dependence, the probability that the panther population would fall below 10 panthers by 2010 was 0.098 (0.002-0.332) for the restoration scenario and 0.445 (0.032-0.944) for the no restoration scenario, providing further evidence that the panther population would have faced a substantially higher risk of extinction if the genetic restoration initiative had not been implemented. 6. Our results, along with those reporting increases in population size and improvements in biomedical correlates of inbreeding depression, provide strong evidence that genetic restoration substantially contributed to the observed increases in the Florida panther population. © 2012 The Authors. Journal of Animal Ecology © 2012 British Ecological Society.

  6. Interference of plant volatiles on pheromone receptor neurons of male Grapholita molesta (Lepidoptera: Tortricidae).

    PubMed

    Ammagarahalli, Byrappa; Gemeno, César

    2015-10-01

    In moths, sex pheromone components are detected by pheromone-specific olfactory receptor neurons (ph-ORNs) housed in sensilla trichodea in the male antennae. In Grapholita molesta, ph-ORNs are highly sensitive and specific to the individual sex pheromone components, and thus help in the detection and discrimination of the unique conspecific pheromone blend. Plant odors interspersed with a sub-optimal pheromone dose are reported to increase male moth attraction. To determine if the behavioral synergism of pheromone and plant odors starts at the ph-ORN level, single sensillum recordings were performed on Z8-12:Ac and E8-12:Ac ph-ORNs (Z-ORNs and E-ORNs, respectively) stimulated with pheromone-plant volatile mixtures. First, biologically meaningful plant-volatile doses were determined by recording the response of plant-specific ORNs housed in sensilla auricillica and trichodea to several plant odorants. This exploration provided a first glance at plant ORNs in this species. Then, using these plant volatile doses, we found that the spontaneous activity of ph-ORNs was not affected by the stimulation with plant volatiles, but that a binary mixture of sex pheromone and plant odorants resulted in a small (about 15%), dose-independent, but statistically significant, reduction in the spike frequency of Z-ORNs with respect to stimulation with Z8-12:Ac alone. The response of E-ORNs to a combination of E8-12:Ac and plant volatiles was not different from E8-12:Ac alone. We argue that the small inhibition of Z-ORNs caused by physiologically realistic plant volatile doses is probably not fully responsible for the observed behavioral synergism of pheromone and plant odors. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Contrasting support for alternative models of genomic variation based on microhabitat preference: species-specific effects of climate change in alpine sedges.

    PubMed

    Massatti, Rob; Knowles, L Lacey

    2016-08-01

    Deterministic processes may uniquely affect codistributed species' phylogeographic patterns such that discordant genetic variation among taxa is predicted. Yet, explicitly testing expectations of genomic discordance in a statistical framework remains challenging. Here, we construct spatially and temporally dynamic models to investigate the hypothesized effect of microhabitat preferences on the permeability of glaciated regions to gene flow in two closely related montane species. Utilizing environmental niche models from the Last Glacial Maximum and the present to inform demographic models of changes in habitat suitability over time, we evaluate the relative probabilities of two alternative models using approximate Bayesian computation (ABC) in which glaciated regions are either (i) permeable or (ii) a barrier to gene flow. Results based on the fit of the empirical data to data sets simulated using a spatially explicit coalescent under alternative models indicate that genomic data are consistent with predictions about the hypothesized role of microhabitat in generating discordant patterns of genetic variation among the taxa. Specifically, a model in which glaciated areas acted as a barrier was much more probable based on patterns of genomic variation in Carex nova, a wet-adapted species. However, in the dry-adapted Carex chalciolepis, the permeable model was more probable, although the difference in the support of the models was small. This work highlights how statistical inferences can be used to distinguish deterministic processes that are expected to result in discordant genomic patterns among species, including species-specific responses to climate change. © 2016 John Wiley & Sons Ltd.

  8. A study on the sensitivity of self-powered neutron detectors (SPNDs)

    NASA Astrophysics Data System (ADS)

    Lee, Wanno; Cho, Gyuseong; Kim, Kwanghyun; Kim, Hee Joon; choi, Yuseon; Park, Moon Chu; Kim, Soongpyung

    2001-08-01

    Self-powered neutron detectors (SPNDs) are widely used in reactors to monitor neutron flux, while they have several advantages such as small size, and relatively simple electronics required in conjunction with those usages, they have some intrinsic problems of the low level of output current-a slow response time and the rapid change of sensitivity-that make it difficult to use for a long term. Monte Carlo simulation was used to calculate the escape probability as a function of the birth position of emitted beta particle for geometry of rhodium-based SPNDs. A simple numerical method calculated the initial generation rate of beta particles and the change of generation rate due to rhodium burnup. Using results of the simulation and the simple numerical method, the burnup profile of rhodium number density and the neutron sensitivity were calculated as a function of burnup time in reactors. This method was verified by the comparison of this and other papers, and data of YGN3.4 (Young Gwang Nuclear plant 3, 4) about the initial sensitivity. In addition, for improvement of some properties of rhodium-based SPNDs, which are currently used, a modified geometry is proposed. The proposed geometry, which is tube-type, is able to increase the initial sensitivity due to increase of the escape probability. The escape probability was calculated by changing the thickness of the insulator and compared solid-type with tube-type about each insulator thickness. The method used here can be applied to the analysis and design of other types of SPNDs.

  9. The proton and helium anomalies in the light of the Myriad model

    NASA Astrophysics Data System (ADS)

    Salati, Pierre; Génolini, Yoann; Serpico, Pasquale; Taillet, Richard

    2017-03-01

    A hardening of the proton and helium fluxes is observed above a few hundreds of GeV/nuc. The distribution of local sources of primary cosmic rays has been suggested as a potential solution to this puzzling behavior. Some authors even claim that a single source is responsible for the observed anomalies. But how probable these explanations are? To answer that question, our current description of cosmic ray Galactic propagation needs to be replaced by the Myriad model. In the former approach, sources of protons and helium nuclei are treated as a jelly continuously spread over space and time. A more accurate description is provided by the Myriad model where sources are considered as point-like events. This leads to a probabilistic derivation of the fluxes of primary species, and opens the possibility that larger-than-average values may be observed at the Earth. For a long time though, a major obstacle has been the infinite variance associated to the probability distribution function which the fluxes follow. Several suggestions have been made to cure this problem but none is entirely satisfactory. We go a step further here and solve the infinite variance problem of the Myriad model by making use of the generalized central limit theorem. We find that primary fluxes are distributed according to a stable law with heavy tail, well-known to financial analysts. The probability that the proton and helium anomalies are sourced by local SNR can then be calculated. The p-values associated to the CREAM measurements turn out to be small, unless somewhat unrealistic propagation parameters are assumed.

  10. Probabilistic treatment of the uncertainty from the finite size of weighted Monte Carlo data

    NASA Astrophysics Data System (ADS)

    Glüsenkamp, Thorsten

    2018-06-01

    Parameter estimation in HEP experiments often involves Monte Carlo simulation to model the experimental response function. A typical application are forward-folding likelihood analyses with re-weighting, or time-consuming minimization schemes with a new simulation set for each parameter value. Problematically, the finite size of such Monte Carlo samples carries intrinsic uncertainty that can lead to a substantial bias in parameter estimation if it is neglected and the sample size is small. We introduce a probabilistic treatment of this problem by replacing the usual likelihood functions with novel generalized probability distributions that incorporate the finite statistics via suitable marginalization. These new PDFs are analytic, and can be used to replace the Poisson, multinomial, and sample-based unbinned likelihoods, which covers many use cases in high-energy physics. In the limit of infinite statistics, they reduce to the respective standard probability distributions. In the general case of arbitrary Monte Carlo weights, the expressions involve the fourth Lauricella function FD, for which we find a new finite-sum representation in a certain parameter setting. The result also represents an exact form for Carlson's Dirichlet average Rn with n > 0, and thereby an efficient way to calculate the probability generating function of the Dirichlet-multinomial distribution, the extended divided difference of a monomial, or arbitrary moments of univariate B-splines. We demonstrate the bias reduction of our approach with a typical toy Monte Carlo problem, estimating the normalization of a peak in a falling energy spectrum, and compare the results with previously published methods from the literature.

  11. Soil texture and granulometry at the surface of Mars

    NASA Technical Reports Server (NTRS)

    Dollfus, Audouin; Deschamps, Marc; Zimbelman, James R.

    1993-01-01

    Attention is given to a characterization of the physical behavior of the Martian upper surface in its first few decimeters on the basis of mutual relationships between three parameters: the linear polarization of the reflected light, the visual albedo, and the thermal inertia. Polarimetric scans raked a strip covering two contrasting regions, the dark-hued Mare Erythraeum, and the light-hued Thaumasia. Erythraeum is characterized everywhere by a uniform polarization response, despite the large geomorphological diversity of the surface. A ubiquitous coating or mantling with small dark grains of albedo 12.7 percent, with a radius of 10 to 20 microns, is indicated. Thaumasia exhibits a large variety of soil properties. A typical location with albedo of 16.3 percent has a surface covered with orange grains, probably very dispersed in size, for which the largest grains are 20 to 40 microns.

  12. Exposure to an extremely low-frequency electromagnetic field only slightly modifies the proteome of Chromobacterium violaceumATCC 12472

    PubMed Central

    Baraúna, Rafael A.; Santos, Agenor V.; Graças, Diego A.; Santos, Daniel M.; Ghilardi, Rubens; Pimenta, Adriano M. C.; Carepo, Marta S. P.; Schneider, Maria P.C.; Silva, Artur

    2015-01-01

    Several studies of the physiological responses of different organisms exposed to extremely low-frequency electromagnetic fields (ELF-EMF) have been described. In this work, we report the minimal effects of in situ exposure to ELF-EMF on the global protein expression of Chromobacterium violaceum using a gel-based proteomic approach. The protein expression profile was only slightly altered, with five differentially expressed proteins detected in the exposed cultures; two of these proteins (DNA-binding stress protein, Dps, and alcohol dehydrogenase) were identified by MS/MS. The enhanced expression of Dps possibly helped to prevent physical damage to DNA. Although small, the changes in protein expression observed here were probably beneficial in helping the bacteria to adapt to the stress generated by the electromagnetic field. PMID:26273227

  13. A search for diffuse band profile variations in the rho Ophiuchi cloud

    NASA Technical Reports Server (NTRS)

    Snow, T. P.; Timothy, J. G.; Sear, S.

    1982-01-01

    High signal-to-noise profiles of the broad diffuse interstellar band at 4430 A were obtained on the 2.2-m telescope at the Mauna Kea Observatory, using the newly-developed pulse-counting multi-anode microchannel array detector system in an effort to determine whether the band profile varies with mean grain size as expected if the band is produced by absorbers embedded in grain lattices. The lack of profile variability over several lines of sight where independent evidence indicates that the mean grain size varies shows that lambda 4430 is probably not formed by the same grains that are responsible for interstellar extinction at visible wavelengths. The possibility that this band is created by a population of very small ( approximately 100 A) grains is still viable, as is the hypothesis that it has a molecular origin.

  14. The Pitman-Yor Process and an Empirical Study of Choice Behavior

    NASA Astrophysics Data System (ADS)

    Hisakado, Masato; Sano, Fumiaki; Mori, Shintaro

    2018-02-01

    This study discusses choice behavior using a voting model in which voters can obtain information from a finite number of previous r voters. Voters vote for a candidate with a probability proportional to the previous vote ratio, which is visible to the voters. We obtain the Pitman sampling formula as the equilibrium distribution of r votes. We present the model as a process of posting on a bulletin board system, 2ch.net, where users can choose one of many threads to create a post. We explore how this choice depends on the last r posts and the distribution of these last r posts across threads. We conclude that the posting process is described by our voting model with analog herders for a small r, which might correspond to the time horizon of users' responses.

  15. Pure perceptual-based learning of second-, third-, and fourth-order sequential probabilities.

    PubMed

    Remillard, Gilbert

    2011-07-01

    There is evidence that sequence learning in the traditional serial reaction time task (SRTT), where target location is the response dimension, and sequence learning in the perceptual SRTT, where target location is not the response dimension, are handled by different mechanisms. The ability of the latter mechanism to learn sequential contingencies that can be learned by the former mechanism was examined. Prior research has established that people can learn second-, third-, and fourth-order probabilities in the traditional SRTT. The present study reveals that people can learn such probabilities in the perceptual SRTT. This suggests that the two mechanisms may have similar architectures. A possible neural basis of the two mechanisms is discussed.

  16. Simple artificial neural networks that match probability and exploit and explore when confronting a multiarmed bandit.

    PubMed

    Dawson, Michael R W; Dupuis, Brian; Spetch, Marcia L; Kelly, Debbie M

    2009-08-01

    The matching law (Herrnstein 1961) states that response rates become proportional to reinforcement rates; this is related to the empirical phenomenon called probability matching (Vulkan 2000). Here, we show that a simple artificial neural network generates responses consistent with probability matching. This behavior was then used to create an operant procedure for network learning. We use the multiarmed bandit (Gittins 1989), a classic problem of choice behavior, to illustrate that operant training balances exploiting the bandit arm expected to pay off most frequently with exploring other arms. Perceptrons provide a medium for relating results from neural networks, genetic algorithms, animal learning, contingency theory, reinforcement learning, and theories of choice.

  17. New insights in dehydration stress behavior of two maize hybrids using advanced distributed reactivity model (DRM). Responses to the impact of 24-epibrassinolide

    PubMed Central

    Janković, Bojan; Janković, Marija; Nikolić, Bogdan; Dimkić, Ivica; Lalević, Blažo; Raičević, Vera

    2017-01-01

    Proposed distributed reactivity model of dehydration for seedling parts of two various maize hybrids (ZP434, ZP704) was established. Dehydration stresses were induced thermally, which is also accompanied by response of hybrids to heat stress. It was found that an increased value of activation energy counterparts within radicle dehydration of ZP434, with a high concentration of 24-epibrassinolide (24-EBL) at elevated operating temperatures, probably causes activation of diffusion mechanisms in cutin network and may increases likelihood of formation of free volumes, large enough to accommodate diffusing molecule. Many small random effects were detected and can be correlated with micro-disturbing in a space filled with water caused by thermal gradients, increasing capillary phenomena, and which can induce thermo-capillary migration. The influence of seedling content of various sugars and minerals on dehydration was also examined. Estimated distributed reactivity models indicate a dependence of reactivity on structural arrangements, due to present interactions between water molecules and chemical species within the plant. PMID:28644899

  18. New insights in dehydration stress behavior of two maize hybrids using advanced distributed reactivity model (DRM). Responses to the impact of 24-epibrassinolide.

    PubMed

    Waisi, Hadi; Janković, Bojan; Janković, Marija; Nikolić, Bogdan; Dimkić, Ivica; Lalević, Blažo; Raičević, Vera

    2017-01-01

    Proposed distributed reactivity model of dehydration for seedling parts of two various maize hybrids (ZP434, ZP704) was established. Dehydration stresses were induced thermally, which is also accompanied by response of hybrids to heat stress. It was found that an increased value of activation energy counterparts within radicle dehydration of ZP434, with a high concentration of 24-epibrassinolide (24-EBL) at elevated operating temperatures, probably causes activation of diffusion mechanisms in cutin network and may increases likelihood of formation of free volumes, large enough to accommodate diffusing molecule. Many small random effects were detected and can be correlated with micro-disturbing in a space filled with water caused by thermal gradients, increasing capillary phenomena, and which can induce thermo-capillary migration. The influence of seedling content of various sugars and minerals on dehydration was also examined. Estimated distributed reactivity models indicate a dependence of reactivity on structural arrangements, due to present interactions between water molecules and chemical species within the plant.

  19. Suboptimal Antituberculosis Drug Concentrations and Outcomes in Small and HIV-Coinfected Children in India: Recommendations for Dose Modifications.

    PubMed

    Guiastrennec, Benjamin; Ramachandran, Geetha; Karlsson, Mats O; Kumar, A K Hemanth; Bhavani, Perumal Kannabiran; Gangadevi, N Poorana; Swaminathan, Soumya; Gupta, Amita; Dooley, Kelly E; Savic, Radojka M

    2017-12-16

    This work aimed to evaluate the once-daily antituberculosis treatment as recommended by the new Indian pediatric guidelines. Isoniazid, rifampin, and pyrazinamide concentration-time profiles and treatment outcome were obtained from 161 Indian children with drug-sensitive tuberculosis undergoing thrice-weekly dosing as per previous Indian pediatric guidelines. The exposure-response relationships were established using a population pharmacokinetic-pharmacodynamic approach. Rifampin exposure was identified as the unique predictor of treatment outcome. Consequently, children with low body weight (4-7 kg) and/or HIV infection, who displayed the lowest rifampin exposure, were associated with the highest probability of unfavorable treatment (therapy failure, death) outcome (P unfavorable ). Model-based simulation of optimized (P unfavorable ≤ 5%) rifampin once-daily doses were suggested per treatment weight band and HIV coinfection status (33% and 190% dose increase, respectively, from the new Indian guidelines). The established dose-exposure-response relationship could be pivotal in the development of future pediatric tuberculosis treatment guidelines. © 2017, The Authors Clinical Pharmacology & Therapeutics published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  20. New drugs and new toxicities: pembrolizumab-induced myocarditis.

    PubMed

    Inayat, Faisal; Masab, Muhammad; Gupta, Sorab; Ullah, Waqas

    2018-01-23

    Pembrolizumab is an immune checkpoint inhibitor that significantly improves clinical outcomes in numerous solid organ malignancies. Despite successful therapeutic responses, this new drug comes with a constellation of adverse reactions. Herein, we chronicle the case of a patient with metastatic non-small-cell lung cancer treated with pembrolizumab. After two cycles, he developed new-onset dyspnoea on exertion. Electrocardiogram showed idioventricular rhythm with diffuse ST-segment elevations. Echocardiography revealed severe biventricular cardiac dysfunction. Based on diagnostic workup and exclusion of probable aetiologies, the patient was diagnosed with pembrolizumab-induced myocarditis. The treatment was initiated with corticosteroids and guideline-conform heart failure therapy. He demonstrated a marked clinical response with resolution of congestive heart failure symptoms. This article summarises the clinical evidence regarding the epidemiology, pathophysiology, clinical features, diagnostic modalities and management of patients with pembrolizumab-associated myocarditis. In addition, it highlights that programmed death receptor-1 inhibition can cause a spectrum of autoimmune adverse events requiring clinical monitoring and periodic screenings. © BMJ Publishing Group Ltd (unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  1. Gamasoidosis caused by the special lineage L1 of Dermanyssus gallinae (Acarina: Dermanyssidae): A case of heavy infestation in a public place in Italy.

    PubMed

    Pezzi, Marco; Leis, Marilena; Chicca, Milvia; Roy, Lise

    2017-10-01

    Among Gamasina (Acari: Mesostigmata) mites, some dermanyssoid species are known to cause gamasoidosis, a human dermatitis characterized by papulosquamous eruptions and urticarian lesions. We describe a case of mite infestation which occurred in public conference halls in Ferrara (Italy), affecting four people who attended the place and showed signs of gamasoidosis. The mites were collected and characterized using scanning electron microscopy, light microscopy and mitochondrial DNA sequencing (Cytochrome c oxidase subunit I partial CDS). Based on morphological and molecular data, the species responsible for the infestation was identified as the special lineage L1 of the poultry red mite, Dermanyssus gallinae (De Geer) (Acarina: Dermanissydae), a cryptic species known to be associated with pigeons. Rock doves, Columba livia Gmelin (Columbiformes: Columbidae) were roosting on the top of the public building, thus the mites probably gained access to the halls through small window openings. The present case report is the first one providing morpho-molecular identification of a D. gallinae cryptic species responsible of gamasoidosis in Italy. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Does dietary insect meal affect the fish immune system? The case of mealworm, Tenebrio molitor on European sea bass, Dicentrarchus labrax.

    PubMed

    Henry, M A; Gasco, L; Chatzifotis, S; Piccolo, G

    2018-04-01

    Feeding small European sea bass, Dicentrarchus labrax, for 6 weeks with Tenebrio molitor larval meal showed significant anti-inflammatory responses (ceruloplasmin, myeloperoxidase and nitric oxide). Serum bacteriolytic activity against a Gram negative bacterium was not significantly affected by dietary Tenebrio, while both lysozyme antibacterial activity and serum trypsin inhibition usually linked to the anti-parasite activity of the fish, were significantly enhanced. The latter may be due to the similarities in the composition of the exoskeleton of parasites and insects that may therefore act as an immunostimulant potentially increasing the anti-parasitic activity. The addition of exogenous proteases significantly decreased both trypsin-inhibition and serum bacteriolytic activity probably through direct inhibition of the proteins responsible for these immune functions. Further investigation involving bacterial or parasitic challenges will be necessary to assess if the effects of dietary mealworm meal on the immune system observed in the present study are translated into an improved resistance to diseases. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. The method of UCN "small heating" measurement in the big gravitational spectrometer (BGS) and studies of this effect on Fomblin oil Y-HVAC 18/8

    NASA Astrophysics Data System (ADS)

    Nesvizhevsky, V. V.; Voronin, A. Yu.; Lambrecht, A.; Reynaud, S.; Lychagin, E. V.; Muzychka, A. Yu.; Nekhaev, G. V.; Strelkov, A. V.

    2018-02-01

    The Big Gravitational Spectrometer (BGS) takes advantage of the strong influence of the Earth's gravity on the motion of ultracold neutrons (UCNs) that makes it possible to shape and measure UCN spectra. We optimized the BGS to investigate the "small heating" of UCNs, that is, the inelastic reflection of UCNs from a surface accompanied by an energy change comparable with the initial UCN energy. UCNs whose energy increases are referred to as "Vaporized UCNs" (VUCNs). The BGS provides the narrowest UCN spectra of a few cm and the broadest "visible" VUCN energy range of up to ˜150 cm (UCN energy is given in units of its maximum height in the Earth's gravitational field, where 1.00 cm ≈ 1.02 neV). The dead-zone between the UCN and VUCN spectra is the narrowest ever achieved (a few cm). We performed measurements with and without samples without breaking vacuum. BGS provides the broadest range of temperatures (77-600 K) and the highest sensitivity to the small heating effect, up to ˜10-8 per bounce, i.e., two orders of magnitude higher than the sensitivity of alternative methods. We describe the method to measure the probability of UCN "small heating" using the BGS and illustrate it with a study of samples of the hydrogen-free oil Fomblin Y-HVAC 18/8. The data obtained are well reproducible, do not depend on sample thickness, and do not evolve over time. The measured model-independent probability P+ of UCN small heating from an energy "mono-line" 30.2 ± 2.5 cm to the energy range 35-140 cm is in the range (1.05 ±0.02s t a t )×1 0-5-(1.31 ±0.24s t a t )×1 0-5 at a temperature of 24 °C. The associated systematic uncertainty would disappear if a VUCN spectrum shape were known, for instance, from a particular model of small heating. This experiment provides the most precise and reliable value of small heating probability on Fomblin measured so far. These results are of importance for studies of UCN small heating as well as for analyzing and designing neutron lifetime experiments.

  4. The method of UCN "small heating" measurement in the big gravitational spectrometer (BGS) and studies of this effect on Fomblin oil Y-HVAC 18/8.

    PubMed

    Nesvizhevsky, V V; Voronin, A Yu; Lambrecht, A; Reynaud, S; Lychagin, E V; Muzychka, A Yu; Nekhaev, G V; Strelkov, A V

    2018-02-01

    The Big Gravitational Spectrometer (BGS) takes advantage of the strong influence of the Earth's gravity on the motion of ultracold neutrons (UCNs) that makes it possible to shape and measure UCN spectra. We optimized the BGS to investigate the "small heating" of UCNs, that is, the inelastic reflection of UCNs from a surface accompanied by an energy change comparable with the initial UCN energy. UCNs whose energy increases are referred to as "Vaporized UCNs" (VUCNs). The BGS provides the narrowest UCN spectra of a few cm and the broadest "visible" VUCN energy range of up to ∼150 cm (UCN energy is given in units of its maximum height in the Earth's gravitational field, where 1.00 cm ≈ 1.02 neV). The dead-zone between the UCN and VUCN spectra is the narrowest ever achieved (a few cm). We performed measurements with and without samples without breaking vacuum. BGS provides the broadest range of temperatures (77-600 K) and the highest sensitivity to the small heating effect, up to ∼10 -8 per bounce, i.e., two orders of magnitude higher than the sensitivity of alternative methods. We describe the method to measure the probability of UCN "small heating" using the BGS and illustrate it with a study of samples of the hydrogen-free oil Fomblin Y-HVAC 18/8. The data obtained are well reproducible, do not depend on sample thickness, and do not evolve over time. The measured model-independent probability P + of UCN small heating from an energy "mono-line" 30.2 ± 2.5 cm to the energy range 35-140 cm is in the range 1.05±0.02 stat ×10 -5 -1.31±0.24 stat ×10 -5 at a temperature of 24 °C. The associated systematic uncertainty would disappear if a VUCN spectrum shape were known, for instance, from a particular model of small heating. This experiment provides the most precise and reliable value of small heating probability on Fomblin measured so far. These results are of importance for studies of UCN small heating as well as for analyzing and designing neutron lifetime experiments.

  5. Prospect evaluation as a function of numeracy and probability denominator.

    PubMed

    Millroth, Philip; Juslin, Peter

    2015-05-01

    This study examines how numeracy and probability denominator (a direct-ratio probability, a relative frequency with denominator 100, a relative frequency with denominator 10,000) affect the evaluation of prospects in an expected-value based pricing task. We expected that numeracy would affect the results due to differences in the linearity of number perception and the susceptibility to denominator neglect with different probability formats. An analysis with functional measurement verified that participants integrated value and probability into an expected value. However, a significant interaction between numeracy and probability format and subsequent analyses of the parameters of cumulative prospect theory showed that the manipulation of probability denominator changed participants' psychophysical response to probability and value. Standard methods in decision research may thus confound people's genuine risk attitude with their numerical capacities and the probability format used. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Exposure-Response Modeling to Characterize the Relationship Between Ixekizumab Serum Drug Concentrations and Efficacy Responses at Week 12 in Patients With Moderate to Severe Plaque Psoriasis.

    PubMed

    Chigutsa, Emmanuel; de Mendizabal, Nieves Velez; Chua, Laiyi; Heathman, Michael; Friedrich, Stuart; Jackson, Kimberley; Reich, Kristian

    2018-06-07

    Ixekizumab, a high-affinity monoclonal antibody, selectively targets interleukin-17A and has been shown to be efficacious in the treatment of moderate to severe psoriasis. The objective was to describe the relationship between ixekizumab concentrations and efficacy response (static Physician Global Assessment [sPGA] and the Psoriasis Activity and Severity Index [PASI) scores] after 12 weeks of ixekizumab treatment in psoriasis patients from 3 phase 3 studies. Data from 2888 psoriasis patients randomized to receive placebo or 80 mg ixekizumab every 2 weeks or every 4 weeks were analyzed. Separate logistic regression models describing the relationship between ixekizumab concentrations and sPGA or PASI scores at week 12 were used to determine the probability of patients achieving a response and to investigate the impact of various patient factors other than drug concentrations on response rates. Both dosing regimens were efficacious, with higher rates of response achieved with the higher range of observed ixekizumab concentrations after every-2-week dosing. Although higher bodyweight, palmoplantar involvement, lower baseline disease state, or high baseline C-reactive protein were associated with slightly lower response rates, the magnitude of effect of these factors on sPGA(0,1) response was small, with all subgroups able to achieve high levels of response. Other factors tested had no effect including age, sex, and antidrug antibody status. Logistic regression modeling of ixekizumab concentration and efficacy data accurately identified the proportion of responders using sPGA or PASI end points. The higher concentration ranges achieved with 80 mg every 2 weeks versus every 4 weeks were associated with higher response levels. © 2018, The American College of Clinical Pharmacology.

  7. Units of analysis and kinetic structure of behavioral repertoires

    PubMed Central

    Thompson, Travis; Lubinski, David

    1986-01-01

    It is suggested that molar streams of behavior are constructed of various arrangements of three elementary constituents (elicited, evoked, and emitted response classes). An eight-cell taxonomy is elaborated as a framework for analyzing and synthesizing complex behavioral repertoires based on these functional units. It is proposed that the local force binding functional units into a smoothly articulated kinetic sequence arises from temporally arranged relative response probability relationships. Behavioral integration is thought to reflect the joint influence of the organism's hierarchy of relative response probabilities, fluctuating biological states, and the arrangement of environmental and behavioral events in time. PMID:16812461

  8. A framework for conducting mechanistic based reliability assessments of components operating in complex systems

    NASA Astrophysics Data System (ADS)

    Wallace, Jon Michael

    2003-10-01

    Reliability prediction of components operating in complex systems has historically been conducted in a statistically isolated manner. Current physics-based, i.e. mechanistic, component reliability approaches focus more on component-specific attributes and mathematical algorithms and not enough on the influence of the system. The result is that significant error can be introduced into the component reliability assessment process. The objective of this study is the development of a framework that infuses the needs and influence of the system into the process of conducting mechanistic-based component reliability assessments. The formulated framework consists of six primary steps. The first three steps, identification, decomposition, and synthesis, are primarily qualitative in nature and employ system reliability and safety engineering principles to construct an appropriate starting point for the component reliability assessment. The following two steps are the most unique. They involve a step to efficiently characterize and quantify the system-driven local parameter space and a subsequent step using this information to guide the reduction of the component parameter space. The local statistical space quantification step is accomplished using two proposed multivariate probability models: Multi-Response First Order Second Moment and Taylor-Based Inverse Transformation. Where existing joint probability models require preliminary distribution and correlation information of the responses, these models combine statistical information of the input parameters with an efficient sampling of the response analyses to produce the multi-response joint probability distribution. Parameter space reduction is accomplished using Approximate Canonical Correlation Analysis (ACCA) employed as a multi-response screening technique. The novelty of this approach is that each individual local parameter and even subsets of parameters representing entire contributing analyses can now be rank ordered with respect to their contribution to not just one response, but the entire vector of component responses simultaneously. The final step of the framework is the actual probabilistic assessment of the component. Although the same multivariate probability tools employed in the characterization step can be used for the component probability assessment, variations of this final step are given to allow for the utilization of existing probabilistic methods such as response surface Monte Carlo and Fast Probability Integration. The overall framework developed in this study is implemented to assess the finite-element based reliability prediction of a gas turbine airfoil involving several failure responses. Results of this implementation are compared to results generated using the conventional 'isolated' approach as well as a validation approach conducted through large sample Monte Carlo simulations. The framework resulted in a considerable improvement to the accuracy of the part reliability assessment and an improved understanding of the component failure behavior. Considerable statistical complexity in the form of joint non-normal behavior was found and accounted for using the framework. Future applications of the framework elements are discussed.

  9. ERP Correlates of Verbal and Numerical Probabilities in Risky Choices: A Two-Stage Probability Processing View

    PubMed Central

    Li, Shu; Du, Xue-Lei; Li, Qi; Xuan, Yan-Hua; Wang, Yun; Rao, Li-Lin

    2016-01-01

    Two kinds of probability expressions, verbal and numerical, have been used to characterize the uncertainty that people face. However, the question of whether verbal and numerical probabilities are cognitively processed in a similar manner remains unresolved. From a levels-of-processing perspective, verbal and numerical probabilities may be processed differently during early sensory processing but similarly in later semantic-associated operations. This event-related potential (ERP) study investigated the neural processing of verbal and numerical probabilities in risky choices. The results showed that verbal probability and numerical probability elicited different N1 amplitudes but that verbal and numerical probabilities elicited similar N2 and P3 waveforms in response to different levels of probability (high to low). These results were consistent with a levels-of-processing framework and suggest some internal consistency between the cognitive processing of verbal and numerical probabilities in risky choices. Our findings shed light on possible mechanism underlying probability expression and may provide the neural evidence to support the translation of verbal to numerical probabilities (or vice versa). PMID:26834612

  10. Genetics Home Reference: frontotemporal dementia with parkinsonism-17

    MedlinePlus

    ... more common than this estimate. FTDP-17 probably accounts for a small percentage of all cases of frontotemporal dementia. Related Information What information about a genetic condition can statistics ...

  11. Dynamic Encoding of Speech Sequence Probability in Human Temporal Cortex

    PubMed Central

    Leonard, Matthew K.; Bouchard, Kristofer E.; Tang, Claire

    2015-01-01

    Sensory processing involves identification of stimulus features, but also integration with the surrounding sensory and cognitive context. Previous work in animals and humans has shown fine-scale sensitivity to context in the form of learned knowledge about the statistics of the sensory environment, including relative probabilities of discrete units in a stream of sequential auditory input. These statistics are a defining characteristic of one of the most important sequential signals humans encounter: speech. For speech, extensive exposure to a language tunes listeners to the statistics of sound sequences. To address how speech sequence statistics are neurally encoded, we used high-resolution direct cortical recordings from human lateral superior temporal cortex as subjects listened to words and nonwords with varying transition probabilities between sound segments. In addition to their sensitivity to acoustic features (including contextual features, such as coarticulation), we found that neural responses dynamically encoded the language-level probability of both preceding and upcoming speech sounds. Transition probability first negatively modulated neural responses, followed by positive modulation of neural responses, consistent with coordinated predictive and retrospective recognition processes, respectively. Furthermore, transition probability encoding was different for real English words compared with nonwords, providing evidence for online interactions with high-order linguistic knowledge. These results demonstrate that sensory processing of deeply learned stimuli involves integrating physical stimulus features with their contextual sequential structure. Despite not being consciously aware of phoneme sequence statistics, listeners use this information to process spoken input and to link low-level acoustic representations with linguistic information about word identity and meaning. PMID:25948269

  12. Intrinsically shunted Josephson junctions for electronics applications

    NASA Astrophysics Data System (ADS)

    Belogolovskii, M.; Zhitlukhina, E.; Lacquaniti, V.; De Leo, N.; Fretto, M.; Sosso, A.

    2017-07-01

    Conventional Josephson metal-insulator-metal devices are inherently underdamped and exhibit hysteretic current-voltage response due to a very high subgap resistance compared to that in the normal state. At the same time, overdamped junctions with single-valued characteristics are needed for most superconducting digital applications. The usual way to overcome the hysteretic behavior is to place an external low-resistance normal-metal shunt in parallel with each junction. Unfortunately, such solution results in a considerable complication of the circuitry design and introduces parasitic inductance through the junction. This paper provides a concise overview of some generic approaches that have been proposed in order to realize internal shunting in Josephson heterostructures with a barrier that itself contains the desired resistive component. The main attention is paid to self-shunted devices with local weak-link transmission probabilities that are so strongly disordered in the interface plane that transmission probabilities are tiny for the main part of the transition region between two super-conducting electrodes, while a small part of the interface is well transparent. We discuss the possibility of realizing a universal bimodal distribution function and emphasize advantages of such junctions that can be considered as a new class of self-shunted Josephson devices promising for practical applications in superconducting electronics operating at 4.2 K.

  13. Stochastic charging of dust grains in planetary rings: Diffusion rates and their effects on Lorentz resonances

    NASA Technical Reports Server (NTRS)

    Schaffer, L.; Burns, J. A.

    1995-01-01

    Dust grains in planetary rings acquire stochastically fluctuating electric charges as they orbit through any corotating magnetospheric plasma. Here we investigate the nature of this stochastic charging and calculate its effect on the Lorentz resonance (LR). First we model grain charging as a Markov process, where the transition probabilities are identified as the ensemble-averaged charging fluxes due to plasma pickup and photoemission. We determine the distribution function P(t;N), giving the probability that a grain has N excess charges at time t. The autocorrelation function tau(sub q) for the strochastic charge process can be approximated by a Fokker-Planck treatment of the evolution equations for P(t; N). We calculate the mean square response to the stochastic fluctuations in the Lorentz force. We find that transport in phase space is very small compared to the resonant increase in amplitudes due to the mean charge, over the timescale that the oscillator is resonantly pumped up. Therefore the stochastic charge variations cannot break the resonant interaction; locally, the Lorentz resonance is a robust mechanism for the shaping of etheral dust ring systems. Slightly stronger bounds on plasma parameters are required when we consider the longer transit times between Lorentz resonances.

  14. Optimization of armored spherical tanks for storage on the lunar surface

    NASA Technical Reports Server (NTRS)

    Bents, D. J.; Knight, D. A.

    1992-01-01

    A redundancy strategy for reducing micrometeroid armoring mass is investigated, with application to cryogenic reactant storage for a regenerative fuel cell (RFC) on the lunar surface. In that micrometeoroid environment, the cryogenic fuel must be protected from loss due to tank puncture. The tankage must have a sufficiently high probability of survival over the length of the mission so that the probability of system failure due to tank puncture is low compared to the other mission risk factors. Assuming that a single meteoroid penetration can cause a storage tank to lose its contents, two means are available to raise the probability of surviving micrometeoroid attack to the desired level. One can armor the tanks to a thickness sufficient to reduce probability of penetration of any tank to the desired level or add extra capacity in the form of spare tanks that results in survival of a given number out of the ensemble at the desired level. A combination of these strategies (armoring and redundancy) is investigated. The objective is to find the optimum combination which yields the lowest shielding mass per cubic meter of surviving fuel out of the original ensemble. The investigation found that, for the volumes of fuel associated with multikilowatt class cryo storage RFC's, and the armoring methodology and meteoroid models used, storage should be fragmented into small individual tanks. Larger installations (more fuel) pay less of a shielding penalty than small installations. For the same survival probability over the same time period, larger volumes will require less armoring mass per unit volume protected.

  15. Description of the lower jaws of Baculites from the Upper Cretaceous U.S. Western Interior

    NASA Astrophysics Data System (ADS)

    Larson, Neal L.; Landman, Neil H.

    2017-03-01

    We report the discovery of lower jaws of Baculites (Ammonoidea) from the Upper Cretaceous U.S. Western Interior. In the lower Campanian Smoky Hill Chalk Member of the Niobrara Chalk of Kansas, most of the jaws occur as isolated elements. Based on their age, they probably belong to Baculites sp. (smooth). They conform to the description of rugaptychus, and are ornamented with coarse rugae on their ventral side. One specimen is preserved inside a small fecal pellet that was probably produced by a fish. Another specimen occurs inside in a crushed body chamber near the aperture and is probably in situ. Three small structures are present immediately behind the jaw and may represent the remains of the gills. In the lower Maastrichtian Pierre Shale of Wyoming, two specimens of Baculites grandis contain lower jaws inside their body chambers, and are probably in situ. In both specimens, the jaws are oriented at an acute angle to the long axis of the shell, with their anterior ends pointing toward the dorsum. One of the jaws is folded into a U-shape, which probably approximates the shape of the jaw during life. Based on the measurements of the jaws and the shape of the shell, the jaws could not have touched the sides of the shell even if they were splayed out, implying that they could not have effectively served as opercula. Instead, in combination with the upper jaws and radula, they constituted the buccal apparatus that collected and conveyed food to the esophagus.

  16. Accurate and efficient modeling of the detector response in small animal multi-head PET systems.

    PubMed

    Cecchetti, Matteo; Moehrs, Sascha; Belcari, Nicola; Del Guerra, Alberto

    2013-10-07

    In fully three-dimensional PET imaging, iterative image reconstruction techniques usually outperform analytical algorithms in terms of image quality provided that an appropriate system model is used. In this study we concentrate on the calculation of an accurate system model for the YAP-(S)PET II small animal scanner, with the aim to obtain fully resolution- and contrast-recovered images at low levels of image roughness. For this purpose we calculate the system model by decomposing it into a product of five matrices: (1) a detector response component obtained via Monte Carlo simulations, (2) a geometric component which describes the scanner geometry and which is calculated via a multi-ray method, (3) a detector normalization component derived from the acquisition of a planar source, (4) a photon attenuation component calculated from x-ray computed tomography data, and finally, (5) a positron range component is formally included. This system model factorization allows the optimization of each component in terms of computation time, storage requirements and accuracy. The main contribution of this work is a new, efficient way to calculate the detector response component for rotating, planar detectors, that consists of a GEANT4 based simulation of a subset of lines of flight (LOFs) for a single detector head whereas the missing LOFs are obtained by using intrinsic detector symmetries. Additionally, we introduce and analyze a probability threshold for matrix elements of the detector component to optimize the trade-off between the matrix size in terms of non-zero elements and the resulting quality of the reconstructed images. In order to evaluate our proposed system model we reconstructed various images of objects, acquired according to the NEMA NU 4-2008 standard, and we compared them to the images reconstructed with two other system models: a model that does not include any detector response component and a model that approximates analytically the depth of interaction as detector response component. The comparisons confirm previous research results, showing that the usage of an accurate system model with a realistic detector response leads to reconstructed images with better resolution and contrast recovery at low levels of image roughness.

  17. Accurate and efficient modeling of the detector response in small animal multi-head PET systems

    NASA Astrophysics Data System (ADS)

    Cecchetti, Matteo; Moehrs, Sascha; Belcari, Nicola; Del Guerra, Alberto

    2013-10-01

    In fully three-dimensional PET imaging, iterative image reconstruction techniques usually outperform analytical algorithms in terms of image quality provided that an appropriate system model is used. In this study we concentrate on the calculation of an accurate system model for the YAP-(S)PET II small animal scanner, with the aim to obtain fully resolution- and contrast-recovered images at low levels of image roughness. For this purpose we calculate the system model by decomposing it into a product of five matrices: (1) a detector response component obtained via Monte Carlo simulations, (2) a geometric component which describes the scanner geometry and which is calculated via a multi-ray method, (3) a detector normalization component derived from the acquisition of a planar source, (4) a photon attenuation component calculated from x-ray computed tomography data, and finally, (5) a positron range component is formally included. This system model factorization allows the optimization of each component in terms of computation time, storage requirements and accuracy. The main contribution of this work is a new, efficient way to calculate the detector response component for rotating, planar detectors, that consists of a GEANT4 based simulation of a subset of lines of flight (LOFs) for a single detector head whereas the missing LOFs are obtained by using intrinsic detector symmetries. Additionally, we introduce and analyze a probability threshold for matrix elements of the detector component to optimize the trade-off between the matrix size in terms of non-zero elements and the resulting quality of the reconstructed images. In order to evaluate our proposed system model we reconstructed various images of objects, acquired according to the NEMA NU 4-2008 standard, and we compared them to the images reconstructed with two other system models: a model that does not include any detector response component and a model that approximates analytically the depth of interaction as detector response component. The comparisons confirm previous research results, showing that the usage of an accurate system model with a realistic detector response leads to reconstructed images with better resolution and contrast recovery at low levels of image roughness.

  18. Taking the easy way out? Increasing implementation effort reduces probability maximizing under cognitive load.

    PubMed

    Schulze, Christin; Newell, Ben R

    2016-07-01

    Cognitive load has previously been found to have a positive effect on strategy selection in repeated risky choice. Specifically, whereas inferior probability matching often prevails under single-task conditions, optimal probability maximizing sometimes dominates when a concurrent task competes for cognitive resources. We examined the extent to which this seemingly beneficial effect of increased task demands hinges on the effort required to implement each of the choice strategies. Probability maximizing typically involves a simple repeated response to a single option, whereas probability matching requires choice proportions to be tracked carefully throughout a sequential choice task. Here, we flipped this pattern by introducing a manipulation that made the implementation of maximizing more taxing and, at the same time, allowed decision makers to probability match via a simple repeated response to a single option. The results from two experiments showed that increasing the implementation effort of probability maximizing resulted in decreased adoption rates of this strategy. This was the case both when decision makers simultaneously learned about the outcome probabilities and responded to a dual task (Exp. 1) and when these two aspects were procedurally separated in two distinct stages (Exp. 2). We conclude that the effort involved in implementing a choice strategy is a key factor in shaping repeated choice under uncertainty. Moreover, highlighting the importance of implementation effort casts new light on the sometimes surprising and inconsistent effects of cognitive load that have previously been reported in the literature.

  19. A Lyme borreliosis diagnosis probability score - no relation with antibiotic treatment response.

    PubMed

    Briciu, Violeta T; Flonta, Mirela; Leucuţa, Daniel; Cârstina, Dumitru; Ţăţulescu, Doina F; Lupşe, Mihaela

    2017-05-01

    (1) To describe epidemiological and clinical data of patients that present with the suspicion of Lyme borreliosis (LB); (2) to evaluate a previous published score that classifies patients on the probability of having LB, following-up patients' clinical outcome after antibiotherapy. Inclusion criteria: patients with clinical manifestations compatible with LB and Borrelia (B.) burgdorferi positive serology, hospitalized in a Romanian hospital between January 2011 and October 2012. erythema migrans (EM) or suspicion of Lyme neuroborreliosis (LNB) with lumbar puncture performed for diagnosis. A questionnaire was completed for each patient regarding associated diseases, tick bites or EM history and clinical signs/symptoms at admission, end of treatment and 3 months later. Two-tier testing (TTT) used an ELISA followed by a Western Blot kit. The patients were classified in groups, using the LB probability score and were evaluated in a multidisciplinary team. Antibiotherapy followed guidelines' recommendations. Sixty-four patients were included, presenting diverse associated comorbidities. Fifty-seven patients presented positive TTT, seven presenting either ELISA or Western Blot test positive. No differences in outcome were found between the groups of patients classified as very probable, probable and little probable LB. Instead, a better post-treatment outcome was described in patients with positive TTT. The patients investigated for the suspicion of LB present diverse clinical manifestations and comorbidities that complicate differential diagnosis. The LB diagnosis probability score used in our patients did not correlate with the antibiotic treatment response, suggesting that the probability score does not bring any benefit in diagnosis.

  20. Quantum probability, choice in large worlds, and the statistical structure of reality.

    PubMed

    Ross, Don; Ladyman, James

    2013-06-01

    Classical probability models of incentive response are inadequate in "large worlds," where the dimensions of relative risk and the dimensions of similarity in outcome comparisons typically differ. Quantum probability models for choice in large worlds may be motivated pragmatically - there is no third theory - or metaphysically: statistical processing in the brain adapts to the true scale-relative structure of the universe.

  1. Full Ionisation In Binary-Binary Encounters With Small Positive Energies

    NASA Astrophysics Data System (ADS)

    Sweatman, W. L.

    2006-08-01

    Interactions between binary stars and single stars and binary stars and other binary stars play a key role in the dynamics of a dense stellar system. Energy can be transferred between the internal dynamics of a binary and the larger scale dynamics of the interacting objects. Binaries can be destroyed and created by the interaction. In a binary-binary encounter, full ionisation occurs when both of the binary stars are destroyed in the interaction to create four single stars. This is only possible when the total energy of the system is positive. For very small energies the probability of this occurring is very low and it tends towards zero as the total energy tends towards zero. Here the case is considered for which all the stars have equal masses. An asymptotic power law is predicted relating the probability of full ionisation with the total energy when this latter quantity is small. The exponent, which is approximately 2.31, is compared with the results from numerical scattering experiments. The theoretical approach taken is similar to one used previously in the three-body problem. It makes use of the fact that the most dramatic changes in scale and energies of a few-body system occur when its components pass near to a central configuration. The position, and number, of these configurations is not known for the general four-body problem, however, with equal masses there are known to be exactly five different cases. Separate consideration and comparison of the properties of orbits close to each of these five central configurations enables the prediction of the form of the cross-section for full ionisation for the case of small positive total energy. This is the relation between total energy and the probability of total ionisation described above.

  2. Setting Climate Mitigation Targets in the Face of Uncertainty

    NASA Astrophysics Data System (ADS)

    Raupach, M. R.

    2012-12-01

    Uncertainty in climate science is well known, and at least some of it may be irreducible. However, the presence of uncertainty increases the urgency of action rather than reducing it. For the purpose of setting climate targets, earth system models can be construed as giant transfer functions mapping anthropogenic forcings to climate responses. Recent work [Allen et al. 2009, Nature 458, and related papers] has shown that the broad structure of this mapping is captured by a near-linear relationship T = αQ between cumulative CO2 emissions (Q) and global warming (T), both measured from the start of the industrial era. The slope α is about 1.8 K/EgC (67% probability range 1.2 to 2.7), or 1.8 degrees per trillion tonnes of carbon, both for the recent past and also for future projections. Near-linearity occurs because of compensating interactions between CO2 emissions trajectories, emissions of non-CO2 gases, and nonlinear carbon-climate dynamics. The implication is that a all-time quota of about 1100 PgC of carbon can be emitted before T =2 K warming is exceeded, with median (50%) probability. Half of this quota (550 PgC) has been emitted already through the industrial era (since 1750). Accounting for the need to turn around the present growth in emissions, the eventual decline in emissions to meet a target T =2 K has to be at more than 5% per year if mitigation starts immediately [Raupach et al. 2011, Tellus B 63]. With delay, the required rate of decline rapidly rises further. A 50% chance of meeting a target T =2 K is inadequate, because of paleoclimatic evidence for destabilising climate feedbacks in response to small changes in forcing. This evidence calls for either or both of two responses: a tougher target such as T = 1 K, or a higher chance of meeting the target. It is shown here that (1) the cumulative probability distribution of T at given Q is approximately log-normal; and (2) consequently, the cumulative emission Q needed to stay below a warming T with probability P is Q = (T/α)[1 - sqrt(2π)(ln r)(P - ½)], where r is a spread parameter such that 2/3 of the probability mass lies within a factor (1/r, r) of the median. Current uncertainty estimates for the equilibrium climate sensitivity (a best estimate of 3 K per CO2 doubling about a 2/3 probability range of 2 to 4.5 K) are consistent with r = 1.5. With this uncertainty, if P is increased from 50% (median) to 80%, then the all-time Q falls from 1100 to 770 PgC, or 220 PgC from 2012 onward (20 years worth of emissions at current rates). Implications are: (1) as the uncertainty (r) increases with all else fixed, the quota falls; (2) the combination of a warming target T < 2 K with high chance of success is now unreachable. At least for a minimal climate target like T = 2 K with 50% chance of success, the mitigation challenge is still technically possible. However, climate futures will be shaped not only by technology but also by inner human narratives, mental maps and aspirations, including attitudes to risk. The transformation that is needed to meet the climate challenge depends not only on technologies but also on the evolution of self-sustaining narratives.

  3. Satisfaction with the humanitarian response to the 2010 Pakistan floods: a call for increased accountability to beneficiaries.

    PubMed

    Kirsch, Thomas; Siddiqui, Muhammad Ahmed; Perrin, Paul Clayton; Robinson, W Courtland; Sauer, Lauren M; Doocy, Shannon

    2013-07-01

    Ascertain recipients' level of satisfaction with humanitarian response efforts. A multi-stage, 80×20 cluster sample randomized survey (1800 households) with probability proportional to size of households affected by the 2010 Indus river floods in Pakistan. The floods affected over 18 million households and led to more than 8 billion USD in response dollars. Less than 20% of respondents reported being satisfied with response, though a small increase in satisfaction levels was observed over the three time periods of interest. Within the first month, receipt of hygiene items, food and household items was most strongly predictive of overall satisfaction. At 6 months, positive receipt of medicines was also highly predictive of satisfaction. The proportion of households reporting unmet needs remained elevated throughout the 6-month period following the floods and varied from 50% to 80%. Needs were best met between 1 and 3 months postflood, when response was at its peak. Unmet needs were the greatest at 6 months, when response was being phased down. Access-limiting issues were rarely captured during routine monitoring and evaluation efforts and seem to be a significant predictor in dissatisfaction with relief efforts, at least in the case of Pakistan, another argument in favor of independent, population-based surveys of this kind. There is also need to better identify and serve those not residing in camps. Direct surveys of the affected population can be used operationally to assess ongoing needs, more appropriately redirect humanitarian resources, and ultimately, judge the overall quality of a humanitarian response.

  4. Long-term data from a small mammal community reveal loss of diversity and potential effects of local climate change.

    PubMed

    Santoro, Simone; Sanchez-Suarez, Cristina; Rouco, Carlos; Palomo, L Javier; Fernández, M Carmen; Kufner, Maura B; Moreno, Sacramento

    2017-10-01

    Climate change affects distribution and persistence of species. However, forecasting species' responses to these changes requires long-term data series that are often lacking in ecological studies. We used 15 years of small mammal trapping data collected between 1978 and 2015 in 3 areas at Doñana National Park (southwest Spain) to (i) describe changes in species composition and (ii) test the association between local climate conditions and size of small mammal populations. Overall, 5 species were captured: wood mouse Apodemus sylvaticus , algerian mouse Mus spretus , greater white-toothed shrew Crocidura russula , garden dormouse Eliomys quercinus , and black rat Rattus rattus . The temporal pattern in the proportion of captures of each species suggests that the small mammal diversity declined with time. Although the larger species (e.g., E. quercinus ), better adapted to colder climate, have disappeared from our trapping records, M. spretus , a small species inhabiting southwest Europe and the Mediterranean coast of Africa, currently is almost the only trapped species. We used 2-level hierarchical models to separate changes in abundance from changes in probability of capture using records of A. sylvaticus in all 3 areas and of M. spretus in 1. We found that heavy rainfall and low temperatures were positively related to abundance of A. sylvaticus , and that the number of extremely hot days was negatively related to abundance of M. spretus . Despite other mechanisms are likely to be involved, our findings support the importance of climate for the distribution and persistence of these species and raise conservation concerns about potential cascading effects in the Doñana ecosystem.

  5. Food stress causes sex-specific maternal effects in mites.

    PubMed

    Walzer, Andreas; Schausberger, Peter

    2015-08-01

    Life history theory predicts that females should produce few large eggs under food stress and many small eggs when food is abundant. We tested this prediction in three female-biased size-dimorphic predatory mites feeding on herbivorous spider mite prey: Phytoseiulus persimilis, a specialized spider mite predator; Neoseiulus californicus, a generalist preferring spider mites; Amblyseius andersoni, a broad diet generalist. Irrespective of predator species and offspring sex, most females laid only one small egg under severe food stress. Irrespective of predator species, the number of female but not male eggs decreased with increasing maternal food stress. This sex-specific effect was probably due to the higher production costs of large female than small male eggs. The complexity of the response to the varying availability of spider mite prey correlated with the predators' degree of adaptation to this prey. Most A. andersoni females did not oviposit under severe food stress, whereas N. californicus and P. persimilis did oviposit. Under moderate food stress, only P. persimilis increased its investment per offspring, at the expense of egg number, and produced few large female eggs. When prey was abundant, P. persimilis decreased the female egg sizes at the expense of increased egg numbers, resulting in a sex-specific egg size/number trade-off. Maternal effects manifested only in N. californicus and P. persimilis. Small egg size correlated with the body size of daughters but not sons. Overall, our study provides a key example of sex-specific maternal effects, i.e. food stress during egg production more strongly affects the sex of the large than the small offspring. © 2015. Published by The Company of Biologists Ltd.

  6. Gravity and decoherence: the double slit experiment revisited

    NASA Astrophysics Data System (ADS)

    Samuel, Joseph

    2018-02-01

    The double slit experiment is iconic and widely used in classrooms to demonstrate the fundamental mystery of quantum physics. The puzzling feature is that the probability of an electron arriving at the detector when both slits are open is not the sum of the probabilities when the slits are open separately. The superposition principle of quantum mechanics tells us to add amplitudes rather than probabilities and this results in interference. This experiment defies our classical intuition that the probabilities of exclusive events add. In understanding the emergence of the classical world from the quantum one, there have been suggestions by Feynman, Diosi and Penrose that gravity is responsible for suppressing interference. This idea has been pursued in many different forms ever since, predominantly within Newtonian approaches to gravity. In this paper, we propose and theoretically analyse two ‘gedanken’ or thought experiments which lend strong support to the idea that gravity is responsible for decoherence. The first makes the point that thermal radiation can suppress interference. The second shows that in an accelerating frame, Unruh radiation does the same. Invoking the Einstein equivalence principle to relate acceleration to gravity, we support the view that gravity is responsible for decoherence.

  7. Small Versus Large-Sized Drug-Eluting Beads (DEBIRI) for the Treatment of Hepatic Colorectal Metastases: A Propensity Score Matching Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Akinwande, Olaguoke K., E-mail: gokeakin@gmail.com; Philips, Prejesh, E-mail: prejesh.philips@louisville.edu; Duras, Petr, E-mail: durasp@seznam.cz

    2015-04-15

    PurposeTo compare the feasibility, safety, and efficacy with small and large irinotecan drug-eluting beads (DEBIRI) for treating hepatic colorectal metastases.MethodsUsing our prospectively maintained, multi-center, intra-arterial therapy registry, we identified 196 patients treated with a combination of large beads (100–300 to 500–700 μm) and patients treated with a combination of small beads (70–150 to 100–300 μm). To minimize selection bias, a propensity score analysis was performed to compare both groups.ResultsUnadjusted analysis consisted of 196 and 30 patients treated with large and small beads, respectively. The adjusted analysis consisted of 19 patients each. Unadjusted analysis showed decreased all-grade (p = <0.001) and high-grade adverse effects (p = 0.02)more » in the small bead group, with a persisting trend toward decreased overall side effects in the adjusted analysis favoring small beads (p = 0.09) The adjusted analysis showed the percentage dose delivered (delivered dose/intended dose) was significantly greater in the small bead group compared to the large bead group (96 vs 79 %; p = 0.005). There were also a lower percentage of treatments terminating in complete stasis in the adjusted analysis (0.0035). Adjusted analysis also showed increased objective response rate (ORR) at 12 months (p = 0.04), with a corresponding trend also seen in the unadjusted analysis (0.09).ConclusionSmaller beads result in increased dose delivery probably due to less propensity to reach complete stasis. It may also lead to more durable long-term efficacy. Smaller beads also demonstrate similarly low toxicity compared to large-sized beads with a trend toward less toxicity.« less

  8. Unsolved Problems in Evolutionary Theory

    DTIC Science & Technology

    1967-01-01

    finding the probability of survival of a single new mutant). Most natural populations probably satisfy these conditions , as is illustrated by the...Ykl) of small quantities adding to zero. Then under suitable conditions on the function f(x), (3) xi + Yi,t+i = fi(x) + YE yjfi(tf) + O(y yt...It is clear that a sufficient condition for the point x to be locally stable is that all the roots of the matrix, (4) (a j) = ____ should have moduli

  9. A Multi-Armed Bandit Approach to Following a Markov Chain

    DTIC Science & Technology

    2017-06-01

    focus on the House to Café transition (p1,4). We develop a Multi-Armed Bandit approach for efficiently following this target, where each state takes the...and longitude (each state corresponding to a physical location and a small set of activities). The searcher would then apply our approach on this...the target’s transition probability and the true probability over time. Further, we seek to provide upper bounds (i.e., worst case bounds) on the

  10. Future southcentral US wildfire probability due to climate change

    USGS Publications Warehouse

    Stambaugh, Michael C.; Guyette, Richard P.; Stroh, Esther D.; Struckhoff, Matthew A.; Whittier, Joanna B.

    2018-01-01

    Globally, changing fire regimes due to climate is one of the greatest threats to ecosystems and society. In this paper, we present projections of future fire probability for the southcentral USA using downscaled climate projections and the Physical Chemistry Fire Frequency Model (PC2FM). Future fire probability is projected to both increase and decrease across the study region of Oklahoma, New Mexico, and Texas. Among all end-of-century projections, change in fire probabilities (CFPs) range from − 51 to + 240%. Greatest absolute increases in fire probability are shown for areas within the range of approximately 75 to 160 cm mean annual precipitation (MAP), regardless of climate model. Although fire is likely to become more frequent across the southcentral USA, spatial patterns may remain similar unless significant increases in precipitation occur, whereby more extensive areas with increased fire probability are predicted. Perhaps one of the most important results is illumination of climate changes where fire probability response (+, −) may deviate (i.e., tipping points). Fire regimes of southcentral US ecosystems occur in a geographic transition zone from reactant- to reaction-limited conditions, potentially making them uniquely responsive to different scenarios of temperature and precipitation changes. Identification and description of these conditions may help anticipate fire regime changes that will affect human health, agriculture, species conservation, and nutrient and water cycling.

  11. Numerical Study on the Partitioning of the Molecular Polarizability into Fluctuating Charge and Induced Atomic Dipole Contributions

    PubMed Central

    Mei, Ye; Simmonett, Andrew C.; Pickard, Frank C.; DiStasio, Robert A.; Brooks, Bernard R.; Shao, Yihan

    2015-01-01

    In order to carry out a detailed analysis of the molecular static polarizability, which is the response of the molecule to a uniform external electric field, the molecular polarizability was computed using the finite-difference method for 21 small molecules, using density functional theory. Within nine charge population schemes (Löwdin, Mulliken, Becke, Hirshfeld, CM5, Hirshfeld-I, NPA, CHELPG, MK-ESP) in common use, the charge fluctuation contribution is found to dominate the molecular polarizability, with its ratio ranging from 59.9% with the Hirshfeld or CM5 scheme to 96.2% with the Mulliken scheme. The Hirshfeld-I scheme is also used to compute the other contribution to the molecular polarizability coming from the induced atomic dipoles, and the atomic polarizabilities in 8 small molecules and water pentamer are found to be highly anisotropic for most atoms. Overall, the results suggest that (a) more emphasis probably should be placed on the charge fluctuation terms in future polarizable force field development; (b) an anisotropic polarizability might be more suitable than an isotropic one in polarizable force fields based entirely or partially on the induced atomic dipoles. PMID:25945749

  12. Livestock First Reached Southern Africa in Two Separate Events.

    PubMed

    Sadr, Karim

    2015-01-01

    After several decades of research on the subject, we now know when the first livestock reached southern Africa but the question of how they got there remains a contentious topic. Debate centres on whether they were brought with a large migration of Khoe-speakers who originated from East Africa; or whether the livestock were traded down-the-line among hunter-gatherer communities; or indeed whether there was a long history of diverse small scale population movements in this part of the world, one or more of which 'infiltrated' livestock into southern Africa. A new analysis of the distribution of stone toolkits from a sizeable sample of sub-equatorial African Later Stone Age sites, coupled with existing knowledge of the distribution of the earliest livestock remains and ceramics vessels, has allowed us to isolate two separate infiltration events that brought the first livestock into southern Africa just over 2000 years ago; one infiltration was along the Atlantic seaboard and another entered the middle reaches of the Limpopo River Basin. These findings agree well with the latest results of genetic research which together indicate that multiple, small-scale infiltrations probably were responsible for bringing the first livestock into southern Africa.

  13. Public reporting and market area exit decisions by home health agencies.

    PubMed

    Jung, Kyoungrae; Feldman, Roger

    2012-01-01

    To examine whether home health agencies selectively discontinue services to areas with socio-economically disadvantaged people after the introduction of Home Health Compare (HHC), a public reporting program initiated by Medicare in 2003. We focused on agencies' initial responses to HHC and examined selective market-area exits by agencies between 2002 and 2004. We measured HHC effects by the percentage of quality indicators reported in public HHC data in 2003. Socio-economic status was measured by per capita income and percent college-educated at the market-area level. 2002 and 2004 Outcome and Assessment Information Set (OASIS); 2000 US Census file; 2004 Area Resource File; and 2002 Provider of Service File. WE FOUND A SMALL AND WEAK EFFECT OF PUBLIC REPORTING ON SELECTIVE EXITS: a 10-percent increase in reporting (reporting one more indicator) increased the probability of leaving an area with less-educated people by 0.3 percentage points, compared with leaving an area with high education. The small level of market-area exits under public reporting is unlikely to be practically meaningful, suggesting that HHC did not lead to a disruption in access to home health care through selective exits during the initial year of the program.

  14. The Impact of Natural Hazards such as Turbulent Wind Gusts on the Wind Energy Conversion Process

    NASA Astrophysics Data System (ADS)

    Wächter, M.; Hölling, M.; Milan, P.; Morales, A.; Peinke, J.

    2012-12-01

    Wind turbines operate in the atmospheric boundary layer, where they are exposed to wind gusts and other types of natural hazards. As the response time of wind turbines is typically in the range of seconds, they are affected by the small scale intermittent properties of the turbulent wind. We show evidence that basic features which are known for small-scale homogeneous isotropic turbulence, and in particular the well-known intermittency problem, have an important impact on the wind energy conversion process. Intermittent statistics include high probabilities of extreme events which can be related to wind gusts and other types of natural hazards. As a summarizing result we find that atmospheric turbulence imposes its intermittent features on the complete wind energy conversion process. Intermittent turbulence features are not only present in atmospheric wind, but are also dominant in the loads on the turbine, i.e. rotor torque and thrust, and in the electrical power output signal. We conclude that profound knowledge of turbulent statistics and the application of suitable numerical as well as experimental methods are necessary to grasp these unique features and quantify their effects on all stages of wind energy conversion.

  15. Numerical study on the partitioning of the molecular polarizability into fluctuating charge and induced atomic dipole contributions

    DOE PAGES

    Mei, Ye; Simmonett, Andrew C.; Pickard, IV, Frank C.; ...

    2015-05-06

    In order to carry out a detailed analysis of the molecular static polarizability, which is the response of the molecule to a uniform external electric field, the molecular polarizability was computed in this study using the finite-difference method for 21 small molecules, using density functional theory. Within nine charge population schemes (Lowdin, Mulliken, Becke, Hirshfeld, CM5, Hirshfeld-I, NPA, CHELPG, MK-ESP) in common use, the charge fluctuation contribution is found to dominate the molecular polarizability, with its ratio ranging from 59.9% with the Hirshfeld or CM5 scheme to 96.2% with the Mulliken scheme. The Hirshfeld-I scheme is also used to computemore » the other contribution to the molecular polarizability coming from the induced atomic dipoles, and the atomic polarizabilities in eight small molecules and water pentamer are found to be highly anisotropic for most atoms. In conclusion, the overall results suggest that (a) more emphasis probably should be placed on the charge fluctuation terms in future polarizable force field development and (b) an anisotropic polarizability might be more suitable than an isotropic one in polarizable force fields based entirely or partially on the induced atomic dipoles.« less

  16. Postwildfire debris flows hazard assessment for the area burned by the 2011 Track Fire, northeastern New Mexico and southeastern Colorado

    USGS Publications Warehouse

    Tillery, Anne C.; Darr, Michael J.; Cannon, Susan H.; Michael, John A.

    2011-01-01

    In June 2011, the Track Fire burned 113 square kilometers in Colfax County, northeastern New Mexico, and Las Animas County, southeastern Colorado, including the upper watersheds of Chicorica and Raton Creeks. The burned landscape is now at risk of damage from postwildfire erosion, such as that caused by debris flows and flash floods. This report presents a preliminary hazard assessment of the debris-flow potential from basins burned by the Track Fire. A pair of empirical hazard-assessment models developed using data from recently burned basins throughout the intermountain western United States were used to estimate the probability of debris-flow occurrence and volume of debris flows at the outlets of selected drainage basins within the burned area. The models incorporate measures of burn severity, topography, soils, and storm rainfall to estimate the probability and volume of post-fire debris flows following the fire. In response to a design storm of 38 millimeters of rain in 30 minutes (10-year recurrence-interval), the probability of debris flow estimated for basins burned by the Track fire ranged between 2 and 97 percent, with probabilities greater than 80 percent identified for the majority of the tributary basins to Raton Creek in Railroad Canyon; six basins that flow into Lake Maloya, including the Segerstrom Creek and Swachheim Creek basins; two tributary basins to Sugarite Canyon, and an unnamed basin on the eastern flank of the burned area. Estimated debris-flow volumes ranged from 30 cubic meters to greater than 100,000 cubic meters. The largest volumes (greater than 100,000 cubic meters) were estimated for Segerstrom Creek and Swachheim Creek basins, which drain into Lake Maloya. The Combined Relative Debris-Flow Hazard Ranking identifies the Segerstrom Creek and Swachheim Creek basins as having the highest probability of producing the largest debris flows. This finding indicates the greatest post-fire debris-flow impacts may be expected to Lake Maloya. In addition, Interstate Highway 25, Raton Creek and the rail line in Railroad Canyon, County road A-27, and State Highway 526 in Sugarite Canyon may also be affected where they cross drainages downstream from recently burned basins. Although this assessment indicates that a rather large debris flow (approximately 42,000 cubic meters) may be generated from the basin above the City of Raton (basin 9) in response to the design storm, the probability of such an event is relatively low (approximately 10 percent). Additional assessment is necessary to determine if the estimated volume of material is sufficient to travel into the City of Raton. In addition, even small debris flows may affect structures at or downstream from basin outlets and increase the threat of flooding downstream by damaging or blocking flood mitigation structures. The maps presented here may be used to prioritize areas where erosion mitigation or other protective measures may be necessary within a 2- to 3-year window of vulnerability following the Track Fire.

  17. A comparative analysis of hazard models for predicting debris flows in Madison County, VA

    USGS Publications Warehouse

    Morrissey, Meghan M.; Wieczorek, Gerald F.; Morgan, Benjamin A.

    2001-01-01

    During the rainstorm of June 27, 1995, roughly 330-750 mm of rain fell within a sixteen-hour period, initiating floods and over 600 debris flows in a small area (130 km2) of Madison County, Virginia. Field studies showed that the majority (70%) of these debris flows initiated with a thickness of 0.5 to 3.0 m in colluvium on slopes from 17 o to 41 o (Wieczorek et al., 2000). This paper evaluated and compared the approaches of SINMAP, LISA, and Iverson's (2000) transient response model for slope stability analysis by applying each model to the landslide data from Madison County. Of these three stability models, only Iverson's transient response model evaluated stability conditions as a function of time and depth. Iverson?s model would be the preferred method of the three models to evaluate landslide hazards on a regional scale in areas prone to rain-induced landslides as it considers both the transient and spatial response of pore pressure in its calculation of slope stability. The stability calculation used in SINMAP and LISA is similar and utilizes probability distribution functions for certain parameters. Unlike SINMAP that only considers soil cohesion, internal friction angle and rainfall-rate distributions, LISA allows the use of distributed data for all parameters, so it is the preferred model to evaluate slope stability over SINMAP. Results from all three models suggested similar soil and hydrologic properties for triggering the landslides that occurred during the 1995 storm in Madison County, Virginia. The colluvium probably had cohesion of less than 2KPa. The root-soil system is above the failure plane and consequently root strength and tree surcharge had negligible effect on slope stability. The result that the final location of the water table was near the ground surface is supported by the water budget analysis of the rainstorm conducted by Smith et al. (1996).

  18. Failure-probability driven dose painting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vogelius, Ivan R.; Håkansson, Katrin; Due, Anne K.

    Purpose: To demonstrate a data-driven dose-painting strategy based on the spatial distribution of recurrences in previously treated patients. The result is a quantitative way to define a dose prescription function, optimizing the predicted local control at constant treatment intensity. A dose planning study using the optimized dose prescription in 20 patients is performed.Methods: Patients treated at our center have five tumor subvolumes from the center of the tumor (PET positive volume) and out delineated. The spatial distribution of 48 failures in patients with complete clinical response after (chemo)radiation is used to derive a model for tumor control probability (TCP). Themore » total TCP is fixed to the clinically observed 70% actuarial TCP at five years. Additionally, the authors match the distribution of failures between the five subvolumes to the observed distribution. The steepness of the dose–response is extracted from the literature and the authors assume 30% and 20% risk of subclinical involvement in the elective volumes. The result is a five-compartment dose response model matching the observed distribution of failures. The model is used to optimize the distribution of dose in individual patients, while keeping the treatment intensity constant and the maximum prescribed dose below 85 Gy.Results: The vast majority of failures occur centrally despite the small volumes of the central regions. Thus, optimizing the dose prescription yields higher doses to the central target volumes and lower doses to the elective volumes. The dose planning study shows that the modified prescription is clinically feasible. The optimized TCP is 89% (range: 82%–91%) as compared to the observed TCP of 70%.Conclusions: The observed distribution of locoregional failures was used to derive an objective, data-driven dose prescription function. The optimized dose is predicted to result in a substantial increase in local control without increasing the predicted risk of toxicity.« less

  19. Saccharomyces boulardii modifies Salmonella typhimurium traffic and host immune responses along the intestinal tract.

    PubMed

    Pontier-Bres, Rodolphe; Munro, Patrick; Boyer, Laurent; Anty, Rodolphe; Imbert, Véronique; Terciolo, Chloé; André, Fréderic; Rampal, Patrick; Lemichez, Emmanuel; Peyron, Jean-François; Czerucka, Dorota

    2014-01-01

    Salmonella enterica serovar Typhimurium (ST) is an enteropathogenic Gram-negative bacterium that causes infection following oral ingestion. ST spreads rapidly along the gastrointestinal tract (GIT) and invades the intestinal epithelium to ultimately reach internal body organs. The probiotic yeast Saccharomyces boulardii BIOCODEX (S.b-B) is prescribed for prophylaxis of diarrheal infectious diseases. We previously showed that S.b-B prevents weight loss in ST-infected mice and significantly decreases bacterial translocation to the spleen and liver. This study was designed to investigate the effect of S.b-B on ST migration along the GIT and the impact of the yeast on the host's early innate immune responses. Bioluminescent imaging (BLI) was used to evaluate the effect of S.b-B on the progression of luminescent Salmonella Typhimurium (ST-lux) in the GIT of mice pretreated with streptomycin. Photonic emission (PE) was measured in GIT extracts (stomach, small intestine, cecum and colon) at various time periods post-infection (PI). PE analysis revealed that, 45 min PI, ST-lux had migrated slightly faster in the mice treated with S.b-B than in the untreated infected animals. At 90 min PI, ST-lux had reached the cecum in both groups of mice. Adhesion of ST to S.b-B was visualized in the intestines of the mice and probably accounts for (1) the faster elimination of ST-lux in the feces, and (2) reduced translocation of ST to the spleen and liver. In the early phase of infection, S.b-B also modifies the host's immune responses by (1) increasing IFN-γ gene expression and decreasing IL-10 gene expression in the small intestine, and (2) elevating both IFN-γ, and IL-10 mRNA levels in the cecum. BLI revealed that S.b-B modifies ST migration and the host immune response along the GIT. Study findings shed new light on the protective mechanisms of S.b-B during the early phase of Salmonella pathogenesis.

  20. Experimental tree removal in tallgrass prairie: variable responses of flora and fauna along a woody cover gradient.

    PubMed

    Alford, Aaron L; Hellgren, Eric C; Limb, Ryan; Engle, David M

    2012-04-01

    Woody plant encroachment is a worldwide phenomenon in grassland and savanna systems whose consequence is often the development of an alternate woodland state. Theoretically, an alternate state may be associated with changes in system state variables (e.g., species composition) or abiotic parameter shifts (e.g., nutrient availability). When state-variable changes are cumulative, such as in woody plant encroachment, the probability of parameter shifts increases as system feedbacks intensify over time. Using a Before-After Control-Impact (BACI) design, we studied eight pairs of grassland sites undergoing various levels of eastern redcedar (Juniperus virginiana) encroachment to determine whether responses of flora and fauna to experimental redcedar removal differed according to the level of pretreatment redcedar cover. In the first year after removal, herbaceous plant species diversity and evenness, woody plant evenness, and invertebrate family richness increased linearly with pretreatment redcedar cover, whereas increases in small-mammal diversity and evenness were described by logarithmic trends. In contrast, increases in woody plant diversity and total biomass of terrestrial invertebrates were accentuated at levels of higher pretreatment cover. Tree removal also shifted small-mammal species composition toward a more grassland-associated assemblage. During the second year postremoval, increases in herbaceous plant diversity followed a polynomial trend, but increases in most other metrics did not vary along the pretreatment cover gradient. These changes were accompanied by extremely high growing-season precipitation, which may have homogenized floral and faunal responses to removal. Our results demonstrate that tree removal increases important community metrics among grassland flora and fauna within two years, with some responses to removal being strongly influenced by the stage of initial encroachment and modulated by climatic variability. Our results underscore the importance of decisive management for reversing the effects of woody plant encroachment in imperiled grassland ecosystems.

  1. Lowland biotic attrition revisited: body size and variation among climate change ‘winners’ and ‘losers’

    PubMed Central

    Strimas-Mackey, Matthew; Mohd-Azlan, Jayasilan; Granados, Alys; Bernard, Henry; Giordano, Anthony J.; Helmy, Olga E.

    2017-01-01

    The responses of lowland tropical communities to climate change will critically influence global biodiversity but remain poorly understood. If species in these systems are unable to tolerate warming, the communities—currently the most diverse on Earth—may become depauperate (‘biotic attrition’). In response to temperature changes, animals can adjust their distribution in space or their activity in time, but these two components of the niche are seldom considered together. We assessed the spatio-temporal niches of rainforest mammal species in Borneo across gradients in elevation and temperature. Most species are not predicted to experience changes in spatio-temporal niche availability, even under pessimistic warming scenarios. Responses to temperature are not predictable by phylogeny but do appear to be trait-based, being much more variable in smaller-bodied taxa. General circulation models and weather station data suggest unprecedentedly high midday temperatures later in the century; predicted responses to this warming among small-bodied species range from 9% losses to 6% gains in spatio-temporal niche availability, while larger species have close to 0% predicted change. Body mass may therefore be a key ecological trait influencing the identity of climate change winners and losers. Mammal species composition will probably change in some areas as temperatures rise, but full-scale biotic attrition this century appears unlikely. PMID:28100818

  2. Lowland biotic attrition revisited: body size and variation among climate change 'winners' and 'losers'.

    PubMed

    Brodie, Jedediah F; Strimas-Mackey, Matthew; Mohd-Azlan, Jayasilan; Granados, Alys; Bernard, Henry; Giordano, Anthony J; Helmy, Olga E

    2017-01-25

    The responses of lowland tropical communities to climate change will critically influence global biodiversity but remain poorly understood. If species in these systems are unable to tolerate warming, the communities-currently the most diverse on Earth-may become depauperate ('biotic attrition'). In response to temperature changes, animals can adjust their distribution in space or their activity in time, but these two components of the niche are seldom considered together. We assessed the spatio-temporal niches of rainforest mammal species in Borneo across gradients in elevation and temperature. Most species are not predicted to experience changes in spatio-temporal niche availability, even under pessimistic warming scenarios. Responses to temperature are not predictable by phylogeny but do appear to be trait-based, being much more variable in smaller-bodied taxa. General circulation models and weather station data suggest unprecedentedly high midday temperatures later in the century; predicted responses to this warming among small-bodied species range from 9% losses to 6% gains in spatio-temporal niche availability, while larger species have close to 0% predicted change. Body mass may therefore be a key ecological trait influencing the identity of climate change winners and losers. Mammal species composition will probably change in some areas as temperatures rise, but full-scale biotic attrition this century appears unlikely. © 2017 The Author(s).

  3. Scenarios for Evolving Seismic Crises: Possible Communication Strategies

    NASA Astrophysics Data System (ADS)

    Steacy, S.

    2015-12-01

    Recent advances in operational earthquake forecasting mean that we are very close to being able to confidently compute changes in earthquake probability as seismic crises develop. For instance, we now have statistical models such as ETAS and STEP which demonstrate considerable skill in forecasting earthquake rates and recent advances in Coulomb based models are also showing much promise. Communicating changes in earthquake probability is likely be very difficult, however, as the absolute probability of a damaging event is likely to remain quite small despite a significant increase in the relative value. Here, we use a hybrid Coulomb/statistical model to compute probability changes for a series of earthquake scenarios in New Zealand. We discuss the strengths and limitations of the forecasts and suggest a number of possible mechanisms that might be used to communicate results in an actual developing seismic crisis.

  4. Tree attenuation at 20 GHz: Foliage effects

    NASA Technical Reports Server (NTRS)

    Vogel, Wolfhard J.; Goldhirsh, Julius

    1993-01-01

    Static tree attenuation measurements at 20 GHz (K-Band) on a 30 deg slant path through a mature Pecan tree with and without leaves showed median fades exceeding approximately 23 dB and 7 dB, respectively. The corresponding 1% probability fades were 43 dB and 25 dB. Previous 1.6 GHz (L-Band) measurements for the bare tree case showed fades larger than those at K-Band by 3.4 dB for the median and smaller by approximately 7 dB at the 1% probability. While the presence of foliage had only a small effect on fading at L-Band (approximately 1 dB additional for the median to 1% probability range), the attenuation increase was significant at K-Band, where it increased by about 17 dB over the same probability range.

  5. Tree attenuation at 20 GHz: Foliage effects

    NASA Astrophysics Data System (ADS)

    Vogel, Wolfhard J.; Goldhirsh, Julius

    1993-08-01

    Static tree attenuation measurements at 20 GHz (K-Band) on a 30 deg slant path through a mature Pecan tree with and without leaves showed median fades exceeding approximately 23 dB and 7 dB, respectively. The corresponding 1% probability fades were 43 dB and 25 dB. Previous 1.6 GHz (L-Band) measurements for the bare tree case showed fades larger than those at K-Band by 3.4 dB for the median and smaller by approximately 7 dB at the 1% probability. While the presence of foliage had only a small effect on fading at L-Band (approximately 1 dB additional for the median to 1% probability range), the attenuation increase was significant at K-Band, where it increased by about 17 dB over the same probability range.

  6. GRANITE FIORDS WILDERNESS STUDY AREA, ALASKA.

    USGS Publications Warehouse

    Berg, Henry C.; Pittman, Tom L.

    1984-01-01

    Mineral surveys of the Granite Fiords Wilderness study area revealed areas with probable and substantiated mineral-resource potential. In the northeastern sector, areas of probable and substantiated resource potential for gold, sivler, and base metals in small, locally high grade vein and disseminated deposits occur in recrystallized Mesozoic volcanic, sedimentary, and intrusive rocks. In the central part, areas of probable resource potential for gold, silver, copper, and zinc in disseminated and locally massive sulfide deposits occur in undated pelitic paragneiss roof pendants. A molybdenite-bearing quartz vein has been prospected in western Granite Fiords, and molybdenum also occurs along with other metals in veins in the northeastern sector and in geochemical samples collected from areas where there is probable resource potential for low-grade porphyry molybdenum deposits in several Cenozoic plutons. No energy resource potential was identified in the course of this study.

  7. Fixation Probability in a Haploid-Diploid Population.

    PubMed

    Bessho, Kazuhiro; Otto, Sarah P

    2017-01-01

    Classical population genetic theory generally assumes either a fully haploid or fully diploid life cycle. However, many organisms exhibit more complex life cycles, with both free-living haploid and diploid stages. Here we ask what the probability of fixation is for selected alleles in organisms with haploid-diploid life cycles. We develop a genetic model that considers the population dynamics using both the Moran model and Wright-Fisher model. Applying a branching process approximation, we obtain an accurate fixation probability assuming that the population is large and the net effect of the mutation is beneficial. We also find the diffusion approximation for the fixation probability, which is accurate even in small populations and for deleterious alleles, as long as selection is weak. These fixation probabilities from branching process and diffusion approximations are similar when selection is weak for beneficial mutations that are not fully recessive. In many cases, particularly when one phase predominates, the fixation probability differs substantially for haploid-diploid organisms compared to either fully haploid or diploid species. Copyright © 2017 by the Genetics Society of America.

  8. Six-dimensional quantum dynamics study for the dissociative adsorption of DCl on Au(111) surface

    NASA Astrophysics Data System (ADS)

    Liu, Tianhui; Fu, Bina; Zhang, Dong H.

    2014-04-01

    We carried out six-dimensional quantum dynamics calculations for the dissociative adsorption of deuterium chloride (DCl) on Au(111) surface using the initial state-selected time-dependent wave packet approach. The four-dimensional dissociation probabilities are also obtained with the center of mass of DCl fixed at various sites. These calculations were all performed based on an accurate potential energy surface recently constructed by neural network fitting to density function theory energy points. The origin of the extremely small dissociation probability for DCl/HCl (v = 0, j = 0) fixed at the top site compared to other fixed sites is elucidated in this study. The influence of vibrational excitation and rotational orientation of DCl on the reactivity was investigated by calculating six-dimensional dissociation probabilities. The vibrational excitation of DCl enhances the reactivity substantially and the helicopter orientation yields higher dissociation probability than the cartwheel orientation. The site-averaged dissociation probability over 25 fixed sites obtained from four-dimensional quantum dynamics calculations can accurately reproduce the six-dimensional dissociation probability.

  9. Six-dimensional quantum dynamics study for the dissociative adsorption of DCl on Au(111) surface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Tianhui; Fu, Bina, E-mail: bina@dicp.ac.cn, E-mail: zhangdh@dicp.ac.cn; Zhang, Dong H., E-mail: bina@dicp.ac.cn, E-mail: zhangdh@dicp.ac.cn

    We carried out six-dimensional quantum dynamics calculations for the dissociative adsorption of deuterium chloride (DCl) on Au(111) surface using the initial state-selected time-dependent wave packet approach. The four-dimensional dissociation probabilities are also obtained with the center of mass of DCl fixed at various sites. These calculations were all performed based on an accurate potential energy surface recently constructed by neural network fitting to density function theory energy points. The origin of the extremely small dissociation probability for DCl/HCl (v = 0, j = 0) fixed at the top site compared to other fixed sites is elucidated in this study. The influence of vibrational excitationmore » and rotational orientation of DCl on the reactivity was investigated by calculating six-dimensional dissociation probabilities. The vibrational excitation of DCl enhances the reactivity substantially and the helicopter orientation yields higher dissociation probability than the cartwheel orientation. The site-averaged dissociation probability over 25 fixed sites obtained from four-dimensional quantum dynamics calculations can accurately reproduce the six-dimensional dissociation probability.« less

  10. Using multilevel spatial models to understand salamander site occupancy patterns after wildfire

    USGS Publications Warehouse

    Chelgren, Nathan; Adams, Michael J.; Bailey, Larissa L.; Bury, R. Bruce

    2011-01-01

    Studies of the distribution of elusive forest wildlife have suffered from the confounding of true presence with the uncertainty of detection. Occupancy modeling, which incorporates probabilities of species detection conditional on presence, is an emerging approach for reducing observation bias. However, the current likelihood modeling framework is restrictive for handling unexplained sources of variation in the response that may occur when there are dependence structures such as smaller sampling units that are nested within larger sampling units. We used multilevel Bayesian occupancy modeling to handle dependence structures and to partition sources of variation in occupancy of sites by terrestrial salamanders (family Plethodontidae) within and surrounding an earlier wildfire in western Oregon, USA. Comparison of model fit favored a spatial N-mixture model that accounted for variation in salamander abundance over models that were based on binary detection/non-detection data. Though catch per unit effort was higher in burned areas than unburned, there was strong support that this pattern was due to a higher probability of capture for individuals in burned plots. Within the burn, the odds of capturing an individual given it was present were 2.06 times the odds outside the burn, reflecting reduced complexity of ground cover in the burn. There was weak support that true occupancy was lower within the burned area. While the odds of occupancy in the burn were 0.49 times the odds outside the burn among the five species, the magnitude of variation attributed to the burn was small in comparison to variation attributed to other landscape variables and to unexplained, spatially autocorrelated random variation. While ordinary occupancy models may separate the biological pattern of interest from variation in detection probability when all sources of variation are known, the addition of random effects structures for unexplained sources of variation in occupancy and detection probability may often more appropriately represent levels of uncertainty. ?? 2011 by the Ecological Society of America.

  11. A global analysis of traits predicting species sensitivity to habitat fragmentation

    USGS Publications Warehouse

    Keinath, Douglas; Doak, Daniel F.; Hodges, Karen E.; Prugh, Laura R.; Fagan, William F.; Sekercioglu, Cagan H.; Buchart, Stuart H. M.; Kauffman, Matthew J.

    2017-01-01

    AimElucidating patterns in species responses to habitat fragmentation is an important focus of ecology and conservation, but studies are often geographically restricted, taxonomically narrow or use indirect measures of species vulnerability. We investigated predictors of species presence after fragmentation using data from studies around the world that included all four terrestrial vertebrate classes, thus allowing direct inter-taxonomic comparison.LocationWorld-wide.MethodsWe used generalized linear mixed-effect models in an information theoretic framework to assess the factors that explained species presence in remnant habitat patches (3342 patches; 1559 species, mostly birds; and 65,695 records of patch-specific presence–absence). We developed a novel metric of fragmentation sensitivity, defined as the maximum rate of change in probability of presence with changing patch size (‘Peak Change’), to distinguish between general rarity on the landscape and sensitivity to fragmentation per se.ResultsSize of remnant habitat patches was the most important driver of species presence. Across all classes, habitat specialists, carnivores and larger species had a lower probability of presence, and those effects were substantially modified by interactions. Sensitivity to fragmentation (measured by Peak Change) was influenced primarily by habitat type and specialization, but also by fecundity, life span and body mass. Reptiles were more sensitive than other classes. Grassland species had a lower probability of presence, though sample size was relatively small, but forest and shrubland species were more sensitive.Main conclusionsHabitat relationships were more important than life-history characteristics in predicting the effects of fragmentation. Habitat specialization increased sensitivity to fragmentation and interacted with class and habitat type; forest specialists and habitat-specific reptiles were particularly sensitive to fragmentation. Our results suggest that when conservationists are faced with disturbances that could fragment habitat they should pay particular attention to specialists, particularly reptiles. Further, our results highlight that the probability of presence in fragmented landscapes and true sensitivity to fragmentation are predicted by different factors.

  12. A new methodology to derive settleable particulate matter guidelines to assist policy-makers on reducing public nuisance

    NASA Astrophysics Data System (ADS)

    Machado, Milena; Santos, Jane Meri; Reisen, Valdério Anselmo; Reis, Neyval Costa; Mavroidis, Ilias; Lima, Ana T.

    2018-06-01

    Air quality standards for settleable particulate matter (SPM) are found in many countries around the world. As well known, annoyance caused by SPM can be considered a community problem even if only a small proportion of the population is bothered at rather infrequent occasions. Many authors have shown that SPM cause soiling in residential and urban environments and degradation of materials (eg, objects and surface painting) that can impair the use and enjoyment of property and alter the normal activities of society. In this context, this paper has as main contribution to propose a guidance to establish air quality standards for annoyance caused by SPM in metropolitan industrial areas. To attain this objective, a new methodology is proposed which is based on the nonlinear correlation between the perceived annoyance (qualitative variable) and particles deposition rate (quantitative variable). Since the response variable is binary (annoyed and not annoyed), the logistic regression model is used to estimate the probability of people being annoyed at different levels of particles deposition rate and to compute the odds ratio function which gives, under a specific level of particles deposition rate, the estimated expected value of the population perceived annoyance. The proposed methodology is verified in a data set measured in the metropolitan area of Great Vitória, Espirito Santo, Brazil. As a general conclusion, the estimated probability function of perceived annoyance as a function of SPM has shown that 17% of inhabitants report annoyance to very low particles deposition levels of 5 g/(m2•30 days). In addition, for an increasing of 1 g/(m2•30 days) of SPM, the smallest estimated odds ratio of perceived annoyance by a factor of 1.5, implying that the probability of occurrence is almost 2 times as large as the probability of no occurrence of annoyance.

  13. Chance-Constrained Guidance With Non-Convex Constraints

    NASA Technical Reports Server (NTRS)

    Ono, Masahiro

    2011-01-01

    Missions to small bodies, such as comets or asteroids, require autonomous guidance for descent to these small bodies. Such guidance is made challenging by uncertainty in the position and velocity of the spacecraft, as well as the uncertainty in the gravitational field around the small body. In addition, the requirement to avoid collision with the asteroid represents a non-convex constraint that means finding the optimal guidance trajectory, in general, is intractable. In this innovation, a new approach is proposed for chance-constrained optimal guidance with non-convex constraints. Chance-constrained guidance takes into account uncertainty so that the probability of collision is below a specified threshold. In this approach, a new bounding method has been developed to obtain a set of decomposed chance constraints that is a sufficient condition of the original chance constraint. The decomposition of the chance constraint enables its efficient evaluation, as well as the application of the branch and bound method. Branch and bound enables non-convex problems to be solved efficiently to global optimality. Considering the problem of finite-horizon robust optimal control of dynamic systems under Gaussian-distributed stochastic uncertainty, with state and control constraints, a discrete-time, continuous-state linear dynamics model is assumed. Gaussian-distributed stochastic uncertainty is a more natural model for exogenous disturbances such as wind gusts and turbulence than the previously studied set-bounded models. However, with stochastic uncertainty, it is often impossible to guarantee that state constraints are satisfied, because there is typically a non-zero probability of having a disturbance that is large enough to push the state out of the feasible region. An effective framework to address robustness with stochastic uncertainty is optimization with chance constraints. These require that the probability of violating the state constraints (i.e., the probability of failure) is below a user-specified bound known as the risk bound. An example problem is to drive a car to a destination as fast as possible while limiting the probability of an accident to 10(exp -7). This framework allows users to trade conservatism against performance by choosing the risk bound. The more risk the user accepts, the better performance they can expect.

  14. Capturing the complexity of uncertainty language to maximise its use.

    NASA Astrophysics Data System (ADS)

    Juanchich, Marie; Sirota, Miroslav

    2016-04-01

    Uncertainty is often communicated verbally, using uncertainty phrases such as 'there is a small risk of earthquake', 'flooding is possible' or 'it is very likely the sea level will rise'. Prior research has only examined a limited number of properties of uncertainty phrases: mainly the probability conveyed (e.g., 'a small chance' convey a small probability whereas 'it is likely' convey a high probability). We propose a new analytical framework that captures more of the complexity of uncertainty phrases by studying their semantic, pragmatic and syntactic properties. Further, we argue that the complexity of uncertainty phrases is functional and can be leveraged to best describe uncertain outcomes and achieve the goals of speakers. We will present findings from a corpus study and an experiment where we assessed the following properties of uncertainty phrases: probability conveyed, subjectivity, valence, nature of the subject, grammatical category of the uncertainty quantifier and whether the quantifier elicits a positive or a negative framing. Natural language processing techniques applied to corpus data showed that people use a very large variety of uncertainty phrases representing different configurations of the properties of uncertainty phrases (e.g., phrases that convey different levels of subjectivity, phrases with different grammatical construction). In addition, the corpus analysis uncovered that uncertainty phrases commonly studied in psychology are not the most commonly used in real life. In the experiment we manipulated the amount of evidence indicating that a fact was true and whether the participant was required to prove the fact was true or that it was false. Participants produced a phrase to communicate the likelihood that the fact was true (e.g., 'it is not sure…', 'I am convinced that…'). The analyses of the uncertainty phrases produced showed that participants leveraged the properties of uncertainty phrases to reflect the strength of evidence but also to achieve their personal goals. For example, participants aiming to prove that the fact was true chose words that conveyed a more positive polarity and a higher probability than participants aiming to prove that the fact was false. We discuss the utility of the framework for harnessing the properties of uncertainty phrases in geosciences.

  15. Shoot Development and Extension of Quercus serrata Saplings in Response to Insect Damage and Nutrient Conditions

    PubMed Central

    MIZUMACHI, ERI; MORI, AKIRA; OSAWA, NAOYA; AKIYAMA, REIKO; TOKUCHI, NAOKO

    2006-01-01

    • Background and Aims Plants have the ability to compensate for damage caused by herbivores. This is important to plant growth, because a plant cannot always avoid damage, even if it has developed defence mechanisms against herbivores. In previous work, we elucidated the herbivory-induced compensatory response of Quercus (at both the individual shoot and whole sapling levels) in both low- and high-nutrient conditions throughout one growing season. In this study, we determine how the compensatory growth of Quercus serrata saplings is achieved at different nutrient levels. • Methods Quercus serrata saplings were grown under controlled conditions. Length, number of leaves and percentage of leaf area lost on all extension units (EUs) were measured. • Key Results Both the probability of flushing and the length of subsequent EUs significantly increased with an increase in the length of the parent EU. The probability of flushing increased with an increase in leaf damage of the parent EU, but the length of subsequent EUs decreased. This indicates that EU growth is fundamentally regulated at the individual EU level. The probabilities of a second and third flush were significantly higher in plants in high-nutrient soil than those in low-nutrient soil. The subsequent EUs of damaged saplings were also significantly longer at high-nutrient conditions. • Conclusions An increase in the probability of flushes in response to herbivore damage is important for damaged saplings to produce new EUs; further, shortening the length of EUs helps to effectively reproduce foliage lost by herbivory. The probability of flushing also varied according to soil nutrient levels, suggesting that the compensatory growth of individual EUs in response to local damage levels is affected by the nutrients available to the whole sapling. PMID:16709576

  16. Early physiological response to intensive care as a clinically relevant approach to predicting the outcome in severe acute pancreatitis.

    PubMed

    Flint, Richard; Windsor, John A

    2004-04-01

    The physiological response to treatment is a better predictor of outcome in acute pancreatitis than are traditional static measures. Retrospective diagnostic test study. The criterion standard was Organ Failure Score (OFS) and Acute Physiology and Chronic Health Evaluation II (APACHE II) score at the time of hospital admission. Intensive care unit of a tertiary referral center, Auckland City Hospital, Auckland, New Zealand. Consecutive sample of 92 patients (60 male, 32 female; median age, 61 years; range, 24-79 years) with severe acute pancreatitis. Twenty patients were not included because of incomplete data. The cause of pancreatitis was gallstones (42%), alcohol use (27%), or other (31%). At hospital admission, the mean +/- SD OFS was 8.1 +/- 6.1, and the mean +/- SD APACHE II score was 19.9 +/- 8.2. All cases were managed according to a standardized protocol. There was no randomization or testing of any individual interventions. Survival and death. There were 32 deaths (pretest probability of dying was 35%). The physiological response to treatment was more accurate in predicting the outcome than was OFS or APACHE II score at hospital admission. For example, 17 patients had an initial OFS of 7-8 (posttest probability of dying was 58%); after 48 hours, 7 had responded to treatment (posttest probability of dying was 28%), and 10 did not respond (posttest probability of dying was 82%). The effect of the change in OFS and APACHE II score was graphically depicted by using a series of logistic regression equations. The resultant sigmoid curve suggests that there is a midrange of scores (the steep portion of the graph) within which the probability of death is most affected by the response to intensive care treatment. Measuring the initial severity of pancreatitis combined with the physiological response to intensive care treatment is a practical and clinically relevant approach to predicting death in patients with severe acute pancreatitis.

  17. Optimized Vertex Method and Hybrid Reliability

    NASA Technical Reports Server (NTRS)

    Smith, Steven A.; Krishnamurthy, T.; Mason, B. H.

    2002-01-01

    A method of calculating the fuzzy response of a system is presented. This method, called the Optimized Vertex Method (OVM), is based upon the vertex method but requires considerably fewer function evaluations. The method is demonstrated by calculating the response membership function of strain-energy release rate for a bonded joint with a crack. The possibility of failure of the bonded joint was determined over a range of loads. After completing the possibilistic analysis, the possibilistic (fuzzy) membership functions were transformed to probability density functions and the probability of failure of the bonded joint was calculated. This approach is called a possibility-based hybrid reliability assessment. The possibility and probability of failure are presented and compared to a Monte Carlo Simulation (MCS) of the bonded joint.

  18. A meta-analysis of response-time tests of the sequential two-systems model of moral judgment.

    PubMed

    Baron, Jonathan; Gürçay, Burcu

    2017-05-01

    The (generalized) sequential two-system ("default interventionist") model of utilitarian moral judgment predicts that utilitarian responses often arise from a system-two correction of system-one deontological intuitions. Response-time (RT) results that seem to support this model are usually explained by the fact that low-probability responses have longer RTs. Following earlier results, we predicted response probability from each subject's tendency to make utilitarian responses (A, "Ability") and each dilemma's tendency to elicit deontological responses (D, "Difficulty"), estimated from a Rasch model. At the point where A = D, the two responses are equally likely, so probability effects cannot account for any RT differences between them. The sequential two-system model still predicts that many of the utilitarian responses made at this point will result from system-two corrections of system-one intuitions, hence should take longer. However, when A = D, RT for the two responses was the same, contradicting the sequential model. Here we report a meta-analysis of 26 data sets, which replicated the earlier results of no RT difference overall at the point where A = D. The data sets used three different kinds of moral judgment items, and the RT equality at the point where A = D held for all three. In addition, we found that RT increased with A-D. This result holds for subjects (characterized by Ability) but not for items (characterized by Difficulty). We explain the main features of this unanticipated effect, and of the main results, with a drift-diffusion model.

  19. Recent ecological responses to climate change support predictions of high extinction risk

    PubMed Central

    Maclean, Ilya M. D.; Wilson, Robert J.

    2011-01-01

    Predicted effects of climate change include high extinction risk for many species, but confidence in these predictions is undermined by a perceived lack of empirical support. Many studies have now documented ecological responses to recent climate change, providing the opportunity to test whether the magnitude and nature of recent responses match predictions. Here, we perform a global and multitaxon metaanalysis to show that empirical evidence for the realized effects of climate change supports predictions of future extinction risk. We use International Union for Conservation of Nature (IUCN) Red List criteria as a common scale to estimate extinction risks from a wide range of climate impacts, ecological responses, and methods of analysis, and we compare predictions with observations. Mean extinction probability across studies making predictions of the future effects of climate change was 7% by 2100 compared with 15% based on observed responses. After taking account of possible bias in the type of climate change impact analyzed and the parts of the world and taxa studied, there was less discrepancy between the two approaches: predictions suggested a mean extinction probability of 10% across taxa and regions, whereas empirical evidence gave a mean probability of 14%. As well as mean overall extinction probability, observations also supported predictions in terms of variability in extinction risk and the relative risk associated with broad taxonomic groups and geographic regions. These results suggest that predictions are robust to methodological assumptions and provide strong empirical support for the assertion that anthropogenic climate change is now a major threat to global biodiversity. PMID:21746924

  20. Recent ecological responses to climate change support predictions of high extinction risk.

    PubMed

    Maclean, Ilya M D; Wilson, Robert J

    2011-07-26

    Predicted effects of climate change include high extinction risk for many species, but confidence in these predictions is undermined by a perceived lack of empirical support. Many studies have now documented ecological responses to recent climate change, providing the opportunity to test whether the magnitude and nature of recent responses match predictions. Here, we perform a global and multitaxon metaanalysis to show that empirical evidence for the realized effects of climate change supports predictions of future extinction risk. We use International Union for Conservation of Nature (IUCN) Red List criteria as a common scale to estimate extinction risks from a wide range of climate impacts, ecological responses, and methods of analysis, and we compare predictions with observations. Mean extinction probability across studies making predictions of the future effects of climate change was 7% by 2100 compared with 15% based on observed responses. After taking account of possible bias in the type of climate change impact analyzed and the parts of the world and taxa studied, there was less discrepancy between the two approaches: predictions suggested a mean extinction probability of 10% across taxa and regions, whereas empirical evidence gave a mean probability of 14%. As well as mean overall extinction probability, observations also supported predictions in terms of variability in extinction risk and the relative risk associated with broad taxonomic groups and geographic regions. These results suggest that predictions are robust to methodological assumptions and provide strong empirical support for the assertion that anthropogenic climate change is now a major threat to global biodiversity.

  1. Central circuitry in the jellyfish Aglantha. II: The ring giant and carrier systems

    PubMed

    Mackie; Meech

    1995-01-01

    1. The ring giant axon in the outer nerve ring of the jellyfish Aglantha digitale is a multinucleate syncytium 85 % of which is occupied by an electron-dense fluid-filled vacuole apparently in a Gibbs­Donnan equilibrium with the surrounding band of cytoplasmic cortex. Micropipette recordings show small (-15 to -25 mV) and large (-62 to -66 mV) resting potentials. Low values, obtained with a high proportion of the micropipette penetrations, are assumed to be from the central vacuole; high values from the cytoplasmic cortex. Background electrical activity includes rhythmic oscillations and synaptic potentials representing hair cell input caused by vibration. 2. After the ring giant axon has been cut, propagating action potentials evoked by stimulation are conducted past the cut and re-enter the axon on the far side. The system responsible (the carrier system) through-conducts at a velocity approximately 25 % of that of the ring giant axon and is probably composed of small neurones running in parallel with it. Numerous small neurones are seen by electron microscopy, some making one-way and some two-way synapses with the ring giant. 3. Despite their different conduction velocities, the two systems normally appear to fire in synchrony and at the velocity of the ring giant axon. We suggest that, once initiated, ring giant spikes propagate rapidly around the margin, firing the carrier neurones through serial synapses and giving them, in effect, the same high conduction velocity. Initiation of ring giant spikes can, however, require input from the carrier system. The spikes are frequently seen to be mounted on slow positive potentials representing summed carrier postsynaptic potentials. 4. The carrier system fires one-for-one with the giant axons of the tentacles and may mediate impulse traffic between the latter and the ring giant axon. We suggest that the carrier system may also provide the pathways from the ring giant to the motor giant axons used in escape swimming. 5. The findings show that the ring giant axon functions in close collaboration with the carrier system, increasing the latter's effective conduction velocity, and that interactions with other neuronal sub-systems are probably mediated exclusively by the carrier system.

  2. Turbulent aerosol fluxes over the Arctic Ocean: 2. Wind-driven sources from the sea

    NASA Astrophysics Data System (ADS)

    Nilsson, E. D.; Rannik, Ü.; Swietlicki, E.; Leck, C.; Aalto, P. P.; Zhou, J.; Norman, M.

    2001-12-01

    An eddy-covariance flux system was successfully applied over open sea, leads and ice floes during the Arctic Ocean Expedition in July-August 1996. Wind-driven upward aerosol number fluxes were observed over open sea and leads in the pack ice. These particles must originate from droplets ejected into the air at the bursting of small air bubbles at the water surface. The source flux F (in 106 m-2 s-1) had a strong dependency on wind speed, log>(F>)=0.20U¯-1.71 and 0.11U¯-1.93, over the open sea and leads, respectively (where U¯ is the local wind speed at about 10 m height). Over the open sea the wind-driven aerosol source flux consisted of a film drop mode centered at ˜100 nm diameter and a jet drop mode centered at ˜1 μm diameter. Over the leads in the pack ice, a jet drop mode at ˜2 μm diameter dominated. The jet drop mode consisted of sea-salt, but oxalate indicated an organic contribution, and bacterias and other biogenic particles were identified by single particle analysis. Particles with diameters less than -100 nm appear to have contributed to the flux, but their chemical composition is unknown. Whitecaps were probably the bubble source at open sea and on the leads at high wind speed, but a different bubble source is needed in the leads owing to their small fetch. Melting of ice in the leads is probably the best candidate. The flux over the open sea was of such a magnitude that it could give a significant contribution to the condensation nuclei (CCN) population. Although the flux from the leads were roughly an order of magnitude smaller and the leads cover only a small fraction of the pack ice, the local source may till be important for the CCN population in Arctic fogs. The primary marine aerosol source will increase both with increased wind speed and with decreased ice fraction and extent. The local CCN production may therefore increase and influence cloud or fog albedo and lifetime in response to greenhouse warming in the Arctic Ocean region.

  3. An efficient algorithm to compute marginal posterior genotype probabilities for every member of a pedigree with loops

    PubMed Central

    2009-01-01

    Background Marginal posterior genotype probabilities need to be computed for genetic analyses such as geneticcounseling in humans and selective breeding in animal and plant species. Methods In this paper, we describe a peeling based, deterministic, exact algorithm to compute efficiently genotype probabilities for every member of a pedigree with loops without recourse to junction-tree methods from graph theory. The efficiency in computing the likelihood by peeling comes from storing intermediate results in multidimensional tables called cutsets. Computing marginal genotype probabilities for individual i requires recomputing the likelihood for each of the possible genotypes of individual i. This can be done efficiently by storing intermediate results in two types of cutsets called anterior and posterior cutsets and reusing these intermediate results to compute the likelihood. Examples A small example is used to illustrate the theoretical concepts discussed in this paper, and marginal genotype probabilities are computed at a monogenic disease locus for every member in a real cattle pedigree. PMID:19958551

  4. Intervals for posttest probabilities: a comparison of 5 methods.

    PubMed

    Mossman, D; Berger, J O

    2001-01-01

    Several medical articles discuss methods of constructing confidence intervals for single proportions and the likelihood ratio, but scant attention has been given to the systematic study of intervals for the posterior odds, or the positive predictive value, of a test. The authors describe 5 methods of constructing confidence intervals for posttest probabilities when estimates of sensitivity, specificity, and the pretest probability of a disorder are derived from empirical data. They then evaluate each method to determine how well the intervals' coverage properties correspond to their nominal value. When the estimates of pretest probabilities, sensitivity, and specificity are derived from more than 80 subjects and are not close to 0 or 1, all methods generate intervals with appropriate coverage properties. When these conditions are not met, however, the best-performing method is an objective Bayesian approach implemented by a simple simulation using a spreadsheet. Physicians and investigators can generate accurate confidence intervals for posttest probabilities in small-sample situations using the objective Bayesian approach.

  5. Robust Estimation of Latent Ability in Item Response Models

    ERIC Educational Resources Information Center

    Schuster, Christof; Yuan, Ke-Hai

    2011-01-01

    Because of response disturbances such as guessing, cheating, or carelessness, item response models often can only approximate the "true" individual response probabilities. As a consequence, maximum-likelihood estimates of ability will be biased. Typically, the nature and extent to which response disturbances are present is unknown, and, therefore,…

  6. Why anthropic reasoning cannot predict Lambda.

    PubMed

    Starkman, Glenn D; Trotta, Roberto

    2006-11-17

    We revisit anthropic arguments purporting to explain the measured value of the cosmological constant. We argue that different ways of assigning probabilities to candidate universes lead to totally different anthropic predictions. As an explicit example, we show that weighting different universes by the total number of possible observations leads to an extremely small probability for observing a value of Lambda equal to or greater than what we now measure. We conclude that anthropic reasoning within the framework of probability as frequency is ill-defined and that in the absence of a fundamental motivation for selecting one weighting scheme over another the anthropic principle cannot be used to explain the value of Lambda, nor, likely, any other physical parameters.

  7. A simple model for DSS-14 outage times

    NASA Technical Reports Server (NTRS)

    Rumsey, H. C.; Stevens, R.; Posner, E. C.

    1989-01-01

    A model is proposed to describe DSS-14 outage times. Discrepancy Reporting System outage data for the period from January 1986 through September 1988 are used to estimate the parameters of the model. The model provides a probability distribution for the duration of outages, which agrees well with observed data. The model depends only on a small number of parameters, and has some heuristic justification. This shows that the Discrepancy Reporting System in the Deep Space Network (DSN) can be used to estimate the probability of extended outages in spite of the discrepancy reports ending when the pass ends. The probability of an outage extending beyond the end of a pass is estimated as around 5 percent.

  8. Confidence intervals for the population mean tailored to small sample sizes, with applications to survey sampling.

    PubMed

    Rosenblum, Michael A; Laan, Mark J van der

    2009-01-07

    The validity of standard confidence intervals constructed in survey sampling is based on the central limit theorem. For small sample sizes, the central limit theorem may give a poor approximation, resulting in confidence intervals that are misleading. We discuss this issue and propose methods for constructing confidence intervals for the population mean tailored to small sample sizes. We present a simple approach for constructing confidence intervals for the population mean based on tail bounds for the sample mean that are correct for all sample sizes. Bernstein's inequality provides one such tail bound. The resulting confidence intervals have guaranteed coverage probability under much weaker assumptions than are required for standard methods. A drawback of this approach, as we show, is that these confidence intervals are often quite wide. In response to this, we present a method for constructing much narrower confidence intervals, which are better suited for practical applications, and that are still more robust than confidence intervals based on standard methods, when dealing with small sample sizes. We show how to extend our approaches to much more general estimation problems than estimating the sample mean. We describe how these methods can be used to obtain more reliable confidence intervals in survey sampling. As a concrete example, we construct confidence intervals using our methods for the number of violent deaths between March 2003 and July 2006 in Iraq, based on data from the study "Mortality after the 2003 invasion of Iraq: A cross sectional cluster sample survey," by Burnham et al. (2006).

  9. Day-to-day variations in the amplitude of the soil temperature cycle and impact on adult eclosion timing of the onion fly.

    PubMed

    Tanaka, Kazuhiro; Watari, Yasuhiko

    2017-06-01

    The onion fly Delia antiqua advances its eclosion timing with decreasing temperature amplitude to compensate for a depth-dependent phase delay of the zeitgeber. To elucidate whether or not naturally occurring day-to-day variations in the amplitude of soil temperature cycle disturb this compensatory response, we monitored daily variations in the temperature amplitude in natural soils and evaluated the impact on adult eclosion timing. Our results indicated that both median and variance of the soil temperature amplitude become smaller as depth increases. Insertion of a larger temperature fluctuation into the thermoperiod with smaller temperature amplitude induced a stronger phase delay, while insertion of a smaller temperature fluctuation into the thermoperiod with larger temperature amplitude had a weaker phase-advancing effect. It is therefore expected that larger diurnal temperature fluctuations disturb the compensatory response, particularly if they occur at deeper locations, while smaller temperature fluctuations do so only at shallower locations. Under natural conditions, however, the probability of occurrence of smaller or larger temperature fluctuations in shallower or deeper soils, respectively, is relatively small. Thus, naturally occurring day-to-day variations in the temperature amplitude rarely disturb the compensatory response, thereby having a subtle or negligible impact on adult eclosion timing.

  10. Sequential treatment of icotinib after first-line pemetrexed in advanced lung adenocarcinoma with unknown EGFR gene status.

    PubMed

    Zheng, Yulong; Fang, Weijia; Deng, Jing; Zhao, Peng; Xu, Nong; Zhou, Jianying

    2014-07-01

    In non-small cell lung cancer (NSCLC), the well-developed epidermal growth factor receptor (EGFR) is an important therapeutic target. EGFR activating gene mutations have been proved strongly predictive of response to EGFR-tyrosine kinase inhibitors (TKI) in NSCLC. However, both in daily clinical practice and clinical trials, patients with unknown EGFR gene status (UN-EGFR-GS) are very common. In this study, we assessed efficacy and tolerability of sequential treatment of first-line pemetrexed followed by icotinib in Chinese advanced lung adenocarcinoma with UN-EGFR-GS. We analyzed 38 patients with advanced lung adenocarcinoma with UN-EGFR-GS treated with first-line pemetrexed-based chemotherapy followed by icotinib as maintenance or second-line therapy. The response rates to pemetrexed and icotinib were 21.1% and 42.1%, respectively. The median overall survival was 27.0 months (95% CI, 19.7-34.2 months). The 12-month overall survival probability was 68.4%. The most common toxicities observed in icotinib phase were rashes, diarrheas, and elevated aminotransferase. Subgroup analysis indicated that the overall survival is correlated with response to icotinib. The sequence of first-line pemetrexed-based chemotherapy followed by icotinib treatment is a promising option for advanced lung adenocarcinoma with UN-EGFR-GS in China.

  11. Qualitative Amino Acid Analysis of Small Peptides by GC/MS.

    ERIC Educational Resources Information Center

    Mabbott, Gary A.

    1990-01-01

    Experiments designed to help undergraduate students gain experience operating instruments and interpreting gas chromatography and mass spectrometry data are presented. Experimental reagents, procedures, analysis, and probable results are discussed. (CW)

  12. How Stitches Help Kids Heal

    MedlinePlus

    ... cuts is a small sticky strip called a butterfly bandage. It keeps the edges of a shallow ... help. Different kinds of materials — sutures, glue, and butterflies — need different kinds of care. The doctor probably ...

  13. Mammal Inventory of the Mojave Network Parks-Death Valley and Joshua Tree National Parks, Lake Mead National Recreation Area, Manzanar National Historic Site, and Mojave National Preserve

    USGS Publications Warehouse

    Drost, Charles A.; Hart, Jan

    2008-01-01

    This report describes the results of a mammal inventory study of National Park Service units in the Mojave Desert Network, including Death Valley National Park, Joshua Tree National Park, Lake Mead National Recreation Area, Manzanar National Historic Site, and Mojave National Preserve. Fieldwork for the inventory focused on small mammals, primarily rodents and bats. Fieldwork for terrestrial small mammals used trapping with Sherman and Tomahawk small- and medium-sized mammal traps, along with visual surveys for diurnal species. The majority of sampling for terrestrial small mammals was carried out in 2002 and 2003. Methods used in field surveys for bats included mist-netting at tanks and other water bodies, along with acoustic surveys using Anabat. Most of the bat survey work was conducted in 2003. Because of extremely dry conditions in the first two survey years (and associated low mammal numbers), we extended field sampling into 2004, following a relatively wet winter. In addition to field sampling, we also reviewed, evaluated, and summarized museum and literature records of mammal species for all of the Park units. We documented a total of 59 mammal species as present at Death Valley National Park, with an additional five species that we consider of probable occurrence. At Joshua Tree, we also documented 50 species, and an additional four 'probable' species. At Lake Mead National Recreation Area, 57 mammal species have been positively documented, with 10 additional probable species. Manzanar National Historic Site had not been previously surveyed. We documented 19 mammal species at Manzanar, with an additional 11 probable species. Mojave National Preserve had not had a comprehensive list previously, either. There are now a total of 50 mammal species documented at Mojave, with three additional probable species. Of these totals, 23 occurrences are new at individual park units (positively documented for the first time), with most of these being at Manzanar. Noteworthy additions include western mastiff bat at Joshua Tree, house mouse at a number of wildland sites at Lake Mead, and San Diego pocket mouse at Mojave National Preserve. There are also species that have been lost from the Mojave Network parks. We discuss remaining questions, including the possible occurrence of additional species at each park area (most of these are marginal species whose distributional range may or may not edge into the boundaries of the area). Taxonomic changes are also discussed, along with potential erroneous species records.

  14. Concurrent variation of response bias and sensitivity in an operant-psychophysical test.

    NASA Technical Reports Server (NTRS)

    Terman, M.; Terman, J. S.

    1972-01-01

    The yes-no signal detection procedure was applied to a single-response operant paradigm in which rats discriminated between a standard auditory intensity and attenuated comparison values. The payoff matrix was symmetrical (with reinforcing brain stimulation for correct detections and brief time-out for errors), but signal probability and intensity differences were varied to generate a family of isobias and isosensitivity functions. The d' parameter remained fairly constant across a wide range of bias levels. Isobias functions deviated from a strict matching strategy as discrimination difficulty increased, although an orderly relation was maintained between signal probability value and the degree and direction of response bias.

  15. Research on response spectrum of dam based on scenario earthquake

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaoliang; Zhang, Yushan

    2017-10-01

    Taking a large hydropower station as an example, the response spectrum based on scenario earthquake is determined. Firstly, the potential source of greatest contribution to the site is determined on the basis of the results of probabilistic seismic hazard analysis (PSHA). Secondly, the magnitude and epicentral distance of the scenario earthquake are calculated according to the main faults and historical earthquake of the potential seismic source zone. Finally, the response spectrum of scenario earthquake is calculated using the Next Generation Attenuation (NGA) relations. The response spectrum based on scenario earthquake method is less than the probability-consistent response spectrum obtained by PSHA method. The empirical analysis shows that the response spectrum of scenario earthquake considers the probability level and the structural factors, and combines the advantages of the deterministic and probabilistic seismic hazard analysis methods. It is easy for people to accept and provide basis for seismic engineering of hydraulic engineering.

  16. Statistical learning of an auditory sequence and reorganization of acquired knowledge: A time course of word segmentation and ordering.

    PubMed

    Daikoku, Tatsuya; Yatomi, Yutaka; Yumoto, Masato

    2017-01-27

    Previous neural studies have supported the hypothesis that statistical learning mechanisms are used broadly across different domains such as language and music. However, these studies have only investigated a single aspect of statistical learning at a time, such as recognizing word boundaries or learning word order patterns. In this study, we neutrally investigated how the two levels of statistical learning for recognizing word boundaries and word ordering could be reflected in neuromagnetic responses and how acquired statistical knowledge is reorganised when the syntactic rules are revised. Neuromagnetic responses to the Japanese-vowel sequence (a, e, i, o, and u), presented every .45s, were recorded from 14 right-handed Japanese participants. The vowel order was constrained by a Markov stochastic model such that five nonsense words (aue, eao, iea, oiu, and uoi) were chained with an either-or rule: the probability of the forthcoming word was statistically defined (80% for one word; 20% for the other word) by the most recent two words. All of the word transition probabilities (80% and 20%) were switched in the middle of the sequence. In the first and second quarters of the sequence, the neuromagnetic responses to the words that appeared with higher transitional probability were significantly reduced compared with those that appeared with a lower transitional probability. After switching the word transition probabilities, the response reduction was replicated in the last quarter of the sequence. The responses to the final vowels in the words were significantly reduced compared with those to the initial vowels in the last quarter of the sequence. The results suggest that both within-word and between-word statistical learning are reflected in neural responses. The present study supports the hypothesis that listeners learn larger structures such as phrases first, and they subsequently extract smaller structures, such as words, from the learned phrases. The present study provides the first neurophysiological evidence that the correction of statistical knowledge requires more time than the acquisition of new statistical knowledge. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Impacts of forest fragmentation on species richness: a hierarchical approach to community modelling

    USGS Publications Warehouse

    Zipkin, Elise F.; DeWan, Amielle; Royle, J. Andrew

    2009-01-01

    1. Species richness is often used as a tool for prioritizing conservation action. One method for predicting richness and other summaries of community structure is to develop species-specific models of occurrence probability based on habitat or landscape characteristics. However, this approach can be challenging for rare or elusive species for which survey data are often sparse. 2. Recent developments have allowed for improved inference about community structure based on species-specific models of occurrence probability, integrated within a hierarchical modelling framework. This framework offers advantages to inference about species richness over typical approaches by accounting for both species-level effects and the aggregated effects of landscape composition on a community as a whole, thus leading to increased precision in estimates of species richness by improving occupancy estimates for all species, including those that were observed infrequently. 3. We developed a hierarchical model to assess the community response of breeding birds in the Hudson River Valley, New York, to habitat fragmentation and analysed the model using a Bayesian approach. 4. The model was designed to estimate species-specific occurrence and the effects of fragment area and edge (as measured through the perimeter and the perimeter/area ratio, P/A), while accounting for imperfect detection of species. 5. We used the fitted model to make predictions of species richness within forest fragments of variable morphology. The model revealed that species richness of the observed bird community was maximized in small forest fragments with a high P/A. However, the number of forest interior species, a subset of the community with high conservation value, was maximized in large fragments with low P/A. 6. Synthesis and applications. Our results demonstrate the importance of understanding the responses of both individual, and groups of species, to environmental heterogeneity while illustrating the utility of hierarchical models for inference about species richness for conservation. This framework can be used to investigate the impacts of land-use change and fragmentation on species or assemblage richness, and to further understand trade-offs in species-specific occupancy probabilities associated with landscape variability.

  18. Channel Width Change as a Potential Sediment Source, Minnesota River Basin

    NASA Astrophysics Data System (ADS)

    Lauer, J. W.; Echterling, C.; Lenhart, C. F.; Rausch, R.; Belmont, P.

    2017-12-01

    Turbidity and suspended sediment are important management considerations along the Minnesota River. The system has experience large and relatively consistent increases in both discharge and channel width over the past century. Here we consider the potential role of channel cross section enlargement as a sediment source. Reach-average channel width was digitized from aerial images dated between 1937 and 2015 along multiple sub-reaches of the Minnesota River and its major tributaries. Many of the sub-reaches include several actively migrating bends. The analysis shows relatively consistent increases in width over time, with average increase rates of 0.4 percent per year. Extrapolation to the river network using a regional relationship for cross-sectional area vs. drainage area indicates that large tributaries and main-stem reaches account for most of the bankfull cross-sectional volume in the basin. Larger tributaries and the main stem thus appear more important for widening related sediment production than small tributaries. On a basin-wide basis, widening could be responsible for a gross supply of more sediment than has been gaged at several main-stem sites, indicating that there may be important sinks for both sand and silt/clay size material distributed throughout the system. Sediment storage is probably largest along the lowest-slope reaches of the main stem. While channel width appears to have adjusted relatively quickly in response to discharge and other hydraulic modifications, net storage of sediment in floodplains probably occurs sufficiently slowly that depth adjustment will lag width adjustment significantly. Detailed analysis of the lower Minnesota River using a river segmenting approach allows for a more detailed assessment of reach-scale processes. Away from channel cutoffs, elongation of the channel at eroding bends is consistent with rates observed on other actively migrating rivers. However, the sinuosity increase has been more than compensated by several natural and engineered cutoffs. The sinuosity change away from cutoffs probably plays a relatively modest role in the reach's sediment budget. However, point bars and abandoned oxbow lakes are important zones of sediment storage that may be large enough to account for much of the widening-related production of sand in the reach.

  19. Multiple interactions and rapidity gap survival

    NASA Astrophysics Data System (ADS)

    Khoze, V. A.; Martin, A. D.; Ryskin, M. G.

    2018-05-01

    Observations of rare processes containing large rapidity gaps at high energy colliders may be exceptionally informative. However the cross sections of these events are small in comparison with that for the inclusive processes since there is a large probability that the gaps may be filled by secondary particles arising from additional soft interactions or from gluon radiation. Here we review the calculations of the probability that the gaps survive population by particles from these effects for a wide range of different processes.

  20. SIERRA ANCHA WILDERNESS, ARIZONA.

    USGS Publications Warehouse

    Wrucke, Chester T.; Light, Thomas D.

    1984-01-01

    Mineral surveys show that the Sierra Ancha Wilderness in Arizona has demonstrated resources of uranium, asbestos, and iron; probable and substantiated resource potential for uranium, asbestos, and iron; and a probable resource potential for fluorspar. Uranium resources occur in vein and strata-bound deposits in siltstone that underlies much of the wilderness. Deposits of long-staple chrysotile asbestos are likely in parts of the wilderness adjacent to known areas of asbestos production. Magnetite deposits in the wilderness form a small iron resource. No fossil fuel resources were identified in this study.

Top