Sample records for population-based threshold model

  1. Should we expect population thresholds for wildlife disease?

    USGS Publications Warehouse

    Lloyd-Smith, James O.; Cross, P.C.; Briggs, C.J.; Daugherty, M.; Getz, W.M.; Latto, J.; Sanchez, M.; Smith, A.; Swei, A.

    2005-01-01

    Host population thresholds for invasion or persistence of infectious disease are core concepts of disease ecology, and underlie on-going and controversial disease control policies based on culling and vaccination. Empirical evidence for these thresholds in wildlife populations has been sparse, however, though recent studies have narrowed this gap. Here we review the theoretical bases for population thresholds for disease, revealing why they are difficult to measure and sometimes are not even expected, and identifying important facets of wildlife ecology left out of current theories. We discuss strengths and weaknesses of selected empirical studies that have reported disease thresholds for wildlife, identify recurring obstacles, and discuss implications of our imperfect understanding of wildlife thresholds for disease control policy.

  2. Critical thresholds for eventual extinction in randomly disturbed population growth models.

    PubMed

    Peckham, Scott D; Waymire, Edward C; De Leenheer, Patrick

    2018-02-16

    This paper considers several single species growth models featuring a carrying capacity, which are subject to random disturbances that lead to instantaneous population reduction at the disturbance times. This is motivated in part by growing concerns about the impacts of climate change. Our main goal is to understand whether or not the species can persist in the long run. We consider the discrete-time stochastic process obtained by sampling the system immediately after the disturbances, and find various thresholds for several modes of convergence of this discrete process, including thresholds for the absence or existence of a positively supported invariant distribution. These thresholds are given explicitly in terms of the intensity and frequency of the disturbances on the one hand, and the population's growth characteristics on the other. We also perform a similar threshold analysis for the original continuous-time stochastic process, and obtain a formula that allows us to express the invariant distribution for this continuous-time process in terms of the invariant distribution of the discrete-time process, and vice versa. Examples illustrate that these distributions can differ, and this sends a cautionary message to practitioners who wish to parameterize these and related models using field data. Our analysis relies heavily on a particular feature shared by all the deterministic growth models considered here, namely that their solutions exhibit an exponentially weighted averaging property between a function of the initial condition, and the same function applied to the carrying capacity. This property is due to the fact that these systems can be transformed into affine systems.

  3. Climate change, population immunity, and hyperendemicity in the transmission threshold of dengue.

    PubMed

    Oki, Mika; Yamamoto, Taro

    2012-01-01

    It has been suggested that the probability of dengue epidemics could increase because of climate change. The probability of epidemics is most commonly evaluated by the basic reproductive number (R(0)), and in mosquito-borne diseases, mosquito density (the number of female mosquitoes per person [MPP]) is the critical determinant of the R(0) value. In dengue-endemic areas, 4 different serotypes of dengue virus coexist-a state known as hyperendemicity-and a certain proportion of the population is immune to one or more of these serotypes. Nevertheless, these factors are not included in the calculation of R(0). We aimed to investigate the effects of temperature change, population immunity, and hyperendemicity on the threshold MPP that triggers an epidemic. We designed a mathematical model of dengue transmission dynamics. An epidemic was defined as a 10% increase in seroprevalence in a year, and the MPP that triggered an epidemic was defined as the threshold MPP. Simulations were conducted in Singapore based on the recorded temperatures from 1980 to 2009 The threshold MPP was estimated with the effect of (1) temperature only; (2) temperature and fluctuation of population immunity; and (3) temperature, fluctuation of immunity, and hyperendemicity. When only the effect of temperature was considered, the threshold MPP was estimated to be 0.53 in the 1980s and 0.46 in the 2000s, a decrease of 13.2%. When the fluctuation of population immunity and hyperendemicity were considered in the model, the threshold MPP decreased by 38.7%, from 0.93 to 0.57, from the 1980s to the 2000s. The threshold MPP was underestimated if population immunity was not considered and overestimated if hyperendemicity was not included in the simulations. In addition to temperature, these factors are particularly important when quantifying the threshold MPP for the purpose of setting goals for vector control in dengue-endemic areas.

  4. Climate Change, Population Immunity, and Hyperendemicity in the Transmission Threshold of Dengue

    PubMed Central

    Oki, Mika; Yamamoto, Taro

    2012-01-01

    Background It has been suggested that the probability of dengue epidemics could increase because of climate change. The probability of epidemics is most commonly evaluated by the basic reproductive number (R0), and in mosquito-borne diseases, mosquito density (the number of female mosquitoes per person [MPP]) is the critical determinant of the R0 value. In dengue-endemic areas, 4 different serotypes of dengue virus coexist–a state known as hyperendemicity–and a certain proportion of the population is immune to one or more of these serotypes. Nevertheless, these factors are not included in the calculation of R0. We aimed to investigate the effects of temperature change, population immunity, and hyperendemicity on the threshold MPP that triggers an epidemic. Methods and Findings We designed a mathematical model of dengue transmission dynamics. An epidemic was defined as a 10% increase in seroprevalence in a year, and the MPP that triggered an epidemic was defined as the threshold MPP. Simulations were conducted in Singapore based on the recorded temperatures from 1980 to 2009 The threshold MPP was estimated with the effect of (1) temperature only; (2) temperature and fluctuation of population immunity; and (3) temperature, fluctuation of immunity, and hyperendemicity. When only the effect of temperature was considered, the threshold MPP was estimated to be 0.53 in the 1980s and 0.46 in the 2000s, a decrease of 13.2%. When the fluctuation of population immunity and hyperendemicity were considered in the model, the threshold MPP decreased by 38.7%, from 0.93 to 0.57, from the 1980s to the 2000s. Conclusions The threshold MPP was underestimated if population immunity was not considered and overestimated if hyperendemicity was not included in the simulations. In addition to temperature, these factors are particularly important when quantifying the threshold MPP for the purpose of setting goals for vector control in dengue-endemic areas. PMID:23144746

  5. Threshold Dynamics of a Temperature-Dependent Stage-Structured Mosquito Population Model with Nested Delays.

    PubMed

    Wang, Xiunan; Zou, Xingfu

    2018-05-21

    Mosquito-borne diseases remain a significant threat to public health and economics. Since mosquitoes are quite sensitive to temperature, global warming may not only worsen the disease transmission case in current endemic areas but also facilitate mosquito population together with pathogens to establish in new regions. Therefore, understanding mosquito population dynamics under the impact of temperature is considerably important for making disease control policies. In this paper, we develop a stage-structured mosquito population model in the environment of a temperature-controlled experiment. The model turns out to be a system of periodic delay differential equations with periodic delays. We show that the basic reproduction number is a threshold parameter which determines whether the mosquito population goes to extinction or remains persistent. We then estimate the parameter values for Aedes aegypti, the mosquito that transmits dengue virus. We verify the analytic result by numerical simulations with the temperature data of Colombo, Sri Lanka where a dengue outbreak occurred in 2017.

  6. A study of life prediction differences for a nickel-base Alloy 690 using a threshold and a non-threshold model

    NASA Astrophysics Data System (ADS)

    Young, B. A.; Gao, Xiaosheng; Srivatsan, T. S.

    2009-10-01

    In this paper we compare and contrast the crack growth rate of a nickel-base superalloy (Alloy 690) in the Pressurized Water Reactor (PWR) environment. Over the last few years, a preponderance of test data has been gathered on both Alloy 690 thick plate and Alloy 690 tubing. The original model, essentially based on a small data set for thick plate, compensated for temperature, load ratio and stress-intensity range but did not compensate for the fatigue threshold of the material. As additional test data on both plate and tube product became available the model was gradually revised to account for threshold properties. Both the original and revised models generated acceptable results for data that were above 1 × 10 -11 m/s. However, the test data at the lower growth rates were over-predicted by the non-threshold model. Since the original model did not take the fatigue threshold into account, this model predicted no operating stress below which the material would effectively undergo fatigue crack growth. Because of an over-prediction of the growth rate below 1 × 10 -11 m/s, due to a combination of low stress, small crack size and long rise-time, the model in general leads to an under-prediction of the total available life of the components.

  7. From individual to population level effects of toxicants in the tubicifid Branchiura sowerbyi using threshold effect models in a Bayesian framework.

    PubMed

    Ducrot, Virginie; Billoir, Elise; Péry, Alexandre R R; Garric, Jeanne; Charles, Sandrine

    2010-05-01

    Effects of zinc were studied in the freshwater worm Branchiura sowerbyi using partial and full life-cycle tests. Only newborn and juveniles were sensitive to zinc, displaying effects on survival, growth, and age at first brood at environmentally relevant concentrations. Threshold effect models were proposed to assess toxic effects on individuals. They were fitted to life-cycle test data using Bayesian inference and adequately described life-history trait data in exposed organisms. The daily asymptotic growth rate of theoretical populations was then simulated with a matrix population model, based upon individual-level outputs. Population-level outputs were in accordance with existing literature for controls. Working in a Bayesian framework allowed incorporating parameter uncertainty in the simulation of the population-level response to zinc exposure, thus increasing the relevance of test results in the context of ecological risk assessment.

  8. Odor Detection Thresholds in a Population of Older Adults

    PubMed Central

    Schubert, Carla R.; Fischer, Mary E.; Pinto, A. Alex; Klein, Barbara E.K.; Klein, Ronald; Cruickshanks, Karen J.

    2016-01-01

    OBJECTIVE To measure odor detection thresholds and associated nasal and behavioral factors in an older adult population. STUDY DESIGN Cross-sectional cohort study METHODS Odor detection thresholds were obtained using an automated olfactometer on 832 participants, aged 68–99 (mean age 77) years in the 21-year (2013–2016) follow-up visit of the Epidemiology of Hearing Loss Study. RESULTS The mean odor detection threshold (ODT) score was 8.2 (range: 1–13; standard deviation = 2.54), corresponding to a n-butanol concentration of slightly less than 0.03%. Older participants were significantly more likely to have lower (worse) ODT scores than younger participants (p<0.001). There were no significant differences in mean ODT scores between men and women. Older age was significantly associated with worse performance in multivariable regression models and exercising at least once a week was associated with a reduced odds of having a low (≤5) ODT score. Cognitive impairment was also associated with poor performance while a history of allergies or a deviated septum were associated with better performance. CONCLUSION Odor detection threshold scores were worse in older age groups but similar between men and women in this large population of older adults. Regular exercise was associated with better odor detection thresholds adding to the evidence that decline in olfactory function with age may be partly preventable. PMID:28000220

  9. A geographic analysis of population density thresholds in the influenza pandemic of 1918-19.

    PubMed

    Chandra, Siddharth; Kassens-Noor, Eva; Kuljanin, Goran; Vertalka, Joshua

    2013-02-20

    Geographic variables play an important role in the study of epidemics. The role of one such variable, population density, in the spread of influenza is controversial. Prior studies have tested for such a role using arbitrary thresholds for population density above or below which places are hypothesized to have higher or lower mortality. The results of such studies are mixed. The objective of this study is to estimate, rather than assume, a threshold level of population density that separates low-density regions from high-density regions on the basis of population loss during an influenza pandemic. We study the case of the influenza pandemic of 1918-19 in India, where over 15 million people died in the short span of less than one year. Using data from six censuses for 199 districts of India (n=1194), the country with the largest number of deaths from the influenza of 1918-19, we use a sample-splitting method embedded within a population growth model that explicitly quantifies population loss from the pandemic to estimate a threshold level of population density that separates low-density districts from high-density districts. The results demonstrate a threshold level of population density of 175 people per square mile. A concurrent finding is that districts on the low side of the threshold experienced rates of population loss (3.72%) that were lower than districts on the high side of the threshold (4.69%). This paper introduces a useful analytic tool to the health geographic literature. It illustrates an application of the tool to demonstrate that it can be useful for pandemic awareness and preparedness efforts. Specifically, it estimates a level of population density above which policies to socially distance, redistribute or quarantine populations are likely to be more effective than they are for areas with population densities that lie below the threshold.

  10. Patterns of threshold evolution in polyphenic insects under different developmental models.

    PubMed

    Tomkins, Joseph L; Moczek, Armin P

    2009-02-01

    Two hypotheses address the evolution of polyphenic traits in insects. Under the developmental reprogramming model, individuals exceeding a threshold follow a different developmental pathway from individuals below the threshold. This decoupling is thought to free selection to independently hone alternative morphologies, increasing phenotypic plasticity and morphological diversity. Under the alternative model, extreme positive allometry explains the existence of alternative phenotypes and divergent phenotypes are developmentally coupled by a continuous reaction norm, such that selection on either morph acts on both. We test the hypothesis that continuous reaction norm polyphenisms, evolve through changes in the allometric parameters of even the smallest males with minimal trait expression, whereas threshold polyphenisms evolve independent of the allometric parameters of individuals below the threshold. We compare two polyphenic species; the dung beetle Onthophagus taurus, whose allometry has been modeled both as a threshold polyphenism and a continuous reaction norm and the earwig Forficula auricularia, whose allometry is best modeled with a discontinuous threshold. We find that across populations of both species, variation in forceps or horn allometry in minor males are correlated to the population's threshold. These findings suggest that regardless of developmental mode, alternative morphs do not evolve independently of one another.

  11. Time Poverty Thresholds and Rates for the US Population

    ERIC Educational Resources Information Center

    Kalenkoski, Charlene M.; Hamrick, Karen S.; Andrews, Margaret

    2011-01-01

    Time constraints, like money constraints, affect Americans' well-being. This paper defines what it means to be time poor based on the concepts of necessary and committed time and presents time poverty thresholds and rates for the US population and certain subgroups. Multivariate regression techniques are used to identify the key variables…

  12. Sampling Based Influence Maximization on Linear Threshold Model

    NASA Astrophysics Data System (ADS)

    Jia, Su; Chen, Ling

    2018-04-01

    A sampling based influence maximization on linear threshold (LT) model method is presented. The method samples the routes in the possible worlds in the social networks, and uses Chernoff bound to estimate the number of samples so that the error can be constrained within a given bound. Then the active possibilities of the routes in the possible worlds are calculated, and are used to compute the influence spread of each node in the network. Our experimental results show that our method can effectively select appropriate seed nodes set that spreads larger influence than other similar methods.

  13. Diversity Outbred Mice Identify Population-Based Exposure Thresholds and Genetic Factors that Influence Benzene-Induced Genotoxicity

    PubMed Central

    Gatti, Daniel M.; Morgan, Daniel L.; Kissling, Grace E.; Shockley, Keith R.; Knudsen, Gabriel A.; Shepard, Kim G.; Price, Herman C.; King, Deborah; Witt, Kristine L.; Pedersen, Lars C.; Munger, Steven C.; Svenson, Karen L.; Churchill, Gary A.

    2014-01-01

    Background Inhalation of benzene at levels below the current exposure limit values leads to hematotoxicity in occupationally exposed workers. Objective We sought to evaluate Diversity Outbred (DO) mice as a tool for exposure threshold assessment and to identify genetic factors that influence benzene-induced genotoxicity. Methods We exposed male DO mice to benzene (0, 1, 10, or 100 ppm; 75 mice/exposure group) via inhalation for 28 days (6 hr/day for 5 days/week). The study was repeated using two independent cohorts of 300 animals each. We measured micronuclei frequency in reticulocytes from peripheral blood and bone marrow and applied benchmark concentration modeling to estimate exposure thresholds. We genotyped the mice and performed linkage analysis. Results We observed a dose-dependent increase in benzene-induced chromosomal damage and estimated a benchmark concentration limit of 0.205 ppm benzene using DO mice. This estimate is an order of magnitude below the value estimated using B6C3F1 mice. We identified a locus on Chr 10 (31.87 Mb) that contained a pair of overexpressed sulfotransferases that were inversely correlated with genotoxicity. Conclusions The genetically diverse DO mice provided a reproducible response to benzene exposure. The DO mice display interindividual variation in toxicity response and, as such, may more accurately reflect the range of response that is observed in human populations. Studies using DO mice can localize genetic associations with high precision. The identification of sulfotransferases as candidate genes suggests that DO mice may provide additional insight into benzene-induced genotoxicity. Citation French JE, Gatti DM, Morgan DL, Kissling GE, Shockley KR, Knudsen GA, Shepard KG, Price HC, King D, Witt KL, Pedersen LC, Munger SC, Svenson KL, Churchill GA. 2015. Diversity Outbred mice identify population-based exposure thresholds and genetic factors that influence benzene-induced genotoxicity. Environ Health Perspect 123:237

  14. Resonances and thresholds in the Rydberg-level population of multiply charged ions at solid surfaces

    NASA Astrophysics Data System (ADS)

    Nedeljković, Lj. D.; Nedeljković, N. N.

    1998-12-01

    We present a theoretical study of resonances and thresholds, two specific features of Rydberg-state formation of multiply charged ions (Z=6, 7, and 8) escaping a solid surface at intermediate velocities (v~1 a.u.) in the normal emergence geometry. The resonances are recognized in pronounced maxima of the experimentally observed population curves of Ar VIII ions for resonant values of the principal quantum number n=nres=11 and for the angular momentum quantum numbers l=1 and 2. Absence of optical signals in detectors of beam-foil experiments for n>nthr of S VI and Cl VII ions (with l=0, 1, and 2) and Ar VIII for l=0 is interpreted as a threshold phenomenon. An interplay between resonance and threshold effects is established within the framework of quantum dynamics of the low angular momentum Rydberg-state formation, based on a generalization of Demkov-Ostrovskii's charge-exchange model. In the model proposed, the Ar VIII resonances appear as a consequence of electron tunneling in the very vicinity of the ion-surface potential barrier top and at some critical ion-surface distances Rc. The observed thresholds are explained by means of a decay mechanism of ionic Rydberg states formed dominantly above the Fermi level EF of a solid conduction band. The theoretically predicted resonant and threshold values, nres and nthr of the principal quantum number n, as well as the obtained population probabilities Pnl=Pnl(v,Z), are in sufficiently good agreement with all available experimental findings.

  15. Identifying optimal threshold statistics for elimination of hookworm using a stochastic simulation model.

    PubMed

    Truscott, James E; Werkman, Marleen; Wright, James E; Farrell, Sam H; Sarkar, Rajiv; Ásbjörnsdóttir, Kristjana; Anderson, Roy M

    2017-06-30

    There is an increased focus on whether mass drug administration (MDA) programmes alone can interrupt the transmission of soil-transmitted helminths (STH). Mathematical models can be used to model these interventions and are increasingly being implemented to inform investigators about expected trial outcome and the choice of optimum study design. One key factor is the choice of threshold for detecting elimination. However, there are currently no thresholds defined for STH regarding breaking transmission. We develop a simulation of an elimination study, based on the DeWorm3 project, using an individual-based stochastic disease transmission model in conjunction with models of MDA, sampling, diagnostics and the construction of study clusters. The simulation is then used to analyse the relationship between the study end-point elimination threshold and whether elimination is achieved in the long term within the model. We analyse the quality of a range of statistics in terms of the positive predictive values (PPV) and how they depend on a range of covariates, including threshold values, baseline prevalence, measurement time point and how clusters are constructed. End-point infection prevalence performs well in discriminating between villages that achieve interruption of transmission and those that do not, although the quality of the threshold is sensitive to baseline prevalence and threshold value. Optimal post-treatment prevalence threshold value for determining elimination is in the range 2% or less when the baseline prevalence range is broad. For multiple clusters of communities, both the probability of elimination and the ability of thresholds to detect it are strongly dependent on the size of the cluster and the size distribution of the constituent communities. Number of communities in a cluster is a key indicator of probability of elimination and PPV. Extending the time, post-study endpoint, at which the threshold statistic is measured improves PPV value in

  16. Optimal control of population recovery--the role of economic restoration threshold.

    PubMed

    Lampert, Adam; Hastings, Alan

    2014-01-01

    A variety of ecological systems around the world have been damaged in recent years, either by natural factors such as invasive species, storms and global change or by direct human activities such as overfishing and water pollution. Restoration of these systems to provide ecosystem services entails significant economic benefits. Thus, choosing how and when to restore in an optimal fashion is important, but has not been well studied. Here we examine a general model where population growth can be induced or accelerated by investing in active restoration. We show that the most cost-effective method to restore an ecosystem dictates investment until the population approaches an 'economic restoration threshold', a density above which the ecosystem should be left to recover naturally. Therefore, determining this threshold is a key general approach for guiding efficient restoration management, and we demonstrate how to calculate this threshold for both deterministic and stochastic ecosystems. © 2013 John Wiley & Sons Ltd/CNRS.

  17. A threshold-based weather model for predicting stripe rust infection in winter wheat

    USDA-ARS?s Scientific Manuscript database

    Wheat stripe rust (WSR) (caused by Puccinia striiformis sp. tritici) is a major threat in most wheat growing regions worldwide, with potential to inflict regular yield losses when environmental conditions are favorable. We propose a threshold-based disease-forecasting model using a stepwise modeling...

  18. Sign language spotting with a threshold model based on conditional random fields.

    PubMed

    Yang, Hee-Deok; Sclaroff, Stan; Lee, Seong-Whan

    2009-07-01

    Sign language spotting is the task of detecting and recognizing signs in a signed utterance, in a set vocabulary. The difficulty of sign language spotting is that instances of signs vary in both motion and appearance. Moreover, signs appear within a continuous gesture stream, interspersed with transitional movements between signs in a vocabulary and nonsign patterns (which include out-of-vocabulary signs, epentheses, and other movements that do not correspond to signs). In this paper, a novel method for designing threshold models in a conditional random field (CRF) model is proposed which performs an adaptive threshold for distinguishing between signs in a vocabulary and nonsign patterns. A short-sign detector, a hand appearance-based sign verification method, and a subsign reasoning method are included to further improve sign language spotting accuracy. Experiments demonstrate that our system can spot signs from continuous data with an 87.0 percent spotting rate and can recognize signs from isolated data with a 93.5 percent recognition rate versus 73.5 percent and 85.4 percent, respectively, for CRFs without a threshold model, short-sign detection, subsign reasoning, and hand appearance-based sign verification. Our system can also achieve a 15.0 percent sign error rate (SER) from continuous data and a 6.4 percent SER from isolated data versus 76.2 percent and 14.5 percent, respectively, for conventional CRFs.

  19. A geographic analysis of population density thresholds in the influenza pandemic of 1918–19

    PubMed Central

    2013-01-01

    Background Geographic variables play an important role in the study of epidemics. The role of one such variable, population density, in the spread of influenza is controversial. Prior studies have tested for such a role using arbitrary thresholds for population density above or below which places are hypothesized to have higher or lower mortality. The results of such studies are mixed. The objective of this study is to estimate, rather than assume, a threshold level of population density that separates low-density regions from high-density regions on the basis of population loss during an influenza pandemic. We study the case of the influenza pandemic of 1918–19 in India, where over 15 million people died in the short span of less than one year. Methods Using data from six censuses for 199 districts of India (n=1194), the country with the largest number of deaths from the influenza of 1918–19, we use a sample-splitting method embedded within a population growth model that explicitly quantifies population loss from the pandemic to estimate a threshold level of population density that separates low-density districts from high-density districts. Results The results demonstrate a threshold level of population density of 175 people per square mile. A concurrent finding is that districts on the low side of the threshold experienced rates of population loss (3.72%) that were lower than districts on the high side of the threshold (4.69%). Conclusions This paper introduces a useful analytic tool to the health geographic literature. It illustrates an application of the tool to demonstrate that it can be useful for pandemic awareness and preparedness efforts. Specifically, it estimates a level of population density above which policies to socially distance, redistribute or quarantine populations are likely to be more effective than they are for areas with population densities that lie below the threshold. PMID:23425498

  20. Between-airport heterogeneity in air toxics emissions associated with individual cancer risk thresholds and population risks

    PubMed Central

    2009-01-01

    Background Airports represent a complex source type of increasing importance contributing to air toxics risks. Comprehensive atmospheric dispersion models are beyond the scope of many applications, so it would be valuable to rapidly but accurately characterize the risk-relevant exposure implications of emissions at an airport. Methods In this study, we apply a high resolution atmospheric dispersion model (AERMOD) to 32 airports across the United States, focusing on benzene, 1,3-butadiene, and benzo [a]pyrene. We estimate the emission rates required at these airports to exceed a 10-6 lifetime cancer risk for the maximally exposed individual (emission thresholds) and estimate the total population risk at these emission rates. Results The emission thresholds vary by two orders of magnitude across airports, with variability predicted by proximity of populations to the airport and mixing height (R2 = 0.74–0.75 across pollutants). At these emission thresholds, the population risk within 50 km of the airport varies by two orders of magnitude across airports, driven by substantial heterogeneity in total population exposure per unit emissions that is related to population density and uncorrelated with emission thresholds. Conclusion Our findings indicate that site characteristics can be used to accurately predict maximum individual risk and total population risk at a given level of emissions, but that optimizing on one endpoint will be non-optimal for the other. PMID:19426510

  1. Estimating population extinction thresholds with categorical classification trees for Louisiana black bears

    USGS Publications Warehouse

    Laufenberg, Jared S.; Clark, Joseph D.; Chandler, Richard B.

    2018-01-01

    Monitoring vulnerable species is critical for their conservation. Thresholds or tipping points are commonly used to indicate when populations become vulnerable to extinction and to trigger changes in conservation actions. However, quantitative methods to determine such thresholds have not been well explored. The Louisiana black bear (Ursus americanus luteolus) was removed from the list of threatened and endangered species under the U.S. Endangered Species Act in 2016 and our objectives were to determine the most appropriate parameters and thresholds for monitoring and management action. Capture mark recapture (CMR) data from 2006 to 2012 were used to estimate population parameters and variances. We used stochastic population simulations and conditional classification trees to identify demographic rates for monitoring that would be most indicative of heighted extinction risk. We then identified thresholds that would be reliable predictors of population viability. Conditional classification trees indicated that annual apparent survival rates for adult females averaged over 5 years () was the best predictor of population persistence. Specifically, population persistence was estimated to be ≥95% over 100 years when , suggesting that this statistic can be used as threshold to trigger management intervention. Our evaluation produced monitoring protocols that reliably predicted population persistence and was cost-effective. We conclude that population projections and conditional classification trees can be valuable tools for identifying extinction thresholds used in monitoring programs.

  2. Estimating population extinction thresholds with categorical classification trees for Louisiana black bears.

    PubMed

    Laufenberg, Jared S; Clark, Joseph D; Chandler, Richard B

    2018-01-01

    Monitoring vulnerable species is critical for their conservation. Thresholds or tipping points are commonly used to indicate when populations become vulnerable to extinction and to trigger changes in conservation actions. However, quantitative methods to determine such thresholds have not been well explored. The Louisiana black bear (Ursus americanus luteolus) was removed from the list of threatened and endangered species under the U.S. Endangered Species Act in 2016 and our objectives were to determine the most appropriate parameters and thresholds for monitoring and management action. Capture mark recapture (CMR) data from 2006 to 2012 were used to estimate population parameters and variances. We used stochastic population simulations and conditional classification trees to identify demographic rates for monitoring that would be most indicative of heighted extinction risk. We then identified thresholds that would be reliable predictors of population viability. Conditional classification trees indicated that annual apparent survival rates for adult females averaged over 5 years ([Formula: see text]) was the best predictor of population persistence. Specifically, population persistence was estimated to be ≥95% over 100 years when [Formula: see text], suggesting that this statistic can be used as threshold to trigger management intervention. Our evaluation produced monitoring protocols that reliably predicted population persistence and was cost-effective. We conclude that population projections and conditional classification trees can be valuable tools for identifying extinction thresholds used in monitoring programs.

  3. Consequences of the genetic threshold model for observing partial migration under climate change scenarios.

    PubMed

    Cobben, Marleen M P; van Noordwijk, Arie J

    2017-10-01

    Migration is a widespread phenomenon across the animal kingdom as a response to seasonality in environmental conditions. Partially migratory populations are populations that consist of both migratory and residential individuals. Such populations are very common, yet their stability has long been debated. The inheritance of migratory activity is currently best described by the threshold model of quantitative genetics. The inclusion of such a genetic threshold model for migratory behavior leads to a stable zone in time and space of partially migratory populations under a wide range of demographic parameter values, when assuming stable environmental conditions and unlimited genetic diversity. Migratory species are expected to be particularly sensitive to global warming, as arrival at the breeding grounds might be increasingly mistimed as a result of the uncoupling of long-used cues and actual environmental conditions, with decreasing reproduction as a consequence. Here, we investigate the consequences for migratory behavior and the stability of partially migratory populations under five climate change scenarios and the assumption of a genetic threshold value for migratory behavior in an individual-based model. The results show a spatially and temporally stable zone of partially migratory populations after different lengths of time in all scenarios. In the scenarios in which the species expands its range from a particular set of starting populations, the genetic diversity and location at initialization determine the species' colonization speed across the zone of partial migration and therefore across the entire landscape. Abruptly changing environmental conditions after model initialization never caused a qualitative change in phenotype distributions, or complete extinction. This suggests that climate change-induced shifts in species' ranges as well as changes in survival probabilities and reproductive success can be met with flexibility in migratory behavior at the

  4. Speech Recognition Thresholds for Multilingual Populations.

    ERIC Educational Resources Information Center

    Ramkissoon, Ishara

    2001-01-01

    This article traces the development of speech audiometry in the United States and reports on the current status, focusing on the needs of a multilingual population in terms of measuring speech recognition threshold (SRT). It also discusses sociolinguistic considerations, alternative SRT stimuli for second language learners, and research on using…

  5. Sustainable thresholds for cooperative epidemiological models.

    PubMed

    Barrios, Edwin; Gajardo, Pedro; Vasilieva, Olga

    2018-05-22

    In this paper, we introduce a method for computing sustainable thresholds for controlled cooperative models described by a system of ordinary differential equations, a property shared by a wide class of compartmental models in epidemiology. The set of sustainable thresholds refers to constraints (e.g., maximal "allowable" number of human infections; maximal "affordable" budget for disease prevention, diagnosis and treatments; etc.), parameterized by thresholds, that can be sustained by applying an admissible control strategy starting at the given initial state and lasting the whole period of the control intervention. This set, determined by the initial state of the dynamical system, virtually provides useful information for more efficient (or cost-effective) decision-making by exhibiting the trade-offs between different types of constraints and allowing the user to assess future outcomes of control measures on transient behavior of the dynamical system. In order to accentuate the originality of our approach and to reveal its potential significance in real-life applications, we present an example relying on the 2013 dengue outbreak in Cali, Colombia, where we compute the set of sustainable thresholds (in terms of the maximal "affordable" budget and the maximal "allowable" levels of active infections among human and vector populations) that could be sustained during the epidemic outbreak. Copyright © 2018 Elsevier Inc. All rights reserved.

  6. Locating helicopter emergency medical service bases to optimise population coverage versus average response time.

    PubMed

    Garner, Alan A; van den Berg, Pieter L

    2017-10-16

    New South Wales (NSW), Australia has a network of multirole retrieval physician staffed helicopter emergency medical services (HEMS) with seven bases servicing a jurisdiction with population concentrated along the eastern seaboard. The aim of this study was to estimate optimal HEMS base locations within NSW using advanced mathematical modelling techniques. We used high resolution census population data for NSW from 2011 which divides the state into areas containing 200-800 people. Optimal HEMS base locations were estimated using the maximal covering location problem facility location optimization model and the average response time model, exploring the number of bases needed to cover various fractions of the population for a 45 min response time threshold or minimizing the overall average response time to all persons, both in green field scenarios and conditioning on the current base structure. We also developed a hybrid mathematical model where average response time was optimised based on minimum population coverage thresholds. Seven bases could cover 98% of the population within 45mins when optimised for coverage or reach the entire population of the state within an average of 21mins if optimised for response time. Given the existing bases, adding two bases could either increase the 45 min coverage from 91% to 97% or decrease the average response time from 21mins to 19mins. Adding a single specialist prehospital rapid response HEMS to the area of greatest population concentration decreased the average state wide response time by 4mins. The optimum seven base hybrid model that was able to cover 97.75% of the population within 45mins, and all of the population in an average response time of 18 mins included the rapid response HEMS model. HEMS base locations can be optimised based on either percentage of the population covered, or average response time to the entire population. We have also demonstrated a hybrid technique that optimizes response time for a given

  7. A probabilistic Poisson-based model accounts for an extensive set of absolute auditory threshold measurements.

    PubMed

    Heil, Peter; Matysiak, Artur; Neubauer, Heinrich

    2017-09-01

    Thresholds for detecting sounds in quiet decrease with increasing sound duration in every species studied. The neural mechanisms underlying this trade-off, often referred to as temporal integration, are not fully understood. Here, we probe the human auditory system with a large set of tone stimuli differing in duration, shape of the temporal amplitude envelope, duration of silent gaps between bursts, and frequency. Duration was varied by varying the plateau duration of plateau-burst (PB) stimuli, the duration of the onsets and offsets of onset-offset (OO) stimuli, and the number of identical bursts of multiple-burst (MB) stimuli. Absolute thresholds for a large number of ears (>230) were measured using a 3-interval-3-alternative forced choice (3I-3AFC) procedure. Thresholds decreased with increasing sound duration in a manner that depended on the temporal envelope. Most commonly, thresholds for MB stimuli were highest followed by thresholds for OO and PB stimuli of corresponding durations. Differences in the thresholds for MB and OO stimuli and in the thresholds for MB and PB stimuli, however, varied widely across ears, were negative in some ears, and were tightly correlated. We show that the variation and correlation of MB-OO and MB-PB threshold differences are linked to threshold microstructure, which affects the relative detectability of the sidebands of the MB stimuli and affects estimates of the bandwidth of auditory filters. We also found that thresholds for MB stimuli increased with increasing duration of the silent gaps between bursts. We propose a new model and show that it accurately accounts for our results and does so considerably better than a leaky-integrator-of-intensity model and a probabilistic model proposed by others. Our model is based on the assumption that sensory events are generated by a Poisson point process with a low rate in the absence of stimulation and higher, time-varying rates in the presence of stimulation. A subject in a 3I-3AFC

  8. Mathematical Model of Naive T Cell Division and Survival IL-7 Thresholds.

    PubMed

    Reynolds, Joseph; Coles, Mark; Lythe, Grant; Molina-París, Carmen

    2013-01-01

    We develop a mathematical model of the peripheral naive T cell population to study the change in human naive T cell numbers from birth to adulthood, incorporating thymic output and the availability of interleukin-7 (IL-7). The model is formulated as three ordinary differential equations: two describe T cell numbers, in a resting state and progressing through the cell cycle. The third is introduced to describe changes in IL-7 availability. Thymic output is a decreasing function of time, representative of the thymic atrophy observed in aging humans. Each T cell is assumed to possess two interleukin-7 receptor (IL-7R) signaling thresholds: a survival threshold and a second, higher, proliferation threshold. If the IL-7R signaling strength is below its survival threshold, a cell may undergo apoptosis. When the signaling strength is above the survival threshold, but below the proliferation threshold, the cell survives but does not divide. Signaling strength above the proliferation threshold enables entry into cell cycle. Assuming that individual cell thresholds are log-normally distributed, we derive population-average rates for apoptosis and entry into cell cycle. We have analyzed the adiabatic change in homeostasis as thymic output decreases. With a parameter set representative of a healthy individual, the model predicts a unique equilibrium number of T cells. In a parameter range representative of persistent viral or bacterial infection, where naive T cell cycle progression is impaired, a decrease in thymic output may result in the collapse of the naive T cell repertoire.

  9. Percolation threshold determines the optimal population density for public cooperation

    NASA Astrophysics Data System (ADS)

    Wang, Zhen; Szolnoki, Attila; Perc, Matjaž

    2012-03-01

    While worldwide census data provide statistical evidence that firmly link the population density with several indicators of social welfare, the precise mechanisms underlying these observations are largely unknown. Here we study the impact of population density on the evolution of public cooperation in structured populations and find that the optimal density is uniquely related to the percolation threshold of the host graph irrespective of its topological details. We explain our observations by showing that spatial reciprocity peaks in the vicinity of the percolation threshold, when the emergence of a giant cooperative cluster is hindered neither by vacancy nor by invading defectors, thus discovering an intuitive yet universal law that links the population density with social prosperity.

  10. Setting nutrient thresholds to support an ecological assessment based on nutrient enrichment, potential primary production and undesirable disturbance.

    PubMed

    Devlin, Michelle; Painting, Suzanne; Best, Mike

    2007-01-01

    The EU Water Framework Directive recognises that ecological status is supported by the prevailing physico-chemical conditions in each water body. This paper describes an approach to providing guidance on setting thresholds for nutrients taking account of the biological response to nutrient enrichment evident in different types of water. Indices of pressure, state and impact are used to achieve a robust nutrient (nitrogen) threshold by considering each individual index relative to a defined standard, scale or threshold. These indices include winter nitrogen concentrations relative to a predetermined reference value; the potential of the waterbody to support phytoplankton growth (estimated as primary production); and detection of an undesirable disturbance (measured as dissolved oxygen). Proposed reference values are based on a combination of historical records, offshore (limited human influence) nutrient concentrations, literature values and modelled data. Statistical confidence is based on a number of attributes, including distance of confidence limits away from a reference threshold and how well the model is populated with real data. This evidence based approach ensures that nutrient thresholds are based on knowledge of real and measurable biological responses in transitional and coastal waters.

  11. Octave-Band Thresholds for Modeled Reverberant Fields

    NASA Technical Reports Server (NTRS)

    Begault, Durand R.; Wenzel, Elizabeth M.; Tran, Laura L.; Anderson, Mark R.; Trejo, Leonard J. (Technical Monitor)

    1998-01-01

    Auditory thresholds for 10 subjects were obtained for speech stimuli reverberation. The reverberation was produced and manipulated by 3-D audio modeling based on an actual room. The independent variables were octave-band-filtering (bypassed, 0.25 - 2.0 kHz Fc) and reverberation time (0.2- 1.1 sec). An ANOVA revealed significant effects (threshold range: -19 to -35 dB re 60 dB SRL).

  12. Modeling spatially-varying landscape change points in species occurrence thresholds

    USGS Publications Warehouse

    Wagner, Tyler; Midway, Stephen R.

    2014-01-01

    Predicting species distributions at scales of regions to continents is often necessary, as large-scale phenomena influence the distributions of spatially structured populations. Land use and land cover are important large-scale drivers of species distributions, and landscapes are known to create species occurrence thresholds, where small changes in a landscape characteristic results in abrupt changes in occurrence. The value of the landscape characteristic at which this change occurs is referred to as a change point. We present a hierarchical Bayesian threshold model (HBTM) that allows for estimating spatially varying parameters, including change points. Our model also allows for modeling estimated parameters in an effort to understand large-scale drivers of variability in land use and land cover on species occurrence thresholds. We use range-wide detection/nondetection data for the eastern brook trout (Salvelinus fontinalis), a stream-dwelling salmonid, to illustrate our HBTM for estimating and modeling spatially varying threshold parameters in species occurrence. We parameterized the model for investigating thresholds in landscape predictor variables that are measured as proportions, and which are therefore restricted to values between 0 and 1. Our HBTM estimated spatially varying thresholds in brook trout occurrence for both the proportion agricultural and urban land uses. There was relatively little spatial variation in change point estimates, although there was spatial variability in the overall shape of the threshold response and associated uncertainty. In addition, regional mean stream water temperature was correlated to the change point parameters for the proportion of urban land use, with the change point value increasing with increasing mean stream water temperature. We present a framework for quantify macrosystem variability in spatially varying threshold model parameters in relation to important large-scale drivers such as land use and land cover

  13. Cross-matching: a modified cross-correlation underlying threshold energy model and match-based depth perception

    PubMed Central

    Doi, Takahiro; Fujita, Ichiro

    2014-01-01

    Three-dimensional visual perception requires correct matching of images projected to the left and right eyes. The matching process is faced with an ambiguity: part of one eye's image can be matched to multiple parts of the other eye's image. This stereo correspondence problem is complicated for random-dot stereograms (RDSs), because dots with an identical appearance produce numerous potential matches. Despite such complexity, human subjects can perceive a coherent depth structure. A coherent solution to the correspondence problem does not exist for anticorrelated RDSs (aRDSs), in which luminance contrast is reversed in one eye. Neurons in the visual cortex reduce disparity selectivity for aRDSs progressively along the visual processing hierarchy. A disparity-energy model followed by threshold nonlinearity (threshold energy model) can account for this reduction, providing a possible mechanism for the neural matching process. However, the essential computation underlying the threshold energy model is not clear. Here, we propose that a nonlinear modification of cross-correlation, which we term “cross-matching,” represents the essence of the threshold energy model. We placed half-wave rectification within the cross-correlation of the left-eye and right-eye images. The disparity tuning derived from cross-matching was attenuated for aRDSs. We simulated a psychometric curve as a function of graded anticorrelation (graded mixture of aRDS and normal RDS); this simulated curve reproduced the match-based psychometric function observed in human near/far discrimination. The dot density was 25% for both simulation and observation. We predicted that as the dot density increased, the performance for aRDSs should decrease below chance (i.e., reversed depth), and the level of anticorrelation that nullifies depth perception should also decrease. We suggest that cross-matching serves as a simple computation underlying the match-based disparity signals in stereoscopic depth

  14. Evaluation of a threshold-based model of fatigue in gamma titanium aluminide following impact damage

    NASA Astrophysics Data System (ADS)

    Harding, Trevor Scott

    2000-10-01

    Recent interest in gamma titanium aluminide (gamma-TiAl) for use in gas turbine engine applications has centered on the low density and good elevated temperature strength retention of gamma-TiAl compared to current materials. However, the relatively low ductility and fracture toughness of gamma-TiAl leads to serious concerns regarding its ability to resist impact damage. Furthermore, the limited fatigue crack growth resistance of gamma-TiAl means that the potential for fatigue failures resulting from impact damage is real if a damage tolerant design approach is used. A threshold-based design approach may be required if fatigue crack growth from potential impact sites is to be avoided. The objective of the present research is to examine the feasibility of a threshold-based approach for the design of a gamma-TiAl low-pressure turbine blade subjected to both assembly-related impact damage and foreign object damage. Specimens of three different gamma-TiAl alloys were damaged in such a way as to simulate anticipated impact damage for a turbine blade. Step-loading fatigue tests were conducted at both room temperature and 600°C. In terms of the assembly-related impact damage, the results indicate that there is reasonably good agreement between the threshold-based predictions of the fatigue strength of damaged specimens and the measured data. However, some discrepancies do exist. In the case of very lightly damaged specimens, prediction of the resulting fatigue strength requires that a very conservative small-crack fatigue threshold be used. Consequently, the allowable design conditions are significantly reduced. For severely damaged specimens, an analytical approach found that the potential effects of residual stresses may be related to the discrepancies observed between the threshold-based model and measured fatigue strength data. In the case of foreign object damage, a good correlation was observed between impacts resulting in large cracks and a long-crack threshold-based

  15. Threshold-driven optimization for reference-based auto-planning

    NASA Astrophysics Data System (ADS)

    Long, Troy; Chen, Mingli; Jiang, Steve; Lu, Weiguo

    2018-02-01

    We study threshold-driven optimization methodology for automatically generating a treatment plan that is motivated by a reference DVH for IMRT treatment planning. We present a framework for threshold-driven optimization for reference-based auto-planning (TORA). Commonly used voxel-based quadratic penalties have two components for penalizing under- and over-dosing of voxels: a reference dose threshold and associated penalty weight. Conventional manual- and auto-planning using such a function involves iteratively updating the preference weights while keeping the thresholds constant, an unintuitive and often inconsistent method for planning toward some reference DVH. However, driving a dose distribution by threshold values instead of preference weights can achieve similar plans with less computational effort. The proposed methodology spatially assigns reference DVH information to threshold values, and iteratively improves the quality of that assignment. The methodology effectively handles both sub-optimal and infeasible DVHs. TORA was applied to a prostate case and a liver case as a proof-of-concept. Reference DVHs were generated using a conventional voxel-based objective, then altered to be either infeasible or easy-to-achieve. TORA was able to closely recreate reference DVHs in 5-15 iterations of solving a simple convex sub-problem. TORA has the potential to be effective for auto-planning based on reference DVHs. As dose prediction and knowledge-based planning becomes more prevalent in the clinical setting, incorporating such data into the treatment planning model in a clear, efficient way will be crucial for automated planning. A threshold-focused objective tuning should be explored over conventional methods of updating preference weights for DVH-guided treatment planning.

  16. A Continuous Threshold Expectile Model.

    PubMed

    Zhang, Feipeng; Li, Qunhua

    2017-12-01

    Expectile regression is a useful tool for exploring the relation between the response and the explanatory variables beyond the conditional mean. A continuous threshold expectile regression is developed for modeling data in which the effect of a covariate on the response variable is linear but varies below and above an unknown threshold in a continuous way. The estimators for the threshold and the regression coefficients are obtained using a grid search approach. The asymptotic properties for all the estimators are derived, and the estimator for the threshold is shown to achieve root-n consistency. A weighted CUSUM type test statistic is proposed for the existence of a threshold at a given expectile, and its asymptotic properties are derived under both the null and the local alternative models. This test only requires fitting the model under the null hypothesis in the absence of a threshold, thus it is computationally more efficient than the likelihood-ratio type tests. Simulation studies show that the proposed estimators and test have desirable finite sample performance in both homoscedastic and heteroscedastic cases. The application of the proposed method on a Dutch growth data and a baseball pitcher salary data reveals interesting insights. The proposed method is implemented in the R package cthreshER .

  17. AN INDIVIDUAL-BASED MODEL OF COTTUS POPULATION DYNAMICS

    EPA Science Inventory

    We explored population dynamics of a southern Appalachian population of Cottus bairdi using a spatially-explicit, individual-based model. The model follows daily growth, mortality, and spawning of individuals as a function of flow and temperature. We modeled movement of juveniles...

  18. IBSEM: An Individual-Based Atlantic Salmon Population Model

    PubMed Central

    Castellani, Marco; Heino, Mikko; Gilbey, John; Araki, Hitoshi; Svåsand, Terje; Glover, Kevin A.

    2015-01-01

    Ecology and genetics can influence the fate of individuals and populations in multiple ways. However, to date, few studies consider them when modelling the evolutionary trajectory of populations faced with admixture with non-local populations. For the Atlantic salmon, a model incorporating these elements is urgently needed because many populations are challenged with gene-flow from non-local and domesticated conspecifics. We developed an Individual-Based Salmon Eco-genetic Model (IBSEM) to simulate the demographic and population genetic change of an Atlantic salmon population through its entire life-cycle. Processes such as growth, mortality, and maturation are simulated through stochastic procedures, which take into account environmental variables as well as the genotype of the individuals. IBSEM is based upon detailed empirical data from salmon biology, and parameterized to reproduce the environmental conditions and the characteristics of a wild population inhabiting a Norwegian river. Simulations demonstrated that the model consistently and reliably reproduces the characteristics of the population. Moreover, in absence of farmed escapees, the modelled populations reach an evolutionary equilibrium that is similar to our definition of a ‘wild’ genotype. We assessed the sensitivity of the model in the face of assumptions made on the fitness differences between farm and wild salmon, and evaluated the role of straying as a buffering mechanism against the intrusion of farm genes into wild populations. These results demonstrate that IBSEM is able to capture the evolutionary forces shaping the life history of wild salmon and is therefore able to model the response of populations under environmental and genetic stressors. PMID:26383256

  19. IBSEM: An Individual-Based Atlantic Salmon Population Model.

    PubMed

    Castellani, Marco; Heino, Mikko; Gilbey, John; Araki, Hitoshi; Svåsand, Terje; Glover, Kevin A

    2015-01-01

    Ecology and genetics can influence the fate of individuals and populations in multiple ways. However, to date, few studies consider them when modelling the evolutionary trajectory of populations faced with admixture with non-local populations. For the Atlantic salmon, a model incorporating these elements is urgently needed because many populations are challenged with gene-flow from non-local and domesticated conspecifics. We developed an Individual-Based Salmon Eco-genetic Model (IBSEM) to simulate the demographic and population genetic change of an Atlantic salmon population through its entire life-cycle. Processes such as growth, mortality, and maturation are simulated through stochastic procedures, which take into account environmental variables as well as the genotype of the individuals. IBSEM is based upon detailed empirical data from salmon biology, and parameterized to reproduce the environmental conditions and the characteristics of a wild population inhabiting a Norwegian river. Simulations demonstrated that the model consistently and reliably reproduces the characteristics of the population. Moreover, in absence of farmed escapees, the modelled populations reach an evolutionary equilibrium that is similar to our definition of a 'wild' genotype. We assessed the sensitivity of the model in the face of assumptions made on the fitness differences between farm and wild salmon, and evaluated the role of straying as a buffering mechanism against the intrusion of farm genes into wild populations. These results demonstrate that IBSEM is able to capture the evolutionary forces shaping the life history of wild salmon and is therefore able to model the response of populations under environmental and genetic stressors.

  20. Identifying Thresholds for Ecosystem-Based Management

    PubMed Central

    Samhouri, Jameal F.; Levin, Phillip S.; Ainsworth, Cameron H.

    2010-01-01

    Background One of the greatest obstacles to moving ecosystem-based management (EBM) from concept to practice is the lack of a systematic approach to defining ecosystem-level decision criteria, or reference points that trigger management action. Methodology/Principal Findings To assist resource managers and policymakers in developing EBM decision criteria, we introduce a quantitative, transferable method for identifying utility thresholds. A utility threshold is the level of human-induced pressure (e.g., pollution) at which small changes produce substantial improvements toward the EBM goal of protecting an ecosystem's structural (e.g., diversity) and functional (e.g., resilience) attributes. The analytical approach is based on the detection of nonlinearities in relationships between ecosystem attributes and pressures. We illustrate the method with a hypothetical case study of (1) fishing and (2) nearshore habitat pressure using an empirically-validated marine ecosystem model for British Columbia, Canada, and derive numerical threshold values in terms of the density of two empirically-tractable indicator groups, sablefish and jellyfish. We also describe how to incorporate uncertainty into the estimation of utility thresholds and highlight their value in the context of understanding EBM trade-offs. Conclusions/Significance For any policy scenario, an understanding of utility thresholds provides insight into the amount and type of management intervention required to make significant progress toward improved ecosystem structure and function. The approach outlined in this paper can be applied in the context of single or multiple human-induced pressures, to any marine, freshwater, or terrestrial ecosystem, and should facilitate more effective management. PMID:20126647

  1. [FRAX® thresholds to identify people with high or low risk of osteoporotic fracture in Spanish female population].

    PubMed

    Azagra, Rafael; Roca, Genís; Martín-Sánchez, Juan Carlos; Casado, Enrique; Encabo, Gloria; Zwart, Marta; Aguyé, Amada; Díez-Pérez, Adolf

    2015-01-06

    To detect FRAX(®) threshold levels that identify groups of the population that are at high/low risk of osteoporotic fracture in the Spanish female population using a cost-effective assessment. This is a cohort study. Eight hundred and sixteen women 40-90 years old selected from the FRIDEX cohort with densitometry and risk factors for fracture at baseline who received no treatment for osteoporosis during the 10 year follow-up period and were stratified into 3 groups/levels of fracture risk (low<10%, 10-20% intermediate and high>20%) according to the real fracture incidence. The thresholds of FRAX(®) baseline for major osteoporotic fracture were: low risk<5; intermediate ≥ 5 to <7.5 and high ≥ 7.5. The incidence of fracture with these values was: low risk (3.6%; 95% CI 2.2-5.9), intermediate risk (13.7%; 95% CI 7.1-24.2) and high risk (21.4%; 95% CI12.9-33.2). The most cost-effective option was to refer to dual energy X-ray absorptiometry (DXA-scan) for FRAX(®)≥ 5 (Intermediate and high risk) to reclassify by FRAX(®) with DXA-scan at high/low risk. These thresholds select 17.5% of women for DXA-scan and 10% for treatment. With these thresholds of FRAX(®), compared with the strategy of opportunistic case finding isolated risk factors, would improve the predictive parameters and reduce 82.5% the DXA-scan, 35.4% osteoporosis prescriptions and 28.7% cost to detect the same number of women who suffer fractures. The use of FRAX ® thresholds identified as high/low risk of osteoporotic fracture in this calibration (FRIDEX model) improve predictive parameters in Spanish women and in a more cost-effective than the traditional model based on the T-score ≤ -2.5 of DXA scan. Copyright © 2013 Elsevier España, S.L.U. All rights reserved.

  2. Intra-population level variation in thresholds for physical dormancy-breaking temperature

    PubMed Central

    Liyanage, Ganesha S.; Ooi, Mark K. J.

    2015-01-01

    Background and Aims Intra-population variation in seed dormancy is an advantage for population persistence in unpredictable environments. The important role played by physically dormant species in these habitats makes understanding the level of variation in their dormancy a key ecological question. Heat produced in the soil is the major dormancy-breaking stimulus and, in fire prone ecosystems, soil temperatures generated by fire may vary spatially and over time. While many studies have investigated variation in initial dormancy, a measure that is of little value in fire-prone ecosystems, where initial dormancy levels are uniformly high, intra-population variation in dormancy-breaking temperature thresholds has never been quantified. This study predicted that species would display variation in dormancy-breaking temperature thresholds within populations, and investigated whether this variation occurred between individual plants from the same maternal environment. Methods The intra-population variation in dormancy-breaking thresholds of five common physically dormant shrub species (family Fabaceae) from fire-prone vegetation in south-eastern Australia was assessed using heat treatments and germination trials. Replicate batches of seeds from each of four maternal plants of Dillwynia floribunda, Viminaria juncea, Bossiaea heterophylla, Aotus ericoides and Acacia linifolia were treated at 40, 60, 80, 100 and 120 °C. Key Results Dormancy-breaking response to heat treatments varied significantly among individual plants for all species, with some individuals able to germinate after heating at low temperatures and others restricting germination to temperatures that only occur as a result of high-severity fires. Germination rate (T50) varied among individuals of three species. Conclusions Variation detected among individuals that were in close proximity to each other indicates that strong differences in dormancy-breaking temperature thresholds occur throughout the broader

  3. Development of a population-based threshold model of conidial germination for analysing the effects of physiological manipulation on the stress tolerance and infectivity of insect pathogenic fungi.

    PubMed

    Andersen, M; Magan, N; Mead, A; Chandler, D

    2006-09-01

    Entomopathogenic fungi are being used as biocontrol agents of insect pests, but their efficacy can be poor in environments where water availability is reduced. In this study, the potential to improve biocontrol by physiologically manipulating fungal inoculum was investigated. Cultures of Beauveria bassiana, Lecanicillium muscarium, Lecanicillium longisporum, Metarhizium anisopliae and Paecilomyces fumosoroseus were manipulated by growing them under conditions of water stress, which produced conidia with increased concentrations of erythritol. The time-course of germination of conidia at different water activities (water activity, aw) was described using a generalized linear model, and in most cases reducing the water activity of the germination medium delayed the onset of germination without affecting the distribution of germination times. The germination of M. anisopliae, L. muscarium, L. longisporum and P. fumosoroseus was accelerated over a range of aw levels as a result of physiological manipulation. However, the relationship between the effect of physiological manipulation on germination and the osmolyte content of conidia varied according to fungal species. There was a linear relationship between germination rate, expressed as the reciprocal of germination time, and aw of the germination medium, but there was no significant effect of fungal species or physiological manipulation on the aw threshold for germination. In bioassays with M. anisopliae, physiologically manipulated conidia germinated more rapidly on the surface of an insect host, the melon cotton aphid Aphis gossypii, and fungal virulence was increased even when relative humidity was reduced after an initial high period. It is concluded that physiological manipulation may lead to improvements in biocontrol in the field, but choice of fungal species/isolate will be critical. In addition, the population-based threshold model used in this study, which considered germination in terms of physiological

  4. An Active Contour Model Based on Adaptive Threshold for Extraction of Cerebral Vascular Structures.

    PubMed

    Wang, Jiaxin; Zhao, Shifeng; Liu, Zifeng; Tian, Yun; Duan, Fuqing; Pan, Yutong

    2016-01-01

    Cerebral vessel segmentation is essential and helpful for the clinical diagnosis and the related research. However, automatic segmentation of brain vessels remains challenging because of the variable vessel shape and high complex of vessel geometry. This study proposes a new active contour model (ACM) implemented by the level-set method for segmenting vessels from TOF-MRA data. The energy function of the new model, combining both region intensity and boundary information, is composed of two region terms, one boundary term and one penalty term. The global threshold representing the lower gray boundary of the target object by maximum intensity projection (MIP) is defined in the first-region term, and it is used to guide the segmentation of the thick vessels. In the second term, a dynamic intensity threshold is employed to extract the tiny vessels. The boundary term is used to drive the contours to evolve towards the boundaries with high gradients. The penalty term is used to avoid reinitialization of the level-set function. Experimental results on 10 clinical brain data sets demonstrate that our method is not only able to achieve better Dice Similarity Coefficient than the global threshold based method and localized hybrid level-set method but also able to extract whole cerebral vessel trees, including the thin vessels.

  5. Population sensitivities of animals to chronic ionizing radiation-model predictions from mice to elephant.

    PubMed

    Sazykina, Tatiana G

    2018-02-01

    Model predictions of population response to chronic ionizing radiation (endpoint 'morbidity') were made for 11 species of warm-blooded animals, differing in body mass and lifespan - from mice to elephant. Predictions were made also for 3 bird species (duck, pigeon, and house sparrow). Calculations were based on analytical solutions of the mathematical model, simulating a population response to low-LET ionizing radiation in an ecosystem with a limiting resource (Sazykina, Kryshev, 2016). Model parameters for different species were taken from biological and radioecological databases; allometric relationships were employed for estimating some parameter values. As a threshold of decreased health status in exposed populations ('health threshold'), a 10% reduction in self-repairing capacity of organisms was suggested, associated with a decline in ability to sustain environmental stresses. Results of the modeling demonstrate a general increase of population vulnerability to ionizing radiation in animal species of larger size and longevity. Populations of small widespread species (mice, house sparrow; body mass 20-50 g), which are characterized by intensive metabolism and short lifespan, have calculated 'health thresholds' at dose rates about 6.5-7.5 mGy day -1 . Widespread animals with body mass 200-500 g (rat, common pigeon) - demonstrate 'health threshold' values at 4-5 mGy day -1 . For populations of animals with body mass 2-5 kg (rabbit, fox, raccoon), the indicators of 10% health decrease are in the range 2-3.4 mGy day -1 . For animals with body mass 40-100 kg (wolf, sheep, wild boar), thresholds are within 0.5-0.8 mGy day -1 ; for herbivorous animals with body mass 200-300 kg (deer, horse) - 0.5-0.6 mGy day -1 . The lowest health threshold was estimated for elephant (body mass around 5000 kg) - 0.1 mGy day -1 . According to the model results, the differences in population sensitivities of warm-blooded animal species to ionizing radiation are generally

  6. Development of a paediatric population-based model of the pharmacokinetics of rivaroxaban.

    PubMed

    Willmann, Stefan; Becker, Corina; Burghaus, Rolf; Coboeken, Katrin; Edginton, Andrea; Lippert, Jörg; Siegmund, Hans-Ulrich; Thelen, Kirstin; Mück, Wolfgang

    2014-01-01

    , pharmacokinetic values in infants and preschool children (body weight <40 kg) were lower than the 90 % confidence interval threshold of the adult reference model and, therefore, indicated that doses in these groups may need to be increased to achieve the same plasma levels as in adults. For children with body weight between 40 and 70 kg, simulated plasma pharmacokinetic parameters (C max, C 24h and AUC) overlapped with the values obtained in the corresponding adult reference simulation, indicating that body weight-related exposure was similar between these children and adults. In adolescents of >70 kg body weight, the simulated 90 % prediction interval values of AUC and C 24h were much higher than the 90 % confidence interval of the adult reference population, owing to the weight-based simulation approach, but for these patients rivaroxaban would be administered at adult fixed doses of 10 and 20 mg. The paediatric PBPK model developed here allowed an exploratory analysis of the pharmacokinetics of rivaroxaban in children to inform the dosing regimen for a clinical study in paediatric patients.

  7. An avoidance behavior model for migrating whale populations

    NASA Astrophysics Data System (ADS)

    Buck, John R.; Tyack, Peter L.

    2003-04-01

    A new model is presented for the avoidance behavior of migrating marine mammals in the presence of a noise stimulus. This model assumes that each whale will adjust its movement pattern near a sound source to maintain its exposure below its own individually specific maximum received sound-pressure level, called its avoidance threshold. The probability distribution function (PDF) of this avoidance threshold across individuals characterizes the migrating population. The avoidance threshold PDF may be estimated by comparing the distribution of migrating whales during playback and control conditions at their closest point of approach to the sound source. The proposed model was applied to the January 1998 experiment which placed a single acoustic source from the U.S. Navy SURTASS-LFA system in the migration corridor of grey whales off the California coast. This analysis found that the median avoidance threshold for this migrating grey whale population was 135 dB, with 90% confidence that the median threshold was within +/-3 dB of this value. This value is less than the 141 dB value for 50% avoidance obtained when the 1984 ``Probability of Avoidance'' model of Malme et al.'s was applied to the same data. [Work supported by ONR.

  8. The threshold of a stochastic avian-human influenza epidemic model with psychological effect

    NASA Astrophysics Data System (ADS)

    Zhang, Fengrong; Zhang, Xinhong

    2018-02-01

    In this paper, a stochastic avian-human influenza epidemic model with psychological effect in human population and saturation effect within avian population is investigated. This model describes the transmission of avian influenza among avian population and human population in random environments. For stochastic avian-only system, persistence in the mean and extinction of the infected avian population are studied. For the avian-human influenza epidemic system, sufficient conditions for the existence of an ergodic stationary distribution are obtained. Furthermore, a threshold of this stochastic model which determines the outcome of the disease is obtained. Finally, numerical simulations are given to support the theoretical results.

  9. A quantitative model of honey bee colony population dynamics.

    PubMed

    Khoury, David S; Myerscough, Mary R; Barron, Andrew B

    2011-04-18

    Since 2006 the rate of honey bee colony failure has increased significantly. As an aid to testing hypotheses for the causes of colony failure we have developed a compartment model of honey bee colony population dynamics to explore the impact of different death rates of forager bees on colony growth and development. The model predicts a critical threshold forager death rate beneath which colonies regulate a stable population size. If death rates are sustained higher than this threshold rapid population decline is predicted and colony failure is inevitable. The model also predicts that high forager death rates draw hive bees into the foraging population at much younger ages than normal, which acts to accelerate colony failure. The model suggests that colony failure can be understood in terms of observed principles of honey bee population dynamics, and provides a theoretical framework for experimental investigation of the problem.

  10. A Gompertz population model with Allee effect and fuzzy initial values

    NASA Astrophysics Data System (ADS)

    Amarti, Zenia; Nurkholipah, Nenden Siti; Anggriani, Nursanti; Supriatna, Asep K.

    2018-03-01

    Growth and population dynamics models are important tools used in preparing a good management for society to predict the future of population or species. This has been done by various known methods, one among them is by developing a mathematical model that describes population growth. Models are usually formed into differential equations or systems of differential equations, depending on the complexity of the underlying properties of the population. One example of biological complexity is Allee effect. It is a phenomenon showing a high correlation between very small population size and the mean individual fitness of the population. In this paper the population growth model used is the Gompertz equation model by considering the Allee effect on the population. We explore the properties of the solution to the model numerically using the Runge-Kutta method. Further exploration is done via fuzzy theoretical approach to accommodate uncertainty of the initial values of the model. It is known that an initial value greater than the Allee threshold will cause the solution rises towards carrying capacity asymptotically. However, an initial value smaller than the Allee threshold will cause the solution decreases towards zero asymptotically, which means the population is eventually extinct. Numerical solutions show that modeling uncertain initial value of the critical point A (the Allee threshold) with a crisp initial value could cause the extinction of population of a certain possibilistic degree, depending on the predetermined membership function of the initial value.

  11. Modelling the regulatory system for diabetes mellitus with a threshold window

    NASA Astrophysics Data System (ADS)

    Yang, Jin; Tang, Sanyi; Cheke, Robert A.

    2015-05-01

    Piecewise (or non-smooth) glucose-insulin models with threshold windows for type 1 and type 2 diabetes mellitus are proposed and analyzed with a view to improving understanding of the glucose-insulin regulatory system. For glucose-insulin models with a single threshold, the existence and stability of regular, virtual, pseudo-equilibria and tangent points are addressed. Then the relations between regular equilibria and a pseudo-equilibrium are studied. Furthermore, the sufficient and necessary conditions for the global stability of regular equilibria and the pseudo-equilibrium are provided by using qualitative analysis techniques of non-smooth Filippov dynamic systems. Sliding bifurcations related to boundary node bifurcations were investigated with theoretical and numerical techniques, and insulin clinical therapies are discussed. For glucose-insulin models with a threshold window, the effects of glucose thresholds or the widths of threshold windows on the durations of insulin therapy and glucose infusion were addressed. The duration of the effects of an insulin injection is sensitive to the variation of thresholds. Our results indicate that blood glucose level can be maintained within a normal range using piecewise glucose-insulin models with a single threshold or a threshold window. Moreover, our findings suggest that it is critical to individualise insulin therapy for each patient separately, based on initial blood glucose levels.

  12. Uncertainties in the Modelled CO2 Threshold for Antarctic Glaciation

    NASA Technical Reports Server (NTRS)

    Gasson, E.; Lunt, D. J.; DeConto, R.; Goldner, A.; Heinemann, M.; Huber, M.; LeGrande, A. N.; Pollard, D.; Sagoo, N.; Siddall, M.; hide

    2014-01-01

    frequently cited atmospheric CO2 threshold for the onset of Antarctic glaciation of approximately780 parts per million by volume is based on the study of DeConto and Pollard (2003) using an ice sheet model and the GENESIS climate model. Proxy records suggest that atmospheric CO2 concentrations passed through this threshold across the Eocene-Oligocene transition approximately 34 million years. However, atmospheric CO2 concentrations may have been close to this threshold earlier than this transition, which is used by some to suggest the possibility of Antarctic ice sheets during the Eocene. Here we investigate the climate model dependency of the threshold for Antarctic glaciation by performing offline ice sheet model simulations using the climate from 7 different climate models with Eocene boundary conditions (HadCM3L, CCSM3, CESM1.0, GENESIS, FAMOUS, ECHAM5 and GISS_ER). These climate simulations are sourced from a number of independent studies, and as such the boundary conditions, which are poorly constrained during the Eocene, are not identical between simulations. The results of this study suggest that the atmospheric CO2 threshold for Antarctic glaciation is highly dependent on the climate model used and the climate model configuration. A large discrepancy between the climate model and ice sheet model grids for some simulations leads to a strong sensitivity to the lapse rate parameter.

  13. Application of Johnson et al.'s speciation threshold model to apparent colonization times of island biotas.

    PubMed

    Ricklefs, Robert E; Bermingham, Eldredge

    2004-08-01

    Understanding patterns of diversity can be furthered by analysis of the dynamics of colonization, speciation, and extinction on islands using historical information provided by molecular phylogeography. The land birds of the Lesser Antilles are one of the most thoroughly described regional faunas in this context. In an analysis of colonization times, Ricklefs and Bermingham (2001) found that the cumulative distribution of lineages with respect to increasing time since colonization exhibits a striking change in slope at a genetic distance of about 2% mitochondrial DNA sequence divergence (about one million years). They further showed how this heterogeneity could be explained by either an abrupt increase in colonization rates or a mass extinction event. Cherry et al. (2002), referring to a model developed by Johnson et al. (2000), argued instead that the pattern resulted from a speciation threshold for reproductive isolation of island populations from their continental source populations. Prior to this threshold, genetic divergence is slowed by migration from the source, and species of varying age accumulate at a low genetic distance. After the threshold is reached, source and island populations diverge more rapidly, creating heterogeneity in the distribution of apparent ages of island taxa. We simulated of Johnson et al.'s speciation-threshold model, incorporating genetic divergence at rate k and fixation at rate M of genes that have migrated between the source and the island population. Fixation resets the divergence clock to zero. The speciation-threshold model fits the distribution of divergence times of Lesser Antillean birds well with biologically plausible parameter estimates. Application of the model to the Hawaiian avifauna, which does not exhibit marked heterogeneity of genetic divergence, and the West Indian herpetofauna, which does, required unreasonably high migration-fixation rates, several orders of magnitude greater than the colonization rate. However

  14. Entrainment and Control of Bacterial Populations: An in Silico Study over a Spatially Extended Agent Based Model.

    PubMed

    Mina, Petros; Tsaneva-Atanasova, Krasimira; Bernardo, Mario di

    2016-07-15

    We extend a spatially explicit agent based model (ABM) developed previously to investigate entrainment and control of the emergent behavior of a population of synchronized oscillating cells in a microfluidic chamber. Unlike most of the work in models of control of cellular systems which focus on temporal changes, we model individual cells with spatial dependencies which may contribute to certain behavioral responses. We use the model to investigate the response of both open loop and closed loop strategies, such as proportional control (P-control), proportional-integral control (PI-control) and proportional-integral-derivative control (PID-control), to heterogeinities and growth in the cell population, variations of the control parameters and spatial effects such as diffusion in the spatially explicit setting of a microfluidic chamber setup. We show that, as expected from the theory of phase locking in dynamical systems, open loop control can only entrain the cell population in a subset of forcing periods, with a wide variety of dynamical behaviors obtained outside these regions of entrainment. Closed-loop control is shown instead to guarantee entrainment in a much wider region of control parameter space although presenting limitations when the population size increases over a certain threshold. In silico tracking experiments are also performed to validate the ability of classical control approaches to achieve other reference behaviors such as a desired constant output or a linearly varying one. All simulations are carried out in BSim, an advanced agent-based simulator of microbial population which is here extended ad hoc to include the effects of control strategies acting onto the population.

  15. Analysis of Critical Mass in Threshold Model of Diffusion

    NASA Astrophysics Data System (ADS)

    Kim, Jeehong; Hur, Wonchang; Kang, Suk-Ho

    2012-04-01

    Why does diffusion sometimes show cascade phenomena but at other times is impeded? In addressing this question, we considered a threshold model of diffusion, focusing on the formation of a critical mass, which enables diffusion to be self-sustaining. Performing an agent-based simulation, we found that the diffusion model produces only two outcomes: Almost perfect adoption or relatively few adoptions. In order to explain the difference, we considered the various properties of network structures and found that the manner in which thresholds are arrayed over a network is the most critical factor determining the size of a cascade. On the basis of the results, we derived a threshold arrangement method effective for generation of a critical mass and calculated the size required for perfect adoption.

  16. History-Based Response Threshold Model for Division of Labor in Multi-Agent Systems

    PubMed Central

    Lee, Wonki; Kim, DaeEun

    2017-01-01

    Dynamic task allocation is a necessity in a group of robots. Each member should decide its own task such that it is most commensurate with its current state in the overall system. In this work, the response threshold model is applied to a dynamic foraging task. Each robot employs a task switching function based on the local task demand obtained from the surrounding environment, and no communication occurs between the robots. Each individual member has a constant-sized task demand history that reflects the global demand. In addition, it has response threshold values for all of the tasks and manages the task switching process depending on the stimuli of the task demands. The robot then determines the task to be executed to regulate the overall division of labor. This task selection induces a specialized tendency for performing a specific task and regulates the division of labor. In particular, maintaining a history of the task demands is very effective for the dynamic foraging task. Various experiments are performed using a simulation with multiple robots, and the results show that the proposed algorithm is more effective as compared to the conventional model. PMID:28555031

  17. Cascades in the Threshold Model for varying system sizes

    NASA Astrophysics Data System (ADS)

    Karampourniotis, Panagiotis; Sreenivasan, Sameet; Szymanski, Boleslaw; Korniss, Gyorgy

    2015-03-01

    A classical model in opinion dynamics is the Threshold Model (TM) aiming to model the spread of a new opinion based on the social drive of peer pressure. Under the TM a node adopts a new opinion only when the fraction of its first neighbors possessing that opinion exceeds a pre-assigned threshold. Cascades in the TM depend on multiple parameters, such as the number and selection strategy of the initially active nodes (initiators), and the threshold distribution of the nodes. For a uniform threshold in the network there is a critical fraction of initiators for which a transition from small to large cascades occurs, which for ER graphs is largerly independent of the system size. Here, we study the spread contribution of each newly assigned initiator under the TM for different initiator selection strategies for synthetic graphs of various sizes. We observe that for ER graphs when large cascades occur, the spread contribution of the added initiator on the transition point is independent of the system size, while the contribution of the rest of the initiators converges to zero at infinite system size. This property is used for the identification of large transitions for various threshold distributions. Supported in part by ARL NS-CTA, ARO, ONR, and DARPA.

  18. Analytical connection between thresholds and immunization strategies of SIS model in random networks

    NASA Astrophysics Data System (ADS)

    Zhou, Ming-Yang; Xiong, Wen-Man; Liao, Hao; Wang, Tong; Wei, Zong-Wen; Fu, Zhong-Qian

    2018-05-01

    Devising effective strategies for hindering the propagation of viruses and protecting the population against epidemics is critical for public security and health. Despite a number of studies based on the susceptible-infected-susceptible (SIS) model devoted to this topic, we still lack a general framework to compare different immunization strategies in completely random networks. Here, we address this problem by suggesting a novel method based on heterogeneous mean-field theory for the SIS model. Our method builds the relationship between the thresholds and different immunization strategies in completely random networks. Besides, we provide an analytical argument that the targeted large-degree strategy achieves the best performance in random networks with arbitrary degree distribution. Moreover, the experimental results demonstrate the effectiveness of the proposed method in both artificial and real-world networks.

  19. Quantifying population-level risks using an individual-based model: sea otters, Harlequin Ducks, and the Exxon Valdez oil spill.

    PubMed

    Harwell, Mark A; Gentile, John H; Parker, Keith R

    2012-07-01

    Ecological risk assessments need to advance beyond evaluating risks to individuals that are largely based on toxicity studies conducted on a few species under laboratory conditions, to assessing population-level risks to the environment, including considerations of variability and uncertainty. Two individual-based models (IBMs), recently developed to assess current risks to sea otters and seaducks in Prince William Sound more than 2 decades after the Exxon Valdez oil spill (EVOS), are used to explore population-level risks. In each case, the models had previously shown that there were essentially no remaining risks to individuals from polycyclic aromatic hydrocarbons (PAHs) derived from the EVOS. New sensitivity analyses are reported here in which hypothetical environmental exposures to PAHs were heuristically increased until assimilated doses reached toxicity reference values (TRVs) derived at the no-observed-adverse-effects and lowest-observed-adverse-effects levels (NOAEL and LOAEL, respectively). For the sea otters, this was accomplished by artificially increasing the number of sea otter pits that would intersect remaining patches of subsurface oil residues by orders of magnitude over actual estimated rates. Similarly, in the seaduck assessment, the PAH concentrations in the constituents of diet, sediments, and seawater were increased in proportion to their relative contributions to the assimilated doses by orders of magnitude over measured environmental concentrations, to reach the NOAEL and LOAEL thresholds. The stochastic IBMs simulated millions of individuals. From these outputs, frequency distributions were derived of assimilated doses for populations of 500,000 sea otters or seaducks in each of 7 or 8 classes, respectively. Doses to several selected quantiles were analyzed, ranging from the 1-in-1000th most-exposed individuals (99.9% quantile) to the median-exposed individuals (50% quantile). The resulting families of quantile curves provide the basis for

  20. Experimental evidence of a pathogen invasion threshold

    PubMed Central

    Krkošek, Martin

    2018-01-01

    Host density thresholds to pathogen invasion separate regions of parameter space corresponding to endemic and disease-free states. The host density threshold is a central concept in theoretical epidemiology and a common target of human and wildlife disease control programmes, but there is mixed evidence supporting the existence of thresholds, especially in wildlife populations or for pathogens with complex transmission modes (e.g. environmental transmission). Here, we demonstrate the existence of a host density threshold for an environmentally transmitted pathogen by combining an epidemiological model with a microcosm experiment. Experimental epidemics consisted of replicate populations of naive crustacean zooplankton (Daphnia dentifera) hosts across a range of host densities (20–640 hosts l−1) that were exposed to an environmentally transmitted fungal pathogen (Metschnikowia bicuspidata). Epidemiological model simulations, parametrized independently of the experiment, qualitatively predicted experimental pathogen invasion thresholds. Variability in parameter estimates did not strongly influence outcomes, though systematic changes to key parameters have the potential to shift pathogen invasion thresholds. In summary, we provide one of the first clear experimental demonstrations of pathogen invasion thresholds in a replicated experimental system, and provide evidence that such thresholds may be predictable using independently constructed epidemiological models. PMID:29410876

  1. A model-based 'varimax' sampling strategy for a heterogeneous population.

    PubMed

    Akram, Nuzhat A; Farooqi, Shakeel R

    2014-01-01

    Sampling strategies are planned to enhance the homogeneity of a sample, hence to minimize confounding errors. A sampling strategy was developed to minimize the variation within population groups. Karachi, the largest urban agglomeration in Pakistan, was used as a model population. Blood groups ABO and Rh factor were determined for 3000 unrelated individuals selected through simple random sampling. Among them five population groups, namely Balochi, Muhajir, Pathan, Punjabi and Sindhi, based on paternal ethnicity were identified. An index was designed to measure the proportion of admixture at parental and grandparental levels. Population models based on index score were proposed. For validation, 175 individuals selected through stratified random sampling were genotyped for the three STR loci CSF1PO, TPOX and TH01. ANOVA showed significant differences across the population groups for blood groups and STR loci distribution. Gene diversity was higher across the sub-population model than in the agglomerated population. At parental level gene diversities are significantly higher across No admixture models than Admixture models. At grandparental level the difference was not significant. A sub-population model with no admixture at parental level was justified for sampling the heterogeneous population of Karachi.

  2. Position Estimation for Switched Reluctance Motor Based on the Single Threshold Angle

    NASA Astrophysics Data System (ADS)

    Zhang, Lei; Li, Pang; Yu, Yue

    2017-05-01

    This paper presents a position estimate model of switched reluctance motor based on the single threshold angle. In view of the relationship of between the inductance and rotor position, the position is estimated by comparing the real-time dynamic flux linkage with the threshold angle position flux linkage (7.5° threshold angle, 12/8SRM). The sensorless model is built by Maltab/Simulink, the simulation are implemented under the steady state and transient state different condition, and verified its validity and feasibility of the method..

  3. Threshold flux-controlled memristor model and its equivalent circuit implementation

    NASA Astrophysics Data System (ADS)

    Wu, Hua-Gan; Bao, Bo-Cheng; Chen, Mo

    2014-11-01

    Modeling a memristor is an effective way to explore the memristor properties due to the fact that the memristor devices are still not commercially available for common researchers. In this paper, a physical memristive device is assumed to exist whose ionic drift direction is perpendicular to the direction of the applied voltage, upon which, corresponding to the HP charge-controlled memristor model, a novel threshold flux-controlled memristor model with a window function is proposed. The fingerprints of the proposed model are analyzed. Especially, a practical equivalent circuit of the proposed model is realized, from which the corresponding experimental fingerprints are captured. The equivalent circuit of the threshold memristor model is appropriate for various memristors based breadboard experiments.

  4. Stylized facts from a threshold-based heterogeneous agent model

    NASA Astrophysics Data System (ADS)

    Cross, R.; Grinfeld, M.; Lamba, H.; Seaman, T.

    2007-05-01

    A class of heterogeneous agent models is investigated where investors switch trading position whenever their motivation to do so exceeds some critical threshold. These motivations can be psychological in nature or reflect behaviour suggested by the efficient market hypothesis (EMH). By introducing different propensities into a baseline model that displays EMH behaviour, one can attempt to isolate their effects upon the market dynamics. The simulation results indicate that the introduction of a herding propensity results in excess kurtosis and power-law decay consistent with those observed in actual return distributions, but not in significant long-term volatility correlations. Possible alternatives for introducing such long-term volatility correlations are then identified and discussed.

  5. Identifying the most appropriate age threshold for TNM stage grouping of well-differentiated thyroid cancer.

    PubMed

    Hendrickson-Rebizant, J; Sigvaldason, H; Nason, R W; Pathak, K A

    2015-08-01

    Age is integrated in most risk stratification systems for well-differentiated thyroid cancer (WDTC). The most appropriate age threshold for stage grouping of WDTC is debatable. The objective of this study was to evaluate the best age threshold for stage grouping by comparing multivariable models designed to evaluate the independent impact of various prognostic factors, including age based stage grouping, on the disease specific survival (DSS) of our population-based cohort. Data from population-based thyroid cancer cohort of 2125 consecutive WDTC, diagnosed during 1970-2010, with a median follow-up of 11.5 years, was used to calculate DSS using the Kaplan Meier method. Multivariable analysis with Cox proportional hazard model was used to assess independent impact of different prognostic factors on DSS. The Akaike information criterion (AIC), a measure of statistical model fit, was used to identify the most appropriate age threshold model. Delta AIC, Akaike weight, and evidence ratios were calculated to compare the relative strength of different models. The mean age of the patients was 47.3 years. DSS of the cohort was 95.6% and 92.8% at 10 and 20 years respectively. A threshold of 55 years, with the lowest AIC, was identified as the best model. Akaike weight indicated an 85% chance that this age threshold is the best among the compared models, and is 16.8 times more likely to be the best model as compared to a threshold of 45 years. The age threshold of 55 years was found to be the best for TNM stage grouping. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. A threshold model of investor psychology

    NASA Astrophysics Data System (ADS)

    Cross, Rod; Grinfeld, Michael; Lamba, Harbir; Seaman, Tim

    2005-08-01

    We introduce a class of agent-based market models founded upon simple descriptions of investor psychology. Agents are subject to various psychological tensions induced by market conditions and endowed with a minimal ‘personality’. This personality consists of a threshold level for each of the tensions being modeled, and the agent reacts whenever a tension threshold is reached. This paper considers an elementary model including just two such tensions. The first is ‘cowardice’, which is the stress caused by remaining in a minority position with respect to overall market sentiment and leads to herding-type behavior. The second is ‘inaction’, which is the increasing desire to act or re-evaluate one's investment position. There is no inductive learning by agents and they are only coupled via the global market price and overall market sentiment. Even incorporating just these two psychological tensions, important stylized facts of real market data, including fat-tails, excess kurtosis, uncorrelated price returns and clustered volatility over the timescale of a few days are reproduced. By then introducing an additional parameter that amplifies the effect of externally generated market noise during times of extreme market sentiment, long-time volatility correlations can also be recovered.

  7. Predictors of the nicotine reinforcement threshold, compensation, and elasticity of demand in a rodent model of nicotine reduction policy.

    PubMed

    Grebenstein, Patricia E; Burroughs, Danielle; Roiko, Samuel A; Pentel, Paul R; LeSage, Mark G

    2015-06-01

    The FDA is considering reducing the nicotine content in tobacco products as a population-based strategy to reduce tobacco addiction. Research is needed to determine the threshold level of nicotine needed to maintain smoking and the extent of compensatory smoking that could occur during nicotine reduction. Sources of variability in these measures across sub-populations also need to be identified so that policies can take into account the risks and benefits of nicotine reduction in vulnerable populations. The present study examined these issues in a rodent nicotine self-administration model of nicotine reduction policy to characterize individual differences in nicotine reinforcement thresholds, degree of compensation, and elasticity of demand during progressive reduction of the unit nicotine dose. The ability of individual differences in baseline nicotine intake and nicotine pharmacokinetics to predict responses to dose reduction was also examined. Considerable variability in the reinforcement threshold, compensation, and elasticity of demand was evident. High baseline nicotine intake was not correlated with the reinforcement threshold, but predicted less compensation and less elastic demand. Higher nicotine clearance predicted low reinforcement thresholds, greater compensation, and less elastic demand. Less elastic demand also predicted lower reinforcement thresholds. These findings suggest that baseline nicotine intake, nicotine clearance, and the essential value of nicotine (i.e. elasticity of demand) moderate the effects of progressive nicotine reduction in rats and warrant further study in humans. They also suggest that smokers with fast nicotine metabolism may be more vulnerable to the risks of nicotine reduction. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  8. Predictors of the nicotine reinforcement threshold, compensation, and elasticity of demand in a rodent model of nicotine reduction policy*

    PubMed Central

    Grebenstein, Patricia E.; Burroughs, Danielle; Roiko, Samuel A.; Pentel, Paul R.; LeSage, Mark G.

    2015-01-01

    Background The FDA is considering reducing the nicotine content in tobacco products as a population-based strategy to reduce tobacco addiction. Research is needed to determine the threshold level of nicotine needed to maintain smoking and the extent of compensatory smoking that could occur during nicotine reduction. Sources of variability in these measures across sub-populations also need to be identified so that policies can take into account the risks and benefits of nicotine reduction in vulnerable populations. Methods The present study examined these issues in a rodent nicotine self- administration model of nicotine reduction policy to characterize individual differences in nicotine reinforcement thresholds, degree of compensation, and elasticity of demand during progressive reduction of the unit nicotine dose. The ability of individual differences in baseline nicotine intake and nicotine pharmacokinetics to predict responses to dose reduction was also examined. Results Considerable variability in the reinforcement threshold, compensation, and elasticity of demand was evident. High baseline nicotine intake was not correlated with the reinforcement threshold, but predicted less compensation and less elastic demand. Higher nicotine clearance predicted low reinforcement thresholds, greater compensation, and less elastic demand. Less elastic demand also predicted lower reinforcement thresholds. Conclusions These findings suggest that baseline nicotine intake, nicotine clearance, and the essential value of nicotine (i.e. elasticity of demand) moderate the effects of progressive nicotine reduction in rats and warrant further study in humans. They also suggest that smokers with fast nicotine metabolism may be more vulnerable to the risks of nicotine reduction. PMID:25891231

  9. Incorporating uncertainty of management costs in sensitivity analyses of matrix population models.

    PubMed

    Salomon, Yacov; McCarthy, Michael A; Taylor, Peter; Wintle, Brendan A

    2013-02-01

    The importance of accounting for economic costs when making environmental-management decisions subject to resource constraints has been increasingly recognized in recent years. In contrast, uncertainty associated with such costs has often been ignored. We developed a method, on the basis of economic theory, that accounts for the uncertainty in population-management decisions. We considered the case where, rather than taking fixed values, model parameters are random variables that represent the situation when parameters are not precisely known. Hence, the outcome is not precisely known either. Instead of maximizing the expected outcome, we maximized the probability of obtaining an outcome above a threshold of acceptability. We derived explicit analytical expressions for the optimal allocation and its associated probability, as a function of the threshold of acceptability, where the model parameters were distributed according to normal and uniform distributions. To illustrate our approach we revisited a previous study that incorporated cost-efficiency analyses in management decisions that were based on perturbation analyses of matrix population models. Incorporating derivations from this study into our framework, we extended the model to address potential uncertainties. We then applied these results to 2 case studies: management of a Koala (Phascolarctos cinereus) population and conservation of an olive ridley sea turtle (Lepidochelys olivacea) population. For low aspirations, that is, when the threshold of acceptability is relatively low, the optimal strategy was obtained by diversifying the allocation of funds. Conversely, for high aspirations, the budget was directed toward management actions with the highest potential effect on the population. The exact optimal allocation was sensitive to the choice of uncertainty model. Our results highlight the importance of accounting for uncertainty when making decisions and suggest that more effort should be placed on

  10. Comparing population and incident data for optimal air ambulance base locations in Norway.

    PubMed

    Røislien, Jo; van den Berg, Pieter L; Lindner, Thomas; Zakariassen, Erik; Uleberg, Oddvar; Aardal, Karen; van Essen, J Theresia

    2018-05-24

    Helicopter emergency medical services are important in many health care systems. Norway has a nationwide physician manned air ambulance service servicing a country with large geographical variations in population density and incident frequencies. The aim of the study was to compare optimal air ambulance base locations using both population and incident data. We used municipality population and incident data for Norway from 2015. The 428 municipalities had a median (5-95 percentile) of 4675 (940-36,264) inhabitants and 10 (2-38) incidents. Optimal helicopter base locations were estimated using the Maximal Covering Location Problem (MCLP) optimization model, exploring the number and location of bases needed to cover various fractions of the population for time thresholds 30 and 45 min, in green field scenarios and conditioned on the existing base structure. The existing bases covered 96.90% of the population and 91.86% of the incidents for time threshold 45 min. Correlation between municipality population and incident frequencies was -0.0027, and optimal base locations varied markedly between the two data types, particularly when lowering the target time. The optimal solution using population density data put focus on the greater Oslo area, where one third of Norwegians live, while using incident data put focus on low population high incident areas, such as northern Norway and winter sport resorts. Using population density data as a proxy for incident frequency is not recommended, as the two data types lead to different optimal base locations. Lowering the target time increases the sensitivity to choice of data.

  11. A New Approach to Threshold Attribute Based Signatures

    DTIC Science & Technology

    2011-01-01

    Inspired by developments in attribute based encryption and signatures, there has recently been a spurtof progress in the direction of threshold ...attribute based signatures (t-ABS). In this work we propose anovel approach to construct threshold attribute based signatures inspired by ring signatures...Thresholdattribute based signatures, dened by a (t; n) threshold predicate, ensure that the signer holds atleastt out of a specied set of n attributes

  12. Threshold and Beyond: Modeling The Intensity Dependence of Auditory Responses

    PubMed Central

    2007-01-01

    In many studies of auditory-evoked responses to low-intensity sounds, the response amplitude appears to increase roughly linearly with the sound level in decibels (dB), corresponding to a logarithmic intensity dependence. But the auditory system is assumed to be linear in the low-intensity limit. The goal of this study was to resolve the seeming contradiction. Based on assumptions about the rate-intensity functions of single auditory-nerve fibers and the pattern of cochlear excitation caused by a tone, a model for the gross response of the population of auditory nerve fibers was developed. In accordance with signal detection theory, the model denies the existence of a threshold. This implies that regarding the detection of a significant stimulus-related effect, a reduction in sound intensity can always be compensated for by increasing the measurement time, at least in theory. The model suggests that the gross response is proportional to intensity when the latter is low (range I), and a linear function of sound level at higher intensities (range III). For intensities in between, it is concluded that noisy experimental data may provide seemingly irrefutable evidence of a linear dependence on sound pressure (range II). In view of the small response amplitudes that are to be expected for intensity range I, direct observation of the predicted proportionality with intensity will generally be a challenging task for an experimenter. Although the model was developed for the auditory nerve, the basic conclusions are probably valid for higher levels of the auditory system, too, and might help to improve models for loudness at threshold. PMID:18008105

  13. Contributions of adaptation currents to dynamic spike threshold on slow timescales: Biophysical insights from conductance-based models

    NASA Astrophysics Data System (ADS)

    Yi, Guosheng; Wang, Jiang; Wei, Xile; Deng, Bin; Li, Huiyan; Che, Yanqiu

    2017-06-01

    Spike-frequency adaptation (SFA) mediated by various adaptation currents, such as voltage-gated K+ current (IM), Ca2+-gated K+ current (IAHP), or Na+-activated K+ current (IKNa), exists in many types of neurons, which has been shown to effectively shape their information transmission properties on slow timescales. Here we use conductance-based models to investigate how the activation of three adaptation currents regulates the threshold voltage for action potential (AP) initiation during the course of SFA. It is observed that the spike threshold gets depolarized and the rate of membrane depolarization (dV/dt) preceding AP is reduced as adaptation currents reduce firing rate. It is indicated that the presence of inhibitory adaptation currents enables the neuron to generate a dynamic threshold inversely correlated with preceding dV/dt on slower timescales than fast dynamics of AP generation. By analyzing the interactions of ionic currents at subthreshold potentials, we find that the activation of adaptation currents increase the outward level of net membrane current prior to AP initiation, which antagonizes inward Na+ to result in a depolarized threshold and lower dV/dt from one AP to the next. Our simulations demonstrate that the threshold dynamics on slow timescales is a secondary effect caused by the activation of adaptation currents. These findings have provided a biophysical interpretation of the relationship between adaptation currents and spike threshold.

  14. Poverty dynamics, poverty thresholds and mortality: An age-stage Markovian model

    PubMed Central

    Rehkopf, David; Tuljapurkar, Shripad; Horvitz, Carol C.

    2018-01-01

    Recent studies have examined the risk of poverty throughout the life course, but few have considered how transitioning in and out of poverty shape the dynamic heterogeneity and mortality disparities of a cohort at each age. Here we use state-by-age modeling to capture individual heterogeneity in crossing one of three different poverty thresholds (defined as 1×, 2× or 3× the “official” poverty threshold) at each age. We examine age-specific state structure, the remaining life expectancy, its variance, and cohort simulations for those above and below each threshold. Survival and transitioning probabilities are statistically estimated by regression analyses of data from the Health and Retirement Survey RAND data-set, and the National Longitudinal Survey of Youth. Using the results of these regression analyses, we parameterize discrete state, discrete age matrix models. We found that individuals above all three thresholds have higher annual survival than those in poverty, especially for mid-ages to about age 80. The advantage is greatest when we classify individuals based on 1× the “official” poverty threshold. The greatest discrepancy in average remaining life expectancy and its variance between those above and in poverty occurs at mid-ages for all three thresholds. And fewer individuals are in poverty between ages 40-60 for all three thresholds. Our findings are consistent with results based on other data sets, but also suggest that dynamic heterogeneity in poverty and the transience of the poverty state is associated with income-related mortality disparities (less transience, especially of those above poverty, more disparities). This paper applies the approach of age-by-stage matrix models to human demography and individual poverty dynamics. In so doing we extend the literature on individual poverty dynamics across the life course. PMID:29768416

  15. Quantifying Population-Level Risks Using an Individual-Based Model: Sea Otters, Harlequin Ducks, and the Exxon Valdez Oil Spill

    PubMed Central

    Harwell, Mark A; Gentile, John H; Parker, Keith R

    2012-01-01

    Ecological risk assessments need to advance beyond evaluating risks to individuals that are largely based on toxicity studies conducted on a few species under laboratory conditions, to assessing population-level risks to the environment, including considerations of variability and uncertainty. Two individual-based models (IBMs), recently developed to assess current risks to sea otters and seaducks in Prince William Sound more than 2 decades after the Exxon Valdez oil spill (EVOS), are used to explore population-level risks. In each case, the models had previously shown that there were essentially no remaining risks to individuals from polycyclic aromatic hydrocarbons (PAHs) derived from the EVOS. New sensitivity analyses are reported here in which hypothetical environmental exposures to PAHs were heuristically increased until assimilated doses reached toxicity reference values (TRVs) derived at the no-observed-adverse-effects and lowest-observed-adverse-effects levels (NOAEL and LOAEL, respectively). For the sea otters, this was accomplished by artificially increasing the number of sea otter pits that would intersect remaining patches of subsurface oil residues by orders of magnitude over actual estimated rates. Similarly, in the seaduck assessment, the PAH concentrations in the constituents of diet, sediments, and seawater were increased in proportion to their relative contributions to the assimilated doses by orders of magnitude over measured environmental concentrations, to reach the NOAEL and LOAEL thresholds. The stochastic IBMs simulated millions of individuals. From these outputs, frequency distributions were derived of assimilated doses for populations of 500 000 sea otters or seaducks in each of 7 or 8 classes, respectively. Doses to several selected quantiles were analyzed, ranging from the 1-in-1000th most-exposed individuals (99.9% quantile) to the median-exposed individuals (50% quantile). The resulting families of quantile curves provide the basis for

  16. An experimental operative system for shallow landslide and flash flood warning based on rainfall thresholds and soil moisture modelling

    NASA Astrophysics Data System (ADS)

    Brigandı, G.; Aronica, G. T.; Basile, G.; Pasotti, L.; Panebianco, M.

    2012-04-01

    On November 2011 a thunderstorms became almost exceptional over the North-East part of the Sicily Region (Italy) producing local heavy rainfall, mud-debris flow and flash flooding. The storm was concentrated on the Tyrrhenian sea coast near the city of Barcellona within the Longano catchment. Main focus of the paper is to present an experimental operative system for alerting extreme hydrometeorological events by using a methodology based on the combined use of rainfall thresholds, soil moisture indexes and quantitative precipitation forecasting. As matter of fact, shallow landslide and flash flood warning is a key element to improve the Civil Protection achievements to mitigate damages and safeguard the security of people. It is a rather complicated task, particularly in those catchments with flashy response where even brief anticipations are important and welcomed. It is well known how the triggering of shallow landslides is strongly influenced by the initial soil moisture conditions of catchments. Therefore, the early warning system here applied is based on the combined use of rainfall thresholds, derived both for flash flood and for landslide, and soil moisture conditions; the system is composed of several basic component related to antecedent soil moisture conditions, real-time rainfall monitoring and antecedent rainfall. Soil moisture conditions were estimated using an Antecedent Precipitation Index (API), similar to this widely used for defining soil moisture conditions via Antecedent Moisture conditions index AMC. Rainfall threshold for landslides were derived using historical and statistical analysis. Finally, rainfall thresholds for flash flooding were derived using an Instantaneous Unit Hydrograph based lumped rainfall-runoff model with the SCS-CN routine for net rainfall. After the implementation and calibration of the model, a testing phase was carried out by using real data collected for the November 2001 event in the Longano catchment. Moreover, in

  17. Scalable Entity-Based Modeling of Population-Based Systems, Final LDRD Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cleary, A J; Smith, S G; Vassilevska, T K

    2005-01-27

    The goal of this project has been to develop tools, capabilities and expertise in the modeling of complex population-based systems via scalable entity-based modeling (EBM). Our initial focal application domain has been the dynamics of large populations exposed to disease-causing agents, a topic of interest to the Department of Homeland Security in the context of bioterrorism. In the academic community, discrete simulation technology based on individual entities has shown initial success, but the technology has not been scaled to the problem sizes or computational resources of LLNL. Our developmental emphasis has been on the extension of this technology to parallelmore » computers and maturation of the technology from an academic to a lab setting.« less

  18. Threshold q -voter model

    NASA Astrophysics Data System (ADS)

    Vieira, Allan R.; Anteneodo, Celia

    2018-05-01

    We introduce the threshold q -voter opinion dynamics where an agent, facing a binary choice, can change its mind when at least q0 among q neighbors share the opposite opinion. Otherwise, the agent can still change its mind with a certain probability ɛ . This threshold dynamics contemplates the possibility of persuasion by an influence group even when there is not full agreement among its members. In fact, individuals can follow their peers not only when there is unanimity (q0=q ) in the lobby group, as assumed in the q -voter model, but also, depending on the circumstances, when there is simple majority (q0>q /2 ), Byzantine consensus (q0>2 q /3 ), or any minimal number q0 among q . This realistic threshold gives place to emerging collective states and phase transitions which are not observed in the standard q voter. The threshold q0, together with the stochasticity introduced by ɛ , yields a phenomenology that mimics as particular cases the q voter with stochastic drivings such as nonconformity and independence. In particular, nonconsensus majority states are possible, as well as mixed phases. Continuous and discontinuous phase transitions can occur, but also transitions from fluctuating phases into absorbing states.

  19. Threshold model of cascades in empirical temporal networks

    NASA Astrophysics Data System (ADS)

    Karimi, Fariba; Holme, Petter

    2013-08-01

    Threshold models try to explain the consequences of social influence like the spread of fads and opinions. Along with models of epidemics, they constitute a major theoretical framework of social spreading processes. In threshold models on static networks, an individual changes her state if a certain fraction of her neighbors has done the same. When there are strong correlations in the temporal aspects of contact patterns, it is useful to represent the system as a temporal network. In such a system, not only contacts but also the time of the contacts are represented explicitly. In many cases, bursty temporal patterns slow down disease spreading. However, as we will see, this is not a universal truth for threshold models. In this work we propose an extension of Watts’s classic threshold model to temporal networks. We do this by assuming that an agent is influenced by contacts which lie a certain time into the past. I.e., the individuals are affected by contacts within a time window. In addition to thresholds in the fraction of contacts, we also investigate the number of contacts within the time window as a basis for influence. To elucidate the model’s behavior, we run the model on real and randomized empirical contact datasets.

  20. Use of Threshold of Toxicological Concern (TTC) with High Throughput Exposure Predictions as a Risk-Based Screening Approach to Prioritize More Than Seven Thousand Chemicals (ASCCT)

    EPA Science Inventory

    Here, we present results of an approach for risk-based prioritization using the Threshold of Toxicological Concern (TTC) combined with high-throughput exposure (HTE) modelling. We started with 7968 chemicals with calculated population median oral daily intakes characterized by an...

  1. Variable-Threshold Threshold Elements,

    DTIC Science & Technology

    A threshold element is a mathematical model of certain types of logic gates and of a biological neuron. Much work has been done on the subject of... threshold elements with fixed thresholds; this study concerns itself with elements in which the threshold may be varied, variable- threshold threshold ...elements. Physical realizations include resistor-transistor elements, in which the threshold is simply a voltage. Variation of the threshold causes the

  2. Drought survival is a threshold function of habitat size and population density in a fish metapopulation.

    PubMed

    White, Richard S A; McHugh, Peter A; McIntosh, Angus R

    2016-10-01

    Because smaller habitats dry more frequently and severely during droughts, habitat size is likely a key driver of survival in populations during climate change and associated increased extreme drought frequency. Here, we show that survival in populations during droughts is a threshold function of habitat size driven by an interaction with population density in metapopulations of the forest pool dwelling fish, Neochanna apoda. A mark-recapture study involving 830 N. apoda individuals during a one-in-seventy-year extreme drought revealed that survival during droughts was high for populations occupying pools deeper than 139 mm, but declined steeply in shallower pools. This threshold was caused by an interaction between increasing population density and drought magnitude associated with decreasing habitat size, which acted synergistically to increase physiological stress and mortality. This confirmed two long-held hypotheses, firstly concerning the interactive role of population density and physiological stress, herein driven by habitat size, and secondly, the occurrence of drought survival thresholds. Our results demonstrate how survival in populations during droughts will depend strongly on habitat size and highlight that minimum habitat size thresholds will likely be required to maximize survival as the frequency and intensity of droughts are projected to increase as a result of global climate change. © 2016 John Wiley & Sons Ltd.

  3. A generalized methodology for identification of threshold for HRU delineation in SWAT model

    NASA Astrophysics Data System (ADS)

    M, J.; Sudheer, K.; Chaubey, I.; Raj, C.

    2016-12-01

    The distributed hydrological model, Soil and Water Assessment Tool (SWAT) is a comprehensive hydrologic model widely used for making various decisions. The simulation accuracy of the distributed hydrological model differs due to the mechanism involved in the subdivision of the watershed. Soil and Water Assessment Tool (SWAT) considers sub-dividing the watershed and the sub-basins into small computing units known as 'hydrologic response units (HRU). The delineation of HRU is done based on unique combinations of land use, soil types, and slope within the sub-watersheds, which are not spatially defined. The computations in SWAT are done at HRU level and are then aggregated up to the sub-basin outlet, which is routed through the stream system. Generally, the HRUs are delineated by considering a threshold percentage of land use, soil and slope are to be given by the modeler to decrease the computation time of the model. The thresholds constrain the minimum area for constructing an HRU. In the current HRU delineation practice in SWAT, the land use, soil and slope of the watershed within a sub-basin, which is less than the predefined threshold, will be surpassed by the dominating land use, soil and slope, and introduce some level of ambiguity in the process simulations in terms of inappropriate representation of the area. But the loss of information due to variation in the threshold values depends highly on the purpose of the study. Therefore this research studies the effects of threshold values of HRU delineation on the hydrological modeling of SWAT on sediment simulations and suggests guidelines for selecting the appropriate threshold values considering the sediment simulation accuracy. The preliminary study was done on Illinois watershed by assigning different thresholds for land use and soil. A general methodology was proposed for identifying an appropriate threshold for HRU delineation in SWAT model that considered computational time and accuracy of the simulation

  4. An agent-based computational model for tuberculosis spreading on age-structured populations

    NASA Astrophysics Data System (ADS)

    Graciani Rodrigues, C. C.; Espíndola, Aquino L.; Penna, T. J. P.

    2015-06-01

    In this work we present an agent-based computational model to study the spreading of the tuberculosis (TB) disease on age-structured populations. The model proposed is a merge of two previous models: an agent-based computational model for the spreading of tuberculosis and a bit-string model for biological aging. The combination of TB with the population aging, reproduces the coexistence of health states, as seen in real populations. In addition, the universal exponential behavior of mortalities curves is still preserved. Finally, the population distribution as function of age shows the prevalence of TB mostly in elders, for high efficacy treatments.

  5. Porcine skin visible lesion thresholds for near-infrared lasers including modeling at two pulse durations and spot sizes.

    PubMed

    Cain, C P; Polhamus, G D; Roach, W P; Stolarski, D J; Schuster, K J; Stockton, K L; Rockwell, B A; Chen, Bo; Welch, A J

    2006-01-01

    With the advent of such systems as the airborne laser and advanced tactical laser, high-energy lasers that use 1315-nm wavelengths in the near-infrared band will soon present a new laser safety challenge to armed forces and civilian populations. Experiments in nonhuman primates using this wavelength have demonstrated a range of ocular injuries, including corneal, lenticular, and retinal lesions as a function of pulse duration. American National Standards Institute (ANSI) laser safety standards have traditionally been based on experimental data, and there is scant data for this wavelength. We are reporting minimum visible lesion (MVL) threshold measurements using a porcine skin model for two different pulse durations and spot sizes for this wavelength. We also compare our measurements to results from our model based on the heat transfer equation and rate process equation, together with actual temperature measurements on the skin surface using a high-speed infrared camera. Our MVL-ED50 thresholds for long pulses (350 micros) at 24-h postexposure are measured to be 99 and 83 J cm(-2) for spot sizes of 0.7 and 1.3 mm diam, respectively. Q-switched laser pulses of 50 ns have a lower threshold of 11 J cm(-2) for a 5-mm-diam top-hat laser pulse.

  6. Nonlinear Dynamic Modeling of Neuron Action Potential Threshold During Synaptically Driven Broadband Intracellular Activity

    PubMed Central

    Roach, Shane M.; Song, Dong; Berger, Theodore W.

    2012-01-01

    Activity-dependent variation of neuronal thresholds for action potential (AP) generation is one of the key determinants of spike-train temporal-pattern transformations from presynaptic to postsynaptic spike trains. In this study, we model the nonlinear dynamics of the threshold variation during synaptically driven broadband intracellular activity. First, membrane potentials of single CA1 pyramidal cells were recorded under physiologically plausible broadband stimulation conditions. Second, a method was developed to measure AP thresholds from the continuous recordings of membrane potentials. It involves measuring the turning points of APs by analyzing the third-order derivatives of the membrane potentials. Four stimulation paradigms with different temporal patterns were applied to validate this method by comparing the measured AP turning points and the actual AP thresholds estimated with varying stimulation intensities. Results show that the AP turning points provide consistent measurement of the AP thresholds, except for a constant offset. It indicates that 1) the variation of AP turning points represents the nonlinearities of threshold dynamics; and 2) an optimization of the constant offset is required to achieve accurate spike prediction. Third, a nonlinear dynamical third-order Volterra model was built to describe the relations between the threshold dynamics and the AP activities. Results show that the model can predict threshold accurately based on the preceding APs. Finally, the dynamic threshold model was integrated into a previously developed single neuron model and resulted in a 33% improvement in spike prediction. PMID:22156947

  7. Understanding Past Population Dynamics: Bayesian Coalescent-Based Modeling with Covariates

    PubMed Central

    Gill, Mandev S.; Lemey, Philippe; Bennett, Shannon N.; Biek, Roman; Suchard, Marc A.

    2016-01-01

    Effective population size characterizes the genetic variability in a population and is a parameter of paramount importance in population genetics and evolutionary biology. Kingman’s coalescent process enables inference of past population dynamics directly from molecular sequence data, and researchers have developed a number of flexible coalescent-based models for Bayesian nonparametric estimation of the effective population size as a function of time. Major goals of demographic reconstruction include identifying driving factors of effective population size, and understanding the association between the effective population size and such factors. Building upon Bayesian nonparametric coalescent-based approaches, we introduce a flexible framework that incorporates time-varying covariates that exploit Gaussian Markov random fields to achieve temporal smoothing of effective population size trajectories. To approximate the posterior distribution, we adapt efficient Markov chain Monte Carlo algorithms designed for highly structured Gaussian models. Incorporating covariates into the demographic inference framework enables the modeling of associations between the effective population size and covariates while accounting for uncertainty in population histories. Furthermore, it can lead to more precise estimates of population dynamics. We apply our model to four examples. We reconstruct the demographic history of raccoon rabies in North America and find a significant association with the spatiotemporal spread of the outbreak. Next, we examine the effective population size trajectory of the DENV-4 virus in Puerto Rico along with viral isolate count data and find similar cyclic patterns. We compare the population history of the HIV-1 CRF02_AG clade in Cameroon with HIV incidence and prevalence data and find that the effective population size is more reflective of incidence rate. Finally, we explore the hypothesis that the population dynamics of musk ox during the Late

  8. Unipolar Terminal-Attractor Based Neural Associative Memory with Adaptive Threshold

    NASA Technical Reports Server (NTRS)

    Liu, Hua-Kuang (Inventor); Barhen, Jacob (Inventor); Farhat, Nabil H. (Inventor); Wu, Chwan-Hwa (Inventor)

    1996-01-01

    A unipolar terminal-attractor based neural associative memory (TABAM) system with adaptive threshold for perfect convergence is presented. By adaptively setting the threshold values for the dynamic iteration for the unipolar binary neuron states with terminal-attractors for the purpose of reducing the spurious states in a Hopfield neural network for associative memory and using the inner-product approach, perfect convergence and correct retrieval is achieved. Simulation is completed with a small number of stored states (M) and a small number of neurons (N) but a large M/N ratio. An experiment with optical exclusive-OR logic operation using LCTV SLMs shows the feasibility of optoelectronic implementation of the models. A complete inner-product TABAM is implemented using a PC for calculation of adaptive threshold values to achieve a unipolar TABAM (UIT) in the case where there is no crosstalk, and a crosstalk model (CRIT) in the case where crosstalk corrupts the desired state.

  9. Unipolar terminal-attractor based neural associative memory with adaptive threshold

    NASA Technical Reports Server (NTRS)

    Liu, Hua-Kuang (Inventor); Barhen, Jacob (Inventor); Farhat, Nabil H. (Inventor); Wu, Chwan-Hwa (Inventor)

    1993-01-01

    A unipolar terminal-attractor based neural associative memory (TABAM) system with adaptive threshold for perfect convergence is presented. By adaptively setting the threshold values for the dynamic iteration for the unipolar binary neuron states with terminal-attractors for the purpose of reducing the spurious states in a Hopfield neural network for associative memory and using the inner product approach, perfect convergence and correct retrieval is achieved. Simulation is completed with a small number of stored states (M) and a small number of neurons (N) but a large M/N ratio. An experiment with optical exclusive-OR logic operation using LCTV SLMs shows the feasibility of optoelectronic implementation of the models. A complete inner-product TABAM is implemented using a PC for calculation of adaptive threshold values to achieve a unipolar TABAM (UIT) in the case where there is no crosstalk, and a crosstalk model (CRIT) in the case where crosstalk corrupts the desired state.

  10. Estimation of debris flow critical rainfall thresholds by a physically-based model

    NASA Astrophysics Data System (ADS)

    Papa, M. N.; Medina, V.; Ciervo, F.; Bateman, A.

    2012-11-01

    Real time assessment of debris flow hazard is fundamental for setting up warning systems that can mitigate its risk. A convenient method to assess the possible occurrence of a debris flow is the comparison of measured and forecasted rainfall with rainfall threshold curves (RTC). Empirical derivation of the RTC from the analysis of rainfall characteristics of past events is not possible when the database of observed debris flows is poor or when the environment changes with time. For landslides triggered debris flows, the above limitations may be overcome through the methodology here presented, based on the derivation of RTC from a physically based model. The critical RTC are derived from mathematical and numerical simulations based on the infinite-slope stability model in which land instability is governed by the increase in groundwater pressure due to rainfall. The effect of rainfall infiltration on landside occurrence is modelled trough a reduced form of the Richards equation. The simulations are performed in a virtual basin, representative of the studied basin, taking into account the uncertainties linked with the definition of the characteristics of the soil. A large number of calculations are performed combining different values of the rainfall characteristics (intensity and duration of event rainfall and intensity of antecedent rainfall). For each combination of rainfall characteristics, the percentage of the basin that is unstable is computed. The obtained database is opportunely elaborated to derive RTC curves. The methodology is implemented and tested on a small basin of the Amalfi Coast (South Italy).

  11. Using thresholds based on risk of cardiovascular disease to target treatment for hypertension: modelling events averted and number treated

    PubMed Central

    Baker, Simon; Priest, Patricia; Jackson, Rod

    2000-01-01

    Objective To estimate the impact of using thresholds based on absolute risk of cardiovascular disease to target drug treatment to lower blood pressure in the community. Design Modelling of three thresholds of treatment for hypertension based on the absolute risk of cardiovascular disease. 5 year risk of disease was estimated for each participant using an equation to predict risk. Net predicted impact of the thresholds on the number of people treated and the number of disease events averted over 5 years was calculated assuming a relative treatment benefit of one quarter. Setting Auckland, New Zealand. Participants 2158 men and women aged 35-79 years randomly sampled from the general electoral rolls. Main outcome measures Predicted 5 year risk of cardiovascular disease event, estimated number of people for whom treatment would be recommended, and disease events averted over 5 years at different treatment thresholds. Results 46 374 (12%) Auckland residents aged 35-79 receive drug treatment to lower their blood pressure, averting an estimated 1689 disease events over 5 years. Restricting treatment to individuals with blood pressure ⩾170/100 mm Hg and those with blood pressure between 150/90-169/99 mm Hg who have a predicted 5 year risk of disease ⩾10% would increase the net number for whom treatment would be recommended by 19 401. This 42% relative increase is predicted to avert 1139/1689 (68%) additional disease events overall over 5 years compared with current treatment. If the threshold for 5 year risk of disease is set at 15% the number recommended for treatment increases by <10% but about 620/1689 (37%) additional events can be averted. A 20% threshold decreases the net number of patients recommended for treatment by about 10% but averts 204/1689 (12%) more disease events than current treatment. Conclusions Implementing treatment guidelines that use treatment thresholds based on absolute risk could significantly improve the efficiency of drug treatment to

  12. Thresholding functional connectomes by means of mixture modeling.

    PubMed

    Bielczyk, Natalia Z; Walocha, Fabian; Ebel, Patrick W; Haak, Koen V; Llera, Alberto; Buitelaar, Jan K; Glennon, Jeffrey C; Beckmann, Christian F

    2018-05-01

    Functional connectivity has been shown to be a very promising tool for studying the large-scale functional architecture of the human brain. In network research in fMRI, functional connectivity is considered as a set of pair-wise interactions between the nodes of the network. These interactions are typically operationalized through the full or partial correlation between all pairs of regional time series. Estimating the structure of the latent underlying functional connectome from the set of pair-wise partial correlations remains an open research problem though. Typically, this thresholding problem is approached by proportional thresholding, or by means of parametric or non-parametric permutation testing across a cohort of subjects at each possible connection. As an alternative, we propose a data-driven thresholding approach for network matrices on the basis of mixture modeling. This approach allows for creating subject-specific sparse connectomes by modeling the full set of partial correlations as a mixture of low correlation values associated with weak or unreliable edges in the connectome and a sparse set of reliable connections. Consequently, we propose to use alternative thresholding strategy based on the model fit using pseudo-False Discovery Rates derived on the basis of the empirical null estimated as part of the mixture distribution. We evaluate the method on synthetic benchmark fMRI datasets where the underlying network structure is known, and demonstrate that it gives improved performance with respect to the alternative methods for thresholding connectomes, given the canonical thresholding levels. We also demonstrate that mixture modeling gives highly reproducible results when applied to the functional connectomes of the visual system derived from the n-back Working Memory task in the Human Connectome Project. The sparse connectomes obtained from mixture modeling are further discussed in the light of the previous knowledge of the functional architecture

  13. Simulation of Healing Threshold in Strain-Induced Inflammation Through a Discrete Informatics Model.

    PubMed

    Ibrahim, Israr Bin M; Sarma O V, Sanjay; Pidaparti, Ramana M

    2018-05-01

    Respiratory diseases such as asthma and acute respiratory distress syndrome as well as acute lung injury involve inflammation at the cellular level. The inflammation process is very complex and is characterized by the emergence of cytokines along with other changes in cellular processes. Due to the complexity of the various constituents that makes up the inflammation dynamics, it is necessary to develop models that can complement experiments to fully understand inflammatory diseases. In this study, we developed a discrete informatics model based on cellular automata (CA) approach to investigate the influence of elastic field (stretch/strain) on the dynamics of inflammation and account for probabilistic adaptation based on statistical interpretation of existing experimental data. Our simulation model investigated the effects of low, medium, and high strain conditions on inflammation dynamics. Results suggest that the model is able to indicate the threshold of innate healing of tissue as a response to strain experienced by the tissue. When strain is under the threshold, the tissue is still capable of adapting its structure to heal the damaged part. However, there exists a strain threshold where healing capability breaks down. The results obtained demonstrate that the developed discrete informatics based CA model is capable of modeling and giving insights into inflammation dynamics parameters under various mechanical strain/stretch environments.

  14. Novel threshold pressure sensors based on nonlinear dynamics of MEMS resonators

    NASA Astrophysics Data System (ADS)

    Hasan, Mohammad H.; Alsaleem, Fadi M.; Ouakad, Hassen M.

    2018-06-01

    Triggering an alarm in a car for low air-pressure in the tire or tripping an HVAC compressor if the refrigerant pressure is lower than a threshold value are examples for applications where measuring the amount of pressure is not as important as determining if the pressure has exceeded a threshold value for an action to occur. Unfortunately, current technology still relies on analog pressure sensors to perform this functionality by adding a complex interface (extra circuitry, controllers, and/or decision units). In this paper, we demonstrate two new smart tunable-threshold pressure switch concepts that can reduce the complexity of a threshold pressure sensor. The first concept is based on the nonlinear subharmonic resonance of a straight double cantilever microbeam with a proof mass and the other concept is based on the snap-through bi-stability of a clamped-clamped MEMS shallow arch. In both designs, the sensor operation concept is simple. Any actuation performed at a certain pressure lower than a threshold value will activate a nonlinear dynamic behavior (subharmonic resonance or snap-through bi-stability) yielding a large output that would be interpreted as a logic value of ONE, or ON. Once the pressure exceeds the threshold value, the nonlinear response ceases to exist, yielding a small output that would be interpreted as a logic value of ZERO, or OFF. A lumped, single degree of freedom model for the double cantilever beam, that is validated using experimental data, and a continuous beam model for the arch beam, are used to simulate the operation range of the proposed sensors by identifying the relationship between the excitation signal and the critical cut-off pressure.

  15. Climate-based models for pulsed resources improve predictability of consumer population dynamics: outbreaks of house mice in forest ecosystems.

    PubMed

    Holland, E Penelope; James, Alex; Ruscoe, Wendy A; Pech, Roger P; Byrom, Andrea E

    2015-01-01

    Accurate predictions of the timing and magnitude of consumer responses to episodic seeding events (masts) are important for understanding ecosystem dynamics and for managing outbreaks of invasive species generated by masts. While models relating consumer populations to resource fluctuations have been developed successfully for a range of natural and modified ecosystems, a critical gap that needs addressing is better prediction of resource pulses. A recent model used change in summer temperature from one year to the next (ΔT) for predicting masts for forest and grassland plants in New Zealand. We extend this climate-based method in the framework of a model for consumer-resource dynamics to predict invasive house mouse (Mus musculus) outbreaks in forest ecosystems. Compared with previous mast models based on absolute temperature, the ΔT method for predicting masts resulted in an improved model for mouse population dynamics. There was also a threshold effect of ΔT on the likelihood of an outbreak occurring. The improved climate-based method for predicting resource pulses and consumer responses provides a straightforward rule of thumb for determining, with one year's advance warning, whether management intervention might be required in invaded ecosystems. The approach could be applied to consumer-resource systems worldwide where climatic variables are used to model the size and duration of resource pulses, and may have particular relevance for ecosystems where global change scenarios predict increased variability in climatic events.

  16. Density thresholds for Mopeia virus invasion and persistence in its host Mastomys natalensis.

    PubMed

    Goyens, J; Reijniers, J; Borremans, B; Leirs, H

    2013-01-21

    Well-established theoretical models predict host density thresholds for invasion and persistence of parasites with a density-dependent transmission. Studying such thresholds in reality, however, is not obvious because it requires long-term data for several fluctuating populations of different size. We developed a spatially explicit and individual-based SEIR model of Mopeia virus in multimammate mice Mastomys natalensis. This is an interesting model system for studying abundance thresholds because the host is the most common African rodent, populations fluctuate considerably and the virus is closely related to Lassa virus but non-pathogenic to humans so can be studied safely in the field. The simulations show that, while host density clearly is important, sharp thresholds are only to be expected for persistence (and not for invasion), since at short time-spans (as during invasion), stochasticity is determining. Besides host density, also the spatial extent of the host population is important. We observe the repeated local occurrence of herd immunity, leading to a decrease in transmission of the virus, while even a limited amount of dispersal can have a strong influence in spreading and re-igniting the transmission. The model is most sensitive to the duration of the infectious stage, the size of the home range and the transmission coefficient, so these are important factors to determine experimentally in the future. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. Extinction-effective population index: incorporating life-history variations in population viability analysis.

    PubMed

    Fujiwara, Masami

    2007-09-01

    Viability status of populations is a commonly used measure for decision-making in the management of populations. One of the challenges faced by managers is the need to consistently allocate management effort among populations. This allocation should in part be based on comparison of extinction risks among populations. Unfortunately, common criteria that use minimum viable population size or count-based population viability analysis (PVA) often do not provide results that are comparable among populations, primarily because they lack consistency in determining population size measures and threshold levels of population size (e.g., minimum viable population size and quasi-extinction threshold). Here I introduce a new index called the "extinction-effective population index," which accounts for differential effects of demographic stochasticity among organisms with different life-history strategies and among individuals in different life stages. This index is expected to become a new way of determining minimum viable population size criteria and also complement the count-based PVA. The index accounts for the difference in life-history strategies of organisms, which are modeled using matrix population models. The extinction-effective population index, sensitivity, and elasticity are demonstrated in three species of Pacific salmonids. The interpretation of the index is also provided by comparing them with existing demographic indices. Finally, a measure of life-history-specific effect of demographic stochasticity is derived.

  18. Genetic variance of tolerance and the toxicant threshold model.

    PubMed

    Tanaka, Yoshinari; Mano, Hiroyuki; Tatsuta, Haruki

    2012-04-01

    A statistical genetics method is presented for estimating the genetic variance (heritability) of tolerance to pollutants on the basis of a standard acute toxicity test conducted on several isofemale lines of cladoceran species. To analyze the genetic variance of tolerance in the case when the response is measured as a few discrete states (quantal endpoints), the authors attempted to apply the threshold character model in quantitative genetics to the threshold model separately developed in ecotoxicology. The integrated threshold model (toxicant threshold model) assumes that the response of a particular individual occurs at a threshold toxicant concentration and that the individual tolerance characterized by the individual's threshold value is determined by genetic and environmental factors. As a case study, the heritability of tolerance to p-nonylphenol in the cladoceran species Daphnia galeata was estimated by using the maximum likelihood method and nested analysis of variance (ANOVA). Broad-sense heritability was estimated to be 0.199 ± 0.112 by the maximum likelihood method and 0.184 ± 0.089 by ANOVA; both results implied that the species examined had the potential to acquire tolerance to this substance by evolutionary change. Copyright © 2012 SETAC.

  19. Modelling interactions of toxicants and density dependence in wildlife populations

    USGS Publications Warehouse

    Schipper, Aafke M.; Hendriks, Harrie W.M.; Kauffman, Matthew J.; Hendriks, A. Jan; Huijbregts, Mark A.J.

    2013-01-01

    1. A major challenge in the conservation of threatened and endangered species is to predict population decline and design appropriate recovery measures. However, anthropogenic impacts on wildlife populations are notoriously difficult to predict due to potentially nonlinear responses and interactions with natural ecological processes like density dependence. 2. Here, we incorporated both density dependence and anthropogenic stressors in a stage-based matrix population model and parameterized it for a density-dependent population of peregrine falcons Falco peregrinus exposed to two anthropogenic toxicants [dichlorodiphenyldichloroethylene (DDE) and polybrominated diphenyl ethers (PBDEs)]. Log-logistic exposure–response relationships were used to translate toxicant concentrations in peregrine falcon eggs to effects on fecundity. Density dependence was modelled as the probability of a nonbreeding bird acquiring a breeding territory as a function of the current number of breeders. 3. The equilibrium size of the population, as represented by the number of breeders, responded nonlinearly to increasing toxicant concentrations, showing a gradual decrease followed by a relatively steep decline. Initially, toxicant-induced reductions in population size were mitigated by an alleviation of the density limitation, that is, an increasing probability of territory acquisition. Once population density was no longer limiting, the toxicant impacts were no longer buffered by an increasing proportion of nonbreeders shifting to the breeding stage, resulting in a strong decrease in the equilibrium number of breeders. 4. Median critical exposure concentrations, that is, median toxicant concentrations in eggs corresponding with an equilibrium population size of zero, were 33 and 46 μg g−1 fresh weight for DDE and PBDEs, respectively. 5. Synthesis and applications. Our modelling results showed that particular life stages of a density-limited population may be relatively insensitive to

  20. Ambient high temperature and mortality in Jinan, China: A study of heat thresholds and vulnerable populations.

    PubMed

    Li, Jing; Xu, Xin; Yang, Jun; Liu, Zhidong; Xu, Lei; Gao, Jinghong; Liu, Xiaobo; Wu, Haixia; Wang, Jun; Yu, Jieqiong; Jiang, Baofa; Liu, Qiyong

    2017-07-01

    Understanding the health consequences of continuously rising temperatures-as is projected for China-is important in terms of developing heat-health adaptation and intervention programs. This study aimed to examine the association between mortality and daily maximum (T max ), mean (T mean ), and minimum (T min ) temperatures in warmer months; to explore threshold temperatures; and to identify optimal heat indicators and vulnerable populations. Daily data on temperature and mortality were obtained for the period 2007-2013. Heat thresholds for condition-specific mortality were estimated using an observed/expected analysis. We used a generalised additive model with a quasi-Poisson distribution to examine the association between mortality and T max /T min /T mean values higher than the threshold values, after adjustment for covariates. T max /T mean /T min thresholds were 32/28/24°C for non-accidental deaths; 32/28/24°C for cardiovascular deaths; 35/31/26°C for respiratory deaths; and 34/31/28°C for diabetes-related deaths. For each 1°C increase in T max /T mean /T min above the threshold, the mortality risk of non-accidental-, cardiovascular-, respiratory, and diabetes-related death increased by 2.8/5.3/4.8%, 4.1/7.2/6.6%, 6.6/25.3/14.7%, and 13.3/30.5/47.6%, respectively. Thresholds for mortality differed according to health condition when stratified by sex, age, and education level. For non-accidental deaths, effects were significant in individuals aged ≥65 years (relative risk=1.038, 95% confidence interval: 1.026-1.050), but not for those ≤64 years. For most outcomes, women and people ≥65 years were more vulnerable. High temperature significantly increases the risk of mortality in the population of Jinan, China. Climate change with rising temperatures may bring about the situation worse. Public health programs should be improved and implemented to prevent and reduce health risks during hot days, especially for the identified vulnerable groups. Copyright

  1. Selection Strategies for Social Influence in the Threshold Model

    NASA Astrophysics Data System (ADS)

    Karampourniotis, Panagiotis; Szymanski, Boleslaw; Korniss, Gyorgy

    The ubiquity of online social networks makes the study of social influence extremely significant for its applications to marketing, politics and security. Maximizing the spread of influence by strategically selecting nodes as initiators of a new opinion or trend is a challenging problem. We study the performance of various strategies for selection of large fractions of initiators on a classical social influence model, the Threshold model (TM). Under the TM, a node adopts a new opinion only when the fraction of its first neighbors possessing that opinion exceeds a pre-assigned threshold. The strategies we study are of two kinds: strategies based solely on the initial network structure (Degree-rank, Dominating Sets, PageRank etc.) and strategies that take into account the change of the states of the nodes during the evolution of the cascade, e.g. the greedy algorithm. We find that the performance of these strategies depends largely on both the network structure properties, e.g. the assortativity, and the distribution of the thresholds assigned to the nodes. We conclude that the optimal strategy needs to combine the network specifics and the model specific parameters to identify the most influential spreaders. Supported in part by ARL NS-CTA, ARO, and ONR.

  2. Public sector low threshold office-based buprenorphine treatment: outcomes at year 7.

    PubMed

    Bhatraju, Elenore Patterson; Grossman, Ellie; Tofighi, Babak; McNeely, Jennifer; DiRocco, Danae; Flannery, Mara; Garment, Ann; Goldfeld, Keith; Gourevitch, Marc N; Lee, Joshua D

    2017-02-28

    Buprenorphine maintenance for opioid dependence remains of limited availability among underserved populations, despite increases in US opioid misuse and overdose deaths. Low threshold primary care treatment models including the use of unobserved, "home," buprenorphine induction may simplify initiation of care and improve access. Unobserved induction and long-term treatment outcomes have not been reported recently among large, naturalistic cohorts treated in low threshold safety net primary care settings. This prospective clinical registry cohort design estimated rates of induction-related adverse events, treatment retention, and urine opioid results for opioid dependent adults offered buprenorphine maintenance in a New York City public hospital primary care office-based practice from 2006 to 2013. This clinic relied on typical ambulatory care individual provider-patient visits, prescribed unobserved induction exclusively, saw patients no more than weekly, and did not require additional psychosocial treatment. Unobserved induction consisted of an in-person screening and diagnostic visit followed by a 1-week buprenorphine written prescription, with pamphlet, and telephone support. Primary outcomes analyzed were rates of induction-related adverse events (AE), week 1 drop-out, and long-term treatment retention. Factors associated with treatment retention were examined using a Cox proportional hazard model among inductions and all patients. Secondary outcomes included overall clinic retention, buprenorphine dosages, and urine sample results. Of the 485 total patients in our registry, 306 were inducted, and 179 were transfers already on buprenorphine. Post-induction (n = 306), week 1 drop-out was 17%. Rates of any induction-related AE were 12%; serious adverse events, 0%; precipitated withdrawal, 3%; prolonged withdrawal, 4%. Treatment retention was a median 38 weeks (range 0-320) for inductions, compared to 110 (0-354) weeks for transfers and 57 for the entire clinic

  3. Hybrid Artificial Root Foraging Optimizer Based Multilevel Threshold for Image Segmentation

    PubMed Central

    Liu, Yang; Liu, Junfei

    2016-01-01

    This paper proposes a new plant-inspired optimization algorithm for multilevel threshold image segmentation, namely, hybrid artificial root foraging optimizer (HARFO), which essentially mimics the iterative root foraging behaviors. In this algorithm the new growth operators of branching, regrowing, and shrinkage are initially designed to optimize continuous space search by combining root-to-root communication and coevolution mechanism. With the auxin-regulated scheme, various root growth operators are guided systematically. With root-to-root communication, individuals exchange information in different efficient topologies, which essentially improve the exploration ability. With coevolution mechanism, the hierarchical spatial population driven by evolutionary pressure of multiple subpopulations is structured, which ensure that the diversity of root population is well maintained. The comparative results on a suit of benchmarks show the superiority of the proposed algorithm. Finally, the proposed HARFO algorithm is applied to handle the complex image segmentation problem based on multilevel threshold. Computational results of this approach on a set of tested images show the outperformance of the proposed algorithm in terms of optimization accuracy computation efficiency. PMID:27725826

  4. Hybrid Artificial Root Foraging Optimizer Based Multilevel Threshold for Image Segmentation.

    PubMed

    Liu, Yang; Liu, Junfei; Tian, Liwei; Ma, Lianbo

    2016-01-01

    This paper proposes a new plant-inspired optimization algorithm for multilevel threshold image segmentation, namely, hybrid artificial root foraging optimizer (HARFO), which essentially mimics the iterative root foraging behaviors. In this algorithm the new growth operators of branching, regrowing, and shrinkage are initially designed to optimize continuous space search by combining root-to-root communication and coevolution mechanism. With the auxin-regulated scheme, various root growth operators are guided systematically. With root-to-root communication, individuals exchange information in different efficient topologies, which essentially improve the exploration ability. With coevolution mechanism, the hierarchical spatial population driven by evolutionary pressure of multiple subpopulations is structured, which ensure that the diversity of root population is well maintained. The comparative results on a suit of benchmarks show the superiority of the proposed algorithm. Finally, the proposed HARFO algorithm is applied to handle the complex image segmentation problem based on multilevel threshold. Computational results of this approach on a set of tested images show the outperformance of the proposed algorithm in terms of optimization accuracy computation efficiency.

  5. Sri Lankan FRAX model and country-specific intervention thresholds.

    PubMed

    Lekamwasam, Sarath

    2013-01-01

    There is a wide variation in fracture probabilities estimated by Asian FRAX models, although the outputs of South Asian models are concordant. Clinicians can choose either fixed or age-specific intervention thresholds when making treatment decisions in postmenopausal women. Cost-effectiveness of such approach, however, needs to be addressed. This study examined suitable fracture probability intervention thresholds (ITs) for Sri Lanka, based on the Sri Lankan FRAX model. Fracture probabilities were estimated using all Asian FRAX models for a postmenopausal woman of BMI 25 kg/m² and has no clinical risk factors apart from a fragility fracture, and they were compared. Age-specific ITs were estimated based on the Sri Lankan FRAX model using the method followed by the National Osteoporosis Guideline Group in the UK. Using the age-specific ITs as the reference standard, suitable fixed ITs were also estimated. Fracture probabilities estimated by different Asian FRAX models varied widely. Japanese and Taiwan models showed higher fracture probabilities while Chinese, Philippine, and Indonesian models gave lower fracture probabilities. Output of remaining FRAX models were generally similar. Age-specific ITs of major osteoporotic fracture probabilities (MOFP) based on the Sri Lankan FRAX model varied from 2.6 to 18% between 50 and 90 years. ITs of hip fracture probabilities (HFP) varied from 0.4 to 6.5% between 50 and 90 years. In finding fixed ITs, MOFP of 11% and HFP of 3.5% gave the lowest misclassification and highest agreement. Sri Lankan FRAX model behaves similar to other Asian FRAX models such as Indian, Singapore-Indian, Thai, and South Korean. Clinicians may use either the fixed or age-specific ITs in making therapeutic decisions in postmenopausal women. The economical aspects of such decisions, however, need to be considered.

  6. The Threshold Bias Model: A Mathematical Model for the Nomothetic Approach of Suicide

    PubMed Central

    Folly, Walter Sydney Dutra

    2011-01-01

    Background Comparative and predictive analyses of suicide data from different countries are difficult to perform due to varying approaches and the lack of comparative parameters. Methodology/Principal Findings A simple model (the Threshold Bias Model) was tested for comparative and predictive analyses of suicide rates by age. The model comprises of a six parameter distribution that was applied to the USA suicide rates by age for the years 2001 and 2002. Posteriorly, linear extrapolations are performed of the parameter values previously obtained for these years in order to estimate the values corresponding to the year 2003. The calculated distributions agreed reasonably well with the aggregate data. The model was also used to determine the age above which suicide rates become statistically observable in USA, Brazil and Sri Lanka. Conclusions/Significance The Threshold Bias Model has considerable potential applications in demographic studies of suicide. Moreover, since the model can be used to predict the evolution of suicide rates based on information extracted from past data, it will be of great interest to suicidologists and other researchers in the field of mental health. PMID:21909431

  7. The threshold bias model: a mathematical model for the nomothetic approach of suicide.

    PubMed

    Folly, Walter Sydney Dutra

    2011-01-01

    Comparative and predictive analyses of suicide data from different countries are difficult to perform due to varying approaches and the lack of comparative parameters. A simple model (the Threshold Bias Model) was tested for comparative and predictive analyses of suicide rates by age. The model comprises of a six parameter distribution that was applied to the USA suicide rates by age for the years 2001 and 2002. Posteriorly, linear extrapolations are performed of the parameter values previously obtained for these years in order to estimate the values corresponding to the year 2003. The calculated distributions agreed reasonably well with the aggregate data. The model was also used to determine the age above which suicide rates become statistically observable in USA, Brazil and Sri Lanka. The Threshold Bias Model has considerable potential applications in demographic studies of suicide. Moreover, since the model can be used to predict the evolution of suicide rates based on information extracted from past data, it will be of great interest to suicidologists and other researchers in the field of mental health.

  8. The mutation-drift balance in spatially structured populations.

    PubMed

    Schneider, David M; Martins, Ayana B; de Aguiar, Marcus A M

    2016-08-07

    In finite populations the action of neutral mutations is balanced by genetic drift, leading to a stationary distribution of alleles that displays a transition between two different behaviors. For small mutation rates most individuals will carry the same allele at equilibrium, whereas for high mutation rates of the alleles will be randomly distributed with frequencies close to one half for a biallelic gene. For well-mixed haploid populations the mutation threshold is μc=1/2N, where N is the population size. In this paper we study how spatial structure affects this mutation threshold. Specifically, we study the stationary allele distribution for populations placed on regular networks where connected nodes represent potential mating partners. We show that the mutation threshold is sensitive to spatial structure only if the number of potential mates is very small. In this limit, the mutation threshold decreases substantially, increasing the diversity of the population at considerably low mutation rates. Defining kc as the degree of the network for which the mutation threshold drops to half of its value in well-mixed populations we show that kc grows slowly as a function of the population size, following a power law. Our calculations and simulations are based on the Moran model and on a mapping between the Moran model with mutations and the voter model with opinion makers. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Extension of landscape-based population viability models to ecoregional scales for conservation planning

    Treesearch

    Thomas W. Bonnot; Frank R. III Thompson; Joshua Millspaugh

    2011-01-01

    Landscape-based population models are potentially valuable tools in facilitating conservation planning and actions at large scales. However, such models have rarely been applied at ecoregional scales. We extended landscape-based population models to ecoregional scales for three species of concern in the Central Hardwoods Bird Conservation Region and compared model...

  10. Recognition ROCS Are Curvilinear--Or Are They? On Premature Arguments against the Two-High-Threshold Model of Recognition

    ERIC Educational Resources Information Center

    Broder, Arndt; Schutz, Julia

    2009-01-01

    Recent reviews of recognition receiver operating characteristics (ROCs) claim that their curvilinear shape rules out threshold models of recognition. However, the shape of ROCs based on confidence ratings is not diagnostic to refute threshold models, whereas ROCs based on experimental bias manipulations are. Also, fitting predicted frequencies to…

  11. Modeling habitat split: landscape and life history traits determine amphibian extinction thresholds.

    PubMed

    Fonseca, Carlos Roberto; Coutinho, Renato M; Azevedo, Franciane; Berbert, Juliana M; Corso, Gilberto; Kraenkel, Roberto A

    2013-01-01

    Habitat split is a major force behind the worldwide decline of amphibian populations, causing community change in richness and species composition. In fragmented landscapes, natural remnants, the terrestrial habitat of the adults, are frequently separated from streams, the aquatic habitat of the larvae. An important question is how this landscape configuration affects population levels and if it can drive species to extinction locally. Here, we put forward the first theoretical model on habitat split which is particularly concerned on how split distance - the distance between the two required habitats - affects population size and persistence in isolated fragments. Our diffusive model shows that habitat split alone is able to generate extinction thresholds. Fragments occurring between the aquatic habitat and a given critical split distance are expected to hold viable populations, while fragments located farther away are expected to be unoccupied. Species with higher reproductive success and higher diffusion rate of post-metamorphic youngs are expected to have farther critical split distances. Furthermore, the model indicates that negative effects of habitat split are poorly compensated by positive effects of fragment size. The habitat split model improves our understanding about spatially structured populations and has relevant implications for landscape design for conservation. It puts on a firm theoretical basis the relation between habitat split and the decline of amphibian populations.

  12. Otoacoustic emissions in the general adult population of Nord-Trøndelag, Norway: III. Relationships with pure-tone hearing thresholds.

    PubMed

    Engdahl, Bo; Tambs, Kristian; Borchgrevink, Hans M; Hoffman, Howard J

    2005-01-01

    This study aims to describe the association between otoacoustic emissions (OAEs) and pure-tone hearing thresholds (PTTs) in an unscreened adult population (N =6415), to determine the efficiency by which TEOAEs and DPOAEs can identify ears with elevated PTTs, and to investigate whether a combination of DPOAE and TEOAE responses improves this performance. Associations were examined by linear regression analysis and ANOVA. Test performance was assessed by receiver operator characteristic (ROC) curves. The relation between OAEs and PTTs appeared curvilinear with a moderate degree of non-linearity. Combining DPOAEs and TEOAEs improved performance. Test performance depended on the cut-off thresholds defining elevated PTTs with optimal values between 25 and 45 dB HL, depending on frequency and type of OAE measure. The unique constitution of the present large sample, which reflects the general adult population, makes these results applicable to population-based studies and screening programs.

  13. Determination of prospective displacement-based gate threshold for respiratory-gated radiation delivery from retrospective phase-based gate threshold selected at 4D CT simulation.

    PubMed

    Vedam, S; Archambault, L; Starkschall, G; Mohan, R; Beddar, S

    2007-11-01

    Four-dimensional (4D) computed tomography (CT) imaging has found increasing importance in the localization of tumor and surrounding normal structures throughout the respiratory cycle. Based on such tumor motion information, it is possible to identify the appropriate phase interval for respiratory gated treatment planning and delivery. Such a gating phase interval is determined retrospectively based on tumor motion from internal tumor displacement. However, respiratory-gated treatment is delivered prospectively based on motion determined predominantly from an external monitor. Therefore, the simulation gate threshold determined from the retrospective phase interval selected for gating at 4D CT simulation may not correspond to the delivery gate threshold that is determined from the prospective external monitor displacement at treatment delivery. The purpose of the present work is to establish a relationship between the thresholds for respiratory gating determined at CT simulation and treatment delivery, respectively. One hundred fifty external respiratory motion traces, from 90 patients, with and without audio-visual biofeedback, are analyzed. Two respiratory phase intervals, 40%-60% and 30%-70%, are chosen for respiratory gating from the 4D CT-derived tumor motion trajectory. From residual tumor displacements within each such gating phase interval, a simulation gate threshold is defined based on (a) the average and (b) the maximum respiratory displacement within the phase interval. The duty cycle for prospective gated delivery is estimated from the proportion of external monitor displacement data points within both the selected phase interval and the simulation gate threshold. The delivery gate threshold is then determined iteratively to match the above determined duty cycle. The magnitude of the difference between such gate thresholds determined at simulation and treatment delivery is quantified in each case. Phantom motion tests yielded coincidence of simulation

  14. Individual-based modelling of population growth and diffusion in discrete time.

    PubMed

    Tkachenko, Natalie; Weissmann, John D; Petersen, Wesley P; Lake, George; Zollikofer, Christoph P E; Callegari, Simone

    2017-01-01

    Individual-based models (IBMs) of human populations capture spatio-temporal dynamics using rules that govern the birth, behavior, and death of individuals. We explore a stochastic IBM of logistic growth-diffusion with constant time steps and independent, simultaneous actions of birth, death, and movement that approaches the Fisher-Kolmogorov model in the continuum limit. This model is well-suited to parallelization on high-performance computers. We explore its emergent properties with analytical approximations and numerical simulations in parameter ranges relevant to human population dynamics and ecology, and reproduce continuous-time results in the limit of small transition probabilities. Our model prediction indicates that the population density and dispersal speed are affected by fluctuations in the number of individuals. The discrete-time model displays novel properties owing to the binomial character of the fluctuations: in certain regimes of the growth model, a decrease in time step size drives the system away from the continuum limit. These effects are especially important at local population sizes of <50 individuals, which largely correspond to group sizes of hunter-gatherers. As an application scenario, we model the late Pleistocene dispersal of Homo sapiens into the Americas, and discuss the agreement of model-based estimates of first-arrival dates with archaeological dates in dependence of IBM model parameter settings.

  15. Lane change warning threshold based on driver perception characteristics.

    PubMed

    Wang, Chang; Sun, Qinyu; Fu, Rui; Li, Zhen; Zhang, Qiong

    2018-08-01

    Lane Change Warning system (LCW) is exploited to alleviate driver workload and improve the safety performance of lane changes. Depending on the secure threshold, the lane change warning system could transmit caution to drivers. Although the system possesses substantial benefits, it may perturb the conventional operating of the driver and affect driver judgment if the warning threshold does not conform to the driver perception of safety. Therefore, it is essential to establish an appropriate warning threshold to enhance the accuracy rate and acceptability of the lane change warning system. This research aims to identify the threshold that conforms to the driver perception of the ability to safely change lanes with a rear vehicle fast approaching. We propose a theoretical warning model of lane change based on a safe minimum distance and deceleration of the rear vehicle. For the purpose of acquiring the different safety levels of lane changes, 30 licensed drivers are recruited and we obtain the extreme moments represented by driver perception characteristics from a Front Extremity Test and a Rear Extremity Test implemented on the freeway. The required deceleration of the rear vehicle corresponding to the extreme time is calculated according to the proposed model. In light of discrepancies in the deceleration in these extremity experiments, we determine two levels of a hierarchical warning system. The purpose of the primary warning is to remind drivers of the existence of potentially dangerous vehicles and the second warning is used to warn the driver to stop changing lanes immediately. We use the signal detection theory to analyze the data. Ultimately, we confirm that the first deceleration threshold is 1.5 m/s 2 and the second deceleration threshold is 2.7 m/s 2 . The findings provide the basis for the algorithm design of LCW and enhance the acceptability of the intelligent system. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. Empirical estimation of genome-wide significance thresholds based on the 1000 Genomes Project data set.

    PubMed

    Kanai, Masahiro; Tanaka, Toshihiro; Okada, Yukinori

    2016-10-01

    To assess the statistical significance of associations between variants and traits, genome-wide association studies (GWAS) should employ an appropriate threshold that accounts for the massive burden of multiple testing in the study. Although most studies in the current literature commonly set a genome-wide significance threshold at the level of P=5.0 × 10 -8 , the adequacy of this value for respective populations has not been fully investigated. To empirically estimate thresholds for different ancestral populations, we conducted GWAS simulations using the 1000 Genomes Phase 3 data set for Africans (AFR), Europeans (EUR), Admixed Americans (AMR), East Asians (EAS) and South Asians (SAS). The estimated empirical genome-wide significance thresholds were P sig =3.24 × 10 -8 (AFR), 9.26 × 10 -8 (EUR), 1.83 × 10 -7 (AMR), 1.61 × 10 -7 (EAS) and 9.46 × 10 -8 (SAS). We additionally conducted trans-ethnic meta-analyses across all populations (ALL) and all populations except for AFR (ΔAFR), which yielded P sig =3.25 × 10 -8 (ALL) and 4.20 × 10 -8 (ΔAFR). Our results indicate that the current threshold (P=5.0 × 10 -8 ) is overly stringent for all ancestral populations except for Africans; however, we should employ a more stringent threshold when conducting a meta-analysis, regardless of the presence of African samples.

  17. Comparison of Intrinsic Rate of Different House Fly Densities in a Simulated Condition: A Prediction for House Fly Population and Control Threshold.

    PubMed

    Ong, Song-Quan; Ahmad, Hamdan; Jaal, Zairi; Rus, Adanan; Fadzlah, Fadhlina Hazwani Mohd

    2017-01-01

    Determining the control threshold for a pest is common prior to initiating a pest control program; however, previous studies related to the house fly control threshold for a poultry farm are insufficient for determining such a threshold. This study aimed to predict the population changes of house fly population by comparing the intrinsic rate of increase (r m ) for different house fly densities in a simulated system. This study first defined the knee points of a known population growth curve as a control threshold by comparing the r m of five densities of house flies in a simulated condition. Later, to understand the interactions between the larval and adult populations, the correlation between larval and adult capacity rate (r c ) was studied. The r m values of 300- and 500-fly densities were significantly higher compared with the r m values at densities of 50 and 100 flies. This result indicated their representative indices as candidates for a control threshold. The r c of larval and adult populations were negatively correlated with densities of fewer than 300 flies; this implicated adult populations with fewer than 300 flies as declining while the larval population was growing; therefore, control approaches should focus on the immature stages. The results in the present study suggest a control threshold for house fly populations. Future works should focus on calibrating the threshold indices in field conditions. © The Authors 2016. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  18. The Alcohol Use Disorders Identification Test (AUDIT): Exploring the Factor Structure and Cutoff Thresholds in a Representative Post-Conflict Population in Northern Uganda.

    PubMed

    Blair, Alden Hooper; Pearce, Margo Ellen; Katamba, Achilles; Malamba, Samuel S; Muyinda, Herbert; Schechter, Martin T; Spittal, Patricia M

    2017-05-01

    Despite increased use of the Alcohol Use Disorders Identification Test (AUDIT) in sub-Saharan Africa, few studies have assessed its underlying conceptual framework, and none have done so in post-conflict settings. Further, significant inconsistencies exist between definitions used for problematic consumption. Such is the case in Uganda, facing one of the highest per-capita alcohol consumption levels regionally, which is thought to be hindering rebuilding in the North after two decades of civil war. This study explores the impact of varying designation cutoff thresholds in the AUDIT as well as its conceptual factor structure in a representative sample of the population. In all, 1720 Cango Lyec Project participants completed socio-economic and mental health questionnaires, provided blood samples and took the AUDIT. Participant characteristics and consumption designations were compared at AUDIT summary score thresholds of ≥3, ≥5 and ≥8. Confirmatory factor analyses (CFA) explored one-, two- and three-factor level models overall and by sex with relative and absolute fit indicators. There were no significant differences in participant demographic characteristics between thresholds. At higher cutoffs, the test increased in specificity to identify those with hazardous drinking, disordered drinking and suffering from alcohol-related harms. All conceptual models indicated good fit, with three-factor models superior overall and within both sexes. In Northern Uganda, a three-factor AUDIT model best explores alcohol use in the population and is appropriate for use in both sexes. Lower cutoff thresholds are recommended to identify those with potentially disordered drinking to best plan effective interventions and treatments. A CFA of the AUDIT showed good fit for one-, two, and three-factor models overall and by sex in a representative sample in post-conflict Northern Uganda. A three-plus total AUDIT cutoff score is suggested to screen for hazardous drinking in this or

  19. Differential equation models for sharp threshold dynamics.

    PubMed

    Schramm, Harrison C; Dimitrov, Nedialko B

    2014-01-01

    We develop an extension to differential equation models of dynamical systems to allow us to analyze probabilistic threshold dynamics that fundamentally and globally change system behavior. We apply our novel modeling approach to two cases of interest: a model of infectious disease modified for malware where a detection event drastically changes dynamics by introducing a new class in competition with the original infection; and the Lanchester model of armed conflict, where the loss of a key capability drastically changes the effectiveness of one of the sides. We derive and demonstrate a step-by-step, repeatable method for applying our novel modeling approach to an arbitrary system, and we compare the resulting differential equations to simulations of the system's random progression. Our work leads to a simple and easily implemented method for analyzing probabilistic threshold dynamics using differential equations. Published by Elsevier Inc.

  20. Linear No-Threshold Model VS. Radiation Hormesis

    PubMed Central

    Doss, Mohan

    2013-01-01

    The atomic bomb survivor cancer mortality data have been used in the past to justify the use of the linear no-threshold (LNT) model for estimating the carcinogenic effects of low dose radiation. An analysis of the recently updated atomic bomb survivor cancer mortality dose-response data shows that the data no longer support the LNT model but are consistent with a radiation hormesis model when a correction is applied for a likely bias in the baseline cancer mortality rate. If the validity of the phenomenon of radiation hormesis is confirmed in prospective human pilot studies, and is applied to the wider population, it could result in a considerable reduction in cancers. The idea of using radiation hormesis to prevent cancers was proposed more than three decades ago, but was never investigated in humans to determine its validity because of the dominance of the LNT model and the consequent carcinogenic concerns regarding low dose radiation. Since cancer continues to be a major health problem and the age-adjusted cancer mortality rates have declined by only ∼10% in the past 45 years, it may be prudent to investigate radiation hormesis as an alternative approach to reduce cancers. Prompt action is urged. PMID:24298226

  1. Wavelet-based adaptive thresholding method for image segmentation

    NASA Astrophysics Data System (ADS)

    Chen, Zikuan; Tao, Yang; Chen, Xin; Griffis, Carl

    2001-05-01

    A nonuniform background distribution may cause a global thresholding method to fail to segment objects. One solution is using a local thresholding method that adapts to local surroundings. In this paper, we propose a novel local thresholding method for image segmentation, using multiscale threshold functions obtained by wavelet synthesis with weighted detail coefficients. In particular, the coarse-to- fine synthesis with attenuated detail coefficients produces a threshold function corresponding to a high-frequency- reduced signal. This wavelet-based local thresholding method adapts to both local size and local surroundings, and its implementation can take advantage of the fast wavelet algorithm. We applied this technique to physical contaminant detection for poultry meat inspection using x-ray imaging. Experiments showed that inclusion objects in deboned poultry could be extracted at multiple resolutions despite their irregular sizes and uneven backgrounds.

  2. Determination of prospective displacement-based gate threshold for respiratory-gated radiation delivery from retrospective phase-based gate threshold selected at 4D CT simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vedam, S.; Archambault, L.; Starkschall, G.

    2007-11-15

    Four-dimensional (4D) computed tomography (CT) imaging has found increasing importance in the localization of tumor and surrounding normal structures throughout the respiratory cycle. Based on such tumor motion information, it is possible to identify the appropriate phase interval for respiratory gated treatment planning and delivery. Such a gating phase interval is determined retrospectively based on tumor motion from internal tumor displacement. However, respiratory-gated treatment is delivered prospectively based on motion determined predominantly from an external monitor. Therefore, the simulation gate threshold determined from the retrospective phase interval selected for gating at 4D CT simulation may not correspond to the deliverymore » gate threshold that is determined from the prospective external monitor displacement at treatment delivery. The purpose of the present work is to establish a relationship between the thresholds for respiratory gating determined at CT simulation and treatment delivery, respectively. One hundred fifty external respiratory motion traces, from 90 patients, with and without audio-visual biofeedback, are analyzed. Two respiratory phase intervals, 40%-60% and 30%-70%, are chosen for respiratory gating from the 4D CT-derived tumor motion trajectory. From residual tumor displacements within each such gating phase interval, a simulation gate threshold is defined based on (a) the average and (b) the maximum respiratory displacement within the phase interval. The duty cycle for prospective gated delivery is estimated from the proportion of external monitor displacement data points within both the selected phase interval and the simulation gate threshold. The delivery gate threshold is then determined iteratively to match the above determined duty cycle. The magnitude of the difference between such gate thresholds determined at simulation and treatment delivery is quantified in each case. Phantom motion tests yielded coincidence of

  3. Ignition threshold of aluminized HMX-based PBXs

    NASA Astrophysics Data System (ADS)

    Miller, Christopher; Zhou, Min

    2017-06-01

    We report the results of micromechanical simulations of the ignition of aluminized HMX-based PBX under loading due to impact by thin flyers. The conditions analyzed concern loading pulses on the order of 20 nanoseconds to 0.8 microseconds in duration and impact piston velocities on the order of 300-1000 ms-1. The samples consist of a stochastically similar bimodal distribution of HMX grains, an Estane binder, and 50 μm aluminum particles. The computational model accounts for constituent elasto-vicoplasticity, viscoelasticity, bulk compressibility, fracture, interfacial debonding, fracture, internal contact, bulk and frictional heating, and heat conduction. The analysis focuses on the development of hotspots under different material settings and loading conditions. In particular, the ignition threshold in the form of the James relation and the corresponding ignition probability are calculated for the PBXs containing 0%, 6%, 10%, and 18% aluminum by volume. It is found that the addition of aluminum increases the ignition threshold, causing the materials to be less sensitive. Dissipation and heating mechanism changes responsible for this trend are delineated. Support by DOE NNSA SSGF is gratefully acknowledged.

  4. Phonatory threshold pressure in a healthy population before and after aerosol treatment, a preliminary study.

    PubMed

    Grini-Grandval, M N; Bingenheimer, S; Maunsell, R; Ouaknine, M; Giovanni, A

    2002-01-01

    The viscosity of the surface mucus of the vocal cords is one of the important elements for good laryngeal functioning. It has been demonstrated that inhalation of hydrated air increases the phonatory threshold pressure by decreasing viscosity of the mucus (1) leading to a more regular vibration that can be appreciated by jitter (2). In an attempt to correlate the concepts of tissue viscosity and surface mucus considering the theoretical model of vibration we measured the phonatory threshold pressure in 6 healthy female subjects before and after aerosol treatment. We were able to demonstrate that the pressure threshold is lower (3.15 hPa) after aerosol treatment than before (3.79 hPa) and this was statistically significant (p: 0.041). The discussion is based on this decrease of mucus viscosity applied to the physiological concepts necessary to understand glottic vibration.

  5. Association of daily asthma emergency department visits and hospital admissions with ambient air pollutants among the pediatric Medicaid population in Detroit: time-series and time-stratified case-crossover analyses with threshold effects.

    PubMed

    Li, Shi; Batterman, Stuart; Wasilevich, Elizabeth; Wahl, Robert; Wirth, Julie; Su, Feng-Chiao; Mukherjee, Bhramar

    2011-11-01

    Asthma morbidity has been associated with ambient air pollutants in time-series and case-crossover studies. In such study designs, threshold effects of air pollutants on asthma outcomes have been relatively unexplored, which are of potential interest for exploring concentration-response relationships. This study analyzes daily data on the asthma morbidity experienced by the pediatric Medicaid population (ages 2-18 years) of Detroit, Michigan and concentrations of pollutants fine particles (PM2.5), CO, NO2 and SO2 for the 2004-2006 period, using both time-series and case-crossover designs. We use a simple, testable and readily implementable profile likelihood-based approach to estimate threshold parameters in both designs. Evidence of significant increases in daily acute asthma events was found for SO2 and PM2.5, and a significant threshold effect was estimated for PM2.5 at 13 and 11 μg m(-3) using generalized additive models and conditional logistic regression models, respectively. Stronger effect sizes above the threshold were typically noted compared to standard linear relationship, e.g., in the time series analysis, an interquartile range increase (9.2 μg m(-3)) in PM2.5 (5-day-moving average) had a risk ratio of 1.030 (95% CI: 1.001, 1.061) in the generalized additive models, and 1.066 (95% CI: 1.031, 1.102) in the threshold generalized additive models. The corresponding estimates for the case-crossover design were 1.039 (95% CI: 1.013, 1.066) in the conditional logistic regression, and 1.054 (95% CI: 1.023, 1.086) in the threshold conditional logistic regression. This study indicates that the associations of SO2 and PM2.5 concentrations with asthma emergency department visits and hospitalizations, as well as the estimated PM2.5 threshold were fairly consistent across time-series and case-crossover analyses, and suggests that effect estimates based on linear models (without thresholds) may underestimate the true risk. Copyright © 2011 Elsevier Inc. All

  6. A Methodology for Phased Array Radar Threshold Modeling Using the Advanced Propagation Model (APM)

    DTIC Science & Technology

    2017-10-01

    TECHNICAL REPORT 3079 October 2017 A Methodology for Phased Array Radar Threshold Modeling Using the Advanced Propagation Model (APM...Head 55190 Networks Division iii EXECUTIVE SUMMARY This report summarizes the methodology developed to improve the radar threshold modeling...PHASED ARRAY RADAR CONFIGURATION ..................................................................... 1 3. METHODOLOGY

  7. Establishing a beachhead: A stochastic population model with an Allee effect applied to species invasion

    USGS Publications Warehouse

    Ackleh, A.S.; Allen, L.J.S.; Carter, J.

    2007-01-01

    We formulated a spatially explicit stochastic population model with an Allee effect in order to explore how invasive species may become established. In our model, we varied the degree of migration between local populations and used an Allee effect with variable birth and death rates. Because of the stochastic component, population sizes below the Allee effect threshold may still have a positive probability for successful invasion. The larger the network of populations, the greater the probability of an invasion occurring when initial population sizes are close to or above the Allee threshold. Furthermore, if migration rates are low, one or more than one patch may be successfully invaded, while if migration rates are high all patches are invaded. ?? 2007 Elsevier Inc. All rights reserved.

  8. Threshold Effects Beyond the Standard Model

    NASA Astrophysics Data System (ADS)

    Taylor, T. R.

    In this contribution to the Festschrift celebrating Gabriele Veneziano on his 65th birthday, I discuss the threshold effects of extra dimensions and their applications to physics beyond the standard model, focusing on superstring theory.

  9. Chaotic Signal Denoising Based on Hierarchical Threshold Synchrosqueezed Wavelet Transform

    NASA Astrophysics Data System (ADS)

    Wang, Wen-Bo; Jing, Yun-yu; Zhao, Yan-chao; Zhang, Lian-Hua; Wang, Xiang-Li

    2017-12-01

    In order to overcoming the shortcoming of single threshold synchrosqueezed wavelet transform(SWT) denoising method, an adaptive hierarchical threshold SWT chaotic signal denoising method is proposed. Firstly, a new SWT threshold function is constructed based on Stein unbiased risk estimation, which is two order continuous derivable. Then, by using of the new threshold function, a threshold process based on the minimum mean square error was implemented, and the optimal estimation value of each layer threshold in SWT chaotic denoising is obtained. The experimental results of the simulating chaotic signal and measured sunspot signals show that, the proposed method can filter the noise of chaotic signal well, and the intrinsic chaotic characteristic of the original signal can be recovered very well. Compared with the EEMD denoising method and the single threshold SWT denoising method, the proposed method can obtain better denoising result for the chaotic signal.

  10. 40 CFR Table Jj-1 to Subpart Jj of... - Animal Population Threshold Level Below Which Facilities Are Not Required To Report Emissions...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 22 2012-07-01 2012-07-01 false Animal Population Threshold Level... Subpart JJ of Part 98—Animal Population Threshold Level Below Which Facilities Are Not Required To Report...,200 Swine 34,100 Poultry: Layers 723,600 Broilers 38,160,000 Turkeys 7,710,000 1 The threshold head...

  11. Absolute auditory threshold: testing the absolute.

    PubMed

    Heil, Peter; Matysiak, Artur

    2017-11-02

    The mechanisms underlying the detection of sounds in quiet, one of the simplest tasks for auditory systems, are debated. Several models proposed to explain the threshold for sounds in quiet and its dependence on sound parameters include a minimum sound intensity ('hard threshold'), below which sound has no effect on the ear. Also, many models are based on the assumption that threshold is mediated by integration of a neural response proportional to sound intensity. Here, we test these ideas. Using an adaptive forced choice procedure, we obtained thresholds of 95 normal-hearing human ears for 18 tones (3.125 kHz carrier) in quiet, each with a different temporal amplitude envelope. Grand-mean thresholds and standard deviations were well described by a probabilistic model according to which sensory events are generated by a Poisson point process with a low rate in the absence, and higher, time-varying rates in the presence, of stimulation. The subject actively evaluates the process and bases the decision on the number of events observed. The sound-driven rate of events is proportional to the temporal amplitude envelope of the bandpass-filtered sound raised to an exponent. We find no evidence for a hard threshold: When the model is extended to include such a threshold, the fit does not improve. Furthermore, we find an exponent of 3, consistent with our previous studies and further challenging models that are based on the assumption of the integration of a neural response that, at threshold sound levels, is directly proportional to sound amplitude or intensity. © 2017 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  12. Rethinking the Clinically Based Thresholds of TransCelerate BioPharma for Risk-Based Monitoring.

    PubMed

    Zink, Richard C; Dmitrienko, Anastasia; Dmitrienko, Alex

    2018-01-01

    The quality of data from clinical trials has received a great deal of attention in recent years. Of central importance is the need to protect the well-being of study participants and maintain the integrity of final analysis results. However, traditional approaches to assess data quality have come under increased scrutiny as providing little benefit for the substantial cost. Numerous regulatory guidance documents and industry position papers have described risk-based approaches to identify quality and safety issues. In particular, the position paper of TransCelerate BioPharma recommends defining risk thresholds to assess safety and quality risks based on past clinical experience. This exercise can be extremely time-consuming, and the resulting thresholds may only be relevant to a particular therapeutic area, patient or clinical site population. In addition, predefined thresholds cannot account for safety or quality issues where the underlying rate of observing a particular problem may change over the course of a clinical trial, and often do not consider varying patient exposure. In this manuscript, we appropriate rules commonly utilized for funnel plots to define a traffic-light system for risk indicators based on statistical criteria that consider the duration of patient follow-up. Further, we describe how these methods can be adapted to assess changing risk over time. Finally, we illustrate numerous graphical approaches to summarize and communicate risk, and discuss hybrid clinical-statistical approaches to allow for the assessment of risk at sites with low patient enrollment. We illustrate the aforementioned methodologies for a clinical trial in patients with schizophrenia. Funnel plots are a flexible graphical technique that can form the basis for a risk-based strategy to assess data integrity, while considering site sample size, patient exposure, and changing risk across time.

  13. An individual-based model of zebrafish population dynamics accounting for energy dynamics.

    PubMed

    Beaudouin, Rémy; Goussen, Benoit; Piccini, Benjamin; Augustine, Starrlight; Devillers, James; Brion, François; Péry, Alexandre R R

    2015-01-01

    Developing population dynamics models for zebrafish is crucial in order to extrapolate from toxicity data measured at the organism level to biological levels relevant to support and enhance ecological risk assessment. To achieve this, a dynamic energy budget for individual zebrafish (DEB model) was coupled to an individual based model of zebrafish population dynamics (IBM model). Next, we fitted the DEB model to new experimental data on zebrafish growth and reproduction thus improving existing models. We further analysed the DEB-model and DEB-IBM using a sensitivity analysis. Finally, the predictions of the DEB-IBM were compared to existing observations on natural zebrafish populations and the predicted population dynamics are realistic. While our zebrafish DEB-IBM model can still be improved by acquiring new experimental data on the most uncertain processes (e.g. survival or feeding), it can already serve to predict the impact of compounds at the population level.

  14. Genetic parameters for direct and maternal calving ease in Walloon dairy cattle based on linear and threshold models.

    PubMed

    Vanderick, S; Troch, T; Gillon, A; Glorieux, G; Gengler, N

    2014-12-01

    Calving ease scores from Holstein dairy cattle in the Walloon Region of Belgium were analysed using univariate linear and threshold animal models. Variance components and derived genetic parameters were estimated from a data set including 33,155 calving records. Included in the models were season, herd and sex of calf × age of dam classes × group of calvings interaction as fixed effects, herd × year of calving, maternal permanent environment and animal direct and maternal additive genetic as random effects. Models were fitted with the genetic correlation between direct and maternal additive genetic effects either estimated or constrained to zero. Direct heritability for calving ease was approximately 8% with linear models and approximately 12% with threshold models. Maternal heritabilities were approximately 2 and 4%, respectively. Genetic correlation between direct and maternal additive effects was found to be not significantly different from zero. Models were compared in terms of goodness of fit and predictive ability. Criteria of comparison such as mean squared error, correlation between observed and predicted calving ease scores as well as between estimated breeding values were estimated from 85,118 calving records. The results provided few differences between linear and threshold models even though correlations between estimated breeding values from subsets of data for sires with progeny from linear model were 17 and 23% greater for direct and maternal genetic effects, respectively, than from threshold model. For the purpose of genetic evaluation for calving ease in Walloon Holstein dairy cattle, the linear animal model without covariance between direct and maternal additive effects was found to be the best choice. © 2014 Blackwell Verlag GmbH.

  15. Estimating and modeling the cure fraction in population-based cancer survival analysis.

    PubMed

    Lambert, Paul C; Thompson, John R; Weston, Claire L; Dickman, Paul W

    2007-07-01

    In population-based cancer studies, cure is said to occur when the mortality (hazard) rate in the diseased group of individuals returns to the same level as that expected in the general population. The cure fraction (the proportion of patients cured of disease) is of interest to patients and is a useful measure to monitor trends in survival of curable disease. There are 2 main types of cure fraction model, the mixture cure fraction model and the non-mixture cure fraction model, with most previous work concentrating on the mixture cure fraction model. In this paper, we extend the parametric non-mixture cure fraction model to incorporate background mortality, thus providing estimates of the cure fraction in population-based cancer studies. We compare the estimates of relative survival and the cure fraction between the 2 types of model and also investigate the importance of modeling the ancillary parameters in the selected parametric distribution for both types of model.

  16. Methodological issues when comparing hearing thresholds of a group with population standards: the case of the ferry engineers.

    PubMed

    Dobie, Robert A

    2006-10-01

    To discuss appropriate and inappropriate methods for comparing distributions of hearing thresholds of a study group with distributions in population standards and to determine whether the thresholds of Washington State Ferries engineers are different from those of men in the general population, using both frequency-by-frequency comparisons and analysis of audiometric shape. The most recent hearing conservation program audiograms of 321 noise-exposed engineers, ages 35 to 64, were compared with the predictions of Annexes A, B, and C from ANSI S3.44. There was no screening by history or otoscopy; all audiograms were included. 95% confidence intervals (95% CIs) were calculated for the engineers' median thresholds for each ear, for the better ear (defined two ways), and for the binaural average. For Annex B, where 95% CIs are also available, it was possible to calculate z scores for the differences between Annex B and the engineers' better ears. Bulge depth, an audiometric shape statistic, measured curvature between 1 and 6 kHz. Engineers' better-ear median thresholds were worse than those in Annex A but (except at 1 kHz) were as good as or better than those in Annexes B and C, which are more appropriate for comparison to an unscreened noise-exposed group like the engineers. Average bulge depth for the engineers was similar to that of the Annex B standard (no added occupational noise) and was much less than that of audiograms created by using the standard with added occupational noise between 90 and 100 dBA. Audiograms from groups that have been selected for a particular exposure, but, without regard to severity, can appropriately be compared with population standards, if certain pitfalls are avoided. For unscreened study groups with large age-sex subgroups, a simple method to assess statistical significance, taking into consideration uncertainties in both the study group and the comparison standard, is the calculation of z scores for the proportion of better

  17. Two-threshold model for scaling laws of noninteracting snow avalanches

    USGS Publications Warehouse

    Faillettaz, J.; Louchet, F.; Grasso, J.-R.

    2004-01-01

    A two-threshold model was proposed for scaling laws of noninteracting snow avalanches. It was found that the sizes of the largest avalanches just preceding the lattice system were power-law distributed. The proposed model reproduced the range of power-law exponents observe for land, rock or snow avalanches, by tuning the maximum value of the ratio of the two failure thresholds. A two-threshold 2D cellular automation was introduced to study the scaling for gravity-driven systems.

  18. Algorithmic detectability threshold of the stochastic block model

    NASA Astrophysics Data System (ADS)

    Kawamoto, Tatsuro

    2018-03-01

    The assumption that the values of model parameters are known or correctly learned, i.e., the Nishimori condition, is one of the requirements for the detectability analysis of the stochastic block model in statistical inference. In practice, however, there is no example demonstrating that we can know the model parameters beforehand, and there is no guarantee that the model parameters can be learned accurately. In this study, we consider the expectation-maximization (EM) algorithm with belief propagation (BP) and derive its algorithmic detectability threshold. Our analysis is not restricted to the community structure but includes general modular structures. Because the algorithm cannot always learn the planted model parameters correctly, the algorithmic detectability threshold is qualitatively different from the one with the Nishimori condition.

  19. Numeric, Agent-based or System Dynamics Model? Which Modeling Approach is the Best for Vast Population Simulation?

    PubMed

    Cimler, Richard; Tomaskova, Hana; Kuhnova, Jitka; Dolezal, Ondrej; Pscheidl, Pavel; Kuca, Kamil

    2018-01-01

    Alzheimer's disease is one of the most common mental illnesses. It is posited that more than 25% of the population is affected by some mental disease during their lifetime. Treatment of each patient draws resources from the economy concerned. Therefore, it is important to quantify the potential economic impact. Agent-based, system dynamics and numerical approaches to dynamic modeling of the population of the European Union and its patients with Alzheimer's disease are presented in this article. Simulations, their characteristics, and the results from different modeling tools are compared. The results of these approaches are compared with EU population growth predictions from the statistical office of the EU by Eurostat. The methodology of a creation of the models is described and all three modeling approaches are compared. The suitability of each modeling approach for the population modeling is discussed. In this case study, all three approaches gave us the results corresponding with the EU population prediction. Moreover, we were able to predict the number of patients with AD and, based on the modeling method, we were also able to monitor different characteristics of the population. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  20. Landscape-based population viability models demonstrate importance of strategic conservation planning for birds

    Treesearch

    Thomas W. Bonnot; Frank R. Thompson; Joshua J. Millspaugh; D. Todd Jones-Farland

    2013-01-01

    Efforts to conserve regional biodiversity in the face of global climate change, habitat loss and fragmentation will depend on approaches that consider population processes at multiple scales. By combining habitat and demographic modeling, landscape-based population viability models effectively relate small-scale habitat and landscape patterns to regional population...

  1. An Individual-Based Model of Zebrafish Population Dynamics Accounting for Energy Dynamics

    PubMed Central

    Beaudouin, Rémy; Goussen, Benoit; Piccini, Benjamin; Augustine, Starrlight; Devillers, James; Brion, François; Péry, Alexandre R. R.

    2015-01-01

    Developing population dynamics models for zebrafish is crucial in order to extrapolate from toxicity data measured at the organism level to biological levels relevant to support and enhance ecological risk assessment. To achieve this, a dynamic energy budget for individual zebrafish (DEB model) was coupled to an individual based model of zebrafish population dynamics (IBM model). Next, we fitted the DEB model to new experimental data on zebrafish growth and reproduction thus improving existing models. We further analysed the DEB-model and DEB-IBM using a sensitivity analysis. Finally, the predictions of the DEB-IBM were compared to existing observations on natural zebrafish populations and the predicted population dynamics are realistic. While our zebrafish DEB-IBM model can still be improved by acquiring new experimental data on the most uncertain processes (e.g. survival or feeding), it can already serve to predict the impact of compounds at the population level. PMID:25938409

  2. Integrating physiological threshold experiments with climate modeling to project mangrove range limits

    NASA Astrophysics Data System (ADS)

    Cavanaugh, K. C.; Kellner, J.; Cook-Patton, S.; Williams, P.; Feller, I. C.; Parker, J.

    2014-12-01

    Due to limitations of purely correlative species distribution models, there is a need for more integration of experimental approaches when studying impacts of climate change on species distributions. Here we used controlled experiments to identify physiological thresholds that control poleward range limits of three species of mangroves found in North America. We found that all three species exhibited a threshold response to extreme cold, but freeze tolerance thresholds varied among species. From these experiments we developed a climate metric, freeze degree days (FDD), which incorporates both the intensity and frequency of freezes. When included in distribution models, FDD was a better predictor of mangrove presence/absence than other temperature-based metrics. Using 27 years of satellite imagery, we linked FDD to past changes in mangrove abundance in Florida, further supporting the relevance of FDD. We then used downscaled climate projections of FDD to project poleward migration of these range limits over the next 50 years.

  3. Comparison between intensity- duration thresholds and cumulative rainfall thresholds for the forecasting of landslide

    NASA Astrophysics Data System (ADS)

    Lagomarsino, Daniela; Rosi, Ascanio; Rossi, Guglielmo; Segoni, Samuele; Catani, Filippo

    2014-05-01

    This work makes a quantitative comparison between the results of landslide forecasting obtained using two different rainfall threshold models, one using intensity-duration thresholds and the other based on cumulative rainfall thresholds in an area of northern Tuscany of 116 km2. The first methodology identifies rainfall intensity-duration thresholds by means a software called MaCumBA (Massive CUMulative Brisk Analyzer) that analyzes rain-gauge records, extracts the intensities (I) and durations (D) of the rainstorms associated with the initiation of landslides, plots these values on a diagram, and identifies thresholds that define the lower bounds of the I-D values. A back analysis using data from past events can be used to identify the threshold conditions associated with the least amount of false alarms. The second method (SIGMA) is based on the hypothesis that anomalous or extreme values of rainfall are responsible for landslide triggering: the statistical distribution of the rainfall series is analyzed, and multiples of the standard deviation (σ) are used as thresholds to discriminate between ordinary and extraordinary rainfall events. The name of the model, SIGMA, reflects the central role of the standard deviations in the proposed methodology. The definition of intensity-duration rainfall thresholds requires the combined use of rainfall measurements and an inventory of dated landslides, whereas SIGMA model can be implemented using only rainfall data. These two methodologies were applied in an area of 116 km2 where a database of 1200 landslides was available for the period 2000-2012. The results obtained are compared and discussed. Although several examples of visual comparisons between different intensity-duration rainfall thresholds are reported in the international literature, a quantitative comparison between thresholds obtained in the same area using different techniques and approaches is a relatively undebated research topic.

  4. Relation of the runaway avalanche threshold to momentum space topology

    NASA Astrophysics Data System (ADS)

    McDevitt, Christopher J.; Guo, Zehua; Tang, Xian-Zhu

    2018-02-01

    The underlying physics responsible for the formation of an avalanche instability due to the generation of secondary electrons is studied. A careful examination of the momentum space topology of the runaway electron population is carried out with an eye toward identifying how qualitative changes in the momentum space of the runaway electrons is correlated with the avalanche threshold. It is found that the avalanche threshold is tied to the merger of an O and X point in the momentum space of the primary runaway electron population. Such a change of the momentum space topology is shown to be accurately described by a simple analytic model, thus providing a powerful means of determining the avalanche threshold for a range of model assumptions.

  5. Relation of the runaway avalanche threshold to momentum space topology

    DOE PAGES

    McDevitt, Christopher J.; Guo, Zehua; Tang, Xian -Zhu

    2018-01-05

    Here, the underlying physics responsible for the formation of an avalanche instability due to the generation of secondary electrons is studied. A careful examination of the momentum space topology of the runaway electron population is carried out with an eye toward identifying how qualitative changes in the momentum space of the runaway electrons is correlated with the avalanche threshold. It is found that the avalanche threshold is tied to the merger of an O and X point in the momentum space of the primary runaway electron population. Such a change of the momentum space topology is shown to be accuratelymore » described by a simple analytic model, thus providing a powerful means of determining the avalanche threshold for a range of model assumptions.« less

  6. Relation of the runaway avalanche threshold to momentum space topology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDevitt, Christopher J.; Guo, Zehua; Tang, Xian -Zhu

    Here, the underlying physics responsible for the formation of an avalanche instability due to the generation of secondary electrons is studied. A careful examination of the momentum space topology of the runaway electron population is carried out with an eye toward identifying how qualitative changes in the momentum space of the runaway electrons is correlated with the avalanche threshold. It is found that the avalanche threshold is tied to the merger of an O and X point in the momentum space of the primary runaway electron population. Such a change of the momentum space topology is shown to be accuratelymore » described by a simple analytic model, thus providing a powerful means of determining the avalanche threshold for a range of model assumptions.« less

  7. Discontinuous non-equilibrium phase transition in a threshold Schloegl model for autocatalysis: Generic two-phase coexistence and metastability

    NASA Astrophysics Data System (ADS)

    Wang, Chi-Jen; Liu, Da-Jiang; Evans, James W.

    2015-04-01

    Threshold versions of Schloegl's model on a lattice, which involve autocatalytic creation and spontaneous annihilation of particles, can provide a simple prototype for discontinuous non-equilibrium phase transitions. These models are equivalent to so-called threshold contact processes. A discontinuous transition between populated and vacuum states can occur selecting a threshold of N ≥ 2 for the minimum number, N, of neighboring particles enabling autocatalytic creation at an empty site. Fundamental open questions remain given the lack of a thermodynamic framework for analysis. For a square lattice with N = 2, we show that phase coexistence occurs not at a unique value but for a finite range of particle annihilation rate (the natural control parameter). This generic two-phase coexistence also persists when perturbing the model to allow spontaneous particle creation. Such behavior contrasts both the Gibbs phase rule for thermodynamic systems and also previous analysis for this model. We find metastability near the transition corresponding to a non-zero effective line tension, also contrasting previously suggested critical behavior. Mean-field type analysis, extended to treat spatially heterogeneous states, further elucidates model behavior.

  8. A threshold selection method based on edge preserving

    NASA Astrophysics Data System (ADS)

    Lou, Liantang; Dan, Wei; Chen, Jiaqi

    2015-12-01

    A method of automatic threshold selection for image segmentation is presented. An optimal threshold is selected in order to preserve edge of image perfectly in image segmentation. The shortcoming of Otsu's method based on gray-level histograms is analyzed. The edge energy function of bivariate continuous function is expressed as the line integral while the edge energy function of image is simulated by discretizing the integral. An optimal threshold method by maximizing the edge energy function is given. Several experimental results are also presented to compare with the Otsu's method.

  9. A Mathematical Model of Anthrax Transmission in Animal Populations.

    PubMed

    Saad-Roy, C M; van den Driessche, P; Yakubu, Abdul-Aziz

    2017-02-01

    A general mathematical model of anthrax (caused by Bacillus anthracis) transmission is formulated that includes live animals, infected carcasses and spores in the environment. The basic reproduction number [Formula: see text] is calculated, and existence of a unique endemic equilibrium is established for [Formula: see text] above the threshold value 1. Using data from the literature, elasticity indices for [Formula: see text] and type reproduction numbers are computed to quantify anthrax control measures. Including only herbivorous animals, anthrax is eradicated if [Formula: see text]. For these animals, oscillatory solutions arising from Hopf bifurcations are numerically shown to exist for certain parameter values with [Formula: see text] and to have periodicity as observed from anthrax data. Including carnivores and assuming no disease-related death, anthrax again goes extinct below the threshold. Local stability of the endemic equilibrium is established above the threshold; thus, periodic solutions are not possible for these populations. It is shown numerically that oscillations in spore growth may drive oscillations in animal populations; however, the total number of infected animals remains about the same as with constant spore growth.

  10. Species extinction thresholds in the face of spatially correlated periodic disturbance.

    PubMed

    Liao, Jinbao; Ying, Zhixia; Hiebeler, David E; Wang, Yeqiao; Takada, Takenori; Nijs, Ivan

    2015-10-20

    The spatial correlation of disturbance is gaining attention in landscape ecology, but knowledge is still lacking on how species traits determine extinction thresholds under spatially correlated disturbance regimes. Here we develop a pair approximation model to explore species extinction risk in a lattice-structured landscape subject to aggregated periodic disturbance. Increasing disturbance extent and frequency accelerated population extinction irrespective of whether dispersal was local or global. Spatial correlation of disturbance likewise increased species extinction risk, but only for local dispersers. This indicates that models based on randomly simulated disturbances (e.g., mean-field or non-spatial models) may underestimate real extinction rates. Compared to local dispersal, species with global dispersal tolerated more severe disturbance, suggesting that the spatial correlation of disturbance favors long-range dispersal from an evolutionary perspective. Following disturbance, intraspecific competition greatly enhanced the extinction risk of distance-limited dispersers, while it surprisingly did not influence the extinction thresholds of global dispersers, apart from decreasing population density to some degree. As species respond differently to disturbance regimes with different spatiotemporal properties, different regimes may accommodate different species.

  11. Global gray-level thresholding based on object size.

    PubMed

    Ranefall, Petter; Wählby, Carolina

    2016-04-01

    In this article, we propose a fast and robust global gray-level thresholding method based on object size, where the selection of threshold level is based on recall and maximum precision with regard to objects within a given size interval. The method relies on the component tree representation, which can be computed in quasi-linear time. Feature-based segmentation is especially suitable for biomedical microscopy applications where objects often vary in number, but have limited variation in size. We show that for real images of cell nuclei and synthetic data sets mimicking fluorescent spots the proposed method is more robust than all standard global thresholding methods available for microscopy applications in ImageJ and CellProfiler. The proposed method, provided as ImageJ and CellProfiler plugins, is simple to use and the only required input is an interval of the expected object sizes. © 2016 International Society for Advancement of Cytometry. © 2016 International Society for Advancement of Cytometry.

  12. Evaluation on the cost-effective threshold of osteoporosis treatment on elderly women in China using discrete event simulation model.

    PubMed

    Ni, W; Jiang, Y

    2017-02-01

    This study used a simulation model to determine the cost-effective threshold of fracture risk to treat osteoporosis among elderly Chinese women. Osteoporosis treatment is cost-effective among average-risk women who are at least 75 years old and above-average-risk women who are younger than 75 years old. Aging of the Chinese population is imposing increasing economic burden of osteoporosis. This study evaluated the cost-effectiveness of osteoporosis treatment among the senior Chinese women population. A discrete event simulation model using age-specific probabilities of hip fracture, clinical vertebral fracture, wrist fracture, humerus fracture, and other fracture; costs (2015 US dollars); and quality-adjusted life years (QALYs) was used to assess the cost-effectiveness of osteoporosis treatment. Incremental cost-effectiveness ratio (ICER) was calculated. The willingness to pay (WTP) for a QALY in China was compared with the calculated ICER to decide the cost-effectiveness. To determine the absolute 10-year hip fracture probability at which the osteoporosis treatment became cost-effective, average age-specific probabilities for all fractures were multiplied by a relative risk (RR) that was systematically varied from 0 to 10 until the WTP threshold was observed for treatment relative to no intervention. Sensitivity analyses were also performed to evaluate the impacts from WTP and annual treatment costs. In baseline analysis, simulated ICERs were higher than the WTP threshold among Chinese women younger than 75, but much lower than the WTP among the older population. Sensitivity analyses indicated that cost-effectiveness could vary due to a higher WTP threshold or a lower annual treatment cost. A 30 % increase in WTP or a 30 % reduction in annual treatment costs will make osteoporosis treatment cost-effective for Chinese women population from 55 to 85. The current study provides evidence that osteoporosis treatment is cost-effective among a subpopulation of

  13. Reconstruction of Sensory Stimuli Encoded with Integrate-and-Fire Neurons with Random Thresholds

    PubMed Central

    Lazar, Aurel A.; Pnevmatikakis, Eftychios A.

    2013-01-01

    We present a general approach to the reconstruction of sensory stimuli encoded with leaky integrate-and-fire neurons with random thresholds. The stimuli are modeled as elements of a Reproducing Kernel Hilbert Space. The reconstruction is based on finding a stimulus that minimizes a regularized quadratic optimality criterion. We discuss in detail the reconstruction of sensory stimuli modeled as absolutely continuous functions as well as stimuli with absolutely continuous first-order derivatives. Reconstruction results are presented for stimuli encoded with single as well as a population of neurons. Examples are given that demonstrate the performance of the reconstruction algorithms as a function of threshold variability. PMID:24077610

  14. Establishing a rainfall threshold for flash flood warnings in China's mountainous areas based on a distributed hydrological model

    NASA Astrophysics Data System (ADS)

    Miao, Qinghua; Yang, Dawen; Yang, Hanbo; Li, Zhe

    2016-10-01

    Flash flooding is one of the most common natural hazards in China, particularly in mountainous areas, and usually causes heavy damage and casualties. However, the forecasting of flash flooding in mountainous regions remains challenging because of the short response time and limited monitoring capacity. This paper aims to establish a strategy for flash flood warnings in mountainous ungauged catchments across humid, semi-humid and semi-arid regions of China. First, we implement a geomorphology-based hydrological model (GBHM) in four mountainous catchments with drainage areas that ranges from 493 to 1601 km2. The results show that the GBHM can simulate flash floods appropriately in these four study catchments. We propose a method to determine the rainfall threshold for flood warning by using frequency analysis and binary classification based on long-term GBHM simulations that are forced by historical rainfall data to create a practically easy and straightforward approach for flash flood forecasting in ungauged mountainous catchments with drainage areas from tens to hundreds of square kilometers. The results show that the rainfall threshold value decreases significantly with increasing antecedent soil moisture in humid regions, while this value decreases slightly with increasing soil moisture in semi-humid and semi-arid regions. We also find that accumulative rainfall over a certain time span (or rainfall over a long time span) is an appropriate threshold for flash flood warnings in humid regions because the runoff is dominated by excess saturation. However, the rainfall intensity (or rainfall over a short time span) is more suitable in semi-humid and semi-arid regions because excess infiltration dominates the runoff in these regions. We conduct a comprehensive evaluation of the rainfall threshold and find that the proposed method produces reasonably accurate flash flood warnings in the study catchments. An evaluation of the performance at uncalibrated interior points

  15. Metro passengers’ route choice model and its application considering perceived transfer threshold

    PubMed Central

    Jin, Fanglei; Zhang, Yongsheng; Liu, Shasha

    2017-01-01

    With the rapid development of the Metro network in China, the greatly increased route alternatives make passengers’ route choice behavior and passenger flow assignment more complicated, which presents challenges to the operation management. In this paper, a path sized logit model is adopted to analyze passengers’ route choice preferences considering such parameters as in-vehicle time, number of transfers, and transfer time. Moreover, the “perceived transfer threshold” is defined and included in the utility function to reflect the penalty difference caused by transfer time on passengers’ perceived utility under various numbers of transfers. Next, based on the revealed preference data collected in the Guangzhou Metro, the proposed model is calibrated. The appropriate perceived transfer threshold value and the route choice preferences are analyzed. Finally, the model is applied to a personalized route planning case to demonstrate the engineering practicability of route choice behavior analysis. The results show that the introduction of the perceived transfer threshold is helpful to improve the model’s explanatory abilities. In addition, personalized route planning based on route choice preferences can meet passengers’ diversified travel demands. PMID:28957376

  16. Representing the acquisition and use of energy by individuals in agent-based models of animal populations

    USGS Publications Warehouse

    Sibly, Richard M.; Grimm, Volker; Martin, Benjamin T.; Johnston, Alice S.A.; Kulakowska, Katarzyna; Topping, Christopher J.; Calow, Peter; Nabe-Nielsen, Jacob; Thorbek, Pernille; DeAngelis, Donald L.

    2013-01-01

    1. Agent-based models (ABMs) are widely used to predict how populations respond to changing environments. As the availability of food varies in space and time, individuals should have their own energy budgets, but there is no consensus as to how these should be modelled. Here, we use knowledge of physiological ecology to identify major issues confronting the modeller and to make recommendations about how energy budgets for use in ABMs should be constructed. 2. Our proposal is that modelled animals forage as necessary to supply their energy needs for maintenance, growth and reproduction. If there is sufficient energy intake, an animal allocates the energy obtained in the order: maintenance, growth, reproduction, energy storage, until its energy stores reach an optimal level. If there is a shortfall, the priorities for maintenance and growth/reproduction remain the same until reserves fall to a critical threshold below which all are allocated to maintenance. Rates of ingestion and allocation depend on body mass and temperature. We make suggestions for how each of these processes should be modelled mathematically. 3. Mortality rates vary with body mass and temperature according to known relationships, and these can be used to obtain estimates of background mortality rate. 4. If parameter values cannot be obtained directly, then values may provisionally be obtained by parameter borrowing, pattern-oriented modelling, artificial evolution or from allometric equations. 5. The development of ABMs incorporating individual energy budgets is essential for realistic modelling of populations affected by food availability. Such ABMs are already being used to guide conservation planning of nature reserves and shell fisheries, to assess environmental impacts of building proposals including wind farms and highways and to assess the effects on nontarget organisms of chemicals for the control of agricultural pests.

  17. Exact Hybrid Particle/Population Simulation of Rule-Based Models of Biochemical Systems

    PubMed Central

    Stover, Lori J.; Nair, Niketh S.; Faeder, James R.

    2014-01-01

    Detailed modeling and simulation of biochemical systems is complicated by the problem of combinatorial complexity, an explosion in the number of species and reactions due to myriad protein-protein interactions and post-translational modifications. Rule-based modeling overcomes this problem by representing molecules as structured objects and encoding their interactions as pattern-based rules. This greatly simplifies the process of model specification, avoiding the tedious and error prone task of manually enumerating all species and reactions that can potentially exist in a system. From a simulation perspective, rule-based models can be expanded algorithmically into fully-enumerated reaction networks and simulated using a variety of network-based simulation methods, such as ordinary differential equations or Gillespie's algorithm, provided that the network is not exceedingly large. Alternatively, rule-based models can be simulated directly using particle-based kinetic Monte Carlo methods. This “network-free” approach produces exact stochastic trajectories with a computational cost that is independent of network size. However, memory and run time costs increase with the number of particles, limiting the size of system that can be feasibly simulated. Here, we present a hybrid particle/population simulation method that combines the best attributes of both the network-based and network-free approaches. The method takes as input a rule-based model and a user-specified subset of species to treat as population variables rather than as particles. The model is then transformed by a process of “partial network expansion” into a dynamically equivalent form that can be simulated using a population-adapted network-free simulator. The transformation method has been implemented within the open-source rule-based modeling platform BioNetGen, and resulting hybrid models can be simulated using the particle-based simulator NFsim. Performance tests show that significant memory

  18. Exact hybrid particle/population simulation of rule-based models of biochemical systems.

    PubMed

    Hogg, Justin S; Harris, Leonard A; Stover, Lori J; Nair, Niketh S; Faeder, James R

    2014-04-01

    Detailed modeling and simulation of biochemical systems is complicated by the problem of combinatorial complexity, an explosion in the number of species and reactions due to myriad protein-protein interactions and post-translational modifications. Rule-based modeling overcomes this problem by representing molecules as structured objects and encoding their interactions as pattern-based rules. This greatly simplifies the process of model specification, avoiding the tedious and error prone task of manually enumerating all species and reactions that can potentially exist in a system. From a simulation perspective, rule-based models can be expanded algorithmically into fully-enumerated reaction networks and simulated using a variety of network-based simulation methods, such as ordinary differential equations or Gillespie's algorithm, provided that the network is not exceedingly large. Alternatively, rule-based models can be simulated directly using particle-based kinetic Monte Carlo methods. This "network-free" approach produces exact stochastic trajectories with a computational cost that is independent of network size. However, memory and run time costs increase with the number of particles, limiting the size of system that can be feasibly simulated. Here, we present a hybrid particle/population simulation method that combines the best attributes of both the network-based and network-free approaches. The method takes as input a rule-based model and a user-specified subset of species to treat as population variables rather than as particles. The model is then transformed by a process of "partial network expansion" into a dynamically equivalent form that can be simulated using a population-adapted network-free simulator. The transformation method has been implemented within the open-source rule-based modeling platform BioNetGen, and resulting hybrid models can be simulated using the particle-based simulator NFsim. Performance tests show that significant memory savings

  19. Setting conservation management thresholds using a novel participatory modeling approach.

    PubMed

    Addison, P F E; de Bie, K; Rumpff, L

    2015-10-01

    We devised a participatory modeling approach for setting management thresholds that show when management intervention is required to address undesirable ecosystem changes. This approach was designed to be used when management thresholds: must be set for environmental indicators in the face of multiple competing objectives; need to incorporate scientific understanding and value judgments; and will be set by participants with limited modeling experience. We applied our approach to a case study where management thresholds were set for a mat-forming brown alga, Hormosira banksii, in a protected area management context. Participants, including management staff and scientists, were involved in a workshop to test the approach, and set management thresholds to address the threat of trampling by visitors to an intertidal rocky reef. The approach involved trading off the environmental objective, to maintain the condition of intertidal reef communities, with social and economic objectives to ensure management intervention was cost-effective. Ecological scenarios, developed using scenario planning, were a key feature that provided the foundation for where to set management thresholds. The scenarios developed represented declines in percent cover of H. banksii that may occur under increased threatening processes. Participants defined 4 discrete management alternatives to address the threat of trampling and estimated the effect of these alternatives on the objectives under each ecological scenario. A weighted additive model was used to aggregate participants' consequence estimates. Model outputs (decision scores) clearly expressed uncertainty, which can be considered by decision makers and used to inform where to set management thresholds. This approach encourages a proactive form of conservation, where management thresholds and associated actions are defined a priori for ecological indicators, rather than reacting to unexpected ecosystem changes in the future. © 2015 The

  20. Modelling population distribution using remote sensing imagery and location-based data

    NASA Astrophysics Data System (ADS)

    Song, J.; Prishchepov, A. V.

    2017-12-01

    Detailed spatial distribution of population density is essential for city studies such as urban planning, environmental pollution and city emergency, even estimate pressure on the environment and human exposure and risks to health. However, most of the researches used census data as the detailed dynamic population distribution are difficult to acquire, especially in microscale research. This research describes a method using remote sensing imagery and location-based data to model population distribution at the function zone level. Firstly, urban functional zones within a city were mapped by high-resolution remote sensing images and POIs. The workflow of functional zones extraction includes five parts: (1) Urban land use classification. (2) Segmenting images in built-up area. (3) Identification of functional segments by POIs. (4) Identification of functional blocks by functional segmentation and weight coefficients. (5) Assessing accuracy by validation points. The result showed as Fig.1. Secondly, we applied ordinary least square and geographically weighted regression to assess spatial nonstationary relationship between light digital number (DN) and population density of sampling points. The two methods were employed to predict the population distribution over the research area. The R²of GWR model were in the order of 0.7 and typically showed significant variations over the region than traditional OLS model. The result showed as Fig.2.Validation with sampling points of population density demonstrated that the result predicted by the GWR model correlated well with light value. The result showed as Fig.3. Results showed: (1) Population density is not linear correlated with light brightness using global model. (2) VIIRS night-time light data could estimate population density integrating functional zones at city level. (3) GWR is a robust model to map population distribution, the adjusted R2 of corresponding GWR models were higher than the optimal OLS models

  1. Codimension-1 Sliding Bifurcations of a Filippov Pest Growth Model with Threshold Policy

    NASA Astrophysics Data System (ADS)

    Tang, Sanyi; Tang, Guangyao; Qin, Wenjie

    A Filippov system is proposed to describe the stage structured nonsmooth pest growth with threshold policy control (TPC). The TPC measure is represented by the total density of both juveniles and adults being chosen as an index for decisions on when to implement chemical control strategies. The proposed Filippov system can have three pieces of sliding segments and three pseudo-equilibria, which result in rich sliding mode bifurcations and local sliding bifurcations including boundary node (boundary focus, or boundary saddle) and tangency bifurcations. As the threshold density varies the model exhibits the interesting global sliding bifurcations sequentially: touching → buckling → crossing → sliding homoclinic orbit to a pseudo-saddle → crossing → touching bifurcations. In particular, bifurcation of a homoclinic orbit to a pseudo-saddle with a figure of eight shape, to a pseudo-saddle-node or to a standard saddle-node have been observed for some parameter sets. This implies that control outcomes are sensitive to the threshold level, and hence it is crucial to choose the threshold level to initiate control strategy. One more sliding segment (or pseudo-equilibrium) is induced by the total density of a population guided switching policy, compared to only the juvenile density guided policy, implying that this control policy is more effective in terms of preventing multiple pest outbreaks or causing the density of pests to stabilize at a desired level such as an economic threshold.

  2. Integrating physiological threshold experiments with climate modeling to project mangrove species' range expansion.

    PubMed

    Cavanaugh, Kyle C; Parker, John D; Cook-Patton, Susan C; Feller, Ilka C; Williams, A Park; Kellner, James R

    2015-05-01

    Predictions of climate-related shifts in species ranges have largely been based on correlative models. Due to limitations of these models, there is a need for more integration of experimental approaches when studying impacts of climate change on species distributions. Here, we used controlled experiments to identify physiological thresholds that control poleward range limits of three species of mangroves found in North America. We found that all three species exhibited a threshold response to extreme cold, but freeze tolerance thresholds varied among species. From these experiments, we developed a climate metric, freeze degree days (FDD), which incorporates both the intensity and the frequency of freezes. When included in distribution models, FDD accurately predicted mangrove presence/absence. Using 28 years of satellite imagery, we linked FDD to observed changes in mangrove abundance in Florida, further exemplifying the importance of extreme cold. We then used downscaled climate projections of FDD to project that these range limits will move northward by 2.2-3.2 km yr(-1) over the next 50 years. © 2014 John Wiley & Sons Ltd.

  3. Modeling the Population Dynamics of Antibiotic-Resistant Bacteria:. AN Agent-Based Approach

    NASA Astrophysics Data System (ADS)

    Murphy, James T.; Walshe, Ray; Devocelle, Marc

    The response of bacterial populations to antibiotic treatment is often a function of a diverse range of interacting factors. In order to develop strategies to minimize the spread of antibiotic resistance in pathogenic bacteria, a sound theoretical understanding of the systems of interactions taking place within a colony must be developed. The agent-based approach to modeling bacterial populations is a useful tool for relating data obtained at the molecular and cellular level with the overall population dynamics. Here we demonstrate an agent-based model, called Micro-Gen, which has been developed to simulate the growth and development of bacterial colonies in culture. The model also incorporates biochemical rules and parameters describing the kinetic interactions of bacterial cells with antibiotic molecules. Simulations were carried out to replicate the development of methicillin-resistant S. aureus (MRSA) colonies growing in the presence of antibiotics. The model was explored to see how the properties of the system emerge from the interactions of the individual bacterial agents in order to achieve a better mechanistic understanding of the population dynamics taking place. Micro-Gen provides a good theoretical framework for investigating the effects of local environmental conditions and cellular properties on the response of bacterial populations to antibiotic exposure in the context of a simulated environment.

  4. Model-Based Individualized Treatment of Chemotherapeutics: Bayesian Population Modeling and Dose Optimization

    PubMed Central

    Jayachandran, Devaraj; Laínez-Aguirre, José; Rundell, Ann; Vik, Terry; Hannemann, Robert; Reklaitis, Gintaras; Ramkrishna, Doraiswami

    2015-01-01

    6-Mercaptopurine (6-MP) is one of the key drugs in the treatment of many pediatric cancers, auto immune diseases and inflammatory bowel disease. 6-MP is a prodrug, converted to an active metabolite 6-thioguanine nucleotide (6-TGN) through enzymatic reaction involving thiopurine methyltransferase (TPMT). Pharmacogenomic variation observed in the TPMT enzyme produces a significant variation in drug response among the patient population. Despite 6-MP’s widespread use and observed variation in treatment response, efforts at quantitative optimization of dose regimens for individual patients are limited. In addition, research efforts devoted on pharmacogenomics to predict clinical responses are proving far from ideal. In this work, we present a Bayesian population modeling approach to develop a pharmacological model for 6-MP metabolism in humans. In the face of scarcity of data in clinical settings, a global sensitivity analysis based model reduction approach is used to minimize the parameter space. For accurate estimation of sensitive parameters, robust optimal experimental design based on D-optimality criteria was exploited. With the patient-specific model, a model predictive control algorithm is used to optimize the dose scheduling with the objective of maintaining the 6-TGN concentration within its therapeutic window. More importantly, for the first time, we show how the incorporation of information from different levels of biological chain-of response (i.e. gene expression-enzyme phenotype-drug phenotype) plays a critical role in determining the uncertainty in predicting therapeutic target. The model and the control approach can be utilized in the clinical setting to individualize 6-MP dosing based on the patient’s ability to metabolize the drug instead of the traditional standard-dose-for-all approach. PMID:26226448

  5. Transient dynamics of NbOx threshold switches explained by Poole-Frenkel based thermal feedback mechanism

    NASA Astrophysics Data System (ADS)

    Wang, Ziwen; Kumar, Suhas; Nishi, Yoshio; Wong, H.-S. Philip

    2018-05-01

    Niobium oxide (NbOx) two-terminal threshold switches are potential candidates as selector devices in crossbar memory arrays and as building blocks for neuromorphic systems. However, the physical mechanism of NbOx threshold switches is still under debate. In this paper, we show that a thermal feedback mechanism based on Poole-Frenkel conduction can explain both the quasi-static and the transient electrical characteristics that are experimentally observed for NbOx threshold switches, providing strong support for the validity of this mechanism. Furthermore, a clear picture of the transient dynamics during the thermal-feedback-induced threshold switching is presented, providing useful insights required to model nonlinear devices where thermal feedback is important.

  6. Threshold Evaluation of Emergency Risk Communication for Health Risks Related to Hazardous Ambient Temperature.

    PubMed

    Liu, Yang; Hoppe, Brenda O; Convertino, Matteo

    2018-04-10

    Emergency risk communication (ERC) programs that activate when the ambient temperature is expected to cross certain extreme thresholds are widely used to manage relevant public health risks. In practice, however, the effectiveness of these thresholds has rarely been examined. The goal of this study is to test if the activation criteria based on extreme temperature thresholds, both cold and heat, capture elevated health risks for all-cause and cause-specific mortality and morbidity in the Minneapolis-St. Paul Metropolitan Area. A distributed lag nonlinear model (DLNM) combined with a quasi-Poisson generalized linear model is used to derive the exposure-response functions between daily maximum heat index and mortality (1998-2014) and morbidity (emergency department visits; 2007-2014). Specific causes considered include cardiovascular, respiratory, renal diseases, and diabetes. Six extreme temperature thresholds, corresponding to 1st-3rd and 97th-99th percentiles of local exposure history, are examined. All six extreme temperature thresholds capture significantly increased relative risks for all-cause mortality and morbidity. However, the cause-specific analyses reveal heterogeneity. Extreme cold thresholds capture increased mortality and morbidity risks for cardiovascular and respiratory diseases and extreme heat thresholds for renal disease. Percentile-based extreme temperature thresholds are appropriate for initiating ERC targeting the general population. Tailoring ERC by specific causes may protect some but not all individuals with health conditions exacerbated by hazardous ambient temperature exposure. © 2018 Society for Risk Analysis.

  7. An Individual-Based Model of the Evolution of Pesticide Resistance in Heterogeneous Environments: Control of Meligethes aeneus Population in Oilseed Rape Crops

    PubMed Central

    Stratonovitch, Pierre; Elias, Jan; Denholm, Ian; Slater, Russell; Semenov, Mikhail A.

    2014-01-01

    Preventing a pest population from damaging an agricultural crop and, at the same time, preventing the development of pesticide resistance is a major challenge in crop protection. Understanding how farming practices and environmental factors interact with pest characteristics to influence the spread of resistance is a difficult and complex task. It is extremely challenging to investigate such interactions experimentally at realistic spatial and temporal scales. Mathematical modelling and computer simulation have, therefore, been used to analyse resistance evolution and to evaluate potential resistance management tactics. Of the many modelling approaches available, individual-based modelling of a pest population offers most flexibility to include and analyse numerous factors and their interactions. Here, a pollen beetle (Meligethes aeneus) population was modelled as an aggregate of individual insects inhabiting a spatially heterogeneous landscape. The development of the pest and host crop (oilseed rape) was driven by climatic variables. The agricultural land of the landscape was managed by farmers applying a specific rotation and crop protection strategy. The evolution of a single resistance allele to the pyrethroid lambda cyhalothrin was analysed for different combinations of crop management practices and for a recessive, intermediate and dominant resistance allele. While the spread of a recessive resistance allele was severely constrained, intermediate or dominant resistance alleles showed a similar response to the management regime imposed. Calendar treatments applied irrespective of pest density accelerated the development of resistance compared to ones applied in response to prescribed pest density thresholds. A greater proportion of spring-sown oilseed rape was also found to increase the speed of resistance as it increased the period of insecticide exposure. Our study demonstrates the flexibility and power of an individual-based model to simulate how farming

  8. An individual-based model of the evolution of pesticide resistance in heterogeneous environments: control of Meligethes aeneus population in oilseed rape crops.

    PubMed

    Stratonovitch, Pierre; Elias, Jan; Denholm, Ian; Slater, Russell; Semenov, Mikhail A

    2014-01-01

    Preventing a pest population from damaging an agricultural crop and, at the same time, preventing the development of pesticide resistance is a major challenge in crop protection. Understanding how farming practices and environmental factors interact with pest characteristics to influence the spread of resistance is a difficult and complex task. It is extremely challenging to investigate such interactions experimentally at realistic spatial and temporal scales. Mathematical modelling and computer simulation have, therefore, been used to analyse resistance evolution and to evaluate potential resistance management tactics. Of the many modelling approaches available, individual-based modelling of a pest population offers most flexibility to include and analyse numerous factors and their interactions. Here, a pollen beetle (Meligethes aeneus) population was modelled as an aggregate of individual insects inhabiting a spatially heterogeneous landscape. The development of the pest and host crop (oilseed rape) was driven by climatic variables. The agricultural land of the landscape was managed by farmers applying a specific rotation and crop protection strategy. The evolution of a single resistance allele to the pyrethroid lambda cyhalothrin was analysed for different combinations of crop management practices and for a recessive, intermediate and dominant resistance allele. While the spread of a recessive resistance allele was severely constrained, intermediate or dominant resistance alleles showed a similar response to the management regime imposed. Calendar treatments applied irrespective of pest density accelerated the development of resistance compared to ones applied in response to prescribed pest density thresholds. A greater proportion of spring-sown oilseed rape was also found to increase the speed of resistance as it increased the period of insecticide exposure. Our study demonstrates the flexibility and power of an individual-based model to simulate how farming

  9. Flood and landslide warning based on rainfall thresholds and soil moisture indexes: the HEWS (Hydrohazards Early Warning System) for Sicily

    NASA Astrophysics Data System (ADS)

    Brigandì, Giuseppina; Tito Aronica, Giuseppe; Bonaccorso, Brunella; Gueli, Roberto; Basile, Giuseppe

    2017-09-01

    The main focus of the paper is to present a flood and landslide early warning system, named HEWS (Hydrohazards Early Warning System), specifically developed for the Civil Protection Department of Sicily, based on the combined use of rainfall thresholds, soil moisture modelling and quantitative precipitation forecast (QPF). The warning system is referred to 9 different Alert Zones in which Sicily has been divided into and based on a threshold system of three different increasing critical levels: ordinary, moderate and high. In this system, for early flood warning, a Soil Moisture Accounting (SMA) model provides daily soil moisture conditions, which allow to select a specific set of three rainfall thresholds, one for each critical level considered, to be used for issue the alert bulletin. Wetness indexes, representative of the soil moisture conditions of a catchment, are calculated using a simple, spatially-lumped rainfall-streamflow model, based on the SCS-CN method, and on the unit hydrograph approach, that require daily observed and/or predicted rainfall, and temperature data as input. For the calibration of this model daily continuous time series of rainfall, streamflow and air temperature data are used. An event based lumped rainfall-runoff model has been, instead, used for the derivation of the rainfall thresholds for each catchment in Sicily characterised by an area larger than 50 km2. In particular, a Kinematic Instantaneous Unit Hydrograph based lumped rainfall-runoff model with the SCS-CN routine for net rainfall was developed for this purpose. For rainfall-induced shallow landslide warning, empirical rainfall thresholds provided by Gariano et al. (2015) have been included in the system. They were derived on an empirical basis starting from a catalogue of 265 shallow landslides in Sicily in the period 2002-2012. Finally, Delft-FEWS operational forecasting platform has been applied to link input data, SMA model and rainfall threshold models to produce

  10. Impact of external sources of infection on the dynamics of bovine tuberculosis in modelled badger populations.

    PubMed

    Hardstaff, Joanne L; Bulling, Mark T; Marion, Glenn; Hutchings, Michael R; White, Piran C L

    2012-06-27

    The persistence of bovine TB (bTB) in various countries throughout the world is enhanced by the existence of wildlife hosts for the infection. In Britain and Ireland, the principal wildlife host for bTB is the badger (Meles meles). The objective of our study was to examine the dynamics of bTB in badgers in relation to both badger-derived infection from within the population and externally-derived, trickle-type, infection, such as could occur from other species or environmental sources, using a spatial stochastic simulation model. The presence of external sources of infection can increase mean prevalence and reduce the threshold group size for disease persistence. Above the threshold equilibrium group size of 6-8 individuals predicted by the model for bTB persistence in badgers based on internal infection alone, external sources of infection have relatively little impact on the persistence or level of disease. However, within a critical range of group sizes just below this threshold level, external infection becomes much more important in determining disease dynamics. Within this critical range, external infection increases the ratio of intra- to inter-group infections due to the greater probability of external infections entering fully-susceptible groups. The effect is to enable bTB persistence and increase bTB prevalence in badger populations which would not be able to maintain bTB based on internal infection alone. External sources of bTB infection can contribute to the persistence of bTB in badger populations. In high-density badger populations, internal badger-derived infections occur at a sufficient rate that the additional effect of external sources in exacerbating disease is minimal. However, in lower-density populations, external sources of infection are much more important in enhancing bTB prevalence and persistence. In such circumstances, it is particularly important that control strategies to reduce bTB in badgers include efforts to minimise such

  11. Impact of external sources of infection on the dynamics of bovine tuberculosis in modelled badger populations

    PubMed Central

    2012-01-01

    Background The persistence of bovine TB (bTB) in various countries throughout the world is enhanced by the existence of wildlife hosts for the infection. In Britain and Ireland, the principal wildlife host for bTB is the badger (Meles meles). The objective of our study was to examine the dynamics of bTB in badgers in relation to both badger-derived infection from within the population and externally-derived, trickle-type, infection, such as could occur from other species or environmental sources, using a spatial stochastic simulation model. Results The presence of external sources of infection can increase mean prevalence and reduce the threshold group size for disease persistence. Above the threshold equilibrium group size of 6–8 individuals predicted by the model for bTB persistence in badgers based on internal infection alone, external sources of infection have relatively little impact on the persistence or level of disease. However, within a critical range of group sizes just below this threshold level, external infection becomes much more important in determining disease dynamics. Within this critical range, external infection increases the ratio of intra- to inter-group infections due to the greater probability of external infections entering fully-susceptible groups. The effect is to enable bTB persistence and increase bTB prevalence in badger populations which would not be able to maintain bTB based on internal infection alone. Conclusions External sources of bTB infection can contribute to the persistence of bTB in badger populations. In high-density badger populations, internal badger-derived infections occur at a sufficient rate that the additional effect of external sources in exacerbating disease is minimal. However, in lower-density populations, external sources of infection are much more important in enhancing bTB prevalence and persistence. In such circumstances, it is particularly important that control strategies to reduce bTB in badgers include

  12. Demand for Colonoscopy in Colorectal Cancer Screening Using a Quantitative Fecal Immunochemical Test and Age/Sex-Specific Thresholds for Test Positivity.

    PubMed

    Chen, Sam Li-Sheng; Hsu, Chen-Yang; Yen, Amy Ming-Fang; Young, Graeme P; Chiu, Sherry Yueh-Hsia; Fann, Jean Ching-Yuan; Lee, Yi-Chia; Chiu, Han-Mo; Chiou, Shu-Ti; Chen, Hsiu-Hsi

    2018-06-01

    Background: Despite age and sex differences in fecal hemoglobin (f-Hb) concentrations, most fecal immunochemical test (FIT) screening programs use population-average cut-points for test positivity. The impact of age/sex-specific threshold on FIT accuracy and colonoscopy demand for colorectal cancer screening are unknown. Methods: Using data from 723,113 participants enrolled in a Taiwanese population-based colorectal cancer screening with single FIT between 2004 and 2009, sensitivity and specificity were estimated for various f-Hb thresholds for test positivity. This included estimates based on a "universal" threshold, receiver-operating-characteristic curve-derived threshold, targeted sensitivity, targeted false-positive rate, and a colonoscopy-capacity-adjusted method integrating colonoscopy workload with and without age/sex adjustments. Results: Optimal age/sex-specific thresholds were found to be equal to or lower than the universal 20 μg Hb/g threshold. For older males, a higher threshold (24 μg Hb/g) was identified using a 5% false-positive rate. Importantly, a nonlinear relationship was observed between sensitivity and colonoscopy workload with workload rising disproportionately to sensitivity at 16 μg Hb/g. At this "colonoscopy-capacity-adjusted" threshold, the test positivity (colonoscopy workload) was 4.67% and sensitivity was 79.5%, compared with a lower 4.0% workload and a lower 78.7% sensitivity using 20 μg Hb/g. When constrained on capacity, age/sex-adjusted estimates were generally lower. However, optimizing age/-sex-adjusted thresholds increased colonoscopy demand across models by 17% or greater compared with a universal threshold. Conclusions: Age/sex-specific thresholds improve FIT accuracy with modest increases in colonoscopy demand. Impact: Colonoscopy-capacity-adjusted and age/sex-specific f-Hb thresholds may be useful in optimizing individual screening programs based on detection accuracy, population characteristics, and clinical capacity

  13. Discontinuous non-equilibrium phase transition in a threshold Schloegl model for autocatalysis: Generic two-phase coexistence and metastability

    DOE PAGES

    Wang, Chi -Jen; Liu, Da -Jiang; Evans, James W.

    2015-04-28

    Threshold versions of Schloegl’s model on a lattice, which involve autocatalytic creation and spontaneous annihilation of particles, can provide a simple prototype for discontinuous non-equilibrium phase transitions. These models are equivalent to so-called threshold contact processes. A discontinuous transition between populated and vacuum states can occur selecting a threshold of N ≥ 2 for the minimum number, N, of neighboring particles enabling autocatalytic creation at an empty site. Fundamental open questions remain given the lack of a thermodynamic framework for analysis. For a square lattice with N = 2, we show that phase coexistence occurs not at a unique valuemore » but for a finite range of particle annihilation rate (the natural control parameter). This generic two-phase coexistence also persists when perturbing the model to allow spontaneous particle creation. Such behavior contrasts both the Gibbs phase rule for thermodynamic systems and also previous analysis for this model. We find metastability near the transition corresponding to a non-zero effective line tension, also contrasting previously suggested critical behavior. As a result, mean-field type analysis, extended to treat spatially heterogeneous states, further elucidates model behavior.« less

  14. Critical thresholds in species` responses to landscape structure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    With, K.A.; Crist, T.O.

    1995-12-01

    Critical thresholds are transition ranges across which small changes in spatial pattern produce abrupt shifts in ecological responses. Habitat fragmentation provides a familiar example of a critical threshold. As the landscape becomes dissected into smaller parcels of habitat. landscape connectivity-the functional linkage among habitat patches - may suddenly become disrupted, which may have important consequences for the distribution and persistence of populations. Landscape connectivity depends not only on the abundance and spatial patterning of habitat. but also on the habitat specificity and dispersal abilities of species. Habitat specialists with limited dispersal capabilities presumably have a much lower threshold to habitatmore » fragmentation than highly vagile species, which may perceive the landscape as functionally connected across a greater range of fragmentation severity. To determine where threshold effects in species, responses to landscape structure are likely to occur, a simulation model modified from percolation theory was developed. Our simulations predicted the distributional patterns of populations in different landscape mosaics, which we tested empirically using two grasshopper species (Orthoptera: Acrididae) that occur in the shortgrass prairie of north-central Colorado. The distribution of these two species in this grassland mosaic matched the predictions from our simulations. By providing quantitative predictions of threshold effects, this modelling approach may prove useful in the formulation of conservation strategies and assessment of land-use changes on species` distributional patterns and persistence.« less

  15. Determination of simple thresholds for accelerometry-based parameters for fall detection.

    PubMed

    Kangas, Maarit; Konttila, Antti; Winblad, Ilkka; Jämsä, Timo

    2007-01-01

    The increasing population of elderly people is mainly living in a home-dwelling environment and needs applications to support their independency and safety. Falls are one of the major health risks that affect the quality of life among older adults. Body attached accelerometers have been used to detect falls. The placement of the accelerometric sensor as well as the fall detection algorithms are still under investigation. The aim of the present pilot study was to determine acceleration thresholds for fall detection, using triaxial accelerometric measurements at the waist, wrist, and head. Intentional falls (forward, backward, and lateral) and activities of daily living (ADL) were performed by two voluntary subjects. The results showed that measurements from the waist and head have potential to distinguish between falls and ADL. Especially, when the simple threshold-based detection was combined with posture detection after the fall, the sensitivity and specificity of fall detection were up to 100 %. On the contrary, the wrist did not appear to be an optimal site for fall detection.

  16. Landscape composition creates a threshold influencing Lesser Prairie-Chicken population resilience to extreme drought

    USGS Publications Warehouse

    Ross, Beth E.; Haukos, David A.; Hagen, Christian A.; Pitman, James C.

    2016-01-01

    Habitat loss and degradation compound the effects of climate change on wildlife, yet responses to climate and land cover change are often quantified independently. The interaction between climate and land cover change could be intensified in the Great Plains region where grasslands are being converted to row-crop agriculture concurrent with increased frequency of extreme drought events. We quantified the combined effects of land cover and climate change on a species of conservation concern in the Great Plains, the Lesser Prairie-Chicken (Tympanuchus pallidicinctus  ). We combined extreme drought events and land cover change with lek count surveys in a Bayesian hierarchical model to quantify changes in abundance of male Lesser Prairie-Chickens from 1978 to 2014 in Kansas, the core of their species range. Our estimates of abundance indicate a gradually decreasing population through 2010 corresponding to drought events and reduced grassland areas. Decreases in Lesser Prairie-Chicken abundance were greatest in areas with increasing row-crop to grassland land cover ratio during extreme drought events, and decreased grassland reduces the resilience of Lesser Prairie-Chicken populations to extreme drought events. A threshold exists for Lesser Prairie-Chickens in response to the gradient of cropland:grassland land cover. When moving across the gradient of grassland to cropland, abundance initially increased in response to more cropland on the landscape, but declined in response to more cropland after the threshold (δ=0.096, or 9.6% cropland). Preservation of intact grasslands and continued implementation of initiatives to revert cropland to grassland should increase Lesser Prairie-Chicken resilience to extreme drought events due to climate change.

  17. Lowered threshold energy for femtosecond laser induced optical breakdown in a water based eye model by aberration correction with adaptive optics.

    PubMed

    Hansen, Anja; Géneaux, Romain; Günther, Axel; Krüger, Alexander; Ripken, Tammo

    2013-06-01

    In femtosecond laser ophthalmic surgery tissue dissection is achieved by photodisruption based on laser induced optical breakdown. In order to minimize collateral damage to the eye laser surgery systems should be optimized towards the lowest possible energy threshold for photodisruption. However, optical aberrations of the eye and the laser system distort the irradiance distribution from an ideal profile which causes a rise in breakdown threshold energy even if great care is taken to minimize the aberrations of the system during design and alignment. In this study we used a water chamber with an achromatic focusing lens and a scattering sample as eye model and determined breakdown threshold in single pulse plasma transmission loss measurements. Due to aberrations, the precise lower limit for breakdown threshold irradiance in water is still unknown. Here we show that the threshold energy can be substantially reduced when using adaptive optics to improve the irradiance distribution by spatial beam shaping. We found that for initial aberrations with a root-mean-square wave front error of only one third of the wavelength the threshold energy can still be reduced by a factor of three if the aberrations are corrected to the diffraction limit by adaptive optics. The transmitted pulse energy is reduced by 17% at twice the threshold. Furthermore, the gas bubble motions after breakdown for pulse trains at 5 kilohertz repetition rate show a more transverse direction in the corrected case compared to the more spherical distribution without correction. Our results demonstrate how both applied and transmitted pulse energy could be reduced during ophthalmic surgery when correcting for aberrations. As a consequence, the risk of retinal damage by transmitted energy and the extent of collateral damage to the focal volume could be minimized accordingly when using adaptive optics in fs-laser surgery.

  18. Lowered threshold energy for femtosecond laser induced optical breakdown in a water based eye model by aberration correction with adaptive optics

    PubMed Central

    Hansen, Anja; Géneaux, Romain; Günther, Axel; Krüger, Alexander; Ripken, Tammo

    2013-01-01

    In femtosecond laser ophthalmic surgery tissue dissection is achieved by photodisruption based on laser induced optical breakdown. In order to minimize collateral damage to the eye laser surgery systems should be optimized towards the lowest possible energy threshold for photodisruption. However, optical aberrations of the eye and the laser system distort the irradiance distribution from an ideal profile which causes a rise in breakdown threshold energy even if great care is taken to minimize the aberrations of the system during design and alignment. In this study we used a water chamber with an achromatic focusing lens and a scattering sample as eye model and determined breakdown threshold in single pulse plasma transmission loss measurements. Due to aberrations, the precise lower limit for breakdown threshold irradiance in water is still unknown. Here we show that the threshold energy can be substantially reduced when using adaptive optics to improve the irradiance distribution by spatial beam shaping. We found that for initial aberrations with a root-mean-square wave front error of only one third of the wavelength the threshold energy can still be reduced by a factor of three if the aberrations are corrected to the diffraction limit by adaptive optics. The transmitted pulse energy is reduced by 17% at twice the threshold. Furthermore, the gas bubble motions after breakdown for pulse trains at 5 kilohertz repetition rate show a more transverse direction in the corrected case compared to the more spherical distribution without correction. Our results demonstrate how both applied and transmitted pulse energy could be reduced during ophthalmic surgery when correcting for aberrations. As a consequence, the risk of retinal damage by transmitted energy and the extent of collateral damage to the focal volume could be minimized accordingly when using adaptive optics in fs-laser surgery. PMID:23761849

  19. Constructing financial network based on PMFG and threshold method

    NASA Astrophysics Data System (ADS)

    Nie, Chun-Xiao; Song, Fu-Tie

    2018-04-01

    Based on planar maximally filtered graph (PMFG) and threshold method, we introduced a correlation-based network named PMFG-based threshold network (PTN). We studied the community structure of PTN and applied ISOMAP algorithm to represent PTN in low-dimensional Euclidean space. The results show that the community corresponds well to the cluster in the Euclidean space. Further, we studied the dynamics of the community structure and constructed the normalized mutual information (NMI) matrix. Based on the real data in the market, we found that the volatility of the market can lead to dramatic changes in the community structure, and the structure is more stable during the financial crisis.

  20. The threshold of a stochastic delayed SIR epidemic model with vaccination

    NASA Astrophysics Data System (ADS)

    Liu, Qun; Jiang, Daqing

    2016-11-01

    In this paper, we study the threshold dynamics of a stochastic delayed SIR epidemic model with vaccination. We obtain sufficient conditions for extinction and persistence in the mean of the epidemic. The threshold between persistence in the mean and extinction of the stochastic system is also obtained. Compared with the corresponding deterministic model, the threshold affected by the white noise is smaller than the basic reproduction number Rbar0 of the deterministic system. Results show that time delay has important effects on the persistence and extinction of the epidemic.

  1. Galactic dual population models of gamma-ray bursts

    NASA Technical Reports Server (NTRS)

    Higdon, J. C.; Lingenfelter, R. E.

    1994-01-01

    We investigate in more detail the properties of two-population models for gamma-ray bursts in the galactic disk and halo. We calculate the gamma-ray burst statistical properties, mean value of (V/V(sub max)), mean value of cos Theta, and mean value of (sin(exp 2) b), as functions of the detection flux threshold for bursts coming from both Galactic disk and massive halo populations. We consider halo models inferred from the observational constraints on the large-scale Galactic structure and we compare the expected values of mean value of (V/V(sub max)), mean value of cos Theta, and mean value of (sin(exp 2) b), with those measured by Burst and Transient Source Experiment (BATSE) and other detectors. We find that the measured values are consistent with solely Galactic populations having a range of halo distributions, mixed with local disk distributions, which can account for as much as approximately 25% of the observed BATSE bursts. M31 does not contribute to these modeled bursts. We also demonstrate, contrary to recent arguments, that the size-frequency distributions of dual population models are quite consistent with the BATSE observations.

  2. A single-index threshold Cox proportional hazard model for identifying a treatment-sensitive subset based on multiple biomarkers.

    PubMed

    He, Ye; Lin, Huazhen; Tu, Dongsheng

    2018-06-04

    In this paper, we introduce a single-index threshold Cox proportional hazard model to select and combine biomarkers to identify patients who may be sensitive to a specific treatment. A penalized smoothed partial likelihood is proposed to estimate the parameters in the model. A simple, efficient, and unified algorithm is presented to maximize this likelihood function. The estimators based on this likelihood function are shown to be consistent and asymptotically normal. Under mild conditions, the proposed estimators also achieve the oracle property. The proposed approach is evaluated through simulation analyses and application to the analysis of data from two clinical trials, one involving patients with locally advanced or metastatic pancreatic cancer and one involving patients with resectable lung cancer. Copyright © 2018 John Wiley & Sons, Ltd.

  3. Threshold secret sharing scheme based on phase-shifting interferometry.

    PubMed

    Deng, Xiaopeng; Shi, Zhengang; Wen, Wei

    2016-11-01

    We propose a new method for secret image sharing with the (3,N) threshold scheme based on phase-shifting interferometry. The secret image, which is multiplied with an encryption key in advance, is first encrypted by using Fourier transformation. Then, the encoded image is shared into N shadow images based on the recording principle of phase-shifting interferometry. Based on the reconstruction principle of phase-shifting interferometry, any three or more shadow images can retrieve the secret image, while any two or fewer shadow images cannot obtain any information of the secret image. Thus, a (3,N) threshold secret sharing scheme can be implemented. Compared with our previously reported method, the algorithm of this paper is suited for not only a binary image but also a gray-scale image. Moreover, the proposed algorithm can obtain a larger threshold value t. Simulation results are presented to demonstrate the feasibility of the proposed method.

  4. An example of population-level risk assessments for small mammals using individual-based population models.

    PubMed

    Schmitt, Walter; Auteri, Domenica; Bastiansen, Finn; Ebeling, Markus; Liu, Chun; Luttik, Robert; Mastitsky, Sergey; Nacci, Diane; Topping, Chris; Wang, Magnus

    2016-01-01

    This article presents a case study demonstrating the application of 3 individual-based, spatially explicit population models (IBMs, also known as agent-based models) in ecological risk assessments to predict long-term effects of a pesticide to populations of small mammals. The 3 IBMs each used a hypothetical fungicide (FungicideX) in different scenarios: spraying in cereals (common vole, Microtus arvalis), spraying in orchards (field vole, Microtus agrestis), and cereal seed treatment (wood mouse, Apodemus sylvaticus). Each scenario used existing model landscapes, which differed greatly in size and structural complexity. The toxicological profile of FungicideX was defined so that the deterministic long-term first tier risk assessment would result in high risk to small mammals, thus providing the opportunity to use the IBMs for risk assessment refinement (i.e., higher tier risk assessment). Despite differing internal model design and scenarios, results indicated in all 3 cases low population sensitivity unless FungicideX was applied at very high (×10) rates. Recovery from local population impacts was generally fast. Only when patch extinctions occured in simulations of intentionally high acute toxic effects, recovery periods, then determined by recolonization, were of any concern. Conclusions include recommendations for the most important input considerations, including the selection of exposure levels, duration of simulations, statistically robust number of replicates, and endpoints to report. However, further investigation and agreement are needed to develop recommendations for landscape attributes such as size, structure, and crop rotation to define appropriate regulatory risk assessment scenarios. Overall, the application of IBMs provides multiple advantages to higher tier ecological risk assessments for small mammals, including consistent and transparent direct links to specific protection goals, and the consideration of more realistic scenarios. © 2015 SETAC.

  5. When is rational to order a diagnostic test, or prescribe treatment: the threshold model as an explanation of practice variation.

    PubMed

    Djulbegovic, Benjamin; van den Ende, Jef; Hamm, Robert M; Mayrhofer, Thomas; Hozo, Iztok; Pauker, Stephen G

    2015-05-01

    The threshold model represents an important advance in the field of medical decision-making. It is a linchpin between evidence (which exists on the continuum of credibility) and decision-making (which is a categorical exercise - we decide to act or not act). The threshold concept is closely related to the question of rational decision-making. When should the physician act, that is order a diagnostic test, or prescribe treatment? The threshold model embodies the decision theoretic rationality that says the most rational decision is to prescribe treatment when the expected treatment benefit outweighs its expected harms. However, the well-documented large variation in the way physicians order diagnostic tests or decide to administer treatments is consistent with a notion that physicians' individual action thresholds vary. We present a narrative review summarizing the existing literature on physicians' use of a threshold strategy for decision-making. We found that the observed variation in decision action thresholds is partially due to the way people integrate benefits and harms. That is, explanation of variation in clinical practice can be reduced to a consideration of thresholds. Limited evidence suggests that non-expected utility threshold (non-EUT) models, such as regret-based and dual-processing models, may explain current medical practice better. However, inclusion of costs and recognition of risk attitudes towards uncertain treatment effects and comorbidities may improve the explanatory and predictive value of the EUT-based threshold models. The decision when to act is closely related to the question of rational choice. We conclude that the medical community has not yet fully defined criteria for rational clinical decision-making. The traditional notion of rationality rooted in EUT may need to be supplemented by reflective rationality, which strives to integrate all aspects of medical practice - medical, humanistic and socio-economic - within a coherent

  6. Phase-change memory: A continuous multilevel compact model of subthreshold conduction and threshold switching

    NASA Astrophysics Data System (ADS)

    Pigot, Corentin; Gilibert, Fabien; Reyboz, Marina; Bocquet, Marc; Zuliani, Paola; Portal, Jean-Michel

    2018-04-01

    Phase-change memory (PCM) compact modeling of the threshold switching based on a thermal runaway in Poole–Frenkel conduction is proposed. Although this approach is often used in physical models, this is the first time it is implemented in a compact model. The model accuracy is validated by a good correlation between simulations and experimental data collected on a PCM cell embedded in a 90 nm technology. A wide range of intermediate states is measured and accurately modeled with a single set of parameters, allowing multilevel programing. A good convergence is exhibited even in snapback simulation owing to this fully continuous approach. Moreover, threshold properties extraction indicates a thermally enhanced switching, which validates the basic hypothesis of the model. Finally, it is shown that this model is compliant with a new drift-resilient cell-state metric. Once enriched with a phase transition module, this compact model is ready to be implemented in circuit simulators.

  7. An isochrone data base and a rapid model for stellar population synthesis

    NASA Astrophysics Data System (ADS)

    Li, Zhongmu; Han, Zhanwen

    2008-06-01

    We first presented an isochrone data base that can be widely used for stellar population synthesis studies and colour-magnitude diagram (CMD) fitting. The data base consists of the isochrones of both single-star and binary-star simple stellar populations (ss-SSPs and bs-SSPs). The ranges for the age and metallicity of populations are 0-15 Gyr and 0.0001-0.03, respectively. All data are available for populations with two widely used initial mass functions (IMFs), that is, Salpeter IMF and Chabrier IMF. The uncertainty caused by the data base (about 0.81 per cent) is designed to be smaller than those caused by the Hurley code and widely used stellar spectra libraries (e.g. BaSeL 3.1) when it is used for stellar population synthesis. Based on the isochrone data base, we then built a rapid stellar population synthesis (RPS) model and calculated the high-resolution (0.3-Å) integrated spectral energy distributions, Lick indices and colour indices for bs-SSPs and ss-SSPs. In particular, we calculated the UBVRIJHKLM colours, ugriz colours and some composite colours that consist of magnitudes on different systems. These colours are useful for disentangling the well-known stellar age-metallicity degeneracy according to our previous work. As an example for applying the isochrone data base for CMD fitting, we fitted the CMDs of two star clusters (M67 and NGC1868) and obtained their distance moduli, colour excesses, stellar metallicities and ages. The results showed that the isochrones of bs-SSPs are closer to those of real star clusters. It suggests that we should take the effects of binary interactions into account in stellar population synthesis. We also discussed on the limitations of the application of the isochrone data base and the results of the RPS model. All the data are available at the CDS or on request to the authors. E-mail: zhongmu.li@gmail.com

  8. Simulation models in population breast cancer screening: A systematic review.

    PubMed

    Koleva-Kolarova, Rositsa G; Zhan, Zhuozhao; Greuter, Marcel J W; Feenstra, Talitha L; De Bock, Geertruida H

    2015-08-01

    The aim of this review was to critically evaluate published simulation models for breast cancer screening of the general population and provide a direction for future modeling. A systematic literature search was performed to identify simulation models with more than one application. A framework for qualitative assessment which incorporated model type; input parameters; modeling approach, transparency of input data sources/assumptions, sensitivity analyses and risk of bias; validation, and outcomes was developed. Predicted mortality reduction (MR) and cost-effectiveness (CE) were compared to estimates from meta-analyses of randomized control trials (RCTs) and acceptability thresholds. Seven original simulation models were distinguished, all sharing common input parameters. The modeling approach was based on tumor progression (except one model) with internal and cross validation of the resulting models, but without any external validation. Differences in lead times for invasive or non-invasive tumors, and the option for cancers not to progress were not explicitly modeled. The models tended to overestimate the MR (11-24%) due to screening as compared to optimal RCTs 10% (95% CI - 2-21%) MR. Only recently, potential harms due to regular breast cancer screening were reported. Most scenarios resulted in acceptable cost-effectiveness estimates given current thresholds. The selected models have been repeatedly applied in various settings to inform decision making and the critical analysis revealed high risk of bias in their outcomes. Given the importance of the models, there is a need for externally validated models which use systematical evidence for input data to allow for more critical evaluation of breast cancer screening. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Evaluating an Action Threshold-Based Insecticide Program on Onion Cultivars Varying in Resistance to Onion Thrips (Thysanoptera: Thripidae).

    PubMed

    Nault, Brian A; Huseth, Anders S

    2016-08-01

    Onion thrips, Thrips tabaci Lindeman (Thysanoptera: Thripidae), is a highly destructive pest of onion, Allium cepa L., and its management relies on multiple applications of foliar insecticides. Development of insecticide resistance is common in T. tabaci populations, and new strategies are needed to relax existing levels of insecticide use, but still provide protection against T. tabaci without compromising marketable onion yield. An action threshold-based insecticide program combined with or without a thrips-resistant onion cultivar was investigated as an improved approach for managing T. tabaci infestations in commercial onion fields. Regardless of cultivar type, the average number of insecticide applications needed to manage T. tabaci infestations in the action-threshold based program was 4.3, while the average number of sprays in the standard weekly program was 7.2 (a 40% reduction). The mean percent reduction in numbers of applications following the action threshold treatment in the thrips-resistant onion cultivar, 'Advantage', was 46.7% (range 40-50%) compared with the standard program, whereas the percentage reduction in applications in action threshold treatments in the thrips-susceptible onion cultivar, 'Santana', was 34.3% (range 13-50%) compared with the standard program, suggesting a benefit of the thrips-resistant cultivar. Marketable bulb yields for both 'Advantage' and 'Santana' in the action threshold-based program were nearly identical to those in the standard program, indicating that commercially acceptable bulb yields will be generated with fewer insecticide sprays following an action threshold-based program, saving money, time and benefiting the environment. © The Authors 2016. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  10. Dynamic Multiple-Threshold Call Admission Control Based on Optimized Genetic Algorithm in Wireless/Mobile Networks

    NASA Astrophysics Data System (ADS)

    Wang, Shengling; Cui, Yong; Koodli, Rajeev; Hou, Yibin; Huang, Zhangqin

    Due to the dynamics of topology and resources, Call Admission Control (CAC) plays a significant role for increasing resource utilization ratio and guaranteeing users' QoS requirements in wireless/mobile networks. In this paper, a dynamic multi-threshold CAC scheme is proposed to serve multi-class service in a wireless/mobile network. The thresholds are renewed at the beginning of each time interval to react to the changing mobility rate and network load. To find suitable thresholds, a reward-penalty model is designed, which provides different priorities between different service classes and call types through different reward/penalty policies according to network load and average call arrival rate. To speed up the running time of CAC, an Optimized Genetic Algorithm (OGA) is presented, whose components such as encoding, population initialization, fitness function and mutation etc., are all optimized in terms of the traits of the CAC problem. The simulation demonstrates that the proposed CAC scheme outperforms the similar schemes, which means the optimization is realized. Finally, the simulation shows the efficiency of OGA.

  11. A threshold method for immunological correlates of protection

    PubMed Central

    2013-01-01

    Background Immunological correlates of protection are biological markers such as disease-specific antibodies which correlate with protection against disease and which are measurable with immunological assays. It is common in vaccine research and in setting immunization policy to rely on threshold values for the correlate where the accepted threshold differentiates between individuals who are considered to be protected against disease and those who are susceptible. Examples where thresholds are used include development of a new generation 13-valent pneumococcal conjugate vaccine which was required in clinical trials to meet accepted thresholds for the older 7-valent vaccine, and public health decision making on vaccination policy based on long-term maintenance of protective thresholds for Hepatitis A, rubella, measles, Japanese encephalitis and others. Despite widespread use of such thresholds in vaccine policy and research, few statistical approaches have been formally developed which specifically incorporate a threshold parameter in order to estimate the value of the protective threshold from data. Methods We propose a 3-parameter statistical model called the a:b model which incorporates parameters for a threshold and constant but different infection probabilities below and above the threshold estimated using profile likelihood or least squares methods. Evaluation of the estimated threshold can be performed by a significance test for the existence of a threshold using a modified likelihood ratio test which follows a chi-squared distribution with 3 degrees of freedom, and confidence intervals for the threshold can be obtained by bootstrapping. The model also permits assessment of relative risk of infection in patients achieving the threshold or not. Goodness-of-fit of the a:b model may be assessed using the Hosmer-Lemeshow approach. The model is applied to 15 datasets from published clinical trials on pertussis, respiratory syncytial virus and varicella. Results

  12. POPULATION-BASED EXPOSURE MODELING FOR AIR POLLUTANTS AT EPA'S NATIONAL EXPOSURE RESEARCH LABORATORY

    EPA Science Inventory

    The US EPA's National Exposure Research Laboratory (NERL) has been developing, applying, and evaluating population-based exposure models to improve our understanding of the variability in personal exposure to air pollutants. Estimates of population variability are needed for E...

  13. High-resolution tide projections reveal extinction threshold in response to sea-level rise.

    PubMed

    Field, Christopher R; Bayard, Trina S; Gjerdrum, Carina; Hill, Jason M; Meiman, Susan; Elphick, Chris S

    2017-05-01

    Sea-level rise will affect coastal species worldwide, but models that aim to predict these effects are typically based on simple measures of sea level that do not capture its inherent complexity, especially variation over timescales shorter than 1 year. Coastal species might be most affected, however, by floods that exceed a critical threshold. The frequency and duration of such floods may be more important to population dynamics than mean measures of sea level. In particular, the potential for changes in the frequency and duration of flooding events to result in nonlinear population responses or biological thresholds merits further research, but may require that models incorporate greater resolution in sea level than is typically used. We created population simulations for a threatened songbird, the saltmarsh sparrow (Ammodramus caudacutus), in a region where sea level is predictable with high accuracy and precision. We show that incorporating the timing of semidiurnal high tide events throughout the breeding season, including how this timing is affected by mean sea-level rise, predicts a reproductive threshold that is likely to cause a rapid demographic shift. This shift is likely to threaten the persistence of saltmarsh sparrows beyond 2060 and could cause extinction as soon as 2035. Neither extinction date nor the population trajectory was sensitive to the emissions scenarios underlying sea-level projections, as most of the population decline occurred before scenarios diverge. Our results suggest that the variation and complexity of climate-driven variables could be important for understanding the potential responses of coastal species to sea-level rise, especially for species that rely on coastal areas for reproduction. © 2016 John Wiley & Sons Ltd.

  14. Methods for estimating population density in data-limited areas: evaluating regression and tree-based models in Peru.

    PubMed

    Anderson, Weston; Guikema, Seth; Zaitchik, Ben; Pan, William

    2014-01-01

    Obtaining accurate small area estimates of population is essential for policy and health planning but is often difficult in countries with limited data. In lieu of available population data, small area estimate models draw information from previous time periods or from similar areas. This study focuses on model-based methods for estimating population when no direct samples are available in the area of interest. To explore the efficacy of tree-based models for estimating population density, we compare six different model structures including Random Forest and Bayesian Additive Regression Trees. Results demonstrate that without information from prior time periods, non-parametric tree-based models produced more accurate predictions than did conventional regression methods. Improving estimates of population density in non-sampled areas is important for regions with incomplete census data and has implications for economic, health and development policies.

  15. Methods for Estimating Population Density in Data-Limited Areas: Evaluating Regression and Tree-Based Models in Peru

    PubMed Central

    Anderson, Weston; Guikema, Seth; Zaitchik, Ben; Pan, William

    2014-01-01

    Obtaining accurate small area estimates of population is essential for policy and health planning but is often difficult in countries with limited data. In lieu of available population data, small area estimate models draw information from previous time periods or from similar areas. This study focuses on model-based methods for estimating population when no direct samples are available in the area of interest. To explore the efficacy of tree-based models for estimating population density, we compare six different model structures including Random Forest and Bayesian Additive Regression Trees. Results demonstrate that without information from prior time periods, non-parametric tree-based models produced more accurate predictions than did conventional regression methods. Improving estimates of population density in non-sampled areas is important for regions with incomplete census data and has implications for economic, health and development policies. PMID:24992657

  16. Dual-threshold segmentation using Arimoto entropy based on chaotic bee colony optimization

    NASA Astrophysics Data System (ADS)

    Li, Li

    2018-03-01

    In order to extract target from complex background more quickly and accurately, and to further improve the detection effect of defects, a method of dual-threshold segmentation using Arimoto entropy based on chaotic bee colony optimization was proposed. Firstly, the method of single-threshold selection based on Arimoto entropy was extended to dual-threshold selection in order to separate the target from the background more accurately. Then intermediate variables in formulae of Arimoto entropy dual-threshold selection was calculated by recursion to eliminate redundant computation effectively and to reduce the amount of calculation. Finally, the local search phase of artificial bee colony algorithm was improved by chaotic sequence based on tent mapping. The fast search for two optimal thresholds was achieved using the improved bee colony optimization algorithm, thus the search could be accelerated obviously. A large number of experimental results show that, compared with the existing segmentation methods such as multi-threshold segmentation method using maximum Shannon entropy, two-dimensional Shannon entropy segmentation method, two-dimensional Tsallis gray entropy segmentation method and multi-threshold segmentation method using reciprocal gray entropy, the proposed method can segment target more quickly and accurately with superior segmentation effect. It proves to be an instant and effective method for image segmentation.

  17. Generalizing a complex model for gully threshold identification in the Mediterranean environment

    NASA Astrophysics Data System (ADS)

    Torri, D.; Borselli, L.; Iaquinta, P.; Iovine, G.; Poesen, J.; Terranova, O.

    2012-04-01

    Among the physical processes leading to land degradation, soil erosion by water is the most important and gully erosion may contribute, at places, to 70% of the total soil loss. Nevertheless, gully erosion has often been neglected in water soil erosion modeling, whilst more prominence has been given to rill and interrill erosion. Both to facilitate the processing by agricultural machinery and to take advantage of all the arable land, gullies are commonly removed at each crop cycle, with significant soil losses due to the repeated excavation of the channel by the successive rainstorm. When the erosive forces of overland flow exceed the strength of the soil particles to detachment and displacement, water erosion occurs and usually a channel is formed. As runoff is proportional to the local catchment area, a relationship between local slope, S, and contributing area, A, is supposed to exists. A "geomorphologic threshold" scheme is therefore suitable to interpret the physical process of gully initiation: accordingly, a gully is formed when a hydraulic threshold for incision exceeds the resistance of the soil particles to detachment and transport. Similarly, it appears reasonable that a gully ends when there is a reduction of slope, or the concentrated flow meets more resistant soil-vegetation complexes. This study aims to predict the location of the beginning of gullies in the Mediterranean environment, based on an evaluation of S and A by means of a mathematical model. For the identification of the areas prone to gully erosion, the model employs two empirical thresholds relevant to the head (Thead) and to the end (Tend) of the gullies (of the type SA^ b>Thead, SA^ bthresholds represent the resistance of the environment to gully erosion, depending on: stoniness, vegetation cover, propensity to tunneling erosion due to soil dispersibility in water, and the intrinsic characteristics of the eroded material and of the erosivity of the rainfall event. Such

  18. A decision model to estimate a risk threshold for venous thromboembolism prophylaxis in hospitalized medical patients.

    PubMed

    Le, P; Martinez, K A; Pappas, M A; Rothberg, M B

    2017-06-01

    Essentials Low risk patients don't require venous thromboembolism (VTE) prophylaxis; low risk is unquantified. We used a Markov model to estimate the risk threshold for VTE prophylaxis in medical inpatients. Prophylaxis was cost-effective for an average medical patient with a VTE risk of ≥ 1.0%. VTE prophylaxis can be personalized based on patient risk and age/life expectancy. Background Venous thromboembolism (VTE) is a common preventable condition in medical inpatients. Thromboprophylaxis is recommended for inpatients who are not at low risk of VTE, but no specific risk threshold for prophylaxis has been defined. Objective To determine a threshold for prophylaxis based on risk of VTE. Patients/Methods We constructed a decision model with a decision-tree following patients for 3 months after hospitalization, and a lifetime Markov model with 3-month cycles. The model tracked symptomatic deep vein thromboses and pulmonary emboli, bleeding events and heparin-induced thrombocytopenia. Long-term complications included recurrent VTE, post-thrombotic syndrome and pulmonary hypertension. For the base case, we considered medical inpatients aged 66 years, having a life expectancy of 13.5 years, VTE risk of 1.4% and bleeding risk of 2.7%. Patients received enoxaparin 40 mg day -1 for prophylaxis. Results Assuming a willingness-to-pay (WTP) threshold of $100 000/ quality-adjusted life year (QALY), prophylaxis was indicated for an average medical inpatient with a VTE risk of ≥ 1.0% up to 3 months after hospitalization. For the average patient, prophylaxis was not indicated when the bleeding risk was > 8.1%, the patient's age was > 73.4 years or the cost of enoxaparin exceeded $60/dose. If VTE risk was < 0.26% or bleeding risk was > 19%, the risks of prophylaxis outweighed benefits. The prophylaxis threshold was relatively insensitive to low-molecular-weight heparin cost and bleeding risk, but very sensitive to patient age and life expectancy. Conclusions The decision to

  19. Population response to climate change: linear vs. non-linear modeling approaches.

    PubMed

    Ellis, Alicia M; Post, Eric

    2004-03-31

    Research on the ecological consequences of global climate change has elicited a growing interest in the use of time series analysis to investigate population dynamics in a changing climate. Here, we compare linear and non-linear models describing the contribution of climate to the density fluctuations of the population of wolves on Isle Royale, Michigan from 1959 to 1999. The non-linear self excitatory threshold autoregressive (SETAR) model revealed that, due to differences in the strength and nature of density dependence, relatively small and large populations may be differentially affected by future changes in climate. Both linear and non-linear models predict a decrease in the population of wolves with predicted changes in climate. Because specific predictions differed between linear and non-linear models, our study highlights the importance of using non-linear methods that allow the detection of non-linearity in the strength and nature of density dependence. Failure to adopt a non-linear approach to modelling population response to climate change, either exclusively or in addition to linear approaches, may compromise efforts to quantify ecological consequences of future warming.

  20. The dynamic influence of human resources on evidence-based intervention sustainability and population outcomes: an agent-based modeling approach.

    PubMed

    McKay, Virginia R; Hoffer, Lee D; Combs, Todd B; Margaret Dolcini, M

    2018-06-05

    Sustaining evidence-based interventions (EBIs) is an ongoing challenge for dissemination and implementation science in public health and social services. Characterizing the relationship among human resource capacity within an agency and subsequent population outcomes is an important step to improving our understanding of how EBIs are sustained. Although human resource capacity and population outcomes are theoretically related, examining them over time within real-world experiments is difficult. Simulation approaches, especially agent-based models, offer advantages that complement existing methods. We used an agent-based model to examine the relationships among human resources, EBI delivery, and population outcomes by simulating provision of an EBI through a hypothetical agency and its staff. We used data from existing studies examining a widely implemented HIV prevention intervention to inform simulation design, calibration, and validity. Once we developed a baseline model, we used the model as a simulated laboratory by systematically varying three human resource variables: the number of staff positions, the staff turnover rate, and timing in training. We tracked the subsequent influence on EBI delivery and the level of population risk over time to describe the overall and dynamic relationships among these variables. Higher overall levels of human resource capacity at an agency (more positions) led to more extensive EBI delivery over time and lowered population risk earlier in time. In simulations representing the typical human resource investments, substantial influences on population risk were visible after approximately 2 years and peaked around 4 years. Human resources, especially staff positions, have an important impact on EBI sustainability and ultimately population health. A minimum level of human resources based on the context (e.g., size of the initial population and characteristics of the EBI) is likely needed for an EBI to have a meaningful impact on

  1. [Prediction of schistosomiasis infection rates of population based on ARIMA-NARNN model].

    PubMed

    Ke-Wei, Wang; Yu, Wu; Jin-Ping, Li; Yu-Yu, Jiang

    2016-07-12

    To explore the effect of the autoregressive integrated moving average model-nonlinear auto-regressive neural network (ARIMA-NARNN) model on predicting schistosomiasis infection rates of population. The ARIMA model, NARNN model and ARIMA-NARNN model were established based on monthly schistosomiasis infection rates from January 2005 to February 2015 in Jiangsu Province, China. The fitting and prediction performances of the three models were compared. Compared to the ARIMA model and NARNN model, the mean square error (MSE), mean absolute error (MAE) and mean absolute percentage error (MAPE) of the ARIMA-NARNN model were the least with the values of 0.011 1, 0.090 0 and 0.282 4, respectively. The ARIMA-NARNN model could effectively fit and predict schistosomiasis infection rates of population, which might have a great application value for the prevention and control of schistosomiasis.

  2. Color difference threshold determination for acrylic denture base resins.

    PubMed

    Ren, Jiabao; Lin, Hong; Huang, Qingmei; Liang, Qifan; Zheng, Gang

    2015-01-01

    This study aimed to set evaluation indicators, i.e., perceptibility and acceptability color difference thresholds, of color stability for acrylic denture base resins for a spectrophotometric assessing method, which offered an alternative to the visual method described in ISO 20795-1:2013. A total of 291 disk specimens 50±1 mm in diameter and 0.5±0.1 mm thick were prepared (ISO 20795-1:2013) and processed through radiation tests in an accelerated aging chamber (ISO 7491:2000) for increasing times of 0 to 42 hours. Color alterations were measured with a spectrophotometer and evaluated using the CIE L*a*b* colorimetric system. Color differences were calculated through the CIEDE2000 color difference formula. Thirty-two dental professionals without color vision deficiencies completed perceptibility and acceptability assessments under controlled conditions in vitro. An S-curve fitting procedure was used to analyze the 50:50% perceptibility and acceptability thresholds. Furthermore, perceptibility and acceptability against the differences of the three color attributes, lightness, chroma, and hue, were also investigated. According to the S-curve fitting procedure, the 50:50% perceptibility threshold was 1.71ΔE00 (r(2)=0.88) and the 50:50% acceptability threshold was 4.00 ΔE00 (r(2)=0.89). Within the limitations of this study, 1.71/4.00 ΔE00 could be used as perceptibility/acceptability thresholds for acrylic denture base resins.

  3. Meaningful use stage 2 e-prescribing threshold and adverse drug events in the Medicare Part D population with diabetes

    PubMed Central

    Gabriel, Meghan Hufstader; Encinosa, William; Mostashari, Farzad; Bynum, Julie

    2015-01-01

    Evidence supports the potential for e-prescribing to reduce the incidence of adverse drug events (ADEs) in hospital-based studies, but studies in the ambulatory setting have not used occurrence of ADE as their outcome. Using the “prescription origin code” in 2011 Medicare Part D prescription drug events files, the authors investigate whether physicians who meet the meaningful use stage 2 threshold for e-prescribing (≥50% of prescriptions e-prescribed) have lower rates of ADEs among their diabetic patients. Risk of any patient with diabetes in the provider’s panel having an ADE from anti-diabetic medications was modeled adjusted for prescriber and patient panel characteristics. Physician e-prescribing to Medicare beneficiaries was associated with reduced risk of ADEs among their diabetes patients (Odds Ratio: 0.95; 95% CI, 0.94-0.96), as were several prescriber and panel characteristics. However, these physicians treated fewer patients from disadvantaged populations. PMID:25948698

  4. Bayesian estimation of dose thresholds

    NASA Technical Reports Server (NTRS)

    Groer, P. G.; Carnes, B. A.

    2003-01-01

    An example is described of Bayesian estimation of radiation absorbed dose thresholds (subsequently simply referred to as dose thresholds) using a specific parametric model applied to a data set on mice exposed to 60Co gamma rays and fission neutrons. A Weibull based relative risk model with a dose threshold parameter was used to analyse, as an example, lung cancer mortality and determine the posterior density for the threshold dose after single exposures to 60Co gamma rays or fission neutrons from the JANUS reactor at Argonne National Laboratory. The data consisted of survival, censoring times and cause of death information for male B6CF1 unexposed and exposed mice. The 60Co gamma whole-body doses for the two exposed groups were 0.86 and 1.37 Gy. The neutron whole-body doses were 0.19 and 0.38 Gy. Marginal posterior densities for the dose thresholds for neutron and gamma radiation were calculated with numerical integration and found to have quite different shapes. The density of the threshold for 60Co is unimodal with a mode at about 0.50 Gy. The threshold density for fission neutrons declines monotonically from a maximum value at zero with increasing doses. The posterior densities for all other parameters were similar for the two radiation types.

  5. Body mass index and type 2 diabetes in Thai adults: defining risk thresholds and population impacts.

    PubMed

    Papier, Keren; D'Este, Catherine; Bain, Chris; Banwell, Cathy; Seubsman, Sam-Ang; Sleigh, Adrian; Jordan, Susan

    2017-09-15

    Body mass index (BMI) cut-off values (>25 and >30) that predict diabetes risk have been well validated in Caucasian populations but less so in Asian populations. We aimed to determine the BMI threshold associated with increased type 2 diabetes (T2DM) risk and to calculate the proportion of T2DM cases attributable to overweight and obesity in the Thai population. Participants were those from the Thai Cohort Study who were diabetes-free in 2005 and were followed-up in 2009 and 2013 (n = 39,021). We used multivariable logistic regression to estimate odds ratios (ORs) and 95% confidence intervals (CIs) for the BMI-T2DM association. We modelled non-linear associations using restricted cubic splines. We estimated population attributable fractions (PAF) and the number of T2DM incident cases attributed to overweight and obesity. We also calculated the impact of reducing the prevalence of overweight and obesity on T2DM incidence in the Thai population. Non-linear modelling indicated that the points of inflection where the BMI-T2DM association became statistically significant compared to a reference of 20.00 kg/m 2 were 21.60 (OR = 1.27, 95% CI 1.00-1.61) and 20.03 (OR = 1.02, 95% CI 1.02-1.03) for men and women, respectively. Approximately two-thirds of T2DM cases in Thai adults could be attributed to overweight and obesity. Annually, if prevalent obesity was 5% lower, ~13,000 cases of T2DM might be prevented in the Thai population. A BMI cut-point of 22 kg/m 2 , one point lower than the current 23 kg/m 2 , would be justified for defining T2DM risk in Thai adults. Lowering obesity prevalence would greatly reduce T2DM incidence.

  6. Policy evaluation in diabetes prevention and treatment using a population-based macro simulation model: the MICADO model.

    PubMed

    van der Heijden, A A W A; Feenstra, T L; Hoogenveen, R T; Niessen, L W; de Bruijne, M C; Dekker, J M; Baan, C A; Nijpels, G

    2015-12-01

    To test a simulation model, the MICADO model, for estimating the long-term effects of interventions in people with and without diabetes. The MICADO model includes micro- and macrovascular diseases in relation to their risk factors. The strengths of this model are its population scope and the possibility to assess parameter uncertainty using probabilistic sensitivity analyses. Outcomes include incidence and prevalence of complications, quality of life, costs and cost-effectiveness. We externally validated MICADO's estimates of micro- and macrovascular complications in a Dutch cohort with diabetes (n = 498,400) by comparing these estimates with national and international empirical data. For the annual number of people undergoing amputations, MICADO's estimate was 592 (95% interquantile range 291-842), which compared well with the registered number of people with diabetes-related amputations in the Netherlands (728). The incidence of end-stage renal disease estimated using the MICADO model was 247 people (95% interquartile range 120-363), which was also similar to the registered incidence in the Netherlands (277 people). MICADO performed well in the validation of macrovascular outcomes of population-based cohorts, while it had more difficulty in reflecting a highly selected trial population. Validation by comparison with independent empirical data showed that the MICADO model simulates the natural course of diabetes and its micro- and macrovascular complications well. As a population-based model, MICADO can be applied for projections as well as scenario analyses to evaluate the long-term (cost-)effectiveness of population-level interventions targeting diabetes and its complications in the Netherlands or similar countries. © 2015 The Authors. Diabetic Medicine © 2015 Diabetes UK.

  7. Cost-effectiveness of Population Screening for BRCA Mutations in Ashkenazi Jewish Women Compared With Family History–Based Testing

    PubMed Central

    Manchanda, Ranjit; Legood, Rosa; Burnell, Matthew; McGuire, Alistair; Raikou, Maria; Loggenberg, Kelly; Wardle, Jane; Sanderson, Saskia; Gessler, Sue; Side, Lucy; Balogun, Nyala; Desai, Rakshit; Kumar, Ajith; Dorkins, Huw; Wallis, Yvonne; Chapman, Cyril; Taylor, Rohan; Jacobs, Chris; Tomlinson, Ian; Beller, Uziel; Menon, Usha

    2015-01-01

    Background: Population-based testing for BRCA1/2 mutations detects the high proportion of carriers not identified by cancer family history (FH)–based testing. We compared the cost-effectiveness of population-based BRCA testing with the standard FH-based approach in Ashkenazi Jewish (AJ) women. Methods: A decision-analytic model was developed to compare lifetime costs and effects amongst AJ women in the UK of BRCA founder-mutation testing amongst: 1) all women in the population age 30 years or older and 2) just those with a strong FH (≥10% mutation risk). The model assumes that BRCA carriers are offered risk-reducing salpingo-oophorectomy and annual MRI/mammography screening or risk-reducing mastectomy. Model probabilities utilize the Genetic Cancer Prediction through Population Screening trial/published literature to estimate total costs, effects in terms of quality-adjusted life-years (QALYs), cancer incidence, incremental cost-effectiveness ratio (ICER), and population impact. Costs are reported at 2010 prices. Costs/outcomes were discounted at 3.5%. We used deterministic/probabilistic sensitivity analysis (PSA) to evaluate model uncertainty. Results: Compared with FH-based testing, population-screening saved 0.090 more life-years and 0.101 more QALYs resulting in 33 days’ gain in life expectancy. Population screening was found to be cost saving with a baseline-discounted ICER of -£2079/QALY. Population-based screening lowered ovarian and breast cancer incidence by 0.34% and 0.62%. Assuming 71% testing uptake, this leads to 276 fewer ovarian and 508 fewer breast cancer cases. Overall, reduction in treatment costs led to a discounted cost savings of £3.7 million. Deterministic sensitivity analysis and 94% of simulations on PSA (threshold £20000) indicated that population screening is cost-effective, compared with current NHS policy. Conclusion: Population-based screening for BRCA mutations is highly cost-effective compared with an FH-based approach in AJ

  8. Clinical multiple sclerosis occurs at one end of a spectrum of CNS pathology: a modified threshold liability model leads to new ways of thinking about the cause of clinical multiple sclerosis.

    PubMed

    Haegert, David G

    2005-01-01

    Multiple sclerosis (MS) is a complex trait, the causes of which are elusive. A threshold liability model influences thinking about the causes of this disorder. According to this model, a population has a normal distribution of genetic liability to MS. In addition, a threshold exists, so that MS begins when an individual's liability exceeds the MS threshold; environmental and other causative factors may increase or decrease an individual's MS liability. It is argued here, however, that this model is misleading, as it is based on the incorrect assumption that MS is a disorder that one either has or does not have. This paper hypothesizes, instead, that patients with a diagnosis of MS share identical CNS pathology, termed MS pathology, with some individuals who have a diagnosis of possible MS and with some apparently healthy individuals, who may never have a diagnosis of MS. In order to accommodate this hypothesis, the current threshold liability model is modified as follows. (1) In addition to a normal distribution of MS liability within a population, a spectrum of MS pathology occurs in some who have a high MS liability. (2) A clinical MS threshold exists at a point on this liability distribution, where the burden and distribution of MS pathology permits a diagnosis of clinical MS. (3) Additional thresholds exist that correspond to a lower MS liability and a lesser burden of MS pathology than occur at the clinical MS threshold. This modified threshold model leads to the postulate that causes act at various time points to increase MS liability and induce MS pathology. The accumulation of MS pathology sometimes leads to a diagnosis of clinical MS. One implication of this model is that the MS pathology in clinical MS and in some with possible MS differs only in the extent but not in the type of CNS injury. Thus, it may be possible to obtain insight into the causative environmental factors that increase MS liability and induce MS pathology by focusing on patients who

  9. Pressure and cold pain threshold reference values in a large, young adult, pain-free population.

    PubMed

    Waller, Robert; Smith, Anne Julia; O'Sullivan, Peter Bruce; Slater, Helen; Sterling, Michele; McVeigh, Joanne Alexandra; Straker, Leon Melville

    2016-10-01

    Currently there is a lack of large population studies that have investigated pain sensitivity distributions in healthy pain free people. The aims of this study were: (1) to provide sex-specific reference values of pressure and cold pain thresholds in young pain-free adults; (2) to examine the association of potential correlates of pain sensitivity with pain threshold values. This study investigated sex specific pressure and cold pain threshold estimates for young pain free adults aged 21-24 years. A cross-sectional design was utilised using participants (n=617) from the Western Australian Pregnancy Cohort (Raine) Study at the 22-year follow-up. The association of site, sex, height, weight, smoking, health related quality of life, psychological measures and activity with pain threshold values was examined. Pressure pain threshold (lumbar spine, tibialis anterior, neck and dorsal wrist) and cold pain threshold (dorsal wrist) were assessed using standardised quantitative sensory testing protocols. Reference values for pressure pain threshold (four body sites) stratified by sex and site, and cold pain threshold (dorsal wrist) stratified by sex are provided. Statistically significant, independent correlates of increased pressure pain sensitivity measures were site (neck, dorsal wrist), sex (female), higher waist-hip ratio and poorer mental health. Statistically significant, independent correlates of increased cold pain sensitivity measures were, sex (female), poorer mental health and smoking. These data provide the most comprehensive and robust sex specific reference values for pressure pain threshold specific to four body sites and cold pain threshold at the dorsal wrist for young adults aged 21-24 years. Establishing normative values in this young age group is important given that the transition from adolescence to adulthood is a critical temporal period during which trajectories for persistent pain can be established. These data will provide an important research

  10. Critical Mutation Rate Has an Exponential Dependence on Population Size in Haploid and Diploid Populations

    PubMed Central

    Aston, Elizabeth; Channon, Alastair; Day, Charles; Knight, Christopher G.

    2013-01-01

    Understanding the effect of population size on the key parameters of evolution is particularly important for populations nearing extinction. There are evolutionary pressures to evolve sequences that are both fit and robust. At high mutation rates, individuals with greater mutational robustness can outcompete those with higher fitness. This is survival-of-the-flattest, and has been observed in digital organisms, theoretically, in simulated RNA evolution, and in RNA viruses. We introduce an algorithmic method capable of determining the relationship between population size, the critical mutation rate at which individuals with greater robustness to mutation are favoured over individuals with greater fitness, and the error threshold. Verification for this method is provided against analytical models for the error threshold. We show that the critical mutation rate for increasing haploid population sizes can be approximated by an exponential function, with much lower mutation rates tolerated by small populations. This is in contrast to previous studies which identified that critical mutation rate was independent of population size. The algorithm is extended to diploid populations in a system modelled on the biological process of meiosis. The results confirm that the relationship remains exponential, but show that both the critical mutation rate and error threshold are lower for diploids, rather than higher as might have been expected. Analyzing the transition from critical mutation rate to error threshold provides an improved definition of critical mutation rate. Natural populations with their numbers in decline can be expected to lose genetic material in line with the exponential model, accelerating and potentially irreversibly advancing their decline, and this could potentially affect extinction, recovery and population management strategy. The effect of population size is particularly strong in small populations with 100 individuals or less; the exponential model has

  11. Modeling the Interactions Between Multiple Crack Closure Mechanisms at Threshold

    NASA Technical Reports Server (NTRS)

    Newman, John A.; Riddell, William T.; Piascik, Robert S.

    2003-01-01

    A fatigue crack closure model is developed that includes interactions between the three closure mechanisms most likely to occur at threshold; plasticity, roughness, and oxide. This model, herein referred to as the CROP model (for Closure, Roughness, Oxide, and Plasticity), also includes the effects of out-of plane cracking and multi-axial loading. These features make the CROP closure model uniquely suited for, but not limited to, threshold applications. Rough cracks are idealized here as two-dimensional sawtooths, whose geometry induces mixed-mode crack- tip stresses. Continuum mechanics and crack-tip dislocation concepts are combined to relate crack face displacements to crack-tip loads. Geometric criteria are used to determine closure loads from crack-face displacements. Finite element results, used to verify model predictions, provide critical information about the locations where crack closure occurs.

  12. Cost-effectiveness of population based BRCA testing with varying Ashkenazi Jewish ancestry.

    PubMed

    Manchanda, Ranjit; Patel, Shreeya; Antoniou, Antonis C; Levy-Lahad, Ephrat; Turnbull, Clare; Evans, D Gareth; Hopper, John L; Macinnis, Robert J; Menon, Usha; Jacobs, Ian; Legood, Rosa

    2017-11-01

    Population-based BRCA1/BRCA2 testing has been found to be cost-effective compared with family history-based testing in Ashkenazi-Jewish women were >30 years old with 4 Ashkenazi-Jewish grandparents. However, individuals may have 1, 2, or 3 Ashkenazi-Jewish grandparents, and cost-effectiveness data are lacking at these lower BRCA prevalence estimates. We present an updated cost-effectiveness analysis of population BRCA1/BRCA2 testing for women with 1, 2, and 3 Ashkenazi-Jewish grandparents. Decision analysis model. Lifetime costs and effects of population and family history-based testing were compared with the use of a decision analysis model. 56% BRCA carriers are missed by family history criteria alone. Analyses were conducted for United Kingdom and United States populations. Model parameters were obtained from the Genetic Cancer Prediction through Population Screening trial and published literature. Model parameters and BRCA population prevalence for individuals with 3, 2, or 1 Ashkenazi-Jewish grandparent were adjusted for the relative frequency of BRCA mutations in the Ashkenazi-Jewish and general populations. Incremental cost-effectiveness ratios were calculated for all Ashkenazi-Jewish grandparent scenarios. Costs, along with outcomes, were discounted at 3.5%. The time horizon of the analysis is "life-time," and perspective is "payer." Probabilistic sensitivity analysis evaluated model uncertainty. Population testing for BRCA mutations is cost-saving in Ashkenazi-Jewish women with 2, 3, or 4 grandparents (22-33 days life-gained) in the United Kingdom and 1, 2, 3, or 4 grandparents (12-26 days life-gained) in the United States populations, respectively. It is also extremely cost-effective in women in the United Kingdom with just 1 Ashkenazi-Jewish grandparent with an incremental cost-effectiveness ratio of £863 per quality-adjusted life-years and 15 days life gained. Results show that population-testing remains cost-effective at the £20,000-30000 per quality

  13. Development of a landlside EWS based on rainfall thresholds for Tuscany Region, Italy

    NASA Astrophysics Data System (ADS)

    Rosi, Ascanio; Segoni, Samuele; Battistini, Alessandro; Rossi, Guglielmo; Catani, Filippo; Casagli, Nicola

    2017-04-01

    We present the set-up of a landslide EWS based on rainfall thresholds for the Tuscany region (central Italy), that shows a heterogeneous distribution of reliefs and precipitation. The work started with the definition of a single set of thresholds for the whole region, but it resulted unsuitable for EWS purposes, because of the heterogeneity of the Tuscan territory and non-repeatability of the analyses, that were affected by a high degree of subjectivity. To overcome this problem, the work started from the implementation of a software capable of objectively defining the rainfall thresholds, since some of the main issues of these thresholds are the subjectivity of the analysis and therefore their non-repeatability. This software, named MaCumBA, is largely automated and can analyze, in a short time, a high number of rainfall events to define several parameters of the threshold, such as the intensity (I) and the duration (D) of the rainfall event, the no-rain time gap (NRG: how many hours without rain are needed to consider two events as separated) and the equation describing the threshold. The possibility of quickly perform several analyses lead to the decision to divide the territory in 25 homogeneous areas (named alert zones, AZ), so as a single threshold for each AZ could be defined. For the definition of the thresholds two independent datasets (of joint rainfall-landslide occurrences) have been used: a calibration dataset (data from 2000 to 2007) and a validation dataset (2008-2009). Once the thresholds were defined, a WebGIS-based EWS has been implemented. In this system it is possible to focus both on monitoring of real-time data and on forecasting at different lead times up to 48 h; forecasting data are collected from LAMI (Limited Area Model Italy) rainfall forecasts. The EWS works on the basis of the threshold parameters defined by MaCumBA (I, D, NRG). An important feature of the warning system is that the visualization of the thresholds in the Web

  14. An energy-based body temperature threshold between torpor and normothermia for small mammals.

    PubMed

    Willis, Craig K R

    2007-01-01

    Field studies of use of torpor by heterothermic endotherms suffer from the lack of a standardized threshold differentiating torpid body temperatures (T(b)) from normothermic T(b)'s. This threshold can be more readily observed if metabolic rate (MR) is measured in the laboratory. I digitized figures from the literature that depicted simultaneous traces of MR and T(b) from 32 respirometry runs for 14 mammal species. For each graph, I quantified the T(b) measured when MR first began to drop at the onset of torpor (T(b-onset)). I used a general linear model to quantify the effect of ambient temperature (T(a)) and body mass (BM) on T(b-onset). For species lighter than 70 g, the model was highly significant and was described by the equation Tb-onset=(0.055+/-0.014)BM+(0.071+/-0.031)Ta+(31.823+/-0.740). To be conservative, I recommend use of these model parameters minus 1 standard error, which modifies the equation to Tb-onset-1 SE=(0.041)BM+(0.040)Ta+31.083. This approach provides a standardized threshold for differentiating torpor from normothermia that is based on use of energy, the actual currency of interest for studies of torpor in the wild. Few laboratory studies have presented the time-course data required to quantify T(b-onset), so more data are needed to validate this relationship.

  15. Thresholds and the rising pion inclusive cross section

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, S.T.

    In the context of the hypothesis of the Pomeron-f identity, it is shown that the rising pion inclusive cross section can be explained over a wide range of energies as a series of threshold effects. Low-mass thresholds are seen to be important. In order to understand the contributions of high-mass thresholds (flavoring), a simple two-channel multiperipheral model is examined. The analysis sheds light on the relation between thresholds and Mueller-Regge couplings. In particular, it is seen that inclusive-, and total-cross-section threshold mechanisms may differ. A quantitative model based on this idea and utilizing previous total-cross-section fits is seen to agreemore » well with experiment.« less

  16. The benefits and tradeoffs for varied high-severity injury risk thresholds for advanced automatic crash notification systems.

    PubMed

    Bahouth, George; Graygo, Jill; Digges, Kennerly; Schulman, Carl; Baur, Peter

    2014-01-01

    The objectives of this study are to (1) characterize the population of crashes meeting the Centers for Disease Control and Prevention (CDC)-recommended 20% risk of Injury Severity Score (ISS)>15 injury and (2) explore the positive and negative effects of an advanced automatic crash notification (AACN) system whose threshold for high-risk indications is 10% versus 20%. Binary logistic regression analysis was performed to predict the occurrence of motor vehicle crash injuries at both the ISS>15 and Maximum Abbreviated Injury Scale (MAIS) 3+ level. Models were trained using crash characteristics recommended by the CDC Committee on Advanced Automatic Collision Notification and Triage of the Injured Patient. Each model was used to assign the probability of severe injury (defined as MAIS 3+ or ISS>15 injury) to a subset of NASS-CDS cases based on crash attributes. Subsequently, actual AIS and ISS levels were compared with the predicted probability of injury to determine the extent to which the seriously injured had corresponding probabilities exceeding the 10% and 20% risk thresholds. Models were developed using an 80% sample of NASS-CDS data from 2002 to 2012 and evaluations were performed using the remaining 20% of cases from the same period. Within the population of seriously injured (i.e., those having one or more AIS 3 or higher injuries), the number of occupants whose injury risk did not exceed the 10% and 20% thresholds were estimated to be 11,700 and 18,600, respectively, each year using the MAIS 3+ injury model. For the ISS>15 model, 8,100 and 11,000 occupants sustained ISS>15 injuries yet their injury probability did not reach the 10% and 20% probability for severe injury respectively. Conversely, model predictions suggested that, at the 10% and 20% thresholds, 207,700 and 55,400 drivers respectively would be incorrectly flagged as injured when their injuries had not reached the AIS 3 level. For the ISS>15 model, 87,300 and 41,900 drivers would be incorrectly

  17. Injury risk functions based on population-based finite element model responses: Application to femurs under dynamic three-point bending.

    PubMed

    Park, Gwansik; Forman, Jason; Kim, Taewung; Panzer, Matthew B; Crandall, Jeff R

    2018-02-28

    The goal of this study was to explore a framework for developing injury risk functions (IRFs) in a bottom-up approach based on responses of parametrically variable finite element (FE) models representing exemplar populations. First, a parametric femur modeling tool was developed and validated using a subject-specific (SS)-FE modeling approach. Second, principal component analysis and regression were used to identify parametric geometric descriptors of the human femur and the distribution of those factors for 3 target occupant sizes (5th, 50th, and 95th percentile males). Third, distributions of material parameters of cortical bone were obtained from the literature for 3 target occupant ages (25, 50, and 75 years) using regression analysis. A Monte Carlo method was then implemented to generate populations of FE models of the femur for target occupants, using a parametric femur modeling tool. Simulations were conducted with each of these models under 3-point dynamic bending. Finally, model-based IRFs were developed using logistic regression analysis, based on the moment at fracture observed in the FE simulation. In total, 100 femur FE models incorporating the variation in the population of interest were generated, and 500,000 moments at fracture were observed (applying 5,000 ultimate strains for each synthesized 100 femur FE models) for each target occupant characteristics. Using the proposed framework on this study, the model-based IRFs for 3 target male occupant sizes (5th, 50th, and 95th percentiles) and ages (25, 50, and 75 years) were developed. The model-based IRF was located in the 95% confidence interval of the test-based IRF for the range of 15 to 70% injury risks. The 95% confidence interval of the developed IRF was almost in line with the mean curve due to a large number of data points. The framework proposed in this study would be beneficial for developing the IRFs in a bottom-up manner, whose range of variabilities is informed by the population-based

  18. Smeared spectrum jamming suppression based on generalized S transform and threshold segmentation

    NASA Astrophysics Data System (ADS)

    Li, Xin; Wang, Chunyang; Tan, Ming; Fu, Xiaolong

    2018-04-01

    Smeared Spectrum (SMSP) jamming is an effective jamming in countering linear frequency modulation (LFM) radar. According to the time-frequency distribution difference between jamming and echo, a jamming suppression method based on Generalized S transform (GST) and threshold segmentation is proposed. The sub-pulse period is firstly estimated based on auto correlation function firstly. Secondly, the time-frequency image and the related gray scale image are achieved based on GST. Finally, the Tsallis cross entropy is utilized to compute the optimized segmentation threshold, and then the jamming suppression filter is constructed based on the threshold. The simulation results show that the proposed method is of good performance in the suppression of false targets produced by SMSP.

  19. The Random-Threshold Generalized Unfolding Model and Its Application of Computerized Adaptive Testing

    ERIC Educational Resources Information Center

    Wang, Wen-Chung; Liu, Chen-Wei; Wu, Shiu-Lien

    2013-01-01

    The random-threshold generalized unfolding model (RTGUM) was developed by treating the thresholds in the generalized unfolding model as random effects rather than fixed effects to account for the subjective nature of the selection of categories in Likert items. The parameters of the new model can be estimated with the JAGS (Just Another Gibbs…

  20. Modelling single shot damage thresholds of multilayer optics for high-intensity short-wavelength radiation sources.

    PubMed

    Loch, R A; Sobierajski, R; Louis, E; Bosgra, J; Bijkerk, F

    2012-12-17

    The single shot damage thresholds of multilayer optics for high-intensity short-wavelength radiation sources are theoretically investigated, using a model developed on the basis of experimental data obtained at the FLASH and LCLS free electron lasers. We compare the radiation hardness of commonly used multilayer optics and propose new material combinations selected for a high damage threshold. Our study demonstrates that the damage thresholds of multilayer optics can vary over a large range of incidence fluences and can be as high as several hundreds of mJ/cm(2). This strongly suggests that multilayer mirrors are serious candidates for damage resistant optics. Especially, multilayer optics based on Li(2)O spacers are very promising for use in current and future short-wavelength radiation sources.

  1. A genetic algorithm based global search strategy for population pharmacokinetic/pharmacodynamic model selection

    PubMed Central

    Sale, Mark; Sherer, Eric A

    2015-01-01

    The current algorithm for selecting a population pharmacokinetic/pharmacodynamic model is based on the well-established forward addition/backward elimination method. A central strength of this approach is the opportunity for a modeller to continuously examine the data and postulate new hypotheses to explain observed biases. This algorithm has served the modelling community well, but the model selection process has essentially remained unchanged for the last 30 years. During this time, more robust approaches to model selection have been made feasible by new technology and dramatic increases in computation speed. We review these methods, with emphasis on genetic algorithm approaches and discuss the role these methods may play in population pharmacokinetic/pharmacodynamic model selection. PMID:23772792

  2. The threshold of a stochastic SIQS epidemic model

    NASA Astrophysics Data System (ADS)

    Zhang, Xiao-Bing; Huo, Hai-Feng; Xiang, Hong; Shi, Qihong; Li, Dungang

    2017-09-01

    In this paper, we present the threshold of a stochastic SIQS epidemic model which determines the extinction and persistence of the disease. Furthermore, we find that noise can suppress the disease outbreak. Numerical simulations are also carried out to confirm the analytical results.

  3. Information transmission and detection thresholds in the vestibular nuclei: single neurons vs. population encoding

    PubMed Central

    Massot, Corentin; Chacron, Maurice J.

    2011-01-01

    Understanding how sensory neurons transmit information about relevant stimuli remains a major goal in neuroscience. Of particular relevance are the roles of neural variability and spike timing in neural coding. Peripheral vestibular afferents display differential variability that is correlated with the importance of spike timing; regular afferents display little variability and use a timing code to transmit information about sensory input. Irregular afferents, conversely, display greater variability and instead use a rate code. We studied how central neurons within the vestibular nuclei integrate information from both afferent classes by recording from a group of neurons termed vestibular only (VO) that are known to make contributions to vestibulospinal reflexes and project to higher-order centers. We found that, although individual central neurons had sensitivities that were greater than or equal to those of individual afferents, they transmitted less information. In addition, their velocity detection thresholds were significantly greater than those of individual afferents. This is because VO neurons display greater variability, which is detrimental to information transmission and signal detection. Combining activities from multiple VO neurons increased information transmission. However, the information rates were still much lower than those of equivalent afferent populations. Furthermore, combining responses from multiple VO neurons led to lower velocity detection threshold values approaching those measured from behavior (∼2.5 vs. 0.5–1°/s). Our results suggest that the detailed time course of vestibular stimuli encoded by afferents is not transmitted by VO neurons. Instead, they suggest that higher vestibular pathways must integrate information from central vestibular neuron populations to give rise to behaviorally observed detection thresholds. PMID:21307329

  4. Threshold models for genome-enabled prediction of ordinal categorical traits in plant breeding.

    PubMed

    Montesinos-López, Osval A; Montesinos-López, Abelardo; Pérez-Rodríguez, Paulino; de Los Campos, Gustavo; Eskridge, Kent; Crossa, José

    2014-12-23

    Categorical scores for disease susceptibility or resistance often are recorded in plant breeding. The aim of this study was to introduce genomic models for analyzing ordinal characters and to assess the predictive ability of genomic predictions for ordered categorical phenotypes using a threshold model counterpart of the Genomic Best Linear Unbiased Predictor (i.e., TGBLUP). The threshold model was used to relate a hypothetical underlying scale to the outward categorical response. We present an empirical application where a total of nine models, five without interaction and four with genomic × environment interaction (G×E) and genomic additive × additive × environment interaction (G×G×E), were used. We assessed the proposed models using data consisting of 278 maize lines genotyped with 46,347 single-nucleotide polymorphisms and evaluated for disease resistance [with ordinal scores from 1 (no disease) to 5 (complete infection)] in three environments (Colombia, Zimbabwe, and Mexico). Models with G×E captured a sizeable proportion of the total variability, which indicates the importance of introducing interaction to improve prediction accuracy. Relative to models based on main effects only, the models that included G×E achieved 9-14% gains in prediction accuracy; adding additive × additive interactions did not increase prediction accuracy consistently across locations. Copyright © 2015 Montesinos-López et al.

  5. Electrocardiogram signal denoising based on a new improved wavelet thresholding

    NASA Astrophysics Data System (ADS)

    Han, Guoqiang; Xu, Zhijun

    2016-08-01

    Good quality electrocardiogram (ECG) is utilized by physicians for the interpretation and identification of physiological and pathological phenomena. In general, ECG signals may mix various noises such as baseline wander, power line interference, and electromagnetic interference in gathering and recording process. As ECG signals are non-stationary physiological signals, wavelet transform is investigated to be an effective tool to discard noises from corrupted signals. A new compromising threshold function called sigmoid function-based thresholding scheme is adopted in processing ECG signals. Compared with other methods such as hard/soft thresholding or other existing thresholding functions, the new algorithm has many advantages in the noise reduction of ECG signals. It perfectly overcomes the discontinuity at ±T of hard thresholding and reduces the fixed deviation of soft thresholding. The improved wavelet thresholding denoising can be proved to be more efficient than existing algorithms in ECG signal denoising. The signal to noise ratio, mean square error, and percent root mean square difference are calculated to verify the denoising performance as quantitative tools. The experimental results reveal that the waves including P, Q, R, and S waves of ECG signals after denoising coincide with the original ECG signals by employing the new proposed method.

  6. Model-based prediction of nephropathia epidemica outbreaks based on climatological and vegetation data and bank vole population dynamics.

    PubMed

    Haredasht, S Amirpour; Taylor, C J; Maes, P; Verstraeten, W W; Clement, J; Barrios, M; Lagrou, K; Van Ranst, M; Coppin, P; Berckmans, D; Aerts, J-M

    2013-11-01

    Wildlife-originated zoonotic diseases in general are a major contributor to emerging infectious diseases. Hantaviruses more specifically cause thousands of human disease cases annually worldwide, while understanding and predicting human hantavirus epidemics pose numerous unsolved challenges. Nephropathia epidemica (NE) is a human infection caused by Puumala virus, which is naturally carried and shed by bank voles (Myodes glareolus). The objective of this study was to develop a method that allows model-based predicting 3 months ahead of the occurrence of NE epidemics. Two data sets were utilized to develop and test the models. These data sets were concerned with NE cases in Finland and Belgium. In this study, we selected the most relevant inputs from all the available data for use in a dynamic linear regression (DLR) model. The number of NE cases in Finland were modelled using data from 1996 to 2008. The NE cases were predicted based on the time series data of average monthly air temperature (°C) and bank voles' trapping index using a DLR model. The bank voles' trapping index data were interpolated using a related dynamic harmonic regression model (DHR). Here, the DLR and DHR models used time-varying parameters. Both the DHR and DLR models were based on a unified state-space estimation framework. For the Belgium case, no time series of the bank voles' population dynamics were available. Several studies, however, have suggested that the population of bank voles is related to the variation in seed production of beech and oak trees in Northern Europe. Therefore, the NE occurrence pattern in Belgium was predicted based on a DLR model by using remotely sensed phenology parameters of broad-leaved forests, together with the oak and beech seed categories and average monthly air temperature (°C) using data from 2001 to 2009. Our results suggest that even without any knowledge about hantavirus dynamics in the host population, the time variation in NE outbreaks in Finland

  7. Epidemic threshold of the susceptible-infected-susceptible model on complex networks

    NASA Astrophysics Data System (ADS)

    Lee, Hyun Keun; Shim, Pyoung-Seop; Noh, Jae Dong

    2013-06-01

    We demonstrate that the susceptible-infected-susceptible (SIS) model on complex networks can have an inactive Griffiths phase characterized by a slow relaxation dynamics. It contrasts with the mean-field theoretical prediction that the SIS model on complex networks is active at any nonzero infection rate. The dynamic fluctuation of infected nodes, ignored in the mean field approach, is responsible for the inactive phase. It is proposed that the question whether the epidemic threshold of the SIS model on complex networks is zero or not can be resolved by the percolation threshold in a model where nodes are occupied in degree-descending order. Our arguments are supported by the numerical studies on scale-free network models.

  8. A threshold-based fixed predictor for JPEG-LS image compression

    NASA Astrophysics Data System (ADS)

    Deng, Lihua; Huang, Zhenghua; Yao, Shoukui

    2018-03-01

    In JPEG-LS, fixed predictor based on median edge detector (MED) only detect horizontal and vertical edges, and thus produces large prediction errors in the locality of diagonal edges. In this paper, we propose a threshold-based edge detection scheme for the fixed predictor. The proposed scheme can detect not only the horizontal and vertical edges, but also diagonal edges. For some certain thresholds, the proposed scheme can be simplified to other existing schemes. So, it can also be regarded as the integration of these existing schemes. For a suitable threshold, the accuracy of horizontal and vertical edges detection is higher than the existing median edge detection in JPEG-LS. Thus, the proposed fixed predictor outperforms the existing JPEG-LS predictors for all images tested, while the complexity of the overall algorithm is maintained at a similar level.

  9. Modeling Evolution of the Chandeleur Barrier Islands, Southeastern Louisiana: Initial Exploration of a Possible Threshold Crossing

    NASA Astrophysics Data System (ADS)

    Moore, L. J.; List, J. H.; Williams, S. J.

    2007-12-01

    Airborne photographic and lidar observations of the 72 km-long Chandeleur Island arc in southeastern Louisiana since August 2005 indicate that large volumes of sediment were removed from the islands during and following Hurricane Katrina and suggest that a return to pre-storm island configuration may be unlikely. Others have suggested, based on recent field observations, that the southern portion of the Chandeleur Islands may be showing signs of becoming an inner shelf shoal. In contrast to these observations, plentiful sand has been observed in the nearshore farther to the north; based on this finding it has been suggested that at least the northern portion of the Chandeleur Islands may be poised for recovery. Given the range of observations, it is unclear if Hurricane Katrina initiated a threshold crossing in the Chandeleurs causing the subaerial, landward- migrating barrier islands to begin evolving as submerged sand shoals. If a threshold crossing has not yet occurred and the Chandeleurs do recover from the impact of Hurricane Katrina, it remains uncertain how imminent a threshold crossing may be. To better understand the potential future evolution of the Chandeleur Islands and to assess the combination of factors that are likely to cause a threshold crossing in this environment, a series of initial model experiments are being conducted using the morphological-behavior model GEOMBEST. This model simulates the evolution of coastal morphology and stratigraphy resulting from changes in relative sea level and sediment supply, and provides insight into how barriers evolve over time scales ranging from decades to millennia. Vibracore logs, geophysical records, bathymetric surveys, and lidar surveys provide data necessary to design the model domain, while sediment budget studies, estimates of sea-level rise rates, and measurements of shoreline change rates provide input and calibration parameters. Late Holocene model runs simulate the evolution of 42 km-long North

  10. Nut crop yield records show that budbreak-based chilling requirements may not reflect yield decline chill thresholds

    NASA Astrophysics Data System (ADS)

    Pope, Katherine S.; Dose, Volker; Da Silva, David; Brown, Patrick H.; DeJong, Theodore M.

    2015-06-01

    Warming winters due to climate change may critically affect temperate tree species. Insufficiently cold winters are thought to result in fewer viable flower buds and the subsequent development of fewer fruits or nuts, decreasing the yield of an orchard or fecundity of a species. The best existing approximation for a threshold of sufficient cold accumulation, the "chilling requirement" of a species or variety, has been quantified by manipulating or modeling the conditions that result in dormant bud breaking. However, the physiological processes that affect budbreak are not the same as those that determine yield. This study sought to test whether budbreak-based chilling thresholds can reasonably approximate the thresholds that affect yield, particularly regarding the potential impacts of climate change on temperate tree crop yields. County-wide yield records for almond ( Prunus dulcis), pistachio ( Pistacia vera), and walnut ( Juglans regia) in the Central Valley of California were compared with 50 years of weather records. Bayesian nonparametric function estimation was used to model yield potentials at varying amounts of chill accumulation. In almonds, average yields occurred when chill accumulation was close to the budbreak-based chilling requirement. However, in the other two crops, pistachios and walnuts, the best previous estimate of the budbreak-based chilling requirements was 19-32 % higher than the chilling accumulations associated with average or above average yields. This research indicates that physiological processes beyond requirements for budbreak should be considered when estimating chill accumulation thresholds of yield decline and potential impacts of climate change.

  11. Computational Modeling of Interventions and Protective Thresholds to Prevent Disease Transmission in Deploying Populations

    PubMed Central

    2014-01-01

    Military personnel are deployed abroad for missions ranging from humanitarian relief efforts to combat actions; delay or interruption in these activities due to disease transmission can cause operational disruptions, significant economic loss, and stressed or exceeded military medical resources. Deployed troops function in environments favorable to the rapid and efficient transmission of many viruses particularly when levels of protection are suboptimal. When immunity among deployed military populations is low, the risk of vaccine-preventable disease outbreaks increases, impacting troop readiness and achievement of mission objectives. However, targeted vaccination and the optimization of preexisting immunity among deployed populations can decrease the threat of outbreaks among deployed troops. Here we describe methods for the computational modeling of disease transmission to explore how preexisting immunity compares with vaccination at the time of deployment as a means of preventing outbreaks and protecting troops and mission objectives during extended military deployment actions. These methods are illustrated with five modeling case studies for separate diseases common in many parts of the world, to show different approaches required in varying epidemiological settings. PMID:25009579

  12. Computational modeling of interventions and protective thresholds to prevent disease transmission in deploying populations.

    PubMed

    Burgess, Colleen; Peace, Angela; Everett, Rebecca; Allegri, Buena; Garman, Patrick

    2014-01-01

    Military personnel are deployed abroad for missions ranging from humanitarian relief efforts to combat actions; delay or interruption in these activities due to disease transmission can cause operational disruptions, significant economic loss, and stressed or exceeded military medical resources. Deployed troops function in environments favorable to the rapid and efficient transmission of many viruses particularly when levels of protection are suboptimal. When immunity among deployed military populations is low, the risk of vaccine-preventable disease outbreaks increases, impacting troop readiness and achievement of mission objectives. However, targeted vaccination and the optimization of preexisting immunity among deployed populations can decrease the threat of outbreaks among deployed troops. Here we describe methods for the computational modeling of disease transmission to explore how preexisting immunity compares with vaccination at the time of deployment as a means of preventing outbreaks and protecting troops and mission objectives during extended military deployment actions. These methods are illustrated with five modeling case studies for separate diseases common in many parts of the world, to show different approaches required in varying epidemiological settings.

  13. Rainfall-triggered shallow landslides at catchment scale: Threshold mechanics-based modeling for abruptness and localization

    NASA Astrophysics Data System (ADS)

    von Ruette, J.; Lehmann, P.; Or, D.

    2013-10-01

    Rainfall-induced shallow landslides may occur abruptly without distinct precursors and could span a wide range of soil mass released during a triggering event. We present a rainfall-induced landslide-triggering model for steep catchments with surfaces represented as an assembly of hydrologically and mechanically interconnected soil columns. The abruptness of failure was captured by defining local strength thresholds for mechanical bonds linking soil and bedrock and adjacent columns, whereby a failure of a single bond may initiate a chain reaction of subsequent failures, culminating in local mass release (a landslide). The catchment-scale hydromechanical landslide-triggering model (CHLT) was applied to results from two event-based landslide inventories triggered by two rainfall events in 2002 and 2005 in two nearby catchments located in the Prealps in Switzerland. Rainfall radar data, surface elevation and vegetation maps, and a soil production model for soil depth distribution were used for hydromechanical modeling of failure patterns for the two rainfall events at spatial and temporal resolutions of 2.5 m and 0.02 h, respectively. The CHLT model enabled systematic evaluation of the effects of soil type, mechanical reinforcement (soil cohesion and lateral root strength), and initial soil water content on landslide characteristics. We compared various landslide metrics and spatial distribution of simulated landslides in subcatchments with observed inventory data. Model parameters were optimized for the short but intense rainfall event in 2002, and the calibrated model was then applied for the 2005 rainfall, yielding reasonable predictions of landslide events and volumes and statistically reproducing localized landslide patterns similar to inventory data. The model provides a means for identifying local hot spots and offers insights into the dynamics of locally resolved landslide hazards in mountainous regions.

  14. Towards a threshold climate for emergency lower respiratory hospital admissions.

    PubMed

    Islam, Muhammad Saiful; Chaussalet, Thierry J; Koizumi, Naoru

    2017-02-01

    Identification of 'cut-points' or thresholds of climate factors would play a crucial role in alerting risks of climate change and providing guidance to policymakers. This study investigated a 'Climate Threshold' for emergency hospital admissions of chronic lower respiratory diseases by using a distributed lag non-linear model (DLNM). We analysed a unique longitudinal dataset (10 years, 2000-2009) on emergency hospital admissions, climate, and pollution factors for the Greater London. Our study extends existing work on this topic by considering non-linearity, lag effects between climate factors and disease exposure within the DLNM model considering B-spline as smoothing technique. The final model also considered natural cubic splines of time since exposure and 'day of the week' as confounding factors. The results of DLNM indicated a significant improvement in model fitting compared to a typical GLM model. The final model identified the thresholds of several climate factors including: high temperature (≥27°C), low relative humidity (≤ 40%), high Pm10 level (≥70-µg/m 3 ), low wind speed (≤ 2 knots) and high rainfall (≥30mm). Beyond the threshold values, a significantly higher number of emergency admissions due to lower respiratory problems would be expected within the following 2-3 days after the climate shift in the Greater London. The approach will be useful to initiate 'region and disease specific' climate mitigation plans. It will help identify spatial hot spots and the most sensitive areas and population due to climate change, and will eventually lead towards a diversified health warning system tailored to specific climate zones and populations. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Towards A Complete Model Of Photopic Visual Threshold Performance

    NASA Astrophysics Data System (ADS)

    Overington, I.

    1982-02-01

    Based on a wide variety of fragmentary evidence taken from psycho-physics, neurophysiology and electron microscopy, it has been possible to put together a very widely applicable conceptual model of photopic visual threshold performance. Such a model is so complex that a single comprehensive mathematical version is excessively cumbersome. It is, however, possible to set up a suite of related mathematical models, each of limited application but strictly known envelope of usage. Such models may be used for assessment of a variety of facets of visual performance when using display imagery, including effects and interactions of image quality, random and discrete display noise, viewing distance, image motion, etc., both for foveal interrogation tasks and for visual search tasks. The specific model may be selected from the suite according to the assessment task in hand. The paper discusses in some depth the major facets of preperceptual visual processing and their interaction with instrumental image quality and noise. It then highlights the statistical nature of visual performance before going on to consider a number of specific mathematical models of partial visual function. Where appropriate, these are compared with widely popular empirical models of visual function.

  16. A flash flood early warning system based on rainfall thresholds and daily soil moisture indexes

    NASA Astrophysics Data System (ADS)

    Brigandì, Giuseppina; Tito Aronica, Giuseppe

    2015-04-01

    Main focus of the paper is to present a flash flood early warning system, developed for Civil Protection Agency for the Sicily Region, for alerting extreme hydrometeorological events by using a methodology based on the combined use of rainfall thresholds and soil moisture indexes. As matter of fact, flash flood warning is a key element to improve the Civil Protection achievements to mitigate damages and safeguard the security of people. It is a rather complicated task, particularly in those catchments with flashy response where even brief anticipations are important and welcomed. In this context, some kind of hydrological precursors can be considered to improve the effectiveness of the emergency actions (i.e. early flood warning). Now, it is well known how soil moisture is an important factor in flood formation, because the runoff generation is strongly influenced by the antecedent soil moisture conditions of the catchment. The basic idea of the work here presented is to use soil moisture indexes derived in a continuous form to define a first alert phase in a flash flood forecasting chain and then define a unique rainfall threshold for a given day for the subsequent alarm phases activation, derived as a function of the soil moisture conditions at the beginning of the day. Daily soil moisture indexes, representative of the moisture condition of the catchment, were derived by using a parsimonious and simply to use approach based on the IHACRES model application in a modified form developed by the authors. It is a simple, spatially-lumped rainfall-streamflow model, based on the SCS-CN method and on the unit hydrograph approach that requires only rainfall, streamflow and air temperature data. It consists of two modules. In the first a non linear loss model, based on the SCS-CN method, was used to transform total rainfall into effective rainfall. In the second, a linear convolution of effective rainfall was performed using a total unit hydrograph with a configuration of

  17. Bayesian population analysis of a washin-washout physiologically based pharmacokinetic model for acetone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moerk, Anna-Karin, E-mail: anna-karin.mork@ki.s; Jonsson, Fredrik; Pharsight, a Certara company, St. Louis, MO

    2009-11-01

    The aim of this study was to derive improved estimates of population variability and uncertainty of physiologically based pharmacokinetic (PBPK) model parameters, especially of those related to the washin-washout behavior of polar volatile substances. This was done by optimizing a previously published washin-washout PBPK model for acetone in a Bayesian framework using Markov chain Monte Carlo simulation. The sensitivity of the model parameters was investigated by creating four different prior sets, where the uncertainty surrounding the population variability of the physiological model parameters was given values corresponding to coefficients of variation of 1%, 25%, 50%, and 100%, respectively. The PBPKmore » model was calibrated to toxicokinetic data from 2 previous studies where 18 volunteers were exposed to 250-550 ppm of acetone at various levels of workload. The updated PBPK model provided a good description of the concentrations in arterial, venous, and exhaled air. The precision of most of the model parameter estimates was improved. New information was particularly gained on the population distribution of the parameters governing the washin-washout effect. The results presented herein provide a good starting point to estimate the target dose of acetone in the working and general populations for risk assessment purposes.« less

  18. Agent-Based Phytoplankton Models of Cellular and Population Processes: Fostering Individual-Based Learning in Undergraduate Research

    NASA Astrophysics Data System (ADS)

    Berges, J. A.; Raphael, T.; Rafa Todd, C. S.; Bate, T. C.; Hellweger, F. L.

    2016-02-01

    Engaging undergraduate students in research projects that require expertise in multiple disciplines (e.g. cell biology, population ecology, and mathematical modeling) can be challenging because they have often not developed the expertise that allows them to participate at a satisfying level. Use of agent-based modeling can allow exploration of concepts at more intuitive levels, and encourage experimentation that emphasizes processes over computational skills. Over the past several years, we have involved undergraduate students in projects examining both ecological and cell biological aspects of aquatic microbial biology, using the freely-downloadable, agent-based modeling environment NetLogo (https://ccl.northwestern.edu/netlogo/). In Netlogo, actions of large numbers of individuals can be simulated, leading to complex systems with emergent behavior. The interface features appealing graphics, monitors, and control structures. In one example, a group of sophomores in a BioMathematics program developed an agent-based model of phytoplankton population dynamics in a pond ecosystem, motivated by observed macroscopic changes in cell numbers (due to growth and death), and driven by responses to irradiance, temperature and a limiting nutrient. In a second example, junior and senior undergraduates conducting Independent Studies created a model of the intracellular processes governing stress and cell death for individual phytoplankton cells (based on parameters derived from experiments using single-cell culturing and flow cytometry), and then this model was embedded in the agents in the pond ecosystem model. In our experience, students with a range of mathematical abilities learned to code quickly and could use the software with varying degrees of sophistication, for example, creation of spatially-explicit two and three-dimensional models. Skills developed quickly and transferred readily to other platforms (e.g. Matlab).

  19. Sustainability in single-species population models.

    PubMed

    Quinn, Terrance J; Collie, Jeremy S

    2005-01-29

    In this paper, we review the concept of sustainability with regard to a single-species, age-structured fish population with density dependence at some stage of its life history. We trace the development of the view of sustainability through four periods. The classical view of sustainability, prevalent in the 1970s and earlier, developed from deterministic production models, in which equilibrium abundance or biomass is derived as a function of fishing mortality. When there is no fishing mortality, the population equilibrates about its carrying capacity. We show that carrying capacity is the result of reproductive and mortality processes and is not a fixed constant unless these processes are constant. There is usually a fishing mortality, F(MSY), which results in MSY, and a higher value, F(ext), for which the population is eventually driven to extinction. For each F between 0 and F(ext), there is a corresponding sustainable population. From this viewpoint, the primary tool for achieving sustainability is the control of fishing mortality. The neoclassical view of sustainability, developed in the 1980s, involved population models with depensation and stochasticity. This view point is in accord with the perception that a population at a low level is susceptible to collapse or to a lack of rebuilding regardless of fishing. Sustainability occurs in a more restricted range from that in the classical view and includes an abundance threshold. A variety of studies has suggested that fishing mortality should not let a population drop below a threshold at 10-20% of carrying capacity. The modern view of sustainability in the 1990s moves further in the direction of precaution. The fishing mortality limit is the former target of F(MSY) (or some proxy), and the target fishing mortality is set lower. This viewpoint further reduces the range of permissible fishing mortalities and resultant desired population sizes. The objective has shifted from optimizing long-term catch to

  20. Sustainability in single-species population models

    PubMed Central

    Quinn, Terrance J.; Collie, Jeremy S.

    2005-01-01

    In this paper, we review the concept of sustainability with regard to a single-species, age-structured fish population with density dependence at some stage of its life history. We trace the development of the view of sustainability through four periods. The classical view of sustainability, prevalent in the 1970s and earlier, developed from deterministic production models, in which equilibrium abundance or biomass is derived as a function of fishing mortality. When there is no fishing mortality, the population equilibrates about its carrying capacity. We show that carrying capacity is the result of reproductive and mortality processes and is not a fixed constant unless these processes are constant. There is usually a fishing mortality, FMSY, which results in MSY, and a higher value, Fext, for which the population is eventually driven to extinction. For each F between 0 and Fext, there is a corresponding sustainable population. From this viewpoint, the primary tool for achieving sustainability is the control of fishing mortality. The neoclassical view of sustainability, developed in the 1980s, involved population models with depensation and stochasticity. This viewpoint is in accord with the perception that a population at a low level is susceptible to collapse or to a lack of rebuilding regardless of fishing. Sustainability occurs in a more restricted range from that in the classical view and includes an abundance threshold. A variety of studies has suggested that fishing mortality should not let a population drop below a threshold at 10–20% of carrying capacity. The modern view of sustainability in the 1990s moves further in the direction of precaution. The fishing mortality limit is the former target of FMSY (or some proxy), and the target fishing mortality is set lower. This viewpoint further reduces the range of permissible fishing mortalities and resultant desired population sizes. The objective has shifted from optimizing long-term catch to preserving

  1. Re-visiting Trichuris trichiura intensity thresholds based on anemia during pregnancy.

    PubMed

    Gyorkos, Theresa W; Gilbert, Nicolas L; Larocque, Renée; Casapía, Martín; Montresor, Antonio

    2012-01-01

    The intensity categories, or thresholds, currently used for Trichuris trichiura (ie. epg intensities of 1-999 (light); 1,000-9,999 epg (moderate), and ≥ 10,000 epg (heavy)) were developed in the 1980s, when there were little epidemiological data available on dose-response relationships. This study was undertaken to determine a threshold for T. trichiura-associated anemia in pregnant women and to describe the implications of this threshold in terms of the need for primary prevention and chemotherapeutic interventions. In Iquitos, Peru, 935 pregnant women were tested for T. trichiura infection in their second trimester of pregnancy; were given daily iron supplements throughout their pregnancy; and had their blood hemoglobin levels measured in their third trimester of pregnancy. Women in the highest two T. trichiura intensity quintiles (601-1632 epg and ≥ 1633 epg) had significantly lower mean hemoglobin concentrations than the lowest quintile (0-24 epg). They also had a statistically significantly higher risk of anemia, with adjusted odds ratios of 1.67 (95% CI: 1.02, 2.62) and 1.73 (95% CI: 1.09, 2.74), respectively. This analysis provides support for categorizing a T. trichiura infection ≥ 1,000 epg as 'moderate', as currently defined by the World Health Organization. Because this 'moderate' level of T. trichiura infection was found to be a significant risk factor for anemia in pregnant women, the intensity of Trichuris infection deemed to cause or aggravate anemia should no longer be restricted to the 'heavy' intensity category. It should now include both 'heavy' and 'moderate' intensities of Trichuris infection. Evidence-based deworming strategies targeting pregnant women or populations where anemia is of concern should be updated accordingly.

  2. Determination and validation of soil thresholds for cadmium based on food quality standard and health risk assessment.

    PubMed

    Ding, Changfeng; Ma, Yibing; Li, Xiaogang; Zhang, Taolin; Wang, Xingxiang

    2018-04-01

    Cadmium (Cd) is an environmental toxicant with high rates of soil-plant transfer. It is essential to establish an accurate soil threshold for the implementation of soil management practices. This study takes root vegetable as an example to derive soil thresholds for Cd based on the food quality standard as well as health risk assessment using species sensitivity distribution (SSD). A soil type-specific bioconcentration factor (BCF, ratio of Cd concentration in plant to that in soil) generated from soil with a proper Cd concentration gradient was calculated and applied in the derivation of soil thresholds instead of a generic BCF value to minimize the uncertainty. The sensitivity variations of twelve root vegetable cultivars for accumulating soil Cd and the empirical soil-plant transfer model were investigated and developed in greenhouse experiments. After normalization, the hazardous concentrations from the fifth percentile of the distribution based on added Cd (HC5 add ) were calculated from the SSD curves fitted by Burr Type III distribution. The derived soil thresholds were presented as continuous or scenario criteria depending on the combination of soil pH and organic carbon content. The soil thresholds based on food quality standard were on average 0.7-fold of those based on health risk assessment, and were further validated to be reliable using independent data from field survey and published articles. The results suggested that deriving soil thresholds for Cd using SSD method is robust and also applicable to other crops as well as other trace elements that have the potential to cause health risk issues. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Modeling Source Water Threshold Exceedances with Extreme Value Theory

    NASA Astrophysics Data System (ADS)

    Rajagopalan, B.; Samson, C.; Summers, R. S.

    2016-12-01

    Variability in surface water quality, influenced by seasonal and long-term climate changes, can impact drinking water quality and treatment. In particular, temperature and precipitation can impact surface water quality directly or through their influence on streamflow and dilution capacity. Furthermore, they also impact land surface factors, such as soil moisture and vegetation, which can in turn affect surface water quality, in particular, levels of organic matter in surface waters which are of concern. All of these will be exacerbated by anthropogenic climate change. While some source water quality parameters, particularly Total Organic Carbon (TOC) and bromide concentrations, are not directly regulated for drinking water, these parameters are precursors to the formation of disinfection byproducts (DBPs), which are regulated in drinking water distribution systems. These DBPs form when a disinfectant, added to the water to protect public health against microbial pathogens, most commonly chlorine, reacts with dissolved organic matter (DOM), measured as TOC or dissolved organic carbon (DOC), and inorganic precursor materials, such as bromide. Therefore, understanding and modeling the extremes of TOC and Bromide concentrations is of critical interest for drinking water utilities. In this study we develop nonstationary extreme value analysis models for threshold exceedances of source water quality parameters, specifically TOC and bromide concentrations. In this, the threshold exceedances are modeled as Generalized Pareto Distribution (GPD) whose parameters vary as a function of climate and land surface variables - thus, enabling to capture the temporal nonstationarity. We apply these to model threshold exceedance of source water TOC and bromide concentrations at two locations with different climate and find very good performance.

  4. The impact of manual threshold selection in medical additive manufacturing.

    PubMed

    van Eijnatten, Maureen; Koivisto, Juha; Karhu, Kalle; Forouzanfar, Tymour; Wolff, Jan

    2017-04-01

    Medical additive manufacturing requires standard tessellation language (STL) models. Such models are commonly derived from computed tomography (CT) images using thresholding. Threshold selection can be performed manually or automatically. The aim of this study was to assess the impact of manual and default threshold selection on the reliability and accuracy of skull STL models using different CT technologies. One female and one male human cadaver head were imaged using multi-detector row CT, dual-energy CT, and two cone-beam CT scanners. Four medical engineers manually thresholded the bony structures on all CT images. The lowest and highest selected mean threshold values and the default threshold value were used to generate skull STL models. Geometric variations between all manually thresholded STL models were calculated. Furthermore, in order to calculate the accuracy of the manually and default thresholded STL models, all STL models were superimposed on an optical scan of the dry female and male skulls ("gold standard"). The intra- and inter-observer variability of the manual threshold selection was good (intra-class correlation coefficients >0.9). All engineers selected grey values closer to soft tissue to compensate for bone voids. Geometric variations between the manually thresholded STL models were 0.13 mm (multi-detector row CT), 0.59 mm (dual-energy CT), and 0.55 mm (cone-beam CT). All STL models demonstrated inaccuracies ranging from -0.8 to +1.1 mm (multi-detector row CT), -0.7 to +2.0 mm (dual-energy CT), and -2.3 to +4.8 mm (cone-beam CT). This study demonstrates that manual threshold selection results in better STL models than default thresholding. The use of dual-energy CT and cone-beam CT technology in its present form does not deliver reliable or accurate STL models for medical additive manufacturing. New approaches are required that are based on pattern recognition and machine learning algorithms.

  5. Effects of pump recycling technique on stimulated Brillouin scattering threshold: a theoretical model.

    PubMed

    Al-Asadi, H A; Al-Mansoori, M H; Ajiya, M; Hitam, S; Saripan, M I; Mahdi, M A

    2010-10-11

    We develop a theoretical model that can be used to predict stimulated Brillouin scattering (SBS) threshold in optical fibers that arises through the effect of Brillouin pump recycling technique. Obtained simulation results from our model are in close agreement with our experimental results. The developed model utilizes single mode optical fiber of different lengths as the Brillouin gain media. For 5-km long single mode fiber, the calculated threshold power for SBS is about 16 mW for conventional technique. This value is reduced to about 8 mW when the residual Brillouin pump is recycled at the end of the fiber. The decrement of SBS threshold is due to longer interaction lengths between Brillouin pump and Stokes wave.

  6. Development and validation of a new population-based simulation model of osteoarthritis in New Zealand.

    PubMed

    Wilson, R; Abbott, J H

    2018-04-01

    To describe the construction and preliminary validation of a new population-based microsimulation model developed to analyse the health and economic burden and cost-effectiveness of treatments for knee osteoarthritis (OA) in New Zealand (NZ). We developed the New Zealand Management of Osteoarthritis (NZ-MOA) model, a discrete-time state-transition microsimulation model of the natural history of radiographic knee OA. In this article, we report on the model structure, derivation of input data, validation of baseline model parameters against external data sources, and validation of model outputs by comparison of the predicted population health loss with previous estimates. The NZ-MOA model simulates both the structural progression of radiographic knee OA and the stochastic development of multiple disease symptoms. Input parameters were sourced from NZ population-based data where possible, and from international sources where NZ-specific data were not available. The predicted distributions of structural OA severity and health utility detriments associated with OA were externally validated against other sources of evidence, and uncertainty resulting from key input parameters was quantified. The resulting lifetime and current population health-loss burden was consistent with estimates of previous studies. The new NZ-MOA model provides reliable estimates of the health loss associated with knee OA in the NZ population. The model structure is suitable for analysis of the effects of a range of potential treatments, and will be used in future work to evaluate the cost-effectiveness of recommended interventions within the NZ healthcare system. Copyright © 2018 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.

  7. Normal Threshold Size of Stimuli in Children Using a Game-Based Visual Field Test.

    PubMed

    Wang, Yanfang; Ali, Zaria; Subramani, Siddharth; Biswas, Susmito; Fenerty, Cecilia; Henson, David B; Aslam, Tariq

    2017-06-01

    The aim of this study was to demonstrate and explore the ability of novel game-based perimetry to establish normal visual field thresholds in children. One hundred and eighteen children (aged 8.0 ± 2.8 years old) with no history of visual field loss or significant medical history were recruited. Each child had one eye tested using a game-based visual field test 'Caspar's Castle' at four retinal locations 12.7° (N = 118) from fixation. Thresholds were established repeatedly using up/down staircase algorithms with stimuli of varying diameter (luminance 20 cd/m 2 , duration 200 ms, background luminance 10 cd/m 2 ). Relationships between threshold and age were determined along with measures of intra- and intersubject variability. The Game-based visual field test was able to establish threshold estimates in the full range of children tested. Threshold size reduced with increasing age in children. Intrasubject variability and intersubject variability were inversely related to age in children. Normal visual field thresholds were established for specific locations in children using a novel game-based visual field test. These could be used as a foundation for developing a game-based perimetry screening test for children.

  8. The threshold of a stochastic delayed SIR epidemic model with temporary immunity

    NASA Astrophysics Data System (ADS)

    Liu, Qun; Chen, Qingmei; Jiang, Daqing

    2016-05-01

    This paper is concerned with the asymptotic properties of a stochastic delayed SIR epidemic model with temporary immunity. Sufficient conditions for extinction and persistence in the mean of the epidemic are established. The threshold between persistence in the mean and extinction of the epidemic is obtained. Compared with the corresponding deterministic model, the threshold affected by the white noise is smaller than the basic reproduction number R0 of the deterministic system.

  9. Effect of threshold disorder on the quorum percolation model

    NASA Astrophysics Data System (ADS)

    Monceau, Pascal; Renault, Renaud; Métens, Stéphane; Bottani, Samuel

    2016-07-01

    We study the modifications induced in the behavior of the quorum percolation model on neural networks with Gaussian in-degree by taking into account an uncorrelated Gaussian thresholds variability. We derive a mean-field approach and show its relevance by carrying out explicit Monte Carlo simulations. It turns out that such a disorder shifts the position of the percolation transition, impacts the size of the giant cluster, and can even destroy the transition. Moreover, we highlight the occurrence of disorder independent fixed points above the quorum critical value. The mean-field approach enables us to interpret these effects in terms of activation probability. A finite-size analysis enables us to show that the order parameter is weakly self-averaging with an exponent independent on the thresholds disorder. Last, we show that the effects of the thresholds and connectivity disorders cannot be easily discriminated from the measured averaged physical quantities.

  10. Predicting the threshold of pulse-train electrical stimuli using a stochastic auditory nerve model: the effects of stimulus noise.

    PubMed

    Xu, Yifang; Collins, Leslie M

    2004-04-01

    The incorporation of low levels of noise into an electrical stimulus has been shown to improve auditory thresholds in some human subjects (Zeng et al., 2000). In this paper, thresholds for noise-modulated pulse-train stimuli are predicted utilizing a stochastic neural-behavioral model of ensemble fiber responses to bi-phasic stimuli. The neural refractory effect is described using a Markov model for a noise-free pulse-train stimulus and a closed-form solution for the steady-state neural response is provided. For noise-modulated pulse-train stimuli, a recursive method using the conditional probability is utilized to track the neural responses to each successive pulse. A neural spike count rule has been presented for both threshold and intensity discrimination under the assumption that auditory perception occurs via integration over a relatively long time period (Bruce et al., 1999). An alternative approach originates from the hypothesis of the multilook model (Viemeister and Wakefield, 1991), which argues that auditory perception is based on several shorter time integrations and may suggest an NofM model for prediction of pulse-train threshold. This motivates analyzing the neural response to each individual pulse within a pulse train, which is considered to be the brief look. A logarithmic rule is hypothesized for pulse-train threshold. Predictions from the multilook model are shown to match trends in psychophysical data for noise-free stimuli that are not always matched by the long-time integration rule. Theoretical predictions indicate that threshold decreases as noise variance increases. Theoretical models of the neural response to pulse-train stimuli not only reduce calculational overhead but also facilitate utilization of signal detection theory and are easily extended to multichannel psychophysical tasks.

  11. Global and local threshold in a metapopulational SEIR model with quarantine

    NASA Astrophysics Data System (ADS)

    Gomes, Marcelo F. C.; Rossi, Luca; Pastore Y Piontti, Ana; Vespignani, Alessandro

    2013-03-01

    Diseases which have the possibility of transmission before the onset of symptoms pose a challenging threat to healthcare since it is hard to track spreaders and implement quarantine measures. More precisely, one main concerns regarding pandemic spreading of diseases is the prediction-and eventually control-of local outbreaks that will trigger a global invasion of a particular disease. We present a metapopulation disease spreading model with transmission from both symptomatic and asymptomatic agents and analyze the role of quarantine measures and mobility processes between subpopulations. We show that, depending on the disease parameters, it is possible to separate in the parameter space the local and global thresholds and study the system behavior as a function of the fraction of asymptomatic transmissions. This means that it is possible to have a range of parameters values where although we do not achieve local control of the outbreak it is possible to control the global spread of the disease. We validate the analytic picture in data-driven model that integrates commuting, air traffic flow and detailed information about population size and structure worldwide. Laboratory for the Modeling of Biological and Socio-Technical Systems (MoBS)

  12. Analytical expression for Risken-Nummedal-Graham-Haken instability threshold in quantum cascade lasers.

    PubMed

    Vukovic, N; Radovanovic, J; Milanovic, V; Boiko, D L

    2016-11-14

    We have obtained a closed-form expression for the threshold of Risken-Nummedal-Graham-Haken (RNGH) multimode instability in a Fabry-Pérot (FP) cavity quantum cascade laser (QCL). This simple analytical expression is a versatile tool that can easily be applied in practical situations which require analysis of QCL dynamic behavior and estimation of its RNGH multimode instability threshold. Our model for a FP cavity laser accounts for the carrier coherence grating and carrier population grating as well as their relaxation due to carrier diffusion. In the model, the RNGH instability threshold is analyzed using a second-order bi-orthogonal perturbation theory and we confirm our analytical solution by a comparison with the numerical simulations. In particular, the model predicts a low RNGH instability threshold in QCLs. This agrees very well with experimental data available in the literature.

  13. Increasing accuracy of dispersal kernels in grid-based population models

    USGS Publications Warehouse

    Slone, D.H.

    2011-01-01

    Dispersal kernels in grid-based population models specify the proportion, distance and direction of movements within the model landscape. Spatial errors in dispersal kernels can have large compounding effects on model accuracy. Circular Gaussian and Laplacian dispersal kernels at a range of spatial resolutions were investigated, and methods for minimizing errors caused by the discretizing process were explored. Kernels of progressively smaller sizes relative to the landscape grid size were calculated using cell-integration and cell-center methods. These kernels were convolved repeatedly, and the final distribution was compared with a reference analytical solution. For large Gaussian kernels (σ > 10 cells), the total kernel error was <10 &sup-11; compared to analytical results. Using an invasion model that tracked the time a population took to reach a defined goal, the discrete model results were comparable to the analytical reference. With Gaussian kernels that had σ ≤ 0.12 using the cell integration method, or σ ≤ 0.22 using the cell center method, the kernel error was greater than 10%, which resulted in invasion times that were orders of magnitude different than theoretical results. A goal-seeking routine was developed to adjust the kernels to minimize overall error. With this, corrections for small kernels were found that decreased overall kernel error to <10-11 and invasion time error to <5%.

  14. Identification of a barrier height threshold where brook trout population genetic diversity, differentiation, and relatedness are affected

    Treesearch

    Anne Timm; Eric Hallerman; Andy Dolloff; Mark Hudy; Randall Kolka

    2016-01-01

    The overall goal of the study was to evaluate effects of landscape features, barriers, on Brook Trout Salvelinus fontinalis population genetics and to identify a potential barrier height threshold where genetic diversity was reduced upstream of the barrier and differentiation and relatedness increase. We screened variation at eight...

  15. Threshold multi-secret sharing scheme based on phase-shifting interferometry

    NASA Astrophysics Data System (ADS)

    Deng, Xiaopeng; Wen, Wei; Shi, Zhengang

    2017-03-01

    A threshold multi-secret sharing scheme is proposed based on phase-shifting interferometry. The K secret images to be shared are firstly encoded by using Fourier transformation, respectively. Then, these encoded images are shared into many shadow images based on recording principle of the phase-shifting interferometry. In the recovering stage, the secret images can be restored by combining any 2 K + 1 or more shadow images, while any 2 K or fewer shadow images cannot obtain any information about the secret images. As a result, a (2 K + 1 , N) threshold multi-secret sharing scheme can be implemented. Simulation results are presented to demonstrate the feasibility of the proposed method.

  16. Study on the threshold of a stochastic SIR epidemic model and its extensions

    NASA Astrophysics Data System (ADS)

    Zhao, Dianli

    2016-09-01

    This paper provides a simple but effective method for estimating the threshold of a class of the stochastic epidemic models by use of the nonnegative semimartingale convergence theorem. Firstly, the threshold R0SIR is obtained for the stochastic SIR model with a saturated incidence rate, whose value is below 1 or above 1 will completely determine the disease to go extinct or prevail for any size of the white noise. Besides, when R0SIR > 1 , the system is proved to be convergent in time mean. Then, the threshold of the stochastic SIVS models with or without saturated incidence rate are also established by the same method. Comparing with the previously-known literatures, the related results are improved, and the method is simpler than before.

  17. Threshold and subthreshold Generalized Anxiety Disorder (GAD) and suicide ideation.

    PubMed

    Gilmour, Heather

    2016-11-16

    Subthreshold Generalized Anxiety Disorder (GAD) has been reported to be at least as prevalent as threshold GAD and of comparable clinical significance. It is not clear if GAD is uniquely associated with the risk of suicide, or if psychiatric comorbidity drives the association. Data from the 2012 Canadian Community Health Survey-Mental Health were used to estimate the prevalence of threshold and subthreshold GAD in the household population aged 15 or older. As well, the relationship between GAD and suicide ideation was studied. Multivariate logistic regression was used in a sample of 24,785 people to identify significant associations, while adjusting for the confounding effects of sociodemographic factors and other mental disorders. In 2012, an estimated 722,000 Canadians aged 15 or older (2.6%) met the criteria for threshold GAD; an additional 2.3% (655,000) had subthreshold GAD. For people with threshold GAD, past 12-month suicide ideation was more prevalent among men than women (32.0% versus 21.2% respectively). In multivariate models that controlled sociodemographic factors, the odds of past 12-month suicide ideation among people with either past 12-month threshold or subthreshold GAD were significantly higher than the odds for those without GAD. When psychiatric comorbidity was also controlled, associations between threshold and subthreshold GAD and suicidal ideation were attenuated, but remained significant. Threshold and subthreshold GAD affect similar percentages of the Canadian household population. This study adds to the literature that has identified an independent association between threshold GAD and suicide ideation, and demonstrates that an association is also apparent for subthreshold GAD.

  18. "Now I see it, now I don't": Determining Threshold Levels of Facial Emotion Recognition for Use in Patient Populations.

    PubMed

    Chiu, Isabelle; Gfrörer, Regina I; Piguet, Olivier; Berres, Manfred; Monsch, Andreas U; Sollberger, Marc

    2015-08-01

    The importance of including measures of emotion processing, such as tests of facial emotion recognition (FER), as part of a comprehensive neuropsychological assessment is being increasingly recognized. In clinical settings, FER tests need to be sensitive, short, and easy to administer, given the limited time available and patient limitations. Current tests, however, commonly use stimuli that either display prototypical emotions, bearing the risk of ceiling effects and unequal task difficulty, or are cognitively too demanding and time-consuming. To overcome these limitations in FER testing in patient populations, we aimed to define FER threshold levels for the six basic emotions in healthy individuals. Forty-nine healthy individuals between 52 and 79 years of age were asked to identify the six basic emotions at different intensity levels (25%, 50%, 75%, 100%, and 125% of the prototypical emotion). Analyses uncovered differing threshold levels across emotions and sex of facial stimuli, ranging from 50% up to 100% intensities. Using these findings as "healthy population benchmarks", we propose to apply these threshold levels to clinical populations either as facial emotion recognition or intensity rating tasks. As part of any comprehensive social cognition test battery, this approach should allow for a rapid and sensitive assessment of potential FER deficits.

  19. Re-Visiting Trichuris trichiura Intensity Thresholds Based on Anemia during Pregnancy

    PubMed Central

    Gyorkos, Theresa W.; Gilbert, Nicolas L.; Larocque, Renée; Casapía, Martín; Montresor, Antonio

    2012-01-01

    Background The intensity categories, or thresholds, currently used for Trichuris trichiura (ie. epg intensities of 1–999 (light); 1,000–9,999 epg (moderate), and ≥10,000 epg (heavy)) were developed in the 1980s, when there were little epidemiological data available on dose-response relationships. This study was undertaken to determine a threshold for T. trichiura-associated anemia in pregnant women and to describe the implications of this threshold in terms of the need for primary prevention and chemotherapeutic interventions. Methodology/Principal Findings In Iquitos, Peru, 935 pregnant women were tested for T. trichiura infection in their second trimester of pregnancy; were given daily iron supplements throughout their pregnancy; and had their blood hemoglobin levels measured in their third trimester of pregnancy. Women in the highest two T. trichiura intensity quintiles (601–1632 epg and ≥1633 epg) had significantly lower mean hemoglobin concentrations than the lowest quintile (0–24 epg). They also had a statistically significantly higher risk of anemia, with adjusted odds ratios of 1.67 (95% CI: 1.02, 2.62) and 1.73 (95% CI: 1.09, 2.74), respectively. Conclusions/Significance This analysis provides support for categorizing a T. trichiura infection ≥1,000 epg as ‘moderate’, as currently defined by the World Health Organization. Because this ‘moderate’ level of T. trichiura infection was found to be a significant risk factor for anemia in pregnant women, the intensity of Trichuris infection deemed to cause or aggravate anemia should no longer be restricted to the ‘heavy’ intensity category. It should now include both ‘heavy’ and ‘moderate’ intensities of Trichuris infection. Evidence-based deworming strategies targeting pregnant women or populations where anemia is of concern should be updated accordingly. PMID:23029572

  20. A de-noising method using the improved wavelet threshold function based on noise variance estimation

    NASA Astrophysics Data System (ADS)

    Liu, Hui; Wang, Weida; Xiang, Changle; Han, Lijin; Nie, Haizhao

    2018-01-01

    The precise and efficient noise variance estimation is very important for the processing of all kinds of signals while using the wavelet transform to analyze signals and extract signal features. In view of the problem that the accuracy of traditional noise variance estimation is greatly affected by the fluctuation of noise values, this study puts forward the strategy of using the two-state Gaussian mixture model to classify the high-frequency wavelet coefficients in the minimum scale, which takes both the efficiency and accuracy into account. According to the noise variance estimation, a novel improved wavelet threshold function is proposed by combining the advantages of hard and soft threshold functions, and on the basis of the noise variance estimation algorithm and the improved wavelet threshold function, the research puts forth a novel wavelet threshold de-noising method. The method is tested and validated using random signals and bench test data of an electro-mechanical transmission system. The test results indicate that the wavelet threshold de-noising method based on the noise variance estimation shows preferable performance in processing the testing signals of the electro-mechanical transmission system: it can effectively eliminate the interference of transient signals including voltage, current, and oil pressure and maintain the dynamic characteristics of the signals favorably.

  1. Modeling wildlife populations with HexSim

    EPA Science Inventory

    HexSim is a framework for constructing spatially-explicit, individual-based computer models designed for simulating terrestrial wildlife population dynamics and interactions. HexSim is useful for a broad set of modeling applications including population viability analysis for on...

  2. Maintenance of algal endosymbionts in Paramecium bursaria: a simple model based on population dynamics.

    PubMed

    Iwai, Sosuke; Fujiwara, Kenji; Tamura, Takuro

    2016-09-01

    Algal endosymbiosis is widely distributed in eukaryotes including many protists and metazoans, and plays important roles in aquatic ecosystems, combining phagotrophy and phototrophy. To maintain a stable symbiotic relationship, endosymbiont population size in the host must be properly regulated and maintained at a constant level; however, the mechanisms underlying the maintenance of algal endosymbionts are still largely unknown. Here we investigate the population dynamics of the unicellular ciliate Paramecium bursaria and its Chlorella-like algal endosymbiont under various experimental conditions in a simple culture system. Our results suggest that endosymbiont population size in P. bursaria was not regulated by active processes such as cell division coupling between the two organisms, or partitioning of the endosymbionts at host cell division. Regardless, endosymbiont population size was eventually adjusted to a nearly constant level once cells were grown with light and nutrients. To explain this apparent regulation of population size, we propose a simple mechanism based on the different growth properties (specifically the nutrient requirements) of the two organisms, and based from this develop a mathematical model to describe the population dynamics of host and endosymbiont. The proposed mechanism and model may provide a basis for understanding the maintenance of algal endosymbionts. © 2015 Society for Applied Microbiology and John Wiley & Sons Ltd.

  3. Critical thresholds in sea lice epidemics: evidence, sensitivity and subcritical estimation

    PubMed Central

    Frazer, L. Neil; Morton, Alexandra; Krkošek, Martin

    2012-01-01

    Host density thresholds are a fundamental component of the population dynamics of pathogens, but empirical evidence and estimates are lacking. We studied host density thresholds in the dynamics of ectoparasitic sea lice (Lepeophtheirus salmonis) on salmon farms. Empirical examples include a 1994 epidemic in Atlantic Canada and a 2001 epidemic in Pacific Canada. A mathematical model suggests dynamics of lice are governed by a stable endemic equilibrium until the critical host density threshold drops owing to environmental change, or is exceeded by stocking, causing epidemics that require rapid harvest or treatment. Sensitivity analysis of the critical threshold suggests variation in dependence on biotic parameters and high sensitivity to temperature and salinity. We provide a method for estimating the critical threshold from parasite abundances at subcritical host densities and estimate the critical threshold and transmission coefficient for the two epidemics. Host density thresholds may be a fundamental component of disease dynamics in coastal seas where salmon farming occurs. PMID:22217721

  4. Visual Basic, Excel-based fish population modeling tool - The pallid sturgeon example

    USGS Publications Warehouse

    Moran, Edward H.; Wildhaber, Mark L.; Green, Nicholas S.; Albers, Janice L.

    2016-02-10

    The model presented in this report is a spreadsheet-based model using Visual Basic for Applications within Microsoft Excel (http://dx.doi.org/10.5066/F7057D0Z) prepared in cooperation with the U.S. Army Corps of Engineers and U.S. Fish and Wildlife Service. It uses the same model structure and, initially, parameters as used by Wildhaber and others (2015) for pallid sturgeon. The difference between the model structure used for this report and that used by Wildhaber and others (2015) is that variance is not partitioned. For the model of this report, all variance is applied at the iteration and time-step levels of the model. Wildhaber and others (2015) partition variance into parameter variance (uncertainty about the value of a parameter itself) applied at the iteration level and temporal variance (uncertainty caused by random environmental fluctuations with time) applied at the time-step level. They included implicit individual variance (uncertainty caused by differences between individuals) within the time-step level.The interface developed for the model of this report is designed to allow the user the flexibility to change population model structure and parameter values and uncertainty separately for every component of the model. This flexibility makes the modeling tool potentially applicable to any fish species; however, the flexibility inherent in this modeling tool makes it possible for the user to obtain spurious outputs. The value and reliability of the model outputs are only as good as the model inputs. Using this modeling tool with improper or inaccurate parameter values, or for species for which the structure of the model is inappropriate, could lead to untenable management decisions. By facilitating fish population modeling, this modeling tool allows the user to evaluate a range of management options and implications. The goal of this modeling tool is to be a user-friendly modeling tool for developing fish population models useful to natural resource

  5. Designing a Community-Based Population Health Model.

    PubMed

    Durovich, Christopher J; Roberts, Peter W

    2018-02-01

    The pace of change from volume-based to value-based payment in health care varies dramatically among markets. Regardless of the ultimate disposition of the Affordable Care Act, employers and public-private payers will continue to increase pressure on health care providers to assume financial risk for populations in the form of shared savings, bundled payments, downside risk, or even capitation. This article outlines a suggested road map and practical considerations for health systems that are building or planning to build population health capabilities to meet the needs of their local markets. The authors review the traditional core capabilities needed to address the medical determinants of health for a population. They also share an innovative approach to community service integration to address the social determinants of health and the engagement of families to improve their own health and well-being. The foundational approach is to connect insurance products, the health care delivery system, and community service agencies around the family's well-being goals using human-centered design strategy.

  6. Modeling the population-level effects of hypoxia on a coastal fish: implications of a spatially-explicit individual-based model

    NASA Astrophysics Data System (ADS)

    Rose, K.; Creekmore, S.; Thomas, P.; Craig, K.; Neilan, R.; Rahman, S.; Wang, L.; Justic, D.

    2016-02-01

    The northwestern Gulf of Mexico (USA) currently experiences a large hypoxic area ("dead zone") during the summer. The population-level effects of hypoxia on coastal fish are largely unknown. We developed a spatially-explicit, individual-based model to analyze how hypoxia effects on reproduction, growth, and mortality of individual Atlantic croaker could lead to population-level responses. The model follows the hourly growth, mortality, reproduction, and movement of individuals on a 300 x 800 spatial grid of 1 km2 cells for 140 years. Chlorophyll-a concentration and water temperature were specified daily for each grid cell. Dissolved oxygen (DO) was obtained from a 3-D water quality model for four years that differed in their severity of hypoxia. A bioenergetics model was used to represent growth, mortality was assumed stage- and age-dependent, and movement behavior was based on temperature preferences and avoidance of low DO. Hypoxia effects were imposed using exposure-effects sub-models that converted time-varying exposure to DO to reductions in growth and fecundity, and increases in mortality. Using sequences of mild, intermediate, and severe hypoxia years, the model predicted a 20% decrease in population abundance. Additional simulations were performed under the assumption that river-based nutrients loadings that lead to more hypoxia also lead to higher primary production and more food for croaker. Twenty-five percent and 50% nutrient reduction scenarios were simulated by adjusting the cholorphyll-a concentrations used as food proxy for the croaker. We then incrementally increased the DO concentrations to determine how much hypoxia would need to be reduced to offset the lower food production resulting from reduced nutrients. We discuss the generality of our results, the hidden effects of hypoxia on fish, and our overall strategy of combining laboratory and field studies with modeling to produce robust predictions of population responses to stressors under

  7. Population-expression models of immune response

    NASA Astrophysics Data System (ADS)

    Stromberg, Sean P.; Antia, Rustom; Nemenman, Ilya

    2013-06-01

    The immune response to a pathogen has two basic features. The first is the expansion of a few pathogen-specific cells to form a population large enough to control the pathogen. The second is the process of differentiation of cells from an initial naive phenotype to an effector phenotype which controls the pathogen, and subsequently to a memory phenotype that is maintained and responsible for long-term protection. The expansion and the differentiation have been considered largely independently. Changes in cell populations are typically described using ecologically based ordinary differential equation models. In contrast, differentiation of single cells is studied within systems biology and is frequently modeled by considering changes in gene and protein expression in individual cells. Recent advances in experimental systems biology make available for the first time data to allow the coupling of population and high dimensional expression data of immune cells during infections. Here we describe and develop population-expression models which integrate these two processes into systems biology on the multicellular level. When translated into mathematical equations, these models result in non-conservative, non-local advection-diffusion equations. We describe situations where the population-expression approach can make correct inference from data while previous modeling approaches based on common simplifying assumptions would fail. We also explore how model reduction techniques can be used to build population-expression models, minimizing the complexity of the model while keeping the essential features of the system. While we consider problems in immunology in this paper, we expect population-expression models to be more broadly applicable.

  8. Threshold-dependent sample sizes for selenium assessment with stream fish tissue

    USGS Publications Warehouse

    Hitt, Nathaniel P.; Smith, David R.

    2015-01-01

    Natural resource managers are developing assessments of selenium (Se) contamination in freshwater ecosystems based on fish tissue concentrations. We evaluated the effects of sample size (i.e., number of fish per site) on the probability of correctly detecting mean whole-body Se values above a range of potential management thresholds. We modeled Se concentrations as gamma distributions with shape and scale parameters fitting an empirical mean-to-variance relationship in data from southwestern West Virginia, USA (63 collections, 382 individuals). We used parametric bootstrapping techniques to calculate statistical power as the probability of detecting true mean concentrations up to 3 mg Se/kg above management thresholds ranging from 4 to 8 mg Se/kg. Sample sizes required to achieve 80% power varied as a function of management thresholds and Type I error tolerance (α). Higher thresholds required more samples than lower thresholds because populations were more heterogeneous at higher mean Se levels. For instance, to assess a management threshold of 4 mg Se/kg, a sample of eight fish could detect an increase of approximately 1 mg Se/kg with 80% power (given α = 0.05), but this sample size would be unable to detect such an increase from a management threshold of 8 mg Se/kg with more than a coin-flip probability. Increasing α decreased sample size requirements to detect above-threshold mean Se concentrations with 80% power. For instance, at an α-level of 0.05, an 8-fish sample could detect an increase of approximately 2 units above a threshold of 8 mg Se/kg with 80% power, but when α was relaxed to 0.2, this sample size was more sensitive to increasing mean Se concentrations, allowing detection of an increase of approximately 1.2 units with equivalent power. Combining individuals into 2- and 4-fish composite samples for laboratory analysis did not decrease power because the reduced number of laboratory samples was compensated for by increased

  9. What is the most cost-effective population-based cancer screening program for Chinese women?

    PubMed

    Woo, Pauline P S; Kim, Jane J; Leung, Gabriel M

    2007-02-20

    To develop a policy-relevant generalized cost-effectiveness (CE) model of population-based cancer screening for Chinese women. Disability-adjusted life-years (DALYs) averted and associated screening and treatment costs under population-based screening using cervical cytology (cervical cancer), mammography (breast cancer), and fecal occult blood testing (FOBT), sigmoidoscopy, FOBT plus sigmoidoscopy, or colonoscopy (colorectal cancer) were estimated, from which average and incremental CE ratios were generated. Probabilistic sensitivity analysis was undertaken to assess stochasticity, parameter uncertainty, and model assumptions. Cervical, breast, and colorectal cancers were together responsible for 13,556 DALYs (in a 1:4:3 ratio, respectively) in Hong Kong's 3.4 million female population annually. All status quo strategies were dominated, thus confirming the suboptimal efficiency of opportunistic screening. Current patterns of screening averted 471 DALYs every year, which could potentially be more than doubled to 1,161 DALYs under the same screening and treatment budgetary threshold of US $50 million with 100% Pap coverage every 4 years and 30% coverage of colonoscopy every 10 years. With higher budgetary caps, biennial mammographic screening starting at age 50 years can be introduced. Our findings have informed how best to achieve allocative efficiency in deploying scarce cancer care dollars but must be coupled with better integrated care planning, improved intersectoral coordination, increased resources, and stronger political will to realize the potential health and economic gains as demonstrated.

  10. Estimating economic thresholds for pest control: an alternative procedure.

    PubMed

    Ramirez, O A; Saunders, J L

    1999-04-01

    An alternative methodology to determine profit maximizing economic thresholds is developed and illustrated. An optimization problem based on the main biological and economic relations involved in determining a profit maximizing economic threshold is first advanced. From it, a more manageable model of 2 nonsimultaneous reduced-from equations is derived, which represents a simpler but conceptually and statistically sound alternative. The model recognizes that yields and pest control costs are a function of the economic threshold used. Higher (less strict) economic thresholds can result in lower yields and, therefore, a lower gross income from the sale of the product, but could also be less costly to maintain. The highest possible profits will be obtained by using the economic threshold that results in a maximum difference between gross income and pest control cost functions.

  11. THRIVE: threshold homomorphic encryption based secure and privacy preserving biometric verification system

    NASA Astrophysics Data System (ADS)

    Karabat, Cagatay; Kiraz, Mehmet Sabir; Erdogan, Hakan; Savas, Erkay

    2015-12-01

    In this paper, we introduce a new biometric verification and template protection system which we call THRIVE. The system includes novel enrollment and authentication protocols based on threshold homomorphic encryption where a private key is shared between a user and a verifier. In the THRIVE system, only encrypted binary biometric templates are stored in a database and verification is performed via homomorphically randomized templates, thus, original templates are never revealed during authentication. Due to the underlying threshold homomorphic encryption scheme, a malicious database owner cannot perform full decryption on encrypted templates of the users in the database. In addition, security of the THRIVE system is enhanced using a two-factor authentication scheme involving user's private key and biometric data. Using simulation-based techniques, the proposed system is proven secure in the malicious model. The proposed system is suitable for applications where the user does not want to reveal her biometrics to the verifier in plain form, but needs to prove her identity by using biometrics. The system can be used with any biometric modality where a feature extraction method yields a fixed size binary template and a query template is verified when its Hamming distance to the database template is less than a threshold. The overall connection time for the proposed THRIVE system is estimated to be 336 ms on average for 256-bit biometric templates on a desktop PC running with quad core 3.2 GHz CPUs at 10 Mbit/s up/down link connection speed. Consequently, the proposed system can be efficiently used in real-life applications.

  12. A flexible cure rate model with dependent censoring and a known cure threshold.

    PubMed

    Bernhardt, Paul W

    2016-11-10

    We propose a flexible cure rate model that accommodates different censoring distributions for the cured and uncured groups and also allows for some individuals to be observed as cured when their survival time exceeds a known threshold. We model the survival times for the uncured group using an accelerated failure time model with errors distributed according to the seminonparametric distribution, potentially truncated at a known threshold. We suggest a straightforward extension of the usual expectation-maximization algorithm approach for obtaining estimates in cure rate models to accommodate the cure threshold and dependent censoring. We additionally suggest a likelihood ratio test for testing for the presence of dependent censoring in the proposed cure rate model. We show through numerical studies that our model has desirable properties and leads to approximately unbiased parameter estimates in a variety of scenarios. To demonstrate how our method performs in practice, we analyze data from a bone marrow transplantation study and a liver transplant study. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  13. Organism and population-level ecological models for ...

    EPA Pesticide Factsheets

    Ecological risk assessment typically focuses on animal populations as endpoints for regulatory ecotoxicology. Scientists at USEPA are developing models for animal populations exposed to a wide range of chemicals from pesticides to emerging contaminants. Modeled taxa include aquatic and terrestrial invertebrates, fish, amphibians, and birds, and employ a wide range of methods, from matrix-based projection models to mechanistic bioenergetics models and spatially explicit population models. not applicable

  14. A parallel implementation of an off-lattice individual-based model of multicellular populations

    NASA Astrophysics Data System (ADS)

    Harvey, Daniel G.; Fletcher, Alexander G.; Osborne, James M.; Pitt-Francis, Joe

    2015-07-01

    As computational models of multicellular populations include ever more detailed descriptions of biophysical and biochemical processes, the computational cost of simulating such models limits their ability to generate novel scientific hypotheses and testable predictions. While developments in microchip technology continue to increase the power of individual processors, parallel computing offers an immediate increase in available processing power. To make full use of parallel computing technology, it is necessary to develop specialised algorithms. To this end, we present a parallel algorithm for a class of off-lattice individual-based models of multicellular populations. The algorithm divides the spatial domain between computing processes and comprises communication routines that ensure the model is correctly simulated on multiple processors. The parallel algorithm is shown to accurately reproduce the results of a deterministic simulation performed using a pre-existing serial implementation. We test the scaling of computation time, memory use and load balancing as more processes are used to simulate a cell population of fixed size. We find approximate linear scaling of both speed-up and memory consumption on up to 32 processor cores. Dynamic load balancing is shown to provide speed-up for non-regular spatial distributions of cells in the case of a growing population.

  15. The scaling of contact rates with population density for the infectious disease models.

    PubMed

    Hu, Hao; Nigmatulina, Karima; Eckhoff, Philip

    2013-08-01

    Contact rates and patterns among individuals in a geographic area drive transmission of directly-transmitted pathogens, making it essential to understand and estimate contacts for simulation of disease dynamics. Under the uniform mixing assumption, one of two mechanisms is typically used to describe the relation between contact rate and population density: density-dependent or frequency-dependent. Based on existing evidence of population threshold and human mobility patterns, we formulated a spatial contact model to describe the appropriate form of transmission with initial growth at low density and saturation at higher density. We show that the two mechanisms are extreme cases that do not capture real population movement across all scales. Empirical data of human and wildlife diseases indicate that a nonlinear function may work better when looking at the full spectrum of densities. This estimation can be applied to large areas with population mixing in general activities. For crowds with unusually large densities (e.g., transportation terminals, stadiums, or mass gatherings), the lack of organized social contact structure deviates the physical contacts towards a special case of the spatial contact model - the dynamics of kinetic gas molecule collision. In this case, an ideal gas model with van der Waals correction fits well; existing movement observation data and the contact rate between individuals is estimated using kinetic theory. A complete picture of contact rate scaling with population density may help clarify the definition of transmission rates in heterogeneous, large-scale spatial systems. Copyright © 2013 The Authors. Published by Elsevier Inc. All rights reserved.

  16. Predictive thresholds for plague in Kazakhstan.

    PubMed

    Davis, Stephen; Begon, Mike; De Bruyn, Luc; Ageyev, Vladimir S; Klassovskiy, Nikolay L; Pole, Sergey B; Viljugrein, Hildegunn; Stenseth, Nils Chr; Leirs, Herwig

    2004-04-30

    In Kazakhstan and elsewhere in central Asia, the bacterium Yersinia pestis circulates in natural populations of gerbils, which are the source of human cases of bubonic plague. Our analysis of field data collected between 1955 and 1996 shows that plague invades, fades out, and reinvades in response to fluctuations in the abundance of its main reservoir host, the great gerbil (Rhombomys opimus). This is a rare empirical example of the two types of abundance thresholds for infectious disease-invasion and persistence- operating in a single wildlife population. We parameterized predictive models that should reduce the costs of plague surveillance in central Asia and thereby encourage its continuance.

  17. Probabilistic estimation of residential air exchange rates for population-based human exposure modeling

    EPA Science Inventory

    Residential air exchange rates (AERs) are a key determinant in the infiltration of ambient air pollution indoors. Population-based human exposure models using probabilistic approaches to estimate personal exposure to air pollutants have relied on input distributions from AER meas...

  18. The interplay between cooperativity and diversity in model threshold ensembles

    PubMed Central

    Cervera, Javier; Manzanares, José A.; Mafe, Salvador

    2014-01-01

    The interplay between cooperativity and diversity is crucial for biological ensembles because single molecule experiments show a significant degree of heterogeneity and also for artificial nanostructures because of the high individual variability characteristic of nanoscale units. We study the cross-effects between cooperativity and diversity in model threshold ensembles composed of individually different units that show a cooperative behaviour. The units are modelled as statistical distributions of parameters (the individual threshold potentials here) characterized by central and width distribution values. The simulations show that the interplay between cooperativity and diversity results in ensemble-averaged responses of interest for the understanding of electrical transduction in cell membranes, the experimental characterization of heterogeneous groups of biomolecules and the development of biologically inspired engineering designs with individually different building blocks. PMID:25142516

  19. Luminance-model-based DCT quantization for color image compression

    NASA Technical Reports Server (NTRS)

    Ahumada, Albert J., Jr.; Peterson, Heidi A.

    1992-01-01

    A model is developed to approximate visibility thresholds for discrete cosine transform (DCT) coefficient quantization error based on the peak-to-peak luminance of the error image. Experimentally measured visibility thresholds for R, G, and B DCT basis functions can be predicted by a simple luminance-based detection model. This model allows DCT coefficient quantization matrices to be designed for display conditions other than those of the experimental measurements: other display luminances, other veiling luminances, and other spatial frequencies (different pixel spacings, viewing distances, and aspect ratios).

  20. Optical Associative Memory Model With Threshold Modification Using Complementary Vector

    NASA Astrophysics Data System (ADS)

    Bian, Shaoping; Xu, Kebin; Hong, Jing

    1989-02-01

    A new criterion to evaluate the similarity between two vectors in associative memory is presented. According to it, an experimental research about optical associative memory model with threshold modification using complementary vector is carried out. This model is capable of eliminating the posibility to recall erroneously. Therefore the accuracy of reading out is improved.

  1. Auditory-nerve single-neuron thresholds to electrical stimulation from scala tympani electrodes.

    PubMed

    Parkins, C W; Colombo, J

    1987-12-31

    Single auditory-nerve neuron thresholds were studied in sensory-deafened squirrel monkeys to determine the effects of electrical stimulus shape and frequency on single-neuron thresholds. Frequency was separated into its components, pulse width and pulse rate, which were analyzed separately. Square and sinusoidal pulse shapes were compared. There were no or questionably significant threshold differences in charge per phase between sinusoidal and square pulses of the same pulse width. There was a small (less than 0.5 dB) but significant threshold advantage for 200 microseconds/phase pulses delivered at low pulse rates (156 pps) compared to higher pulse rates (625 pps and 2500 pps). Pulse width was demonstrated to be the prime determinant of single-neuron threshold, resulting in strength-duration curves similar to other mammalian myelinated neurons, but with longer chronaxies. The most efficient electrical stimulus pulse width to use for cochlear implant stimulation was determined to be 100 microseconds/phase. This pulse width delivers the lowest charge/phase at threshold. The single-neuron strength-duration curves were compared to strength-duration curves of a computer model based on the specific anatomy of auditory-nerve neurons. The membrane capacitance and resulting chronaxie of the model can be varied by altering the length of the unmyelinated termination of the neuron, representing the unmyelinated portion of the neuron between the habenula perforata and the hair cell. This unmyelinated segment of the auditory-nerve neuron may be subject to aminoglycoside damage. Simulating a 10 micron unmyelinated termination for this model neuron produces a strength-duration curve that closely fits the single-neuron data obtained from aminoglycoside deafened animals. Both the model and the single-neuron strength-duration curves differ significantly from behavioral threshold data obtained from monkeys and humans with cochlear implants. This discrepancy can best be explained by

  2. A Vulnerability-Based, Bottom-up Assessment of Future Riverine Flood Risk Using a Modified Peaks-Over-Threshold Approach and a Physically Based Hydrologic Model

    NASA Astrophysics Data System (ADS)

    Knighton, James; Steinschneider, Scott; Walter, M. Todd

    2017-12-01

    There is a chronic disconnection among purely probabilistic flood frequency analysis of flood hazards, flood risks, and hydrological flood mechanisms, which hamper our ability to assess future flood impacts. We present a vulnerability-based approach to estimating riverine flood risk that accommodates a more direct linkage between decision-relevant metrics of risk and the dominant mechanisms that cause riverine flooding. We adapt the conventional peaks-over-threshold (POT) framework to be used with extreme precipitation from different climate processes and rainfall-runoff-based model output. We quantify the probability that at least one adverse hydrologic threshold, potentially defined by stakeholders, will be exceeded within the next N years. This approach allows us to consider flood risk as the summation of risk from separate atmospheric mechanisms, and supports a more direct mapping between hazards and societal outcomes. We perform this analysis within a bottom-up framework to consider the relevance and consequences of information, with varying levels of credibility, on changes to atmospheric patterns driving extreme precipitation events. We demonstrate our proposed approach using a case study for Fall Creek in Ithaca, NY, USA, where we estimate the risk of stakeholder-defined flood metrics from three dominant mechanisms: summer convection, tropical cyclones, and spring rain and snowmelt. Using downscaled climate projections, we determine how flood risk associated with a subset of mechanisms may change in the future, and the resultant shift to annual flood risk. The flood risk approach we propose can provide powerful new insights into future flood threats.

  3. Comparison of individual-based modeling and population approaches for prediction of foodborne pathogens growth.

    PubMed

    Augustin, Jean-Christophe; Ferrier, Rachel; Hezard, Bernard; Lintz, Adrienne; Stahl, Valérie

    2015-02-01

    Individual-based modeling (IBM) approach combined with the microenvironment modeling of vacuum-packed cold-smoked salmon was more effective to describe the variability of the growth of a few Listeria monocytogenes cells contaminating irradiated salmon slices than the traditional population models. The IBM approach was particularly relevant to predict the absence of growth in 25% (5 among 20) of artificially contaminated cold-smoked salmon samples stored at 8 °C. These results confirmed similar observations obtained with smear soft cheese (Ferrier et al., 2013). These two different food models were used to compare the IBM/microscale and population/macroscale modeling approaches in more global exposure and risk assessment frameworks taking into account the variability and/or the uncertainty of the factors influencing the growth of L. monocytogenes. We observed that the traditional population models significantly overestimate exposure and risk estimates in comparison to IBM approach when contamination of foods occurs with a low number of cells (<100 per serving). Moreover, the exposure estimates obtained with the population model were characterized by a great uncertainty. The overestimation was mainly linked to the ability of IBM to predict no growth situations rather than the consideration of microscale environment. On the other hand, when the aim of quantitative risk assessment studies is only to assess the relative impact of changes in control measures affecting the growth of foodborne bacteria, the two modeling approach gave similar results and the simplest population approach was suitable. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Population-level analysis and validation of an individual-based cutthroat trout model

    Treesearch

    Steven F. Railsback; Bret C. Harvey; Roland H. Lamberson; Derek E. Lee; Claasen Nathan J.; Shuzo Yoshihara

    2002-01-01

    Abstract - An individual-based model of stream trout is analyzed by testing its ability to reproduce patterns of population-level behavior observed in real trout: (1) "self-thinning," a negative power relation between weight and abundance; (2) a "critical period" of density-dependent mortality in young-of-the-year; (3) high and age-speci...

  5. Hypothesis testing in functional linear regression models with Neyman's truncation and wavelet thresholding for longitudinal data.

    PubMed

    Yang, Xiaowei; Nie, Kun

    2008-03-15

    Longitudinal data sets in biomedical research often consist of large numbers of repeated measures. In many cases, the trajectories do not look globally linear or polynomial, making it difficult to summarize the data or test hypotheses using standard longitudinal data analysis based on various linear models. An alternative approach is to apply the approaches of functional data analysis, which directly target the continuous nonlinear curves underlying discretely sampled repeated measures. For the purposes of data exploration, many functional data analysis strategies have been developed based on various schemes of smoothing, but fewer options are available for making causal inferences regarding predictor-outcome relationships, a common task seen in hypothesis-driven medical studies. To compare groups of curves, two testing strategies with good power have been proposed for high-dimensional analysis of variance: the Fourier-based adaptive Neyman test and the wavelet-based thresholding test. Using a smoking cessation clinical trial data set, this paper demonstrates how to extend the strategies for hypothesis testing into the framework of functional linear regression models (FLRMs) with continuous functional responses and categorical or continuous scalar predictors. The analysis procedure consists of three steps: first, apply the Fourier or wavelet transform to the original repeated measures; then fit a multivariate linear model in the transformed domain; and finally, test the regression coefficients using either adaptive Neyman or thresholding statistics. Since a FLRM can be viewed as a natural extension of the traditional multiple linear regression model, the development of this model and computational tools should enhance the capacity of medical statistics for longitudinal data.

  6. Macroscopic neural mass model constructed from a current-based network model of spiking neurons.

    PubMed

    Umehara, Hiroaki; Okada, Masato; Teramae, Jun-Nosuke; Naruse, Yasushi

    2017-02-01

    Neural mass models (NMMs) are efficient frameworks for describing macroscopic cortical dynamics including electroencephalogram and magnetoencephalogram signals. Originally, these models were formulated on an empirical basis of synaptic dynamics with relatively long time constants. By clarifying the relations between NMMs and the dynamics of microscopic structures such as neurons and synapses, we can better understand cortical and neural mechanisms from a multi-scale perspective. In a previous study, the NMMs were analytically derived by averaging the equations of synaptic dynamics over the neurons in the population and further averaging the equations of the membrane-potential dynamics. However, the averaging of synaptic current assumes that the neuron membrane potentials are nearly time invariant and that they remain at sub-threshold levels to retain the conductance-based model. This approximation limits the NMM to the non-firing state. In the present study, we newly propose a derivation of a NMM by alternatively approximating the synaptic current which is assumed to be independent of the membrane potential, thus adopting a current-based model. Our proposed model releases the constraint of the nearly constant membrane potential. We confirm that the obtained model is reducible to the previous model in the non-firing situation and that it reproduces the temporal mean values and relative power spectrum densities of the average membrane potentials for the spiking neurons. It is further ensured that the existing NMM properly models the averaged dynamics over individual neurons even if they are spiking in the populations.

  7. Modeling jointly low, moderate, and heavy rainfall intensities without a threshold selection

    NASA Astrophysics Data System (ADS)

    Naveau, Philippe; Huser, Raphael; Ribereau, Pierre; Hannart, Alexis

    2016-04-01

    In statistics, extreme events are often defined as excesses above a given large threshold. This definition allows hydrologists and flood planners to apply Extreme-Value Theory (EVT) to their time series of interest. Even in the stationary univariate context, this approach has at least two main drawbacks. First, working with excesses implies that a lot of observations (those below the chosen threshold) are completely disregarded. The range of precipitation is artificially shopped down into two pieces, namely large intensities and the rest, which necessarily imposes different statistical models for each piece. Second, this strategy raises a nontrivial and very practical difficultly: how to choose the optimal threshold which correctly discriminates between low and heavy rainfall intensities. To address these issues, we propose a statistical model in which EVT results apply not only to heavy, but also to low precipitation amounts (zeros excluded). Our model is in compliance with EVT on both ends of the spectrum and allows a smooth transition between the two tails, while keeping a low number of parameters. In terms of inference, we have implemented and tested two classical methods of estimation: likelihood maximization and probability weighed moments. Last but not least, there is no need to choose a threshold to define low and high excesses. The performance and flexibility of this approach are illustrated on simulated and hourly precipitation recorded in Lyon, France.

  8. Flood Extent Delineation by Thresholding Sentinel-1 SAR Imagery Based on Ancillary Land Cover Information

    NASA Astrophysics Data System (ADS)

    Liang, J.; Liu, D.

    2017-12-01

    Emergency responses to floods require timely information on water extents that can be produced by satellite-based remote sensing. As SAR image can be acquired in adverse illumination and weather conditions, it is particularly suitable for delineating water extent during a flood event. Thresholding SAR imagery is one of the most widely used approaches to delineate water extent. However, most studies apply only one threshold to separate water and dry land without considering the complexity and variability of different dry land surface types in an image. This paper proposes a new thresholding method for SAR image to delineate water from other different land cover types. A probability distribution of SAR backscatter intensity is fitted for each land cover type including water before a flood event and the intersection between two distributions is regarded as a threshold to classify the two. To extract water, a set of thresholds are applied to several pairs of land cover types—water and urban or water and forest. The subsets are merged to form the water distribution for the SAR image during or after the flooding. Experiments show that this land cover based thresholding approach outperformed the traditional single thresholding by about 5% to 15%. This method has great application potential with the broadly acceptance of the thresholding based methods and availability of land cover data, especially for heterogeneous regions.

  9. Economic evaluation of a group-based exercise program for falls prevention among the older community-dwelling population.

    PubMed

    McLean, Kendra; Day, Lesley; Dalton, Andrew

    2015-03-26

    Falls among older people are of growing concern globally. Implementing cost-effective strategies for their prevention is of utmost importance given the ageing population and associated potential for increased costs of fall-related injury over the next decades. The purpose of this study was to undertake a cost-utility analysis and secondary cost-effectiveness analysis from a healthcare system perspective, of a group-based exercise program compared to routine care for falls prevention in an older community-dwelling population. A decision analysis using a decision tree model was based on the results of a previously published randomised controlled trial with a community-dwelling population aged over 70. Measures of falls, fall-related injuries and resource use were directly obtained from trial data and supplemented by literature-based utility measures. A sub-group analysis was performed of women only. Cost estimates are reported in 2010 British Pound Sterling (GBP). The ICER of GBP£51,483 per QALY for the base case analysis was well above the accepted cost-effectiveness threshold of GBP£20,000 to £30,000 per QALY, but in a sensitivity analysis with minimised program implementation the incremental cost reached GBP£25,678 per QALY. The ICER value at 95% confidence in the base case analysis was GBP£99,664 per QALY and GBP£50,549 per QALY in the lower cost analysis. Males had a 44% lower injury rate if they fell, compared to females resulting in a more favourable ICER for the women only analysis. For women only the ICER was GBP£22,986 per QALY in the base case and was below the cost-effectiveness threshold for all other variations of program implementation. The ICER value at 95% confidence was GBP£48,212 in the women only base case analysis and GBP£23,645 in the lower cost analysis. The base case incremental cost per fall averted was GBP£652 (GBP£616 for women only). A threshold analysis indicates that this exercise program cannot realistically break even. The

  10. An individual-based model for population viability analysis of humpback chub in Grand Canyon

    USGS Publications Warehouse

    Pine, William Pine; Healy, Brian; Smith, Emily Omana; Trammell, Melissa; Speas, Dave; Valdez, Rich; Yard, Mike; Walters, Carl; Ahrens, Rob; Vanhaverbeke, Randy; Stone, Dennis; Wilson, Wade

    2013-01-01

    We developed an individual-based population viability analysis model (females only) for evaluating risk to populations from catastrophic events or conservation and research actions. This model tracks attributes (size, weight, viability, etc.) for individual fish through time and then compiles this information to assess the extinction risk of the population across large numbers of simulation trials. Using a case history for the Little Colorado River population of Humpback Chub Gila cypha in Grand Canyon, Arizona, we assessed extinction risk and resiliency to a catastrophic event for this population and then assessed a series of conservation actions related to removing specific numbers of Humpback Chub at different sizes for conservation purposes, such as translocating individuals to establish other spawning populations or hatchery refuge development. Our results suggested that the Little Colorado River population is generally resilient to a single catastrophic event and also to removals of larvae and juveniles for conservation purposes, including translocations to establish new populations. Our results also suggested that translocation success is dependent on similar survival rates in receiving and donor streams and low emigration rates from recipient streams. In addition, translocating either large numbers of larvae or small numbers of large juveniles has generally an equal likelihood of successful population establishment at similar extinction risk levels to the Little Colorado River donor population. Our model created a transparent platform to consider extinction risk to populations from catastrophe or conservation actions and should prove useful to managers assessing these risks for endangered species such as Humpback Chub.

  11. Discrimination thresholds of normal and anomalous trichromats: Model of senescent changes in ocular media density on the Cambridge Colour Test

    PubMed Central

    Shinomori, Keizo; Panorgias, Athanasios; Werner, John S.

    2017-01-01

    Age-related changes in chromatic discrimination along dichromatic confusion lines were measured with the Cambridge Colour Test (CCT). One hundred and sixty-two individuals (16 to 88 years old) with normal Rayleigh matches were the major focus of this paper. An additional 32 anomalous trichromats classified by their Rayleigh matches were also tested. All subjects were screened to rule out abnormalities of the anterior and posterior segments. Thresholds on all three chromatic vectors measured with the CCT showed age-related increases. Protan and deutan vector thresholds increased linearly with age while the tritan vector threshold was described with a bilinear model. Analysis and modeling demonstrated that the nominal vectors of the CCT are shifted by senescent changes in ocular media density, and a method for correcting the CCT vectors is demonstrated. A correction for these shifts indicates that classification among individuals of different ages is unaffected. New vector thresholds for elderly observers and for all age groups are suggested based on calculated tolerance limits. PMID:26974943

  12. Length-Based Assessment of Coral Reef Fish Populations in the Main and Northwestern Hawaiian Islands

    PubMed Central

    Nadon, Marc O.; Ault, Jerald S.; Williams, Ivor D.; Smith, Steven G.; DiNardo, Gerard T.

    2015-01-01

    The coral reef fish community of Hawaii is composed of hundreds of species, supports a multimillion dollar fishing and tourism industry, and is of great cultural importance to the local population. However, a major stock assessment of Hawaiian coral reef fish populations has not yet been conducted. Here we used the robust indicator variable “average length in the exploited phase of the population (L¯)”, estimated from size composition data from commercial fisheries trip reports and fishery-independent diver surveys, to evaluate exploitation rates for 19 Hawaiian reef fishes. By and large, the average lengths obtained from diver surveys agreed well with those from commercial data. We used the estimated exploitation rates coupled with life history parameters synthesized from the literature to parameterize a numerical population model and generate stock sustainability metrics such as spawning potential ratios (SPR). We found good agreement between predicted average lengths in an unfished population (from our population model) and those observed from diver surveys in the largely unexploited Northwestern Hawaiian Islands. Of 19 exploited reef fish species assessed in the main Hawaiian Islands, 9 had SPRs close to or below the 30% overfishing threshold. In general, longer-lived species such as surgeonfishes, the redlip parrotfish (Scarus rubroviolaceus), and the gray snapper (Aprion virescens) had the lowest SPRs, while short-lived species such as goatfishes and jacks, as well as two invasive species (Lutjanus kasmira and Cephalopholis argus), had SPRs above the 30% threshold. PMID:26267473

  13. Beneficial laggards: multilevel selection, cooperative polymorphism and division of labour in threshold public good games

    PubMed Central

    2010-01-01

    Background The origin and stability of cooperation is a hot topic in social and behavioural sciences. A complicated conundrum exists as defectors have an advantage over cooperators, whenever cooperation is costly so consequently, not cooperating pays off. In addition, the discovery that humans and some animal populations, such as lions, are polymorphic, where cooperators and defectors stably live together -- while defectors are not being punished--, is even more puzzling. Here we offer a novel explanation based on a Threshold Public Good Game (PGG) that includes the interaction of individual and group level selection, where individuals can contribute to multiple collective actions, in our model group hunting and group defense. Results Our results show that there are polymorphic equilibria in Threshold PGGs; that multi-level selection does not select for the most cooperators per group but selects those close to the optimum number of cooperators (in terms of the Threshold PGG). In particular for medium cost values division of labour evolves within the group with regard to the two types of cooperative actions (hunting vs. defense). Moreover we show evidence that spatial population structure promotes cooperation in multiple PGGs. We also demonstrate that these results apply for a wide range of non-linear benefit function types. Conclusions We demonstrate that cooperation can be stable in Threshold PGG, even when the proportion of so called free riders is high in the population. A fundamentally new mechanism is proposed how laggards, individuals that have a high tendency to defect during one specific group action can actually contribute to the fitness of the group, by playing part in an optimal resource allocation in Threshold Public Good Games. In general, our results show that acknowledging a multilevel selection process will open up novel explanations for collective actions. PMID:21044340

  14. A low-threshold high-index-contrast grating (HCG)-based organic VCSEL

    NASA Astrophysics Data System (ADS)

    Shayesteh, Mohammad Reza; Darvish, Ghafar; Ahmadi, Vahid

    2015-12-01

    We propose a low-threshold high-index-contrast grating (HCG)-based organic vertical-cavity surface-emitting laser (OVCSEL). The device has the feasibility to apply both electrical and optical excitation. The microcavity of the laser is a hybrid photonic crystal (HPC) in which the top distributed Bragg reflector (DBR) is replaced by a sub-wavelength high-contrast-grating layer, and provides a high-quality factor. The simulated quality factor of the microcavity is shown to be as high as 282,000. We also investigate the threshold behavior and the dynamics of the OVCSEL optically pumped with sub-picosecond pulses. Results from numerical simulation show that lasing threshold is 75 nJ/cm2.

  15. Rainfall thresholds for the initiation of debris flows at La Honda, California

    USGS Publications Warehouse

    Wilson, R.C.; Wieczorek, G.F.

    1995-01-01

    A simple numerical model, based on the physical analogy of a leaky barrel, can simulate significant features of the interaction between rainfall and shallow-hillslope pore pressures. The leaky-barrel-model threshold is consistent with, but slightly higher than, an earlier, purely empirical, threshold. The number of debris flows triggered by a storm can be related to the time and amount by which the leaky-barrel-model response exceeded the threshold during the storm. -from Authors

  16. A Threshold Model of Social Support, Adjustment, and Distress after Breast Cancer Treatment

    ERIC Educational Resources Information Center

    Mallinckrodt, Brent; Armer, Jane M.; Heppner, P. Paul

    2012-01-01

    This study examined a threshold model that proposes that social support exhibits a curvilinear association with adjustment and distress, such that support in excess of a critical threshold level has decreasing incremental benefits. Women diagnosed with a first occurrence of breast cancer (N = 154) completed survey measures of perceived support…

  17. Modelling food and population dynamics in honey bee colonies.

    PubMed

    Khoury, David S; Barron, Andrew B; Myerscough, Mary R

    2013-01-01

    Honey bees (Apis mellifera) are increasingly in demand as pollinators for various key agricultural food crops, but globally honey bee populations are in decline, and honey bee colony failure rates have increased. This scenario highlights a need to understand the conditions in which colonies flourish and in which colonies fail. To aid this investigation we present a compartment model of bee population dynamics to explore how food availability and bee death rates interact to determine colony growth and development. Our model uses simple differential equations to represent the transitions of eggs laid by the queen to brood, then hive bees and finally forager bees, and the process of social inhibition that regulates the rate at which hive bees begin to forage. We assume that food availability can influence both the number of brood successfully reared to adulthood and the rate at which bees transition from hive duties to foraging. The model predicts complex interactions between food availability and forager death rates in shaping colony fate. Low death rates and high food availability results in stable bee populations at equilibrium (with population size strongly determined by forager death rate) but consistently increasing food reserves. At higher death rates food stores in a colony settle at a finite equilibrium reflecting the balance of food collection and food use. When forager death rates exceed a critical threshold the colony fails but residual food remains. Our model presents a simple mathematical framework for exploring the interactions of food and forager mortality on colony fate, and provides the mathematical basis for more involved simulation models of hive performance.

  18. The interplay between cooperativity and diversity in model threshold ensembles.

    PubMed

    Cervera, Javier; Manzanares, José A; Mafe, Salvador

    2014-10-06

    The interplay between cooperativity and diversity is crucial for biological ensembles because single molecule experiments show a significant degree of heterogeneity and also for artificial nanostructures because of the high individual variability characteristic of nanoscale units. We study the cross-effects between cooperativity and diversity in model threshold ensembles composed of individually different units that show a cooperative behaviour. The units are modelled as statistical distributions of parameters (the individual threshold potentials here) characterized by central and width distribution values. The simulations show that the interplay between cooperativity and diversity results in ensemble-averaged responses of interest for the understanding of electrical transduction in cell membranes, the experimental characterization of heterogeneous groups of biomolecules and the development of biologically inspired engineering designs with individually different building blocks. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  19. Population-based prevention of eating disorders: an application of the Rose prevention model.

    PubMed

    Austin, S B

    2001-03-01

    Several decades of concerted research on eating disorders have generated a broad range of proposed causal influences, but much of this etiologic research does not elucidate practical avenues for preventive interventions. Translating etiologic theory into community health interventions depends on the identification of key leverage points, factors that are amenable to public health intervention and provide an opportunity to maximize impact on the outcome of interest. Population-based preventive strategies, elaborated by epidemiologist Geoffrey Rose, can maximize the impact of public health interventions. In the case of eating disorders, Rose's model is instructive: Dieting stands out as risk behavior that may both fit Rose's model well and be a key leverage point for preventive intervention. Grounded in Rose's work, this article lodges a theoretical argument for the population-based prevention of eating disorders. In the introductory section, existing research on the epidemiology of dieting is reviewed, showing that it is extremely common among adolescent girls and women and that the behavior has been implicated as a causal factor for disordered eating. Next, new evidence is offered to build a case for how a population-wide reduction in dieting may be an effective strategy for prevention of eating pathology. Finally Rose's prevention framework is used to introduce a unique and provocative perspective on the prevention of eating disorders. Dieting is a normative behavior in our culture with psychological and physiological effects in the causal chain leading to eating pathology. This behavior may represent an ideal target for population-based prevention. Theoretical and empirical evidence suggests that a population-wide reduction in dieting may be a justifiable and effective strategy for prevention of eating pathology. Copyright 2001 American Health Foundation and Academic Press.

  20. Modeling of ablation threshold dependence on pulse duration for dielectrics with ultrashort pulsed laser

    NASA Astrophysics Data System (ADS)

    Sun, Mingying; Zhu, Jianqiang; Lin, Zunqi

    2017-01-01

    We present a numerical model of plasma formation in ultrafast laser ablation on the dielectrics surface. Ablation threshold dependence on pulse duration is predicted with the model and the numerical results for water agrees well with the experimental data for pulse duration from 140 fs to 10 ps. Influences of parameters and approximations of photo- and avalanche-ionization on the ablation threshold prediction are analyzed in detail for various pulse lengths. The calculated ablation threshold is strongly dependent on electron collision time for all the pulse durations. The complete photoionization model is preferred for pulses shorter than 1 ps rather than the multiphoton ionization approximations. The transition time of inverse bremsstrahlung absorption needs to be considered when pulses are shorter than 5 ps and it can also ensure the avalanche ionization (AI) coefficient consistent with that in multiple rate equations (MREs) for pulses shorter than 300 fs. The threshold electron density for AI is only crucial for longer pulses. It is reasonable to ignore the recombination loss for pulses shorter than 100 fs. In addition to thermal transport and hydrodynamics, neglecting the threshold density for AI and recombination could also contribute to the disagreements between the numerical and the experimental results for longer pulses.

  1. Epidemic Threshold in Structured Scale-Free Networks

    NASA Astrophysics Data System (ADS)

    EguíLuz, VíCtor M.; Klemm, Konstantin

    2002-08-01

    We analyze the spreading of viruses in scale-free networks with high clustering and degree correlations, as found in the Internet graph. For the susceptible-infected-susceptible model of epidemics the prevalence undergoes a phase transition at a finite threshold of the transmission probability. Comparing with the absence of a finite threshold in networks with purely random wiring, our result suggests that high clustering (modularity) and degree correlations protect scale-free networks against the spreading of viruses. We introduce and verify a quantitative description of the epidemic threshold based on the connectivity of the neighborhoods of the hubs.

  2. A threshold model of content knowledge transfer for socioscientific argumentation

    NASA Astrophysics Data System (ADS)

    Sadler, Troy D.; Fowler, Samantha R.

    2006-11-01

    This study explores how individuals make use of scientific content knowledge for socioscientific argumentation. More specifically, this mixed-methods study investigates how learners apply genetics content knowledge as they justify claims relative to genetic engineering. Interviews are conducted with 45 participants, representing three distinct groups: high school students with variable genetics knowledge, college nonscience majors with little genetics knowledge, and college science majors with advanced genetics knowledge. During the interviews, participants advance positions concerning three scenarios dealing with gene therapy and cloning. Arguments are assessed in terms of the number of justifications offered as well as justification quality, based on a five-point rubric. Multivariate analysis of variance results indicate that college science majors outperformed the other groups in terms of justification quality and frequency. Argumentation does not differ among nonscience majors or high school students. Follow-up qualitative analyses of interview responses suggest that all three groups tend to focus on similar, sociomoral themes as they negotiate socially complex, genetic engineering issues, but that the science majors frequently reference specific science content knowledge in the justification of their claims. Results support the Threshold Model of Content Knowledge Transfer, which proposes two knowledge thresholds around which argumentation quality can reasonably be expected to increase. Research and educational implications of these findings are discussed.

  3. Assessing three fish species ecological status in Colorado River, Grand Canyon based on physical habitat and population models.

    PubMed

    Yao, Weiwei; Chen, Yuansheng

    2018-04-01

    Colorado River is a unique ecosystem and provides important ecological services such as habitat for fish species as well as water power energy supplies. River management for this ecosystem requires assessment and decision support tools for fish which involves protecting, restoring as well as forecasting of future conditions. In this paper, a habitat and population model was developed and used to determine the levels of fish habitat suitability and population density in Colorado River between Lees Ferry and Lake Mead. The short term target fish populations are also predicted based on native fish recovery strategy. This model has been developed by combining hydrodynamics, heat transfer and sediment transport models with a habitat suitability index model and then coupling with habitat model into life stage population model. The fish were divided into four life stages according to the fish length. Three most abundant and typical native and non-native fish were selected as target species, which are rainbow trout (Oncorhynchus mykiss), brown trout (Salmo trutta) and flannelmouth sucker (Catostomus latipinnis). Flow velocity, water depth, water temperature and substrates were used as the suitability indicators in habitat model and overall suitability index (OSI) as well as weight usable area (WUA) was used as an indicator in population model. A comparison was made between simulated fish population alteration and surveyed fish number fluctuation during 2000 to 2009. The application of this habitat and population model indicates that this model can be accurate present habitat situation and targets fish population dynamics of in the study areas. The analysis also indicates the flannelmouth sucker population will steadily increase while the rainbow trout will decrease based on the native fish recovery scheme. Copyright © 2018. Published by Elsevier Inc.

  4. Using perceptual cues for brake response to a lead vehicle: Comparing threshold and accumulator models of visual looming.

    PubMed

    Xue, Qingwan; Markkula, Gustav; Yan, Xuedong; Merat, Natasha

    2018-06-18

    Previous studies have shown the effect of a lead vehicle's speed, deceleration rate and headway distance on drivers' brake response times. However, how drivers perceive this information and use it to determine when to apply braking is still not quite clear. To better understand the underlying mechanisms, a driving simulator experiment was performed where each participant experienced nine deceleration scenarios. Previously reported effects of the lead vehicle's speed, deceleration rate and headway distance on brake response time were firstly verified in this paper, using a multilevel model. Then, as an alternative to measures of speed, deceleration rate and distance, two visual looming-based metrics (angular expansion rate θ˙ of the lead vehicle on the driver's retina, and inverse tau τ -1 , the ratio between θ˙ and the optical size θ), considered to be more in line with typical human psycho-perceptual responses, were adopted to quantify situation urgency. These metrics were used in two previously proposed mechanistic models predicting brake onset: either when looming surpasses a threshold, or when the accumulated evidence (looming and other cues) reaches a threshold. Results showed that the looming threshold model did not capture the distribution of brake response time. However, regardless of looming metric, the accumulator models fitted the distribution of brake response times better than the pure threshold models. Accumulator models, including brake lights, provided a better model fit than looming-only versions. For all versions of the mechanistic models, models using τ -1 as the measure of looming fitted better than those using θ˙, indicating that the visual cues drivers used during rear-end collision avoidance may be more close to τ -1 . Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. A modeling approach to establish environmental flow threshold in ungauged semidiurnal tidal river

    NASA Astrophysics Data System (ADS)

    Akter, A.; Tanim, A. H.

    2018-03-01

    Due to shortage of flow monitoring data in ungauged semidiurnal river, 'environmental flow' (EF) determination based on its key component 'minimum low flow' is always difficult. For EF assessment this study selected a reach immediately after the Halda-Karnafuli confluence, a unique breeding ground for Indian Carp fishes of Bangladesh. As part of an ungauged tidal river, EF threshold establishment faces challenges in changing ecological paradigms with periodic change of tides and hydrologic alterations. This study describes a novel approach through modeling framework comprising hydrological, hydrodynamic and habitat simulation model. The EF establishment was conceptualized according to the hydrologic process of an ungauged semi-diurnal tidal regime in four steps. Initially, a hydrologic model coupled with a hydrodynamic model to simulate flow considering land use changes effect on streamflow, seepage loss of channel, friction dominated tidal decay as well as lack of long term flow characteristics. Secondly, to define hydraulic habitat feature, a statistical analysis on derived flow data was performed to identify 'habitat suitability'. Thirdly, to observe the ecological habitat behavior based on the identified hydrologic alteration, hydraulic habitat features were investigated. Finally, based on the combined habitat suitability index flow alteration and ecological response relationship was established. Then, the obtained EF provides a set of low flow indices of desired regime and thus the obtained discharge against maximum Weighted Usable Area (WUA) was defined as EF threshold for the selected reach. A suitable EF regime condition was obtained within flow range 25-30.1 m3/s i.e., around 10-12% of the mean annual runoff of 245 m3/s and these findings are within researchers' recommendation of minimum flow requirement. Additionally it was observed that tidal characteristics are dominant process in semi-diurnal regime. However, during the study period (2010-2015) the

  6. An adaptive design for updating the threshold value of a continuous biomarker

    PubMed Central

    Spencer, Amy V.; Harbron, Chris; Mander, Adrian; Wason, James; Peers, Ian

    2017-01-01

    Potential predictive biomarkers are often measured on a continuous scale, but in practice, a threshold value to divide the patient population into biomarker ‘positive’ and ‘negative’ is desirable. Early phase clinical trials are increasingly using biomarkers for patient selection, but at this stage, it is likely that little will be known about the relationship between the biomarker and the treatment outcome. We describe a single-arm trial design with adaptive enrichment, which can increase power to demonstrate efficacy within a patient subpopulation, the parameters of which are also estimated. Our design enables us to learn about the biomarker and optimally adjust the threshold during the study, using a combination of generalised linear modelling and Bayesian prediction. At the final analysis, a binomial exact test is carried out, allowing the hypothesis that ‘no population subset exists in which the novel treatment has a desirable response rate’ to be tested. Through extensive simulations, we are able to show increased power over fixed threshold methods in many situations without increasing the type-I error rate. We also show that estimates of the threshold, which defines the population subset, are unbiased and often more precise than those from fixed threshold studies. We provide an example of the method applied (retrospectively) to publically available data from a study of the use of tamoxifen after mastectomy by the German Breast Study Group, where progesterone receptor is the biomarker of interest. PMID:27417407

  7. Pool desiccation and developmental thresholds in the common frog, Rana temporaria.

    PubMed

    Lind, Martin I; Persbo, Frida; Johansson, Frank

    2008-05-07

    The developmental threshold is the minimum size or condition that a developing organism must have reached in order for a life-history transition to occur. Although developmental thresholds have been observed for many organisms, inter-population variation among natural populations has not been examined. Since isolated populations can be subjected to strong divergent selection, population divergence in developmental thresholds can be predicted if environmental conditions favour fast or slow developmental time in different populations. Amphibian metamorphosis is a well-studied life-history transition, and using a common garden approach we compared the development time and the developmental threshold of metamorphosis in four island populations of the common frog Rana temporaria: two populations originating from islands with only temporary breeding pools and two from islands with permanent pools. As predicted, tadpoles from time-constrained temporary pools had a genetically shorter development time than those from permanent pools. Furthermore, the variation in development time among females from temporary pools was low, consistent with the action of selection on rapid development in this environment. However, there were no clear differences in the developmental thresholds between the populations, indicating that the main response to life in a temporary pool is to shorten the development time.

  8. Determination of the threshold dose distribution in photodynamic action from in vitro experiments.

    PubMed

    de Faria, Clara Maria Gonçalves; Inada, Natalia Mayumi; Kurachi, Cristina; Bagnato, Vanderlei Salvador

    2016-09-01

    The concept of threshold in photodynamic action on cells or microorganisms is well observed in experiments but not fully explored on in vitro experiments. The intercomparison between light and used photosensitizer among many experiments is also poorly evaluated. In this report, we present an analytical model that allows extracting from the survival rate experiments the data of the threshold dose distribution, ie, the distribution of energies and photosensitizer concentration necessary to produce death of cells. Then, we use this model to investigate photodynamic therapy (PDT) data previously published in literature. The concept of threshold dose distribution instead of "single value of threshold" is a rich concept for the comparison of photodynamic action in different situations, allowing analyses of its efficiency as well as determination of optimized conditions for PDT. We observed that, in general, as it becomes more difficult to kill a population, the distribution tends to broaden, which means it presents a large spectrum of threshold values within the same cell type population. From the distribution parameters (center peak and full width), we also observed a clear distinction among cell types regarding their response to PDT that can be quantified. Comparing data obtained from the same cell line and used photosensitizer (PS), where the only distinct condition was the light source's wavelength, we found that the differences on the distribution parameters were comparable to the differences on the PS absorption. At last, we observed evidence that the threshold dose distribution matches the curve of apoptotic activity for some PSs. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Detection and Modeling of High-Dimensional Thresholds for Fault Detection and Diagnosis

    NASA Technical Reports Server (NTRS)

    He, Yuning

    2015-01-01

    Many Fault Detection and Diagnosis (FDD) systems use discrete models for detection and reasoning. To obtain categorical values like oil pressure too high, analog sensor values need to be discretized using a suitablethreshold. Time series of analog and discrete sensor readings are processed and discretized as they come in. This task isusually performed by the wrapper code'' of the FDD system, together with signal preprocessing and filtering. In practice,selecting the right threshold is very difficult, because it heavily influences the quality of diagnosis. If a threshold causesthe alarm trigger even in nominal situations, false alarms will be the consequence. On the other hand, if threshold settingdoes not trigger in case of an off-nominal condition, important alarms might be missed, potentially causing hazardoussituations. In this paper, we will in detail describe the underlying statistical modeling techniques and algorithm as well as the Bayesian method for selecting the most likely shape and its parameters. Our approach will be illustrated by several examples from the Aerospace domain.

  10. Cost–effectiveness thresholds: pros and cons

    PubMed Central

    Lauer, Jeremy A; De Joncheere, Kees; Edejer, Tessa; Hutubessy, Raymond; Kieny, Marie-Paule; Hill, Suzanne R

    2016-01-01

    Abstract Cost–effectiveness analysis is used to compare the costs and outcomes of alternative policy options. Each resulting cost–effectiveness ratio represents the magnitude of additional health gained per additional unit of resources spent. Cost–effectiveness thresholds allow cost–effectiveness ratios that represent good or very good value for money to be identified. In 2001, the World Health Organization’s Commission on Macroeconomics in Health suggested cost–effectiveness thresholds based on multiples of a country’s per-capita gross domestic product (GDP). In some contexts, in choosing which health interventions to fund and which not to fund, these thresholds have been used as decision rules. However, experience with the use of such GDP-based thresholds in decision-making processes at country level shows them to lack country specificity and this – in addition to uncertainty in the modelled cost–effectiveness ratios – can lead to the wrong decision on how to spend health-care resources. Cost–effectiveness information should be used alongside other considerations – e.g. budget impact and feasibility considerations – in a transparent decision-making process, rather than in isolation based on a single threshold value. Although cost–effectiveness ratios are undoubtedly informative in assessing value for money, countries should be encouraged to develop a context-specific process for decision-making that is supported by legislation, has stakeholder buy-in, for example the involvement of civil society organizations and patient groups, and is transparent, consistent and fair. PMID:27994285

  11. Cost-effectiveness thresholds: pros and cons.

    PubMed

    Bertram, Melanie Y; Lauer, Jeremy A; De Joncheere, Kees; Edejer, Tessa; Hutubessy, Raymond; Kieny, Marie-Paule; Hill, Suzanne R

    2016-12-01

    Cost-effectiveness analysis is used to compare the costs and outcomes of alternative policy options. Each resulting cost-effectiveness ratio represents the magnitude of additional health gained per additional unit of resources spent. Cost-effectiveness thresholds allow cost-effectiveness ratios that represent good or very good value for money to be identified. In 2001, the World Health Organization's Commission on Macroeconomics in Health suggested cost-effectiveness thresholds based on multiples of a country's per-capita gross domestic product (GDP). In some contexts, in choosing which health interventions to fund and which not to fund, these thresholds have been used as decision rules. However, experience with the use of such GDP-based thresholds in decision-making processes at country level shows them to lack country specificity and this - in addition to uncertainty in the modelled cost-effectiveness ratios - can lead to the wrong decision on how to spend health-care resources. Cost-effectiveness information should be used alongside other considerations - e.g. budget impact and feasibility considerations - in a transparent decision-making process, rather than in isolation based on a single threshold value. Although cost-effectiveness ratios are undoubtedly informative in assessing value for money, countries should be encouraged to develop a context-specific process for decision-making that is supported by legislation, has stakeholder buy-in, for example the involvement of civil society organizations and patient groups, and is transparent, consistent and fair.

  12. Perceptual color difference metric including a CSF based on the perception threshold

    NASA Astrophysics Data System (ADS)

    Rosselli, Vincent; Larabi, Mohamed-Chaker; Fernandez-Maloigne, Christine

    2008-01-01

    The study of the Human Visual System (HVS) is very interesting to quantify the quality of a picture, to predict which information will be perceived on it, to apply adapted tools ... The Contrast Sensitivity Function (CSF) is one of the major ways to integrate the HVS properties into an imaging system. It characterizes the sensitivity of the visual system to spatial and temporal frequencies and predicts the behavior for the three channels. Common constructions of the CSF have been performed by estimating the detection threshold beyond which it is possible to perceive a stimulus. In this work, we developed a novel approach for spatio-chromatic construction based on matching experiments to estimate the perception threshold. It consists in matching the contrast of a test stimulus with that of a reference one. The obtained results are quite different in comparison with the standard approaches as the chromatic CSFs have band-pass behavior and not low pass. The obtained model has been integrated in a perceptual color difference metric inspired by the s-CIELAB. The metric is then evaluated with both objective and subjective procedures.

  13. Survival models for harvest management of mourning dove populations

    USGS Publications Warehouse

    Otis, D.L.

    2002-01-01

    Quantitative models of the relationship between annual survival and harvest rate of migratory game-bird populations are essential to science-based harvest management strategies. I used the best available band-recovery and harvest data for mourning doves (Zenaida macroura) to build a set of models based on different assumptions about compensatory harvest mortality. Although these models suffer from lack of contemporary data, they can be used in development of an initial set of population models that synthesize existing demographic data on a management-unit scale, and serve as a tool for prioritization of population demographic information needs. Credible harvest management plans for mourning dove populations will require a long-term commitment to population monitoring and iterative population analysis.

  14. Model-based estimators of density and connectivity to inform conservation of spatially structured populations

    USGS Publications Warehouse

    Morin, Dana J.; Fuller, Angela K.; Royle, J. Andrew; Sutherland, Chris

    2017-01-01

    Conservation and management of spatially structured populations is challenging because solutions must consider where individuals are located, but also differential individual space use as a result of landscape heterogeneity. A recent extension of spatial capture–recapture (SCR) models, the ecological distance model, uses spatial encounter histories of individuals (e.g., a record of where individuals are detected across space, often sequenced over multiple sampling occasions), to estimate the relationship between space use and characteristics of a landscape, allowing simultaneous estimation of both local densities of individuals across space and connectivity at the scale of individual movement. We developed two model-based estimators derived from the SCR ecological distance model to quantify connectivity over a continuous surface: (1) potential connectivity—a metric of the connectivity of areas based on resistance to individual movement; and (2) density-weighted connectivity (DWC)—potential connectivity weighted by estimated density. Estimates of potential connectivity and DWC can provide spatial representations of areas that are most important for the conservation of threatened species, or management of abundant populations (i.e., areas with high density and landscape connectivity), and thus generate predictions that have great potential to inform conservation and management actions. We used a simulation study with a stationary trap design across a range of landscape resistance scenarios to evaluate how well our model estimates resistance, potential connectivity, and DWC. Correlation between true and estimated potential connectivity was high, and there was positive correlation and high spatial accuracy between estimated DWC and true DWC. We applied our approach to data collected from a population of black bears in New York, and found that forested areas represented low levels of resistance for black bears. We demonstrate that formal inference about measures

  15. On the thresholds in modeling of high flows via artificial neural networks - A bootstrapping analysis

    NASA Astrophysics Data System (ADS)

    Panagoulia, D.; Trichakis, I.

    2012-04-01

    Considering the growing interest in simulating hydrological phenomena with artificial neural networks (ANNs), it is useful to figure out the potential and limits of these models. In this study, the main objective is to examine how to improve the ability of an ANN model to simulate extreme values of flow utilizing a priori knowledge of threshold values. A three-layer feedforward ANN was trained by using the back propagation algorithm and the logistic function as activation function. By using the thresholds, the flow was partitioned in low (x < μ), medium (μ ≤ x ≤ μ + 2σ) and high (x > μ + 2σ) values. The employed ANN model was trained for high flow partition and all flow data too. The developed methodology was implemented over a mountainous river catchment (the Mesochora catchment in northwestern Greece). The ANN model received as inputs pseudo-precipitation (rain plus melt) and previous observed flow data. After the training was completed the bootstrapping methodology was applied to calculate the ANN confidence intervals (CIs) for a 95% nominal coverage. The calculated CIs included only the uncertainty, which comes from the calibration procedure. The results showed that an ANN model trained specifically for high flows, with a priori knowledge of the thresholds, can simulate these extreme values much better (RMSE is 31.4% less) than an ANN model trained with all data of the available time series and using a posteriori threshold values. On the other hand the width of CIs increases by 54.9% with a simultaneous increase by 64.4% of the actual coverage for the high flows (a priori partition). The narrower CIs of the high flows trained with all data may be attributed to the smoothing effect produced from the use of the full data sets. Overall, the results suggest that an ANN model trained with a priori knowledge of the threshold values has an increased ability in simulating extreme values compared with an ANN model trained with all the data and a posteriori

  16. Genetic variation in threshold reaction norms for alternative reproductive tactics in male Atlantic salmon, Salmo salar.

    PubMed

    Piché, Jacinthe; Hutchings, Jeffrey A; Blanchard, Wade

    2008-07-07

    Alternative reproductive tactics may be a product of adaptive phenotypic plasticity, such that discontinuous variation in life history depends on both the genotype and the environment. Phenotypes that fall below a genetically determined threshold adopt one tactic, while those exceeding the threshold adopt the alternative tactic. We report evidence of genetic variability in maturation thresholds for male Atlantic salmon (Salmo salar) that mature either as large (more than 1 kg) anadromous males or as small (10-150 g) parr. Using a common-garden experimental protocol, we find that the growth rate at which the sneaker parr phenotype is expressed differs among pure- and mixed-population crosses. Maturation thresholds of hybrids were intermediate to those of pure crosses, consistent with the hypothesis that the life-history switch points are heritable. Our work provides evidence, for a vertebrate, that thresholds for alternative reproductive tactics differ genetically among populations and can be modelled as discontinuous reaction norms for age and size at maturity.

  17. Error threshold for color codes and random three-body Ising models.

    PubMed

    Katzgraber, Helmut G; Bombin, H; Martin-Delgado, M A

    2009-08-28

    We study the error threshold of color codes, a class of topological quantum codes that allow a direct implementation of quantum Clifford gates suitable for entanglement distillation, teleportation, and fault-tolerant quantum computation. We map the error-correction process onto a statistical mechanical random three-body Ising model and study its phase diagram via Monte Carlo simulations. The obtained error threshold of p(c) = 0.109(2) is very close to that of Kitaev's toric code, showing that enhanced computational capabilities do not necessarily imply lower resistance to noise.

  18. Kinetic Model of Growth of Arthropoda Populations

    NASA Astrophysics Data System (ADS)

    Ershov, Yu. A.; Kuznetsov, M. A.

    2018-05-01

    Kinetic equations were derived for calculating the growth of crustacean populations ( Crustacea) based on the biological growth model suggested earlier using shrimp ( Caridea) populations as an example. The development cycle of successive stages for populations can be represented in the form of quasi-chemical equations. The kinetic equations that describe the development cycle of crustaceans allow quantitative prediction of the development of populations depending on conditions. In contrast to extrapolation-simulation models, in the developed kinetic model of biological growth the kinetic parameters are the experimental characteristics of population growth. Verification and parametric identification of the developed model on the basis of the experimental data showed agreement with experiment within the error of the measurement technique.

  19. Modeling of high composition AlGaN channel high electron mobility transistors with large threshold voltage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bajaj, Sanyam, E-mail: bajaj.10@osu.edu; Hung, Ting-Hsiang; Akyol, Fatih

    2014-12-29

    We report on the potential of high electron mobility transistors (HEMTs) consisting of high composition AlGaN channel and barrier layers for power switching applications. Detailed two-dimensional (2D) simulations show that threshold voltages in excess of 3 V can be achieved through the use of AlGaN channel layers. We also calculate the 2D electron gas mobility in AlGaN channel HEMTs and evaluate their power figures of merit as a function of device operating temperature and Al mole fraction in the channel. Our models show that power switching transistors with AlGaN channels would have comparable on-resistance to GaN-channel based transistors for the samemore » operation voltage. The modeling in this paper shows the potential of high composition AlGaN as a channel material for future high threshold enhancement mode transistors.« less

  20. Modeling Honey Bee Populations.

    PubMed

    Torres, David J; Ricoy, Ulises M; Roybal, Shanae

    2015-01-01

    Eusocial honey bee populations (Apis mellifera) employ an age stratification organization of egg, larvae, pupae, hive bees and foraging bees. Understanding the recent decline in honey bee colonies hinges on understanding the factors that impact each of these different age castes. We first perform an analysis of steady state bee populations given mortality rates within each bee caste and find that the honey bee colony is highly susceptible to hive and pupae mortality rates. Subsequently, we study transient bee population dynamics by building upon the modeling foundation established by Schmickl and Crailsheim and Khoury et al. Our transient model based on differential equations accounts for the effects of pheromones in slowing the maturation of hive bees to foraging bees, the increased mortality of larvae in the absence of sufficient hive bees, and the effects of food scarcity. We also conduct sensitivity studies and show the effects of parameter variations on the colony population.

  1. Modeling Honey Bee Populations

    PubMed Central

    Torres, David J.; Ricoy, Ulises M.; Roybal, Shanae

    2015-01-01

    Eusocial honey bee populations (Apis mellifera) employ an age stratification organization of egg, larvae, pupae, hive bees and foraging bees. Understanding the recent decline in honey bee colonies hinges on understanding the factors that impact each of these different age castes. We first perform an analysis of steady state bee populations given mortality rates within each bee caste and find that the honey bee colony is highly susceptible to hive and pupae mortality rates. Subsequently, we study transient bee population dynamics by building upon the modeling foundation established by Schmickl and Crailsheim and Khoury et al. Our transient model based on differential equations accounts for the effects of pheromones in slowing the maturation of hive bees to foraging bees, the increased mortality of larvae in the absence of sufficient hive bees, and the effects of food scarcity. We also conduct sensitivity studies and show the effects of parameter variations on the colony population. PMID:26148010

  2. Population-based contracting (population health): part II.

    PubMed

    Jacofsky, D J

    2017-11-01

    Modern healthcare contracting is shifting the responsibility for improving quality, enhancing community health and controlling the total cost of care for patient populations from payers to providers. Population-based contracting involves capitated risk taken across an entire population, such that any included services within the contract are paid for by the risk-bearing entity throughout the term of the agreement. Under such contracts, a risk-bearing entity, which may be a provider group, a hospital or another payer, administers the contract and assumes risk for contractually defined services. These contracts can be structured in various ways, from professional fee capitation to full global per member per month diagnosis-based risk. The entity contracting with the payer must have downstream network contracts to provide the care and facilities that it has agreed to provide. Population health is a very powerful model to reduce waste and costs. It requires a deep understanding of the nuances of such contracting and the appropriate infrastructure to manage both networks and risk. Cite this article: Bone Joint J 2017;99-B:1431-4. ©2017 The British Editorial Society of Bone & Joint Surgery.

  3. Cost-Effectiveness of Orthogeriatric and Fracture Liaison Service Models of Care for Hip Fracture Patients: A Population-Based Study.

    PubMed

    Leal, Jose; Gray, Alastair M; Hawley, Samuel; Prieto-Alhambra, Daniel; Delmestri, Antonella; Arden, Nigel K; Cooper, Cyrus; Javaid, M Kassim; Judge, Andrew

    2017-02-01

    Fracture liaison services are recommended as a model of best practice for organizing patient care and secondary fracture prevention for hip fracture patients, although variation exists in how such services are structured. There is considerable uncertainty as to which model is most cost-effective and should therefore be mandated. This study evaluated the cost- effectiveness of orthogeriatric (OG)- and nurse-led fracture liaison service (FLS) models of post-hip fracture care compared with usual care. Analyses were conducted from a health care and personal social services payer perspective, using a Markov model to estimate the lifetime impact of the models of care. The base-case population consisted of men and women aged 83 years with a hip fracture. The risk and costs of hip and non-hip fractures were derived from large primary and hospital care data sets in the UK. Utilities were informed by a meta-regression of 32 studies. In the base-case analysis, the orthogeriatric-led service was the most effective and cost-effective model of care at a threshold of £30,000 per quality-adjusted life years gained (QALY). For women aged 83 years, the OG-led service was the most cost-effective at £22,709/QALY. If only health care costs are considered, OG-led service was cost-effective at £12,860/QALY and £14,525/QALY for women and men aged 83 years, respectively. Irrespective of how patients were stratified in terms of their age, sex, and Charlson comorbidity score at index hip fracture, our results suggest that introducing an orthogeriatrician-led or a nurse-led FLS is cost-effective when compared with usual care. Although considerable uncertainty remains concerning which of the models of care should be preferred, introducing an orthogeriatrician-led service seems to be the most cost-effective service to pursue. © 2016 American Society for Bone and Mineral Research. © 2016 American Society for Bone and Mineral Research.

  4. A cognitive-consistency based model of population wide attitude change.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lakkaraju, Kiran; Speed, Ann Elizabeth

    Attitudes play a significant role in determining how individuals process information and behave. In this paper we have developed a new computational model of population wide attitude change that captures the social level: how individuals interact and communicate information, and the cognitive level: how attitudes and concept interact with each other. The model captures the cognitive aspect by representing each individuals as a parallel constraint satisfaction network. The dynamics of this model are explored through a simple attitude change experiment where we vary the social network and distribution of attitudes in a population.

  5. A robust threshold-based cloud mask for the HRV channel of MSG SEVIRI

    NASA Astrophysics Data System (ADS)

    Bley, S.; Deneke, H.

    2013-03-01

    A robust threshold-based cloud mask for the high-resolution visible (HRV) channel (1 × 1 km2) of the METEOSAT SEVIRI instrument is introduced and evaluated. It is based on operational EUMETSAT cloud mask for the low resolution channels of SEVIRI (3 × 3 km2), which is used for the selection of suitable thresholds to ensure consistency with its results. The aim of using the HRV channel is to resolve small-scale cloud structures which cannot be detected by the low resolution channels. We find that it is of advantage to apply thresholds relative to clear-sky reflectance composites, and to adapt the threshold regionally. Furthermore, the accuracy of the different spectral channels for thresholding and the suitability of the HRV channel are investigated for cloud detection. The case studies show different situations to demonstrate the behaviour for various surface and cloud conditions. Overall, between 4 and 24% of cloudy low-resolution SEVIRI pixels are found to contain broken clouds in our test dataset depending on considered region. Most of these broken pixels are classified as cloudy by EUMETSAT's cloud mask, which will likely result in an overestimate if the mask is used as estimate of cloud fraction.

  6. Evaluating effects of Everglades restoration on American crocodile populations in south Florida using a spatially-explicit, stage-based population model

    USGS Publications Warehouse

    Green, Timothy W.; Slone, Daniel H.; Swain, Eric D.; Cherkiss, Michael S.; Lohmann, Melinda; Mazzotti, Frank J.; Rice, Kenneth G.

    2014-01-01

    The distribution and abundance of the American crocodile (Crocodylus acutus) in the Florida Everglades is dependent on the timing, amount, and location of freshwater flow. One of the goals of the Comprehensive Everglades Restoration Plan (CERP) is to restore historic freshwater flows to American crocodile habitat throughout the Everglades. To predict the impacts on the crocodile population from planned restoration activities, we created a stage-based spatially explicit crocodile population model that incorporated regional hydrology models and American crocodile research and monitoring data. Growth and survival were influenced by salinity, water depth, and density-dependent interactions. A stage-structured spatial model was used with discrete spatial convolution to direct crocodiles toward attractive sources where conditions were favorable. The model predicted that CERP would have both positive and negative impacts on American crocodile growth, survival, and distribution. Overall, crocodile populations across south Florida were predicted to decrease approximately 3 % with the implementation of CERP compared to future conditions without restoration, but local increases up to 30 % occurred in the Joe Bay area near Taylor Slough, and local decreases up to 30 % occurred in the vicinity of Buttonwood Canal due to changes in salinity and freshwater flows.

  7. A critique of the use of indicator-species scores for identifying thresholds in species responses

    USGS Publications Warehouse

    Cuffney, Thomas F.; Qian, Song S.

    2013-01-01

    Identification of ecological thresholds is important both for theoretical and applied ecology. Recently, Baker and King (2010, King and Baker 2010) proposed a method, threshold indicator analysis (TITAN), to calculate species and community thresholds based on indicator species scores adapted from Dufrêne and Legendre (1997). We tested the ability of TITAN to detect thresholds using models with (broken-stick, disjointed broken-stick, dose-response, step-function, Gaussian) and without (linear) definitive thresholds. TITAN accurately and consistently detected thresholds in step-function models, but not in models characterized by abrupt changes in response slopes or response direction. Threshold detection in TITAN was very sensitive to the distribution of 0 values, which caused TITAN to identify thresholds associated with relatively small differences in the distribution of 0 values while ignoring thresholds associated with large changes in abundance. Threshold identification and tests of statistical significance were based on the same data permutations resulting in inflated estimates of statistical significance. Application of bootstrapping to the split-point problem that underlies TITAN led to underestimates of the confidence intervals of thresholds. Bias in the derivation of the z-scores used to identify TITAN thresholds and skewedness in the distribution of data along the gradient produced TITAN thresholds that were much more similar than the actual thresholds. This tendency may account for the synchronicity of thresholds reported in TITAN analyses. The thresholds identified by TITAN represented disparate characteristics of species responses that, when coupled with the inability of TITAN to identify thresholds accurately and consistently, does not support the aggregation of individual species thresholds into a community threshold.

  8. Assessment and Mmanagement of North American horseshoe crab populations, with emphasis on a multispecies framework for Delaware Bay, U.S.A. populations: Chapter 24

    USGS Publications Warehouse

    Millard, Michael J.; Sweka, John A.; McGowan, Conor P.; Smith, David R.

    2015-01-01

    The horseshoe crab fishery on the US Atlantic coast represents a compelling fishery management story for many reasons, including ecological complexity, health and human safety ramifications, and socio-economic conflicts. Knowledge of stock status and assessment and monitoring capabilities for the species have increased greatly in the last 15 years and permitted managers to make more informed harvest recommendations. Incorporating the bioenergetics needs of migratory shorebirds, which feed on horseshoe crab eggs, into the management framework for horseshoe crabs was identified as a goal, particularly in the Delaware Bay region where the birds and horseshoe crabs exhibit an important ecological interaction. In response, significant effort was invested in studying the population dynamics, migration ecology, and the ecologic relationship of a key migratory shorebird, the Red Knot, to horseshoe crabs. A suite of models was developed that linked Red Knot populations to horseshoe crab populations through a mass gain function where female spawning crab abundance determined what proportion of the migrating Red Knot population reached a critical body mass threshold. These models were incorporated in an adaptive management framework wherein optimal harvest decisions for horseshoe crab are recommended based on several resource-based and value-based variables and thresholds. The current adaptive framework represents a true multispecies management effort where additional data over time are employed to improve the predictive models and reduce parametric uncertainty. The possibility of increasing phenologic asynchrony between the two taxa in response to climate change presents a potential challenge to their ecologic interaction in Delaware Bay.

  9. Agile Model Driven Development of Electronic Health Record-Based Specialty Population Registries.

    PubMed

    Kannan, Vaishnavi; Fish, Jason C; Willett, DuWayne L

    2016-02-01

    The transformation of the American healthcare payment system from fee-for-service to value-based care increasingly makes it valuable to develop patient registries for specialized populations, to better assess healthcare quality and costs. Recent widespread adoption of Electronic Health Records (EHRs) in the U.S. now makes possible construction of EHR-based specialty registry data collection tools and reports, previously unfeasible using manual chart abstraction. But the complexities of specialty registry EHR tools and measures, along with the variety of stakeholders involved, can result in misunderstood requirements and frequent product change requests, as users first experience the tools in their actual clinical workflows. Such requirements churn could easily stall progress in specialty registry rollout. Modeling a system's requirements and solution design can be a powerful way to remove ambiguities, facilitate shared understanding, and help evolve a design to meet newly-discovered needs. "Agile Modeling" retains these values while avoiding excessive unused up-front modeling in favor of iterative incremental modeling. Using Agile Modeling principles and practices, in calendar year 2015 one institution developed 58 EHR-based specialty registries, with 111 new data collection tools, supporting 134 clinical process and outcome measures, and enrolling over 16,000 patients. The subset of UML and non-UML models found most consistently useful in designing, building, and iteratively evolving EHR-based specialty registries included User Stories, Domain Models, Use Case Diagrams, Decision Trees, Graphical User Interface Storyboards, Use Case text descriptions, and Solution Class Diagrams.

  10. Stochastic population dynamic models as probability networks

    Treesearch

    M.E. and D.C. Lee Borsuk

    2009-01-01

    The dynamics of a population and its response to environmental change depend on the balance of birth, death and age-at-maturity, and there have been many attempts to mathematically model populations based on these characteristics. Historically, most of these models were deterministic, meaning that the results were strictly determined by the equations of the model and...

  11. Genetic evaluation of calf and heifer survival in Iranian Holstein cattle using linear and threshold models.

    PubMed

    Forutan, M; Ansari Mahyari, S; Sargolzaei, M

    2015-02-01

    Calf and heifer survival are important traits in dairy cattle affecting profitability. This study was carried out to estimate genetic parameters of survival traits in female calves at different age periods, until nearly the first calving. Records of 49,583 female calves born during 1998 and 2009 were considered in five age periods as days 1-30, 31-180, 181-365, 366-760 and full period (day 1-760). Genetic components were estimated based on linear and threshold sire models and linear animal models. The models included both fixed effects (month of birth, dam's parity number, calving ease and twin/single) and random effects (herd-year, genetic effect of sire or animal and residual). Rates of death were 2.21, 3.37, 1.97, 4.14 and 12.4% for the above periods, respectively. Heritability estimates were very low ranging from 0.48 to 3.04, 0.62 to 3.51 and 0.50 to 4.24% for linear sire model, animal model and threshold sire model, respectively. Rank correlations between random effects of sires obtained with linear and threshold sire models and with linear animal and sire models were 0.82-0.95 and 0.61-0.83, respectively. The estimated genetic correlations between the five different periods were moderate and only significant for 31-180 and 181-365 (r(g) = 0.59), 31-180 and 366-760 (r(g) = 0.52), and 181-365 and 366-760 (r(g) = 0.42). The low genetic correlations in current study would suggest that survival at different periods may be affected by the same genes with different expression or by different genes. Even though the additive genetic variations of survival traits were small, it might be possible to improve these traits by traditional or genomic selection. © 2014 Blackwell Verlag GmbH.

  12. High-resolution modeling of thermal thresholds and environmental influences on coral bleaching for local and regional reef management.

    PubMed

    Kumagai, Naoki H; Yamano, Hiroya

    2018-01-01

    Coral reefs are one of the world's most threatened ecosystems, with global and local stressors contributing to their decline. Excessive sea-surface temperatures (SSTs) can cause coral bleaching, resulting in coral death and decreases in coral cover. A SST threshold of 1 °C over the climatological maximum is widely used to predict coral bleaching. In this study, we refined thermal indices predicting coral bleaching at high-spatial resolution (1 km) by statistically optimizing thermal thresholds, as well as considering other environmental influences on bleaching such as ultraviolet (UV) radiation, water turbidity, and cooling effects. We used a coral bleaching dataset derived from the web-based monitoring system Sango Map Project, at scales appropriate for the local and regional conservation of Japanese coral reefs. We recorded coral bleaching events in the years 2004-2016 in Japan. We revealed the influence of multiple factors on the ability to predict coral bleaching, including selection of thermal indices, statistical optimization of thermal thresholds, quantification of multiple environmental influences, and use of multiple modeling methods (generalized linear models and random forests). After optimization, differences in predictive ability among thermal indices were negligible. Thermal index, UV radiation, water turbidity, and cooling effects were important predictors of the occurrence of coral bleaching. Predictions based on the best model revealed that coral reefs in Japan have experienced recent and widespread bleaching. A practical method to reduce bleaching frequency by screening UV radiation was also demonstrated in this paper.

  13. High-resolution modeling of thermal thresholds and environmental influences on coral bleaching for local and regional reef management

    PubMed Central

    Yamano, Hiroya

    2018-01-01

    Coral reefs are one of the world’s most threatened ecosystems, with global and local stressors contributing to their decline. Excessive sea-surface temperatures (SSTs) can cause coral bleaching, resulting in coral death and decreases in coral cover. A SST threshold of 1 °C over the climatological maximum is widely used to predict coral bleaching. In this study, we refined thermal indices predicting coral bleaching at high-spatial resolution (1 km) by statistically optimizing thermal thresholds, as well as considering other environmental influences on bleaching such as ultraviolet (UV) radiation, water turbidity, and cooling effects. We used a coral bleaching dataset derived from the web-based monitoring system Sango Map Project, at scales appropriate for the local and regional conservation of Japanese coral reefs. We recorded coral bleaching events in the years 2004–2016 in Japan. We revealed the influence of multiple factors on the ability to predict coral bleaching, including selection of thermal indices, statistical optimization of thermal thresholds, quantification of multiple environmental influences, and use of multiple modeling methods (generalized linear models and random forests). After optimization, differences in predictive ability among thermal indices were negligible. Thermal index, UV radiation, water turbidity, and cooling effects were important predictors of the occurrence of coral bleaching. Predictions based on the best model revealed that coral reefs in Japan have experienced recent and widespread bleaching. A practical method to reduce bleaching frequency by screening UV radiation was also demonstrated in this paper. PMID:29473007

  14. Application of the predicted heat strain model in development of localized, threshold-based heat stress management guidelines for the construction industry.

    PubMed

    Rowlinson, Steve; Jia, Yunyan Andrea

    2014-04-01

    Existing heat stress risk management guidelines recommended by international standards are not practical for the construction industry which needs site supervision staff to make instant managerial decisions to mitigate heat risks. The ability of the predicted heat strain (PHS) model [ISO 7933 (2004). Ergonomics of the thermal environment analytical determination and interpretation of heat stress using calculation of the predicted heat strain. Geneva: International Standard Organisation] to predict maximum allowable exposure time (D lim) has now enabled development of localized, action-triggering and threshold-based guidelines for implementation by lay frontline staff on construction sites. This article presents a protocol for development of two heat stress management tools by applying the PHS model to its full potential. One of the tools is developed to facilitate managerial decisions on an optimized work-rest regimen for paced work. The other tool is developed to enable workers' self-regulation during self-paced work.

  15. The effect of seasonal harvesting on stage-structured population models.

    PubMed

    Tang, Sanyi; Chen, Lansun

    2004-04-01

    In most models of population dynamics, increases in population due to birth are assumed to be time-independent, but many species reproduce only during a single period of the year. We propose an exploited single-species model with stage structure for the dynamics in a fish population for which births occur in a single pulse once per time period. Since birth pulse populations are often characterized with a discrete time dynamical system determined by its Poincaré map, we explore the consequences of harvest timing to equilibrium population sizes under seasonal dependence and obtain threshold conditions for their stability, and show that the timing of harvesting has a strong impact on the persistence of the fish population, on the volume of mature fish stock and on the maximum annual-sustainable yield. Moreover, our results imply that the population can sustain much higher harvest rates if the mature fish is removed as early in the season (after the birth pulse) as possible. Further, the effects of harvesting effort and harvest timing on the dynamical complexity are also investigated. Bifurcation diagrams are constructed with the birth rate (or harvesting effort or harvest timing) as the bifurcation parameter, and these are observed to display rich structure, including chaotic bands with periodic windows, pitch-fork and tangent bifurcations, non-unique dynamics (meaning that several attractors coexist) and attractor crisis. This suggests that birth pulse, in effect, provides a natural period or cyclicity that makes the dynamical behavior more complex.

  16. When does ecological sustainability ensure economic sustainability? An integrated analysis of thresholds in semi-arid western rangelands

    NASA Astrophysics Data System (ADS)

    Cobourn, K. M.; Peckham, S. D.

    2011-12-01

    The vulnerability of agri-environmental systems to ecological threshold events depends on the combined influence of economic factors and natural drivers, such as climate and disturbance. This analysis builds an integrated ecologic-economic model to evaluate the behavioral response of agricultural producers to changing and uncertain natural conditions. The model explicitly reflects the effect of producer behavior on the likelihood of a threshold event that threatens the ecological and/or economic sustainability of the agri-environmental system. The foundation of the analysis is a threshold indicator that incorporates the population dynamics of a species that supports economic production and an episodic disturbance regime-in this case rangeland grass that is grazed by livestock and is subject to wildfire. This ecological indicator is integrated into an economic model in which producers choose grazing intensity given the state of the grass population and a set of economic parameters. We examine two model variants that characterize differing economic circumstances. The first characterizes the optimal grazing regime assuming that the system is managed by a single planner whose objective is to maximize the aggregate long-run returns of producers in the system. The second examines the case in which individual producers choose their own stocking rates in order to maximize their private economic benefit. The results from the first model variant illustrate the difference between an ecologic and an economic threshold. Failure to cross an ecological threshold does not necessarily ensure that the system remains economically viable: Economic sustainability, defined as the ability of the system to support optimal production into the infinite future, requires that the net growth rate of the supporting population exceeds the level required for ecological sustainability by an amount that depends on the market price of livestock and grazing efficiency. The results from the second

  17. Comparison of singlet oxygen threshold dose for PDT.

    PubMed

    Zhu, Timothy C; Liu, Baochang; Kim, Michele M; McMillan, Dayton; Liang, Xing; Finlay, Jarod C; Busch, Theresa M

    2014-02-01

    Macroscopic modeling of singlet oxygen ( 1 O 2 ) is of particular interest because it is the major cytotoxic agent causing biological effects for type II photosensitizers during PDT. We have developed a macroscopic model to calculate reacted singlet oxygen concentration ([1O2] rx for PDT. An in-vivo RIF tumor mouse model is used to correlate the necrosis depth to the calculation based on explicit PDT dosimetry of light fluence distribution, tissue optical properties, and photosensitizer concentrations. Inputs to the model include 4 photosensitizer specific photochemical parameters along with the apparent singlet oxygen threshold concentration. Photosensitizer specific model parameters are determined for several type II photosensitizers (Photofrin, BPD, and HPPH). The singlet oxygen threshold concentration is approximately 0.41 - 0.56 mM for all three photosensitizers studied, assuming that the fraction of singlet oxygen generated that interacts with the cell is ( f = 1). In comparison, value derived from other in-vivo mice studies is 0.4 mM for mTHPC. However, the singlet oxygen threshold doses were reported to be 7.9 and 12.1 mM for a multicell in-vitro EMT6/Ro spheroid model for mTHPC and Photofrin PDT, respectively. The sensitivity of threshold singlet oxygen dose for our experiment is examined. The possible influence of vascular vs. apoptotic cell killing mechanism on the singlet oxygen threshold dose is discussed using the BPD with different drug-light intervals 3 hrs vs. 15 min. The observed discrepancies between different experiments warrant further investigation to explain the cause of the difference.

  18. Comparison of singlet oxygen threshold dose for PDT

    PubMed Central

    Zhu, Timothy C; Liu, Baochang; Kim, Michele M.; McMillan, Dayton; Liang, Xing; Finlay, Jarod C.; Busch, Theresa M.

    2015-01-01

    Macroscopic modeling of singlet oxygen (1O2) is of particular interest because it is the major cytotoxic agent causing biological effects for type II photosensitizers during PDT. We have developed a macroscopic model to calculate reacted singlet oxygen concentration ([1O2]rx for PDT. An in-vivo RIF tumor mouse model is used to correlate the necrosis depth to the calculation based on explicit PDT dosimetry of light fluence distribution, tissue optical properties, and photosensitizer concentrations. Inputs to the model include 4 photosensitizer specific photochemical parameters along with the apparent singlet oxygen threshold concentration. Photosensitizer specific model parameters are determined for several type II photosensitizers (Photofrin, BPD, and HPPH). The singlet oxygen threshold concentration is approximately 0.41 – 0.56 mM for all three photosensitizers studied, assuming that the fraction of singlet oxygen generated that interacts with the cell is (f = 1). In comparison, value derived from other in-vivo mice studies is 0.4 mM for mTHPC. However, the singlet oxygen threshold doses were reported to be 7.9 and 12.1 mM for a multicell in-vitro EMT6/Ro spheroid model for mTHPC and Photofrin PDT, respectively. The sensitivity of threshold singlet oxygen dose for our experiment is examined. The possible influence of vascular vs. apoptotic cell killing mechanism on the singlet oxygen threshold dose is discussed using the BPD with different drug-light intervals 3 hrs vs. 15 min. The observed discrepancies between different experiments warrant further investigation to explain the cause of the difference. PMID:25999651

  19. An adaptive design for updating the threshold value of a continuous biomarker.

    PubMed

    Spencer, Amy V; Harbron, Chris; Mander, Adrian; Wason, James; Peers, Ian

    2016-11-30

    Potential predictive biomarkers are often measured on a continuous scale, but in practice, a threshold value to divide the patient population into biomarker 'positive' and 'negative' is desirable. Early phase clinical trials are increasingly using biomarkers for patient selection, but at this stage, it is likely that little will be known about the relationship between the biomarker and the treatment outcome. We describe a single-arm trial design with adaptive enrichment, which can increase power to demonstrate efficacy within a patient subpopulation, the parameters of which are also estimated. Our design enables us to learn about the biomarker and optimally adjust the threshold during the study, using a combination of generalised linear modelling and Bayesian prediction. At the final analysis, a binomial exact test is carried out, allowing the hypothesis that 'no population subset exists in which the novel treatment has a desirable response rate' to be tested. Through extensive simulations, we are able to show increased power over fixed threshold methods in many situations without increasing the type-I error rate. We also show that estimates of the threshold, which defines the population subset, are unbiased and often more precise than those from fixed threshold studies. We provide an example of the method applied (retrospectively) to publically available data from a study of the use of tamoxifen after mastectomy by the German Breast Study Group, where progesterone receptor is the biomarker of interest. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd.

  20. Fatigue threshold studies in Fe, Fe-Si, and HSLA steel: Part II. thermally activated behavior of the effective stress intensity at threshold

    NASA Astrophysics Data System (ADS)

    Yu, W.; Esaklul, K.; Gerberich, W. W.

    1984-05-01

    It is shown that closure mechanisms alone cannot fully explain increasing fatigue thresholds with decreasing test temperature for a sequence of Fe-Si binary alloys and an HSLA steel. Implications are that fatigue crack propagation near threshold is a thermally activated process. The effective threshold stress intensity, which was obtained by subtracting the closure portion from the fatigue threshold, was examined. This effective stress intensity was found to correlate very well to the thermal component of the flow stress. A detailed fractographic study of the fatigue surface was performed. Water vapor in the room air was found to promote the formation of oxide and intergranular crack growth. At lower temperature, a brittle-type cyclic cleavage fatigue surface was observed but the ductile process persisted even at 123 K. Arrest marks were found on all three modes of fatigue crack growth. The regular spacings between these lines and dislocation modeling suggested that fatigue crack growth was controlled by the subcell structure near threshold. A model based on the slip-off of dislocations was examined. From this, it is shown that the effective fatigue threshold may be related to the square root of (one plus the strain rate sensitivity).

  1. Population trends for North American winter birds based on hierarchical models

    USGS Publications Warehouse

    Soykan, Candan U.; Sauer, John; Schuetz, Justin G.; LeBaron, Geoffrey S.; Dale, Kathy; Langham, Gary M.

    2016-01-01

    Managing widespread and persistent threats to birds requires knowledge of population dynamics at large spatial and temporal scales. For over 100 yrs, the Audubon Christmas Bird Count (CBC) has enlisted volunteers in bird monitoring efforts that span the Americas, especially southern Canada and the United States. We employed a Bayesian hierarchical model to control for variation in survey effort among CBC circles and, using CBC data from 1966 to 2013, generated early-winter population trend estimates for 551 species of birds. Selecting a subset of species that do not frequent bird feeders and have ≥25% range overlap with the distribution of CBC circles (228 species) we further estimated aggregate (i.e., across species) trends for the entire study region and at the level of states/provinces, Bird Conservation Regions, and Landscape Conservation Cooperatives. Moreover, we examined the relationship between ten biological traits—range size, population size, migratory strategy, habitat affiliation, body size, diet, number of eggs per clutch, age at sexual maturity, lifespan, and tolerance of urban/suburban settings—and CBC trend estimates. Our results indicate that 68% of the 551 species had increasing trends within the study area over the interval 1966–2013. When trends were examined across the subset of 228 species, the median population trend for the group was 0.9% per year at the continental level. At the regional level, aggregate trends were positive in all but a few areas. Negative population trends were evident in lower latitudes, whereas the largest increases were at higher latitudes, a pattern consistent with range shifts due to climate change. Nine of 10 biological traits were significantly associated with median population trend; however, none of the traits explained >34% of the deviance in the data, reflecting the indirect relationships between population trend estimates and species traits. Trend estimates based on the CBC are broadly congruent with

  2. Updated ultrasound criteria for polycystic ovary syndrome: reliable thresholds for elevated follicle population and ovarian volume.

    PubMed

    Lujan, Marla E; Jarrett, Brittany Y; Brooks, Eric D; Reines, Jonathan K; Peppin, Andrew K; Muhn, Narry; Haider, Ehsan; Pierson, Roger A; Chizen, Donna R

    2013-05-01

    Do the ultrasonographic criteria for polycystic ovaries supported by the 2003 Rotterdam consensus adequately discriminate between the normal and polycystic ovary syndrome (PCOS) condition in light of recent advancements in imaging technology and reliable methods for estimating follicle populations in PCOS? Using newer ultrasound technology and a reliable grid system approach to count follicles, we concluded that a substantially higher threshold of follicle counts throughout the entire ovary (FNPO)-26 versus 12 follicles-is required to distinguish among women with PCOS and healthy women from the general population. The Rotterdam consensus defined the polycystic ovary as having 12 or more follicles, measuring between 2 and 9 mm (FNPO), and/or an ovarian volume (OV) >10 cm(3). Since their initial proposal in 2003, a heightened prevalence of polycystic ovaries has been described in healthy women with regular menstrual cycles, which has questioned the accuracy of these criteria and marginalized the specificity of polycystic ovaries as a diagnostic criterion for PCOS. A diagnostic test study was performed using cross-sectional data, collected from 2006 to 2011, from 168 women prospectively evaluated by transvaginal ultrasonography. Receiver operating characteristic (ROC) curve analyses were performed to determine the appropriate diagnostic thresholds for: (i) FNPO, (ii) follicle counts in a single cross section (FNPS) and (iii) OV. The levels of intra- and inter-observer reliability when five observers used the proposed criteria on 100 ultrasound cases were also determined. Ninety-eight women diagnosed with PCOS by the National Institutes of Health criteria as having both oligo-amenorrhea and hyperandrogenism and 70 healthy female volunteers recruited from the general population. Participants were evaluated by transvaginal ultrasonography at the Royal University Hospital within the Department of Obstetrics, Gynecology and Reproductive Sciences, University of Saskatchewan

  3. Do we need a dynamic snow depth threshold when comparing hydrological models with remote sensing products in mountain catchments?

    NASA Astrophysics Data System (ADS)

    Engel, Michael; Bertoldi, Giacomo; Notarnicola, Claudia; Comiti, Francesco

    2017-04-01

    To assess the performance of simulated snow cover of hydrological models, it is common practice to compare simulated data with observed ones derived from satellite images such as MODIS. However, technical and methodological limitations such as data availability of MODIS products, its spatial resolution or difficulties in finding appropriate parameterisations of the model need to be solved previously. Another important assumption usually made is the threshold of minimum simulated snow depth, generally set to 10 mm of snow depth, to respect the MODIS detection thresholds for snow cover. But is such a constant threshold appropriate for complex alpine terrain? How important is the impact of different snow depth thresholds on the spatial and temporal distribution of the pixel-based overall accuracy (OA)? To address this aspect, we compared the snow covered area (SCA) simulated by the GEOtop 2.0 snow model to the daily composite 250 m EURAC MODIS SCA in the upper Saldur basin (61 km2, Eastern Italian Alps) during the period October 2011 - October 2013. Initially, we calibrated the snow model against snow depths and snow water equivalents at point scale, taken from measurements at different meteorological stations. We applied different snow depth thresholds (0 mm, 10 mm, 50 mm, and 100 mm) to obtain the simulated snow cover and assessed the changes in OA both in time (during the entire evaluation period, accumulation and melting season) and space (entire catchment and specific areas of topographic characteristics such as elevation, slope, aspect, landcover, and roughness). Results show remarkable spatial and temporal differences in OA with respect to different snow depth thresholds. Inaccuracies of simulated and observed SCA during the accumulation season September to November 2012 were located in areas with north-west aspect, slopes of 30° or little elevation differences at sub-pixel scale (-0.25 to 0 m). We obtained best agreements with MODIS SCA for a snow depth

  4. Representation of Vegetation and Other Nonerodible Elements in Aeolian Shear Stress Partitioning Models for Predicting Transport Threshold

    NASA Technical Reports Server (NTRS)

    King, James; Nickling, William G.; Gillies, John A.

    2005-01-01

    The presence of nonerodible elements is well understood to be a reducing factor for soil erosion by wind, but the limits of its protection of the surface and erosion threshold prediction are complicated by the varying geometry, spatial organization, and density of the elements. The predictive capabilities of the most recent models for estimating wind driven particle fluxes are reduced because of the poor representation of the effectiveness of vegetation to reduce wind erosion. Two approaches have been taken to account for roughness effects on sediment transport thresholds. Marticorena and Bergametti (1995) in their dust emission model parameterize the effect of roughness on threshold with the assumption that there is a relationship between roughness density and the aerodynamic roughness length of a surface. Raupach et al. (1993) offer a different approach based on physical modeling of wake development behind individual roughness elements and the partition of the surface stress and the total stress over a roughened surface. A comparison between the models shows the partitioning approach to be a good framework to explain the effect of roughness on entrainment of sediment by wind. Both models provided very good agreement for wind tunnel experiments using solid objects on a nonerodible surface. However, the Marticorena and Bergametti (1995) approach displays a scaling dependency when the difference between the roughness length of the surface and the overall roughness length is too great, while the Raupach et al. (1993) model's predictions perform better owing to the incorporation of the roughness geometry and the alterations to the flow they can cause.

  5. Exploiting Sub-threshold and above-threshold characteristics in a silver-enhanced gold nanoparticle based biochip.

    PubMed

    Liu, Yang; Alocilja, Evangelyn; Chakrabartty, Shantanu

    2009-01-01

    Silver-enhanced labeling is a technique used in immunochromatographic assays for improving the sensitivity of pathogen detection. In this paper, we employ the silver enhancement approach for constructing a biomolecular transistor that uses a high-density interdigitated electrode to detect rabbit IgG. We show that the response of the biomolecular transistor comprises of: (a) a sub-threshold region where the conductance change is an exponential function of the enhancement time and; (b) an above-threshold region where the conductance change is a linear function with respect to the enhancement time. By exploiting both these regions of operation, it is shown that the silver enhancing time is a reliable indicator of the IgG concentration. The method provides a relatively straightforward alternative to biomolecular signal amplification techniques. The measured results using a biochip prototype fabricated in silicon show that 240 pg/mL rabbit IgG can be detected at the silver enhancing time of 42 min. Also, the biomolecular transistor is compatible with silicon based processing making it ideal for designing integrated CMOS biosensors.

  6. 3D SAPIV particle field reconstruction method based on adaptive threshold.

    PubMed

    Qu, Xiangju; Song, Yang; Jin, Ying; Li, Zhenhua; Wang, Xuezhen; Guo, ZhenYan; Ji, Yunjing; He, Anzhi

    2018-03-01

    Particle image velocimetry (PIV) is a necessary flow field diagnostic technique that provides instantaneous velocimetry information non-intrusively. Three-dimensional (3D) PIV methods can supply the full understanding of a 3D structure, the complete stress tensor, and the vorticity vector in the complex flows. In synthetic aperture particle image velocimetry (SAPIV), the flow field can be measured with large particle intensities from the same direction by different cameras. During SAPIV particle reconstruction, particles are commonly reconstructed by manually setting a threshold to filter out unfocused particles in the refocused images. In this paper, the particle intensity distribution in refocused images is analyzed, and a SAPIV particle field reconstruction method based on an adaptive threshold is presented. By using the adaptive threshold to filter the 3D measurement volume integrally, the three-dimensional location information of the focused particles can be reconstructed. The cross correlations between images captured from cameras and images projected by the reconstructed particle field are calculated for different threshold values. The optimal threshold is determined by cubic curve fitting and is defined as the threshold value that causes the correlation coefficient to reach its maximum. The numerical simulation of a 16-camera array and a particle field at two adjacent time events quantitatively evaluates the performance of the proposed method. An experimental system consisting of a camera array of 16 cameras was used to reconstruct the four adjacent frames in a vortex flow field. The results show that the proposed reconstruction method can effectively reconstruct the 3D particle fields.

  7. Threshold Values for Identification of Contamination Predicted by Reduced-Order Models

    DOE PAGES

    Last, George V.; Murray, Christopher J.; Bott, Yi-Ju; ...

    2014-12-31

    The U.S. Department of Energy’s (DOE’s) National Risk Assessment Partnership (NRAP) Project is developing reduced-order models to evaluate potential impacts on underground sources of drinking water (USDWs) if CO2 or brine leaks from deep CO2 storage reservoirs. Threshold values, below which there would be no predicted impacts, were determined for portions of two aquifer systems. These threshold values were calculated using an interwell approach for determining background groundwater concentrations that is an adaptation of methods described in the U.S. Environmental Protection Agency’s Unified Guidance for Statistical Analysis of Groundwater Monitoring Data at RCRA Facilities.

  8. Agile Model Driven Development of Electronic Health Record-Based Specialty Population Registries

    PubMed Central

    Kannan, Vaishnavi; Fish, Jason C.; Willett, DuWayne L.

    2018-01-01

    The transformation of the American healthcare payment system from fee-for-service to value-based care increasingly makes it valuable to develop patient registries for specialized populations, to better assess healthcare quality and costs. Recent widespread adoption of Electronic Health Records (EHRs) in the U.S. now makes possible construction of EHR-based specialty registry data collection tools and reports, previously unfeasible using manual chart abstraction. But the complexities of specialty registry EHR tools and measures, along with the variety of stakeholders involved, can result in misunderstood requirements and frequent product change requests, as users first experience the tools in their actual clinical workflows. Such requirements churn could easily stall progress in specialty registry rollout. Modeling a system’s requirements and solution design can be a powerful way to remove ambiguities, facilitate shared understanding, and help evolve a design to meet newly-discovered needs. “Agile Modeling” retains these values while avoiding excessive unused up-front modeling in favor of iterative incremental modeling. Using Agile Modeling principles and practices, in calendar year 2015 one institution developed 58 EHR-based specialty registries, with 111 new data collection tools, supporting 134 clinical process and outcome measures, and enrolling over 16,000 patients. The subset of UML and non-UML models found most consistently useful in designing, building, and iteratively evolving EHR-based specialty registries included User Stories, Domain Models, Use Case Diagrams, Decision Trees, Graphical User Interface Storyboards, Use Case text descriptions, and Solution Class Diagrams. PMID:29750222

  9. Cost effectiveness of population based BRCA1 founder mutation testing in Sephardi Jewish women.

    PubMed

    Patel, Shreeya; Legood, Rosa; Evans, D Gareth; Turnbull, Clare; Antoniou, Antonis C; Menon, Usha; Jacobs, Ian; Manchanda, Ranjit

    2018-04-01

    Population-based BRCA1/BRCA2 founder-mutation testing has been demonstrated as cost effective compared with family history based testing in Ashkenazi Jewish women. However, only 1 of the 3 Ashkenazi Jewish BRCA1/BRCA2 founder mutations (185delAG[c.68_69delAG]), 5382insC[c.5266dupC]), and 6174delT[c.5946delT]) is found in the Sephardi Jewish population (185delAG[c.68_69delAG]), and the overall prevalence of BRCA mutations in the Sephardi Jewish population is accordingly lower (0.7% compared with 2.5% in the Ashkenazi Jewish population). Cost-effectiveness analyses of BRCA testing have not previously been performed at these lower BRCA prevalence levels seen in the Sephardi Jewish population. Here we present a cost-effectiveness analysis for UK and US populations comparing population testing with clinical criteria/family history-based testing in Sephardi Jewish women. A Markov model was built comparing the lifetime costs and effects of population-based BRCA1 testing, with testing using family history-based clinical criteria in Sephardi Jewish women aged ≥30 years. BRCA1 carriers identified were offered magnetic resonance imaging/mammograms and risk-reducing surgery. Costs are reported at 2015 prices. Outcomes include breast cancer, ovarian cancer, and excess deaths from heart disease. All costs and outcomes are discounted at 3.5%. The time horizon is lifetime, and perspective is payer. The incremental cost-effectiveness ratio per quality-adjusted life-year was calculated. Parameter uncertainty was evaluated through 1-way and probabilistic sensitivity analysis. Population testing resulted in gain in life expectancy of 12 months (quality-adjusted life-year = 1.00). The baseline discounted incremental cost-effectiveness ratio for UK population-based testing was £67.04/quality-adjusted life-year and for US population was $308.42/quality-adjusted life-year. Results were robust in the 1-way sensitivity analysis. The probabilistic sensitivity analysis showed 100% of

  10. Epidemic thresholds for bipartite networks

    NASA Astrophysics Data System (ADS)

    Hernández, D. G.; Risau-Gusman, S.

    2013-11-01

    It is well known that sexually transmitted diseases (STD) spread across a network of human sexual contacts. This network is most often bipartite, as most STD are transmitted between men and women. Even though network models in epidemiology have quite a long history now, there are few general results about bipartite networks. One of them is the simple dependence, predicted using the mean field approximation, between the epidemic threshold and the average and variance of the degree distribution of the network. Here we show that going beyond this approximation can lead to qualitatively different results that are supported by numerical simulations. One of the new features, that can be relevant for applications, is the existence of a critical value for the infectivity of each population, below which no epidemics can arise, regardless of the value of the infectivity of the other population.

  11. 3.5 GHz Environmental Sensing Capability Detection Thresholds and Deployment

    PubMed Central

    Nguyen, Thao T.; Souryal, Michael R.; Sahoo, Anirudha; Hall, Timothy A.

    2017-01-01

    Spectrum sharing in the 3.5 GHz band between commercial and government users along U.S. coastal areas depends on an environmental sensing capability (ESC)—that is, a network of radio frequency sensors and a decision system—to detect the presence of incumbent shipborne radar systems and trigger protective measures, as needed. It is well known that the sensitivity of these sensors depends on the aggregate interference generated by commercial systems to the incumbent radar receivers, but to date no comprehensive study has been made of the aggregate interference in realistic scenarios and its impact on the requirement for detection of the radar signal. This paper presents systematic methods for determining the placement of ESC sensors and their detection thresholds to adequately protect incumbent shipborne radar systems from harmful interference. Using terrain-based propagation models and a population-based deployment model, the analysis finds the offshore distances at which protection must be triggered and relates these to the detection levels of coastline sensors. We further show that sensor placement is a form of the well-known set cover problem, which has been shown to be NP-complete, and demonstrate practical solutions achieved with a greedy algorithm. Results show detection thresholds to be as much as 22 dB lower than required by current industry standards. The methodology and results presented in this paper can be used by ESC operators for planning and deployment of sensors and by regulators for testing sensor performance. PMID:29303162

  12. Towards thresholds of disaster management performance under demographic change: exploring functional relationships using agent-based modeling

    NASA Astrophysics Data System (ADS)

    Dressler, Gunnar; Müller, Birgit; Frank, Karin; Kuhlicke, Christian

    2016-10-01

    Effective disaster management is a core feature for the protection of communities against natural disasters such as floods. Disaster management organizations (DMOs) are expected to contribute to ensuring this protection. However, what happens when their resources to cope with a flood are at stake or the intensity and frequency of the event exceeds their capacities? Many cities in the Free State of Saxony, Germany, were strongly hit by several floods in the last years and are additionally challenged by demographic change, with an ageing society and out-migration leading to population shrinkage in many parts of Saxony. Disaster management, which is mostly volunteer-based in Germany, is particularly affected by this change, leading to a loss of members. We propose an agent-based simulation model that acts as a "virtual lab" to explore the impact of various changes on disaster management performance. Using different scenarios we examine the impact of changes in personal resources of DMOs, their access to operation relevant information, flood characteristics as well as differences between geographic regions. A loss of DMOs and associated manpower caused by demographic change has the most profound impact on the performance. Especially in rural, upstream regions population decline in combination with very short lead times can put disaster management performance at risk.

  13. Hierarchical modeling of population stability and species group attributes from survey data

    USGS Publications Warehouse

    Sauer, J.R.; Link, W.A.

    2002-01-01

    Many ecological studies require analysis of collections of estimates. For example, population change is routinely estimated for many species from surveys such as the North American Breeding Bird Survey (BBS), and the species are grouped and used in comparative analyses. We developed a hierarchical model for estimation of group attributes from a collection of estimates of population trend. The model uses information from predefined groups of species to provide a context and to supplement data for individual species; summaries of group attributes are improved by statistical methods that simultaneously analyze collections of trend estimates. The model is Bayesian; trends are treated as random variables rather than fixed parameters. We use Markov Chain Monte Carlo (MCMC) methods to fit the model. Standard assessments of population stability cannot distinguish magnitude of trend and statistical significance of trend estimates, but the hierarchical model allows us to legitimately describe the probability that a trend is within given bounds. Thus we define population stability in terms of the probability that the magnitude of population change for a species is less than or equal to a predefined threshold. We applied the model to estimates of trend for 399 species from the BBS to estimate the proportion of species with increasing populations and to identify species with unstable populations. Analyses are presented for the collection of all species and for 12 species groups commonly used in BBS summaries. Overall, we estimated that 49% of species in the BBS have positive trends and 33 species have unstable populations. However, the proportion of species with increasing trends differs among habitat groups, with grassland birds having only 19% of species with positive trend estimates and wetland birds having 68% of species with positive trend estimates.

  14. Extinction dynamics of a discrete population in an oasis.

    PubMed

    Berti, Stefano; Cencini, Massimo; Vergni, Davide; Vulpiani, Angelo

    2015-07-01

    Understanding the conditions ensuring the persistence of a population is an issue of primary importance in population biology. The first theoretical approach to the problem dates back to the 1950s with the Kierstead, Slobodkin, and Skellam (KiSS) model, namely a continuous reaction-diffusion equation for a population growing on a patch of finite size L surrounded by a deadly environment with infinite mortality, i.e., an oasis in a desert. The main outcome of the model is that only patches above a critical size allow for population persistence. Here we introduce an individual-based analog of the KiSS model to investigate the effects of discreteness and demographic stochasticity. In particular, we study the average time to extinction both above and below the critical patch size of the continuous model and investigate the quasistationary distribution of the number of individuals for patch sizes above the critical threshold.

  15. Mirror instability near the threshold: Hybrid simulations

    NASA Astrophysics Data System (ADS)

    Hellinger, P.; Trávníček, P.; Passot, T.; Sulem, P.; Kuznetsov, E. A.; Califano, F.

    2007-12-01

    Nonlinear behavior of the mirror instability near the threshold is investigated using 1-D hybrid simulations. The simulations demonstrate the presence of an early phase where quasi-linear effects dominate [ Shapiro and Shevchenko, 1964]. The quasi-linear diffusion is however not the main saturation mechanism. A second phase is observed where the mirror mode is linearly stable (the stability is evaluated using the instantaneous ion distribution function) but where the instability nevertheless continues to develop, leading to nonlinear coherent structures in the form of magnetic humps. This regime is well modeled by a nonlinear equation for the magnetic field evolution, derived from a reductive perturbative expansion of the Vlasov-Maxwell equations [ Kuznetsov et al., 2007] with a phenomenological term which represents local variations of the ion Larmor radius. In contrast with previous models where saturation is due to the cooling of a population of trapped particles, the resulting equation correctly reproduces the development of magnetic humps from an initial noise. References Kuznetsov, E., T. Passot and P. L. Sulem (2007), Dynamical model for nonlinear mirror modes near threshold, Phys. Rev. Lett., 98, 235003. Shapiro, V. D., and V. I. Shevchenko (1964), Sov. JETP, 18, 1109.

  16. Threshold-adaptive canny operator based on cross-zero points

    NASA Astrophysics Data System (ADS)

    Liu, Boqi; Zhang, Xiuhua; Hong, Hanyu

    2018-03-01

    Canny edge detection[1] is a technique to extract useful structural information from different vision objects and dramatically reduce the amount of data to be processed. It has been widely applied in various computer vision systems. There are two thresholds have to be settled before the edge is segregated from background. Usually, by the experience of developers, two static values are set as the thresholds[2]. In this paper, a novel automatic thresholding method is proposed. The relation between the thresholds and Cross-zero Points is analyzed, and an interpolation function is deduced to determine the thresholds. Comprehensive experimental results demonstrate the effectiveness of proposed method and advantageous for stable edge detection at changing illumination.

  17. Towards a unifying basis of auditory thresholds: binaural summation.

    PubMed

    Heil, Peter

    2014-04-01

    Absolute auditory threshold decreases with increasing sound duration, a phenomenon explainable by the assumptions that the sound evokes neural events whose probabilities of occurrence are proportional to the sound's amplitude raised to an exponent of about 3 and that a constant number of events are required for threshold (Heil and Neubauer, Proc Natl Acad Sci USA 100:6151-6156, 2003). Based on this probabilistic model and on the assumption of perfect binaural summation, an equation is derived here that provides an explicit expression of the binaural threshold as a function of the two monaural thresholds, irrespective of whether they are equal or unequal, and of the exponent in the model. For exponents >0, the predicted binaural advantage is largest when the two monaural thresholds are equal and decreases towards zero as the monaural threshold difference increases. This equation is tested and the exponent derived by comparing binaural thresholds with those predicted on the basis of the two monaural thresholds for different values of the exponent. The thresholds, measured in a large sample of human subjects with equal and unequal monaural thresholds and for stimuli with different temporal envelopes, are compatible only with an exponent close to 3. An exponent of 3 predicts a binaural advantage of 2 dB when the two ears are equally sensitive. Thus, listening with two (equally sensitive) ears rather than one has the same effect on absolute threshold as doubling duration. The data suggest that perfect binaural summation occurs at threshold and that peripheral neural signals are governed by an exponent close to 3. They might also shed new light on mechanisms underlying binaural summation of loudness.

  18. Network analysis of a financial market based on genuine correlation and threshold method

    NASA Astrophysics Data System (ADS)

    Namaki, A.; Shirazi, A. H.; Raei, R.; Jafari, G. R.

    2011-10-01

    A financial market is an example of an adaptive complex network consisting of many interacting units. This network reflects market’s behavior. In this paper, we use Random Matrix Theory (RMT) notion for specifying the largest eigenvector of correlation matrix as the market mode of stock network. For a better risk management, we clean the correlation matrix by removing the market mode from data and then construct this matrix based on the residuals. We show that this technique has an important effect on correlation coefficient distribution by applying it for Dow Jones Industrial Average (DJIA). To study the topological structure of a network we apply the removing market mode technique and the threshold method to Tehran Stock Exchange (TSE) as an example. We show that this network follows a power-law model in certain intervals. We also show the behavior of clustering coefficients and component numbers of this network for different thresholds. These outputs are useful for both theoretical and practical purposes such as asset allocation and risk management.

  19. Aeroelastic Model of Vocal-Fold Vibrating Element for Studying the Phonation Threshold

    NASA Astrophysics Data System (ADS)

    Horáček, J.; Švec, J. G.

    2002-10-01

    An original theoretical model for vibration onset of the vocal folds in the air-flow coming from the human subglottal tract is designed, which allows studying the influence of the physical properties of the vocal folds (e.g., geometrical shape, mass, viscosity) on their vibration characteristics (such as the natural frequencies, mode shapes of vibration and the thresholds of instability). The mathematical model of the vocal fold is designed as a simplified dynamic system of two degrees of freedom (rotation and translation) vibrating on an elastic foundation in the wall of a channel conveying air. An approximate unsteady one-dimensional flow theory for the inviscid incompressible fluid is presented for the phonatory air-flow. A generally defined shape of the vocal-fold surface is considered for expressing the unsteady aerodynamic forces in the glottis. The parameters of the mechanical part of the model, i.e., the mass, stiffness and damping matrices, are related to the geometry and material density of the vocal folds as well as to the fundamental natural frequency and damping known from experiments. The coupled numerical solution yields the vibration characteristics (natural frequencies, damping and mode shapes of vibration), including the instability thresholds of the aeroelastic system. The vibration characteristics obtained from the coupled numerical solution of the system appear to be in reasonable qualitative agreement with the physiological data and clinical observations. The model is particularly suitable for studying the phonation threshold, i.e., the onset of vibration of the vocal folds.

  20. Cost-effectiveness and budget impact analysis of a population-based screening program for colorectal cancer.

    PubMed

    Pil, L; Fobelets, M; Putman, K; Trybou, J; Annemans, L

    2016-07-01

    Colorectal cancer (CRC) is one of the leading causes of cancer mortality in Belgium. In Flanders (Belgium), a population-based screening program with a biennial immunochemical faecal occult blood test (iFOBT) in women and men aged 56-74 has been organised since 2013. This study assessed the cost-effectiveness and budget impact of the colorectal population-based screening program in Flanders (Belgium). A health economic model was conducted, consisting of a decision tree simulating the screening process and a Markov model, with a time horizon of 20years, simulating natural progression. Predicted mortality and incidence, total costs, and quality-adjusted life-years (QALYs) with and without the screening program were calculated in order to determine the incremental cost-effectiveness ratio of CRC screening. Deterministic and probabilistic sensitivity analyses were conducted, taking into account uncertainty of the model parameters. Mortality and incidence were predicted to decrease over 20years. The colorectal screening program in Flanders is found to be cost-effective with an ICER of 1681/QALY (95% CI -1317 to 6601) in males and €4,484/QALY (95% CI -3254 to 18,163). The probability of being cost-effective given a threshold of €35,000/QALY was 100% and 97.3%, respectively. The budget impact analysis showed the extra cost for the health care payer to be limited. This health economic analysis has shown that despite the possible adverse effects of screening and the extra costs for the health care payer and the patient, the population-based screening program for CRC in Flanders is cost-effective and should therefore be maintained. Copyright © 2016 European Federation of Internal Medicine. Published by Elsevier B.V. All rights reserved.

  1. On the renewal risk model under a threshold strategy

    NASA Astrophysics Data System (ADS)

    Dong, Yinghui; Wang, Guojing; Yuen, Kam C.

    2009-08-01

    In this paper, we consider the renewal risk process under a threshold dividend payment strategy. For this model, the expected discounted dividend payments and the Gerber-Shiu expected discounted penalty function are investigated. Integral equations, integro-differential equations and some closed form expressions for them are derived. When the claims are exponentially distributed, it is verified that the expected penalty of the deficit at ruin is proportional to the ruin probability.

  2. a Threshold-Free Filtering Algorithm for Airborne LIDAR Point Clouds Based on Expectation-Maximization

    NASA Astrophysics Data System (ADS)

    Hui, Z.; Cheng, P.; Ziggah, Y. Y.; Nie, Y.

    2018-04-01

    Filtering is a key step for most applications of airborne LiDAR point clouds. Although lots of filtering algorithms have been put forward in recent years, most of them suffer from parameters setting or thresholds adjusting, which will be time-consuming and reduce the degree of automation of the algorithm. To overcome this problem, this paper proposed a threshold-free filtering algorithm based on expectation-maximization. The proposed algorithm is developed based on an assumption that point clouds are seen as a mixture of Gaussian models. The separation of ground points and non-ground points from point clouds can be replaced as a separation of a mixed Gaussian model. Expectation-maximization (EM) is applied for realizing the separation. EM is used to calculate maximum likelihood estimates of the mixture parameters. Using the estimated parameters, the likelihoods of each point belonging to ground or object can be computed. After several iterations, point clouds can be labelled as the component with a larger likelihood. Furthermore, intensity information was also utilized to optimize the filtering results acquired using the EM method. The proposed algorithm was tested using two different datasets used in practice. Experimental results showed that the proposed method can filter non-ground points effectively. To quantitatively evaluate the proposed method, this paper adopted the dataset provided by the ISPRS for the test. The proposed algorithm can obtain a 4.48 % total error which is much lower than most of the eight classical filtering algorithms reported by the ISPRS.

  3. Thresholds, injury, and loss relationships for thrips in Phleum pratense (Poales: Poaceae).

    PubMed

    Reisig, Dominic D; Godfrey, Larry D; Marcum, Daniel B

    2009-12-01

    Timothy (Phleum pratense L.) is an important forage crop in many Western U.S. states. Marketing of timothy hay is primarily based on esthetics, and green color is an important attribute. The objective of these studies was to determine a relationship between arthropod populations, yield, and esthetic injury in timothy. Economic injury levels (EILs) and economic thresholds were calculated based on these relationships. Thrips (Thripidae) numbers were manipulated with insecticides in small plot studies in 2006, 2007, and 2008, although tetranychid mite levels were incidentally flared by cyfluthrin in some experiments. Arthropod population densities were determined weekly, and yield and esthetic injury were measured at each harvest. Effects of arthropods on timothy were assessed using multilinear regression. Producers were also surveyed to relate economic loss from leaf color to the injury ratings for use in establishing EILs. Thrips population levels were significantly related to yield loss in only one of nine experiments. Thrips population levels were significantly related to injury once before the first annual harvest and twice before the second. Thrips were the most important pest in these experiments, and they were more often related to esthetic injury rather than yield loss. EILs and economic thresholds for thrips population levels were established using esthetic injury data. These results document the first example of a significant relationship between arthropod pest population levels and economic yield and quality losses in timothy.

  4. A mathematical model for malaria transmission with asymptomatic carriers and two age groups in the human population.

    PubMed

    Beretta, Edoardo; Capasso, Vincenzo; Garao, Dario G

    2018-06-01

    In this paper a conceptual mathematical model of malaria transmission proposed in a previous paper has been analyzed in a deeper detail. Among its key epidemiological features of this model, two-age-classes (child and adult) and asymptomatic carriers have been included. The extra mortality of mosquitoes due to the use of long-lasting treated mosquito nets (LLINs) and Indoor Residual Spraying (IRS) has been included too. By taking advantage of the natural double time scale of the parasite and the human populations, it has been possible to provide interesting threshold results. In particular it has been shown that key parameters can be identified such that below a threshold level, built on these parameters, the epidemic tends to extinction, while above another threshold level it tends to a nontrivial endemic state, for which an interval estimate has been provided. Numerical simulations confirm the analytical results. Copyright © 2018 Elsevier Inc. All rights reserved.

  5. Evolutionary dynamics of general group interactions in structured populations

    NASA Astrophysics Data System (ADS)

    Li, Aming; Broom, Mark; Du, Jinming; Wang, Long

    2016-02-01

    The evolution of populations is influenced by many factors, and the simple classical models have been developed in a number of important ways. Both population structure and multiplayer interactions have been shown to significantly affect the evolution of important properties, such as the level of cooperation or of aggressive behavior. Here we combine these two key factors and develop the evolutionary dynamics of general group interactions in structured populations represented by regular graphs. The traditional linear and threshold public goods games are adopted as models to address the dynamics. We show that for linear group interactions, population structure can favor the evolution of cooperation compared to the well-mixed case, and we see that the more neighbors there are, the harder it is for cooperators to persist in structured populations. We further show that threshold group interactions could lead to the emergence of cooperation even in well-mixed populations. Here population structure sometimes inhibits cooperation for the threshold public goods game, where depending on the benefit to cost ratio, the outcomes are bistability or a monomorphic population of defectors or cooperators. Our results suggest, counterintuitively, that structured populations are not always beneficial for the evolution of cooperation for nonlinear group interactions.

  6. Verification of the tumor volume delineation method using a fixed threshold of peak standardized uptake value.

    PubMed

    Koyama, Kazuya; Mitsumoto, Takuya; Shiraishi, Takahiro; Tsuda, Keisuke; Nishiyama, Atsushi; Inoue, Kazumasa; Yoshikawa, Kyosan; Hatano, Kazuo; Kubota, Kazuo; Fukushi, Masahiro

    2017-09-01

    We aimed to determine the difference in tumor volume associated with the reconstruction model in positron-emission tomography (PET). To reduce the influence of the reconstruction model, we suggested a method to measure the tumor volume using the relative threshold method with a fixed threshold based on peak standardized uptake value (SUV peak ). The efficacy of our method was verified using 18 F-2-fluoro-2-deoxy-D-glucose PET/computed tomography images of 20 patients with lung cancer. The tumor volume was determined using the relative threshold method with a fixed threshold based on the SUV peak . The PET data were reconstructed using the ordered-subset expectation maximization (OSEM) model, the OSEM + time-of-flight (TOF) model, and the OSEM + TOF + point-spread function (PSF) model. The volume differences associated with the reconstruction algorithm (%VD) were compared. For comparison, the tumor volume was measured using the relative threshold method based on the maximum SUV (SUV max ). For the OSEM and TOF models, the mean %VD values were -0.06 ± 8.07 and -2.04 ± 4.23% for the fixed 40% threshold according to the SUV max and the SUV peak, respectively. The effect of our method in this case seemed to be minor. For the OSEM and PSF models, the mean %VD values were -20.41 ± 14.47 and -13.87 ± 6.59% for the fixed 40% threshold according to the SUV max and SUV peak , respectively. Our new method enabled the measurement of tumor volume with a fixed threshold and reduced the influence of the changes in tumor volume associated with the reconstruction model.

  7. Bayesian Threshold Estimation

    ERIC Educational Resources Information Center

    Gustafson, S. C.; Costello, C. S.; Like, E. C.; Pierce, S. J.; Shenoy, K. N.

    2009-01-01

    Bayesian estimation of a threshold time (hereafter simply threshold) for the receipt of impulse signals is accomplished given the following: 1) data, consisting of the number of impulses received in a time interval from zero to one and the time of the largest time impulse; 2) a model, consisting of a uniform probability density of impulse time…

  8. Electrical percolation threshold of cementitious composites possessing self-sensing functionality incorporating different carbon-based materials

    NASA Astrophysics Data System (ADS)

    Al-Dahawi, Ali; Haroon Sarwary, Mohammad; Öztürk, Oğuzhan; Yıldırım, Gürkan; Akın, Arife; Şahmaran, Mustafa; Lachemi, Mohamed

    2016-10-01

    An experimental study was carried out to understand the electrical percolation thresholds of different carbon-based nano- and micro-scale materials in cementitious composites. Multi-walled carbon nanotubes (CNTs), graphene nanoplatelets (GNPs) and carbon black (CB) were selected as the nano-scale materials, while 6 and 12 mm long carbon fibers (CF6 and CF12) were used as the micro-scale carbon-based materials. After determining the percolation thresholds of different electrical conductive materials, mechanical properties and piezoresistive properties of specimens produced with the abovementioned conductive materials at percolation threshold were investigated under uniaxial compressive loading. Results demonstrate that regardless of initial curing age, the percolation thresholds of CNT, GNP, CB and CFs in ECC mortar specimens were around 0.55%, 2.00%, 2.00% and 1.00%, respectively. Including different carbon-based conductive materials did not harm compressive strength results; on the contrary, it improved overall values. All cementitious composites produced with carbon-based materials, with the exception of the control mixtures, exhibited piezoresistive behavior under compression, which is crucial for sensing capability. It is believed that incorporating the sensing attribute into cementitious composites will enhance benefits for sustainable civil infrastructures.

  9. multi-dice: r package for comparative population genomic inference under hierarchical co-demographic models of independent single-population size changes.

    PubMed

    Xue, Alexander T; Hickerson, Michael J

    2017-11-01

    Population genetic data from multiple taxa can address comparative phylogeographic questions about community-scale response to environmental shifts, and a useful strategy to this end is to employ hierarchical co-demographic models that directly test multi-taxa hypotheses within a single, unified analysis. This approach has been applied to classical phylogeographic data sets such as mitochondrial barcodes as well as reduced-genome polymorphism data sets that can yield 10,000s of SNPs, produced by emergent technologies such as RAD-seq and GBS. A strategy for the latter had been accomplished by adapting the site frequency spectrum to a novel summarization of population genomic data across multiple taxa called the aggregate site frequency spectrum (aSFS), which potentially can be deployed under various inferential frameworks including approximate Bayesian computation, random forest and composite likelihood optimization. Here, we introduce the r package multi-dice, a wrapper program that exploits existing simulation software for flexible execution of hierarchical model-based inference using the aSFS, which is derived from reduced genome data, as well as mitochondrial data. We validate several novel software features such as applying alternative inferential frameworks, enforcing a minimal threshold of time surrounding co-demographic pulses and specifying flexible hyperprior distributions. In sum, multi-dice provides comparative analysis within the familiar R environment while allowing a high degree of user customization, and will thus serve as a tool for comparative phylogeography and population genomics. © 2017 The Authors. Molecular Ecology Resources Published by John Wiley & Sons Ltd.

  10. Holes in the Bathtub: Water Table Dependent Services and Threshold Behavior in an Economic Model of Groundwater Extraction

    NASA Astrophysics Data System (ADS)

    Kirk-lawlor, N. E.; Edwards, E. C.

    2012-12-01

    In many groundwater systems, the height of the water table must be above certain thresholds for some types of surface flow to exist. Examples of flows that depend on water table elevation include groundwater baseflow to river systems, groundwater flow to wetland systems, and flow to springs. Meeting many of the goals of sustainable water resource management requires maintaining these flows at certain rates. Water resource management decisions invariably involve weighing tradeoffs between different possible usage regimes and the economic consequences of potential management choices are an important factor in these tradeoffs. Policies based on sustainability may have a social cost from forgoing present income. This loss of income may be worth bearing, but should be well understood and carefully considered. Traditionally, the economic theory of groundwater exploitation has relied on the assumption of a single-cell or "bathtub" aquifer model, which offers a simple means to examine complex interactions between water user and hydrologic system behavior. However, such a model assumes a closed system and does not allow for the simulation of groundwater outflows that depend on water table elevation (e.g. baseflow, springs, wetlands), even though those outflows have value. We modify the traditional single-cell aquifer model by allowing for outflows when the water table is above certain threshold elevations. These thresholds behave similarly to holes in a bathtub, where the outflow is a positive function of the height of the water table above the threshold and the outflow is lost when the water table drops below the threshold. We find important economic consequences to this representation of the groundwater system. The economic value of services provided by threshold-dependent outflows (including non-market value), such as ecosystem services, can be incorporated. The value of services provided by these flows may warrant maintaining the water table at higher levels than would

  11. Global epidemic invasion thresholds in directed cattle subpopulation networks having source, sink, and transit nodes.

    PubMed

    Schumm, Phillip; Scoglio, Caterina; Zhang, Qian; Balcan, Duygu

    2015-02-21

    Through the characterization of a metapopulation cattle disease model on a directed network having source, transit, and sink nodes, we derive two global epidemic invasion thresholds. The first threshold defines the conditions necessary for an epidemic to successfully spread at the global scale. The second threshold defines the criteria that permit an epidemic to move out of the giant strongly connected component and to invade the populations of the sink nodes. As each sink node represents a final waypoint for cattle before slaughter, the existence of an epidemic among the sink nodes is a serious threat to food security. We find that the relationship between these two thresholds depends on the relative proportions of transit and sink nodes in the system and the distributions of the in-degrees of both node types. These analytic results are verified through numerical realizations of the metapopulation cattle model. Published by Elsevier Ltd.

  12. Effect of thermal insulation on the electrical characteristics of NbOx threshold switches

    NASA Astrophysics Data System (ADS)

    Wang, Ziwen; Kumar, Suhas; Wong, H.-S. Philip; Nishi, Yoshio

    2018-02-01

    Threshold switches based on niobium oxide (NbOx) are promising candidates as bidirectional selector devices in crossbar memory arrays and building blocks for neuromorphic computing. Here, it is experimentally demonstrated that the electrical characteristics of NbOx threshold switches can be tuned by engineering the thermal insulation. Increasing the thermal insulation by ˜10× is shown to produce ˜7× reduction in threshold current and ˜45% reduction in threshold voltage. The reduced threshold voltage leads to ˜5× reduction in half-selection leakage, which highlights the effectiveness of reducing half-selection leakage of NbOx selectors by engineering the thermal insulation. A thermal feedback model based on Poole-Frenkel conduction in NbOx can explain the experimental results very well, which also serves as a piece of strong evidence supporting the validity of the Poole-Frenkel based mechanism in NbOx threshold switches.

  13. A systems approach to healthcare: agent-based modeling, community mental health, and population well-being.

    PubMed

    Silverman, Barry G; Hanrahan, Nancy; Bharathy, Gnana; Gordon, Kim; Johnson, Dan

    2015-02-01

    Explore whether agent-based modeling and simulation can help healthcare administrators discover interventions that increase population wellness and quality of care while, simultaneously, decreasing costs. Since important dynamics often lie in the social determinants outside the health facilities that provide services, this study thus models the problem at three levels (individuals, organizations, and society). The study explores the utility of translating an existing (prize winning) software for modeling complex societal systems and agent's daily life activities (like a Sim City style of software), into a desired decision support system. A case study tests if the 3 levels of system modeling approach is feasible, valid, and useful. The case study involves an urban population with serious mental health and Philadelphia's Medicaid population (n=527,056), in particular. Section 3 explains the models using data from the case study and thereby establishes feasibility of the approach for modeling a real system. The models were trained and tuned using national epidemiologic datasets and various domain expert inputs. To avoid co-mingling of training and testing data, the simulations were then run and compared (Section 4.1) to an analysis of 250,000 Philadelphia patient hospital admissions for the year 2010 in terms of re-hospitalization rate, number of doctor visits, and days in hospital. Based on the Student t-test, deviations between simulated vs. real world outcomes are not statistically significant. Validity is thus established for the 2008-2010 timeframe. We computed models of various types of interventions that were ineffective as well as 4 categories of interventions (e.g., reduced per-nurse caseload, increased check-ins and stays, etc.) that result in improvement in well-being and cost. The 3 level approach appears to be useful to help health administrators sort through system complexities to find effective interventions at lower costs. Copyright © 2014 Elsevier B

  14. Low dimensional model of heart rhythm dynamics as a tool for diagnosing the anaerobic threshold

    NASA Astrophysics Data System (ADS)

    Anosov, O. L.; Butkovskii, O. Ya.; Kadtke, J.; Kravtsov, Yu. A.; Protopopescu, V.

    1997-05-01

    We report preliminary results on describing the dependence of the heart rhythm variability on the stress level by using qualitative, low dimensional models. The reconstruction of macroscopic heart models yielding cardio cycles (RR-intervals) duration was based on actual clinical data. Our results show that the coefficients of the low dimensional models are sensitive to metabolic changes. In particular, at the transition between aerobic and aerobic-anaerobic metabolism, there are pronounced extrema in the functional dependence of the coefficients on the stress level. This strong sensitivity can be used to design an easy indirect method for determining the anaerobic threshold. This method could replace costly and invasive traditional methods such as gas analysis and blood tests.

  15. Methods for automatic trigger threshold adjustment

    DOEpatents

    Welch, Benjamin J; Partridge, Michael E

    2014-03-18

    Methods are presented for adjusting trigger threshold values to compensate for drift in the quiescent level of a signal monitored for initiating a data recording event, thereby avoiding false triggering conditions. Initial threshold values are periodically adjusted by re-measuring the quiescent signal level, and adjusting the threshold values by an offset computation based upon the measured quiescent signal level drift. Re-computation of the trigger threshold values can be implemented on time based or counter based criteria. Additionally, a qualification width counter can be utilized to implement a requirement that a trigger threshold criterion be met a given number of times prior to initiating a data recording event, further reducing the possibility of a false triggering situation.

  16. Bioclimatic Thresholds, Thermal Constants and Survival of Mealybug, Phenacoccus solenopsis (Hemiptera: Pseudococcidae) in Response to Constant Temperatures on Hibiscus

    PubMed Central

    Sreedevi, Gudapati; Prasad, Yenumula Gerard; Prabhakar, Mathyam; Rao, Gubbala Ramachandra; Vennila, Sengottaiyan; Venkateswarlu, Bandi

    2013-01-01

    Temperature-driven development and survival rates of the mealybug, Phenacoccussolenopsis Tinsley (Hemiptera: Pseudococcidae) were examined at nine constant temperatures (15, 20, 25, 27, 30, 32, 35 and 40°C) on hibiscus ( Hibiscus rosa -sinensis L.). Crawlers successfully completed development to adult stage between 15 and 35°C, although their survival was affected at low temperatures. Two linear and four nonlinear models were fitted to describe developmental rates of P . solenopsis as a function of temperature, and for estimating thermal constants and bioclimatic thresholds (lower, optimum and upper temperature thresholds for development: Tmin, Topt and Tmax, respectively). Estimated thresholds between the two linear models were statistically similar. Ikemoto and Takai’s linear model permitted testing the equivalence of lower developmental thresholds for life stages of P . solenopsis reared on two hosts, hibiscus and cotton. Thermal constants required for completion of cumulative development of female and male nymphs and for the whole generation were significantly lower on hibiscus (222.2, 237.0, 308.6 degree-days, respectively) compared to cotton. Three nonlinear models performed better in describing the developmental rate for immature instars and cumulative life stages of female and male and for generation based on goodness-of-fit criteria. The simplified β type distribution function estimated Topt values closer to the observed maximum rates. Thermodynamic SSI model indicated no significant differences in the intrinsic optimum temperature estimates for different geographical populations of P . solenopsis . The estimated bioclimatic thresholds and the observed survival rates of P . solenopsis indicate the species to be high-temperature adaptive, and explained the field abundance of P . solenopsis on its host plants. PMID:24086597

  17. [Threshold value for reimbursement of costs of new drugs: cost-effectiveness research and modelling are essential links].

    PubMed

    Frederix, Geert W J; Hövels, Anke M; Severens, Johan L; Raaijmakers, Jan A M; Schellens, Jan H M

    2015-01-01

    There is increasing discussion in the Netherlands about the introduction of a threshold value for the costs per extra year of life when reimbursing costs of new drugs. The Medicines Committee ('Commissie Geneesmiddelen'), a division of the Netherlands National Healthcare Institute ('Zorginstituut Nederland'), advises on reimbursement of costs of new drugs. This advice is based upon the determination of therapeutic value of the drug and the results of economic evaluations. Mathematical models that predict future costs and effectiveness are often used in economic evaluations; these models can vary greatly in transparency and quality due to author assumptions. Standardisation of cost-effectiveness models is one solution to overcome the unwanted variation in quality. Discussions about the introduction of a threshold value can only be meaningful if all involved are adequately informed, and by high quality in cost-effectiveness research and, particularly, economic evaluations. Collaboration and discussion between medical specialists, patients or patient organisations, health economists and policy makers, both in development of methods and in standardisation, are essential to improve the quality of decision making.

  18. Evidence Accumulator or Decision Threshold – Which Cortical Mechanism are We Observing?

    PubMed Central

    Simen, Patrick

    2012-01-01

    Most psychological models of perceptual decision making are of the accumulation-to-threshold variety. The neural basis of accumulation in parietal and prefrontal cortex is therefore a topic of great interest in neuroscience. In contrast, threshold mechanisms have received less attention, and their neural basis has usually been sought in subcortical structures. Here I analyze a model of a decision threshold that can be implemented in the same cortical areas as evidence accumulators, and whose behavior bears on two open questions in decision neuroscience: (1) When ramping activity is observed in a brain region during decision making, does it reflect evidence accumulation? (2) Are changes in speed-accuracy tradeoffs and response biases more likely to be achieved by changes in thresholds, or in accumulation rates and starting points? The analysis suggests that task-modulated ramping activity, by itself, is weak evidence that a brain area mediates evidence accumulation as opposed to threshold readout; and that signs of modulated accumulation are as likely to indicate threshold adaptation as adaptation of starting points and accumulation rates. These conclusions imply that how thresholds are modeled can dramatically impact accumulator-based interpretations of this data. PMID:22737136

  19. A statistical framework for the validation of a population exposure model based on personal exposure data

    NASA Astrophysics Data System (ADS)

    Rodriguez, Delphy; Valari, Myrto; Markakis, Konstantinos; Payan, Sébastien

    2016-04-01

    Currently, ambient pollutant concentrations at monitoring sites are routinely measured by local networks, such as AIRPARIF in Paris, France. Pollutant concentration fields are also simulated with regional-scale chemistry transport models such as CHIMERE (http://www.lmd.polytechnique.fr/chimere) under air-quality forecasting platforms (e.g. Prev'Air http://www.prevair.org) or research projects. These data may be combined with more or less sophisticated techniques to provide a fairly good representation of pollutant concentration spatial gradients over urban areas. Here we focus on human exposure to atmospheric contaminants. Based on census data on population dynamics and demographics, modeled outdoor concentrations and infiltration of outdoor air-pollution indoors we have developed a population exposure model for ozone and PM2.5. A critical challenge in the field of population exposure modeling is model validation since personal exposure data are expensive and therefore, rare. However, recent research has made low cost mobile sensors fairly common and therefore personal exposure data should become more and more accessible. In view of planned cohort field-campaigns where such data will be available over the Paris region, we propose in the present study a statistical framework that makes the comparison between modeled and measured exposures meaningful. Our ultimate goal is to evaluate the exposure model by comparing modeled exposures to monitor data. The scientific question we address here is how to downscale modeled data that are estimated on the county population scale at the individual scale which is appropriate to the available measurements. To assess this question we developed a Bayesian hierarchical framework that assimilates actual individual data into population statistics and updates the probability estimate.

  20. Development of a dynamic framework to explain population patterns of leisure-time physical activity through agent-based modeling.

    PubMed

    Garcia, Leandro M T; Diez Roux, Ana V; Martins, André C R; Yang, Yong; Florindo, Alex A

    2017-08-22

    Despite the increasing body of evidences on the factors influencing leisure-time physical activity, our understanding of the mechanisms and interactions that lead to the formation and evolution of population patterns is still limited. Moreover, most frameworks in this field fail to capture dynamic processes. Our aim was to create a dynamic conceptual model depicting the interaction between key psychological attributes of individuals and main aspects of the built and social environments in which they live. This conceptual model will inform and support the development of an agent-based model aimed to explore how population patterns of LTPA in adults may emerge from the dynamic interplay between psychological traits and built and social environments. We integrated existing theories and models as well as available empirical data (both from literature reviews), and expert opinions (based on a systematic expert assessment of an intermediary version of the model). The model explicitly presents intention as the proximal determinant of leisure-time physical activity, a relationship dynamically moderated by the built environment (access, quality, and available activities) - with the strength of the moderation varying as a function of the person's intention- and influenced both by the social environment (proximal network's and community's behavior) and the person's behavior. Our conceptual model is well supported by evidence and experts' opinions and will inform the design of our agent-based model, as well as data collection and analysis of future investigations on population patterns of leisure-time physical activity among adults.

  1. Threshold effect of habitat loss on bat richness in cerrado-forest landscapes.

    PubMed

    Muylaert, Renata L; Stevens, Richard D; Ribeiro, Milton C

    2016-09-01

    Understanding how animal groups respond to contemporary habitat loss and fragmentation is essential for development of strategies for species conservation. Until now, there has been no consensus about how landscape degradation affects the diversity and distribution of Neotropical bats. Some studies demonstrate population declines and species loss in impacted areas, although the magnitude and generality of these effects on bat community structure are unclear. Empirical fragmentation thresholds predict an accentuated drop in biodiversity, and species richness in particular, when less than 30% of the original amount of habitat in the landscape remains. In this study, we tested whether bat species richness demonstrates this threshold response, based on 48 sites distributed across 12 landscapes with 9-88% remaining forest in Brazilian cerrado-forest formations. We also examined the degree to which abundance was similarly affected within four different feeding guilds. The threshold value for richness, below which bat diversity declines precipitously, was estimated at 47% of remaining forest. To verify if the response of bat abundance to habitat loss differed among feeding guilds, we used a model selection approach based on Akaike's information criterion. Models accounted for the amount of riparian forest, semideciduous forest, cerrado, tree plantations, secondary forest, and the total amount of forest in the landscape. We demonstrate a nonlinear effect of the contribution of tree plantations to frugivores, and a positive effect of the amount of cerrado to nectarivores and animalivores, the groups that responded most to decreases in amount of forest. We suggest that bat assemblages in interior Atlantic Forest and cerrado regions of southeastern Brazil are impoverished, since we found lower richness and abundance of different groups in landscapes with lower amounts of forest. The relatively higher threshold value of 47% suggests that bat communities have a relatively lower

  2. Development of an anaerobic threshold (HRLT, HRVT) estimation equation using the heart rate threshold (HRT) during the treadmill incremental exercise test

    PubMed Central

    Ham, Joo-ho; Park, Hun-Young; Kim, Youn-ho; Bae, Sang-kon; Ko, Byung-hoon

    2017-01-01

    [Purpose] The purpose of this study was to develop a regression model to estimate the heart rate at the lactate threshold (HRLT) and the heart rate at the ventilatory threshold (HRVT) using the heart rate threshold (HRT), and to test the validity of the regression model. [Methods] We performed a graded exercise test with a treadmill in 220 normal individuals (men: 112, women: 108) aged 20–59 years. HRT, HRLT, and HRVT were measured in all subjects. A regression model was developed to estimate HRLT and HRVT using HRT with 70% of the data (men: 79, women: 76) through randomization (7:3), with the Bernoulli trial. The validity of the regression model developed with the remaining 30% of the data (men: 33, women: 32) was also examined. [Results] Based on the regression coefficient, we found that the independent variable HRT was a significant variable in all regression models. The adjusted R2 of the developed regression models averaged about 70%, and the standard error of estimation of the validity test results was 11 bpm, which is similar to that of the developed model. [Conclusion] These results suggest that HRT is a useful parameter for predicting HRLT and HRVT. PMID:29036765

  3. Development of an anaerobic threshold (HRLT, HRVT) estimation equation using the heart rate threshold (HRT) during the treadmill incremental exercise test.

    PubMed

    Ham, Joo-Ho; Park, Hun-Young; Kim, Youn-Ho; Bae, Sang-Kon; Ko, Byung-Hoon; Nam, Sang-Seok

    2017-09-30

    The purpose of this study was to develop a regression model to estimate the heart rate at the lactate threshold (HRLT) and the heart rate at the ventilatory threshold (HRVT) using the heart rate threshold (HRT), and to test the validity of the regression model. We performed a graded exercise test with a treadmill in 220 normal individuals (men: 112, women: 108) aged 20-59 years. HRT, HRLT, and HRVT were measured in all subjects. A regression model was developed to estimate HRLT and HRVT using HRT with 70% of the data (men: 79, women: 76) through randomization (7:3), with the Bernoulli trial. The validity of the regression model developed with the remaining 30% of the data (men: 33, women: 32) was also examined. Based on the regression coefficient, we found that the independent variable HRT was a significant variable in all regression models. The adjusted R2 of the developed regression models averaged about 70%, and the standard error of estimation of the validity test results was 11 bpm, which is similar to that of the developed model. These results suggest that HRT is a useful parameter for predicting HRLT and HRVT. ©2017 The Korean Society for Exercise Nutrition

  4. Population pharmacokinetic-pharmacodynamic modeling and model-based prediction of docetaxel-induced neutropenia in Japanese patients with non-small cell lung cancer.

    PubMed

    Fukae, Masato; Shiraishi, Yoshimasa; Hirota, Takeshi; Sasaki, Yuka; Yamahashi, Mika; Takayama, Koichi; Nakanishi, Yoichi; Ieiri, Ichiro

    2016-11-01

    Docetaxel is used to treat many cancers, and neutropenia is the dose-limiting factor for its clinical use. A population pharmacokinetic-pharmacodynamic (PK-PD) model was introduced to predict the development of docetaxel-induced neutropenia in Japanese patients with non-small cell lung cancer (NSCLC). Forty-seven advanced or recurrent Japanese patients with NSCLC were enrolled. Patients received 50 or 60 mg/m 2 docetaxel as monotherapy, and blood samples for a PK analysis were collected up to 24 h after its infusion. Laboratory tests including absolute neutrophil count data and demographic information were used in population PK-PD modeling. The model was built by NONMEM 7.2 with a first-order conditional estimation using an interaction method. Based on the final model, a Monte Carlo simulation was performed to assess the impact of covariates on and the predictability of neutropenia. A three-compartment model was employed to describe PK data, and the PK model adequately described the docetaxel concentrations observed. Serum albumin (ALB) was detected as a covariate of clearance (CL): CL (L/h) = 32.5 × (ALB/3.6) 0.965  × (WGHT/70) 3/4 . In population PK-PD modeling, a modified semi-mechanistic myelosuppression model was applied, and characterization of the time course of neutrophil counts was adequate. The covariate selection indicated that α1-acid glycoprotein (AAG) was a predictor of neutropenia. The model-based simulation also showed that ALB and AAG negatively correlated with the development of neutropenia and that the time course of neutrophil counts was predictable. The developed model may facilitate the prediction and care of docetaxel-induced neutropenia.

  5. Dynamic energy budget as a basis to model population-level effects of zinc-spiked sediments in the gastropod Valvata piscinalis.

    PubMed

    Ducrot, Virginie; Péry, Alexandre R R; Mons, Raphaël; Quéau, Hervé; Charles, Sandrine; Garric, Jeanne

    2007-08-01

    This paper presents original toxicity test designs and mathematical models that may be used to assess the deleterious effects of toxicants on Valvata piscinalis (Mollusca, Gastropoda). Results obtained for zinc, used as a reference toxicant, are presented. The feeding behavior, juvenile survival, growth, age at puberty, onset of reproduction, number of breedings during the life cycle, and fecundity were significantly altered when the snails were exposed to zinc-spiked sediments. Dynamic energy budget models (DEBtox) adequately predicted the effects of zinc on the V. piscinalis life cycle. They also provided estimates for lifecycle parameters that were used to parameterize a demographic model, based on a Z-transformed life-cycle graph. The effect threshold for the population growth rate (lambda) was estimated at 259 mg/kg dry sediment of zinc, showing that significant changes in abundance may occur at environmental concentrations. Significant effects occurring just above this threshold value were mainly caused by the severe impairment of reproductive endpoints. Sensitivity analysis showed that the value of lambda depended mainly on the juvenile survival rate. The impairment of this latter parameter may result in extinction of V. piscinalis. Finally, the present study highlights advantages of the proposed modeling approach in V. piscinalis and possible transfer to other test species and contaminants.

  6. Landslide triggering thresholds for Switzerland based on a new gridded precipitation dataset

    NASA Astrophysics Data System (ADS)

    Leonarduzzi, Elena; Molnar, Peter; McArdell, Brian W.

    2017-04-01

    In Switzerland floods are responsible for most of the damage caused by rainfall-triggered natural hazards (89%), followed by landslides (6%, ca. 520 M Euros) as reported in Hilker et al. (2009) for the period 1972-2007. The prediction of landslide occurrence is particularly challenging because of their wide distribution in space and the complex interdependence of predisposing and triggering factors. The overall goal of our research is to develop an Early Warning System for landsliding in Switzerland based on hydrological modelling and rainfall forecasts. In order to achieve this, we first analyzed rainfall triggering thresholds for landslides from a new gridded daily precipitation dataset (RhiresD, MeteoSwiss) for Switzerland combined with landslide events recorded in the Swiss Damage Database (Hilker et al.,2009). The high-resolution gridded precipitation dataset allows us to collocate rainfall and landslides accurately in space, which is an advantage over many previous studies. Each of the 2272 landslides in the database in the period 1972-2012 was assigned to the corresponding 2x2 km precipitation cell. For each of these cells, precipitation events were defined as series of consecutive rainy days and the following event parameters were computed: duration (day), maximum and mean daily intensity (mm/day), total rainfall depth (mm) and maximum daily intensity divided by Mean Daily Precipitation (MDP). The events were classified as triggering or non-triggering depending on whether a landslide was recorded in the cell during the event. This classification of observations was compared to predictions based on a threshold for each of the parameters. The predictive power of each parameter and the best threshold value were quantified by ROC analysis and statistics such as AUC and the True Skill Statistic (TSS). Event parameters based on rainfall intensity were found to have similarly high predictive power (TSS=0.54-0.59, AUC=0.85-0.86), while rainfall duration had a

  7. Conception, fabrication and characterization of a silicon based MEMS inertial switch with a threshold value of 5 g

    NASA Astrophysics Data System (ADS)

    Zhang, Fengtian; Wang, Chao; Yuan, Mingquan; Tang, Bin; Xiong, Zhuang

    2017-12-01

    Most of the MEMS inertial switches developed in recent years are intended for shock and impact sensing with a threshold value above 50 g. In order to follow the requirement of detecting linear acceleration signal at low-g level, a silicon based MEMS inertial switch with a threshold value of 5 g was designed, fabricated and characterized. The switch consisted of a large proof mass, supported by circular spiral springs. An analytical model of the structure stiffness of the proposed switch was derived and verified by finite-element simulation. The structure fabrication was based on a customized double-buried layer silicon-on-insulator wafer and encapsulated by glass wafers. The centrifugal experiment and nanoindentation experiment were performed to measure the threshold value as well as the structure stiffness. The actual threshold values were measured to be 0.1-0.3 g lower than the pre-designed value of 5 g due to the dimension loss during non-contact lithography processing. Concerning the reliability assessment, a series of environmental experiments were conducted and the switches remained operational without excessive errors. However, both the random vibration and the shock tests indicate that the metal particles generated during collision of contact parts might affect the contact reliability and long-time stability. According to the conclusion reached in this report, an attentive study on switch contact behavior should be included in future research.

  8. Estimating and modelling cure in population-based cancer studies within the framework of flexible parametric survival models

    PubMed Central

    2011-01-01

    Background When the mortality among a cancer patient group returns to the same level as in the general population, that is, the patients no longer experience excess mortality, the patients still alive are considered "statistically cured". Cure models can be used to estimate the cure proportion as well as the survival function of the "uncured". One limitation of parametric cure models is that the functional form of the survival of the "uncured" has to be specified. It can sometimes be hard to find a survival function flexible enough to fit the observed data, for example, when there is high excess hazard within a few months from diagnosis, which is common among older age groups. This has led to the exclusion of older age groups in population-based cancer studies using cure models. Methods Here we have extended the flexible parametric survival model to incorporate cure as a special case to estimate the cure proportion and the survival of the "uncured". Flexible parametric survival models use splines to model the underlying hazard function, and therefore no parametric distribution has to be specified. Results We have compared the fit from standard cure models to our flexible cure model, using data on colon cancer patients in Finland. This new method gives similar results to a standard cure model, when it is reliable, and better fit when the standard cure model gives biased estimates. Conclusions Cure models within the framework of flexible parametric models enables cure modelling when standard models give biased estimates. These flexible cure models enable inclusion of older age groups and can give stage-specific estimates, which is not always possible from parametric cure models. PMID:21696598

  9. A score-statistic approach for determining threshold values in QTL mapping.

    PubMed

    Kao, Chen-Hung; Ho, Hsiang-An

    2012-06-01

    Issues in determining the threshold values of QTL mapping are often investigated for the backcross and F2 populations with relatively simple genome structures so far. The investigations of these issues in the progeny populations after F2 (advanced populations) with relatively more complicated genomes are generally inadequate. As these advanced populations have been well implemented in QTL mapping, it is important to address these issues for them in more details. Due to an increasing number of meiosis cycle, the genomes of the advanced populations can be very different from the backcross and F2 genomes. Therefore, special devices that consider the specific genome structures present in the advanced populations are required to resolve these issues. By considering the differences in genome structure between populations, we formulate more general score test statistics and gaussian processes to evaluate their threshold values. In general, we found that, given a significance level and a genome size, threshold values for QTL detection are higher in the denser marker maps and in the more advanced populations. Simulations were performed to validate our approach.

  10. Fine-touch pressure thresholds in the adult penis.

    PubMed

    Sorrells, Morris L; Snyder, James L; Reiss, Mark D; Eden, Christopher; Milos, Marilyn F; Wilcox, Norma; Van Howe, Robert S

    2007-04-01

    To map the fine-touch pressure thresholds of the adult penis in circumcised and uncircumcised men, and to compare the two populations. Adult male volunteers with no history of penile pathology or diabetes were evaluated with a Semmes-Weinstein monofilament touch-test to map the fine-touch pressure thresholds of the penis. Circumcised and uncircumcised men were compared using mixed models for repeated data, controlling for age, type of underwear worn, time since last ejaculation, ethnicity, country of birth, and level of education. The glans of the uncircumcised men had significantly lower mean (sem) pressure thresholds than that of the circumcised men, at 0.161 (0.078) g (P = 0.040) when controlled for age, location of measurement, type of underwear worn, and ethnicity. There were significant differences in pressure thresholds by location on the penis (P < 0.001). The most sensitive location on the circumcised penis was the circumcision scar on the ventral surface. Five locations on the uncircumcised penis that are routinely removed at circumcision had lower pressure thresholds than the ventral scar of the circumcised penis. The glans of the circumcised penis is less sensitive to fine touch than the glans of the uncircumcised penis. The transitional region from the external to the internal prepuce is the most sensitive region of the uncircumcised penis and more sensitive than the most sensitive region of the circumcised penis. Circumcision ablates the most sensitive parts of the penis.

  11. High prices for rare species can drive large populations extinct: the anthropogenic Allee effect revisited.

    PubMed

    Holden, Matthew H; McDonald-Madden, Eve

    2017-09-21

    Consumer demand for plant and animal products threatens many populations with extinction. The anthropogenic Allee effect (AAE) proposes that such extinctions can be caused by prices for wildlife products increasing with species rarity. This price-rarity relationship creates financial incentives to extract the last remaining individuals of a population, despite higher search and harvest costs. The AAE has become a standard approach for conceptualizing the threat of economic markets on endangered species. Despite its potential importance for conservation, AAE theory is based on a simple graphical model with limited analysis of possible population trajectories. By specifying a general class of functions for price-rarity relationships, we show that the classic theory can understate the risk of species extinction. AAE theory proposes that only populations below a critical Allee threshold will go extinct due to increasing price-rarity relationships. Our analysis shows that this threshold can be much higher than the original theory suggests, depending on initial harvest effort. More alarmingly, even species with population sizes above this Allee threshold, for which AAE predicts persistence, can be destined to extinction. Introducing even a minimum price for harvested individuals, close to zero, can cause large populations to cross the classic anthropogenic Allee threshold on a trajectory towards extinction. These results suggest that traditional AAE theory may give a false sense of security when managing large harvested populations. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Population viability analysis for endangered Roanoke logperch

    USGS Publications Warehouse

    Roberts, James H.; Angermeier, Paul; Anderson, Gregory B.

    2016-01-01

    A common strategy for recovering endangered species is ensuring that populations exceed the minimum viable population size (MVP), a demographic benchmark that theoretically ensures low long-term extinction risk. One method of establishing MVP is population viability analysis, a modeling technique that simulates population trajectories and forecasts extinction risk based on a series of biological, environmental, and management assumptions. Such models also help identify key uncertainties that have a large influence on extinction risk. We used stochastic count-based simulation models to explore extinction risk, MVP, and the possible benefits of alternative management strategies in populations of Roanoke logperch Percina rex, an endangered stream fish. Estimates of extinction risk were sensitive to the assumed population growth rate and model type, carrying capacity, and catastrophe regime (frequency and severity of anthropogenic fish kills), whereas demographic augmentation did little to reduce extinction risk. Under density-dependent growth, the estimated MVP for Roanoke logperch ranged from 200 to 4200 individuals, depending on the assumed severity of catastrophes. Thus, depending on the MVP threshold, anywhere from two to all five of the logperch populations we assessed were projected to be viable. Despite this uncertainty, these results help identify populations with the greatest relative extinction risk, as well as management strategies that might reduce this risk the most, such as increasing carrying capacity and reducing fish kills. Better estimates of population growth parameters and catastrophe regimes would facilitate the refinement of MVP and extinction-risk estimates, and they should be a high priority for future research on Roanoke logperch and other imperiled stream-fish species.

  13. Thresholds of probable problematic gambling involvement for the German population: Results of the Pathological Gambling and Epidemiology (PAGE) Study.

    PubMed

    Brosowski, Tim; Hayer, Tobias; Meyer, Gerhard; Rumpf, Hans-Jürgen; John, Ulrich; Bischof, Anja; Meyer, Christian

    2015-09-01

    Consumption measures in gambling research may help to establish thresholds of low-risk gambling as 1 part of evidence-based responsible gambling strategies. The aim of this study is to replicate existing Canadian thresholds of probable low-risk gambling (Currie et al., 2006) in a representative dataset of German gambling behavior (Pathological Gambling and Epidemiology [PAGE]; N = 15,023). Receiver-operating characteristic curves applied in a training dataset (60%) extracted robust thresholds of low-risk gambling across 4 nonexclusive definitions of gambling problems (1 + to 4 + Diagnostic and Statistical Manual for Mental Disorders-Fifth Edition [DSM-5] Composite International Diagnostic Interview [CIDI] symptoms), different indicators of gambling involvement (across all game types; form-specific) and different timeframes (lifetime; last year). Logistic regressions applied in a test dataset (40%) to cross-validate the heuristics of probable low-risk gambling incorporated confounding covariates (age, gender, education, migration, and unemployment) and confirmed the strong concurrent validity of the thresholds. Moreover, it was possible to establish robust form-specific thresholds of low-risk gambling (only for gaming machines and poker). Possible implications for early detection of problem gamblers in offline or online environments are discussed. Results substantiate international knowledge about problem gambling prevention and contribute to a German discussion about empirically based guidelines of low-risk gambling. (c) 2015 APA, all rights reserved).

  14. Network-level reproduction number and extinction threshold for vector-borne diseases.

    PubMed

    Xue, Ling; Scoglio, Caterina

    2015-06-01

    The basic reproduction number of deterministic models is an essential quantity to predict whether an epidemic will spread or not. Thresholds for disease extinction contribute crucial knowledge of disease control, elimination, and mitigation of infectious diseases. Relationships between basic reproduction numbers of two deterministic network-based ordinary differential equation vector-host models, and extinction thresholds of corresponding stochastic continuous-time Markov chain models are derived under some assumptions. Numerical simulation results for malaria and Rift Valley fever transmission on heterogeneous networks are in agreement with analytical results without any assumptions, reinforcing that the relationships may always exist and proposing a mathematical problem for proving existence of the relationships in general. Moreover, numerical simulations show that the basic reproduction number does not monotonically increase or decrease with the extinction threshold. Consistent trends of extinction probability observed through numerical simulations provide novel insights into mitigation strategies to increase the disease extinction probability. Research findings may improve understandings of thresholds for disease persistence in order to control vector-borne diseases.

  15. Hydro-mechanical mechanism and thresholds of rainfall-induced unsaturated landslides

    NASA Astrophysics Data System (ADS)

    Yang, Zongji; Lei, Xiaoqin; Huang, Dong; Qiao, Jianping

    2017-04-01

    The devastating Ms 8 Wenchuan earthquake in 2008 created the greatest number of co-seismic mountain hazards ever recorded in China. However, the dynamics of rainfall induced mass remobilization and transport deposits after giant earthquake are not fully understood. Moreover, rainfall intensity and duration (I-D) methods are the predominant early warning indicators of rainfall-induced landslides in post-earthquake region, which are a convenient and straight-forward way to predict the hazards. However, the rainfall-based criteria and thresholds are generally empirical and based on statistical analysis,consequently, they ignore the failure mechanisms of the landslides. This study examines the mechanism and hydro-mechanical behavior and thresholds of these unsaturated deposits under the influence of rainfall. To accomplish this, in situ experiments were performed in an instrumented landslide deposit, The field experimental tests were conducted on a natural co-seismic fractured slope to 1) simulate rainfall-induced shallow failures in the depression channels of a debris flow catchment in an earthquake-affected region, 2)explore the mechanisms and transient processes associated with hydro-mechanical parameter variations in response to the infiltration of rainfall, and 3) identify the hydrologic parameter thresholds and critical criteria of gravitational erosion in areas prone to mass remobilization as a source of debris flows. These experiments provided instrumental evidence and directly proved that post-earthquake rainfall-induced mass remobilization occurred under unsaturated conditions in response to transient rainfall infiltration, and revealed the presence of transient processes and the dominance of preferential flow paths during rainfall infiltration. A hydro-mechanical method was adopted for the transient hydrologic process modelling and unsaturated slope stability analysis. and the slope failures during the experimental test were reproduced by the model

  16. Population Dynamics of Belonolaimus longicaudatusin a Cotton Production System

    PubMed Central

    Crow, W. T.; Weingartner, D. P.; McSorley, R.; Dickson, D. W.

    2000-01-01

    Belonolaimus longicaudatus is a recognized pathogen of cotton (Gossypium hirsutum), but insufficient information is available on the population dynamics and economic thresholds of B. longicaudatus in cotton production. In this study, data collected from a field in Florida were used to develop models predicting population increases of B. longicaudatus on cotton and population declines under clean fallow. Population densities of B. longicaudatus increased on cotton, reaching a carrying capacity of 139 nematodes/130 cm³ of soil, but decreased exponentially during periods of bare fallow. The model indicated that population densities should decrease each year of monocropped cotton, if an alternate host is not present between sequential cotton crops. Economic thresholds derived from published damage functions and current prices for cotton and nematicides varied from 2 to 5 B. longicaudatus/130 cm³ of soil, depending on the nematicide used. PMID:19270968

  17. Prediction of spatially explicit rainfall intensity-duration thresholds for post-fire debris-flow generation in the western United States

    NASA Astrophysics Data System (ADS)

    Staley, Dennis; Negri, Jacquelyn; Kean, Jason

    2016-04-01

    Population expansion into fire-prone steeplands has resulted in an increase in post-fire debris-flow risk in the western United States. Logistic regression methods for determining debris-flow likelihood and the calculation of empirical rainfall intensity-duration thresholds for debris-flow initiation represent two common approaches for characterizing hazard and reducing risk. Logistic regression models are currently being used to rapidly assess debris-flow hazard in response to design storms of known intensities (e.g. a 10-year recurrence interval rainstorm). Empirical rainfall intensity-duration thresholds comprise a major component of the United States Geological Survey (USGS) and the National Weather Service (NWS) debris-flow early warning system at a regional scale in southern California. However, these two modeling approaches remain independent, with each approach having limitations that do not allow for synergistic local-scale (e.g. drainage-basin scale) characterization of debris-flow hazard during intense rainfall. The current logistic regression equations consider rainfall a unique independent variable, which prevents the direct calculation of the relation between rainfall intensity and debris-flow likelihood. Regional (e.g. mountain range or physiographic province scale) rainfall intensity-duration thresholds fail to provide insight into the basin-scale variability of post-fire debris-flow hazard and require an extensive database of historical debris-flow occurrence and rainfall characteristics. Here, we present a new approach that combines traditional logistic regression and intensity-duration threshold methodologies. This method allows for local characterization of both the likelihood that a debris-flow will occur at a given rainfall intensity, the direct calculation of the rainfall rates that will result in a given likelihood, and the ability to calculate spatially explicit rainfall intensity-duration thresholds for debris-flow generation in recently

  18. Single bumps in a 2-population homogenized neuronal network model

    NASA Astrophysics Data System (ADS)

    Kolodina, Karina; Oleynik, Anna; Wyller, John

    2018-05-01

    We investigate existence and stability of single bumps in a homogenized 2-population neural field model, when the firing rate functions are given by the Heaviside function. The model is derived by means of the two-scale convergence technique of Nguetseng in the case of periodic microvariation in the connectivity functions. The connectivity functions are periodically modulated in both the synaptic footprint and in the spatial scale. The bump solutions are constructed by using a pinning function technique for the case where the solutions are independent of the local variable. In the weakly modulated case the generic picture consists of two bumps (one narrow and one broad bump) for each admissible set of threshold values for firing. In addition, a new threshold value regime for existence of bumps is detected. Beyond the weakly modulated regime the number of bumps depends sensitively on the degree of heterogeneity. For the latter case we present a configuration consisting of three coexisting bumps. The linear stability of the bumps is studied by means of the spectral properties of a Fredholm integral operator, block diagonalization of this operator and the Fourier decomposition method. In the weakly modulated regime, one of the bumps is unstable for all relative inhibition times, while the other one is stable for small and moderate values of this parameter. The latter bump becomes unstable as the relative inhibition time exceeds a certain threshold. In the case of the three coexisting bumps detected in the regime of finite degree of heterogeneity, we have at least one stable bump (and maximum two stable bumps) for small and moderate values of the relative inhibition time.

  19. Assessing regional and interspecific variation in threshold responses of forest breeding birds through broad scale analyses.

    PubMed

    van der Hoek, Yntze; Renfrew, Rosalind; Manne, Lisa L

    2013-01-01

    Identifying persistence and extinction thresholds in species-habitat relationships is a major focal point of ecological research and conservation. However, one major concern regarding the incorporation of threshold analyses in conservation is the lack of knowledge on the generality and transferability of results across species and regions. We present a multi-region, multi-species approach of modeling threshold responses, which we use to investigate whether threshold effects are similar across species and regions. We modeled local persistence and extinction dynamics of 25 forest-associated breeding birds based on detection/non-detection data, which were derived from repeated breeding bird atlases for the state of Vermont. We did not find threshold responses to be particularly well-supported, with 9 species supporting extinction thresholds and 5 supporting persistence thresholds. This contrasts with a previous study based on breeding bird atlas data from adjacent New York State, which showed that most species support persistence and extinction threshold models (15 and 22 of 25 study species respectively). In addition, species that supported a threshold model in both states had associated average threshold estimates of 61.41% (SE = 6.11, persistence) and 66.45% (SE = 9.15, extinction) in New York, compared to 51.08% (SE = 10.60, persistence) and 73.67% (SE = 5.70, extinction) in Vermont. Across species, thresholds were found at 19.45-87.96% forest cover for persistence and 50.82-91.02% for extinction dynamics. Through an approach that allows for broad-scale comparisons of threshold responses, we show that species vary in their threshold responses with regard to habitat amount, and that differences between even nearby regions can be pronounced. We present both ecological and methodological factors that may contribute to the different model results, but propose that regardless of the reasons behind these differences, our results merit a warning that

  20. Assessing Regional and Interspecific Variation in Threshold Responses of Forest Breeding Birds through Broad Scale Analyses

    PubMed Central

    van der Hoek, Yntze; Renfrew, Rosalind; Manne, Lisa L.

    2013-01-01

    Background Identifying persistence and extinction thresholds in species-habitat relationships is a major focal point of ecological research and conservation. However, one major concern regarding the incorporation of threshold analyses in conservation is the lack of knowledge on the generality and transferability of results across species and regions. We present a multi-region, multi-species approach of modeling threshold responses, which we use to investigate whether threshold effects are similar across species and regions. Methodology/Principal Findings We modeled local persistence and extinction dynamics of 25 forest-associated breeding birds based on detection/non-detection data, which were derived from repeated breeding bird atlases for the state of Vermont. We did not find threshold responses to be particularly well-supported, with 9 species supporting extinction thresholds and 5 supporting persistence thresholds. This contrasts with a previous study based on breeding bird atlas data from adjacent New York State, which showed that most species support persistence and extinction threshold models (15 and 22 of 25 study species respectively). In addition, species that supported a threshold model in both states had associated average threshold estimates of 61.41% (SE = 6.11, persistence) and 66.45% (SE = 9.15, extinction) in New York, compared to 51.08% (SE = 10.60, persistence) and 73.67% (SE = 5.70, extinction) in Vermont. Across species, thresholds were found at 19.45–87.96% forest cover for persistence and 50.82–91.02% for extinction dynamics. Conclusions/Significance Through an approach that allows for broad-scale comparisons of threshold responses, we show that species vary in their threshold responses with regard to habitat amount, and that differences between even nearby regions can be pronounced. We present both ecological and methodological factors that may contribute to the different model results, but propose that regardless of the

  1. A derivation of the stable cavitation threshold accounting for bubble-bubble interactions.

    PubMed

    Guédra, Matthieu; Cornu, Corentin; Inserra, Claude

    2017-09-01

    The subharmonic emission of sound coming from the nonlinear response of a bubble population is the most used indicator for stable cavitation. When driven at twice their resonance frequency, bubbles can exhibit subharmonic spherical oscillations if the acoustic pressure amplitude exceeds a threshold value. Although various theoretical derivations exist for the subharmonic emission by free or coated bubbles, they all rest on the single bubble model. In this paper, we propose an analytical expression of the subharmonic threshold for interacting bubbles in a homogeneous, monodisperse cloud. This theory predicts a shift of the subharmonic resonance frequency and a decrease of the corresponding pressure threshold due to the interactions. For a given sonication frequency, these results show that an optimal value of the interaction strength (i.e. the number density of bubbles) can be found for which the subharmonic threshold is minimum, which is consistent with recently published experiments conducted on ultrasound contrast agents. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Hydrodynamics of sediment threshold

    NASA Astrophysics Data System (ADS)

    Ali, Sk Zeeshan; Dey, Subhasish

    2016-07-01

    A novel hydrodynamic model for the threshold of cohesionless sediment particle motion under a steady unidirectional streamflow is presented. The hydrodynamic forces (drag and lift) acting on a solitary sediment particle resting over a closely packed bed formed by the identical sediment particles are the primary motivating forces. The drag force comprises of the form drag and form induced drag. The lift force includes the Saffman lift, Magnus lift, centrifugal lift, and turbulent lift. The points of action of the force system are appropriately obtained, for the first time, from the basics of micro-mechanics. The sediment threshold is envisioned as the rolling mode, which is the plausible mode to initiate a particle motion on the bed. The moment balance of the force system on the solitary particle about the pivoting point of rolling yields the governing equation. The conditions of sediment threshold under the hydraulically smooth, transitional, and rough flow regimes are examined. The effects of velocity fluctuations are addressed by applying the statistical theory of turbulence. This study shows that for a hindrance coefficient of 0.3, the threshold curve (threshold Shields parameter versus shear Reynolds number) has an excellent agreement with the experimental data of uniform sediments. However, most of the experimental data are bounded by the upper and lower limiting threshold curves, corresponding to the hindrance coefficients of 0.2 and 0.4, respectively. The threshold curve of this study is compared with those of previous researchers. The present model also agrees satisfactorily with the experimental data of nonuniform sediments.

  3. Point estimation following two-stage adaptive threshold enrichment clinical trials.

    PubMed

    Kimani, Peter K; Todd, Susan; Renfro, Lindsay A; Stallard, Nigel

    2018-05-31

    Recently, several study designs incorporating treatment effect assessment in biomarker-based subpopulations have been proposed. Most statistical methodologies for such designs focus on the control of type I error rate and power. In this paper, we have developed point estimators for clinical trials that use the two-stage adaptive enrichment threshold design. The design consists of two stages, where in stage 1, patients are recruited in the full population. Stage 1 outcome data are then used to perform interim analysis to decide whether the trial continues to stage 2 with the full population or a subpopulation. The subpopulation is defined based on one of the candidate threshold values of a numerical predictive biomarker. To estimate treatment effect in the selected subpopulation, we have derived unbiased estimators, shrinkage estimators, and estimators that estimate bias and subtract it from the naive estimate. We have recommended one of the unbiased estimators. However, since none of the estimators dominated in all simulation scenarios based on both bias and mean squared error, an alternative strategy would be to use a hybrid estimator where the estimator used depends on the subpopulation selected. This would require a simulation study of plausible scenarios before the trial. © 2018 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  4. Synergistic effects in threshold models on networks.

    PubMed

    Juul, Jonas S; Porter, Mason A

    2018-01-01

    Network structure can have a significant impact on the propagation of diseases, memes, and information on social networks. Different types of spreading processes (and other dynamical processes) are affected by network architecture in different ways, and it is important to develop tractable models of spreading processes on networks to explore such issues. In this paper, we incorporate the idea of synergy into a two-state ("active" or "passive") threshold model of social influence on networks. Our model's update rule is deterministic, and the influence of each meme-carrying (i.e., active) neighbor can-depending on a parameter-either be enhanced or inhibited by an amount that depends on the number of active neighbors of a node. Such a synergistic system models social behavior in which the willingness to adopt either accelerates or saturates in a way that depends on the number of neighbors who have adopted that behavior. We illustrate that our model's synergy parameter has a crucial effect on system dynamics, as it determines whether degree-k nodes are possible or impossible to activate. We simulate synergistic meme spreading on both random-graph models and networks constructed from empirical data. Using a heterogeneous mean-field approximation, which we derive under the assumption that a network is locally tree-like, we are able to determine which synergy-parameter values allow degree-k nodes to be activated for many networks and for a broad family of synergistic models.

  5. Synergistic effects in threshold models on networks

    NASA Astrophysics Data System (ADS)

    Juul, Jonas S.; Porter, Mason A.

    2018-01-01

    Network structure can have a significant impact on the propagation of diseases, memes, and information on social networks. Different types of spreading processes (and other dynamical processes) are affected by network architecture in different ways, and it is important to develop tractable models of spreading processes on networks to explore such issues. In this paper, we incorporate the idea of synergy into a two-state ("active" or "passive") threshold model of social influence on networks. Our model's update rule is deterministic, and the influence of each meme-carrying (i.e., active) neighbor can—depending on a parameter—either be enhanced or inhibited by an amount that depends on the number of active neighbors of a node. Such a synergistic system models social behavior in which the willingness to adopt either accelerates or saturates in a way that depends on the number of neighbors who have adopted that behavior. We illustrate that our model's synergy parameter has a crucial effect on system dynamics, as it determines whether degree-k nodes are possible or impossible to activate. We simulate synergistic meme spreading on both random-graph models and networks constructed from empirical data. Using a heterogeneous mean-field approximation, which we derive under the assumption that a network is locally tree-like, we are able to determine which synergy-parameter values allow degree-k nodes to be activated for many networks and for a broad family of synergistic models.

  6. Thresholding Based on Maximum Weighted Object Correlation for Rail Defect Detection

    NASA Astrophysics Data System (ADS)

    Li, Qingyong; Huang, Yaping; Liang, Zhengping; Luo, Siwei

    Automatic thresholding is an important technique for rail defect detection, but traditional methods are not competent enough to fit the characteristics of this application. This paper proposes the Maximum Weighted Object Correlation (MWOC) thresholding method, fitting the features that rail images are unimodal and defect proportion is small. MWOC selects a threshold by optimizing the product of object correlation and the weight term that expresses the proportion of thresholded defects. Our experimental results demonstrate that MWOC achieves misclassification error of 0.85%, and outperforms the other well-established thresholding methods, including Otsu, maximum correlation thresholding, maximum entropy thresholding and valley-emphasis method, for the application of rail defect detection.

  7. Impact of different policies on unhealthy dietary behaviors in an urban adult population: an agent-based simulation model.

    PubMed

    Zhang, Donglan; Giabbanelli, Philippe J; Arah, Onyebuchi A; Zimmerman, Frederick J

    2014-07-01

    Unhealthy eating is a complex-system problem. We used agent-based modeling to examine the effects of different policies on unhealthy eating behaviors. We developed an agent-based simulation model to represent a synthetic population of adults in Pasadena, CA, and how they make dietary decisions. Data from the 2007 Food Attitudes and Behaviors Survey and other empirical studies were used to calibrate the parameters of the model. Simulations were performed to contrast the potential effects of various policies on the evolution of dietary decisions. Our model showed that a 20% increase in taxes on fast foods would lower the probability of fast-food consumption by 3 percentage points, whereas improving the visibility of positive social norms by 10%, either through community-based or mass-media campaigns, could improve the consumption of fruits and vegetables by 7 percentage points and lower fast-food consumption by 6 percentage points. Zoning policies had no significant impact. Interventions emphasizing healthy eating norms may be more effective than directly targeting food prices or regulating local food outlets. Agent-based modeling may be a useful tool for testing the population-level effects of various policies within complex systems.

  8. Impact of Different Policies on Unhealthy Dietary Behaviors in an Urban Adult Population: An Agent-Based Simulation Model

    PubMed Central

    Giabbanelli, Philippe J.; Arah, Onyebuchi A.; Zimmerman, Frederick J.

    2014-01-01

    Objectives. Unhealthy eating is a complex-system problem. We used agent-based modeling to examine the effects of different policies on unhealthy eating behaviors. Methods. We developed an agent-based simulation model to represent a synthetic population of adults in Pasadena, CA, and how they make dietary decisions. Data from the 2007 Food Attitudes and Behaviors Survey and other empirical studies were used to calibrate the parameters of the model. Simulations were performed to contrast the potential effects of various policies on the evolution of dietary decisions. Results. Our model showed that a 20% increase in taxes on fast foods would lower the probability of fast-food consumption by 3 percentage points, whereas improving the visibility of positive social norms by 10%, either through community-based or mass-media campaigns, could improve the consumption of fruits and vegetables by 7 percentage points and lower fast-food consumption by 6 percentage points. Zoning policies had no significant impact. Conclusions. Interventions emphasizing healthy eating norms may be more effective than directly targeting food prices or regulating local food outlets. Agent-based modeling may be a useful tool for testing the population-level effects of various policies within complex systems. PMID:24832414

  9. Modelling effects of diquat under realistic exposure patterns in genetically differentiated populations of the gastropod Lymnaea stagnalis

    PubMed Central

    Ducrot, Virginie; Péry, Alexandre R. R.; Lagadic, Laurent

    2010-01-01

    Pesticide use leads to complex exposure and response patterns in non-target aquatic species, so that the analysis of data from standard toxicity tests may result in unrealistic risk forecasts. Developing models that are able to capture such complexity from toxicity test data is thus a crucial issue for pesticide risk assessment. In this study, freshwater snails from two genetically differentiated populations of Lymnaea stagnalis were exposed to repeated acute applications of environmentally realistic concentrations of the herbicide diquat, from the embryo to the adult stage. Hatching rate, embryonic development duration, juvenile mortality, feeding rate and age at first spawning were investigated during both exposure and recovery periods. Effects of diquat on mortality were analysed using a threshold hazard model accounting for time-varying herbicide concentrations. All endpoints were significantly impaired at diquat environmental concentrations in both populations. Snail evolutionary history had no significant impact on their sensitivity and responsiveness to diquat, whereas food acted as a modulating factor of toxicant-induced mortality. The time course of effects was adequately described by the model, which thus appears suitable to analyse long-term effects of complex exposure patterns based upon full life cycle experiment data. Obtained model outputs (e.g. no-effect concentrations) could be directly used for chemical risk assessment. PMID:20921047

  10. Modelling effects of diquat under realistic exposure patterns in genetically differentiated populations of the gastropod Lymnaea stagnalis.

    PubMed

    Ducrot, Virginie; Péry, Alexandre R R; Lagadic, Laurent

    2010-11-12

    Pesticide use leads to complex exposure and response patterns in non-target aquatic species, so that the analysis of data from standard toxicity tests may result in unrealistic risk forecasts. Developing models that are able to capture such complexity from toxicity test data is thus a crucial issue for pesticide risk assessment. In this study, freshwater snails from two genetically differentiated populations of Lymnaea stagnalis were exposed to repeated acute applications of environmentally realistic concentrations of the herbicide diquat, from the embryo to the adult stage. Hatching rate, embryonic development duration, juvenile mortality, feeding rate and age at first spawning were investigated during both exposure and recovery periods. Effects of diquat on mortality were analysed using a threshold hazard model accounting for time-varying herbicide concentrations. All endpoints were significantly impaired at diquat environmental concentrations in both populations. Snail evolutionary history had no significant impact on their sensitivity and responsiveness to diquat, whereas food acted as a modulating factor of toxicant-induced mortality. The time course of effects was adequately described by the model, which thus appears suitable to analyse long-term effects of complex exposure patterns based upon full life cycle experiment data. Obtained model outputs (e.g. no-effect concentrations) could be directly used for chemical risk assessment.

  11. Invited perspectives: Hydrological perspectives on precipitation intensity-duration thresholds for landslide initiation: proposing hydro-meteorological thresholds

    NASA Astrophysics Data System (ADS)

    Bogaard, Thom; Greco, Roberto

    2018-01-01

    Many shallow landslides and debris flows are precipitation initiated. Therefore, regional landslide hazard assessment is often based on empirically derived precipitation intensity-duration (ID) thresholds and landslide inventories. Generally, two features of precipitation events are plotted and labeled with (shallow) landslide occurrence or non-occurrence. Hereafter, a separation line or zone is drawn, mostly in logarithmic space. The practical background of ID is that often only meteorological information is available when analyzing (non-)occurrence of shallow landslides and, at the same time, it could be that precipitation information is a good proxy for both meteorological trigger and hydrological cause. Although applied in many case studies, this approach suffers from many false positives as well as limited physical process understanding. Some first steps towards a more hydrologically based approach have been proposed in the past, but these efforts received limited follow-up.Therefore, the objective of our paper is to (a) critically analyze the concept of precipitation ID thresholds for shallow landslides and debris flows from a hydro-meteorological point of view and (b) propose a trigger-cause conceptual framework for lumped regional hydro-meteorological hazard assessment based on published examples and associated discussion. We discuss the ID thresholds in relation to return periods of precipitation, soil physics, and slope and catchment water balance. With this paper, we aim to contribute to the development of a stronger conceptual model for regional landslide hazard assessment based on physical process understanding and empirical data.

  12. Predictors of first episode unipolar major depression in individuals with and without sub-threshold depressive symptoms: A prospective, population-based study

    PubMed Central

    Peters, Amy T.; Shankman, Stewart A.; Deckersbach, Thilo; West, Amy E.

    2015-01-01

    Background The aim of this study is to assess predictors of first-episode major depression in a community-based sample of adults with and without sub-threshold depression. Method Data were from Waves 1 & 2 of the National Epidemiological Survey on Alcohol and Related Conditions (NESARC). Participants meeting criteria for a sub-threshold depressive episode (sMDE; n = 3,901) reported lifetime depressed mood/loss of interest lasting at least two weeks and at least two of the seven other DSM-IV symptoms of MDD. Predictors of MDE 3 years later were compared in those with and without (n = 31,022) sMDE. Results Being female, history of alcohol or substance use, and child abuse increased the odds of developing MDD to a greater degree in individuals without sMDE relative to those with sMDE. Among those with sMDE and additional risk factors (low education, substance use), younger age was associated with marginally increased risk of MDD. Conclusion Several demographic risk factors may help identify individuals at risk for developing MDD in individuals who have not experienced an sMDE who may be candidates for early intervention. Future work should assess whether preventative interventions targeting substance/alcohol use and child abuse could reduce the risk of depression. PMID:26343831

  13. Deactivating stimulation sites based on low-rate thresholds improves spectral ripple and speech reception thresholds in cochlear implant users.

    PubMed

    Zhou, Ning

    2017-03-01

    The study examined whether the benefit of deactivating stimulation sites estimated to have broad neural excitation was attributed to improved spectral resolution in cochlear implant users. The subjects' spatial neural excitation pattern was estimated by measuring low-rate detection thresholds across the array [see Zhou (2016). PLoS One 11, e0165476]. Spectral resolution, as assessed by spectral-ripple discrimination thresholds, significantly improved after deactivation of five high-threshold sites. The magnitude of improvement in spectral-ripple discrimination thresholds predicted the magnitude of improvement in speech reception thresholds after deactivation. Results suggested that a smaller number of relatively independent channels provide a better outcome than using all channels that might interact.

  14. Estimating and modelling cure in population-based cancer studies within the framework of flexible parametric survival models.

    PubMed

    Andersson, Therese M L; Dickman, Paul W; Eloranta, Sandra; Lambert, Paul C

    2011-06-22

    When the mortality among a cancer patient group returns to the same level as in the general population, that is, the patients no longer experience excess mortality, the patients still alive are considered "statistically cured". Cure models can be used to estimate the cure proportion as well as the survival function of the "uncured". One limitation of parametric cure models is that the functional form of the survival of the "uncured" has to be specified. It can sometimes be hard to find a survival function flexible enough to fit the observed data, for example, when there is high excess hazard within a few months from diagnosis, which is common among older age groups. This has led to the exclusion of older age groups in population-based cancer studies using cure models. Here we have extended the flexible parametric survival model to incorporate cure as a special case to estimate the cure proportion and the survival of the "uncured". Flexible parametric survival models use splines to model the underlying hazard function, and therefore no parametric distribution has to be specified. We have compared the fit from standard cure models to our flexible cure model, using data on colon cancer patients in Finland. This new method gives similar results to a standard cure model, when it is reliable, and better fit when the standard cure model gives biased estimates. Cure models within the framework of flexible parametric models enables cure modelling when standard models give biased estimates. These flexible cure models enable inclusion of older age groups and can give stage-specific estimates, which is not always possible from parametric cure models. © 2011 Andersson et al; licensee BioMed Central Ltd.

  15. Modeling the population dynamics of Culex quinquefasciatus (Diptera: Culcidae), along an elevational gradient in Hawaii

    USGS Publications Warehouse

    Ahumada, Jorge A.; LaPointe, Dennis; Samuel, Michael D.

    2004-01-01

    We present a population model to understand the effects of temperature and rainfall on the population dynamics of the southern house mosquito, Culex quinquefasciatus Say, along an elevational gradient in Hawaii. We use a novel approach to model the effects of temperature on population growth by dynamically incorporating developmental rate into the transition matrix, by using physiological ages of immatures instead of chronological age or stages. We also model the effects of rainfall on survival of immatures as the cumulative number of days below a certain rain threshold. Finally, we incorporate density dependence into the model as competition between immatures within breeding sites. Our model predicts the upper altitudinal distributions of Cx. quinquefasciatus on the Big Island of Hawaii for self-sustaining mosquito and migrating summer sink populations at 1,475 and 1,715 m above sea level, respectively. Our model predicts that mosquitoes at lower elevations can grow under a broader range of rainfall parameters than middle and high elevation populations. Density dependence in conjunction with the seasonal forcing imposed by temperature and rain creates cycles in the dynamics of the population that peak in the summer and early fall. The model provides a reasonable fit to the available data on mosquito abundance for the east side of Mauna Loa, Hawaii. The predictions of our model indicate the importance of abiotic conditions on mosquito dynamics and have important implications for the management of diseases transmitted by Cx. quinquefasciatus in Hawaii and elsewhere.

  16. Stable Extraction of Threshold Voltage Using Transconductance Change Method for CMOS Modeling, Simulation and Characterization

    NASA Astrophysics Data System (ADS)

    Choi, Woo Young; Woo, Dong-Soo; Choi, Byung Yong; Lee, Jong Duk; Park, Byung-Gook

    2004-04-01

    We proposed a stable extraction algorithm for threshold voltage using transconductance change method by optimizing node interval. With the algorithm, noise-free gm2 (=dgm/dVGS) profiles can be extracted within one-percent error, which leads to more physically-meaningful threshold voltage calculation by the transconductance change method. The extracted threshold voltage predicts the gate-to-source voltage at which the surface potential is within kT/q of φs=2φf+VSB. Our algorithm makes the transconductance change method more practical by overcoming noise problem. This threshold voltage extraction algorithm yields the threshold roll-off behavior of nanoscale metal oxide semiconductor field effect transistor (MOSFETs) accurately and makes it possible to calculate the surface potential φs at any other point on the drain-to-source current (IDS) versus gate-to-source voltage (VGS) curve. It will provide us with a useful analysis tool in the field of device modeling, simulation and characterization.

  17. Velocity-based movement modeling for individual and population level inference

    USGS Publications Warehouse

    Hanks, Ephraim M.; Hooten, Mevin B.; Johnson, Devin S.; Sterling, Jeremy T.

    2011-01-01

    Understanding animal movement and resource selection provides important information about the ecology of the animal, but an animal's movement and behavior are not typically constant in time. We present a velocity-based approach for modeling animal movement in space and time that allows for temporal heterogeneity in an animal's response to the environment, allows for temporal irregularity in telemetry data, and accounts for the uncertainty in the location information. Population-level inference on movement patterns and resource selection can then be made through cluster analysis of the parameters related to movement and behavior. We illustrate this approach through a study of northern fur seal (Callorhinus ursinus) movement in the Bering Sea, Alaska, USA. Results show sex differentiation, with female northern fur seals exhibiting stronger response to environmental variables.

  18. Velocity-Based Movement Modeling for Individual and Population Level Inference

    PubMed Central

    Hanks, Ephraim M.; Hooten, Mevin B.; Johnson, Devin S.; Sterling, Jeremy T.

    2011-01-01

    Understanding animal movement and resource selection provides important information about the ecology of the animal, but an animal's movement and behavior are not typically constant in time. We present a velocity-based approach for modeling animal movement in space and time that allows for temporal heterogeneity in an animal's response to the environment, allows for temporal irregularity in telemetry data, and accounts for the uncertainty in the location information. Population-level inference on movement patterns and resource selection can then be made through cluster analysis of the parameters related to movement and behavior. We illustrate this approach through a study of northern fur seal (Callorhinus ursinus) movement in the Bering Sea, Alaska, USA. Results show sex differentiation, with female northern fur seals exhibiting stronger response to environmental variables. PMID:21931584

  19. Multi-host model and threshold of intermediate host Oncomelania snail density for eliminating schistosomiasis transmission in China.

    PubMed

    Zhou, Yi-Biao; Chen, Yue; Liang, Song; Song, Xiu-Xia; Chen, Geng-Xin; He, Zhong; Cai, Bin; Yihuo, Wu-Li; He, Zong-Gui; Jiang, Qing-Wu

    2016-08-18

    Schistosomiasis remains a serious public health issue in many tropical countries, with more than 700 million people at risk of infection. In China, a national integrated control strategy, aiming at blocking its transmission, has been carried out throughout endemic areas since 2005. A longitudinal study was conducted to determine the effects of different intervention measures on the transmission dynamics of S. japonicum in three study areas and the data were analyzed using a multi-host model. The multi-host model was also used to estimate the threshold of Oncomelania snail density for interrupting schistosomiasis transmission based on the longitudinal data as well as data from the national surveillance system for schistosomiasis. The data showed a continuous decline in the risk of human infection and the multi-host model fit the data well. The 25th, 50th and 75th percentiles, and the mean of estimated thresholds of Oncomelania snail density below which the schistosomiasis transmission cannot be sustained were 0.006, 0.009, 0.028 and 0.020 snails/0.11 m(2), respectively. The study results could help develop specific strategies of schistosomiasis control and elimination tailored to the local situation for each endemic area.

  20. Bivalves: From individual to population modelling

    NASA Astrophysics Data System (ADS)

    Saraiva, S.; van der Meer, J.; Kooijman, S. A. L. M.; Ruardij, P.

    2014-11-01

    An individual based population model for bivalves was designed, built and tested in a 0D approach, to simulate the population dynamics of a mussel bed located in an intertidal area. The processes at the individual level were simulated following the dynamic energy budget theory, whereas initial egg mortality, background mortality, food competition, and predation (including cannibalism) were additional population processes. Model properties were studied through the analysis of theoretical scenarios and by simulation of different mortality parameter combinations in a realistic setup, imposing environmental measurements. Realistic criteria were applied to narrow down the possible combination of parameter values. Field observations obtained in the long-term and multi-station monitoring program were compared with the model scenarios. The realistically selected modeling scenarios were able to reproduce reasonably the timing of some peaks in the individual abundances in the mussel bed and its size distribution but the number of individuals was not well predicted. The results suggest that the mortality in the early life stages (egg and larvae) plays an important role in population dynamics, either by initial egg mortality, larvae dispersion, settlement failure or shrimp predation. Future steps include the coupling of the population model with a hydrodynamic and biogeochemical model to improve the simulation of egg/larvae dispersion, settlement probability, food transport and also to simulate the feedback of the organisms' activity on the water column properties, which will result in an improvement of the food quantity and quality characterization.

  1. Novel Threshold Changeable Secret Sharing Schemes Based on Polynomial Interpolation.

    PubMed

    Yuan, Lifeng; Li, Mingchu; Guo, Cheng; Choo, Kim-Kwang Raymond; Ren, Yizhi

    2016-01-01

    After any distribution of secret sharing shadows in a threshold changeable secret sharing scheme, the threshold may need to be adjusted to deal with changes in the security policy and adversary structure. For example, when employees leave the organization, it is not realistic to expect departing employees to ensure the security of their secret shadows. Therefore, in 2012, Zhang et al. proposed (t → t', n) and ({t1, t2,⋯, tN}, n) threshold changeable secret sharing schemes. However, their schemes suffer from a number of limitations such as strict limit on the threshold values, large storage space requirement for secret shadows, and significant computation for constructing and recovering polynomials. To address these limitations, we propose two improved dealer-free threshold changeable secret sharing schemes. In our schemes, we construct polynomials to update secret shadows, and use two-variable one-way function to resist collusion attacks and secure the information stored by the combiner. We then demonstrate our schemes can adjust the threshold safely.

  2. Statistical Analysis of SSMIS Sea Ice Concentration Threshold at the Arctic Sea Ice Edge during Summer Based on MODIS and Ship-Based Observational Data.

    PubMed

    Ji, Qing; Li, Fei; Pang, Xiaoping; Luo, Cong

    2018-04-05

    The threshold of sea ice concentration (SIC) is the basis for accurately calculating sea ice extent based on passive microwave (PM) remote sensing data. However, the PM SIC threshold at the sea ice edge used in previous studies and released sea ice products has not always been consistent. To explore the representable value of the PM SIC threshold corresponding on average to the position of the Arctic sea ice edge during summer in recent years, we extracted sea ice edge boundaries from the Moderate-resolution Imaging Spectroradiometer (MODIS) sea ice product (MOD29 with a spatial resolution of 1 km), MODIS images (250 m), and sea ice ship-based observation points (1 km) during the fifth (CHINARE-2012) and sixth (CHINARE-2014) Chinese National Arctic Research Expeditions, and made an overlay and comparison analysis with PM SIC derived from Special Sensor Microwave Imager Sounder (SSMIS, with a spatial resolution of 25 km) in the summer of 2012 and 2014. Results showed that the average SSMIS SIC threshold at the Arctic sea ice edge based on ice-water boundary lines extracted from MOD29 was 33%, which was higher than that of the commonly used 15% discriminant threshold. The average SIC threshold at sea ice edge based on ice-water boundary lines extracted by visual interpretation from four scenes of the MODIS image was 35% when compared to the average value of 36% from the MOD29 extracted ice edge pixels for the same days. The average SIC of 31% at the sea ice edge points extracted from ship-based observations also confirmed that choosing around 30% as the SIC threshold during summer is recommended for sea ice extent calculations based on SSMIS PM data. These results can provide a reference for further studying the variation of sea ice under the rapidly changing Arctic.

  3. [Research on the threshold of Chl-a in Lake Taihu based on microcystins].

    PubMed

    Wei, Dai-chun; Su, Jing; Ji, Dan-feng; Fu, Xiao-yong; Wang, Ji; Huo, Shou-liang; Cui, Chi-fei; Tang, Jun; Xi, Bei-dou

    2014-12-01

    Water samples were collected in Lake Taihu from June to October in 2013 in order to investigate the threshold of chlorophyll a (Chl-a). The concentrations of three microcystins isomers (MC-LR, MC-RR, MC-YR) were detected by means of solid phase extraction and high performance liquid chromatography-tandem mass spectrometry. The correlations between various MCs and eutrophication factors, for instance of total nitrogen (TN), total phosphorus (TP), chlorophyll a, permanganate index etc were analyzed. The threshold of Chl-a was studied based on the relationships between MC-LR, MCs and Chl-a. The results showed that Lake Taihu was severely polluted by MCs and its spatial distribution could be described as follows: the concentration in Meiliang Bay was the highest, followed by Gonghu Bay and Western Lake, and Lake Center; the least polluted areas were in Lake Xuhu and Southern Lake. The concentration of MC-LR was the highest among the 3 MCs. The correlation analysis indicated that MC-LR, MC-RR, MC-YR and MCs had very positive correlation with permanganate index, TN, TP and Chl-a (P < 0.01). The threshold value of Chl-a was 12.26 mg x m(-3) according to the standard thresholds of MC-LR and MCs in drinking water. The threshold value of Chl-a in Lake Taihu was very close to the standard in the State of North Carolina, which demonstrated that the threshold value provided in this study was reasonable.

  4. Threshold-based insulin-pump interruption for reduction of hypoglycemia.

    PubMed

    Bergenstal, Richard M; Klonoff, David C; Garg, Satish K; Bode, Bruce W; Meredith, Melissa; Slover, Robert H; Ahmann, Andrew J; Welsh, John B; Lee, Scott W; Kaufman, Francine R

    2013-07-18

    The threshold-suspend feature of sensor-augmented insulin pumps is designed to minimize the risk of hypoglycemia by interrupting insulin delivery at a preset sensor glucose value. We evaluated sensor-augmented insulin-pump therapy with and without the threshold-suspend feature in patients with nocturnal hypoglycemia. We randomly assigned patients with type 1 diabetes and documented nocturnal hypoglycemia to receive sensor-augmented insulin-pump therapy with or without the threshold-suspend feature for 3 months. The primary safety outcome was the change in the glycated hemoglobin level. The primary efficacy outcome was the area under the curve (AUC) for nocturnal hypoglycemic events. Two-hour threshold-suspend events were analyzed with respect to subsequent sensor glucose values. A total of 247 patients were randomly assigned to receive sensor-augmented insulin-pump therapy with the threshold-suspend feature (threshold-suspend group, 121 patients) or standard sensor-augmented insulin-pump therapy (control group, 126 patients). The changes in glycated hemoglobin values were similar in the two groups. The mean AUC for nocturnal hypoglycemic events was 37.5% lower in the threshold-suspend group than in the control group (980 ± 1200 mg per deciliter [54.4 ± 66.6 mmol per liter] × minutes vs. 1568 ± 1995 mg per deciliter [87.0 ± 110.7 mmol per liter] × minutes, P<0.001). Nocturnal hypoglycemic events occurred 31.8% less frequently in the threshold-suspend group than in the control group (1.5 ± 1.0 vs. 2.2 ± 1.3 per patient-week, P<0.001). The percentages of nocturnal sensor glucose values of less than 50 mg per deciliter (2.8 mmol per liter), 50 to less than 60 mg per deciliter (3.3 mmol per liter), and 60 to less than 70 mg per deciliter (3.9 mmol per liter) were significantly reduced in the threshold-suspend group (P<0.001 for each range). After 1438 instances at night in which the pump was stopped for 2 hours, the mean sensor glucose value was 92.6 ± 40.7 mg

  5. Modeling of Beams’ Multiple-Contact Mode with an Application in the Design of a High-g Threshold Microaccelerometer

    PubMed Central

    Li, Kai; Chen, Wenyuan; Zhang, Weiping

    2011-01-01

    Beam’s multiple-contact mode, characterized by multiple and discrete contact regions, non-uniform stoppers’ heights, irregular contact sequence, seesaw-like effect, indirect interaction between different stoppers, and complex coupling relationship between loads and deformation is studied. A novel analysis method and a novel high speed calculation model are developed for multiple-contact mode under mechanical load and electrostatic load, without limitations on stopper height and distribution, providing the beam has stepped or curved shape. Accurate values of deflection, contact load, contact region and so on are obtained directly, with a subsequent validation by CoventorWare. A new concept design of high-g threshold microaccelerometer based on multiple-contact mode is presented, featuring multiple acceleration thresholds of one sensitive component and consequently small sensor size. PMID:22163897

  6. Prediction of pKa Values for Neutral and Basic Drugs based on Hybrid Artificial Intelligence Methods.

    PubMed

    Li, Mengshan; Zhang, Huaijing; Chen, Bingsheng; Wu, Yan; Guan, Lixin

    2018-03-05

    The pKa value of drugs is an important parameter in drug design and pharmacology. In this paper, an improved particle swarm optimization (PSO) algorithm was proposed based on the population entropy diversity. In the improved algorithm, when the population entropy was higher than the set maximum threshold, the convergence strategy was adopted; when the population entropy was lower than the set minimum threshold the divergence strategy was adopted; when the population entropy was between the maximum and minimum threshold, the self-adaptive adjustment strategy was maintained. The improved PSO algorithm was applied in the training of radial basis function artificial neural network (RBF ANN) model and the selection of molecular descriptors. A quantitative structure-activity relationship model based on RBF ANN trained by the improved PSO algorithm was proposed to predict the pKa values of 74 kinds of neutral and basic drugs and then validated by another database containing 20 molecules. The validation results showed that the model had a good prediction performance. The absolute average relative error, root mean square error, and squared correlation coefficient were 0.3105, 0.0411, and 0.9685, respectively. The model can be used as a reference for exploring other quantitative structure-activity relationships.

  7. Automatic threshold selection for multi-class open set recognition

    NASA Astrophysics Data System (ADS)

    Scherreik, Matthew; Rigling, Brian

    2017-05-01

    Multi-class open set recognition is the problem of supervised classification with additional unknown classes encountered after a model has been trained. An open set classifer often has two core components. The first component is a base classifier which estimates the most likely class of a given example. The second component consists of open set logic which estimates if the example is truly a member of the candidate class. Such a system is operated in a feed-forward fashion. That is, a candidate label is first estimated by the base classifier, and the true membership of the example to the candidate class is estimated afterward. Previous works have developed an iterative threshold selection algorithm for rejecting examples from classes which were not present at training time. In those studies, a Platt-calibrated SVM was used as the base classifier, and the thresholds were applied to class posterior probabilities for rejection. In this work, we investigate the effectiveness of other base classifiers when paired with the threshold selection algorithm and compare their performance with the original SVM solution.

  8. Modeling the effect of surgical sterilization on owned dog population size in Villa de Tezontepec, Hidalgo, Mexico, using an individual-based computer simulation model

    PubMed Central

    Kisiel, Luz Maria; Jones-Bitton, Andria; Sargeant, Jan M.; Coe, Jason B.; Flockhart, D. T. Tyler; Canales Vargas, Erick J.

    2018-01-01

    Surgical sterilization programs for dogs have been proposed as interventions to control dog population size. Models can be used to help identify the long-term impact of reproduction control interventions for dogs. The objective of this study was to determine the projected impact of surgical sterilization interventions on the owned dog population size in Villa de Tezontepec, Hidalgo, Mexico. A stochastic, individual-based simulation model was constructed and parameterized using a combination of empirical data collected on the demographics of owned dogs in Villa de Tezontepec and data available from the peer-reviewed literature. Model outcomes were assessed using a 20-year time horizon. The model was used to examine: the effect of surgical sterilization strategies focused on: 1) dogs of any age and sex, 2) female dogs of any age, 3) young dogs (i.e., not yet reached sexual maturity) of any sex, and 4) young, female dogs. Model outcomes suggested that as surgical capacity increases from 21 to 84 surgeries/month, (8.6% to 34.5% annual sterilization) for dogs of any age, the mean dog population size after 20 years was reduced between 14% and 79% compared to the base case scenario (i.e. in the absence of intervention). Surgical sterilization interventions focused only on young dogs of any sex yielded greater reductions (81% - 90%) in the mean population size, depending on the level of surgical capacity. More focused sterilization targeted at female dogs of any age, resulted in reductions that were similar to focusing on mixed sex sterilization of only young dogs (82% - 92%). The greatest mean reduction in population size (90% - 91%) was associated with sterilization of only young, female dogs. Our model suggests that targeting sterilization to young females could enhance the efficacy of existing surgical dog population control interventions in this location, without investing extra resources. PMID:29856830

  9. Pathways to extinction: beyond the error threshold.

    PubMed

    Manrubia, Susanna C; Domingo, Esteban; Lázaro, Ester

    2010-06-27

    Since the introduction of the quasispecies and the error catastrophe concepts for molecular evolution by Eigen and their subsequent application to viral populations, increased mutagenesis has become a common strategy to cause the extinction of viral infectivity. Nevertheless, the high complexity of virus populations has shown that viral extinction can occur through several other pathways apart from crossing an error threshold. Increases in the mutation rate enhance the appearance of defective forms and promote the selection of mechanisms that are able to counteract the accelerated appearance of mutations. Current models of viral evolution take into account more realistic scenarios that consider compensatory and lethal mutations, a highly redundant genotype-to-phenotype map, rough fitness landscapes relating phenotype and fitness, and where phenotype is described as a set of interdependent traits. Further, viral populations cannot be understood without specifying the characteristics of the environment where they evolve and adapt. Altogether, it turns out that the pathways through which viral quasispecies go extinct are multiple and diverse.

  10. Incorporating evolutionary processes into population viability models.

    PubMed

    Pierson, Jennifer C; Beissinger, Steven R; Bragg, Jason G; Coates, David J; Oostermeijer, J Gerard B; Sunnucks, Paul; Schumaker, Nathan H; Trotter, Meredith V; Young, Andrew G

    2015-06-01

    We examined how ecological and evolutionary (eco-evo) processes in population dynamics could be better integrated into population viability analysis (PVA). Complementary advances in computation and population genomics can be combined into an eco-evo PVA to offer powerful new approaches to understand the influence of evolutionary processes on population persistence. We developed the mechanistic basis of an eco-evo PVA using individual-based models with individual-level genotype tracking and dynamic genotype-phenotype mapping to model emergent population-level effects, such as local adaptation and genetic rescue. We then outline how genomics can allow or improve parameter estimation for PVA models by providing genotypic information at large numbers of loci for neutral and functional genome regions. As climate change and other threatening processes increase in rate and scale, eco-evo PVAs will become essential research tools to evaluate the effects of adaptive potential, evolutionary rescue, and locally adapted traits on persistence. © 2014 Society for Conservation Biology.

  11. Impact of in-Sewer Degradation of Pharmaceutical and Personal Care Products (PPCPs) Population Markers on a Population Model.

    PubMed

    O'Brien, Jake William; Banks, Andrew Phillip William; Novic, Andrew Joseph; Mueller, Jochen F; Jiang, Guangming; Ort, Christoph; Eaglesham, Geoff; Yuan, Zhiguo; Thai, Phong K

    2017-04-04

    A key uncertainty of wastewater-based epidemiology is the size of the population which contributed to a given wastewater sample. We previously developed and validated a Bayesian inference model to estimate population size based on 14 population markers which: (1) are easily measured and (2) have mass loads which correlate with population size. However, the potential uncertainty of the model prediction due to in-sewer degradation of these markers was not evaluated. In this study, we addressed this gap by testing their stability under sewer conditions and assessed whether degradation impacts the model estimates. Five markers, which formed the core of our model, were stable in the sewers while the others were not. Our evaluation showed that the presence of unstable population markers in the model did not decrease the precision of the population estimates providing that stable markers such as acesulfame remained in the model. However, to achieve the minimum uncertainty in population estimates, we propose that the core markers to be included in population models for other sites should meet two additional criteria: (3) negligible degradation in wastewater to ensure the stability of chemicals during collection; and (4) < 10% in-sewer degradation could occur during the mean residence time of the sewer network.

  12. Definition of temperature thresholds: the example of the French heat wave warning system.

    PubMed

    Pascal, Mathilde; Wagner, Vérène; Le Tertre, Alain; Laaidi, Karine; Honoré, Cyrille; Bénichou, Françoise; Beaudeau, Pascal

    2013-01-01

    Heat-related deaths should be somewhat preventable. In France, some prevention measures are activated when minimum and maximum temperatures averaged over three days reach city-specific thresholds. The current thresholds were computed based on a descriptive analysis of past heat waves and on local expert judgement. We tested whether a different method would confirm these thresholds. The study was set in the six cities of Paris, Lyon, Marseille, Nantes, Strasbourg and Limoges between 1973 and 2003. For each city, we estimated the excess in mortality associated with different temperature thresholds, using a generalised additive model, controlling for long-time trends, seasons and days of the week. These models were used to compute the mortality predicted by different percentiles of temperatures. The thresholds were chosen as the percentiles associated with a significant excess mortality. In all cities, there was a good correlation between current thresholds and the thresholds derived from the models, with 0°C to 3°C differences for averaged maximum temperatures. Both set of thresholds were able to anticipate the main periods of excess mortality during the summers of 1973 to 2003. A simple method relying on descriptive analysis and expert judgement is sufficient to define protective temperature thresholds and to prevent heat wave mortality. As temperatures are increasing along with the climate change and adaptation is ongoing, more research is required to understand if and when thresholds should be modified.

  13. Uncovering contrast categories in categorization with a probabilistic threshold model.

    PubMed

    Verheyen, Steven; De Deyne, Simon; Dry, Matthew J; Storms, Gert

    2011-11-01

    A contrast category effect on categorization occurs when the decision to apply a category term to an entity not only involves a comparison between the entity and the target category but is also influenced by a comparison of the entity with 1 or more alternative categories from the same domain as the target. Establishing a contrast category effect on categorization in natural language categories has proven to be laborious, especially when the categories concerned are natural kinds situated at the superordinate level of abstraction. We conducted 3 studies with these categories to look for an influence on categorization of both similarity to the target category and similarity to a contrast category. The results are analyzed with a probabilistic threshold model that assumes categorization decisions arise from the placement of threshold criteria by individual categorizers along a single scale that holds the experimental stimuli. The stimuli's positions along the scale are shown to be influenced by similarity to both target and contrast. These findings suggest that the prevalence of contrast category effects on categorization might have been underestimated. Additional analyses demonstrate how the proposed model can be employed in future studies to systematically investigate the origins of contrast category effects on categorization.

  14. Low dimensional model of heart rhythm dynamics as a tool for diagnosing the anaerobic threshold

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anosov, O.L.; Butkovskii, O.Y.; Kadtke, J.

    We report preliminary results on describing the dependence of the heart rhythm variability on the stress level by using qualitative, low dimensional models. The reconstruction of macroscopic heart models yielding cardio cycles (RR-intervals) duration was based on actual clinical data. Our results show that the coefficients of the low dimensional models are sensitive to metabolic changes. In particular, at the transition between aerobic and aerobic-anaerobic metabolism, there are pronounced extrema in the functional dependence of the coefficients on the stress level. This strong sensitivity can be used to design an easy indirect method for determining the anaerobic threshold. This methodmore » could replace costly and invasive traditional methods such as gas analysis and blood tests. {copyright} {ital 1997 American Institute of Physics.}« less

  15. Probability of an Abnormal Screening PSA Result Based on Age, Race, and PSA Threshold

    PubMed Central

    Espaldon, Roxanne; Kirby, Katharine A.; Fung, Kathy Z.; Hoffman, Richard M.; Powell, Adam A.; Freedland, Stephen J.; Walter, Louise C.

    2014-01-01

    Objective To determine the distribution of screening PSA values in older men and how different PSA thresholds affect the proportion of white, black, and Latino men who would have an abnormal screening result across advancing age groups. Methods We used linked national VA and Medicare data to determine the value of the first screening PSA test (ng/mL) of 327,284 men age 65+ who underwent PSA screening in the VA healthcare system in 2003. We calculated the proportion of men with an abnormal PSA result based on age, race, and common PSA thresholds. Results Among men age 65+, 8.4% had a PSA >4.0ng/mL. The percentage of men with a PSA >4.0ng/mL increased with age and was highest in black men (13.8%) versus white (8.0%) or Latino men (10.0%) (P<0.001). Combining age and race, the probability of having a PSA >4.0ng/mL ranged from 5.1% of Latino men age 65–69 to 27.4% of black men age 85+. Raising the PSA threshold from >4.0ng/mL to >10.0ng/mL, reclassified the greatest percentage of black men age 85+ (18.3% absolute change) and the lowest percentage of Latino men age 65–69 (4.8% absolute change) as being under the biopsy threshold (P<0.001). Conclusions Age, race, and PSA threshold together affect the pre-test probability of an abnormal screening PSA result. Based on screening PSA distributions, stopping screening among men whose PSA < 3ng/ml means over 80% of white and Latino men age 70+ would stop further screening, and increasing the biopsy threshold to >10ng/ml has the greatest effect on reducing the number of older black men who will face biopsy decisions after screening. PMID:24439009

  16. Structured decision making as a conceptual framework to identify thresholds for conservation and management

    USGS Publications Warehouse

    Martin, J.; Runge, M.C.; Nichols, J.D.; Lubow, B.C.; Kendall, W.L.

    2009-01-01

    Thresholds and their relevance to conservation have become a major topic of discussion in the ecological literature. Unfortunately, in many cases the lack of a clear conceptual framework for thinking about thresholds may have led to confusion in attempts to apply the concept of thresholds to conservation decisions. Here, we advocate a framework for thinking about thresholds in terms of a structured decision making process. The purpose of this framework is to promote a logical and transparent process for making informed decisions for conservation. Specification of such a framework leads naturally to consideration of definitions and roles of different kinds of thresholds in the process. We distinguish among three categories of thresholds. Ecological thresholds are values of system state variables at which small changes bring about substantial changes in system dynamics. Utility thresholds are components of management objectives (determined by human values) and are values of state or performance variables at which small changes yield substantial changes in the value of the management outcome. Decision thresholds are values of system state variables at which small changes prompt changes in management actions in order to reach specified management objectives. The approach that we present focuses directly on the objectives of management, with an aim to providing decisions that are optimal with respect to those objectives. This approach clearly distinguishes the components of the decision process that are inherently subjective (management objectives, potential management actions) from those that are more objective (system models, estimates of system state). Optimization based on these components then leads to decision matrices specifying optimal actions to be taken at various values of system state variables. Values of state variables separating different actions in such matrices are viewed as decision thresholds. Utility thresholds are included in the objectives

  17. Population balance modeling: current status and future prospects.

    PubMed

    Ramkrishna, Doraiswami; Singh, Meenesh R

    2014-01-01

    Population balance modeling is undergoing phenomenal growth in its applications, and this growth is accompanied by multifarious reviews. This review aims to fortify the model's fundamental base, as well as point to a variety of new applications, including modeling of crystal morphology, cell growth and differentiation, gene regulatory processes, and transfer of drug resistance. This is accomplished by presenting the many faces of population balance equations that arise in the foregoing applications.

  18. Connecting micro dynamics and population distributions in system dynamics models

    PubMed Central

    Rahmandad, Hazhir; Chen, Hsin-Jen; Xue, Hong; Wang, Youfa

    2014-01-01

    Researchers use system dynamics models to capture the mean behavior of groups of indistinguishable population elements (e.g., people) aggregated in stock variables. Yet, many modeling problems require capturing the heterogeneity across elements with respect to some attribute(s) (e.g., body weight). This paper presents a new method to connect the micro-level dynamics associated with elements in a population with the macro-level population distribution along an attribute of interest without the need to explicitly model every element. We apply the proposed method to model the distribution of Body Mass Index and its changes over time in a sample population of American women obtained from the U.S. National Health and Nutrition Examination Survey. Comparing the results with those obtained from an individual-based model that captures the same phenomena shows that our proposed method delivers accurate results with less computation than the individual-based model. PMID:25620842

  19. The asymmetric reactions of mean and volatility of stock returns to domestic and international information based on a four-regime double-threshold GARCH model

    NASA Astrophysics Data System (ADS)

    Chen, Cathy W. S.; Yang, Ming Jing; Gerlach, Richard; Jim Lo, H.

    2006-07-01

    In this paper, we investigate the asymmetric reactions of mean and volatility of stock returns in five major markets to their own local news and the US information via linear and nonlinear models. We introduce a four-regime Double-Threshold GARCH (DTGARCH) model, which allows asymmetry in both the conditional mean and variance equations simultaneously by employing two threshold variables, to analyze the stock markets’ reactions to different types of information (good/bad news) generated from the domestic markets and the US stock market. By applying the four-regime DTGARCH model, this study finds that the interaction between the information of domestic and US stock markets leads to the asymmetric reactions of stock returns and their variability. In addition, this research also finds that the positive autocorrelation reported in the previous studies of financial markets may in fact be mis-specified, and actually due to the local market's positive response to the US stock market.

  20. A new ODE tumor growth modeling based on tumor population dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oroji, Amin; Omar, Mohd bin; Yarahmadian, Shantia

    2015-10-22

    In this paper a new mathematical model for the population of tumor growth treated by radiation is proposed. The cells dynamics population in each state and the dynamics of whole tumor population are studied. Furthermore, a new definition of tumor lifespan is presented. Finally, the effects of two main parameters, treatment parameter (q), and repair mechanism parameter (r) on tumor lifespan are probed, and it is showed that the change in treatment parameter (q) highly affects the tumor lifespan.

  1. Anaerobic Threshold by Mathematical Model in Healthy and Post-Myocardial Infarction Men.

    PubMed

    Novais, L D; Silva, E; Simões, R P; Sakabe, D I; Martins, L E B; Oliveira, L; Diniz, C A R; Gallo, L; Catai, A M

    2016-02-01

    The aim of this study was to determine the anaerobic threshold (AT) in a population of healthy and post-myocardial infarction men by applying Hinkley's mathematical method and comparing its performance to the ventilatory visual method. This mathematical model, in lieu of observer-dependent visual determination, can produce more reliable results due to the uniformity of the procedure. 17 middle-aged men (55±3 years) were studied in 2 groups: 9 healthy men (54±2 years); and 8 men with previous myocardial infarction (57±3 years). All subjects underwent an incremental ramp exercise test until physical exhaustion. Breath-by-breath ventilatory variables, heart rate (HR), and vastus lateralis surface electromyography (sEMG) signal were collected throughout the test. Carbon dioxide output (V˙CO2), HR, and sEMG were studied, and the AT determination methods were compared using correlation coefficients and Bland-Altman plots. Parametric statistical tests were applied with significance level set at 5%. No significant differences were found in the HR, sEMG, and ventilatory variables at AT between the different methods, such as the intensity of effort relative to AT. Moreover, important concordance and significant correlations were observed between the methods. We concluded that the mathematical model was suitable for detecting the AT in both healthy and myocardial infarction subjects. © Georg Thieme Verlag KG Stuttgart · New York.

  2. Differential Equation Models for Sharp Threshold Dynamics

    DTIC Science & Technology

    2012-08-01

    dynamics, and the Lanchester model of armed conflict, where the loss of a key capability drastically changes dynamics. We derive and demonstrate a step...dynamics using differential equations. 15. SUBJECT TERMS Differential Equations, Markov Population Process, S-I-R Epidemic, Lanchester Model 16...infection, where a detection event drastically changes dynamics, and the Lanchester model of armed conflict, where the loss of a key capability

  3. Novel Threshold Changeable Secret Sharing Schemes Based on Polynomial Interpolation

    PubMed Central

    Li, Mingchu; Guo, Cheng; Choo, Kim-Kwang Raymond; Ren, Yizhi

    2016-01-01

    After any distribution of secret sharing shadows in a threshold changeable secret sharing scheme, the threshold may need to be adjusted to deal with changes in the security policy and adversary structure. For example, when employees leave the organization, it is not realistic to expect departing employees to ensure the security of their secret shadows. Therefore, in 2012, Zhang et al. proposed (t → t′, n) and ({t1, t2,⋯, tN}, n) threshold changeable secret sharing schemes. However, their schemes suffer from a number of limitations such as strict limit on the threshold values, large storage space requirement for secret shadows, and significant computation for constructing and recovering polynomials. To address these limitations, we propose two improved dealer-free threshold changeable secret sharing schemes. In our schemes, we construct polynomials to update secret shadows, and use two-variable one-way function to resist collusion attacks and secure the information stored by the combiner. We then demonstrate our schemes can adjust the threshold safely. PMID:27792784

  4. Fractal Approach to Erosion Threshold of Bentonites

    NASA Astrophysics Data System (ADS)

    Xu, Y. F.; Li, X. Y.

    Bentonite has been considered as a candidate buffer material for the disposal of high-level radioactive waste (HLW) because of its low permeability, high sorption capacity, self-sealing characteristics and durability in a natural environment. Bentonite erosion caused by groundwater flow may take place at the interface of the compacted bentonite and fractured granite. Surface erosion of bentonite flocs is represented typically as an erosion threshold. Predicting the erosion threshold of bentonite flocs requires taking into account cohesion, which results from interactions between clay particles. Beyond the usual dependence on grain size, a significant correlation between erosion threshold and porosity measurements is confirmed for bentonite flocs. A fractal model for erosion threshold of bentonite flocs is proposed. Cohesion forces, the long-range van der Waals interaction between two clay particles are taken as the resource of the erosion threshold. The model verification is conducted by the comparison with experiments published in the literature. The results show that the proposed model for erosion threshold is in good agreement with the experimental data.

  5. Thresholds in forest bird occurrence as a function of the amount of early-seral broadleaf forest at landscape scales

    USGS Publications Warehouse

    Betts, M.G.; Hagar, J.C.; Rivers, J.W.; Alexander, J.D.; McGarigal, K.; McComb, B.C.

    2010-01-01

    Recent declines in broadleaf-dominated, early-seral forest globally as a function of intensive forest management and/or fire suppression have raised concern about the viability of populations dependent on such forest types. However, quantitative information about the strength and direction of species associations with broadleaf cover at landscape scales are rare. Uncovering such habitat relationships is essential for understanding the demography of species and in developing sound conservation strategies. It is particularly important to detect points in habitat reduction where rates of population decline may accelerate or the likelihood of species occurrence drops rapidly (i.e., thresholds). Here, we use a large avian point-count data set (N = 4375) from southwestern and northwestern Oregon along with segmented logistic regression to test for thresholds in forest bird occurrence as a function of broadleaf forest and early-seral broadleaf forest at local (150-m radius) and landscape (500–2000-m radius) scales. All 12 bird species examined showed positive responses to either broadleaf forest in general, and/or early-seral broadleaf forest. However, regional variation in species response to these conditions was high. We found considerable evidence for landscape thresholds in bird species occurrence as a function of broadleaf cover; threshold models received substantially greater support than linear models for eight of 12 species. Landscape thresholds in broadleaf forest ranged broadly from 1.35% to 24.55% mean canopy cover. Early-seral broadleaf thresholds tended to be much lower (0.22–1.87%). We found a strong negative relationship between the strength of species association with early-seral broadleaf forest and 42-year bird population trends; species most associated with this forest type have declined at the greatest rates. Taken together, these results provide the first support for the hypothesis that reductions in broadleaf-dominated early-seral forest due to

  6. An Improved Compressive Sensing and Received Signal Strength-Based Target Localization Algorithm with Unknown Target Population for Wireless Local Area Networks.

    PubMed

    Yan, Jun; Yu, Kegen; Chen, Ruizhi; Chen, Liang

    2017-05-30

    In this paper a two-phase compressive sensing (CS) and received signal strength (RSS)-based target localization approach is proposed to improve position accuracy by dealing with the unknown target population and the effect of grid dimensions on position error. In the coarse localization phase, by formulating target localization as a sparse signal recovery problem, grids with recovery vector components greater than a threshold are chosen as the candidate target grids. In the fine localization phase, by partitioning each candidate grid, the target position in a grid is iteratively refined by using the minimum residual error rule and the least-squares technique. When all the candidate target grids are iteratively partitioned and the measurement matrix is updated, the recovery vector is re-estimated. Threshold-based detection is employed again to determine the target grids and hence the target population. As a consequence, both the target population and the position estimation accuracy can be significantly improved. Simulation results demonstrate that the proposed approach achieves the best accuracy among all the algorithms compared.

  7. Computational Model of Population Dynamics Based on the Cell Cycle and Local Interactions

    NASA Astrophysics Data System (ADS)

    Oprisan, Sorinel Adrian; Oprisan, Ana

    2005-03-01

    Our study bridges cellular (mesoscopic) level interactions and global population (macroscopic) dynamics of carcinoma. The morphological differences and transitions between well and smooth defined benign tumors and tentacular malignat tumors suggest a theoretical analysis of tumor invasion based on the development of mathematical models exhibiting bifurcations of spatial patterns in the density of tumor cells. Our computational model views the most representative and clinically relevant features of oncogenesis as a fight between two distinct sub-systems: the immune system of the host and the neoplastic system. We implemented the neoplastic sub-system using a three-stage cell cycle: active, dormant, and necrosis. The second considered sub-system consists of cytotoxic active (effector) cells — EC, with a very broad phenotype ranging from NK cells to CTL cells, macrophages, etc. Based on extensive numerical simulations, we correlated the fractal dimensions for carcinoma, which could be obtained from tumor imaging, with the malignat stage. Our computational model was able to also simulate the effects of surgical, chemotherapeutical, and radiotherapeutical treatments.

  8. An energy budget agent-based model of earthworm populations and its application to study the effects of pesticides

    PubMed Central

    Johnston, A.S.A.; Hodson, M.E.; Thorbek, P.; Alvarez, T.; Sibly, R.M.

    2014-01-01

    Earthworms are important organisms in soil communities and so are used as model organisms in environmental risk assessments of chemicals. However current risk assessments of soil invertebrates are based on short-term laboratory studies, of limited ecological relevance, supplemented if necessary by site-specific field trials, which sometimes are challenging to apply across the whole agricultural landscape. Here, we investigate whether population responses to environmental stressors and pesticide exposure can be accurately predicted by combining energy budget and agent-based models (ABMs), based on knowledge of how individuals respond to their local circumstances. A simple energy budget model was implemented within each earthworm Eisenia fetida in the ABM, based on a priori parameter estimates. From broadly accepted physiological principles, simple algorithms specify how energy acquisition and expenditure drive life cycle processes. Each individual allocates energy between maintenance, growth and/or reproduction under varying conditions of food density, soil temperature and soil moisture. When simulating published experiments, good model fits were obtained to experimental data on individual growth, reproduction and starvation. Using the energy budget model as a platform we developed methods to identify which of the physiological parameters in the energy budget model (rates of ingestion, maintenance, growth or reproduction) are primarily affected by pesticide applications, producing four hypotheses about how toxicity acts. We tested these hypotheses by comparing model outputs with published toxicity data on the effects of copper oxychloride and chlorpyrifos on E. fetida. Both growth and reproduction were directly affected in experiments in which sufficient food was provided, whilst maintenance was targeted under food limitation. Although we only incorporate toxic effects at the individual level we show how ABMs can readily extrapolate to larger scales by providing

  9. Terrestrial Microgravity Model and Threshold Gravity Simulation using Magnetic Levitation

    NASA Technical Reports Server (NTRS)

    Ramachandran, N.

    2005-01-01

    What is the threshold gravity (minimum gravity level) required for the nominal functioning of the human system? What dosage is required? Do human cell lines behave differently in microgravity in response to an external stimulus? The critical need for such a gravity simulator is emphasized by recent experiments on human epithelial cells and lymphocytes on the Space Shuttle clearly showing that cell growth and function are markedly different from those observed terrestrially. Those differences are also dramatic between cells grown in space and those in Rotating Wall Vessels (RWV), or NASA bioreactor often used to simulate microgravity, indicating that although morphological growth patterns (three dimensional growth) can be successfully simulated using RWVs, cell function performance is not reproduced - a critical difference. If cell function is dramatically affected by gravity off-loading, then cell response to stimuli such as radiation, stress, etc. can be very different from terrestrial cell lines. Yet, we have no good gravity simulator for use in study of these phenomena. This represents a profound shortcoming for countermeasures research. We postulate that we can use magnetic levitation of cells and tissue, through the use of strong magnetic fields and field gradients, as a terrestrial microgravity model to study human cells. Specific objectives of the research are: 1. To develop a tried, tested and benchmarked terrestrial microgravity model for cell culture studies; 2. Gravity threshold determination; 3. Dosage (magnitude and duration) of g-level required for nominal functioning of cells; 4. Comparisons of magnetic levitation model to other models such as RWV, hind limb suspension, etc. and 5. Cellular response to reduced gravity levels of Moon and Mars. The paper will discuss experiments md modeling work to date in support of this project.

  10. Double Photoionization Near Threshold

    NASA Technical Reports Server (NTRS)

    Wehlitz, Ralf

    2007-01-01

    The threshold region of the double-photoionization cross section is of particular interest because both ejected electrons move slowly in the Coulomb field of the residual ion. Near threshold both electrons have time to interact with each other and with the residual ion. Also, different theoretical models compete to describe the double-photoionization cross section in the threshold region. We have investigated that cross section for lithium and beryllium and have analyzed our data with respect to the latest results in the Coulomb-dipole theory. We find that our data support the idea of a Coulomb-dipole interaction.

  11. A climate-driven mechanistic population model of Aedes albopictus with diapause.

    PubMed

    Jia, Pengfei; Lu, Liang; Chen, Xiang; Chen, Jin; Guo, Li; Yu, Xiao; Liu, Qiyong

    2016-03-24

    The mosquito Aedes albopitus is a competent vector for the transmission of many blood-borne pathogens. An important factor that affects the mosquitoes' development and spreading is climate, such as temperature, precipitation and photoperiod. Existing climate-driven mechanistic models overlook the seasonal pattern of diapause, referred to as the survival strategy of mosquito eggs being dormant and unable to hatch under extreme weather. With respect to diapause, several issues remain unaddressed, including identifying the time when diapause eggs are laid and hatched under different climatic conditions, demarcating the thresholds of diapause and non-diapause periods, and considering the mortality rate of diapause eggs. Here we propose a generic climate-driven mechanistic population model of Ae. albopitus applicable to most Ae. albopictus-colonized areas. The new model is an improvement over the previous work by incorporating the diapause behaviors with many modifications to the stage-specific mechanism of the mosquitoes' life-cycle. monthly Container Index (CI) of Ae. albopitus collected in two Chinese cities, Guangzhou and Shanghai is used for model validation. The simulation results by the proposed model is validated with entomological field data by the Pearson correlation coefficient r (2) in Guangzhou (r (2) = 0.84) and in Shanghai (r (2) = 0.90). In addition, by consolidating the effect of diapause-related adjustments and temperature-related parameters in the model, the improvement is significant over the basic model. The model highlights the importance of considering diapause in simulating Ae. albopitus population. It also corroborates that temperature and photoperiod are significant in affecting the population dynamics of the mosquito. By refining the relationship between Ae. albopitus population and climatic factors, the model serves to establish a mechanistic relation to the growth and decline of the species. Understanding this relationship in a better way

  12. Wafer plane inspection with soft resist thresholding

    NASA Astrophysics Data System (ADS)

    Hess, Carl; Shi, Rui-fang; Wihl, Mark; Xiong, Yalin; Pang, Song

    2008-10-01

    Wafer Plane Inspection (WPI) is an inspection mode on the KLA-Tencor TeraScaTM platform that uses the high signalto- noise ratio images from the high numerical aperture microscope, and then models the entire lithographic process to enable defect detection on the wafer plane[1]. This technology meets the needs of some advanced mask manufacturers to identify the lithographically-significant defects while ignoring the other non-lithographically-significant defects. WPI accomplishes this goal by performing defect detection based on a modeled image of how the mask features would actually print in the photoresist. There are several advantages to this approach: (1) the high fidelity of the images provide a sensitivity advantage over competing approaches; (2) the ability to perform defect detection on the wafer plane allows one to only see those defects that have a printing impact on the wafer; (3) the use of modeling on the lithographic portion of the flow enables unprecedented flexibility to support arbitrary illumination profiles, process-window inspection in unit time, and combination modes to find both printing and non-printing defects. WPI is proving to be a valuable addition to the KLA-Tencor detection algorithm suite. The modeling portion of WPI uses a single resist threshold as the final step in the processing. This has been shown to be adequate on several advanced customer layers, but is not ideal for all layers. Actual resist chemistry has complicated processes including acid and base-diffusion and quench that are not consistently well-modeled with a single resist threshold. We have considered the use of an advanced resist model for WPI, but rejected it because the burdensome requirements for the calibration of the model were not practical for reticle inspection. This paper describes an alternative approach that allows for a "soft" resist threshold to be applied that provides a more robust solution for the most challenging processes. This approach is just

  13. Comparative analysis of risk-based cleanup levels and associated remediation costs using linearized multistage model (cancer slope factor) vs. threshold approach (reference dose) for three chlorinated alkenes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawton, L.J.; Mihalich, J.P.

    1995-12-31

    The chlorinated alkenes 1,1-dichloroethene (1,1-DCE), tetrachloroethene (PCE), and trichloroethene (TCE) are common environmental contaminants found in soil and groundwater at hazardous waste sites. Recent assessment of data from epidemiology and mechanistic studies indicates that although exposure to 1,1-DCE, PCE, and TCE causes tumor formation in rodents, it is unlikely that these chemicals are carcinogenic to humans. Nevertheless, many state and federal agencies continue to regulate these compounds as carcinogens through the use of the linearized multistage model and resulting cancer slope factor (CSF). The available data indicate that 1,1-DCE, PCE, and TCE should be assessed using a threshold (i.e., referencemore » dose [RfD]) approach rather than a CSF. This paper summarizes the available metabolic, toxicologic, and epidemiologic data that question the use of the linear multistage model (and CSF) for extrapolation from rodents to humans. A comparative analysis of potential risk-based cleanup goals (RBGs) for these three compounds in soil is presented for a hazardous waste site. Goals were calculated using the USEPA CSFs and using a threshold (i.e., RfD) approach. Costs associated with remediation activities required to meet each set of these cleanup goals are presented and compared.« less

  14. Stochastic Threshold Microdose Model for Cell Killing by Insoluble Metallic Nanomaterial Particles

    PubMed Central

    Scott, Bobby R.

    2010-01-01

    This paper introduces a novel microdosimetric model for metallic nanomaterial-particles (MENAP)-induced cytotoxicity. The focus is on the engineered insoluble MENAP which represent a significant breakthrough in the design and development of new products for consumers, industry, and medicine. Increased production is rapidly occurring and may cause currently unrecognized health effects (e.g., nervous system dysfunction, heart disease, cancer); thus, dose-response models for MENAP-induced biological effects are needed to facilitate health risk assessment. The stochastic threshold microdose (STM) model presented introduces novel stochastic microdose metrics for use in constructing dose-response relationships for the frequency of specific cellular (e.g., cell killing, mutations, neoplastic transformation) or subcellular (e.g., mitochondria dysfunction) effects. A key metric is the exposure-time-dependent, specific burden (MENAP count) for a given critical target (e.g., mitochondria, nucleus). Exceeding a stochastic threshold specific burden triggers cell death. For critical targets in the cytoplasm, the autophagic mode of death is triggered. For the nuclear target, the apoptotic mode of death is triggered. Overall cell survival is evaluated for the indicated competing modes of death when both apply. The STM model can be applied to cytotoxicity data using Bayesian methods implemented via Markov chain Monte Carlo. PMID:21191483

  15. Terrestrial Microgravity Model and Threshold Gravity Simulation sing Magnetic Levitation

    NASA Technical Reports Server (NTRS)

    Ramachandran, N.

    2005-01-01

    What is the threshold gravity (minimum gravity level) required for the nominal functioning of the human system? What dosage is required? Do human cell lines behave differently in microgravity in response to an external stimulus? The critical need for such a gravity simulator is emphasized by recent experiments on human epithelial cells and lymphocytes on the Space Shuttle clearly showing that cell growth and function are markedly different from those observed terrestrially. Those differences are also dramatic between cells grown in space and those in Rotating Wall Vessels (RWV), or NASA bioreactor often used to simulate microgravity, indicating that although morphological growth patterns (three dimensional growth) can be successiblly simulated using RWVs, cell function performance is not reproduced - a critical difference. If cell function is dramatically affected by gravity off-loading, then cell response to stimuli such as radiation, stress, etc. can be very different from terrestrial cell lines. Yet, we have no good gravity simulator for use in study of these phenomena. This represents a profound shortcoming for countermeasures research. We postulate that we can use magnetic levitation of cells and tissue, through the use of strong magnetic fields and field gradients, as a terrestrial microgravity model to study human cells. Specific objectives of the research are: 1. To develop a tried, tested and benchmarked terrestrial microgravity model for cell culture studies; 2. Gravity threshold determination; 3. Dosage (magnitude and duration) of g-level required for nominal functioning of cells; 4. Comparisons of magnetic levitation model to other models such as RWV, hind limb suspension, etc. and 5. Cellular response to reduced gravity levels of Moon and Mars.

  16. The critical domain size of stochastic population models.

    PubMed

    Reimer, Jody R; Bonsall, Michael B; Maini, Philip K

    2017-02-01

    Identifying the critical domain size necessary for a population to persist is an important question in ecology. Both demographic and environmental stochasticity impact a population's ability to persist. Here we explore ways of including this variability. We study populations with distinct dispersal and sedentary stages, which have traditionally been modelled using a deterministic integrodifference equation (IDE) framework. Individual-based models (IBMs) are the most intuitive stochastic analogues to IDEs but yield few analytic insights. We explore two alternate approaches; one is a scaling up to the population level using the Central Limit Theorem, and the other a variation on both Galton-Watson branching processes and branching processes in random environments. These branching process models closely approximate the IBM and yield insight into the factors determining the critical domain size for a given population subject to stochasticity.

  17. The Population Tracking Model: A Simple, Scalable Statistical Model for Neural Population Data

    PubMed Central

    O'Donnell, Cian; alves, J. Tiago Gonç; Whiteley, Nick; Portera-Cailliau, Carlos; Sejnowski, Terrence J.

    2017-01-01

    Our understanding of neural population coding has been limited by a lack of analysis methods to characterize spiking data from large populations. The biggest challenge comes from the fact that the number of possible network activity patterns scales exponentially with the number of neurons recorded (∼2Neurons). Here we introduce a new statistical method for characterizing neural population activity that requires semi-independent fitting of only as many parameters as the square of the number of neurons, requiring drastically smaller data sets and minimal computation time. The model works by matching the population rate (the number of neurons synchronously active) and the probability that each individual neuron fires given the population rate. We found that this model can accurately fit synthetic data from up to 1000 neurons. We also found that the model could rapidly decode visual stimuli from neural population data from macaque primary visual cortex about 65 ms after stimulus onset. Finally, we used the model to estimate the entropy of neural population activity in developing mouse somatosensory cortex and, surprisingly, found that it first increases, and then decreases during development. This statistical model opens new options for interrogating neural population data and can bolster the use of modern large-scale in vivo Ca2+ and voltage imaging tools. PMID:27870612

  18. Population-production-pollution nexus based air pollution management model for alleviating the atmospheric crisis in Beijing, China.

    PubMed

    Zeng, X T; Tong, Y F; Cui, L; Kong, X M; Sheng, Y N; Chen, L; Li, Y P

    2017-07-15

    In recent years, increscent emissions in the city of Beijing due to expanded population, accelerated industrialization and inter-regional pollutant transportation have led to hazardous atmospheric pollution issues. Although a number of anthropogenic control measures have been put into use, frequent/severe haze events have still challenged regional governments. In this study, a hybrid population-production-pollution nexus model (PPP) is proposed for air pollution management and air quality planning (AMP) with the aim to coordinate human activities and environmental protection. A fuzzy-stochastic mixed quadratic programming method (FSQ) is developed and introduced into a PPP for tackling atmospheric pollution issues with uncertainties. Based on the contribution of an index of population-production-pollution, a hybrid PPP-based AMP model that considers employment structure, industrial layout pattern, production mode, pollutant purification efficiency and a pollution mitigation scheme have been applied in Beijing. Results of the adjustment of employment structure, pollution mitigation scheme, and green gross domestic product under various environmental regulation scenarios are obtained and analyzed. This study can facilitate the identification of optimized policies for alleviating population-production-emission conflict in the study region, as well as ameliorating the hazardous air pollution crisis at an urban level. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Initial analyses of the relationship between 'Thresholds' of toxicity for individual chemicals and 'Interaction Thresholds' for chemical mixtures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Raymond S.H.; Dennison, James E.

    2007-09-01

    The inter-relationship of 'Thresholds' between chemical mixtures and their respective component single chemicals was studied using three sets of data and two types of analyses. Two in vitro data sets involve cytotoxicity in human keratinocytes from treatment of metals and a metal mixture [Bae, D.S., Gennings, C., Carter, Jr., W.H., Yang, R.S.H., Campain, J.A., 2001. Toxicological interactions among arsenic, cadmium, chromium, and lead in human keratinocytes. Toxicol. Sci. 63, 132-142; Gennings, C., Carter, Jr., W.H., Campain, J.A., Bae, D.S., Yang, R.S.H., 2002. Statistical analysis of interactive cytotoxicity in human epidermal keratinocytes following exposure to a mixture of four metals. J.more » Agric. Biol. Environ. Stat. 7, 58-73], and induction of estrogen receptor alpha (ER-{alpha}) reporter gene in MCF-7 human breast cancer cells by estrogenic xenobiotics [Gennings, C., Carter, Jr., W.H., Carney, E.W., Charles, G.D., Gollapudi, B.B., Carchman, R.A., 2004. A novel flexible approach for evaluating fixed ratio mixtures of full and partial agonists. Toxicol. Sci. 80, 134-150]. The third data set came from PBPK modeling of gasoline and its components in the human. For in vitro cellular responses, we employed Benchmark Dose Software (BMDS) to obtain BMD{sub 01}, BMD{sub 05}, and BMD{sub 10}. We then plotted these BMDs against exposure concentrations for the chemical mixture and its components to assess the ranges and slopes of these BMD-concentration lines. In doing so, we consider certain BMDs to be 'Interaction Thresholds' or 'Thresholds' for mixtures and their component single chemicals and the slope of the line must be a reflection of the potency of the biological effects. For in vivo PBPK modeling, we used 0.1x TLVs, TLVs, and 10x TLVs for gasoline and six component markers as input dosing for PBPK modeling. In this case, the venous blood levels under the hypothetical exposure conditions become our designated 'Interaction Thresholds' or 'Thresholds' for

  20. Using stylized agent-based models for population-environment research: A case study from the Galápagos Islands

    PubMed Central

    Miller, Brian W.; Breckheimer, Ian; McCleary, Amy L.; Guzmán-Ramirez, Liza; Caplow, Susan C.; Jones-Smith, Jessica C.; Walsh, Stephen J.

    2010-01-01

    Agent Based Models (ABMs) are powerful tools for population-environment research but are subject to trade-offs between model complexity and abstraction. This study strikes a compromise between abstract and highly specified ABMs by designing a spatially explicit, stylized ABM and using it to explore policy scenarios in a setting that is facing substantial conservation and development challenges. Specifically, we present an ABM that reflects key Land Use / Land Cover (LULC) dynamics and livelihood decisions on Isabela Island in the Galápagos Archipelago of Ecuador. We implement the model using the NetLogo software platform, a free program that requires relatively little programming experience. The landscape is composed of a satellite-derived distribution of a problematic invasive species (common guava) and a stylized representation of the Galápagos National Park, the community of Puerto Villamil, the agricultural zone, and the marine area. The agent module is based on publicly available data and household interviews, and represents the primary livelihoods of the population in the Galápagos Islands – tourism, fisheries, and agriculture. We use the model to enact hypothetical agricultural subsidy scenarios aimed at controlling invasive guava and assess the resulting population and land cover dynamics. Findings suggest that spatially explicit, stylized ABMs have considerable utility, particularly during preliminary stages of research, as platforms for (1) sharpening conceptualizations of population-environment systems, (2) testing alternative scenarios, and (3) uncovering critical data gaps. PMID:20539752

  1. Using stylized agent-based models for population-environment research: A case study from the Galápagos Islands.

    PubMed

    Miller, Brian W; Breckheimer, Ian; McCleary, Amy L; Guzmán-Ramirez, Liza; Caplow, Susan C; Jones-Smith, Jessica C; Walsh, Stephen J

    2010-05-01

    Agent Based Models (ABMs) are powerful tools for population-environment research but are subject to trade-offs between model complexity and abstraction. This study strikes a compromise between abstract and highly specified ABMs by designing a spatially explicit, stylized ABM and using it to explore policy scenarios in a setting that is facing substantial conservation and development challenges. Specifically, we present an ABM that reflects key Land Use / Land Cover (LULC) dynamics and livelihood decisions on Isabela Island in the Galápagos Archipelago of Ecuador. We implement the model using the NetLogo software platform, a free program that requires relatively little programming experience. The landscape is composed of a satellite-derived distribution of a problematic invasive species (common guava) and a stylized representation of the Galápagos National Park, the community of Puerto Villamil, the agricultural zone, and the marine area. The agent module is based on publicly available data and household interviews, and represents the primary livelihoods of the population in the Galápagos Islands - tourism, fisheries, and agriculture. We use the model to enact hypothetical agricultural subsidy scenarios aimed at controlling invasive guava and assess the resulting population and land cover dynamics. Findings suggest that spatially explicit, stylized ABMs have considerable utility, particularly during preliminary stages of research, as platforms for (1) sharpening conceptualizations of population-environment systems, (2) testing alternative scenarios, and (3) uncovering critical data gaps.

  2. Raising threshold for diagnosis of polycystic ovary syndrome excludes population of patients with metabolic risk.

    PubMed

    Quinn, Molly M; Kao, Chia-Ning; Ahmad, Asima; Lenhart, Nikolaus; Shinkai, Kanade; Cedars, Marcelle I; Huddleston, Heather G

    2016-10-01

    To characterize the population of patients excluded from a diagnosis of polycystic ovary syndrome (PCOS) when follicle number criteria are increased to 25 per ovary as suggested by the Androgen Excess and Polycystic Ovary Syndrome Society's recent task force. Cross-sectional study. Tertiary academic center. A total of 259 women with PCOS according to Rotterdam criteria who were systematically examined from 2007 to 2015, with 1,100 ovulatory women participating in the Ovarian Aging (OVA) Study as controls. Anthropometric measurements, serum testing, ultrasonic imaging, and comprehensive dermatologic exams. Body mass index (BMI), waist to hip ratio (WHR), serum cholesterol, fasting glucose and insulin, follicle count per ovary, biochemical hyperandrogenemia, and hirsutism. Forty-seven of 259 women meeting the Rotterdam criteria (18.1%) were excluded from a diagnosis of PCOS when the follicle number criteria was increased to 25. These women had clinical evidence of hyperandrogenism (68.1%) and biochemical hyperandrogenemia (44.7%), although fewer reported oligoanovulation (26.8%). The excluded women had elevated total cholesterol, fasting insulin, and homeostatic model of insulin resistance (HOMA-IR) when compared with controls despite controlling for age and BMI. The women excluded from the PCOS diagnosis by raising the threshold of follicle number per ovary to ≥25 continue to show evidence of metabolic risk. Copyright © 2016 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  3. Peaks Over Threshold (POT): A methodology for automatic threshold estimation using goodness of fit p-value

    NASA Astrophysics Data System (ADS)

    Solari, Sebastián.; Egüen, Marta; Polo, María. José; Losada, Miguel A.

    2017-04-01

    Threshold estimation in the Peaks Over Threshold (POT) method and the impact of the estimation method on the calculation of high return period quantiles and their uncertainty (or confidence intervals) are issues that are still unresolved. In the past, methods based on goodness of fit tests and EDF-statistics have yielded satisfactory results, but their use has not yet been systematized. This paper proposes a methodology for automatic threshold estimation, based on the Anderson-Darling EDF-statistic and goodness of fit test. When combined with bootstrapping techniques, this methodology can be used to quantify both the uncertainty of threshold estimation and its impact on the uncertainty of high return period quantiles. This methodology was applied to several simulated series and to four precipitation/river flow data series. The results obtained confirmed its robustness. For the measured series, the estimated thresholds corresponded to those obtained by nonautomatic methods. Moreover, even though the uncertainty of the threshold estimation was high, this did not have a significant effect on the width of the confidence intervals of high return period quantiles.

  4. Modeling the population dynamics of Pacific yew.

    Treesearch

    Richard T. Busing; Thomas A. Spies

    1995-01-01

    A study of Pacific yew (Taxus brevifolia Nutt.) population dynamics in the mountains of western Oregon and Washington was based on a combination of long-term population data and computer modeling. Rates of growth and mortality were low in mature and old-growth forest stands. Diameter growth at breast height ranged from 0 to 3 centimeters per decade...

  5. An analysis of population-based prenatal screening for overt hypothyroidism.

    PubMed

    Bryant, Stefanie N; Nelson, David B; McIntire, Donald D; Casey, Brian M; Cunningham, F Gary

    2015-10-01

    The purpose of the study was to evaluate pregnancy outcomes of hypothyroidism that were identified in a population-based prenatal screening program. This is a secondary analysis of a prospective prenatal population-based study in which serum thyroid analytes were obtained from November 2000 to April 2003. Initial screening thresholds were intentionally inclusive (thyroid-stimulating hormone [TSH], >3.0 mU/L; free thyroxine, <0.9 ng/dL); those who screened positive were referred for confirmatory testing in a hospital-based laboratory. Hypothyroidism was identified and treated if TSH level was >4.5 mU/L and if fT4 level was <0.76 ng/dL. Perinatal outcomes in these women and those who screened positive but unconfirmed to have hypothyroidism were compared with women with euthyroidism. Outcomes were then analyzed according to initial TSH levels. A total of 26,518 women completed initial screening: 24,584 women (93%) were euthyroid, and 284 women (1%) had abnormal initial values that suggested hypothyroidism. Of those referred, 232 women (82%) underwent repeat testing, and 47 women (0.2% initially screened) were confirmed to have hypothyroidism. Perinatal outcomes of women with treated overt hypothyroidism were similar to women with euthyroidism. Higher rates of pregnancy-related hypertension were identified in the 182 women with unconfirmed hypothyroidism when compared with women with euthyroidism (P < .001); however, this association was seen only in women with initial TSH >4.5 mU/L (adjusted odds ratio, 2.53; 95% confidence interval, 1.4-4.5). The identification and treatment of overt hypothyroidism results in pregnancy outcomes similar to women with euthyroidism. Unconfirmed screening results suggestive of hypothyroidism portend pregnancy risks similar to women with subclinical hypothyroidism, specifically preeclampsia; however, this increased risk was seen only in women with initial TSH levels of >4.5 mU/L and suggests that this is a more clinically relevant

  6. Testing a Threshold-Based Bed Bug Management Approach in Apartment Buildings.

    PubMed

    Singh, Narinderpal; Wang, Changlu; Zha, Chen; Cooper, Richard; Robson, Mark

    2017-07-26

    We tested a threshold-based bed bug ( Cimex lectularius L.) management approach with the goal of achieving elimination with minimal or no insecticide application. Thirty-two bed bug infested apartments were identified. These apartments were divided into four treatment groups based on apartment size and initial bed bug count, obtained through a combination of visual inspection and bed bug monitors: I- Non-chemical only in apartments with 1-12 bed bug count, II- Chemical control only in apartments with 1-12 bed bug count, III- Non-chemical and chemical control in apartments with >12 bed bug count, and IV- Chemical control only in apartments with ≥11 bed bug count. All apartments were monitored or treated once every two weeks for a maximum of 28 wk. Treatment I eliminated bed bugs in a similar amount of time to treatment II. Time to eliminate bed bugs was similar between treatment III and IV but required significantly less insecticide spray in treatment III than that in treatment IV. A threshold-based management approach (non-chemical only or non-chemical and chemical) can eliminate bed bugs in a similar amount of time, using little to no pesticide compared to a chemical only approach.

  7. Testing a Threshold-Based Bed Bug Management Approach in Apartment Buildings

    PubMed Central

    Singh, Narinderpal; Zha, Chen; Cooper, Richard; Robson, Mark

    2017-01-01

    We tested a threshold-based bed bug (Cimex lectularius L.) management approach with the goal of achieving elimination with minimal or no insecticide application. Thirty-two bed bug infested apartments were identified. These apartments were divided into four treatment groups based on apartment size and initial bed bug count, obtained through a combination of visual inspection and bed bug monitors: I- Non-chemical only in apartments with 1–12 bed bug count, II- Chemical control only in apartments with 1–12 bed bug count, III- Non-chemical and chemical control in apartments with >12 bed bug count, and IV- Chemical control only in apartments with ≥11 bed bug count. All apartments were monitored or treated once every two weeks for a maximum of 28 wk. Treatment I eliminated bed bugs in a similar amount of time to treatment II. Time to eliminate bed bugs was similar between treatment III and IV but required significantly less insecticide spray in treatment III than that in treatment IV. A threshold-based management approach (non-chemical only or non-chemical and chemical) can eliminate bed bugs in a similar amount of time, using little to no pesticide compared to a chemical only approach. PMID:28933720

  8. Automatic threshold optimization in nonlinear energy operator based spike detection.

    PubMed

    Malik, Muhammad H; Saeed, Maryam; Kamboh, Awais M

    2016-08-01

    In neural spike sorting systems, the performance of the spike detector has to be maximized because it affects the performance of all subsequent blocks. Non-linear energy operator (NEO), is a popular spike detector due to its detection accuracy and its hardware friendly architecture. However, it involves a thresholding stage, whose value is usually approximated and is thus not optimal. This approximation deteriorates the performance in real-time systems where signal to noise ratio (SNR) estimation is a challenge, especially at lower SNRs. In this paper, we propose an automatic and robust threshold calculation method using an empirical gradient technique. The method is tested on two different datasets. The results show that our optimized threshold improves the detection accuracy in both high SNR and low SNR signals. Boxplots are presented that provide a statistical analysis of improvements in accuracy, for instance, the 75th percentile was at 98.7% and 93.5% for the optimized NEO threshold and traditional NEO threshold, respectively.

  9. 40 CFR Table Jj-1 to Subpart Jj of... - Animal Population Threshold Level Below Which Facilities Are Not Required To Report Emissions...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 21 2011-07-01 2011-07-01 false Animal Population Threshold Level Below Which Facilities Are Not Required To Report Emissions Under Subpart JJ 1,2 JJ Table JJ-1 to Subpart JJ of Part 98 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING...

  10. A general modeling framework for describing spatially structured population dynamics

    USGS Publications Warehouse

    Sample, Christine; Fryxell, John; Bieri, Joanna; Federico, Paula; Earl, Julia; Wiederholt, Ruscena; Mattsson, Brady; Flockhart, Tyler; Nicol, Sam; Diffendorfer, James E.; Thogmartin, Wayne E.; Erickson, Richard A.; Norris, D. Ryan

    2017-01-01

    Variation in movement across time and space fundamentally shapes the abundance and distribution of populations. Although a variety of approaches model structured population dynamics, they are limited to specific types of spatially structured populations and lack a unifying framework. Here, we propose a unified network-based framework sufficiently novel in its flexibility to capture a wide variety of spatiotemporal processes including metapopulations and a range of migratory patterns. It can accommodate different kinds of age structures, forms of population growth, dispersal, nomadism and migration, and alternative life-history strategies. Our objective was to link three general elements common to all spatially structured populations (space, time and movement) under a single mathematical framework. To do this, we adopt a network modeling approach. The spatial structure of a population is represented by a weighted and directed network. Each node and each edge has a set of attributes which vary through time. The dynamics of our network-based population is modeled with discrete time steps. Using both theoretical and real-world examples, we show how common elements recur across species with disparate movement strategies and how they can be combined under a unified mathematical framework. We illustrate how metapopulations, various migratory patterns, and nomadism can be represented with this modeling approach. We also apply our network-based framework to four organisms spanning a wide range of life histories, movement patterns, and carrying capacities. General computer code to implement our framework is provided, which can be applied to almost any spatially structured population. This framework contributes to our theoretical understanding of population dynamics and has practical management applications, including understanding the impact of perturbations on population size, distribution, and movement patterns. By working within a common framework, there is less chance

  11. Identity-by-Descent-Based Phasing and Imputation in Founder Populations Using Graphical Models

    PubMed Central

    Palin, Kimmo; Campbell, Harry; Wright, Alan F; Wilson, James F; Durbin, Richard

    2011-01-01

    Accurate knowledge of haplotypes, the combination of alleles co-residing on a single copy of a chromosome, enables powerful gene mapping and sequence imputation methods. Since humans are diploid, haplotypes must be derived from genotypes by a phasing process. In this study, we present a new computational model for haplotype phasing based on pairwise sharing of haplotypes inferred to be Identical-By-Descent (IBD). We apply the Bayesian network based model in a new phasing algorithm, called systematic long-range phasing (SLRP), that can capitalize on the close genetic relationships in isolated founder populations, and show with simulated and real genome-wide genotype data that SLRP substantially reduces the rate of phasing errors compared to previous phasing algorithms. Furthermore, the method accurately identifies regions of IBD, enabling linkage-like studies without pedigrees, and can be used to impute most genotypes with very low error rate. Genet. Epidemiol. 2011. © 2011 Wiley Periodicals, Inc.35:853-860, 2011 PMID:22006673

  12. Mitochondrial threshold effects.

    PubMed Central

    Rossignol, Rodrigue; Faustin, Benjamin; Rocher, Christophe; Malgat, Monique; Mazat, Jean-Pierre; Letellier, Thierry

    2003-01-01

    The study of mitochondrial diseases has revealed dramatic variability in the phenotypic presentation of mitochondrial genetic defects. To attempt to understand this variability, different authors have studied energy metabolism in transmitochondrial cell lines carrying different proportions of various pathogenic mutations in their mitochondrial DNA. The same kinds of experiments have been performed on isolated mitochondria and on tissue biopsies taken from patients with mitochondrial diseases. The results have shown that, in most cases, phenotypic manifestation of the genetic defect occurs only when a threshold level is exceeded, and this phenomenon has been named the 'phenotypic threshold effect'. Subsequently, several authors showed that it was possible to inhibit considerably the activity of a respiratory chain complex, up to a critical value, without affecting the rate of mitochondrial respiration or ATP synthesis. This phenomenon was called the 'biochemical threshold effect'. More recently, quantitative analysis of the effects of various mutations in mitochondrial DNA on the rate of mitochondrial protein synthesis has revealed the existence of a 'translational threshold effect'. In this review these different mitochondrial threshold effects are discussed, along with their molecular bases and the roles that they play in the presentation of mitochondrial diseases. PMID:12467494

  13. Cortical surface-based threshold-free cluster enhancement and cortexwise mediation.

    PubMed

    Lett, Tristram A; Waller, Lea; Tost, Heike; Veer, Ilya M; Nazeri, Arash; Erk, Susanne; Brandl, Eva J; Charlet, Katrin; Beck, Anne; Vollstädt-Klein, Sabine; Jorde, Anne; Kiefer, Falk; Heinz, Andreas; Meyer-Lindenberg, Andreas; Chakravarty, M Mallar; Walter, Henrik

    2017-06-01

    Threshold-free cluster enhancement (TFCE) is a sensitive means to incorporate spatial neighborhood information in neuroimaging studies without using arbitrary thresholds. The majority of methods have applied TFCE to voxelwise data. The need to understand the relationship among multiple variables and imaging modalities has become critical. We propose a new method of applying TFCE to vertexwise statistical images as well as cortexwise (either voxel- or vertexwise) mediation analysis. Here we present TFCE_mediation, a toolbox that can be used for cortexwise multiple regression analysis with TFCE, and additionally cortexwise mediation using TFCE. The toolbox is open source and publicly available (https://github.com/trislett/TFCE_mediation). We validated TFCE_mediation in healthy controls from two independent multimodal neuroimaging samples (N = 199 and N = 183). We found a consistent structure-function relationship between surface area and the first independent component (IC1) of the N-back task, that white matter fractional anisotropy is strongly associated with IC1 N-back, and that our voxel-based results are essentially identical to FSL randomise using TFCE (all P FWE <0.05). Using cortexwise mediation, we showed that the relationship between white matter FA and IC1 N-back is mediated by surface area in the right superior frontal cortex (P FWE  < 0.05). We also demonstrated that the same mediation model is present using vertexwise mediation (P FWE  < 0.05). In conclusion, cortexwise analysis with TFCE provides an effective analysis of multimodal neuroimaging data. Furthermore, cortexwise mediation analysis may identify or explain a mechanism that underlies an observed relationship among a predictor, intermediary, and dependent variables in which one of these variables is assessed at a whole-brain scale. Hum Brain Mapp 38:2795-2807, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  14. Thermalization threshold in models of 1D fermions

    NASA Astrophysics Data System (ADS)

    Mukerjee, Subroto; Modak, Ranjan; Ramswamy, Sriram

    2013-03-01

    The question of how isolated quantum systems thermalize is an interesting and open one. In this study we equate thermalization with non-integrability to try to answer this question. In particular, we study the effect of system size on the integrability of 1D systems of interacting fermions on a lattice. We find that for a finite-sized system, a non-zero value of an integrability breaking parameter is required to make an integrable system appear non-integrable. Using exact diagonalization and diagnostics such as energy level statistics and the Drude weight, we find that the threshold value of the integrability breaking parameter scales to zero as a power law with system size. We find the exponent to be the same for different models with its value depending on the random matrix ensemble describing the non-integrable system. We also study a simple analytical model of a non-integrable system with an integrable limit to better understand how a power law emerges.

  15. Effective temperature of an ultracold electron source based on near-threshold photoionization.

    PubMed

    Engelen, W J; Smakman, E P; Bakker, D J; Luiten, O J; Vredenbregt, E J D

    2014-01-01

    We present a detailed description of measurements of the effective temperature of a pulsed electron source, based on near-threshold photoionization of laser-cooled atoms. The temperature is determined by electron beam waist scans, source size measurements with ion beams, and analysis with an accurate beam line model. Experimental data is presented for the source temperature as a function of the wavelength of the photoionization laser, for both nanosecond and femtosecond ionization pulses. For the nanosecond laser, temperatures as low as 14 ± 3 K were found; for femtosecond photoionization, 30 ± 5 K is possible. With a typical source size of 25 μm, this results in electron bunches with a relative transverse coherence length in the 10⁻⁴ range and an emittance of a few nm rad. © 2013 Elsevier B.V. All rights reserved.

  16. Overcoming the effects of false positives and threshold bias in graph theoretical analyses of neuroimaging data.

    PubMed

    Drakesmith, M; Caeyenberghs, K; Dutt, A; Lewis, G; David, A S; Jones, D K

    2015-09-01

    Graph theory (GT) is a powerful framework for quantifying topological features of neuroimaging-derived functional and structural networks. However, false positive (FP) connections arise frequently and influence the inferred topology of networks. Thresholding is often used to overcome this problem, but an appropriate threshold often relies on a priori assumptions, which will alter inferred network topologies. Four common network metrics (global efficiency, mean clustering coefficient, mean betweenness and smallworldness) were tested using a model tractography dataset. It was found that all four network metrics were significantly affected even by just one FP. Results also show that thresholding effectively dampens the impact of FPs, but at the expense of adding significant bias to network metrics. In a larger number (n=248) of tractography datasets, statistics were computed across random group permutations for a range of thresholds, revealing that statistics for network metrics varied significantly more than for non-network metrics (i.e., number of streamlines and number of edges). Varying degrees of network atrophy were introduced artificially to half the datasets, to test sensitivity to genuine group differences. For some network metrics, this atrophy was detected as significant (p<0.05, determined using permutation testing) only across a limited range of thresholds. We propose a multi-threshold permutation correction (MTPC) method, based on the cluster-enhanced permutation correction approach, to identify sustained significant effects across clusters of thresholds. This approach minimises requirements to determine a single threshold a priori. We demonstrate improved sensitivity of MTPC-corrected metrics to genuine group effects compared to an existing approach and demonstrate the use of MTPC on a previously published network analysis of tractography data derived from a clinical population. In conclusion, we show that there are large biases and instability induced

  17. Persistent oscillations and backward bifurcation in a malaria model with varying human and mosquito populations: implications for control.

    PubMed

    Ngonghala, Calistus N; Teboh-Ewungkem, Miranda I; Ngwa, Gideon A

    2015-06-01

    We derive and study a deterministic compartmental model for malaria transmission with varying human and mosquito populations. Our model considers disease-related deaths, asymptomatic immune humans who are also infectious, as well as mosquito demography, reproduction and feeding habits. Analysis of the model reveals the existence of a backward bifurcation and persistent limit cycles whose period and size is determined by two threshold parameters: the vectorial basic reproduction number Rm, and the disease basic reproduction number R0, whose size can be reduced by reducing Rm. We conclude that malaria dynamics are indeed oscillatory when the methodology of explicitly incorporating the mosquito's demography, feeding and reproductive patterns is considered in modeling the mosquito population dynamics. A sensitivity analysis reveals important control parameters that can affect the magnitudes of Rm and R0, threshold quantities to be taken into consideration when designing control strategies. Both Rm and the intrinsic period of oscillation are shown to be highly sensitive to the mosquito's birth constant λm and the mosquito's feeding success probability pw. Control of λm can be achieved by spraying, eliminating breeding sites or moving them away from human habitats, while pw can be controlled via the use of mosquito repellant and insecticide-treated bed-nets. The disease threshold parameter R0 is shown to be highly sensitive to pw, and the intrinsic period of oscillation is also sensitive to the rate at which reproducing mosquitoes return to breeding sites. A global sensitivity and uncertainty analysis reveals that the ability of the mosquito to reproduce and uncertainties in the estimations of the rates at which exposed humans become infectious and infectious humans recover from malaria are critical in generating uncertainties in the disease classes.

  18. Do Optimal Prognostic Thresholds in Continuous Physiological Variables Really Exist? Analysis of Origin of Apparent Thresholds, with Systematic Review for Peak Oxygen Consumption, Ejection Fraction and BNP

    PubMed Central

    Leong, Tora; Rehman, Michaela B.; Pastormerlo, Luigi Emilio; Harrell, Frank E.; Coats, Andrew J. S.; Francis, Darrel P.

    2014-01-01

    Background Clinicians are sometimes advised to make decisions using thresholds in measured variables, derived from prognostic studies. Objectives We studied why there are conflicting apparently-optimal prognostic thresholds, for example in exercise peak oxygen uptake (pVO2), ejection fraction (EF), and Brain Natriuretic Peptide (BNP) in heart failure (HF). Data Sources and Eligibility Criteria Studies testing pVO2, EF or BNP prognostic thresholds in heart failure, published between 1990 and 2010, listed on Pubmed. Methods First, we examined studies testing pVO2, EF or BNP prognostic thresholds. Second, we created repeated simulations of 1500 patients to identify whether an apparently-optimal prognostic threshold indicates step change in risk. Results 33 studies (8946 patients) tested a pVO2 threshold. 18 found it prognostically significant: the actual reported threshold ranged widely (10–18 ml/kg/min) but was overwhelmingly controlled by the individual study population's mean pVO2 (r = 0.86, p<0.00001). In contrast, the 15 negative publications were testing thresholds 199% further from their means (p = 0.0001). Likewise, of 35 EF studies (10220 patients), the thresholds in the 22 positive reports were strongly determined by study means (r = 0.90, p<0.0001). Similarly, in the 19 positives of 20 BNP studies (9725 patients): r = 0.86 (p<0.0001). Second, survival simulations always discovered a “most significant” threshold, even when there was definitely no step change in mortality. With linear increase in risk, the apparently-optimal threshold was always near the sample mean (r = 0.99, p<0.001). Limitations This study cannot report the best threshold for any of these variables; instead it explains how common clinical research procedures routinely produce false thresholds. Key Findings First, shifting (and/or disappearance) of an apparently-optimal prognostic threshold is strongly determined by studies' average pVO2, EF or BNP. Second

  19. Linking population viability, habitat suitability, and landscape simulation models for conservation planning

    Treesearch

    Michael A. Larson; Frank R., III Thompson; Joshua J. Millspaugh; William D. Dijak; Stephen R. Shifley

    2004-01-01

    Methods for habitat modeling based on landscape simulations and population viability modeling based on habitat quality are well developed, but no published study of which we are aware has effectively joined them in a single, comprehensive analysis. We demonstrate the application of a population viability model for ovenbirds (Seiurus aurocapillus)...

  20. Uncertainty Estimates of Psychoacoustic Thresholds Obtained from Group Tests

    NASA Technical Reports Server (NTRS)

    Rathsam, Jonathan; Christian, Andrew

    2016-01-01

    Adaptive psychoacoustic test methods, in which the next signal level depends on the response to the previous signal, are the most efficient for determining psychoacoustic thresholds of individual subjects. In many tests conducted in the NASA psychoacoustic labs, the goal is to determine thresholds representative of the general population. To do this economically, non-adaptive testing methods are used in which three or four subjects are tested at the same time with predetermined signal levels. This approach requires us to identify techniques for assessing the uncertainty in resulting group-average psychoacoustic thresholds. In this presentation we examine the Delta Method of frequentist statistics, the Generalized Linear Model (GLM), the Nonparametric Bootstrap, a frequentist method, and Markov Chain Monte Carlo Posterior Estimation and a Bayesian approach. Each technique is exercised on a manufactured, theoretical dataset and then on datasets from two psychoacoustics facilities at NASA. The Delta Method is the simplest to implement and accurate for the cases studied. The GLM is found to be the least robust, and the Bootstrap takes the longest to calculate. The Bayesian Posterior Estimate is the most versatile technique examined because it allows the inclusion of prior information.

  1. Population Pharmacokinetic and Pharmacodynamic Model-Based Comparability Assessment of a Recombinant Human Epoetin Alfa and the Biosimilar HX575

    PubMed Central

    Yan, Xiaoyu; Lowe, Philip J.; Fink, Martin; Berghout, Alexander; Balser, Sigrid; Krzyzanski, Wojciech

    2012-01-01

    The aim of this study was to develop an integrated pharmacokinetic and pharmacodynamic (PK/PD) model and assess the comparability between epoetin alfa HEXAL/Binocrit (HX575) and a comparator epoetin alfa by a model-based approach. PK/PD data—including serum drug concentrations, reticulocyte counts, red blood cells, and hemoglobin levels—were obtained from 2 clinical studies. In sum, 149 healthy men received multiple intravenous or subcutaneous doses of HX575 (100 IU/kg) and the comparator 3 times a week for 4 weeks. A population model based on pharmacodynamics-mediated drug disposition and cell maturation processes was used to characterize the PK/PD data for the 2 drugs. Simulations showed that due to target amount changes, total clearance may increase up to 2.4-fold as compared with the baseline. Further simulations suggested that once-weekly and thrice-weekly subcutaneous dosing regimens would result in similar efficacy. The findings from the model-based analysis were consistent with previous results using the standard noncompartmental approach demonstrating PK/PD comparability between HX575 and comparator. However, due to complexity of the PK/PD model, control of random effects was not straightforward. Whereas population PK/PD model-based analyses are suited for studying complex biological systems, such models have their limitations (statistical), and their comparability results should be interpreted carefully. PMID:22162538

  2. Depletion with Cyclodextrin Reveals Two Populations of Cholesterol in Model Lipid Membranes

    PubMed Central

    Litz, Jonathan P.; Thakkar, Niket; Portet, Thomas; Keller, Sarah L.

    2016-01-01

    Recent results provide evidence that cholesterol is highly accessible for removal from both cell and model membranes above a threshold concentration that varies with membrane composition. Here we measured the rate at which methyl-β-cyclodextrin depletes cholesterol from a supported lipid bilayer as a function of cholesterol mole fraction. We formed supported bilayers from two-component mixtures of cholesterol and a PC (phosphatidylcholine) lipid, and we directly visualized the rate of decrease in area of the bilayers with fluorescence microscopy. Our technique yields the accessibility of cholesterol over a wide range of concentrations (30–66 mol %) for many individual bilayers, enabling fast acquisition of replicate data. We found that the bilayers contain two populations of cholesterol, one with low surface accessibility and the other with high accessibility. A larger fraction of the total membrane cholesterol appears in the more accessible population when the acyl chains of the PC-lipid tails are more unsaturated. Our findings are most consistent with the predictions of the condensed-complex and cholesterol bilayer domain models of cholesterol-phospholipid interactions in lipid membranes. PMID:26840728

  3. Spike-Threshold Variability Originated from Separatrix-Crossing in Neuronal Dynamics.

    PubMed

    Wang, Longfei; Wang, Hengtong; Yu, Lianchun; Chen, Yong

    2016-08-22

    The threshold voltage for action potential generation is a key regulator of neuronal signal processing, yet the mechanism of its dynamic variation is still not well described. In this paper, we propose that threshold phenomena can be classified as parameter thresholds and state thresholds. Voltage thresholds which belong to the state threshold are determined by the 'general separatrix' in state space. We demonstrate that the separatrix generally exists in the state space of neuron models. The general form of separatrix was assumed as the function of both states and stimuli and the previously assumed threshold evolving equation versus time is naturally deduced from the separatrix. In terms of neuronal dynamics, the threshold voltage variation, which is affected by different stimuli, is determined by crossing the separatrix at different points in state space. We suggest that the separatrix-crossing mechanism in state space is the intrinsic dynamic mechanism for threshold voltages and post-stimulus threshold phenomena. These proposals are also systematically verified in example models, three of which have analytic separatrices and one is the classic Hodgkin-Huxley model. The separatrix-crossing framework provides an overview of the neuronal threshold and will facilitate understanding of the nature of threshold variability.

  4. Experimental and environmental factors affect spurious detection of ecological thresholds

    USGS Publications Warehouse

    Daily, Jonathan P.; Hitt, Nathaniel P.; Smith, David; Snyder, Craig D.

    2012-01-01

    Threshold detection methods are increasingly popular for assessing nonlinear responses to environmental change, but their statistical performance remains poorly understood. We simulated linear change in stream benthic macroinvertebrate communities and evaluated the performance of commonly used threshold detection methods based on model fitting (piecewise quantile regression [PQR]), data partitioning (nonparametric change point analysis [NCPA]), and a hybrid approach (significant zero crossings [SiZer]). We demonstrated that false detection of ecological thresholds (type I errors) and inferences on threshold locations are influenced by sample size, rate of linear change, and frequency of observations across the environmental gradient (i.e., sample-environment distribution, SED). However, the relative importance of these factors varied among statistical methods and between inference types. False detection rates were influenced primarily by user-selected parameters for PQR (τ) and SiZer (bandwidth) and secondarily by sample size (for PQR) and SED (for SiZer). In contrast, the location of reported thresholds was influenced primarily by SED. Bootstrapped confidence intervals for NCPA threshold locations revealed strong correspondence to SED. We conclude that the choice of statistical methods for threshold detection should be matched to experimental and environmental constraints to minimize false detection rates and avoid spurious inferences regarding threshold location.

  5. A Micro-Level Data-Calibrated Agent-Based Model: The Synergy between Microsimulation and Agent-Based Modeling.

    PubMed

    Singh, Karandeep; Ahn, Chang-Won; Paik, Euihyun; Bae, Jang Won; Lee, Chun-Hee

    2018-01-01

    Artificial life (ALife) examines systems related to natural life, its processes, and its evolution, using simulations with computer models, robotics, and biochemistry. In this article, we focus on the computer modeling, or "soft," aspects of ALife and prepare a framework for scientists and modelers to be able to support such experiments. The framework is designed and built to be a parallel as well as distributed agent-based modeling environment, and does not require end users to have expertise in parallel or distributed computing. Furthermore, we use this framework to implement a hybrid model using microsimulation and agent-based modeling techniques to generate an artificial society. We leverage this artificial society to simulate and analyze population dynamics using Korean population census data. The agents in this model derive their decisional behaviors from real data (microsimulation feature) and interact among themselves (agent-based modeling feature) to proceed in the simulation. The behaviors, interactions, and social scenarios of the agents are varied to perform an analysis of population dynamics. We also estimate the future cost of pension policies based on the future population structure of the artificial society. The proposed framework and model demonstrates how ALife techniques can be used by researchers in relation to social issues and policies.

  6. Derivation of debris flow critical rainfall thresholds from land stability modeling

    NASA Astrophysics Data System (ADS)

    Papa, M. N.; Medina, V.; Bateman, A.; Ciervo, F.

    2012-04-01

    The aim of the work is to develop a system capable of providing debris flow warnings in areas where historical events data are not available as well as in the case of changing environments and climate. For these reasons, critical rainfall threshold curves are derived from mathematical and numerical simulations rather than the classical derivation from empirical rainfall data. The operational use of distributed model, based on the stability analysis for each grid cell of the basin, is not feasible in the case of warnings due to the long running time required for this kind of model as well as the lack of detailed information on the spatial distribution of the properties of the material in many practical cases. Moreover, with the aim of giving debris flow warnings, it is not necessary to know the distribution of instable elements along the basin but only if a debris flow may affect the vulnerable areas in the valley. The capability of a debris flow of reaching the downstream areas depends on many factors linked with the topography, the solid concentration, the rheological properties of the debris mixture and the flow discharge as well as the occurrence of liquefaction of the sliding mass. In relation to a specific basin, many of these factors may be considered as not time dependent. The most rainfall dependent factors are flow discharge and correlated total debris volume. In the present study, the total volume that is instable, and therefore available for the flow, is considered as the governing factor from which it is possible to assess whether a debris flow will affect the downstream areas or not. The possible triggering debris flow is simulated, in a generic element of the basin, by an infinite slope stability analysis. The groundwater pressure is calculated by the superposition of the effect of an "antecedent" rainfall and an "event" rainfall. The groundwater pressure response to antecedent rainfall is used as the initial condition for the time

  7. A systematic review of intervention thresholds based on FRAX : A report prepared for the National Osteoporosis Guideline Group and the International Osteoporosis Foundation

    PubMed Central

    Kanis, John A; Harvey, Nicholas C; Cooper, Cyrus; Johansson, Helena; Odén, Anders; McCloskey, Eugene V

    2016-01-01

    probability but in the absence of a previous fracture (i.e. at the ‘fracture threshold’) should also be eligible. Under current NOGG guidelines, based on age-dependent probability thresholds, inequalities in access to therapy arise especially at older ages (≥ 70 years) depending on the presence or absence of a prior fracture. An alternative threshold using a hybrid model reduces this disparity. The use of FRAX (fixed or age-dependent thresholds) as the gateway to assessment identifies individuals at high risk more effectively than the use of BMD. However, the setting of intervention thresholds need to be country-specific. PMID:27465509

  8. A simple plug-in bagging ensemble based on threshold-moving for classifying binary and multiclass imbalanced data.

    PubMed

    Collell, Guillem; Prelec, Drazen; Patil, Kaustubh R

    2018-01-31

    Class imbalance presents a major hurdle in the application of classification methods. A commonly taken approach is to learn ensembles of classifiers using rebalanced data. Examples include bootstrap averaging (bagging) combined with either undersampling or oversampling of the minority class examples. However, rebalancing methods entail asymmetric changes to the examples of different classes, which in turn can introduce their own biases. Furthermore, these methods often require specifying the performance measure of interest a priori, i.e., before learning. An alternative is to employ the threshold moving technique, which applies a threshold to the continuous output of a model, offering the possibility to adapt to a performance measure a posteriori , i.e., a plug-in method. Surprisingly, little attention has been paid to this combination of a bagging ensemble and threshold-moving. In this paper, we study this combination and demonstrate its competitiveness. Contrary to the other resampling methods, we preserve the natural class distribution of the data resulting in well-calibrated posterior probabilities. Additionally, we extend the proposed method to handle multiclass data. We validated our method on binary and multiclass benchmark data sets by using both, decision trees and neural networks as base classifiers. We perform analyses that provide insights into the proposed method.

  9. Heritability of Autism Spectrum Disorder in a UK Population-Based Twin Sample

    PubMed Central

    Colvert, Emma; Tick, Beata; McEwen, Fiona; Stewart, Catherine; Curran, Sarah R.; Woodhouse, Emma; Gillan, Nicola; Hallett, Victoria; Lietz, Stephanie; Garnett, Tracy; Ronald, Angelica; Plomin, Robert; Rijsdijk, Frühling; Happé, Francesca; Bolton, Patrick

    2016-01-01

    IMPORTANCE Most evidence to date highlights the importance of genetic influences on the liability to autism and related traits. However, most of these findings are derived from clinically ascertained samples, possibly missing individuals with subtler manifestations, and obtained estimates may not be representative of the population. OBJECTIVES To establish the relative contributions of genetic and environmental factors in liability to autism spectrum disorder (ASD) and a broader autism phenotype in a large population-based twin sample and to ascertain the genetic/environmental relationship between dimensional trait measures and categorical diagnostic constructs of ASD. DESIGN, SETTING, AND PARTICIPANTS We used data from the population-based cohort Twins Early Development Study, which included all twin pairs born in England and Wales from January 1, 1994, through December 31, 1996. We performed joint continuous-ordinal liability threshold model fitting using the full information maximum likelihood method to estimate genetic and environmental parameters of covariance. Twin pairs underwent the following assessments: the Childhood Autism Spectrum Test (CAST) (6423 pairs; mean age, 7.9 years), the Development and Well-being Assessment (DAWBA) (359 pairs; mean age, 10.3 years), the Autism Diagnostic Observation Schedule (ADOS) (203 pairs; mean age, 13.2 years), the Autism Diagnostic Interview–Revised (ADI-R) (205 pairs; mean age, 13.2 years), and a best-estimate diagnosis (207 pairs). MAIN OUTCOMES AND MEASURES Participants underwent screening using a population-based measure of autistic traits (CAST assessment), structured diagnostic assessments (DAWBA, ADI-R, and ADOS), and a best-estimate diagnosis. RESULTS On all ASD measures, correlations among monozygotic twins (range, 0.77-0.99) were significantly higher than those for dizygotic twins (range, 0.22-0.65), giving heritability estimates of 56% to 95%. The covariance of CAST and ASD diagnostic status (DAWBA, ADOS

  10. The threshold hypothesis: solving the equation of nurture vs nature in type 1 diabetes.

    PubMed

    Wasserfall, C; Nead, K; Mathews, C; Atkinson, M A

    2011-09-01

    For more than 40 years, the contributions of nurture (i.e. the environment) and nature (i.e. genetics) have been touted for their aetiological importance in type 1 diabetes. Disappointingly, knowledge gains in these areas, while individually successful, have to a large extent occurred in isolation from each other. One reason underlying this divide is the lack of a testable model that simultaneously considers the contributions of genetic and environmental determinants in the formation of this and potentially other disorders that are subject to these variables. To address this void, we have designed a model based on the hypothesis that the aetiological influences of genetics and environment, when evaluated as intersecting and reciprocal trend lines based on odds ratios, result in a method of concurrently evaluating both facets and defining the attributable risk of clinical onset of type 1 diabetes. The model, which we have elected to term the 'threshold hypothesis', also provides a novel means of conceptualising the complex interactions of nurture with nature in type 1 diabetes across various geographical populations.

  11. Time series sightability modeling of animal populations.

    PubMed

    ArchMiller, Althea A; Dorazio, Robert M; St Clair, Katherine; Fieberg, John R

    2018-01-01

    Logistic regression models-or "sightability models"-fit to detection/non-detection data from marked individuals are often used to adjust for visibility bias in later detection-only surveys, with population abundance estimated using a modified Horvitz-Thompson (mHT) estimator. More recently, a model-based alternative for analyzing combined detection/non-detection and detection-only data was developed. This approach seemed promising, since it resulted in similar estimates as the mHT when applied to data from moose (Alces alces) surveys in Minnesota. More importantly, it provided a framework for developing flexible models for analyzing multiyear detection-only survey data in combination with detection/non-detection data. During initial attempts to extend the model-based approach to multiple years of detection-only data, we found that estimates of detection probabilities and population abundance were sensitive to the amount of detection-only data included in the combined (detection/non-detection and detection-only) analysis. Subsequently, we developed a robust hierarchical modeling approach where sightability model parameters are informed only by the detection/non-detection data, and we used this approach to fit a fixed-effects model (FE model) with year-specific parameters and a temporally-smoothed model (TS model) that shares information across years via random effects and a temporal spline. The abundance estimates from the TS model were more precise, with decreased interannual variability relative to the FE model and mHT abundance estimates, illustrating the potential benefits from model-based approaches that allow information to be shared across years.

  12. Discriminating the precipitation phase based on different temperature thresholds in the Songhua River Basin, China

    NASA Astrophysics Data System (ADS)

    Zhong, Keyuan; Zheng, Fenli; Xu, Ximeng; Qin, Chao

    2018-06-01

    Different precipitation phases (rain, snow or sleet) differ greatly in their hydrological and erosional processes. Therefore, accurate discrimination of the precipitation phase is highly important when researching hydrologic processes and climate change at high latitudes and mountainous regions. The objective of this study was to identify suitable temperature thresholds for discriminating the precipitation phase in the Songhua River Basin (SRB) based on 20-year daily precipitation collected from 60 meteorological stations located in and around the basin. Two methods, the air temperature method (AT method) and the wet bulb temperature method (WBT method), were used to discriminate the precipitation phase. Thirteen temperature thresholds were used to discriminate snowfall in the SRB. These thresholds included air temperatures from 0 to 5.5 °C at intervals of 0.5 °C and the wet bulb temperature (WBT). Three evaluation indices, the error percentage of discriminated snowfall days (Ep), the relative error of discriminated snowfall (Re) and the determination coefficient (R2), were applied to assess the discrimination accuracy. The results showed that 2.5 °C was the optimum threshold temperature for discriminating snowfall at the scale of the entire basin. Due to differences in the landscape conditions at the different stations, the optimum threshold varied by station. The optimal threshold ranged 1.5-4.0 °C, and 19 stations, 17 stations and 18 stations had optimal thresholds of 2.5 °C, 3.0 °C, and 3.5 °C respectively, occupying 90% of all stations. Compared with using a single suitable temperature threshold to discriminate snowfall throughout the basin, it was more accurate to use the optimum threshold at each station to estimate snowfall in the basin. In addition, snowfall was underestimated when the temperature threshold was the WBT and when the temperature threshold was below 2.5 °C, whereas snowfall was overestimated when the temperature threshold exceeded 4

  13. Mosquito population dynamics from cellular automata-based simulation

    NASA Astrophysics Data System (ADS)

    Syafarina, Inna; Sadikin, Rifki; Nuraini, Nuning

    2016-02-01

    In this paper we present an innovative model for simulating mosquito-vector population dynamics. The simulation consist of two stages: demography and dispersal dynamics. For demography simulation, we follow the existing model for modeling a mosquito life cycles. Moreover, we use cellular automata-based model for simulating dispersal of the vector. In simulation, each individual vector is able to move to other grid based on a random walk. Our model is also capable to represent immunity factor for each grid. We simulate the model to evaluate its correctness. Based on the simulations, we can conclude that our model is correct. However, our model need to be improved to find a realistic parameters to match real data.

  14. Rainfall thresholds for possible landslide occurrence in Italy

    NASA Astrophysics Data System (ADS)

    Peruccacci, Silvia; Brunetti, Maria Teresa; Gariano, Stefano Luigi; Melillo, Massimo; Rossi, Mauro; Guzzetti, Fausto

    2017-08-01

    The large physiographic variability and the abundance of landslide and rainfall data make Italy an ideal site to investigate variations in the rainfall conditions that can result in rainfall-induced landslides. We used landslide information obtained from multiple sources and rainfall data captured by 2228 rain gauges to build a catalogue of 2309 rainfall events with - mostly shallow - landslides in Italy between January 1996 and February 2014. For each rainfall event with landslides, we reconstructed the rainfall history that presumably caused the slope failure, and we determined the corresponding rainfall duration D (in hours) and cumulated event rainfall E (in mm). Adopting a power law threshold model, we determined cumulated event rainfall-rainfall duration (ED) thresholds, at 5% exceedance probability, and their uncertainty. We defined a new national threshold for Italy, and 26 regional thresholds for environmental subdivisions based on topography, lithology, land-use, land cover, climate, and meteorology, and we used the thresholds to study the variations of the rainfall conditions that can result in landslides in different environments, in Italy. We found that the national and the environmental thresholds cover a small part of the possible DE domain. The finding supports the use of empirical rainfall thresholds for landslide forecasting in Italy, but poses an empirical limitation to the possibility of defining thresholds for small geographical areas. We observed differences between some of the thresholds. With increasing mean annual precipitation (MAP), the thresholds become higher and steeper, indicating that more rainfall is needed to trigger landslides where the MAP is high than where it is low. This suggests that the landscape adjusts to the regional meteorological conditions. We also observed that the thresholds are higher for stronger rocks, and that forested areas require more rainfall than agricultural areas to initiate landslides. Finally, we

  15. Laser damage threshold measurements of microstructure-based high reflectors

    NASA Astrophysics Data System (ADS)

    Hobbs, Douglas S.

    2008-10-01

    In 2007, the pulsed laser induced damage threshold (LIDT) of anti-reflecting (AR) microstructures built in fused silica and glass was shown to be up to three times greater than the LIDT of single-layer thin-film AR coatings, and at least five times greater than multiple-layer thin-film AR coatings. This result suggested that microstructure-based wavelength selective mirrors might also exhibit high LIDT. Efficient light reflection over a narrow spectral range can be produced by an array of sub-wavelength sized surface relief microstructures built in a waveguide configuration. Such surface structure resonant (SSR) filters typically achieve a reflectivity exceeding 99% over a 1-10nm range about the filter center wavelength, making SSR filters useful as laser high reflectors (HR). SSR laser mirrors consist of microstructures that are first etched in the surface of fused silica and borosilicate glass windows and subsequently coated with a thin layer of a non-absorbing high refractive index dielectric material such as tantalum pent-oxide or zinc sulfide. Results of an initial investigation into the LIDT of single layer SSR laser mirrors operating at 532nm, 1064nm and 1573nm are described along with data from SEM analysis of the microstructures, and spectral reflection measurements. None of the twelve samples tested exhibited damage thresholds above 3 J/cm2 when illuminated at the resonant wavelength, indicating that the simple single layer, first order design will need further development to be suitable for high power laser applications. Samples of SSR high reflectors entered in the Thin Film Damage Competition also exhibited low damage thresholds of less than 1 J/cm2 for the ZnS coated SSR, and just over 4 J/cm2 for the Ta2O5 coated SSR.

  16. Quantile-based permutation thresholds for quantitative trait loci hotspots.

    PubMed

    Neto, Elias Chaibub; Keller, Mark P; Broman, Andrew F; Attie, Alan D; Jansen, Ritsert C; Broman, Karl W; Yandell, Brian S

    2012-08-01

    Quantitative trait loci (QTL) hotspots (genomic locations affecting many traits) are a common feature in genetical genomics studies and are biologically interesting since they may harbor critical regulators. Therefore, statistical procedures to assess the significance of hotspots are of key importance. One approach, randomly allocating observed QTL across the genomic locations separately by trait, implicitly assumes all traits are uncorrelated. Recently, an empirical test for QTL hotspots was proposed on the basis of the number of traits that exceed a predetermined LOD value, such as the standard permutation LOD threshold. The permutation null distribution of the maximum number of traits across all genomic locations preserves the correlation structure among the phenotypes, avoiding the detection of spurious hotspots due to nongenetic correlation induced by uncontrolled environmental factors and unmeasured variables. However, by considering only the number of traits above a threshold, without accounting for the magnitude of the LOD scores, relevant information is lost. In particular, biologically interesting hotspots composed of a moderate to small number of traits with strong LOD scores may be neglected as nonsignificant. In this article we propose a quantile-based permutation approach that simultaneously accounts for the number and the LOD scores of traits within the hotspots. By considering a sliding scale of mapping thresholds, our method can assess the statistical significance of both small and large hotspots. Although the proposed approach can be applied to any type of heritable high-volume "omic" data set, we restrict our attention to expression (e)QTL analysis. We assess and compare the performances of these three methods in simulations and we illustrate how our approach can effectively assess the significance of moderate and small hotspots with strong LOD scores in a yeast expression data set.

  17. Cost-effectiveness of community-based strategies to strengthen the continuum of HIV care in rural South Africa: a health economic modelling analysis.

    PubMed

    Smith, Jennifer A; Sharma, Monisha; Levin, Carol; Baeten, Jared M; van Rooyen, Heidi; Celum, Connie; Hallett, Timothy B; Barnabas, Ruanne V

    2015-04-01

    Home HIV counselling and testing (HTC) achieves high coverage of testing and linkage to care compared with existing facility-based approaches, particularly among asymptomatic individuals. In a modelling analysis we aimed to assess the effect on population-level health and cost-effectiveness of a community-based package of home HTC in KwaZulu-Natal, South Africa. We parameterised an individual-based model with data from home HTC and linkage field studies that achieved high coverage (91%) and linkage to antiretroviral therapy (80%) in rural KwaZulu-Natal, South Africa. Costs were derived from a linked microcosting study. The model simulated 10,000 individuals over 10 years and incremental cost-effectiveness ratios were calculated for the intervention relative to the existing status quo of facility-based testing, with costs discounted at 3% annually. The model predicted implementation of home HTC in addition to current practice to decrease HIV-associated morbidity by 10–22% and HIV infections by 9–48% with increasing CD4 cell count thresholds for antiretroviral therapy initiation. Incremental programme costs were US$2·7 million to $4·4 million higher in the intervention scenarios than at baseline, and costs increased with higher CD4 cell count thresholds for antiretroviral therapy initiation; antiretroviral therapy accounted for 48–87% of total costs. Incremental cost-effectiveness ratios per disability-adjusted life-year averted were $1340 at an antiretroviral therapy threshold of CD4 count lower than 200 cells per μL, $1090 at lower than 350 cells per μL, $1150 at lower than 500 cells per μL, and $1360 at universal access to antiretroviral therapy. Community-based HTC with enhanced linkage to care can result in increased HIV testing coverage and treatment uptake, decreasing the population burden of HIV-associated morbidity and mortality. The incremental cost-effectiveness ratios are less than 20% of South Africa's gross domestic product per person, and

  18. Hierarchical spatial capture-recapture models: Modeling population density from stratified populations

    USGS Publications Warehouse

    Royle, J. Andrew; Converse, Sarah J.

    2014-01-01

    Capture–recapture studies are often conducted on populations that are stratified by space, time or other factors. In this paper, we develop a Bayesian spatial capture–recapture (SCR) modelling framework for stratified populations – when sampling occurs within multiple distinct spatial and temporal strata.We describe a hierarchical model that integrates distinct models for both the spatial encounter history data from capture–recapture sampling, and also for modelling variation in density among strata. We use an implementation of data augmentation to parameterize the model in terms of a latent categorical stratum or group membership variable, which provides a convenient implementation in popular BUGS software packages.We provide an example application to an experimental study involving small-mammal sampling on multiple trapping grids over multiple years, where the main interest is in modelling a treatment effect on population density among the trapping grids.Many capture–recapture studies involve some aspect of spatial or temporal replication that requires some attention to modelling variation among groups or strata. We propose a hierarchical model that allows explicit modelling of group or strata effects. Because the model is formulated for individual encounter histories and is easily implemented in the BUGS language and other free software, it also provides a general framework for modelling individual effects, such as are present in SCR models.

  19. History of research on modelling gypsy moth population ecology

    Treesearch

    J. J. Colbert

    1991-01-01

    History of research to develop models of gypsy moth population dynamics and some related studies are described. Empirical regression-based models are reviewed, and then the more comprehensive process models are discussed. Current model- related research efforts are introduced.

  20. Population models of burrowing mayfly recolonization in Western Lake Erie

    USGS Publications Warehouse

    Madenjian, C.P.; Schloesser, D.W.; Krieger, K.A.

    1998-01-01

    Burrowing mayflies, Hexagenia spp. (H. limbata and H. rigida), began recolonizing western Lake Erie during the 1990s. Survey data for mayfly nymph densities indicated that the population experienced exponential growth between 1991 and 1997. To predict the time to full recovery of the mayfly population, we fitted logistic models, ranging in carrying capacity from 600 to 2000 nymphs/m2, to these survey data. Based on the fitted logistic curves, we forecast that the mayfly population in western Lake Erie would achieve full recovery between years 1998 and 2000, depending on the carrying capacity of the western basin. Additionally, we estimated the mortality rate of nymphs in western Lake Erie during 1994 and then applied an age-based matrix model to the mayfly population. The results of the matrix population modeling corroborated the exponential growth model application in that both methods yielded an estimate of the population growth rate, r, in excess of 0.8 yr-1. This was the first evidence that mayfly populations are capable of recolonizing large aquatic ecosystems at rates comparable with those observed in much smaller lentic ecosystems. Our model predictions should prove valuable to managers of power plant facilities along the western basin in planning for mayfly emergences and to managers of the yellow perch (Perca flavescens) fishery in western Lake Erie.