Science.gov

Sample records for population-based threshold model

  1. Threshold models in radiation carcinogenesis

    SciTech Connect

    Hoel, D.G.; Li, P.

    1998-09-01

    Cancer incidence and mortality data from the atomic bomb survivors cohort has been analyzed to allow for the possibility of a threshold dose response. The same dose-response models as used in the original papers were fit to the data. The estimated cancer incidence from the fitted models over-predicted the observed cancer incidence in the lowest exposure group. This is consistent with a threshold or nonlinear dose-response at low-doses. Thresholds were added to the dose-response models and the range of possible thresholds is shown for both solid tumor cancers as well as the different leukemia types. This analysis suggests that the A-bomb cancer incidence data agree more with a threshold or nonlinear dose-response model than a purely linear model although the linear model is statistically equivalent. This observation is not found with the mortality data. For both the incidence data and the mortality data the addition of a threshold term significantly improves the fit to the linear or linear-quadratic dose response for both total leukemias and also for the leukemia subtypes of ALL, AML, and CML.

  2. Universal Screening for Emotional and Behavioral Problems: Fitting a Population-Based Model

    ERIC Educational Resources Information Center

    Schanding, G. Thomas, Jr.; Nowell, Kerri P.

    2013-01-01

    Schools have begun to adopt a population-based method to conceptualizing assessment and intervention of students; however, little empirical evidence has been gathered to support this shift in service delivery. The present study examined the fit of a population-based model in identifying students' behavioral and emotional functioning using a…

  3. POPULATION-BASED EXPOSURE MODELING FOR AIR POLLUTANTS AT EPA'S NATIONAL EXPOSURE RESEARCH LABORATORY

    EPA Science Inventory

    The US EPA's National Exposure Research Laboratory (NERL) has been developing, applying, and evaluating population-based exposure models to improve our understanding of the variability in personal exposure to air pollutants. Estimates of population variability are needed for E...

  4. Population based models of cortical drug response: insights from anaesthesia

    PubMed Central

    Bojak, Ingo; Liley, David T. J.

    2008-01-01

    A great explanatory gap lies between the molecular pharmacology of psychoactive agents and the neurophysiological changes they induce, as recorded by neuroimaging modalities. Causally relating the cellular actions of psychoactive compounds to their influence on population activity is experimentally challenging. Recent developments in the dynamical modelling of neural tissue have attempted to span this explanatory gap between microscopic targets and their macroscopic neurophysiological effects via a range of biologically plausible dynamical models of cortical tissue. Such theoretical models allow exploration of neural dynamics, in particular their modification by drug action. The ability to theoretically bridge scales is due to a biologically plausible averaging of cortical tissue properties. In the resulting macroscopic neural field, individual neurons need not be explicitly represented (as in neural networks). The following paper aims to provide a non-technical introduction to the mean field population modelling of drug action and its recent successes in modelling anaesthesia. PMID:19003456

  5. Validation of population-based disease simulation models: a review of concepts and methods

    PubMed Central

    2010-01-01

    Background Computer simulation models are used increasingly to support public health research and policy, but questions about their quality persist. The purpose of this article is to review the principles and methods for validation of population-based disease simulation models. Methods We developed a comprehensive framework for validating population-based chronic disease simulation models and used this framework in a review of published model validation guidelines. Based on the review, we formulated a set of recommendations for gathering evidence of model credibility. Results Evidence of model credibility derives from examining: 1) the process of model development, 2) the performance of a model, and 3) the quality of decisions based on the model. Many important issues in model validation are insufficiently addressed by current guidelines. These issues include a detailed evaluation of different data sources, graphical representation of models, computer programming, model calibration, between-model comparisons, sensitivity analysis, and predictive validity. The role of external data in model validation depends on the purpose of the model (e.g., decision analysis versus prediction). More research is needed on the methods of comparing the quality of decisions based on different models. Conclusion As the role of simulation modeling in population health is increasing and models are becoming more complex, there is a need for further improvements in model validation methodology and common standards for evaluating model credibility. PMID:21087466

  6. Models of population-based analyses for data collected from large extended families

    PubMed Central

    Lee, Elisa T.; Howard, Barbara V.; Fabsitz, Richard R.; Devereux, Richard B.; MacCluer, Jean W.; Laston, Sandra; Comuzzie, Anthony G.; Shara, Nawar M.; Welty, Thomas K.

    2014-01-01

    Large studies of extended families usually collect valuable phenotypic data that may have scientific value for purposes other than testing genetic hypotheses if the families were not selected in a biased manner. These purposes include assessing population-based associations of diseases with risk factors/covariates and estimating population characteristics such as disease prevalence and incidence. Relatedness among participants however, violates the traditional assumption of independent observations in these classic analyses. The commonly used adjustment method for relatedness in population-based analyses is to use marginal models, in which clusters (families) are assumed to be independent (unrelated) with a simple and identical covariance (family) structure such as those called independent, exchangeable and unstructured covariance structures. However, using these simple covariance structures may not be optimally appropriate for outcomes collected from large extended families, and may under- or over-estimate the variances of estimators and thus lead to uncertainty in inferences. Moreover, the assumption that families are unrelated with an identical family structure in a marginal model may not be satisfied for family studies with large extended families. The aim of this paper is to propose models incorporating marginal models approaches with a covariance structure for assessing population-based associations of diseases with their risk factors/covariates and estimating population characteristics for epidemiological studies while adjusting for the complicated relatedness among outcomes (continuous/categorical, normally/non-normally distributed) collected from large extended families. We also discuss theoretical issues of the proposed models and show that the proposed models and covariance structure are appropriate for and capable of achieving the aim. PMID:20882324

  7. Error Threshold of Fully Random Eigen Model

    NASA Astrophysics Data System (ADS)

    Li, Duo-Fang; Cao, Tian-Guang; Geng, Jin-Peng; Qiao, Li-Hua; Gu, Jian-Zhong; Zhan, Yong

    2015-01-01

    Species evolution is essentially a random process of interaction between biological populations and their environments. As a result, some physical parameters in evolution models are subject to statistical fluctuations. In this work, two important parameters in the Eigen model, the fitness and mutation rate, are treated as Gaussian distributed random variables simultaneously to examine the property of the error threshold. Numerical simulation results show that the error threshold in the fully random model appears as a crossover region instead of a phase transition point, and as the fluctuation strength increases the crossover region becomes smoother and smoother. Furthermore, it is shown that the randomization of the mutation rate plays a dominant role in changing the error threshold in the fully random model, which is consistent with the existing experimental data. The implication of the threshold change due to the randomization for antiviral strategies is discussed.

  8. Scalable Entity-Based Modeling of Population-Based Systems, Final LDRD Report

    SciTech Connect

    Cleary, A J; Smith, S G; Vassilevska, T K; Jefferson, D R

    2005-01-27

    The goal of this project has been to develop tools, capabilities and expertise in the modeling of complex population-based systems via scalable entity-based modeling (EBM). Our initial focal application domain has been the dynamics of large populations exposed to disease-causing agents, a topic of interest to the Department of Homeland Security in the context of bioterrorism. In the academic community, discrete simulation technology based on individual entities has shown initial success, but the technology has not been scaled to the problem sizes or computational resources of LLNL. Our developmental emphasis has been on the extension of this technology to parallel computers and maturation of the technology from an academic to a lab setting.

  9. Comprehensive, Population-Based Sensitivity Analysis of a Two-Mass Vocal Fold Model.

    PubMed

    Robertson, Daniel; Zañartu, Matías; Cook, Douglas

    2016-01-01

    Previous vocal fold modeling studies have generally focused on generating detailed data regarding a narrow subset of possible model configurations. These studies can be interpreted to be the investigation of a single subject under one or more vocal conditions. In this study, a broad population-based sensitivity analysis is employed to examine the behavior of a virtual population of subjects and to identify trends between virtual individuals as opposed to investigating a single subject or model instance. Four different sensitivity analysis techniques were used in accomplishing this task. Influential relationships between model input parameters and model outputs were identified, and an exploration of the model's parameter space was conducted. Results indicate that the behavior of the selected two-mass model is largely dominated by complex interactions, and that few input-output pairs have a consistent effect on the model. Results from the analysis can be used to increase the efficiency of optimization routines of reduced-order models used to investigate voice abnormalities. Results also demonstrate the types of challenges and difficulties to be expected when applying sensitivity analyses to more complex vocal fold models. Such challenges are discussed and recommendations are made for future studies. PMID:26845452

  10. Comprehensive, Population-Based Sensitivity Analysis of a Two-Mass Vocal Fold Model

    PubMed Central

    Robertson, Daniel; Zañartu, Matías; Cook, Douglas

    2016-01-01

    Previous vocal fold modeling studies have generally focused on generating detailed data regarding a narrow subset of possible model configurations. These studies can be interpreted to be the investigation of a single subject under one or more vocal conditions. In this study, a broad population-based sensitivity analysis is employed to examine the behavior of a virtual population of subjects and to identify trends between virtual individuals as opposed to investigating a single subject or model instance. Four different sensitivity analysis techniques were used in accomplishing this task. Influential relationships between model input parameters and model outputs were identified, and an exploration of the model’s parameter space was conducted. Results indicate that the behavior of the selected two-mass model is largely dominated by complex interactions, and that few input-output pairs have a consistent effect on the model. Results from the analysis can be used to increase the efficiency of optimization routines of reduced-order models used to investigate voice abnormalities. Results also demonstrate the types of challenges and difficulties to be expected when applying sensitivity analyses to more complex vocal fold models. Such challenges are discussed and recommendations are made for future studies. PMID:26845452

  11. Differential equation models for sharp threshold dynamics.

    PubMed

    Schramm, Harrison C; Dimitrov, Nedialko B

    2014-01-01

    We develop an extension to differential equation models of dynamical systems to allow us to analyze probabilistic threshold dynamics that fundamentally and globally change system behavior. We apply our novel modeling approach to two cases of interest: a model of infectious disease modified for malware where a detection event drastically changes dynamics by introducing a new class in competition with the original infection; and the Lanchester model of armed conflict, where the loss of a key capability drastically changes the effectiveness of one of the sides. We derive and demonstrate a step-by-step, repeatable method for applying our novel modeling approach to an arbitrary system, and we compare the resulting differential equations to simulations of the system's random progression. Our work leads to a simple and easily implemented method for analyzing probabilistic threshold dynamics using differential equations. PMID:24184349

  12. Toxicogenetics: population-based testing of drug and chemical safety in mouse models.

    PubMed

    Rusyn, Ivan; Gatti, Daniel M; Wiltshire, Timothy; Wilshire, Timothy; Kleeberger, Steven R; Threadgill, David W

    2010-08-01

    The rapid decline in the cost of dense genotyping is paving the way for new DNA sequence-based laboratory tests to move quickly into clinical practice, and to ultimately help realize the promise of 'personalized' therapies. These advances are based on the growing appreciation of genetics as an important dimension in science and the practice of investigative pharmacology and toxicology. On the clinical side, both the regulators and the pharmaceutical industry hope that the early identification of individuals prone to adverse drug effects will keep advantageous medicines on the market for the benefit of the vast majority of prospective patients. On the environmental health protection side, there is a clear need for better science to define the range and causes of susceptibility to adverse effects of chemicals in the population, so that the appropriate regulatory limits are established. In both cases, most of the research effort is focused on genome-wide association studies in humans where de novo genotyping of each subject is required. At the same time, the power of population-based preclinical safety testing in rodent models (e.g., mouse) remains to be fully exploited. Here, we highlight the approaches available to utilize the knowledge of DNA sequence and genetic diversity of the mouse as a species in mechanistic toxicology research. We posit that appropriate genetically defined mouse models may be combined with the limited data from human studies to not only discover the genetic determinants of susceptibility, but to also understand the molecular underpinnings of toxicity. PMID:20704464

  13. A Simple Population-Based Finite Element Model Eliminates the Need for Patient-Specific Models to Predict Instability of the Shoulder

    PubMed Central

    Jones, Morgan H.; Walia, Piyush; Fening, Stephen D.; Miniaci, Anthony

    2016-01-01

    computed as a ratio of horizontal reaction force to the compressive load. Results: The individual specimen-specific model results comparison to the experimental data for %IT had a good agreement as the values were similar for defect created. However, results for SR were over predicted by the FE model, but they had similar linear decreasing trends for both specimen-specific and cadaveric model. In addition, the humeral head defect size of 44% reduced the %IT from 100% to nearly 0% for all three models. The results for the comparison of all three models with increasing size of humeral defect with a 20% glenoid defect are shown in Figure 1 at three arm position. Conclusion: This study proposed a simple population-based model that can be used to estimate the loss in stability due to combined defects to determine a threshold for defect augmentation in clinical practice. It was demonstrated that a smaller glenoid defect size of 10% combined with a 19% humeral head defect can cause significant instability. Similar to past studies, it was also shown that a glenoid defect would lead to loss of translation and a humeral head defect would lead to instability at a functional arm position of increased abduction and external rotation [5-6]. All three models predicted similar results during validation, which shows that the population based model can be utilized to estimate the stability, instead of needing patient-specific FE models. The limitation of the study is the absence of soft tissue restraints.

  14. Threshold modeling of extreme spatial rainfall

    NASA Astrophysics Data System (ADS)

    Thibaud, E.; Davison, A.

    2013-12-01

    Complex events such as sustained extreme precipitation have major effects on human populations and environmental sustainability, and there is a growing interest in modeling them realistically. For risk assessment based on spatial quantities such as the total amount of rainfall falling over a region, it is necessary to properly model the dependence among extremes over that region, based on data from perhaps only a few sites within it. We propose an approach to spatial modeling of extreme rainfall, based on max-stable processes fitted using partial duration series and a censored threshold likelihood function. The resulting models are coherent with classical extreme-value theory and allow the consistent treatment of spatial dependence of rainfall using ideas related to those of classical geostatistics. The method can be used to produce simulations needed for hydrological models, and in particular for the generation of spatially heterogeneous extreme rainfall fields over catchments. We illustrate the ideas through data from the Val Ferret watershed in the Swiss Alps, based on daily cumulative rainfall totals recorded at 24 stations for four summers, augmented by a longer series from nearby. References: Davison, A. C., Huser, R., Thibaud, E. (2013). Geostatistics of Dependent and Asymptotically Independent Extremes, Mathematical Geosciences, vol. 45, num. 5, p. 511-529, 2013, doi:10.1007/s11004-013-9469-y Thibaud, E., Mutzner, R., Davison A. C. (2013, to appear). Threshold modeling of extreme spatial rainfall, Water Resources Research, doi:10.1002/wrcr.20329

  15. A threshold model of investor psychology

    NASA Astrophysics Data System (ADS)

    Cross, Rod; Grinfeld, Michael; Lamba, Harbir; Seaman, Tim

    2005-08-01

    We introduce a class of agent-based market models founded upon simple descriptions of investor psychology. Agents are subject to various psychological tensions induced by market conditions and endowed with a minimal ‘personality’. This personality consists of a threshold level for each of the tensions being modeled, and the agent reacts whenever a tension threshold is reached. This paper considers an elementary model including just two such tensions. The first is ‘cowardice’, which is the stress caused by remaining in a minority position with respect to overall market sentiment and leads to herding-type behavior. The second is ‘inaction’, which is the increasing desire to act or re-evaluate one's investment position. There is no inductive learning by agents and they are only coupled via the global market price and overall market sentiment. Even incorporating just these two psychological tensions, important stylized facts of real market data, including fat-tails, excess kurtosis, uncorrelated price returns and clustered volatility over the timescale of a few days are reproduced. By then introducing an additional parameter that amplifies the effect of externally generated market noise during times of extreme market sentiment, long-time volatility correlations can also be recovered.

  16. Neural Field Models with Threshold Noise.

    PubMed

    Thul, Rüdiger; Coombes, Stephen; Laing, Carlo R

    2016-12-01

    The original neural field model of Wilson and Cowan is often interpreted as the averaged behaviour of a network of switch like neural elements with a distribution of switch thresholds, giving rise to the classic sigmoidal population firing-rate function so prevalent in large scale neuronal modelling. In this paper we explore the effects of such threshold noise without recourse to averaging and show that spatial correlations can have a strong effect on the behaviour of waves and patterns in continuum models. Moreover, for a prescribed spatial covariance function we explore the differences in behaviour that can emerge when the underlying stationary distribution is changed from Gaussian to non-Gaussian. For travelling front solutions, in a system with exponentially decaying spatial interactions, we make use of an interface approach to calculate the instantaneous wave speed analytically as a series expansion in the noise strength. From this we find that, for weak noise, the spatially averaged speed depends only on the choice of covariance function and not on the shape of the stationary distribution. For a system with a Mexican-hat spatial connectivity we further find that noise can induce localised bump solutions, and using an interface stability argument show that there can be multiple stable solution branches. PMID:26936267

  17. On the two steps threshold selection for over-threshold modelling of extreme events

    NASA Astrophysics Data System (ADS)

    Bernardara, Pietro; Mazas, Franck; Weiss, Jerome; Andreewsky, Marc; Kergadallan, Xavier; Benoit, Michel; Hamm, Luc

    2013-04-01

    The estimation of the probability of occurrence of extreme events is traditionally achieved by fitting a probability distribution on a sample of extreme observations. In particular, the extreme value theory (EVT) states that values exceeding a given threshold converge through a Generalized Pareto Distribution (GPD) if the original sample is composed of independent and identically distributed values. However, the temporal series of sea and ocean variables usually show strong temporal autocorrelation. Traditionally, in order to select independent events for the following statistical analysis, the concept of a physical threshold is introduced: events that excess that threshold are defined as "extreme events". This is the so-called "Peak Over a Threshold (POT)" sampling, widely spread in the literature and currently used for engineering applications among many others. In the past, the threshold for the statistical sampling of extreme values asymptotically convergent toward GPD and the threshold for the physical selection of independent extreme events were confused, as the same threshold was used for both sampling data and to meet the hypothesis of extreme value convergence, leading to some incoherencies. In particular, if the two steps are performed simultaneously, the number of peaks over the threshold can increase but also decrease when the threshold decreases. This is logic in a physical point of view, since the definition of the sample of "extreme events" changes, but is not coherent with the statistical theory. We introduce a two-steps threshold selection for over-threshold modelling, aiming to discriminate (i) a physical threshold for the selection of extreme and independent events, and (ii) a statistical threshold for the optimization of the coherence with the hypothesis of the EVT. The former is a physical events identification procedure (also called "declustering") aiming at selecting independent extreme events. The latter is a purely statistical optimization

  18. Simulation of Population-Based Commuter Exposure to NO2 Using Different Air Pollution Models

    PubMed Central

    Ragettli, Martina S.; Tsai, Ming-Yi; Braun-Fahrländer, Charlotte; de Nazelle, Audrey; Schindler, Christian; Ineichen, Alex; Ducret-Stich, Regina E.; Perez, Laura; Probst-Hensch, Nicole; Künzli, Nino; Phuleria, Harish C.

    2014-01-01

    We simulated commuter routes and long-term exposure to traffic-related air pollution during commute in a representative population sample in Basel (Switzerland), and evaluated three air pollution models with different spatial resolution for estimating commute exposures to nitrogen dioxide (NO2) as a marker of long-term exposure to traffic-related air pollution. Our approach includes spatially and temporally resolved data on actual commuter routes, travel modes and three air pollution models. Annual mean NO2 commuter exposures were similar between models. However, we found more within-city and within-subject variability in annual mean (±SD) NO2 commuter exposure with a high resolution dispersion model (40 ± 7 µg m−3, range: 21–61) than with a dispersion model with a lower resolution (39 ± 5 µg m−3; range: 24–51), and a land use regression model (41 ± 5 µg m−3; range: 24–54). Highest median cumulative exposures were calculated along motorized transport and bicycle routes, and the lowest for walking. For estimating commuter exposure within a city and being interested also in small-scale variability between roads, a model with a high resolution is recommended. For larger scale epidemiological health assessment studies, models with a coarser spatial resolution are likely sufficient, especially when study areas include suburban and rural areas. PMID:24823664

  19. Simulation of population-based commuter exposure to NO₂ using different air pollution models.

    PubMed

    Ragettli, Martina S; Tsai, Ming-Yi; Braun-Fahrländer, Charlotte; de Nazelle, Audrey; Schindler, Christian; Ineichen, Alex; Ducret-Stich, Regina E; Perez, Laura; Probst-Hensch, Nicole; Künzli, Nino; Phuleria, Harish C

    2014-05-01

    We simulated commuter routes and long-term exposure to traffic-related air pollution during commute in a representative population sample in Basel (Switzerland), and evaluated three air pollution models with different spatial resolution for estimating commute exposures to nitrogen dioxide (NO2) as a marker of long-term exposure to traffic-related air pollution. Our approach includes spatially and temporally resolved data on actual commuter routes, travel modes and three air pollution models. Annual mean NO2 commuter exposures were similar between models. However, we found more within-city and within-subject variability in annual mean (±SD) NO2 commuter exposure with a high resolution dispersion model (40 ± 7 µg m(-3), range: 21-61) than with a dispersion model with a lower resolution (39 ± 5 µg m(-3); range: 24-51), and a land use regression model (41 ± 5 µg m(-3); range: 24-54). Highest median cumulative exposures were calculated along motorized transport and bicycle routes, and the lowest for walking. For estimating commuter exposure within a city and being interested also in small-scale variability between roads, a model with a high resolution is recommended. For larger scale epidemiological health assessment studies, models with a coarser spatial resolution are likely sufficient, especially when study areas include suburban and rural areas. PMID:24823664

  20. Troughs under threshold modeling of minimum flows in perennial streams

    NASA Astrophysics Data System (ADS)

    Önöz, B.; Bayazit, M.

    2002-02-01

    Troughs under threshold analysis has so far found little application in the modeling of minimum streamflows. In this study, all the troughs under a certain threshold level are considered in deriving the probability distribution of annual minima through the total probability theorem. For the occurrence of minima under the threshold, Poissonian, binomial or negative binomial processes are assumed. The magnitude of minima follows the generalized Pareto, exponential or power distribution. It is shown that asymptotic extreme value distributions for minima or the two-parameter Weibull distribution is obtained for the annual minima, depending on which models are assumed for the occurrence and magnitude of troughs under the threshold. Derived distributions can be used for modeling the minimum flows in streams which do not have zero flows. Expressions for the T-year annual minimum flow are obtained. An example illustrates the application of the troughs under threshold model to the minimum flows observed in a stream.

  1. Evaluation of a spatially resolved forest fire smoke model for population-based epidemiologic exposure assessment.

    PubMed

    Yao, Jiayun; Eyamie, Jeff; Henderson, Sarah B

    2016-05-01

    Exposure to forest fire smoke (FFS) is associated with multiple adverse health effects, mostly respiratory. Findings for cardiovascular effects have been inconsistent, possibly related to the limitations of conventional methods to assess FFS exposure. In previous work, we developed an empirical model to estimate smoke-related fine particulate matter (PM2.5) for all populated areas in British Columbia (BC), Canada. Here, we evaluate the utility of our model by comparing epidemiologic associations between modeled and measured PM2.5. For each local health area (LHA), we used Poisson regression to estimate the effects of PM2.5 estimates and measurements on counts of medication dispensations and outpatient physician visits. We then used meta-regression to estimate the overall effects. A 10 μg/m(3) increase in modeled PM2.5 was associated with increased sabutamol dispensations (RR=1.04, 95% CI 1.03-1.06), and physician visits for asthma (1.06, 1.04-1.08), COPD (1.02, 1.00-1.03), lower respiratory infections (1.03, 1.00-1.05), and otitis media (1.05, 1.03-1.07), all comparable to measured PM2.5. Effects on cardiovascular outcomes were only significant using model estimates in all LHAs during extreme fire days. This suggests that the exposure model is a promising tool for increasing the power of epidemiologic studies to detect the health effects of FFS via improved spatial coverage and resolution. PMID:25294305

  2. The adverse effect of spasticity on 3-month poststroke outcome using a population-based model.

    PubMed

    Belagaje, S R; Lindsell, C; Moomaw, C J; Alwell, K; Flaherty, M L; Woo, D; Dunning, K; Khatri, P; Adeoye, O; Kleindorfer, D; Broderick, J; Kissela, B

    2014-01-01

    Several devices and medications have been used to address poststroke spasticity. Yet, spasticity's impact on outcomes remains controversial. Using data from a cohort of 460 ischemic stroke patients, we previously published a validated multivariable regression model for predicting 3-month modified Rankin Score (mRS) as an indicator of functional outcome. Here, we tested whether including spasticity improved model fit and estimated the effect spasticity had on the outcome. Spasticity was defined by a positive response to the question "Did you have spasticity following your stroke?" on direct interview at 3 months from stroke onset. Patients who had expired by 90 days (n = 30) or did not have spasticity data available (n = 102) were excluded. Spasticity affected the 3-month functional status (β = 0.420, 95 CI = 0.194 to 0.645) after accounting for age, diabetes, leukoaraiosis, and retrospective NIHSS. Using spasticity as a covariable, the model's R (2) changed from 0.599 to 0.622. In our model, the presence of spasticity in the cohort was associated with a worsened 3-month mRS by an average of 0.4 after adjusting for known covariables. This significant adverse effect on functional outcomes adds predictive value beyond previously established factors. PMID:25147752

  3. The Adverse Effect of Spasticity on 3-Month Poststroke Outcome Using a Population-Based Model

    PubMed Central

    Belagaje, S. R.; Lindsell, C.; Moomaw, C. J.; Alwell, K.; Flaherty, M. L.; Woo, D.; Dunning, K.; Khatri, P.; Adeoye, O.; Kleindorfer, D.; Broderick, J.; Kissela, B.

    2014-01-01

    Several devices and medications have been used to address poststroke spasticity. Yet, spasticity's impact on outcomes remains controversial. Using data from a cohort of 460 ischemic stroke patients, we previously published a validated multivariable regression model for predicting 3-month modified Rankin Score (mRS) as an indicator of functional outcome. Here, we tested whether including spasticity improved model fit and estimated the effect spasticity had on the outcome. Spasticity was defined by a positive response to the question “Did you have spasticity following your stroke?” on direct interview at 3 months from stroke onset. Patients who had expired by 90 days (n = 30) or did not have spasticity data available (n = 102) were excluded. Spasticity affected the 3-month functional status (β = 0.420, 95 CI = 0.194 to 0.645) after accounting for age, diabetes, leukoaraiosis, and retrospective NIHSS. Using spasticity as a covariable, the model's R2 changed from 0.599 to 0.622. In our model, the presence of spasticity in the cohort was associated with a worsened 3-month mRS by an average of 0.4 after adjusting for known covariables. This significant adverse effect on functional outcomes adds predictive value beyond previously established factors. PMID:25147752

  4. Validation and extension of the PREMM1,2 model in a population-based cohort of colorectal cancer patients

    PubMed Central

    Balaguer, Francesc; Balmaña, Judith; Castellví-Bel, Sergi; Steyerberg, Ewout W.; Andreu, Montserrat; Llor, Xavier; Jover, Rodrigo; Syngal, Sapna; Castells, Antoni

    2008-01-01

    Summary Background and aims Early recognition of patients at risk for Lynch syndrome is critical but often difficult. Recently, a predictive algorithm -the PREMM1,2 model- has been developed to quantify the risk of carrying a germline mutation in the mismatch repair (MMR) genes, MLH1 and MSH2. However, its performance in an unselected, population-based colorectal cancer population as well as its performance in combination with tumor MMR testing are unknown. Methods We included all colorectal cancer cases from the EPICOLON study, a prospective, multicenter, population-based cohort (n=1,222). All patients underwent tumor microsatellite instability analysis and immunostaining for MLH1 and MSH2, and those with MMR deficiency (n=91) underwent tumor BRAF V600E mutation analysis and MLH1/MSH2 germline testing. Results The PREMM1,2 model with a ≥5% cut-off had a sensitivity, specificity and positive predictive value (PPV) of 100%, 68% and 2%, respectively. The use of a higher PREMM1,2 cut-off provided a higher specificity and PPV, at expense of a lower sensitivity. The combination of a ≥5% cut-off with tumor MMR testing maintained 100% sensitivity with an increased specificity (97%) and PPV (21%). The PPV of a PREMM1,2 score ≥20% alone (16%) approached the PPV obtained with PREMM1,2 score ≥5% combined with tumor MMR testing. In addition, a PREMM1,2 score of <5% was associated with a high likelihood of a BRAF V600E mutation. Conclusions The PREMM1,2 model is useful to identify MLH1/MSH2 mutation carriers among unselected colorectal cancer patients. Quantitative assessment of the genetic risk might be useful to decide on subsequent tumor MMR and germline testing. PMID:18061181

  5. Mathematical model for adaptive evolution of populations based on a complex domain

    PubMed Central

    Ibrahim, Rabha W.; Ahmad, M.Z.; Al-Janaby, Hiba F.

    2015-01-01

    A mutation is ultimately essential for adaptive evolution in all populations. It arises all the time, but is mostly fixed by enzymes. Further, most do consider that the evolution mechanism is by a natural assortment of variations in organisms in line for random variations in their DNA, and the suggestions for this are overwhelming. The altering of the construction of a gene, causing a different form that may be communicated to succeeding generations, produced by the modification of single base units in DNA, or the deletion, insertion, or rearrangement of larger units of chromosomes or genes. This altering is called a mutation. In this paper, a mathematical model is introduced to this reality. The model describes the time and space for the evolution. The tool is based on a complex domain for the space. We show that the evolution is distributed with the hypergeometric function. The Boundedness of the evolution is imposed by utilizing the Koebe function. PMID:26858564

  6. Peristomal Skin Complications Are Common, Expensive, and Difficult to Manage: A Population Based Cost Modeling Study

    PubMed Central

    Meisner, Søren; Lehur, Paul-Antoine; Moran, Brendan; Martins, Lina; Jemec, Gregor Borut Ernst

    2012-01-01

    Background Peristomal skin complications (PSCs) are the most common post-operative complications following creation of a stoma. Living with a stoma is a challenge, not only for the patient and their carers, but also for society as a whole. Due to methodological problems of PSC assessment, the associated health-economic burden of medium to longterm complications has been poorly described. Aim The aim of the present study was to create a model to estimate treatment costs of PSCs using the standardized assessment Ostomy Skin Tool as a reference. The resultant model was applied to a real-life global data set of stoma patients (n = 3017) to determine the prevalence and financial burden of PSCs. Methods Eleven experienced stoma care nurses were interviewed to get a global understanding of a treatment algorithm that formed the basis of the cost analysis. The estimated costs were based on a seven week treatment period. PSC costs were estimated for five underlying diagnostic categories and three levels of severity. The estimated treatment costs of severe cases of PSCs were increased 2–5 fold for the different diagnostic categories of PSCs compared with mild cases. French unit costs were applied to the global data set. Results The estimated total average cost for a seven week treatment period (including appliances and accessories) was 263€ for those with PSCs (n = 1742) compared to 215€ for those without PSCs (n = 1172). A co-variance analysis showed that leakage level had a significant impact on PSC cost from ‘rarely/never’ to ‘always/often’ p<0.00001 and from ‘rarely/never’ to ‘sometimes’ p = 0.0115. Conclusion PSCs are common and troublesome and the consequences are substantial, both for the patient and from a health economic viewpoint. PSCs should be diagnosed and treated at an early stage to prevent long term, debilitating and expensive complications. PMID:22679479

  7. Analysis of amyotrophic lateral sclerosis as a multistep process: a population-based modelling study

    PubMed Central

    Al-Chalabi, Ammar; Calvo, Andrea; Chio, Adriano; Colville, Shuna; Ellis, Cathy M; Hardiman, Orla; Heverin, Mark; Howard, Robin S; Huisman, Mark H B; Keren, Noa; Leigh, P Nigel; Mazzini, Letizia; Mora, Gabriele; Orrell, Richard W; Rooney, James; Scott, Kirsten M; Scotton, William J; Seelen, Meinie; Shaw, Christopher E; Sidle, Katie S; Swingler, Robert; Tsuda, Miho; Veldink, Jan H; Visser, Anne E; van den Berg, Leonard H; Pearce, Neil

    2014-01-01

    Summary Background Amyotrophic lateral sclerosis shares characteristics with some cancers, such as onset being more common in later life, progression usually being rapid, the disease affecting a particular cell type, and showing complex inheritance. We used a model originally applied to cancer epidemiology to investigate the hypothesis that amyotrophic lateral sclerosis is a multistep process. Methods We generated incidence data by age and sex from amyotrophic lateral sclerosis population registers in Ireland (registration dates 1995–2012), the Netherlands (2006–12), Italy (1995–2004), Scotland (1989–98), and England (2002–09), and calculated age and sex-adjusted incidences for each register. We regressed the log of age-specific incidence against the log of age with least squares regression. We did the analyses within each register, and also did a combined analysis, adjusting for register. Findings We identified 6274 cases of amyotrophic lateral sclerosis from a catchment population of about 34 million people. We noted a linear relationship between log incidence and log age in all five registers: England r2=0·95, Ireland r2=0·99, Italy r2=0·95, the Netherlands r2=0·99, and Scotland r2=0·97; overall r2=0·99. All five registers gave similar estimates of the linear slope ranging from 4·5 to 5·1, with overlapping confidence intervals. The combination of all five registers gave an overall slope of 4·8 (95% CI 4·5–5·0), with similar estimates for men (4·6, 4·3–4·9) and women (5·0, 4·5–5·5). Interpretation A linear relationship between the log incidence and log age of onset of amyotrophic lateral sclerosis is consistent with a multistage model of disease. The slope estimate suggests that amyotrophic lateral sclerosis is a six-step process. Identification of these steps could lead to preventive and therapeutic avenues. Funding UK Medical Research Council; UK Economic and Social Research Council; Ireland Health Research Board; The

  8. A Results-Based Logic Model for Primary Healthcare: A Conceptual Foundation for Population-Based Information Systems

    PubMed Central

    Watson, Diane E.; Broemeling, Anne-Marie; Wong, Sabrina T.

    2009-01-01

    A conceptual framework for population-based information systems is needed if these data are to be created and used to generate information to support healthcare policy, management and practice communities that seek to improve quality and account for progress in primary healthcare (PHC) renewal. This paper describes work conducted in British Columbia since 2003 to (1) create a Results-Based Logic Model for PHC using the approach of the Treasury Board of Canada in designing management and accountability frameworks, together with a literature review, policy analysis and broad consultation with approximately 650 people, (2) identify priorities for information within that logic model, (3) use the logic model and priorities within it to implement performance measurement and research and (4) identify how information systems need to be structured to assess the impact of variation or change in PHC inputs, activities and outputs on patient, population and healthcare system outcomes. The resulting logic model distinguishes among outcomes for which the PHC sector should be held more or less accountable. PMID:21037902

  9. Effect of model uncertainty on failure detection - The threshold selector

    NASA Technical Reports Server (NTRS)

    Emami-Naeini, Abbas; Akhter, Muhammad M.; Rock, Stephen M.

    1988-01-01

    The performance of all failure detection, isolation, and accomodation (DIA) algorithms is influenced by the presence of model uncertainty. A unique framework is presented to incorporate a knowledge of modeling error in the analysis and design of failure detection systems. The tools being used are very similar to those in robust control theory. A concept is introduced called the threshold selector, which is a nonlinear inequality whose solution defines the set of detectable sensor failure signals. The threshold selector represents an innovative tool for analysis and synthesis of DIA algorithms. It identifies the optimal threshold to be used in innovations-based DIA algorithms. The optimal threshold is shown to be a function of the bound on modeling errors, the noise properties, the speed of DIA filters, and the classes of reference and failure signals. The size of the smallest detectable failure is also determined. The results are applied to a multivariable turbofan jet engine example, which demonstrates improvements compared to previous studies.

  10. Uncertainties in the Modelled CO2 Threshold for Antarctic Glaciation

    NASA Technical Reports Server (NTRS)

    Gasson, E.; Lunt, D. J.; DeConto, R.; Goldner, A.; Heinemann, M.; Huber, M.; LeGrande, A. N.; Pollard, D.; Sagoo, N.; Siddall, M.; Winguth, A.; Valdes, P. J.

    2014-01-01

    frequently cited atmospheric CO2 threshold for the onset of Antarctic glaciation of approximately780 parts per million by volume is based on the study of DeConto and Pollard (2003) using an ice sheet model and the GENESIS climate model. Proxy records suggest that atmospheric CO2 concentrations passed through this threshold across the Eocene-Oligocene transition approximately 34 million years. However, atmospheric CO2 concentrations may have been close to this threshold earlier than this transition, which is used by some to suggest the possibility of Antarctic ice sheets during the Eocene. Here we investigate the climate model dependency of the threshold for Antarctic glaciation by performing offline ice sheet model simulations using the climate from 7 different climate models with Eocene boundary conditions (HadCM3L, CCSM3, CESM1.0, GENESIS, FAMOUS, ECHAM5 and GISS_ER). These climate simulations are sourced from a number of independent studies, and as such the boundary conditions, which are poorly constrained during the Eocene, are not identical between simulations. The results of this study suggest that the atmospheric CO2 threshold for Antarctic glaciation is highly dependent on the climate model used and the climate model configuration. A large discrepancy between the climate model and ice sheet model grids for some simulations leads to a strong sensitivity to the lapse rate parameter.

  11. Octave-Band Thresholds for Modeled Reverberant Fields

    NASA Technical Reports Server (NTRS)

    Begault, Durand R.; Wenzel, Elizabeth M.; Tran, Laura L.; Anderson, Mark R.; Trejo, Leonard J. (Technical Monitor)

    1998-01-01

    Auditory thresholds for 10 subjects were obtained for speech stimuli reverberation. The reverberation was produced and manipulated by 3-D audio modeling based on an actual room. The independent variables were octave-band-filtering (bypassed, 0.25 - 2.0 kHz Fc) and reverberation time (0.2- 1.1 sec). An ANOVA revealed significant effects (threshold range: -19 to -35 dB re 60 dB SRL).

  12. Setting conservation management thresholds using a novel participatory modeling approach.

    PubMed

    Addison, P F E; de Bie, K; Rumpff, L

    2015-10-01

    We devised a participatory modeling approach for setting management thresholds that show when management intervention is required to address undesirable ecosystem changes. This approach was designed to be used when management thresholds: must be set for environmental indicators in the face of multiple competing objectives; need to incorporate scientific understanding and value judgments; and will be set by participants with limited modeling experience. We applied our approach to a case study where management thresholds were set for a mat-forming brown alga, Hormosira banksii, in a protected area management context. Participants, including management staff and scientists, were involved in a workshop to test the approach, and set management thresholds to address the threat of trampling by visitors to an intertidal rocky reef. The approach involved trading off the environmental objective, to maintain the condition of intertidal reef communities, with social and economic objectives to ensure management intervention was cost-effective. Ecological scenarios, developed using scenario planning, were a key feature that provided the foundation for where to set management thresholds. The scenarios developed represented declines in percent cover of H. banksii that may occur under increased threatening processes. Participants defined 4 discrete management alternatives to address the threat of trampling and estimated the effect of these alternatives on the objectives under each ecological scenario. A weighted additive model was used to aggregate participants' consequence estimates. Model outputs (decision scores) clearly expressed uncertainty, which can be considered by decision makers and used to inform where to set management thresholds. This approach encourages a proactive form of conservation, where management thresholds and associated actions are defined a priori for ecological indicators, rather than reacting to unexpected ecosystem changes in the future. PMID:26040608

  13. Cascades in the Threshold Model for varying system sizes

    NASA Astrophysics Data System (ADS)

    Karampourniotis, Panagiotis; Sreenivasan, Sameet; Szymanski, Boleslaw; Korniss, Gyorgy

    2015-03-01

    A classical model in opinion dynamics is the Threshold Model (TM) aiming to model the spread of a new opinion based on the social drive of peer pressure. Under the TM a node adopts a new opinion only when the fraction of its first neighbors possessing that opinion exceeds a pre-assigned threshold. Cascades in the TM depend on multiple parameters, such as the number and selection strategy of the initially active nodes (initiators), and the threshold distribution of the nodes. For a uniform threshold in the network there is a critical fraction of initiators for which a transition from small to large cascades occurs, which for ER graphs is largerly independent of the system size. Here, we study the spread contribution of each newly assigned initiator under the TM for different initiator selection strategies for synthetic graphs of various sizes. We observe that for ER graphs when large cascades occur, the spread contribution of the added initiator on the transition point is independent of the system size, while the contribution of the rest of the initiators converges to zero at infinite system size. This property is used for the identification of large transitions for various threshold distributions. Supported in part by ARL NS-CTA, ARO, ONR, and DARPA.

  14. Modeling threshold detection and search for point and extended sources

    NASA Astrophysics Data System (ADS)

    Friedman, Melvin

    2016-05-01

    This paper deals with three separate topics. 1)The Berek extended object threshold detection model is described, calibrated against a portion of Blackwell's 1946 naked eye threshold detection data for extended objects against an unstructured background, and then the remainder of Blackwell's data is used to verify and validate the model. A range equation is derived from Berek's model which allows threshold detection range to be predicted for extended to point objects against an un-cluttered background as a function of target size and adapting luminance levels. The range equation is then used to model threshold detection of stationary reflective and self-luminous targets against an uncluttered background. 2) There is uncertainty whether Travnikova's search data for point source detection against an un-cluttered background is described by Rayleigh or exponential distributions. A model which explains the Rayleigh distribution for barely perceptible objects and the exponential distribution for brighter objects is given. 3) A technique is presented which allows a specific observer's target acquisition capability to be characterized. Then a model is presented which describes how individual target acquisition probability grows when a specific observer or combination of specific observers search for targets. Applications for the three topics are discussed.

  15. Inflection, canards and excitability threshold in neuronal models.

    PubMed

    Desroches, M; Krupa, M; Rodrigues, S

    2013-10-01

    A technique is presented, based on the differential geometry of planar curves, to evaluate the excitability threshold of neuronal models. The aim is to determine regions of the phase plane where solutions to the model equations have zero local curvature, thereby defining a zero-curvature (inflection) set that discerns between sub-threshold and spiking electrical activity. This transition can arise through a Hopf bifurcation, via the so-called canard explosion that happens in an exponentially small parameter variation, and this is typical for a large class of planar neuronal models (FitzHugh-Nagumo, reduced Hodgkin-Huxley), namely, type II neurons (resonators). This transition can also correspond to the crossing of the stable manifold of a saddle equilibrium, in the case of type I neurons (integrators). We compute inflection sets and study how well they approximate the excitability threshold of these neuron models, that is, both in the canard and in the non-canard regime, using tools from invariant manifold theory and singularity theory. With the latter, we investigate the topological changes that inflection sets undergo upon parameter variation. Finally, we show that the concept of inflection set gives a good approximation of the threshold in both the so-called resonator and integrator neuronal cases. PMID:22945512

  16. Modeling the Interactions Between Multiple Crack Closure Mechanisms at Threshold

    NASA Technical Reports Server (NTRS)

    Newman, John A.; Riddell, William T.; Piascik, Robert S.

    2003-01-01

    A fatigue crack closure model is developed that includes interactions between the three closure mechanisms most likely to occur at threshold; plasticity, roughness, and oxide. This model, herein referred to as the CROP model (for Closure, Roughness, Oxide, and Plasticity), also includes the effects of out-of plane cracking and multi-axial loading. These features make the CROP closure model uniquely suited for, but not limited to, threshold applications. Rough cracks are idealized here as two-dimensional sawtooths, whose geometry induces mixed-mode crack- tip stresses. Continuum mechanics and crack-tip dislocation concepts are combined to relate crack face displacements to crack-tip loads. Geometric criteria are used to determine closure loads from crack-face displacements. Finite element results, used to verify model predictions, provide critical information about the locations where crack closure occurs.

  17. Effect of threshold disorder on the quorum percolation model.

    PubMed

    Monceau, Pascal; Renault, Renaud; Métens, Stéphane; Bottani, Samuel

    2016-07-01

    We study the modifications induced in the behavior of the quorum percolation model on neural networks with Gaussian in-degree by taking into account an uncorrelated Gaussian thresholds variability. We derive a mean-field approach and show its relevance by carrying out explicit Monte Carlo simulations. It turns out that such a disorder shifts the position of the percolation transition, impacts the size of the giant cluster, and can even destroy the transition. Moreover, we highlight the occurrence of disorder independent fixed points above the quorum critical value. The mean-field approach enables us to interpret these effects in terms of activation probability. A finite-size analysis enables us to show that the order parameter is weakly self-averaging with an exponent independent on the thresholds disorder. Last, we show that the effects of the thresholds and connectivity disorders cannot be easily discriminated from the measured averaged physical quantities. PMID:27575157

  18. Effect of threshold disorder on the quorum percolation model

    NASA Astrophysics Data System (ADS)

    Monceau, Pascal; Renault, Renaud; Métens, Stéphane; Bottani, Samuel

    2016-07-01

    We study the modifications induced in the behavior of the quorum percolation model on neural networks with Gaussian in-degree by taking into account an uncorrelated Gaussian thresholds variability. We derive a mean-field approach and show its relevance by carrying out explicit Monte Carlo simulations. It turns out that such a disorder shifts the position of the percolation transition, impacts the size of the giant cluster, and can even destroy the transition. Moreover, we highlight the occurrence of disorder independent fixed points above the quorum critical value. The mean-field approach enables us to interpret these effects in terms of activation probability. A finite-size analysis enables us to show that the order parameter is weakly self-averaging with an exponent independent on the thresholds disorder. Last, we show that the effects of the thresholds and connectivity disorders cannot be easily discriminated from the measured averaged physical quantities.

  19. Oscillation threshold of a clarinet model: a numerical continuation approach.

    PubMed

    Karkar, Sami; Vergez, Christophe; Cochelin, Bruno

    2012-01-01

    This paper focuses on the oscillation threshold of single reed instruments. Several characteristics such as blowing pressure at threshold, regime selection, and playing frequency are known to change radically when taking into account the reed dynamics and the flow induced by the reed motion. Previous works have shown interesting tendencies, using analytical expressions with simplified models. In the present study, a more elaborated physical model is considered. The influence of several parameters, depending on the reed properties, the design of the instrument or the control operated by the player, are studied. Previous results on the influence of the reed resonance frequency are confirmed. New results concerning the simultaneous influence of two model parameters on oscillation threshold, regime selection and playing frequency are presented and discussed. The authors use a numerical continuation approach. Numerical continuation consists in following a given solution of a set of equations when a parameter varies. Considering the instrument as a dynamical system, the oscillation threshold problem is formulated as a path following of Hopf bifurcations, generalizing the usual approach of the characteristic equation, as used in previous works. The proposed numerical approach proves to be useful for the study of musical instruments. It is complementary to analytical analysis and direct time-domain or frequency-domain simulations since it allows to derive information that is hardly reachable through simulation, without the approximations needed for analytical approach. PMID:22280691

  20. Oscillation threshold of a clarinet model: A numerical continuation approach

    NASA Astrophysics Data System (ADS)

    Karkar, Sami; Vergez, Christophe; Cochelin, Bruno

    This paper focuses on the oscillation threshold of single reed instruments. Several characteristics such as blowing pressure at threshold, regime selection, and playing frequency are known to change radically when taking into account the reed dynamics and the flow induced by the reed motion. Previous works have shown interesting tendencies, using analytical expressions with simplified models. In the present study, a more elaborated physical model is considered. The influence of several parameters, depending on the reed properties, the design of the instrument or the control operated by the player, are studied. Previous results on the influence of the reed resonance frequency are confirmed. New results concerning the simultaneous influence of two model parameters on oscillation threshold, regime selection and playing frequency are presented and discussed. The authors use a numerical continuation approach. Numerical continuation consists in following a given solution of a set of equations when a parameter varies. Considering the instrument as a dynamical system, the oscillation threshold problem is formulated as a path following of Hopf bifurcations, generalizing the usual approach of the characteristic equation, as used in previous works. The proposed numerical approach proves to be useful for the study of musical instruments. It is complementary to analytical analysis and direct time-domain or frequency-domain simulations since it allows to derive information that is hardly reachable through simulation, without the approximations needed for analytical approach.

  1. Predictors of the nicotine reinforcement threshold, compensation, and elasticity of demand in a rodent model of nicotine reduction policy*

    PubMed Central

    Grebenstein, Patricia E.; Burroughs, Danielle; Roiko, Samuel A.; Pentel, Paul R.; LeSage, Mark G.

    2015-01-01

    Background The FDA is considering reducing the nicotine content in tobacco products as a population-based strategy to reduce tobacco addiction. Research is needed to determine the threshold level of nicotine needed to maintain smoking and the extent of compensatory smoking that could occur during nicotine reduction. Sources of variability in these measures across sub-populations also need to be identified so that policies can take into account the risks and benefits of nicotine reduction in vulnerable populations. Methods The present study examined these issues in a rodent nicotine self- administration model of nicotine reduction policy to characterize individual differences in nicotine reinforcement thresholds, degree of compensation, and elasticity of demand during progressive reduction of the unit nicotine dose. The ability of individual differences in baseline nicotine intake and nicotine pharmacokinetics to predict responses to dose reduction was also examined. Results Considerable variability in the reinforcement threshold, compensation, and elasticity of demand was evident. High baseline nicotine intake was not correlated with the reinforcement threshold, but predicted less compensation and less elastic demand. Higher nicotine clearance predicted low reinforcement thresholds, greater compensation, and less elastic demand. Less elastic demand also predicted lower reinforcement thresholds. Conclusions These findings suggest that baseline nicotine intake, nicotine clearance, and the essential value of nicotine (i.e. elasticity of demand) moderate the effects of progressive nicotine reduction in rats and warrant further study in humans. They also suggest that smokers with fast nicotine metabolism may be more vulnerable to the risks of nicotine reduction. PMID:25891231

  2. Discrete threshold versus continuous strength models of perceptual recognition.

    PubMed

    Paap, K R; Chun, E; Vonnahme, P

    1999-12-01

    Two experiments were designed to test discrete-threshold models of letter and word recognition against models that assume that decision criteria are applied to measures of continuous strength. Although our goal is to adjudicate this matter with respect to broad classes of models, some of the specific predictions for discrete-threshold are generated from Grainger and Jacobs' (1994) Dual-Readout Model (DROM) and some of the predictions for continuous strength are generated from a revised version of the Activation-Verification Model (Paap, Newsome, McDonald, & Schvaneveldt, 1982). Experiment 1 uses a two-alternative forced-choice task that is followed by an assessment of confidence and then a whole report if a word is recognized. Factors are manipulated to assess the presence or magnitude of a neighbourhood-frequency effect, a lexical-bias effect, a word-superiority effect, and a pseudoword advantage. Several discrepancies between DROM's predictions and the obtained data are noted. Both types of models were also used to predict the distribution of responses across the levels of confidence for each individual participant. The predictions based on continuous strength were superior. Experiment 2 used a same-different task and confidence ratings to enable the generation of receiver operating characteristics (ROCs). The shapes of the ROCs are more consistent with the continuous strength assumption than with a discrete threshold. PMID:10646200

  3. A phenomenological model of myelinated nerve with a dynamic threshold.

    PubMed

    Morse, R P; Allingham, D; Stocks, N G

    2015-10-01

    To evaluate coding strategies for cochlear implants a model of the human cochlear nerve is required. Nerve models based on voltage-clamp experiments, such as the Frankenhaeuser-Huxley model of myelinated nerve, can have over forty parameters and are not amenable for fitting to physiological data from a different animal or type of nerve. Phenomenological nerve models, such as leaky integrate-and-fire (LIF) models, have fewer parameters but have not been validated with a wide range of stimuli. In the absence of substantial cochlear nerve data, we have used data from a toad sciatic nerve for validation (50 Hz to 2 kHz with levels up to 20 dB above threshold). We show that the standard LIF model with fixed refractory properties and a single set of parameters cannot adequately predict the toad rate-level functions. Given the deficiency of this standard model, we have abstracted the dynamics of the sodium inactivation variable in the Frankenhaeuser-Huxley model to develop a phenomenological LIF model with a dynamic threshold. This nine-parameter model predicts the physiological rate-level functions much more accurately than the standard LIF model. Because of the low number of parameters, we expect to be able to optimize the model parameters so that the model is more appropriate for cochlear implant simulations. PMID:26141642

  4. Semiautomatic bladder segmentation on CBCT using a population-based model for multiple-plan ART of bladder cancer

    NASA Astrophysics Data System (ADS)

    Chai, Xiangfei; van Herk, Marcel; Betgen, Anja; Hulshof, Maarten; Bel, Arjan

    2012-12-01

    The aim of this study is to develop a novel semiautomatic bladder segmentation approach for selecting the appropriate plan from the library of plans for a multiple-plan adaptive radiotherapy (ART) procedure. A population-based statistical bladder model was first built from a training data set (95 bladder contours from 8 patients). This model was then used as constraint to segment the bladder in an independent validation data set (233 CBCT scans from the remaining 22 patients). All 3D bladder contours were converted into parametric surface representations using spherical harmonic expansion. Principal component analysis (PCA) was applied in the spherical harmonic-based shape parameter space to calculate the major variation of bladder shapes. The number of dominating PCA modes was chosen such that 95% of the total shape variation of the training data set was described. The automatic segmentation started from the bladder contour of the planning CT of each patient, which was modified by changing the weight of each PCA mode. As a result, the segmentation contour was deformed consistently with the training set to best fit the bladder boundary in the localization CBCT image. A cost function was defined to measure the goodness of fit of the segmentation on the localization CBCT image. The segmentation was obtained by minimizing this cost function using a simplex optimizer. After automatic segmentation, a fast manual correction method was provided to correct those bladders (parts) that were poorly segmented. Volume- and distance-based metrics and the accuracy of plan selection from multiple plans were evaluated to quantify the performance of the automatic and semiautomatic segmentation methods. For the training data set, only seven PCA modes were needed to represent 95% of the bladder shape variation. The mean CI overlap and residual error (SD) of automatic bladder segmentation over all of the validation data were 70.5% and 0.39 cm, respectively. The agreement of plan

  5. Compact modeling of perpendicular nanomagnetic logic based on threshold gates

    NASA Astrophysics Data System (ADS)

    Breitkreutz, Stephan; Eichwald, Irina; Kiermaier, Josef; Csaba, Gyorgy; Schmitt-Landsiedel, Doris; Becherer, Markus

    2014-05-01

    In this work, we show that physical-based compact modeling of perpendicular Nanomagnetic Logic is crucial for the design and simulation of complex circuitry. A compact model for field-coupled nanomagnets based on an Arrhenius switching model and finite element calculations is introduced. As physical parameters have an enormous influence on the behavior of the circuit, their modeling is of great importance. Exemplarily, a 1-bit full adder based on threshold logic gates is analyzed due to its reliability. The obtained findings are used to design a pure magnetic arithmetic logic unit, which can be used for basic Boolean and logic operations.

  6. The interplay between cooperativity and diversity in model threshold ensembles.

    PubMed

    Cervera, Javier; Manzanares, José A; Mafe, Salvador

    2014-10-01

    The interplay between cooperativity and diversity is crucial for biological ensembles because single molecule experiments show a significant degree of heterogeneity and also for artificial nanostructures because of the high individual variability characteristic of nanoscale units. We study the cross-effects between cooperativity and diversity in model threshold ensembles composed of individually different units that show a cooperative behaviour. The units are modelled as statistical distributions of parameters (the individual threshold potentials here) characterized by central and width distribution values. The simulations show that the interplay between cooperativity and diversity results in ensemble-averaged responses of interest for the understanding of electrical transduction in cell membranes, the experimental characterization of heterogeneous groups of biomolecules and the development of biologically inspired engineering designs with individually different building blocks. PMID:25142516

  7. The interplay between cooperativity and diversity in model threshold ensembles

    PubMed Central

    Cervera, Javier; Manzanares, José A.; Mafe, Salvador

    2014-01-01

    The interplay between cooperativity and diversity is crucial for biological ensembles because single molecule experiments show a significant degree of heterogeneity and also for artificial nanostructures because of the high individual variability characteristic of nanoscale units. We study the cross-effects between cooperativity and diversity in model threshold ensembles composed of individually different units that show a cooperative behaviour. The units are modelled as statistical distributions of parameters (the individual threshold potentials here) characterized by central and width distribution values. The simulations show that the interplay between cooperativity and diversity results in ensemble-averaged responses of interest for the understanding of electrical transduction in cell membranes, the experimental characterization of heterogeneous groups of biomolecules and the development of biologically inspired engineering designs with individually different building blocks. PMID:25142516

  8. Selection Strategies for Social Influence in the Threshold Model

    NASA Astrophysics Data System (ADS)

    Karampourniotis, Panagiotis; Szymanski, Boleslaw; Korniss, Gyorgy

    The ubiquity of online social networks makes the study of social influence extremely significant for its applications to marketing, politics and security. Maximizing the spread of influence by strategically selecting nodes as initiators of a new opinion or trend is a challenging problem. We study the performance of various strategies for selection of large fractions of initiators on a classical social influence model, the Threshold model (TM). Under the TM, a node adopts a new opinion only when the fraction of its first neighbors possessing that opinion exceeds a pre-assigned threshold. The strategies we study are of two kinds: strategies based solely on the initial network structure (Degree-rank, Dominating Sets, PageRank etc.) and strategies that take into account the change of the states of the nodes during the evolution of the cascade, e.g. the greedy algorithm. We find that the performance of these strategies depends largely on both the network structure properties, e.g. the assortativity, and the distribution of the thresholds assigned to the nodes. We conclude that the optimal strategy needs to combine the network specifics and the model specific parameters to identify the most influential spreaders. Supported in part by ARL NS-CTA, ARO, and ONR.

  9. Population-based local search for protein folding simulation in the MJ energy model and cubic lattices.

    PubMed

    Kapsokalivas, L; Gan, X; Albrecht, A A; Steinhöfel, K

    2009-08-01

    We present experimental results on benchmark problems in 3D cubic lattice structures with the Miyazawa-Jernigan energy function for two local search procedures that utilise the pull-move set: (i) population-based local search (PLS) that traverses the energy landscape with greedy steps towards (potential) local minima followed by upward steps up to a certain level of the objective function; (ii) simulated annealing with a logarithmic cooling schedule (LSA). The parameter settings for PLS are derived from short LSA-runs executed in pre-processing and the procedure utilises tabu lists generated for each member of the population. In terms of the total number of energy function evaluations both methods perform equally well, however, PLS has the potential of being parallelised with an expected speed-up in the region of the population size. Furthermore, both methods require a significant smaller number of function evaluations when compared to Monte Carlo simulations with kink-jump moves. PMID:19647489

  10. A damage model based on failure threshold weakening

    NASA Astrophysics Data System (ADS)

    Gran, Joseph D.; Rundle, John B.; Turcotte, Donald L.; Holliday, James R.; Klein, William

    2011-04-01

    A variety of studies have modeled the physics of material deformation and damage as examples of generalized phase transitions, involving either critical phenomena or spinodal nucleation. Here we study a model for frictional sliding with long-range interactions and recurrent damage that is parameterized by a process of damage and partial healing during sliding. We introduce a failure threshold weakening parameter into the cellular automaton slider-block model which allows blocks to fail at a reduced failure threshold for all subsequent failures during an event. We show that a critical point is reached beyond which the probability of a system-wide event scales with this weakening parameter. We provide a mapping to the percolation transition, and show that the values of the scaling exponents approach the values for mean-field percolation (spinodal nucleation) as lattice size L is increased for fixed R. We also examine the effect of the weakening parameter on the frequency-magnitude scaling relationship and the ergodic behavior of the model.

  11. Threshold dynamics of a malaria transmission model in periodic environment

    NASA Astrophysics Data System (ADS)

    Wang, Lei; Teng, Zhidong; Zhang, Tailei

    2013-05-01

    In this paper, we propose a malaria transmission model with periodic environment. The basic reproduction number R0 is computed for the model and it is shown that the disease-free periodic solution of the model is globally asymptotically stable when R0<1, that is, the disease goes extinct when R0<1, while the disease is uniformly persistent and there is at least one positive periodic solution when R0>1. It indicates that R0 is the threshold value determining the extinction and the uniform persistence of the disease. Finally, some examples are given to illustrate the main theoretical results. The numerical simulations show that, when the disease is uniformly persistent, different dynamic behaviors may be found in this model, such as the global attractivity and the chaotic attractor.

  12. Model to Estimate Threshold Mechanical Stability of Lower Lateral Cartilage

    PubMed Central

    Kim, James Hakjune; Hamamoto, Ashley; Kiyohara, Nicole; Wong, Brian J. F.

    2015-01-01

    IMPORTANCE In rhinoplasty, techniques used to alter the shape of the nasal tip often compromise the structural stability of the cartilage framework in the nose. Determining the minimum threshold level of cartilage stiffness required to maintain long-term structural stability is a critical aspect in performing these surgical maneuvers. OBJECTIVE To quantify the minimum threshold mechanical stability (elastic modulus) of lower lateral cartilage (LLC) according to expert opinion. METHODS Five anatomically correct LLC phantoms were made from urethane via a 3-dimensional computer modeling and injection molding process. All 5 had identical geometry but varied in stiffness along the intermediate crural region (0.63–30.6 MPa). DESIGN, SETTING, AND PARTICIPANTS A focus group of experienced rhinoplasty surgeons (n = 33) was surveyed at a regional professional meeting on October 25, 2013. Each survey participant was presented the 5 phantoms in a random order and asked to arrange the phantoms in order of increasing stiffness based on their sense of touch. Then, they were asked to select a single phantom out of the set that they believed to have the minimum acceptable mechanical stability for LLC to maintain proper form and function. MAIN OUTCOMES AND MEASURES A binary logistic regression was performed to calculate the probability of mechanical acceptability as a function of the elastic modulus of the LLC based on survey data. A Hosmer-Lemeshow test was performed to measure the goodness of fit between the logistic regression and survey data. The minimum threshold mechanical stability for LLC was taken at a 50% acceptability rating. RESULTS Phantom 4 was selected most frequently by the participants as having the minimum acceptable stiffness for LLC intermediate care. The minimum threshold mechanical stability for LLC was determined to be 3.65 MPa. The Hosmer-Lemeshow test revealed good fit between the logistic regression and survey data ( χ32=0.92 , P = .82). CONCLUSIONS AND

  13. Terrestrial Microgravity Model and Threshold Gravity Simulation using Magnetic Levitation

    NASA Technical Reports Server (NTRS)

    Ramachandran, N.

    2005-01-01

    What is the threshold gravity (minimum gravity level) required for the nominal functioning of the human system? What dosage is required? Do human cell lines behave differently in microgravity in response to an external stimulus? The critical need for such a gravity simulator is emphasized by recent experiments on human epithelial cells and lymphocytes on the Space Shuttle clearly showing that cell growth and function are markedly different from those observed terrestrially. Those differences are also dramatic between cells grown in space and those in Rotating Wall Vessels (RWV), or NASA bioreactor often used to simulate microgravity, indicating that although morphological growth patterns (three dimensional growth) can be successfully simulated using RWVs, cell function performance is not reproduced - a critical difference. If cell function is dramatically affected by gravity off-loading, then cell response to stimuli such as radiation, stress, etc. can be very different from terrestrial cell lines. Yet, we have no good gravity simulator for use in study of these phenomena. This represents a profound shortcoming for countermeasures research. We postulate that we can use magnetic levitation of cells and tissue, through the use of strong magnetic fields and field gradients, as a terrestrial microgravity model to study human cells. Specific objectives of the research are: 1. To develop a tried, tested and benchmarked terrestrial microgravity model for cell culture studies; 2. Gravity threshold determination; 3. Dosage (magnitude and duration) of g-level required for nominal functioning of cells; 4. Comparisons of magnetic levitation model to other models such as RWV, hind limb suspension, etc. and 5. Cellular response to reduced gravity levels of Moon and Mars. The paper will discuss experiments md modeling work to date in support of this project.

  14. A model based rule for selecting spiking thresholds in neuron models.

    PubMed

    Mikkelsen, Frederik Riis

    2016-06-01

    Determining excitability thresholds in neuronal models is of high interest due to its applicability in separating spiking from non-spiking phases of neuronal membrane potential processes. However, excitability thresholds are known to depend on various auxiliary variables, including any conductance or gating variables. Such dependences pose as a double-edged sword; they are natural consequences of the complexity of the model, but proves difficult to apply in practice, since gating variables are rarely measured. In this paper a technique for finding excitability thresholds, based on the local behaviour of the flow in dynamical systems, is presented. The technique incorporates the dynamics of the auxiliary variables, yet only produces thresholds for the membrane potential. The method is applied to several classical neuron models and the threshold's dependence upon external parameters is studied, along with a general evaluation of the technique. PMID:27106187

  15. A unified statistical model for hydrological variables including the selection of threshold for the peak over threshold method

    NASA Astrophysics Data System (ADS)

    Solari, S.; Losada, M. A.

    2012-10-01

    This paper explores the use of a mixture model for determining the marginal distribution of hydrological variables, consisting of a truncated central distribution that is representative of the central or main-mass regime, which for the cases studied is a lognormal distribution, and of two generalized Pareto distributions for the maximum and minimum regimes, representing the upper and lower tails, respectively. The thresholds defining the limits between these regimes and the central regime are parameters of the model and are calculated together with the remaining parameters by maximum likelihood. After testing the model with a simulation study we concluded that the upper threshold of the model can be used when applying the peak over threshold method. This will yield an automatic and objective identification of the threshold presenting an alternative to existing methods. The model was also applied to four hydrological data series: two mean daily flow series, the Thames at Kingston (United Kingdom), and the Guadalfeo River at Orgiva (Spain); and two daily precipitation series, Fort Collins (CO, USA), and Orgiva (Spain). It was observed that the model improved the fit of the data series with respect to the fit obtained with the lognormal (LN) and, in particular, provided a good fit for the upper tail. Moreover, we concluded that the proposed model is able to accommodate the entire range of values of some significant hydrological variables.

  16. A random graph model of density thresholds in swarming cells.

    PubMed

    Jena, Siddhartha G

    2016-03-01

    Swarming behaviour is a type of bacterial motility that has been found to be dependent on reaching a local density threshold of cells. With this in mind, the process through which cell-to-cell interactions develop and how an assembly of cells reaches collective motility becomes increasingly important to understand. Additionally, populations of cells and organisms have been modelled through graphs to draw insightful conclusions about population dynamics on a spatial level. In the present study, we make use of analogous random graph structures to model the formation of large chain subgraphs, representing interactions between multiple cells, as a random graph Markov process. Using numerical simulations and analytical results on how quickly paths of certain lengths are reached in a random graph process, metrics for intercellular interaction dynamics at the swarm layer that may be experimentally evaluated are proposed. PMID:26893102

  17. Stylized facts from a threshold-based heterogeneous agent model

    NASA Astrophysics Data System (ADS)

    Cross, R.; Grinfeld, M.; Lamba, H.; Seaman, T.

    2007-05-01

    A class of heterogeneous agent models is investigated where investors switch trading position whenever their motivation to do so exceeds some critical threshold. These motivations can be psychological in nature or reflect behaviour suggested by the efficient market hypothesis (EMH). By introducing different propensities into a baseline model that displays EMH behaviour, one can attempt to isolate their effects upon the market dynamics. The simulation results indicate that the introduction of a herding propensity results in excess kurtosis and power-law decay consistent with those observed in actual return distributions, but not in significant long-term volatility correlations. Possible alternatives for introducing such long-term volatility correlations are then identified and discussed.

  18. Terrestrial Microgravity Model and Threshold Gravity Simulation sing Magnetic Levitation

    NASA Technical Reports Server (NTRS)

    Ramachandran, N.

    2005-01-01

    What is the threshold gravity (minimum gravity level) required for the nominal functioning of the human system? What dosage is required? Do human cell lines behave differently in microgravity in response to an external stimulus? The critical need for such a gravity simulator is emphasized by recent experiments on human epithelial cells and lymphocytes on the Space Shuttle clearly showing that cell growth and function are markedly different from those observed terrestrially. Those differences are also dramatic between cells grown in space and those in Rotating Wall Vessels (RWV), or NASA bioreactor often used to simulate microgravity, indicating that although morphological growth patterns (three dimensional growth) can be successiblly simulated using RWVs, cell function performance is not reproduced - a critical difference. If cell function is dramatically affected by gravity off-loading, then cell response to stimuli such as radiation, stress, etc. can be very different from terrestrial cell lines. Yet, we have no good gravity simulator for use in study of these phenomena. This represents a profound shortcoming for countermeasures research. We postulate that we can use magnetic levitation of cells and tissue, through the use of strong magnetic fields and field gradients, as a terrestrial microgravity model to study human cells. Specific objectives of the research are: 1. To develop a tried, tested and benchmarked terrestrial microgravity model for cell culture studies; 2. Gravity threshold determination; 3. Dosage (magnitude and duration) of g-level required for nominal functioning of cells; 4. Comparisons of magnetic levitation model to other models such as RWV, hind limb suspension, etc. and 5. Cellular response to reduced gravity levels of Moon and Mars.

  19. Towards A Complete Model Of Photopic Visual Threshold Performance

    NASA Astrophysics Data System (ADS)

    Overington, I.

    1982-02-01

    Based on a wide variety of fragmentary evidence taken from psycho-physics, neurophysiology and electron microscopy, it has been possible to put together a very widely applicable conceptual model of photopic visual threshold performance. Such a model is so complex that a single comprehensive mathematical version is excessively cumbersome. It is, however, possible to set up a suite of related mathematical models, each of limited application but strictly known envelope of usage. Such models may be used for assessment of a variety of facets of visual performance when using display imagery, including effects and interactions of image quality, random and discrete display noise, viewing distance, image motion, etc., both for foveal interrogation tasks and for visual search tasks. The specific model may be selected from the suite according to the assessment task in hand. The paper discusses in some depth the major facets of preperceptual visual processing and their interaction with instrumental image quality and noise. It then highlights the statistical nature of visual performance before going on to consider a number of specific mathematical models of partial visual function. Where appropriate, these are compared with widely popular empirical models of visual function.

  20. A threshold model of content knowledge transfer for socioscientific argumentation

    NASA Astrophysics Data System (ADS)

    Sadler, Troy D.; Fowler, Samantha R.

    2006-11-01

    This study explores how individuals make use of scientific content knowledge for socioscientific argumentation. More specifically, this mixed-methods study investigates how learners apply genetics content knowledge as they justify claims relative to genetic engineering. Interviews are conducted with 45 participants, representing three distinct groups: high school students with variable genetics knowledge, college nonscience majors with little genetics knowledge, and college science majors with advanced genetics knowledge. During the interviews, participants advance positions concerning three scenarios dealing with gene therapy and cloning. Arguments are assessed in terms of the number of justifications offered as well as justification quality, based on a five-point rubric. Multivariate analysis of variance results indicate that college science majors outperformed the other groups in terms of justification quality and frequency. Argumentation does not differ among nonscience majors or high school students. Follow-up qualitative analyses of interview responses suggest that all three groups tend to focus on similar, sociomoral themes as they negotiate socially complex, genetic engineering issues, but that the science majors frequently reference specific science content knowledge in the justification of their claims. Results support the Threshold Model of Content Knowledge Transfer, which proposes two knowledge thresholds around which argumentation quality can reasonably be expected to increase. Research and educational implications of these findings are discussed.

  1. There and back again: Iterating between population-based modeling and experiments reveals surprising regulation of calcium transients in rat cardiac myocytes.

    PubMed

    Devenyi, Ryan A; Sobie, Eric A

    2016-07-01

    While many ion channels and transporters involved in cardiac cellular physiology have been identified and described, the relative importance of each in determining emergent cellular behaviors remains unclear. Here we address this issue with a novel approach that combines population-based mathematical modeling with experimental tests to systematically quantify the relative contributions of different ion channels and transporters to the amplitude of the cellular Ca(2+) transient. Sensitivity analysis of a mathematical model of the rat ventricular cardiomyocyte quantified the response of cell behaviors to changes in the level of each ion channel and transporter, and experimental tests of these predictions were performed to validate or invalidate the predictions. The model analysis found that partial inhibition of the transient outward current in rat ventricular epicardial myocytes was predicted to have a greater impact on Ca(2+) transient amplitude than either: (1) inhibition of the same current in endocardial myocytes, or (2) comparable inhibition of the sarco/endoplasmic reticulum Ca(2+) ATPase (SERCA). Experimental tests confirmed the model predictions qualitatively but showed some quantitative disagreement. This guided us to recalibrate the model by adjusting the relative importance of several Ca(2+) fluxes, thereby improving the consistency with experimental data and producing a more predictive model. Analysis of human cardiomyocyte models suggests that the relative importance of outward currents to Ca(2+) transporters is generalizable to human atrial cardiomyocytes, but not ventricular cardiomyocytes. Overall, our novel approach of combining population-based mathematical modeling with experimental tests has yielded new insight into the relative importance of different determinants of cell behavior. PMID:26235057

  2. Cost effectiveness analysis of population-based serology screening and 13C-Urea breath test for Helicobacter pylori to prevent gastric cancer: A markov model

    PubMed Central

    Xie, Feng; Luo, Nan; Lee, Hin-Peng

    2008-01-01

    AIM: To compare the costs and effectiveness of no screening and no eradication therapy, the population-based Helicobacter pylori (H pylori) serology screening with eradication therapy and 13C-Urea breath test (UBT) with eradication therapy. METHODS: A Markov model simulation was carried out in all 237 900 Chinese males with age between 35 and 44 from the perspective of the public healthcare provider in Singapore. The main outcome measures were the costs, number of gastric cancer cases prevented, life years saved, and quality-adjusted life years (QALYs) gained from screening age to death. The uncertainty surrounding the cost-effectiveness ratio was addressed by one-way sensitivity analyses. RESULTS: Compared to no screening, the incremental cost-effectiveness ratio (ICER) was $16 166 per life year saved or $13 571 per QALY gained for the serology screening, and $38 792 per life year saved and $32 525 per QALY gained for the UBT. The ICER was $477 079 per life year saved or $390 337 per QALY gained for the UBT compared to the serology screening. The cost-effectiveness of serology screening over the UBT was robust to most parameters in the model. CONCLUSION: The population-based serology screening for H pylori was more cost-effective than the UBT in prevention of gastric cancer in Singapore Chinese males. PMID:18494053

  3. Power law distribution of wealth in population based on a modified Equíluz-Zimmermann model

    NASA Astrophysics Data System (ADS)

    Xie, Yan-Bo; Wang, Bing-Hong; Hu, Bo; Zhou, Tao

    2005-04-01

    We propose a money-based model for the power law distribution (PLD) of wealth in an economically interacting population. It is introduced as a modification of the Equíluz and Zimmermann (EZ) model for crowding and information transmission in financial markets. Still, it must be stressed that in the EZ model a PLD without exponential correction is obtained only for a particular parameter, while our pattern will give the exact PLD within a wide range. The PLD exponent depends on the model parameters in a nontrivial way and is exactly calculated in this paper. The numerical results are in excellent agreement with the analytic prediction, and also comparable with empirical data of wealth distribution.

  4. Smooth-Threshold Multivariate Genetic Prediction with Unbiased Model Selection.

    PubMed

    Ueki, Masao; Tamiya, Gen

    2016-04-01

    We develop a new genetic prediction method, smooth-threshold multivariate genetic prediction, using single nucleotide polymorphisms (SNPs) data in genome-wide association studies (GWASs). Our method consists of two stages. At the first stage, unlike the usual discontinuous SNP screening as used in the gene score method, our method continuously screens SNPs based on the output from standard univariate analysis for marginal association of each SNP. At the second stage, the predictive model is built by a generalized ridge regression simultaneously using the screened SNPs with SNP weight determined by the strength of marginal association. Continuous SNP screening by the smooth thresholding not only makes prediction stable but also leads to a closed form expression of generalized degrees of freedom (GDF). The GDF leads to the Stein's unbiased risk estimation (SURE), which enables data-dependent choice of optimal SNP screening cutoff without using cross-validation. Our method is very rapid because computationally expensive genome-wide scan is required only once in contrast to the penalized regression methods including lasso and elastic net. Simulation studies that mimic real GWAS data with quantitative and binary traits demonstrate that the proposed method outperforms the gene score method and genomic best linear unbiased prediction (GBLUP), and also shows comparable or sometimes improved performance with the lasso and elastic net being known to have good predictive ability but with heavy computational cost. Application to whole-genome sequencing (WGS) data from the Alzheimer's Disease Neuroimaging Initiative (ADNI) exhibits that the proposed method shows higher predictive power than the gene score and GBLUP methods. PMID:26947266

  5. Application of a dynamic population-based model for evaluation of exposure reduction strategies in the baking industry

    NASA Astrophysics Data System (ADS)

    Meijster, Tim; Warren, Nick; Heederik, Dick; Tielemans, Erik

    2009-02-01

    Recently a dynamic population model was developed that simulates a population of bakery workers longitudinally through time and tracks the development of work-related sensitisation and respiratory symptoms in each worker. Input for this model comes from cross-sectional and longitudinal epidemiological studies which allowed estimation of exposure response relationships and disease transition probabilities This model allows us to study the development of diseases and transitions between disease states over time in relation to determinants of disease including flour dust and/or allergen exposure. Furthermore it enables more realistic modelling of the health impact of different intervention strategies at the workplace (e.g. changes in exposure may take several years to impact on ill-health and often occur as a gradual trend). A large dataset of individual full-shift exposure measurements and real-time exposure measurements were used to obtain detailed insight into the effectiveness of control measures and other determinants of exposure. Given this information a population wide reduction of the median exposure with 50% was evaluated in this paper.

  6. Clinical effectiveness of orthogeriatric and fracture liaison service models of care for hip fracture patients: population-based longitudinal study

    PubMed Central

    Hawley, Samuel; Javaid, M. Kassim; Prieto-Alhambra, Daniel; Lippett, Janet; Sheard, Sally; Arden, Nigel K.; Cooper, Cyrus; Judge, Andrew

    2016-01-01

    Objectives: to evaluate orthogeriatric and nurse-led fracture liaison service (FLS) models of post-hip fracture care in terms of impact on mortality (30 days and 1 year) and second hip fracture (2 years). Setting: Hospital Episode Statistics database linked to Office for National Statistics mortality records for 11 acute hospitals in a region of England. Population: patients aged over 60 years admitted for a primary hip fracture from 2003 to 2013. Methods: each hospital was analysed separately and acted as its own control in a before–after time-series design in which the appointment of an orthogeriatrician or set-up/expansion of an FLS was evaluated. Multivariable Cox regression (mortality) and competing risk survival models (second hip fracture) were used. Fixed effects meta-analysis was used to pool estimates of impact for interventions of the same type. Results: of 33,152 primary hip fracture patients, 1,288 sustained a second hip fracture within 2 years (age and sex standardised proportion of 4.2%). 3,033 primary hip fracture patients died within 30 days and 9,662 died within 1 year (age and sex standardised proportion of 9.5% and 29.8%, respectively). The estimated impact of introducing an orthogeriatrician on 30-day and 1-year mortality was hazard ratio (HR) = 0.73 (95% CI: 0.65–0.82) and HR = 0.81 (CI: 0.75–0.87), respectively. Following an FLS, these associations were as follows: HR = 0.80 (95% CI: 0.71–0.91) and HR = 0.84 (0.77–0.93). There was no significant impact on time to second hip fracture. Conclusions: the introduction and/or expansion of orthogeriatric and FLS models of post-hip fracture care has a beneficial effect on subsequent mortality. No evidence for a reduction in second hip fracture rate was found. PMID:26802076

  7. Associations of iron metabolism genes with blood manganese levels: a population-based study with validation data from animal models

    PubMed Central

    2011-01-01

    Background Given mounting evidence for adverse effects from excess manganese exposure, it is critical to understand host factors, such as genetics, that affect manganese metabolism. Methods Archived blood samples, collected from 332 Mexican women at delivery, were analyzed for manganese. We evaluated associations of manganese with functional variants in three candidate iron metabolism genes: HFE [hemochromatosis], TF [transferrin], and ALAD [δ-aminolevulinic acid dehydratase]. We used a knockout mouse model to parallel our significant results as a novel method of validating the observed associations between genotype and blood manganese in our epidemiologic data. Results Percentage of participants carrying at least one copy of HFE C282Y, HFE H63D, TF P570S, and ALAD K59N variant alleles was 2.4%, 17.7%, 20.1%, and 6.4%, respectively. Percentage carrying at least one copy of either C282Y or H63D allele in HFE gene was 19.6%. Geometric mean (geometric standard deviation) manganese concentrations were 17.0 (1.5) μg/l. Women with any HFE variant allele had 12% lower blood manganese concentrations than women with no variant alleles (β = -0.12 [95% CI = -0.23 to -0.01]). TF and ALAD variants were not significant predictors of blood manganese. In animal models, Hfe-/- mice displayed a significant reduction in blood manganese compared with Hfe+/+ mice, replicating the altered manganese metabolism found in our human research. Conclusions Our study suggests that genetic variants in iron metabolism genes may contribute to variability in manganese exposure by affecting manganese absorption, distribution, or excretion. Genetic background may be critical to consider in studies that rely on environmental manganese measurements. PMID:22074419

  8. OEDGE Modeling of Detachment Threshold Experiments on DIII-D

    NASA Astrophysics Data System (ADS)

    Elder, J. D.; Stangeby, P. C.; McLean, A. G.; Leonard, A. W.; Watkins, J. G.

    2015-11-01

    A detachment threshold experiment was performed on DIII-D in which the divertor plasma transitioned from attached to weakly detached at the strike point with minimal changes in upstream parameters. The value of Te at the outer strike point measured by Thompson scattering decreased from ~ 10eV (attached) to ~ 2 eV (weakly detached). Both the Langmuir probes and the divertor Thomson diagnostics recorded increases in the particle flux on the order of a factor of two between these divertor conditions. OEDGE is used to model both of these plasma regimes for both L-mode and H-mode discharges. The behaviour of molecular hydrogen is assessed using OEDGE and possible roles of hydrogen molecules in the detachment process are examined. Work supported by the US Department of Energy under DE-FC02-04ER54698, DE-FG02-04ER54578, DE-AC04-94AL85000, DE-AC05-00OR22725, and DE-AC52-07NA27344.

  9. Modeling of Auditory Neuron Response Thresholds with Cochlear Implants.

    PubMed

    Venail, Frederic; Mura, Thibault; Akkari, Mohamed; Mathiolon, Caroline; Menjot de Champfleur, Sophie; Piron, Jean Pierre; Sicard, Marielle; Sterkers-Artieres, Françoise; Mondain, Michel; Uziel, Alain

    2015-01-01

    The quality of the prosthetic-neural interface is a critical point for cochlear implant efficiency. It depends not only on technical and anatomical factors such as electrode position into the cochlea (depth and scalar placement), electrode impedance, and distance between the electrode and the stimulated auditory neurons, but also on the number of functional auditory neurons. The efficiency of electrical stimulation can be assessed by the measurement of e-CAP in cochlear implant users. In the present study, we modeled the activation of auditory neurons in cochlear implant recipients (nucleus device). The electrical response, measured using auto-NRT (neural responses telemetry) algorithm, has been analyzed using multivariate regression with cubic splines in order to take into account the variations of insertion depth of electrodes amongst subjects as well as the other technical and anatomical factors listed above. NRT thresholds depend on the electrode squared impedance (β = -0.11 ± 0.02, P < 0.01), the scalar placement of the electrodes (β = -8.50 ± 1.97, P < 0.01), and the depth of insertion calculated as the characteristic frequency of auditory neurons (CNF). Distribution of NRT residues according to CNF could provide a proxy of auditory neurons functioning in implanted cochleas. PMID:26236725

  10. No-Impact Threshold Values for NRAP's Reduced Order Models

    SciTech Connect

    Last, George V.; Murray, Christopher J.; Brown, Christopher F.; Jordan, Preston D.; Sharma, Maneesh

    2013-02-01

    The purpose of this study was to develop methodologies for establishing baseline datasets and statistical protocols for determining statistically significant changes between background concentrations and predicted concentrations that would be used to represent a contamination plume in the Gen II models being developed by NRAP’s Groundwater Protection team. The initial effort examined selected portions of two aquifer systems; the urban shallow-unconfined aquifer system of the Edwards-Trinity Aquifer System (being used to develop the ROM for carbon-rock aquifers, and the a portion of the High Plains Aquifer (an unconsolidated and semi-consolidated sand and gravel aquifer, being used to development the ROM for sandstone aquifers). Threshold values were determined for Cd, Pb, As, pH, and TDS that could be used to identify contamination due to predicted impacts from carbon sequestration storage reservoirs, based on recommendations found in the EPA’s ''Unified Guidance for Statistical Analysis of Groundwater Monitoring Data at RCRA Facilities'' (US Environmental Protection Agency 2009). Results from this effort can be used to inform a ''no change'' scenario with respect to groundwater impacts, rather than the use of an MCL that could be significantly higher than existing concentrations in the aquifer.

  11. Effects of mixing in threshold models of social behavior

    NASA Astrophysics Data System (ADS)

    Akhmetzhanov, Andrei R.; Worden, Lee; Dushoff, Jonathan

    2013-07-01

    We consider the dynamics of an extension of the influential Granovetter model of social behavior, where individuals are affected by their personal preferences and observation of the neighbors’ behavior. Individuals are arranged in a network (usually the square lattice), and each has a state and a fixed threshold for behavior changes. We simulate the system asynchronously by picking a random individual and we either update its state or exchange it with another randomly chosen individual (mixing). We describe the dynamics analytically in the fast-mixing limit by using the mean-field approximation and investigate it mainly numerically in the case of finite mixing. We show that the dynamics converge to a manifold in state space, which determines the possible equilibria, and show how to estimate the projection of this manifold by using simulated trajectories, emitted from different initial points. We show that the effects of considering the network can be decomposed into finite-neighborhood effects, and finite-mixing-rate effects, which have qualitatively similar effects. Both of these effects increase the tendency of the system to move from a less-desired equilibrium to the “ground state.” Our findings can be used to probe shifts in behavioral norms and have implications for the role of information flow in determining when social norms that have become unpopular in particular communities (such as foot binding or female genital cutting) persist or vanish.

  12. Phonation threshold power in ex vivo laryngeal models

    PubMed Central

    Regner, Michael F.; Jiang, Jack J.

    2011-01-01

    This study hypothesized that phonation threshold power is measureable and sensitive to changes in the biomechanical properties of the vocal folds. Phonation threshold power was measured in three sample populations of ten excised canine larynges treated with variable posterior glottal gap, variable bilateral vocal fold elongation, and variable vocal fold lesioning. Posterior glottal gap was varied from 0 mm to 4 mm in 0.5 mm intervals. Bilateral vocal fold elongation was varied from 0% to 20% in 5% intervals. Vocal fold lesion treatments included unilateral and bilateral vocal fold lesion groups. Each treatment was investigated independently in a sample population of ten excised canine larynges. Linear regression analysis indicated that phonation threshold power was sensitive to posterior glottal gap (R2=0.298, p<0.001) and weakly to vocal fold elongation (R2=0.052, p=0.003). A one-way repeated measures ANOVA indicated that phonation threshold power was sensitive to the presence of lesions (p<0.001). Theoretical and experimental evidence presented here suggests that phonation threshold power could be used as a broad screening parameter, sensitive to certain changes in the biomechanical properties of the larynx. It has not yet been measured in humans, but because it has the potential to represent the airflow-tissue energy transfer more completely the phonation threshold pressure or flow alone, it may be a more useful parameter than these and could be used to indicate that laryngeal health is likely abnormal. PMID:20817475

  13. Budget Impact Analysis of Switching to Digital Mammography in a Population-Based Breast Cancer Screening Program: A Discrete Event Simulation Model

    PubMed Central

    Comas, Mercè; Arrospide, Arantzazu; Mar, Javier; Sala, Maria; Vilaprinyó, Ester; Hernández, Cristina; Cots, Francesc; Martínez, Juan; Castells, Xavier

    2014-01-01

    Objective To assess the budgetary impact of switching from screen-film mammography to full-field digital mammography in a population-based breast cancer screening program. Methods A discrete-event simulation model was built to reproduce the breast cancer screening process (biennial mammographic screening of women aged 50 to 69 years) combined with the natural history of breast cancer. The simulation started with 100,000 women and, during a 20-year simulation horizon, new women were dynamically entered according to the aging of the Spanish population. Data on screening were obtained from Spanish breast cancer screening programs. Data on the natural history of breast cancer were based on US data adapted to our population. A budget impact analysis comparing digital with screen-film screening mammography was performed in a sample of 2,000 simulation runs. A sensitivity analysis was performed for crucial screening-related parameters. Distinct scenarios for recall and detection rates were compared. Results Statistically significant savings were found for overall costs, treatment costs and the costs of additional tests in the long term. The overall cost saving was 1,115,857€ (95%CI from 932,147 to 1,299,567) in the 10th year and 2,866,124€ (95%CI from 2,492,610 to 3,239,638) in the 20th year, representing 4.5% and 8.1% of the overall cost associated with screen-film mammography. The sensitivity analysis showed net savings in the long term. Conclusions Switching to digital mammography in a population-based breast cancer screening program saves long-term budget expense, in addition to providing technical advantages. Our results were consistent across distinct scenarios representing the different results obtained in European breast cancer screening programs. PMID:24832200

  14. The Random-Threshold Generalized Unfolding Model and Its Application of Computerized Adaptive Testing

    ERIC Educational Resources Information Center

    Wang, Wen-Chung; Liu, Chen-Wei; Wu, Shiu-Lien

    2013-01-01

    The random-threshold generalized unfolding model (RTGUM) was developed by treating the thresholds in the generalized unfolding model as random effects rather than fixed effects to account for the subjective nature of the selection of categories in Likert items. The parameters of the new model can be estimated with the JAGS (Just Another Gibbs…

  15. A Comprehensive Multistate Model Analyzing Associations of Various Risk Factors With the Course of Breast Cancer in a Population-Based Cohort of Breast Cancer Cases.

    PubMed

    Eulenburg, Christine; Schroeder, Jennifer; Obi, Nadia; Heinz, Judith; Seibold, Petra; Rudolph, Anja; Chang-Claude, Jenny; Flesch-Janys, Dieter

    2016-02-15

    We employed a semi-Markov multistate model for the simultaneous analysis of various endpoints describing the course of breast cancer. Results were compared with those from standard analyses using a Cox proportional hazards model. We included 3,012 patients with invasive breast cancer newly diagnosed between 2001 and 2005 who were recruited in Germany for a population-based study, the Mamma Carcinoma Risk Factor Investigation (MARIE Study), and prospectively followed up until the end of 2009. Locoregional recurrence and distant metastasis were included as intermediate states, and deaths from breast cancer, secondary cancer, and other causes were included as competing absorbing states. Tumor characteristics were significantly associated with all breast cancer-related endpoints. Nodal involvement was significantly related to local recurrence but more strongly related to distant metastases. Smoking was significantly associated with mortality from second cancers and other causes, whereas menopausal hormone use was significantly associated with reduced distant metastasis and death from causes other than cancer. The presence of cardiovascular disease at diagnosis was solely associated with mortality from other causes. Compared with separate Cox models, multistate models allow for dissection of prognostic factors and intermediate events in the analysis of cause-specific mortality and can yield new insights into disease progression and associated pathways. PMID:26823437

  16. Fiber bundle model with highly disordered breaking thresholds.

    PubMed

    Roy, Chandreyee; Kundu, Sumanta; Manna, S S

    2015-03-01

    We present a study of the fiber bundle model using equal load-sharing dynamics where the breaking thresholds of the fibers are drawn randomly from a power-law distribution of the form p(b)∼b-1 in the range 10-β to 10β. Tuning the value of β continuously over a wide range, the critical behavior of the fiber bundle has been studied both analytically as well as numerically. Our results are: (i) The critical load σc(β,N) for the bundle of size N approaches its asymptotic value σc(β) as σc(β,N)=σc(β)+AN-1/ν(β), where σc(β) has been obtained analytically as σc(β)=10β/(2βeln10) for β≥βu=1/(2ln10), and for β<βu the weakest fiber failure leads to the catastrophic breakdown of the entire fiber bundle, similar to brittle materials, leading to σ_{c}(β)=10-β; (ii) the fraction of broken fibers right before the complete breakdown of the bundle has the form 1-1/(2βln10); (iii) the distribution D(Δ) of the avalanches of size Δ follows a power-law D(Δ)∼Δ-ξ with ξ=5/2 for Δ≫Δc(β) and ξ=3/2 for Δ≪Δc(β), where the crossover avalanche size Δc(β)=2/(1-e10-2β)2. PMID:25871050

  17. Laser thresholds in pulp exposure: a rat animal model

    NASA Astrophysics Data System (ADS)

    White, Joel M.; Goodis, Harold E.; Kudler, Joel J.

    1995-05-01

    Laser technology is now being clinically investigated for the removal of carious enamel and dentin. This study used an animal model to evaluate histological pulpal effects from laser exposure. The molars of 24 Sprague-Dawley rats (n equals 264) were exposed to either a pulsed 1.06 micrometers Nd:YAG laser (120 microseconds, 320 micrometer diameter fiber), air rotor drill preparation or left untreated as controls. The following treatment conditions were investigated: control group (n equals 54); high speed drill with carbide bur (n equals 39); laser exposure at 50 mJ/p at 10 Hz (n equals 27), 100 mJ/p at 10 Hz (n equals 66) and 100 mJ/p at 20 Hz (n equals 39). A sixth treatment condition was investigated: root surface hypersensitivity, which included incremental laser exposure from 30 to 100 mJ/p at 10 Hz (n equals 39). The animals were euthanized either immediately after treatment, at one week, or at one month. The jaws were fixed and bioprepared. Remaining dentin thickness was measured, and ranged from 0.17 +/- 0.04 mm to 0.35 +/- 0.09 mm. The pulp tissue was examined for histologic inflammatory response. No evidence of pulpal involvement or adverse pulpal effects were found at any time period in teeth receiving 50 mJ/p. When histologic samples were compared with controls, all observations were similar. Of the 210 exposed teeth, 2 teeth receiving 100 mJ/p demonstrated abscess formation and were exfoliated. Further, in the rat molar when remaining dentin thickness was less than 0.5 mm, exposed to 100 mJ/p, threshold pulpal effects occurred. The response of rat pulp to laser exposure indicated no histologically measurable response to pulsed laser energy at 50 mJ/p.

  18. Threshold models linked to land classification and indicators as guides to restoration – prospects and pitfalls

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Threshold (or regime shift) models are useful for restoration because they match actions to conditions where benefits are likely to be maximized. The procedures by which threshold models should be applied, however, are in the early stages of development. Here, we describe ecological concepts and der...

  19. Prediction of liver disease in patients whose liver function tests have been checked in primary care: model development and validation using population-based observational cohorts

    PubMed Central

    McLernon, David J; Donnan, Peter T; Sullivan, Frank M; Roderick, Paul; Rosenberg, William M; Ryder, Steve D; Dillon, John F

    2014-01-01

    Objective To derive and validate a clinical prediction model to estimate the risk of liver disease diagnosis following liver function tests (LFTs) and to convert the model to a simplified scoring tool for use in primary care. Design Population-based observational cohort study of patients in Tayside Scotland identified as having their LFTs performed in primary care and followed for 2 years. Biochemistry data were linked to secondary care, prescriptions and mortality data to ascertain baseline characteristics of the derivation cohort. A separate validation cohort was obtained from 19 general practices across the rest of Scotland to externally validate the final model. Setting Primary care, Tayside, Scotland. Participants Derivation cohort: LFT results from 310 511 patients. After exclusions (including: patients under 16 years, patients having initial LFTs measured in secondary care, bilirubin >35 μmol/L, liver complications within 6 weeks and history of a liver condition), the derivation cohort contained 95 977 patients with no clinically apparent liver condition. Validation cohort: after exclusions, this cohort contained 11 653 patients. Primary and secondary outcome measures Diagnosis of a liver condition within 2 years. Results From the derivation cohort (n=95 977), 481 (0.5%) were diagnosed with a liver disease. The model showed good discrimination (C-statistic=0.78). Given the low prevalence of liver disease, the negative predictive values were high. Positive predictive values were low but rose to 20–30% for high-risk patients. Conclusions This study successfully developed and validated a clinical prediction model and subsequent scoring tool, the Algorithm for Liver Function Investigations (ALFI), which can predict liver disease risk in patients with no clinically obvious liver disease who had their initial LFTs taken in primary care. ALFI can help general practitioners focus referral on a small subset of patients with higher predicted risk

  20. The relationship between the Five-Factor Model personality traits and peptic ulcer disease in a large population-based adult sample.

    PubMed

    Realo, Anu; Teras, Andero; Kööts-Ausmees, Liisi; Esko, Tõnu; Metspalu, Andres; Allik, Jüri

    2015-12-01

    The current study examined the relationship between the Five-Factor Model personality traits and physician-confirmed peptic ulcer disease (PUD) diagnosis in a large population-based adult sample, controlling for the relevant behavioral and sociodemographic factors. Personality traits were assessed by participants themselves and by knowledgeable informants using the NEO Personality Inventory-3 (NEO PI-3). When controlling for age, sex, education, and cigarette smoking, only one of the five NEO PI-3 domain scales - higher Neuroticism - and two facet scales - lower A1: Trust and higher C1: Competence - made a small, yet significant contribution (p < 0.01) to predicting PUD in logistic regression analyses. In the light of these relatively modest associations, our findings imply that it is certain behavior (such as smoking) and sociodemographic variables (such as age, gender, and education) rather than personality traits that are associated with the diagnosis of PUD at a particular point in time. Further prospective studies with a longitudinal design and multiple assessments would be needed to fully understand if the FFM personality traits serve as risk factors for the development of PUD. PMID:26437682

  1. Natural History of Dependency in the Elderly: A 24-Year Population-Based Study Using a Longitudinal Item Response Theory Model.

    PubMed

    Edjolo, Arlette; Proust-Lima, Cécile; Delva, Fleur; Dartigues, Jean-François; Pérès, Karine

    2016-02-15

    We aimed to describe the hierarchical structure of Instrumental Activities of Daily Living (IADL) and basic Activities of Daily Living (ADL) and trajectories of dependency before death in an elderly population using item response theory methodology. Data were obtained from a population-based French cohort study, the Personnes Agées QUID (PAQUID) Study, of persons aged ≥65 years at baseline in 1988 who were recruited from 75 randomly selected areas in Gironde and Dordogne. We evaluated IADL and ADL data collected at home every 2-3 years over a 24-year period (1988-2012) for 3,238 deceased participants (43.9% men). We used a longitudinal item response theory model to investigate the item sequence of 11 IADL and ADL combined into a single scale and functional trajectories adjusted for education, sex, and age at death. The findings confirmed the earliest losses in IADL (shopping, transporting, finances) at the partial limitation level, and then an overlapping of concomitant IADL and ADL, with bathing and dressing being the earliest ADL losses, and finally total losses for toileting, continence, eating, and transferring. Functional trajectories were sex-specific, with a benefit of high education that persisted until death in men but was only transient in women. An in-depth understanding of this sequence provides an early warning of functional decline for better adaptation of medical and social care in the elderly. PMID:26825927

  2. Modelling the regulatory system for diabetes mellitus with a threshold window

    NASA Astrophysics Data System (ADS)

    Yang, Jin; Tang, Sanyi; Cheke, Robert A.

    2015-05-01

    Piecewise (or non-smooth) glucose-insulin models with threshold windows for type 1 and type 2 diabetes mellitus are proposed and analyzed with a view to improving understanding of the glucose-insulin regulatory system. For glucose-insulin models with a single threshold, the existence and stability of regular, virtual, pseudo-equilibria and tangent points are addressed. Then the relations between regular equilibria and a pseudo-equilibrium are studied. Furthermore, the sufficient and necessary conditions for the global stability of regular equilibria and the pseudo-equilibrium are provided by using qualitative analysis techniques of non-smooth Filippov dynamic systems. Sliding bifurcations related to boundary node bifurcations were investigated with theoretical and numerical techniques, and insulin clinical therapies are discussed. For glucose-insulin models with a threshold window, the effects of glucose thresholds or the widths of threshold windows on the durations of insulin therapy and glucose infusion were addressed. The duration of the effects of an insulin injection is sensitive to the variation of thresholds. Our results indicate that blood glucose level can be maintained within a normal range using piecewise glucose-insulin models with a single threshold or a threshold window. Moreover, our findings suggest that it is critical to individualise insulin therapy for each patient separately, based on initial blood glucose levels.

  3. A Threshold Model of Social Support, Adjustment, and Distress after Breast Cancer Treatment

    ERIC Educational Resources Information Center

    Mallinckrodt, Brent; Armer, Jane M.; Heppner, P. Paul

    2012-01-01

    This study examined a threshold model that proposes that social support exhibits a curvilinear association with adjustment and distress, such that support in excess of a critical threshold level has decreasing incremental benefits. Women diagnosed with a first occurrence of breast cancer (N = 154) completed survey measures of perceived support…

  4. Reversed thresholds in partial credit models: a reason for collapsing categories?

    PubMed

    Wetzel, Eunike; Carstensen, Claus H

    2014-12-01

    When questionnaire data with an ordered polytomous response format are analyzed in the framework of item response theory using the partial credit model or the generalized partial credit model, reversed thresholds may occur. This led to the discussion of whether reversed thresholds violate model assumptions and indicate disordering of the response categories. Adams, Wu, and Wilson showed that reversed thresholds are merely a consequence of low frequencies in the categories concerned and that they do not affect the order of the rating scale. This article applies an empirical approach to elucidate the topic of reversed thresholds using data from the Revised NEO Personality Inventory as well as a simulation study. It is shown that categories differentiate between participants with different trait levels despite reversed thresholds and that category disordering can be analyzed independently of the ordering of the thresholds. Furthermore, we show that reversed thresholds often only occur in subgroups of participants. Thus, researchers should think more carefully about collapsing categories due to reversed thresholds. PMID:24789857

  5. Role of propagation thresholds in sentiment-based model of opinion evolution with information diffusion

    NASA Astrophysics Data System (ADS)

    Si, Xia-Meng; Wang, Wen-Dong; Ma, Yan

    2016-06-01

    The degree of sentiment is the key factor for internet users in determining their propagating behaviors, i.e. whether participating in a discussion and whether withdrawing from a discussion. For this end, we introduce two sentiment-based propagation thresholds (i.e. infected threshold and refractory threshold) and propose an interacting model based on the Bayesian updating rules. Our model describe the phenomena that few internet users change their decisions and that someone has drop out of discussion about the topic when some others are just aware of it. Numerical simulations show that, large infected threshold restrains information diffusion but favors the lessening of extremism, while large refractory threshold facilitates decision interaction but promotes the extremism. Making netizens calm down and propagate information sanely can restrain the prevailing of extremism about rumors.

  6. Medication Adherence Patterns after Hospitalization for Coronary Heart Disease. A Population-Based Study Using Electronic Records and Group-Based Trajectory Models

    PubMed Central

    Librero, Julián; Sanfélix-Gimeno, Gabriel; Peiró, Salvador

    2016-01-01

    Objective To identify adherence patterns over time and their predictors for evidence-based medications used after hospitalization for coronary heart disease (CHD). Patients and Methods We built a population-based retrospective cohort of all patients discharged after hospitalization for CHD from public hospitals in the Valencia region (Spain) during 2008 (n = 7462). From this initial cohort, we created 4 subcohorts with at least one prescription (filled or not) from each therapeutic group (antiplatelet, beta-blockers, ACEI/ARB, statins) within the first 3 months after discharge. Monthly adherence was defined as having ≥24 days covered out of 30, leading to a repeated binary outcome measure. We assessed the membership to trajectory groups of adherence using group-based trajectory models. We also analyzed predictors of the different adherence patterns using multinomial logistic regression. Results We identified a maximum of 5 different adherence patterns: 1) Nearly-always adherent patients; 2) An early gap in adherence with a later recovery; 3) Brief gaps in medication use or occasional users; 4) A slow decline in adherence; and 5) A fast decline. These patterns represented variable proportions of patients, the descending trajectories being more frequent for the beta-blocker and ACEI/ARB cohorts (16% and 17%, respectively) than the antiplatelet and statin cohorts (10% and 8%, respectively). Predictors of poor or intermediate adherence patterns were having a main diagnosis of unstable angina or other forms of CHD vs. AMI in the index hospitalization, being born outside Spain, requiring copayment or being older. Conclusion Distinct adherence patterns over time and their predictors were identified. This may be a useful approach for targeting improvement interventions in patients with poor adherence patterns. PMID:27551748

  7. A phenomenological model on the kink mode threshold varying with the inclination of sheath boundary

    SciTech Connect

    Sun, X.; Intrator, T. P.; Sears, J.; Weber, T.; Liu, M.

    2013-11-15

    In nature and many laboratory plasmas, a magnetic flux tube threaded by current or a flux rope has a footpoint at a boundary. The current driven kink mode is one of the fundamental ideal magnetohydrodynamic instabilities in plasmas. It has an instability threshold that has been found to strongly depend on boundary conditions (BCs). We provide a theoretical model to explain the transition of this threshold dependence between nonline tied and line tied boundary conditions. We evaluate model parameters using experimentally measured plasma data, explicitly verify several kink eigenfunctions, and validate the model predictions for boundary conditions BCs that span the range between NLT and LT BCs. Based on this model, one could estimate the kink threshold given knowledge of the displacement of a flux rope end, or conversely estimate flux rope end motion based on knowledge of it kink stability threshold.

  8. Two-threshold model for scaling laws of noninteracting snow avalanches

    USGS Publications Warehouse

    Faillettaz, J.; Louchet, F.; Grasso, J.-R.

    2004-01-01

    A two-threshold model was proposed for scaling laws of noninteracting snow avalanches. It was found that the sizes of the largest avalanches just preceding the lattice system were power-law distributed. The proposed model reproduced the range of power-law exponents observe for land, rock or snow avalanches, by tuning the maximum value of the ratio of the two failure thresholds. A two-threshold 2D cellular automation was introduced to study the scaling for gravity-driven systems.

  9. The threshold of a stochastic delayed SIR epidemic model with temporary immunity

    NASA Astrophysics Data System (ADS)

    Liu, Qun; Chen, Qingmei; Jiang, Daqing

    2016-05-01

    This paper is concerned with the asymptotic properties of a stochastic delayed SIR epidemic model with temporary immunity. Sufficient conditions for extinction and persistence in the mean of the epidemic are established. The threshold between persistence in the mean and extinction of the epidemic is obtained. Compared with the corresponding deterministic model, the threshold affected by the white noise is smaller than the basic reproduction number R0 of the deterministic system.

  10. The Rasch Rating Model and the Disordered Threshold Controversy

    ERIC Educational Resources Information Center

    Adams, Raymond J.; Wu, Margaret L.; Wilson, Mark

    2012-01-01

    The Rasch rating (or partial credit) model is a widely applied item response model that is used to model ordinal observed variables that are assumed to collectively reflect a common latent variable. In the application of the model there is considerable controversy surrounding the assessment of fit. This controversy is most notable when the set of…

  11. Simplified modelling the mode instability threshold of high power fiber amplifiers in the presence of photodarkening.

    PubMed

    Jauregui, Cesar; Otto, Hans-Jürgen; Stutzki, F; Limpert, J; Tünnermann, A

    2015-08-10

    In this paper we present a simple model to predict the behavior of the transversal mode instability threshold when different parameters of a fiber amplifier system are changed. The simulation model includes an estimation of the photodarkening losses which shows the strong influence that this effect has on the mode instability threshold and on its behavior. Comparison of the simulation results with experimental measurements reveal that the mode instability threshold in a fiber amplifier system is reached for a constant average heat load value in good approximation. Based on this model, the expected behavior of the mode instability threshold when changing the seed wavelength, the seed power and/or the fiber length will be presented and discussed. Additionally, guidelines for increasing the average power of fiber amplifier systems will be provided. PMID:26367877

  12. The Threshold Bias Model: A Mathematical Model for the Nomothetic Approach of Suicide

    PubMed Central

    Folly, Walter Sydney Dutra

    2011-01-01

    Background Comparative and predictive analyses of suicide data from different countries are difficult to perform due to varying approaches and the lack of comparative parameters. Methodology/Principal Findings A simple model (the Threshold Bias Model) was tested for comparative and predictive analyses of suicide rates by age. The model comprises of a six parameter distribution that was applied to the USA suicide rates by age for the years 2001 and 2002. Posteriorly, linear extrapolations are performed of the parameter values previously obtained for these years in order to estimate the values corresponding to the year 2003. The calculated distributions agreed reasonably well with the aggregate data. The model was also used to determine the age above which suicide rates become statistically observable in USA, Brazil and Sri Lanka. Conclusions/Significance The Threshold Bias Model has considerable potential applications in demographic studies of suicide. Moreover, since the model can be used to predict the evolution of suicide rates based on information extracted from past data, it will be of great interest to suicidologists and other researchers in the field of mental health. PMID:21909431

  13. Immunization and epidemic threshold of an SIS model in complex networks

    NASA Astrophysics Data System (ADS)

    Wu, Qingchu; Fu, Xinchu

    2016-02-01

    We propose an improved mean-field model to investigate immunization strategies of an SIS model in complex networks. Unlike the traditional mean-field approach, the improved model utilizes the degree information of before and after the immunization. The epidemic threshold of degree-correlated networks can be obtained by linear stability analysis. For degree-uncorrelated networks, the model is reduced to the SIS epidemic model in networks after removing the immunized nodes. Compared to the previous results of random and targeted immunization schemes on degree-uncorrelated networks, we find that the infectious disease has a lower epidemic threshold.

  14. Epidemic threshold of node-weighted susceptible-infected-susceptible models on networks

    NASA Astrophysics Data System (ADS)

    Wu, Qingchu; Zhang, Haifeng

    2016-08-01

    In this paper, we investigate the epidemic spreading on random and regular networks through a pairwise-type model with a general transmission rate to evaluate the influence of the node-weight distribution. By using block matrix theory, an epidemic threshold index is formulated to predict the epidemic outbreak. An upper bound of the epidemic threshold is obtained by analyzing the monotonicity of spectral radius for nonnegative matrices. Theoretical results suggest that the epidemic threshold is dependent on both matrices {H}(1) and {H}(2) with the first matrix being related to the mean-field model while the second one reflecting the heterogeneous transmission rates. In particular, for a linear transmission rate, this study shows the negative correlation between the heterogeneity of weight distribution and the epidemic threshold, which is different from the results for existing results from the edge-weighted networks.

  15. Investigating the genetic architecture of conditional strategies using the environmental threshold model.

    PubMed

    Buzatto, Bruno A; Buoro, Mathieu; Hazel, Wade N; Tomkins, Joseph L

    2015-12-22

    The threshold expression of dichotomous phenotypes that are environmentally cued or induced comprise the vast majority of phenotypic dimorphisms in colour, morphology, behaviour and life history. Modelled as conditional strategies under the framework of evolutionary game theory, the quantitative genetic basis of these traits is a challenge to estimate. The challenge exists firstly because the phenotypic expression of the trait is dichotomous and secondly because the apparent environmental cue is separate from the biological signal pathway that induces the switch between phenotypes. It is the cryptic variation underlying the translation of cue to phenotype that we address here. With a 'half-sib common environment' and a 'family-level split environment' experiment, we examine the environmental and genetic influences that underlie male dimorphism in the earwig Forficula auricularia. From the conceptual framework of the latent environmental threshold (LET) model, we use pedigree information to dissect the genetic architecture of the threshold expression of forceps length. We investigate for the first time the strength of the correlation between observable and cryptic 'proximate' cues. Furthermore, in support of the environmental threshold model, we found no evidence for a genetic correlation between cue and the threshold between phenotypes. Our results show strong correlations between observable and proximate cues and less genetic variation for thresholds than previous studies have suggested. We discuss the importance of generating better estimates of the genetic variation for thresholds when investigating the genetic architecture and heritability of threshold traits. By investigating genetic architecture by means of the LET model, our study supports several key evolutionary ideas related to conditional strategies and improves our understanding of environmentally cued decisions. PMID:26674955

  16. A two-step framework for over-threshold modelling of environmental extremes

    NASA Astrophysics Data System (ADS)

    Bernardara, P.; Mazas, F.; Kergadallan, X.; Hamm, L.

    2014-03-01

    The evaluation of the probability of occurrence of extreme natural events is important for the protection of urban areas, industrial facilities and others. Traditionally, the extreme value theory (EVT) offers a valid theoretical framework on this topic. In an over-threshold modelling (OTM) approach, Pickands' theorem, (Pickands, 1975) states that, for a sample composed by independent and identically distributed (i.i.d.) values, the distribution of the data exceeding a given threshold converges through a generalized Pareto distribution (GPD). Following this theoretical result, the analysis of realizations of environmental variables exceeding a threshold spread widely in the literature. However, applying this theorem to an auto-correlated time series logically involves two successive and complementary steps: the first one is required to build a sample of i.i.d. values from the available information, as required by the EVT; the second to set the threshold for the optimal convergence toward the GPD. In the past, the same threshold was often employed both for sampling observations and for meeting the hypothesis of extreme value convergence. This confusion can lead to an erroneous understanding of methodologies and tools available in the literature. This paper aims at clarifying the conceptual framework involved in threshold selection, reviewing the available methods for the application of both steps and illustrating it with a double threshold approach.

  17. Modeling the Threshold Wind Speed for Saltation Initiation over Heterogeneous Sand Beds

    NASA Astrophysics Data System (ADS)

    Turney, F. A.; Martin, R. L.; Kok, J. F.

    2015-12-01

    Initiation of aeolian sediment transport is key to understanding the formation of dunes, emission of dust into the atmosphere, and landscape erosion. Previous models of the threshold wind speed required for saltation initiation have assumed that the particle bed is monodisperse and homogeneous in arrangement, thereby ignoring what is in reality a distribution of particle lifting thresholds, influenced by variability in soil particle sizes and bed geometry. To help overcome this problem, we present a numerical model that determines the distribution of threshold wind speeds required for particle lifting for a given soil size distribution. The model results are evaluated against high frequency wind speed and saltation data from a recent field campaign in Oceano Dunes in Southern California. The results give us insight into the range of lifting thresholds present during incipient sediment transport and the simplifications that are often made to characterize the process. In addition, this study provides a framework for moving beyond the 'fluid threshold' paradigm, which is known to be inaccurate, especially for near-threshold conditions.

  18. Estimating nerve excitation thresholds to cutaneous electrical stimulation by finite element modeling combined with a stochastic branching nerve fiber model.

    PubMed

    Mørch, Carsten Dahl; Hennings, Kristian; Andersen, Ole Kæseler

    2011-04-01

    Electrical stimulation of cutaneous tissue through surface electrodes is an often used method for evoking experimental pain. However, at painful intensities both non-nociceptive Aβ-fibers and nociceptive Aδ- and C-fibers may be activated by the electrical stimulation. This study proposes a finite element (FE) model of the extracellular potential and stochastic branching fiber model of the afferent fiber excitation thresholds. The FE model described four horizontal layers; stratum corneum, epidermis, dermis, and hypodermal used to estimate the excitation threshold of Aβ-fibers terminating in dermis and Aδ-fibers terminating in epidermis. The perception thresholds of 11 electrodes with diameters ranging from 0.2 to 20 mm were modeled and assessed on the volar forearm of healthy human volunteers by an adaptive two-alternative forced choice algorithm. The model showed that the magnitude of the current density was highest for smaller electrodes and decreased through the skin. The excitation thresholds of the Aδ-fibers were lower than the excitation thresholds of Aβ-fibers when current was applied through small, but not large electrodes. The experimentally assessed perception threshold followed the lowest excitation threshold of the modeled fibers. The model confirms that preferential excitation of Aδ-fibers may be achieved by small electrode stimulation due to higher current density in the dermoepidermal junction. PMID:21207174

  19. Using a combined population-based and kinetic modelling approach to assess timescales and durations of magma migration activities prior to the 1669 flank eruption of Mt. Etna

    NASA Astrophysics Data System (ADS)

    Kahl, M.; Morgan, D. J.; Viccaro, M.; Dingwell, D. B.

    2015-12-01

    The March-July eruption of Mt. Etna in 1669 is ranked as one of the most destructive and voluminous eruptions of Etna volcano in historical times. To assess threats from future eruptions, a better understanding of how and over what timescales magma moved underground prior to and during the 1669 eruption is required. We present a combined population based and kinetic modelling approach [1-2] applied to 185 olivine crystals that erupted during the 1669 eruption. By means of this approach we provide, for the first time, a dynamic picture of magma mixing and magma migration activity prior to and during the 1669 flank eruption of Etna volcano. Following the work of [3] we have studied 10 basaltic lava samples (five SET1 and five SET2 samples) that were erupted from different fissures that opened between 950 and 700 m a.s.l. Following previous work [1-2] we were able to classify different populations of olivine based on their overall core and rim compositional record and the prevalent zoning type (i.e. normal vs. reverse). The core plateau compositions of the SET1 and SET2 olivines range from Fo70 up to Fo83 with a single peak at Fo75-76. The rims differ significantly and can be distinguished into two different groups. Olivine rims from the SET1 samples are generally more evolved and range from Fo50 to Fo64 with a maximum at Fo55-57. SET2 olivine rims vary between Fo65-75 with a peak at Fo69. SET1 and SET2 olivines display normal zonation with cores at Fo75-76 and diverging rim records (Fo55-57 and Fo65-75). The diverging core and rim compositions recorded in the SET1 and SET2 olivines can be attributed to magma evolution possibly in three different magmatic environments (MEs): M1 (=Fo75-76), M2 (=Fo69) and M3 (=Fo55-57) with magma transfer and mixing amongst them. The MEs established in this study differ slightly from those identified in previous works [1-2]. We note the relative lack of olivines with Fo-rich core and rim compositions indicating a major mafic magma

  20. Postscript: Parallel Distributed Processing in Localist Models without Thresholds

    ERIC Educational Resources Information Center

    Plaut, David C.; McClelland, James L.

    2010-01-01

    The current authors reply to a response by Bowers on a comment by the current authors on the original article. Bowers (2010) mischaracterizes the goals of parallel distributed processing (PDP research)--explaining performance on cognitive tasks is the primary motivation. More important, his claim that localist models, such as the interactive…

  1. Threshold behaviour of a SIR epidemic model with age structure and immigration.

    PubMed

    Franceschetti, Andrea; Pugliese, Andrea

    2008-07-01

    We consider a SIR age-structured model with immigration of infectives in all epidemiological compartments; the population is assumed to be in demographic equilibrium between below-replacement fertility and immigration; the spread of the infection occurs through a general age-dependent kernel. We analyse the equations for steady states; because of immigration of infectives a steady state with a positive density of infectives always exists; however, a quasi-threshold theorem is proved, in the sense that, below the threshold, the density of infectives is close to 0, while it is away from 0, above the threshold; furthermore, conditions that guarantee uniqueness of steady states are obtained. Finally, we present some numerical examples, inspired by the Italian demographic situation, that illustrate the threshold-like behaviour, and other features of the stationary solutions and of the transient. PMID:17985131

  2. Neural masking by sub-threshold electric stimuli: animal and computer model results.

    PubMed

    Miller, Charles A; Woo, Jihwan; Abbas, Paul J; Hu, Ning; Robinson, Barbara K

    2011-04-01

    Electric stimuli can prosthetically excite auditory nerve fibers to partially restore sensory function to individuals impaired by profound or severe hearing loss. While basic response properties of electrically stimulated auditory nerve fibers (ANF) are known, responses to complex, time-changing stimuli used clinically are inadequately understood. We report that forward-masker pulse trains can enhance and reduce ANF responsiveness to subsequent stimuli and the novel observation that sub-threshold (nonspike-evoking) electric trains can reduce responsiveness to subsequent pulse-train stimuli. The effect is observed in the responses of cat ANFs and shown by a computational biophysical ANF model that simulates rate adaptation through integration of external potassium cation (K) channels. Both low-threshold (i.e., Klt) and high-threshold (Kht) channels were simulated at each node of Ranvier. Model versions without Klt channels did not produce the sub-threshold effect. These results suggest that some such accumulation mechanism, along with Klt channels, may underlie sub-threshold masking observed in cat ANF responses. As multichannel auditory prostheses typically present sub-threshold stimuli to various ANF subsets, there is clear relevance of these findings to clinical situations. PMID:21080206

  3. Modeling soil quality thresholds to ecosystem recovery at Fort Benning, GA, USA

    SciTech Connect

    Garten Jr, Charles T; Ashwood, Tom L

    2004-12-01

    The objective of this research was to use a simple model of soil carbon (C) and nitrogen (N) dynamics to predict nutrient thresholds to ecosystem recovery on degraded soils at Fort Benning, Georgia, in the southeastern USA. Artillery, wheeled, and tracked vehicle training at military installations can produce soil disturbance and potentially create barren, degraded soils. Ecosystem reclamation is an important component of natural resource management at military installations. Four factors were important to the development of thresholds to recovery of aboveground biomass on degraded soils: (1) initial amounts of aboveground biomass, (2) initial soil C stocks (i.e., soil quality), (3) relative recovery rates of biomass, and (4) soil sand content. Forests and old fields on soils with varying sand content had different predicted thresholds for ecosystem recovery. Soil C stocks at barren sites on Fort Benning were generally below predicted thresholds to 100% recovery of desired future ecosystem conditions defined on the basis of aboveground biomass. Predicted thresholds to ecosystem recovery were less on soils with more than 70% sand content. The lower thresholds for old field and forest recovery on more sandy soils were apparently due to higher relative rates of net soil N mineralization. Calculations with the model indicated that a combination of desired future conditions, initial levels of soil quality (defined by soil C stocks), and the rate of biomass accumulation determine the predicted success of ecosystem recovery on disturbed soils.

  4. Does Imaging Technology Cause Cancer? Debunking the Linear No-Threshold Model of Radiation Carcinogenesis.

    PubMed

    Siegel, Jeffry A; Welsh, James S

    2016-04-01

    In the past several years, there has been a great deal of attention from the popular media focusing on the alleged carcinogenicity of low-dose radiation exposures received by patients undergoing medical imaging studies such as X-rays, computed tomography scans, and nuclear medicine scintigraphy. The media has based its reporting on the plethora of articles published in the scientific literature that claim that there is "no safe dose" of ionizing radiation, while essentially ignoring all the literature demonstrating the opposite point of view. But this reported "scientific" literature in turn bases its estimates of cancer induction on the linear no-threshold hypothesis of radiation carcinogenesis. The use of the linear no-threshold model has yielded hundreds of articles, all of which predict a definite carcinogenic effect of any dose of radiation, regardless of how small. Therefore, hospitals and professional societies have begun campaigns and policies aiming to reduce the use of certain medical imaging studies based on perceived risk:benefit ratio assumptions. However, as they are essentially all based on the linear no-threshold model of radiation carcinogenesis, the risk:benefit ratio models used to calculate the hazards of radiological imaging studies may be grossly inaccurate if the linear no-threshold hypothesis is wrong. Here, we review the myriad inadequacies of the linear no-threshold model and cast doubt on the various studies based on this overly simplistic model. PMID:25824269

  5. A physical-based pMOSFETs threshold voltage model including the STI stress effect

    NASA Astrophysics Data System (ADS)

    Wei, Wu; Gang, Du; Xiaoyan, Liu; Lei, Sun; Jinfeng, Kang; Ruqi, Han

    2011-05-01

    The physical threshold voltage model of pMOSFETs under shallow trench isolation (STI) stress has been developed. The model is verified by 130 nm technology layout dependent measurement data. The comparison between pMOSFET and nMOSFET model simulations due to STI stress was conducted to show that STI stress induced less threshold voltage shift and more mobility shift for the pMOSFET. The circuit simulations of a nine stage ring oscillator with and without STI stress proved about 11% improvement of average delay time. This indicates the importance of STI stress consideration in circuit design.

  6. Threshold conditions for SIS epidemic models on edge-weighted networks

    NASA Astrophysics Data System (ADS)

    Wu, Qingchu; Zhang, Fei

    2016-07-01

    We consider the disease dynamics of a susceptible-infected-susceptible model in weighted random and regular networks. By using the pairwise approximation, we build an edge-based compartment model, from which the condition of epidemic outbreak is obtained. Our results suggest that there exists a remarkable difference between the linear and nonlinear transmission rate. For a linear transmission rate, the epidemic threshold is completely determined by the mean weight, which is different from the susceptible-infected-recovered model framework. While for a nonlinear transmission rate, the epidemic threshold is not only related to the mean weight, but also closely related to the heterogeneity of weight distribution.

  7. Effects of pump recycling technique on stimulated Brillouin scattering threshold: a theoretical model.

    PubMed

    Al-Asadi, H A; Al-Mansoori, M H; Ajiya, M; Hitam, S; Saripan, M I; Mahdi, M A

    2010-10-11

    We develop a theoretical model that can be used to predict stimulated Brillouin scattering (SBS) threshold in optical fibers that arises through the effect of Brillouin pump recycling technique. Obtained simulation results from our model are in close agreement with our experimental results. The developed model utilizes single mode optical fiber of different lengths as the Brillouin gain media. For 5-km long single mode fiber, the calculated threshold power for SBS is about 16 mW for conventional technique. This value is reduced to about 8 mW when the residual Brillouin pump is recycled at the end of the fiber. The decrement of SBS threshold is due to longer interaction lengths between Brillouin pump and Stokes wave. PMID:20941134

  8. Study on the threshold of a stochastic SIR epidemic model and its extensions

    NASA Astrophysics Data System (ADS)

    Zhao, Dianli

    2016-09-01

    This paper provides a simple but effective method for estimating the threshold of a class of the stochastic epidemic models by use of the nonnegative semimartingale convergence theorem. Firstly, the threshold R0SIR is obtained for the stochastic SIR model with a saturated incidence rate, whose value is below 1 or above 1 will completely determine the disease to go extinct or prevail for any size of the white noise. Besides, when R0SIR > 1 , the system is proved to be convergent in time mean. Then, the threshold of the stochastic SIVS models with or without saturated incidence rate are also established by the same method. Comparing with the previously-known literatures, the related results are improved, and the method is simpler than before.

  9. Modeling Soil Quality Thresholds to Ecosystem Recovery at Fort Benning, Georgia, USA

    SciTech Connect

    Garten Jr., C.T.

    2004-03-08

    The objective of this research was to use a simple model of soil C and N dynamics to predict nutrient thresholds to ecosystem recovery on degraded soils at Fort Benning, Georgia, in the southeastern USA. The model calculates aboveground and belowground biomass, soil C inputs and dynamics, soil N stocks and availability, and plant N requirements. A threshold is crossed when predicted soil N supplies fall short of predicted N required to sustain biomass accrual at a specified recovery rate. Four factors were important to development of thresholds to recovery: (1) initial amounts of aboveground biomass, (2) initial soil C stocks (i.e., soil quality), (3) relative recovery rates of biomass, and (4) soil sand content. Thresholds to ecosystem recovery predicted by the model should not be interpreted independent of a specified recovery rate. Initial soil C stocks influenced the predicted patterns of recovery by both old field and forest ecosystems. Forests and old fields on soils with varying sand content had different predicted thresholds to recovery. Soil C stocks at barren sites on Fort Benning generally lie below predicted thresholds to 100% recovery of desired future ecosystem conditions defined on the basis of aboveground biomass (18000 versus 360 g m{sup -2} for forests and old fields, respectively). Calculations with the model indicated that reestablishment of vegetation on barren sites to a level below the desired future condition is possible at recovery rates used in the model, but the time to 100% recovery of desired future conditions, without crossing a nutrient threshold, is prolonged by a reduced rate of forest growth. Predicted thresholds to ecosystem recovery were less on soils with more than 70% sand content. The lower thresholds for old field and forest recovery on more sandy soils are apparently due to higher relative rates of net soil N mineralization in more sandy soils. Calculations with the model indicate that a combination of desired future

  10. Modeling the effect of humidity on the threshold friction velocity of coal particles

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaochun; Chen, Weiping; Ma, Chun; Zhan, Shuifen

    2012-09-01

    Coal particles emission could cause serious air pollution in coal production region and transport region. In coal mining industry, large amounts of water are regularly spayed to coal piles to prevent dust emission from the coal particles. The mechanism behind this measure is to manage the threshold friction velocity, which is an important parameter in controlling wind erosion and dust emission. Bagnold has developed a threshold friction velocity model for soil particles. However, the Bagnold model cannot be applied directly to coal particles as coal particles are quite different from soils in physical and chemical properties. We studied and modeled threshold friction velocity of coal particles under different humidities by using a wind tunnel. Results showed that the effects of humidity on coal particles' threshold friction velocity are related to the hydrophilic effect and adhesive effect. Bagnold model can be corrected by two new parameter items which explained the two effects. The new model, agreed well with wind tunnel measurements for coal particles with different size categories. Despite the fact the new model was developed for coal particles, its physical basis may allow the model application to other wind susceptible particles.

  11. Complex Dynamic Thresholds and Generation of the Action Potentials in the Neural-Activity Model

    NASA Astrophysics Data System (ADS)

    Kirillov, S. Yu.; Nekorkin, V. I.

    2016-05-01

    This work is devoted to studying the processes of activation of the neurons whose excitation thresholds are not constant and vary in time (the so-called dynamic thresholds). The neuron dynamics is described by the FitzHugh-Nagumo model with nonlinear behavior of the recovery variable. The neuron response to the external pulsed activating action in the presence of a slowly varying synaptic current is studied within the framework of this model. The structure of the dynamic threshold is studied and its properties depending on the external-action parameters are established. It is found that the formation of the "folds" in the separatrix threshold manifold in the model phase space is a typical feature of the complex dynamic threshold. High neuron sensitivity to the action of the comparatively weak slow control signals is established. This explains the capability of the neurons to perform flexible tuning of their selective properties for detecting various external signals in sufficiently short times (of the order of duration of several spikes).

  12. Complex Dynamic Thresholds and Generation of the Action Potentials in the Neural-Activity Model

    NASA Astrophysics Data System (ADS)

    Kirillov, S. Yu.; Nekorkin, V. I.

    2016-06-01

    This work is devoted to studying the processes of activation of the neurons whose excitation thresholds are not constant and vary in time (the so-called dynamic thresholds). The neuron dynamics is described by the FitzHugh-Nagumo model with nonlinear behavior of the recovery variable. The neuron response to the external pulsed activating action in the presence of a slowly varying synaptic current is studied within the framework of this model. The structure of the dynamic threshold is studied and its properties depending on the external-action parameters are established. It is found that the formation of the "folds" in the separatrix threshold manifold in the model phase space is a typical feature of the complex dynamic threshold. High neuron sensitivity to the action of the comparatively weak slow control signals is established. This explains the capability of the neurons to perform flexible tuning of their selective properties for detecting various external signals in sufficiently short times (of the order of duration of several spikes).

  13. A new analytical threshold voltage model of cylindrical gate tunnel FET (CG-TFET)

    NASA Astrophysics Data System (ADS)

    Dash, S.; Mishra, G. P.

    2015-10-01

    The cylindrical gate tunnel FET (CG-TFET) is one of the potential candidates for future nano-technology, as it exhibit greater scaling capability and low subthreshold swing (SS) as compared to conventional MOSFET. In this paper, a new analytical approach is proposed to extract the gate dependent threshold voltage for CG-TFET. The potential distribution and electric field distribution in the cylindrical channel has been obtained using the 2-D Poisson's equation which in turn computes the shortest tunneling distance and tunneling current. The threshold voltage is extracted using peak transconductance change method based on the saturation of tunneling barrier width. The impact of scaling of effective oxide thickness, cylindrical pillar diameter and gate length on the threshold voltage has been investigated. The consistency of the proposed model is validated with the TCAD simulated results. The present model can be a handful for the study of switching behavior of TFET.

  14. Thresholds in vegetation responses to drought: Implications for rainfall-runoff modeling

    NASA Astrophysics Data System (ADS)

    Tague, C.; Dugger, A. L.

    2011-12-01

    While threshold behavior is often associated with soil and subsurface runoff generation, dynamic vegetation responses to water stress may be an important contributor to threshold type behavior in rainfall runoff models. Vegetation water loss varies with vegetation type and biomass and transpiration dynamics in many settings are regulated by stomatal function. In water limited environments the timing and frequency of stomatal closure varies from year to year as a function of water stress. Stomatal closure and associated fine time scale (hourly to weekly) plant transpiration may appear as threshold (on/off) behavior. Total seasonal to annual plant water use, however, typically show a continuous relationship with atmospheric conditions and soil moisture. Thus while short-time scale behavior may demonstrate non-linear, threshold type behavior, continuous relationships at slightly longer time scales can be used to capture the role of vegetation mediated water loss and its associated impact on storage and runoff. Many rainfall runoff models rely on these types of relationships. However these relationships may change if water stress influences vegetation structure as it does in drought conditions. Forest dieback under drought is a dramatic example of a threshold event, and one that is expected to occur with increasing frequency under a warmer climate. Less dramatic but still important are changes in leaf and root biomass in response to drought. We demonstrate these effects using a coupled ecosystem carbon cycling and hydrology model and show that by accounting for drought driven changes in vegetation dynamics we improve our ability to capture inter-annual variation in streamflow for a semi-arid watershed in New Mexico. We also use the model to predict spatial patterns of more catastrophic vegetation dieback with moisture stress and show that we can accurately capture the spatial pattern of ponderosa pine dieback during a early 2000s drought in New Mexico. We use these

  15. On the physical meaning of hillslope thresholds: A combined field-modeling analysis

    NASA Astrophysics Data System (ADS)

    Graham, C. B.; McDonnell, J. J.

    2008-12-01

    Near surface lateral subsurface flow has been shown to be a major component of streamflow in many upland humid areas. Nevertheless, efforts to derive macroscale understanding have proven difficult, often due to the baffling degree of heterogeneity of hillslope scale soil, geologic and hydraulic characteristics. One common finding on gauged hillslopes is a threshold response of subsurface stormflow to total storm precipitation. These thresholds have been attributed to several mechanisms, but increasingly, it appears that such threshold response in areas with strong permeability contrast between soil and bedrock relates to the filling and spilling of connected subsurface saturated patches. Additionally, antecedent moisture conditions appear to have an impact linked to the general soil moisture deficit in the soil profile. Here, we describe a combined field- modeling study at the Maimai, NZ, experimental hillslope. We present a simple reservoir-based model based on hillslope excavations aimed at quantifying subsurface flow paths and processes, and bedrock microtopography. We perform a multiple objective calibration incorporating hydrograph, tracer and internal state response to demonstrate the model is capturing observed behavior at the site. We then present a series of virtual experiments using the calibrated model to examine the relative influence of fill and spill (bedrock leakage and subsurface storage) and soil moisture deficit (PET and storm spacing) factors on threshold development and magnitude. Overall, our work suggests that whole-hillslope thresholds are balanced by fill and spill and soil moisture deficit: at Maimai where rainfall is high and evenly distributed annually, fill and spill factors dominate. In climate regimes where storm spacing is variable and potential evaporation rates are high, our virtual experiments suggest that soil moisture deficit factors would dominate the threshold response.

  16. A probabilistic model of absolute auditory thresholds and its possible physiological basis.

    PubMed

    Heil, Peter; Neubauer, Heinrich; Tetschke, Manuel; Irvine, Dexter R F

    2013-01-01

    Detection thresholds for auditory stimuli, specified in terms of their -amplitude or level, depend on the stimulus temporal envelope and decrease with increasing stimulus duration. The neural mechanisms underlying these fundamental across-species observations are not fully understood. Here, we present a "continuous look" model, according to which the stimulus gives rise to stochastic neural detection events whose probability of occurrence is proportional to the 3rd power of the low-pass filtered, time-varying stimulus amplitude. Threshold is reached when a criterion number of events have occurred (probability summation). No long-term integration is required. We apply the model to an extensive set of thresholds measured in humans for tones of different envelopes and durations and find it to fit well. Subtle differences at long durations may be due to limited attention resources. We confirm the probabilistic nature of the detection events by analyses of simple reaction times and verify the exponent of 3 by validating model predictions for binaural thresholds from monaural thresholds. The exponent originates in the auditory periphery, possibly in the intrinsic Ca(2+) cooperativity of the Ca(2+) sensor involved in exocytosis from inner hair cells. It results in growth of the spike rate of auditory-nerve fibers (ANFs) with the 3rd power of the stimulus amplitude before saturating (Heil et al., J Neurosci 31:15424-15437, 2011), rather than with its square (i.e., with stimulus intensity), as is commonly assumed. Our work therefore suggests a link between detection thresholds and a key biochemical reaction in the receptor cells. PMID:23716205

  17. The threshold of a stochastic SIVS epidemic model with nonlinear saturated incidence

    NASA Astrophysics Data System (ADS)

    Zhao, Dianli; Zhang, Tiansi; Yuan, Sanling

    2016-02-01

    A stochastic version of the SIS epidemic model with vaccination (SIVS) is studied. When the noise is small, the threshold parameter is identified, which determines the extinction and persistence of the epidemic. Besides, the results show that large noise will suppress the epidemic from prevailing regardless of the saturated incidence. The results are illustrated by computer simulations.

  18. NESTED THRESHOLD SIRE MODELS FOR ESTIMATING GENETIC PARAMETERS FOR STAYABILITY IN BEEF COWS

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Stayability is the ability of a beef cow to remain in production to a specified age. In this study, the interest was in determining the genetic relationship between stayability to an early age with the stayability to a later age. A nested threshold sire model for stayability was used to estimate t...

  19. GENETIC EVALUATION OF STILLBIRTH IN UNITED STATES HOLSTEINS USING A SIRE-MATERNAL GRANDSIRE THRESHOLD MODEL

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A sire-maternal grandsire threshold model was used for genetic evaluation of stillbirth in U.S. Holsteins. Calving ease and stillbirth records for herds reporting at least 10 dead calves were extracted from the AIPL database. About half of the 14 million calving ease records in the database have a k...

  20. Threshold Values for Identification of Contamination Predicted by Reduced-Order Models

    DOE PAGESBeta

    Last, George V.; Murray, Christopher J.; Bott, Yi-Ju; Brown, Christopher F.

    2014-12-31

    The U.S. Department of Energy’s (DOE’s) National Risk Assessment Partnership (NRAP) Project is developing reduced-order models to evaluate potential impacts on underground sources of drinking water (USDWs) if CO2 or brine leaks from deep CO2 storage reservoirs. Threshold values, below which there would be no predicted impacts, were determined for portions of two aquifer systems. These threshold values were calculated using an interwell approach for determining background groundwater concentrations that is an adaptation of methods described in the U.S. Environmental Protection Agency’s Unified Guidance for Statistical Analysis of Groundwater Monitoring Data at RCRA Facilities.

  1. Predicting the epidemic threshold of the susceptible-infected-recovered model

    NASA Astrophysics Data System (ADS)

    Wang, Wei; Liu, Quan-Hui; Zhong, Lin-Feng; Tang, Ming; Gao, Hui; Stanley, H. Eugene

    2016-04-01

    Researchers have developed several theoretical methods for predicting epidemic thresholds, including the mean-field like (MFL) method, the quenched mean-field (QMF) method, and the dynamical message passing (DMP) method. When these methods are applied to predict epidemic threshold they often produce differing results and their relative levels of accuracy are still unknown. We systematically analyze these two issues—relationships among differing results and levels of accuracy—by studying the susceptible-infected-recovered (SIR) model on uncorrelated configuration networks and a group of 56 real-world networks. In uncorrelated configuration networks the MFL and DMP methods yield identical predictions that are larger and more accurate than the prediction generated by the QMF method. As for the 56 real-world networks, the epidemic threshold obtained by the DMP method is more likely to reach the accurate epidemic threshold because it incorporates full network topology information and some dynamical correlations. We find that in most of the networks with positive degree-degree correlations, an eigenvector localized on the high k-core nodes, or a high level of clustering, the epidemic threshold predicted by the MFL method, which uses the degree distribution as the only input information, performs better than the other two methods.

  2. Codimension-1 Sliding Bifurcations of a Filippov Pest Growth Model with Threshold Policy

    NASA Astrophysics Data System (ADS)

    Tang, Sanyi; Tang, Guangyao; Qin, Wenjie

    A Filippov system is proposed to describe the stage structured nonsmooth pest growth with threshold policy control (TPC). The TPC measure is represented by the total density of both juveniles and adults being chosen as an index for decisions on when to implement chemical control strategies. The proposed Filippov system can have three pieces of sliding segments and three pseudo-equilibria, which result in rich sliding mode bifurcations and local sliding bifurcations including boundary node (boundary focus, or boundary saddle) and tangency bifurcations. As the threshold density varies the model exhibits the interesting global sliding bifurcations sequentially: touching → buckling → crossing → sliding homoclinic orbit to a pseudo-saddle → crossing → touching bifurcations. In particular, bifurcation of a homoclinic orbit to a pseudo-saddle with a figure of eight shape, to a pseudo-saddle-node or to a standard saddle-node have been observed for some parameter sets. This implies that control outcomes are sensitive to the threshold level, and hence it is crucial to choose the threshold level to initiate control strategy. One more sliding segment (or pseudo-equilibrium) is induced by the total density of a population guided switching policy, compared to only the juvenile density guided policy, implying that this control policy is more effective in terms of preventing multiple pest outbreaks or causing the density of pests to stabilize at a desired level such as an economic threshold.

  3. Predicting the epidemic threshold of the susceptible-infected-recovered model

    PubMed Central

    Wang, Wei; Liu, Quan-Hui; Zhong, Lin-Feng; Tang, Ming; Gao, Hui; Stanley, H. Eugene

    2016-01-01

    Researchers have developed several theoretical methods for predicting epidemic thresholds, including the mean-field like (MFL) method, the quenched mean-field (QMF) method, and the dynamical message passing (DMP) method. When these methods are applied to predict epidemic threshold they often produce differing results and their relative levels of accuracy are still unknown. We systematically analyze these two issues—relationships among differing results and levels of accuracy—by studying the susceptible-infected-recovered (SIR) model on uncorrelated configuration networks and a group of 56 real-world networks. In uncorrelated configuration networks the MFL and DMP methods yield identical predictions that are larger and more accurate than the prediction generated by the QMF method. As for the 56 real-world networks, the epidemic threshold obtained by the DMP method is more likely to reach the accurate epidemic threshold because it incorporates full network topology information and some dynamical correlations. We find that in most of the networks with positive degree-degree correlations, an eigenvector localized on the high k-core nodes, or a high level of clustering, the epidemic threshold predicted by the MFL method, which uses the degree distribution as the only input information, performs better than the other two methods. PMID:27091705

  4. Validation of theoretical models of phonation threshold pressure with data from a vocal fold mechanical replica.

    PubMed

    Lucero, Jorge C; Van Hirtum, Annemie; Ruty, Nicolas; Cisonni, Julien; Pelorson, Xavier

    2009-02-01

    This paper analyzes the capability of a mucosal wave model of the vocal fold to predict values of phonation threshold lung pressure. Equations derived from the model are fitted to pressure data collected from a mechanical replica of the vocal folds. The results show that a recent extension of the model to include an arbitrary delay of the mucosal wave in its travel along the glottal channel provides a better approximation to the data than the original version of the model, which assumed a small delay. They also show that modeling the vocal tract as a simple inertive load, as has been proposed in recent analytical studies of phonation, fails to capture the effect of the vocal tract on the phonation threshold pressure with reasonable accuracy. PMID:19206840

  5. Modeling jointly low, moderate, and heavy rainfall intensities without a threshold selection

    NASA Astrophysics Data System (ADS)

    Naveau, Philippe; Huser, Raphael; Ribereau, Pierre; Hannart, Alexis

    2016-04-01

    In statistics, extreme events are often defined as excesses above a given large threshold. This definition allows hydrologists and flood planners to apply Extreme-Value Theory (EVT) to their time series of interest. Even in the stationary univariate context, this approach has at least two main drawbacks. First, working with excesses implies that a lot of observations (those below the chosen threshold) are completely disregarded. The range of precipitation is artificially shopped down into two pieces, namely large intensities and the rest, which necessarily imposes different statistical models for each piece. Second, this strategy raises a nontrivial and very practical difficultly: how to choose the optimal threshold which correctly discriminates between low and heavy rainfall intensities. To address these issues, we propose a statistical model in which EVT results apply not only to heavy, but also to low precipitation amounts (zeros excluded). Our model is in compliance with EVT on both ends of the spectrum and allows a smooth transition between the two tails, while keeping a low number of parameters. In terms of inference, we have implemented and tested two classical methods of estimation: likelihood maximization and probability weighed moments. Last but not least, there is no need to choose a threshold to define low and high excesses. The performance and flexibility of this approach are illustrated on simulated and hourly precipitation recorded in Lyon, France.

  6. Research of adaptive threshold model and its application in iris tracking

    NASA Astrophysics Data System (ADS)

    Zhao, Qijie; Tu, Dawei; Wang, Rensan; Gao, Daming

    2005-02-01

    The relationship between gray value of pixels and macro-information in image has been analyzed with the method in statistical mechanics. After simulating and curve fitting with the experiment data by statistic and regression method, an adaptive threshold model between average gray value and image threshold has been proposed in terms of Boltzmann statistics. On the other hand, the image characteristics around the eye region and the states of eyeball also have been analyzed, and an algorithm to extract the eye feature and locate its position on the image has been proposed, furthermore, another algorithm has been proposed to find the iris characteristic line and then to coordinate the iris center. At last, considering the cases of head gesture, different head position, and the opening state of eyes, some experiments have been respectively done with the function based on the adaptive threshold model and the designed algorithms in eye-gaze input human-computer interaction (HCI) system. The experiment results show that the algorithms can widely be applied in different cases, and real-time iris tracking can be performed with the adaptive threshold model and algorithms.

  7. Near-threshold boson pair production in the model of smeared-mass unstable particles

    SciTech Connect

    Kuksa, V. I.; Pasechnik, R. S.

    2010-09-15

    Near-threshold production of boson pairs is considered within the framework of the model of unstable particles with smeared mass. We describe the principal aspects of the model and consider the strategy of calculations including the radiative corrections. The results of calculations are in good agreement with LEP II data and Monte-Carlo simulations. Suggested approach significantly simplifies calculations with respect to the standard perturbative one.

  8. A preliminary threshold model of parasitism in the Cockle Cerastoderma edule using delayed exchange of stability

    NASA Astrophysics Data System (ADS)

    O'Grady, E. A.; Culloty, S. C.; Kelly, T. C.; O'Callaghan, M. J. A.; Rachinskii, D.

    2015-02-01

    Thresholds occur, and play an important role, in the dynamics of many biological communities. In this paper, we model a persistence type threshold which has been shown experimentally to exist in hyperparasitised flukes in the cockle, a shellfish. Our model consists of a periodically driven slow-fast host-parasite system of equations for a slow flukes population (host) and a fast Unikaryon hyperparasite population (parasite). The model exhibits two branches of the critical curve crossing in a transcritical bifurcation scenario. We discuss two thresholds due to immediate and delayed exchange of stability effects; and we derive algebraic relationships for parameters of the periodic solution in the limit of the infinite ratio of the time scales. Flukes, which are the host species in our model, parasitise cockles and in turn are hyperparasitised by the microsporidian Unikaryon legeri; the life cycle of flukes includes several life stages and a number of different hosts. That is, the flukes-hyperparasite system in a cockle is, naturally, part of a larger estuarine ecosystem of interacting species involving parasites, shellfish and birds which prey on shellfish. A population dynamics model which accounts for one system of such multi-species interactions and includes the fluke-hyperparasite model in a cockle as a subsystem is presented. We provide evidence that the threshold effect we observed in the flukes-hyperparasite subsystem remains apparent in the multi-species system. Assuming that flukes damage cockles, and taking into account that the hyperparasite is detrimental to flukes, it is natural to suggest that the hyperparasitism may support the abundance of cockles and, thereby, the persistence of the estuarine ecosystem, including shellfish and birds. We confirm the possibility of the existence of this scenario in our model, at least partially, by removing the hyperparasite and demonstrating that this may result in a substantial drop in cockle numbers. The result

  9. Evaluation of a threshold-based model of fatigue in gamma titanium aluminide following impact damage

    NASA Astrophysics Data System (ADS)

    Harding, Trevor Scott

    2000-10-01

    Recent interest in gamma titanium aluminide (gamma-TiAl) for use in gas turbine engine applications has centered on the low density and good elevated temperature strength retention of gamma-TiAl compared to current materials. However, the relatively low ductility and fracture toughness of gamma-TiAl leads to serious concerns regarding its ability to resist impact damage. Furthermore, the limited fatigue crack growth resistance of gamma-TiAl means that the potential for fatigue failures resulting from impact damage is real if a damage tolerant design approach is used. A threshold-based design approach may be required if fatigue crack growth from potential impact sites is to be avoided. The objective of the present research is to examine the feasibility of a threshold-based approach for the design of a gamma-TiAl low-pressure turbine blade subjected to both assembly-related impact damage and foreign object damage. Specimens of three different gamma-TiAl alloys were damaged in such a way as to simulate anticipated impact damage for a turbine blade. Step-loading fatigue tests were conducted at both room temperature and 600°C. In terms of the assembly-related impact damage, the results indicate that there is reasonably good agreement between the threshold-based predictions of the fatigue strength of damaged specimens and the measured data. However, some discrepancies do exist. In the case of very lightly damaged specimens, prediction of the resulting fatigue strength requires that a very conservative small-crack fatigue threshold be used. Consequently, the allowable design conditions are significantly reduced. For severely damaged specimens, an analytical approach found that the potential effects of residual stresses may be related to the discrepancies observed between the threshold-based model and measured fatigue strength data. In the case of foreign object damage, a good correlation was observed between impacts resulting in large cracks and a long-crack threshold

  10. Simulating electrical modulation detection thresholds using a biophysical model of the auditory nerve.

    PubMed

    O'Brien, Gabrielle E; Imennov, Nikita S; Rubinstein, Jay T

    2016-05-01

    Modulation detection thresholds (MDTs) assess listeners' sensitivity to changes in the temporal envelope of a signal and have been shown to strongly correlate with speech perception in cochlear implant users. MDTs are simulated with a stochastic model of a population of auditory nerve fibers that has been verified to accurately simulate a number of physiologically important temporal response properties. The procedure to estimate detection thresholds has previously been applied to stimulus discrimination tasks. The population model simulates the MDT-stimulus intensity relationship measured in cochlear implant users. The model also recreates the shape of the modulation transfer function and the relationship between MDTs and carrier rate. Discrimination based on fluctuations in synchronous firing activity predicts better performance at low carrier rates, but quantitative measures of modulation coding predict better neural representation of high carrier rate stimuli. Manipulating the number of fibers and a temporal integration parameter, the width of a sliding temporal integration window, varies properties of the MDTs, such as cutoff frequency and peak threshold. These results demonstrate the importance of using a multi-diameter fiber population in modeling the MDTs and demonstrate a wider applicability of this model to simulating behavioral performance in cochlear implant listeners. PMID:27250141

  11. Percolation thresholds for discrete-continuous models with nonuniform probabilities of bond formation

    NASA Astrophysics Data System (ADS)

    Szczygieł, Bartłomiej; Dudyński, Marek; Kwiatkowski, Kamil; Lewenstein, Maciej; Lapeyre, Gerald John; Wehr, Jan

    2016-02-01

    We introduce a class of discrete-continuous percolation models and an efficient Monte Carlo algorithm for computing their properties. The class is general enough to include well-known discrete and continuous models as special cases. We focus on a particular example of such a model, a nanotube model of disintegration of activated carbon. We calculate its exact critical threshold in two dimensions and obtain a Monte Carlo estimate in three dimensions. Furthermore, we use this example to analyze and characterize the efficiency of our algorithm, by computing critical exponents and properties, finding that it compares favorably to well-known algorithms for simpler systems.

  12. Threshold voltage model of junctionless cylindrical surrounding gate MOSFETs including fringing field effects

    NASA Astrophysics Data System (ADS)

    Gupta, Santosh Kumar

    2015-12-01

    2D Analytical model of the body center potential (BCP) in short channel junctionless Cylindrical Surrounding Gate (JLCSG) MOSFETs is developed using evanescent mode analysis (EMA). This model also incorporates the gate bias dependent inner and outer fringing capacitances due to the gate-source/drain fringing fields. The developed model provides results in good agreement with simulated results for variations of different physical parameters of JLCSG MOSFET viz. gate length, channel radius, doping concentration, and oxide thickness. Using the BCP, an analytical model for the threshold voltage has been derived and validated against results obtained from 3D device simulator.

  13. Recognition ROCS Are Curvilinear--Or Are They? On Premature Arguments against the Two-High-Threshold Model of Recognition

    ERIC Educational Resources Information Center

    Broder, Arndt; Schutz, Julia

    2009-01-01

    Recent reviews of recognition receiver operating characteristics (ROCs) claim that their curvilinear shape rules out threshold models of recognition. However, the shape of ROCs based on confidence ratings is not diagnostic to refute threshold models, whereas ROCs based on experimental bias manipulations are. Also, fitting predicted frequencies to…

  14. Computational model of collective nest selection by ants with heterogeneous acceptance thresholds

    PubMed Central

    Masuda, Naoki; O'shea-Wheller, Thomas A.; Doran, Carolina; Franks, Nigel R.

    2015-01-01

    Collective decision-making is a characteristic of societies ranging from ants to humans. The ant Temnothorax albipennis is known to use quorum sensing to collectively decide on a new home; emigration to a new nest site occurs when the number of ants favouring the new site becomes quorate. There are several possible mechanisms by which ant colonies can select the best nest site among alternatives based on a quorum mechanism. In this study, we use computational models to examine the implications of heterogeneous acceptance thresholds across individual ants in collective nest choice behaviour. We take a minimalist approach to develop a differential equation model and a corresponding non-spatial agent-based model. We show, consistent with existing empirical evidence, that heterogeneity in acceptance thresholds is a viable mechanism for efficient nest choice behaviour. In particular, we show that the proposed models show speed–accuracy trade-offs and speed–cohesion trade-offs when we vary the number of scouts or the quorum threshold. PMID:26543578

  15. Gauge threshold corrections for {N}=2 heterotic local models with flux, and mock modular forms

    NASA Astrophysics Data System (ADS)

    Carlevaro, Luca; Israël, Dan

    2013-03-01

    We determine threshold corrections to the gauge couplings in local models of {N}=2 smooth heterotic compactifications with torsion, given by the direct product of a warped Eguchi-Hanson space and a two-torus, together with a line bundle. Using the worldsheet cft description previously found and by suitably regularising the infinite target space volume divergence, we show that threshold corrections to the various gauge factors are governed by the non-holomorphic completion of the Appell-Lerch sum. While its holomorphic Mock-modular component captures the contribution of states that localise on the blown-up two-cycle, the non-holomorphic correction originates from non-localised bulk states. We infer from this analysis universality properties for {N}=2 heterotic local models with flux, based on target space modular invariance and the presence of such non-localised states. We finally determine the explicit dependence of these one-loop gauge threshold corrections on the moduli of the two-torus, and by S-duality we extract the corresponding string-loop and E1-instanton corrections to the Kähler potential and gauge kinetic functions of the dual type i model. In both cases, the presence of non-localised bulk states brings about novel perturbative and non-perturbative corrections, some features of which can be interpreted in the light of analogous corrections to the effective theory in compact models.

  16. Computational model of collective nest selection by ants with heterogeneous acceptance thresholds.

    PubMed

    Masuda, Naoki; O'shea-Wheller, Thomas A; Doran, Carolina; Franks, Nigel R

    2015-06-01

    Collective decision-making is a characteristic of societies ranging from ants to humans. The ant Temnothorax albipennis is known to use quorum sensing to collectively decide on a new home; emigration to a new nest site occurs when the number of ants favouring the new site becomes quorate. There are several possible mechanisms by which ant colonies can select the best nest site among alternatives based on a quorum mechanism. In this study, we use computational models to examine the implications of heterogeneous acceptance thresholds across individual ants in collective nest choice behaviour. We take a minimalist approach to develop a differential equation model and a corresponding non-spatial agent-based model. We show, consistent with existing empirical evidence, that heterogeneity in acceptance thresholds is a viable mechanism for efficient nest choice behaviour. In particular, we show that the proposed models show speed-accuracy trade-offs and speed-cohesion trade-offs when we vary the number of scouts or the quorum threshold. PMID:26543578

  17. A multi-timescale adaptive threshold model for the SAI tactile afferent to predict response to mechanical vibration

    PubMed Central

    Jahangiri, Anila F.; Gerling, Gregory J.

    2011-01-01

    The Leaky Integrate and Fire (LIF) model of a neuron is one of the best known models for a spiking neuron. A current limitation of the LIF model is that it may not accurately reproduce the dynamics of an action potential. There have recently been some studies suggesting that a LIF coupled with a multi-timescale adaptive threshold (MAT) may increase LIF’s accuracy in predicting spikes in cortical neurons. We propose a mechanotransduction process coupled with a LIF model with multi-timescale adaptive threshold to model slowly adapting type I (SAI) mechanoreceptor in monkey’s glabrous skin. In order to test the performance of the model, the spike timings predicted by this MAT model are compared with neural data. We also test a fixed threshold variant of the model by comparing its outcome with the neural data. Initial results indicate that the MAT model predicts spike timings better than a fixed threshold LIF model only. PMID:21814636

  18. Two-threshold model for scaling laws of noninteracting snow avalanches.

    PubMed

    Faillettaz, Jerome; Louchet, Francois; Grasso, Jean-Robert

    2004-11-12

    The sizes of snow slab failure that trigger snow avalanches are power-law distributed. Such a power-law probability distribution function has also been proposed to characterize different landslide types. In order to understand this scaling for gravity-driven systems, we introduce a two-threshold 2D cellular automaton, in which failure occurs irreversibly. Taking snow slab avalanches as a model system, we find that the sizes of the largest avalanches just preceding the lattice system breakdown are power-law distributed. By tuning the maximum value of the ratio of the two failure thresholds our model reproduces the range of power-law exponents observed for land, rock, or snow avalanches. We suggest this control parameter represents the material cohesion anisotropy. PMID:15600971

  19. Hierarchical Multiscale Adaptive Variable Fidelity Wavelet-based Turbulence Modeling with Lagrangian Spatially Variable Thresholding

    NASA Astrophysics Data System (ADS)

    Nejadmalayeri, Alireza

    The current work develops a wavelet-based adaptive variable fidelity approach that integrates Wavelet-based Direct Numerical Simulation (WDNS), Coherent Vortex Simulations (CVS), and Stochastic Coherent Adaptive Large Eddy Simulations (SCALES). The proposed methodology employs the notion of spatially and temporarily varying wavelet thresholding combined with hierarchical wavelet-based turbulence modeling. The transition between WDNS, CVS, and SCALES regimes is achieved through two-way physics-based feedback between the modeled SGS dissipation (or other dynamically important physical quantity) and the spatial resolution. The feedback is based on spatio-temporal variation of the wavelet threshold, where the thresholding level is adjusted on the fly depending on the deviation of local significant SGS dissipation from the user prescribed level. This strategy overcomes a major limitation for all previously existing wavelet-based multi-resolution schemes: the global thresholding criterion, which does not fully utilize the spatial/temporal intermittency of the turbulent flow. Hence, the aforementioned concept of physics-based spatially variable thresholding in the context of wavelet-based numerical techniques for solving PDEs is established. The procedure consists of tracking the wavelet thresholding-factor within a Lagrangian frame by exploiting a Lagrangian Path-Line Diffusive Averaging approach based on either linear averaging along characteristics or direct solution of the evolution equation. This innovative technique represents a framework of continuously variable fidelity wavelet-based space/time/model-form adaptive multiscale methodology. This methodology has been tested and has provided very promising results on a benchmark with time-varying user prescribed level of SGS dissipation. In addition, a longtime effort to develop a novel parallel adaptive wavelet collocation method for numerical solution of PDEs has been completed during the course of the current work

  20. Standardization of thresholding for binary conversion of vocal tract modeling in computed tomography.

    PubMed

    Inohara, Ken; Sumita, Yuka I; Ohbayashi, Naoto; Ino, Shuichi; Kurabayashi, Tohru; Ifukube, Tohru; Taniguchi, Hisashi

    2010-07-01

    Postoperative head and neck cancer patients suffer from speech disorders, which are the result of changes in their vocal tracts. Making a solid vocal tract model and measuring its transmission characteristics will provide one of the most useful tools to resolve the problem. In binary conversion of X-ray computed tomographic (CT) images for vocal tract reconstruction, nonobjective methods have been used by many researchers. We hypothesized that a standardized vocal tract model could be reconstructed by adopting the Hounsfield number of fat tissue as a criterion for thresholding of binary conversion, because its Hounsfield number is the nearest to air in the human body. The purpose of this study was to establish a new standardized method for binary conversion in reconstructing three-dimensional (3-D) vocal tract models. CT images for postoperative diagnosis were secondarily obtained from a CT scanner. Each patient's minimum settings of Hounsfield number for the buccal fat-pad regions were measured. Thresholds were set every 50 Hounsfield units (HU) from the bottom line of the buccal fat-pad region to -1024 HU, the images were converted into binary values, and were evaluated according to the three-grade system based on anatomically defined criteria. The optimal threshold between tissue and air was determined by nonlinear multiple regression analyses. Each patient's minimum settings of the buccal fat-pad regions were obtained. The optimal threshold was determined to be -165 HU from each patient's minimum settings of the Hounsfield number for the buccal fat-pad regions. To conclude, a method of 3-D standardized vocal tract modeling was established. PMID:19766442

  1. Integrating physiological threshold experiments with climate modeling to project mangrove species' range expansion.

    PubMed

    Cavanaugh, Kyle C; Parker, John D; Cook-Patton, Susan C; Feller, Ilka C; Williams, A Park; Kellner, James R

    2015-05-01

    Predictions of climate-related shifts in species ranges have largely been based on correlative models. Due to limitations of these models, there is a need for more integration of experimental approaches when studying impacts of climate change on species distributions. Here, we used controlled experiments to identify physiological thresholds that control poleward range limits of three species of mangroves found in North America. We found that all three species exhibited a threshold response to extreme cold, but freeze tolerance thresholds varied among species. From these experiments, we developed a climate metric, freeze degree days (FDD), which incorporates both the intensity and the frequency of freezes. When included in distribution models, FDD accurately predicted mangrove presence/absence. Using 28 years of satellite imagery, we linked FDD to observed changes in mangrove abundance in Florida, further exemplifying the importance of extreme cold. We then used downscaled climate projections of FDD to project that these range limits will move northward by 2.2-3.2 km yr(-1) over the next 50 years. PMID:25558057

  2. Generalizing a complex model for gully threshold identification in the Mediterranean environment

    NASA Astrophysics Data System (ADS)

    Torri, D.; Borselli, L.; Iaquinta, P.; Iovine, G.; Poesen, J.; Terranova, O.

    2012-04-01

    Among the physical processes leading to land degradation, soil erosion by water is the most important and gully erosion may contribute, at places, to 70% of the total soil loss. Nevertheless, gully erosion has often been neglected in water soil erosion modeling, whilst more prominence has been given to rill and interrill erosion. Both to facilitate the processing by agricultural machinery and to take advantage of all the arable land, gullies are commonly removed at each crop cycle, with significant soil losses due to the repeated excavation of the channel by the successive rainstorm. When the erosive forces of overland flow exceed the strength of the soil particles to detachment and displacement, water erosion occurs and usually a channel is formed. As runoff is proportional to the local catchment area, a relationship between local slope, S, and contributing area, A, is supposed to exists. A "geomorphologic threshold" scheme is therefore suitable to interpret the physical process of gully initiation: accordingly, a gully is formed when a hydraulic threshold for incision exceeds the resistance of the soil particles to detachment and transport. Similarly, it appears reasonable that a gully ends when there is a reduction of slope, or the concentrated flow meets more resistant soil-vegetation complexes. This study aims to predict the location of the beginning of gullies in the Mediterranean environment, based on an evaluation of S and A by means of a mathematical model. For the identification of the areas prone to gully erosion, the model employs two empirical thresholds relevant to the head (Thead) and to the end (Tend) of the gullies (of the type SA^ b>Thead, SA^ bthresholds represent the resistance of the environment to gully erosion, depending on: stoniness, vegetation cover, propensity to tunneling erosion due to soil dispersibility in water, and the intrinsic characteristics of the eroded material and of the erosivity of the rainfall event. Such

  3. Genetic evaluation of calf and heifer survival in Iranian Holstein cattle using linear and threshold models.

    PubMed

    Forutan, M; Ansari Mahyari, S; Sargolzaei, M

    2015-02-01

    Calf and heifer survival are important traits in dairy cattle affecting profitability. This study was carried out to estimate genetic parameters of survival traits in female calves at different age periods, until nearly the first calving. Records of 49,583 female calves born during 1998 and 2009 were considered in five age periods as days 1-30, 31-180, 181-365, 366-760 and full period (day 1-760). Genetic components were estimated based on linear and threshold sire models and linear animal models. The models included both fixed effects (month of birth, dam's parity number, calving ease and twin/single) and random effects (herd-year, genetic effect of sire or animal and residual). Rates of death were 2.21, 3.37, 1.97, 4.14 and 12.4% for the above periods, respectively. Heritability estimates were very low ranging from 0.48 to 3.04, 0.62 to 3.51 and 0.50 to 4.24% for linear sire model, animal model and threshold sire model, respectively. Rank correlations between random effects of sires obtained with linear and threshold sire models and with linear animal and sire models were 0.82-0.95 and 0.61-0.83, respectively. The estimated genetic correlations between the five different periods were moderate and only significant for 31-180 and 181-365 (r(g) = 0.59), 31-180 and 366-760 (r(g) = 0.52), and 181-365 and 366-760 (r(g) = 0.42). The low genetic correlations in current study would suggest that survival at different periods may be affected by the same genes with different expression or by different genes. Even though the additive genetic variations of survival traits were small, it might be possible to improve these traits by traditional or genomic selection. PMID:25100295

  4. Threshold models for genome-enabled prediction of ordinal categorical traits in plant breeding.

    PubMed

    Montesinos-López, Osval A; Montesinos-López, Abelardo; Pérez-Rodríguez, Paulino; de Los Campos, Gustavo; Eskridge, Kent; Crossa, José

    2015-02-01

    Categorical scores for disease susceptibility or resistance often are recorded in plant breeding. The aim of this study was to introduce genomic models for analyzing ordinal characters and to assess the predictive ability of genomic predictions for ordered categorical phenotypes using a threshold model counterpart of the Genomic Best Linear Unbiased Predictor (i.e., TGBLUP). The threshold model was used to relate a hypothetical underlying scale to the outward categorical response. We present an empirical application where a total of nine models, five without interaction and four with genomic × environment interaction (G×E) and genomic additive × additive × environment interaction (G×G×E), were used. We assessed the proposed models using data consisting of 278 maize lines genotyped with 46,347 single-nucleotide polymorphisms and evaluated for disease resistance [with ordinal scores from 1 (no disease) to 5 (complete infection)] in three environments (Colombia, Zimbabwe, and Mexico). Models with G×E captured a sizeable proportion of the total variability, which indicates the importance of introducing interaction to improve prediction accuracy. Relative to models based on main effects only, the models that included G×E achieved 9-14% gains in prediction accuracy; adding additive × additive interactions did not increase prediction accuracy consistently across locations. PMID:25538102

  5. Threshold Models for Genome-Enabled Prediction of Ordinal Categorical Traits in Plant Breeding

    PubMed Central

    Montesinos-López, Osval A.; Montesinos-López, Abelardo; Pérez-Rodríguez, Paulino; de los Campos, Gustavo; Eskridge, Kent; Crossa, José

    2014-01-01

    Categorical scores for disease susceptibility or resistance often are recorded in plant breeding. The aim of this study was to introduce genomic models for analyzing ordinal characters and to assess the predictive ability of genomic predictions for ordered categorical phenotypes using a threshold model counterpart of the Genomic Best Linear Unbiased Predictor (i.e., TGBLUP). The threshold model was used to relate a hypothetical underlying scale to the outward categorical response. We present an empirical application where a total of nine models, five without interaction and four with genomic × environment interaction (G×E) and genomic additive × additive × environment interaction (G×G×E), were used. We assessed the proposed models using data consisting of 278 maize lines genotyped with 46,347 single-nucleotide polymorphisms and evaluated for disease resistance [with ordinal scores from 1 (no disease) to 5 (complete infection)] in three environments (Colombia, Zimbabwe, and Mexico). Models with G×E captured a sizeable proportion of the total variability, which indicates the importance of introducing interaction to improve prediction accuracy. Relative to models based on main effects only, the models that included G×E achieved 9–14% gains in prediction accuracy; adding additive × additive interactions did not increase prediction accuracy consistently across locations. PMID:25538102

  6. Statistical models for overdispersion in the frequency of peaks over threshold data for a flow series

    NASA Astrophysics Data System (ADS)

    Eastoe, Emma F.; Tawn, Jonathan A.

    2010-02-01

    In a peaks over threshold analysis of a series of river flows, a sufficiently high threshold is used to extract the peaks of independent flood events. This paper reviews existing, and proposes new, statistical models for both the annual counts of such events and the process of event peak times. The most common existing model for the process of event times is a homogeneous Poisson process. This model is motivated by asymptotic theory. However, empirical evidence suggests that it is not the most appropriate model, since it implies that the mean and variance of the annual counts are the same, whereas the counts appear to be overdispersed, i.e., have a larger variance than mean. This paper describes how the homogeneous Poisson process can be extended to incorporate time variation in the rate at which events occur and so help to account for overdispersion in annual counts through the use of regression and mixed models. The implications of these new models on the implied probability distribution of the annual maxima are also discussed. The models are illustrated using a historical flow series from the River Thames at Kingston.

  7. An Active Contour Model Based on Adaptive Threshold for Extraction of Cerebral Vascular Structures

    PubMed Central

    Wang, Jiaxin; Zhao, Shifeng; Liu, Zifeng; Duan, Fuqing; Pan, Yutong

    2016-01-01

    Cerebral vessel segmentation is essential and helpful for the clinical diagnosis and the related research. However, automatic segmentation of brain vessels remains challenging because of the variable vessel shape and high complex of vessel geometry. This study proposes a new active contour model (ACM) implemented by the level-set method for segmenting vessels from TOF-MRA data. The energy function of the new model, combining both region intensity and boundary information, is composed of two region terms, one boundary term and one penalty term. The global threshold representing the lower gray boundary of the target object by maximum intensity projection (MIP) is defined in the first-region term, and it is used to guide the segmentation of the thick vessels. In the second term, a dynamic intensity threshold is employed to extract the tiny vessels. The boundary term is used to drive the contours to evolve towards the boundaries with high gradients. The penalty term is used to avoid reinitialization of the level-set function. Experimental results on 10 clinical brain data sets demonstrate that our method is not only able to achieve better Dice Similarity Coefficient than the global threshold based method and localized hybrid level-set method but also able to extract whole cerebral vessel trees, including the thin vessels. PMID:27597878

  8. Detection and Modeling of High-Dimensional Thresholds for Fault Detection and Diagnosis

    NASA Technical Reports Server (NTRS)

    He, Yuning

    2015-01-01

    Many Fault Detection and Diagnosis (FDD) systems use discrete models for detection and reasoning. To obtain categorical values like oil pressure too high, analog sensor values need to be discretized using a suitablethreshold. Time series of analog and discrete sensor readings are processed and discretized as they come in. This task isusually performed by the wrapper code'' of the FDD system, together with signal preprocessing and filtering. In practice,selecting the right threshold is very difficult, because it heavily influences the quality of diagnosis. If a threshold causesthe alarm trigger even in nominal situations, false alarms will be the consequence. On the other hand, if threshold settingdoes not trigger in case of an off-nominal condition, important alarms might be missed, potentially causing hazardoussituations. In this paper, we will in detail describe the underlying statistical modeling techniques and algorithm as well as the Bayesian method for selecting the most likely shape and its parameters. Our approach will be illustrated by several examples from the Aerospace domain.

  9. An Active Contour Model Based on Adaptive Threshold for Extraction of Cerebral Vascular Structures.

    PubMed

    Wang, Jiaxin; Zhao, Shifeng; Liu, Zifeng; Tian, Yun; Duan, Fuqing; Pan, Yutong

    2016-01-01

    Cerebral vessel segmentation is essential and helpful for the clinical diagnosis and the related research. However, automatic segmentation of brain vessels remains challenging because of the variable vessel shape and high complex of vessel geometry. This study proposes a new active contour model (ACM) implemented by the level-set method for segmenting vessels from TOF-MRA data. The energy function of the new model, combining both region intensity and boundary information, is composed of two region terms, one boundary term and one penalty term. The global threshold representing the lower gray boundary of the target object by maximum intensity projection (MIP) is defined in the first-region term, and it is used to guide the segmentation of the thick vessels. In the second term, a dynamic intensity threshold is employed to extract the tiny vessels. The boundary term is used to drive the contours to evolve towards the boundaries with high gradients. The penalty term is used to avoid reinitialization of the level-set function. Experimental results on 10 clinical brain data sets demonstrate that our method is not only able to achieve better Dice Similarity Coefficient than the global threshold based method and localized hybrid level-set method but also able to extract whole cerebral vessel trees, including the thin vessels. PMID:27597878

  10. Critical Thresholds and the Limit Distribution in the Bak-Sneppen Model

    NASA Astrophysics Data System (ADS)

    Meester, Ronald; Znamenski, Dmitri

    One of the key problems related to the Bak-Sneppen evolution model is to compute the limit distribution of the fitnesses in the stationary regime, as the size of the system tends to infinity. Simulations in [3, 1, 4] suggest that the one-dimensional limit marginal distribution is uniform on (pc, 1), for some pc 0.667. In this paper we define three critical thresholds related to avalanche characteristics. We prove that if these critical thresholds are the same and equal to some pc (we can only prove that two of them are the same) then the limit distribution is the product of uniform distributions on (pc, 1), and moreover pc<0.75. Our proofs are based on a self-similar graphical representation of the avalanches.

  11. Reentry Near the Percolation Threshold in a Heterogeneous Discrete Model for Cardiac Tissue

    NASA Astrophysics Data System (ADS)

    Alonso, Sergio; Bär, Markus

    2013-04-01

    Arrhythmias in cardiac tissue are related to irregular electrical wave propagation in the heart. Cardiac tissue is formed by a discrete cell network, which is often heterogeneous. A localized region with a fraction of nonconducting links surrounded by homogeneous conducting tissue can become a source of reentry and ectopic beats. Extensive simulations in a discrete model of cardiac tissue show that a wave crossing a heterogeneous region of cardiac tissue can disintegrate into irregular patterns, provided the fraction of nonconducting links is close to the percolation threshold of the cell network. The dependence of the reentry probability on this fraction, the system size, and the degree of excitability can be inferred from the size distribution of nonconducting clusters near the percolation threshold.

  12. A piecewise model of virus-immune system with two thresholds.

    PubMed

    Tang, Biao; Xiao, Yanni; Wu, Jianhong

    2016-08-01

    The combined antiretroviral therapy with interleukin (IL)-2 treatment may not be enough to preclude exceptionally high growth of HIV virus nor rebuilt the HIV-specific CD4 or CD8 T-cell proliferative immune response for management of HIV infected patients. Whether extra inclusion of immune therapy can induce the HIV-specific immune response and control HIV replication remains challenging. Here a piecewise virus-immune model with two thresholds is proposed to represent the HIV-1 RNA and effector cell-guided therapy strategies. We first analyze the dynamics of the virus-immune system with effector cell-guided immune therapy only and prove that there exists a critical level of the intensity of immune therapy determining whether the HIV-1 RAN virus loads can be controlled below a relative low level. Our analysis of the global dynamics of the proposed model shows that the pseudo-equilibrium can be globally stable or locally bistable with order 1 periodic solution or bistable with the virus-free periodic solution under various appropriate conditions. This indicates that HIV viral loads can either be eradicated or stabilize at a previously given level or go to infinity (corresponding to the effector cells oscillating), depending on the threshold levels and the initial HIV virus loads and effector cell counts. Comparing with the single threshold therapy strategy we obtain that with two thresholds therapy strategies either virus can be eradicated or the controllable region, where HIV viral loads can be maintained below a certain value, can be enlarged. PMID:27321193

  13. On the thresholds, probability densities, and critical exponents of Bak-Sneppen-like models

    NASA Astrophysics Data System (ADS)

    Garcia, Guilherme J. M.; Dickman, Ronald

    2004-10-01

    We report a simple method to accurately determine the threshold and the exponent ν of the Bak-Sneppen (BS) model and also investigate the BS universality class. For the random-neighbor version of the BS model, we find the threshold x ∗=0.33332(3) , in agreement with the exact result x ∗= {1}/{3} given by mean-field theory. For the one-dimensional original model, we find x ∗=0.6672(2) in good agreement with the results reported in the literature; for the anisotropic BS model we obtain x ∗=0.7240(1) . We study the finite size effect x ∗(L)-x ∗(L→∞)∝L -ν, observed in a system with L sites, and find ν=1.00(1) for the random-neighbor version, ν=1.40(1) for the original model, and ν=1.58(1) for the anisotropic case. Finally, we discuss the effect of defining the extremal site as the one which minimizes a general function f( x), instead of simply f( x)= x as in the original updating rule. We emphasize that models with extremal dynamics have singular stationary probability distributions p( x). Our simulations indicate the existence of two symmetry-based universality classes.

  14. Kuramoto model with uniformly spaced frequencies: Finite-N asymptotics of the locking threshold.

    PubMed

    Ottino-Löffler, Bertrand; Strogatz, Steven H

    2016-06-01

    We study phase locking in the Kuramoto model of coupled oscillators in the special case where the number of oscillators, N, is large but finite, and the oscillators' natural frequencies are evenly spaced on a given interval. In this case, stable phase-locked solutions are known to exist if and only if the frequency interval is narrower than a certain critical width, called the locking threshold. For infinite N, the exact value of the locking threshold was calculated 30 years ago; however, the leading corrections to it for finite N have remained unsolved analytically. Here we derive an asymptotic formula for the locking threshold when N≫1. The leading correction to the infinite-N result scales like either N^{-3/2} or N^{-1}, depending on whether the frequencies are evenly spaced according to a midpoint rule or an end-point rule. These scaling laws agree with numerical results obtained by Pazó [D. Pazó, Phys. Rev. E 72, 046211 (2005)PLEEE81539-375510.1103/PhysRevE.72.046211]. Moreover, our analysis yields the exact prefactors in the scaling laws, which also match the numerics. PMID:27415267

  15. Kuramoto model with uniformly spaced frequencies: Finite-N asymptotics of the locking threshold

    NASA Astrophysics Data System (ADS)

    Ottino-Löffler, Bertrand; Strogatz, Steven H.

    2016-06-01

    We study phase locking in the Kuramoto model of coupled oscillators in the special case where the number of oscillators, N , is large but finite, and the oscillators' natural frequencies are evenly spaced on a given interval. In this case, stable phase-locked solutions are known to exist if and only if the frequency interval is narrower than a certain critical width, called the locking threshold. For infinite N , the exact value of the locking threshold was calculated 30 years ago; however, the leading corrections to it for finite N have remained unsolved analytically. Here we derive an asymptotic formula for the locking threshold when N ≫1 . The leading correction to the infinite-N result scales like either N-3 /2 or N-1, depending on whether the frequencies are evenly spaced according to a midpoint rule or an end-point rule. These scaling laws agree with numerical results obtained by Pazó [D. Pazó, Phys. Rev. E 72, 046211 (2005), 10.1103/PhysRevE.72.046211]. Moreover, our analysis yields the exact prefactors in the scaling laws, which also match the numerics.

  16. Impact of slow K(+) currents on spike generation can be described by an adaptive threshold model.

    PubMed

    Kobayashi, Ryota; Kitano, Katsunori

    2016-06-01

    A neuron that is stimulated by rectangular current injections initially responds with a high firing rate, followed by a decrease in the firing rate. This phenomenon is called spike-frequency adaptation and is usually mediated by slow K(+) currents, such as the M-type K(+) current (I M ) or the Ca(2+)-activated K(+) current (I AHP ). It is not clear how the detailed biophysical mechanisms regulate spike generation in a cortical neuron. In this study, we investigated the impact of slow K(+) currents on spike generation mechanism by reducing a detailed conductance-based neuron model. We showed that the detailed model can be reduced to a multi-timescale adaptive threshold model, and derived the formulae that describe the relationship between slow K(+) current parameters and reduced model parameters. Our analysis of the reduced model suggests that slow K(+) currents have a differential effect on the noise tolerance in neural coding. PMID:27085337

  17. Cumulative t-link threshold models for the genetic analysis of calving ease scores

    PubMed Central

    Kizilkaya, Kadir; Carnier, Paolo; Albera, Andrea; Bittante, Giovanni; Tempelman, Robert J

    2003-01-01

    In this study, a hierarchical threshold mixed model based on a cumulative t-link specification for the analysis of ordinal data or more, specifically, calving ease scores, was developed. The validation of this model and the Markov chain Monte Carlo (MCMC) algorithm was carried out on simulated data from normally and t4 (i.e. a t-distribution with four degrees of freedom) distributed populations using the deviance information criterion (DIC) and a pseudo Bayes factor (PBF) measure to validate recently proposed model choice criteria. The simulation study indicated that although inference on the degrees of freedom parameter is possible, MCMC mixing was problematic. Nevertheless, the DIC and PBF were validated to be satisfactory measures of model fit to data. A sire and maternal grandsire cumulative t-link model was applied to a calving ease dataset from 8847 Italian Piemontese first parity dams. The cumulative t-link model was shown to lead to posterior means of direct and maternal heritabilities (0.40 ± 0.06, 0.11 ± 0.04) and a direct maternal genetic correlation (-0.58 ± 0.15) that were not different from the corresponding posterior means of the heritabilities (0.42 ± 0.07, 0.14 ± 0.04) and the genetic correlation (-0.55 ± 0.14) inferred under the conventional cumulative probit link threshold model. Furthermore, the correlation (> 0.99) between posterior means of sire progeny merit from the two models suggested no meaningful rerankings. Nevertheless, the cumulative t-link model was decisively chosen as the better fitting model for this calving ease data using DIC and PBF. PMID:12939202

  18. A Probabilistic Model for Estimating the Depth and Threshold Temperature of C-fiber Nociceptors.

    PubMed

    Dezhdar, Tara; Moshourab, Rabih A; Fründ, Ingo; Lewin, Gary R; Schmuker, Michael

    2015-01-01

    The subjective experience of thermal pain follows the detection and encoding of noxious stimuli by primary afferent neurons called nociceptors. However, nociceptor morphology has been hard to access and the mechanisms of signal transduction remain unresolved. In order to understand how heat transducers in nociceptors are activated in vivo, it is important to estimate the temperatures that directly activate the skin-embedded nociceptor membrane. Hence, the nociceptor's temperature threshold must be estimated, which in turn will depend on the depth at which transduction happens in the skin. Since the temperature at the receptor cannot be accessed experimentally, such an estimation can currently only be achieved through modeling. However, the current state-of-the-art model to estimate temperature at the receptor suffers from the fact that it cannot account for the natural stochastic variability of neuronal responses. We improve this model using a probabilistic approach which accounts for uncertainties and potential noise in system. Using a data set of 24 C-fibers recorded in vitro, we show that, even without detailed knowledge of the bio-thermal properties of the system, the probabilistic model that we propose here is capable of providing estimates of threshold and depth in cases where the classical method fails. PMID:26638830

  19. Effects of temporal correlations on cascades: Threshold models on temporal networks

    NASA Astrophysics Data System (ADS)

    Backlund, Ville-Pekka; Saramäki, Jari; Pan, Raj Kumar

    2014-06-01

    A person's decision to adopt an idea or product is often driven by the decisions of peers, mediated through a network of social ties. A common way of modeling adoption dynamics is to use threshold models, where a node may become an adopter given a high enough rate of contacts with adopted neighbors. We study the dynamics of threshold models that take both the network topology and the timings of contacts into account, using empirical contact sequences as substrates. The models are designed such that adoption is driven by the number of contacts with different adopted neighbors within a chosen time. We find that while some networks support cascades leading to network-level adoption, some do not: the propagation of adoption depends on several factors from the frequency of contacts to burstiness and timing correlations of contact sequences. More specifically, burstiness is seen to suppress cascade sizes when compared to randomized contact timings, while timing correlations between contacts on adjacent links facilitate cascades.

  20. Computationally Efficient Implementation of a Novel Algorithm for the General Unified Threshold Model of Survival (GUTS)

    PubMed Central

    Albert, Carlo; Vogel, Sören

    2016-01-01

    The General Unified Threshold model of Survival (GUTS) provides a consistent mathematical framework for survival analysis. However, the calibration of GUTS models is computationally challenging. We present a novel algorithm and its fast implementation in our R package, GUTS, that help to overcome these challenges. We show a step-by-step application example consisting of model calibration and uncertainty estimation as well as making probabilistic predictions and validating the model with new data. Using self-defined wrapper functions, we show how to produce informative text printouts and plots without effort, for the inexperienced as well as the advanced user. The complete ready-to-run script is available as supplemental material. We expect that our software facilitates novel re-analysis of existing survival data as well as asking new research questions in a wide range of sciences. In particular the ability to quickly quantify stressor thresholds in conjunction with dynamic compensating processes, and their uncertainty, is an improvement that complements current survival analysis methods. PMID:27340823

  1. A Probabilistic Model for Estimating the Depth and Threshold Temperature of C-fiber Nociceptors

    NASA Astrophysics Data System (ADS)

    Dezhdar, Tara; Moshourab, Rabih A.; Fründ, Ingo; Lewin, Gary R.; Schmuker, Michael

    2015-12-01

    The subjective experience of thermal pain follows the detection and encoding of noxious stimuli by primary afferent neurons called nociceptors. However, nociceptor morphology has been hard to access and the mechanisms of signal transduction remain unresolved. In order to understand how heat transducers in nociceptors are activated in vivo, it is important to estimate the temperatures that directly activate the skin-embedded nociceptor membrane. Hence, the nociceptor’s temperature threshold must be estimated, which in turn will depend on the depth at which transduction happens in the skin. Since the temperature at the receptor cannot be accessed experimentally, such an estimation can currently only be achieved through modeling. However, the current state-of-the-art model to estimate temperature at the receptor suffers from the fact that it cannot account for the natural stochastic variability of neuronal responses. We improve this model using a probabilistic approach which accounts for uncertainties and potential noise in system. Using a data set of 24 C-fibers recorded in vitro, we show that, even without detailed knowledge of the bio-thermal properties of the system, the probabilistic model that we propose here is capable of providing estimates of threshold and depth in cases where the classical method fails.

  2. A Probabilistic Model for Estimating the Depth and Threshold Temperature of C-fiber Nociceptors

    PubMed Central

    Dezhdar, Tara; Moshourab, Rabih A.; Fründ, Ingo; Lewin, Gary R.; Schmuker, Michael

    2015-01-01

    The subjective experience of thermal pain follows the detection and encoding of noxious stimuli by primary afferent neurons called nociceptors. However, nociceptor morphology has been hard to access and the mechanisms of signal transduction remain unresolved. In order to understand how heat transducers in nociceptors are activated in vivo, it is important to estimate the temperatures that directly activate the skin-embedded nociceptor membrane. Hence, the nociceptor’s temperature threshold must be estimated, which in turn will depend on the depth at which transduction happens in the skin. Since the temperature at the receptor cannot be accessed experimentally, such an estimation can currently only be achieved through modeling. However, the current state-of-the-art model to estimate temperature at the receptor suffers from the fact that it cannot account for the natural stochastic variability of neuronal responses. We improve this model using a probabilistic approach which accounts for uncertainties and potential noise in system. Using a data set of 24 C-fibers recorded in vitro, we show that, even without detailed knowledge of the bio-thermal properties of the system, the probabilistic model that we propose here is capable of providing estimates of threshold and depth in cases where the classical method fails. PMID:26638830

  3. Spinodals, scaling, and ergodicity in a threshold model with long-range stress transfer

    SciTech Connect

    Ferguson, C.D.; Klein, W.; Rundle, J.B.

    1999-08-01

    We present both theoretical and numerical analyses of a cellular automaton version of a slider-block model or threshold model that includes long-range interactions. Theoretically we develop a coarse-grained description in the mean-field (infinite range) limit and discuss the relevance of the metastable state, limit of stability (spinodal), and nucleation to the phenomenology of the model. We also simulate the model and confirm the relevance of the theory for systems with long- but finite-range interactions. Results of particular interest include the existence of Gutenberg-Richter-like scaling consistent with that found on real earthquake fault systems, the association of large events with nucleation near the spinodal, and the result that such systems can be described, in the mean-field limit, with techniques appropriate to systems in equilibrium. {copyright} {ital 1999} {ital The American Physical Society}

  4. Bus mathematical model of acceleration threshold limit estimation in lateral rollover test

    NASA Astrophysics Data System (ADS)

    Gauchía, A.; Olmeda, E.; Aparicio, F.; Díaz, V.

    2011-10-01

    Vehicle safety is a major concerns for researchers, governments and vehicle manufacturers, and therefore a special attention is paid to it. Particularly, rollover is one of the types of accidents where researchers have focused due to the gravity of injuries and the social impact it generates. One of the parameters that define bus lateral behaviour is the acceleration threshold limit, which is defined as the lateral acceleration from which the rollover process begins to take place. This parameter can be obtained by means of a lateral rollover platform test or estimated by means of mathematical models. In this paper, the differences between these methods are deeply analysed, and a new mathematical model is proposed to estimate the acceleration threshold limit in the lateral rollover test. The proposed model simulates the lateral rollover test, and, for the first time, it includes the effect of a variable position of the centre of gravity. Finally, the maximum speed at which the bus can travel in a bend without rolling over is computed.

  5. Threshold conditions for integrated pest management models with pesticides that have residual effects.

    PubMed

    Tang, Sanyi; Liang, Juhua; Tan, Yuanshun; Cheke, Robert A

    2013-01-01

    Impulsive differential equations (hybrid dynamical systems) can provide a natural description of pulse-like actions such as when a pesticide kills a pest instantly. However, pesticides may have long-term residual effects, with some remaining active against pests for several weeks, months or years. Therefore, a more realistic method for modelling chemical control in such cases is to use continuous or piecewise-continuous periodic functions which affect growth rates. How to evaluate the effects of the duration of the pesticide residual effectiveness on successful pest control is key to the implementation of integrated pest management (IPM) in practice. To address these questions in detail, we have modelled IPM including residual effects of pesticides in terms of fixed pulse-type actions. The stability threshold conditions for pest eradication are given. Moreover, effects of the killing efficiency rate and the decay rate of the pesticide on the pest and on its natural enemies, the duration of residual effectiveness, the number of pesticide applications and the number of natural enemy releases on the threshold conditions are investigated with regard to the extent of depression or resurgence resulting from pulses of pesticide applications and predator releases. Latin Hypercube Sampling/Partial Rank Correlation uncertainty and sensitivity analysis techniques are employed to investigate the key control parameters which are most significantly related to threshold values. The findings combined with Volterra's principle confirm that when the pesticide has a strong effect on the natural enemies, repeated use of the same pesticide can result in target pest resurgence. The results also indicate that there exists an optimal number of pesticide applications which can suppress the pest most effectively, and this may help in the design of an optimal control strategy. PMID:22205243

  6. Low dimensional model of heart rhythm dynamics as a tool for diagnosing the anaerobic threshold

    SciTech Connect

    Anosov, O.L.; Butkovskii, O.Y.; Kadtke, J.; Kravtsov, Y.A.

    1997-05-01

    We report preliminary results on describing the dependence of the heart rhythm variability on the stress level by using qualitative, low dimensional models. The reconstruction of macroscopic heart models yielding cardio cycles (RR-intervals) duration was based on actual clinical data. Our results show that the coefficients of the low dimensional models are sensitive to metabolic changes. In particular, at the transition between aerobic and aerobic-anaerobic metabolism, there are pronounced extrema in the functional dependence of the coefficients on the stress level. This strong sensitivity can be used to design an easy indirect method for determining the anaerobic threshold. This method could replace costly and invasive traditional methods such as gas analysis and blood tests. {copyright} {ital 1997 American Institute of Physics.}

  7. Landslide triggering rainfall thresholds estimation using hydrological modelling of catchments in the Ialomita Subcarpathians, Romania

    NASA Astrophysics Data System (ADS)

    Chitu, Zenaida; Busuioc, Aristita; Burcea, Sorin; Sandric, Ionut

    2016-04-01

    This work focuses on the hydro-meteorological analysis for landslide triggering rainfall thresholds estimation in the Ialomita Subcarpathians. This specific area is a complex geological and geomorphic unit in Romania, affected by landslides that produce numerous damages to the infrastructure every few years (1997, 1998, 2005, 2006, 2010, 2012 and 2014). Semi-distributed ModClark hydrological model implemented in HEC HMS software that integrates radar rainfall data, was used to investigate hydrological conditions within the catchment responsible for the occurrence of landslides during the main rainfall events. Statistical analysis of the main hydro-meteorological variables during the landslide events that occurred between 2005 and 2014 was carried out in order to identify preliminary rainfall thresholds for landslides in the Ialomita Subcarpathians. Moreover, according to the environmental catchment characteristics, different hydrological behaviors could be identified based on the spatially distributed rainfall estimates from weather radar data. Two hydrological regimes in the catchments were distinguished: one dominated by direct flow that explains the landslides that occurred due to slope undercutting and one characterized by high soil water storage during prolonged rainfall and therefore where subsurface runoff is significant. The hydrological precipitation-discharge modelling of the catchment in the Ialomita Subcarpathians, in which landslides occurred, helped understanding the landslide triggering and as such can be of added value for landslide research.

  8. Hypothesis testing in functional linear regression models with Neyman's truncation and wavelet thresholding for longitudinal data.

    PubMed

    Yang, Xiaowei; Nie, Kun

    2008-03-15

    Longitudinal data sets in biomedical research often consist of large numbers of repeated measures. In many cases, the trajectories do not look globally linear or polynomial, making it difficult to summarize the data or test hypotheses using standard longitudinal data analysis based on various linear models. An alternative approach is to apply the approaches of functional data analysis, which directly target the continuous nonlinear curves underlying discretely sampled repeated measures. For the purposes of data exploration, many functional data analysis strategies have been developed based on various schemes of smoothing, but fewer options are available for making causal inferences regarding predictor-outcome relationships, a common task seen in hypothesis-driven medical studies. To compare groups of curves, two testing strategies with good power have been proposed for high-dimensional analysis of variance: the Fourier-based adaptive Neyman test and the wavelet-based thresholding test. Using a smoking cessation clinical trial data set, this paper demonstrates how to extend the strategies for hypothesis testing into the framework of functional linear regression models (FLRMs) with continuous functional responses and categorical or continuous scalar predictors. The analysis procedure consists of three steps: first, apply the Fourier or wavelet transform to the original repeated measures; then fit a multivariate linear model in the transformed domain; and finally, test the regression coefficients using either adaptive Neyman or thresholding statistics. Since a FLRM can be viewed as a natural extension of the traditional multiple linear regression model, the development of this model and computational tools should enhance the capacity of medical statistics for longitudinal data. PMID:17610294

  9. NN-->NNπ reaction near threshold in a covariant one-boson-exchange model

    NASA Astrophysics Data System (ADS)

    Shyam, R.; Mosel, U.

    1998-04-01

    We calculate the cross sections for the p(p,nπ+)p and p(p,pπ0)p reactions for proton beam energies near threshold in a covariant one-boson-exchange model, which incorporates the exchange of π, ρ, σ and ω mesons, treats both nucleon and delta isobar as intermediate states. The final state interaction effects are included within the Watson's theory. Within this model the ω and σ meson exchange terms contribute significantly at these energies, which, along with other meson exchanges, make it possible to reproduce the available experimental data for the total as well as differential cross sections for both the reactions. The cross sections at beam energies <=300 MeV are found to be almost free from the contributions of the Δ isobar excitation.

  10. Probabilistic transport models for plasma transport in the presence of critical thresholds: Beyond the diffusive paradigma)

    NASA Astrophysics Data System (ADS)

    Sánchez, R.; van Milligen, B. Ph.; Carreras, B. A.

    2005-05-01

    It is argued that the modeling of plasma transport in tokamaks may benefit greatly from extending the usual local paradigm to accommodate scale-free transport mechanisms. This can be done by combining Lévy distributions and a nonlinear threshold condition within the continuous time random walk concept. The advantages of this nonlocal, nonlinear extension are illustrated by constructing a simple particle density transport model that, as a result of these ideas, spontaneously exhibits much of nondiffusive phenomenology routinely observed in tokamaks. The fluid limit of the system shows that the kind of equations that are appropriate to capture these dynamics are based on fractional differential operators. In them, effective diffusivities and pinch velocities are found that are dynamically set by the system in response to the specific characteristics of the fueling source and external perturbations. This fact suggests some dramatic consequences for the extrapolation of these transport properties to larger size systems.

  11. The minimal SUSY B - L model: simultaneous Wilson lines and string thresholds

    NASA Astrophysics Data System (ADS)

    Deen, Rehan; Ovrut, Burt A.; Purves, Austin

    2016-07-01

    In previous work, we presented a statistical scan over the soft supersymmetry breaking parameters of the minimal SUSY B - L model. For specificity of calculation, unification of the gauge parameters was enforced by allowing the two Z_3× Z_3 Wilson lines to have mass scales separated by approximately an order of magnitude. This introduced an additional "left-right" sector below the unification scale. In this paper, for three important reasons, we modify our previous analysis by demanding that the mass scales of the two Wilson lines be simultaneous and equal to an "average unification" mass < M U >. The present analysis is 1) more "natural" than the previous calculations, which were only valid in a very specific region of the Calabi-Yau moduli space, 2) the theory is conceptually simpler in that the left-right sector has been removed and 3) in the present analysis the lack of gauge unification is due to threshold effects — particularly heavy string thresholds, which we calculate statistically in detail. As in our previous work, the theory is renormalization group evolved from < M U > to the electroweak scale — being subjected, sequentially, to the requirement of radiative B - L and electroweak symmetry breaking, the present experimental lower bounds on the B - L vector boson and sparticle masses, as well as the lightest neutral Higgs mass of ˜125 GeV. The subspace of soft supersymmetry breaking masses that satisfies all such constraints is presented and shown to be substantial.

  12. Comparison between the effects of quercetin on seizure threshold in acute and chronic seizure models.

    PubMed

    Nassiri-Asl, Marjan; Hajiali, Farid; Taghiloo, Mina; Abbasi, Esmail; Mohseni, Fatemeh; Yousefi, Farbod

    2016-05-01

    Flavonoids are important constituents of food and beverages, and several studies have shown that they have neuroactive properties. Many of these compounds are ligands for γ-aminobutyric acid type A receptors in the central nervous system. This study aimed to investigate the anticonvulsant effects of quercetin (3,3',4',5,7-pentahydroxyflavone), which is a flavonoid found in plants, in rats treated with pentylenetetrazole in acute and chronic seizure models. Single intraperitoneal administration of quercetin did not show anticonvulsive effects against acute seizure. Similarly, multiple oral pretreatment with quercetin did not have protective effects against acute seizure. However, multiple intraperitoneal administration of quercetin (25 and 50 mg/kg) significantly increased time to death compared with the control (p < 0.001). However, quercetin pretreatment had no significant effects on the pattern of convulsion development during all periods of kindling. But on the test day, quercetin (100 mg/kg) could significantly increase generalized tonic-clonic seizure onset (GTCS) and decrease GTCS duration compared with the control (p < 0.01, p < 0.05). We conclude that quercetin has a narrow therapeutic dose range for anticonvulsant activities in vivo, and it has different effects on the seizure threshold. The different effects of quercetin on seizure threshold may occur through several mechanisms. PMID:24442347

  13. A model for calculating the threshold for shock initiation of pyrotechnics and explosives

    SciTech Connect

    Maiden, D.E.

    1987-03-01

    A model is proposed for predicting the shock pressure P and pulse pulse width ..pi.. required to ignite porous reactive mixtures. Essentially, the shock wave collapses the voids, forming high-temperature hot spots that ignite the mixture. The pore temperature is determined by numerical solution of the equations of motion, viscoplastic heating, and heat conduction. The pore radius is determined as a function of the pore size, viscosity, yield stress, and pressure. Temperature-dependent material properties and melting are considered. Ignition occurs when the surface temperature of the pore reaches the critical hot-spot temperature for thermal runaway. Data from flyer-plate impact experiments were analyzed and the pressure pulse at the ignition threshold was determined for 2Al/Fe/sub 2/O/sub 3/ (thermite) and the high explosives TATB, PBX 9404, and PETN. Mercury intrusion porosimetry was performed on the samples and the pore size distribution determined. Theoretical and numerical predictions of the ignition threshold are compared with experiment. Results show that P/sup 2/..pi.. appears to be an initiation property of the material.

  14. Can we clinically recognize a vascular depression? The role of personality in an expanded threshold model.

    PubMed

    Turk, Bela R; Gschwandtner, Michael E; Mauerhofer, Michaela; Löffler-Stastka, Henriette

    2015-05-01

    The vascular depression (VD) hypothesis postulates that cerebrovascular disease may "predispose, precipitate, or perpetuate" a depressive syndrome in elderly patients. Clinical presentation of VD has been shown to differ to major depression in quantitative disability; however, as little research has been made toward qualitative phenomenological differences in the personality aspects of the symptom profile, clinical diagnosis remains a challenge.We attempted to identify differences in clinical presentation between depression patients (n = 50) with (n = 25) and without (n = 25) vascular disease using questionnaires to assess depression, affect regulation, object relations, aggressiveness, alexithymia, personality functioning, personality traits, and counter transference.We were able to show that patients with vascular dysfunction and depression exhibit significantly higher aggressive and auto-aggressive tendencies due to a lower tolerance threshold. These data indicate that VD is a separate clinical entity and secondly that the role of personality itself may be a component of the disease process. We propose an expanded threshold disease model incorporating personality functioning and mood changes. Such findings might also aid the development of a screening program, by serving as differential criteria, ameliorating the diagnostic procedure. PMID:25950684

  15. Thresholds in Atmosphere-Soil Moisture Interactions: Results from Climate Model Studies

    NASA Technical Reports Server (NTRS)

    Oglesby, Robert J.; Marshall, Susan; Erickson, David J., III; Roads, John O.; Robertson, Franklin R.; Arnold, James E. (Technical Monitor)

    2001-01-01

    The potential predictability of the effects of warm season soil moisture anomalies over the central U.S. has been investigated using a series of GCM (Global Climate Model) experiments with the NCAR (National Center for Atmospheric Research) CCM3 (Community Climate Model version 3)/LSM (Land Surface Model). Three different types of experiments have been made, all starting in either March (representing precursor conditions) or June (conditions at the onset of the warm season): (1) 'anomaly' runs with large, exaggerated initial soil moisture reductions, aimed at evaluating the physical mechanisms by which soil moisture can affect the atmosphere; (2) 'predictability' runs aimed at evaluating whether typical soil moisture initial anomalies (indicative of year-to-year variability) can have a significant effect, and if so, for how long; (3) 'threshold' runs aimed at evaluating if a soil moisture anomaly must be of a specific size (i.e., a threshold crossed) before a significant impact on the atmosphere is seen. The 'anomaly' runs show a large, long-lasting response in soil moisture and also quantities such as surface temperature, sea level pressure, and precipitation; effects persist for at least a year. The 'predictability' runs, on the other hand, show very little impact of the initial soil moisture anomalies on the subsequent evolution of soil moisture and other atmospheric parameters; internal variability is most important, with the initial state of the atmosphere (representing remote effects such as SST anomalies) playing a more minor role. The 'threshold' runs, devised to help resolve the dichotomy in 'anomaly' and 'predictability' results, suggest that, at least in CCM3/LSM, the vertical profile of soil moisture is the most important factor, and that deep soil zone anomalies exert a more powerful, long-lasting effect than do anomalies in the near surface soil zone. We therefore suggest that soil moisture feedbacks may be more important in explaining prolonged

  16. A population-based Habitable Zone perspective

    NASA Astrophysics Data System (ADS)

    Zsom, Andras

    2015-08-01

    What can we tell about exoplanet habitability if currently only the stellar properties, planet radius, and the incoming stellar flux are known? The Habitable Zone (HZ) is the region around stars where planets can harbor liquid water on their surfaces. The HZ is traditionally conceived as a sharp region around the star because it is calculated for one planet with specific properties e.g., Earth-like or desert planets , or rocky planets with H2 atmospheres. Such planet-specific approach is limiting because the planets’ atmospheric and geophysical properties, which influence the surface climate and the presence of liquid water, are currently unknown but expected to be diverse.A statistical HZ description is outlined which does not select one specific planet type. Instead the atmospheric and surface properties of exoplanets are treated as random variables and a continuous range of planet scenarios are considered. Various probability density functions are assigned to each observationally unconstrained random variable, and a combination of Monte Carlo sampling and climate modeling is used to generate synthetic exoplanet populations with known surface climates. Then, the properties of the liquid water bearing subpopulation is analyzed.Given our current observational knowledge of small exoplanets, the HZ takes the form of a weakly-constrained but smooth probability function. The model shows that the HZ has an inner edge: it is unlikely that planets receiving two-three times more stellar radiation than Earth can harbor liquid water. But a clear outer edge is not seen: a planet that receives a fraction of Earth's stellar radiation (1-10%) can be habitable, if the greenhouse effect of the atmosphere is strong enough. The main benefit of the population-based approach is that it will be refined over time as new data on exoplanets and their atmospheres become available.

  17. Signal detection and threshold modeling of confidence-rating ROCs: A critical test with minimal assumptions.

    PubMed

    Kellen, David; Klauer, Karl Christoph

    2015-07-01

    An ongoing discussion in the recognition-memory literature concerns the question of whether recognition judgments reflect a direct mapping of graded memory representations (a notion that is instantiated by signal detection theory) or whether they are mediated by a discrete-state representation with the possibility of complete information loss (a notion that is instantiated by threshold models). These 2 accounts are usually evaluated by comparing their (penalized) fits to receiver operating characteristic data, a procedure that is predicated on substantial auxiliary assumptions, which if violated can invalidate results. We show that the 2 accounts can be compared on the basis of critical tests that invoke only minimal assumptions. Using previously published receiver operating characteristic data, we show that confidence-rating judgments are consistent with a discrete-state account. (PsycINFO Database Record PMID:26120910

  18. Effect of resiniferatoxin on the noxious heat threshold temperature in the rat: a novel heat allodynia model sensitive to analgesics

    PubMed Central

    Almási, Róbert; Pethö, Gábor; Bölcskei, Kata; Szolcsányi, János

    2003-01-01

    An increasing-temperature hot plate (ITHP) was introduced to measure the noxious heat threshold (45.3±0.3°C) of unrestrained rats, which was reproducible upon repeated determinations at intervals of 5 or 30 min or 1 day. Morphine, diclofenac and paracetamol caused an elevation of the noxious heat threshold following i.p. pretreatment, the minimum effective doses being 3, 10 and 200 mg kg−1, respectively. Unilateral intraplantar injection of the VR1 receptor agonist resiniferatoxin (RTX, 0.048 nmol) induced a profound drop of heat threshold to the innocuous range with a maximal effect (8–10°C drop) 5 min after RTX administration. This heat allodynia was inhibited by pretreatment with morphine, diclofenac and paracetamol, the minimum effective doses being 1, 1 and 100 mg kg−1 i.p., respectively. The long-term sensory desensitizing effect of RTX was examined by bilateral intraplantar injection (0.048 nmol per paw) which produced, after an initial threshold drop, an elevation (up to 2.9±0.5°C) of heat threshold lasting for 5 days. The VR1 receptor antagonist iodo-resiniferatoxin (I-RTX, 0.05 nmol intraplantarly) inhibited by 51% the heat threshold-lowering effect of intraplantar RTX but not α,β-methylene-ATP (0.3 μmol per paw). I-RTX (0.1 or 1 nmol per paw) failed to alter the heat threshold either acutely (5–60 min) or on the long-term (5 days). The heat threshold of VR1 receptor knockout mice was not different from that of wild-type animals (45.6±0.5 vs 45.2±0.4°C). In conclusion, the RTX-induced drop of heat threshold measured by the ITHP is a novel heat allodynia model exhibiting a high sensitivity to analgesics. PMID:12746222

  19. Electrodynamic model of the field effect transistor application for THz/subTHz radiation detection: Subthreshold and above threshold operation

    SciTech Connect

    Dobrovolsky, V.

    2014-10-21

    Developed in this work is an electrodynamic model of field effect transistor (FET) application for THz/subTHz radiation detection. It is based on solution of the Maxwell equations in the gate dielectric, expression for current in the channel, which takes into account both the drift and diffusion current components, and the equation of current continuity. For the regimes under and above threshold at the strong inversion the response voltage, responsivity, wave impedance, power of ohmic loss in the gate and channel have been found, and the electrical noise equivalent power (ENEP) has been estimated. The responsivity is orders of magnitude higher and ENEP under threshold is orders of magnitude less than these values above threshold. Under the threshold, the electromagnetic field in the gate oxide is identical to field of the plane waves in free-space. At the same time, for strong inversion the charging of the gate capacitance through the resistance of channel determines the electric field in oxide.

  20. Linear No-Threshold model and standards for protection against radiation.

    PubMed

    Shamoun, Dima Yazji

    2016-06-01

    In response to the three petitions by Carol S. Marcus, Mark L. Miller, and Mohan Doss, dated February 9, February 13, and February 24, 2015, respectively, the Nuclear Regulatory Commission (NRC or the Commission) has announced that it is considering assessing its choice of dose-response model, the Linear No-Threshold (LNT) model, for exposure to ionizing radiation. This comment is designed to assist the Commission in evaluating the merits of a review of the default dose-response model it uses as the basis for the Standards for Protection against Radiation regulations. It extends the petitioners' argument in favor of reexamining the default hypothesis (LNT) and taking consideration of low-dose hormesis for two main reasons: 1) Failure to review the LNT hypothesis may jeopardize the NRC's mission to protect public health and safety; and 2) The National Research Council's guidelines for choosing adequate defaults indicate that the choice of low-dose default model is due for a reevaluation. PMID:26924276

  1. The intrapleural volume threshold for ultrasound detection of pneumothoraces: An experimental study on porcine models

    PubMed Central

    2013-01-01

    Background Small pneumothoraxes (PTXs) may not impart an immediate threat to trauma patients after chest injuries. However, the amount of pleural air may increase and become a concern for patients who require positive pressure ventilation or air ambulance transport. Lung ultrasonography (US) is a reliable tool in finding intrapleural air, but the performance characteristics regarding the detection of small PTXs need to be defined. The study aimed to define the volume threshold of intrapleural air when PTXs are accurately diagnosed with US and compare this volume with that for chest x-ray (CXR). Methods Air was insufflated into a unilateral pleural catheter in seven incremental steps (10, 25, 50, 100, 200, 350 and 500 mL) in 20 intubated porcine models, followed by a diagnostic evaluation with US and a supine anteroposterior CXR. The sonographers continued the US scanning until the PTXs could be ruled in, based on the pathognomonic US “lung point” sign. The corresponding threshold volume was noted. A senior radiologist interpreted the CXR images. Results The mean threshold volume to confirm the diagnosis of PTX using US was 18 mL (standard deviation of 13 mL). Sixty-five percent of the PTXs were already diagnosed at 10 mL of intrapleural air; 25%, at 25 mL; and the last 10%, at 50 mL. At an air volume of 50 mL, the radiologist only identified four out of 20 PTXs in the CXR pictures; i.e., a sensitivity of 20% (95% CI: 7%, 44%). The sensitivity of CXR increased as a function of volume but leveled off at 67%, leaving one-third (1/3) of the PTXs unidentified after 500 mL of insufflated air. Conclusion Lung US is very accurate in diagnosing even small amounts of intrapleural air and should be performed by clinicians treating chest trauma patients when PTX is among the differential diagnoses. PMID:23453044

  2. Centrifuge model study of thresholds for rainfall-induced landslides in sandy slopes

    NASA Astrophysics Data System (ADS)

    Matziaris, V.; Marshall, A. M.; Heron, C. M.; Yu, H.-S.

    2015-09-01

    Rainfall-induced landslides are very common natural disasters which cause damage to properties and infrastructure and may result in the loss of human life. These phenomena often take place in unsaturated soil slopes and are triggered by the saturation of the soil profile due to rain infiltration which leads to the decrease of effective stresses and loss of shear strength. The aim of this study is to determine rainfall thresholds for the initiation of landslides under different initial conditions. Model tests of rainfall-induced landslides were conducted on the Nottingham Centre for Geomechanics geotechnical centrifuge. Initially unsaturated plane-strain slope models made with fine silica sand were prepared at varying densities at 1g and accommodated within a centrifuge container with rainfall simulator. During the centrifuge flight at 60g, rainfall events of varying intensity and duration, as well as variation of groundwater conditions, were applied to the slope models with the aim of initiating slope failure. This paper presents a discussion on the impact of soil state properties, rainfall characteristics, and groundwater conditions on slope behaviour and the initiation of slope instability.

  3. Marker-based monitoring of seated spinal posture using a calibrated single-variable threshold model.

    PubMed

    Walsh, Pauline; Dunne, Lucy E; Caulfield, Brian; Smyth, Barry

    2006-01-01

    This work, as part of a larger project developing wearable posture monitors for the work environment, seeks to monitor and model seated posture during computer use. A non-wearable marker-based optoelectronic motion capture system was used to monitor seated posture for ten healthy subjects during a calibration exercise and a typing task. Machine learning techniques were used to select overall spinal sagittal flexion as the best indicator of posture from a set of marker and vector variables. Overall flexion data from the calibration exercise were used to define a threshold model designed to classify posture for each subject, which was then applied to the typing task data. Results of the model were analysed visually by qualified physiotherapists with experience in ergonomics and posture analysis to confirm the accuracy of the calibration. The calibration formula was found to be accurate on 100% subjects. This process will be used as a comparative measure in the evaluation of several wearable posture sensors, and to inform the design of the wearable system. PMID:17946301

  4. Global and local threshold in a metapopulational SEIR model with quarantine

    NASA Astrophysics Data System (ADS)

    Gomes, Marcelo F. C.; Rossi, Luca; Pastore Y Piontti, Ana; Vespignani, Alessandro

    2013-03-01

    Diseases which have the possibility of transmission before the onset of symptoms pose a challenging threat to healthcare since it is hard to track spreaders and implement quarantine measures. More precisely, one main concerns regarding pandemic spreading of diseases is the prediction-and eventually control-of local outbreaks that will trigger a global invasion of a particular disease. We present a metapopulation disease spreading model with transmission from both symptomatic and asymptomatic agents and analyze the role of quarantine measures and mobility processes between subpopulations. We show that, depending on the disease parameters, it is possible to separate in the parameter space the local and global thresholds and study the system behavior as a function of the fraction of asymptomatic transmissions. This means that it is possible to have a range of parameters values where although we do not achieve local control of the outbreak it is possible to control the global spread of the disease. We validate the analytic picture in data-driven model that integrates commuting, air traffic flow and detailed information about population size and structure worldwide. Laboratory for the Modeling of Biological and Socio-Technical Systems (MoBS)

  5. Adaptive Thresholds

    SciTech Connect

    Bremer, P. -T.

    2014-08-26

    ADAPT is a topological analysis code that allow to compute local threshold, in particular relevance based thresholds for features defined in scalar fields. The initial target application is vortex detection but the software is more generally applicable to all threshold based feature definitions.

  6. Time-course and dose-response relationships of imperatorin in the mouse maximal electroshock seizure threshold model.

    PubMed

    Luszczki, Jarogniew J; Glowniak, Kazimierz; Czuczwar, Stanislaw J

    2007-09-01

    This study was designed to evaluate the anticonvulsant effects of imperatorin (a furanocoumarin isolated from fruits of Angelica archangelica) in the mouse maximal electroshock seizure threshold model. The threshold for electroconvulsions in mice was determined at several times: 15, 30, 60 and 120 min after i.p. administration of imperatorin at increasing doses of 10, 20, 30, 40, 50 and 100 mg/kg. The evaluation of time-course relationship for imperatorin in the maximal electroshock seizure threshold test revealed that the agent produced its maximum antielectroshock action at 30 min after its i.p. administration. In this case, imperatorin at doses of 50 and 100 mg/kg significantly raised the threshold for electroconvulsions in mice by 38 and 68% (P<0.05 and P<0.001), respectively. The antiseizure effects produced by imperatorin at 15, 60 and 120 min after its systemic (i.p.) administration were less expressed than those observed for imperatorin injected 30 min before the maximal electroshock seizure threshold test. Based on this study, one can conclude that imperatorin produces the anticonvulsant effect in the maximal electroshock seizure threshold test in a dose-dependent manner. PMID:17602770

  7. Solving Cordelia's Dilemma: Threshold Concepts within a Punctuated Model of Learning

    ERIC Educational Resources Information Center

    Kinchin, Ian M.

    2010-01-01

    The consideration of threshold concepts is offered in the context of biological education as a theoretical framework that may have utility in the teaching and learning of biology at all levels. Threshold concepts may provide a mechanism to explain the observed punctuated nature of conceptual change. This perspective raises the profile of periods…

  8. The relationship between the Rating Scale and Partial Credit Models and the implication of disordered thresholds of the Rasch models for polytomous responses.

    PubMed

    Luo, Guanzhong

    2005-01-01

    There is a perception in the literature that the Rating Scale Model (RSM) and Partial Credit Model (PCM) are two different types of Rasch models. This paper clarifies the relationship between the RSM and PCM from the perspectives of literature history and mathematical logic. It is shown that not only are the RSM and the PCM identical, but the two approaches used to introduce them are statistically equivalent. Then the implication of disordered thresholds is discussed. In addition, the difference between the structural thresholds and the Thurstone thresholds are clarified. PMID:16192666

  9. A study of jet fuel sooting tendency using the threshold sooting index (TSI) model

    SciTech Connect

    Yang, Yi; Boehman, Andre L.; Santoro, Robert J.

    2007-04-15

    Fuel composition can have a significant effect on soot formation during gas turbine combustion. Consequently, this paper contains a comprehensive review of the relationship between fuel hydrocarbon composition and soot formation in gas turbine combustors. Two levels of correlation are identified. First, lumped fuel composition parameters such as hydrogen content and smoke point, which are conventionally used to represent fuel sooting tendency, are correlated with soot formation in practical combustors. Second, detailed fuel hydrocarbon composition is correlated with these lumped parameters. The two-level correlation makes it possible to predict soot formation in practical combustors from basic fuel composition data. Threshold sooting index (TSI), which correlates linearly with the ratio of fuel molecular weight and smoke point in a diffusion flame, is proposed as a new lumped parameter for sooting tendency correlation. It is found that the TSI model correlates excellently with hydrocarbon compositions over a wide range of fuel samples. Also, in predicting soot formation in actual combustors, the TSI model produces the best results overall in comparison with other previously reported correlating parameters, including hydrogen content, smoke point, and composite predictors containing more than one parameter. (author)

  10. Myeloid conditional deletion and transgenic models reveal a threshold for the neutrophil survival factor Serpinb1.

    PubMed

    Burgener, Sabrina S; Baumann, Mathias; Basilico, Paola; Remold-O'Donnell, Eileen; Touw, Ivo P; Benarafa, Charaf

    2016-09-01

    Serpinb1 is an inhibitor of neutrophil granule serine proteases cathepsin G, proteinase-3 and elastase. One of its core physiological functions is to protect neutrophils from granule protease-mediated cell death. Mice lacking Serpinb1a (Sb1a-/-), its mouse ortholog, have reduced bone marrow neutrophil numbers due to cell death mediated by cathepsin G and the mice show increased susceptibility to lung infections. Here, we show that conditional deletion of Serpinb1a using the Lyz2-cre and Cebpa-cre knock-in mice effectively leads to recombination-mediated deletion in neutrophils but protein-null neutrophils were only obtained using the latter recombinase-expressing strain. Absence of Serpinb1a protein in neutrophils caused neutropenia and increased granule permeabilization-induced cell death. We then generated transgenic mice expressing human Serpinb1 in neutrophils under the human MRP8 (S100A8) promoter. Serpinb1a expression levels in founder lines correlated positively with increased neutrophil survival when crossed with Sb1a-/- mice, which had their defective neutrophil phenotype rescued in the higher expressing transgenic line. Using new conditional and transgenic mouse models, our study demonstrates the presence of a relatively low Serpinb1a protein threshold in neutrophils that is required for sustained survival. These models will also be helpful in delineating recently described functions of Serpinb1 in metabolism and cancer. PMID:27107834

  11. Extrapolation of extreme sea levels: incorporation of Over-Threshold-Modeling to the Joint Probability Method

    NASA Astrophysics Data System (ADS)

    Mazas, Franck; Hamm, Luc; Kergadallan, Xavier

    2013-04-01

    In France, the storm Xynthia of February 27-28th, 2010 reminded engineers and stakeholders of the necessity for an accurate estimation of extreme sea levels for the risk assessment in coastal areas. Traditionally, two main approaches exist for the statistical extrapolation of extreme sea levels: the direct approach performs a direct extrapolation on the sea level data, while the indirect approach carries out a separate analysis of the deterministic component (astronomical tide) and stochastic component (meteorological residual, or surge). When the tidal component is large compared with the surge one, the latter approach is known to perform better. In this approach, the statistical extrapolation is performed on the surge component then the distribution of extreme seal levels is obtained by convolution of the tide and surge distributions. This model is often referred to as the Joint Probability Method. Different models from the univariate extreme theory have been applied in the past for extrapolating extreme surges, in particular the Annual Maxima Method (AMM) and the r-largest method. In this presentation, we apply the Peaks-Over-Threshold (POT) approach for declustering extreme surge events, coupled with the Poisson-GPD model for fitting extreme surge peaks. This methodology allows a sound estimation of both lower and upper tails of the stochastic distribution, including the estimation of the uncertainties associated to the fit by computing the confidence intervals. After convolution with the tide signal, the model yields the distribution for the whole range of possible sea level values. Particular attention is paid to the necessary distinction between sea level values observed at a regular time step, such as hourly, and sea level events, such as those occurring during a storm. Extremal indexes for both surges and levels are thus introduced. This methodology will be illustrated with a case study at Brest, France.

  12. T Lymphocyte Activation Threshold and Membrane Reorganization Perturbations in Unique Culture Model

    NASA Technical Reports Server (NTRS)

    Adams, C. L.; Sams, C. F.

    2000-01-01

    Quantitative activation thresholds and cellular membrane reorganization are mechanisms by which resting T cells modulate their response to activating stimuli. Here we demonstrate perturbations of these cellular processes in a unique culture system that non-invasively inhibits T lymphocyte activation. During clinorotation, the T cell activation threshold is increased 5-fold. This increased threshold involves a mechanism independent of TCR triggering. Recruitment of lipid rafts to the activation site is impaired during clinorotation but does occur with increased stimulation. This study describes a situation in which an individual cell senses a change in its physical environment and alters its cell biological behavior.

  13. Improving Landslide Susceptibility Modeling Using an Empirical Threshold Scheme for Excluding Landslide Deposition

    NASA Astrophysics Data System (ADS)

    Tsai, F.; Lai, J. S.; Chiang, S. H.

    2015-12-01

    Landslides are frequently triggered by typhoons and earthquakes in Taiwan, causing serious economic losses and human casualties. Remotely sensed images and geo-spatial data consisting of land-cover and environmental information have been widely used for producing landslide inventories and causative factors for slope stability analysis. Landslide susceptibility, on the other hand, can represent the spatial likelihood of landslide occurrence and is an important basis for landslide risk assessment. As multi-temporal satellite images become popular and affordable, they are commonly used to generate landslide inventories for subsequent analysis. However, it is usually difficult to distinguish different landslide sub-regions (scarp, debris flow, deposition etc.) directly from remote sensing imagery. Consequently, the extracted landslide extents using image-based visual interpretation and automatic detections may contain many depositions that may reduce the fidelity of the landslide susceptibility model. This study developed an empirical thresholding scheme based on terrain characteristics for eliminating depositions from detected landslide areas to improve landslide susceptibility modeling. In this study, Bayesian network classifier is utilized to build a landslide susceptibility model and to predict sequent rainfall-induced shallow landslides in the Shimen reservoir watershed located in northern Taiwan. Eleven causative factors are considered, including terrain slope, aspect, curvature, elevation, geology, land-use, NDVI, soil, distance to fault, river and road. Landslide areas detected using satellite images acquired before and after eight typhoons between 2004 to 2008 are collected as the main inventory for training and verification. In the analysis, previous landslide events are used as training data to predict the samples of the next event. The results are then compared with recorded landslide areas in the inventory to evaluate the accuracy. Experimental results

  14. Analytic Model for Description of Above-Threshold Ionization by an Intense, Short Laser Pulse

    NASA Astrophysics Data System (ADS)

    Starace, Anthony F.; Frolov, M. V.; Knyazeva, D. V.; Manakov, N. L.; Geng, J.-W.; Peng, L.-Y.

    2015-05-01

    We present an analytic model for above-threshold ionization (ATI) of an atom by an intense, linearly-polarized short laser pulse. Our quantum analysis provides closed-form formulas for the differential probability of ATI, with amplitudes given by a coherent sum of partial amplitudes describing ionization by neighboring optical cycles near the peak of the intensity envelope of a short laser pulse. These analytic results explain key features of short-pulse ATI spectra, such as the left-right asymmetry in the ionized electron angular distribution, the multi-plateau structures, and both large-scale and fine-scale oscillation patterns resulting from quantum interferences of electron trajectories. The ATI spectrum in the middle part of the ATI plateau is shown to be sensitive to the spatial symmetry of the initial bound state of the active electron owing to contributions from multiple-return electron trajectories. An extension of our analytic formulas to real atoms provides results that are in good agreement with results of numerical solutions of the time-dependent Schrödinger equation for He and Ar atoms. Research supported in part by NSF Grant No. PHY-1208059, by RFBR Grant No. 13-02-00420, by Ministry of Ed. & Sci. of the Russian Fed. Proj. No. 1019, by NNSFC Grant Nos. 11322437, 11174016, and 11121091, and by the Dynasty Fdn. (MVF & DVK).

  15. High-precision percolation thresholds and Potts-model critical manifolds from graph polynomials

    NASA Astrophysics Data System (ADS)

    >Jesper Lykke Jacobsen,

    2014-04-01

    The critical curves of the q-state Potts model can be determined exactly for regular two-dimensional lattices G that are of the three-terminal type. This comprises the square, triangular, hexagonal and bow-tie lattices. Jacobsen and Scullard have defined a graph polynomial PB(q, v) that gives access to the critical manifold for general lattices. It depends on a finite repeating part of the lattice, called the basis B, and its real roots in the temperature variable v = eK - 1 provide increasingly accurate approximations to the critical manifolds upon increasing the size of B. Using transfer matrix techniques, these authors computed PB(q, v) for large bases (up to 243 edges), obtaining determinations of the ferromagnetic critical point vc > 0 for the (4, 82), kagome, and (3, 122) lattices to a precision (of the order 10-8) slightly superior to that of the best available Monte Carlo simulations. In this paper we describe a more efficient transfer matrix approach to the computation of PB(q, v) that relies on a formulation within the periodic Temperley-Lieb algebra. This makes possible computations for substantially larger bases (up to 882 edges), and the precision on vc is hence taken to the range 10-13. We further show that a large variety of regular lattices can be cast in a form suitable for this approach. This includes all Archimedean lattices, their duals and their medials. For all these lattices we tabulate high-precision estimates of the bond percolation thresholds pc and Potts critical points vc. We also trace and discuss the full Potts critical manifold in the (q, v) plane, paying special attention to the antiferromagnetic region v < 0. Finally, we adapt the technique to site percolation as well, and compute the polynomials PB(p) for certain Archimedean and dual lattices (those having only cubic and quartic vertices), using very large bases (up to 243 vertices). This produces the site percolation thresholds pc to a precision of the order of 10-9.

  16. Modeling direction discrimination thresholds for yaw rotations around an earth-vertical axis for arbitrary motion profiles.

    PubMed

    Soyka, Florian; Giordano, Paolo Robuffo; Barnett-Cowan, Michael; Bülthoff, Heinrich H

    2012-07-01

    Understanding the dynamics of vestibular perception is important, for example, for improving the realism of motion simulation and virtual reality environments or for diagnosing patients suffering from vestibular problems. Previous research has found a dependence of direction discrimination thresholds for rotational motions on the period length (inverse frequency) of a transient (single cycle) sinusoidal acceleration stimulus. However, self-motion is seldom purely sinusoidal, and up to now, no models have been proposed that take into account non-sinusoidal stimuli for rotational motions. In this work, the influence of both the period length and the specific time course of an inertial stimulus is investigated. Thresholds for three acceleration profile shapes (triangular, sinusoidal, and trapezoidal) were measured for three period lengths (0.3, 1.4, and 6.7 s) in ten participants. A two-alternative forced-choice discrimination task was used where participants had to judge if a yaw rotation around an earth-vertical axis was leftward or rightward. The peak velocity of the stimulus was varied, and the threshold was defined as the stimulus yielding 75 % correct answers. In accordance with previous research, thresholds decreased with shortening period length (from ~2 deg/s for 6.7 s to ~0.8 deg/s for 0.3 s). The peak velocity was the determining factor for discrimination: Different profiles with the same period length have similar velocity thresholds. These measurements were used to fit a novel model based on a description of the firing rate of semi-circular canal neurons. In accordance with previous research, the estimates of the model parameters suggest that velocity storage does not influence perceptual thresholds. PMID:22623095

  17. Transfer model of lead in soil-carrot (Daucus carota L.) system and food safety thresholds in soil.

    PubMed

    Ding, Changfeng; Li, Xiaogang; Zhang, Taolin; Wang, Xingxiang

    2015-09-01

    Reliable empirical models describing lead (Pb) transfer in soil-plant systems are needed to improve soil environmental quality standards. A greenhouse experiment was conducted to develop soil-plant transfer models to predict Pb concentrations in carrot (Daucus carota L.). Soil thresholds for food safety were then derived inversely using the prediction model in view of the maximum allowable limit for Pb in food. The 2 most important soil properties that influenced carrot Pb uptake factor (ratio of Pb concentration in carrot to that in soil) were soil pH and cation exchange capacity (CEC), as revealed by path analysis. Stepwise multiple linear regression models were based on soil properties and the pseudo total (aqua regia) or extractable (0.01 M CaCl2 and 0.005 M diethylenetriamine pentaacetic acid) soil Pb concentrations. Carrot Pb contents were best explained by the pseudo total soil Pb concentrations in combination with soil pH and CEC, with the percentage of variation explained being up to 93%. The derived soil thresholds based on added Pb (total soil Pb with the geogenic background part subtracted) have the advantage of better applicability to soils with high natural background Pb levels. Validation of the thresholds against data from field trials and literature studies indicated that the proposed thresholds are reasonable and reliable. PMID:25904232

  18. A Violation of the Conditional Independence Assumption in the Two-High-Threshold Model of Recognition Memory

    ERIC Educational Resources Information Center

    Chen, Tina; Starns, Jeffrey J.; Rotello, Caren M.

    2015-01-01

    The 2-high-threshold (2HT) model of recognition memory assumes that test items result in distinct internal states: they are either detected or not, and the probability of responding at a particular confidence level that an item is "old" or "new" depends on the state-response mapping parameters. The mapping parameters are…

  19. Applying the Rasch Model for Ordered Categories To Assess the Relationship between Response Choice Content and Category Threshold Disorder.

    ERIC Educational Resources Information Center

    Popp, Sharon Osborn; Behrens, John T.; Ryan, Joseph M.; Hess, Robert K.

    The Rasch model for ordered categories was applied to responses on a science attitude survey that uses a combined semantic differential and Likert-type scale format. Data were drawn from the Views about Science Survey for 1,300 high school students. Examination of category response function graphs and threshold estimates allowed classification of…

  20. Use of a threshold animal model to estimate calving ease and stillbirth (co)variance components for US Holsteins

    Technology Transfer Automated Retrieval System (TEKTRAN)

    (Co)variance components for calving ease and stillbirth in US Holsteins were estimated using a single-trait threshold animal model and two different sets of data edits. Six sets of approximately 250,000 records each were created by randomly selecting herd codes without replacement from the data used...

  1. Predicting Bed Grain Size in Threshold Channels Using Lidar Digital Elevation Models

    NASA Astrophysics Data System (ADS)

    Snyder, N. P.; Nesheim, A. O.; Wilkins, B. C.; Edmonds, D. A.

    2011-12-01

    Over the past 20 years, researchers have developed GIS-based algorithms to extract channel networks and measure longitudinal profiles from digital elevation models (DEMs), and have used these to study stream morphology in relation to tectonics, climate and ecology. The accuracy of stream elevations from traditional DEMs (10-50 m pixels) is typically limited by the contour interval (3-20 m) of the rasterized topographic map source. This is a particularly severe limitation in low-relief watersheds, where 3 m of channel elevation change may occur over several km. Lidar DEMs (~1 m pixels) allow researchers to resolve channel elevation changes of ~0.5 m, enabling reach-scale calculations of gradient, which is the most important parameter for understanding channel processes at that scale. Lidar DEMs have the additional advantage of allowing users to make estimates of channel width. We present a process-based model that predicts median bed grain size in threshold gravel-bed channels from lidar slope and width measurements using the Shields and Manning equations. We compare these predictions to field grain size measurements in segments of three Maine rivers. Like many paraglacial rivers, these have longitudinal profiles characterized by relatively steep (gradient >0.002) and flat (gradient <0.0005) segments, with length scales of several km. This heterogeneity corresponds to strong variations in channel form, sediment supply, bed grain size, and aquatic habitat characteristics. The model correctly predicts bed sediment size within a factor of two in ~70% of the study sites. The model works best in single-thread channels with relatively low sediment supply, and poorly in depositional, multi-thread and/or fine (median grain size <20 mm) reaches. We evaluate the river morphology (using field and lidar measurements) in the context of the Parker et al. (2007) hydraulic geometry relations for single-thread gravel-bed rivers, and find correspondence in the locations where both

  2. Evaluation of the pentylenetetrazole seizure threshold test in epileptic mice as surrogate model for drug testing against pharmacoresistant seizures.

    PubMed

    Töllner, Kathrin; Twele, Friederike; Löscher, Wolfgang

    2016-04-01

    Resistance to antiepileptic drugs (AEDs) is a major problem in epilepsy therapy, so that development of more effective AEDs is an unmet clinical need. Several rat and mouse models of epilepsy with spontaneous difficult-to-treat seizures exist, but because testing of antiseizure drug efficacy is extremely laborious in such models, they are only rarely used in the development of novel AEDs. Recently, the use of acute seizure tests in epileptic rats or mice has been proposed as a novel strategy for evaluating novel AEDs for increased antiseizure efficacy. In the present study, we compared the effects of five AEDs (valproate, phenobarbital, diazepam, lamotrigine, levetiracetam) on the pentylenetetrazole (PTZ) seizure threshold in mice that were made epileptic by pilocarpine. Experiments were started 6 weeks after a pilocarpine-induced status epilepticus. At this time, control seizure threshold was significantly lower in epileptic than in nonepileptic animals. Unexpectedly, only one AED (valproate) was less effective to increase seizure threshold in epileptic vs. nonepileptic mice, and this difference was restricted to doses of 200 and 300 mg/kg, whereas the difference disappeared at 400mg/kg. All other AEDs exerted similar seizure threshold increases in epileptic and nonepileptic mice. Thus, induction of acute seizures with PTZ in mice pretreated with pilocarpine does not provide an effective and valuable surrogate method to screen drugs for antiseizure efficacy in a model of difficult-to-treat chronic epilepsy as previously suggested from experiments with this approach in rats. PMID:26930359

  3. Implementation and assessment of the mechanical-threshold-stress model using the EPIC2 and PINON computer codes

    SciTech Connect

    Maudlin, P.J.; Davidson, R.F.; Henninger, R.J.

    1990-09-01

    A flow-stress constitutive model based on dislocation mechanics has been implemented in the EPIC2 and PINON continuum mechanics modes. This model provides a better understanding of the plastic deformation process for ductile materials by using an internal state variable called the mechanical threshold stress. This kinematic quantity tracks the evolution of the material's microstructure along some arbitrary strain, strain-rate, and temperature-dependent path using a differential form that balances dislocation generation and recovery processes. Given a value for the mechanical threshold stress, the flow stress is determined using either a thermal-activation-controlled or a drag-controlled kinetics relationship. We evaluated the performance of the Mechanical Threshold Stress (MTS) model in terms of accuracy and computational resources through a series of assessment problems chosen to exercise the model over a large range of strain rates and strains. Our calculations indicate that the more complicated MTS model is reasonable in terms of computational resources when compared with other models in common hydrocode use. In terms of accuracy, these simulations show that the MTS model is superior for problems containing mostly normal strain with shear strains less than 0.2 but perhaps not as accurate for problems that contain large amounts of shear strain. 29 refs., 33 figs., 9 tabs.

  4. Genetic parameters for calving rate and calf survival from linear, threshold, and logistic models in a multibreed beef cattle population.

    PubMed

    Guerra, J L L; Franke, D E; Blouin, D C

    2006-12-01

    Generalized mixed linear, threshold, and logistic sire models and Markov chain, Monte Carlo simulation procedures were used to estimate genetic parameters for calving rate and calf survival in a multibreed beef cattle population. Data were obtained from a 5-generation rotational crossbreeding study involving Angus, Brahman, Charolais, and Hereford (1969 to 1995). Gelbvieh and Simmental bulls sired terminal-cross calves from a sample of generation 5 cows. A total of 1,458 cows sired by 158 bulls had a mean calving rate of 78% based on 4,808 calving records. Ninety-one percent of 5,015 calves sired by 260 bulls survived to weaning. Mean heritability estimates and standard deviations for daughter calving rate from posterior distributions were 0.063 +/- 0.024, 0.150 +/- 0.049, and 0.130 +/- 0.047 for linear, threshold, and logistic models, respectively. For calf survival, mean heritability estimates and standard deviations from posterior distributions were 0.049 +/- 0.022, 0.160 +/- 0.058, and 0.190 +/- 0.078 from linear, threshold, and logistic models, respectively. When transformed to an underlying normal scale, linear sire, mixed model, heritability estimates were similar to threshold and logistic sire mixed model estimates. Posterior density distributions of estimated heritabilities from all models were normal. Spearman rank correlations between sire EPD across statistical models were greater than 0.97 for daughter calving rate and for calf survival. Sire EPD had similar ranges across statistical models for daughter calving rate and for calf survival. PMID:17093211

  5. Deciphering and modeling interconnections in ecohydrology: The role of scale, thresholds and stochastic storage processes

    NASA Astrophysics Data System (ADS)

    Bartlett, M. S.; McDonnell, J. J.; Porporato, A. M.

    2013-12-01

    Several components of ecohydrological systems are characterized by an interplay of stochastic inputs, finite capacity storage, and nonlinear, threshold-like losses, resulting in a complex partitioning of the rainfall input between the different basin scales. With the goal of more accurate predictions of rainfall partitioning and threshold effects in ecohydrology, we examine ecohydrological processes at the various scales, including canopy interception, soil storage with runoff/percolation, hillslope filling-spilling mechanisms, and the related groundwater recharge and baseflow contribution to streamflow. We apply a probabilistic approach to a hierarchical arrangement of cascading reservoirs that are representative of the components of the basin system. The analytical results of this framework help single out the key parameters controlling the partitioning of rainfall within the storage compartments of river basins. This theoretical framework is a useful learning tool for exploring the physical meaning of known thresholds in ecohydrology.

  6. Representation of Vegetation and Other Nonerodible Elements in Aeolian Shear Stress Partitioning Models for Predicting Transport Threshold

    NASA Technical Reports Server (NTRS)

    King, James; Nickling, William G.; Gillies, John A.

    2005-01-01

    The presence of nonerodible elements is well understood to be a reducing factor for soil erosion by wind, but the limits of its protection of the surface and erosion threshold prediction are complicated by the varying geometry, spatial organization, and density of the elements. The predictive capabilities of the most recent models for estimating wind driven particle fluxes are reduced because of the poor representation of the effectiveness of vegetation to reduce wind erosion. Two approaches have been taken to account for roughness effects on sediment transport thresholds. Marticorena and Bergametti (1995) in their dust emission model parameterize the effect of roughness on threshold with the assumption that there is a relationship between roughness density and the aerodynamic roughness length of a surface. Raupach et al. (1993) offer a different approach based on physical modeling of wake development behind individual roughness elements and the partition of the surface stress and the total stress over a roughened surface. A comparison between the models shows the partitioning approach to be a good framework to explain the effect of roughness on entrainment of sediment by wind. Both models provided very good agreement for wind tunnel experiments using solid objects on a nonerodible surface. However, the Marticorena and Bergametti (1995) approach displays a scaling dependency when the difference between the roughness length of the surface and the overall roughness length is too great, while the Raupach et al. (1993) model's predictions perform better owing to the incorporation of the roughness geometry and the alterations to the flow they can cause.

  7. Discrimination thresholds of normal and anomalous trichromats: Model of senescent changes in ocular media density on the Cambridge Colour Test.

    PubMed

    Shinomori, Keizo; Panorgias, Athanasios; Werner, John S

    2016-03-01

    Age-related changes in chromatic discrimination along dichromatic confusion lines were measured with the Cambridge Colour Test (CCT). One hundred and sixty-two individuals (16 to 88 years old) with normal Rayleigh matches were the major focus of this paper. An additional 32 anomalous trichromats classified by their Rayleigh matches were also tested. All subjects were screened to rule out abnormalities of the anterior and posterior segments. Thresholds on all three chromatic vectors measured with the CCT showed age-related increases. Protan and deutan vector thresholds increased linearly with age while the tritan vector threshold was described with a bilinear model. Analysis and modeling demonstrated that the nominal vectors of the CCT are shifted by senescent changes in ocular media density, and a method for correcting the CCT vectors is demonstrated. A correction for these shifts indicates that classification among individuals of different ages is unaffected. New vector thresholds for elderly observers and for all age groups are suggested based on calculated tolerance limits. PMID:26974943

  8. Model for Estimating the Threshold Mechanical Stability of Structural Cartilage Grafts Used in Rhinoplasty

    PubMed Central

    Zemek, Allison; Garg, Rohit; Wong, Brian J. F.

    2014-01-01

    Objectives/Hypothesis Characterizing the mechanical properties of structural cartilage grafts used in rhinoplasty is valuable because softer engineered tissues are more time- and cost-efficient to manufacture. The aim of this study is to quantitatively identify the threshold mechanical stability (e.g., Young’s modulus) of columellar, L-strut, and alar cartilage replacement grafts. Study Design Descriptive, focus group survey. Methods Ten mechanical phantoms of identical size (5 × 20 × 2.3 mm) and varying stiffness (0.360 to 0.85 MPa in 0.05 MPa increments) were made from urethane. A focus group of experienced rhinoplasty surgeons (n = 25, 5 to 30 years in practice) were asked to arrange the phantoms in order of increasing stiffness. Then, they were asked to identify the minimum acceptable stiffness that would still result in favorable surgical outcomes for three clinical applications: columellar, L-strut, and lateral crural replacement grafts. Available surgeons were tested again after 1 week to evaluate intra-rater consistency. Results For each surgeon, the threshold stiffness for each clinical application differed from the threshold values derived by logistic regression by no more than 0.05 MPa (accuracy to within 10%). Specific thresholds were 0.56, 0.59, and 0.49 MPa for columellar, L-strut, and alar grafts, respectively. For comparison, human nasal septal cartilage is approximately 0.8 MPa. Conclusions There was little inter- and intra-rater variation of the identified threshold values for adequate graft stiffness. The identified threshold values will be useful for the design of tissue-engineered or semisynthetic cartilage grafts for use in structural nasal surgery. PMID:20513022

  9. Bayesian Threshold Estimation

    ERIC Educational Resources Information Center

    Gustafson, S. C.; Costello, C. S.; Like, E. C.; Pierce, S. J.; Shenoy, K. N.

    2009-01-01

    Bayesian estimation of a threshold time (hereafter simply threshold) for the receipt of impulse signals is accomplished given the following: 1) data, consisting of the number of impulses received in a time interval from zero to one and the time of the largest time impulse; 2) a model, consisting of a uniform probability density of impulse time…

  10. PT -breaking threshold in spatially asymmetric Aubry-André and Harper models: Hidden symmetry and topological states

    NASA Astrophysics Data System (ADS)

    Harter, Andrew K.; Lee, Tony E.; Joglekar, Yogesh N.

    2016-06-01

    Aubry-André-Harper lattice models, characterized by a reflection-asymmetric sinusoidally varying nearest-neighbor tunneling profile, are well known for their topological properties. We consider the fate of such models in the presence of balanced gain and loss potentials ±i γ located at reflection-symmetric sites. We predict that these models have a finite PT -breaking threshold only for specific locations of the gain-loss potential and uncover a hidden symmetry that is instrumental to the finite threshold strength. We also show that the topological edge states remain robust in the PT -symmetry-broken phase. Our predictions substantially broaden the possible experimental realizations of a PT -symmetric system.

  11. Discontinuous non-equilibrium phase transition in a threshold Schloegl model for autocatalysis: Generic two-phase coexistence and metastability

    SciTech Connect

    Wang, Chi -Jen; Liu, Da -Jiang; Evans, James W.

    2015-04-28

    Threshold versions of Schloegl’s model on a lattice, which involve autocatalytic creation and spontaneous annihilation of particles, can provide a simple prototype for discontinuous non-equilibrium phase transitions. These models are equivalent to so-called threshold contact processes. A discontinuous transition between populated and vacuum states can occur selecting a threshold of N ≥ 2 for the minimum number, N, of neighboring particles enabling autocatalytic creation at an empty site. Fundamental open questions remain given the lack of a thermodynamic framework for analysis. For a square lattice with N = 2, we show that phase coexistence occurs not at a unique value but for a finite range of particle annihilation rate (the natural control parameter). This generic two-phase coexistence also persists when perturbing the model to allow spontaneous particle creation. Such behavior contrasts both the Gibbs phase rule for thermodynamic systems and also previous analysis for this model. We find metastability near the transition corresponding to a non-zero effective line tension, also contrasting previously suggested critical behavior. As a result, mean-field type analysis, extended to treat spatially heterogeneous states, further elucidates model behavior.

  12. Discontinuous non-equilibrium phase transition in a threshold Schloegl model for autocatalysis: Generic two-phase coexistence and metastability

    DOE PAGESBeta

    Wang, Chi -Jen; Liu, Da -Jiang; Evans, James W.

    2015-04-28

    Threshold versions of Schloegl’s model on a lattice, which involve autocatalytic creation and spontaneous annihilation of particles, can provide a simple prototype for discontinuous non-equilibrium phase transitions. These models are equivalent to so-called threshold contact processes. A discontinuous transition between populated and vacuum states can occur selecting a threshold of N ≥ 2 for the minimum number, N, of neighboring particles enabling autocatalytic creation at an empty site. Fundamental open questions remain given the lack of a thermodynamic framework for analysis. For a square lattice with N = 2, we show that phase coexistence occurs not at a unique valuemore » but for a finite range of particle annihilation rate (the natural control parameter). This generic two-phase coexistence also persists when perturbing the model to allow spontaneous particle creation. Such behavior contrasts both the Gibbs phase rule for thermodynamic systems and also previous analysis for this model. We find metastability near the transition corresponding to a non-zero effective line tension, also contrasting previously suggested critical behavior. As a result, mean-field type analysis, extended to treat spatially heterogeneous states, further elucidates model behavior.« less

  13. Discontinuous non-equilibrium phase transition in a threshold Schloegl model for autocatalysis: Generic two-phase coexistence and metastability

    SciTech Connect

    Wang, Chi-Jen; Liu, Da-Jiang; Evans, James W.

    2015-04-28

    Threshold versions of Schloegl’s model on a lattice, which involve autocatalytic creation and spontaneous annihilation of particles, can provide a simple prototype for discontinuous non-equilibrium phase transitions. These models are equivalent to so-called threshold contact processes. A discontinuous transition between populated and vacuum states can occur selecting a threshold of N ≥ 2 for the minimum number, N, of neighboring particles enabling autocatalytic creation at an empty site. Fundamental open questions remain given the lack of a thermodynamic framework for analysis. For a square lattice with N = 2, we show that phase coexistence occurs not at a unique value but for a finite range of particle annihilation rate (the natural control parameter). This generic two-phase coexistence also persists when perturbing the model to allow spontaneous particle creation. Such behavior contrasts both the Gibbs phase rule for thermodynamic systems and also previous analysis for this model. We find metastability near the transition corresponding to a non-zero effective line tension, also contrasting previously suggested critical behavior. Mean-field type analysis, extended to treat spatially heterogeneous states, further elucidates model behavior.

  14. Analytic model for the description of above-threshold ionization by an intense short laser pulse

    NASA Astrophysics Data System (ADS)

    Frolov, M. V.; Knyazeva, D. V.; Manakov, N. L.; Geng, Ji-Wei; Peng, Liang-You; Starace, Anthony F.

    2014-06-01

    We present an analytic model for the description of above-threshold ionization (ATI) of an atom by an intense, linearly polarized short laser pulse. Our treatment is based upon a description of ATI by an infinitely long train of short laser pulses whereupon we take the limit that the time interval between pulses becomes infinite. In the quasiclassical approximation, we provide detailed quantum-mechanical derivations, within the time-dependent effective range (TDER) model, of the closed-form formulas for the differential probability P (p) of ATI by an intense, short laser pulse that were presented briefly by Frolov et al. [Phys. Rev. Lett. 108, 213002 (2012), 10.1103/PhysRevLett.108.213002] and that were used to describe key features of the high-energy part of ATI spectra for H and He atoms in an intense, few-cycle laser pulse, using a phenomenological generalization of the physically transparent TDER results to the case of real atoms. Moreover, we extend these results here to the case of an electron bound initially in a p state; we also take into account multiple-return electron trajectories. The ATI amplitude in our approach is given by a coherent sum of partial amplitudes describing ionization by neighboring optical cycles near the peak of the intensity envelope of a short laser pulse. These results provide an analytical explanation of key features in short-pulse ATI spectra, such as the left-right asymmetry in the ionized electron angular distribution, the multiplateau structures, and both large-scale and fine-scale oscillation patterns resulting from quantum interferences of electron trajectories. Our results show that the shape of the ATI spectrum in the middle part of the ATI plateau is sensitive to the spatial symmetry of the initial bound state of the active electron. This sensitivity originates from the contributions of multiple-return electron trajectories. Our analytic results are shown to be in good agreement with results of numerical solutions of the

  15. A STATISTICAL MODELING METHODOLOGY FOR THE DETECTION, QUANTIFICATION, AND PREDICTION OF ECOLOGICAL THRESHOLDS

    EPA Science Inventory

    This study will provide a general methodology for integrating threshold information from multiple species ecological metrics, allow for prediction of changes of alternative stable states, and provide a risk assessment tool that can be applied to adaptive management. The integr...

  16. A Multinomial Model for Identifying Significant Pure-Tone Threshold Shifts

    ERIC Educational Resources Information Center

    Schlauch, Robert S.; Carney, Edward

    2007-01-01

    Purpose: Significant threshold differences on retest for pure-tone audiometry are often evaluated by application of ad hoc rules, such as a shift in a pure-tone average or in 2 adjacent frequencies that exceeds a predefined amount. Rules that are so derived do not consider the probability of observing a particular audiogram. Methods: A general…

  17. "Getting Stuck" in Analogue Electronics: Threshold Concepts as an Explanatory Model

    ERIC Educational Resources Information Center

    Harlow, A.; Scott, J.; Peter, M.; Cowie, B.

    2011-01-01

    Could the challenge of mastering threshold concepts be a potential factor that influences a student's decision to continue in electronics engineering? This was the question that led to a collaborative research project between educational researchers and the Faculty of Engineering in a New Zealand university. This paper deals exclusively with the…

  18. Single-Event Upset (SEU) model verification and threshold determination using heavy ions in a bipolar static RAM

    NASA Technical Reports Server (NTRS)

    Zoutendyk, J. A.; Smith, L. S.; Soli, G. A.; Thieberger, P.; Wegner, H. E.

    1985-01-01

    Single-Event Upset (SEU) response of a bipolar low-power Schottky-diode-clamped TTL static RAM has been observed using Br ions in the 100-240 MeV energy range and O ions in the 20-100 MeV range. These data complete the experimental verification of circuit-simulation SEU modeling for this device. The threshold for onset of SEU has been observed by the variation of energy, ion species and angle of incidence. The results obtained from the computer circuit-simulation modeling and experimental model verification demonstrate a viable methodology for modeling SEU in bipolar integrated circuits.

  19. Evolution of the Stokes Wave Side-Band Instability, threshold modification of Tulin NLS Model

    NASA Astrophysics Data System (ADS)

    Shugan, Igor; Hwung, Hwung-Hweng; Yang, Ray-Yeng

    2010-05-01

    of the wave train evolution: wave instability, the side band asymmetry and wave breaking effects. On the other hand, continuous wave breaking dissipation presumed in the model gives significantly overestimated values of wave attenuation on the latter stages of wave propagation and can not describe the wave modulation and restabilization at sufficiently long distances of propagation. The adjusted dissipative model based on the Nonlinear SchrÖdinger Equation is suggested for adequate description the obtained experimental data. Sink/Source terms due to wave breaking processes in its right side correspond to well-known Tulin (1996) model. The wave dissipation function includes the wave steepness threshold function and applied only in the regions with active wave breaking. Permanent frequencies downshift as a result of wave breaking process and post-breaking wave modulations described by the model have the satisfactory quantitative correspondence to results of experiments conducted along a super tank.

  20. A Threshold Shear Force for Calcium Influx in an Astrocyte Model of Traumatic Brain Injury

    PubMed Central

    Maneshi, Mohammad Mehdi; Sachs, Frederick

    2015-01-01

    Abstract Traumatic brain injury (TBI) refers to brain damage resulting from external mechanical force, such as a blast or crash. Our current understanding of TBI is derived mainly from in vivo studies that show measurable biological effects on neurons sampled after TBI. Little is known about the early responses of brain cells during stimuli and which features of the stimulus are most critical to cell injury. We generated defined shear stress in a microfluidic chamber using a fast pressure servo and examined the intracellular Ca2+ levels in cultured adult astrocytes. Shear stress increased intracellular Ca2+ depending on the magnitude, duration, and rise time of the stimulus. Square pulses with a fast rise time (∼2 ms) caused transient increases in intracellular Ca2+, but when the rise time was extended to 20 ms, the response was much less. The threshold for a response is a matrix of multiple parameters. Cells can integrate the effect of shear force from repeated challenges: A pulse train of 10 narrow pulses (11.5 dyn/cm2 and 10 ms wide) resulted in a 4-fold increase in Ca2+ relative to a single pulse of the same amplitude 100 ms wide. The Ca2+ increase was eliminated in Ca2+-free media, but was observed after depleting the intracellular Ca2+ stores with thapsigargin suggesting the need for a Ca2+ influx. The Ca2+ influx was inhibited by extracellular Gd3+, a nonspecific inhibitor of mechanosensitive ion channels, but it was not affected by the more specific inhibitor, GsMTx4. The voltage-gated channel blockers, nifedipine, diltiazem, and verapamil, were also ineffective. The data show that the mechanically induced Ca2+ influx commonly associated with neuron models for TBI is also present in astrocytes, and there is a viscoelastic/plastic coupling of shear stress to the Ca2+ influx. The site of Ca2+ influx has yet to be determined. PMID:25442327

  1. A new analytical threshold voltage model for symmetrical double-gate MOSFETs with high- k gate dielectrics

    NASA Astrophysics Data System (ADS)

    Chiang, T. K.; Chen, M. L.

    2007-03-01

    Based on the fully two-dimensional (2D) Poisson's solution in both silicon film and insulator layer, a compact and analytical threshold voltage model, which accounts for the fringing field effect of the short channel symmetrical double-gate (SDG) MOSFETs, has been developed. Exploiting the new model, a concerned analysis combining FIBL-enhanced short-channel effects and high- k gate dielectrics assess their overall impact on SDG MOSFET's scaling. It is found that for the same equivalent oxide thickness, the gate insulator with high- k dielectric constant which keeps a great characteristic length allows less design space than SiO 2 to sustain the same FIBL induced threshold voltage degradation.

  2. Modeling of high composition AlGaN channel high electron mobility transistors with large threshold voltage

    SciTech Connect

    Bajaj, Sanyam Hung, Ting-Hsiang; Akyol, Fatih; Nath, Digbijoy; Rajan, Siddharth

    2014-12-29

    We report on the potential of high electron mobility transistors (HEMTs) consisting of high composition AlGaN channel and barrier layers for power switching applications. Detailed two-dimensional (2D) simulations show that threshold voltages in excess of 3 V can be achieved through the use of AlGaN channel layers. We also calculate the 2D electron gas mobility in AlGaN channel HEMTs and evaluate their power figures of merit as a function of device operating temperature and Al mole fraction in the channel. Our models show that power switching transistors with AlGaN channels would have comparable on-resistance to GaN-channel based transistors for the same operation voltage. The modeling in this paper shows the potential of high composition AlGaN as a channel material for future high threshold enhancement mode transistors.

  3. Combining physiological threshold knowledge to species distribution models is key to improving forecasts of the future niche for macroalgae.

    PubMed

    Martínez, Brezo; Arenas, Francisco; Trilla, Alba; Viejo, Rosa M; Carreño, Francisco

    2015-04-01

    Species distribution models (SDM) are a useful tool for predicting species range shifts in response to global warming. However, they do not explore the mechanisms underlying biological processes, making it difficult to predict shifts outside the environmental gradient where the model was trained. In this study, we combine correlative SDMs and knowledge on physiological limits to provide more robust predictions. The thermal thresholds obtained in growth and survival experiments were used as proxies of the fundamental niches of two foundational marine macrophytes. The geographic projections of these species' distributions obtained using these thresholds and existing SDMs were similar in areas where the species are either absent-rare or frequent and where their potential and realized niches match, reaching consensus predictions. The cold-temperate foundational seaweed Himanthalia elongata was predicted to become extinct at its southern limit in northern Spain in response to global warming, whereas the occupancy of southern-lusitanic Bifurcaria bifurcata was expected to increase. Combined approaches such as this one may also highlight geographic areas where models disagree potentially due to biotic factors. Physiological thresholds alone tended to over-predict species prevalence, as they cannot identify absences in climatic conditions within the species' range of physiological tolerance or at the optima. Although SDMs tended to have higher sensitivity than threshold models, they may include regressions that do not reflect causal mechanisms, constraining their predictive power. We present a simple example of how combining correlative and mechanistic knowledge provides a rapid way to gain insight into a species' niche resulting in consistent predictions and highlighting potential sources of uncertainty in forecasted responses to climate change. PMID:24917488

  4. Threshold analysis of the susceptible-infected-susceptible model on overlay networks

    NASA Astrophysics Data System (ADS)

    Wu, Qingchu; Zhang, Haifeng; Small, Michael; Fu, Xinchu

    2014-07-01

    In this paper, we study epidemic spreading on overlay networks in which n multiple sets of links interconnect among the same nodes. By using the microscopic Markov-chain approximation (MMA) approach, we establish the conditions of epidemic outbreak for two kinds of spreading mechanisms in such an overlay network: the concatenation case and the switching case. When a uniform infection rate is set in all the subnetworks, we find the epidemic threshold for the switching case is just n times as large as that of concatenation case. We also find that the overlay network with a uniform infection rate can be considered as an equivalent (in the sense of epidemic dynamics and epidemic threshold) weighted network. To be specific, the concatenation case corresponds to the integer weighted network, while the switching case corresponds to the fractional weighted network. Interestingly, the time-varying unweighted network can be mapped into the static weighted network. Our analytic results exhibit good agreement with numerical simulations.

  5. Holes in the Bathtub: Water Table Dependent Services and Threshold Behavior in an Economic Model of Groundwater Extraction

    NASA Astrophysics Data System (ADS)

    Kirk-lawlor, N. E.; Edwards, E. C.

    2012-12-01

    In many groundwater systems, the height of the water table must be above certain thresholds for some types of surface flow to exist. Examples of flows that depend on water table elevation include groundwater baseflow to river systems, groundwater flow to wetland systems, and flow to springs. Meeting many of the goals of sustainable water resource management requires maintaining these flows at certain rates. Water resource management decisions invariably involve weighing tradeoffs between different possible usage regimes and the economic consequences of potential management choices are an important factor in these tradeoffs. Policies based on sustainability may have a social cost from forgoing present income. This loss of income may be worth bearing, but should be well understood and carefully considered. Traditionally, the economic theory of groundwater exploitation has relied on the assumption of a single-cell or "bathtub" aquifer model, which offers a simple means to examine complex interactions between water user and hydrologic system behavior. However, such a model assumes a closed system and does not allow for the simulation of groundwater outflows that depend on water table elevation (e.g. baseflow, springs, wetlands), even though those outflows have value. We modify the traditional single-cell aquifer model by allowing for outflows when the water table is above certain threshold elevations. These thresholds behave similarly to holes in a bathtub, where the outflow is a positive function of the height of the water table above the threshold and the outflow is lost when the water table drops below the threshold. We find important economic consequences to this representation of the groundwater system. The economic value of services provided by threshold-dependent outflows (including non-market value), such as ecosystem services, can be incorporated. The value of services provided by these flows may warrant maintaining the water table at higher levels than would

  6. Variable-threshold optical proximity correction (OPC) models for high-performance 0.18-μm process

    NASA Astrophysics Data System (ADS)

    Liao, Hongmei; Palmer, Shane R.; Sadra, Kayvan

    2000-07-01

    The recent development of lithographic resolution enhancement techniques of optical proximity correction (OPC) and phase shift masks (PSM) enable sprinting critical dimension (CD) features that are significantly smaller than the exposure wavelength. In this paper, we present a variable threshold OPC model that describes how a pattern configuration transfers to the wafer after resist and etch processes. This 0.18 micrometers CMOS technology utilizes isolation with pitches of active device regions below 0.5 micrometers . The effective gate length on silicon is in the range of 0.11 to 0.18 micrometers . The OPC model begins with a Hopkin's formula for aerial image calculation and is tuned to fit the measured CD data, using a commercially available software. The OPC models are anchored at a set of selected CD dat including linearity, line-end pullback, and linewidth as a function of pitch. It is found that the threshold values inferred from measured CD dat vary approximately linearly with the slope of aerial image. The accuracy of the model is illustrated by comparing the simulated contour using the OPC model and measured SEM image. The implementation of OPC models at both active and gate is achieved using two approaches: (1) to optimize the mask bias and sizes of hammerhead and serifs via a rule based approach; and (2) to correct the SRAM cell layouts by OPC model. The OPC models developed have been successfully applied to 0.18 micrometers technology in a prototyping environment.

  7. Determination of navigation FDI thresholds using a Markov model. [Failure Detection and Identification in triplex inertial platform systems for Shuttle entry

    NASA Technical Reports Server (NTRS)

    Walker, B. K.; Gai, E.

    1978-01-01

    A method for determining time-varying Failure Detection and Identification (FDI) thresholds for single sample decision functions is described in the context of a triplex system of inertial platforms. A cost function consisting of the probability of vehicle loss due to FDI decision errors is minimized. A discrete Markov model is constructed from which this cost can be determined as a function of the decision thresholds employed to detect and identify the first and second failures. Optimal thresholds are determined through the use of parameter optimization techniques. The application of this approach to threshold determination is illustrated for the Space Shuttle's inertial measurement instruments.

  8. Porcine skin visible lesion thresholds for near-infrared lasers including modeling at two pulse durations and spot sizes

    NASA Astrophysics Data System (ADS)

    Cain, Clarence P.; Polhamus, Garrett D.; Roach, William P.; Stolarski, David J.; Schuster, Kurt J.; Stockton, Kevin; Rockwell, Benjamin A.; Chen, Bo; Welch, Ashley J.

    2006-07-01

    With the advent of such systems as the airborne laser and advanced tactical laser, high-energy lasers that use 1315-nm wavelengths in the near-infrared band will soon present a new laser safety challenge to armed forces and civilian populations. Experiments in nonhuman primates using this wavelength have demonstrated a range of ocular injuries, including corneal, lenticular, and retinal lesions as a function of pulse duration. American National Standards Institute (ANSI) laser safety standards have traditionally been based on experimental data, and there is scant data for this wavelength. We are reporting minimum visible lesion (MVL) threshold measurements using a porcine skin model for two different pulse durations and spot sizes for this wavelength. We also compare our measurements to results from our model based on the heat transfer equation and rate process equation, together with actual temperature measurements on the skin surface using a high-speed infrared camera. Our MVL-ED50 thresholds for long pulses (350 µs) at 24-h postexposure are measured to be 99 and 83 Jcm-2 for spot sizes of 0.7 and 1.3 mm diam, respectively. Q-switched laser pulses of 50 ns have a lower threshold of 11 Jcm-2 for a 5-mm-diam top-hat laser pulse.

  9. Modeling of Beams’ Multiple-Contact Mode with an Application in the Design of a High-g Threshold Microaccelerometer

    PubMed Central

    Li, Kai; Chen, Wenyuan; Zhang, Weiping

    2011-01-01

    Beam’s multiple-contact mode, characterized by multiple and discrete contact regions, non-uniform stoppers’ heights, irregular contact sequence, seesaw-like effect, indirect interaction between different stoppers, and complex coupling relationship between loads and deformation is studied. A novel analysis method and a novel high speed calculation model are developed for multiple-contact mode under mechanical load and electrostatic load, without limitations on stopper height and distribution, providing the beam has stepped or curved shape. Accurate values of deflection, contact load, contact region and so on are obtained directly, with a subsequent validation by CoventorWare. A new concept design of high-g threshold microaccelerometer based on multiple-contact mode is presented, featuring multiple acceleration thresholds of one sensitive component and consequently small sensor size. PMID:22163897

  10. A threshold-voltage model for small-scaled GaAs nMOSFET with stacked high-k gate dielectric

    NASA Astrophysics Data System (ADS)

    Chaowen, Liu; Jingping, Xu; Lu, Liu; Hanhan, Lu; Yuan, Huang

    2016-02-01

    A threshold-voltage model for a stacked high-k gate dielectric GaAs MOSFET is established by solving a two-dimensional Poisson's equation in channel and considering the short-channel, DIBL and quantum effects. The simulated results are in good agreement with the Silvaco TCAD data, confirming the correctness and validity of the model. Using the model, impacts of structural and physical parameters of the stack high-k gate dielectric on the threshold-voltage shift and the temperature characteristics of the threshold voltage are investigated. The results show that the stacked gate dielectric structure can effectively suppress the fringing-field and DIBL effects and improve the threshold and temperature characteristics, and on the other hand, the influence of temperature on the threshold voltage is overestimated if the quantum effect is ignored. Project supported by the National Natural Science Foundation of China (No. 61176100).

  11. Modeling on oxide dependent 2DEG sheet charge density and threshold voltage in AlGaN/GaN MOSHEMT

    NASA Astrophysics Data System (ADS)

    Panda, J.; Jena, K.; Swain, R.; Lenka, T. R.

    2016-04-01

    We have developed a physics based analytical model for the calculation of threshold voltage, two dimensional electron gas (2DEG) density and surface potential for AlGaN/GaN metal oxide semiconductor high electron mobility transistors (MOSHEMT). The developed model includes important parameters like polarization charge density at oxide/AlGaN and AlGaN/GaN interfaces, interfacial defect oxide charges and donor charges at the surface of the AlGaN barrier. The effects of two different gate oxides (Al2O3 and HfO2) are compared for the performance evaluation of the proposed MOSHEMT. The MOSHEMTs with Al2O3 dielectric have an advantage of significant increase in 2DEG up to 1.2 × 1013 cm‑2 with an increase in oxide thickness up to 10 nm as compared to HfO2 dielectric MOSHEMT. The surface potential for HfO2 based device decreases from 2 to ‑1.6 eV within 10 nm of oxide thickness whereas for the Al2O3 based device a sharp transition of surface potential occurs from 2.8 to ‑8.3 eV. The variation in oxide thickness and gate metal work function of the proposed MOSHEMT shifts the threshold voltage from negative to positive realizing the enhanced mode operation. Further to validate the model, the device is simulated in Silvaco Technology Computer Aided Design (TCAD) showing good agreement with the proposed model results. The accuracy of the developed calculations of the proposed model can be used to develop a complete physics based 2DEG sheet charge density and threshold voltage model for GaN MOSHEMT devices for performance analysis.

  12. Food allergy population thresholds: an evaluation of the number of oral food challenges and dosing schemes on the accuracy of threshold dose distribution modeling.

    PubMed

    Klein Entink, Rinke H; Remington, Benjamin C; Blom, W Marty; Rubingh, Carina M; Kruizinga, Astrid G; Baumert, Joseph L; Taylor, Steve L; Houben, Geert F

    2014-08-01

    For most allergenic foods, limited availability of threshold dose information within the population restricts the advice on action levels of unintended allergenic foods which should trigger advisory labeling on packaged foods. The objective of this paper is to provide guidance for selecting an optimal sample size for threshold dosing studies for major allergenic foods and to identify factors influencing the accuracy of estimation. A simulation study was performed to evaluate the effects of sample size and dosing schemes on the accuracy of the threshold distribution curve. The relationships between sample size, dosing scheme and the employed statistical distribution on the one hand and accuracy of estimation on the other hand were obtained. It showed that the largest relative gains in accuracy are obtained when sample size increases from N=20 to N=60. Moreover, it showed that the EuroPrevall dosing scheme is a useful start, but that it may need revision for a specific allergen as more data become available, because a proper allocation of the dosing steps is important. The results may guide risk assessors in minimum sample sizes for new studies and in the allocation of proper dosing schemes for allergens in provocation studies. PMID:24815821

  13. Deviation from threshold model in ultrafast laser ablation of graphene at sub-micron scale

    SciTech Connect

    Gil-Villalba, A.; Xie, C.; Salut, R.; Furfaro, L.; Giust, R.; Jacquot, M.; Lacourt, P. A.; Dudley, J. M.; Courvoisier, F.

    2015-08-10

    We investigate a method to measure ultrafast laser ablation threshold with respect to spot size. We use structured complex beams to generate a pattern of craters in CVD graphene with a single laser pulse. A direct comparison between beam profile and SEM characterization allows us to determine the dependence of ablation probability on spot-size, for crater diameters ranging between 700 nm and 2.5 μm. We report a drastic decrease of ablation probability when the crater diameter is below 1 μm which we interpret in terms of free-carrier diffusion.

  14. Model selection based on FDR-thresholding optimizing the area under the ROC-curve.

    PubMed

    Graf, Alexandra C; Bauer, Peter

    2009-01-01

    We evaluate variable selection by multiple tests controlling the false discovery rate (FDR) to build a linear score for prediction of clinical outcome in high-dimensional data. Quality of prediction is assessed by the receiver operating characteristic curve (ROC) for prediction in independent patients. Thus we try to combine both goals: prediction and controlled structure estimation. We show that the FDR-threshold which provides the ROC-curve with the largest area under the curve (AUC) varies largely over the different parameter constellations not known in advance. Hence, we investigated a new cross validation procedure based on the maximum rank correlation estimator to determine the optimal selection threshold. This procedure (i) allows choosing an appropriate selection criterion, (ii) provides an estimate of the FDR close to the true FDR and (iii) is simple and computationally feasible for rather moderate to small sample sizes. Low estimates of the cross validated AUC (the estimates generally being positively biased) and large estimates of the cross validated FDR may indicate a lack of sufficiently prognostic variables and/or too small sample sizes. The method is applied to an oncology dataset. PMID:19572830

  15. Unilateral inflammation of the hindpaw in rats as a model of prolonged noxious stimulation: alterations in behavior and nociceptive thresholds.

    PubMed

    Stein, C; Millan, M J; Herz, A

    1988-10-01

    Unilateral intraplantar injection of Freund's complete adjuvant (FCA) into one hindpaw of rats led to a localized inflammation that became apparent within 12 hours and reached its peak between 2 and 3 weeks. FCA-treated rats displayed a diminished rate of body weight gain, a reduction of food and water intake and a disruption of circadian temperature regulation, as well as decreased locomotor activity and pronounced scratching behavior in the open field. Paw pressure thresholds were reduced only in inflamed paws. Contralateral, noninflamed paws showed comparable thresholds to those of control animals. Tail-flick and tail-pressure responses were not different from controls. These data suggest that FCA-treated animals experience increased noxious input from the inflamed limb and that changes in thresholds to acutely applied nociceptive stimuli are due to a peripheral hypersensitivity of inflamed tissue. The present condition resembles most closely a state of acute inflammatory pain. The term "chronic pain" in its strict sense is not appropriate in this model. PMID:3244721

  16. Determination of threshold conditions for a non-linear stochastic partnership model for heterosexually transmitted diseases with stages.

    PubMed

    Gallop, Robert J; Mode, Charles J; Sleeman, Candace K

    2002-01-01

    When comparing the performance of a stochastic model of an epidemic at two points in a parameter space, a threshold is said to have been crossed when at one point an epidemic develops with positive probability; while at the other there is a tendency for an epidemic to become extinct. The approach used to find thresholds in this paper was to embed a system of ordinary non-linear differential equations in a stochastic process, accommodating the formation and dissolution of marital partnerships in a heterosexual population, extra-marital sexual contacts, and diseases such as HIV/AIDS with stages. A symbolic representation of the Jacobian matrix of this system was derived. To determine whether this matrix was stable or non-stable at a particular parameter point, the Jacobian was evaluated at a disease-free equilibrium and its eigenvalues were computed. The stability or non-stability of the matrix was then determined by checking if all real parts of the eigenvalues were negative. By writing software to repeat this process for a selected set of points in the parameter space, it was possible to develop search engines for finding points in the parameter space where thresholds were crossed. The results of a set of Monte Carlo simulation experiments were reported which suggest that, by combining the stochastic and deterministic paradigms within a single formulation, it was possible to obtain more informative interpretations of simulation experiments than if attention were confined solely to either paradigm. PMID:11965260

  17. Cavitation thresholds of contrast agents in an in vitro human clot model exposed to 120-kHz ultrasound

    PubMed Central

    Gruber, Matthew J.; Bader, Kenneth B.; Holland, Christy K.

    2014-01-01

    Ultrasound contrast agents (UCAs) can be employed to nucleate cavitation to achieve desired bioeffects, such as thrombolysis, in therapeutic ultrasound applications. Effective methods of enhancing thrombolysis with ultrasound have been examined at low frequencies (<1 MHz) and low amplitudes (<0.5 MPa). The objective of this study was to determine cavitation thresholds for two UCAs exposed to 120-kHz ultrasound. A commercial ultrasound contrast agent (Definity®) and echogenic liposomes were investigated to determine the acoustic pressure threshold for ultraharmonic (UH) and broadband (BB) generation using an in vitro flow model perfused with human plasma. Cavitation emissions were detected using two passive receivers over a narrow frequency bandwidth (540–900 kHz) and a broad frequency bandwidth (0.54–1.74 MHz). UH and BB cavitation thresholds occurred at the same acoustic pressure (0.3 ± 0.1 MPa, peak to peak) and were found to depend on the sensitivity of the cavitation detector but not on the nucleating contrast agent or ultrasound duty cycle. PMID:25234874

  18. Ground-water vulnerability to nitrate contamination at multiple thresholds in the mid-Atlantic region using spatial probability models

    USGS Publications Warehouse

    Greene, Earl A.; LaMotte, Andrew E.; Cullinan, Kerri-Ann

    2005-01-01

    The U.S. Geological Survey, in cooperation with the U.S. Environmental Protection Agency?s Regional Vulnerability Assessment Program, has developed a set of statistical tools to support regional-scale, ground-water quality and vulnerability assessments. The Regional Vulnerability Assessment Program?s goals are to develop and demonstrate approaches to comprehensive, regional-scale assessments that effectively inform managers and decision-makers as to the magnitude, extent, distribution, and uncertainty of current and anticipated environmental risks. The U.S. Geological Survey is developing and exploring the use of statistical probability models to characterize the relation between ground-water quality and geographic factors in the Mid-Atlantic Region. Available water-quality data obtained from U.S. Geological Survey National Water-Quality Assessment Program studies conducted in the Mid-Atlantic Region were used in association with geographic data (land cover, geology, soils, and others) to develop logistic-regression equations that use explanatory variables to predict the presence of a selected water-quality parameter exceeding a specified management concentration threshold. The resulting logistic-regression equations were transformed to determine the probability, P(X), of a water-quality parameter exceeding a specified management threshold. Additional statistical procedures modified by the U.S. Geological Survey were used to compare the observed values to model-predicted values at each sample point. In addition, procedures to evaluate the confidence of the model predictions and estimate the uncertainty of the probability value were developed and applied. The resulting logistic-regression models were applied to the Mid-Atlantic Region to predict the spatial probability of nitrate concentrations exceeding specified management thresholds. These thresholds are usually set or established by regulators or managers at National or local levels. At management thresholds of

  19. [Tremendous Human, Social, and Economic Losses Caused by Obstinate Application of the Failed Linear No-threshold Model].

    PubMed

    Sutou, Shizuyo

    2015-01-01

    The linear no-threshold model (LNT) was recommended in 1956, with abandonment of the traditional threshold dose-response for genetic risk assessment. Adoption of LNT by the International Commission on Radiological Protection (ICRP) became the standard for radiation regulation worldwide. The ICRP recommends a dose limit of 1 mSv/year for the public, which is too low and which terrorizes innocent people. Indeed, LNT arose mainly from the lifespan survivor study (LSS) of atomic bomb survivors. The LSS, which asserts linear dose-response and no threshold, is challenged mainly on three points. 1) Radiation doses were underestimated by half because of disregard for major residual radiation, resulting in cancer risk overestimation. 2) The dose and dose-rate effectiveness factor (DDREF) of 2 is used, but the actual DDREF is estimated as 16, resulting in cancer risk overestimation by several times. 3) Adaptive response (hormesis) is observed in leukemia and solid cancer cases, consistently contradicting the linearity of LNT. Drastic reduction of cancer risk moves the dose-response curve close to the control line, allowing the setting of a threshold. Living organisms have been evolving for 3.8 billion years under radiation exposure, naturally acquiring various defense mechanisms such as DNA repair mechanisms, apoptosis, and immune response. The failure of LNT lies in the neglect of carcinogenesis and these biological mechanisms. Obstinate application of LNT continues to cause tremendous human, social, and economic losses. The 60-year-old LNT must be rejected to establish a new scientific knowledge-based system. PMID:26521869

  20. Endometrial cancer and antidepressants: A nationwide population-based study.

    PubMed

    Lin, Chiao-Fan; Chan, Hsiang-Lin; Hsieh, Yi-Hsuan; Liang, Hsin-Yi; Chiu, Wei-Che; Huang, Kuo-You; Lee, Yena; McIntyre, Roger S; Chen, Vincent Chin-Hung

    2016-07-01

    To our knowledge, the association between antidepressant exposure and endometrial cancer has not been previously explored. Herein, we aim to investigate the association between antidepressant prescription, including novel antidepressants, and the risk for endometrial cancer in a population-based study.Data for the analysis were derived from National Health Insurance Research Database. We identified 8392 cases with a diagnosis of endometrial cancer and 82,432 matched controls. A conditional logistic regression model was used, with adjusting for potentially confounding variables (e.g., comorbid psychiatric diseases, comorbid physical diseases, and other medications). Risk for endometrial cancer in the population-based study sample was categorized by, and assessed as a function of, antidepressant prescription and cumulative dosage.We report no association between endometrial cancer incidence and antidepressant prescription, including those prescribed either selective serotonin reuptake inhibitors (adjusted odds ratio [OR] = 0.98; 95% confidence interval [CI], 0.84-1.15) or serotonin norepinephrine reuptake inhibitors (adjusted OR = 1.14; 95% CI, 0.76-1.71). We also did not identify an association between higher cumulative doses of antidepressant prescription and endometrial cancer.There was no association between antidepressant prescription and endometrial cancer. PMID:27442640

  1. Applications of threshold models and the weighted bootstrap for Hungarian precipitation data

    NASA Astrophysics Data System (ADS)

    Varga, László; Rakonczai, Pál; Zempléni, András

    2016-05-01

    This paper presents applications of the peaks-over-threshold methodology for both the univariate and the recently introduced bivariate case, combined with a novel bootstrap approach. We compare the proposed bootstrap methods to the more traditional profile likelihood. We have investigated 63 years of the European Climate Assessment daily precipitation data for five Hungarian grid points, first separately for the summer and winter months, then aiming at the detection of possible changes by investigating 20 years moving windows. We show that significant changes can be observed both in the univariate and the bivariate cases, the most recent period being the most dangerous in several cases, as some return values have increased substantially. We illustrate these effects by bivariate coverage regions.

  2. A continuum model with a percolation threshold and tunneling-assisted interfacial conductivity for carbon nanotube-based nanocomposites

    SciTech Connect

    Wang, Yang; Weng, George J.; Meguid, Shaker A.; Hamouda, Abdel Magid

    2014-05-21

    A continuum model that possesses several desirable features of the electrical conduction process in carbon-nanotube (CNT) based nanocomposites is developed. Three basic elements are included: (i) percolation threshold, (ii) interface effects, and (iii) tunneling-assisted interfacial conductivity. We approach the first one through the selection of an effective medium theory. We approach the second one by the introduction of a diminishing layer of interface with an interfacial conductivity to build a 'thinly coated' CNT. The third one is introduced through the observation that interface conductivity can be enhanced by electron tunneling which in turn can be facilitated with the formation of CNT networks. We treat this last issue in a continuum fashion by taking the network formation as a statistical process that can be represented by Cauchy's probability density function. The outcome is a simple and yet widely useful model that can simultaneously capture all these fundamental characteristics. It is demonstrated that, without considering the interface effect, the predicted conductivity would be too high, and that, without accounting for the additional contribution from the tunneling-assisted interfacial conductivity, the predicted conductivity beyond the percolation threshold would be too low. It is with the consideration of all three elements that the theory can fully account for the experimentally measured data. We further use the developed model to demonstrate that, despite the anisotropy of the intrinsic CNT conductivity, it is its axial component along the CNT direction that dominates the overall conductivity. This theory is also proved that, even with a totally insulating matrix, it is still capable of delivering non-zero conductivity beyond the percolation threshold.

  3. Simplified models for the nonlinear evolution of two fast-particle-driven modes near the linear stability threshold

    NASA Astrophysics Data System (ADS)

    Galant, Grzegorz; Zaleśny, Jarosław; Lisak, Mietek; Berczyński, Paweł; Berczyński, Stefan

    2011-05-01

    An analytical model that is based on purely differential equations of the nonlinear dynamics of two plasma modes driven resonantly by high-energy ions near the instability threshold is presented here. The well-known integro-differential model of Berk and Breizman (BB) extended to the case of two plasma modes is simplified here to a system of two coupled nonlinear differential equations of fifth order. The effects of the Krook, diffusion and dynamical friction (drag) relaxation processes are considered, whereas shifts in frequency and wavenumber between the modes are neglected. In spite of these simplifications the main features of the dynamics of the two plasma modes are retained. The numerical solutions to the model equations show competition between the two modes for survival, oscillations, chaotic regimes and 'blow-up' behavior, similar to the BB model.

  4. Threshold driven response of permafrost in Northern Eurasia to climate and environmental change: from conceptual model to quantitative assessment

    NASA Astrophysics Data System (ADS)

    Anisimov, Oleg; Kokorev, Vasiliy; Reneva, Svetlana; Shiklomanov, Nikolai

    2010-05-01

    Numerous efforts have been made to access the environmental impacts of changing climate in permafrost regions using mathematical models. Despite the significant improvements in representation of individual sub-systems, such as permafrost, vegetation, snow and hydrology, even the most comprehensive models do not replicate the coupled non-linear interactions between them that lead to threshold-driven changes. Observations indicate that ecosystems may change dramatically, rapidly, and often irreversibly, reaching fundamentally different state once they pass a critical threshold. The key to understanding permafrost threshold phenomena is interaction with other environmental factors that are very likely to change in response to climate warming. One of such factors is vegetation. Vegetation control over the thermal state of underlying ground is two-fold. Firstly, canopies have different albedo that affects the radiation balance at the soil surface. Secondly, depending on biome composition vegetation canopy may have different thermal conductivity that governs the heat fluxes between soil and atmosphere. There are clear indications based on ground observations and remote sensing that vegetation has already been changed in response to climatic warming, in consensus with the results of manipulations at experimental plots that involve artificial warming and CO2 fertilization. Under sustained warming lower vegetation (mosses, lichens) is gradually replaced by shrubs. Mosses have high thermal insolating effect in summer, which is why their retreat enhances permafrost warming. Taller shrubs accumulate snow that further warms permafrost in winter. Permafrost remains unchanged as long as responding vegetation intercepts and mitigates the climate change signal. Beyond certain threshold enhanced abundance and growth of taller vegetation leads to abrupt permafrost changes. Changes in hydrology, i.e. soil wetting or drying, may have similar effect on permafrost. Wetting increases soil

  5. Multi-host model and threshold of intermediate host Oncomelania snail density for eliminating schistosomiasis transmission in China.

    PubMed

    Zhou, Yi-Biao; Chen, Yue; Liang, Song; Song, Xiu-Xia; Chen, Geng-Xin; He, Zhong; Cai, Bin; Yihuo, Wu-Li; He, Zong-Gui; Jiang, Qing-Wu

    2016-01-01

    Schistosomiasis remains a serious public health issue in many tropical countries, with more than 700 million people at risk of infection. In China, a national integrated control strategy, aiming at blocking its transmission, has been carried out throughout endemic areas since 2005. A longitudinal study was conducted to determine the effects of different intervention measures on the transmission dynamics of S. japonicum in three study areas and the data were analyzed using a multi-host model. The multi-host model was also used to estimate the threshold of Oncomelania snail density for interrupting schistosomiasis transmission based on the longitudinal data as well as data from the national surveillance system for schistosomiasis. The data showed a continuous decline in the risk of human infection and the multi-host model fit the data well. The 25th, 50th and 75th percentiles, and the mean of estimated thresholds of Oncomelania snail density below which the schistosomiasis transmission cannot be sustained were 0.006, 0.009, 0.028 and 0.020 snails/0.11 m(2), respectively. The study results could help develop specific strategies of schistosomiasis control and elimination tailored to the local situation for each endemic area. PMID:27535177

  6. Multi-host model and threshold of intermediate host Oncomelania snail density for eliminating schistosomiasis transmission in China

    PubMed Central

    Zhou, Yi-Biao; Chen, Yue; Liang, Song; Song, Xiu-Xia; Chen, Geng-Xin; He, Zhong; Cai, Bin; Yihuo, Wu-Li; He, Zong-Gui; Jiang, Qing-Wu

    2016-01-01

    Schistosomiasis remains a serious public health issue in many tropical countries, with more than 700 million people at risk of infection. In China, a national integrated control strategy, aiming at blocking its transmission, has been carried out throughout endemic areas since 2005. A longitudinal study was conducted to determine the effects of different intervention measures on the transmission dynamics of S. japonicum in three study areas and the data were analyzed using a multi-host model. The multi-host model was also used to estimate the threshold of Oncomelania snail density for interrupting schistosomiasis transmission based on the longitudinal data as well as data from the national surveillance system for schistosomiasis. The data showed a continuous decline in the risk of human infection and the multi-host model fit the data well. The 25th, 50th and 75th percentiles, and the mean of estimated thresholds of Oncomelania snail density below which the schistosomiasis transmission cannot be sustained were 0.006, 0.009, 0.028 and 0.020 snails/0.11 m2, respectively. The study results could help develop specific strategies of schistosomiasis control and elimination tailored to the local situation for each endemic area. PMID:27535177

  7. Modeling the calcium spike as a threshold triggered fixed waveform for synchronous inputs in the fluctuation regime

    PubMed Central

    Chua, Yansong; Morrison, Abigail; Helias, Moritz

    2015-01-01

    Modeling the layer 5 pyramidal neuron as a system of three connected isopotential compartments, the soma, proximal, and distal compartment, with calcium spike dynamics in the distal compartment following first order kinetics, we are able to reproduce in-vitro experimental results which demonstrate the involvement of calcium spikes in action potentials generation. To explore how calcium spikes affect the neuronal output in-vivo, we emulate in-vivo like conditions by embedding the neuron model in a regime of low background fluctuations with occasional large synchronous inputs. In such a regime, a full calcium spike is only triggered by the synchronous events in a threshold like manner and has a stereotypical waveform. Hence, in such a regime, we are able to replace the calcium dynamics with a simpler threshold triggered current of fixed waveform, which is amenable to analytical treatment. We obtain analytically the mean somatic membrane potential excursion due to a calcium spike being triggered while in the fluctuating regime. Our analytical form that accounts for the covariance between conductances and the membrane potential shows a better agreement with simulation results than a naive first order approximation. PMID:26283954

  8. The future of population-based postmarket drug risk assessment: a regulator's perspective.

    PubMed

    Hammad, T A; Neyarapally, G A; Iyasu, S; Staffa, J A; Dal Pan, G

    2013-09-01

    The US Food and Drug Administration emphasizes the role of regulatory science in the fulfillment of its mission to promote and protect public health and foster innovation. With respect to the evaluation of drug effects in the real world, regulatory science plays an important role in drug risk assessment and management. This article discusses opportunities and challenges with population-based drug risk assessment as well as related regulatory science knowledge gaps in the following areas: (i) population-based data sources and methods to evaluate drug safety issues; (ii) evidence-based thresholds to account for uncertainty in postmarket data; (iii) approaches to optimize the integration and interpretation of evidence from different sources; and (iv) approaches to evaluate the real-world impact of regulatory decisions. Regulators should continue the ongoing dialogue with multiple stakeholders to strengthen regulatory safety science and address these and other critical knowledge gaps. PMID:23739537

  9. Population-based absolute risk estimation with survey data.

    PubMed

    Kovalchik, Stephanie A; Pfeiffer, Ruth M

    2014-04-01

    Absolute risk is the probability that a cause-specific event occurs in a given time interval in the presence of competing events. We present methods to estimate population-based absolute risk from a complex survey cohort that can accommodate multiple exposure-specific competing risks. The hazard function for each event type consists of an individualized relative risk multiplied by a baseline hazard function, which is modeled nonparametrically or parametrically with a piecewise exponential model. An influence method is used to derive a Taylor-linearized variance estimate for the absolute risk estimates. We introduce novel measures of the cause-specific influences that can guide modeling choices for the competing event components of the model. To illustrate our methodology, we build and validate cause-specific absolute risk models for cardiovascular and cancer deaths using data from the National Health and Nutrition Examination Survey. Our applications demonstrate the usefulness of survey-based risk prediction models for predicting health outcomes and quantifying the potential impact of disease prevention programs at the population level. PMID:23686614

  10. Threshold-like complexation of conjugated polymers with small molecule acceptors in solution within the neighbor-effect model.

    PubMed

    Sosorev, Andrey Yu; Parashchuk, Olga D; Zapunidi, Sergey A; Kashtanov, Grigoriy S; Golovnin, Ilya V; Kommanaboyina, Srikanth; Perepichka, Igor F; Paraschuk, Dmitry Yu

    2016-02-14

    In some donor-acceptor blends based on conjugated polymers, a pronounced charge-transfer complex (CTC) forms in the electronic ground state. In contrast to small-molecule donor-acceptor blends, the CTC concentration in polymer:acceptor solution can increase with the acceptor content in a threshold-like way. This threshold-like behavior was earlier attributed to the neighbor effect (NE) in the polymer complexation, i.e., next CTCs are preferentially formed near the existing ones; however, the NE origin is unknown. To address the factors affecting the NE, we record the optical absorption data for blends of the most studied conjugated polymers, poly(2-methoxy-5-(2-ethylhexyloxy)-1,4-phenylenevinylene) (MEH-PPV) and poly(3-hexylthiophene) (P3HT), with electron acceptors of fluorene series, 1,8-dinitro-9,10-antraquinone (), and 7,7,8,8-tetracyanoquinodimethane () in different solvents, and then analyze the data within the NE model. We have found that the NE depends on the polymer and acceptor molecular skeletons and solvent, while it does not depend on the acceptor electron affinity and polymer concentration. We conclude that the NE operates within a single macromolecule and stems from planarization of the polymer chain involved in the CTC with an acceptor molecule; as a result, the probability of further complexation with the next acceptor molecules at the adjacent repeat units increases. The steric and electronic microscopic mechanisms of NE are discussed. PMID:26799407

  11. Error thresholds for Abelian quantum double models: Increasing the bit-flip stability of topological quantum memory

    NASA Astrophysics Data System (ADS)

    Andrist, Ruben S.; Wootton, James R.; Katzgraber, Helmut G.

    2015-04-01

    Current approaches for building quantum computing devices focus on two-level quantum systems which nicely mimic the concept of a classical bit, albeit enhanced with additional quantum properties. However, rather than artificially limiting the number of states to two, the use of d -level quantum systems (qudits) could provide advantages for quantum information processing. Among other merits, it has recently been shown that multilevel quantum systems can offer increased stability to external disturbances. In this study we demonstrate that topological quantum memories built from qudits, also known as Abelian quantum double models, exhibit a substantially increased resilience to noise. That is, even when taking into account the multitude of errors possible for multilevel quantum systems, topological quantum error-correction codes employing qudits can sustain a larger error rate than their two-level counterparts. In particular, we find strong numerical evidence that the thresholds of these error-correction codes are given by the hashing bound. Considering the significantly increased error thresholds attained, this might well outweigh the added complexity of engineering and controlling higher-dimensional quantum systems.

  12. Vulnerability and triggers in threshold development: models from the Chihuahuan Desert

    Technology Transfer Automated Retrieval System (TEKTRAN)

    We review models developed for the Draw and Loamy ecological sites in Major Land Resource Area 42.2 in southwestern New Mexico and review empirical support obtained for parts of the model. We describe evidence for 1) vulnerability to transitions and associated triggers and 2) the characteristics of ...

  13. Crossing the Threshold Mindfully: Exploring Rites of Passage Models in Adventure Therapy

    ERIC Educational Resources Information Center

    Norris, Julian

    2011-01-01

    Rites of passage models, drawing from ethnographic descriptions of ritualized transition, are widespread in adventure therapy programmes. However, critical literature suggests that: (a) contemporary rites of passage models derive from a selective and sometimes misleading use of ethnographic materials, and (b) the appropriation of initiatory…

  14. Population-based register of stroke: manual of operations.

    PubMed

    Giampaoli, Simona; Hammar, Niklas; Adany, Roza; De Peretti, Christine

    2007-12-01

    Cardiovascular disease is the leading cause of death and hospitalization in both sexes in nearly all countries of Europe. The main forms of cardiovascular disease are ischaemic heart disease and stroke. Stroke by itself is the second leading cause of death in the European Union, and the annual number of cases of stroke is expected to increase within the next few decades, mainly owing to a growth in the proportion of older people. Stroke is an expensive disease because of the large number of premature deaths, ongoing disability in survivors, and the impact on families or caregivers and on health services (treatment and rehabilitation). Therefore, there is a pressing need to make stroke prevention and treatment a priority, to reduce the growing health burden and lessen its socioeconomic impact. The magnitude of the problem contrasts with the shortage, weak quality, and comparability of data available in most European countries. A stepwise surveillance procedure based on standardized data collection, appropriate record linkage, and validation methods was set up by the EUROCISS project (EUROpean Cardiovascular Indicators Surveillance Set), to build up comparable and reliable indicators for the surveillance of stroke at the population level.This manual of operations is intended for health professionals and policy makers. It provides a standardized and simple model for the implementation of a population-based register, which can provide estimates of attack rate and case fatality. The manual recommends starting from a minimum data set. Before implementing a population-based register, it is important to identify the target population under surveillance, which should preferably cover a well defined geographical and administrative area or region representative of the whole country, where population data and vital statistics (mortality and hospital discharge records at least) are routinely collected and easily available each year. All cases among residents should be recorded

  15. A Distinct Catabolic to Anabolic Threshold Due to Single-Cell Static Nanomechanical Stimulation in a Cartilage Biokinetics Model

    PubMed Central

    Saha, Asit K.; Kohles, Sean S.

    2010-01-01

    Understanding physicochemical interactions during biokinetic regulation will be critical for the creation of relevant nanotechnology supporting cellular and molecular engineering. The impact of nanoscale influences in medicine and biology can be explored in detail through mathematical models as an in silico testbed. In a recent single-cell biomechanical analysis, the cytoskeletal strain response due to fluid-induced stresses was characterized (Wilson, Z. D., and Kohles, S. S., 2010, “Two-Dimensional Modeling of Nanomechanical Strains in Healthy and Diseased Single-Cells During Microfluidic Stress Applications,” J. Nanotech. Eng. Med., 1(2), p. 021005). Results described a microfluidic environment having controlled nanometer and piconewton resolution for explorations of multiscale mechanobiology. In the present study, we constructed a mathematical model exploring the nanoscale biomolecular response to that controlled microenvironment. We introduce mechanical stimuli and scaling factor terms as specific input values for regulating a cartilage molecule synthesis. Iterative model results for this initial multiscale static load application have identified a transition threshold load level from which the mechanical input causes a shift from a catabolic state to an anabolic state. Modeled molecule homeostatic levels appear to be dependent upon the mechanical stimulus as reflected experimentally. This work provides a specific mathematical framework from which to explore biokinetic regulation. Further incorporation of nanomechanical stresses and strains into biokinetic models will ultimately lead to refined mechanotransduction relationships at the cellular and molecular levels. PMID:21152243

  16. Hydrodynamics of sediment threshold

    NASA Astrophysics Data System (ADS)

    Ali, Sk Zeeshan; Dey, Subhasish

    2016-07-01

    A novel hydrodynamic model for the threshold of cohesionless sediment particle motion under a steady unidirectional streamflow is presented. The hydrodynamic forces (drag and lift) acting on a solitary sediment particle resting over a closely packed bed formed by the identical sediment particles are the primary motivating forces. The drag force comprises of the form drag and form induced drag. The lift force includes the Saffman lift, Magnus lift, centrifugal lift, and turbulent lift. The points of action of the force system are appropriately obtained, for the first time, from the basics of micro-mechanics. The sediment threshold is envisioned as the rolling mode, which is the plausible mode to initiate a particle motion on the bed. The moment balance of the force system on the solitary particle about the pivoting point of rolling yields the governing equation. The conditions of sediment threshold under the hydraulically smooth, transitional, and rough flow regimes are examined. The effects of velocity fluctuations are addressed by applying the statistical theory of turbulence. This study shows that for a hindrance coefficient of 0.3, the threshold curve (threshold Shields parameter versus shear Reynolds number) has an excellent agreement with the experimental data of uniform sediments. However, most of the experimental data are bounded by the upper and lower limiting threshold curves, corresponding to the hindrance coefficients of 0.2 and 0.4, respectively. The threshold curve of this study is compared with those of previous researchers. The present model also agrees satisfactorily with the experimental data of nonuniform sediments.

  17. Universal squash model for optical communications using linear optics and threshold detectors

    NASA Astrophysics Data System (ADS)

    Fung, Chi-Hang Fred; Chau, H. F.; Lo, Hoi-Kwong

    2011-08-01

    Transmission of photons through open-air or optical fibers is an important primitive in quantum-information processing. Theoretical descriptions of this process often consider single photons as information carriers and thus fail to accurately describe experimental implementations where any number of photons may enter a detector. It has been a great challenge to bridge this big gap between theory and experiments. One powerful method for achieving this goal is by conceptually squashing the received multiphoton states to single-photon states. However, until now, only a few protocols admit a squash model; furthermore, a recently proven no-go theorem appears to rule out the existence of a universal squash model. Here we show that a necessary condition presumed by all existing squash models is in fact too stringent. By relaxing this condition, we find that, rather surprisingly, a universal squash model actually exists for many protocols, including quantum key distribution, quantum state tomography, Bell's inequality testing, and entanglement verification.

  18. Population-Based Study of Baseline Ethanol Consumption and Risk of Incident Essential Tremor

    PubMed Central

    Louis, Elan D.; Benito-León, Julián; Bermejo-Pareja, Félix

    2009-01-01

    Background Recent postmortem studies have demonstrated pathological changes, including Purkinje cell loss, in the cerebellum in essential tremor (ET). Toxic exposures that compromise cerebellar tissue could lower the threshold for developing ET. Ethanol is a well-established cerebellar toxin, resulting in Purkinje cell loss. Objective To test whether higher baseline ethanol consumption is a risk factor for the subsequent development of incident ET. Methods Lifetime ethanol consumption was assessed at baseline (1994-1995) in a prospective, population-based study in central Spain of 3,285 elderly participants, 76 of whom developed incident ET by follow-up (1997-1998). Results In a Cox proportional hazards model adjusting for cigarette pack-years, depressive symptoms and community, the baseline number of drink-years was marginally associated with higher risk of incident ET (relative risk, RR = 1.003, p = 0.059). In an adjusted Cox model, highest baseline drink-year quartile doubled the risk of incident ET (RR = 2.29, p = 0.018) while other quartiles were associated with more modest elevations in risk (RR3rd quartile = 1.82 [p = 0.10], RR2nd quartile = 1.75 [p = 0.10], RR1st quartile = 1.43 [p = 0.34] vs. non-drinkers [RR = 1.00]). With each higher drink-year quartile, risk of incident ET increased an average of 23% (p = 0.01, test for trend). Conclusions Higher levels of chronic ethanol consumption increased the risk of developing ET. Ethanol is often used for symptomatic relief; studies should explore whether higher consumption levels are a continued source of underlying cerebellar neurotoxicity in patients who already manifest this disease. PMID:19359288

  19. Combining regional estimation and historical floods: A multivariate semiparametric peaks-over-threshold model with censored data

    NASA Astrophysics Data System (ADS)

    Sabourin, Anne; Renard, Benjamin

    2015-12-01

    The estimation of extreme flood quantiles is challenging due to the relative scarcity of extreme data compared to typical target return periods. Several approaches have been developed over the years to face this challenge, including regional estimation and the use of historical flood data. This paper investigates the combination of both approaches using a multivariate peaks-over-threshold model that allows estimating altogether the intersite dependence structure and the marginal distributions at each site. The joint distribution of extremes at several sites is constructed using a semiparametric Dirichlet Mixture model. The existence of partially missing and censored observations (historical data) is accounted for within a data augmentation scheme. This model is applied to a case study involving four catchments in Southern France, for which historical data are available since 1604. The comparison of marginal estimates from four versions of the model (with or without regionalizing the shape parameter; using or ignoring historical floods) highlights significant differences in terms of return level estimates. Moreover, the availability of historical data on several nearby catchments allows investigating the asymptotic dependence properties of extreme floods. Catchments display a significant amount of asymptotic dependence, calling for adapted multivariate statistical models.

  20. The simcyp population based simulator: architecture, implementation, and quality assurance.

    PubMed

    Jamei, Masoud; Marciniak, Steve; Edwards, Duncan; Wragg, Kris; Feng, Kairui; Barnett, Adrian; Rostami-Hodjegan, Amin

    2013-01-01

    Developing a user-friendly platform that can handle a vast number of complex physiologically based pharmacokinetic and pharmacodynamic (PBPK/PD) models both for conventional small molecules and larger biologic drugs is a substantial challenge. Over the last decade the Simcyp Population Based Simulator has gained popularity in major pharmaceutical companies (70% of top 40 - in term of R&D spending). Under the Simcyp Consortium guidance, it has evolved from a simple drug-drug interaction tool to a sophisticated and comprehensive Model Based Drug Development (MBDD) platform that covers a broad range of applications spanning from early drug discovery to late drug development. This article provides an update on the latest architectural and implementation developments within the Simulator. Interconnection between peripheral modules, the dynamic model building process and compound and population data handling are all described. The Simcyp Data Management (SDM) system, which contains the system and drug databases, can help with implementing quality standards by seamless integration and tracking of any changes. This also helps with internal approval procedures, validation and auto-testing of the new implemented models and algorithms, an area of high interest to regulatory bodies. PMID:25505654

  1. Threshold dynamics of a time periodic reaction-diffusion epidemic model with latent period

    NASA Astrophysics Data System (ADS)

    Zhang, Liang; Wang, Zhi-Cheng; Zhao, Xiao-Qiang

    2015-05-01

    In this paper, we first propose a time-periodic reaction-diffusion epidemic model which incorporates simple demographic structure and the latent period of infectious disease. Then we introduce the basic reproduction number R0 for this model and prove that the sign of R0 - 1 determines the local stability of the disease-free periodic solution. By using the comparison arguments and persistence theory, we further show that the disease-free periodic solution is globally attractive if R0 < 1, while there is an endemic periodic solution and the disease is uniformly persistent if R0 > 1.

  2. Identifying Atomic Structure as a Threshold Concept: Student Mental Models and Troublesomeness

    ERIC Educational Resources Information Center

    Park, Eun Jung; Light, Gregory

    2009-01-01

    Atomic theory or the nature of matter is a principal concept in science and science education. This has, however, been complicated by the difficulty students have in learning the concept and the subsequent construction of many alternative models. To understand better the conceptual barriers to learning atomic structure, this study explores the…

  3. Performance of the SWEEP model affected by estimates of threshold friction velocity

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The Wind Erosion Prediction System (WEPS) is a process-based model and needs to be verified under a broad range of climatic, soil, and management conditions. Occasional failure of the WEPS erosion submodel (Single-event Wind Erosion Evaluation Program or SWEEP) to simulate erosion in the Columbia Pl...

  4. Part I. A look at population-based medical care.

    PubMed

    Weiss, K

    1998-08-01

    Recent trends toward managed health care have generated interest in developing strategies to manage the health care of a population as a whole. Population-based medicine places the individual patient within the context of the larger community, which is composed of both sick and well individuals; when viewed in these terms, only a small proportion of the people who consult a primary care physician are at risk for substantial morbidity. However, the physician serves as the central figure for delivering population-based health care to the entire community. Many strategies for population-based care contain the following 4 basic elements: 1. Identifying the health and disease states that are likely to be responsive to population-based care, 2. Applying principles of epidemiology to define the population-of-interest, 3. Assembling a multidisciplinary team, and 4. Building information systems to support ongoing surveillance of population-based care. To date, most of the published examples of population-based management have been conducted in managed care environments, but population-based management may also be used by a single physician practice or a small group practice. Programs aimed at health promotion or disease prevention are among the easiest to implement. By examining the results of an entire population with a given condition, physicians and their teams may begin to identify ways to improve the overall delivery of care, either by establishing new procedures or improving old ones. PMID:9735940

  5. Elaborating on Threshold Concepts

    ERIC Educational Resources Information Center

    Rountree, Janet; Robins, Anthony; Rountree, Nathan

    2013-01-01

    We propose an expanded definition of Threshold Concepts (TCs) that requires the successful acquisition and internalisation not only of knowledge, but also its practical elaboration in the domains of applied strategies and mental models. This richer definition allows us to clarify the relationship between TCs and Fundamental Ideas, and to account…

  6. Bayesian approach to color-difference models based on threshold and constant-stimuli methods.

    PubMed

    Brusola, Fernando; Tortajada, Ignacio; Lengua, Ismael; Jordá, Begoña; Peris, Guillermo

    2015-06-15

    An alternative approach based on statistical Bayesian inference is presented to deal with the development of color-difference models and the precision of parameter estimation. The approach was applied to simulated data and real data, the latter published by selected authors involved with the development of color-difference formulae using traditional methods. Our results show very good agreement between the Bayesian and classical approaches. Among other benefits, our proposed methodology allows one to determine the marginal posterior distribution of each random individual parameter of the color-difference model. In this manner, it is possible to analyze the effect of individual parameters on the statistical significance calculation of a color-difference equation. PMID:26193510

  7. Permafrost landscapes in transition - towards modeling interactions, thresholds and feedbacks related to ice-rich ground

    NASA Astrophysics Data System (ADS)

    Westermann, Sebastian; Langer, Moritz; Lee, Hanna; Berntsen, Terje; Boike, Julia; Krinner, Gerhard; Aalstad, Kristoffer; Schanke Aas, Kjetil; Peter, Maria; Heikenfeld, Max; Etzelmüller, Bernd

    2016-04-01

    Thawing of permafrost is governed by a complex interplay of different processes, of which only conductive heat transfer is taken into account in most model studies. However, heat conduction alone can not account for the dynamical evolution of many permafrost landscapes, e.g. in areas rich in ground ice shaped by thermokarst ponds and lakes. Novel process parameterizations are required to include such phenomena in future projections of permafrost thaw and hereby triggered climatic feedbacks. Recently, we have demonstrated a physically-based parameterization for thaw process in ice-rich ground in the permafrost model CryoGrid 3, which can reproduce the formation of thermokarst ponds and subsidence of the ground following thawing of ice-rich subsurface layers. Long-term simulations for different subsurface stratigraphies in the Lena River Delta, Siberia, demonstrate that the hydrological regime can both accelerate and delay permafrost thawing. If meltwater from thawed ice-rich layers can drain, the ground subsides while at the same time the formation of a talik is delayed. If the meltwater pools at the surface, a pond is formed which enhances heat transfer in the ground and leads to the formation of a talik. The PERMANOR project funded by the Norwegian Research Council until 2019 will extend this work by integrating such small-scale processes in larger-scale Earth System Models (ESMs). For this purpose, the project will explore and develop statistical approaches, in particular tiling, to represent permafrost landscape dynamics on subgrid scale. Ultimately, PERMANOR will conceptualize process understanding from in-situ studies to develop new model algorithms and pursue their implementation in a coupled ESM framework.

  8. Yield threshold decision framework

    SciTech Connect

    Judd, B.R.; Younker, L.W.; Hannon, W.J.

    1989-08-17

    The USA is developing a decision analysis framework for evaluating the relative value of lower yield thresholds and related verification policies. The framework facilitates systematic analysis of the major issues in the yield threshold decision. The framework can be used to evaluate options proposed either in the inter-agency process or in the negotiations. In addition, the framework can measure the importance of uncertainties and alternative judgments, and thereby determine the advantages of additional research. Since the model is explicit and quantitative, it provides a rational, defensible approach for reaching important treaty and verification decisions. 9 figs.

  9. The threshold feeding response of microzooplankton within Pacific high-nitrate low-chlorophyll ecosystem models under steady and variable iron input

    NASA Astrophysics Data System (ADS)

    Leising, Andrew W.; Gentleman, Wendy C.; Frost, Bruce W.

    2003-11-01

    The equatorial Pacific is an HNLC (High-Nitrate Low-Chlorophyll) region. Modeling and in-situ process studies have confirmed the importance of microzooplankton grazing in this ecosystem. Unfortunately, both the parameters and functions representing microzooplankton grazing within current ecosystem models are poorly constrained. We used a simple 4-component food web model to test the assumption that a lower grazing threshold, which is common in many models, is necessary to achieve the HNLC condition. Without the grazing threshold, the model did not reproduce the HNLC condition. However, by raising the half-saturation constant within the microzooplankton functional response with no threshold, it was possible to reproduce the critical dynamics of the HNLC condition under both steady and moderate seasonal variability in nutrient input. It was also possible to reproduce the HNLC system using a sigmoidal functional response for the microzooplankton, with results somewhere between the other two forms of the model, although this version had the highest sensitivity to changes in its parameters. The three models predicted similar phytoplankton biomass and primary productivity under steady nutrient input, but diverge in these metrics as the amplitude of nutrient input variability increases. These three functional responses also imply certain important differences in the microzooplankton community. Whereas the threshold model had the least sensitivity to parameter choice, the high half-saturation constant, no-threshold model may actually be a better approximation when modeling a community of grazers. Ecosystem models that predict carbon production and export in HNLC regions can be very sensitive to assumptions concerning microzooplankton grazing; future studies need to concentrate on the functional responses of microzooplankton before these models can be used for predicting fluxes in times or regions where forcing is beyond that used to constrain the original model.

  10. The CNP signal is able to silence a supra threshold neuronal model

    PubMed Central

    Camera, Francesca; Paffi, Alessandra; Thomas, Alex W.; Apollonio, Francesca; D'Inzeo, Guglielmo; Prato, Frank S.; Liberti, Micaela

    2015-01-01

    Several experimental results published in the literature showed that weak pulsed magnetic fields affected the response of the central nervous system. However, the specific biological mechanisms that regulate the observed behaviors are still unclear and further scientific investigation is required. In this work we performed simulations on a neuronal network model exposed to a specific pulsed magnetic field signal that seems to be very effective in modulating the brain activity: the Complex Neuroelectromagnetic Pulse (CNP). Results show that CNP can silence the neurons of a feed-forward network for signal intensities that depend on the strength of the bias current, the endogenous noise level and the specific waveforms of the pulses. Therefore, it is conceivable that a neuronal network model responds to the CNP signal with an inhibition of its activity. Further studies on more realistic neuronal networks are needed to clarify if such an inhibitory effect on neuronal tissue may be the basis of the induced analgesia seen in humans and the antinociceptive effects seen in animals when exposed to the CNP. PMID:25972807

  11. Finger Vein Segmentation from Infrared Images Based on a Modified Separable Mumford Shah Model and Local Entropy Thresholding.

    PubMed

    Vlachos, Marios; Dermatas, Evangelos

    2015-01-01

    A novel method for finger vein pattern extraction from infrared images is presented. This method involves four steps: preprocessing which performs local normalization of the image intensity, image enhancement, image segmentation, and finally postprocessing for image cleaning. In the image enhancement step, an image which will be both smooth and similar to the original is sought. The enhanced image is obtained by minimizing the objective function of a modified separable Mumford Shah Model. Since, this minimization procedure is computationally intensive for large images, a local application of the Mumford Shah Model in small window neighborhoods is proposed. The finger veins are located in concave nonsmooth regions and, so, in order to distinct them from the other tissue parts, all the differences between the smooth neighborhoods, obtained by the local application of the model, and the corresponding windows of the original image are added. After that, veins in the enhanced image have been sufficiently emphasized. Thus, after image enhancement, an accurate segmentation can be obtained readily by a local entropy thresholding method. Finally, the resulted binary image may suffer from some misclassifications and, so, a postprocessing step is performed in order to extract a robust finger vein pattern. PMID:26120357

  12. Comparison of estimates of hip dysplasia genetic parameters in Estrela Mountain Dog using linear and threshold models.

    PubMed

    Silvestre, A M; Ginja, M M D; Ferreira, A J A; Colaço, J

    2007-08-01

    Genetic parameters, breeding values, and genetic trends of hip dysplasia in Estrela Mountain Dogs were estimated using a linear model (LM) and a threshold model (TM). A database with 313 animals was used. Right and left hip joints were individually scored, according to the Fédération Cynologic Internationale grading rules of the canine hip dysplasia system, as normal (1), borderline (2), slight (3), moderate (4), and severe (5 and 6). The estimate of repeatability was lower in LM (0.86) than in TM (0.90). The same tendency was verified with the heritability because its estimate in LM was 0.38 and in TM was 0.43. However, these results did not establish any statistical differences between the models. The genetic trend of canine hip dysplasia for LM and TM showed a similarity in shape, but considerable individual differences were found in the EBV ranking lists. Therefore, the selection of breeding animals would not be the same with the 2 methodologies. To select the best method for genetic evaluation of hip dysplasia, further studies using more data and other dog breeds are required. PMID:17468417

  13. Finger Vein Segmentation from Infrared Images Based on a Modified Separable Mumford Shah Model and Local Entropy Thresholding

    PubMed Central

    Vlachos, Marios; Dermatas, Evangelos

    2015-01-01

    A novel method for finger vein pattern extraction from infrared images is presented. This method involves four steps: preprocessing which performs local normalization of the image intensity, image enhancement, image segmentation, and finally postprocessing for image cleaning. In the image enhancement step, an image which will be both smooth and similar to the original is sought. The enhanced image is obtained by minimizing the objective function of a modified separable Mumford Shah Model. Since, this minimization procedure is computationally intensive for large images, a local application of the Mumford Shah Model in small window neighborhoods is proposed. The finger veins are located in concave nonsmooth regions and, so, in order to distinct them from the other tissue parts, all the differences between the smooth neighborhoods, obtained by the local application of the model, and the corresponding windows of the original image are added. After that, veins in the enhanced image have been sufficiently emphasized. Thus, after image enhancement, an accurate segmentation can be obtained readily by a local entropy thresholding method. Finally, the resulted binary image may suffer from some misclassifications and, so, a postprocessing step is performed in order to extract a robust finger vein pattern. PMID:26120357

  14. Hot-spot model for calculating the threshold for shock initiation of pyrotechnic mixtures

    SciTech Connect

    Maiden, D.E.; Nutt, G.L.

    1986-05-14

    A model for predicting the pressure required to initiate a reaction in pyrotechnic mixtures is described. The pore temperature is determined by calculating the dynamics of pore collapse. An approximate solution for the motion of the pore radius is determined as a function of the pore size, viscosity, yield stress and pressure. The heating of the material surrounding the pore is given by an approximate solution of the heat conduction equation with a source term accounting for viscoplastic heating as a function of the pore motion. Ignition occurs when the surface temperature of the pore matches the hot-spot ignition criterion. The hot-spot ignition temperatures for 2Al/Fe/sub 2/O/sub 3/, Ti/2B, and Ti/C are determined. Predictions for the ignition pressure of 2Al/Fe/sub 2/O/sub 3/ (thermite) are in resonable agreement with experiment. 18 refs.

  15. Analytical two-dimensional modeling for potential distribution and threshold voltage of the short-channel fully depleted SOI (silicon-on-insulator) MOSFET

    NASA Astrophysics Data System (ADS)

    Aggarwal, Vaneeta; Khanna, Manoj K.; Sood, Rachna; Haldar, Subhasis; Gupta, R. S.

    1994-08-01

    A two-dimensional analytical model for fully depleted SOI MOSFETs is presented. An extensive study of potential distribution in the silicon film is carried out for non-uniform doping distribution and extended to find an expression for threshold voltage in the sub micrometer region. The results so obtained are verified with experimental data. The present model calculates a critical gate voltage (for short channel fully depleted SOI devices) beyond which gate losses its control on drain current. The advantages of SOI MOSFETs over the bulk counterparts are explained on the basis of drain induced barrier lowering [DIBL]. It is also shown that the threshold voltage for the thin film SOI MOSFET is less than that of bulk MOSFET. The short-channel effects, DIBL and threshold voltage reduction, are well predicted in the present model.

  16. Drought Risk Modeling for Thermoelectric Power Plants Siting using an Excess Over Threshold Approach

    SciTech Connect

    Bekera, Behailu B; Francis, Royce A; Omitaomu, Olufemi A

    2014-01-01

    Water availability is among the most important elements of thermoelectric power plant site selection and evaluation criteria. With increased variability and changes in hydrologic statistical stationarity, one concern is the increased occurrence of extreme drought events that may be attributable to climatic changes. As hydrological systems are altered, operators of thermoelectric power plants need to ensure a reliable supply of water for cooling and generation requirements. The effects of climate change are expected to influence hydrological systems at multiple scales, possibly leading to reduced efficiency of thermoelectric power plants. In this paper, we model drought characteristics from a thermoelectric systems operational and regulation perspective. A systematic approach to characterise a stream environment in relation to extreme drought occurrence, duration and deficit-volume is proposed and demonstrated. This approach can potentially enhance early stage decisions in identifying candidate sites for a thermoelectric power plant application and allow investigation and assessment of varying degrees of drought risk during more advanced stages of the siting process.

  17. Threshold dose for peripheral neuropathy following intraoperative radiotherapy (IORT) in a large animal model

    SciTech Connect

    Kinsella, T.J.; DeLuca, A.M.; Barnes, M.; Anderson, W.; Terrill, R.; Sindelar, W.F. )

    1991-04-01

    Radiation injury to peripheral nerve is a dose-limiting toxicity in the clinical application of intraoperative radiotherapy, particularly for pelvic and retroperitoneal tumors. Intraoperative radiotherapy-related peripheral neuropathy in humans receiving doses of 20-25 Gy is manifested as a mixed motor-sensory deficit beginning 6-9 months following treatment. In a previous experimental study of intraoperative radiotherapy-related neuropathy of the lumbro-sacral plexus, an approximate inverse linear relationship was reported between the intraoperative dose (20-75 Gy range) and the time to onset of hind limb paresis (1-12 mos following intraoperative radiotherapy). The principal histological lesion in irradiated nerve was loss of large nerve fibers and perineural fibrosis without significant vascular injury. Similar histological changes in irradiated nerves were found in humans. To assess peripheral nerve injury to lower doses of intraoperative radiotherapy in this same large animal model, groups of four adult American Foxhounds received doses of 10, 15, or 20 Gy to the right lumbro-sacral plexus and sciatic nerve using 9 MeV electrons. The left lumbro-sacral plexus and sciatic nerve were excluded from the intraoperative field to allow each animal to serve as its own control. Following treatment, a complete neurological exam, electromyogram, and nerve conduction studies were performed monthly for 1 year. Monthly neurological exams were performed in years 2 and 3 whereas electromyogram and nerve conduction studies were performed every 3 months during this follow-up period. With follow-up of greater than or equal to 42 months, no dog receiving 10 or 15 Gy IORT shows any clinical or laboratory evidence of peripheral nerve injury. However, all four dogs receiving 20 Gy developed right hind limb paresis at 8, 9, 9, and 12 mos following intraoperative radiotherapy.

  18. PopTract: Population-Based Tractography

    PubMed Central

    Yap, Pew-Thian; Gilmore, John H.; Lin, Weili

    2016-01-01

    White matter fiber tractography plays a key role in the in vivo understanding of brain circuitry. For tract-based comparison of a population of images, a common approach is to first generate an atlas by averaging, after spatial normalization, all images in the population, and then perform tractography using the constructed atlas. The reconstructed fiber trajectories form a common geometry onto which diffusion properties of each individual subject can be projected based on the corresponding locations in the subject native space. However, in the case of high angular resolution diffusion imaging (HARDI), where modeling fiber crossings is an important goal, the above-mentioned averaging method for generating an atlas results in significant error in the estimation of local fiber orientations and causes a major loss of fiber crossings. These limitatitons have significant impact on the accuracy of the reconstructed fiber trajectories and jeopardize subsequent tract-based analysis. As a remedy, we present in this paper a more effective means of performing tractography at a population level. Our method entails determining a bipolar Watson distribution at each voxel location based on information given by all images in the population, giving us not only the local principal orientations of the fiber pathways, but also confidence levels of how reliable these orientations are across subjects. The distribution field is then fed as an input to a probabilistic tractography framework for reconstructing a set of fiber trajectories that are consistent across all images in the population. We observe that the proposed method, called PopTract, results in significantly better preservation of fiber crossings, and hence yields better trajectory reconstruction in the atlas space. PMID:21571607

  19. Elaborating on threshold concepts

    NASA Astrophysics Data System (ADS)

    Rountree, Janet; Robins, Anthony; Rountree, Nathan

    2013-09-01

    We propose an expanded definition of Threshold Concepts (TCs) that requires the successful acquisition and internalisation not only of knowledge, but also its practical elaboration in the domains of applied strategies and mental models. This richer definition allows us to clarify the relationship between TCs and Fundamental Ideas, and to account for both the important and the problematic characteristics of TCs in terms of the Knowledge/Strategies/Mental Models Framework defined in previous work.

  20. Learning foraging thresholds for lizards

    SciTech Connect

    Goldberg, L.A.; Hart, W.E.; Wilson, D.B.

    1996-01-12

    This work gives a proof of convergence for a randomized learning algorithm that describes how anoles (lizards found in the Carribean) learn a foraging threshold distance. This model assumes that an anole will pursue a prey if and only if it is within this threshold of the anole`s perch. This learning algorithm was proposed by the biologist Roughgarden and his colleagues. They experimentally confirmed that this algorithm quickly converges to the foraging threshold that is predicted by optimal foraging theory our analysis provides an analytic confirmation that the learning algorithm converses to this optimal foraging threshold with high probability.

  1. Genetic threshold hypothesis of neocortical spike-and-wave discharges in the rat: An animal model of petit mal epilepsy

    SciTech Connect

    Vadasz, C.; Fleischer, A.; Carpi, D.; Jando, G.

    1995-02-27

    Neocortical high-voltage spike-and-wave discharges (HVS) in the rat are an animal model of petit mal epilepsy. Genetic analysis of total duration of HVS (s/12 hr) in reciprocal F1 and F2 hybrids of F344 and BN rats indicated that the phenotypic variability of HVS cannot be explained by simple, monogenic Mendelian model. Biometrical analysis suggested the presence of additive, dominance, and sex-linked-epistatic effects, buffering maternal influence, and heterosis. High correlation was observed between average duration (s/episode) and frequency of occurrence of spike-and-wave episodes (n/12 hr) in parental and segregating generations, indicating that common genes affect both duration and frequency of the spike-and-wave pattern. We propose that both genetic and developmental - environmental factors control an underlying quantitative variable, which, above a certain threshold level, precipitates HVS discharges. These findings, together with the recent availability of rat DNA markers for total genome mapping, pave the way to the identification of genes that control the susceptibility of the brain to spike-and-wave discharges. 67 refs., 3 figs., 5 tabs.

  2. Possible recovery or unavoidable fall? A model to predict the one step balance recovery threshold and its stepping characteristics.

    PubMed

    Vallée, Pascal; Tisserand, Romain; Robert, Thomas

    2015-11-01

    In order to prevent fall related injuries and their consequences, one needs to be able to predict the outcome of a given balance perturbation: a possible Balance Recovery (BR) or an unavoidable fall? Given that results from the existing experimental studies are difficult to compare and to generalize, we propose to address this question with a numerical tool. Built on existing concepts from the biomechanics and robotics literature, it includes the optimal use of BR reactions and particularly the possibility to perform a recovery step. It allows estimating 1) the possibility to recover a steady balance from a given initial state or perturbation using at most one recovery step; 2) the set of recovery steps leading to a BR. Using standard sets of parameters for young and elderly population, we assessed this model's predictions against experimental data from the literature in the anterior direction. Two classical representations of the human body (inverted pendulum (IP) vs. linear inverted pendulum (LIP)) were also compared. The results showed that the model correctly predicted the possibility to recover using a single protective step (1-Step BR threshold) and the characteristics (step length and time) of the protective step for both the young and the elderly. This tool has a real potential in the field of fall prevention to detect risky situation. It could also be used to get insights into the neuromuscular mechanisms involved in the BR process. PMID:26602371

  3. A software tool to model genetic regulatory networks. Applications to the modeling of threshold phenomena and of spatial patterning in Drosophila.

    PubMed

    Dilão, Rui; Muraro, Daniele

    2010-01-01

    We present a general methodology in order to build mathematical models of genetic regulatory networks. This approach is based on the mass action law and on the Jacob and Monod operon model. The mathematical models are built symbolically by the Mathematica software package GeneticNetworks. This package accepts as input the interaction graphs of the transcriptional activators and repressors of a biological process and, as output, gives the mathematical model in the form of a system of ordinary differential equations. All the relevant biological parameters are chosen automatically by the software. Within this framework, we show that concentration dependent threshold effects in biology emerge from the catalytic properties of genes and its associated conservation laws. We apply this methodology to the segment patterning in Drosophila early development and we calibrate the genetic transcriptional network responsible for the patterning of the gap gene proteins Hunchback and Knirps, along the antero-posterior axis of the Drosophila embryo. In this approach, the zygotically produced proteins Hunchback and Knirps do not diffuse along the antero-posterior axis of the embryo of Drosophila, developing a spatial pattern due to concentration dependent thresholds. This shows that patterning at the gap genes stage can be explained by the concentration gradients along the embryo of the transcriptional regulators. PMID:20523731

  4. Cost-effectiveness of tenofovir gel in urban South Africa: model projections of HIV impact and threshold product prices

    PubMed Central

    2014-01-01

    Background There is urgent need for effective HIV prevention methods that women can initiate. The CAPRISA 004 trial showed that a tenofovir-based vaginal microbicide had significant impact on HIV incidence among women. This study uses the trial findings to estimate the population-level impact of the gel on HIV and HSV-2 transmission, and price thresholds at which widespread product introduction would be as cost-effective as male circumcision in urban South Africa. Methods The estimated ‘per sex-act’ HIV and HSV-2 efficacies were imputed from CAPRISA 004. A dynamic HIV/STI transmission model, parameterised and fitted to Gauteng (HIV prevalence of 16.9% in 2008), South Africa, was used to estimate the impact of gel use over 15 years. Uptake was assumed to increase linearly to 30% over 10 years, with gel use in 72% of sex-acts. Full economic programme and averted HIV treatment costs were modelled. Cost per DALY averted is estimated and a microbicide price that equalises its cost-effectiveness to that of male circumcision is estimated. Results Using plausible assumptions about product introduction, we predict that tenofovir gel use could lead to a 12.5% and 4.9% reduction in HIV and HSV-2 incidence respectively, by year 15. Microbicide introduction is predicted to be highly cost-effective (under $300 per DALY averted), though the dose price would need to be just $0.12 to be equally cost-effective as male circumcision. A single dose or highly effective (83% HIV efficacy per sex-act) regimen would allow for more realistic threshold prices ($0.25 and $0.33 per dose, respectively). Conclusions These findings show that an effective coitally-dependent microbicide could reduce HIV incidence by 12.5% in this setting, if current condom use is maintained. For microbicides to be in the range of the most cost-effective HIV prevention interventions, product costs will need to decrease substantially. PMID:24405719

  5. Glycemic Change After Pancreaticoduodenectomy: A Population-Based Study.

    PubMed

    Wu, Jin-Ming; Ho, Te-Wei; Kuo, Ting-Chun; Yang, Ching-Yao; Lai, Hong-Shiee; Chiang, Pin-Yi; Hsieh, Su-Hua; Lai, Feipei; Tien, Yu-Wen

    2015-07-01

    The purpose of this population-based study was to determine the change of glucose metabolism in patients undergoing pancreaticoduodenectomy (PD).We conducted a nationwide cohort study using data from Taiwan's National Health Insurance Research Database collected between 2000 and 2010. Our sample included 861 subjects with type 2 diabetes mellitus (DM) and 3914 subjects without DM.Of 861 subjects with type 2 diabetes, 174 patients (20.2%) experienced resolution of their diabetes after PD, including patients with pancreatic ductal adenocarcinoma (PDAC) (20.5%), and non-PDAC (20.1%). Using a multiple logistic regression model, we found that subjects with comorbid chronic pancreatitis (odds ratio, 0.356; 95% CI, 0.167-0.759; P = 0.007) and use of insulin (odds ratio, 0.265; 95% CI, 0.171-0.412; P < 0.001) had significantly lower rates of resolution of diabetes. In the 3914 subjects without diabetes, the only statistically significant comorbidity contributing to pancreatogenic diabetes was chronic pancreatitis (odds ratio, 1.446; 95% CI, 1.146-1.823; P = 0.002).Subjects with comorbid chronic pancreatitis and use of insulin had lower rates of resolution of DM after PD. In subjects without diabetes, chronic pancreatitis contributed significantly to the development of pancreatogenic DM. PMID:26166104

  6. A semi-analytic power balance model for low (L) to high (H) mode transition power threshold

    SciTech Connect

    Singh, R.; Jhang, Hogun; Kaw, P. K.; Diamond, P. H.; Nordman, H.; Bourdelle, C.

    2014-06-15

    We present a semi-analytic model for low (L) to high (H) mode transition power threshold (P{sub th}). Two main assumptions are made in our study. First, high poloidal mode number drift resistive ballooning modes (high-m DRBM) are assumed to be the dominant turbulence driver in a narrow edge region near to last closed flux surface. Second, the pre-transition edge profile and turbulent diffusivity at the narrow edge region pertain to turbulent equipartition. An edge power balance relation is derived by calculating the dissipated power flux through both turbulent conduction and convection, and radiation in the edge region. P{sub th} is obtained by imposing the turbulence quench rule due to sheared E × B rotation. Evaluation of P{sub th} shows a good agreement with experimental results in existing machines. Increase of P{sub th} at low density (i.e., the existence of roll-over density in P{sub th} vs. density) is shown to originate from the longer scale length of the density profile than that of the temperature profile.

  7. Lowered threshold energy for femtosecond laser induced optical breakdown in a water based eye model by aberration correction with adaptive optics.

    PubMed

    Hansen, Anja; Géneaux, Romain; Günther, Axel; Krüger, Alexander; Ripken, Tammo

    2013-06-01

    In femtosecond laser ophthalmic surgery tissue dissection is achieved by photodisruption based on laser induced optical breakdown. In order to minimize collateral damage to the eye laser surgery systems should be optimized towards the lowest possible energy threshold for photodisruption. However, optical aberrations of the eye and the laser system distort the irradiance distribution from an ideal profile which causes a rise in breakdown threshold energy even if great care is taken to minimize the aberrations of the system during design and alignment. In this study we used a water chamber with an achromatic focusing lens and a scattering sample as eye model and determined breakdown threshold in single pulse plasma transmission loss measurements. Due to aberrations, the precise lower limit for breakdown threshold irradiance in water is still unknown. Here we show that the threshold energy can be substantially reduced when using adaptive optics to improve the irradiance distribution by spatial beam shaping. We found that for initial aberrations with a root-mean-square wave front error of only one third of the wavelength the threshold energy can still be reduced by a factor of three if the aberrations are corrected to the diffraction limit by adaptive optics. The transmitted pulse energy is reduced by 17% at twice the threshold. Furthermore, the gas bubble motions after breakdown for pulse trains at 5 kilohertz repetition rate show a more transverse direction in the corrected case compared to the more spherical distribution without correction. Our results demonstrate how both applied and transmitted pulse energy could be reduced during ophthalmic surgery when correcting for aberrations. As a consequence, the risk of retinal damage by transmitted energy and the extent of collateral damage to the focal volume could be minimized accordingly when using adaptive optics in fs-laser surgery. PMID:23761849

  8. Lowered threshold energy for femtosecond laser induced optical breakdown in a water based eye model by aberration correction with adaptive optics

    PubMed Central

    Hansen, Anja; Géneaux, Romain; Günther, Axel; Krüger, Alexander; Ripken, Tammo

    2013-01-01

    In femtosecond laser ophthalmic surgery tissue dissection is achieved by photodisruption based on laser induced optical breakdown. In order to minimize collateral damage to the eye laser surgery systems should be optimized towards the lowest possible energy threshold for photodisruption. However, optical aberrations of the eye and the laser system distort the irradiance distribution from an ideal profile which causes a rise in breakdown threshold energy even if great care is taken to minimize the aberrations of the system during design and alignment. In this study we used a water chamber with an achromatic focusing lens and a scattering sample as eye model and determined breakdown threshold in single pulse plasma transmission loss measurements. Due to aberrations, the precise lower limit for breakdown threshold irradiance in water is still unknown. Here we show that the threshold energy can be substantially reduced when using adaptive optics to improve the irradiance distribution by spatial beam shaping. We found that for initial aberrations with a root-mean-square wave front error of only one third of the wavelength the threshold energy can still be reduced by a factor of three if the aberrations are corrected to the diffraction limit by adaptive optics. The transmitted pulse energy is reduced by 17% at twice the threshold. Furthermore, the gas bubble motions after breakdown for pulse trains at 5 kilohertz repetition rate show a more transverse direction in the corrected case compared to the more spherical distribution without correction. Our results demonstrate how both applied and transmitted pulse energy could be reduced during ophthalmic surgery when correcting for aberrations. As a consequence, the risk of retinal damage by transmitted energy and the extent of collateral damage to the focal volume could be minimized accordingly when using adaptive optics in fs-laser surgery. PMID:23761849

  9. STILLBIRTH (C0)VARIANCE COMPONENTS FOR A SIRE-MATERNAL GRANDSIRE THRESHOLD MODEL AND DEVELOPMENT OF A CALVING ABILITY INDEX FOR SIRE SELECTION

    Technology Transfer Automated Retrieval System (TEKTRAN)

    (Co)variance components for stillbirth in US Holsteins were estimated under a sire-maternal grandsire threshold model using subsets of data from the national calving ease database, which includes over 7 million calving records with associated stillbirth scores. Stillbirth was coded as a binomial tra...

  10. An Analytical Threshold Voltage Model of Fully Depleted (FD) Recessed-Source/Drain (Re-S/D) SOI MOSFETs with Back-Gate Control

    NASA Astrophysics Data System (ADS)

    Saramekala, Gopi Krishna; Tiwari, Pramod Kumar

    2016-06-01

    This paper presents an analytical threshold voltage model for back-gated fully depleted (FD), recessed-source drain silicon-on-insulator metal-oxide-semiconductor field-effect transistors (MOSFETs). Analytical surface potential models have been developed at front and back surfaces of the channel by solving the two-dimensional (2-D) Poisson's equation in the channel region with appropriate boundary conditions assuming a parabolic potential profile in the transverse direction of the channel. The strong inversion criterion is applied to the front surface potential as well as on the back one in order to find two separate threshold voltages for front and back channels of the device, respectively. The device threshold voltage has been assumed to be associated with the surface that offers a lower threshold voltage. The developed model was analyzed extensively for a variety of device geometry parameters like the oxide and silicon channel thicknesses, the thickness of the source/drain extension in the buried oxide, and the applied bias voltages with back-gate control. The proposed model has been validated by comparing the analytical results with numerical simulation data obtained from ATLAS™, a 2-D device simulator from SILVACO.

  11. A threshold model for opposing actions of acetylcholine on reward behavior: Molecular mechanisms and implications for treatment of substance abuse disorders.

    PubMed

    Grasing, Kenneth

    2016-10-01

    The cholinergic system plays important roles in both learning and addiction. Medications that modify cholinergic tone can have pronounced effects on behaviors reinforced by natural and drug reinforcers. Importantly, enhancing the action of acetylcholine (ACh) in the nucleus accumbens and ventral tegmental area (VTA) dopamine system can either augment or diminish these behaviors. A threshold model is presented that can explain these seemingly contradictory results. Relatively low levels of ACh rise above a lower threshold, facilitating behaviors supported by drugs or natural reinforcers. Further increases in cholinergic tone that rise above a second upper threshold oppose the same behaviors. Accordingly, cholinesterase inhibitors, or agonists for nicotinic or muscarinic receptors, each have the potential to produce biphasic effects on reward behaviors. Pretreatment with either nicotinic or muscarinic antagonists can block drug- or food- reinforced behavior by maintaining cholinergic tone below its lower threshold. Potential threshold mediators include desensitization of nicotinic receptors and biphasic effects of ACh on the firing of medium spiny neurons. Nicotinic receptors with high- and low- affinity appear to play greater roles in reward enhancement and inhibition, respectively. Cholinergic inhibition of natural and drug rewards may serve as mediators of previously described opponent processes. Future studies should evaluate cholinergic agents across a broader range of doses, and include a variety of reinforced behaviors. PMID:27316344

  12. Guiding principles and checklist for population-based quality metrics.

    PubMed

    Krishnan, Mahesh; Brunelli, Steven M; Maddux, Franklin W; Parker, Thomas F; Johnson, Douglas; Nissenson, Allen R; Collins, Allan; Lacson, Eduardo

    2014-06-01

    The Centers for Medicare and Medicaid Services oversees the ESRD Quality Incentive Program to ensure that the highest quality of health care is provided by outpatient dialysis facilities that treat patients with ESRD. To that end, Centers for Medicare and Medicaid Services uses clinical performance measures to evaluate quality of care under a pay-for-performance or value-based purchasing model. Now more than ever, the ESRD therapeutic area serves as the vanguard of health care delivery. By translating medical evidence into clinical performance measures, the ESRD Prospective Payment System became the first disease-specific sector using the pay-for-performance model. A major challenge for the creation and implementation of clinical performance measures is the adjustments that are necessary to transition from taking care of individual patients to managing the care of patient populations. The National Quality Forum and others have developed effective and appropriate population-based clinical performance measures quality metrics that can be aggregated at the physician, hospital, dialysis facility, nursing home, or surgery center level. Clinical performance measures considered for endorsement by the National Quality Forum are evaluated using five key criteria: evidence, performance gap, and priority (impact); reliability; validity; feasibility; and usability and use. We have developed a checklist of special considerations for clinical performance measure development according to these National Quality Forum criteria. Although the checklist is focused on ESRD, it could also have broad application to chronic disease states, where health care delivery organizations seek to enhance quality, safety, and efficiency of their services. Clinical performance measures are likely to become the norm for tracking performance for health care insurers. Thus, it is critical that the methodologies used to develop such metrics serve the payer and the provider and most importantly, reflect

  13. A two-dimensional analytical model for channel potential and threshold voltage of short channel dual material gate lightly doped drain MOSFET

    NASA Astrophysics Data System (ADS)

    Shweta, Tripathi

    2014-11-01

    An analytical model for the channel potential and the threshold voltage of the short channel dual-material-gate lightly doped drain (DMG-LDD) metal—oxide—semiconductor field-effect transistor (MOSFET) is presented using the parabolic approximation method. The proposed model takes into account the effects of the LDD region length, the LDD region doping, the lengths of the gate materials and their respective work functions, along with all the major geometrical parameters of the MOSFET. The impact of the LDD region length, the LDD region doping, and the channel length on the channel potential is studied in detail. Furthermore, the threshold voltage of the device is calculated using the minimum middle channel potential, and the result obtained is compared with the DMG MOSFET threshold voltage to show the improvement in the threshold voltage roll-off. It is shown that the DMG-LDD MOSFET structure alleviates the problem of short channel effects (SCEs) and the drain induced barrier lowering (DIBL) more efficiently. The proposed model is verified by comparing the theoretical results with the simulated data obtained by using the commercially available ATLAS™ 2D device simulator.

  14. Two-dimensional models of threshold voltage and subthreshold current for symmetrical double-material double-gate strained Si MOSFETs

    NASA Astrophysics Data System (ADS)

    Yan-hui, Xin; Sheng, Yuan; Ming-tang, Liu; Hong-xia, Liu; He-cai, Yuan

    2016-03-01

    The two-dimensional models for symmetrical double-material double-gate (DM-DG) strained Si (s-Si) metal-oxide semiconductor field effect transistors (MOSFETs) are presented. The surface potential and the surface electric field expressions have been obtained by solving Poisson’s equation. The models of threshold voltage and subthreshold current are obtained based on the surface potential expression. The surface potential and the surface electric field are compared with those of single-material double-gate (SM-DG) MOSFETs. The effects of different device parameters on the threshold voltage and the subthreshold current are demonstrated. The analytical models give deep insight into the device parameters design. The analytical results obtained from the proposed models show good matching with the simulation results using DESSIS. Project supported by the National Natural Science Foundation of China (Grant Nos. 61376099, 11235008, and 61205003).

  15. Young adults' trajectories of Ecstasy use: a population based study.

    PubMed

    Smirnov, Andrew; Najman, Jake M; Hayatbakhsh, Reza; Plotnikova, Maria; Wells, Helene; Legosz, Margot; Kemp, Robert

    2013-11-01

    Young adults' Ecstasy use trajectories have important implications for individual and population-level consequences of Ecstasy use, but little relevant research has been conducted. This study prospectively examines Ecstasy trajectories in a population-based sample. Data are from the Natural History Study of Drug Use, a retrospective/prospective cohort study conducted in Australia. Population screening identified a probability sample of Ecstasy users aged 19-23 years. Complete data for 30 months of follow-up, comprising 4 time intervals, were available for 297 participants (88.4% of sample). Trajectories were derived using cluster analysis based on recent Ecstasy use at each interval. Trajectory predictors were examined using a generalized ordered logit model and included Ecstasy dependence (World Mental Health Composite International Diagnostic Instrument), psychological distress (Hospital Anxiety Depression Scale), aggression (Young Adult Self Report) and contextual factors (e.g. attendance at electronic/dance music events). Three Ecstasy trajectories were identified (low, intermediate and high use). At its peak, the high-use trajectory involved 1-2 days Ecstasy use per week. Decreasing frequency of use was observed for intermediate and high-use trajectories from 12 months, independently of market factors. Intermediate and high-use trajectory membership was predicted by past Ecstasy consumption (>70 pills) and attendance at electronic/dance music events. High-use trajectory members were unlikely to have used Ecstasy for more than 3 years and tended to report consistently positive subjective effects at baseline. Given the social context and temporal course of Ecstasy use, Ecstasy trajectories might be better understood in terms of instrumental rather than addictive drug use patterns. PMID:23899430

  16. Numerical Modeling of Cloud Convection With High Condensation Threshold: Implication to Methane Convetive Clouds in Titan's Atmosphere

    NASA Astrophysics Data System (ADS)

    Nakajima, K.; Ishiwatari, M.; Takehiro, S.; Hayashi, Y.

    2004-12-01

    Recent ground-based observations and the first Cassini flyby reveal prominent cloud activities near the south pole of Titan. Their characteristics imply their convective origin. On the other hand, it has been proposed that a large degree of super-saturation is required for condensation of methane to occur. Here, we examine how such high condensation threshold affects the nature of cloud convection and over-all structure of the atmosphere through explicit numerical modeling of cloud convection. As a first step, we perform sensitivity experiments designed to isolate the effects of the large super saturation in the setup of the earth's tropical atmosphere because the condition for Titan's atmosphere is not well constrained. We conduct long-term integrations of a two-dimensional non-hydrostatic cloud convection model that extends 4,096km in the horizontal direction including three-category (vapor-cloud-rain) parameterized microphysics. We compare the simulated cloud convection in the case with "ordinary" condensation scheme with that in the case with "Titan's" condensation scheme, where water vapor is allowed to condense into cloud water only at a highly super saturated condition; after the nucleation, water vapor rapidly condense onto the cloud water toward exactly saturated state, and cloud water also evaporates towards exactly saturated state in appropriate conditions (e.g., in the downward flow of the air). The results show that, in "Titan's" case, individual convective clouds are much stronger, larger and longer-lived. The convective towers occur only at one or two limited locations in the 4,096km domain instead of occurring in rather scattered manner in the "ordinary" case. The average atmosphere in the "Titan's" case is super saturated around the condensation level and the tropopause, but the degree of super saturation is much smaller than that specified as the condensation criterion. The temperature structure is maintained to be conditionally unstable. Although

  17. Bayesian estimation of dose thresholds

    NASA Technical Reports Server (NTRS)

    Groer, P. G.; Carnes, B. A.

    2003-01-01

    An example is described of Bayesian estimation of radiation absorbed dose thresholds (subsequently simply referred to as dose thresholds) using a specific parametric model applied to a data set on mice exposed to 60Co gamma rays and fission neutrons. A Weibull based relative risk model with a dose threshold parameter was used to analyse, as an example, lung cancer mortality and determine the posterior density for the threshold dose after single exposures to 60Co gamma rays or fission neutrons from the JANUS reactor at Argonne National Laboratory. The data consisted of survival, censoring times and cause of death information for male B6CF1 unexposed and exposed mice. The 60Co gamma whole-body doses for the two exposed groups were 0.86 and 1.37 Gy. The neutron whole-body doses were 0.19 and 0.38 Gy. Marginal posterior densities for the dose thresholds for neutron and gamma radiation were calculated with numerical integration and found to have quite different shapes. The density of the threshold for 60Co is unimodal with a mode at about 0.50 Gy. The threshold density for fission neutrons declines monotonically from a maximum value at zero with increasing doses. The posterior densities for all other parameters were similar for the two radiation types.

  18. Using Generalized Additive Modeling to Empirically Identify Thresholds within the ITERS in Relation to Toddlers' Cognitive Development

    ERIC Educational Resources Information Center

    Setodji, Claude Messan; Le, Vi-Nhuan; Schaack, Diana

    2013-01-01

    Research linking high-quality child care programs and children's cognitive development has contributed to the growing popularity of child care quality benchmarking efforts such as quality rating and improvement systems (QRIS). Consequently, there has been an increased interest in and a need for approaches to identifying thresholds, or cutpoints,…

  19. POPULATION-BASED EXPOSURE AND DOSE MODELING FOR AIR POLLUTANTS

    EPA Science Inventory

    This task will address EPA's need to better understand the variability in personal exposure to air pollutants for the purpose of assessing what populations are at risk for adverse health outcomes due to air pollutant exposures. To improve our understanding of exposures to air po...

  20. Air temperature thresholds to evaluate snow melting at the surface of Alpine glaciers by T-index models: the case study of Forni Glacier (Italy)

    NASA Astrophysics Data System (ADS)

    Senese, A.; Maugeri, M.; Vuillermoz, E.; Smiraglia, C.; Diolaiuti, G.

    2014-03-01

    The glacier melt conditions (i.e.: null surface temperature and positive energy budget) can be assessed by analyzing meteorological and energy data acquired by a supraglacial Automatic Weather Station (AWS). In the case this latter is not present the assessment of actual melting conditions and the evaluation of the melt amount is difficult and simple methods based on T-index (or degree days) models are generally applied. These models require the choice of a correct temperature threshold. In fact, melt does not necessarily occur at daily air temperatures higher than 273.15 K. In this paper, to detect the most indicative threshold witnessing melt conditions in the April-June period, we have analyzed air temperature data recorded from 2006 to 2012 by a supraglacial AWS set up at 2631 m a.s.l. on the ablation tongue of the Forni Glacier (Italian Alps), and by a weather station located outside the studied glacier (at Bormio, a village at 1225 m a.s.l.). Moreover we have evaluated the glacier energy budget and the Snow Water Equivalent (SWE) values during this time-frame. Then the snow ablation amount was estimated both from the surface energy balance (from supraglacial AWS data) and from T-index method (from Bormio data, applying the mean tropospheric lapse rate and varying the air temperature threshold) and the results were compared. We found that the mean tropospheric lapse rate permits a good and reliable reconstruction of glacier air temperatures and the major uncertainty in the computation of snow melt is driven by the choice of an appropriate temperature threshold. From our study using a 5.0 K lower threshold value (with respect to the largely applied 273.15 K) permits the most reliable reconstruction of glacier melt.

  1. CARA Risk Assessment Thresholds

    NASA Technical Reports Server (NTRS)

    Hejduk, M. D.

    2016-01-01

    Warning remediation threshold (Red threshold): Pc level at which warnings are issued, and active remediation considered and usually executed. Analysis threshold (Green to Yellow threshold): Pc level at which analysis of event is indicated, including seeking additional information if warranted. Post-remediation threshold: Pc level to which remediation maneuvers are sized in order to achieve event remediation and obviate any need for immediate follow-up maneuvers. Maneuver screening threshold: Pc compliance level for routine maneuver screenings (more demanding than regular Red threshold due to additional maneuver uncertainty).

  2. Experimental and Finite Element Modeling of Near-Threshold Fatigue Crack Growth for the K-Decreasing Test Method

    NASA Technical Reports Server (NTRS)

    Smith, Stephen W.; Seshadri, Banavara R.; Newman, John A.

    2015-01-01

    The experimental methods to determine near-threshold fatigue crack growth rate data are prescribed in ASTM standard E647. To produce near-threshold data at a constant stress ratio (R), the applied stress-intensity factor (K) is decreased as the crack grows based on a specified K-gradient. Consequently, as the fatigue crack growth rate threshold is approached and the crack tip opening displacement decreases, remote crack wake contact may occur due to the plastically deformed crack wake surfaces and shield the growing crack tip resulting in a reduced crack tip driving force and non-representative crack growth rate data. If such data are used to life a component, the evaluation could yield highly non-conservative predictions. Although this anomalous behavior has been shown to be affected by K-gradient, starting K level, residual stresses, environmental assisted cracking, specimen geometry, and material type, the specifications within the standard to avoid this effect are limited to a maximum fatigue crack growth rate and a suggestion for the K-gradient value. This paper provides parallel experimental and computational simulations for the K-decreasing method for two materials (an aluminum alloy, AA 2024-T3 and a titanium alloy, Ti 6-2-2-2-2) to aid in establishing clear understanding of appropriate testing requirements. These simulations investigate the effect of K-gradient, the maximum value of stress-intensity factor applied, and material type. A material independent term is developed to guide in the selection of appropriate test conditions for most engineering alloys. With the use of such a term, near-threshold fatigue crack growth rate tests can be performed at accelerated rates, near-threshold data can be acquired in days instead of weeks without having to establish testing criteria through trial and error, and these data can be acquired for most engineering materials, even those that are produced in relatively small product forms.

  3. Evaluation of the most suitable threshold value for modelling snow glacier melt through T- index approach: the case study of Forni Glacier (Italian Alps)

    NASA Astrophysics Data System (ADS)

    Senese, Antonella; Maugeri, Maurizio; Vuillermoz, Elisa; Smiraglia, Claudio; Diolaiuti, Guglielmina

    2014-05-01

    Glacier melt occurs whenever the surface temperature is null (273.15 K) and the net energy budget is positive. These conditions can be assessed by analyzing meteorological and energy data acquired by a supraglacial Automatic Weather Station (AWS). In the case this latter is not present at the glacier surface the assessment of actual melting conditions and the evaluation of melt amount is difficult and degree-day (also named T-index) models are applied. These approaches require the choice of a correct temperature threshold. In fact, melt does not necessarily occur at daily air temperatures higher than 273.15 K, since it is determined by the energy budget which in turn is only indirectly affected by air temperature. This is the case of the late spring period when ablation processes start at the glacier surface thus progressively reducing snow thickness. In this study, to detect the most indicative air temperature threshold witnessing melt conditions in the April-June period, we analyzed air temperature data recorded from 2006 to 2012 by a supraglacial AWS (at 2631 m a.s.l.) on the ablation tongue of the Forni Glacier (Italy), and by a weather station located nearby the studied glacier (at Bormio, 1225 m a.s.l.). Moreover we evaluated the glacier energy budget (which gives the actual melt, Senese et al., 2012) and the snow water equivalent values during this time-frame. Then the ablation amount was estimated both from the surface energy balance (MEB from supraglacial AWS data) and from degree-day method (MT-INDEX, in this latter case applying the mean tropospheric lapse rate to temperature data acquired at Bormio changing the air temperature threshold) and the results were compared. We found that the mean tropospheric lapse rate permits a good and reliable reconstruction of daily glacier air temperature conditions and the major uncertainty in the computation of snow melt from degree-day models is driven by the choice of an appropriate air temperature threshold. Then

  4. Analytical Modeling of Potential Distribution and Threshold Voltage of Gate Underlap DG MOSFETs with a Source/Drain Lateral Gaussian Doping Profile

    NASA Astrophysics Data System (ADS)

    Singh, Kunal; Kumar, Mirgender; Goel, Ekta; Singh, Balraj; Dubey, Sarvesh; Kumar, Sanjay; Jit, Satyabrata

    2016-04-01

    This paper reports a new two-dimensional (2D) analytical model for the potential distribution and threshold voltage of the short-channel symmetric gate underlap ultrathin DG MOSFETs with a lateral Gaussian doping profile in the source (S)/drain (D) region. The parabolic approximation and conformal mapping techniques have been explored for solving the 2D Poisson's equation to obtain the channel potential function of the device. The effects of straggle parameter (of the lateral Gaussian doping profile in the S/D region), underlap length, gate length, channel thickness and oxide thickness on the surface potential and threshold voltage have been investigated. The loss of switching speed due to the drain-induced barrier lowering (DIBL) has also been reported. The proposed model results have been validated by comparing them with their corresponding TCAD simulation data obtained by using the commercially available 2D ATLAS™ simulation software.

  5. Coloring geographical threshold graphs

    SciTech Connect

    Bradonjic, Milan; Percus, Allon; Muller, Tobias

    2008-01-01

    We propose a coloring algorithm for sparse random graphs generated by the geographical threshold graph (GTG) model, a generalization of random geometric graphs (RGG). In a GTG, nodes are distributed in a Euclidean space, and edges are assigned according to a threshold function involving the distance between nodes as well as randomly chosen node weights. The motivation for analyzing this model is that many real networks (e.g., wireless networks, the Internet, etc.) need to be studied by using a 'richer' stochastic model (which in this case includes both a distance between nodes and weights on the nodes). Here, we analyze the GTG coloring algorithm together with the graph's clique number, showing formally that in spite of the differences in structure between GTG and RGG, the asymptotic behavior of the chromatic number is identical: {chi}1n 1n n / 1n n (1 + {omicron}(1)). Finally, we consider the leading corrections to this expression, again using the coloring algorithm and clique number to provide bounds on the chromatic number. We show that the gap between the lower and upper bound is within C 1n n / (1n 1n n){sup 2}, and specify the constant C.

  6. Importance of population-based studies in clinical practice

    PubMed Central

    Ronnie, George; Ve, Ramesh Sathyamangalam; Velumuri, Lokapavani; Asokan, Rashima; Vijaya, Lingam

    2011-01-01

    In the last decade, there have been reports on the prevalence of glaucoma from the Vellore Eye Survey, Andhra Pradesh Eye Diseases Survey, Aravind Comprehensive Eye Survey, Chennai Glaucoma Study and West Bengal Glaucoma Study. Population-based studies provide important information regarding the prevalence and risk factors for glaucoma. They also highlight regional differences in the prevalence of various types of glaucoma. It is possible to gather important insights regarding the number of persons affected with glaucoma and the proportion with undiagnosed disease. We reviewed the different population-based studies from India and compare their findings. The lacunae in ophthalmic care that can be inferred from these studies are identified and possible reasons and solutions are discussed. We also discuss the clinical relevance of the various findings, and how it reflects on clinical practice in the country. Since India has a significantly high disease burden, we examine the possibility of population-based screening for disease in the Indian context. PMID:21150021

  7. Threshold quantum cryptography

    SciTech Connect

    Tokunaga, Yuuki; Okamoto, Tatsuaki; Imoto, Nobuyuki

    2005-01-01

    We present the concept of threshold collaborative unitary transformation or threshold quantum cryptography, which is a kind of quantum version of threshold cryptography. Threshold quantum cryptography states that classical shared secrets are distributed to several parties and a subset of them, whose number is greater than a threshold, collaborates to compute a quantum cryptographic function, while keeping each share secretly inside each party. The shared secrets are reusable if no cheating is detected. As a concrete example of this concept, we show a distributed protocol (with threshold) of conjugate coding.

  8. Genetic analysis of the rates of conception using a longitudinal threshold model with random regression in dairy crossbreeding within a tropical environment.

    PubMed

    Buaban, Sayan; Kuchida, Keigo; Suzuki, Mitsuyoshi; Masuda, Yutaka; Boonkum, Wuttigrai; Duangjinda, Monchai

    2016-08-01

    This study was designed to: (i) estimate genetic parameters and breeding values for conception rates (CR) using the repeatability threshold model (RP-THM) and random regression threshold models (RR-THM); and (ii) compare covariance functions for modeling the additive genetic (AG) and permanent environmental (PE) effects in the RR-THM. The CR was defined as the outcome of an insemination. A data set of 130 592 first-lactation insemination records of 55 789 Thai dairy cows, calving between 1996 and 2011, was used in the analyses. All models included fixed effects of year × month of insemination, breed × day in milk to insemination class and age at calving. The random effects consisted of herd × year interaction, service sire, PE, AG and residual. Variance components were estimated using a Bayesian method via Gibbs sampling. Heritability estimates of CR ranged from 0.032 to 0.067, 0.037 to 0.165 and 0.045 to 0.218 for RR-THM with the second, third and fourth-order of Legendre polynomials, respectively. The heritability estimated from RP-THM was 0.056. Model comparisons based on goodness of fit, predictive abilities, predicted service results of animal, and pattern of genetic parameter estimates, indicated that the model which fit the desired outcome of insemination was the RR-THM with two regression coefficients. PMID:26556694

  9. Threshold Concepts in Biochemistry

    ERIC Educational Resources Information Center

    Loertscher, Jennifer

    2011-01-01

    Threshold concepts can be identified for any discipline and provide a framework for linking student learning to curricular design. Threshold concepts represent a transformed understanding of a discipline, without which the learner cannot progress and are therefore pivotal in learning in a discipline. Although threshold concepts have been…

  10. Numerical investigation of a coupled moving boundary model of radial flow in low-permeable stress-sensitive reservoir with threshold pressure gradient

    NASA Astrophysics Data System (ADS)

    Wen-Chao, Liu; Yue-Wu, Liu; Cong-Cong, Niu; Guo-Feng, Han; Yi-Zhao, Wan

    2016-02-01

    The threshold pressure gradient and formation stress-sensitive effect as the two prominent physical phenomena in the development of a low-permeable reservoir are both considered here for building a new coupled moving boundary model of radial flow in porous medium. Moreover, the wellbore storage and skin effect are both incorporated into the inner boundary conditions in the model. It is known that the new coupled moving boundary model has strong nonlinearity. A coordinate transformation based fully implicit finite difference method is adopted to obtain its numerical solutions. The involved coordinate transformation can equivalently transform the dynamic flow region for the moving boundary model into a fixed region as a unit circle, which is very convenient for the model computation by the finite difference method on fixed spatial grids. By comparing the numerical solution obtained from other different numerical method in the existing literature, its validity can be verified. Eventually, the effects of permeability modulus, threshold pressure gradient, wellbore storage coefficient, and skin factor on the transient wellbore pressure, the derivative, and the formation pressure distribution are analyzed respectively. Project supported by the National Natural Science Foundation of China (Grant No. 51404232), the China Postdoctoral Science Foundation (Grant No. 2014M561074), and the National Science and Technology Major Project, China (Grant No. 2011ZX05038003).

  11. Damage thresholds for terahertz radiation

    NASA Astrophysics Data System (ADS)

    Dalzell, Danielle R.; McQuade, Jill; Vincelette, Rebecca; Ibey, Bennet; Payne, Jason; Thomas, Robert; Roach, W. P.; Roth, Caleb L.; Wilmink, Gerald J.

    2010-02-01

    Several international organizations establish minimum safety standards to ensure that workers and the general population are protected against adverse health effects associated with electromagnetic radiation. Suitable standards are typically defined using published experimental data. To date, few experimental studies have been conducted at Terahertz (THz) frequencies, and as a result, current THz standards have been defined using extrapolated estimates from neighboring spectral regions. In this study, we used computational modeling and experimental approaches to determine tissue-damage thresholds at THz frequencies. For the computational modeling efforts, we used the Arrhenius damage integral to predict damage-thresholds. We determined thresholds experimentally for both long (minutes) and short (seconds) THz exposures. For the long exposure studies, we used an in-house molecular gas THz laser (υ= 1.89 THz, 189.92 mW/cm2, 10 minutes) and excised porcine skin. For the short exposure studies, we used the Free Electron Laser (FEL) at Jefferson Laboratory (υ= 0.1-1.0 THz, 2.0-14.0 mW/cm2, 2 seconds) and wet chamois cloths. Thresholds were determined using conventional damage score determination and probit analysis techniques, and tissue temperatures were measured using infrared thermographic techniques. We found that the FEL was ideal for tissue damage studies, while our in-house THz source was not suitable to determine tissue damage thresholds. Using experimental data, the tissue damage threshold (ED50) was determined to be 7.16 W/cm2. This value was in well agreement with that predicted using our computational models. We hope that knowledge of tissue-damage thresholds at THz frequencies helps to ensure the safe use of THz radiation.

  12. Perinatal risk factors in offenders with severe personality disorder: a population-based investigation

    PubMed Central

    Fazel, Seena; Bakiyeva, Liliya; Cnattingius, Sven; Grann, Martin; Hultman, Christina M.; Lichtenstein, Paul; Geddes, John R.

    2013-01-01

    Although perinatal factors are associated with the development of several psychiatric disorders, it is unknown whether these factors are linked with personality disorder. Cases of personality disorder were drawn from a national registry of all forensic psychiatric evaluations (n=150). Two control groups were used: 1. A sample of forensic evaluations without any psychiatric disorder (n=97) allowing for a nested case-control investigation; 2: A population-based sample matched by age and gender with no history of psychiatric hospitalization (n=1498). Prematurity (<37 weeks of completed gestation) was significantly associated with a diagnosis of personality disorder, both in the nested and the population-based case-control comparisons with adjusted odds ratios (OR) for this risk factors ranging from 2 to 4. Asphyxia (adjusted OR=2.4, 95% CI: 1.4-4.1) and complicated delivery (adjusted OR=1.5, 1.0-2.1) were associated with personality disorder in the population-based study, and the former remained significant in multivariate models. Overall, perinatal complications were found to be associated with a later diagnosis of personality disorder in this selected sample. As with other psychiatric disorders where such associations have been demonstrated, changes during the perinatal period may lead to abnormal brain development and function. PMID:23013342

  13. Surface characterizations of color threshold

    NASA Technical Reports Server (NTRS)

    Poirson, Allen B.; Wandell, Brian A.; Varner, Denise C.; Brainard, David H.

    1990-01-01

    The paper evaluates how well three different parametric shapes, ellipsoids, rectangles, and parallelograms, serve as models of three-dimentional detection contours. The constraints of the procedures for deriving the best-fitting shapes on inferences about the theoretical visual detection mechanisms are described. Results of two statistical tests show that only the parallelogram fits the data with more precision than the variance in repeated threshold measurements, and thus provides a slightly better fit than the other two shapes. Nevertheless it does not serve as a better guide than the ellipsoidal model for interpolating from the measurements to thresholds in novel color directions.

  14. A compact quasi 3D threshold voltage modeling and performance analysis of a novel linearly graded binary metal alloy quadruple gate MOSFET for subdued short channel effects

    NASA Astrophysics Data System (ADS)

    Sarkhel, Saheli; Sarkar, Subir Kumar

    2015-06-01

    In the present era of low power devices, to keep pace with the aggressive scaling demands, the concept of surrounding gate MOS geometry is gradually being popular among the researchers for enhancing the performance of nanoscale MOSFETs due to the inherent benefit of the gate-all-around geometry compared to the conventional planar structures. In this research endeavour, we have, for the first time, incorporated the novel theory of work function engineering of a binary metal alloy gate with continuous horizontal variation of mole fraction in a fully depleted quadruple gate MOSFET, thereby proposing a new structure namely Work Function Engineered Gate Quadruple Gate MOSFET (WFEG QG MOSFET). A detailed analytical modeling of this novel WFEG QG MOS structure has been formulated to present a quasi 3D threshold voltage model based on 3D scaling equation instead of the tedious solution of 3D Poisson's equation. The device short channel effects have been included by calculating the natural length of the proposed QG device using the effective number of gate (ENG) concept. An overall comparative performance analysis of the WFEG QG MOS and normal QG MOSFET has been done to establish the superiority of the proposed WFEG structure over its QG equivalent in terms of reduced Short Channel Effects (SCEs), Drain Induced Barrier Lowering (DIBL) and Threshold Voltage Roll Off (TVRO). The results of our analytical modeling are found to be in good agreement with the simulation results, thereby establishing the accuracy of our modeling.

  15. Population based mortality surveillance in carbon products manufacturing plants.

    PubMed Central

    Teta, M J; Ott, M G; Schnatter, A R

    1987-01-01

    The utility of a population based, corporate wide mortality surveillance system was evaluated after a 10 year observation period of one of the company's divisions. The subject population, 2219 white male, long term employees from Union Carbide Corporation's carbon based electrode and specialty products operations, was followed up for mortality from 1974 to 1983. External comparisons with the United States male population were supplemented with internal comparisons among subgroups of the study population, defined by broad job categories and time related variables, adjusting for important correlates of the healthy worker effect. Significant deficits of deaths were observed for all causes and the major non-cancer causes of death. The numbers of deaths due to malignant neoplasms and respiratory cancer were less than, but not statistically different from, expected. There was a non-significant excess of deaths from lymphopoietic cancer, occurring predominantly among salaried employees. When specific locations were examined, operations with potential exposure to coal tar products exhibited a mortality pattern similar to that of the total cohort. The risk for lung cancer was significantly raised (five observed, 1.4 expected) in one small, but older, location which did not involve coal tar products during the period of employment of these individuals, but which historically used asbestos materials for several unique applications. Although these findings are limited by small numbers and a short observation period, the population based surveillance strategy has provided valuable information regarding the mortality experience of the population, directions for future research, and the allocation of epidemiological resources. PMID:3593661

  16. Continuous bilateral infusion of vigabatrin into the subthalamic nucleus: Effects on seizure threshold and GABA metabolism in two rat models.

    PubMed

    Gey, Laura; Gernert, Manuela; Löscher, Wolfgang

    2016-07-01

    The subthalamic nucleus (STN) plays a crucial role as a regulator of basal ganglia outflow but also influences the activity of cortical and limbic structures, so that it is widely used as a therapeutic target in different brain diseases, including epilepsy. In addition to electrical stimulation of the STN, targeted delivery of anti-seizure drugs to the STN may constitute an alternative treatment approach in patients with pharmacoresistant epilepsy. In the present experimental study, we investigated the anti-seizure and adverse effects of chronic infusion of vigabatrin into the STN of rats. Vigabatrin is a clinically approved anti-seizure drug, which acts by increasing brain GABA levels by irreversibly inhibiting GABA-aminotransferase (GABA-T). Based on functional and neurochemical effects of acute STN microinjection, doses for continuous infusion were calculated and administered, using an innovative drug infusion technology. Bilateral infusion of only 10μg/day vigabatrin over 3weeks into the STN resulted in an almost complete inhibition of GABA-T and 4-fold increase in GABA in the target region, which was associated with a significant increase in seizure threshold, determined once weekly by i.v. infusion of pentylenetetrazole (PTZ). Lower doses or unilateral infusion were less effective, both on PTZ seizures and on kindled seizures. Bilateral infusion into substantia nigra pars reticulata was less effective and more toxic than STN infusion. In part of the rats, tolerance to the anti-seizure effect developed. The data demonstrate that chronic administration of very low, nontoxic doses of vigabatrin into STN is an effective means of increasing local GABA concentrations and seizure threshold. PMID:26976738

  17. Population-based advanced practice nursing: where does oncology fit in?

    PubMed

    Lattimer, Jennie Greco

    2013-12-01

    A national work group met in 2004 to discuss the future of advanced practice nursing. The representatives were nursing education, certification, accreditation, and regulation experts, and the goal was to develop a consensus model for advanced practice nursing regulation (Nevidjon et al., 2010). As a result, a set of recommendations was published in an article that defined a new consensus model for advanced practice registered nurse (APRN) regulation (APRN Consensus Workgroup, 2008; Goudreau, 2009). The new model included six population-based focuses of practice (i.e., family and individual across the lifespan, adult and gerontology, neonatal, pediatrics, women's health- and gender-related, and psychiatric and mental health) (Johnson, Dawson, & Brassard, 2010). A goal of the new model was to standardize the licensure, certification, and regulation of nurse practitioners into specific focuses. State boards were facing an increasing number of requests to recognize nurse practitioner specialties (e.g., organ specific, body systems, diseases) (Johnson et al., 2010). The new model helped standardize education programs, which may help certifying agencies set up curriculum review processes to ensure appropriate credentials for APRNs (Johnson et al., 2010). It also supported the mission of nursing to meet future healthcare needs of the public and to protect the public (Johnson et al., 2010). Some advantages exist to delineating into population-based focuses, but the new model leaves out many specialties (e.g., oncology) that encompass the whole person as well as concentrate on certain diseases. PMID:24305476

  18. Oscillatory Threshold Logic

    PubMed Central

    Borresen, Jon; Lynch, Stephen

    2012-01-01

    In the 1940s, the first generation of modern computers used vacuum tube oscillators as their principle components, however, with the development of the transistor, such oscillator based computers quickly became obsolete. As the demand for faster and lower power computers continues, transistors are themselves approaching their theoretical limit and emerging technologies must eventually supersede them. With the development of optical oscillators and Josephson junction technology, we are again presented with the possibility of using oscillators as the basic components of computers, and it is possible that the next generation of computers will be composed almost entirely of oscillatory devices. Here, we demonstrate how coupled threshold oscillators may be used to perform binary logic in a manner entirely consistent with modern computer architectures. We describe a variety of computational circuitry and demonstrate working oscillator models of both computation and memory. PMID:23173034

  19. New states above charm threshold

    SciTech Connect

    Eichten, Estia J.; Lane, Kenneth; Quigg, Chris; /Fermilab

    2005-11-01

    We revise and extend expectations for the properties of charmonium states that lie above charm threshold, in light of new experimental information. We refine the Cornell coupled-channel model for the coupling of c{bar c} levels to two-meson states, defining resonance masses and widths by pole positions in the complex energy plane, and suggest new targets for experiment.

  20. Explosive percolation in thresholded networks

    NASA Astrophysics Data System (ADS)

    Hayasaka, Satoru

    2016-06-01

    Explosive percolation in a network is a phase transition where a large portion of nodes becomes connected with an addition of a small number of edges. Although extensively studied in random network models and reconstructed real networks, explosive percolation has not been observed in a more realistic scenario where a network is generated by thresholding a similarity matrix describing between-node associations. In this report, I examine construction schemes of such thresholded networks, and demonstrate that explosive percolation can be observed by introducing edges in a particular order.

  1. Predictors of Childhood Anxiety: A Population-Based Cohort Study

    PubMed Central

    2015-01-01

    Background Few studies have explored predictors of early childhood anxiety. Objective To determine the prenatal, postnatal, and early life predictors of childhood anxiety by age 5. Methods Population-based, provincial administrative data (N = 19,316) from Manitoba, Canada were used to determine the association between demographic, obstetrical, psychosocial, medical, behavioral, and infant factors on childhood anxiety. Results Risk factors for childhood anxiety by age 5 included maternal psychological distress from birth to 12 months and 13 months to 5 years post-delivery and an infant 5-minute Apgar score of ≤7. Factors associated with decreased risk included maternal age < 20 years, multiparity, and preterm birth. Conclusion Identifying predictors of childhood anxiety is a key step to early detection and prevention. Maternal psychological distress is an early, modifiable risk factor. Future research should aim to disentangle early life influences on childhood anxiety occurring in the prenatal, postnatal, and early childhood periods. PMID:26158268

  2. A population-based study of birth defects in Malaysia.

    PubMed

    Thong, M K; Ho, J J; Khatijah, N N

    2005-01-01

    Birth defects are one of the leading causes of paediatric disability and mortality in developed and developing countries. Data on birth defects from population-based studies originating from developing countries are lacking. One of the objectives of this study was to determine the epidemiology of major birth defects in births during the perinatal period in Kinta district, Perak, Malaysia over a 14-month period, using a population-based birth defect register. There were 253 babies with major birth defects in 17,720 births, giving an incidence of 14.3/1000 births, a birth prevalence of 1 in 70. There were 80 babies with multiple birth defects and 173 with isolated birth defects. The exact syndromic diagnosis of the babies with multiple birth defects could not be identified in 18 (22.5%) babies. The main organ systems involved in the isolated birth defects were cardiovascular (13.8%), cleft lip and palate (11.9%), clubfeet (9.1%), central nervous system (CNS) (including neural tube defects) (7.9%), musculoskeletal (5.5%) and gastrointestinal systems (4.7%), and hydrops fetalis (4.3%). The babies with major birth defects were associated with lower birth weights, premature deliveries, higher Caesarean section rates, prolonged hospitalization and increased specialist care. Among the cohort of babies with major birth defects, the mortality rate was 25.2% during the perinatal period. Mothers with affected babies were associated with advanced maternal age, birth defects themselves or their relatives but not in their other offspring, and significantly higher rates of previous abortions. The consanguinity rate of 2.4% was twice that of the control population. It is concluded that a birth defects register is needed to monitor these developments and future interventional trials are needed to reduce birth defects in Malaysia. PMID:16096215

  3. Birth Prevalence of Cerebral Palsy: A Population-Based Study

    PubMed Central

    Van Naarden Braun, Kim; Doernberg, Nancy; Schieve, Laura; Christensen, Deborah; Goodman, Alyson; Yeargin-Allsopp, Marshalyn

    2015-01-01

    OBJECTIVE Population-based data in the United States on trends in cerebral palsy (CP) birth prevalence are limited. The objective of this study was to examine trends in the birth prevalence of congenital spastic CP by birth weight, gestational age, and race/ethnicity in a heterogeneous US metropolitan area. METHODS Children with CP were identified by a population-based surveillance system for developmental disabilities (DDs). Children with CP were included if they were born in metropolitan Atlanta, Georgia, from 1985 to 2002, resided there at age 8 years, and did not have a postneonatal etiology (n = 766). Birth weight, gestational age, and race/ethnicity subanalyses were restricted to children with spastic CP (n = 640). Trends were examined by CP subtype, gender, race/ethnicity, co-occurring DDs, birth weight, and gestational age. RESULTS Birth prevalence of spastic CP per 1000 1-year survivors was stable from 1985 to 2002 (1.9 in 1985 to 1.8 in 2002; 0.3% annual average prevalence; 95% confidence interval [CI] −1.1 to 1.8). Whereas no significant trends were observed by gender, subtype, birth weight, or gestational age overall, CP prevalence with co-occurring moderate to severe intellectual disability significantly decreased (−2.6% [95% CI −4.3 to −0.8]). Racial disparities persisted over time between non-Hispanic black and non-Hispanic white children (prevalence ratio 1.8 [95% CI 1.5 to 2.1]). Different patterns emerged for non-Hispanic white and non-Hispanic black children by birth weight and gestational age. CONCLUSIONS Given improvements in neonatal survival, evidence of stability of CP prevalence is encouraging. Yet lack of overall decreases supports continued monitoring of trends and increased research and prevention efforts. Racial/ethnic disparities, in particular, warrant further study. PMID:26659459

  4. Excessive daytime somnolence and cardiovascular health: A population-based study in rural Ecuador

    PubMed Central

    Del Brutto, Oscar H.; Mera, Robertino M.; Zambrano, Mauricio; Castillo, Pablo R.

    2014-01-01

    In a population-based study conducted in rural Ecuador, 635 stroke-free persons aged ≥40 years were interviewed with the Epworth sleepiness scale and screened to assess their cardiovascular health (CVH) status. Excessive daytime somnolence was present in 22% persons and a poor CVH status in 69%. In a generalized linear model after adjusting for age and sex, excessive daytime somnolence was not associated with a poor CVH status or with any of the individual metrics in the poor range. Excessive daytime somnolence may not be linked to cardiovascular risk factors at the rural level. PMID:26483927

  5. Excessive daytime somnolence and cardiovascular health: A population-based study in rural Ecuador.

    PubMed

    Del Brutto, Oscar H; Mera, Robertino M; Zambrano, Mauricio; Castillo, Pablo R

    2014-12-01

    In a population-based study conducted in rural Ecuador, 635 stroke-free persons aged ≥40 years were interviewed with the Epworth sleepiness scale and screened to assess their cardiovascular health (CVH) status. Excessive daytime somnolence was present in 22% persons and a poor CVH status in 69%. In a generalized linear model after adjusting for age and sex, excessive daytime somnolence was not associated with a poor CVH status or with any of the individual metrics in the poor range. Excessive daytime somnolence may not be linked to cardiovascular risk factors at the rural level. PMID:26483927

  6. School Performance and the Risk of Suicidal Thoughts in Young Adults: Population-Based Study

    PubMed Central

    Kosidou, Kyriaki; Dalman, Christina; Fredlund, Peeter; Magnusson, Cecilia

    2014-01-01

    Although low school performance is related to attempted and completed suicide, its relationship with suicidal thoughts has been less clear. We conducted a population-based study including 10081 individuals aged 18–29 years in Stockholm, Sweden, and found a clear positive gradient in the risk of lifetime suicidal thoughts with decreasing levels of compulsory school leaving grades. This relationship was somewhat attenuated but remained significant in multivariate models accounting for family background, severe adult psychopathology and adult socioeconomic conditions. School failure is associated with an increased risk of experiencing suicidal thoughts and may also increase the tendency of acting upon them. PMID:25347404

  7. A Stochastic Hourly Stream Temperature Model to Forecast Land-Use and Climate Change Effects on Temperature Threshold Exceedance Duration for Freshwater Mussels

    NASA Astrophysics Data System (ADS)

    Daraio, J. A.; Bales, J. D.; Pandolfo, T.

    2012-12-01

    Changes in stream temperature can have significant effects on freshwater ecosystems and aquatic organisms. Direct effects of temperature on aquatic organisms are often measured by acute thermal tolerance, or LT%, over some time period (h). For example, LT50 is an average median lethal temperature where 50% of individuals in a population die. Mean daily temperature above a given 24 h LT50 does not necessarily indicate a 24 h exposure, and there is a need for hourly water temperature estimates to obtain exposure durations. We developed a stochastic hourly temperature model that provides forecasts for the probability of exceeding given threshold temperatures for specified durations (24 and 96 h). Daily mean stream temperatures from an existing model of the upper Tar River basin, North Carolina, USA, were used as input to our stochastic hourly temperature model for climate change and land-use change simulations for 2021-2030 and 2051-2060. Time series of hourly temperature data at 20 sites from July 2010 through November 2011, were used to parameterize autoregressive-moving average (ARMA) models. Parameterizations were done using site specific and basin-wide observations. The use of site-fitted parameters for ARMA simulations of hourly temperatures showed no significant differences with simulations using basin-fitted parameters. Stream temperature observations in 2010 revealed only 2 sites with temperatures above 30°C for > 24 h and temperatures were never > 31°C for more than 24 h at any site. Simulations suggest that higher temperature thresholds are likely to be exceeded for longer durations than have occurred in the past century. The probability that temperatures will exceed 32°C for at least 96 h in a given year increased from 0, at present, to 0.05 in 2021-2030 and to ~ 0.14 in 2051-2060. Simulations indicated that climate change had much greater affects on probabilities of temperature threshold exceedance for 24 and 96 h durations than land-use change

  8. Clinician Use and Acceptance of Population-Based Data about Respiratory Pathogens: Implications for Enhancing Population-Based Clinical Practice

    PubMed Central

    Gesteland, Per H; Allison, Mandy A; Staes, Catherine J; Samore, Matthew H; Rubin, Michael A; Carter, Marjorie E; Wuthrich, Amyanne; Kinney, Anita Y; Mottice, Susan; Byington, Carrie L

    2008-01-01

    Front line health care providers (HCPs) play a central role in endemic (e.g., pertussis), epidemic (e.g., influenza) and pandemic (e.g., avian influenza) infectious disease outbreaks. Effective preparedness for this role requires access to and awareness of population-based data (PBD). We investigated the degree to which this is currently achieved among HCPs in Utah by surveying a sample about access, awareness and attitudes concerning PBD in clinical practice. We found variability in the number and nature (national vs. local, pushed vs. pulled) of PBD sources accessed by HCPs, with a subset using multiple sources and using them frequently. We found that HCPs believe PBD improves their clinical performance and that they cannot rely on their own practice to remain informed. These findings suggest that an integrated system, which interprets PBD from multiple sources and optimizes the delivery of PBD, may facilitate preparedness of HCPs through the application of PBD in routine clinical practice. PMID:18999305

  9. Pausing at the Threshold

    ERIC Educational Resources Information Center

    Morgan, Patrick K.

    2015-01-01

    Since about 2003, the notion of threshold concepts--the central ideas in any field that change how learners think about other ideas--have become difficult to escape at library conferences and in general information literacy discourse. Their visibility will likely only increase because threshold concepts figure prominently in the Framework for…

  10. Threshold Concepts in Economics

    ERIC Educational Resources Information Center

    Shanahan, Martin

    2016-01-01

    Purpose: The purpose of this paper is to examine threshold concepts in the context of teaching and learning first-year university economics. It outlines some of the arguments for using threshold concepts and provides examples using opportunity cost as an exemplar in economics. Design/ Methodology/Approach: The paper provides an overview of the…

  11. Neighborhood Deprivation Is Strongly Associated with Participation in a Population-Based Health Check

    PubMed Central

    Bender, Anne Mette; Kawachi, Ichiro; Jørgensen, Torben; Pisinger, Charlotta

    2015-01-01

    Background We sought to examine whether neighborhood deprivation is associated with participation in a large population-based health check. Such analyses will help answer the question whether health checks, which are designed to meet the needs of residents in deprived neighborhoods, may increase participation and prove to be more effective in preventing disease. In Europe, no study has previously looked at the association between neighborhood deprivation and participation in a population-based health check. Methods The study population comprised 12,768 persons invited for a health check including screening for ischemic heart disease and lifestyle counseling. The study population was randomly drawn from a population of 179,097 persons living in 73 neighborhoods in Denmark. Data on neighborhood deprivation (percentage with basic education, with low income and not in work) and individual socioeconomic position were retrieved from national administrative registers. Multilevel regression analyses with log links and binary distributions were conducted to obtain relative risks, intraclass correlation coefficients and proportional change in variance. Results Large differences between neighborhoods existed in both deprivation levels and neighborhood health check participation rate (mean 53%; range 35-84%). In multilevel analyses adjusted for age and sex, higher levels of all three indicators of neighborhood deprivation and a deprivation score were associated with lower participation in a dose-response fashion. Persons living in the most deprived neighborhoods had up to 37% decreased probability of participating compared to those living in the least deprived neighborhoods. Inclusion of individual socioeconomic position in the model attenuated the neighborhood deprivation coefficients, but all except for income deprivation remained statistically significant. Conclusion Neighborhood deprivation was associated with participation in a population-based health check in a dose

  12. Thresholds in chemical respiratory sensitisation.

    PubMed

    Cochrane, Stella A; Arts, Josje H E; Ehnes, Colin; Hindle, Stuart; Hollnagel, Heli M; Poole, Alan; Suto, Hidenori; Kimber, Ian

    2015-07-01

    There is a continuing interest in determining whether it is possible to identify thresholds for chemical allergy. Here allergic sensitisation of the respiratory tract by chemicals is considered in this context. This is an important occupational health problem, being associated with rhinitis and asthma, and in addition provides toxicologists and risk assessors with a number of challenges. In common with all forms of allergic disease chemical respiratory allergy develops in two phases. In the first (induction) phase exposure to a chemical allergen (by an appropriate route of exposure) causes immunological priming and sensitisation of the respiratory tract. The second (elicitation) phase is triggered if a sensitised subject is exposed subsequently to the same chemical allergen via inhalation. A secondary immune response will be provoked in the respiratory tract resulting in inflammation and the signs and symptoms of a respiratory hypersensitivity reaction. In this article attention has focused on the identification of threshold values during the acquisition of sensitisation. Current mechanistic understanding of allergy is such that it can be assumed that the development of sensitisation (and also the elicitation of an allergic reaction) is a threshold phenomenon; there will be levels of exposure below which sensitisation will not be acquired. That is, all immune responses, including allergic sensitisation, have threshold requirement for the availability of antigen/allergen, below which a response will fail to develop. The issue addressed here is whether there are methods available or clinical/epidemiological data that permit the identification of such thresholds. This document reviews briefly relevant human studies of occupational asthma, and experimental models that have been developed (or are being developed) for the identification and characterisation of chemical respiratory allergens. The main conclusion drawn is that although there is evidence that the

  13. Histocompatibility antigens in a population based silicosis series.

    PubMed Central

    Kreiss, K; Danilovs, J A; Newman, L S

    1989-01-01

    Individual susceptibility to silicosis is suggested by the lack of a uniform dose response relation and by the presence of immunological epiphenomena, such as increased antibody levels and associated diseases that reflect altered immune regulation. Human leucocyte antigens (HLA) are linked with immune response capability and might indicate a possible genetic susceptibility to silicosis. Forty nine silicotic subjects were identified from chest radiographs in a population based study in Leadville, Colorado. They were interviewed for symptoms and occupational history and gave a blood specimen for HLA-A, -B, -DR, and -DQ typing and for antinuclear antibody, immune complexes, immunoglobulins, and rheumatoid factor. Silicotic subjects had twice the prevalence of B44 (45%) of the reference population and had triple the prevalence of A29 (20%), both of which were statistically significant when corrected for the number of comparisons made. No perturbations in D-region antigen frequencies were detected. B44-positive subjects were older at diagnosis and had less dyspnoea than other subjects. A29-positive subjects were more likely to have abnormal levels of IgA and had higher levels of immune complexes. This study is the first to find significant HLA antigen excesses among a series of silicotic cases and extends earlier reported hypotheses that were based on groups of antigens of which B44 and A29 are components. PMID:2818968

  14. A population-based study of large granular lymphocyte leukemia.

    PubMed

    Shah, M V; Hook, C C; Call, T G; Go, R S

    2016-01-01

    Large granular lymphocyte (LGL) leukemia is a lymphoproliferative disorder of cytotoxic cells. T-cell LGL (T-LGL) leukemia is characterized by accumulation of cytotoxic T cells in blood and infiltration of the bone marrow, liver or spleen. Population-based studies have not been reported in LGL leukemia. We present clinical characteristics, natural history and risk factors for poor survival in patients with LGL leukemia using the Surveillance, Epidemiology, and End Results Program (SEER) and the United States National Cancer Data Base (NCDB). LGL leukemia is an extremely rare disease with the incidence of 0.2 cases per 1 000 000 individuals. The median age at diagnosis was 66.5 years with females likely to be diagnosed at 3 years earlier compared with males. Analysis of patient-level data using NCDB (n=978) showed that 45% patients with T-LGL leukemia required some form of systemic treatment at the time of diagnosis. T-LGL leukemia patients have reduced survival compared with general population, with a median overall survival of 9 years. Multivariate analysis showed that age >60 years at the time of diagnosis and the presence of significant comorbidities were independent predictors of poor survival. PMID:27494824

  15. Provider communication on perinatal depression: a population-based study.

    PubMed

    Farr, Sherry L; Ko, Jean Y; Burley, Kim; Gupta, Seema

    2016-02-01

    Women's lack of knowledge on symptoms of perinatal depression and treatment resources is a barrier to receiving care. We sought to estimate the prevalence and predictors of discussing depression with a prenatal care provider. We used the 2011 population-based data from 24 sites participating in the Pregnancy Risk Assessment Monitoring System (n = 32,827 women with recent live births) to examine associations between maternal characteristics and report that a prenatal care provider discussed with her what to do if feeling depressed during or after pregnancy. Overall, 71.9 % of women reported discussing perinatal depression with their prenatal care provider (range 60.7 % in New York City to 85.6 % in Maine). Women were more likely to report a discussion on perinatal depression with their provider if they they were 18-29 years of age than over 35 years of age compared to older (adjusted prevalence ratio [aPR] 18 to 19 y = 1.08, 20 to 24 y = 1.10, 25 to 29 y = 1.09), unmarried (aPR = 1.07) compared to married, had <12 years of education (aPR = 1.05) compared to > 12 years, and had no previous live births (aPR = 1.03) compared to ≥ 1 live births. Research is needed on effective ways to educate women about perinatal depression and whether increased knowledge on perinatal depression results in higher rates of treatment and shorter duration of symptoms. PMID:25578631

  16. A four-state kinetic model of the temporary threshold shift after loud sound based on inactivation of hair cell transduction channels.

    PubMed

    Patuzzi, R

    1998-11-01

    A model of the temporary threshold shift (TTS) following loud sound is presented based on inactivation of the mechano-electrical transduction (MET) channels at the apex of the outer hair cells (OHCs). This inactivation is assumed to reduce temporarily the OHC receptor current with a consequent drop in the mechanical sensitivity of the organ of Corti. With acoustic over-stimulation some of the hair cells' MET channels are assumed to adopt one of three closed and non-transducing conformations or 'TTS states'. The sound-induced inactivation is assumed to occur because the sound makes the TTS states more energetically favourable when compared with the transducing states, and the distribution between these states is assumed to depend on the relative energies of the states and the time allowed for migration between them. By lumping the fast transducing states (one open and two closed) into a single transducing 'pseudo-state', the kinetics of the inactivation and re-activation processes (corresponding to the onset and recovery of TTS) can be described by a four-state kinetic model. The model allows an elegant description of the onset and recovery of TTS time-course in a human subject under a variety of continuous exposure conditions, and some features of intermittent exposure as well. The model also suggests that recovery of TTS may be accelerated by an intermittent tone during the recovery period which may explain some variability TTS in the literature. Other implications of the model are also discussed. PMID:9833962

  17. A Functional Threshold for Long-Term Use of Hand and Arm Function Can Be Determined: Predictions From a Computational Model and Supporting Data From the Extremity Constraint-Induced Therapy Evaluation (EXCITE) Trial

    PubMed Central

    Han, Cheol E.; Wolf, Steven L.; Arbib, Michael A.; Winstein, Carolee J.

    2009-01-01

    Background Although spontaneous use of the more-affected arm and hand after stroke is an important determinant of participation and quality of life, a number of patients exhibit decreases in use following rehabilitative therapy. A previous neurocomputational model predicted that if the dose of therapy is sufficient to bring performance above a certain threshold, training can be stopped. Objective The aim of this study was to test the hypothesis that there exists a threshold for function of the paretic arm and hand after therapy. If function is above this threshold, spontaneous use will increase in the months following therapy. In contrast, if function is below this threshold, spontaneous use will decrease. Methods New computer simulations are presented showing that changes in arm use following therapy depend on a performance threshold. This prediction was tested by reanalyzing the data from the Extremity Constraint-Induced Therapy Evaluation (EXCITE) trial, a phase III randomized controlled trial in which participants received constraint-induced movement therapy for 2 weeks and were tested both 1 week and 1 year after therapy. Results The results demonstrate that arm and hand function measured immediately after therapy predicts, on average, the long-term change of arm use. Above a functional threshold, use improves. Below this threshold, use decreases. Limitations The reanalysis of the EXCITE trial data provides a “group” threshold above which a majority of patients, but not all, improve spontaneously. A goal of future research is to provide the means to assess when patients reach their individual threshold. Conclusion Understanding of the causal and nonlinear relationship between limb function and daily use is important for the future development of cost-effective interventions and prevention of “rehabilitation in vain.” PMID:19797304

  18. Modeling the Future Global Carbon Balance: Is there a threshold from terrestrial sink to source in the Near Future?

    NASA Astrophysics Data System (ADS)

    Neilson, R. P.; Lenihan, J. M.; Bachelet, D. M.; Drapek, R.; Wells, J. R.

    2009-12-01

    Attempts to model the terrestrial carbon balance began almost immediately with the inception of the Intergovernmental Panel on Climate Change about 20 years ago. Beginning from empirical approaches to simulating vegetation distribution, ecologist progressed to the mechanistic global biogeography models. These models, combined with global biogeochemical cycling models and fire disturbance models, resulted in the emergence of Dynamic General Vegetation Models (DGVM). Alternating latitudinal wet and dry zones, driven by the large-scale Hadley circulation system, produce forests in the tropics, deserts in the subtropics, forests in the temperate and boreal regions and nearly barren tundra zones at the poles, with many transition types in between. Thus, while acknowledging the constraints of limiting nutrients, such as nitrogen and phosphorous, the global distributions of forests, deserts and other major biomes are determined primarily by thermal limits and the water cycle, with notable exceptions driven primarily by fire and herbivory. A warming planet should sequester carbon, as forested zones shift north, but carbon could also be lost if deserts and other semi-arid zones expand. It remains unclear whether the tropical forests will increase or decrease in size and carbon volume, due to many counter-acting factors. Confounding the issue even further are the direct effects of elevated CO2 concentration on the photosynthetic processes and on plant water-use-efficiency. Thus, among the many goals of this research is the development of simulation capabilities to alter global vegetation distribution, as driven by rising CO2 and a changing climate, while accurately accounting for the mechanisms of carbon gains and losses in all terrestrial ecosystems. Since the terrestrial carbon balance is the difference between two large, opposing fluxes, productivity and respiration (or combustion), it is never in true balance, being either somewhat positive or negative even under a

  19. Effects of lightning and sprites on the ionospheric potential, and threshold effects on sprite initiation, obtained using an analog model of the global atmospheric electric circuit

    NASA Astrophysics Data System (ADS)

    Rycroft, M. J.; Odzimek, A.

    2010-06-01

    A quantitative model of the global atmospheric electric circuit has been constructed using the PSpice electrical engineering software package. Currents (˜1 kA) above thunderstorms and electrified rain/shower clouds raise the potential of the ionosphere (presumed to be an equipotential surface at 80 km altitude) to ˜250 kV with respect to the Earth's surface. The circuit is completed by currents flowing down through the fair-weather atmosphere in the land/sea surface and up to the cloud systems. Using a model for the atmospheric conductivity profile, the effects of both negative and positive cloud-to-ground (CG) lightning discharges on the ionospheric potential have been estimated. A large positive CG discharge creates an electric field that exceeds the breakdown field from the ionosphere down to ˜74 km, thereby forming a halo, a column sprite, and some milliseconds later, from ˜67 km down to ˜55 km at ˜60 ms after the discharge, a “carrot” sprite. Estimates are made of the return stroke current and the thundercloud charge moment change of a +CG discharge required to exceed the threshold breakdown field, or the threshold field for creating and sustaining negative or positive streamers. The values for breakdown at 80 km altitude are 35 kA and 350 C.km, (Coulomb.kilometers), respectively, and those at 70 km altitude are 45 kA and 360 C.km, respectively. The different temporal and spatial developments of the mesospheric electric field distinguishing between column and carrot sprites agree with the latest deductions from recent observations. The current flowing in the highly conducting sprite reduces the ionospheric potential by ˜1 V.

  20. Effects of lightning and sprites on the ionospheric potential, and threshold effects on sprite initiation, obtained using an analog model of the global atmospheric electric circuit

    NASA Astrophysics Data System (ADS)

    Rycroft, M. J.; Odzimek, A.

    2010-06-01

    A quantitative model of the global atmospheric electric circuit has been constructed using the PSpice electrical engineering software package. Currents (˜1 kA) above thunderstorms and electrified rain/shower clouds raise the potential of the ionosphere (presumed to be an equipotential surface at 80 km altitude) to ˜250 kV with respect to the Earth's surface. The circuit is completed by currents flowing down through the fair-weather atmosphere in the land/sea surface and up to the cloud systems. Using a model for the atmospheric conductivity profile, the effects of both negative and positive cloud-to-ground (CG) lightning discharges on the ionospheric potential have been estimated. A large positive CG discharge creates an electric field that exceeds the breakdown field from the ionosphere down to ˜74 km, thereby forming a halo, a column sprite, and some milliseconds later, from ˜67 km down to ˜55 km at ˜60 ms after the discharge, a "carrot" sprite. Estimates are made of the return stroke current and the thundercloud charge moment change of a +CG discharge required to exceed the threshold breakdown field, or the threshold field for creating and sustaining negative or positive streamers. The values for breakdown at 80 km altitude are 35 kA and 350 C.km, (Coulomb.kilometers), respectively, and those at 70 km altitude are 45 kA and 360 C.km, respectively. The different temporal and spatial developments of the mesospheric electric field distinguishing between column and carrot sprites agree with the latest deductions from recent observations. The current flowing in the highly conducting sprite reduces the ionospheric potential by ˜1 V.

  1. The effects of lightning and sprites on the ionospheric potential, and threshold effects on sprite initiation, obtained using a PSpice model

    NASA Astrophysics Data System (ADS)

    Rycroft, Michael J.; Odzimek, Anna

    2010-05-01

    A quantitative model of the global atmospheric electric circuit has been constructed using the PSpice electrical engineering software package. Currents (~ 1 kA) above thunderstorms and electrified rain/shower clouds raise the potential of the ionosphere, which is presumed to be an equipotential surface at 80 km altitude, to ~ 250 kV with respect to the Earth's surface. The circuit is completed by currents flowing down through the fair weather atmosphere, in the land/sea surface and up to the cloud systems. Using a model for the atmospheric conductivity profile (Rycroft et al., JASTP, 2007), the effects of both negative and positive cloud-to-ground (CG) lightning discharges ion the ionospheric potential have been estimated. A large positive CG discharge creates an electric field which exceeds the breakdown field from the ionosphere down to ~ 74 km, so forming a halo and a column sprite, and, some ms later, from ~ 67 km down to ~ 55 km at ~ 60 ms after the discharge, thereby forming a "carrot" sprite. Estimates are made of the return stroke current and the thundercloud charge moment change (CMC) for a +CG discharge required to exceed the threshold breakdown field, or the threshold field for creating and sustaining negative or positive streamers. The values for breakdown at 80 km altitude are 35 kA and 350 C.km, respectively, and 45 kA and 360 C.km at 70 km altitude. The different temporal and spatial developments of the mesospheric electric field distinguishing between column and carrot sprites agree with the latest deductions from from recent observations. A current flowing in the highly conducting sprite reduces the ionospheric potential by ~ 1 V.

  2. Modeling weather and stocking rate threshold effects on forage and steer production in northern mixed-grass prairie

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Model evaluations of forage production and yearling steer weight gain (SWG) responses to stocking density (SD) and seasonal weather patterns are presented for semi-arid northern mixed-grass prairie. We used the improved Great Plains Framework for Agricultural Resource Management-Range (GPFARM-Range)...

  3. Computational modeling of distinct neocortical oscillations driven by cell-type selective optogenetic drive: separable resonant circuits controlled by low-threshold spiking and fast-spiking interneurons.

    PubMed

    Vierling-Claassen, Dorea; Cardin, Jessica A; Moore, Christopher I; Jones, Stephanie R

    2010-01-01

    Selective optogenetic drive of fast-spiking (FS) interneurons (INs) leads to enhanced local field potential (LFP) power across the traditional "gamma" frequency band (20-80 Hz; Cardin et al., 2009). In contrast, drive to regular-spiking (RS) pyramidal cells enhances power at lower frequencies, with a peak at 8 Hz. The first result is consistent with previous computational studies emphasizing the role of FS and the time constant of GABA(A) synaptic inhibition in gamma rhythmicity. However, the same theoretical models do not typically predict low-frequency LFP enhancement with RS drive. To develop hypotheses as to how the same network can support these contrasting behaviors, we constructed a biophysically principled network model of primary somatosensory neocortex containing FS, RS, and low-threshold spiking (LTS) INs. Cells were modeled with detailed cell anatomy and physiology, multiple dendritic compartments, and included active somatic and dendritic ionic currents. Consistent with prior studies, the model demonstrated gamma resonance during FS drive, dependent on the time constant of GABA(A) inhibition induced by synchronous FS activity. Lower-frequency enhancement during RS drive was replicated only on inclusion of an inhibitory LTS population, whose activation was critically dependent on RS synchrony and evoked longer-lasting inhibition. Our results predict that differential recruitment of FS and LTS inhibitory populations is essential to the observed cortical dynamics and may provide a means for amplifying the natural expression of distinct oscillations in normal cortical processing. PMID:21152338

  4. Computational Modeling of Distinct Neocortical Oscillations Driven by Cell-Type Selective Optogenetic Drive: Separable Resonant Circuits Controlled by Low-Threshold Spiking and Fast-Spiking Interneurons

    PubMed Central

    Vierling-Claassen, Dorea; Cardin, Jessica A.; Moore, Christopher I.; Jones, Stephanie R.

    2010-01-01

    Selective optogenetic drive of fast-spiking (FS) interneurons (INs) leads to enhanced local field potential (LFP) power across the traditional “gamma” frequency band (20–80 Hz; Cardin et al., 2009). In contrast, drive to regular-spiking (RS) pyramidal cells enhances power at lower frequencies, with a peak at 8 Hz. The first result is consistent with previous computational studies emphasizing the role of FS and the time constant of GABAA synaptic inhibition in gamma rhythmicity. However, the same theoretical models do not typically predict low-frequency LFP enhancement with RS drive. To develop hypotheses as to how the same network can support these contrasting behaviors, we constructed a biophysically principled network model of primary somatosensory neocortex containing FS, RS, and low-threshold spiking (LTS) INs. Cells were modeled with detailed cell anatomy and physiology, multiple dendritic compartments, and included active somatic and dendritic ionic currents. Consistent with prior studies, the model demonstrated gamma resonance during FS drive, dependent on the time constant of GABAA inhibition induced by synchronous FS activity. Lower-frequency enhancement during RS drive was replicated only on inclusion of an inhibitory LTS population, whose activation was critically dependent on RS synchrony and evoked longer-lasting inhibition. Our results predict that differential recruitment of FS and LTS inhibitory populations is essential to the observed cortical dynamics and may provide a means for amplifying the natural expression of distinct oscillations in normal cortical processing. PMID:21152338

  5. A Threshold Continuum for Aeolian Sand Transport

    NASA Astrophysics Data System (ADS)

    Swann, C.; Ewing, R. C.; Sherman, D. J.

    2015-12-01

    The threshold of motion for aeolian sand transport marks the initial entrainment of sand particles by the force of the wind. This is typically defined and modeled as a singular wind speed for a given grain size and is based on field and laboratory experimental data. However, the definition of threshold varies significantly between these empirical models, largely because the definition is based on visual-observations of initial grain movement. For example, in his seminal experiments, Bagnold defined threshold of motion when he observed that 100% of the bed was in motion. Others have used 50% and lesser values. Differences in threshold models, in turn, result is large errors in predicting the fluxes associated with sand and dust transport. Here we use a wind tunnel and novel sediment trap to capture the fractions of sand in creep, reptation and saltation at Earth and Mars pressures and show that the threshold of motion for aeolian sand transport is best defined as a continuum in which grains progress through stages defined by the proportion of grains in creep and saltation. We propose the use of scale dependent thresholds modeled by distinct probability distribution functions that differentiate the threshold based on micro to macro scale applications. For example, a geologic timescale application corresponds to a threshold when 100% of the bed in motion whereas a sub-second application corresponds to a threshold when a single particle is set in motion. We provide quantitative measurements (number and mode of particle movement) corresponding to visual observations, percent of bed in motion and degrees of transport intermittency for Earth and Mars. Understanding transport as a continuum provides a basis for revaluating sand transport thresholds on Earth, Mars and Titan.

  6. Recurrent Wheezing in Infants: A Population-Based Study.

    PubMed

    Belhassen, Manon; De Blic, Jacques; Laforest, Laurent; Laigle, Valérie; Chanut-Vogel, Céline; Lamezec, Liliane; Brouard, Jacques; Fauroux, Brigitte; de Pouvourville, Gérard; Ginoux, Marine; Van Ganse, Eric

    2016-04-01

    Recurrent wheezing (RW) has a significant impact on infants, caregivers, and society, but morbidity and related medical resource utilization (MRU) have not been thoroughly explored. The burden of RW needs to be documented with population-based data. The objective was to assess the characteristics, medical management, and MRU of RW infants identified from national claims data.Infants aged from 6 to 24 months, receiving ≥2 dispensations of respiratory drugs within 3 months, and presenting a marker of poor control (index date), were selected. During the 6 months after index date, MRU was described in the cohort and among 3 subgroups with more severe RW, defined as ≥4 dispensations of respiratory drugs, ≥3 dispensations of oral corticosteroids (OCS), or ≥1 hospitalization for respiratory symptoms.A total of 115,489 infants had RW, corresponding to 8.2% of subjects in this age group. During follow-up, 68.7% of infants received inhaled corticosteroids, but only 1.8 U (unit) were dispensed over 6 months, suggesting discontinuous use. Control was mostly inadequate: 61.7% of subjects received OCS, 80.2% antibiotics, and 71.2% short-acting beta-agonists, and medical/paramedical visits were numerous, particularly for physiotherapy. Severe RW concerned 39.0% of the cohort; 32.8% and 11.7% of infants had repeated use of respiratory drugs and OCS, respectively, and 5.5% were hospitalized for respiratory symptoms.In this real-life nation-wide study, RW was common and infants had poor control and high MRU. Interventions are needed to support adequate use of controller therapy, and to improve medical care. PMID:27082618

  7. Calibrating a population-based job-exposure matrix using inspection measurements to estimate historical occupational exposure to lead for a population-based cohort in Shanghai, China.

    PubMed

    Koh, Dong-Hee; Bhatti, Parveen; Coble, Joseph B; Stewart, Patricia A; Lu, Wei; Shu, Xiao-Ou; Ji, Bu-Tian; Xue, Shouzheng; Locke, Sarah J; Portengen, Lutzen; Yang, Gong; Chow, Wong-Ho; Gao, Yu-Tang; Rothman, Nathaniel; Vermeulen, Roel; Friesen, Melissa C

    2014-01-01

    The epidemiologic evidence for the carcinogenicity of lead is inconsistent and requires improved exposure assessment to estimate risk. We evaluated historical occupational lead exposure for a population-based cohort of women (n=74,942) by calibrating a job-exposure matrix (JEM) with lead fume (n=20,084) and lead dust (n=5383) measurements collected over four decades in Shanghai, China. Using mixed-effect models, we calibrated intensity JEM ratings to the measurements using fixed-effects terms for year and JEM rating. We developed job/industry-specific estimates from the random-effects terms for job and industry. The model estimates were applied to subjects' jobs when the JEM probability rating was high for either job or industry; remaining jobs were considered unexposed. The models predicted that exposure increased monotonically with JEM intensity rating and decreased 20-50-fold over time. The cumulative calibrated JEM estimates and job/industry-specific estimates were highly correlated (Pearson correlation=0.79-0.84). Overall, 5% of the person-years and 8% of the women were exposed to lead fume; 2% of the person-years and 4% of the women were exposed to lead dust. The most common lead-exposed jobs were manufacturing electronic equipment. These historical lead estimates should enhance our ability to detect associations between lead exposure and cancer risk in the future epidemiologic analyses. PMID:22910004

  8. Calibrating a population-based job-exposure matrix using inspection measurements to estimate historical occupational exposure to lead for a population-based cohort in Shanghai, China

    PubMed Central

    Koh, Dong-Hee; Bhatti, Parveen; Coble, Joseph B.; Stewart, Patricia A; Lu, Wei; Shu, Xiao-Ou; Ji, Bu-Tian; Xue, Shouzheng; Locke, Sarah J.; Portengen, Lutzen; Yang, Gong; Chow, Wong-Ho; Gao, Yu-Tang; Rothman, Nathaniel; Vermeulen, Roel; Friesen, Melissa C.

    2012-01-01

    The epidemiologic evidence for the carcinogenicity of lead is inconsistent and requires improved exposure assessment to estimate risk. We evaluated historical occupational lead exposure for a population-based cohort of women (n=74,942) by calibrating a job-exposure matrix (JEM) with lead fume (n=20,084) and lead dust (n=5,383) measurements collected over four decades in Shanghai, China. Using mixed-effect models, we calibrated intensity JEM ratings to the measurements using fixed-effects terms for year and JEM rating. We developed job/industry-specific estimates from the random-effects terms for job and industry. The model estimates were applied to subjects’ jobs when the JEM probability rating was high for either job or industry; remaining jobs were considered unexposed. The models predicted that exposure increased monotonically with JEM intensity rating and decreased 20–50-fold over time. The cumulative calibrated JEM estimates and job/industry-specific estimates were highly correlated (Pearson correlation=0.79–0.84). Overall, 5% of the person-years and 8% of the women were exposed to lead fume; 2% of the person-years and 4% of the women were exposed to lead dust. The most common lead-exposed jobs were manufacturing electronic equipment. These historical lead estimates should enhance our ability to detect associations between lead exposure and cancer risk in future epidemiologic analyses. PMID:22910004

  9. Roots at the Percolation Threshold

    NASA Astrophysics Data System (ADS)

    Kroener, E.; Ahmed, M. A.; Kaestner, A.; Vontobel, P.; Zarebanadkouki, M.; Carminati, A.

    2014-12-01

    Much of the carbon assimilated by plants during photosynthesis is lost to the soil via rhizodepositions. One component of rhizopdeposition is mucilage, a hydrogel that dramatically alters the soil physical properties. Mucilage was assumed to explain unexpectedly low rhizosphere rewetting rates during irrigation (Carminati et al. 2010) and temporarily water repellency in the rhizosphere after severe drying (Moradi et al. 2012).Here, we present an experimental and theoretical study for the rewetting behaviour of a soil mixed with mucilage, which was used as an analogue of the rhizosphere. Our samples were made of two layers of untreated soils separated by a thin layer (ca. 1 mm) of soil treated with mucilage. We prepared soil columns of varying particle size, mucilage concentration and height of the middle layer above the water table. The dry soil columns were re-wetted by capillary rise from the bottom.The rewetting of the middle layer showed a distinct dual behavior. For mucilage concentrations lower than a certain threshold, water could cross the thin layer almost immediately after rewetting of bulk soil. At slightly higher mucilage concentrations, the thin layer was almost impermeable. The mucilage concentration at the threshold strongly depended on particle size: the smaller the particle size the larger the soil specific surface and the more mucilage was needed to cover the entire particle surface and to induce water repellency.We applied a classic pore network model to simulate the experimental observations. In the model a certain fraction of nodes were randomly disconnected to reproduce the effect of mucilage in temporarily blocking the flow. The percolation model could qualitatively reproduce well the threshold characteristics of the experiments. Our experiments, together with former observations of water dynamics in the rhizosphere, suggest that the rhizosphere is near the percolation threshold, where small variations in mucilage concentration sensitively

  10. SU-D-9A-02: Relative Effects of Threshold Choice and Spatial Resolution Modeling On SUV and Volume Quantification in F18-FDG PET Imaging of Anal Cancer Patients

    SciTech Connect

    Zhao, F; Bowsher, J; Palta, M; Czito, B; Willett, C; Yin, F

    2014-06-01

    Purpose: PET imaging with F18-FDG is utilized for treatment planning, treatment assessment, and prognosis. A region of interest (ROI) encompassing the tumor may be determined on the PET image, often by a threshold T on the PET standard uptake values (SUVs). Several studies have shown prognostic value for relevant ROI properties including maximum SUV value (SUVmax), metabolic tumor volume (MTV), and total glycolytic activity (TGA). The choice of threshold T may affect mean SUV value (SUVmean), MTV, and TGA. Recently spatial resolution modeling (SRM) has been introduced on many PET systems. SRM may also affect these ROI properties. The purpose of this work is to investigate the relative influence of SRM and threshold choice T on SUVmean, MTV, TGA, and SUVmax. Methods: For 9 anal cancer patients, 18F-FDG PET scans were performed prior to treatment. PET images were reconstructed by 2 iterations of Ordered Subsets Expectation Maximization (OSEM), with and without SRM. ROI contours were generated by 5 different SUV threshold values T: 2.5, 3.0, 30%, 40%, and 50% of SUVmax. Paired-samples t tests were used to compare SUVmean, MTV, and TGA (a) for SRM on versus off and (b) between each pair of threshold values T. SUVmax was also compared for SRM on versus off. Results: For almost all (57/60) comparisons of 2 different threshold values, SUVmean, MTV, and TGA showed statistically significant variation. For comparison of SRM on versus off, there were no statistically significant changes in SUVmax and TGA, but there were statistically significant changes in MTV for T=2.5 and T=3.0 and in SUVmean for all T. Conclusion: The near-universal statistical significance of threshold choice T suggests that, regarding harmonization across sites, threshold choice may be a greater concern than choice of SRM. However, broader study is warranted, e.g. other iterations of OSEM should be considered.

  11. Cyberbullying among Finnish adolescents – a population-based study

    PubMed Central

    2012-01-01

    Background Cyberbullying, threatening or harassing another via the internet or mobile phones, does not cause physically harm and thus the consequences are less visible. Little research has been performed on the occurrence of cyberbullying among adolescents or the perception of its seriousness. Only a few population-based studies have been published, none of which included research on the witnessing of cyberbullying. Here, we examined exposure to cyberbullying during the last year, and its frequency and perceived seriousness among 12 to 18-year-old adolescents in Finland. We studied four dimensions of cyberbullying: being a victim, bully, or both victim and bully of cyberbullying, and witnessing the cyberbullying of friends. Methods Self-administered questionnaires, including four questions on cyberbullying, were mailed to a representative sample of 12-, 14-, 16-, and 18-year-old Finns in 2009 (the Adolescent Health and Lifestyle Survey). The respondents could answer via the internet or paper questionnaire. Results The number of respondents was 5516 and the response rate was 56%. Girls more often than boys reported experiencing at least one dimension of cyberbullying during the last year. The proportion was highest among 14-year-olds and lowest among 18-year-olds of both sexes. Among girls, the most commonly encountered dimension was witnessing the cyberbullying of friends (16%); and being a victim was slightly more common than being a bully (11% vs. 9%). Among boys, an equal proportion, approximately 10%, had been a victim, a bully, or had witnessed cyberbullying. The proportion of bully-victims was 4%. Serious and disruptive cyberbullying was experienced by 2% of respondents and weekly cyberbullying by 1%; only 0.5% of respondents had been bullied weekly and considered bullying serious and disruptive. Conclusions Adolescents are commonly exposed to cyberbullying, but it is rarely frequent or considered serious or disruptive. Cyberbullying exposure differed between

  12. Familial risk of cerebral palsy: population based cohort study

    PubMed Central

    Wilcox, Allen J; Lie, Rolv T; Moster, Dag

    2014-01-01

    Objective To investigate risks of recurrence of cerebral palsy in family members with various degrees of relatedness to elucidate patterns of hereditability. Design Population based cohort study. Setting Data from the Medical Birth Registry of Norway, linked to the Norwegian social insurance scheme to identify cases of cerebral palsy and to databases of Statistics Norway to identify relatives. Participants 2 036 741 Norwegians born during 1967-2002, 3649 of whom had a diagnosis of cerebral palsy; 22 558 pairs of twins, 1 851 144 pairs of first degree relatives, 1 699 856 pairs of second degree relatives, and 5 165 968 pairs of third degree relatives were identified. Main outcome measure Cerebral palsy. Results If one twin had cerebral palsy, the relative risk of recurrence of cerebral palsy was 15.6 (95% confidence interval 9.8 to 25) in the other twin. In families with an affected singleton child, risk was increased 9.2 (6.4 to 13)-fold in a subsequent full sibling and 3.0 (1.1 to 8.6)-fold in a half sibling. Affected parents were also at increased risk of having an affected child (6.5 (1.6 to 26)-fold). No evidence was found of differential transmission through mothers or fathers, although the study had limited power to detect such differences. For people with an affected first cousin, only weak evidence existed for an increased risk (1.5 (0.9 to 2.7)-fold). Risks in siblings or cousins were independent of sex of the index case. After exclusion of preterm births (an important risk factor for cerebral palsy), familial risks remained and were often stronger. Conclusions People born into families in which someone already has cerebral palsy are themselves at elevated risk, depending on their degree of relatedness. Elevated risk may extend even to third degree relatives (first cousins). The patterns of risk suggest multifactorial inheritance, in which multiple genes interact with each other and with environmental factors. These data offer additional

  13. Mitochondrial threshold effects.

    PubMed Central

    Rossignol, Rodrigue; Faustin, Benjamin; Rocher, Christophe; Malgat, Monique; Mazat, Jean-Pierre; Letellier, Thierry

    2003-01-01

    The study of mitochondrial diseases has revealed dramatic variability in the phenotypic presentation of mitochondrial genetic defects. To attempt to understand this variability, different authors have studied energy metabolism in transmitochondrial cell lines carrying different proportions of various pathogenic mutations in their mitochondrial DNA. The same kinds of experiments have been performed on isolated mitochondria and on tissue biopsies taken from patients with mitochondrial diseases. The results have shown that, in most cases, phenotypic manifestation of the genetic defect occurs only when a threshold level is exceeded, and this phenomenon has been named the 'phenotypic threshold effect'. Subsequently, several authors showed that it was possible to inhibit considerably the activity of a respiratory chain complex, up to a critical value, without affecting the rate of mitochondrial respiration or ATP synthesis. This phenomenon was called the 'biochemical threshold effect'. More recently, quantitative analysis of the effects of various mutations in mitochondrial DNA on the rate of mitochondrial protein synthesis has revealed the existence of a 'translational threshold effect'. In this review these different mitochondrial threshold effects are discussed, along with their molecular bases and the roles that they play in the presentation of mitochondrial diseases. PMID:12467494

  14. Conflict Resolution as Near-Threshold Decision-Making: A Spiking Neural Circuit Model with Two-Stage Competition for Antisaccadic Task.

    PubMed

    Lo, Chung-Chuan; Wang, Xiao-Jing

    2016-08-01

    Automatic responses enable us to react quickly and effortlessly, but they often need to be inhibited so that an alternative, voluntary action can take place. To investigate the brain mechanism of controlled behavior, we investigated a biologically-based network model of spiking neurons for inhibitory control. In contrast to a simple race between pro- versus anti-response, our model incorporates a sensorimotor remapping module, and an action-selection module endowed with a "Stop" process through tonic inhibition. Both are under the modulation of rule-dependent control. We tested the model by applying it to the well known antisaccade task in which one must suppress the urge to look toward a visual target that suddenly appears, and shift the gaze diametrically away from the target instead. We found that the two-stage competition is crucial for reproducing the complex behavior and neuronal activity observed in the antisaccade task across multiple brain regions. Notably, our model demonstrates two types of errors: fast and slow. Fast errors result from failing to inhibit the quick automatic responses and therefore exhibit very short response times. Slow errors, in contrast, are due to incorrect decisions in the remapping process and exhibit long response times comparable to those of correct antisaccade responses. The model thus reveals a circuit mechanism for the empirically observed slow errors and broad distributions of erroneous response times in antisaccade. Our work suggests that selecting between competing automatic and voluntary actions in behavioral control can be understood in terms of near-threshold decision-making, sharing a common recurrent (attractor) neural circuit mechanism with discrimination in perception. PMID:27551824

  15. Conflict Resolution as Near-Threshold Decision-Making: A Spiking Neural Circuit Model with Two-Stage Competition for Antisaccadic Task

    PubMed Central

    Wang, Xiao-Jing

    2016-01-01

    Automatic responses enable us to react quickly and effortlessly, but they often need to be inhibited so that an alternative, voluntary action can take place. To investigate the brain mechanism of controlled behavior, we investigated a biologically-based network model of spiking neurons for inhibitory control. In contrast to a simple race between pro- versus anti-response, our model incorporates a sensorimotor remapping module, and an action-selection module endowed with a “Stop” process through tonic inhibition. Both are under the modulation of rule-dependent control. We tested the model by applying it to the well known antisaccade task in which one must suppress the urge to look toward a visual target that suddenly appears, and shift the gaze diametrically away from the target instead. We found that the two-stage competition is crucial for reproducing the complex behavior and neuronal activity observed in the antisaccade task across multiple brain regions. Notably, our model demonstrates two types of errors: fast and slow. Fast errors result from failing to inhibit the quick automatic responses and therefore exhibit very short response times. Slow errors, in contrast, are due to incorrect decisions in the remapping process and exhibit long response times comparable to those of correct antisaccade responses. The model thus reveals a circuit mechanism for the empirically observed slow errors and broad distributions of erroneous response times in antisaccade. Our work suggests that selecting between competing automatic and voluntary actions in behavioral control can be understood in terms of near-threshold decision-making, sharing a common recurrent (attractor) neural circuit mechanism with discrimination in perception. PMID:27551824

  16. Effects of Population Based Screening for Chlamydia Infections in The Netherlands Limited by Declining Participation Rates

    PubMed Central

    Schmid, Boris V.; Over, Eelco A. B.; van den Broek, Ingrid V. F.; Op de Coul, Eline L. M.; van Bergen, Jan E. A. M.; Fennema, Johan S. A.; Götz, Hannelore M.; Hoebe, Christian J. P. A.; de Wit, G. Ardine; van der Sande, Marianne A. B.; Kretzschmar, Mirjam E. E.

    2013-01-01

    Background A large trial to investigate the effectiveness of population based screening for chlamydia infections was conducted in the Netherlands in 2008–2012. The trial was register based and consisted of four rounds of screening of women and men in the age groups 16–29 years in three regions in the Netherlands. Data were collected on participation rates and positivity rates per round. A modeling study was conducted to project screening effects for various screening strategies into the future. Methods and Findings We used a stochastic network simulation model incorporating partnership formation and dissolution, aging and a sexual life course perspective. Trends in baseline rates of chlamydia testing and treatment were used to describe the epidemiological situation before the start of the screening program. Data on participation rates was used to describe screening uptake in rural and urban areas. Simulations were used to project the effectiveness of screening on chlamydia prevalence for a time period of 10 years. In addition, we tested alternative screening strategies, such as including only women, targeting different age groups, and biennial screening. Screening reduced prevalence by about 1% in the first two screening rounds and leveled off after that. Extrapolating observed participation rates into the future indicated very low participation in the long run. Alternative strategies only marginally changed the effectiveness of screening. Higher participation rates as originally foreseen in the program would have succeeded in reducing chlamydia prevalence to very low levels in the long run. Conclusions Decreasing participation rates over time profoundly impact the effectiveness of population based screening for chlamydia infections. Using data from several consecutive rounds of screening in a simulation model enabled us to assess the future effectiveness of screening on prevalence. If participation rates cannot be kept at a sufficient level, the effectiveness

  17. Estimating HIV Prevalence in Zimbabwe Using Population-Based Survey Data

    PubMed Central

    Chinomona, Amos; Mwambi, Henry Godwell

    2015-01-01

    Estimates of HIV prevalence computed using data obtained from sampling a subgroup of the national population may lack the representativeness of all the relevant domains of the population. These estimates are often computed on the assumption that HIV prevalence is uniform across all domains of the population. Use of appropriate statistical methods together with population-based survey data can enhance better estimation of national and subgroup level HIV prevalence and can provide improved explanations of the variation in HIV prevalence across different domains of the population. In this study we computed design-consistent estimates of HIV prevalence, and their respective 95% confidence intervals at both the national and subgroup levels. In addition, we provided a multivariable survey logistic regression model from a generalized linear modelling perspective for explaining the variation in HIV prevalence using demographic, socio-economic, socio-cultural and behavioural factors. Essentially, this study borrows from the proximate determinants conceptual framework which provides guiding principles upon which socio-economic and socio-cultural variables affect HIV prevalence through biological behavioural factors. We utilize the 2010–11 Zimbabwe Demographic and Health Survey (2010–11 ZDHS) data (which are population based) to estimate HIV prevalence in different categories of the population and for constructing the logistic regression model. It was established that HIV prevalence varies greatly with age, gender, marital status, place of residence, literacy level, belief on whether condom use can reduce the risk of contracting HIV and level of recent sexual activity whereas there was no marked variation in HIV prevalence with social status (measured using a wealth index), method of contraceptive and an individual’s level of education. PMID:26624280

  18. Estimating HIV Prevalence in Zimbabwe Using Population-Based Survey Data.

    PubMed

    Chinomona, Amos; Mwambi, Henry Godwell

    2015-01-01

    Estimates of HIV prevalence computed using data obtained from sampling a subgroup of the national population may lack the representativeness of all the relevant domains of the population. These estimates are often computed on the assumption that HIV prevalence is uniform across all domains of the population. Use of appropriate statistical methods together with population-based survey data can enhance better estimation of national and subgroup level HIV prevalence and can provide improved explanations of the variation in HIV prevalence across different domains of the population. In this study we computed design-consistent estimates of HIV prevalence, and their respective 95% confidence intervals at both the national and subgroup levels. In addition, we provided a multivariable survey logistic regression model from a generalized linear modelling perspective for explaining the variation in HIV prevalence using demographic, socio-economic, socio-cultural and behavioural factors. Essentially, this study borrows from the proximate determinants conceptual framework which provides guiding principles upon which socio-economic and socio-cultural variables affect HIV prevalence through biological behavioural factors. We utilize the 2010-11 Zimbabwe Demographic and Health Survey (2010-11 ZDHS) data (which are population based) to estimate HIV prevalence in different categories of the population and for constructing the logistic regression model. It was established that HIV prevalence varies greatly with age, gender, marital status, place of residence, literacy level, belief on whether condom use can reduce the risk of contracting HIV and level of recent sexual activity whereas there was no marked variation in HIV prevalence with social status (measured using a wealth index), method of contraceptive and an individual's level of education. PMID:26624280

  19. A threshold for dissipative fission

    SciTech Connect

    Thoennessen, M.; Bertsch, G.F.

    1993-09-21

    The empirical domain of validity of statistical theory is examined as applied to fission data on pre-fission data on pre-fission neutron, charged particle, and {gamma}-ray multiplicities. Systematics are found of the threshold excitation energy for the appearance of nonstatistical fission. From the data on systems with not too high fissility, the relevant phenomenological parameter is the ratio of the threshold temperature T{sub thresh} to the (temperature-dependent) fission barrier height E{sub Bar}(T). The statistical model reproduces the data for T{sub thresh}/E{sub Bar}(T) < 0.26 {plus_minus} 0.05, but underpredicts the multiplicities at higher T{sub thresh}/E{sub Bar}(T) independent of mass and fissility of the systems.

  20. Space-time resolved simulation of femtosecond nonlinear light-matter interactions using a holistic quantum atomic model: application to near-threshold harmonics.

    PubMed

    Kolesik, M; Wright, E M; Andreasen, J; Brown, J M; Carlson, D R; Jones, R J

    2012-07-01

    We introduce a new computational approach for femtosecond pulse propagation in the transparency region of gases that permits full resolution in three space dimensions plus time while fully incorporating quantum coherent effects such as high-harmonic generation and strong-field ionization in a holistic fashion. This is achieved by utilizing a one-dimensional model atom with a delta-function potential which allows for a closed-form solution for the nonlinear optical response due to ground-state to continuum transitions. It side-steps evaluation of the wave function, and offers more than one hundred-fold reduction in computation time in comparison to direct solution of the atomic Schrödinger equation. To illustrate the capability of our new computational approach, we apply it to the example of near-threshold harmonic generation in Xenon, and we also present a qualitative comparison between our model and results from an in-house experiment on extreme ultraviolet generation in a femtosecond enhancement cavity. PMID:22772302

  1. A threshold voltage model of short-channel fully-depleted recessed-source/drain (Re-S/D) SOI MOSFETs with high-k dielectric

    NASA Astrophysics Data System (ADS)

    Gopi Krishna, Saramekala; Sarvesh, Dubey; Pramod, Kumar Tiwari

    2015-10-01

    In this paper, a surface potential based threshold voltage model of fully-depleted (FD) recessed-source/drain (Re-S/D) silicon-on-insulator (SOI) metal-oxide semiconductor field-effect transistor (MOSFET) is presented while considering the effects of high-k gate-dielectric material induced fringing-field. The two-dimensional (2D) Poisson’s equation is solved in a channel region in order to obtain the surface potential under the assumption of the parabolic potential profile in the transverse direction of the channel with appropriate boundary conditions. The accuracy of the model is verified by comparing the model’s results with the 2D simulation results from ATLAS over a wide range of channel lengths and other parameters, including the dielectric constant of gate-dielectric material. The author, Pramod Kumar Tiwari, was supported by the Science and Engineering Research Board (SERB), Department of Science and Technology, Ministry of Human Resource and Development, Government of India under Young Scientist Research (Grant No. SB/FTP/ETA-415/2012).

  2. The Effects of Social Reforms on Mental Disability in China: Population-Based Study.

    PubMed

    Wang, Zhenjie; Zhang, Lei; Li, Ning; Guo, Chao; Chen, Gong; Zheng, Xiaoying

    2016-04-01

    Few studies have explored how mental disabilities have changed with the waves of Chinese social reforms that occurred between 1912 and 2006. The present study evaluated population-based data from the Second China National Sample Survey on Disability to investigate these trends and their effects on mental disabilities. The Cox proportional hazards model was used to estimate the association between social reforms and mental disabilities. The confounding variables considered were as follows: survey age, gender, residence in 2006, ethnicity, and living arrangements in 2006. The highest risks of mental disabilities were observed in subjects born during the Mao Zedong era. Subjects who experienced social turbulence during their early development may have increased risks of mental disabilities in adulthood. The results and discussion herein contribute to our understanding of mental disabilities in China within the context of changing political, socioeconomic, and health system conditions and a developing mental health system. PMID:26969637

  3. ASSOCIATON BETWEEN INTIMATE PARTNER VIOLENCE AND IRRITABLE BOWEL SYNDROME: A POPULATION-BASED STUDY IN NICARAGUA

    PubMed Central

    Becker-Dreps, Sylvia; Morgan, Douglas; Peña, Rodolfo; Cortes, Loreto; Martin, Christopher F.; Valladares, Eliette

    2010-01-01

    Irritable bowel syndrome (IBS) is a disabling functional gastrointestinal disorder, which serves as a model for abdominal pain syndromes. An association between intimate partner violence and IBS has been shown among Caucasian women in the industrialized world. To determine whether this relationship transcends cultural boundaries, we conducted a population-based, cross-sectional survey in Nicaragua, using the innovative Health and Demographic Surveillance System in the León province. Women who had experienced physical intimate partner violence had significantly increased risk of IBS (OR 2.08, 95% CI, 1.35, 3.21), as did those who had experienced sexual intimate partner violence (OR 2.85, 95% CI 1.45, 5.59). These findings argue for intimate partner violence screening among Latina women with IBS. PMID:20558772

  4. Population based analysis of directional information in serial deformation tensor morphometry.

    PubMed

    Studholme, Colin; Cardenas, Valerie

    2007-01-01

    Deformation morphometry provides a sensitive approach to detecting and mapping subtle volume changes in the brain. Population based analyses of this data have been used successfully to detect characteristic changes in different neurodegenerative conditions. However, most studies have been limited to statistical mapping of the scalar volume change at each point in the brain, by evaluating the determinant of the Jacobian of the deformation field. In this paper we describe an approach to spatial normalisation and analysis of the full deformation tensor. The approach employs a spatial relocation and reorientation of tensors of each subject. Using the assumption of small changes, we use a linear modeling of effects of clinical variables on each deformation tensor component across a population. We illustrate the use of this approach by examining the pattern of significance and orientation of the volume change effects in recovery from alcohol abuse. Results show new local structure which was not apparent in the analysis of scalar volume changes. PMID:18044583

  5. Secondary flow structures in the presence of Type-IV stent fractures through a bent tube model for curved arteries: Effect of circulation thresholding

    NASA Astrophysics Data System (ADS)

    Hussain, Shadman; Bulusu, Kartik V.; Plesniak, Michael W.

    2013-11-01

    A common treatment for atherosclerosis is the opening of narrowed arteries resulting from obstructive lesions by angioplasty and stent implantation to restore unrestricted blood flow. ``Type-IV'' stent fractures involve complete transverse, linear fracture of stent struts, along with displacement of the stent fragments. Experimental data pertaining to secondary flows in the presence of stents that underwent ``Type-IV'' fractures in a bent artery model under physiological inflow conditions were obtained through a two-component, two-dimensional (2C-2D) PIV technique. Concomitant stent-induced flow perturbations result in secondary flow structures with complex, multi-scale morphologies and varying size-strength characteristics. Ultimately, these flow structures may have a role to play in restenosis and progression of atherosclerotic plaque. Vortex circulation thresholds were established with the goal of resolving and tracking iso-circulation secondary flow vortical structures and their morphological changes. This allowed for a parametric evaluation and quantitative representation of secondary flow structures undergoing deformation and spatial reorganization. Supported by NSF Grant No. CBET- 0828903 and GW Center for Biomimetics and Bioinspired Engineering.

  6. Long-term daily vibration exposure alters current perception threshold (CPT) sensitivity and myelinated axons in a rat-tail model of vibration-induced injury.

    PubMed

    Krajnak, Kristine; Raju, Sandya G; Miller, G Roger; Johnson, Claud; Waugh, Stacey; Kashon, Michael L; Riley, Danny A

    2016-01-01

    Repeated exposure to hand-transmitted vibration through the use of powered hand tools may result in pain and progressive reductions in tactile sensitivity. The goal of the present study was to use an established animal model of vibration-induced injury to characterize changes in sensory nerve function and cellular mechanisms associated with these alterations. Sensory nerve function was assessed weekly using the current perception threshold test and tail-flick analgesia test in male Sprague-Dawley rats exposed to 28 d of tail vibration. After 28 d of exposure, Aβ fiber sensitivity was reduced. This reduction in sensitivity was partly attributed to structural disruption of myelin. In addition, the decrease in sensitivity was also associated with a reduction in myelin basic protein and 2',3'- cyclic nucleotide phosphodiasterase (CNPase) staining in tail nerves, and an increase in circulating calcitonin gene-related peptide (CGRP) concentrations. Changes in Aβ fiber sensitivity and CGRP concentrations may serve as early markers of vibration-induced injury in peripheral nerves. It is conceivable that these markers may be utilized to monitor sensorineural alterations in workers exposed to vibration to potentially prevent additional injury. PMID:26852665

  7. Identification of Molecular Fingerprints in Human Heat Pain Thresholds by Use of an Interactive Mixture Model R Toolbox (AdaptGauss).

    PubMed

    Ultsch, Alfred; Thrun, Michael C; Hansen-Goos, Onno; Lötsch, Jörn

    2015-01-01

    Biomedical data obtained during cell experiments, laboratory animal research, or human studies often display a complex distribution. Statistical identification of subgroups in research data poses an analytical challenge. Here were introduce an interactive R-based bioinformatics tool, called "AdaptGauss". It enables a valid identification of a biologically-meaningful multimodal structure in the data by fitting a Gaussian mixture model (GMM) to the data. The interface allows a supervised selection of the number of subgroups. This enables the expectation maximization (EM) algorithm to adapt more complex GMM than usually observed with a noninteractive approach. Interactively fitting a GMM to heat pain threshold data acquired from human volunteers revealed a distribution pattern with four Gaussian modes located at temperatures of 32.3, 37.2, 41.4, and 45.4 °C. Noninteractive fitting was unable to identify a meaningful data structure. Obtained results are compatible with known activity temperatures of different TRP ion channels suggesting the mechanistic contribution of different heat sensors to the perception of thermal pain. Thus, sophisticated analysis of the modal structure of biomedical data provides a basis for the mechanistic interpretation of the observations. As it may reflect the involvement of different TRP thermosensory ion channels, the analysis provides a starting point for hypothesis-driven laboratory experiments. PMID:26516852

  8. Predictors of Cerebral Palsy in Very Preterm Infants: The EPIPAGE Prospective Population-Based Cohort Study

    ERIC Educational Resources Information Center

    Beaino, Ghada; Khoshnood, Babak; Kaminski, Monique; Pierrat, Veronique; Marret, Stephane; Matis, Jacqueline; Ledesert, Bernard; Thiriez, Gerard; Fresson, Jeanne; Roze, Jean-Christophe; Zupan-Simunek, Veronique; Arnaud, Catherine; Burguet, Antoine; Larroque, Beatrice; Breart, Gerard; Ancel, Pierre-Yves

    2010-01-01

    Aim: The aim of this study was to assess the independent role of cerebral lesions on ultrasound scan, and several other neonatal and obstetric factors, as potential predictors of cerebral palsy (CP) in a large population-based cohort of very preterm infants. Method: As part of EPIPAGE, a population-based prospective cohort study, perinatal data…

  9. Osteoporosis-related fracture case definitions for population-based administrative data

    PubMed Central

    2012-01-01

    Background Population-based administrative data have been used to study osteoporosis-related fracture risk factors and outcomes, but there has been limited research about the validity of these data for ascertaining fracture cases. The objectives of this study were to: (a) compare fracture incidence estimates from administrative data with estimates from population-based clinically-validated data, and (b) test for differences in incidence estimates from multiple administrative data case definitions. Methods Thirty-five case definitions for incident fractures of the hip, wrist, humerus, and clinical vertebrae were constructed using diagnosis codes in hospital data and diagnosis and service codes in physician billing data from Manitoba, Canada. Clinically-validated fractures were identified from the Canadian Multicentre Osteoporosis Study (CaMos). Generalized linear models were used to test for differences in incidence estimates. Results For hip fracture, sex-specific differences were observed in the magnitude of under- and over-ascertainment of administrative data case definitions when compared with CaMos data. The length of the fracture-free period to ascertain incident cases had a variable effect on over-ascertainment across fracture sites, as did the use of imaging, fixation, or repair service codes. Case definitions based on hospital data resulted in under-ascertainment of incident clinical vertebral fractures. There were no significant differences in trend estimates for wrist, humerus, and clinical vertebral case definitions. Conclusions The validity of administrative data for estimating fracture incidence depends on the site and features of the case definition. PMID:22537071

  10. Hip Fracture in People with Erectile Dysfunction: A Nationwide Population-Based Cohort Study

    PubMed Central

    Wu, Chieh-Hsin; Tung, Yi-Ching; Lin, Tzu-Kang; Chai, Chee-Yin; Su, Yu-Feng; Tsai, Tai-Hsin; Tsai, Cheng-Yu; Lu, Ying-Yi; Lin, Chih-Lung

    2016-01-01

    The aims of this study were to investigate the risk of hip fracture and contributing factors in patients with erectile dysfunction(ED). This population-based study was performed using the Taiwan National Health Insurance Research Database. The analysis included4636 patients aged ≥ 40 years who had been diagnosed with ED (International Classification of Diseases, Ninth Revision, Clinical Modification codes 302.72, 607.84) during 1996–2010. The control group included 18,544 randomly selected age-matched patients without ED (1:4 ratio). The association between ED and hip fracture risk was estimated using a Cox proportional hazard regression model. During the follow-up period, 59 (1.27%) patients in the ED group and 140 (0.75%) patients in the non-ED group developed hip fracture. After adjusting for covariates, the overall incidence of hip fracture was 3.74-times higher in the ED group than in the non-ED group (2.03 vs. 0.50 per 1000 person-years, respectively). The difference in the overall incidence of hip fracture was largest during the 3-year follow-up period (hazard ratio = 7.85; 95% confidence interval = 2.94–20.96; P <0.0001). To the best of our knowledge, this nationwide population-based study is the first to investigate the relationship between ED and subsequent hip fracture in an Asian population. The results showed that ED patients had a higher risk of developing hip fracture. Patients with ED, particularly those aged 40–59 years, should undergo bone mineral density examinations as early as possible and should take measures to reduce the risk of falls. PMID:27078254

  11. A Nationwide Population-Based Cohort Study of Migraine and Organic-Psychogenic Erectile Dysfunction.

    PubMed

    Wu, Szu-Hsien; Chuang, Eric; Chuang, Tien-Yow; Lin, Cheng-Li; Lin, Ming-Chia; Yen, Der-Jen; Kao, Chia-Hung

    2016-03-01

    As chronic illnesses and chronic pain are related to erectile dysfunction (ED), migraine as a prevalent chronic disorder affecting lots of people all over the world may negatively affect quality of life as well as sexual function. However, a large-scale population-based study of erectile dysfunction and other different comorbidities in patients with migraine is quite limited. This cohort longitudinal study aimed to estimate the association between migraine and ED using a nationwide population-based database in Taiwan.The data used for this cohort study were retrieved from the Longitudinal Health Insurance Database 2000 in Taiwan. We identified 5015 patients with migraine and frequency matched 20,060 controls without migraine from 2000 to 2011. The occurrence of ED was followed up until the end of 2011. We used Cox proportional hazard regression models to analyze the risks of ED.The overall incidence of ED was 1.78-fold greater in the migraine cohort than in the comparison cohort (23.3 vs 10.5 per 10,000 person-years; 95% confidence interval [CI] = 1.31-2.41). Furthermore, patients with migraine were 1.75-fold more likely to develop organic ED (95% CI = 1.27-2.41) than were the comparison cohort. The migraine patients with anxiety had a 3.6-fold higher HR of having been diagnosed with ED than the comparison cohort without anxiety (95% CI, 2.10-6.18).The results support that patients with migraine have a higher incidence of being diagnosed with ED, particularly in the patient with the comorbidity of anxiety. PMID:26962838

  12. Pioglitazone use and risk of bladder cancer: population based cohort study

    PubMed Central

    Tuccori, Marco; Filion, Kristian B; Yin, Hui; Yu, Oriana H; Platt, Robert W

    2016-01-01

    Objective To determine whether pioglitazone compared with other antidiabetic drugs is associated with an increased risk of bladder cancer in people with type 2 diabetes. Design Population based cohort study. Setting General practices contributing data to the United Kingdom Clinical Practice Research Datalink. Participants A cohort of 145 806 patients newly treated with antidiabetic drugs between 1 January 2000 and 31 July 2013, with follow-up until 31 July 2014. Main outcome measures The use of pioglitazone was treated as a time varying variable, with use lagged by one year for latency purposes. Cox proportional hazards models were used to estimate adjusted hazard ratios with 95% confidence intervals of incident bladder cancer associated with pioglitazone overall and by both cumulative duration of use and cumulative dose. Similar analyses were conducted for rosiglitazone, a thiazolidinedione not previously associated with an increased risk of bladder cancer. Results The cohort generated 689 616 person years of follow-up, during which 622 patients were newly diagnosed as having bladder cancer (crude incidence 90.2 per 100 000 person years). Compared with other antidiabetic drugs, pioglitazone was associated with an increased risk of bladder cancer (121.0 v 88.9 per 100 000 person years; hazard ratio 1.63, 95% confidence interval 1.22 to 2.19). Conversely, rosiglitazone was not associated with an increased risk of bladder cancer (86.2 v 88.9 per 100 000 person years; 1.10, 0.83 to 1.47). Duration-response and dose-response relations were observed for pioglitazone but not for rosiglitazone. Conclusion The results of this large population based study indicate that pioglitazone is associated with an increased risk of bladder cancer. The absence of an association with rosiglitazone suggests that the increased risk is drug specific and not a class effect. PMID:27029385

  13. Population-based register of acute myocardial infarction: manual of operations.

    PubMed

    Madsen, Mette; Gudnason, Vilmundur; Pajak, Andrzej; Palmieri, Luigi; Rocha, Evangelista C; Salomaa, Veikko; Sans, Susana; Steinbach, Konrad; Vanuzzo, Diego

    2007-12-01

    Cardiovascular disease is the leading cause of death and hospitalization in both sexes in nearly all countries of Europe. The main forms of cardiovascular disease are ischaemic heart disease and stroke. The magnitude of the problem contrasts with the shortage, weak quality and comparability of data available in most European countries. Innovations in medical, invasive and biological treatments have substantially contributed to the escalating costs of health services. It is therefore urgent to obtain reliable information on the magnitude and distribution of the disease for both adequate health planning (including preventive strategies) and clinical decision making with correct cost-benefit assessments.A stepwise surveillance procedure based on standardized data collection, appropriate record linkage and validation methods was set up by the EUROCISS Project (EUROpean Cardiovascular Indicators Surveillance Set) to build up comparable and reliable indicators (attack rate and case fatality) for the surveillance of acute myocardial infarction/acute coronary syndrome at population level. This manual of operations is intended for health professionals and policy makers and provides a standardized and simple model for the implementation of a population-based register. It recommends to start from a minimum data set and then follow a stepwise procedure. Before implementing a population-based register, it is important to identify the target population under surveillance which should preferably cover a well-defined geographical and administrative area or region representative of the whole country for which population data and vital statistics (mortality and hospital discharge records at minimum) are routinely collected and easily available each year. All cases among residents should be recorded even if the case occurs outside the area. Validation of a sample of fatal and nonfatal events is mandatory. PMID:18091134

  14. Heterogeneity in ALSFRS-R decline and survival: a population-based study in Italy.

    PubMed

    Mandrioli, Jessica; Biguzzi, Sara; Guidi, Carlo; Sette, Elisabetta; Terlizzi, Emilio; Ravasio, Alessandro; Casmiro, Mario; Salvi, Fabrizio; Liguori, Rocco; Rizzi, Romana; Pietrini, Vladimiro; Borghi, Annamaria; Rinaldi, Rita; Fini, Nicola; Chierici, Elisabetta; Santangelo, Mario; Granieri, Enrico; Mussuto, Vittoria; De Pasqua, Silvia; Georgoulopoulou, Eleni; Fasano, Antonio; Ferro, Salvatore; D'Alessandro, Roberto

    2015-12-01

    Very few studies examined trend over time of the revised Amyotrophic Lateral Sclerosis Functional Rating Scale (ALSFRS-R) and factors influencing it; previous studies, then, included only patients attending tertiary ALS Centres. We studied ALSFRS-R decline, factors influencing this trend and survival in a population-based setting. From 2009 onwards, a prospective registry records all incident ALS cases among residents in Emilia Romagna (population: 4.4 million). For each patient, demographic and clinical details (including ALSFRS-R) are collected by caring physicians at each follow-up. Analysis was performed on 402 incident cases (1279 ALSFRS-R assessments). The average decline of the ALSFRS-R was 0.60 points/month during the first year after diagnosis and 0.34 points/month in the second year. ALSFRS-R decline was heterogeneous among subgroups. Repeated measures mixed model showed that ALSFRS-R score decline was influenced by age at onset (p < 0.01), phenotype (p = 0.01), body mass index (BMI) (p < 0.01), progression rate at diagnosis (ΔFS) (p < 0.01), El Escorial Criteria-Revised (p < 0.01), and FVC% at diagnosis (p < 0.01). Among these factors, at multivariate analysis, only age, site of onset and ΔFS independently influenced survival. In this first population-based study on ALSFRS-R trend, we confirm that ALSFRS-R decline is not homogeneous among ALS patients and during the disease. Factors influencing ALSFRS-R decline may not match with those affecting survival. These disease modifiers should be taken into consideration for trials design and in clinical practice during discussions with patients on prognosis. PMID:26205535

  15. Thresholds for disease persistence in models for tick-borne infections including non-viraemic transmission, extended feeding and tick aggregation.

    PubMed

    Rosà, Roberto; Pugliese, Andrea; Norman, Rachel; Hudson, Peter J

    2003-10-01

    Lyme disease and Tick-Borne Encephalitis (TBE) are two emergent tick-borne diseases transmitted by the widely distributed European tick Ixodes ricinus. The life cycle of the vector and the number of hosts involved requires the development of complex models which consider different routes of pathogen transmission including those occurring between ticks that co-feed on the same host. Hence, we consider here a general model for tick-borne infections. We assumed ticks feed on two types of host species, one competent for viraemic transmission of infection, the second incompetent but included a third transmission route through non-viraemic transmission between ticks co-feeding on the same host. Since a blood meal lasts for several days these routes could lead to interesting nonlinearities in transmission rates, which may have important effects.We derive an explicit formula for the threshold for disease persistence in the case of viraemic transmission, also for the case of viraemic and non-viraemic transmission. From this formula, the effect of parameters on the persistence of infection can be determined. When only viraemic transmission occurs, we confirm that, while the density of the competent host has always a positive effect on infection persistence, the density of the incompetent host may have either a positive effect, by amplifying tick population, or a negative ("dilution") effect, by wasting tick bites on an incompetent host. With non-viraemic transmission, the "dilution" effect becomes less relevant. On the other hand, if the nonlinearity due to extended feeding is included, the dilution effect always occurs, but often at unrealistically high host densities. Finally, we incorporated the effects of tick aggregation on the hosts and correlation of tick stages and found that both had an important effect on infection persistence, if non-viraemic transmission occurred. PMID:12941594

  16. Network problem threshold

    NASA Technical Reports Server (NTRS)

    Gejji, Raghvendra, R.

    1992-01-01

    Network transmission errors such as collisions, CRC errors, misalignment, etc. are statistical in nature. Although errors can vary randomly, a high level of errors does indicate specific network problems, e.g. equipment failure. In this project, we have studied the random nature of collisions theoretically as well as by gathering statistics, and established a numerical threshold above which a network problem is indicated with high probability.

  17. Deterministic estimation of hydrological thresholds for shallow landslide initiation and slope stability models: case study from the Somma-Vesuvius area of southern Italy

    USGS Publications Warehouse

    Baum, Rex L.; Godt, Jonathan W.; De Vita, P.; Napolitano, E.

    2012-01-01

    interrupted. These results lead to the identification of a comprehensive hydrogeomorphological model of susceptibility to initial landslides that links morphological, stratigraphical and hydrological conditions. The calculation of intensities and durations of rainfall necessary for slope instability allowed the identification of deterministic hydrological thresholds that account for uncertainty in properties and observed rainfall intensities.

  18. Threshold altitude resulting in decompression sickness

    NASA Technical Reports Server (NTRS)

    Kumar, K. V.; Waligora, James M.; Calkins, Dick S.

    1990-01-01

    A review of case reports, hypobaric chamber training data, and experimental evidence indicated that the threshold for incidence of altitude decompression sickness (DCS) was influenced by various factors such as prior denitrogenation, exercise or rest, and period of exposure, in addition to individual susceptibility. Fitting these data with appropriate statistical models makes it possible to examine the influence of various factors on the threshold for DCS. This approach was illustrated by logistic regression analysis on the incidence of DCS below 9144 m. Estimations using these regressions showed that, under a noprebreathe, 6-h exposure, simulated EVA profile, the threshold for symptoms occurred at approximately 3353 m; while under a noprebreathe, 2-h exposure profile with knee-bends exercise, the threshold occurred at 7925 m.

  19. Can we infer the magma overpressure threshold before an eruption? Insights from ground deformation time series and numerical modeling of reservoir failure.

    NASA Astrophysics Data System (ADS)

    Albino, F.; Gregg, P. M.; Amelug, F.

    2015-12-01

    Overpressure within a magma chamber is a key parameter to understanding the onset of an eruption. Recent investigations indicate that surface inflation at a volcanic edifice does not always precede eruption (Chaussard and Amelung, 2012; Biggs et al., 2014), suggesting that the overpressure threshold may differ between volcanoes. To understand the failure conditions of a magma reservoir, mechanical models were developed to quantify the range of overpressure affordable in a reservoir for a given situation. Even if the choice of the failure criterion is still debated, most investigators agree that the overpressure required to fail the magma reservoir is at first order a function of the crustal stress field and the shape of the magma reservoir. Radar interferometry (InSAR) provides a large dataset of ground deformation worldwide, but many of these InSAR studies continue to use point or dislocation sources (Mogi, Okada) to explain deformation on volcanoes. Even if these simple solutions often fit the data and estimate the depth and the volume change of the source of deformation, key parameters such as the magma overpressure or the mechanical properties of the rocks cannot be derived. We use mechanical numerical models of reservoir failure combined with ground deformation data. It has been observed that volume change before an eruption can easily range one or two order of magnitude from 1-100x106 m3. The first goal of this study is to understand which parameter(s) control the critical volume changes just before the failure of the reservoir. First, a parametric study is performed to quantify the effect of the geometry of the reservoir (radius, depth), the local stress (compressive/extensive) and even the crust rheology (elastic/viscoelastic). We then compare modeling results with several active volcanoes where long time series of volume change are available: Okmok and Westdahl in Alaska, Sinabung and Agung in Indonesia and Galapagos. For each case, the maximum

  20. Thresholded Power law Size Distributions of Instabilities in Astrophysics

    NASA Astrophysics Data System (ADS)

    Aschwanden, Markus J.

    2015-11-01

    Power-law-like size distributions are ubiquitous in astrophysical instabilities. There are at least four natural effects that cause deviations from ideal power law size distributions, which we model here in a generalized way: (1) a physical threshold of an instability; (2) incomplete sampling of the smallest events below a threshold x0; (3) contamination by an event-unrelated background xb; and (4) truncation effects at the largest events due to a finite system size. These effects can be modeled in the simplest terms with a “thresholded power law” distribution function (also called generalized Pareto [type II] or Lomax distribution), N(x){dx}\\propto {(x+{x}0)}-a{dx}, where x0 > 0 is positive for a threshold effect, while x0 < 0 is negative for background contamination. We analytically derive the functional shape of this thresholded power law distribution function from an exponential growth evolution model, which produces avalanches only when a disturbance exceeds a critical threshold x0. We apply the thresholded power law distribution function to terrestrial, solar (HXRBS, BATSE, RHESSI), and stellar flare (Kepler) data sets. We find that the thresholded power law model provides an adequate fit to most of the observed data. Major advantages of this model are the automated choice of the power law fitting range, diagnostics of background contamination, physical instability thresholds, instrumental detection thresholds, and finite system size limits. When testing self-organized criticality models that predict ideal power laws, we suggest including these natural truncation effects.

  1. Predictors of Colorectal Cancer Survival in Golestan, Iran: A Population-based Study

    PubMed Central

    Aryaie, Mohammad; Roshandel, Gholamreza; Semnani, Shahryar; Asadi-Lari, Mohsen; Aarabi, Mohsen; Vakili, Mohammad Ali; Kazemnejhad, Vahideh; Sedaghat, Seyed Mehdi

    2013-01-01

    OBJECTIVES We aimed to investigate factors associated with colorectal cancer survival in Golestan, Iran. METHODS We used a population based cancer registry to recruit study subjects. All patients registered since 2004 were contacted and data were collected using structured questionnaires and trained interviewers. All the existing evidences to determine the stage of the cancer were also collected. The time from first diagnosis to death was compared in patients according to their stage of cancer using the Kaplan-Meir method. A Cox proportional hazard model was built to examine their survival experience by taking into account other covariates. RESULTS Out of a total of 345 subjects, 227 were traced. Median age of the subjects was 54 and more than 42% were under 50 years old. We found 132 deaths among these patients, 5 of which were non-colorectal related deaths. The median survival time for the entire cohort was 3.56 years. A borderline significant difference in survival experience was detected for ethnicity (log rank test, p=0.053). Using Cox proportional hazard modeling, only cancer stage remained significantly associated with time of death in the final model. CONCLUSIONS Colorectal cancer occurs at a younger age among people living in Golestan province. A very young age at presentation and what appears to be a high proportion of patients presenting with late stage in this area suggest this population might benefit substantially from early diagnoses by introducing age adapted screening programs. PMID:23807907

  2. Unsolved homicides in Sweden: A population-based study of 264 homicides.

    PubMed

    Sturup, Joakim; Karlberg, Daniel; Kristiansson, Marianne

    2015-12-01

    The clearance rates for homicides have decreased internationally. This retrospective population-based study of all Swedish homicide incidents between 2007 and 2009 (n=264) aims to investigate factors associated with solvability in homicides. Victims were identified in an autopsy registry and offenders in a criminal-conviction registry. Autopsy reports, police files, court verdicts and criminal records were systematically collected and linked. The clearance rate was 86.4% (n=228), and almost three quarters of cases (71.9%) were solved within the first week. Nine factors were significantly associated with the case status; however, only four factors remained significant in the multivariate logistic-regression model. Cases were more likely to be solved if there was an eyewitness and if the victim was intoxicated with alcohol. Moreover, cases were less likely to be solved if the victim had a criminal record in the past five years and was killed by a firearm. In the final model, a Cox proportional-hazards model, where time to arrest was taken into account, only alcohol intoxication were positively and firearms negatively significantly associated with clearance status. The study concludes that cases involving these factors should be granted extra, intensive and lasting resources. PMID:26295928

  3. Development of a fracture mechanics/threshold behavior model to assess the effects of competing mechanisms induced by shot peening on cyclic life of a nickel-base superalloy, Rene 88DT

    NASA Astrophysics Data System (ADS)

    Tufft, Marsha Klopmeier

    This research establishes an improved lower-bound predictive method for the cyclic life of shot peened specimens made from a nickel-base superalloy, Rene 88DT. Based on previous work, shot peening is noted to induce the equivalent of fatigue damage, in addition to the beneficial compressive residual stresses. The ability to quantify the relative effects of various shot peening treatments on cyclic life capability provides a basis for more economic use of shot peening, and selection of shot peening parameters to meet design and life requirements, while minimizing production costs. The predictive method developed consists of two major elements: (1) a Fracture Mechanics Model, which accounts for changes in microstructure, residual stress and topography induced by shot peening, and (2) a Threshold Behavior Map which identifies both crack nucleation and crack propagation thresholds. When both thresholds are crossed, life capability can be evaluated using the Fracture Mechanics model developed. When the crack propagation threshold is exceeded but the crack nucleation threshold is not, the FM method produces a conservative lower-bound estimate of life capability. A unique contribution is the characterization of damage induced by peening by an initial flaw size from microstructural observations of slip depth. Observations of crack formation along slip band in a model disk provide reinforcement for defining a flaw size from slip measurements. Supporting research includes: (1) metallurgical and microstructural evaluation of single impact dimples and production peened coupons, (2) instrumented Single Particle Impact Tests, characterizing changes in material response due to variations in impact conditions (particle size, incidence angle, velocity), (3) duplication of 16 peening conditions used in a designed experiment, characterizing slip depth, residual stress profiles, surface roughness and velocity measurements taken during production peening conditions.

  4. Percolation Threshold in Polycarbonate Nanocomposites

    NASA Astrophysics Data System (ADS)

    Ahuja, Suresh

    2014-03-01

    Nanocomposites have unique mechanical, electrical, magnetic, optical and thermal properties. Many methods could be applied to prepare polymer-inorganic nanocomposites, such as sol-gel processing, in-situ polymerization, particle in-situ formation, blending, and radiation synthesis. The analytical composite models that have been put forth include Voigt and Reuss bounds, Polymer nanocomposites offer the possibility of substantial improvements in material properties such as shear and bulk modulus, yield strength, toughness, film scratch resistance, optical properties, electrical conductivity, gas and solvent transport, with only very small amounts of nanoparticles Experimental results are compared against composite models of Hashin and Shtrikman bounds, Halpin-Tsai model, Cox model, and various Mori and Tanaka models. Examples of numerical modeling are molecular dynamics modeling and finite element modeling of reduced modulus and hardness that takes into account the modulus of the components and the effect of the interface between the hard filler and relatively soft polymer, polycarbonate. Higher nanoparticle concentration results in poor dispersion and adhesion to polymer matrix which results in lower modulus and hardness and departure from the existing composite models. As the level of silica increases beyond a threshold level, aggregates form which results in weakening of the structure. Polymer silica interface is found to be weak as silica is non-interacting promoting interfacial slip at silica-matrix junctions. Our experimental results compare favorably with those of nanocomposites of polyesters where the effect of nanoclay on composite hardness and modulus depended on dispersion of nanoclay in polyester.

  5. Thresholds for Epidemic Spreading in Networks

    NASA Astrophysics Data System (ADS)

    Castellano, Claudio; Pastor-Satorras, Romualdo

    2010-11-01

    We study the threshold of epidemic models in quenched networks with degree distribution given by a power-law. For the susceptible-infected-susceptible model the activity threshold λc vanishes in the large size limit on any network whose maximum degree kmax⁡ diverges with the system size, at odds with heterogeneous mean-field (HMF) theory. The vanishing of the threshold has nothing to do with the scale-free nature of the network but stems instead from the largest hub in the system being active for any spreading rate λ>1/kmax⁡ and playing the role of a self-sustained source that spreads the infection to the rest of the system. The susceptible-infected-removed model displays instead agreement with HMF theory and a finite threshold for scale-rich networks. We conjecture that on quenched scale-rich networks the threshold of generic epidemic models is vanishing or finite depending on the presence or absence of a steady state.

  6. Maternal hypertension with nifedipine treatment associated with a higher risk for right-sided obstructive defects of the heart: a population-based case–control study

    PubMed Central

    Csáky-Szunyogh, Melinda; Vereczkey, Attila; Gerencsér, Balázs; Czeizel, Andrew E

    2014-01-01

    Objective To establish possible aetiological factors contributing to congenital heart defects (CHD) overall and separately for different types of CHD, as causes are unknown for the vast majority of patients. Design To estimate a possible association with maternal diseases and related drug treatments as exposures in the mothers of cases with right-sided obstructive defects of the heart (RSODH). Setting A large population-based Hungarian Case-Control Surveillance of Congenital Abnormalities data set. Patients Newborn infants with four types of RSODH based on autopsy or surgical records. Interventions Comparison of 200 live-born cases with RSODH including 72 (36.0%) with pulmonary valve stenosis, 13 (6.5%) with tricuspid atresia/stenosis, 7 (3.5%) with Ebstein's anomaly and 108 (54.0%) with pulmonary atresia, with 304 matched controls and 38 151 population controls without any defects. Main outcome measures Risk of any RSODH and risk of each type of RSODH. Results High blood pressure, particularly chronic hypertension with nifedipine treatment, was associated with a risk for RSODH (OR 7.03, 95% CI 3.13 to 13.84). High doses of folic acid reduced the birth prevalence of pulmonary atresia (OR 0.29, 95% CI 0.16 to 0.53). Conclusions The multifactorial threshold model provides the best explanation for the origins of RSODH. Genetic predisposition may be triggered by maternal hypertension with nifedipine treatment, while the risk for pulmonary atresia is reduced by high doses of folic acid in early pregnancy.

  7. The Henry street consortium population-based competencies for educating public health nursing students.

    PubMed

    Schaffer, Marjorie A; Cross, Sharon; Keller, Linda O; Nelson, Pamela; Schoon, Patricia M; Henton, Pat

    2011-01-01

    The Henry Street Consortium, a collaboration of nurse educators from universities and colleges and public health nurses (PHNs) from government, school, and community agencies, developed 11 population-based competencies for educating nursing students and the novice PHN. Although many organizations have developed competency lists for experts, the Consortium developed a set of competencies that clearly define expectations for the beginning PHN. The competencies are utilized by both education and practice. They guide nurse educators and PHNs in the creation of learning experiences that develop population-based knowledge and skills for baccalaureate nursing students. Public health nursing leaders use the competencies to frame their expectations and orientations for nurses who are new to public health nursing. This paper explains the meaning of each of the 11 population-based competencies and provides examples of student projects that demonstrate competency development. Strategies are suggested for nurse educators and PHNs to promote effective population-based student projects in public health agencies. PMID:21198818

  8. NSAID Use and Incident Cognitive Impairment in a Population-based Cohort.

    PubMed

    Wichmann, Margarete A; Cruickshanks, Karen J; Carlsson, Cynthia M; Chappell, Rick; Fischer, Mary E; Klein, Barbara E K; Klein, Ronald; Schubert, Carla R

    2016-01-01

    Nonsteroidal anti-inflammatory drugs (NSAIDs) may prevent dementia, but previous studies have yielded conflicting results. This study estimated the association of prior NSAID use with incident cognitive impairment in the population-based Epidemiology of Hearing Loss Study (EHLS, n=2422 without cognitive impairment in 1998-2000). Prospectively collected medication data from 1988-1990, 1993-1995, and 1998-2000 were used to categorize NSAID use history at the cognitive baseline (1998-2000). Aspirin use and nonaspirin NSAID use were separately examined. Cox regression models were used to estimate the associations between NSAID use history at baseline and incident cognitive impairment in 2003-2005 or 2009-2010. Logistic regression analyses were used to estimate associations with a second outcome, mild cognitive impairment/dementia, available in 2009-2010. Participants using aspirin at baseline but not 5 years prior were more likely to develop cognitive impairment (adjusted hazard ratio=1.77; 95% confidence interval=1.11, 2.82; model 2), with nonsignificant associations for longer term use. Nonaspirin NSAID use was not associated with incident cognitive impairment or mild cognitive impairment/dementia odds. These results provided no evidence to support a potential protective effect of NSAIDs against dementia. PMID:26079710

  9. Clinical risk factors for fracture in postmenopausal Canadian women: a population-based prevalence study.

    PubMed

    Leslie, William D; Anderson, William A; Metge, Colleen J; Manness, Lori-Jean

    2007-04-01

    Clinical risk factor assessment can be used to enhance fracture risk estimation based upon bone densitometry alone. Population- and age-specific risk factor prevalence data are required for the construction of these risk models. Our objective was to derive population-based prevalence estimates of specific clinical risk factors for postmenopausal women resident in the Province of Manitoba, Canada. A random sample of 40,300 women age 50 or older identified from the provincial health plan was mailed a validated self-report risk factor survey. The response rate was 8747 (21.7%) with a final study population of 8027 women after exclusions. The individual prevalence for each clinical risk factor ranged from 5.8% for hyperthyroidism to 33.0% for a fall in the preceding 12 months. Most point prevalence estimates were similar to other large cohort studies, though the prevalences of inactivity and poor mobility were higher than expected while height at age 25 and the prevalence of any fracture after age 50 were lower than expected. Most of the respondents (86.9%) had at least one non-age clinical risk factor, 60.6% had two or more, and 33.5% had three or more. Age affected risk factor prevalence, and older age was associated with a higher rate of multiple risk factors. The availability of age-specific risk factor prevalence rates in this population may allow for more accurate fracture risk modeling. PMID:17182296

  10. Outcome Predictors in First-Ever Ischemic Stroke Patients: A Population-Based Study

    PubMed Central

    Corso, Giovanni; Bottacchi, Edo; Tosi, Piera; Caligiana, Laura; Lia, Chiara; Veronese Morosini, Massimo; Dalmasso, Paola

    2014-01-01

    Background. There is scant population-based information regarding predictors of stroke severity and long-term mortality for first-ever ischemic strokes. The aims of this study were to determine the characteristics of patients who initially presented with first-ever ischemic stroke and to identify predictors of severity and long-term mortality. Methods. Data were collected from the population-based Cerebrovascular Aosta Registry. Between 2004 and 2008, 1057 patients with first-ever ischemic stroke were included. Variables analysed included comorbidities, sociodemographic factors, prior-to-stroke risk factors, therapy at admission and pathophysiologic and metabolic factors. Multivariate logistic regression models, Kaplan-Meier estimates, and Cox proportional Hazards model were used to assess predictors. Results. Predictors of stroke severity at admission were very old age (odds ratio [OR] 2.98, 95% confidence interval [CI] 1.75–5.06), female gender (OR 1.73, 95% CI 1.21–2.40), atrial fibrillation (OR 2.76, 95% CI 1.72–4.44), low ejection fraction (OR 2.22, CI 95% 1.13–4.32), and cardioembolism (OR 2.0, 95% CI 1.36–2.93). Predictors of long-term mortality were very old age (hazard ratio [HR] 2.02, 95% CI 1.65–2.47), prestroke modified Rankin scale 3–5 (HR 1.82; 95% CI 1.46–2.26), Charlson Index ≥2 (HR 1.97; 95% CI 1.62–2.42), atrial fibrillation (HR 1.43, 95% CI 1.04–1.98), and stroke severity (HR 3.54, 95% CI 2.87–4.36). Conclusions. Very old age and cardiac embolism risk factors are the independent predictors of stroke severity. Moreover, these factors associated with other comorbid medical conditions influence independently long-term mortality after ischemic stroke.

  11. Adipocytokines, C-Reactive Protein, and Cardiovascular Disease: A Population-Based Prospective Study

    PubMed Central

    Seven, Ekim; Husemoen, Lise L. N.; Sehested, Thomas S. G.; Ibsen, Hans; Wachtell, Kristian; Linneberg, Allan; Jeppesen, Jørgen L.

    2015-01-01

    Background Being overweight or obese is associated with a greater risk of coronary heart disease and stroke compared with normal weight. The role of the specific adipose tissue-derived substances, called adipocytokines, in overweight- and obesity-related cardiovascular disease (CVD) is still unclear. Objective To investigate the associations of three adipose tissue-derived substances: adiponectin, leptin, and interleukin-6 with incident CVD in a longitudinal population-based study, including extensive adjustments for traditional and metabolic risk factors closely associated with overweight and obesity. C-reactive protein (CRP) was used as a proxy for interleukin-6. Methods Prospective population-based study of 6.502 participants, 51.9% women, aged 30–60 years, free of CVD at baseline, with a mean follow-up time of 11.4 years, equivalent to 74,123 person-years of follow-up. As outcome, we defined a composite outcome comprising of the first event of fatal and nonfatal coronary heart disease and fatal and nonfatal stroke. Results During the follow-up period, 453 composite CV outcomes occurred among participants with complete datasets. In models, including gender, age, smoking status, systolic blood pressure, treatment for hypertension, diabetes, body mass index (BMI), total cholesterol, high-density-lipoprotein cholesterol, homeostasis model assessment of insulin resistance, estimated glomerular filtration rate, adiponectin, leptin, and CRP, neither adiponectin (hazard ratio [HR] with 95% confidence interval [CI]: 0.97 [0.87–1.08] per SD increase, P = 0.60) nor leptin (0.97 [0.85–1.12] per SD increase, P = 0.70) predicted the composite outcome, whereas CRP was significantly associated with the composite outcome (1.19 [1.07–1.35] per SD increase, P = 0.002). Furthermore, in mediation analysis, adjusted for age and sex, CRP decreased the BMI-associated CV risk by 43% (95%CI 29–72). Conclusions In this study, neither adiponectin nor leptin were independently

  12. Initiation Pressure Thresholds from Three Sources

    SciTech Connect

    Souers, P C; Vitello, P

    2007-02-28

    Pressure thresholds are minimum pressures needed to start explosive initiation that ends in detonation. We obtain pressure thresholds from three sources. Run-to-detonation times are the poorest source but the fitting of a function gives rough results. Flyer-induced initiation gives the best results because the initial conditions are the best known. However, very thick flyers are needed to give the lowest, asymptotic pressure thresholds used in modern models and this kind of data is rarely available. Gap test data is in much larger supply but the various test sizes and materials are confusing. We find that explosive pressures are almost the same if the distance in the gap test spacers are in units of donor explosive radius. Calculated half-width time pulses in the spacers may be used to create a pressure-time curve similar to that of the flyers. The very-large Eglin gap tests give asymptotic thresholds comparable to extrapolated flyer results. The three sources are assembled into a much-expanded set of near-asymptotic pressure thresholds. These thresholds vary greatly with density: for TATB/LX-17/PBX 9502, we find values of 4.9 and 8.7 GPa at 1.80 and 1.90 g/cm{sup 3}, respectively.

  13. Laser threshold magnetometry

    NASA Astrophysics Data System (ADS)

    Jeske, Jan; Cole, Jared H.; Greentree, Andrew D.

    2016-01-01

    We propose a new type of sensor, which uses diamond containing the optically active nitrogen-vacancy (NV-) centres as a laser medium. The magnetometer can be operated at room-temperature and generates light that can be readily fibre coupled, thereby permitting use in industrial applications and remote sensing. By combining laser pumping with a radio-frequency Rabi-drive field, an external magnetic field changes the fluorescence of the NV- centres. We use this change in fluorescence level to push the laser above threshold, turning it on with an intensity controlled by the external magnetic field, which provides a coherent amplification of the readout signal with very high contrast. This mechanism is qualitatively different from conventional NV--based magnetometers which use fluorescence measurements, based on incoherent photon emission. We term our approach laser threshold magnetometer (LTM). We predict that an NV--based LTM with a volume of 1 mm3 can achieve shot-noise limited dc sensitivity of 1.86 fT /\\sqrt{{{Hz}}} and ac sensitivity of 3.97 fT /\\sqrt{{{Hz}}}.

  14. Human immunodeficiency virus testing for patient-based and population-based diagnosis.

    PubMed

    Albritton, W L; Vittinghoff, E; Padian, N S

    1996-10-01

    Laboratory testing for human immunodeficiency virus (HIV) has been introduced for individual patient-based diagnosis as well as high-risk and low-risk population-based screening. The choice of test, confirmatory algorithm, and interpretative criteria used depend on the clinical setting. In the context of general population-based testing, factors affecting test performance will have to be considered carefully in the development of testing policy. PMID:8843247

  15. Public assistance, drug testing, and the law: the limits of population-based legal analysis.

    PubMed

    Player, Candice T

    2014-01-01

    In Populations, Public Health and the Law, legal scholar Wendy Parmet urges courts to embrace population-based legal analysis, a public health inspired approach to legal reasoning. Parmet contends that population-based legal analysis offers a way to analyze legal issues--not unlike law and economics--as well as a set of values from which to critique contemporary legal discourse. Population-based analysis has been warmly embraced by the health law community as a bold new way of analyzing legal issues. Still, population-based analysis is not without its problems. At times, Parmet claims too much territory for the population perspective. Moreover, Parmet urges courts to recognize population health as an important norm in legal reasoning. What should we do when the insights of public health and conventional legal reasoning conflict? Still in its infancy, population-based analysis offers little in the way of answers to these questions. This Article applies population-based legal analysis to the constitutional problems that arise when states condition public assistance benefits on passing a drug test, thereby highlighting the strengths of the population perspective and exposing its weaknesses. PMID:24844042

  16. Assessing the Validity of a Stage Measure on Physical Activity in a Population-Based Sample of Individuals with Type 1 or Type 2 Diabetes

    ERIC Educational Resources Information Center

    Plotnikoff, Ronald C.; Lippke, Sonia; Reinbold-Matthews, Melissa; Courneya, Kerry S.; Karunamuni, Nandini; Sigal, Ronald J.; Birkett, Nicholas

    2007-01-01

    This study was designed to test the validity of a transtheoretical model's physical activity (PA) stage measure with intention and different intensities of behavior in a large population-based sample of adults living with diabetes (Type 1 diabetes, n = 697; Type 2 diabetes, n = 1,614) and examine different age groups. The overall "specificity"…

  17. Ambient Fine Particulate Matter and Mortality among Survivors of Myocardial Infarction: Population-Based Cohort Study

    PubMed Central

    Chen, Hong; Burnett, Richard T.; Copes, Ray; Kwong, Jeffrey C.; Villeneuve, Paul J.; Goldberg, Mark S.; Brook, Robert D.; van Donkelaar, Aaron; Jerrett, Michael; Martin, Randall V.; Brook, Jeffrey R.; Kopp, Alexander; Tu, Jack V.

    2016-01-01

    Background: Survivors of acute myocardial infarction (AMI) are at increased risk of dying within several hours to days following exposure to elevated levels of ambient air pollution. Little is known, however, about the influence of long-term (months to years) air pollution exposure on survival after AMI. Objective: We conducted a population-based cohort study to determine the impact of long-term exposure to fine particulate matter ≤ 2.5 μm in diameter (PM2.5) on post-AMI survival. Methods: We assembled a cohort of 8,873 AMI patients who were admitted to 1 of 86 hospital corporations across Ontario, Canada in 1999–2001. Mortality follow-up for this cohort extended through 2011. Cumulative time-weighted exposures to PM2.5 were derived from satellite observations based on participants’ annual residences during follow-up. We used standard and multilevel spatial random-effects Cox proportional hazards models and adjusted for potential confounders. Results: Between 1999 and 2011, we identified 4,016 nonaccidental deaths, of which 2,147 were from any cardiovascular disease, 1,650 from ischemic heart disease, and 675 from AMI. For each 10-μg/m3 increase in PM2.5, the adjusted hazard ratio (HR10) of nonaccidental mortality was 1.22 [95% confidence interval (CI): 1.03, 1.45]. The association with PM2.5 was robust to sensitivity analyses and appeared stronger for cardiovascular-related mortality: ischemic heart (HR10 = 1.43; 95% CI: 1.12, 1.83) and AMI (HR10 = 1.64; 95% CI: 1.13, 2.40). We estimated that 12.4% of nonaccidental deaths (or 497 deaths) could have been averted if the lowest measured concentration in an urban area (4 μg/m3) had been achieved at all locations over the course of the study. Conclusions: Long-term air pollution exposure adversely affects the survival of AMI patients. Citation: Chen H, Burnett RT, Copes R, Kwong JC, Villeneuve PJ, Goldberg MS, Brook RD, van Donkelaar A, Jerrett M, Martin RV, Brook JR, Kopp A, Tu JV. 2016. Ambient fine

  18. The Epidemiology of Chronic Kidney Disease in Northern Tanzania: A Population-Based Survey

    PubMed Central

    Stanifer, John W.; Maro, Venance; Egger, Joseph; Karia, Francis; Thielman, Nathan; Turner, Elizabeth L.; Shimbi, Dionis; Kilaweh, Humphrey; Matemu, Oliver; Patel, Uptal D.

    2015-01-01

    Background In sub-Saharan Africa, kidney failure has a high morbidity and mortality. Despite this, population-based estimates of prevalence, potential etiologies, and awareness are not available. Methods Between January and June 2014, we conducted a household survey of randomly-selected adults in Northern Tanzania. To estimate prevalence we screened for CKD, which was defined as an estimated glomerular filtration rate ≤ 60 ml/min/1.73m2 and/or persistent albuminuria. We also screened for human immunodeficiency virus (HIV), diabetes, hypertension, obesity, and lifestyle practices including alcohol, tobacco, and traditional medicine use. Awareness was defined as a self-reported disease history and subsequently testing positive. We used population-based age- and gender-weights in estimating prevalence, and we used generalized linear models to explore potential risk factors associated with CKD, including living in an urban environment. Results We enrolled 481 adults from 346 households with a median age of 45 years. The community-based prevalence of CKD was 7.0% (95% CI 3.8-12.3), and awareness was low at 10.5% (4.7-22.0). The urban prevalence of CKD was 15.2% (9.6-23.3) while the rural prevalence was 2.0% (0.5-6.9). Half of the cases of CKD (49.1%) were not associated with any of the measured risk factors of hypertension, diabetes, or HIV. Living in an urban environment had the strongest crude (5.40; 95% CI 2.05-14.2) and adjusted prevalence risk ratio (4.80; 1.70-13.6) for CKD, and the majority (79%) of this increased risk was not explained by demographics, traditional medicine use, socioeconomic status, or co-morbid non-communicable diseases (NCDs). Conclusions We observed a high burden of CKD in Northern Tanzania that was associated with low awareness. Although demographic, lifestyle practices including traditional medicine use, socioeconomic factors, and NCDs accounted for some of the excess CKD risk observed with urban residence, much of the increased urban

  19. Population-based 3D genome structure analysis reveals driving forces in spatial genome organization

    PubMed Central

    Li, Wenyuan; Kalhor, Reza; Dai, Chao; Hao, Shengli; Gong, Ke; Zhou, Yonggang; Li, Haochen; Zhou, Xianghong Jasmine; Le Gros, Mark A.; Larabell, Carolyn A.; Chen, Lin; Alber, Frank

    2016-01-01

    Conformation capture technologies (e.g., Hi-C) chart physical interactions between chromatin regions on a genome-wide scale. However, the structural variability of the genome between cells poses a great challenge to interpreting ensemble-averaged Hi-C data, particularly for long-range and interchromosomal interactions. Here, we present a probabilistic approach for deconvoluting Hi-C data into a model population of distinct diploid 3D genome structures, which facilitates the detection of chromatin interactions likely to co-occur in individual cells. Our approach incorporates the stochastic nature of chromosome conformations and allows a detailed analysis of alternative chromatin structure states. For example, we predict and experimentally confirm the presence of large centromere clusters with distinct chromosome compositions varying between individual cells. The stability of these clusters varies greatly with their chromosome identities. We show that these chromosome-specific clusters can play a key role in the overall chromosome positioning in the nucleus and stabilizing specific chromatin interactions. By explicitly considering genome structural variability, our population-based method provides an important tool for revealing novel insights into the key factors shaping the spatial genome organization. PMID:26951677

  20. Physical Function and Health-Related Quality-of-Life in a Population-Based Sample

    PubMed Central

    Hall, Susan A.; Chiu, Gretchen R.; Williams, Rachel E.; Clark, Richard V.; Araujo, Andre B.

    2011-01-01

    Background It is of interest to understand whether impaired physical function is associated with health-related quality of life (HRQOL). We examined upper and lower body physical function and its relationship with two domains of HRQOL among men. Methods We conducted a population-based observational study of musculoskeletal health among Boston, MA residents, the Boston Area Community Health/Bone Survey. Participants were 1,219 randomly-selected Black, Hispanic, and White males (30–79 years). Upper body function was measured using hand grip strength, while lower body function was measured by combining a timed walk and a chair stand test. HRQOL was measured using the physical (PCS-12) and mental health (MCS-12) component scores of the SF-12. Multivariate linear regression models were used to estimate the association between poor function and HRQOL. Results There was a significant association of poor upper body physical function with the MCS-12 (beta coefficient: −4.12, p=0.003) but not the PCS-12 (beta coefficient: 0.79, p=0.30) compared to those without poor function. Those with poor lower body physical function had significantly lower PCS-12 scores (beta: −2.95, p=0.007), compared to those without poor function, but an association was not observed for MCS-12 scores. Conclusions Domains of physical function was not consistently related to domains of HRQOL. PMID:20670102

  1. Population-based study of facial morphology and excessive daytime somnolence.

    PubMed

    Castillo, Pablo R; Mera, Robertino M; Zambrano, Mauricio; Del Brutto, Oscar H

    2014-11-01

    Studies in patients seeking attention for nasal obstruction or pharyngeal disorders suggest that craniofacial abnormalities correlate with obstructive sleep apnea, but there is little information on the relevance of this association in the population at large. We aimed to determine whether characteristics of facial morphology correlate with excessive daytime somnolence (EDS) in a population-based, door-to-door survey. Residents of a village in rural Ecuador were screened with the Epworth sleepiness scale to assess EDS and underwent physical examination with attention to nasal septum deflection, mandibular retrognathia and presence of Friedman's palate position type IV. From 665 participants aged ≥40 years, 155 had EDS, 98 had nasal septum deflection, 47 had mandibular retrognathia and 528 had a Friedman's palate position type IV. In a logistic regression model adjusted for age, sex, body mass index, and nightly sleep hours, persons with nasal septum deflection were twice as likely to have EDS (p=0.009). The other two variables were not associated with EDS. Identification of nasal septum deflection may be a cost-effective method of detecting persons at risk for obstructive sleep apnea in remote areas where sophisticated technology is not readily available. PMID:24986788

  2. A Population-Based Cohort Study on Peripheral Arterial Disease in Patients with Schizophrenia

    PubMed Central

    Hsu, Wen-Yu; Lin, Cheng-Li; Kao, Chia-Hung

    2016-01-01

    Purpose Peripheral arterial disease (PAD) is considered the leading cause of atherosclerotic cardiovascular morbidity. Several risk factors of PAD have been observed in patients with schizophrenia. Therefore, we hypothesize that the incidence of PAD is higher in the schizophrenia population than in the general population. Methods The patients in this population-based cohort study were selected from the Taiwanese National Health Insurance Research Database on the basis of the claims data from 2000 to 2011. We compared the incidence of PAD between schizophrenia and nonschizophrenia cohorts. Cox proportional hazard regression models were employed for analyzing the risk of PAD after adjustment for sex, age, and comorbidities. Results The adjusted hazard ratio (HR) for PAD in the schizophrenia cohort was 1.26-fold higher than that in the nonschizophrenia cohort. Furthermore, patients with schizophrenia using atypical antipsychotics exhibited a high adjusted HR for PAD. Conclusion Compared with the general population, the risk of PAD is higher among patients with schizophrenia. Early diagnosis and intervention can mitigate complications resulting from cardiovascular diseases and lower mortality. PMID:26871697

  3. Seasonal variation of peripheral blood leukocyte telomere length in Costa Rica: a population based observational study

    PubMed Central

    Rehkopf, David H; Dow, William H; Rosero-Bixby, Luis; Lin, Jue; Epel, Elissa S; Blackburn, Elizabeth H

    2014-01-01

    Objectives Peripheral blood leukocyte telomere length is increasingly being used as a biomarker of aging, but its natural variation in human populations is not well understood. Several other biomarkers show seasonal variation, as do several determinants of leukocyte telomere length. We examined whether there was monthly variation in leukocyte telomere length in Costa Rica, a country with strong seasonal differences in precipitation and infection. Methods We examined a longitudinal population based cohort of 581 Costa Rican adults age 60 and above, from which blood samples were drawn between October 2006 and July 2008. Leukocyte telomere length was assayed from these samples using the quantitative PCR method. Multivariate regression models were used to examine correlations between month of blood draw and leukocyte telomere length. Results Telomere length from peripheral blood leukocytes varied by as much as 200 base pairs depending on month of blood draw, and this difference is not likely to be due to random variation. A moderate proportion of this association is statistically accounted for by month and region specific average rainfall. We found shorter telomere length associated with greater rainfall. Conclusions There are two possible explanations of our findings. First, there could be relatively rapid month-to-month changes in leukocyte telomere length. This conclusion would have implications for understanding the natural population dynamics of telomere length. Second, there could be seasonal differences in constituent cell populations. This conclusion would suggest that future studies of leukocyte telomere length use methods to account for the potential impact of constituent cell type. PMID:24615938

  4. The association of psychosocial and familial factors with adolescent suicidal ideation: A population-based study.

    PubMed

    An, Hoyoung; Ahn, Joon-ho; Bhang, Soo-young

    2010-05-30

    We aimed to compare the influence of various parental factors on adolescent suicidal ideas from a population-based sample of 2965 adolescents between 15 to 18 years-old, and their parents. Among the subject variables, gender, satisfaction with one's health, having an illness, and satisfaction with family; and among parental variables, fathers' satisfaction with health; mothers' insufficient sleep; parents' history of suicidal ideation, and satisfaction with family were significantly different in adolescents who reported suicidal ideation compared to those who reported none. Odds ratios indicated increased risk of adolescent suicidal ideation was associated with the subject factors female gender, insufficient sleep, dissatisfaction with one's health, dissatisfaction with family, and with maternal data showing insufficient sleep and a positive history of suicidal impulse. A path analysis model (comparative fit index (CFI)=0.907; root mean square error of approximation (RMSEA)=0.047), indicated psychosocial factors (beta=0.232) had a greater influence on adolescent suicidal ideation than did genetic factors (beta=0.120). These results show psychosocial factors have an almost two-fold greater influence on adolescent suicidal ideation than genetic factors. Assessment and modification of these factors would greatly assist future interventions. PMID:20381165

  5. Incidence of Hidradenitis Suppurativa and Associated Factors: A Population-Based Study of Olmsted County, Minnesota

    PubMed Central

    Vazquez, Benjamin G.; Alikhan, Ali; Weaver, Amy L.; Wetter, David A.; Davis, Mark D.

    2012-01-01

    There are no population-based incidence studies of hidradenitis suppurativa (HS). Using the medical records linkage system of the Rochester Epidemiology Project, we sought to determine incidence, as well as other associations and characteristics, for HS patients diagnosed in Olmsted County, Minnesota between 1968 and 2008. Incidence was estimated using the decennial census data for the county. Logistic regression models were fit to evaluate associations between patient characteristics and disease severity. A total of 268 incident cases were identified, with an overall annual age- and sex-adjusted incidence of 6.0 per 100,000. Age-adjusted incidence was significantly higher in women compared to men [8.2 (95% CI, 7.0–9.3) vs. 3.8 (95% CI, 3.0–4.7)]. The highest incidence was among young women aged 20–29 (18.4 per 100,000). The incidence has risen over the past four decades, particularly among women. Women were more likely to have axillary and upper anterior torso involvement, while men were more likely to have perineal or perianal disease. Additionally, 54.9% (140/255) patients were obese; 70.2% were current or former smokers; 42.9% carried a diagnosis of depression; 36.2% carried a diagnosis of acne; and 6% had pilonidal disease. Smoking and gender were significantly associated with more severe disease. PMID:22931916

  6. Sleep and academic performance in later adolescence: results from a large population-based study.

    PubMed

    Hysing, Mari; Harvey, Allison G; Linton, Steven J; Askeland, Kristin G; Sivertsen, Børge

    2016-06-01

    The aim of the current study was to assess the association between sleep duration and sleep patterns and academic performance in 16-19 year-old adolescents using registry-based academic grades. A large population-based study from Norway conducted in 2012, the youth@hordaland-survey, surveyed 7798 adolescents aged 16-19 years (53.5% girls). The survey was linked with objective outcome data on school performance. Self-reported sleep measures provided information on sleep duration, sleep efficiency, sleep deficit and bedtime differences between weekday and weekend. School performance [grade point average (GPA)] was obtained from official administrative registries. Most sleep parameters were associated with increased risk for poor school performance. After adjusting for sociodemographic information, short sleep duration and sleep deficit were the sleep measures with the highest odds of poor GPA (lowest quartile). Weekday bedtime was associated significantly with GPA, with adolescents going to bed between 22:00 and 23:00 hours having the best GPA. Also, delayed sleep schedule during weekends was associated with poor academic performance. The associations were somewhat reduced after additional adjustment for non-attendance at school, but remained significant in the fully adjusted models. In conclusion, the demonstrated relationship between sleep problems and poor academic performance suggests that careful assessment of sleep is warranted when adolescents are underperforming at school. Future studies are needed on the association between impaired sleep in adolescence and later functioning in adulthood. PMID:26825591

  7. Firearm and Nonfirearm Homicide in 5 South African Cities: A Retrospective Population-Based Study

    PubMed Central

    Thompson, Mary Lou; Myers, Jonathan E.

    2014-01-01

    Objective. We assessed the effectiveness of South Africa’s Firearm Control Act (FCA), passed in 2000, on firearm homicide rates compared with rates of nonfirearm homicide across 5 South African cities from 2001 to 2005. Methods. We conducted a retrospective population-based study of 37 067 firearm and nonfirearm homicide cases. Generalized linear models helped estimate and compare time trends of firearm and nonfirearm homicides, adjusting for age, sex, race, day of week, city, year of death, and population size. Results. There was a statistically significant decreasing trend regarding firearm homicides from 2001, with an adjusted year-on-year homicide rate ratio of 0.864 (95% confidence interval [CI] = 0.848, 0.880), representing a decrease of 13.6% per annum. The year-on-year decrease in nonfirearm homicide rates was also significant, but considerably lower at 0.976 (95% CI = 0.954, 0.997). Results suggest that 4585 (95% CI = 4427, 4723) lives were saved across 5 cities from 2001 to 2005 because of the FCA. Conclusions. Strength, timing and consistent decline suggest stricter gun control mediated by the FCA accounted for a significant decrease in homicide overall, and firearm homicide in particular, during the study period. PMID:24432917

  8. Normal liver enzymes are correlated with severity of metabolic syndrome in a large population based cohort

    PubMed Central

    Kälsch, Julia; Bechmann, Lars P.; Heider, Dominik; Best, Jan; Manka, Paul; Kälsch, Hagen; Sowa, Jan-Peter; Moebus, Susanne; Slomiany, Uta; Jöckel, Karl-Heinz; Erbel, Raimund; Gerken, Guido; Canbay, Ali

    2015-01-01

    Key features of the metabolic syndrome are insulin resistance and diabetes. The liver as central metabolic organ is not only affected by the metabolic syndrome as non-alcoholic fatty liver disease (NAFLD), but may contribute to insulin resistance and metabolic alterations. We aimed to identify potential associations between liver injury markers and diabetes in the population-based Heinz Nixdorf RECALL Study. Demographic and laboratory data were analyzed in participants (n = 4814, age 45 to 75y). ALT and AST values were significantly higher in males than in females. Mean BMI was 27.9 kg/m2 and type-2-diabetes (known and unkown) was present in 656 participants (13.7%). Adiponectin and vitamin D both correlated inversely with BMI. ALT, AST, and GGT correlated with BMI, CRP and HbA1c and inversely correlated with adiponectin levels. Logistic regression models using HbA1c and adiponectin or HbA1c and BMI were able to predict diabetes with high accuracy. Transaminase levels within normal ranges were closely associated with the BMI and diabetes risk. Transaminase levels and adiponectin were inversely associated. Re-assessment of current normal range limits should be considered, to provide a more exact indicator for chronic metabolic liver injury, in particular to reflect the situation in diabetic or obese individuals. PMID:26269425

  9. Normal liver enzymes are correlated with severity of metabolic syndrome in a large population based cohort.

    PubMed

    Kälsch, Julia; Bechmann, Lars P; Heider, Dominik; Best, Jan; Manka, Paul; Kälsch, Hagen; Sowa, Jan-Peter; Moebus, Susanne; Slomiany, Uta; Jöckel, Karl-Heinz; Erbel, Raimund; Gerken, Guido; Canbay, Ali

    2015-01-01

    Key features of the metabolic syndrome are insulin resistance and diabetes. The liver as central metabolic organ is not only affected by the metabolic syndrome as non-alcoholic fatty liver disease (NAFLD), but may contribute to insulin resistance and metabolic alterations. We aimed to identify potential associations between liver injury markers and diabetes in the population-based Heinz Nixdorf RECALL Study. Demographic and laboratory data were analyzed in participants (n = 4814, age 45 to 75 y). ALT and AST values were significantly higher in males than in females. Mean BMI was 27.9 kg/m(2) and type-2-diabetes (known and unkown) was present in 656 participants (13.7%). Adiponectin and vitamin D both correlated inversely with BMI. ALT, AST, and GGT correlated with BMI, CRP and HbA1c and inversely correlated with adiponectin levels. Logistic regression models using HbA1c and adiponectin or HbA1c and BMI were able to predict diabetes with high accuracy. Transaminase levels within normal ranges were closely associated with the BMI and diabetes risk. Transaminase levels and adiponectin were inversely associated. Re-assessment of current normal range limits should be considered, to provide a more exact indicator for chronic metabolic liver injury, in particular to reflect the situation in diabetic or obese individuals. PMID:26269425

  10. Epidemiology in a changing world: implications for population-based research on mental disorders.

    PubMed

    Cooper, B

    2014-06-01

    Introduction and objectives. Population-based research on mental disorders needs to keep pace with trends in general epidemiology. At present, this requirement is complicated by uncertainty within the parent discipline about its future development. The present study examines proposals for new directions in strategy and methods and considers their significance for psychiatric epidemiology. Method. Narrative review, cross-checked by search of English-language journals of epidemiology for new trends and developments reported in the years from 2000 onwards. Results. The proposals reviewed here are divided into three groups: 1. A new research paradigm of 'eco-epidemiology', which includes both individual risk factors and macro-environmental systems that mediate population levels of health and sickness. 2. Improved 'translation' of research findings - i.e. more rapid and effective implementation of epidemiological evidence into health policy and practice. 3. Adaptation of epidemiology to a globalised economy, with firmer regulation of funding and resources. Conclusions. Each of these proposals has implications for psychiatric epidemiology. Workers in this field, however, are still preoccupied by relatively specific problems of definition, measurement and classification, and so far the current debates in general epidemiology are scarcely reflected. The proposals outlined above call for: • a working model of eco-epidemiology as it relates to psychiatric disorders; • implementation strategies to encourage more active participation in epidemiological research by community health services and caregiver organisations; • international collaborative projects that offer practical benefits in training and service facilities for the countries taking part. PMID:24345606

  11. Physical Trauma and Amyotrophic Lateral Sclerosis: A Population-Based Study Using Danish National Registries.

    PubMed

    Seals, Ryan M; Hansen, Johnni; Gredal, Ole; Weisskopf, Marc G

    2016-02-15

    Prior studies have suggested that physical trauma might be associated with the development of amyotrophic lateral sclerosis (ALS). We conducted a population-based, individually matched case-control study in Denmark to assess whether hospitalization for trauma is associated with a higher risk of developing ALS. There were 3,650 incident cases of ALS in the Danish National Patient Register from 1982 to 2009. We used risk-set sampling to match each case to 100 age- and sex-matched population controls alive on the date of the case's diagnosis. Odds ratios and 95% confidence intervals were calculated using a conditional logistic regression model. History of trauma diagnosis was also obtained from the Danish Patient Register. When traumas in the 5 years prior to the index date were excluded, there was a borderline association between any trauma and ALS (odds ratio (OR) = 1.09, 95% confidence interval (CI): 0.99, 1.19). A first trauma before age 55 years was associated with ALS (OR = 1.22, 95% CI: 1.08, 1.37), whereas first traumas at older ages were not (OR = 0.97, 95% CI: 0.85, 1.10). Our data suggest that physical trauma at earlier ages is associated with ALS risk. Age at first trauma could help explain discrepancies in results of past studies of trauma and ALS. PMID:26825926

  12. Threshold concepts in finance: conceptualizing the curriculum

    NASA Astrophysics Data System (ADS)

    Hoadley, Susan; Tickle, Leonie; Wood, Leigh N.; Kyng, Tim

    2015-08-01

    Graduates with well-developed capabilities in finance are invaluable to our society and in increasing demand. Universities face the challenge of designing finance programmes to develop these capabilities and the essential knowledge that underpins them. Our research responds to this challenge by identifying threshold concepts that are central to the mastery of finance and by exploring their potential for informing curriculum design and pedagogical practices to improve student outcomes. In this paper, we report the results of an online survey of finance academics at multiple institutions in Australia, Canada, New Zealand, South Africa and the United Kingdom. The outcomes of our research are recommendations for threshold concepts in finance endorsed by quantitative evidence, as well as a model of the finance curriculum incorporating finance, modelling and statistics threshold concepts. In addition, we draw conclusions about the application of threshold concept theory supported by both quantitative and qualitative evidence. Our methodology and findings have general relevance to the application of threshold concept theory as a means to investigate and inform curriculum design and delivery in higher education.

  13. Predictability of threshold exceedances in dynamical systems

    NASA Astrophysics Data System (ADS)

    Bódai, Tamás

    2015-12-01

    In a low-order model of the general circulation of the atmosphere we examine the predictability of threshold exceedance events of certain observables. The likelihood of such binary events-the cornerstone also for the categoric (as opposed to probabilistic) prediction of threshold exceedances-is established from long time series of one or more observables of the same system. The prediction skill is measured by a summary index of the ROC curve that relates the hit- and false alarm rates. Our results for the examined systems suggest that exceedances of higher thresholds are more predictable; or in other words: rare large magnitude, i.e., extreme, events are more predictable than frequent typical events. We find this to hold provided that the bin size for binning time series data is optimized, but not necessarily otherwise. This can be viewed as a confirmation of a counterintuitive (and seemingly contrafactual) statement that was previously formulated for more simple autoregressive stochastic processes. However, we argue that for dynamical systems in general it may be typical only, but not universally true. We argue that when there is a sufficient amount of data depending on the precision of observation, the skill of a class of data-driven categoric predictions of threshold exceedances approximates the skill of the analogous model-driven prediction, assuming strictly no model errors. Therefore, stronger extremes in terms of higher threshold levels are more predictable both in case of data- and model-driven prediction. Furthermore, we show that a quantity commonly regarded as a measure of predictability, the finite-time maximal Lyapunov exponent, does not correspond directly to the ROC-based measure of prediction skill when they are viewed as functions of the prediction lead time and the threshold level. This points to the fact that even if the Lyapunov exponent as an intrinsic property of the system, measuring the instability of trajectories, determines predictability

  14. Epidemic thresholds for bipartite networks

    NASA Astrophysics Data System (ADS)

    Hernández, D. G.; Risau-Gusman, S.

    2013-11-01

    It is well known that sexually transmitted diseases (STD) spread across a network of human sexual contacts. This network is most often bipartite, as most STD are transmitted between men and women. Even though network models in epidemiology have quite a long history now, there are few general results about bipartite networks. One of them is the simple dependence, predicted using the mean field approximation, between the epidemic threshold and the average and variance of the degree distribution of the network. Here we show that going beyond this approximation can lead to qualitatively different results that are supported by numerical simulations. One of the new features, that can be relevant for applications, is the existence of a critical value for the infectivity of each population, below which no epidemics can arise, regardless of the value of the infectivity of the other population.

  15. Absolute Cerebral Blood Flow Infarction Threshold for 3-Hour Ischemia Time Determined with CT Perfusion and 18F-FFMZ-PET Imaging in a Porcine Model of Cerebral Ischemia

    PubMed Central

    Cockburn, Neil; Kovacs, Michael

    2016-01-01

    CT Perfusion (CTP) derived cerebral blood flow (CBF) thresholds have been proposed as the optimal parameter for distinguishing the infarct core prior to reperfusion. Previous threshold-derivation studies have been limited by uncertainties introduced by infarct expansion between the acute phase of stroke and follow-up imaging, or DWI lesion reversibility. In this study a model is proposed for determining infarction CBF thresholds at 3hr ischemia time by comparing contemporaneously acquired CTP derived CBF maps to 18F-FFMZ-PET imaging, with the objective of deriving a CBF threshold for infarction after 3 hours of ischemia. Endothelin-1 (ET-1) was injected into the brain of Duroc-Cross pigs (n = 11) through a burr hole in the skull. CTP images were acquired 10 and 30 minutes post ET-1 injection and then every 30 minutes for 150 minutes. 370 MBq of 18F-FFMZ was injected ~120 minutes post ET-1 injection and PET images were acquired for 25 minutes starting ~155–180 minutes post ET-1 injection. CBF maps from each CTP acquisition were co-registered and converted into a median CBF map. The median CBF map was co-registered to blood volume maps for vessel exclusion, an average CT image for grey/white matter segmentation, and 18F-FFMZ-PET images for infarct delineation. Logistic regression and ROC analysis were performed on infarcted and non-infarcted pixel CBF values for each animal that developed infarct. Six of the eleven animals developed infarction. The mean CBF value corresponding to the optimal operating point of the ROC curves for the 6 animals was 12.6 ± 2.8 mL·min-1·100g-1 for infarction after 3 hours of ischemia. The porcine ET-1 model of cerebral ischemia is easier to implement then other large animal models of stroke, and performs similarly as long as CBF is monitored using CTP to prevent reperfusion. PMID:27347877

  16. Population-Based Prospective Study of Cigarette Smoking and Risk of Incident Essential Tremor

    PubMed Central

    Louis, Elan D.; Benito-León, Julián; Bermejo-Pareja, Félix

    2009-01-01

    BACKGROUND Smoking cigarettes is associated with lower risk of Parkinson’s disease (PD). Despite the clinical links between PD and essential tremor (ET), there are few data on smoking in ET. One study showed an association between smoking and lower ET prevalence. We now study whether baseline smoking is associated with lower risk of incident ET. METHODS Using a population-based, cohort design, baseline cigarette smoking habits were assessed in 3,348 participants in an epidemiological study in Spain, among whom 77 developed incident ET. RESULTS There were 3,348 participants, among whom 397 (11.9%) were smokers at baseline. Five (6.5%) of 77 incident ET cases had been smokers at baseline compared with 392 (12.0%) of 3,271 controls (p = 0.14). Baseline pack-years were lower in incident ET cases than controls (9.2 ± 17.7 vs. 15.7 ± 28.4, p = 0.002). Participants were stratified into baseline pack-year tertiles and few incident ET cases were in the highest tertile (4 [5.2%] cases vs. 431 [13.2%] controls, p = 0.039). In Cox Proportional Hazards Models, highest baseline pack-year tertile was associated with lower risk of incident ET; those in the highest pack-year tertile were one-third as likely to develop ET when compared to non-smokers (RR = 0.37, 95% CI = 0.14–1.03, p = 0.057 [unadjusted model] and RR = 0.29, 95% CI = 0.09–0.90, p = 0.03 [adjusted model]). CONCLUSIONS We demonstrated an association between baseline heavy cigarette smoking and lower risk of incident ET. The biological basis for this association requires future investigation. PMID:18458228

  17. Prevalence of Hidradenitis Suppurativa (HS): A Population-Based Study in Olmsted County, Minnesota

    PubMed Central

    Shahi, Varun; Alikhan, Ali; Vazquez, Benjamin G.; Weaver, Amy L.; Davis, Mark D.

    2014-01-01

    BACKGROUND/AIMS Hidradenitis suppurativa (HS) is a follicular occlusion disorder occurring in apocrine-rich regions of the skin. Estimates of the prevalence of this disorder have not been population-based. We sought to provide population-based information on the prevalence of HS in Olmsted County, Minnesota as of 1/1/2009. METHODS Rochester Epidemiology Project, a unique infrastructure that combines and makes accessible all medical records in Olmsted County since the 1960s, was used to collect population-based data on the prevalence of HS. RESULTS We identified 178 confirmed cases of HS that included 135 females and 43 males, and estimated the total sex- and age-adjusted prevalence in Olmsted County to be 127.8 per 100,000 or 0.13%. The total prevalence was significantly higher among women than men. CONCLUSION This study represents the first population-based investigation on the prevalence of HS. In this population-based cohort, HS was less prevalent than previous reports have suggested. PMID:25228133

  18. Erosion thresholds and land surface morphology

    NASA Astrophysics Data System (ADS)

    Dietrich, William E.; Wilson, Cathy J.; Montgomery, David R.; McKean, James; Bauer, Romy

    1992-08-01

    We propose a graphical technique to analyze the entirety of landforms in a catchment to define quantitatively the spatial variation in the dominance of different erosion processes. High-resolution digital elevation data of a 1.2 km2 hilly area where the channel network had been mapped in the field were used in the digital terrain model, TOPOG, to test threshold theories for erosion. The land surface was divided into ˜20 m2 elements whose shapes were then classified as convergent, planar, or divergent. The entire landscape plotted on a graph of area per unit contour length against surface gradient shows each planform plotting as a separate field. A simple steady-state hydrologic model was used to predict zones of saturation and areas of high pore pressure to mimic the extreme hydrologic events responsible for erosive instability of the land surface. The field observation that saturation overland flow is rare outside convergent zones provided a significant constraint on the hydrologic parameter in the model. This model was used in threshold theories to predict areas of slope instability and areas subject to erosion by saturation overland flow, both of which can contribute to channel initiation. The proportion of convergent elements predicted to exceed the threshold varies greatly with relatively small changes in surface resistance, demonstrating a high sensitivity to land use such as cattle grazing. Overall, the landscape can be divided, using erosion threshold lines, into areas prone to channel instability due to runoff and stable areas where diffusive transport predominates.

  19. Hairpin Vortex Regeneration Threshold

    NASA Astrophysics Data System (ADS)

    Sabatino, Daniel; Maharjan, Rijan

    2015-11-01

    A free surface water channel is used to study hairpin vortex formation created by fluid injection through a narrow slot into a laminar boundary layer. Particle image velocimetry is used to calculate the circulation of the primary hairpin vortex head which is found to monotonically decrease in strength with downstream distance. When a secondary hairpin vortex is formed upstream of the primary vortex, the circulation strength of the head is comparable to the strength of the primary head at the time of regeneration. However, the legs of the primary vortex strengthen up to the moment the secondary hairpin is generated. Although the peak circulation in the legs is not directly correlated to the strength of the original elongated ring vortex, when the circulation is scaled with the injection momentum ratio it is linearly related to scaled injection time. It is proposed that the injection momentum ratio and nondimensionalized injection time based on the wall normal penetration time can be used to identify threshold conditions which produce a secondary vortex. Supported by the National Science Foundation under Grant CBET- 1040236.

  20. Probabilistic Threshold Criterion

    SciTech Connect

    Gresshoff, M; Hrousis, C A

    2010-03-09

    The Probabilistic Shock Threshold Criterion (PSTC) Project at LLNL develops phenomenological criteria for estimating safety or performance margin on high explosive (HE) initiation in the shock initiation regime, creating tools for safety assessment and design of initiation systems and HE trains in general. Until recently, there has been little foundation for probabilistic assessment of HE initiation scenarios. This work attempts to use probabilistic information that is available from both historic and ongoing tests to develop a basis for such assessment. Current PSTC approaches start with the functional form of the James Initiation Criterion as a backbone, and generalize to include varying areas of initiation and provide a probabilistic response based on test data for 1.8 g/cc (Ultrafine) 1,3,5-triamino-2,4,6-trinitrobenzene (TATB) and LX-17 (92.5% TATB, 7.5% Kel-F 800 binder). Application of the PSTC methodology is presented investigating the safety and performance of a flying plate detonator and the margin of an Ultrafine TATB booster initiating LX-17.

  1. Developmental Profiles of Eczema, Wheeze, and Rhinitis: Two Population-Based Birth Cohort Studies

    PubMed Central

    2014-01-01

    Background The term “atopic march” has been used to imply a natural progression of a cascade of symptoms from eczema to asthma and rhinitis through childhood. We hypothesize that this expression does not adequately describe the natural history of eczema, wheeze, and rhinitis during childhood. We propose that this paradigm arose from cross-sectional analyses of longitudinal studies, and may reflect a population pattern that may not predominate at the individual level. Methods and Findings Data from 9,801 children in two population-based birth cohorts were used to determine individual profiles of eczema, wheeze, and rhinitis and whether the manifestations of these symptoms followed an atopic march pattern. Children were assessed at ages 1, 3, 5, 8, and 11 y. We used Bayesian machine learning methods to identify distinct latent classes based on individual profiles of eczema, wheeze, and rhinitis. This approach allowed us to identify groups of children with similar patterns of eczema, wheeze, and rhinitis over time. Using a latent disease profile model, the data were best described by eight latent classes: no disease (51.3%), atopic march (3.1%), persistent eczema and wheeze (2.7%), persistent eczema with later-onset rhinitis (4.7%), persistent wheeze with later-onset rhinitis (5.7%), transient wheeze (7.7%), eczema only (15.3%), and rhinitis only (9.6%). When latent variable modelling was carried out separately for the two cohorts, similar results were obtained. Highly concordant patterns of sensitisation were associated with different profiles of eczema, rhinitis, and wheeze. The main limitation of this study was the difference in wording of the questions used to ascertain the presence of eczema, wheeze, and rhinitis in the two cohorts. Conclusions The developmental profiles of eczema, wheeze, and rhinitis are heterogeneous; only a small proportion of children (∼7% of those with symptoms) follow trajectory profiles resembling the atopic march. Please see later

  2. A Population-based Study of Age Inequalities in Access to Palliative Care Among Cancer Patients

    PubMed Central

    Burge, Frederick I.; Lawson, Beverley J.; Johnston, Grace M.; Grunfeld, Eva

    2013-01-01

    Background Inequalities in access to palliative care programs (PCP) by age have been shown to exist in Canada and elsewhere. Few studies have been able to provide greater insight by simultaneously adjusting for multiple demographic, health service, and socio-cultural indicators. Objective To re-examine the relationship between age and registration to specialized community-based PCP programs among cancer patients and identify the multiple indicators contributing to these inequalities. Methods This retrospective, population-based study was a secondary data analysis of linked individual level information extracted from 6 administrative health databases and contextual (neighborhood level) data from provincial and census information. Subjects included all adults who died due to cancer between 1998 and 2003 living within 2 District Health Authorities in the province of Nova Scotia, Canada. The relationship between registration in a PCP and age was examined using hierarchical nonlinear regression modeling techniques. Identification of potential patient and ecologic contributing indicators was guided by Andersen’s conceptual model of health service utilization. Results Overall, 66% of 7511 subjects were registered with a PCP. Older subjects were significantly less likely than those <65 years of age to be registered with a PCP, in particular those aged 85 years and older (adjusted odds ratio: 0.4; 95% confidence interval: 0.3–0.5). Distance to the closest cancer center had a major impact on registration. Conclusions Age continues to be a significant predictor of PCP registration in Nova Scotia even after controlling for the confounding effects of many new demographic, health service, and ecologic indicators. PMID:19300309

  3. Air pollution and newly diagnostic autism spectrum disorders: a population-based cohort study in Taiwan.

    PubMed

    Jung, Chau-Ren; Lin, Yu-Ting; Hwang, Bing-Fang

    2013-01-01

    There is limited evidence that long-term exposure to ambient air pollution increases the risk of childhood autism spectrum disorder (ASD). The objective of the study was to investigate the associations between long-term exposure to air pollution and newly diagnostic ASD in Taiwan. We conducted a population-based cohort of 49,073 children age less than 3 years in 2000 that were retrieved from Taiwan National Insurance Research Database and followed up from 2000 through 2010. Inverse distance weighting method was used to form exposure parameter for ozone (O3), carbon monoxide (CO), nitrogen dioxide (NO2), sulfur dioxide (SO2), and particles with aerodynamic diameter less than 10 µm (PM10). Time-dependent Cox proportional hazards (PH) model was performed to evaluate the relationship between yearly average exposure air pollutants of preceding years and newly diagnostic ASD. The risk of newly diagnostic ASD increased according to increasing O3, CO, NO2, and SO2 levels. The effect estimate indicating an approximately 59% risk increase per 10 ppb increase in O3 level (95% CI 1.42-1.79), 37% risk increase per 10 ppb in CO (95% CI 1.31-1.44), 340% risk increase per 10 ppb increase in NO2 level (95% CI 3.31-5.85), and 17% risk increase per 1 ppb in SO2 level (95% CI 1.09-1.27) was stable with different combinations of air pollutants in the multi-pollutant models. Our results provide evident that children exposure to O3, CO, NO2, and SO2 in the preceding 1 year to 4 years may increase the risk of ASD diagnosis. PMID:24086549

  4. Radiotherapy and Survival in Prostate Cancer Patients: A Population-Based Study

    SciTech Connect

    Zhou, Esther H. Ellis, Rodney J.; Cherullo, Edward; Colussi, Valdir; Xu Fang; Chen Weidong; Gupta, Sanjay; Whalen, Christopher C.; Bodner, Donald; Resnick, Martin I.; Rimm, Alfred A.

    2009-01-01

    Purpose: To investigate the association of overall and disease-specific survival with the five standard treatment modalities for prostate cancer (CaP): radical prostatectomy (RP), brachytherapy (BT), external beam radiotherapy, androgen deprivation therapy, and no treatment (NT) within 6 months after CaP diagnosis. Methods and Materials: The study population included 10,179 men aged 65 years and older with incident CaP diagnosed between 1999 and 2001. Using the linked Ohio Cancer Incidence Surveillance System, Medicare, and death certificate files, overall and disease-specific survival through 2005 among the five clinically accepted therapies were analyzed. Results: Disease-specific survival rates were 92.3% and 23.9% for patients with localized vs. distant disease at 7 years, respectively. Controlling for age, race, comorbidities, stage, and Gleason score, results from the Cox multiple regression models indicated that the risk of CaP-specific death was significantly reduced in patients receiving RP or BT, compared with NT. For localized disease, compared with NT, in the monotherapy cohort, RP and BT were associated with reduced hazard ratios (HR) of 0.25 and 0.45 (95% confidence intervals 0.13-0.48 and 0.23-0.87, respectively), whereas in the combination therapy cohort, HR were 0.40 (0.17-0.94) and 0.46 (0.27-0.80), respectively. Conclusions: The present population-based study indicates that RP and BT are associated with improved survival outcomes. Further studies are warranted to improve clinical determinates in the selection of appropriate management of CaP and to improve predictive modeling for which patient subsets may benefit most from definitive therapy vs. conservative management and/or observation.

  5. Direct costs in impaired glucose regulation: results from the population-based Heinz Nixdorf Recall study

    PubMed Central

    Bächle, C; Claessen, H; Andrich, S; Brüne, M; Dintsios, C M; Slomiany, U; Roggenbuck, U; Jöckel, K H; Moebus, S; Icks, A

    2016-01-01

    Objective For the first time, this population-based study sought to analyze healthcare utilization and associated costs in people with normal fasting glycemia (NFG), impaired fasting glycemia (IFG), as well as previously undetected diabetes and previously diagnosed diabetes linking data from the prospective German Heinz Nixdorf Recall (HNR) study with individual claims data from German statutory health insurances. Research design and methods A total of 1709 participants of the HNR 5-year follow-up (mean age (SD) 64.9 (7.5) years, 44.5% men) were included in the study. Age-standardized and sex-standardized healthcare utilization and associated costs (reported as € for the year 2008, perspective of the statutory health insurance) were stratified by diabetes stage defined by the participants' self-report and fasting plasma glucose values. Cost ratios (CRs) were estimated using two-part regression models, adjusting for age, sex, sociodemographic variables and comorbidity. Results The mean total direct healthcare costs for previously diagnosed diabetes, previously undetected diabetes, IFG, and NFG were €2761 (95% CI 2378 to 3268), €2210 (1483 to 4279), €2035 (1732 to 2486) and €1810 (1634 to 2035), respectively. Corresponding age-adjusted and sex-adjusted CRs were 1.53 (1.30 to 1.80), 1.16 (0.91 to 1.47), and 1.09 (0.95 to 1.25) (reference: NFG). Inpatient, outpatient and medication costs varied in order between people with IFG and those with previously undetected diabetes. Conclusions The study provides claims-based detailed cost data in well-defined glucose metabolism subgroups. CRs of individuals with IFG and previously undetected diabetes were surprisingly low. Data are important for the model-based evaluation of screening programs and interventions that are aimed either to prevent diabetes onset or to improve diabetes therapy as well. PMID:27252871

  6. Population-Based Age Group Specific Annual Incidence Rates of Symptomatic Age-Related Macular Degeneration

    PubMed Central

    Saari, Jukka M

    2014-01-01

    Purpose To study the population-based annual incidence rates of exudative, dry and all cases of symptomatic age-related macular degeneration (AMD) in different age and sex groups. Methods. This is a one year, prospective, population-based study on all consecutive new patients with AMD in the hospital district of Central Finland. The diagnosis was confirmed in all patients with slit lamp biomicroscopy, optical coherence tomography (OCT) using a Spectralis HRA + OCT device, and the Heidelberg Eye Explorer 1.6.2.0 program. Fluorescein angiograms were taken when needed. Results. The population-based annual incidence rates of all cases of symptomatic AMD increased from 0.03% (95% CI, 0.01-0.05%) in the age group 50-59 years to 0.82% (95% CI, 0.55-1.09%) in the age group 85-89 years and were 0.2% (95% CI, 0.17-0.24%) in exudative, 0.11% (95% CI, 0.09-0.14%) in dry, and 0.32% (95% CI, 0.28-0.36%) in all cases of AMD in the age group 60 years and older. During the next 20 years in Central Finland the population-based annual incidence rates can be estimated to increase to 0.27% (95% CI, 0.24-0.30%) in exudative, to 0.13% (95% CI, 0.11-0.15%) in dry, and to 0.41% (95% CI, 0.37-0.45%) in all cases of AMD in the age group 60 years and older. The population-based annual incidence of AMD did not show statistically significant differences between males and females (p>0.1). Conclusion: The population-based age-group specific annual incidence rates of symptomatic AMD of this study may help to plan health care provision for patients of AMD. PMID:25674187

  7. Marital status and risk of dementia: a nationwide population-based prospective study from Sweden

    PubMed Central

    Sundström, Anna; Westerlund, Olle; Kotyrlo, Elena

    2016-01-01

    Objectives To examine the association between marital status and dementia in a cohort of young-old (50–64) and middle-old (65–74) adults, and also whether this may differ by gender. Design Prospective population-based study with follow-up time of up to 10 years. Setting Swedish national register-based study. Participants 2 288 489 individuals, aged 50–74 years, without prior dementia diagnosis at baseline. Dementia was identified using the Swedish National Patient Register and the Cause of Death Register. Outcome measures The influence of marital status on dementia was analysed using Cox proportional hazards models, adjusted stepwise for multiple covariates (model 1: adjusted for age and gender; and model 2: additionally adjusted for having adult children, education, income and prior cardiovascular disease). Results During follow-up, 31 572 individuals in the study were identified as demented. Cox regression showed each non-married subcategory to be associated with a significantly higher risk of dementia than the married group, with the highest risk observed among people in the young-old age group, especially among those who were divorced or single (HRs 1.79 vs 1.71, fully adjusted model). Analyses stratified by gender showed gender differences in the young-old group, with indications of divorced men having a higher relative risk compared with divorced women (HRs 2.1 vs 1.7, only-age adjusted model). However, in the fully adjusted model, these differences were attenuated and there was no longer any significant difference between male and female participants. Conclusions Our results suggest that those living alone as non-marrieds may be at risk for early-onset and late-onset dementia. Although more research is needed to understand the underlying mechanism by which marital status is associated with dementia, this suggests that social relationships should be taken seriously as a risk factor for dementia and that social-based interventions may provide

  8. Usefulness of data from magnetic resonance imaging to improve prediction of dementia: population based cohort study

    PubMed Central

    Stephan, Blossom C M; Tzourio, Christophe; Auriacombe, Sophie; Amieva, Hélène; Dufouil, Carole; Alpérovitch, Annick

    2015-01-01

    Objective To determine whether the addition of data derived from magnetic resonance imaging (MRI) of the brain to a model incorporating conventional risk variables improves prediction of dementia over 10 years of follow-up. Design Population based cohort study of individuals aged ≥65. Setting The Dijon magnetic resonance imaging study cohort from the Three-City Study, France. Participants 1721 people without dementia who underwent an MRI scan at baseline and with known dementia status over 10 years’ follow-up. Main outcome measure Incident dementia (all cause and Alzheimer’s disease). Results During 10 years of follow-up, there were 119 confirmed cases of dementia, 84 of which were Alzheimer’s disease. The conventional risk model incorporated age, sex, education, cognition, physical function, lifestyle (smoking, alcohol use), health (cardiovascular disease, diabetes, systolic blood pressure), and the apolipoprotein genotype (C statistic for discrimination performance was 0.77, 95% confidence interval 0.71 to 0.82). No significant differences were observed in the discrimination performance of the conventional risk model compared with models incorporating data from MRI including white matter lesion volume (C statistic 0.77, 95% confidence interval 0.72 to 0.82; P=0.48 for difference of C statistics), brain volume (0.77, 0.72 to 0.82; P=0.60), hippocampal volume (0.79, 0.74 to 0.84; P=0.07), or all three variables combined (0.79, 0.75 to 0.84; P=0.05). Inclusion of hippocampal volume or all three MRI variables combined in the conventional model did, however, lead to significant improvement in reclassification measured by using the integrated discrimination improvement index (P=0.03 and P=0.04) and showed increased net benefit in decision curve analysis. Similar results were observed when the outcome was restricted to Alzheimer’s disease. Conclusions Data from MRI do not significantly improve discrimination performance in prediction of all cause dementia

  9. Excitable neurons, firing threshold manifolds and canards.

    PubMed

    Mitry, John; McCarthy, Michelle; Kopell, Nancy; Wechselberger, Martin

    2013-01-01

    We investigate firing threshold manifolds in a mathematical model of an excitable neuron. The model analyzed investigates the phenomenon of post-inhibitory rebound spiking due to propofol anesthesia and is adapted from McCarthy et al. (SIAM J. Appl. Dyn. Syst. 11(4):1674-1697, 2012). Propofol modulates the decay time-scale of an inhibitory GABAa synaptic current. Interestingly, this system gives rise to rebound spiking within a specific range of propofol doses. Using techniques from geometric singular perturbation theory, we identify geometric structures, known as canards of folded saddle-type, which form the firing threshold manifolds. We find that the position and orientation of the canard separatrix is propofol dependent. Thus, the speeds of relevant slow synaptic processes are encoded within this geometric structure. We show that this behavior cannot be understood using a static, inhibitory current step protocol, which can provide a single threshold for rebound spiking but cannot explain the observed cessation of spiking for higher propofol doses. We then compare the analyses of dynamic and static synaptic inhibition, showing how the firing threshold manifolds of each relate, and why a current step approach is unable to fully capture the behavior of this model. PMID:23945278

  10. Outcome-Driven Thresholds for Home Blood Pressure Measurement

    PubMed Central

    Niiranen, Teemu J.; Asayama, Kei; Thijs, Lutgarde; Johansson, Jouni K.; Ohkubo, Takayoshi; Kikuya, Masahiro; Boggia, José; Hozawa, Atsushi; Sandoya, Edgardo; Stergiou, George S.; Tsuji, Ichiro; Jula, Antti M.; Imai, Yutaka; Staessen, Jan A.

    2013-01-01

    The lack of outcome-driven operational thresholds limits the clinical application of home blood pressure (BP) measurement. Our objective was to determine an outcome-driven reference frame for home BP measurement. We measured home and clinic BP in 6470 participants (mean age, 59.3 years; 56.9% women; 22.4% on antihypertensive treatment) recruited in Ohasama, Japan (n=2520); Montevideo, Uruguay (n=399); Tsurugaya, Japan (n=811); Didima, Greece (n=665); and nationwide in Finland (n=2075). In multivariable-adjusted analyses of individual subject data, we determined home BP thresholds, which yielded 10-year cardiovascular risks similar to those associated with stages 1 (120/80 mm Hg) and 2 (130/85 mm Hg) prehypertension, and stages 1 (140/90 mm Hg) and 2 (160/100 mm Hg) hypertension on clinic measurement. During 8.3 years of follow-up (median), 716 cardiovascular end points, 294 cardiovascular deaths, 393 strokes, and 336 cardiac events occurred in the whole cohort; in untreated participants these numbers were 414, 158, 225, and 194, respectively. In the whole cohort, outcome-driven systolic/diastolic thresholds for the home BP corresponding with stages 1 and 2 prehypertension and stages 1 and 2 hypertension were 121.4/77.7, 127.4/79.9, 133.4/82.2, and 145.4/86.8 mm Hg; in 5018 untreated participants, these thresholds were 118.5/76.9, 125.2/79.7, 131.9/82.4, and 145.3/87.9 mm Hg, respectively. Rounded thresholds for stages 1 and 2 prehypertension and stages 1 and 2 hypertension amounted to 120/75, 125/80, 130/85, and 145/90 mm Hg, respectively. Population-based outcome-driven thresholds for home BP are slightly lower than those currently proposed in hypertension guidelines. Our current findings could inform guidelines and help clinicians in diagnosing and managing patients. PMID:23129700

  11. Life below the threshold.

    PubMed

    Castro, C

    1991-01-01

    This article explains that malnutrition, poor health, and limited educational opportunities plague Philippine children -- especially female children -- from families living below the poverty threshold. Nearly 70% of households in the Philippines do not meet the required daily level of nutritional intake. Because it is often -- and incorrectly -- assumed that women's nutritional requirements are lower than men's, women suffer higher rates of malnutrition and poor health. A 1987 study revealed that 11.7% of all elementary students were underweight and 13.9% had stunted growths. Among elementary-school girls, 17% were malnourished and 40% suffered from anemia (among lactating mothers, more than 1/2 are anemic). A 1988 Program for Decentralized Educational Development study showed that grade VI students learn only about 1/2 of what they are supposed to learn. 30% of the children enrolled in grade school drop out before they reach their senior year. The Department of Education, Culture and Sports estimates that some 2.56 million students dropped out of school in l989. That same year, some 3.7 million children were counted as part of the labor force. In Manila alone, some 60,000 children work the streets, whether doing odd jobs or begging, or turning to crime or prostitution. the article tells the story of a 12 year-old girl named Ging, a 4th grader at a public school and the oldest child in a poor family of 6 children. The undernourished Ging dreams of a good future for her family and sees education as a way out of poverty; unfortunately, her time after school is spend working in the streets or looking after her family. She considers herself luckier than many of the other children working in the streets, since she at least has a family. PMID:12285009

  12. Method for calculating multiphoton above-threshold processes in atoms: Two-photon above-threshold ionization

    SciTech Connect

    Manakov, N. L. Marmo, S. I.; Sviridov, S. A.

    2009-04-15

    The two-photon above-threshold ionization of atoms is calculated using numerical algorithms of the Pade approximation in the model-potential method with the Coulomb asymptotics. The total and differential cross sections of the above-threshold ionization of helium and alkali metal atoms by elliptically polarized radiation are presented. The dependence of the angular distribution of photoelectrons on the sign of the ellipticity of radiation (the elliptic dichroism phenomenon) is analyzed in the above-threshold frequency range.

  13. Threshold Hypothesis: Fact or Artifact?

    ERIC Educational Resources Information Center

    Karwowski, Maciej; Gralewski, Jacek

    2013-01-01

    The threshold hypothesis (TH) assumes the existence of complex relations between creative abilities and intelligence: linear associations below 120 points of IQ and weaker or lack of associations above the threshold. However, diverse results have been obtained over the last six decades--some confirmed the hypothesis and some rejected it. In this…

  14. The Nature of Psychological Thresholds

    ERIC Educational Resources Information Center

    Rouder, Jeffrey N.; Morey, Richard D.

    2009-01-01

    Following G. T. Fechner (1966), thresholds have been conceptualized as the amount of intensity needed to transition between mental states, such as between a states of unconsciousness and consciousness. With the advent of the theory of signal detection, however, discrete-state theory and the corresponding notion of threshold have been discounted.…

  15. Threshold Concepts and Information Literacy

    ERIC Educational Resources Information Center

    Townsend, Lori; Brunetti, Korey; Hofer, Amy R.

    2011-01-01

    What do we teach when we teach information literacy in higher education? This paper describes a pedagogical approach to information literacy that helps instructors focus content around transformative learning thresholds. The threshold concept framework holds promise for librarians because it grounds the instructor in the big ideas and underlying…

  16. Thresholds for Cenozoic bipolar glaciation.

    PubMed

    Deconto, Robert M; Pollard, David; Wilson, Paul A; Pälike, Heiko; Lear, Caroline H; Pagani, Mark

    2008-10-01

    The long-standing view of Earth's Cenozoic glacial history calls for the first continental-scale glaciation of Antarctica in the earliest Oligocene epoch ( approximately 33.6 million years ago), followed by the onset of northern-hemispheric glacial cycles in the late Pliocene epoch, about 31 million years later. The pivotal early Oligocene event is characterized by a rapid shift of 1.5 parts per thousand in deep-sea benthic oxygen-isotope values (Oi-1) within a few hundred thousand years, reflecting a combination of terrestrial ice growth and deep-sea cooling. The apparent absence of contemporaneous cooling in deep-sea Mg/Ca records, however, has been argued to reflect the growth of more ice than can be accommodated on Antarctica; this, combined with new evidence of continental cooling and ice-rafted debris in the Northern Hemisphere during this period, raises the possibility that Oi-1 represents a precursory bipolar glaciation. Here we test this hypothesis using an isotope-capable global climate/ice-sheet model that accommodates both the long-term decline of Cenozoic atmospheric CO(2) levels and the effects of orbital forcing. We show that the CO(2) threshold below which glaciation occurs in the Northern Hemisphere ( approximately 280 p.p.m.v.) is much lower than that for Antarctica ( approximately 750 p.p.m.v.). Therefore, the growth of ice sheets in the Northern Hemisphere immediately following Antarctic glaciation would have required rapid CO(2) drawdown within the Oi-1 timeframe, to levels lower than those estimated by geochemical proxies and carbon-cycle models. Instead of bipolar glaciation, we find that Oi-1 is best explained by Antarctic glaciation alone, combined with deep-sea cooling of up to 4 degrees C and Antarctic ice that is less isotopically depleted (-30 to -35 per thousand) than previously suggested. Proxy CO(2) estimates remain above our model's northern-hemispheric glaciation threshold of approximately 280 p.p.m.v. until approximately 25 Myr

  17. Threshold selection for regional peaks-over-threshold data

    NASA Astrophysics Data System (ADS)

    Roth, Martin; Jongbloed, Geurt; Adri Buishand, T.

    2016-04-01

    A hurdle in the peaks-over-threshold approach for analyzing extreme values is the selection of the threshold. A method is developed to reduce this obstacle in the presence of multiple, similar data samples. This is for instance the case in many environmental applications. The idea is to combine threshold selection methods into a regional method. Regionalized versions of the threshold stability and the mean excess plot are presented as graphical tools for threshold selection. Moreover, quantitative approaches based on the bootstrap distribution of the spatially averaged Kolmogorov-Smirnov and Anderson-Darling test statistics are introduced. It is demonstrated that the proposed regional method leads to an increased sensitivity for too low thresholds, compared to methods that do not take into account the regional information. The approach can be used for a wide range of univariate threshold selection methods. We test the methods using simulated data and present an application to rainfall data from the Dutch water board Vallei en Veluwe.

  18. The Risk of Chronic Pancreatitis in Patients with Psoriasis: A Population-Based Cohort Study

    PubMed Central

    Chiang, Yi-Ting; Huang, Weng-Foung; Tsai, Tsen-Fang

    2016-01-01

    Background Psoriasis is a chronic systemic inflammatory disorder, and studies have revealed its association with a variety of comorbidities. However, the risk of chronic pancreatitis (CP) in psoriasis has not been studied. This study aimed to investigate the risk of CP among patients with psoriasis. Methods Using the Taiwan National Health Insurance Research Database, this population-based cohort study enrolled 48430 patients with psoriasis and 193720 subjects without psoriasis. Stratified Cox proportional hazards models were used to compare the risks of CP between the patients with and without psoriasis. Results The incidence of CP was 0.61 per 1000 person-years in patients with psoriasis and 0.34 per 1000 person-years in controls during a mean 6.6-year follow-up period. Before adjustment, patients with psoriasis had a significantly higher risk of CP (crude hazard ratio (HR) = 1.81; 95% confidence interval (CI) = 1.53–2.15), and the risk remained significantly higher after adjustments for gender, age group, medications, and comorbidities (adjusted HR (aHR) = 1.76; 95% CI = 1.47–2.10). All psoriasis patient subgroups other than those with arthritis, including those with mild and severe psoriasis and those without arthritis, had significantly increased aHRs for CP, and the risk increased with increasing psoriasis severity. Psoriasis patients taking nonsteroidal anti-inflammatory drugs (aHR = 0.33; 95% CI = 0.22–0.49) and methotrexate (aHR = 0.28; 95% CI = 0.12–0.64) had a lower risk of developing CP after adjustments. Conclusions Psoriasis is associated with a significantly increased risk of CP. The results of our study call for more research to provide additional insight into the relationship between psoriasis and CP. PMID:27467265

  19. Early Cognitive Deficits in Type 2 Diabetes: A Population-Based Study

    PubMed Central

    Marseglia, Anna; Fratiglioni, Laura; Laukka, Erika J.; Santoni, Giola; Pedersen, Nancy L.; Bäckman, Lars; Xu, Weili

    2016-01-01

    Evidence links type 2 diabetes to dementia risk. However, our knowledge on the initial cognitive deficits in diabetic individuals and the factors that might promote such deficits is still limited. This study aimed to identify the cognitive domains initially impaired by diabetes and the factors that play a role in this first stage. Within the population-based Swedish National Study on Aging and Care–Kungsholmen, 2305 cognitively intact participants aged ≥60 y were identified. Attention/working memory, perceptual speed, category fluency, letter fluency, semantic memory, and episodic memory were assessed. Diabetes (controlled and uncontrolled) and prediabetes were ascertained by clinicians, who also collected information on vascular disorders (hypertension, heart diseases, and stroke) and vascular risk factors (VRFs, including smoking and overweight/obesity). Data were analyzed with linear regression models. Overall, 196 participants (8.5%) had diabetes, of which 144 (73.5%) had elevated glycaemia (uncontrolled diabetes); 571 (24.8%) persons had prediabetes. In addition, diabetes, mainly uncontrolled, was related to lower performance in perceptual speed (β – 1.10 [95% CI – 1.98, – 0.23]), category fluency (β – 1.27 [95% CI – 2.52, – 0.03]), and digit span forward (β – 0.35 [95% CI – 0.54, – 0.17]). Critically, these associations were present only among APOE ɛ4 non–carriers. The associations of diabetes with perceptual speed and category fluency were present only among participants with VRFs or vascular disorders. Diabetes, especially uncontrolled diabetes, is associated with poorer performance in perceptual speed, category fluency, and attention/primary memory. VRFs, vascular disorders, and APOE status play a role in these associations. PMID:27314527

  20. Increased Risk of Osteoporosis in Patients With Peptic Ulcer Disease: A Nationwide Population-Based Study.

    PubMed

    Wu, Chieh-Hsin; Tung, Yi-Ching; Chai, Chee-Yin; Lu, Ying-Yi; Su, Yu-Feng; Tsai, Tai-Hsin; Kuo, Keng-Liang; Lin, Chih-Lung

    2016-04-01

    To investigate osteoporosis risk in patients with peptic ulcer disease (PUD) using a nationwide population-based dataset.This Taiwan National Health Insurance Research Database (NHIRD) analysis included 27,132 patients aged 18 years and older who had been diagnosed with PUD (International Classification of Diseases, Ninth Revision, Clinical Modification [ICD-9-CM] codes 531-534) during 1996 to 2010. The control group consisted of 27,132 randomly selected (age- and gender)-matched patients without PUD. The association between PUD and the risk of developing osteoporosis was estimated using a Cox proportional hazard regression model.During the follow-up period, osteoporosis was diagnosed in 2538 (9.35 %) patients in the PUD group and in 2259 (8.33 %) participants in the non-PUD group. After adjusting for covariates, osteoporosis risk was 1.85 times greater in the PUD group compared to the non-PUD group (13.99 vs 5.80 per 1000 person-years, respectively). Osteoporosis developed 1 year after PUD diagnosis. The 1-year follow-up period exhibited the highest significance between the 2 groups (hazard ratio [HR] = 63.44, 95% confidence interval [CI] = 28.19-142.74, P < 0.001). Osteoporosis risk was significantly higher in PUD patients with proton-pump-inhibitors (PPIs) use (HR = 1.17, 95% CI = 1.03-1.34) compared to PUD patients without PPIs use.This study revealed a significant association between PUD and subsequent risk of osteoporosis. Therefore, PUD patients, especially those treated with PPIs, should be evaluated for subsequent risk of osteoporosis to minimize the occurrence of adverse events. PMID:27100415

  1. Shift-work and cardiovascular disease: a population-based 22-year follow-up study.

    PubMed

    Hublin, Christer; Partinen, Markku; Koskenvuo, Karoliina; Silventoinen, Karri; Koskenvuo, Markku; Kaprio, Jaakko

    2010-05-01

    Studies on the association between shift-work and cardiovascular disease (CVD), in particular coronary heart disease (CHD), have given conflicting results. In this prospective population-based study we assessed the association of shift-work with three endpoints: CHD mortality, disability retirement due to CVD, and incident hypertension. A cohort of 20,142 adults (the Finnish Twin Cohort) was followed from 1982 to 2003. Type of working time (daytime/nighttime/shift-work) was assessed by questionnaires in 1975 (response rate 89%) and in 1981 (84%). Causes of death, information on disability retirement and hypertension medication were obtained from nationwide official registers. Cox proportional hazard models were used to obtain hazard ratios (HR) for each endpoint by type of working time. Adjustments were made for 14 socio-demographic and lifestyle covariates. 76.9% were daytime workers and 9.5% shift-workers both in 1975 and in 1981. During the follow-up, 857 deaths due to CHD, 721 disability retirements due to CVD, and 2,642 new cases of medicated hypertension were observed. However, HRs for shift-work were not significant (mortality HR men 1.09 and women 1.22; retirement 1.15 and 0.96; hypertension 1.15 and 0.98, respectively). The results were essentially similar after full adjustments for all covariates. Within twin pairs, no association between shift work and outcome was observed. Our results do not support an association between shift-work and cardiovascular morbidity. PMID:20229313

  2. HIV testing in national population-based surveys: experience from the Demographic and Health Surveys.

    PubMed Central

    Mishra, Vinod; Vaessen, Martin; Boerma, J. Ties; Arnold, Fred; Way, Ann; Barrere, Bernard; Cross, Anne; Hong, Rathavuth; Sangha, Jasbir

    2006-01-01

    OBJECTIVES: To describe the methods used in the Demographic and Health Surveys (DHS) to collect nationally representative data on the prevalence of human immunodeficiency virus (HIV) and assess the value of such data to country HIV surveillance systems. METHODS: During 2001-04, national samples of adult women and men in Burkina Faso, Cameroon, Dominican Republic, Ghana, Mali, Kenya, United Republic of Tanzania and Zambia were tested for HIV. Dried blood spot samples were collected for HIV testing, following internationally accepted ethical standards. The results for each country are presented by age, sex, and urban versus rural residence. To estimate the effects of non-response, HIV prevalence among non-responding males and females was predicted using multivariate statistical models for those who were tested, with a common set of predictor variables. RESULTS: Rates of HIV testing varied from 70% among Kenyan men to 92% among women in Burkina Faso and Cameroon. Despite large differences in HIV prevalence between the surveys (1-16%), fairly consistent patterns of HIV infection were observed by age, sex and urban versus rural residence, with considerably higher rates in urban areas and in women, especially at younger ages. Analysis of non-response bias indicates that although predicted HIV prevalence tended to be higher in non-tested males and females than in those tested, the overall effects of non-response on the observed national estimates of HIV prevalence are insignificant. CONCLUSIONS: Population-based surveys can provide reliable, direct estimates of national and regional HIV seroprevalence among men and women irrespective of pregnancy status. Survey data greatly enhance surveillance systems and the accuracy of national estimates in generalized epidemics. PMID:16878227

  3. Clozapine use in childhood and adolescent schizophrenia: A nationwide population-based study.

    PubMed

    Schneider, Carolina; Papachristou, Efstathios; Wimberley, Theresa; Gasse, Christiane; Dima, Danai; MacCabe, James H; Mortensen, Preben Bo; Frangou, Sophia

    2015-06-01

    Early onset schizophrenia (EOS) begins in childhood or adolescence. EOS is associated with poor treatment response and may benefit from timely use of clozapine. This study aimed to identify the predictors of clozapine use in EOS and characterize the clinical profile and outcome of clozapine-treated youths with schizophrenia. We conducted a nationwide population-based study using linked data from Danish medical registries. We examined all incident cases of EOS (i.e., cases diagnosed prior to their 18th birthday) between December 31st 1994 and December 31st 2006 and characterized their demographic, clinical and treatment profiles. We then used multivariable cox proportional hazard models to identify predictors of clozapine treatment in this patient population. We identified 662 EOS cases (1.9% of all schizophrenia cases), of whom 108 (17.6%) had commenced clozapine by December 31st 2008. Patients had on average 3 antipsychotic trials prior to clozapine initiation. The mean interval between first antipsychotic treatment and clozapine initiation was 3.2 (2.9) years. Older age at diagnosis of schizophrenia [HR=1.2, 95% CI (1.05-1.4), p=0.01], family history of schizophrenia [HR=2.1, 95% CI (1.1-3.04), p=0.02] and attempted suicide [HR=1.8, 95% CI (1.1-3.04), p=0.02] emerged as significant predictors of clozapine use. The majority of patients (n=96, 88.8%) prescribed clozapine appeared to have a favorable clinical response as indicated by continued prescription redemption and improved occupational outcomes. Our findings support current recommendations for the timely use of clozapine in EOS. PMID:25769917

  4. Epilepsy in Onchocerciasis Endemic Areas: Systematic Review and Meta-analysis of Population-Based Surveys

    PubMed Central

    Pion, Sébastien D. S.; Kaiser, Christoph; Boutros-Toni, Fernand; Cournil, Amandine; Taylor, Melanie M.; Meredith, Stefanie E. O.; Stufe, Ansgar; Bertocchi, Ione; Kipp, Walter; Preux, Pierre-Marie; Boussinesq, Michel

    2009-01-01

    Objective We sought to evaluate the relationship between onchocerciasis prevalence and that of epilepsy using available data collected at community level. Design We conducted a systematic review and meta-regression of available data. Data Sources Electronic and paper records on subject area ever produced up to February 2008. Review Methods We searched for population-based studies reporting on the prevalence of epilepsy in communities for which onchocerciasis prevalence was available or could be estimated. Two authors independently assessed eligibility and study quality and extracted data. The estimation of point prevalence of onchocerciasis was standardized across studies using appropriate correction factors. Variation in epilepsy prevalence was then analyzed as a function of onchocerciasis endemicity using random-effect logistic models. Results Eight studies from west (Benin and Nigeria), central (Cameroon and Central African Republic) and east Africa (Uganda, Tanzania and Burundi) met the criteria for inclusion and analysis. Ninety-one communities with a total population of 79,270 individuals screened for epilepsy were included in the analysis. The prevalence of epilepsy ranged from 0 to 8.7% whereas that of onchocerciasis ranged from 5.2 to 100%. Variation in epilepsy prevalence was consistent with a logistic function of onchocerciasis prevalence, with epilepsy prevalence being increased, on average, by 0.4% for each 10% increase in onchocerciasis prevalence. Conclusion These results give further evidence that onchocerciasis is associated with epilepsy and that the disease burden of onchocerciasis might have to be re-estimated by taking into account this relationship. PMID:19529767

  5. Metformin use and survival after colorectal cancer: A population-based cohort study.

    PubMed

    Mc Menamin, Úna C; Murray, Liam J; Hughes, Carmel M; Cardwell, Chris R

    2016-01-15

    Preclinical evidence suggests that metformin could delay cancer progression. Previous epidemiological studies however have been limited by small sample sizes and certain time-related biases. This study aimed to investigate whether colorectal cancer patients with type 2 diabetes who were exposed to metformin had reduced cancer-specific mortality. We conducted a retrospective cohort study of 1,197 colorectal cancer patients newly diagnosed from 1998 to 2009 (identified from English cancer registries) with type 2 diabetes (based upon Clinical Practice Research Datalink, CPRD, prescription and diagnosis records). In this cohort 382 colorectal cancer-specific deaths occurred up to 2012 from the Office of National Statistics (ONS) mortality data. Metformin use was identified from CPRD prescription records. Using time-dependent Cox regression models, unadjusted and adjusted hazard ratios (HR) and 95% CIs were calculated for the association between post-diagnostic exposure to metformin and colorectal cancer-specific mortality. Overall, there was no evidence of an association between metformin use and cancer-specific death before or after adjustment for potential confounders (adjusted HR 1.06, 95% CI 0.80, 1.40). In addition, after adjustment for confounders, there was also no evidence of associations between other diabetic medications and cancer-specific mortality including sulfonylureas (HR 1.14, 95% CI 0.86, 1.51), insulin use (HR 1.35, 95% CI 0.95, 1.93) or other anti-diabetic medications including thiazolidinediones (HR 0.73, 95% CI 0.46, 1.14). Similar associations were observed by duration of use and for all-cause mortality. This population-based study, the largest to date, does not support a protective association between metformin and survival in colorectal cancer patients. PMID:26331456

  6. Genocide Exposure and Subsequent Suicide Risk: A Population-Based Study

    PubMed Central

    Levine, Stephen Z.; Levav, Itzhak; Yoffe, Rinat; Becher, Yifat; Pugachova, Inna

    2016-01-01

    The association between periods of genocide-related exposures and suicide risk remains unknown. Our study tests that association using a national population-based study design. The source population comprised of all persons born during1922-1945 in Nazi-occupied or dominated European nations, that immigrated to Israel by 1965, were identified in the Population Register (N = 220,665), and followed up for suicide to 2014, totaling 16,953,602 person-years. The population was disaggregated to compare a trauma gradient among groups that immigrated before (indirect, n = 20,612, 9%); during (partial direct, n = 17,037, 8%); or after (full direct, n = 183,016, 83%) exposure to the Nazi era. Also, the direct exposure groups were examined regarding pre- or post-natal exposure periods. Cox regression models were used to compute Hazard Ratios (HR) of suicide risk to compare the exposure groups, adjusting for confounding by gender, residential SES and history of psychiatric hospitalization. In the total population, only the partial direct exposure subgroup was at greater risk compared to the indirect exposure group (HR = 1.73, 95% CI, 1.10, 2.73; P < .05). That effect replicated in six sensitivity analyses. In addition, sensitivity analyses showed that exposure at ages 13 plus among females, and follow-up by years since immigration were associated with a greater risk; whereas in utero exposure among persons with no psychiatric hospitalization and early postnatal exposure among males were at a reduced risk. Tentative mechanisms impute biopsychosocial vulnerability and natural selection during early critical periods among males, and feelings of guilt and entrapment or defeat among females. PMID:26901411

  7. Diagnostic Ionizing Radiation Exposure in a Population-based Sample of Children with Inflammatory Bowel Diseases

    PubMed Central

    Palmer, Lena; Herfarth, Hans; Porter, Carol Q.; Fordham, Lynn A.; Sandler, Robert S.; Kappelman, Michael D.

    2009-01-01

    Background and Aims The degree of diagnostic radiation exposure in children with inflammatory bowel diseases (IBD) is largely unknown. Here we describe this exposure in a population-based sample of children with IBD and determine characteristics associated with moderate radiation exposure. Methods We ascertained radiological study use, demographic characteristics, IBD medication use, and the requirement for hospitalization, emergency department (ED) encounter, or inpatient GI surgery among children with IBD within a large insurance claims database. Characteristics associated with moderate radiation exposure (at least one computed tomography (CT) or three fluoroscopies over two years) were determined using logistic regression models. Results We identified 965 children with Crohn’s Disease (CD) and 628 with Ulcerative Colitis (UC). Over 24 months, 34% of CD subjects and 23% of UC subjects were exposed to moderate diagnostic radiation [odds ratio (OR) 1.71, 95% confidence interval (CI), 1.36–2.14]. CT accounted for 28% and 25% of all studies in CD and UC subjects, respectively. For CD subjects, moderate radiation exposure was associated with hospitalization (OR 4.89, 95% CI 3.37–7.09), surgery (OR 2.93, 95% CI 1.59–5.39), ED encounter (OR 2.65, 1.93–3.64 95% CI), oral steroids (OR 2.25, 95% CI 1.50–3.38), and budesonide (OR 1.80, 95% CI 1.10–3.06); an inverse association was seen with immunomodulator use (OR 0.67, 95% CI 0.47–0.97). Except for oral steroids and immunomodulators, similar relationships were seen in UC. Conclusion A substantial proportion of children with IBD are exposed to moderate amounts of radiation as a result of diagnostic testing. This high utilization may impart long-term risk given the chronic nature of the disease. PMID:19690524

  8. A strategy for teaching students concepts of population-based health care for older persons.

    PubMed

    Xakellis, George C; Robinson, Mark

    2003-08-01

    The authors present a strategy for organizing and teaching the concepts of population-based health care for patients over the age of 65. The key ingredients are a case study based on a representative sample of 5,000 Medicare recipients and a student guide containing the sample group's demographics, clinical characteristics, and utilization patterns. As part of the case study, three subgroups within the sample are described: the basically healthy 50% that consume only 3% of medical resources, the most severely ill 10% that consume 70% of medical resources, and the moderately ill 40% that consume the remaining 27% of medical resources. These categories introduce the concepts of severity of illness, highlight the clinical challenges facing providers of care to the elderly, and contrast the divergent needs of individual health care consumers in an aging population. Armed with this succinct and manageable information packet, students are asked to play the role of an interdisciplinary team that is responsible for effectively managing the care of this population of 5,000 lives. Included in this article is a description of learning resources provided; sample group discussion questions; one strategy for caring for the population developed by a faculty-student group; and a brief description of the educational implications of the model. At the end of the article the reader is provided a Web address containing a description of the case and supporting materials (http://healthyaging.ucdavis.edu/education/continuing/ManagedMedicareTeachingCase.pdf). Readers are invited to view, print, and/or utilize the case in their own academic settings. PMID:12915368

  9. Aspirin Use Associated With Amyotrophic Lateral Sclerosis: a Total Population-Based Case-Control Study

    PubMed Central

    Tsai, Ching-Piao; Lin, Feng-Cheng; Lee, Johnny Kuang-Wu; Lee, Charles Tzu-Chi

    2015-01-01

    Background The association of aspirin use and nonsteroid anti-inflammatory drug (NSAID) use with amyotrophic lateral sclerosis (ALS) risk is unclear. This study determined whether use of any individual compound is associated with ALS risk by conducting a total population-based case-control study in Taiwan. Methods A total of 729 patients with newly diagnosed ALS who had a severely disabling disease certificate between January 1, 2002, and December 1, 2008, comprised the case group. These cases were compared with 7290 sex-, age-, residence-, and insurance premium-matched controls. Drug use by each Anatomical Therapeutic Chemical code was analyzed using conditional logistic regression models. False discovery rate (FDR)-adjusted P values were reported in order to avoid inflating false positives. Results Of the 1336 compounds, only the 266 with use cases exceeding 30 in our database were included in the screening analysis. Without controlling for steroid use, the analysis failed to reveal any compound that was inversely associated with ALS risk according to FDR criteria. After controlling for steroid use, we found use of the following compounds to be associated with ALS risk: aspirin, diphenhydramine (one of the antihistamines), and mefenamic acid (one of the NSAIDs). A multivariate analysis revealed that aspirin was independently inversely associated with ALS risk after controlling for diphenhydramine, mefenamic acid, and steroid use. The inverse association between aspirin and ALS was present predominately in patients older than 55 years. Conclusions The results of this study suggested that aspirin use might reduce the risk of ALS, and the benefit might be more prominent for older people. PMID:25721071

  10. Association between gastroesophageal reflux disease and coronary heart disease: A nationwide population-based analysis.

    PubMed

    Chen, Chien-Hua; Lin, Cheng-Li; Kao, Chia-Hung

    2016-07-01

    In this study, we aimed to determine the association between gastroesophageal reflux disease (GERD) and subsequent coronary heart disease (CHD) development, if any, and to evaluate whether longer use of proton pump inhibitors (PPIs) increases the risk of CHD.Patients diagnosed with GERD between 2000 and 2011 were identified as the study cohort (n = 12,960). Patients without GERD were randomly selected from the general population, frequency-matched with the study group according to age, sex, and index year, and evaluated as the comparison cohort (n = 51,840). Both cohorts were followed up until the end of 2011 to determine the incidence of CHD. The risk of CHD was evaluated in both groups by using Cox proportional hazards regression models.The GERD patients had a greater probability of CHD than the cohort without GERD did (log-rank test, P < 0.001 and 11.8 vs 6.5 per 1000 person-years). The GERD cohort had a higher risk of CHD than the comparison cohort did after adjustment for age, sex, hypertension, diabetes, hyperlipidemia, alcohol-related illness, stroke, chronic obstructive pulmonary disease, asthma, biliary stone, anxiety, depression, chronic kidney disease, and cirrhosis (adjusted hazard ratio [aHR]: 1.49, 95% confidence interval [CI]: 1.34-1.66). The risk of CHD was greater for the patients treated with PPIs for more than 1 year (aHR = 1.67, 95% CI = 1.34-2.08) than for those treated with PPIs for <1 year (aHR = 1.56, 95% CI = 1.39-1.74).Our population-based cohort study results indicate that GERD was associated with an increased risk of developing CHD, and that PPI use for more than 1 year might increase the risk of CHD. PMID:27399102

  11. Dietary patterns and risk of oesophageal cancers: a population-based case-control study.

    PubMed

    Ibiebele, Torukiri I; Hughes, Maria Celia; Whiteman, David C; Webb, Penelope M

    2012-04-01

    Epidemiological studies investigating the association between dietary intake and oesophageal cancer have mostly focused on nutrients and food groups instead of dietary patterns. We conducted a population-based case-control study, which included 365 oesophageal adenocarcinoma (OAC), 426 oesophagogastric junction adenocarcinoma (OGJAC) and 303 oesophageal squamous cell carcinoma (OSCC) cases, with frequency matched on age, sex and geographical location to 1580 controls. Data on demographic, lifestyle and dietary factors were collected using self-administered questionnaires. We used principal component analysis to derive three dietary patterns: 'meat and fat', 'pasta and pizza' and 'fruit and vegetable', and unconditional logistic regression models to estimate risks of OAC, OGJAC and OSCC associated with quartiles (Q) of dietary pattern scores. A high score on the meat-and-fat pattern was associated with increased risk of all three cancers: multivariable-adjusted OR 2·12 (95 % CI 1·30, 3·46) for OAC; 1·88 (95% CI 1·21, 2·94) for OGJAC; 2·84 (95% CI 1·67, 4·83) for OSCC (P-trend <0·01 for all three cancers). A high score on the pasta-and-pizza pattern was inversely associated with OSCC risk (OR 0·58, 95 % CI 0·36, 0·96, P for trend=0·009); and a high score on the fruit-and-vegetable pattern was associated with a borderline significant decreased risk of OGJAC (OR for Q4 v. Q1 0·66, 95% CI 0·42, 1·04, P=0·07) and significantly decreased risk of OSCC (OR 0·41, 95% CI 0·24, 0·70, P for trend=0·002). High-fat dairy foods appeared to play a dominant role in the association between the meat-and-fat pattern and risk of OAC and OGJAC. Further investigation in prospective studies is needed to confirm these findings. PMID:21899799

  12. Socioeconomic Status and Incidence of Traffic Accidents in Metropolitan Tehran: A Population-based Study

    PubMed Central

    Sehat, Mojtaba; Naieni, Kourosh Holakouie; Asadi-Lari, Mohsen; Foroushani, Abbas Rahimi; Malek-Afzali, Hossein

    2012-01-01

    Background: Population-based estimates of traffic accidents (TAs) are not readily available for developing countries. This study examined the contribution of socioeconomic status (SES) to the risk of TA among Iranian adults. Methods: A total of 64,200people aged ≥18years were identified from 2008 Urban Health Equity Assessment and Response Tool (Urban HEART) survey. 22,128 households were interviewed to estimate the overall annual incidence, severity and socioeconomic determinants of TAs for males and females in Iranian capital over the preceding year. Wealth index and house value index were constructed for economic measurement. Weighted estimates were computed adjusting for complex survey design. Logistic regression models were used to examine individual and SES measures as potential determinants of TAs in adults. Results: The overall incidence of traffic accident was 17.3(95% CI 16.0, 18.7) per 1000 per year. TA rate in men and women was 22.6(95% CI 20.6, 24.8) and 11.8(95% CI 10.4, 13.2), respectively. The overall TA mortality rate was 26.6(95% CI 13.4, 39.8) per 100,000 person-years, which was almost three times higher in men than that for women (40.4 vs. 12.1 per 100,000person-years). Lower economic level was associated with increased incidence and mortality of TA. Association between SES and incidence, and severity and mortality of TA were identified. Conclusion: TAs occur more in lower socioeconomic layers of the society. This should be taken seriously into consideration by policy makers, so that preventive programs aimed at behavioral modifications in the society are promoted to decrease the health and economic burden imposed by TAs. PMID:22448311

  13. Characterization of mitochondrial haplogroups in a large population-based sample from the United States.

    PubMed

    Mitchell, Sabrina L; Goodloe, Robert; Brown-Gentry, Kristin; Pendergrass, Sarah A; Murdock, Deborah G; Crawford, Dana C

    2014-07-01

    Mitochondrial DNA (mtDNA) haplogroups are valuable for investigations in forensic science, molecular anthropology, and human genetics. In this study, we developed a custom panel of 61 mtDNA markers for high-throughput classification of European, African, and Native American/Asian mitochondrial haplogroup lineages. Using these mtDNA markers, we constructed a mitochondrial haplogroup classification tree and classified 18,832 participants from the National Health and Nutrition Examination Surveys (NHANES). To our knowledge, this is the largest study to date characterizing mitochondrial haplogroups in a population-based sample from the United States, and the first study characterizing mitochondrial haplogroup distributions in self-identified Mexican Americans separately from Hispanic Americans of other descent. We observed clear differences in the distribution of maternal genetic ancestry consistent with proposed admixture models for these subpopulations, underscoring the genetic heterogeneity of the United States Hispanic population. The mitochondrial haplogroup distributions in the other self-identified racial/ethnic groups within NHANES were largely comparable to previous studies. Mitochondrial haplogroup classification was highly concordant with self-identified race/ethnicity (SIRE) in non-Hispanic whites (94.8 %), but was considerably lower in admixed populations including non-Hispanic blacks (88.3 %), Mexican Americans (81.8 %), and other Hispanics (61.6 %), suggesting SIRE does not accurately reflect maternal genetic ancestry, particularly in populations with greater proportions of admixture. Thus, it is important to consider inconsistencies between SIRE and genetic ancestry when performing genetic association studies. The mitochondrial haplogroup data that we have generated, coupled with the epidemiologic variables in NHANES, is a valuable resource for future studies investigating the contribution of mtDNA variation to human health and disease. PMID:24488180

  14. Sex differences in the outcomes of peripheral arterial disease: a population-based cohort study

    PubMed Central

    Hussain, Mohamad A.; Lindsay, Thomas F.; Mamdani, Muhammad; Wang, Xuesong; Verma, Subodh; Al-Omran, Mohammed

    2016-01-01

    Background: The role of sex in the outcomes of patients with peripheral arterial disease (PAD) has been poorly studied. We sought to investigate differences in the long-term adverse cardiovascular and limb outcomes between men and women with PAD. Methods: We conducted a population-based cohort study with up to 7 years of follow-up using linked administrative databases in Ontario, Canada. Patients aged 40 years or older who visited a vascular surgeon between Apr. 1, 2004, and Mar. 31, 2007 (index date), and carried a diagnosis of PAD comprised the study cohort. The primary outcome was a composite of death or hospital admission for stroke or myocardial infarction. Secondary outcomes included lower limb amputation or revascularization. We used Cox proportional hazards modelling to compute unadjusted hazard ratios (HRs) and HRs adjusted for baseline covariates. Results: A total of 6915 patients were studied, of whom 2461 (35.6%) were women. No significant differences in the risk of the primary outcome were observed between men and women (adjusted HR 0.99 [95% confidence interval (CI) 0.92-1.05]). Women were less likely than men to undergo minor amputation (adjusted HR 0.73 [95% CI 0.62-0.85]) and arterial bypass surgery (adjusted HR 0.82 [95% CI 0.71-0.94]) but were more likely to be admitted to hospital for acute myocardial infarction (adjusted HR 1.15 [95% CI 1.00-1.31]). There were no sex differences in the rates of major amputation or transluminal percutaneous angioplasty. Interpretation: We identified no significant differences in the composite risk of major adverse cardiovascular events between women and men with PAD, although our findings suggest men may be at increased risk for adverse limb events compared with women. Cardiovascular health campaigns should focus on both women and men to promote early diagnosis and management of PAD. PMID:27280110

  15. Opioid use in patients with rheumatoid arthritis 2005-2014: a population-based comparative study.

    PubMed

    Zamora-Legoff, Jorge A; Achenbach, Sara J; Crowson, Cynthia S; Krause, Megan L; Davis, John M; Matteson, Eric L

    2016-05-01

    Opioid prescriptions have seen an increase across the USA, Canada, Europe, and the UK. In the USA, they have quadrupled from 1999 to 2010. Opioid use among patients with rheumatoid arthritis (RA) over time is not well described. This study examined trends of opioid use in patients with RA. Retrospective prescription data was examined from 2005 to 2014 in a population-based incidence cohort of patients with RA by 1987 ACR criteria and comparable non-RA subjects. Differences in opioid use were examined with Poisson models. A total of 501 patients with RA (71 % female) and 532 non-RA subjects (70 % female) were included in the study. Total and chronic opioid use in 2014 was substantial in both cohorts 40 % RA vs 24 % non-RA and 12 % RA vs. 4 % non-RA, respectively. Opioid use increased by 19 % per year in both cohorts during the study period (95 % confidence interval [CI] 1.15, 1.25). Relative risk (RR) of chronic opiate use for RA patients compared to non-RA subjects was highest in adults aged 50-64 years (RR 2.82; 95 % CI 1.43-6.23). RA disease characteristics, biologic use at index, treated depression/fibromyalgia, education, and smoking status were not significantly associated with chronic opiate use. Over a third of patients with RA use opioids in some form, and in more than a tenth use is chronic. Use has increased in recent years. Patients aged 50-64 with RA use substantially more opioids than their non-RA counterparts. PMID:27022929

  16. Systemic inflammation, depression and obstructive pulmonary function: a population-based study

    PubMed Central

    2013-01-01

    Background Levels of Interleukin-6 (IL-6) and C-creative protein (CRP) indicating systemic inflammation are known to be elevated in chronic diseases including chronic obstructive pulmonary disease (COPD) and depression. Comorbid depression is common in patients with COPD, but no studies have investigated whether proinflammatory cytokines mediate the association between pulmonary function and depressive symptoms in healthy individuals with no known history of obstructive pulmonary diseases. Methods In a population-based sample (n = 2077) of individuals aged 55 and above with no known history of obstructive pulmonary disease in the Singapore Longitudinal Ageing Study (SLAS), we analyzed the relationships between IL-6 and CRP, depressive symptoms (GDS-15 ≥5) and obstructive pulmonary function (FEV1% predicted and FEV1/FVC% predicted). Results High serum levels of IL-6 and CRP were associated with greater prevalence of depressive symptoms (p < 0.05). High IL-6, high CRP and depressive symptoms were independently associated with decreased FEV1% predicted and FEV1/FVC% predicted after adjusting for smoking status, BMI and number of chronic inflammatory diseases. Increasing grades of combination of inflammatory markers and/or depressive symptoms was associated with progressive increases in pulmonary obstruction. In hierarchical models, the significant association of depressive symptoms with pulmonary obstruction was reduced by the presence of IL-6 and CRP. Conclusions This study found for the first time an association of depressive symptoms and pulmonary function in older adults which appeared to be partly mediated by proinflammatory cytokines. Further studies should be conducted to investigate proinflammatory immune markers and depressive symptoms as potential phenotypic indicators for chronic obstructive airway disorders in older adults. PMID:23676005

  17. Ethnic Differences in Gestational Weight Gain: A Population-Based Cohort Study in Norway.

    PubMed

    Kinnunen, Tarja I; Waage, Christin W; Sommer, Christine; Sletner, Line; Raitanen, Jani; Jenum, Anne Karen

    2016-07-01

    Objectives To explore ethnic differences in gestational weight gain (GWG). Methods This was a population-based cohort study conducted in primary care child health clinics in Groruddalen, Oslo, Norway. Participants were healthy pregnant women (n = 632) categorised to six ethnic groups (43 % were Western European women, the reference group). Body weight was measured at 15 and 28 weeks' gestation on average. Data on pre-pregnancy weight and total GWG until delivery were self-reported. The main method of analysis was linear regression adjusting for age, weeks' gestation, pre-pregnancy body mass index, education and severe nausea. Results No ethnic differences were observed in GWG by 15 weeks' gestation. By 28 weeks' gestation, Eastern European women had gained 2.71 kg (95 % confidence interval, CI 1.10-4.33) and Middle Eastern women 1.32 kg (95 % CI 0.14-2.50) more weight on average than the Western European women in the fully adjusted model. Among Eastern European women, the total adjusted GWG was 3.47 kg (95 % CI 1.33-5.61) above the reference group. Other ethnic groups (South Asian, East Asian and African) did not differ from the reference group. When including non-smokers (n = 522) only, observed between-group differences increased and Middle Eastern women gained more weight than the reference group by all time points. Conclusions Eastern European and Middle Eastern women had higher GWG on average than Western European women, especially among the non-smokers. Although prevention of excessive GWG is important for all pregnant women, these ethnic groups might need special attention during pregnancy. PMID:26979613

  18. Effect of radical prostatectomy surgeon volume on complication rates from a large population-based cohort

    PubMed Central

    Almatar, Ashraf; Wallis, Christopher J.D.; Herschorn, Sender; Saskin, Refik; Kulkarni, Girish S.; Kodama, Ronald T.; Nam, Robert K.

    2016-01-01

    Introduction: Surgical volume can affect several outcomes following radical prostatectomy (RP). We examined if surgical volume was associated with novel categories of treatment-related complications following RP. Methods: We examined a population-based cohort of men treated with RP in Ontario, Canada between 2002 and 2009. We used Cox proportional hazard modeling to examine the effect of physician, hospital and patient demographic factors on rates of treatment-related hospital admissions, urologic procedures, and open surgeries. Results: Over the study interval, 15 870 men were treated with RP. A total of 196 surgeons performed a median of 15 cases per year (range: 1–131). Patients treated by surgeons in the highest quartile of annual case volume (>39/year) had a lower risk of hospital admission (hazard ratio [HR]=0.54, 95% CI 0.47–0.61) and urologic procedures (HR=0.69, 95% CI 0.64–0.75), but not open surgeries (HR=0.83, 95% CI 0.47–1.45) than patients treated by surgeons in the lowest quartile (<15/year). Treatment at an academic hospital was associated with a decreased risk of hospitalization (HR=0.75, 95% CI 0.67–0.83), but not of urologic procedures (HR=0.94, 95% CI 0.88–1.01) or open surgeries (HR=0.87, 95% CI 0.54–1.39). There was no significant trend in any of the outcomes by population density. Conclusions: The annual case volume of the treating surgeon significantly affects a patient’s risk of requiring hospitalization or urologic procedures (excluding open surgeries) to manage treatment-related complications. PMID:26977206

  19. Severity of malocclusion in adolescents: populational-based study in the north of Minas Gerais, Brazil

    PubMed Central

    Silveira, Marise Fagundes; Freire, Rafael Silveira; Nepomuceno, Marcela Oliveira; Martins, Andrea Maria Eleutério de Barros Lima; Marcopito, Luiz Francisco

    2016-01-01

    ABSTRACT OBJECTIVE To identify the factors associated with severity of malocclusion in a population of adolescents. METHODS In this cross-sectional population-based study, the sample size (n = 761) was calculated considering a prevalence of malocclusion of 50.0%, with a 95% confidence level and a 5.0% precision level. The study adopted correction for the effect of delineation (deff = 2), and a 20.0% increase to offset losses and refusals. Multistage probability cluster sampling was adopted. Trained and calibrated professionals performed the intraoral examinations and interviews in households. The dependent variable (severity of malocclusion) was assessed using the Dental Aesthetic Index (DAI). The independent variables were grouped into five blocks: demographic characteristics, socioeconomic condition, use of dental services, health-related behavior and oral health subjective conditions. The ordinal logistic regression model was used to identify the factors associated with severity of malocclusion. RESULTS We interviewed and examined 736 adolescents (91.5% response rate), 69.9% of whom showed no abnormalities or slight malocclusion. Defined malocclusion was observed in 17.8% of the adolescents, being severe or very severe in 12.6%, with pressing or essential need of orthodontic treatment. The probabilities of greater severity of malocclusion were higher among adolescents who self-reported as black, indigenous, pardo or yellow, with lower per capita income, having harmful oral habits, negative perception of their appearance and perception of social relationship affected by oral health. CONCLUSIONS Severe or very severe malocclusion was more prevalent among socially disadvantaged adolescents, with reported harmful habits and perception of compromised esthetics and social relationships. Given that malocclusion can interfere with the self-esteem of adolescents, it is essential to improve public policy for the inclusion of orthodontic treatment among health care

  20. Severity of malocclusion in adolescents: populational-based study in the north of Minas Gerais, Brazil.

    PubMed

    Silveira, Marise Fagundes; Freire, Rafael Silveira; Nepomuceno, Marcela Oliveira; Martins, Andrea Maria Eleutério de Barros Lima; Marcopito, Luiz Francisco

    2016-01-01

    OBJECTIVE To identify the factors associated with severity of malocclusion in a population of adolescents. METHODS In this cross-sectional population-based study, the sample size (n = 761) was calculated considering a prevalence of malocclusion of 50.0%, with a 95% confidence level and a 5.0% precision level. The study adopted correction for the effect of delineation (deff = 2), and a 20.0% increase to offset losses and refusals. Multistage probability cluster sampling was adopted. Trained and calibrated professionals performed the intraoral examinations and interviews in households. The dependent variable (severity of malocclusion) was assessed using the Dental Aesthetic Index (DAI). The independent variables were grouped into five blocks: demographic characteristics, socioeconomic condition, use of dental services, health-related behavior and oral health subjective conditions. The ordinal logistic regression model was used to identify the factors associated with severity of malocclusion. RESULTS We interviewed and examined 736 adolescents (91.5% response rate), 69.9% of whom showed no abnormalities or slight malocclusion. Defined malocclusion was observed in 17.8% of the adolescents, being severe or very severe in 12.6%, with pressing or essential need of orthodontic treatment. The probabilities of greater severity of malocclusion were higher among adolescents who self-reported as black, indigenous, pardo or yellow, with lower per capita income, having harmful oral habits, negative perception of their appearance and perception of social relationship affected by oral health. CONCLUSIONS Severe or very severe malocclusion was more prevalent among socially disadvantaged adolescents, with reported harmful habits and perception of compromised esthetics and social relationships. Given that malocclusion can interfere with the self-esteem of adolescents, it is essential to improve public policy for the inclusion of orthodontic treatment among health care provided to this

  1. Healthcare Costs Attributable to Hypertension: Canadian Population-Based Cohort Study.

    PubMed

    Weaver, Colin G; Clement, Fiona M; Campbell, Norm R C; James, Matthew T; Klarenbach, Scott W; Hemmelgarn, Brenda R; Tonelli, Marcello; McBrien, Kerry A

    2015-09-01

    Accurately documenting the current and future costs of hypertension is required to fully understand the potential economic impact of currently available and future interventions to prevent and treat hypertension. The objective of this work was to calculate the healthcare costs attributable to hypertension in Canada and to project these costs to 2020. Using population-based administrative data for the province of Alberta, Canada (>3 million residents) from 2002 to 2010, we identified individuals with and without diagnosed hypertension. We calculated their total healthcare costs and estimated costs attributable to hypertension using a regression model adjusting for comorbidities and sociodemographic factors. We then extrapolated hypertension-attributable costs to the rest of Canada and projected costs to the year 2020. Twenty-one percent of adults in Alberta had diagnosed hypertension in 2010, with a projected increase to 27% by 2020. The average individual with hypertension had annual healthcare costs of $5768, of which $2341 (41%) were attributed to hypertension. In Alberta, the healthcare costs attributable to hypertension were $1.4 billion in 2010. In Canada, the hypertension-attributable costs were estimated to be $13.9 billion in 2010, rising to $20.5 billion by 2020. The increase was ascribed to demographic changes (52%), increasing prevalence (16%), and increasing per-patient costs (32%). Hypertension accounts for a significant proportion of healthcare spending (10.2% of the Canadian healthcare budget) and is projected to rise even further. Interventions to prevent and treat hypertension may play a role in limiting this cost growth. PMID:26169049

  2. Socioeconomic inequalities in pregnancy outcome associated with Down syndrome: a population-based study

    PubMed Central

    Budd, Judith L S; Draper, Elizabeth S; Lotto, Robyn R; Berry, Laura E; Smith, Lucy K

    2015-01-01

    Objective To investigate socioeconomic inequalities in outcome of pregnancy associated with Down syndrome (DS) compared with other congenital anomalies screened for during pregnancy. Design and setting Retrospective population-based registry study (East Midlands & South Yorkshire in England). Participants All registered cases of DS and nine selected congenital anomalies with poor prognostic outcome (the UK Fetal Anomaly Screening Programme (FASP)9) with an end of pregnancy date between 1 January 1998 and 31 December 2007. Main outcome measures: Poisson regression models were used to explore outcome measures, including socioeconomic variation in rates of anomaly; antenatal detection; pregnancy outcome; live birth incidence and neonatal mortality. Deprivation was measured using the Index of Multiple Deprivation 2004 at super output area level. Results There were 1151 cases of DS and 1572 cases of the nine severe anomalies combined. The overall rate of antenatal detection was 57% for DS, which decreased with increasing deprivation (rate ratio comparing the most deprived tenth with the least deprived: 0.76 (0.60 to 0.97)). Antenatal detection rates were considerably higher for FASP9 anomalies (86%), with no evidence of a trend with deprivation (0.99 95% CI (0.84 to 1.17)). The termination of pregnancy rate following antenatal diagnosis was higher for DS (86%) than the FASP9 anomalies (70%). Both groups showed wide socioeconomic variation in the termination of pregnancy rate (rate ratio: DS: 0.76 (0.58 to 0.99); FASP9 anomalies: 0.80 (0.65 to 0.97)). Consequently, socioeconomic inequalities in live birth and neonatal mortality rates associated with these anomalies arise that were not observed in utero. Conclusions Socioeconomic inequalities exist in the antenatal detection of DS, and subsequent termination rates are much higher for DS than other anomalies. Termination rates for all anomalies are lower in more deprived areas leading to wide socioeconomic inequalities in

  3. Predictors of ovarian cancer survival: a population-based prospective study in Sweden.

    PubMed

    Yang, Ling; Klint, Asa; Lambe, Mats; Bellocco, Rino; Riman, Tomas; Bergfeldt, Kjell; Persson, Ingemar; Weiderpass, Elisabete

    2008-08-01

    Ovarian cancer is the leading cause of death from gynecologic malignancies among women worldwide. Little is known about reproductive factors or lifestyle determinants and ovarian cancer prognosis. The objective of this study was to examine whether ovarian cancer survival is influenced by reproductive history, anthropometric characteristics, prediagnostic life-style factors and family history of breast or ovarian cancer. The study population consisted of 635 epithelial ovarian cancer (EOC) cases derived from a nationwide population-based case-control study conducted in Sweden between 1993 and 1995. Exposure data on prediagnostic factors of interest were collected through questionnaires at the beginning of the parent study. Clinical data were abstracted from medical records. Cases were followed-up by means of record linkage to nationwide registers until December 31, 2002. Cox proportional hazard regression model was used to estimate the prognostic effect of each factor in terms of hazard ratios (HR) and 95% confidence intervals (CI), following adjustment for age at diagnosis, FIGO tumor stage and WHO grade of tumor differentiation. Tumor characteristics significantly influenced the risk of death from EOC. After adjustment for these, no clear associations were detected between reproductive history (parity, age at first or last birth, oral contraceptive use, age at menarche or menopause), anthropometric characteristics (body size and shape in different periods of life), lifestyle factors before diagnosis (alcohol consumption, smoking and physical activity over lifetime), nor family history of breast cancer or ovarian cancer and EOC survival. Our findings indicate that these prediagnostic factors do not influence the EOC survival. Nevertheless, among women with early stage disease (FIGO stage I and II), there was some indication that overweight in young adulthood or recent years increased the risk of death, while physical activity in young adult life appeared to reduce

  4. Cell-cycle protein expression in a population-based study of ovarian and endometrial cancers.

    PubMed

    Felix, Ashley S; Sherman, Mark E; Hewitt, Stephen M; Gunja, Munira Z; Yang, Hannah P; Cora, Renata L; Boudreau, Vicky; Ylaya, Kris; Lissowska, Jolanta; Brinton, Louise A; Wentzensen, Nicolas

    2015-01-01

    Aberrant expression of cyclin-dependent kinase (CDK) inhibitors is implicated in the carcinogenesis of many cancers, including ovarian and endometrial cancers. We examined associations between CDK inhibitor expression, cancer risk factors, tumor characteristics, and survival outcomes among ovarian and endometrial cancer patients enrolled in a population-based case-control study. Expression (negative vs. positive) of three CDK inhibitors (p16, p21, and p27) and ki67 was examined with immunohistochemical staining of tissue microarrays. Logistic regression was used to estimate adjusted odds ratios (ORs) and 95% confidence intervals (CIs) for associations between biomarkers, risk factors, and tumor characteristics. Survival outcomes were only available for ovarian cancer patients and examined using Kaplan-Meier plots and Cox proportional hazards regression. Among ovarian cancer patients (n = 175), positive p21 expression was associated with endometrioid tumors (OR = 12.22, 95% CI = 1.45-102.78) and higher overall survival (log-rank p = 0.002). In Cox models adjusted for stage, grade, and histology, the association between p21 expression and overall survival was borderline significant (hazard ratio = 0.65, 95% CI = 0.42-1.05). Among endometrial cancer patients (n = 289), positive p21 expression was inversely associated with age (OR ≥ 65 years of age = 0.25, 95% CI = 0.07-0.84) and current smoking status (OR: 0.33, 95% CI 0.15, 0.72) compared to negative expression. Our study showed heterogeneity in expression of cell-cycle proteins associated with risk factors and tumor characteristics of gynecologic cancers. Future studies to assess these markers of etiological classification and behavior may be warranted. PMID:25709969

  5. Childhood ADHD and Risk for Substance Dependence in Adulthood: A Longitudinal, Population-Based Study

    PubMed Central

    Levy, Sharon; Katusic, Slavica K.; Colligan, Robert C.; Weaver, Amy L.; Killian, Jill M.; Voigt, Robert G.; Barbaresi, William J.

    2014-01-01

    Background Adolescents with attention-deficit/hyperactivity disorder (ADHD) are known to be at significantly greater risk for the development of substance use disorders (SUD) compared to peers. Impulsivity, which could lead to higher levels of drug use, is a known symptom of ADHD and likely accounts, in part, for this relationship. Other factors, such as a biologically increased susceptibility to substance dependence (addiction), may also play a role. Objective This report further examines the relationships between childhood ADHD, adolescent- onset SUD, and substance abuse and substance dependence in adulthood. Method Individuals with childhood ADHD and non-ADHD controls from the same population-based birth cohort were invited to participate in a prospective outcome study. Participants completed a structured neuropsychiatric interview with modules for SUD and a psychosocial questionnaire. Information on adolescent SUD was obtained retrospectively, in a previous study, from medical and school records. Associations were summarized using odds ratios (OR) and 95% CIs estimated from logistic regression models adjusted for age and gender. Results A total of 232 ADHD cases and 335 non-ADHD controls participated (mean age, 27.0 and 28.6 years, respectively). ADHD cases were more likely than controls to have a SUD diagnosed in adolescence and were more likely to have alcohol (adjusted OR 14.38, 95% CI 1.49–138.88) and drug (adjusted OR 3.48, 95% CI 1.38–8.79) dependence in adulthood. The subgroup of participating ADHD cases who did not have SUD during adolescence were no more likely than controls to develop new onset alcohol dependence as adults, although they were significantly more likely to develop new onset drug dependence. Conclusions Our study found preliminary evidence that adults with childhood ADHD are more susceptible than peers to developing drug dependence, a disorder associated with neurological changes in the brain. The relationship between ADHD and

  6. Gout increases risk of fracture: A nationwide population-based cohort study.

    PubMed

    Tzeng, Huey-En; Lin, Che-Chen; Wang, I-Kuan; Huang, Po-Hao; Tsai, Chun-Hao

    2016-08-01

    There is still debate on whether high uric acid increases bone mineral density (BMD) against osteoporotic fracture or bone resorption caused by gout inflammation. This study aimed to evaluate whether gout offers a protective effect on bone health or not. We conducted a nationwide population-based retrospective cohort study to evaluate the association between gout history and risk factors of fracture.A retrospective cohort study was designed using the claim data from Longitudinal Health Insurance Database (LHID). A total of 43,647 subjects with gout and a cohort of 87,294 comparison subjects without gout were matched in terms of age and sex between 2001 and 2009, and the data were followed until December 31, 2011. The primary outcome of the study was the fracture incidence, and the impacts of gout on fracture risks were analyzed using the Cox proportional hazards model.After an 11-year follow-up period, 6992 and 11,412 incidents of fracture were reported in gout and comparison cohorts, respectively. The overall incidence rate of fracture in individuals with gout was nearly 23%, which was higher than that in individuals without gout (252 vs 205 per 10,000 person-years) at an adjusted hazard ratio of 1.17 (95% confidence interval = 1.14-1.21). Age, sex, and fracture-associated comorbidities were adjusted accordingly. As for fracture locations, patients with gout were found at significant higher fracture risks for upper/lower limbs and spine fractures. In gout patient, the user of allopurinol or benzbromarone has significantly lower risk of facture than nonusers.Gout history is considered as a risk factor for fractures, particularly in female individuals and fracture sites located at the spine or upper/lower limbs. PMID:27559970

  7. Genocide Exposure and Subsequent Suicide Risk: A Population-Based Study.

    PubMed

    Levine, Stephen Z; Levav, Itzhak; Yoffe, Rinat; Becher, Yifat; Pugachova, Inna

    2016-01-01

    The association between periods of genocide-related exposures and suicide risk remains unknown. Our study tests that association using a national population-based study design. The source population comprised of all persons born during1922-1945 in Nazi-occupied or dominated European nations, that immigrated to Israel by 1965, were identified in the Population Register (N = 220,665), and followed up for suicide to 2014, totaling 16,953,602 person-years. The population was disaggregated to compare a trauma gradient among groups that immigrated before (indirect, n = 20,612, 9%); during (partial direct, n = 17,037, 8%); or after (full direct, n = 183,016, 83%) exposure to the Nazi era. Also, the direct exposure groups were examined regarding pre- or post-natal exposure periods. Cox regression models were used to compute Hazard Ratios (HR) of suicide risk to compare the exposure groups, adjusting for confounding by gender, residential SES and history of psychiatric hospitalization. In the total population, only the partial direct exposure subgroup was at greater risk compared to the indirect exposure group (HR = 1.73, 95% CI, 1.10, 2.73; P < .05). That effect replicated in six sensitivity analyses. In addition, sensitivity analyses showed that exposure at ages 13 plus among females, and follow-up by years since immigration were associated with a greater risk; whereas in utero exposure among persons with no psychiatric hospitalization and early postnatal exposure among males were at a reduced risk. Tentative mechanisms impute biopsychosocial vulnerability and natural selection during early critical periods among males, and feelings of guilt and entrapment or defeat among females. PMID:26901411

  8. Physical comorbidities in men with mood and anxiety disorders: a population-based study

    PubMed Central

    2013-01-01

    Background The mind-body nexus has been a topic of growing interest. Further data are however required to understand the specific relationship between mood and anxiety disorders and individual physical health conditions, and to verify whether these psychiatric disorders are linked to overall medical burden. Methods This study examined data collected from 942 men, 20 to 97 years old, participating in the Geelong Osteoporosis Study. A lifetime history of mood and anxiety disorders was identified using the Structured Clinical Interview for DSM-IV-TR Research Version, Non-patient edition (SCID-I/NP). The presence of medical conditions (lifetime) was self-reported and confirmed by medical records, medication use or clinical data. Anthropometric measurements and socioeconomic status (SES) were determined and information on medication use and lifestyle was obtained via questionnaire. Logistic regression models were used to test the associations. Results After adjustment for age, socioeconomic status, and health risk factors (body mass index, physical activity and smoking), mood disorders were associated with gastro oesophageal reflux disease (GORD), recurrent headaches, blackouts and/or epilepsy, liver disorders and pulmonary disease in older people, whilst anxiety disorders were significantly associated with thyroid, GORD and other gastrointestinal disorders, and psoriasis. Increased odds of high medical burden were associated with both mood and anxiety disorders. Conclusions Our study provides further population-based evidence supporting the link between mental and physical illness in men. Understanding these associations is not only necessary for individual management, but also to inform the delivery of health promotion messages and health care. PMID:23618390

  9. The Association Between Peptic Ulcer Disease and Ischemic Stroke: A Population-Based Longitudinal Study.

    PubMed

    Cheng, Tain-Junn; Guo, How-Ran; Chang, Chia-Yu; Weng, Shih-Feng; Li, Pi-I; Wang, Jhi-Joung; Wu, Wen-Shiann

    2016-05-01

    Stroke is a common cause of death worldwide, but about 30% of ischemic stroke (IS) patients have no identifiable contributing risk factors. Because peptic ulcer disease (PUD) and vascular events share some common risk factors, we conducted a population-based study to evaluate the association between PUD and IS.We followed up a representative sample of 1 million residents of Taiwan using the National Health Insurance Research Database from 1997 to 2011. We defined patients who received medications for PUD and had related diagnosis codes as the PUD group, and a reference group matched by age and sex was sampled from those who did not have PUD. We also collected data on medical history and monthly income. The events of IS occurred after enrollment were compared between the 2 groups. The data were analyzed using Cox proportional hazard models at the 2-tailed significant level of 0.05.The PUD group had higher income and prevalence of hypertension, diabetes mellitus (DM), heart disease, and hyperlipidemia. They also had a higher risk of developing IS with an adjusted hazard ratio of 1.31 (95% confidence interval: 1.20-1.41). Other independent risk factors included male sex, older age, lower income, and co-morbidity of hypertension, diabetes mellitus (DM), and heart disease.PUD is a risk factor for IS, independent of conventional risk factors such as male sex, older age, lower income, and co-morbidity of hypertension, DM, and heart disease. Prevention strategies taking into account PUD should be developed and evaluated. PMID:27258514

  10. Population-based analysis of Alzheimer’s disease risk alleles implicates genetic interactions

    PubMed Central

    Ebbert, Mark T. W.; Ridge, Perry G.; Wilson, Andrew R.; Sharp, Aaron R.; Bailey, Matthew; Norton, Maria C.; Tschanz, JoAnn T.; Munger, Ronald G.; Corcoran, Christopher D.; Kauwe, John S. K.

    2013-01-01

    Background Reported odds ratios and population attributable fractions (PAF) for late-onset Alzheimer’s disease (LOAD) risk loci (BIN1, ABCA7, CR1, MS4A4E, CD2AP, PICALM, MS4A6A, CD33, and CLU) come from clinically ascertained samples. Little is known about the combined PAF for these LOAD risk alleles and the utility of these combined markers for case-control prediction. Here we evaluate these loci in a large population-based sample to estimate PAF and explore the effects of additive and non-additive interactions on LOAD status prediction performance. Methods 2,419 samples from the Cache County Memory Study were genotyped for APOE and nine LOAD risk loci from AlzGene.org. We used logistic regression and ROC analysis to assess the LOAD status prediction performance of these loci using additive and non-additive models, and compared ORs and PAFs between AlzGene.org and Cache County. Results Odds ratios were comparable between Cache County and AlzGene.org when identical SNPs were genotyped. PAFs from AlzGene.org ranged from 2.25–37%; those from Cache County ranged from 0.05–20%. Including non-APOE alleles significantly improved LOAD status prediction performance (AUC = 0.80) over APOE alone (AUC = 0.78) when not constrained to an additive relationship (p < 0.03). We identified potential allelic interactions (p-values uncorrected): CD33-MS4A4E (Synergy Factor = 5.31; p < 0.003) and CLU-MS4A4E (SF = 3.81; p < 0.016). Conclusions While non-additive interactions between loci significantly improve diagnostic ability, the improvement does not reach the desired sensitivity or specificity for clinical use. Nevertheless, these results suggest that understanding gene-gene interactions may be important in resolving Alzheimer’s disease etiology. PMID:23954108

  11. Factors Affecting Perceptual Threshold in Argus II Retinal Prosthesis Subjects

    PubMed Central

    Ahuja, A. K.; Yeoh, J.; Dorn, J. D.; Caspi, A.; Wuyyuru, V.; McMahon, M. J.; Humayun, M. S.; Greenberg, R. J.; daCruz, L.

    2013-01-01

    Purpose The Argus II epiretinal prosthesis has been developed to provide partial restoration of vision to subjects blinded from outer retinal degenerative disease. Participants were surgically implanted with the system in the United States and Europe in a single arm, prospective, multicenter clinical trial. The purpose of this investigation was to determine which factors affect electrical thresholds in order to inform surgical placement of the device. Methods Electrode–retina and electrode–fovea distances were determined using SD-OCT and fundus photography, respectively. Perceptual threshold to electrical stimulation of electrodes was measured using custom developed software, in which current amplitude was varied until the threshold was found. Full field stimulus light threshold was measured using the Espion D-FST test. Relationships between electrical threshold and these three explanatory variables (electrode–retina distance, electrode–fovea distance, and monocular light threshold) were quantified using regression. Results Regression analysis showed a significant correlation between electrical threshold and electrode–retina distance (R2 = 0.50, P = 0.0002; n = 703 electrodes). 90.3% of electrodes in contact with the macula (n = 207) elicited percepts at charge densities less than 1 mC/cm2/phase. These threshold data also correlated well with ganglion cell density profile (P = 0.03). A weaker, but still significant, inverse correlation was found between light threshold and electrical threshold (R2 < 0.52, P = 0.01). Multivariate modeling indicated that electrode–retina distance and light threshold are highly predictive of electrode threshold (R2 = 0.87; P < 0.0005). Conclusions Taken together, these results suggest that while light threshold should be used to inform patient selection, macular contact of the array is paramount. Translational Relevance Reported Argus II clinical study results are in good agreement with prior in vitro and in vivo studies

  12. Energy Switching Threshold for Climatic Benefits

    NASA Astrophysics Data System (ADS)

    Zhang, X.; Cao, L.; Caldeira, K.

    2013-12-01

    Climate change is one of the great challenges facing humanity currently and in the future. Its most severe impacts may still be avoided if efforts are made to transform current energy systems (1). A transition from the global system of high Greenhouse Gas (GHG) emission electricity generation to low GHG emission energy technologies is required to mitigate climate change (2). Natural gas is increasingly seen as a choice for transitions to renewable sources. However, recent researches in energy and climate puzzled about the climate implications of relying more energy on natural gas. On one hand, a shift to natural gas is promoted as climate mitigation because it has lower carbon per unit energy than coal (3). On the other hand, the effect of switching to natural gas on nuclear-power and other renewable energies development may offset benefits from fuel-switching (4). Cheap natural gas is causing both coal plants and nuclear plants to close in the US. The objective of this study is to measure and evaluate the threshold of energy switching for climatic benefits. We hypothesized that the threshold ratio of energy switching for climatic benefits is related to GHGs emission factors of energy technologies, but the relation is not linear. A model was developed to study the fuel switching threshold for greenhouse gas emission reduction, and transition from coal and nuclear electricity generation to natural gas electricity generation was analyzed as a case study. The results showed that: (i) the threshold ratio of multi-energy switching for climatic benefits changes with GHGs emission factors of energy technologies. (ii)The mathematical relation between the threshold ratio of energy switching and GHGs emission factors of energies is a curved surface function. (iii) The analysis of energy switching threshold for climatic benefits can be used for energy and climate policy decision support.

  13. Maternal Use of Antibiotics and the Risk of Childhood Febrile Seizures: A Danish Population-Based Cohort

    PubMed Central

    Miller, Jessica E.; Pedersen, Lars Henning; Vestergaard, Mogens; Olsen, Jørn

    2013-01-01

    Objective In a large population-based cohort in Denmark to examine if maternal use of antibiotics during pregnancy, as a marker of infection, increases the risk of febrile seizures in childhood in a large population-based cohort in Denmark. Methods All live-born singletons born in Denmark between January 1, 1996 and September 25, 2004 and who were alive on the 90th day of life were identified from the Danish National Birth Registry. Diagnoses of febrile seizures were obtained from the Danish National Hospital Register and maternal use of antibiotics was obtained from the National Register of Medicinal Product Statistics. Hazard ratios (HR) and 95% confidence intervals (95% CI) were estimated by Cox proportional hazard regression models. Results We followed 551,518 singletons for up to 5 years and identified a total of 21,779 children with a diagnosis of febrile seizures. Slightly increased hazard ratios were observed among most exposure groups when compared to the unexposed group, ex. HR 1.08 95% CI: 1.05–1.11 for use of any systemic antibiotic during pregnancy. Conclusion We found weak associations between the use of pharmacologically different antibiotics during pregnancy and febrile seizures in early childhood which may indicate that some infections, or causes or effects of infections, during pregnancy could affect the fetal brain and induce susceptibility to febrile seizures. PMID:23613800

  14. Cost-effectiveness analysis of population-based screening of hepatocellular carcinoma: Comparing ultrasonography with two-stage screening

    PubMed Central

    Kuo, Ming-Jeng; Chen, Hsiu-Hsi; Chen, Chi-Ling; Fann, Jean Ching-Yuan; Chen, Sam Li-Sheng; Chiu, Sherry Yueh-Hsia; Lin, Yu-Min; Liao, Chao-Sheng; Chang, Hung-Chuen; Lin, Yueh-Shih; Yen, Amy Ming-Fang

    2016-01-01

    AIM: To assess the cost-effectiveness of two population-based hepatocellular carcinoma (HCC) screening programs, two-stage biomarker-ultrasound method and mass screening using abdominal ultrasonography (AUS). METHODS: In this study, we applied a Markov decision model with a societal perspective and a lifetime horizon for the general population-based cohorts in an area with high HCC incidence, such as Taiwan. The accuracy of biomarkers and ultrasonography was estimated from published meta-analyses. The costs of surveillance, diagnosis, and treatment were based on a combination of published literature, Medicare payments, and medical expenditure at the National Taiwan University Hospital. The main outcome measure was cost per life-year gained with a 3% annual discount rate. RESULTS: The results show that the mass screening using AUS was associated with an incremental cost-effectiveness ratio of USD39825 per life-year gained, whereas two-stage screening was associated with an incremental cost-effectiveness ratio of USD49733 per life-year gained, as compared with no screening. Screening programs with an initial screening age of 50 years old and biennial screening interval were the most cost-effective. These findings were sensitive to the costs of screening tools and the specificity of biomarker screening. CONCLUSION: Mass screening using AUS is more cost effective than two-stage biomarker-ultrasound screening. The most optimal strategy is an initial screening age at 50 years old with a 2-year inter-screening interval. PMID:27022228

  15. Risk of Peripheral Artery Disease in Patients With Carbon Monoxide Poisoning: A Population-Based Retrospective Cohort Study.

    PubMed

    Chen, Yu-Guang; Lin, Te-Yu; Dai, Ming-Shen; Lin, Cheng-Li; Hung, Yuan; Huang, Wen-Sheng; Kao, Chia-Hung

    2015-10-01

    Carbon monoxide (CO) poisoning can cause several life-threatening complications, particularly in cardiovascular and neurological systems. However, no studies have been performed to investigate the association between peripheral artery disease (PAD) and CO poisoning. We constructed a population-based retrospective cohort study to clarify the risks between PAD and CO poisoning. This population-based cohort study involved analyzing data from 1998 to 2010 obtained from the Taiwanese National Health Insurance Research Database, with a follow-up period extending to the end of 2011. We identified patients with CO poisoning and selected a comparison cohort that was frequency matched according to age, sex, and year of diagnosis of CO poisoning at a ratio of 1 patient to 4 control patients. We analyzed the risks for patients with CO poisoning and PAD by using Cox proportional hazards regression models. In this study, 9046 patients with CO poisoning and 36,183 controls were included. The overall risks for developing PAD were 1.85-fold in the patients with CO poisoning compared with the comparison cohort after adjusting for age, sex, and comorbidities. Our long-term cohort study results showed a higher risk for PAD development among patients with CO poisoning. PMID:26448007

  16. Understanding Risk and Protective Factors for Child Maltreatment: The Value of Integrated, Population-Based Data

    ERIC Educational Resources Information Center

    Putnam-Hornstein, Emily; Needell, Barbara; Rhodes, Anne E.

    2013-01-01

    In this article, we argue for expanded efforts to integrate administrative data systems as a "practical strategy" for developing a richer understanding of child abuse and neglect. Although the study of child maltreatment is often critiqued for being atheoretical, we believe that a more pressing concern is the absence of population-based and…

  17. Paediatric cancer stage in population-based cancer registries: the Toronto consensus principles and guidelines.

    PubMed

    Gupta, Sumit; Aitken, Joanne F; Bartels, Ute; Brierley, James; Dolendo, Mae; Friedrich, Paola; Fuentes-Alabi, Soad; Garrido, Claudia P; Gatta, Gemma; Gospodarowicz, Mary; Gross, Thomas; Howard, Scott C; Molyneux, Elizabeth; Moreno, Florencia; Pole, Jason D; Pritchard-Jones, Kathy; Ramirez, Oscar; Ries, Lynn A G; Rodriguez-Galindo, Carlos; Shin, Hee Young; Steliarova-Foucher, Eva; Sung, Lillian; Supriyadi, Eddy; Swaminathan, Rajaraman; Torode, Julie; Vora, Tushar; Kutluk, Tezer; Frazier, A Lindsay

    2016-04-01

    Population-based cancer registries generate estimates of incidence and survival that are essential for cancer surveillance, research, and control strategies. Although data on cancer stage allow meaningful assessments of changes in cancer incidence and outcomes, stage is not recorded by most population-based cancer registries. The main method of staging adult cancers is the TNM classification. The criteria for staging paediatric cancers, however, vary by diagnosis, have evolved over time, and sometimes vary by cooperative trial group. Consistency in the collection of staging data has therefore been challenging for population-based cancer registries. We assembled key experts and stakeholders (oncologists, cancer registrars, epidemiologists) and used a modified Delphi approach to establish principles for paediatric cancer stage collection. In this Review, we make recommendations on which staging systems should be adopted by population-based cancer registries for the major childhood cancers, including adaptations for low-income countries. Wide adoption of these guidelines in registries will ease international comparative incidence and outcome studies. PMID:27300676

  18. Asthma and Attention-Deficit/Hyperactivity Disorder: A Nationwide Population-Based Prospective Cohort Study

    ERIC Educational Resources Information Center

    Chen, Mu-Hong; Su, Tung-Ping; Chen, Ying-Sheue; Hsu, Ju-Wei; Huang, Kai-Lin; Chang, Wen-Han; Chen, Tzeng-Ji; Bai, Ya-Mei

    2013-01-01

    Background: Previous cross-sectional studies have suggested an association between asthma and attention-deficit/hyperactivity disorder (ADHD), but the temporal relationship was not determined. Using a nationwide population-based prospective case-control cohort study (1:4, age-/gender-matched), we hypothesized that asthma in infanthood or early…

  19. Mortality in Adults with Moderate to Profound Intellectual Disability: A Population-Based Study

    ERIC Educational Resources Information Center

    Tyrer, F.; Smith, L. K.; McGrother, C. W.

    2007-01-01

    Background: People with intellectual disability (ID) experience a variety of health inequalities compared with the general population including higher mortality rates. This is the first UK population-based study to measure the extent of excess mortality in people with ID compared with the general population. Method: Indirectly standardized…

  20. Minor Self-Harm and Psychiatric Disorder: A Population-Based Study

    ERIC Educational Resources Information Center

    Skegg, Keren; Nada-Raja, Shyamala; Moffit, Terrie E.

    2004-01-01

    Little is known about the extent to which minor self-harm in the general population is associated with psychiatric disorder. A population-based sample of 980 young adults was interviewed independently about past-year suicidal and self-harm behavior and thoughts, and psychiatric disorders. Self-harm included self-harmful behaviors such as…

  1. A Population-Based Study of Preschoolers' Food Neophobia and Its Associations with Food Preferences

    ERIC Educational Resources Information Center

    Russell, Catherine Georgina; Worsley, Anthony

    2008-01-01

    Objective: This cross-sectional study was designed to investigate the relationships between food preferences, food neophobia, and children's characteristics among a population-based sample of preschoolers. Design: A parent-report questionnaire. Setting: Child-care centers, kindergartens, playgroups, day nurseries, and swimming centers. Subjects:…

  2. Epilepsy Among Children and Adolescents with Autism Spectrum Disorders: A Population-Based Study

    ERIC Educational Resources Information Center

    Jokiranta, Elina; Sourander, Andre; Suominen, Auli; Timonen-Soivio, Laura; Brown, Alan S.; Sillanpää, Matti

    2014-01-01

    The present population-based study examines associations between epilepsy and autism spectrum disorders (ASD). The cohort includes register data of 4,705 children born between 1987 and 2005 and diagnosed as cases of childhood autism, Asperger's syndrome or pervasive developmental disorders--not otherwise specified. Each case was matched to…

  3. Long-Term Benefits of Full-Day Kindergarten: A Longitudinal Population-Based Study

    ERIC Educational Resources Information Center

    Brownell, M. D.; Nickel, N. C.; Chateau, D.; Martens, P. J.; Taylor, C.; Crockett, L.; Katz, A.; Sarkar, J.; Burland, E.; Goh, C. Y.

    2015-01-01

    In the first longitudinal, population-based study of full-day kindergarten (FDK) outcomes beyond primary school in Canada, we used linked administrative data to follow 15 kindergarten cohorts (n ranging from 112 to 736) up to grade 9. Provincial assessments conducted in grades 3, 7, and 8 and course marks and credits earned in grade 9 were…