Sample records for non-threshold lnt dose-based

  1. Regulatory implications of a linear non-threshold (LNT) dose-based risks.

    PubMed

    Aleta, C R

    2009-01-01

    Current radiation protection regulatory limits are based on the linear non-threshold (LNT) theory using health data from atomic bombing survivors. Studies in recent years sparked debate on the validity of the theory, especially at low doses. The present LNT overestimates radiation risks since the dosimetry included only acute gammas and neutrons; the role of other bomb-caused factors, e.g. fallout, induced radioactivity, thermal radiation (UVR), electromagnetic pulse (EMP), and blast, were excluded. Studies are proposed to improve the dose-response relationship.

  2. The threshold vs LNT showdown: Dose rate findings exposed flaws in the LNT model part 2. How a mistake led BEIR I to adopt LNT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Calabrese, Edward J., E-mail: edwardc@schoolph.uma

    This paper reveals that nearly 25 years after the used Russell's dose-rate data to support the adoption of the linear-no-threshold (LNT) dose response model for genetic and cancer risk assessment, Russell acknowledged a significant under-reporting of the mutation rate of the historical control group. This error, which was unknown to BEIR I, had profound implications, leading it to incorrectly adopt the LNT model, which was a decision that profoundly changed the course of risk assessment for radiation and chemicals to the present. -- Highlights: • The BEAR I Genetics Panel made an error in denying dose rate for mutation. •more » The BEIR I Genetics Subcommittee attempted to correct this dose rate error. • The control group used for risk assessment by BEIR I is now known to be in error. • Correcting this error contradicts the LNT, supporting a threshold model.« less

  3. The threshold vs LNT showdown: Dose rate findings exposed flaws in the LNT model part 1. The Russell-Muller debate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Calabrese, Edward J., E-mail: edwardc@schoolph.uma

    This paper assesses the discovery of the dose-rate effect in radiation genetics and how it challenged fundamental tenets of the linear non-threshold (LNT) dose response model, including the assumptions that all mutational damage is cumulative and irreversible and that the dose-response is linear at low doses. Newly uncovered historical information also describes how a key 1964 report by the International Commission for Radiological Protection (ICRP) addressed the effects of dose rate in the assessment of genetic risk. This unique story involves assessments by two leading radiation geneticists, Hermann J. Muller and William L. Russell, who independently argued that the report'smore » Genetic Summary Section on dose rate was incorrect while simultaneously offering vastly different views as to what the report's summary should have contained. This paper reveals occurrences of scientific disagreements, how conflicts were resolved, which view(s) prevailed and why. During this process the Nobel Laureate, Muller, provided incorrect information to the ICRP in what appears to have been an attempt to manipulate the decision-making process and to prevent the dose-rate concept from being adopted into risk assessment practices. - Highlights: • The discovery of radiation dose rate challenged the scientific basis of LNT. • Radiation dose rate occurred in males and females. • The dose rate concept supported a threshold dose-response for radiation.« less

  4. The threshold vs LNT showdown: Dose rate findings exposed flaws in the LNT model part 2. How a mistake led BEIR I to adopt LNT.

    PubMed

    Calabrese, Edward J

    2017-04-01

    This paper reveals that nearly 25 years after the National Academy of Sciences (NAS), Biological Effects of Ionizing Radiation (BEIR) I Committee (1972) used Russell's dose-rate data to support the adoption of the linear-no-threshold (LNT) dose response model for genetic and cancer risk assessment, Russell acknowledged a significant under-reporting of the mutation rate of the historical control group. This error, which was unknown to BEIR I, had profound implications, leading it to incorrectly adopt the LNT model, which was a decision that profoundly changed the course of risk assessment for radiation and chemicals to the present. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. LNT IS THE BEST WE CAN DO - TO-DAY

    EPA Science Inventory

    Abstract

    The form of the dose-response curve for radiation-induced cancers, particularly at low doses, is the subject of an ongoing and spirited debate. The present review describes the current data base and basis for establishing a low dose, linear no threshold (LNT) mode...

  6. Observations on the Chernobyl Disaster and LNT.

    PubMed

    Jaworowski, Zbigniew

    2010-01-28

    The Chernobyl accident was probably the worst possible catastrophe of a nuclear power station. It was the only such catastrophe since the advent of nuclear power 55 years ago. It resulted in a total meltdown of the reactor core, a vast emission of radionuclides, and early deaths of only 31 persons. Its enormous political, economic, social and psychological impact was mainly due to deeply rooted fear of radiation induced by the linear non-threshold hypothesis (LNT) assumption. It was a historic event that provided invaluable lessons for nuclear industry and risk philosophy. One of them is demonstration that counted per electricity units produced, early Chernobyl fatalities amounted to 0.86 death/GWe-year), and they were 47 times lower than from hydroelectric stations ( approximately 40 deaths/GWe-year). The accident demonstrated that using the LNT assumption as a basis for protection measures and radiation dose limitations was counterproductive, and lead to sufferings and pauperization of millions of inhabitants of contaminated areas. The projections of thousands of late cancer deaths based on LNT, are in conflict with observations that in comparison with general population of Russia, a 15% to 30% deficit of solid cancer mortality was found among the Russian emergency workers, and a 5% deficit solid cancer incidence among the population of most contaminated areas.

  7. Do non-targeted effects increase or decrease low dose risk in relation to the linear-non-threshold (LNT) model?☆

    PubMed Central

    Little, M.P.

    2011-01-01

    In this paper we review the evidence for departure from linearity for malignant and non-malignant disease and in the light of this assess likely mechanisms, and in particular the potential role for non-targeted effects. Excess cancer risks observed in the Japanese atomic bomb survivors and in many medically and occupationally exposed groups exposed at low or moderate doses are generally statistically compatible. For most cancer sites the dose–response in these groups is compatible with linearity over the range observed. The available data on biological mechanisms do not provide general support for the idea of a low dose threshold or hormesis. This large body of evidence does not suggest, indeed is not statistically compatible with, any very large threshold in dose for cancer, or with possible hormetic effects, and there is little evidence of the sorts of non-linearity in response implied by non-DNA-targeted effects. There are also excess risks of various types of non-malignant disease in the Japanese atomic bomb survivors and in other groups. In particular, elevated risks of cardiovascular disease, respiratory disease and digestive disease are observed in the A-bomb data. In contrast with cancer, there is much less consistency in the patterns of risk between the various exposed groups; for example, radiation-associated respiratory and digestive diseases have not been seen in these other (non-A-bomb) groups. Cardiovascular risks have been seen in many exposed populations, particularly in medically exposed groups, but in contrast with cancer there is much less consistency in risk between studies: risks per unit dose in epidemiological studies vary over at least two orders of magnitude, possibly a result of confounding and effect modification by well known (but unobserved) risk factors. In the absence of a convincing mechanistic explanation of epidemiological evidence that is, at present, less than persuasive, a cause-and-effect interpretation of the reported

  8. Observations on the Chernobyl Disaster and LNT

    PubMed Central

    Jaworowski, Zbigniew

    2010-01-01

    The Chernobyl accident was probably the worst possible catastrophe of a nuclear power station. It was the only such catastrophe since the advent of nuclear power 55 years ago. It resulted in a total meltdown of the reactor core, a vast emission of radionuclides, and early deaths of only 31 persons. Its enormous political, economic, social and psychological impact was mainly due to deeply rooted fear of radiation induced by the linear non-threshold hypothesis (LNT) assumption. It was a historic event that provided invaluable lessons for nuclear industry and risk philosophy. One of them is demonstration that counted per electricity units produced, early Chernobyl fatalities amounted to 0.86 death/GWe-year), and they were 47 times lower than from hydroelectric stations (∼40 deaths/GWe-year). The accident demonstrated that using the LNT assumption as a basis for protection measures and radiation dose limitations was counterproductive, and lead to sufferings and pauperization of millions of inhabitants of contaminated areas. The projections of thousands of late cancer deaths based on LNT, are in conflict with observations that in comparison with general population of Russia, a 15% to 30% deficit of solid cancer mortality was found among the Russian emergency workers, and a 5% deficit solid cancer incidence among the population of most contaminated areas. PMID:20585443

  9. Model Uncertainty via the Integration of Hormesis and LNT as the Default in Cancer Risk Assessment.

    PubMed

    Calabrese, Edward J

    2015-01-01

    On June 23, 2015, the US Nuclear Regulatory Commission (NRC) issued a formal notice in the Federal Register that it would consider whether "it should amend its 'Standards for Protection Against Radiation' regulations from the linear non-threshold (LNT) model of radiation protection to the hormesis model." The present commentary supports this recommendation based on the (1) flawed and deceptive history of the adoption of LNT by the US National Academy of Sciences (NAS) in 1956; (2) the documented capacity of hormesis to make more accurate predictions of biological responses for diverse biological end points in the low-dose zone; (3) the occurrence of extensive hormetic data from the peer-reviewed biomedical literature that revealed hormetic responses are highly generalizable, being independent of biological model, end point measured, inducing agent, level of biological organization, and mechanism; and (4) the integration of hormesis and LNT models via a model uncertainty methodology that optimizes public health responses at 10(-4). Thus, both LNT and hormesis can be integratively used for risk assessment purposes, and this integration defines the so-called "regulatory sweet spot."

  10. Time to Reject the Linear-No Threshold Hypothesis and Accept Thresholds and Hormesis: A Petition to the U.S. Nuclear Regulatory Commission.

    PubMed

    Marcus, Carol S

    2015-07-01

    On February 9, 2015, I submitted a petition to the U.S. Nuclear Regulatory Commission (NRC) to reject the linear-no threshold (LNT) hypothesis and ALARA as the bases for radiation safety regulation in the United States, using instead threshold and hormesis evidence. In this article, I will briefly review the history of LNT and its use by regulators, the lack of evidence supporting LNT, and the large body of evidence supporting thresholds and hormesis. Physician acceptance of cancer risk from low dose radiation based upon federal regulatory claims is unfortunate and needs to be reevaluated. This is dangerous to patients and impedes good medical care. A link to my petition is available: http://radiationeffects.org/wp-content/uploads/2015/03/Hormesis-Petition-to-NRC-02-09-15.pdf, and support by individual physicians once the public comment period begins would be extremely important.

  11. Radiation Hormesis: Historical Perspective and Implications for Low-Dose Cancer Risk Assessment

    PubMed Central

    Vaiserman, Alexander M.

    2010-01-01

    Current guidelines for limiting exposure of humans to ionizing radiation are based on the linear-no-threshold (LNT) hypothesis for radiation carcinogenesis under which cancer risk increases linearly as the radiation dose increases. With the LNT model even a very small dose could cause cancer and the model is used in establishing guidelines for limiting radiation exposure of humans. A slope change at low doses and dose rates is implemented using an empirical dose and dose rate effectiveness factor (DDREF). This imposes usually unacknowledged nonlinearity but not a threshold in the dose-response curve for cancer induction. In contrast, with the hormetic model, low doses of radiation reduce the cancer incidence while it is elevated after high doses. Based on a review of epidemiological and other data for exposure to low radiation doses and dose rates, it was found that the LNT model fails badly. Cancer risk after ordinarily encountered radiation exposure (medical X-rays, natural background radiation, etc.) is much lower than projections based on the LNT model and is often less than the risk for spontaneous cancer (a hormetic response). Understanding the mechanistic basis for hormetic responses will provide new insights about both risks and benefits from low-dose radiation exposure. PMID:20585444

  12. Response to, "On the origins of the linear no-threshold (LNT) dogma by means of untruths, artful dodges and blind faith.".

    PubMed

    Beyea, Jan

    2016-07-01

    It is not true that successive groups of researchers from academia and research institutions-scientists who served on panels of the US National Academy of Sciences (NAS)-were duped into supporting a linear no-threshold model (LNT) by the opinions expressed in the genetic panel section of the 1956 "BEAR I" report. Successor reports had their own views of the LNT model, relying on mouse and human data, not fruit fly data. Nor was the 1956 report biased and corrupted, as has been charged in an article by Edward J. Calabrese in this journal. With or without BEAR I, the LNT model would likely have been accepted in the US for radiation protection purposes in the 1950's. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  13. On the origins of the linear no-threshold (LNT) dogma by means of untruths, artful dodges and blind faith

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Calabrese, Edward J., E-mail: edwardc@schoolph.umass.edu

    This paper is an historical assessment of how prominent radiation geneticists in the United States during the 1940s and 1950s successfully worked to build acceptance for the linear no-threshold (LNT) dose–response model in risk assessment, significantly impacting environmental, occupational and medical exposure standards and practices to the present time. Detailed documentation indicates that actions taken in support of this policy revolution were ideologically driven and deliberately and deceptively misleading; that scientific records were artfully misrepresented; and that people and organizations in positions of public trust failed to perform the duties expected of them. Key activities are described and the rolesmore » of specific individuals are documented. These actions culminated in a 1956 report by a Genetics Panel of the U.S. National Academy of Sciences (NAS) on Biological Effects of Atomic Radiation (BEAR). In this report the Genetics Panel recommended that a linear dose response model be adopted for the purpose of risk assessment, a recommendation that was rapidly and widely promulgated. The paper argues that current international cancer risk assessment policies are based on fraudulent actions of the U.S. NAS BEAR I Committee, Genetics Panel and on the uncritical, unquestioning and blind-faith acceptance by regulatory agencies and the scientific community. - Highlights: • The 1956 recommendation of the US NAS to use the LNT for risk assessment was adopted worldwide. • This recommendation is based on a falsification of the research record and represents scientific misconduct. • The record misrepresented the magnitude of panelist disagreement of genetic risk from radiation. • These actions enhanced public acceptance of their risk assessment policy recommendations.« less

  14. Dose Response Data for Hormonally Active Chemicals ...

    EPA Pesticide Factsheets

    The shape of the dose response curve in the low dose region has been debated since the late 1940s. The debate originally focused on linear no threshold (LNT) vs threshold responses in the low dose range for cancer and noncancer related effects. For noncancer effects the default assumption is that noncancer effects generally display threshold rather than LNT responses. More recently, claims have arisen that the chemicals, like endocrine disrupters (EDS), which act via high affinity, low capacity nuclear receptors, may display LNT or nonmonotonic low dose responses: responses that could be missed in multigenerational guideline toxicity testing. This presentation will discuss LNT, threshold and nonmonotonic dose response relationships from case studies of chemicals that disrupt reproductive development and function via the ER, AR and AhR pathways and will include in vitro and in vivo multigenerational data. The in vivo studies in this discussion include only robust, well designed, comprehensive studies that administered the chemical via a relevant route(s) of exposure over a broad dose response range, including low dose(s) in the microgram/kg/d range. The chemicals include ethinyl estradiol, estradiol, genistein, bisphenol a, trenbolone, finasteride, flutamide, phthalate esters and 2,3,7,8 TCDD. The objective is to critically evaluate the data from well done studies in this field to address concerns that current multigenerational reproductive test gui

  15. The linear nonthreshold (LNT) model as used in radiation protection: an NCRP update.

    PubMed

    Boice, John D

    2017-10-01

    The linear nonthreshold (LNT) model has been used in radiation protection for over 40 years and has been hotly debated. It relies heavily on human epidemiology, with support from radiobiology. The scientific underpinnings include NCRP Report No. 136 ('Evaluation of the Linear-Nonthreshold Dose-Response Model for Ionizing Radiation'), UNSCEAR 2000, ICRP Publication 99 (2004) and the National Academies BEIR VII Report (2006). NCRP Scientific Committee 1-25 is reviewing recent epidemiologic studies focusing on dose-response models, including threshold, and the relevance to radiation protection. Recent studies after the BEIR VII Report are being critically reviewed and include atomic-bomb survivors, Mayak workers, atomic veterans, populations on the Techa River, U.S. radiological technologists, the U.S. Million Person Study, international workers (INWORKS), Chernobyl cleanup workers, children given computerized tomography scans, and tuberculosis-fluoroscopy patients. Methodologic limitations, dose uncertainties and statistical approaches (and modeling assumptions) are being systematically evaluated. The review of studies continues and will be published as an NCRP commentary in 2017. Most studies reviewed to date are consistent with a straight-line dose response but there are a few exceptions. In the past, the scientific consensus process has worked in providing practical and prudent guidance. So pragmatic judgment is anticipated. The evaluations are ongoing and the extensive NCRP review process has just begun, so no decisions or recommendations are in stone. The march of science requires a constant assessment of emerging evidence to provide an optimum, though not necessarily perfect, approach to radiation protection. Alternatives to the LNT model may be forthcoming, e.g. an approach that couples the best epidemiology with biologically-based models of carcinogenesis, focusing on chronic (not acute) exposure circumstances. Currently for the practical purposes of

  16. Commentary: Ethical Issues of Current Health-Protection Policies on Low-Dose Ionizing Radiation

    PubMed Central

    Socol, Yehoshua; Dobrzyński, Ludwik; Doss, Mohan; Feinendegen, Ludwig E.; Janiak, Marek K.; Miller, Mark L.; Sanders, Charles L.; Scott, Bobby R.; Ulsh, Brant; Vaiserman, Alexander

    2014-01-01

    The linear no-threshold (LNT) model of ionizing-radiation-induced cancer is based on the assumption that every radiation dose increment constitutes increased cancer risk for humans. The risk is hypothesized to increase linearly as the total dose increases. While this model is the basis for radiation safety regulations, its scientific validity has been questioned and debated for many decades. The recent memorandum of the International Commission on Radiological Protection admits that the LNT-model predictions at low doses are “speculative, unproven, undetectable and ‘phantom’.” Moreover, numerous experimental, ecological, and epidemiological studies show that low doses of sparsely-ionizing or sparsely-ionizing plus highly-ionizing radiation may be beneficial to human health (hormesis/adaptive response). The present LNT-model-based regulations impose excessive costs on the society. For example, the median-cost medical program is 5000 times more cost-efficient in saving lives than controlling radiation emissions. There are also lives lost: e.g., following Fukushima accident, more than 1000 disaster-related yet non-radiogenic premature deaths were officially registered among the population evacuated due to radiation concerns. Additional negative impacts of LNT-model-inspired radiophobia include: refusal of some patients to undergo potentially life-saving medical imaging; discouragement of the study of low-dose radiation therapies; motivation for radiological terrorism and promotion of nuclear proliferation. PMID:24910586

  17. Linear-No-Threshold Default Assumptions for Noncancer and Nongenotoxic Cancer Risks: A Mathematical and Biological Critique.

    PubMed

    Bogen, Kenneth T

    2016-03-01

    To improve U.S. Environmental Protection Agency (EPA) dose-response (DR) assessments for noncarcinogens and for nonlinear mode of action (MOA) carcinogens, the 2009 NRC Science and Decisions Panel recommended that the adjustment-factor approach traditionally applied to these endpoints should be replaced by a new default assumption that both endpoints have linear-no-threshold (LNT) population-wide DR relationships. The panel claimed this new approach is warranted because population DR is LNT when any new dose adds to a background dose that explains background levels of risk, and/or when there is substantial interindividual heterogeneity in susceptibility in the exposed human population. Mathematically, however, the first claim is either false or effectively meaningless and the second claim is false. Any dose-and population-response relationship that is statistically consistent with an LNT relationship may instead be an additive mixture of just two quasi-threshold DR relationships, which jointly exhibit low-dose S-shaped, quasi-threshold nonlinearity just below the lower end of the observed "linear" dose range. In this case, LNT extrapolation would necessarily overestimate increased risk by increasingly large relative magnitudes at diminishing values of above-background dose. The fact that chemically-induced apoptotic cell death occurs by unambiguously nonlinear, quasi-threshold DR mechanisms is apparent from recent data concerning this quintessential toxicity endpoint. The 2009 NRC Science and Decisions Panel claims and recommendations that default LNT assumptions be applied to DR assessment for noncarcinogens and nonlinear MOA carcinogens are therefore not justified either mathematically or biologically. © 2015 The Author. Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.

  18. Nonmonotonic Dose-Response Curves and Endocrine-Disrupting Chemicals: Fact or Falderal?**

    EPA Science Inventory

    Nonmonotonic Dose-Response Curves and Endocrine-Disrupting Chemicals: Fact or Falderal? The shape of the dose response curve in the low dose region has been debated since the 1940s, originally focusing on linear no threshold (LNT) versus threshold responses for cancer and noncanc...

  19. Dose Response Data for Hormonally Active Chemicals: Estrogens, Antiandrogens and Androgens

    EPA Science Inventory

    The shape of the dose response curve in the low dose region has been debated since the late 1940s. The debate originally focused on linear no threshold (LNT) vs threshold responses in the low dose range for cancer and noncancer related effects. For noncancer effects the defaul...

  20. Advanced Computational Approaches for Characterizing Stochastic Cellular Responses to Low Dose, Low Dose Rate Exposures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scott, Bobby, R., Ph.D.

    2003-06-27

    OAK - B135 This project final report summarizes modeling research conducted in the U.S. Department of Energy (DOE), Low Dose Radiation Research Program at the Lovelace Respiratory Research Institute from October 1998 through June 2003. The modeling research described involves critically evaluating the validity of the linear nonthreshold (LNT) risk model as it relates to stochastic effects induced in cells by low doses of ionizing radiation and genotoxic chemicals. The LNT model plays a central role in low-dose risk assessment for humans. With the LNT model, any radiation (or genotoxic chemical) exposure is assumed to increase one¡¯s risk of cancer.more » Based on the LNT model, others have predicted tens of thousands of cancer deaths related to environmental exposure to radioactive material from nuclear accidents (e.g., Chernobyl) and fallout from nuclear weapons testing. Our research has focused on developing biologically based models that explain the shape of dose-response curves for low-dose radiation and genotoxic chemical-induced stochastic effects in cells. Understanding the shape of the dose-response curve for radiation and genotoxic chemical-induced stochastic effects in cells helps to better understand the shape of the dose-response curve for cancer induction in humans. We have used a modeling approach that facilitated model revisions over time, allowing for timely incorporation of new knowledge gained related to the biological basis for low-dose-induced stochastic effects in cells. Both deleterious (e.g., genomic instability, mutations, and neoplastic transformation) and protective (e.g., DNA repair and apoptosis) effects have been included in our modeling. Our most advanced model, NEOTRANS2, involves differing levels of genomic instability. Persistent genomic instability is presumed to be associated with nonspecific, nonlethal mutations and to increase both the risk for neoplastic transformation and for cancer occurrence. Our research results, based

  1. Linear No-Threshold Model VS. Radiation Hormesis

    PubMed Central

    Doss, Mohan

    2013-01-01

    The atomic bomb survivor cancer mortality data have been used in the past to justify the use of the linear no-threshold (LNT) model for estimating the carcinogenic effects of low dose radiation. An analysis of the recently updated atomic bomb survivor cancer mortality dose-response data shows that the data no longer support the LNT model but are consistent with a radiation hormesis model when a correction is applied for a likely bias in the baseline cancer mortality rate. If the validity of the phenomenon of radiation hormesis is confirmed in prospective human pilot studies, and is applied to the wider population, it could result in a considerable reduction in cancers. The idea of using radiation hormesis to prevent cancers was proposed more than three decades ago, but was never investigated in humans to determine its validity because of the dominance of the LNT model and the consequent carcinogenic concerns regarding low dose radiation. Since cancer continues to be a major health problem and the age-adjusted cancer mortality rates have declined by only ∼10% in the past 45 years, it may be prudent to investigate radiation hormesis as an alternative approach to reduce cancers. Prompt action is urged. PMID:24298226

  2. Bayesian estimation of dose thresholds

    NASA Technical Reports Server (NTRS)

    Groer, P. G.; Carnes, B. A.

    2003-01-01

    An example is described of Bayesian estimation of radiation absorbed dose thresholds (subsequently simply referred to as dose thresholds) using a specific parametric model applied to a data set on mice exposed to 60Co gamma rays and fission neutrons. A Weibull based relative risk model with a dose threshold parameter was used to analyse, as an example, lung cancer mortality and determine the posterior density for the threshold dose after single exposures to 60Co gamma rays or fission neutrons from the JANUS reactor at Argonne National Laboratory. The data consisted of survival, censoring times and cause of death information for male B6CF1 unexposed and exposed mice. The 60Co gamma whole-body doses for the two exposed groups were 0.86 and 1.37 Gy. The neutron whole-body doses were 0.19 and 0.38 Gy. Marginal posterior densities for the dose thresholds for neutron and gamma radiation were calculated with numerical integration and found to have quite different shapes. The density of the threshold for 60Co is unimodal with a mode at about 0.50 Gy. The threshold density for fission neutrons declines monotonically from a maximum value at zero with increasing doses. The posterior densities for all other parameters were similar for the two radiation types.

  3. Nonmonotonic dose response curves (NMDRCs) are common after Estrogen or Androgen signaling pathway disruption. Fact or Falderal? ###SETAC

    EPA Science Inventory

    The shape of the dose response curve in the low dose region has been debated since the late 1940s. The debate originally focused on linear no threshold (LNT) vs threshold responses in the low dose range for cancer and noncancer related effects. Recently, claims have arisen tha...

  4. NONMONOTONIC DOSE RESPONSE CURVES (NMDRCS) ARE COMMON AFTER ESTROGEN OR ANDROGEN SIGNALING PATHWAY DISRUPTION. FACT OR FALDERAL?

    EPA Science Inventory

    ABSTRACT BODY: The shape of the dose response curve in the low dose region has been debated since the 1940s, originally focusing on linear no threshold (LNT) versus threshold responses for cancer and noncancer effects. Recently, it has been claimed that endocrine disrupters (EDCs...

  5. Non-linear relationship of cell hit and transformation probabilities in a low dose of inhaled radon progenies.

    PubMed

    Balásházy, Imre; Farkas, Arpád; Madas, Balázs Gergely; Hofmann, Werner

    2009-06-01

    Cellular hit probabilities of alpha particles emitted by inhaled radon progenies in sensitive bronchial epithelial cell nuclei were simulated at low exposure levels to obtain useful data for the rejection or support of the linear-non-threshold (LNT) hypothesis. In this study, local distributions of deposited inhaled radon progenies in airway bifurcation models were computed at exposure conditions characteristic of homes and uranium mines. Then, maximum local deposition enhancement factors at bronchial airway bifurcations, expressed as the ratio of local to average deposition densities, were determined to characterise the inhomogeneity of deposition and to elucidate their effect on resulting hit probabilities. The results obtained suggest that in the vicinity of the carinal regions of the central airways the probability of multiple hits can be quite high, even at low average doses. Assuming a uniform distribution of activity there are practically no multiple hits and the hit probability as a function of dose exhibits a linear shape in the low dose range. The results are quite the opposite in the case of hot spots revealed by realistic deposition calculations, where practically all cells receive multiple hits and the hit probability as a function of dose is non-linear in the average dose range of 10-100 mGy.

  6. Flaws in the LNT single-hit model for cancer risk: An historical assessment.

    PubMed

    Calabrese, Edward J

    2017-10-01

    The LNT single-hit model was derived from the Nobel Prize-winning research of Herman J. Muller who showed that x-rays could induce gene mutations in Drosophila and that the dose response for these so-called mutational events was linear. Lewis J. Stadler, another well-known and respected geneticist at the time, strongly disagreed with and challenged Muller's claims. Detailed evaluations by Stadler over a prolonged series of investigations revealed that Muller's experiments had induced gross heritable chromosomal damage instead of specific gene mutations as had been claimed by Muller at his Nobel Lecture. These X-ray-induced alterations became progressively more frequent and were of larger magnitude (more destructive) with increasing doses. Thus, Muller's claim of having induced discrete gene mutations represented a substantial speculative overreach and was, in fact, without proof. The post hoc arguments of Muller to support his gene mutation hypothesis were significantly challenged and weakened by a series of new findings in the areas of cytogenetics, reverse mutation, adaptive and repair processes, and modern molecular methods for estimating induced genetic damage. These findings represented critical and substantial limitations to Muller's hypothesis of X-ray-induced gene mutations. Furthermore, they challenged the scientific foundations used in support of the LNT single-hit model by severing the logical nexus between Muller's data on radiation-induced inheritable alterations and the LNT single-hit model. These findings exposed fundamental scientific flaws that undermined not only the seminal recommendation of the 1956 BEAR I Genetics Panel to adopt the LNT single-hit Model for risk assessment but also any rationale for its continued use in the present day. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Threshold dose for behavioral discrimination of cigarette nicotine content in menthol vs. non-menthol smokers.

    PubMed

    Perkins, Kenneth A; Kunkle, Nicole; Karelitz, Joshua L

    2017-04-01

    The lowest threshold content (or "dose") of nicotine discriminated in cigarettes may differ due to menthol preference. Menthol and non-menthol Spectrum research cigarettes differing in nicotine content were used to determine discrimination thresholds. Dependent smokers preferring menthol (n = 40) or non-menthol (n = 21) brands were tested on ability to discriminate cigarettes (matched for their menthol preference) with nicotine contents of 16-17, 11-12, 5, 2, and 1 mg/g, one per session, from an "ultra-low" cigarette with 0.4 mg/g. Controlled exposure to each cigarette was four puffs/trial, and the number of sessions was determined by the lowest nicotine content they could discriminate on >80% of trials (i.e., ≥5 of 6). We also assessed subjective perceptions and behavioral choice between cigarettes to relate them to discrimination responses. Controlling for Fagerstrom Test of Nicotine Dependence score, discrimination thresholds were more likely to be at higher nicotine content cigarettes for menthol vs. non-menthol smokers (p < .005), with medians of 16 vs. 11 mg/g, respectively. Compared to the ultra-low, threshold and subthreshold (next lowest) cigarettes differed on most perceptions and puff choice, but menthol preference did not alter these associations. Notably, threshold cigarettes did, but subthreshold did not, increase choice over the ultra-low. Threshold for discriminating nicotine via smoking may be generally higher for menthol vs. non-menthol smokers. More research is needed to identify why menthol smoking is related to higher nicotine thresholds and to verify that cigarettes unable to be discriminated do not support reinforcement.

  8. The Genetics Panel of the NAS BEAR I Committee (1956): epistolary evidence suggests self-interest may have prompted an exaggeration of radiation risks that led to the adoption of the LNT cancer risk assessment model.

    PubMed

    Calabrese, Edward J

    2014-09-01

    This paper extends a series of historical papers which demonstrated that the linear-no-threshold (LNT) model for cancer risk assessment was founded on ideological-based scientific deceptions by key radiation genetics leaders. Based on an assessment of recently uncovered personal correspondence, it is shown that some members of the United States (US) National Academy of Sciences (NAS) Biological Effects of Atomic Radiation I (BEAR I) Genetics Panel were motivated by self-interest to exaggerate risks to promote their science and personal/professional agenda. Such activities have profound implications for public policy and may have had a significant impact on the adoption of the LNT model for cancer risk assessment.

  9. Dose-responses for mortality from cerebrovascular and heart diseases in atomic bomb survivors: 1950-2003.

    PubMed

    Schöllnberger, Helmut; Eidemüller, Markus; Cullings, Harry M; Simonetto, Cristoforo; Neff, Frauke; Kaiser, Jan Christian

    2018-03-01

    The scientific community faces important discussions on the validity of the linear no-threshold (LNT) model for radiation-associated cardiovascular diseases at low and moderate doses. In the present study, mortalities from cerebrovascular diseases (CeVD) and heart diseases from the latest data on atomic bomb survivors were analyzed. The analysis was performed with several radio-biologically motivated linear and nonlinear dose-response models. For each detrimental health outcome one set of models was identified that all fitted the data about equally well. This set was used for multi-model inference (MMI), a statistical method of superposing different models to allow risk estimates to be based on several plausible dose-response models rather than just relying on a single model of choice. MMI provides a more accurate determination of the dose response and a more comprehensive characterization of uncertainties. It was found that for CeVD, the dose-response curve from MMI is located below the linear no-threshold model at low and medium doses (0-1.4 Gy). At higher doses MMI predicts a higher risk compared to the LNT model. A sublinear dose-response was also found for heart diseases (0-3 Gy). The analyses provide no conclusive answer to the question whether there is a radiation risk below 0.75 Gy for CeVD and 2.6 Gy for heart diseases. MMI suggests that the dose-response curves for CeVD and heart diseases in the Lifespan Study are sublinear at low and moderate doses. This has relevance for radiotherapy treatment planning and for international radiation protection practices in general.

  10. Growth of non-toxigenic Clostridium botulinum mutant LNT01 in cooked beef: One-step kinetic analysis and comparison with C. sporogenes and C. perfringens.

    PubMed

    Huang, Lihan

    2018-05-01

    The objective of this study was to investigate the growth kinetics of Clostridium botulinum LNT01, a non-toxigenic mutant of C. botulinum 62A, in cooked ground beef. The spores of C. botulinum LNT01 were inoculated to ground beef and incubated anaerobically under different temperature conditions to observe growth and develop growth curves. A one-step kinetic analysis method was used to analyze the growth curves simultaneously to minimize the global residual error. The data analysis was performed using the USDA IPMP-Global Fit, with the Huang model as the primary model and the cardinal parameters model as the secondary model. The results of data analysis showed that the minimum, optimum, and maximum growth temperatures of this mutant are 11.5, 36.4, and 44.3 °C, and the estimated optimum specific growth rate is 0.633 ln CFU/g per h, or 0.275 log CFU/g per h. The maximum cell density is 7.84 log CFU/g. The models and kinetic parameters were validated using additional isothermal and dynamic growth curves. The resulting residual errors of validation followed a Laplace distribution, with about 60% of the residual errors within ±0.5 log CFU/g of experimental observations, suggesting that the models could predict the growth of C. botulinum LNT01 in ground beef with reasonable accuracy. Comparing with C. perfringens, C. botulinum LNT01 grows at much slower rates and with much longer lag times. Its growth kinetics is also very similar to C. sporogenes in ground beef. The results of computer simulation using kinetic models showed that, while prolific growth of C. perfringens may occur in ground beef during cooling, no growth of C. botulinum LNT01 or C. sporogenes would occur under the same cooling conditions. The models developed in this study may be used for prediction of the growth and risk assessments of proteolytic C. botulinum in cooked meats. Published by Elsevier Ltd.

  11. The Mistaken Birth and Adoption of LNT: An Abridged Version

    PubMed Central

    Calabrese, Edward J.

    2017-01-01

    The historical foundations of cancer risk assessment were based on the discovery of X-ray-induced gene mutations by Hermann J. Muller, its transformation into the linear nonthreshold (LNT) single-hit theory, the recommendation of the model by the US National Academy of Sciences, Biological Effects of Atomic Radiation I, Genetics Panel in 1956, and subsequent widespread adoption by regulatory agencies worldwide. This article summarizes substantial recent historical revelations of this history, which profoundly challenge the standard and widely acceptable history of cancer risk assessment, showing multiple significant scientific errors and incorrect interpretations, mixed with deliberate misrepresentation of the scientific record by leading ideologically motivated radiation geneticists. These novel historical findings demonstrate that the scientific foundations of the LNT single-hit model were seriously flawed and should not have been adopted for cancer risk assessment. PMID:29051718

  12. Atomic Bomb Health Benefits

    PubMed Central

    Luckey, T. D.

    2008-01-01

    Media reports of deaths and devastation produced by atomic bombs convinced people around the world that all ionizing radiation is harmful. This concentrated attention on fear of miniscule doses of radiation. Soon the linear no threshold (LNT) paradigm was converted into laws. Scientifically valid information about the health benefits from low dose irradiation was ignored. Here are studies which show increased health in Japanese survivors of atomic bombs. Parameters include decreased mutation, leukemia and solid tissue cancer mortality rates, and increased average lifespan. Each study exhibits a threshold that repudiates the LNT dogma. The average threshold for acute exposures to atomic bombs is about 100 cSv. Conclusions from these studies of atomic bomb survivors are: One burst of low dose irradiation elicits a lifetime of improved health.Improved health from low dose irradiation negates the LNT paradigm.Effective triage should include radiation hormesis for survivor treatment. PMID:19088902

  13. Atomic bomb health benefits.

    PubMed

    Luckey, T D

    2008-01-01

    Media reports of deaths and devastation produced by atomic bombs convinced people around the world that all ionizing radiation is harmful. This concentrated attention on fear of miniscule doses of radiation. Soon the linear no threshold (LNT) paradigm was converted into laws. Scientifically valid information about the health benefits from low dose irradiation was ignored. Here are studies which show increased health in Japanese survivors of atomic bombs. Parameters include decreased mutation, leukemia and solid tissue cancer mortality rates, and increased average lifespan. Each study exhibits a threshold that repudiates the LNT dogma. The average threshold for acute exposures to atomic bombs is about 100 cSv. Conclusions from these studies of atomic bomb survivors are: One burst of low dose irradiation elicits a lifetime of improved health.Improved health from low dose irradiation negates the LNT paradigm.Effective triage should include radiation hormesis for survivor treatment.

  14. Comparison of singlet oxygen threshold dose for PDT.

    PubMed

    Zhu, Timothy C; Liu, Baochang; Kim, Michele M; McMillan, Dayton; Liang, Xing; Finlay, Jarod C; Busch, Theresa M

    2014-02-01

    Macroscopic modeling of singlet oxygen ( 1 O 2 ) is of particular interest because it is the major cytotoxic agent causing biological effects for type II photosensitizers during PDT. We have developed a macroscopic model to calculate reacted singlet oxygen concentration ([1O2] rx for PDT. An in-vivo RIF tumor mouse model is used to correlate the necrosis depth to the calculation based on explicit PDT dosimetry of light fluence distribution, tissue optical properties, and photosensitizer concentrations. Inputs to the model include 4 photosensitizer specific photochemical parameters along with the apparent singlet oxygen threshold concentration. Photosensitizer specific model parameters are determined for several type II photosensitizers (Photofrin, BPD, and HPPH). The singlet oxygen threshold concentration is approximately 0.41 - 0.56 mM for all three photosensitizers studied, assuming that the fraction of singlet oxygen generated that interacts with the cell is ( f = 1). In comparison, value derived from other in-vivo mice studies is 0.4 mM for mTHPC. However, the singlet oxygen threshold doses were reported to be 7.9 and 12.1 mM for a multicell in-vitro EMT6/Ro spheroid model for mTHPC and Photofrin PDT, respectively. The sensitivity of threshold singlet oxygen dose for our experiment is examined. The possible influence of vascular vs. apoptotic cell killing mechanism on the singlet oxygen threshold dose is discussed using the BPD with different drug-light intervals 3 hrs vs. 15 min. The observed discrepancies between different experiments warrant further investigation to explain the cause of the difference.

  15. Comparison of singlet oxygen threshold dose for PDT

    PubMed Central

    Zhu, Timothy C; Liu, Baochang; Kim, Michele M.; McMillan, Dayton; Liang, Xing; Finlay, Jarod C.; Busch, Theresa M.

    2015-01-01

    Macroscopic modeling of singlet oxygen (1O2) is of particular interest because it is the major cytotoxic agent causing biological effects for type II photosensitizers during PDT. We have developed a macroscopic model to calculate reacted singlet oxygen concentration ([1O2]rx for PDT. An in-vivo RIF tumor mouse model is used to correlate the necrosis depth to the calculation based on explicit PDT dosimetry of light fluence distribution, tissue optical properties, and photosensitizer concentrations. Inputs to the model include 4 photosensitizer specific photochemical parameters along with the apparent singlet oxygen threshold concentration. Photosensitizer specific model parameters are determined for several type II photosensitizers (Photofrin, BPD, and HPPH). The singlet oxygen threshold concentration is approximately 0.41 – 0.56 mM for all three photosensitizers studied, assuming that the fraction of singlet oxygen generated that interacts with the cell is (f = 1). In comparison, value derived from other in-vivo mice studies is 0.4 mM for mTHPC. However, the singlet oxygen threshold doses were reported to be 7.9 and 12.1 mM for a multicell in-vitro EMT6/Ro spheroid model for mTHPC and Photofrin PDT, respectively. The sensitivity of threshold singlet oxygen dose for our experiment is examined. The possible influence of vascular vs. apoptotic cell killing mechanism on the singlet oxygen threshold dose is discussed using the BPD with different drug-light intervals 3 hrs vs. 15 min. The observed discrepancies between different experiments warrant further investigation to explain the cause of the difference. PMID:25999651

  16. Threshold-driven optimization for reference-based auto-planning

    NASA Astrophysics Data System (ADS)

    Long, Troy; Chen, Mingli; Jiang, Steve; Lu, Weiguo

    2018-02-01

    We study threshold-driven optimization methodology for automatically generating a treatment plan that is motivated by a reference DVH for IMRT treatment planning. We present a framework for threshold-driven optimization for reference-based auto-planning (TORA). Commonly used voxel-based quadratic penalties have two components for penalizing under- and over-dosing of voxels: a reference dose threshold and associated penalty weight. Conventional manual- and auto-planning using such a function involves iteratively updating the preference weights while keeping the thresholds constant, an unintuitive and often inconsistent method for planning toward some reference DVH. However, driving a dose distribution by threshold values instead of preference weights can achieve similar plans with less computational effort. The proposed methodology spatially assigns reference DVH information to threshold values, and iteratively improves the quality of that assignment. The methodology effectively handles both sub-optimal and infeasible DVHs. TORA was applied to a prostate case and a liver case as a proof-of-concept. Reference DVHs were generated using a conventional voxel-based objective, then altered to be either infeasible or easy-to-achieve. TORA was able to closely recreate reference DVHs in 5-15 iterations of solving a simple convex sub-problem. TORA has the potential to be effective for auto-planning based on reference DVHs. As dose prediction and knowledge-based planning becomes more prevalent in the clinical setting, incorporating such data into the treatment planning model in a clear, efficient way will be crucial for automated planning. A threshold-focused objective tuning should be explored over conventional methods of updating preference weights for DVH-guided treatment planning.

  17. The 10th anniversary of the publication of genes and environment: memoir of establishing the Japanese environmental mutagen society and a proposal for a new collaborative study on mutagenic hormesis.

    PubMed

    Sutou, Shizuyo

    2017-01-01

    The Japanese Environmental Mutagen Society (JEMS) was established in 1972 by 147 members, 11 of whom are still on the active list as of May 1, 2016. As one of them, I introduce some historic topics here. These include 1) establishment of JEMS, 2) the issue of 2-(2-furyl)-3-(3-nitro-2-furyl)acrylamide (AF-2), 3) the Mammalian Mutagenicity Study Group (MMS) and its achievements, and 4) the Collaborative Study Group of the Micronucleus Test (CSGMT) and its achievements. In addition to these historic matters, some of which are still ongoing, a new collaborative study is proposed on adaptive response or hormesis by mutagens. There is a close relationship between mutagens and carcinogens, the dose-response relationship of which has been thought to follow the linear no-threshold model (LNT). LNT was fabricated on the basis of Drosophila sperm experiments using high dose radiation delivered in a short period. The fallacious 60 years-old LNT is applied to cancer induction by radiation without solid data and then to cancer induction by carcinogens also without solid data. Therefore, even the smallest amount of carcinogens is postulated to be carcinogenic without thresholds now. Radiation hormesis is observed in a large variety of living organisms; radiation is beneficial at low doses, but hazardous at high doses. There is a threshold at the boundary between benefit and hazard. Hormesis denies LNT. Not a few papers report existence of chemical hormesis. If mutagens and carcinogens show hormesis, the linear dose-response relationship in mutagenesis and carcinogenesis is denied and thresholds can be introduced.

  18. In-vivo singlet oxygen threshold doses for PDT

    PubMed Central

    Zhu, Timothy C.; Kim, Michele M.; Liang, Xing; Finlay, Jarod C.; Busch, Theresa M.

    2015-01-01

    Objective Dosimetry of singlet oxygen (1O2) is of particular interest because it is the major cytotoxic agent causing biological effects for type-II photosensitizers during photodynamic therapy (PDT). An in-vivo model to determine the singlet oxygen threshold dose, [1O2]rx,sh, for PDT was developed. Material and methods An in-vivo radiation-induced fibrosarcoma (RIF) tumor mouse model was used to correlate the radius of necrosis to the calculation based on explicit PDT dosimetry of light fluence distribution, tissue optical properties, and photosensitizer concentrations. Inputs to the model include five photosensitizer-specific photochemical parameters along with [1O2]rx,sh. Photosensitizer-specific model parameters were determined for benzoporphyrin derivative monoacid ring A (BPD) and compared with two other type-II photosensitizers, Photofrin® and m-tetrahydroxyphenylchlorin (mTHPC) from the literature. Results The mean values (standard deviation) of the in-vivo [1O2]rx,sh are approximately 0.56 (0.26) and 0.72 (0.21) mM (or 3.6×107 and 4.6×107 singlet oxygen per cell to reduce the cell survival to 1/e) for Photofrin® and BPD, respectively, assuming that the fraction of generated singlet oxygen that interacts with the cell is 1. While the values for the photochemical parameters (ξ, σ, g, β) used for BPD were preliminary and may need further refinement, there is reasonable confidence for the values of the singlet oxygen threshold doses. Discussion In comparison, the [1O2]rx,sh value derived from in-vivo mouse study was reported to be 0.4 mM for mTHPC-PDT. However, the singlet oxygen required per cell is reported to be 9×108 per cell per 1/e fractional kill in an in-vitro mTHPC-PDT study on a rat prostate cancer cell line (MLL cells) and is reported to be 7.9 mM for a multicell in-vitro EMT6/Ro spheroid model for mTHPC-PDT. A theoretical analysis is provided to relate the number of in-vitro singlet oxygen required per cell to reach cell killing of 1/e to

  19. In-vivo singlet oxygen threshold doses for PDT.

    PubMed

    Zhu, Timothy C; Kim, Michele M; Liang, Xing; Finlay, Jarod C; Busch, Theresa M

    2015-02-01

    Dosimetry of singlet oxygen ( 1 O 2 ) is of particular interest because it is the major cytotoxic agent causing biological effects for type-II photosensitizers during photodynamic therapy (PDT). An in-vivo model to determine the singlet oxygen threshold dose, [ 1 O 2 ] rx,sh , for PDT was developed. An in-vivo radiation-induced fibrosarcoma (RIF) tumor mouse model was used to correlate the radius of necrosis to the calculation based on explicit PDT dosimetry of light fluence distribution, tissue optical properties, and photosensitizer concentrations. Inputs to the model include five photosensitizer-specific photochemical parameters along with [ 1 O 2 ] rx,sh . Photosensitizer-specific model parameters were determined for benzoporphyrin derivative monoacid ring A (BPD) and compared with two other type-II photosensitizers, Photofrin ® and m-tetrahydroxyphenylchlorin (mTHPC) from the literature. The mean values (standard deviation) of the in-vivo [ 1 O 2 ] rx,sh are approximately 0.56 (0.26) and 0.72 (0.21) mM (or 3.6×10 7 and 4.6×10 7 singlet oxygen per cell to reduce the cell survival to 1/e) for Photofrin ® and BPD, respectively, assuming that the fraction of generated singlet oxygen that interacts with the cell is 1. While the values for the photochemical parameters (ξ, σ, g , β) used for BPD were preliminary and may need further refinement, there is reasonable confidence for the values of the singlet oxygen threshold doses. In comparison, the [ 1 O 2 ] rx,sh value derived from in-vivo mouse study was reported to be 0.4 mM for mTHPC-PDT. However, the singlet oxygen required per cell is reported to be 9×10 8 per cell per 1/ e fractional kill in an in-vitro mTHPC-PDT study on a rat prostate cancer cell line (MLL cells) and is reported to be 7.9 mM for a multicell in-vitro EMT6/Ro spheroid model for mTHPC-PDT. A theoretical analysis is provided to relate the number of in-vitro singlet oxygen required per cell to reach cell killing of 1/ e to in-vivo singlet

  20. Review of Chinese Environmental Risk Assessment Regulations and Case Studies

    PubMed Central

    Meng, Xiaojie; Zhang, Yan; Zhao, Yuchao; Lou, In Chio; Gao, Jixi

    2012-01-01

    Environmental risk assessment is an essential step in the development of solutions for pollution problems and new environmental regulations. An assessment system for environmental risks has been developed in China in recent decades. However, many of the Chinese technical guidelines, standards, and regulations were directly adapted from those of developed countries, and were not based on the Chinese environmental and socioeconomic context. Although existing environmental regulations for pollutants are usually obtained by extrapolations from high-dose toxicological data to low-dose scenarios using linear-non-threshold (LNT) models, toxicologists have argued that J-shaped or inverse J-shaped curves may dominate the dose–response relationships for environmental pollutants at low doses because low exposures stimulate biological protective mechanisms that are ineffective at higher doses. The costs of regulations based on LNT and J-shaped models could therefore be dramatically different. Since economic factors strongly affect the decision-making process, particularly for developing countries, it is time to strengthen basic research to provide more scientific support for Chinese environmental regulations. In this paper, we summarize current Chinese environmental policies and standards and the application of environmental risk assessment in China, and recommend a more scientific approach to the development of Chinese regulations. PMID:22740787

  1. The Key Events Dose-Response Framework: a cross-disciplinary mode-of-action based approach to examining dose-response and thresholds.

    PubMed

    Julien, Elizabeth; Boobis, Alan R; Olin, Stephen S

    2009-09-01

    The ILSI Research Foundation convened a cross-disciplinary working group to examine current approaches for assessing dose-response and identifying safe levels of intake or exposure for four categories of bioactive agents-food allergens, nutrients, pathogenic microorganisms, and environmental chemicals. This effort generated a common analytical framework-the Key Events Dose-Response Framework (KEDRF)-for systematically examining key events that occur between the initial dose of a bioactive agent and the effect of concern. Individual key events are considered with regard to factors that influence the dose-response relationship and factors that underlie variability in that relationship. This approach illuminates the connection between the processes occurring at the level of fundamental biology and the outcomes observed at the individual and population levels. Thus, it promotes an evidence-based approach for using mechanistic data to reduce reliance on default assumptions, to quantify variability, and to better characterize biological thresholds. This paper provides an overview of the KEDRF and introduces a series of four companion papers that illustrate initial application of the approach to a range of bioactive agents.

  2. Threshold dose for discrimination of nicotine via cigarette smoking.

    PubMed

    Perkins, Kenneth A; Kunkle, Nicole; Karelitz, Joshua L; Michael, Valerie C; Donny, Eric C

    2016-06-01

    The lowest nicotine threshold "dose" in cigarettes discriminated from a cigarette containing virtually no nicotine may help inform the minimum dose maintaining dependence. Spectrum research cigarettes (from NIDA) differing in nicotine content were used to evaluate a procedure to determine discrimination thresholds. Dependent smokers (n = 18; 13 M, 5 F) were tested on ability to discriminate cigarettes with nicotine contents of 11, 5, 2.4, and 1.3 mg/g, one per session, from the "ultralow" cigarette with 0.4 mg/g, after having discriminated 16 mg/g from 0.4 mg/g (all had 9-10 mg "tar"). Exposure to each was limited to 4 puffs/trial. All subjects were abstinent from smoking overnight prior to each session, and the number of sessions was determined by the participant's success in discrimination behavior on >80 % of trials. Subjective perceptions and behavioral choice between cigarettes were also assessed and related to discrimination behavior. The median threshold was 11 mg/g, but the range was 2.4 to 16 mg/g, suggesting wide variability in discrimination threshold. Compared to the ultralow, puff choice was greater for the subject's threshold dose but only marginal for the subthreshold (next lowest nicotine) cigarette. Threshold and subthreshold also differed on subjective perceptions but not withdrawal relief. Under these testing conditions, threshold content for discriminating nicotine via cigarettes may be 11 mg/g or greater for most smokers, but some can discriminate nicotine contents one-half or one-quarter this amount. Further study with other procedures and cigarette exposure amounts may identify systematic differences in nicotine discrimination thresholds.

  3. The Healthy Worker Effect and Nuclear Industry Workers

    PubMed Central

    Fornalski, Krzysztof W.; Dobrzyński, Ludwik

    2010-01-01

    The linear no-threshold (LNT) dose-effect relationship has been consistently used by most radiation epidemiologists to estimate cancer mortality risk. The large scattering of data by International Agency for Research on Cancer, IARC (Vrijheid et al. 2007; Therry-Chef et al. 2007; Cardis et al. 2007), interpreted in accordance with LNT, has been previously demonstrated (Fornalski and Dobrzyński 2009). Using conventional and Bayesian methods the present paper demonstrates that the standard mortality ratios (SMRs), lower in the IARC cohort of exposed nuclear workers than in the non exposed group, should be considered as a hormetic effect, rather than a healthy worker effect (HWE) as claimed by the IARC group. PMID:20585442

  4. Threshold and non-threshold chemical carcinogens: A survey of the present regulatory landscape.

    PubMed

    Bevan, Ruth J; Harrison, Paul T C

    2017-08-01

    For the proper regulation of a carcinogenic material it is necessary to fully understand its mode of action, and in particular whether it demonstrates a threshold of effect. This paper explores our present understanding of carcinogenicity and the mechanisms underlying the carcinogenic response. The concepts of genotoxic and non-genotoxic and threshold and non-threshold carcinogens are fully described. We provide summary tables of the types of cancer considered to be associated with exposure to a number of carcinogens and the available evidence relating to whether carcinogenicity occurs through a threshold or non-threshold mechanism. In light of these observations we consider how different regulatory bodies approach the question of chemical carcinogenesis, looking in particular at the definitions and methodologies used to derive Occupational Exposure Levels (OELs) for carcinogens. We conclude that unless proper differentiation is made between threshold and non-threshold carcinogens, inappropriate risk management measures may be put in place - and lead also to difficulties in translating carcinogenicity research findings into appropriate health policies. We recommend that clear differentiation between threshold and non-threshold carcinogens should be made by all expert groups and regulatory bodies dealing with carcinogen classification and risk assessment. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. The Use of Lexical Neighborhood Test (LNT) in the Assessment of Speech Recognition Performance of Cochlear Implantees with Normal and Malformed Cochlea.

    PubMed

    Kant, Anjali R; Banik, Arun A

    2017-09-01

    The present study aims to use the model-based test Lexical Neighborhood Test (LNT), to assess speech recognition performance in early and late implanted hearing impaired children with normal and malformed cochlea. The LNT was administered to 46 children with congenital (prelingual) bilateral severe-profound sensorineural hearing loss, using Nucleus 24 cochlear implant. The children were grouped into Group 1-(early implantees with normal cochlea-EI); n = 15, 31/2-61/2 years of age; mean age at implantation-3½ years. Group 2-(late implantees with normal cochlea-LI); n = 15, 6-12 years of age; mean age at implantation-5 years. Group 3-(early implantees with malformed cochlea-EIMC); n = 9; 4.9-10.6 years of age; mean age at implantation-3.10 years. Group 4-(late implantees with malformed cochlea-LIMC); n = 7; 7-12.6 years of age; mean age at implantation-6.3 years. The following were the malformations: dysplastic cochlea, common cavity, Mondini's, incomplete partition-1 and 2 (IP-1 and 2), enlarged IAC. The children were instructed to repeat the words on hearing them. Means of the word and phoneme scores were computed. The LNT can also be used to assess speech recognition performance of hearing impaired children with malformed cochlea. When both easy and hard lists of LNT are considered, although, late implantees (with or without normal cochlea), have achieved higher word scores than early implantees, the differences are not statistically significant. Using LNT for assessing speech recognition enables a quantitative as well as descriptive report of phonological processes used by the children.

  6. Regulatory-Science: Biphasic Cancer Models or the LNT—Not Just a Matter of Biology!

    PubMed Central

    Ricci, Paolo F.; Sammis, Ian R.

    2012-01-01

    There is no doubt that prudence and risk aversion must guide public decisions when the associated adverse outcomes are either serious or irreversible. With any carcinogen, the levels of risk and needed protection before and after an event occurs, are determined by dose-response models. Regulatory law should not crowd out the actual beneficial effects from low dose exposures—when demonstrable—that are inevitably lost when it adopts the linear non-threshold (LNT) as its causal model. Because regulating exposures requires planning and developing protective measures for future acute and chronic exposures, public management decisions should be based on minimizing costs and harmful exposures. We address the direct and indirect effects of causation when the danger consists of exposure to very low levels of carcinogens and toxicants. The societal consequences of a policy can be deleterious when that policy is based on a risk assumed by the LNT, in cases where low exposures are actually beneficial. Our work develops the science and the law of causal risk modeling: both are interwoven. We suggest how their relevant characteristics differ, but do not attempt to keep them separated; as we demonstrate, this union, however unsatisfactory, cannot be severed. PMID:22740778

  7. Derivation of a no-significant-risk-level for tetrabromobisphenol A based on a threshold non-mutagenic cancer mode of action.

    PubMed

    Pecquet, Alison M; Martinez, Jeanelle M; Vincent, Melissa; Erraguntla, Neeraja; Dourson, Michael

    2018-06-01

    A no-significant-risk-level of 20 mg day -1 was derived for tetrabromobisphenol A (TBBPA). Uterine tumors (adenomas, adenocarcinomas, and malignant mixed Müllerian) observed in female Wistar Han rats from a National Toxicology Program 2-year cancer bioassay were identified as the critical effect. Studies suggest that TBBPA is acting through a non-mutagenic mode of action. Thus, the most appropriate approach to derivation of a cancer risk value based on US Environmental Protection Agency guidelines is a threshold approach, akin to a cancer safe dose (RfD cancer ). Using the National Toxicology Program data, we utilized Benchmark dose software to derive a benchmark dose lower limit (BMDL 10 ) as the point of departure (POD) of 103 mg kg -1  day -1 . The POD was adjusted to a human equivalent dose of 25.6 mg kg -1  day -1 using allometric scaling. We applied a composite adjustment factor of 100 to the POD to derive an RfD cancer of 0.26 mg kg -1  day -1 . Based on a human body weight of 70 kg, the RfD cancer was adjusted to a no-significant-risk-level of 20 mg day -1 . This was compared to other available non-cancer and cancer risk values, and aligns well with our understanding of the underlying biology based on the toxicology data. Overall, the weight of evidence from animal studies indicates that TBBPA has low toxicity and suggests that high doses over long exposure durations are needed to induce uterine tumor formation. Future research needs include a thorough and detailed vetting of the proposed adverse outcome pathway, including further support for key events leading to uterine tumor formation and a quantitative weight of evidence analysis. Copyright © 2018 John Wiley & Sons, Ltd.

  8. Evidence supporting radiation hormesis in atomic bomb survivor cancer mortality data.

    PubMed

    Doss, Mohan

    2012-12-01

    A recent update on the atomic bomb survivor cancer mortality data has concluded that excess relative risk (ERR) for solid cancers increases linearly with dose and that zero dose is the best estimate for the threshold, apparently validating the present use of the linear no threshold (LNT) model for estimating the cancer risk from low dose radiation. A major flaw in the standard ERR formalism for estimating cancer risk from radiation (and other carcinogens) is that it ignores the potential for a large systematic bias in the measured baseline cancer mortality rate, which can have a major effect on the ERR values. Cancer rates are highly variable from year to year and between adjacent regions and so the likelihood of such a bias is high. Calculations show that a correction for such a bias can lower the ERRs in the atomic bomb survivor data to negative values for intermediate doses. This is consistent with the phenomenon of radiation hormesis, providing a rational explanation for the decreased risk of cancer observed at intermediate doses for which there is no explanation based on the LNT model. The recent atomic bomb survivor data provides additional evidence for radiation hormesis in humans.

  9. A study of life prediction differences for a nickel-base Alloy 690 using a threshold and a non-threshold model

    NASA Astrophysics Data System (ADS)

    Young, B. A.; Gao, Xiaosheng; Srivatsan, T. S.

    2009-10-01

    In this paper we compare and contrast the crack growth rate of a nickel-base superalloy (Alloy 690) in the Pressurized Water Reactor (PWR) environment. Over the last few years, a preponderance of test data has been gathered on both Alloy 690 thick plate and Alloy 690 tubing. The original model, essentially based on a small data set for thick plate, compensated for temperature, load ratio and stress-intensity range but did not compensate for the fatigue threshold of the material. As additional test data on both plate and tube product became available the model was gradually revised to account for threshold properties. Both the original and revised models generated acceptable results for data that were above 1 × 10 -11 m/s. However, the test data at the lower growth rates were over-predicted by the non-threshold model. Since the original model did not take the fatigue threshold into account, this model predicted no operating stress below which the material would effectively undergo fatigue crack growth. Because of an over-prediction of the growth rate below 1 × 10 -11 m/s, due to a combination of low stress, small crack size and long rise-time, the model in general leads to an under-prediction of the total available life of the components.

  10. A Critique of Recent Epidemiologic Studies of Cancer Mortality Among Nuclear Workers.

    PubMed

    Scott, Bobby R

    2018-01-01

    Current justification by linear no-threshold (LNT) cancer risk model advocates for its use in low-dose radiation risk assessment is now mainly based on results from flawed and unreliable epidemiologic studies that manufacture small risk increases (ie, phantom risks). Four such studies of nuclear workers, essentially carried out by the same group of epidemiologists, are critiqued in this article. Three of the studies that forcibly applied the LNT model (inappropriate null hypothesis) to cancer mortality data and implicated increased mortality risk from any radiation exposure, no matter how small the dose, are demonstrated to manufacture risk increases for doses up to 100 mSv (or 100 mGy). In a study where risk reduction (hormetic effect/adaptive response) was implicated for nuclear workers, it was assumed by the researchers to relate to a "strong healthy worker effect" with no consideration of the possibility that low radiation doses may help prevent cancer mortality (which is consistent with findings from basic radiobiological research). It was found with basic research that while large radiation doses suppress our multiple natural defenses (barriers) against cancer, these barriers are enhanced by low radiation doses, thereby decreasing cancer risk, essentially rendering the LNT model to be inconsistent with the data.

  11. Threshold irradiation dose for amorphization of silicon carbide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snead, L.L.; Zinkle, S.J.

    1997-04-01

    The amorphization of silicon carbide due to ion and electron irradiation is reviewed with emphasis on the temperature-dependent critical dose for amorphization. The effect of ion mass and energy on the threshold dose for amorphization is summarized, showing only a weak dependence near room temperature. Results are presented for 0.56 MeV silicon ions implanted into single crystal 6H-SiC as a function of temperature and ion dose. From this, the critical dose for amorphization is found as a function of temperature at depths well separated from the implanted ion region. Results are compared with published data generated using electrons and xenonmore » ions as the irradiating species. High resolution TEM analysis is presented for the Si ion series showing the evolution of elongated amorphous islands oriented such that their major axis is parallel to the free surface. This suggests that surface of strain effects may be influencing the apparent amorphization threshold. Finally, a model for the temperature threshold for amorphization is described using the Si ion irradiation flux and the fitted interstitial migration energy which was found to be {approximately}0.56 eV. This model successfully explains the difference in the temperature-dependent amorphization behavior of SiC irradiated with 0.56 MeV silicon ions at 1 x 10{sup {minus}3} dpa/s and with fission neutrons irradiated at 1 x 10{sup {minus}6} dpa/s irradiated to 15 dpa in the temperature range of {approximately}340 {+-} 10K.« less

  12. An individualized radiation dose escalation trial in non-small cell lung cancer based on FDG-PET imaging.

    PubMed

    Wanet, Marie; Delor, Antoine; Hanin, François-Xavier; Ghaye, Benoît; Van Maanen, Aline; Remouchamps, Vincent; Clermont, Christian; Goossens, Samuel; Lee, John Aldo; Janssens, Guillaume; Bol, Anne; Geets, Xavier

    2017-10-01

    The aim of the study was to assess the feasibility of an individualized 18F fluorodeoxyglucose positron emission tomography (FDG-PET)-guided dose escalation boost in non-small cell lung cancer (NSCLC) patients and to assess its impact on local tumor control and toxicity. A total of 13 patients with stage II-III NSCLC were enrolled to receive a dose of 62.5 Gy in 25 fractions to the CT-based planning target volume (PTV; primary turmor and affected lymph nodes). The fraction dose was increased within the individual PET-based PTV (PTV PET ) using intensity modulated radiotherapy (IMRT) with a simultaneous integrated boost (SIB) until the predefined organ-at-risk (OAR) threshold was reached. Tumor response was assessed during follow-up by means of repeat FDG-PET/computed tomography. Acute and late toxicity were recorded and classified according to the CTCAE criteria (Version 4.0). Local progression-free survival was determined using the Kaplan-Meier method. The average dose to PTV PET reached 89.17 Gy for peripheral and 75 Gy for central tumors. After a median follow-up period of 29 months, seven patients were still alive, while six had died (four due to distant progression, two due to grade 5 toxicity). Local progression was seen in two patients in association with further recurrences. One and 2-year local progression free survival rates were 76.9% and 52.8%, respectively. Three cases of acute grade 3 esophagitis were seen. Two patients with central tumors developed late toxicity and died due to severe hemoptysis. These results suggest that a non-uniform and individualized dose escalation based on FDG-PET in IMRT delivery is feasible. The doses reached were higher in patients with peripheral compared to central tumors. This strategy enables good local control to be achieved at acceptable toxicity rates. However, dose escalation in centrally located tumors with direct invasion of mediastinal organs must be performed with great caution in order to avoid

  13. No-threshold dose-response curves for nongenotoxic chemicals: Findings and applications for risk assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sheehan, Daniel M.

    2006-01-15

    We tested the hypothesis that no threshold exists when estradiol acts through the same mechanism as an active endogenous estrogen. A Michaelis-Menten (MM) equation accounting for response saturation, background effects, and endogenous estrogen level fit a turtle sex-reversal data set with no threshold and estimated the endogenous dose. Additionally, 31 diverse literature dose-response data sets were analyzed by adding a term for nonhormonal background; good fits were obtained but endogenous dose estimations were not significant due to low resolving power. No thresholds were observed. Data sets were plotted using a normalized MM equation; all 178 data points were accommodated onmore » a single graph. Response rates from {approx}1% to >95% were well fit. The findings contradict the threshold assumption and low-dose safety. Calculating risk and assuming additivity of effects from multiple chemicals acting through the same mechanism rather than assuming a safe dose for nonthresholded curves is appropriate.« less

  14. Health Physics Society Comments to U.S. Environmental Protection Agency Regulatory Reform Task Force.

    PubMed

    Ring, Joseph; Tupin, Edward; Elder, Deirdre; Hiatt, Jerry; Sheetz, Michael; Kirner, Nancy; Little, Craig

    2018-05-01

    The Health Physics Society (HPS) provided comment to the U.S. Environmental Protection Agency (EPA) on options to consider when developing an action plan for President Trump's Executive Order to evaluate regulations for repeal, replacement, or modification. The HPS recommended that the EPA reconsider their adherence to the linear no-threshold (LNT) model for radiation risk calculations and improve several documents by better addressing uncertainties in low-dose, low dose-rate (LDDR) radiation exposure environments. The authors point out that use of the LNT model near background levels cannot provide reliable risk projections, use of the LNT model and collective-dose calculations in some EPA documents is inconsistent with the recommendations of international organizations, and some EPA documents have not been exposed to the public comment rule-making process. To assist in establishing a better scientific basis for the risks of low dose rate and low dose radiation exposure, the EPA should continue to support the "Million Worker Study," led by the National Council on Radiation Protection and Measurement.

  15. Graded-threshold parametric response maps: towards a strategy for adaptive dose painting

    NASA Astrophysics Data System (ADS)

    Lausch, A.; Jensen, N.; Chen, J.; Lee, T. Y.; Lock, M.; Wong, E.

    2014-03-01

    Purpose: To modify the single-threshold parametric response map (ST-PRM) method for predicting treatment outcomes in order to facilitate its use for guidance of adaptive dose painting in intensity-modulated radiotherapy. Methods: Multiple graded thresholds were used to extend the ST-PRM method (Nat. Med. 2009;15(5):572-576) such that the full functional change distribution within tumours could be represented with respect to multiple confidence interval estimates for functional changes in similar healthy tissue. The ST-PRM and graded-threshold PRM (GT-PRM) methods were applied to functional imaging scans of 5 patients treated for hepatocellular carcinoma. Pre and post-radiotherapy arterial blood flow maps (ABF) were generated from CT-perfusion scans of each patient. ABF maps were rigidly registered based on aligning tumour centres of mass. ST-PRM and GT-PRM analyses were then performed on overlapping tumour regions within the registered ABF maps. Main findings: The ST-PRMs contained many disconnected clusters of voxels classified as having a significant change in function. While this may be useful to predict treatment response, it may pose challenges for identifying boost volumes or for informing dose-painting by numbers strategies. The GT-PRMs included all of the same information as ST-PRMs but also visualized the full tumour functional change distribution. Heterogeneous clusters in the ST-PRMs often became more connected in the GT-PRMs by voxels with similar functional changes. Conclusions: GT-PRMs provided additional information which helped to visualize relationships between significant functional changes identified by ST-PRMs. This may enhance ST-PRM utility for guiding adaptive dose painting.

  16. Classification of radiation effects for dose limitation purposes: history, current situation and future prospects

    PubMed Central

    Hamada, Nobuyuki; Fujimichi, Yuki

    2014-01-01

    Radiation exposure causes cancer and non-cancer health effects, each of which differs greatly in the shape of the dose–response curve, latency, persistency, recurrence, curability, fatality and impact on quality of life. In recent decades, for dose limitation purposes, the International Commission on Radiological Protection has divided such diverse effects into tissue reactions (formerly termed non-stochastic and deterministic effects) and stochastic effects. On the one hand, effective dose limits aim to reduce the risks of stochastic effects (cancer/heritable effects) and are based on the detriment-adjusted nominal risk coefficients, assuming a linear-non-threshold dose response and a dose and dose rate effectiveness factor of 2. On the other hand, equivalent dose limits aim to avoid tissue reactions (vision-impairing cataracts and cosmetically unacceptable non-cancer skin changes) and are based on a threshold dose. However, the boundary between these two categories is becoming vague. Thus, we review the changes in radiation effect classification, dose limitation concepts, and the definition of detriment and threshold. Then, the current situation is overviewed focusing on (i) stochastic effects with a threshold, (ii) tissue reactions without a threshold, (iii) target organs/tissues for circulatory disease, (iv) dose levels for limitation of cancer risks vs prevention of non-life-threatening tissue reactions vs prevention of life-threatening tissue reactions, (v) mortality or incidence of thyroid cancer, and (vi) the detriment for tissue reactions. For future discussion, one approach is suggested that classifies radiation effects according to whether effects are life threatening, and radiobiological research needs are also briefly discussed. PMID:24794798

  17. Absorbed dose thresholds and absorbed dose rate limitations for studies of electron radiation effects on polyetherimides

    NASA Technical Reports Server (NTRS)

    Long, Edward R., Jr.; Long, Sheila Ann T.; Gray, Stephanie L.; Collins, William D.

    1989-01-01

    The threshold values of total absorbed dose for causing changes in tensile properties of a polyetherimide film and the limitations of the absorbed dose rate for accelerated-exposure evaluation of the effects of electron radiation in geosynchronous orbit were studied. Total absorbed doses from 1 kGy to 100 MGy and absorbed dose rates from 0.01 MGy/hr to 100 MGy/hr were investigated, where 1 Gy equals 100 rads. Total doses less than 2.5 MGy did not significantly change the tensile properties of the film whereas doses higher than 2.5 MGy significantly reduced elongation-to-failure. There was no measurable effect of the dose rate on the tensile properties for accelerated electron exposures.

  18. Scientific foundation of regulating ionizing radiation: application of metrics for evaluation of regulatory science information.

    PubMed

    Moghissi, A Alan; Gerraa, Vikrham Kumar; McBride, Dennis K; Swetnam, Michael

    2014-11-01

    This paper starts by describing the historical evolution of assessment of biologic effects of ionizing radiation leading to the linear non-threshold (LNT) system currently used to regulate exposure to ionizing radiation. The paper describes briefly the concept of Best Available Science (BAS) and Metrics for Evaluation of Scientific Claims (MESC) derived for BAS. It identifies three phases of regulatory science consisting of the initial phase, when the regulators had to develop regulations without having the needed scientific information; the exploratory phase, when relevant tools were developed; and the standard operating phase, when the tools were applied to regulations. Subsequently, an attempt is made to apply the BAS/MESC system to various stages of LNT. This paper then compares the exposure limits imposed by regulatory agencies and also compares them with naturally occurring radiation at several cities. Controversies about LNT are addressed, including judgments of the U.S. National Academies and their French counterpart. The paper concludes that, based on the BAS/MESC system, there is no disagreement between the two academies on the scientific foundation of LNT; instead, the disagreement is based on their judgment or speculation.

  19. Determination of the threshold dose distribution in photodynamic action from in vitro experiments.

    PubMed

    de Faria, Clara Maria Gonçalves; Inada, Natalia Mayumi; Kurachi, Cristina; Bagnato, Vanderlei Salvador

    2016-09-01

    The concept of threshold in photodynamic action on cells or microorganisms is well observed in experiments but not fully explored on in vitro experiments. The intercomparison between light and used photosensitizer among many experiments is also poorly evaluated. In this report, we present an analytical model that allows extracting from the survival rate experiments the data of the threshold dose distribution, ie, the distribution of energies and photosensitizer concentration necessary to produce death of cells. Then, we use this model to investigate photodynamic therapy (PDT) data previously published in literature. The concept of threshold dose distribution instead of "single value of threshold" is a rich concept for the comparison of photodynamic action in different situations, allowing analyses of its efficiency as well as determination of optimized conditions for PDT. We observed that, in general, as it becomes more difficult to kill a population, the distribution tends to broaden, which means it presents a large spectrum of threshold values within the same cell type population. From the distribution parameters (center peak and full width), we also observed a clear distinction among cell types regarding their response to PDT that can be quantified. Comparing data obtained from the same cell line and used photosensitizer (PS), where the only distinct condition was the light source's wavelength, we found that the differences on the distribution parameters were comparable to the differences on the PS absorption. At last, we observed evidence that the threshold dose distribution matches the curve of apoptotic activity for some PSs. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Oscillation patterns are enhanced and firing threshold is lowered in medullary respiratory neuron discharges by threshold doses of a μ-opioid receptor agonist

    PubMed Central

    Mifflin, Steve W.

    2017-01-01

    μ-Opioid receptors are distributed widely in the brain stem respiratory network, and opioids with selectivity for μ-type receptors slow in vivo respiratory rhythm in lowest effective doses. Several studies have reported μ-opioid receptor effects on the three-phase rhythm of respiratory neurons, but there are until now no reports of opioid effects on oscillatory activity within respiratory discharges. In this study, effects of the μ-opioid receptor agonist fentanyl on spike train discharge properties of several different types of rhythm-modulating medullary respiratory neuron discharges were analyzed. Doses of fentanyl that were just sufficient for prolongation of discharges and slowing of the three-phase respiratory rhythm also produced pronounced enhancement of spike train properties. Oscillation and burst patterns detected by autocorrelation measurements were greatly enhanced, and interspike intervals were prolonged. Spike train properties under control conditions and after fentanyl were uniform within each experiment, but varied considerably between experiments, which might be related to variability in acid-base balance in the brain stem extracellular fluid. Discharge threshold was shifted to more negative levels of membrane potential. The effects on threshold are postulated to result from opioid-mediated disinhibition and postsynaptic enhancement of N-methyl-d- aspartate receptor current. Lowering of firing threshold, enhancement of spike train oscillations and bursts and prolongation of discharges by lowest effective doses of fentanyl could represent compensatory adjustments in the brain stem respiratory network to override opioid blunting of CO2/pH chemosensitivity. PMID:28202437

  1. History of dose specification in Brachytherapy: From Threshold Erythema Dose to Computational Dosimetry

    NASA Astrophysics Data System (ADS)

    Williamson, Jeffrey F.

    2006-09-01

    This paper briefly reviews the evolution of brachytherapy dosimetry from 1900 to the present. Dosimetric practices in brachytherapy fall into three distinct eras: During the era of biological dosimetry (1900-1938), radium pioneers could only specify Ra-226 and Rn-222 implants in terms of the mass of radium encapsulated within the implanted sources. Due to the high energy of its emitted gamma rays and the long range of its secondary electrons in air, free-air chambers could not be used to quantify the output of Ra-226 sources in terms of exposure. Biological dosimetry, most prominently the threshold erythema dose, gained currency as a means of intercomparing radium treatments with exposure-calibrated orthovoltage x-ray units. The classical dosimetry era (1940-1980) began with successful exposure standardization of Ra-226 sources by Bragg-Gray cavity chambers. Classical dose-computation algorithms, based upon 1-D buildup factor measurements and point-source superposition computational algorithms, were able to accommodate artificial radionuclides such as Co-60, Ir-192, and Cs-137. The quantitative dosimetry era (1980- ) arose in response to the increasing utilization of low energy K-capture radionuclides such as I-125 and Pd-103 for which classical approaches could not be expected to estimate accurate correct doses. This led to intensive development of both experimental (largely TLD-100 dosimetry) and Monte Carlo dosimetry techniques along with more accurate air-kerma strength standards. As a result of extensive benchmarking and intercomparison of these different methods, single-seed low-energy radionuclide dose distributions are now known with a total uncertainty of 3%-5%.

  2. Non-invasive indices for the estimation of the anaerobic threshold of oarsmen.

    PubMed

    Erdogan, A; Cetin, C; Karatosun, H; Baydar, M L

    2010-01-01

    This study compared four common non-invasive indices with an invasive index for determining the anaerobic threshold (AT) in 22 adult male rowers using a Concept2 rowing ergometer. A criterion-standard progressive incremental test (invasive method) measured blood lactate concentrations to determine the 4 mmol/l threshold (La4-AT) and Dmax AT (Dm-AT). This was compared with three indices obtained by analysis of respiratory gases and one that was based on the heart rate (HR) deflection point (HRDP) all of which used the Conconi test (non-invasive methods). In the Conconi test, the HRDP was determined whilst continuously increasing the power output (PO) by 25 W/min and measuring respiratory gases and HR. The La4-AT and Dm-AT values differed slightly with respect to oxygen uptake, PO and HR however, AT values significantly correlated with each other and with the four non-invasive methods. In conclusion, the non-invasive indices were comparable with the invasive index and could, therefore, be used in the assessment of AT during rowing ergometer use. In this population of elite rowers, Conconi threshold (Con-AT), based on the measurement of HRDP tended to be the most adequate way of estimating AT for training regulation purposes.

  3. Image quality, threshold contrast and mean glandular dose in CR mammography

    NASA Astrophysics Data System (ADS)

    Jakubiak, R. R.; Gamba, H. R.; Neves, E. B.; Peixoto, J. E.

    2013-09-01

    In many countries, computed radiography (CR) systems represent the majority of equipment used in digital mammography. This study presents a method for optimizing image quality and dose in CR mammography of patients with breast thicknesses between 45 and 75 mm. Initially, clinical images of 67 patients (group 1) were analyzed by three experienced radiologists, reporting about anatomical structures, noise and contrast in low and high pixel value areas, and image sharpness and contrast. Exposure parameters (kV, mAs and target/filter combination) used in the examinations of these patients were reproduced to determine the contrast-to-noise ratio (CNR) and mean glandular dose (MGD). The parameters were also used to radiograph a CDMAM (version 3.4) phantom (Artinis Medical Systems, The Netherlands) for image threshold contrast evaluation. After that, different breast thicknesses were simulated with polymethylmethacrylate layers and various sets of exposure parameters were used in order to determine optimal radiographic parameters. For each simulated breast thickness, optimal beam quality was defined as giving a target CNR to reach the threshold contrast of CDMAM images for acceptable MGD. These results were used for adjustments in the automatic exposure control (AEC) by the maintenance team. Using optimized exposure parameters, clinical images of 63 patients (group 2) were evaluated as described above. Threshold contrast, CNR and MGD for such exposure parameters were also determined. Results showed that the proposed optimization method was effective for all breast thicknesses studied in phantoms. The best result was found for breasts of 75 mm. While in group 1 there was no detection of the 0.1 mm critical diameter detail with threshold contrast below 23%, after the optimization, detection occurred in 47.6% of the images. There was also an average MGD reduction of 7.5%. The clinical image quality criteria were attended in 91.7% for all breast thicknesses evaluated in both

  4. Lessons to be learned from a contentious challenge to mainstream radiobiological science (the linear no-threshold theory of genetic mutations)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beyea, Jan, E-mail: jbeyea@cipi.com

    There are both statistically valid and invalid reasons why scientists with differing default hypotheses can disagree in high-profile situations. Examples can be found in recent correspondence in this journal, which may offer lessons for resolving challenges to mainstream science, particularly when adherents of a minority view attempt to elevate the status of outlier studies and/or claim that self-interest explains the acceptance of the dominant theory. Edward J. Calabrese and I have been debating the historical origins of the linear no-threshold theory (LNT) of carcinogenesis and its use in the regulation of ionizing radiation. Professor Calabrese, a supporter of hormesis, hasmore » charged a committee of scientists with misconduct in their preparation of a 1956 report on the genetic effects of atomic radiation. Specifically he argues that the report mischaracterized the LNT research record and suppressed calculations of some committee members. After reviewing the available scientific literature, I found that the contemporaneous evidence overwhelmingly favored a (genetics) LNT and that no calculations were suppressed. Calabrese's claims about the scientific record do not hold up primarily because of lack of attention to statistical analysis. Ironically, outlier studies were more likely to favor supra-linearity, not sub-linearity. Finally, the claim of investigator bias, which underlies Calabrese's accusations about key studies, is based on misreading of text. Attention to ethics charges, early on, may help seed a counter narrative explaining the community's adoption of a default hypothesis and may help focus attention on valid evidence and any real weaknesses in the dominant paradigm. - Highlights: • Edward J Calabrese has made a contentious challenge to mainstream radiobiological science. • Such challenges should not be neglected, lest they enter the political arena without review. • Key genetic studies from the 1940s, challenged by Calabrese, were found

  5. Oscillation patterns are enhanced and firing threshold is lowered in medullary respiratory neuron discharges by threshold doses of a μ-opioid receptor agonist.

    PubMed

    Lalley, Peter M; Mifflin, Steve W

    2017-05-01

    μ-Opioid receptors are distributed widely in the brain stem respiratory network, and opioids with selectivity for μ-type receptors slow in vivo respiratory rhythm in lowest effective doses. Several studies have reported μ-opioid receptor effects on the three-phase rhythm of respiratory neurons, but there are until now no reports of opioid effects on oscillatory activity within respiratory discharges. In this study, effects of the μ-opioid receptor agonist fentanyl on spike train discharge properties of several different types of rhythm-modulating medullary respiratory neuron discharges were analyzed. Doses of fentanyl that were just sufficient for prolongation of discharges and slowing of the three-phase respiratory rhythm also produced pronounced enhancement of spike train properties. Oscillation and burst patterns detected by autocorrelation measurements were greatly enhanced, and interspike intervals were prolonged. Spike train properties under control conditions and after fentanyl were uniform within each experiment, but varied considerably between experiments, which might be related to variability in acid-base balance in the brain stem extracellular fluid. Discharge threshold was shifted to more negative levels of membrane potential. The effects on threshold are postulated to result from opioid-mediated disinhibition and postsynaptic enhancement of N -methyl-d- aspartate receptor current. Lowering of firing threshold, enhancement of spike train oscillations and bursts and prolongation of discharges by lowest effective doses of fentanyl could represent compensatory adjustments in the brain stem respiratory network to override opioid blunting of CO 2 /pH chemosensitivity. Copyright © 2017 the American Physiological Society.

  6. Avascular Necrosis of the Femoral Head After Palliative Radiotherapy in Metastatic Prostate Cancer: Absence of a Dose Threshold?

    PubMed

    Daoud, Alia M; Hudson, Mack; Magnus, Kenneth G; Huang, Fleur; Danielson, Brita L; Venner, Peter; Saluja, Ronak; LeGuerrier, Bronwen; Daly, Helene; Emmenegger, Urban; Fairchild, Alysa

    2016-03-06

    Avascular necrosis (AVN) is the final common pathway resulting from insufficient blood supply to bone, commonly the femoral head. There are many postulated etiologies of non-traumatic AVN, including corticosteroids, bisphosphonates, and radiotherapy (RT). However, it is unclear whether there is a dose threshold for the development of RT-induced AVN. In this case report, we describe a patient with prostate cancer metastatic to bone diagnosed with AVN after receiving single-fraction palliative RT to the left femoral head. Potential contributing factors are discussed, along with a review of other reported cases. At present, the RT dose threshold below which there is no risk for AVN is unknown, and therefore detrimental impact from the RT cannot be excluded. Given the possibility that RT-induced AVN is a stochastic effect, it is important to be aware of the possibility of this diagnosis in any patient with a painful hip who has received RT to the femoral head.

  7. Dose and Effect Thresholds for Early Key Events in a Mode of ...

    EPA Pesticide Factsheets

    ABSTRACT Strategies for predicting adverse health outcomes of environmental chemicals are centered on early key events in toxicity pathways. However, quantitative relationships between early molecular changes in a given pathway and later health effects are often poorly defined. The goal of this study was to evaluate short-term key event indicators using qualitative and quantitative methods in an established pathway of mouse liver tumorigenesis mediated by peroxisome proliferator-activated receptor-alpha (PPARα). Male B6C3F1 mice were exposed for 7 days to di(2-ethylhexyl) phthalate (DEHP), di-n-octyl phthalate (DNOP), and n-butyl benzyl phthalate (BBP), which vary in PPARα activity and liver tumorigenicity. Each phthalate increased expression of select PPARα target genes at 7 days, while only DEHP significantly increased liver cell proliferation labeling index (LI). Transcriptional benchmark dose (BMDT) estimates for dose-related genomic markers stratified phthalates according to hypothetical tumorigenic potencies, unlike BMDs for non-genomic endpoints (liver weights or proliferation). The 7-day BMDT values for Acot1 as a surrogate measure for PPARα activation were 29, 370, and 676 mg/kg-d for DEHP, DNOP, and BBP, respectively, distinguishing DEHP (liver tumor BMD of 35 mg/kg-d) from non-tumorigenic DNOP and BBP. Effect thresholds were generated using linear regression of DEHP effects at 7 days and 2-year tumor incidence values to anchor early response molec

  8. A comparison of the fragmentation thresholds and inertial cavitation doses of different ultrasound contrast agents

    NASA Astrophysics Data System (ADS)

    Chen, Wen-Shiang; Matula, Thomas J.; Brayman, Andrew A.; Crum, Lawrence A.

    2003-01-01

    Contrast bubble destruction is important in several new diagnostic and therapeutic applications. The pressure threshold of destruction is determined by the shell material, while the propensity for of the bubbles to undergo inertial cavitation (IC) depends both on the gas and shell properties of the ultrasound contrast agent (UCA). The ultrasonic fragmentation thresholds of three specific UCAs (Optison, Sonazoid, and biSpheres), each with different shell and gas properties, were determined under various acoustic conditions. The acoustic emissions generated by the agents, or their derivatives, characteristic of IC after fragmentation, was also compared, using cumulated broadband-noise emissions (IC ``dose''). Albumin-shelled Optison and surfactant-shelled Sonazoid had low fragmentation thresholds (mean=0.13 and 0.15 MPa at 1.1 MHz, 0.48 and 0.58 MPa at 3.5 MHz, respectively), while polymer-shelled biSpheres had a significant higher threshold (mean=0.19 and 0.23 MPa at 1.1 MHz, 0.73 and 0.96 MPa for thin- and thick-shell biSpheres at 3.5 MHz, respectively, p<0.01). At comparable initial concentrations, surfactant-shelled Sonazoid produced a much larger IC dose after shell destruction than did either biSpheres or Optison (p<0.01). Thick-shelled biSpheres had the highest fragmentation threshold and produced the lowest IC dose. More than two and five acoustic cycles, respectively, were necessary for the thin- and thick-shell biSpheres to reach a steady-state fragmentation threshold.

  9. Lessons to be learned from a contentious challenge to mainstream radiobiological science (the linear no-threshold theory of genetic mutations).

    PubMed

    Beyea, Jan

    2017-04-01

    There are both statistically valid and invalid reasons why scientists with differing default hypotheses can disagree in high-profile situations. Examples can be found in recent correspondence in this journal, which may offer lessons for resolving challenges to mainstream science, particularly when adherents of a minority view attempt to elevate the status of outlier studies and/or claim that self-interest explains the acceptance of the dominant theory. Edward J. Calabrese and I have been debating the historical origins of the linear no-threshold theory (LNT) of carcinogenesis and its use in the regulation of ionizing radiation. Professor Calabrese, a supporter of hormesis, has charged a committee of scientists with misconduct in their preparation of a 1956 report on the genetic effects of atomic radiation. Specifically he argues that the report mischaracterized the LNT research record and suppressed calculations of some committee members. After reviewing the available scientific literature, I found that the contemporaneous evidence overwhelmingly favored a (genetics) LNT and that no calculations were suppressed. Calabrese's claims about the scientific record do not hold up primarily because of lack of attention to statistical analysis. Ironically, outlier studies were more likely to favor supra-linearity, not sub-linearity. Finally, the claim of investigator bias, which underlies Calabrese's accusations about key studies, is based on misreading of text. Attention to ethics charges, early on, may help seed a counter narrative explaining the community's adoption of a default hypothesis and may help focus attention on valid evidence and any real weaknesses in the dominant paradigm. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  10. In-vitro singlet oxygen threshold dose at PDT with Radachlorin photosensitizer

    NASA Astrophysics Data System (ADS)

    Klimenko, V. V.; Shmakov, S. V.; Kaydanov, N. E.; Knyazev, N. A.; Kazakov, N. V.; Rusanov, A. A.; Bogdanov, A. A.; Dubina, M. V.

    2017-07-01

    In this present study we investigate the Radachlorin photosensitizer accumulation in K562 cells and Hela cells and determined the cell viability after PDT. Using the macroscopic singlet oxygen modeling and cellular photosensitizer concentration the singlet oxygen threshold doses for K562 cells and Hela cells were calculated.

  11. Nefopam, a Non-sedative Benzoxazocine Analgesic, Selectively Reduces the Shivering Threshold

    PubMed Central

    Alfonsi, Pascal; Adam, Frederic; Passard, Andrea; Guignard, Bruno; Sessler, Daniel I.; Chauvin, Marcel

    2005-01-01

    Background The analgesic nefopam does not compromise ventilation, is minimally sedating, and is effective as a treatment for postoperative shivering. We evaluated the effects of nefopam on the major thermoregulatory responses in humans: sweating, vasoconstriction, and shivering. Methods Nine volunteers were studied on three randomly assigned days: 1) control (Saline), 2) nefopam at a target plasma concentration of 35 ng/ml (Small Dose), and 3) nefopam at a target concentration of 70 ng/ml (Large Dose, ≈20 mg total). Each day, skin and core temperatures were increased to provoke sweating and then reduced to elicit peripheral vasoconstriction and shivering. We determined the thresholds (triggering core temperature at a designated skin temperature of 34°C) by mathematically compensating for changes in skin temperature using the established linear cutaneous contributions to control of each response. Results Nefopam did not significantly modify the slopes for sweating (0.0 ± 4.9°C·μg−1·ml; r2 = 0.73 ± 0.32) or vasoconstriction (−3.6 ± 5.0°C·μg−1·ml; r2=−0.47± 0.41). In contrast, nefopam significantly reduced the slope of shivering (−16.8 ± 9.3°C·μg−1·ml; r2 = 0.92 ± 0.06). Large-Dose nefopam thus reduced the shivering threshold by 0.9 ± 0.4°C (P<0.001) without any discernable effect on the sweating or vasoconstriction thresholds. Conclusions Most drugs with thermoregulatory actions — including anesthetics, sedatives, and opioids — synchronously reduce the vasoconstriction and shivering thresholds. Nefopam however reduced only the shivering threshold. This pattern has not previously been reported for a centrally acting drug. That pharmacologic modulation of vasoconstriction and shivering can be separated is of clinical and physiologic interest. PMID:14695722

  12. A Phase II, Randomized, Double-Blind, Placebo Controlled, Dose-Response Trial of the Melatonin Effect on the Pain Threshold of Healthy Subjects

    PubMed Central

    Stefani, Luciana Cadore; Muller, Suzana; Torres, Iraci L. S.; Razzolini, Bruna; Rozisky, Joanna R.; Fregni, Felipe; Markus, Regina; Caumo, Wolnei

    2013-01-01

    Background Previous studies have suggested that melatonin may produce antinociception through peripheral and central mechanisms. Based on the preliminary encouraging results of studies of the effects of melatonin on pain modulation, the important question has been raised of whether there is a dose relationship in humans of melatonin on pain modulation. Objective The objective was to evaluate the analgesic dose response of the effects of melatonin on pressure and heat pain threshold and tolerance and the sedative effects. Methods Sixty-one healthy subjects aged 19 to 47 y were randomized into one of four groups: placebo, 0.05 mg/kg sublingual melatonin, 0.15 mg/kg sublingual melatonin or 0.25 mg/kg sublingual melatonin. We determine the pressure pain threshold (PPT) and the pressure pain tolerance (PPTo). Quantitative sensory testing (QST) was used to measure the heat pain threshold (HPT) and the heat pain tolerance (HPTo). Sedation was assessed with a visual analogue scale and bispectral analysis. Results Serum plasma melatonin levels were directly proportional to the melatonin doses given to each subject. We observed a significant effect associated with dose group. Post hoc analysis indicated significant differences between the placebo vs. the intermediate (0.15 mg/kg) and the highest (0.25 mg/kg) melatonin doses for all pain threshold and sedation level tests. A linear regression model indicated a significant association between the serum melatonin concentrations and changes in pain threshold and pain tolerance (R2 = 0.492 for HPT, R2 = 0.538 for PPT, R2 = 0.558 for HPTo and R2 = 0.584 for PPTo). Conclusions The present data indicate that sublingual melatonin exerts well-defined dose-dependent antinociceptive activity. There is a correlation between the plasma melatonin drug concentration and acute changes in the pain threshold. These results provide additional support for the investigation of melatonin as an analgesic agent. Brazilian Clinical

  13. Threshold-type dose response for induction of neoplastic transformation by 1 GeV/nucleon iron ions.

    PubMed

    Elmore, E; Lao, X-Y; Kapadia, R; Redpath, J L

    2009-06-01

    Neoplastic transformation of HeLa x skin fibroblast human hybrid cells by doses of 1 GeV/nucleon iron ions in the range 1 cGy to 1 Gy to exposed cultures has been examined. The data indicate a threshold-type dose-response curve with no increase in transformation frequency until doses above 20 cGy. At doses <10 cGy, not all exposed cells receive a direct traversal of an iron-ion track core, but all exposed cells receive up to several mGy of low-LET radiation associated with the delta-ray penumbra. It is proposed that the threshold-type response seen is a consequence of an adaptive response associated with the delta-ray exposure. For comparison purposes, the dose response for (137)Cs gamma rays over the same dose range was examined using the same experimental procedure. As we have shown previously, the dose response for (137)Cs gamma radiation was J-shaped. The iron ions were 1.5 to 1.7 times more biologically effective than the gamma radiation over the dose range examined.

  14. Nonlinear dose response model with repair and repair suppression

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leonard, B.E.

    1996-12-31

    In March 1996, the Health Physics Society issued a position statement supporting a nonlinear threshold (NLT) concept for radiation risk at low-dose/low-dose-rate (LD/LDR) levels. This action was after receipt of an overwhelming consensus from world-renown radiobiologists and is contrary to the opinions of the United Nations Scientific Committee on Effects of Atomic Radiation, the National Research Council Committee on the Biological Effects of Ionizing Radiations, and U.S. Environmental Protection Agency. Alvarez and others have called for a new NLT model for radiation risk. Two mathematical models have historically been used to describe cell survival experimental results. Each provides the abilitymore » to account for the shoulder observed in cell survival curves, predominantly for low-linear energy transfer (LET) radiation, and the wide variation in radio sensitivity of cell species and particular phase of the mitotic cycle. Only Kellerer and Rossi, Elkind and Whitmore, and Green and Burki have proposed modified models explicitly incorporating radiobiological repair and departing from LNT. None of these were subsequently used with any extent of success in cell survival analysis. The author reports initial work on a program to reexamine radiobiology research exhibiting repair processes at LD/LDR levels.« less

  15. Perceptual thresholds for non-ideal diffuse field reverberation.

    PubMed

    Romblom, David; Guastavino, Catherine; Depalle, Philippe

    2016-11-01

    The objective of this study is to understand listeners' sensitivity to directional variations in non-ideal diffuse field reverberation. An ABX discrimination test was conducted using a semi-spherical 28-loudspeaker array; perceptual thresholds were estimated by systematically varying the level of a segment of loudspeakers for lateral, height, and frontal conditions. The overall energy was held constant using a gain compensation scheme. When compared to an ideal diffuse field, the perceptual threshold for detection is -2.5 dB for the lateral condition, -6.8 dB for the height condition, and -3.2 dB for the frontal condition. Measurements of the experimental stimuli were analyzed using a Head and Torso Simulator as well as with opposing cardioid microphones aligned on the three Cartesian axes. Additionally, opposing cardioid measurements made in an acoustic space demonstrate that level differences corresponding to the perceptual thresholds can be found in practice. These results suggest that non-ideal diffuse field reverberation may be a previously unrecognized component of spatial impression.

  16. Diethylene glycol-induced toxicities show marked threshold dose response in rats

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Landry, Greg M., E-mail: Landry.Greg@mayo.edu; Dunning, Cody L., E-mail: cdunni@lsuhsc.edu; Abreo, Fleurette, E-mail: fabreo@lsuhsc.edu

    Diethylene glycol (DEG) exposure poses risks to human health because of widespread industrial use and accidental exposures from contaminated products. To enhance the understanding of the mechanistic role of metabolites in DEG toxicity, this study used a dose response paradigm to determine a rat model that would best mimic DEG exposure in humans. Wistar and Fischer-344 (F-344) rats were treated by oral gavage with 0, 2, 5, or 10 g/kg DEG and blood, kidney and liver tissues were collected at 48 h. Both rat strains treated with 10 g/kg DEG had equivalent degrees of metabolic acidosis, renal toxicity (increased BUNmore » and creatinine and cortical necrosis) and liver toxicity (increased serum enzyme levels, centrilobular necrosis and severe glycogen depletion). There was no liver or kidney toxicity at the lower DEG doses (2 and 5 g/kg) regardless of strain, demonstrating a steep threshold dose response. Kidney diglycolic acid (DGA), the presumed nephrotoxic metabolite of DEG, was markedly elevated in both rat strains administered 10 g/kg DEG, but no DGA was present at 2 or 5 g/kg, asserting its necessary role in DEG-induced toxicity. These results indicate that mechanistically in order to produce toxicity, metabolism to and significant target organ accumulation of DGA are required and that both strains would be useful for DEG risk assessments. - Highlights: • DEG produces a steep threshold dose response for kidney injury in rats. • Wistar and F-344 rats do not differ in response to DEG-induced renal injury. • The dose response for renal injury closely mirrors that for renal DGA accumulation. • Results demonstrate the importance of DGA accumulation in producing kidney injury.« less

  17. A threshold dose distribution approach for the study of PDT resistance development: A threshold distribution approach for the study of PDT resistance.

    PubMed

    de Faria, Clara Maria Gonçalves; Inada, Natalia Mayumi; Vollet-Filho, José Dirceu; Bagnato, Vanderlei Salvador

    2018-05-01

    Photodynamic therapy (PDT) is a technique with well-established principles that often demands repeated applications for sequential elimination of tumor cells. An important question concerns the way surviving cells from a treatment behave in the subsequent one. Threshold dose is a core concept in PDT dosimetry, as the minimum amount of energy to be delivered for cell destruction via PDT. Concepts of threshold distribution have shown to be an important tool for PDT results analysis in vitro. In this study, we used some of these concepts for demonstrating subsequent treatments with partial elimination of cells modify the distribution, which represents an increased resistance of the cells to the photodynamic action. HepG2 and HepaRG were used as models of tumor and normal liver cells and a protocol to induce resistance, consisted of repeated PDT sessions using Photogem® as a photosensitizer, was applied to the tumor ones. The response of these cells to PDT was assessed using a standard viability assay and the dose response curves were used for deriving the threshold distributions. The changes in the distribution revealed that the resistance protocol effectively eliminated the most sensitive cells. Nevertheless, HepaRG cell line was the most resistant one among the cells analyzed, which indicates a specificity in clinical applications that enables the use of high doses and drug concentrations with minimal damage to the surrounding normal tissue. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. Recent international regulations: low dose-low rate radiation protection and the demise of reason.

    PubMed

    Okkalides, Demetrios

    2008-01-01

    The radiation protection measures suggested by the International Committee for Radiation Protection (ICRP), national regulating bodies and experts, have been becoming ever more strict despite the decrease of any information supporting the existence of the Linear no Threshold model (LNT) and of any adverse effects of Low Dose Low Rate (LDLR) irradiation. This tendency arises from the disproportionate response of human society to hazards that are currently in fashion and is unreasonable. The 1 mSv/year dose limit for the public suggested by the ICRP corresponds to a 1/18,181 detriment-adjusted cancer risk and is much lower than other hazards that are faced by modern societies such as e.g. driving and smoking which carry corresponding rate risks of 1/2,100 and 1/2,000. Even worldwide deadly work accidents rate is higher at 1/ 8,065. Such excessive safety measures against minimal risks from man made radiation sources divert resources from very real and much greater hazards. In addition they undermine research and development of radiation technology and tend to subjugate science and the quest for understanding nature to phobic practices.

  19. Skin entrance dose with and without lead apron in digital panoramic radiography for selected sensitive body regions.

    PubMed

    Schulze, Ralf Kurt Willy; Cremers, Catrin; Karle, Heiko; de Las Heras Gala, Hugo

    2017-05-01

    The aim of this study was to compare the dose at skin level at five significant anatomical regions for panoramic radiography devices with and without lead apron by means of a highly sensitive dosimeter. A female RANDO-phantom was exposed in five different digital panoramic radiography systems, and the dose at skin level was assessed tenfold for each measurement region by means of a highly sensitive solid-state-dosimeter. The five measurement regions selected were the thyroid, both female breasts, the gonads, and a central region in the back of the phantom. For each panoramic machine, the measurements were performed in two modes: with and without a commercial lead apron specifically designed for panoramic radiography. Reproducibility of the measurements was expressed by absolute differences and the coefficient of variation. Values between shielded and unshielded doses were pooled for each region and compared by means of the paired Wilcoxon tests (p ≤ 0.05). Reproducibility as represented by the mean CV was 22 ± 52 % (median 2.3 %) with larger variations for small dose values. Doses at skin level ranged between 0.00 μGy at the gonads and 85.39 μGy at the unshielded thyroid (mean ± SD 15 ± 24 μGy). Except for the gonads, the dose in all the other regions was significantly lower (p < 0.001) when a lead apron was applied. Unshielded doses were between 1.02-fold (thyroid) and 112-fold (at the right breast) higher than those with lead apron shielding (mean: 14-fold ± 18-fold). Although the doses were entirely very low, we observed a significant increase in dose in the radiation-sensitive female breast region when no lead apron was used. Future discussions on shielding requirements for panoramic radiography should focus on these differences in the light of the linear non-threshold (LNT) theory which is generally adopted in medical imaging.

  20. Cavitation and non-cavitation regime for large-scale ultrasonic standing wave particle separation systems--In situ gentle cavitation threshold determination and free radical related oxidation.

    PubMed

    Johansson, Linda; Singh, Tanoj; Leong, Thomas; Mawson, Raymond; McArthur, Sally; Manasseh, Richard; Juliano, Pablo

    2016-01-01

    We here suggest a novel and straightforward approach for liter-scale ultrasound particle manipulation standing wave systems to guide system design in terms of frequency and acoustic power for operating in either cavitation or non-cavitation regimes for ultrasound standing wave systems, using the sonochemiluminescent chemical luminol. We show that this method offers a simple way of in situ determination of the cavitation threshold for selected separation vessel geometry. Since the pressure field is system specific the cavitation threshold is system specific (for the threshold parameter range). In this study we discuss cavitation effects and also measure one implication of cavitation for the application of milk fat separation, the degree of milk fat lipid oxidation by headspace volatile measurements. For the evaluated vessel, 2 MHz as opposed to 1 MHz operation enabled operation in non-cavitation or low cavitation conditions as measured by the luminol intensity threshold method. In all cases the lipid oxidation derived volatiles were below the human sensory detection level. Ultrasound treatment did not significantly influence the oxidative changes in milk for either 1 MHz (dose of 46 kJ/L and 464 kJ/L) or 2 MHz (dose of 37 kJ/L and 373 kJ/L) operation. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Impact of tumour motion compensation and delineation methods on FDG PET-based dose painting plan quality for NSCLC radiation therapy.

    PubMed

    Thomas, Hannah Mary; Kinahan, Paul E; Samuel, James Jebaseelan E; Bowen, Stephen R

    2018-02-01

    To quantitatively estimate the impact of different methods for both boost volume delineation and respiratory motion compensation of [18F] FDG PET/CT images on the fidelity of planned non-uniform 'dose painting' plans to the prescribed boost dose distribution. Six locally advanced non-small cell lung cancer (NSCLC) patients were retrospectively reviewed. To assess the impact of respiratory motion, time-averaged (3D AVG), respiratory phase-gated (4D GATED) and motion-encompassing (4D MIP) PET images were used. The boost volumes were defined using manual contour (MANUAL), fixed threshold (FIXED) and gradient search algorithm (GRADIENT). The dose painting prescription of 60 Gy base dose to the planning target volume and an integral dose of 14 Gy (total 74 Gy) was discretized into seven treatment planning substructures and linearly redistributed according to the relative SUV at every voxel in the boost volume. Fifty-four dose painting plan combinations were generated and conformity was evaluated using quality index VQ0.95-1.05, which represents the sum of planned dose voxels within 5% deviation from the prescribed dose. Trends in plan quality and magnitude of achievable dose escalation were recorded. Different segmentation techniques produced statistically significant variations in maximum planned dose (P < 0.02), as well as plan quality between segmentation methods for 4D GATED and 4D MIP PET images (P < 0.05). No statistically significant differences in plan quality and maximum dose were observed between motion-compensated PET-based plans (P > 0.75). Low variability in plan quality was observed for FIXED threshold plans, while MANUAL and GRADIENT plans achieved higher dose with lower plan quality indices. The dose painting plans were more sensitive to segmentation of boost volumes than PET motion compensation in this study sample. Careful consideration of boost target delineation and motion compensation strategies should guide the design of NSCLC dose

  2. A sensitive and rapid radiolabelling method for the in vivo pharmacokinetic study of lentinan.

    PubMed

    Zhang, Yu; Zheng, Ziming; Yang, Xiawen; Pan, Xianglin; Yin, Lianglan; Huang, Xiao; Li, Qiang; Shu, Yamin; Zhang, Qilin; Wang, Kaiping

    2018-06-20

    The aim of this study is to establish a rapid and sensitive method for detecting lentinan (LNT) in biosamples and to evaluate the pharmacokinetics of LNT in mice and rats. A diethylenetriaminepentaacetic acid (DTPA) derivative of LNT (DTPA-LNT) was synthesized first to allow labelling with 99m-technetium (99mTc). After purification and identification, 99mTc-DTPA-LNT was intravenously administered to mice (2 mg kg-1) and rats at different doses (0.5, 2 and 8 mg kg-1). The results showed that the 99mTc-labelling method was suitable for the quantification of the LNT concentration in biological samples, with satisfactory linearity (r2 > 0.998), precision (<7%), accuracy (95.01-104.51%) and total recovery (∼90%). The blood concentration-time profiles of 99mTc-DTPA-LNT were consistent with the two-compartment model and showed a rapid distribution phase and a slow elimination, and no significant difference in the blood level of LNT was found among the tested doses (0.5, 2 and 8 mg kg-1). LNT was predominantly incorporated into the liver and spleen, and there was a small amount of aggregation in the bile, kidneys, lungs and stomach. Approximately 40% of the administered radioactivity was detected in urine and faeces within 24 h post-dosing. In addition, SPECT imaging of 99mTc-DTPA-LNT was performed to visually reveal the pharmacokinetic characteristics of LNT. These findings provide a reference for further study and for use of LNT and other β-glucans.

  3. Below-threshold harmonic generation from strong non-uniform fields

    NASA Astrophysics Data System (ADS)

    Yavuz, I.

    2017-10-01

    Strong-field photoemission below the ionization threshold is a rich/complex region where atomic emission and harmonic generation may coexist. We studied the mechanism of below-threshold harmonics (BTH) from spatially non-uniform local fields near the metallic nanostructures. Discrete harmonics are generated due to the broken inversion symmetry, suggesting enriched coherent emission in the vuv frequency range. Through the numerical solution of the time-dependent Schrödinger equation, we investigate wavelength and intensity dependence of BTH. Wavelength dependence identifies counter-regular resonances; individual contributions from the multi-photon emission and channel-closing effects due to quantum path interferences. In order to understand the underlying mechanism of BTH, we devised a generalized semi-classical model, including the influence of Coulomb and non-uniform field interactions. As in uniform fields, Coulomb potential in non-uniform fields is the determinant of BTH; we observed that the generation of BTH are due to returning trajectories with negative energies. Due to large distance effectiveness of the non-uniformity, only long trajectories are noticeably affected.

  4. TU-C-18A-01: Models of Risk From Low-Dose Radiation Exposures: What Does the Evidence Say?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bushberg, J; Boreham, D; Ulsh, B

    2014-06-15

    At dose levels of (approximately) 500 mSv or more, increased cancer incidence and mortality have been clearly demonstrated. However, at the low doses of radiation used in medical imaging, the relationship between dose and cancer risk is not well established. As such, assumptions about the shape of the dose-response curve are made. These assumptions, or risk models, are used to estimate potential long term effects. Common models include 1) the linear non-threshold (LNT) model, 2) threshold models with either a linear or curvilinear dose response above the threshold, and 3) a hormetic model, where the risk is initially decreased belowmore » background levels before increasing. The choice of model used when making radiation risk or protection calculations and decisions can have significant implications on public policy and health care decisions. However, the ongoing debate about which risk model best describes the dose-response relationship at low doses of radiation makes informed decision making difficult. This symposium will review the two fundamental approaches to determining the risk associated with low doses of ionizing radiation, namely radiation epidemiology and radiation biology. The strengths and limitations of each approach will be reviewed, the results of recent studies presented, and the appropriateness of different risk models for various real world scenarios discussed. Examples of well-designed and poorly-designed studies will be provided to assist medical physicists in 1) critically evaluating publications in the field and 2) communicating accurate information to medical professionals, patients, and members of the general public. Equipped with the best information that radiation epidemiology and radiation biology can currently provide, and an understanding of the limitations of such information, individuals and organizations will be able to make more informed decisions regarding questions such as 1) how much shielding to install at medical facilities

  5. Testing a Threshold-Based Bed Bug Management Approach in Apartment Buildings.

    PubMed

    Singh, Narinderpal; Wang, Changlu; Zha, Chen; Cooper, Richard; Robson, Mark

    2017-07-26

    We tested a threshold-based bed bug ( Cimex lectularius L.) management approach with the goal of achieving elimination with minimal or no insecticide application. Thirty-two bed bug infested apartments were identified. These apartments were divided into four treatment groups based on apartment size and initial bed bug count, obtained through a combination of visual inspection and bed bug monitors: I- Non-chemical only in apartments with 1-12 bed bug count, II- Chemical control only in apartments with 1-12 bed bug count, III- Non-chemical and chemical control in apartments with >12 bed bug count, and IV- Chemical control only in apartments with ≥11 bed bug count. All apartments were monitored or treated once every two weeks for a maximum of 28 wk. Treatment I eliminated bed bugs in a similar amount of time to treatment II. Time to eliminate bed bugs was similar between treatment III and IV but required significantly less insecticide spray in treatment III than that in treatment IV. A threshold-based management approach (non-chemical only or non-chemical and chemical) can eliminate bed bugs in a similar amount of time, using little to no pesticide compared to a chemical only approach.

  6. Testing a Threshold-Based Bed Bug Management Approach in Apartment Buildings

    PubMed Central

    Singh, Narinderpal; Zha, Chen; Cooper, Richard; Robson, Mark

    2017-01-01

    We tested a threshold-based bed bug (Cimex lectularius L.) management approach with the goal of achieving elimination with minimal or no insecticide application. Thirty-two bed bug infested apartments were identified. These apartments were divided into four treatment groups based on apartment size and initial bed bug count, obtained through a combination of visual inspection and bed bug monitors: I- Non-chemical only in apartments with 1–12 bed bug count, II- Chemical control only in apartments with 1–12 bed bug count, III- Non-chemical and chemical control in apartments with >12 bed bug count, and IV- Chemical control only in apartments with ≥11 bed bug count. All apartments were monitored or treated once every two weeks for a maximum of 28 wk. Treatment I eliminated bed bugs in a similar amount of time to treatment II. Time to eliminate bed bugs was similar between treatment III and IV but required significantly less insecticide spray in treatment III than that in treatment IV. A threshold-based management approach (non-chemical only or non-chemical and chemical) can eliminate bed bugs in a similar amount of time, using little to no pesticide compared to a chemical only approach. PMID:28933720

  7. Electrocardiogram signal denoising based on a new improved wavelet thresholding

    NASA Astrophysics Data System (ADS)

    Han, Guoqiang; Xu, Zhijun

    2016-08-01

    Good quality electrocardiogram (ECG) is utilized by physicians for the interpretation and identification of physiological and pathological phenomena. In general, ECG signals may mix various noises such as baseline wander, power line interference, and electromagnetic interference in gathering and recording process. As ECG signals are non-stationary physiological signals, wavelet transform is investigated to be an effective tool to discard noises from corrupted signals. A new compromising threshold function called sigmoid function-based thresholding scheme is adopted in processing ECG signals. Compared with other methods such as hard/soft thresholding or other existing thresholding functions, the new algorithm has many advantages in the noise reduction of ECG signals. It perfectly overcomes the discontinuity at ±T of hard thresholding and reduces the fixed deviation of soft thresholding. The improved wavelet thresholding denoising can be proved to be more efficient than existing algorithms in ECG signal denoising. The signal to noise ratio, mean square error, and percent root mean square difference are calculated to verify the denoising performance as quantitative tools. The experimental results reveal that the waves including P, Q, R, and S waves of ECG signals after denoising coincide with the original ECG signals by employing the new proposed method.

  8. CEM43°C thermal dose thresholds: a potential guide for magnetic resonance radiofrequency exposure levels?

    PubMed

    van Rhoon, Gerard C; Samaras, Theodoros; Yarmolenko, Pavel S; Dewhirst, Mark W; Neufeld, Esra; Kuster, Niels

    2013-08-01

    To define thresholds of safe local temperature increases for MR equipment that exposes patients to radiofrequency fields of high intensities for long duration. These MR systems induce heterogeneous energy absorption patterns inside the body and can create localised hotspots with a risk of overheating. The MRI + EUREKA research consortium organised a "Thermal Workshop on RF Hotspots". The available literature on thresholds for thermal damage and the validity of the thermal dose (TD) model were discussed. The following global TD threshold guidelines for safe use of MR are proposed: 1. All persons: maximum local temperature of any tissue limited to 39 °C 2. Persons with compromised thermoregulation AND (a) Uncontrolled conditions: maximum local temperature limited to 39 °C (b) Controlled conditions: TD < 2 CEM43°C 3. Persons with uncompromised thermoregulation AND (a) Uncontrolled conditions: TD < 2 CEM43°C (b) Controlled conditions: TD < 9 CEM43°C The following definitions are applied: Controlled conditions A medical doctor or a dedicated trained person can respond instantly to heat-induced physiological stress Compromised thermoregulation All persons with impaired systemic or reduced local thermoregulation • Standard MRI can cause local heating by radiofrequency absorption. • Monitoring thermal dose (in units of CEM43°C) can control risk during MRI. • 9 CEM43°C seems an acceptable thermal dose threshold for most patients. • For skin, muscle, fat and bone,16 CEM43°C is likely acceptable.

  9. Dose-response approaches for nuclear receptor-mediated ...

    EPA Pesticide Factsheets

    A public workshop, organized by a Steering Committee of scientists from government, industry, universities, and research organizations, was held at the National Institute of Environmental Health Sciences (NIEHS) in September, 2010. The workshop explored the dose-response implications of toxicant modes of action (MOA) mediated by nuclear receptors. The dominant paradigm in human health risk assessment has been linear extrapolation without a threshold for cancer, and estimation of sub-threshold doses for non-cancer and (in appropriate cases) cancer endpoints. However, recent publications question the application of dose-response modeling approaches with a threshold. The growing body of molecular toxicology information and computational toxicology tools has allowed for exploration of the presence or absence of subthreshold doses for a number of receptor-mediated MOPs. The workshop explored the development of dose-response approaches for nuclear receptor-mediated liver cancer, within a MOA Human Relevance framework (HRF). Case studies addressed activation of the AHR; the CAR/PXR, and the PPARa. This paper describes the workshop process, key issues discussed, and conclusions. The value of an interactive workshop approach to apply current MOA/HRF frameworks was demonstrated. The results may help direct research on the MOA and dose-response of receptor-based toxicity, since there are commonalities for many receptors in the basic pathways involved for late steps in the

  10. Advances of the reverse lactate threshold test: Non-invasive proposal based on heart rate and effect of previous cycling experience

    PubMed Central

    2018-01-01

    Our first aim was to compare the anaerobic threshold (AnT) determined by the incremental protocol with the reverse lactate threshold test (RLT), investigating the previous cycling experience effect. Secondarily, an alternative RLT application based on heart rate was proposed. Two groups (12 per group-according to cycling experience) were evaluated on cycle ergometer. The incremental protocol started at 25 W with increments of 25 W at each 3 minutes, and the AnT was calculated by bissegmentation, onset of blood lactate concentration and maximal deviation methods. The RLT was applied in two phases: a) lactate priming segment; and b) reverse segment; the AnT (AnTRLT) was calculated based on a second order polynomial function. The AnT from the RLT was calculated based on the heart rate (AnTRLT-HR) by the second order polynomial function. In regard of the Study 1, most of statistical procedures converged for similarity between the AnT determined from the bissegmentation method and AnTRLT. For 83% of non-experienced and 75% of experienced subjects the bias was 4% and 2%, respectively. In Study 2, no difference was found between the AnTRLT and AnTRLT-HR. For 83% of non-experienced and 91% of experienced subjects, the bias between AnTRLT and AnTRLT-HR was similar (i.e. 6%). In summary, the AnT determined by the incremental protocol and RLT are consistent. The AnT can be determined during the RLT via heart rate, improving its applicability. However, future studies are required to improve the agreement between variables. PMID:29534108

  11. Preliminary test of cigarette nicotine discrimination threshold in non-dependent versus dependent smokers.

    PubMed

    Perkins, Kenneth A; Kunkle, Nicole; Karelitz, Joshua L; Perkins, K A; Kunkle, N; Karelitz, J L

    2017-06-01

    Despite its potential for understanding tobacco dependence, behavioral discrimination of nicotine via smoking has not been formally examined as a function of nicotine dependence level. Spectrum research cigarettes were used to compare non-dependent with dependent smokers on the lowest content of nicotine they could discriminate (i.e., "threshold"). Dependent (n=21; 16M, 5F) or non-dependent (n=7; 4M, 3F) smokers were tested on ability to discriminate between cigarettes with nicotine contents of 17, 11, 5, 2, and 1mg/g, one per session, from an "ultra-low" cigarette with 0.4mg/g (all had 9-10mg "tar"). All abstained from smoking overnight prior to sessions, and number of sessions was determined by the lowest nicotine content they could reliably discriminate from the ultra-low on >80% of trials (i.e., ≥5 of 6). Subjective perceptions and cigarette choice behavior were also assessed and related to discrimination behavior. Discrimination thresholds (and most perceptions) did not differ between dependent and non-dependent smokers, with median thresholds of 11mg/g for both subgroups. Yet, "liking" and puff choice for threshold cigarettes were greater in dependent but not non-dependent smokers, while cigarettes with nicotine contents below threshold did not support "liking" or choice in both groups. In sum, this preliminary study suggests threshold for discriminating nicotine via smoking may not vary by dependence level, and further study is needed to confirm that cigarettes unable to be discriminated are also not reinforcing. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. A New Approach to Threshold Attribute Based Signatures

    DTIC Science & Technology

    2011-01-01

    Inspired by developments in attribute based encryption and signatures, there has recently been a spurtof progress in the direction of threshold ...attribute based signatures (t-ABS). In this work we propose anovel approach to construct threshold attribute based signatures inspired by ring signatures...Thresholdattribute based signatures, dened by a (t; n) threshold predicate, ensure that the signer holds atleastt out of a specied set of n attributes

  13. Commentary on Inhaled 239PuO 2 in Dogs — A Prophylaxis against Lung Cancer?

    DOE PAGES

    Cuttler, Jerry M.; Feinendegen, Ludwig E.

    2015-01-01

    Several studies on the effect of inhaled plutonium-dioxide particulates and the incidence of lung tumors in dogs reveal beneficial effects when the cumulative alpha-radiation dose is low. There is a threshold at an exposure level of about 100 cGy for excess tumor incidence and reduced lifespan. The observations conform to the expectations of the radiation hormesis dose-response model and contradict the predictions of the LNT hypothesis. These studies suggest investigating the possibility of employing low-dose alpha-radiation, such as from 239PuO 2 inhalation, as a prophylaxis against lung cancer.

  14. Commentary on Inhaled 239PuO 2 in Dogs — A Prophylaxis against Lung Cancer?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cuttler, Jerry M.; Feinendegen, Ludwig E.

    Several studies on the effect of inhaled plutonium-dioxide particulates and the incidence of lung tumors in dogs reveal beneficial effects when the cumulative alpha-radiation dose is low. There is a threshold at an exposure level of about 100 cGy for excess tumor incidence and reduced lifespan. The observations conform to the expectations of the radiation hormesis dose-response model and contradict the predictions of the LNT hypothesis. These studies suggest investigating the possibility of employing low-dose alpha-radiation, such as from 239PuO 2 inhalation, as a prophylaxis against lung cancer.

  15. Mode of Action (MOA) and Dose-Response Approaches for Nuclear Receptors

    EPA Science Inventory

    Abstract: The presence of sub-threshold doses for non-cancer and (in appropriate cases) cancer has been the dominant paradigm for the practice of risk assessment, but the application of dose-response modeling approaches that include a threshold have been questioned in a 2009 NRC ...

  16. Proposal on Calculation of Ventilation Threshold Using Non-contact Respiration Measurement with Pattern Light Projection

    NASA Astrophysics Data System (ADS)

    Aoki, Hirooki; Ichimura, Shiro; Fujiwara, Toyoki; Kiyooka, Satoru; Koshiji, Kohji; Tsuzuki, Keishi; Nakamura, Hidetoshi; Fujimoto, Hideo

    We proposed a calculation method of the ventilation threshold using the non-contact respiration measurement with dot-matrix pattern light projection under pedaling exercise. The validity and effectiveness of our proposed method is examined by simultaneous measurement with the expiration gas analyzer. The experimental result showed that the correlation existed between the quasi ventilation thresholds calculated by our proposed method and the ventilation thresholds calculated by the expiration gas analyzer. This result indicates the possibility of the non-contact measurement of the ventilation threshold by the proposed method.

  17. Lower thresholds for lifetime health effects in mammals from high-LET radiation - Comparison with chronic low-LET radiation.

    PubMed

    Sazykina, Tatiana G; Kryshev, Alexander I

    2016-12-01

    Lower threshold dose rates and confidence limits are quantified for lifetime radiation effects in mammalian animals from internally deposited alpha-emitting radionuclides. Extensive datasets on effects from internal alpha-emitters are compiled from the International Radiobiological Archives. In total, the compiled database includes 257 records, which are analyzed by means of non-parametric order statistics. The generic lower threshold for alpha-emitters in mammalian animals (combined datasets) is 6.6·10 -5  Gy day -1 . Thresholds for individual alpha-emitting elements differ considerably: plutonium and americium - 2.0·10 -5  Gy day -1 ; radium - 2.1·10 -4  Gy day -1 . Threshold for chronic low-LET radiation is previously estimated at 1·10 -3  Gy day -1 . For low exposures, the following values of alpha radiation weighting factor w R for internally deposited alpha-emitters in mammals are quantified: w R (α) = 15 as a generic value for the whole group of alpha-emitters; w R (Pu) = 50 for plutonium; w R (Am) = 50 for americium; w R (Ra) = 5 for radium. These values are proposed to serve as radiation weighting factors in calculations of equivalent doses to non-human biota. The lower threshold dose rate for long-lived mammals (dogs) is significantly lower than comparing with the threshold for short-lived mammals (mice): 2.7·10 -5  Gy day -1 , and 2.0·10 -4  Gy day -1 , respectively. The difference in thresholds is exactly reflecting the relationship between the natural longevity of these two species. Graded scale of severity in lifetime radiation effects in mammals is developed, based on compiled datasets. Being placed on the severity scale, the effects of internal alpha-emitters are situated in the zones of considerably lower dose rates than effects of the same severity caused by low-LET radiation. RBE values, calculated for effects of equal severity, are found to depend on the intensity of chronic exposure: different RBE values are characteristic

  18. Is ``No-Threshold'' a ``Non-Concept''?

    NASA Astrophysics Data System (ADS)

    Schaeffer, David J.

    1981-11-01

    A controversy prominent in scientific literature that has carried over to newspapers, magazines, and popular books is having serious social and political expressions today: “Is there, or is there not, a threshold below which exposure to a carcinogen will not induce cancer?” The distinction between establishing the existence of this threshold (which is a theoretical question) and its value (which is an experimental one) gets lost in the scientific arguments. Establishing the existence of this threshold has now become a philosophical question (and an emotional one). In this paper I qualitatively outline theoretical reasons why a threshold must exist, discuss experiments which measure thresholds on two chemicals, and describe and apply a statistical method for estimating the threshold value from exposure-response data.

  19. Preliminary test of cigarette nicotine discrimination threshold in non-dependent versus dependent smokers

    PubMed Central

    Perkins, Kenneth A.; Kunkle, Nicole; Karelitz, Joshua L.

    2017-01-01

    Background Despite its potential for understanding tobacco dependence, behavioral discrimination of nicotine via smoking has not been formally examined as a function of nicotine dependence level. Methods Spectrum research cigarettes were used to compare non-dependent with dependent smokers on the lowest content of nicotine they could discriminate (i.e., “threshold”). Dependent (n=21; 16 M, 5 F) or non-dependent (n=7; 4 M, 3 F) smokers were tested on ability to discriminate between cigarettes with nicotine contents of 17, 11, 5, 2, and 1 mg/g, one per session, from an “ultra-low” cigarette with 0.4 mg/g (all had 9–10 mg “tar”). All abstained from smoking overnight prior to sessions, and number of sessions was determined by the lowest nicotine content they could reliably discriminate from the ultra-low on >80% of trials (i.e., ≥5 of 6). Subjective perceptions and cigarette choice behavior were also assessed and related to discrimination behavior. Results Discrimination thresholds (and most perceptions) did not differ between dependent and non-dependent smokers, with median thresholds of 11 mg/g for both subgroups. Yet, “liking” and puff choice for threshold cigarettes were greater in dependent but not non-dependent smokers, while cigarettes with nicotine contents below threshold did not support “liking” or choice in both groups. Conclusions In sum, this preliminary study suggests threshold for discriminating nicotine via smoking may not vary by dependence level, and further study is needed to confirm that cigarettes unable to be discriminated are also not reinforcing. PMID:28380366

  20. Quantitative comparison of the results obtained by the multiple-dose guinea pig maximization test and the non-radioactive murine local lymph-node assay for various biocides.

    PubMed

    Yamano, Tetsuo; Shimizu, Mitsuru; Noda, Tsutomu

    2005-07-01

    We compared the results of the multiple-dose guinea pig maximization test (GPMT) and the non-radioactive murine local lymph-node assay (LLNA) for various biocides. Thirteen out of 17 positive biocides in the GPMT gave positive results in the LLNA. In the GPMT, the minimum first induction doses ranged over four orders (0.00005-0.5%), while elicitation-threshold doses, which were evaluated using an optimally sensitized group of animals in the multiple-dose studies, ranged over five orders (0.00006-2.8%). In the LLNA, minimum induction doses ranged over more than three orders (0.01-30%). With respect to 13 biocides that were positive in both the GPMT and the LLNA, results were quantitatively compared. When compared after conversion to corresponding area doses (microg/cm), the minimum doses required to elicit skin reaction in guinea pigs were always lower than that for induction in mice with all biocides. Correlation between minimum induction doses from the GPMT and the LLNA seemed poor (r=0.57), while that between minimum induction doses in the LLNA and elicitation-threshold doses in the GPMT was relatively good (r=0.73). The results suggest the possibility to estimate human elicitation-threshold doses, which are definitely lacking in the process of risk assessment for skin-sensitizers, from the data of the LLNA.

  1. [Effects of radiation exposure on human body].

    PubMed

    Kamiya, Kenji; Sasatani, Megumi

    2012-03-01

    There are two types of radiation health effect; acute disorder and late on-set disorder. Acute disorder is a deterministic effect that the symptoms appear by exposure above a threshold. Tissues and cells that compose the human body have different radiation sensitivity respectively, and the symptoms appear in order, from highly radiosensitive tissues. The clinical symptoms of acute disorder begin with a decrease in lymphocytes, and then the symptoms appear such as alopecia, skin erythema, hematopoietic damage, gastrointestinal damage, central nervous system damage with increasing radiation dose. Regarding the late on-set disorder, a predominant health effect is the cancer among the symptoms of such as cancer, non-cancer disease and genetic effect. Cancer and genetic effect are recognized as stochastic effects without the threshold. When radiation dose is equal to or more than 100 mSv, it is observed that the cancer risk by radiation exposure increases linearly with an increase in dose. On the other hand, the risk of developing cancer through low-dose radiation exposure, less 100 mSv, has not yet been clarified scientifically. Although uncertainty still remains regarding low level risk estimation, ICRP propound LNT model and conduct radiation protection in accordance with LNT model in the low-dose and low-dose rate radiation from a position of radiation protection. Meanwhile, the mechanism of radiation damage has been gradually clarified. The initial event of radiation-induced diseases is thought to be the damage to genome such as radiation-induced DNA double-strand breaks. Recently, it is clarified that our cells could recognize genome damage and induce the diverse cell response to maintain genome integrity. This phenomenon is called DNA damage response which induces the cell cycle arrest, DNA repair, apoptosis, cell senescence and so on. These responses act in the direction to maintain genome integrity against genome damage, however, the death of large number of

  2. Thresholds of Toxicological Concern - Setting a threshold for testing below which there is little concern.

    PubMed

    Hartung, Thomas

    2017-01-01

    Low dose, low risk; very low dose, no real risk. Setting a pragmatic threshold below which concerns become negligible is the purpose of thresholds of toxicological concern (TTC). The idea is that such threshold values do not need to be established for each and every chemical based on experimental data, but that by analyzing the distribution of lowest or no-effect doses of many chemicals, a TTC can be defined - typically using the 5th percentile of this distribution and lowering it by an uncertainty factor of, e.g., 100. In doing so, TTC aims to compare exposure information (dose) with a threshold below which any hazard manifestation is very unlikely to occur. The history and current developments of this concept are reviewed and the application of TTC for different regulated products and their hazards is discussed. TTC lends itself as a pragmatic filter to deprioritize testing needs whenever real-life exposures are much lower than levels where hazard manifestation would be expected, a situation that is called "negligible exposure" in the REACH legislation, though the TTC concept has not been fully incorporated in its implementation (yet). Other areas and regulations - especially in the food sector and for pharmaceutical impurities - are more proactive. Large, curated databases on toxic effects of chemicals provide us with the opportunity to set TTC for many hazards and substance classes and thus offer a precautionary second tier for risk assessments if hazard cannot be excluded. This allows focusing testing efforts better on relevant exposures to chemicals.

  3. Optimal threshold of error decision related to non-uniform phase distribution QAM signals generated from MZM based on OCS

    NASA Astrophysics Data System (ADS)

    Han, Xifeng; Zhou, Wen

    2018-03-01

    Optical vector radio-frequency (RF) signal generation based on optical carrier suppression (OCS) in one Mach-Zehnder modulator (MZM) can realize frequency-doubling. In order to match the phase or amplitude of the recovered quadrature amplitude modulation (QAM) signal, phase or amplitude pre-coding is necessary in the transmitter side. The detected QAM signals usually have one non-uniform phase distribution after square-law detection at the photodiode because of the imperfect characteristics of the optical and electrical devices. We propose to use optimal threshold of error decision for non-uniform phase contribution to reduce the bit error rate (BER). By employing this scheme, the BER of 16 Gbaud (32 Gbit/s) quadrature-phase-shift-keying (QPSK) millimeter wave signal at 36 GHz is improved from 1 × 10-3 to 1 × 10-4 at - 4 . 6 dBm input power into the photodiode.

  4. Individualized Radical Radiotherapy of Non-Small-Cell Lung Cancer Based on Normal Tissue Dose Constraints: A Feasibility Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baardwijk, Angela van; Bosmans, Geert; Boersma, Liesbeth

    2008-08-01

    Purpose: Local recurrence is a major problem after (chemo-)radiation for non-small-cell lung cancer. We hypothesized that for each individual patient, the highest therapeutic ratio could be achieved by increasing total tumor dose (TTD) to the limits of normal tissues, delivered within 5 weeks. We report first results of a prospective feasibility trial. Methods and Materials: Twenty-eight patients with medically inoperable or locally advanced non-small-cell lung cancer, World Health Organization performance score of 0-1, and reasonable lung function (forced expiratory volume in 1 second > 50%) were analyzed. All patients underwent irradiation using an individualized prescribed TTD based on normal tissuemore » dose constraints (mean lung dose, 19 Gy; maximal spinal cord dose, 54 Gy) up to a maximal TTD of 79.2 Gy in 1.8-Gy fractions twice daily. No concurrent chemoradiation was administered. Toxicity was scored using the Common Terminology Criteria for Adverse Events criteria. An {sup 18}F-fluoro-2-deoxy-glucose-positron emission tomography-computed tomography scan was performed to evaluate (metabolic) response 3 months after treatment. Results: Mean delivered dose was 63.0 {+-} 9.8 Gy. The TTD was most often limited by the mean lung dose (32.1%) or spinal cord (28.6%). Acute toxicity generally was mild; only 1 patient experienced Grade 3 cough and 1 patient experienced Grade 3 dysphagia. One patient (3.6%) died of pneumonitis. For late toxicity, 2 patients (7.7%) had Grade 3 cough or dyspnea; none had severe dysphagia. Complete metabolic response was obtained in 44% (11 of 26 patients). With a median follow-up of 13 months, median overall survival was 19.6 months, with a 1-year survival rate of 57.1%. Conclusions: Individualized maximal tolerable dose irradiation based on normal tissue dose constraints is feasible, and initial results are promising.« less

  5. SU-E-T-647: Quality Assurance of VMAT by Gamma Analysis Dependence On Low-Dose Threshold

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Song, J; Kim, M; Lee, S

    2015-06-15

    Purpose: The AAPM TG-119 instructed institutions to use low-dose threshold (LDT) of 10% or a ROI determined by the jaw when they collected gamma analysis QA data of planar dose distribution. Also, based on a survey by Nelms and Simon, more than 70% of institutions use a LDT between 0% and 10% for gamma analysis. However, there are no clinical data to quantitatively demonstrate the impact of the LDT on the gamma index. Therefore, we performed a gamma analysis with LDTs of 0% to 15% according to both global and local normalization and different acceptance criteria: 3%/3 mm, 2%/2 mm,more » and 1%/1 mm. Methods: A total of 30 treatment plans—10 head and neck, 10 brain, and 10 prostate cancer cases—were randomly selected from the Varian Eclipse TPS, retrospectively. For the gamma analysis, a predicted portal image was acquired through a portal dose calculation algorithm in the Eclipse TPS, and a measured portal image was obtained using a Varian Clinac iX and an EPID. Then, the gamma analysis was performed using the Portal Dosimetry software. Results: For the global normalization, the gamma passing rate (%GP) decreased as the LDT increased, and all cases of low-dose thresholds exhibited a %GP above 95% for both the 3%/3 mm and 2%/2 mm criteria. However, for local normalization, the %GP increased as LDT increased. The gamma passing rate with LDT of 10% increased by 6.86%, 9.22% and 6.14% compared with the 0% in the case of the head and neck, brain and prostate for 3%/3 mm criteria, respectively. Conclusion: Applying the LDT in the global normalization does not have critical impact to judge patient-specific QA results. However, LDT for the local normalization should be carefully selected because applying the LDT could affect the average of the %GP to increase rapidly.« less

  6. Defining serum ferritin thresholds to predict clinically relevant liver iron concentrations for guiding deferasirox therapy when MRI is unavailable in patients with non-transfusion-dependent thalassaemia.

    PubMed

    Taher, Ali T; Porter, John B; Viprakasit, Vip; Kattamis, Antonis; Chuncharunee, Suporn; Sutcharitchan, Pranee; Siritanaratkul, Noppadol; Origa, Raffaella; Karakas, Zeynep; Habr, Dany; Zhu, Zewen; Cappellini, Maria Domenica

    2015-01-01

    Liver iron concentration (LIC) assessment by magnetic resonance imaging (MRI) remains the gold standard to diagnose iron overload and guide iron chelation therapy in patients with non-transfusion-dependent thalassaemia (NTDT). However, limited access to MRI technology and expertise worldwide makes it practical to also use serum ferritin assessments. The THALASSA (assessment of Exjade(®) in non-transfusion-dependent THALASSemiA patients) study assessed the efficacy and safety of deferasirox in iron-overloaded NTDT patients and provided a large data set to allow exploration of the relationship between LIC and serum ferritin. Using data from screened patients and those treated with deferasirox for up to 2 years, we identified clinically relevant serum ferritin thresholds (for when MRI is unavailable) for the initiation of chelation therapy (>800 μg/l), as well as thresholds to guide chelator dose interruption (<300 μg/l) and dose escalation (>2000 μg/l). (clinicaltrials.gov identifier: NCT00873041). © 2014 The Authors. British Journal of Haematology published by John Wiley & Sons Ltd.

  7. Structure-based rationale for differential recognition of lacto- and neolacto- series glycosphingolipids by the N-terminal domain of human galectin-8

    NASA Astrophysics Data System (ADS)

    Bohari, Mohammad H.; Yu, Xing; Zick, Yehiel; Blanchard, Helen

    2016-12-01

    Glycosphingolipids are ubiquitous cell surface molecules undertaking fundamental cellular processes. Lacto-N-tetraose (LNT) and lacto-N-neotetraose (LNnT) are the representative core structures for lacto- and neolacto-series glycosphingolipids. These glycolipids are the carriers to the blood group antigen and human natural killer antigens mainly found on blood cells, and are also principal components in human milk, contributing to infant health. The β-galactoside recognising galectins mediate various cellular functions of these glycosphingolipids. We report crystallographic structures of the galectin-8 N-terminal domain (galectin-8N) in complex with LNT and LNnT. We reveal the first example in which the non-reducing end of LNT binds to the primary binding site of a galectin, and provide a structure-based rationale for the significant ten-fold difference in binding affinities of galectin-8N toward LNT compared to LNnT, such a magnitude of difference not being observed for any other galectin. In addition, the LNnT complex showed that the unique Arg59 has ability to adopt a new orientation, and comparison of glycerol- and lactose-bound galectin-8N structures reveals a minimum atomic framework for ligand recognition. Overall, these results enhance our understanding of glycosphingolipids interactions with galectin-8N, and highlight a structure-based rationale for its significantly different affinity for components of biologically relevant glycosphingolipids.

  8. Non-abelian factorisation for next-to-leading-power threshold logarithms

    NASA Astrophysics Data System (ADS)

    Bonocore, D.; Laenen, E.; Magnea, L.; Vernazza, L.; White, C. D.

    2016-12-01

    Soft and collinear radiation is responsible for large corrections to many hadronic cross sections, near thresholds for the production of heavy final states. There is much interest in extending our understanding of this radiation to next-to-leading power (NLP) in the threshold expansion. In this paper, we generalise a previously proposed all-order NLP factorisation formula to include non-abelian corrections. We define a nonabelian radiative jet function, organising collinear enhancements at NLP, and compute it for quark jets at one loop. We discuss in detail the issue of double counting between soft and collinear regions. Finally, we verify our prescription by reproducing all NLP logarithms in Drell-Yan production up to NNLO, including those associated with double real emission. Our results constitute an important step in the development of a fully general resummation formalism for NLP threshold effects.

  9. Determining a threshold sub-acute dose leading to minimal physiological alterations following prolonged exposure to the nerve agent VX in rats.

    PubMed

    Bloch-Shilderman, E; Rabinovitz, I; Egoz, I; Yacov, G; Allon, N; Nili, U

    2018-02-01

    VX, a potent inhibitor of cholinesterase (ChE), is considered as one of the most toxic, persistent and least volatile nerve agents. VX is absorbed in various environmental surfaces and is gradually released long after its initial dispersal. Its toxicity is mainly caused by disrupting central and peripheral cholinergic nervous system activity, leading to potential long-term detrimental effects on health. The primary objective of the present study was to assess the threshold VX dose leading to minimal physiological alterations following prolonged VX exposure. Characterization of such a threshold is crucial for dealing with unresolved operative dilemmas such as when it is safe enough to resettle a population that has been evacuated from a VX-contaminated area. Rats, continuously exposed to various doses of VX (0.225-45 µg/kg/day) for 4 weeks via implanted mini-osmotic pumps, showed a dose-dependent and continuous decrease in ChE activity in whole blood, brain and muscles, ranging between 20 and 100%. Exposure to 13.5 µg/kg/day led to a stable low ChE activity level (~ 20%), accompanied by transient and negligible electrocorticogram spectral power transformations, especially in the theta and alpha brain wave frequencies, and a significant decrease in total brain M2 receptor density. These changes were neither accompanied by observable signs of intoxication nor by changes in motor function, circadian rhythm or TSPO level (a reliable marker of brain damage). Following exposure to lower doses of 2.25 and 0.225 µg/kg/day, the only change measured was a reduction in ChE activity of 60 and 20%, respectively. Based on these results, we delineate ChE inhibition as the physiological measure most susceptible to alterations following prolonged VX exposure, and determine for the first time the threshold sub-acute VX dose for minimal physiological effects (up to 20% reduction in ChE activity) in the rat as 0.225 µg/kg/day.

  10. SU-D-BRB-01: A Comparison of Learning Methods for Knowledge Based Dose Prediction for Coplanar and Non-Coplanar Liver Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tran, A; Ruan, D; Woods, K

    Purpose: The predictive power of knowledge based planning (KBP) has considerable potential in the development of automated treatment planning. Here, we examine the predictive capabilities and accuracy of previously reported KBP methods, as well as an artificial neural networks (ANN) method. Furthermore, we compare the predictive accuracy of these methods on coplanar volumetric-modulated arc therapy (VMAT) and non-coplanar 4π radiotherapy. Methods: 30 liver SBRT patients previously treated using coplanar VMAT were selected for this study. The patients were re-planned using 4π radiotherapy, which involves 20 optimally selected non-coplanar IMRT fields. ANNs were used to incorporate enhanced geometric information including livermore » and PTV size, prescription dose, patient girth, and proximity to beams. The performance of ANN was compared to three methods from statistical voxel dose learning (SVDL), wherein the doses of voxels sharing the same distance to the PTV are approximated by either taking the median of the distribution, non-parametric fitting, or skew-normal fitting. These three methods were shown to be capable of predicting DVH, but only median approximation can predict 3D dose. Prediction methods were tested using leave-one-out cross-validation tests and evaluated using residual sum of squares (RSS) for DVH and 3D dose predictions. Results: DVH prediction using non-parametric fitting had the lowest average RSS with 0.1176(4π) and 0.1633(VMAT), compared to 0.4879(4π) and 1.8744(VMAT) RSS for ANN. 3D dose prediction with median approximation had lower RSS with 12.02(4π) and 29.22(VMAT), compared to 27.95(4π) and 130.9(VMAT) for ANN. Conclusion: Paradoxically, although the ANNs included geometric features in addition to the distances to the PTV, it did not perform better in predicting DVH or 3D dose compared to simpler, faster methods based on the distances alone. The study further confirms that the prediction of 4π non-coplanar plans were more

  11. Biological mechanisms of non-linear dose-response for respirable mineral fibers.

    PubMed

    Cox, Louis Anthony Tony

    2018-06-19

    Sufficiently high and prolonged inhalation exposures to some respirable elongated mineral particles (REMPs), notably including amphibole asbestos fibers, can increase risk of inflammation-mediated diseases including malignant mesothelioma, pleural diseases, fibrosis, and lung cancer. Chronic inflammation involves ongoing activation of the NLRP3 inflammasome, which enables immune cells to produce potent proinflammatory cytokines IL-1β and IL-18. Reactive oxygen species (ROS) (in particular, mitochondrial ROS) contribute to NRLP3 activation via a well-elucidated mechanism involving oxidation of reduced thioredoxin and association of thioredoxin-interacting protein with NLRP3. Lysosomal destabilization, efflux of cytosolic potassium ions and influx of calcium ions, signals from damaged mitochondria, both translational and post-translational controls, and prion-like polymerization have increasingly clear roles in regulating NLRP3 activation. As the molecular biology of inflammation-mediated responses to REMP exposure becomes clearer, a practical question looms: What do these mechanisms imply for the shape of the dose-response function relating exposure concentrations and durations for EMPs to risk of pathological responses? Dose-response thresholds or threshold-like nonlinearities can arise from (a) Cooperativity in assembly of supramolecular signaling complexes; (b) Positive feedback loops and bistability in regulatory networks; (c) Overwhelming of defensive barriers maintaining homeostasis; and (d) Damage thresholds, as in lysosome destabilization-induced activation of NLRP3. Each of these mechanisms holds for NLRP3 activation in response to stimuli such as REMP exposures. It is therefore timely to consider the implications of these advances in biological understanding for human health risk assessment with dose-response thresholds. Copyright © 2018. Published by Elsevier Inc.

  12. Quantification of residual dose estimation error on log file-based patient dose calculation.

    PubMed

    Katsuta, Yoshiyuki; Kadoya, Noriyuki; Fujita, Yukio; Shimizu, Eiji; Matsunaga, Kenichi; Matsushita, Haruo; Majima, Kazuhiro; Jingu, Keiichi

    2016-05-01

    The log file-based patient dose estimation includes a residual dose estimation error caused by leaf miscalibration, which cannot be reflected on the estimated dose. The purpose of this study is to determine this residual dose estimation error. Modified log files for seven head-and-neck and prostate volumetric modulated arc therapy (VMAT) plans simulating leaf miscalibration were generated by shifting both leaf banks (systematic leaf gap errors: ±2.0, ±1.0, and ±0.5mm in opposite directions and systematic leaf shifts: ±1.0mm in the same direction) using MATLAB-based (MathWorks, Natick, MA) in-house software. The generated modified and non-modified log files were imported back into the treatment planning system and recalculated. Subsequently, the generalized equivalent uniform dose (gEUD) was quantified for the definition of the planning target volume (PTV) and organs at risks. For MLC leaves calibrated within ±0.5mm, the quantified residual dose estimation errors that obtained from the slope of the linear regression of gEUD changes between non- and modified log file doses per leaf gap are in head-and-neck plans 1.32±0.27% and 0.82±0.17Gy for PTV and spinal cord, respectively, and in prostate plans 1.22±0.36%, 0.95±0.14Gy, and 0.45±0.08Gy for PTV, rectum, and bladder, respectively. In this work, we determine the residual dose estimation errors for VMAT delivery using the log file-based patient dose calculation according to the MLC calibration accuracy. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  13. Noise reduction in Lidar signal using correlation-based EMD combined with soft thresholding and roughness penalty

    NASA Astrophysics Data System (ADS)

    Chang, Jianhua; Zhu, Lingyan; Li, Hongxu; Xu, Fan; Liu, Binggang; Yang, Zhenbo

    2018-01-01

    Empirical mode decomposition (EMD) is widely used to analyze the non-linear and non-stationary signals for noise reduction. In this study, a novel EMD-based denoising method, referred to as EMD with soft thresholding and roughness penalty (EMD-STRP), is proposed for the Lidar signal denoising. With the proposed method, the relevant and irrelevant intrinsic mode functions are first distinguished via a correlation coefficient. Then, the soft thresholding technique is applied to the irrelevant modes, and the roughness penalty technique is applied to the relevant modes to extract as much information as possible. The effectiveness of the proposed method was evaluated using three typical signals contaminated by white Gaussian noise. The denoising performance was then compared to the denoising capabilities of other techniques, such as correlation-based EMD partial reconstruction, correlation-based EMD hard thresholding, and wavelet transform. The use of EMD-STRP on the measured Lidar signal resulted in the noise being efficiently suppressed, with an improved signal to noise ratio of 22.25 dB and an extended detection range of 11 km.

  14. Gamma Low-Dose-Rate Ionizing Radiation Stimulates Adaptive Functional and Molecular Response in Human Aortic Endothelial Cells in a Threshold-, Dose-, and Dose Rate–Dependent Manner

    PubMed Central

    Vieira Dias, Juliana; Gloaguen, Celine; Kereselidze, Dimitri; Manens, Line; Tack, Karine; Ebrahimian, Teni G

    2018-01-01

    A central question in radiation protection research is whether low-dose and low-dose-rate (LDR) exposures to ionizing radiation play a role in progression of cardiovascular disease. The response of endothelial cells to different LDR exposures may help estimate risk of cardiovascular disease by providing the biological mechanism involved. We investigated the effect of chronic LDR radiation on functional and molecular responses of human aorta endothelial cells (HAoECs). Human aorta endothelial cells were continuously irradiated at LDR (6 mGy/h) for 15 days and analyzed at time points when the cumulative dose reached 0.05, 0.5, 1.0, and 2.0 Gy. The same doses were administered acutely at high-dose rate (HDR; 1 Gy/min). The threshold for the loss of angiogenic capacity for both LDR and HDR radiations was between 0.5 and 1.0 Gy. At 2.0 Gy, angiogenic capacity returned to normal only for HAoEC exposed to LDR radiation, associated with increased expression of antioxidant and anti-inflammatory genes. Pre-LDR, but not pre-HDR, radiation, followed by a single acute 2.0 Gy challenge dose sustained the expression of antioxidant and anti-inflammatory genes and stimulated angiogenesis. Our results suggest that dose rate is important in cellular response and that a radioadaptive response is involved for a 2.0 Gy dose at LDR. PMID:29531508

  15. Gamma Low-Dose-Rate Ionizing Radiation Stimulates Adaptive Functional and Molecular Response in Human Aortic Endothelial Cells in a Threshold-, Dose-, and Dose Rate-Dependent Manner.

    PubMed

    Vieira Dias, Juliana; Gloaguen, Celine; Kereselidze, Dimitri; Manens, Line; Tack, Karine; Ebrahimian, Teni G

    2018-01-01

    A central question in radiation protection research is whether low-dose and low-dose-rate (LDR) exposures to ionizing radiation play a role in progression of cardiovascular disease. The response of endothelial cells to different LDR exposures may help estimate risk of cardiovascular disease by providing the biological mechanism involved. We investigated the effect of chronic LDR radiation on functional and molecular responses of human aorta endothelial cells (HAoECs). Human aorta endothelial cells were continuously irradiated at LDR (6 mGy/h) for 15 days and analyzed at time points when the cumulative dose reached 0.05, 0.5, 1.0, and 2.0 Gy. The same doses were administered acutely at high-dose rate (HDR; 1 Gy/min). The threshold for the loss of angiogenic capacity for both LDR and HDR radiations was between 0.5 and 1.0 Gy. At 2.0 Gy, angiogenic capacity returned to normal only for HAoEC exposed to LDR radiation, associated with increased expression of antioxidant and anti-inflammatory genes. Pre-LDR, but not pre-HDR, radiation, followed by a single acute 2.0 Gy challenge dose sustained the expression of antioxidant and anti-inflammatory genes and stimulated angiogenesis. Our results suggest that dose rate is important in cellular response and that a radioadaptive response is involved for a 2.0 Gy dose at LDR.

  16. Suppression of alkylating agent induced cell transformation and gastric ulceration by low-dose alkylating agent pretreatment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Onodera, Akira, E-mail: onodera@pharm.kobegakuin.ac.jp; Department of Pharmaceutical Sciences, Kobegakuin University, 1-1-3 Minatojima, Chuo-ku, Kobe 650-8586; Kawai, Yuichi

    2013-06-14

    Highlights: •Low-dose MNNG pretreatment suppresses high-dose MNNG induced in vitro transformation. •Gastric ulcers induced by high-dose MNNG decreased after low-dose MNNG pretreatment. •Efficacy of low-dose MNNG related to resistance of mutation and oxidative stress. -- Abstract: Exposure to mild stress by chemicals and radiation causes DNA damage and leads to acquired stress resistance. Although the linear no-threshold (LNT) model of safety assessment assumes risk from any dose, evidence from radiological research demonstrates a conflicting hormetic phenomenon known as the hormesis effect. However, the mechanisms underlying radiation hormesis have not yet been clarified, and little is known about the effects ofmore » low doses of chemical carcinogens. We analyzed the efficacy of pretreatment with low doses of the alkylating agent N-methyl-N′-nitro-N-nitrosoguanidine (MNNG) on the subsequent induction of cell transformation and gastric ulceration by high-dose MNNG. We used an in vitro Balb/3T3 A31-1-1 cell transformation test and monitored the formation of gastric ulcers in 5-week-old male ICR mice that were administered MNNG in drinking water. The treatment concentrations of MNNG were determined by the cell survival rate and past reports. For low-dose in vitro and in vivo experiments, MNNG was used at 0.028 μM, and 2.8 μg/mL, respectively. The frequency of cell transformation induced by 10 μm MNNG was decreased by low-dose MNNG pretreatment to levels similar to that of spontaneous transformation. In addition, reactive oxygen species (ROS) and mutation frequencies induced by 10 μm MNNG were decreased by low-dose MNNG pretreatment. Importantly, low-dose MNNG pretreatment had no effect on cell proliferation. In vivo studies showed that the number of gastric ulcers induced by 1 mg/mL MNNG decreased after low-dose MNNG pretreatment. These data indicate that low-dose pretreatment with carcinogens may play a beneficial role in the prevention of chemical

  17. Identifying Thresholds for Ecosystem-Based Management

    PubMed Central

    Samhouri, Jameal F.; Levin, Phillip S.; Ainsworth, Cameron H.

    2010-01-01

    Background One of the greatest obstacles to moving ecosystem-based management (EBM) from concept to practice is the lack of a systematic approach to defining ecosystem-level decision criteria, or reference points that trigger management action. Methodology/Principal Findings To assist resource managers and policymakers in developing EBM decision criteria, we introduce a quantitative, transferable method for identifying utility thresholds. A utility threshold is the level of human-induced pressure (e.g., pollution) at which small changes produce substantial improvements toward the EBM goal of protecting an ecosystem's structural (e.g., diversity) and functional (e.g., resilience) attributes. The analytical approach is based on the detection of nonlinearities in relationships between ecosystem attributes and pressures. We illustrate the method with a hypothetical case study of (1) fishing and (2) nearshore habitat pressure using an empirically-validated marine ecosystem model for British Columbia, Canada, and derive numerical threshold values in terms of the density of two empirically-tractable indicator groups, sablefish and jellyfish. We also describe how to incorporate uncertainty into the estimation of utility thresholds and highlight their value in the context of understanding EBM trade-offs. Conclusions/Significance For any policy scenario, an understanding of utility thresholds provides insight into the amount and type of management intervention required to make significant progress toward improved ecosystem structure and function. The approach outlined in this paper can be applied in the context of single or multiple human-induced pressures, to any marine, freshwater, or terrestrial ecosystem, and should facilitate more effective management. PMID:20126647

  18. Evaluation of anaerobic threshold in non-pregnant and pregnant rats.

    PubMed

    Netto, Aline Oliveira; Macedo, Nathália C D; Gallego, Franciane Q; Sinzato, Yuri K; Volpato, Gustavo T; Damasceno, Débora C

    2017-01-01

    Several studies present different methodologies and results about intensity exercise, and many of them are performed in male rats. However, the impact of different type, intensity, frequency and duration of exercise on female rats needs more investigation. From the analysis of blood lactate concentration during lactate minimum test (LacMin) in the swimming exercise, the anaerobic threshold (AT) was identified, which parameter is defined as the transition point between aerobic and anaerobic metabolism. LacMin test is considered a good indicator of aerobic conditioning and has been used in prescription of training in different exercise modalities. However, there is no evidence of LacMin test in female rats. The objective was to determine AT in non-pregnant and pregnant Wistar rats. The LacMin test was performed and AT defined for mild exercise intensity was from a load equivalent to 1% of body weight (bw), moderate exercise as carrying 4% bw and severe intensity as carrying 7% bw. In pregnant rats, the AT was reached at a lower loading from 5.0% to 5.5% bw, while in non-pregnant the load was from 5.5% to 6.0% bw. Thus, this study was effective to identify exercise intensities in pregnant and non-pregnant rats using anaerobic threshold by LacMin test.

  19. Development of a landlside EWS based on rainfall thresholds for Tuscany Region, Italy

    NASA Astrophysics Data System (ADS)

    Rosi, Ascanio; Segoni, Samuele; Battistini, Alessandro; Rossi, Guglielmo; Catani, Filippo; Casagli, Nicola

    2017-04-01

    We present the set-up of a landslide EWS based on rainfall thresholds for the Tuscany region (central Italy), that shows a heterogeneous distribution of reliefs and precipitation. The work started with the definition of a single set of thresholds for the whole region, but it resulted unsuitable for EWS purposes, because of the heterogeneity of the Tuscan territory and non-repeatability of the analyses, that were affected by a high degree of subjectivity. To overcome this problem, the work started from the implementation of a software capable of objectively defining the rainfall thresholds, since some of the main issues of these thresholds are the subjectivity of the analysis and therefore their non-repeatability. This software, named MaCumBA, is largely automated and can analyze, in a short time, a high number of rainfall events to define several parameters of the threshold, such as the intensity (I) and the duration (D) of the rainfall event, the no-rain time gap (NRG: how many hours without rain are needed to consider two events as separated) and the equation describing the threshold. The possibility of quickly perform several analyses lead to the decision to divide the territory in 25 homogeneous areas (named alert zones, AZ), so as a single threshold for each AZ could be defined. For the definition of the thresholds two independent datasets (of joint rainfall-landslide occurrences) have been used: a calibration dataset (data from 2000 to 2007) and a validation dataset (2008-2009). Once the thresholds were defined, a WebGIS-based EWS has been implemented. In this system it is possible to focus both on monitoring of real-time data and on forecasting at different lead times up to 48 h; forecasting data are collected from LAMI (Limited Area Model Italy) rainfall forecasts. The EWS works on the basis of the threshold parameters defined by MaCumBA (I, D, NRG). An important feature of the warning system is that the visualization of the thresholds in the Web

  20. Determination of prospective displacement-based gate threshold for respiratory-gated radiation delivery from retrospective phase-based gate threshold selected at 4D CT simulation.

    PubMed

    Vedam, S; Archambault, L; Starkschall, G; Mohan, R; Beddar, S

    2007-11-01

    Four-dimensional (4D) computed tomography (CT) imaging has found increasing importance in the localization of tumor and surrounding normal structures throughout the respiratory cycle. Based on such tumor motion information, it is possible to identify the appropriate phase interval for respiratory gated treatment planning and delivery. Such a gating phase interval is determined retrospectively based on tumor motion from internal tumor displacement. However, respiratory-gated treatment is delivered prospectively based on motion determined predominantly from an external monitor. Therefore, the simulation gate threshold determined from the retrospective phase interval selected for gating at 4D CT simulation may not correspond to the delivery gate threshold that is determined from the prospective external monitor displacement at treatment delivery. The purpose of the present work is to establish a relationship between the thresholds for respiratory gating determined at CT simulation and treatment delivery, respectively. One hundred fifty external respiratory motion traces, from 90 patients, with and without audio-visual biofeedback, are analyzed. Two respiratory phase intervals, 40%-60% and 30%-70%, are chosen for respiratory gating from the 4D CT-derived tumor motion trajectory. From residual tumor displacements within each such gating phase interval, a simulation gate threshold is defined based on (a) the average and (b) the maximum respiratory displacement within the phase interval. The duty cycle for prospective gated delivery is estimated from the proportion of external monitor displacement data points within both the selected phase interval and the simulation gate threshold. The delivery gate threshold is then determined iteratively to match the above determined duty cycle. The magnitude of the difference between such gate thresholds determined at simulation and treatment delivery is quantified in each case. Phantom motion tests yielded coincidence of simulation

  1. Outcomes Associated with Reducing the Urine Alkalinization Threshold in Patients Receiving High-Dose Methotrexate.

    PubMed

    Drost, Sarah A; Wentzell, Jason R; Giguère, Pierre; McLurg, Darcy L; Sabloff, Mitchell; Kanji, Salmaan; Nguyen, Tiffany T

    2017-06-01

    Urine alkalinization increases methotrexate (MTX) solubility and reduces the risk of nephrotoxicity. The objectives of this study were to determine whether a reduction in the urine pH threshold from 8 to 7 in patients receiving high-dose methotrexate (HDMTX) results in a shorter length of hospital stay, delayed MTX clearance, or higher rates of nephrotoxicity; and to determine whether specific factors were associated with prolonged MTX clearance. Retrospective cohort study. Hematology service of a large university-affiliated teaching hospital in Ottawa, Canada. Sixty-five adults with 150 HDMTX exposures who had elective admissions for HDMTX between September 1, 2014, and December 18, 2015, were included. Thirty-four patients (with 79 HDMTX exposures) had their urine alkalinized to a pH of 8 or higher, and 31 patients (with 71 HDMTX exposures) had their urine alkalinized to a pH of 7 or higher, after an institutional change in the urine pH threshold from 8 to 7 was implemented on May 1, 2015. Data related to patient demographics, urine alkalinization, MTX serum concentration monitoring, hospital length of stay, and renal function were collected retrospectively from patients' electronic health records. Lowering the urine pH threshold from 8 to 7 did not significantly affect hospital length of stay (absolute difference 3.5 hrs, 95% confidence interval -4.0 to 10.9) or clearance of MTX (elimination rate constant 0.058 in the pH of 7 or higher group vs 0.064 in the pH of 8 or higher group, p=0.233). Nephrotoxicity rates were similar between groups (15.5% in the pH of 7 or higher group vs 10.1% in the pH of 8 or higher group, p=0.34). Higher MTX dose and interacting medications (e.g., proton pump inhibitors and sulfonamide antibiotics) were significantly associated with delayed MTX elimination. No significant differences in HDMTX-associated hospital length of stay, MTX clearance, or rates of nephrotoxicity were noted between patients in the urine pH of 7 or higher and 8

  2. Damage threshold of platinum coating used for optics for self-seeding of soft x-ray free electron laser

    DOE PAGES

    Krzywinski, Jacek; Cocco, Daniele; Moeller, Stefan; ...

    2015-02-23

    We investigated the experimental damage threshold of platinum coating on a silicon substrate illuminated by soft x-ray radiation at grazing incidence angle of 2.1 deg. The coating was the same as the blazed grating used for the soft X-ray self-seeding optics of the Linac Coherent Light Source free electron laser. The irradiation condition was chosen such that the absorbed dose was similar to the maximum dose expected for the grating. The expected dose was simulated by solving the Helmholtz equation in non-homogenous media. The experiment was performed at 900 eV photon energy for both single pulse and multi-shot conditions. Wemore » have not observed single shot damage. This corresponds to a single shot damage threshold being higher than 3 J/cm 2. The multiple shot damage threshold measured for 10 shots and about 600 shots was determined to be 0.95 J/cm 2 and 0.75 J/cm 2 respectively. The damage threshold occurred at an instantaneous dose which is higher that the melt dose of platinum.« less

  3. Wavelet-based adaptive thresholding method for image segmentation

    NASA Astrophysics Data System (ADS)

    Chen, Zikuan; Tao, Yang; Chen, Xin; Griffis, Carl

    2001-05-01

    A nonuniform background distribution may cause a global thresholding method to fail to segment objects. One solution is using a local thresholding method that adapts to local surroundings. In this paper, we propose a novel local thresholding method for image segmentation, using multiscale threshold functions obtained by wavelet synthesis with weighted detail coefficients. In particular, the coarse-to- fine synthesis with attenuated detail coefficients produces a threshold function corresponding to a high-frequency- reduced signal. This wavelet-based local thresholding method adapts to both local size and local surroundings, and its implementation can take advantage of the fast wavelet algorithm. We applied this technique to physical contaminant detection for poultry meat inspection using x-ray imaging. Experiments showed that inclusion objects in deboned poultry could be extracted at multiple resolutions despite their irregular sizes and uneven backgrounds.

  4. Impact of PET and MRI threshold-based tumor volume segmentation on patient-specific targeted radionuclide therapy dosimetry using CLR1404.

    PubMed

    Besemer, Abigail E; Titz, Benjamin; Grudzinski, Joseph J; Weichert, Jamey P; Kuo, John S; Robins, H Ian; Hall, Lance T; Bednarz, Bryan P

    2017-07-06

    Variations in tumor volume segmentation methods in targeted radionuclide therapy (TRT) may lead to dosimetric uncertainties. This work investigates the impact of PET and MRI threshold-based tumor segmentation on TRT dosimetry in patients with primary and metastatic brain tumors. In this study, PET/CT images of five brain cancer patients were acquired at 6, 24, and 48 h post-injection of 124 I-CLR1404. The tumor volume was segmented using two standardized uptake value (SUV) threshold levels, two tumor-to-background ratio (TBR) threshold levels, and a T1 Gadolinium-enhanced MRI threshold. The dice similarity coefficient (DSC), jaccard similarity coefficient (JSC), and overlap volume (OV) metrics were calculated to compare differences in the MRI and PET contours. The therapeutic 131 I-CLR1404 voxel-level dose distribution was calculated from the 124 I-CLR1404 activity distribution using RAPID, a Geant4 Monte Carlo internal dosimetry platform. The TBR, SUV, and MRI tumor volumes ranged from 2.3-63.9 cc, 0.1-34.7 cc, and 0.4-11.8 cc, respectively. The average  ±  standard deviation (range) was 0.19  ±  0.13 (0.01-0.51), 0.30  ±  0.17 (0.03-0.67), and 0.75  ±  0.29 (0.05-1.00) for the JSC, DSC, and OV, respectively. The DSC and JSC values were small and the OV values were large for both the MRI-SUV and MRI-TBR combinations because the regions of PET uptake were generally larger than the MRI enhancement. Notable differences in the tumor dose volume histograms were observed for each patient. The mean (standard deviation) 131 I-CLR1404 tumor doses ranged from 0.28-1.75 Gy GBq -1 (0.07-0.37 Gy GBq -1 ). The ratio of maximum-to-minimum mean doses for each patient ranged from 1.4-2.0. The tumor volume and the interpretation of the tumor dose is highly sensitive to the imaging modality, PET enhancement metric, and threshold level used for tumor volume segmentation. The large variations in tumor doses clearly demonstrate the need for

  5. Impact of PET and MRI threshold-based tumor volume segmentation on patient-specific targeted radionuclide therapy dosimetry using CLR1404

    NASA Astrophysics Data System (ADS)

    Besemer, Abigail E.; Titz, Benjamin; Grudzinski, Joseph J.; Weichert, Jamey P.; Kuo, John S.; Robins, H. Ian; Hall, Lance T.; Bednarz, Bryan P.

    2017-08-01

    Variations in tumor volume segmentation methods in targeted radionuclide therapy (TRT) may lead to dosimetric uncertainties. This work investigates the impact of PET and MRI threshold-based tumor segmentation on TRT dosimetry in patients with primary and metastatic brain tumors. In this study, PET/CT images of five brain cancer patients were acquired at 6, 24, and 48 h post-injection of 124I-CLR1404. The tumor volume was segmented using two standardized uptake value (SUV) threshold levels, two tumor-to-background ratio (TBR) threshold levels, and a T1 Gadolinium-enhanced MRI threshold. The dice similarity coefficient (DSC), jaccard similarity coefficient (JSC), and overlap volume (OV) metrics were calculated to compare differences in the MRI and PET contours. The therapeutic 131I-CLR1404 voxel-level dose distribution was calculated from the 124I-CLR1404 activity distribution using RAPID, a Geant4 Monte Carlo internal dosimetry platform. The TBR, SUV, and MRI tumor volumes ranged from 2.3-63.9 cc, 0.1-34.7 cc, and 0.4-11.8 cc, respectively. The average  ±  standard deviation (range) was 0.19  ±  0.13 (0.01-0.51), 0.30  ±  0.17 (0.03-0.67), and 0.75  ±  0.29 (0.05-1.00) for the JSC, DSC, and OV, respectively. The DSC and JSC values were small and the OV values were large for both the MRI-SUV and MRI-TBR combinations because the regions of PET uptake were generally larger than the MRI enhancement. Notable differences in the tumor dose volume histograms were observed for each patient. The mean (standard deviation) 131I-CLR1404 tumor doses ranged from 0.28-1.75 Gy GBq-1 (0.07-0.37 Gy GBq-1). The ratio of maximum-to-minimum mean doses for each patient ranged from 1.4-2.0. The tumor volume and the interpretation of the tumor dose is highly sensitive to the imaging modality, PET enhancement metric, and threshold level used for tumor volume segmentation. The large variations in tumor doses clearly demonstrate the need for standard

  6. 3D SAPIV particle field reconstruction method based on adaptive threshold.

    PubMed

    Qu, Xiangju; Song, Yang; Jin, Ying; Li, Zhenhua; Wang, Xuezhen; Guo, ZhenYan; Ji, Yunjing; He, Anzhi

    2018-03-01

    Particle image velocimetry (PIV) is a necessary flow field diagnostic technique that provides instantaneous velocimetry information non-intrusively. Three-dimensional (3D) PIV methods can supply the full understanding of a 3D structure, the complete stress tensor, and the vorticity vector in the complex flows. In synthetic aperture particle image velocimetry (SAPIV), the flow field can be measured with large particle intensities from the same direction by different cameras. During SAPIV particle reconstruction, particles are commonly reconstructed by manually setting a threshold to filter out unfocused particles in the refocused images. In this paper, the particle intensity distribution in refocused images is analyzed, and a SAPIV particle field reconstruction method based on an adaptive threshold is presented. By using the adaptive threshold to filter the 3D measurement volume integrally, the three-dimensional location information of the focused particles can be reconstructed. The cross correlations between images captured from cameras and images projected by the reconstructed particle field are calculated for different threshold values. The optimal threshold is determined by cubic curve fitting and is defined as the threshold value that causes the correlation coefficient to reach its maximum. The numerical simulation of a 16-camera array and a particle field at two adjacent time events quantitatively evaluates the performance of the proposed method. An experimental system consisting of a camera array of 16 cameras was used to reconstruct the four adjacent frames in a vortex flow field. The results show that the proposed reconstruction method can effectively reconstruct the 3D particle fields.

  7. Determination of prospective displacement-based gate threshold for respiratory-gated radiation delivery from retrospective phase-based gate threshold selected at 4D CT simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vedam, S.; Archambault, L.; Starkschall, G.

    2007-11-15

    Four-dimensional (4D) computed tomography (CT) imaging has found increasing importance in the localization of tumor and surrounding normal structures throughout the respiratory cycle. Based on such tumor motion information, it is possible to identify the appropriate phase interval for respiratory gated treatment planning and delivery. Such a gating phase interval is determined retrospectively based on tumor motion from internal tumor displacement. However, respiratory-gated treatment is delivered prospectively based on motion determined predominantly from an external monitor. Therefore, the simulation gate threshold determined from the retrospective phase interval selected for gating at 4D CT simulation may not correspond to the deliverymore » gate threshold that is determined from the prospective external monitor displacement at treatment delivery. The purpose of the present work is to establish a relationship between the thresholds for respiratory gating determined at CT simulation and treatment delivery, respectively. One hundred fifty external respiratory motion traces, from 90 patients, with and without audio-visual biofeedback, are analyzed. Two respiratory phase intervals, 40%-60% and 30%-70%, are chosen for respiratory gating from the 4D CT-derived tumor motion trajectory. From residual tumor displacements within each such gating phase interval, a simulation gate threshold is defined based on (a) the average and (b) the maximum respiratory displacement within the phase interval. The duty cycle for prospective gated delivery is estimated from the proportion of external monitor displacement data points within both the selected phase interval and the simulation gate threshold. The delivery gate threshold is then determined iteratively to match the above determined duty cycle. The magnitude of the difference between such gate thresholds determined at simulation and treatment delivery is quantified in each case. Phantom motion tests yielded coincidence of

  8. A fully automatic, threshold-based segmentation method for the estimation of the Metabolic Tumor Volume from PET images: validation on 3D printed anthropomorphic oncological lesions

    NASA Astrophysics Data System (ADS)

    Gallivanone, F.; Interlenghi, M.; Canervari, C.; Castiglioni, I.

    2016-01-01

    18F-Fluorodeoxyglucose (18F-FDG) Positron Emission Tomography (PET) is a standard functional diagnostic technique to in vivo image cancer. Different quantitative paramters can be extracted from PET images and used as in vivo cancer biomarkers. Between PET biomarkers Metabolic Tumor Volume (MTV) has gained an important role in particular considering the development of patient-personalized radiotherapy treatment for non-homogeneous dose delivery. Different imaging processing methods have been developed to define MTV. The different proposed PET segmentation strategies were validated in ideal condition (e.g. in spherical objects with uniform radioactivity concentration), while the majority of cancer lesions doesn't fulfill these requirements. In this context, this work has a twofold objective: 1) to implement and optimize a fully automatic, threshold-based segmentation method for the estimation of MTV, feasible in clinical practice 2) to develop a strategy to obtain anthropomorphic phantoms, including non-spherical and non-uniform objects, miming realistic oncological patient conditions. The developed PET segmentation algorithm combines an automatic threshold-based algorithm for the definition of MTV and a k-means clustering algorithm for the estimation of the background. The method is based on parameters always available in clinical studies and was calibrated using NEMA IQ Phantom. Validation of the method was performed both in ideal (e.g. in spherical objects with uniform radioactivity concentration) and non-ideal (e.g. in non-spherical objects with a non-uniform radioactivity concentration) conditions. The strategy to obtain a phantom with synthetic realistic lesions (e.g. with irregular shape and a non-homogeneous uptake) consisted into the combined use of standard anthropomorphic phantoms commercially and irregular molds generated using 3D printer technology and filled with a radioactive chromatic alginate. The proposed segmentation algorithm was feasible in a

  9. Chaotic Signal Denoising Based on Hierarchical Threshold Synchrosqueezed Wavelet Transform

    NASA Astrophysics Data System (ADS)

    Wang, Wen-Bo; Jing, Yun-yu; Zhao, Yan-chao; Zhang, Lian-Hua; Wang, Xiang-Li

    2017-12-01

    In order to overcoming the shortcoming of single threshold synchrosqueezed wavelet transform(SWT) denoising method, an adaptive hierarchical threshold SWT chaotic signal denoising method is proposed. Firstly, a new SWT threshold function is constructed based on Stein unbiased risk estimation, which is two order continuous derivable. Then, by using of the new threshold function, a threshold process based on the minimum mean square error was implemented, and the optimal estimation value of each layer threshold in SWT chaotic denoising is obtained. The experimental results of the simulating chaotic signal and measured sunspot signals show that, the proposed method can filter the noise of chaotic signal well, and the intrinsic chaotic characteristic of the original signal can be recovered very well. Compared with the EEMD denoising method and the single threshold SWT denoising method, the proposed method can obtain better denoising result for the chaotic signal.

  10. Ionizing radiation sensitivity of the ocular lens and its dose rate dependence.

    PubMed

    Hamada, Nobuyuki

    2017-10-01

    In 2011, the International Commission on Radiological Protection reduced the threshold for the lens effects of low linear energy transfer (LET) radiation. On one hand, the revised threshold of 0.5 Gy is much lower than previously recommended thresholds, but mechanisms behind high radiosensitivity remain incompletely understood. On the other hand, such a threshold is independent of dose rate, in contrast to previously recommended separate thresholds each for single and fractionated/protracted exposures. Such a change was made predicated on epidemiological evidence suggesting that a threshold for fractionated/protracted exposures is not higher than an acute threshold, and that a chronic threshold is uncertain. Thus, the dose rate dependence is still unclear. This paper therefore reviews the current knowledge on the radiosensitivity of the lens and the dose rate dependence of radiation cataractogenesis, and discusses its mechanisms. Mounting biological evidence indicates that the lens cells are not necessarily radiosensitive to cell killing, and the high radiosensitivity of the lens thus appears to be attributable to other mechanisms (e.g., excessive proliferation, abnormal differentiation, a slow repair of DNA double-strand breaks, telomere, senescence, crystallin changes, non-targeted effects and inflammation). Both biological and epidemiological evidence generally supports the lack of dose rate effects. However, there is also biological evidence for the tissue sparing dose rate (or fractionation) effect of low-LET radiation and an enhancing inverse dose fractionation effect of high-LET radiation at a limited range of LET. Emerging epidemiological evidence in chronically exposed individuals implies the inverse dose rate effect. Further biological and epidemiological studies are warranted to gain deeper knowledge on the radiosensitivity of the lens and dose rate dependence of radiation cataractogenesis.

  11. Pain thresholds, supra-threshold pain and lidocaine sensitivity in patients with erythromelalgia, including the I848Tmutation in NaV 1.7.

    PubMed

    Helås, T; Sagafos, D; Kleggetveit, I P; Quiding, H; Jönsson, B; Segerdahl, M; Zhang, Z; Salter, H; Schmelz, M; Jørum, E

    2017-09-01

    Nociceptive thresholds and supra-threshold pain ratings as well as their reduction upon local injection with lidocaine were compared between healthy subjects and patients with erythromelalgia (EM). Lidocaine (0.25, 0.50, 1.0 or 10 mg/mL) or placebo (saline) was injected intradermally in non-painful areas of the lower arm, in a randomized, double-blind manner, to test the effect on dynamic and static mechanical sensitivity, mechanical pain sensitivity, thermal thresholds and supra-threshold heat pain sensitivity. Heat pain thresholds and pain ratings to supra-threshold heat stimulation did not differ between EM-patients (n = 27) and controls (n = 25), neither did the dose-response curves for lidocaine. Only the subgroup of EM-patients with mutations in sodium channel subunits Na V 1.7, 1.8 or 1.9 (n = 8) had increased lidocaine sensitivity for supra-threshold heat stimuli, contrasting lower sensitivity to strong mechanical stimuli. This pattern was particularly clear in the two patients carrying the Na V 1.7 I848T mutations in whom lidocaine's hyperalgesic effect on mechanical pain sensitivity contrasted more effective heat analgesia. Heat pain thresholds are not sensitized in EM patients, even in those with gain-of-function mutations in Na V 1.7. Differential lidocaine sensitivity was overt only for noxious stimuli in the supra-threshold range suggesting that sensitized supra-threshold encoding is important for the clinical pain phenotype in EM in addition to lower activation threshold. Intracutaneous lidocaine dose-dependently blocked nociceptive sensations, but we did not identify EM patients with particular high lidocaine sensitivity that could have provided valuable therapeutic guidance. Acute pain thresholds and supra-threshold heat pain in controls and patients with erythromelalgia do not differ and have the same lidocaine sensitivity. Acute heat pain thresholds even in EM patients with the Na V 1.7 I848T mutation are normal and only nociceptor

  12. A Morphological Hessian Based Approach for Retinal Blood Vessels Segmentation and Denoising Using Region Based Otsu Thresholding

    PubMed Central

    BahadarKhan, Khan; A Khaliq, Amir; Shahid, Muhammad

    2016-01-01

    Diabetic Retinopathy (DR) harm retinal blood vessels in the eye causing visual deficiency. The appearance and structure of blood vessels in retinal images play an essential part in the diagnoses of an eye sicknesses. We proposed a less computational unsupervised automated technique with promising results for detection of retinal vasculature by using morphological hessian based approach and region based Otsu thresholding. Contrast Limited Adaptive Histogram Equalization (CLAHE) and morphological filters have been used for enhancement and to remove low frequency noise or geometrical objects, respectively. The hessian matrix and eigenvalues approach used has been in a modified form at two different scales to extract wide and thin vessel enhanced images separately. Otsu thresholding has been further applied in a novel way to classify vessel and non-vessel pixels from both enhanced images. Finally, postprocessing steps has been used to eliminate the unwanted region/segment, non-vessel pixels, disease abnormalities and noise, to obtain a final segmented image. The proposed technique has been analyzed on the openly accessible DRIVE (Digital Retinal Images for Vessel Extraction) and STARE (STructured Analysis of the REtina) databases along with the ground truth data that has been precisely marked by the experts. PMID:27441646

  13. Intravenously administered oxotremorine and atropine, in doses known to affect pain threshold, affect the intraspinal release of acetylcholine in rats.

    PubMed

    Abelson, Klas S P; Höglund, A Urban

    2002-04-01

    Both systemically and intrathecally administered cholinergic agonists produce antinociception while cholinergic antagonists decrease pain threshold. The mechanism and the site of action of these substances are not known. In the present study it was hypothesized that systemically administered muscarinic agonists and antagonists modify nociceptive threshold by affecting intraspinal release of acetylcholine (ACh). Catheters were inserted into the femoral vein in rats maintained on isoflurane anaesthesia for administration of oxotremorine (10-300 microg/kg) and atropine (0.1, 10, 5000 microg/kg). Spinal microdialysis probes were placed intraspinally at approximately the C2-C5 spinal level for sampling of acetylcholine and dialysis delivery of atropine (0.1, 1, 10 nM). Additionally, the tail-flick behaviour was tested on conscious rats injected intraperitoneally with saline, atropine (10, 100 and 5000 microg/kg), or subcutaneously with oxotremorine (30, 100, 300 microg/kg). Subcutaneous administration of oxotremorine (30, 100, 300 microg/kg) significantly increased the tail-flick latency. These doses of oxotremorine dose-dependently increased the intraspinal release of acetylcholine. Intravenously administered atropine, in a dose that produced hyperalgesia (5000 microg/kg) in the tail-flick test, significantly decreased the intraspinal release of acetylcholine. Our results suggest an association between pain threshold and acetylcholine release in spinal cord. It is also suggested that an approximately 30% increase in basal ACh release produces antinociception and that a 30% decrease in basal release produces hyperalgesia.

  14. Application of the "threshold of toxicological concern" to derive tolerable concentrations of "non-relevant metabolites" formed from plant protection products in ground and drinking water.

    PubMed

    Melching-Kollmuss, Stephanie; Dekant, Wolfgang; Kalberlah, Fritz

    2010-03-01

    Limits for tolerable concentrations of ground water metabolites ("non-relevant metabolites" without targeted toxicities and specific classification and labeling) derived from active ingredients (AI) of plant protection products (PPPs) are discussed in the European Union. Risk assessments for "non-relevant metabolites" need to be performed when concentrations are above 0.75 microg/L. Since oral uptake is the only relevant exposure pathway for "non-relevant metabolites", risk assessment approaches as used for other chemicals with predominantly oral exposure in humans are applicable. The concept of "thresholds of toxicological concern" (TTC) defines tolerable dietary intakes for chemicals without toxicity data and is widely applied to chemicals present in food in low concentrations such as flavorings. Based on a statistical evaluation of the results of many toxicity studies and considerations of chemical structures, the TTC concept derives a maximum daily oral intake without concern of 90 microg/person/day for non-genotoxic chemicals, even for those with appreciable toxicity. When using the typical exposure assessment for drinking water contaminants (consumption of 2L of drinking water/person/day, allocation of 10% of the tolerable daily intake to drinking water), a TTC-based upper concentration limit of 4.5 microg/L for "non-relevant metabolites" in ground/drinking water is delineated. In the present publication it has been evaluated, whether this value would cover all relevant toxicities (repeated dose, reproductive and developmental, and immune effects). Taking into account, that after evaluation of specific reproduction toxicity data from chemicals and pharmaceuticals, a value of 1 microg/kgbw/day has been assessed as to cover developmental and reproduction toxicity, a TTC value of 60 microg/person/day was assessed as to represent a safe value. Based on these reasonable worst case assumptions, a TTC-derived threshold of 3 microg/L in drinking water is derived

  15. Assessing the role of soil water limitation in determining the Phytotoxic Ozone Dose (PODY) thresholds

    NASA Astrophysics Data System (ADS)

    De Marco, Alessandra; Sicard, Pierre; Fares, Silvano; Tuovinen, Juha-Pekka; Anav, Alessandro; Paoletti, Elena

    2016-12-01

    Phytotoxic Ozone Dose (PODY), defined as the accumulated stomatal ozone flux over a threshold of Y, is considered an optimal metric to evaluate O3 effects on vegetation. PODY is often computed through the DO3SE model, which includes species-specific parameterizations for the environmental response of stomatal conductance. However, the effect of soil water content (SWC) on stomatal aperture is difficult to model on a regional scale and thus often ignored. In this study, we used environmental input data obtained from the WRF-CHIMERE model for 14,546 grid-based forest sites in Southern Europe. SWC was obtained for the upper 10 cm of soil, which resulted in a worst-case risk scenario. PODY was calculated either with or without water limitation for different Y thresholds. Exclusion of the SWC effect on stomatal fluxes caused a serious overestimation of PODY. The difference increased with increasing Y (78%, 128%, 237% and 565% with Y = 0, 1, 2 and 3 nmol O3 m-2 s-1, respectively). This behaviour was confirmed by applying the same approach to field data measured in a Mediterranean Quercus ilex forest. WRF-CHIMERE overestimated SWC at this field site, so under real-world conditions the SWC effect may be larger than modelled. The differences were lower for temperate species (Pinus cembra 50-340%, P. sylvestris 57-363%, Abies alba 57-371%) than for Mediterranean species (P. pinaster 87-356%, P. halepensis 96-429%, P. pinea 107-532%, Q. suber 104-1602%), although a high difference was recorded also for the temperate species Fagus sylvatica with POD3 (524%). We conclude that SWC should be considered in PODY simulations and a low Y threshold should be used for robustness.

  16. The ship edge feature detection based on high and low threshold for remote sensing image

    NASA Astrophysics Data System (ADS)

    Li, Xuan; Li, Shengyang

    2018-05-01

    In this paper, a method based on high and low threshold is proposed to detect the ship edge feature due to the low accuracy rate caused by the noise. Analyze the relationship between human vision system and the target features, and to determine the ship target by detecting the edge feature. Firstly, using the second-order differential method to enhance the quality of image; Secondly, to improvement the edge operator, we introduction of high and low threshold contrast to enhancement image edge and non-edge points, and the edge as the foreground image, non-edge as a background image using image segmentation to achieve edge detection, and remove the false edges; Finally, the edge features are described based on the result of edge features detection, and determine the ship target. The experimental results show that the proposed method can effectively reduce the number of false edges in edge detection, and has the high accuracy of remote sensing ship edge detection.

  17. Automatic threshold optimization in nonlinear energy operator based spike detection.

    PubMed

    Malik, Muhammad H; Saeed, Maryam; Kamboh, Awais M

    2016-08-01

    In neural spike sorting systems, the performance of the spike detector has to be maximized because it affects the performance of all subsequent blocks. Non-linear energy operator (NEO), is a popular spike detector due to its detection accuracy and its hardware friendly architecture. However, it involves a thresholding stage, whose value is usually approximated and is thus not optimal. This approximation deteriorates the performance in real-time systems where signal to noise ratio (SNR) estimation is a challenge, especially at lower SNRs. In this paper, we propose an automatic and robust threshold calculation method using an empirical gradient technique. The method is tested on two different datasets. The results show that our optimized threshold improves the detection accuracy in both high SNR and low SNR signals. Boxplots are presented that provide a statistical analysis of improvements in accuracy, for instance, the 75th percentile was at 98.7% and 93.5% for the optimized NEO threshold and traditional NEO threshold, respectively.

  18. Dose measurement based on threshold shift in MOSFET arrays in commercial SRAMS

    NASA Technical Reports Server (NTRS)

    Scheick, L. Z.; Swift, G.

    2002-01-01

    A new method using an array of MOS transistors isdescribed for measuring dose absorbed from ionizingradiation. Using the array of MOSFETs in a SRAM, a direct measurement of the number of MOS cells which change as a function of applied bias on the SRAM. Since the input and output of a SRAM used as a dosimeter is completely digital, the measurement of dose is easily accessible by a remote processing system.

  19. A threshold selection method based on edge preserving

    NASA Astrophysics Data System (ADS)

    Lou, Liantang; Dan, Wei; Chen, Jiaqi

    2015-12-01

    A method of automatic threshold selection for image segmentation is presented. An optimal threshold is selected in order to preserve edge of image perfectly in image segmentation. The shortcoming of Otsu's method based on gray-level histograms is analyzed. The edge energy function of bivariate continuous function is expressed as the line integral while the edge energy function of image is simulated by discretizing the integral. An optimal threshold method by maximizing the edge energy function is given. Several experimental results are also presented to compare with the Otsu's method.

  20. Evaluation of nuclear chromatin using grayscale intensity and thresholded percentage area in liquid-based cervical cytology.

    PubMed

    Lee, Hyekyung; Han, Myungein; Yoo, Taejo; Jung, Chanho; Son, Hyun-Jin; Cho, Migyung

    2018-05-01

    Development of computerized image analysis techniques has opened up the possibility for the quantitative analysis of nuclear chromatin in pathology. We hypothesized that the features extracted from digital images could be used to determine specific cytomorphological findings for nuclear chromatin that may be applicable for establishing a medical diagnosis. Three parameters were evaluated from nuclear chromatin images obtained from the liquid-based cervical cytology samples of patients with biopsy-proven high-grade squamous intraepithelial lesion (HGSIL), and compared between non-neoplastic squamous epithelia and dysplastic epithelia groups: (1) standard deviation (SD) of the grayscale intensity; (2) difference between the maximum and minimum grayscale intensity (M-M); and (3) thresholded area percentage. Each parameter was evaluated at the mean, mean-1SD, and mean-2SD thresholding intensity levels. Between the mean and mean-1SD levels, the thresholded nuclear chromatin pattern was most similar to the chromatin granularity of the unthresholded grayscale images. The SD of the gray intensity and the thresholded area percentage differed significantly between the non-neoplastic squamous epithelia and dysplastic epithelia of HGSIL images at all three thresholding intensity levels (mean, mean-1SD, and mean-2SD). However, the M-M significantly differed between the two sample types for only two of the thresholding intensity levels (mean-1SD and mean-2SD). The digital parameters SD and M-M of the grayscale intensity, along with the thresholded area percentage could be useful in automated cytological evaluations. Further studies are needed to identify more valuable parameters for clinical application. © 2018 Wiley Periodicals, Inc.

  1. Global gray-level thresholding based on object size.

    PubMed

    Ranefall, Petter; Wählby, Carolina

    2016-04-01

    In this article, we propose a fast and robust global gray-level thresholding method based on object size, where the selection of threshold level is based on recall and maximum precision with regard to objects within a given size interval. The method relies on the component tree representation, which can be computed in quasi-linear time. Feature-based segmentation is especially suitable for biomedical microscopy applications where objects often vary in number, but have limited variation in size. We show that for real images of cell nuclei and synthetic data sets mimicking fluorescent spots the proposed method is more robust than all standard global thresholding methods available for microscopy applications in ImageJ and CellProfiler. The proposed method, provided as ImageJ and CellProfiler plugins, is simple to use and the only required input is an interval of the expected object sizes. © 2016 International Society for Advancement of Cytometry. © 2016 International Society for Advancement of Cytometry.

  2. Aspirin and non-steroidal anti-inflammatory drugs use reduce gastric cancer risk: A dose-response meta-analysis.

    PubMed

    Huang, Xuan-Zhang; Chen, You; Wu, Jian; Zhang, Xi; Wu, Cong-Cong; Zhang, Chao-Ying; Sun, Shuang-Shuang; Chen, Wen-Jun

    2017-01-17

    The association between non-steroidal anti-inflammatory drugs (NSAIDs) and gastric cancer (GC) risk is controversial. The aim of this study is to evaluate the chemopreventive effect of NSAIDs for GC. A literature search was performed for relevant studies using the PubMed and Embase database (up to March 2016). Risk ratios (RRs) and 95% confidence intervals (CIs) were used as the effect measures. The dose-response analysis and subgroup analysis were also performed. Twenty-four studies were included. Our results indicated that NSAIDs could reduce GC risk (any NSAIDs: RR=0.78, 96%CI=0.72-0.85; aspirin: RR=0.70, 95%CI=0.62-0.80; non-aspirin NSAIDs: RR=0.86, 95%CI=0.80-0.94), especially for non-cardia GC risk. Moreover, the dose-response analysis indicated the risk of GC decreased by 11% and 5% for 2 years increment of any NSAIDs and aspirin use, respectively. There were nonlinear relationships between the frequency of any NSAIDs use and aspirin use and GC risk (P for non-linearity<0.01), with a threshold effect of 5 times/week. A monotonically decreasing trend was observed only for the frequency of less than 5 times/week. Our results indicate that NSAIDs is inversely associated with GC risk, especially for non-cardia GC risk. NSAIDs use may become a feasible approach to prevent GC.

  3. Reliability of TMS phosphene threshold estimation: Toward a standardized protocol.

    PubMed

    Mazzi, Chiara; Savazzi, Silvia; Abrahamyan, Arman; Ruzzoli, Manuela

    Phosphenes induced by transcranial magnetic stimulation (TMS) are a subjectively described visual phenomenon employed in basic and clinical research as index of the excitability of retinotopically organized areas in the brain. Phosphene threshold estimation is a preliminary step in many TMS experiments in visual cognition for setting the appropriate level of TMS doses; however, the lack of a direct comparison of the available methods for phosphene threshold estimation leaves unsolved the reliability of those methods in setting TMS doses. The present work aims at fulfilling this gap. We compared the most common methods for phosphene threshold calculation, namely the Method of Constant Stimuli (MOCS), the Modified Binary Search (MOBS) and the Rapid Estimation of Phosphene Threshold (REPT). In two experiments we tested the reliability of PT estimation under each of the three methods, considering the day of administration, participants' expertise in phosphene perception and the sensitivity of each method to the initial values used for the threshold calculation. We found that MOCS and REPT have comparable reliability when estimating phosphene thresholds, while MOBS estimations appear less stable. Based on our results, researchers and clinicians can estimate phosphene threshold according to MOCS or REPT equally reliably, depending on their specific investigation goals. We suggest several important factors for consideration when calculating phosphene thresholds and describe strategies to adopt in experimental procedures. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Comparative analysis of risk-based cleanup levels and associated remediation costs using linearized multistage model (cancer slope factor) vs. threshold approach (reference dose) for three chlorinated alkenes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawton, L.J.; Mihalich, J.P.

    1995-12-31

    The chlorinated alkenes 1,1-dichloroethene (1,1-DCE), tetrachloroethene (PCE), and trichloroethene (TCE) are common environmental contaminants found in soil and groundwater at hazardous waste sites. Recent assessment of data from epidemiology and mechanistic studies indicates that although exposure to 1,1-DCE, PCE, and TCE causes tumor formation in rodents, it is unlikely that these chemicals are carcinogenic to humans. Nevertheless, many state and federal agencies continue to regulate these compounds as carcinogens through the use of the linearized multistage model and resulting cancer slope factor (CSF). The available data indicate that 1,1-DCE, PCE, and TCE should be assessed using a threshold (i.e., referencemore » dose [RfD]) approach rather than a CSF. This paper summarizes the available metabolic, toxicologic, and epidemiologic data that question the use of the linear multistage model (and CSF) for extrapolation from rodents to humans. A comparative analysis of potential risk-based cleanup goals (RBGs) for these three compounds in soil is presented for a hazardous waste site. Goals were calculated using the USEPA CSFs and using a threshold (i.e., RfD) approach. Costs associated with remediation activities required to meet each set of these cleanup goals are presented and compared.« less

  5. Dose and dose rate extrapolation factors for malignant and non-malignant health endpoints after exposure to gamma and neutron radiation.

    PubMed

    Tran, Van; Little, Mark P

    2017-11-01

    Murine experiments were conducted at the JANUS reactor in Argonne National Laboratory from 1970 to 1992 to study the effect of acute and protracted radiation dose from gamma rays and fission neutron whole body exposure. The present study reports the reanalysis of the JANUS data on 36,718 mice, of which 16,973 mice were irradiated with neutrons, 13,638 were irradiated with gamma rays, and 6107 were controls. Mice were mostly Mus musculus, but one experiment used Peromyscus leucopus. For both types of radiation exposure, a Cox proportional hazards model was used, using age as timescale, and stratifying on sex and experiment. The optimal model was one with linear and quadratic terms in cumulative lagged dose, with adjustments to both linear and quadratic dose terms for low-dose rate irradiation (<5 mGy/h) and with adjustments to the dose for age at exposure and sex. After gamma ray exposure there is significant non-linearity (generally with upward curvature) for all tumours, lymphoreticular, respiratory, connective tissue and gastrointestinal tumours, also for all non-tumour, other non-tumour, non-malignant pulmonary and non-malignant renal diseases (p < 0.001). Associated with this the low-dose extrapolation factor, measuring the overestimation in low-dose risk resulting from linear extrapolation is significantly elevated for lymphoreticular tumours 1.16 (95% CI 1.06, 1.31), elevated also for a number of non-malignant endpoints, specifically all non-tumour diseases, 1.63 (95% CI 1.43, 2.00), non-malignant pulmonary disease, 1.70 (95% CI 1.17, 2.76) and other non-tumour diseases, 1.47 (95% CI 1.29, 1.82). However, for a rather larger group of malignant endpoints the low-dose extrapolation factor is significantly less than 1 (implying downward curvature), with central estimates generally ranging from 0.2 to 0.8, in particular for tumours of the respiratory system, vasculature, ovary, kidney/urinary bladder and testis. For neutron exposure most endpoints, malignant

  6. Evaluation of HIFU-induced lesion region using temperature threshold and equivalent thermal dose methods

    NASA Astrophysics Data System (ADS)

    Chang, Shihui; Xue, Fanfan; Zhou, Wenzheng; Zhang, Ji; Jian, Xiqi

    2017-03-01

    Usually, numerical simulation is used to predict the acoustic filed and temperature distribution of high intensity focused ultrasound (HIFU). In this paper, the simulated lesion volumes obtained by temperature threshold (TRT) 60 °C and equivalent thermal dose (ETD) 240 min were compared with the experimental results which were obtained by animal tissue experiment in vitro. In the simulation, the calculated model was established according to the vitro tissue experiment, and the Finite Difference Time Domain (FDTD) method was used to calculate the acoustic field and temperature distribution in bovine liver by the Westervelt formula and Pennes bio-heat transfer equation, and the non-linear characteristics of the ultrasound was considered. In the experiment, the fresh bovine liver was exposed for 8s, 10s, 12s under different power conditions (150W, 170W, 190W, 210W), and the exposure was repeated 6 times under the same dose. After the exposures, the liver was sliced and photographed every 0.2mm, and the area of the lesion region in every photo was calculated. Then, every value of the areas was multiplied by 0.2mm, and summed to get the approximation volume of the lesion region. The comparison result shows that the lesion volume of the region calculated by TRT 60 °C in simulation was much closer to the lesion volume obtained in experiment, and the volume of the region above 60 °C was larger than the experimental results, but the volume deviation was not exceed 10%. The volume of the lesion region calculated by ETD 240 min was larger than that calculated by TRT 60 °C in simulation, and the volume deviations were ranged from 4.9% to 23.7%.

  7. Compressively sampled MR image reconstruction using generalized thresholding iterative algorithm

    NASA Astrophysics Data System (ADS)

    Elahi, Sana; kaleem, Muhammad; Omer, Hammad

    2018-01-01

    Compressed sensing (CS) is an emerging area of interest in Magnetic Resonance Imaging (MRI). CS is used for the reconstruction of the images from a very limited number of samples in k-space. This significantly reduces the MRI data acquisition time. One important requirement for signal recovery in CS is the use of an appropriate non-linear reconstruction algorithm. It is a challenging task to choose a reconstruction algorithm that would accurately reconstruct the MR images from the under-sampled k-space data. Various algorithms have been used to solve the system of non-linear equations for better image quality and reconstruction speed in CS. In the recent past, iterative soft thresholding algorithm (ISTA) has been introduced in CS-MRI. This algorithm directly cancels the incoherent artifacts produced because of the undersampling in k -space. This paper introduces an improved iterative algorithm based on p -thresholding technique for CS-MRI image reconstruction. The use of p -thresholding function promotes sparsity in the image which is a key factor for CS based image reconstruction. The p -thresholding based iterative algorithm is a modification of ISTA, and minimizes non-convex functions. It has been shown that the proposed p -thresholding iterative algorithm can be used effectively to recover fully sampled image from the under-sampled data in MRI. The performance of the proposed method is verified using simulated and actual MRI data taken at St. Mary's Hospital, London. The quality of the reconstructed images is measured in terms of peak signal-to-noise ratio (PSNR), artifact power (AP), and structural similarity index measure (SSIM). The proposed approach shows improved performance when compared to other iterative algorithms based on log thresholding, soft thresholding and hard thresholding techniques at different reduction factors.

  8. Effect of Pulse Polarity on Thresholds and on Non-monotonic Loudness Growth in Cochlear Implant Users.

    PubMed

    Macherey, Olivier; Carlyon, Robert P; Chatron, Jacques; Roman, Stéphane

    2017-06-01

    Most cochlear implants (CIs) activate their electrodes non-simultaneously in order to eliminate electrical field interactions. However, the membrane of auditory nerve fibers needs time to return to its resting state, causing the probability of firing to a pulse to be affected by previous pulses. Here, we provide new evidence on the effect of pulse polarity and current level on these interactions. In experiment 1, detection thresholds and most comfortable levels (MCLs) were measured in CI users for 100-Hz pulse trains consisting of two consecutive biphasic pulses of the same or of opposite polarity. All combinations of polarities were studied: anodic-cathodic-anodic-cathodic (ACAC), CACA, ACCA, and CAAC. Thresholds were lower when the adjacent phases of the two pulses had the same polarity (ACCA and CAAC) than when they were different (ACAC and CACA). Some subjects showed a lower threshold for ACCA than for CAAC while others showed the opposite trend demonstrating that polarity sensitivity at threshold is genuine and subject- or electrode-dependent. In contrast, anodic (CAAC) pulses always showed a lower MCL than cathodic (ACCA) pulses, confirming previous reports. In experiments 2 and 3, the subjects compared the loudness of several pulse trains differing in current level separately for ACCA and CAAC. For 40 % of the electrodes tested, loudness grew non-monotonically as a function of current level for ACCA but never for CAAC. This finding may relate to a conduction block of the action potentials along the fibers induced by a strong hyperpolarization of their central processes. Further analysis showed that the electrodes showing a lower threshold for ACCA than for CAAC were more likely to yield a non-monotonic loudness growth. It is proposed that polarity sensitivity at threshold reflects the local neural health and that anodic asymmetric pulses should preferably be used to convey sound information while avoiding abnormal loudness percepts.

  9. Lentinan diminishes apoptotic bodies in the ileal crypts associated with S-1 administration.

    PubMed

    Suga, Yasuyo; Takehana, Kenji

    2017-09-01

    S-1 is an oral agent containing tegafur (a prodrug of 5-fluorouracil) that is used to treat various cancers, but adverse effects are frequent. Two pilot clinical studies have suggested that lentinan (LNT; β-1,3-glucan) may reduce the incidence of adverse effects caused by S-1 therapy. In this study, we established a murine model for assessment of gastrointestinal toxicity associated with S-1 and studied the effect of LNT. S-1 was administered orally to BALB/c mice at the effective dose (8.3mg/kg, as tegafur equivalent) once daily (5days per week) for 3weeks. Stool consistency and intestinal specimens were examined. We investigated the effect of combined intravenous administration of LNT at 0.1mg, which is an effective dose in murine tumor models. We also investigated the effect of a single administration of S-1. During long-term administration of S-1, some mice had loose stools and an increase in apoptotic bodies was observed in the ileal crypts. An increase in apoptotic bodies was also noted after a single administration of S-1 (15mg/kg). Prior or concomitant administration of LNT inhibited the increase in apoptotic bodies in both settings. Administration of LNT also increased the accumulation of CD11b + TIM-4 + cells in the ileum, while depletion of these cells by liposomal clodronate diminished the inhibitory effect of LNT on S-1 toxicity. Combined administration of LNT with S-1 led to a decrease in apoptotic bodies in the ileal crypts, possibly because LNT promoted phagocytosis of damaged cells by CD11b + TIM-4 + cells. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. 48 CFR 1352.213-71 - Instructions for submitting quotations under the simplified acquisition threshold-non-commercial.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... submitting quotations under the simplified acquisition threshold-non-commercial. 1352.213-71 Section 1352.213... quotations under the simplified acquisition threshold—non-commercial. As prescribed in 48 CFR 1313.302-1-70... Threshold—Non-Commercial (APR 2010) (a) North American Industry Classification System (NAICS) code and small...

  11. Cochlear implant characteristics and speech perception skills of adolescents with long-term device use.

    PubMed

    Davidson, Lisa S; Geers, Ann E; Brenner, Christine

    2010-10-01

    Updated cochlear implant technology and optimized fitting can have a substantial impact on speech perception. The effects of upgrades in processor technology and aided thresholds on word recognition at soft input levels and sentence recognition in noise were examined. We hypothesized that updated speech processors and lower aided thresholds would allow improved recognition of soft speech without compromising performance in noise. 109 teenagers who had used a Nucleus 22-cochlear implant since preschool were tested with their current speech processor(s) (101 unilateral and 8 bilateral): 13 used the Spectra, 22 the ESPrit 22, 61 the ESPrit 3G, and 13 the Freedom. The Lexical Neighborhood Test (LNT) was administered at 70 and 50 dB SPL and the Bamford Kowal Bench sentences were administered in quiet and in noise. Aided thresholds were obtained for frequency-modulated tones from 250 to 4,000 Hz. Results were analyzed using repeated measures analysis of variance. Aided thresholds for the Freedom/3G group were significantly lower (better) than the Spectra/Sprint group. LNT scores at 50 dB were significantly higher for the Freedom/3G group. No significant differences between the 2 groups were found for the LNT at 70 or sentences in quiet or noise. Adolescents using updated processors that allowed for aided detection thresholds of 30 dB HL or better performed the best at soft levels. The BKB in noise results suggest that greater access to soft speech does not compromise listening in noise.

  12. T Cell Activation Thresholds are Affected by Gravitational

    NASA Technical Reports Server (NTRS)

    Adams, Charley; Gonzalez, M.; Nelman-Gonzalez, M.

    1999-01-01

    T cells stimulated in space flight by various mitogenic signals show a dramatic reduction in proliferation and expression of early activation markers. Similar results are also obtained in a ground based model of microgravity, clinorotation, which provides a vector-averaged reduction of the apparent gravity on cells without significant shear force. Here we demonstrate that T cell inhibition is due to an increase in the required threshold for activation. Dose response curves indicate that cells activated during clinorotation require higher stimulation to achieve the same level of activation, as measured by CD69 expression. Interleukin 2 receptor expression, and DNA synthesis. The amount of stimulation necessary for 50% activation is 5 fold in the clinostat relative to static. Correlation of TCR internalization with activation also exhibit a dramatic right shift in clinorotation, demonstrating unequivocally that signal transduction mechanism independent of TCR triggering account for the increased activation threshold. Previous results from space flight experiments are consistent with the dose response curves obtained for clinorotation. Activation thresholds are important aspects of T cell memory, autoimmunity and tolerance Clinorotation is a useful, noninvasive tool for the study of cellular and biochemical event regulating T cell activation threshold and the effects of gravitation forces on these systems.

  13. Thresholds in chemical respiratory sensitisation.

    PubMed

    Cochrane, Stella A; Arts, Josje H E; Ehnes, Colin; Hindle, Stuart; Hollnagel, Heli M; Poole, Alan; Suto, Hidenori; Kimber, Ian

    2015-07-03

    acquisition of sensitisation to chemical respiratory allergens is a dose-related phenomenon, and that thresholds exist, it is frequently difficult to define accurate numerical values for threshold exposure levels. Nevertheless, based on occupational exposure data it may sometimes be possible to derive levels of exposure in the workplace, which are safe. An additional observation is the lack currently of suitable experimental methods for both routine hazard characterisation and the measurement of thresholds, and that such methods are still some way off. Given the current trajectory of toxicology, and the move towards the use of non-animal in vitro and/or in silico) methods, there is a need to consider the development of alternative approaches for the identification and characterisation of respiratory sensitisation hazards, and for risk assessment. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  14. In vivo dose verification method in catheter based high dose rate brachytherapy.

    PubMed

    Jaselskė, Evelina; Adlienė, Diana; Rudžianskas, Viktoras; Urbonavičius, Benas Gabrielis; Inčiūra, Arturas

    2017-12-01

    In vivo dosimetry is a powerful tool for dose verification in radiotherapy. Its application in high dose rate (HDR) brachytherapy is usually limited to the estimation of gross errors, due to inability of the dosimetry system/ method to record non-uniform dose distribution in steep dose gradient fields close to the radioactive source. In vivo dose verification in interstitial catheter based HDR brachytherapy is crucial since the treatment is performed inserting radioactive source at the certain positions within the catheters that are pre-implanted into the tumour. We propose in vivo dose verification method for this type of brachytherapy treatment which is based on the comparison between experimentally measured and theoretical dose values calculated at well-defined locations corresponding dosemeter positions in the catheter. Dose measurements were performed using TLD 100-H rods (6 mm long, 1 mm diameter) inserted in a certain sequences into additionally pre-implanted dosimetry catheter. The adjustment of dosemeter positioning in the catheter was performed using reconstructed CT scans of patient with pre-implanted catheters. Doses to three Head&Neck and one Breast cancer patient have been measured during several randomly selected treatment fractions. It was found that the average experimental dose error varied from 4.02% to 12.93% during independent in vivo dosimetry control measurements for selected Head&Neck cancer patients and from 7.17% to 8.63% - for Breast cancer patient. Average experimental dose error was below the AAPM recommended margin of 20% and did not exceed the measurement uncertainty of 17.87% estimated for this type of dosemeters. Tendency of slightly increasing average dose error was observed in every following treatment fraction of the same patient. It was linked to the changes of theoretically estimated dosemeter positions due to the possible patient's organ movement between different treatment fractions, since catheter reconstruction was

  15. Performance of dose calculation algorithms from three generations in lung SBRT: comparison with full Monte Carlo‐based dose distributions

    PubMed Central

    Kapanen, Mika K.; Hyödynmaa, Simo J.; Wigren, Tuija K.; Pitkänen, Maunu A.

    2014-01-01

    achieved, but 2%/2 mm threshold criteria showed larger discrepancies. The TPS algorithm comparison results showed large dose discrepancies in the PTV mean dose (D50%), nearly 60%, for the PBC algorithm, and differences of nearly 20% for the AAA, occurring also in the small PTV size range. This work suggests the application of independent plan verification, when the AAA or the AXB algorithm are utilized in lung SBRT having PTVs smaller than 20‐25 cc. The calculated data from this study can be used in converting the SBRT protocols based on type ‘a’ and/or type ‘b’ algorithms for the most recent generation type ‘c’ algorithms, such as the AXB algorithm. PACS numbers: 87.55.‐x, 87.55.D‐, 87.55.K‐, 87.55.kd, 87.55.Qr PMID:24710454

  16. Constructing financial network based on PMFG and threshold method

    NASA Astrophysics Data System (ADS)

    Nie, Chun-Xiao; Song, Fu-Tie

    2018-04-01

    Based on planar maximally filtered graph (PMFG) and threshold method, we introduced a correlation-based network named PMFG-based threshold network (PTN). We studied the community structure of PTN and applied ISOMAP algorithm to represent PTN in low-dimensional Euclidean space. The results show that the community corresponds well to the cluster in the Euclidean space. Further, we studied the dynamics of the community structure and constructed the normalized mutual information (NMI) matrix. Based on the real data in the market, we found that the volatility of the market can lead to dramatic changes in the community structure, and the structure is more stable during the financial crisis.

  17. A Critical, Nonlinear Threshold Dictates Bacterial Invasion and Initial Kinetics During Influenza

    NASA Astrophysics Data System (ADS)

    Smith, Amber M.; Smith, Amanda P.

    2016-12-01

    Secondary bacterial infections increase morbidity and mortality of influenza A virus (IAV) infections. Bacteria are able to invade due to virus-induced depletion of alveolar macrophages (AMs), but this is not the only contributing factor. By analyzing a kinetic model, we uncovered a nonlinear initial dose threshold that is dependent on the amount of virus-induced AM depletion. The threshold separates the growth and clearance phenotypes such that bacteria decline for dose-AM depletion combinations below the threshold, stay constant near the threshold, and increase above the threshold. In addition, the distance from the threshold correlates to the growth rate. Because AM depletion changes throughout an IAV infection, the dose requirement for bacterial invasion also changes accordingly. Using the threshold, we found that the dose requirement drops dramatically during the first 7d of IAV infection. We then validated these analytical predictions by infecting mice with doses below or above the predicted threshold over the course of IAV infection. These results identify the nonlinear way in which two independent factors work together to support successful post-influenza bacterial invasion. They provide insight into coinfection timing, the heterogeneity in outcome, the probability of acquiring a coinfection, and the use of new therapeutic strategies to combat viral-bacterial coinfections.

  18. A Critical, Nonlinear Threshold Dictates Bacterial Invasion and Initial Kinetics During Influenza.

    PubMed

    Smith, Amber M; Smith, Amanda P

    2016-12-15

    Secondary bacterial infections increase morbidity and mortality of influenza A virus (IAV) infections. Bacteria are able to invade due to virus-induced depletion of alveolar macrophages (AMs), but this is not the only contributing factor. By analyzing a kinetic model, we uncovered a nonlinear initial dose threshold that is dependent on the amount of virus-induced AM depletion. The threshold separates the growth and clearance phenotypes such that bacteria decline for dose-AM depletion combinations below the threshold, stay constant near the threshold, and increase above the threshold. In addition, the distance from the threshold correlates to the growth rate. Because AM depletion changes throughout an IAV infection, the dose requirement for bacterial invasion also changes accordingly. Using the threshold, we found that the dose requirement drops dramatically during the first 7d of IAV infection. We then validated these analytical predictions by infecting mice with doses below or above the predicted threshold over the course of IAV infection. These results identify the nonlinear way in which two independent factors work together to support successful post-influenza bacterial invasion. They provide insight into coinfection timing, the heterogeneity in outcome, the probability of acquiring a coinfection, and the use of new therapeutic strategies to combat viral-bacterial coinfections.

  19. Threshold secret sharing scheme based on phase-shifting interferometry.

    PubMed

    Deng, Xiaopeng; Shi, Zhengang; Wen, Wei

    2016-11-01

    We propose a new method for secret image sharing with the (3,N) threshold scheme based on phase-shifting interferometry. The secret image, which is multiplied with an encryption key in advance, is first encrypted by using Fourier transformation. Then, the encoded image is shared into N shadow images based on the recording principle of phase-shifting interferometry. Based on the reconstruction principle of phase-shifting interferometry, any three or more shadow images can retrieve the secret image, while any two or fewer shadow images cannot obtain any information of the secret image. Thus, a (3,N) threshold secret sharing scheme can be implemented. Compared with our previously reported method, the algorithm of this paper is suited for not only a binary image but also a gray-scale image. Moreover, the proposed algorithm can obtain a larger threshold value t. Simulation results are presented to demonstrate the feasibility of the proposed method.

  20. Non-human primate skull effects on the cavitation detection threshold of FUS-induced blood-brain barrier opening

    NASA Astrophysics Data System (ADS)

    Wu, Shih-Ying; Tung, Yao-Sheng; Marquet, Fabrice; Chen, Cherry C.; Konofagou, Elisa E.

    2012-11-01

    Microbubble (MB)-assisted focused ultrasound is a promising technique for delivering drugs to the brain by noninvasively and transiently opening the blood-brain barrier (BBB), and monitoring BBB opening using passive cavitation detection (PCD) is critical in detecting its occurrence, extent as well as assessing its mechanism. One of the main obstacles in achieving those objectives in large animals is the transcranial attenuation. To study the effects, the cavitation response through the in-vitro non-human primate (NHP) skull was investigated. In-house manufactured lipid-shelled MB (medium diameter: 4-5 um) were injected into a 4-mm channel of a phantom below a degassed monkey skull. A hydrophone confocally aligned with the FUS transducer served as PCD during sonication (frequency: 0.50 MHz, peak rarefactional pressures: 0.05-0.60 MPa, pulse length: 100 cycles, PRF: 10 Hz, duration: 2 s) for four cases: water without skull, water with skull, MB without skull and MB with skull. A 5.1-MHz linear-array transducer was also used to monitor the MB disruption. The frequency spectra, spectrograms, stable cavitation dose (SCD) and inertial cavitation dose (ICD) were quantified. Results showed that the onset of stable cavitation and inertial cavitation in the experiments occurred at 50 kPa, and was detectable throught the NHP skull since the both the detection thresholds for stable cavitation and inertial cavitation remained unchanged compared to the non-skull case, and the SCD and ICD acquired transcranially may not adequately represent the true extent of stable and inertial cavitation due to the skull attenuation.

  1. Non-Targeted Effects and the Dose Response for Heavy Ion Tumorigenesis

    NASA Technical Reports Server (NTRS)

    Chappelli, Lori J.; Cucinotta, Francis A.

    2010-01-01

    BACKGROUND: There is no human epidemiology data available to estimate the heavy ion cancer risks experienced by astronauts in space. Studies of tumor induction in mice are a necessary step to estimate risks to astronauts. Previous experimental data can be better utilized to model dose response for heavy ion tumorigenesis and plan future low dose studies. DOSE RESPONSE MODELS: The Harderian Gland data of Alpen et al.[1-3] was re-analyzed [4] using non-linear least square regression. The data set measured the induction of Harderian gland tumors in mice by high-energy protons, helium, neon, iron, niobium and lanthanum with LET s ranging from 0.4 to 950 keV/micron. We were able to strengthen the individual ion models by combining data for all ions into a model that relates both radiation dose and LET for the ion to tumor prevalence. We compared models based on Targeted Effects (TE) to one motivated by Non-targeted Effects (NTE) that included a bystander term that increased tumor induction at low doses non-linearly. When comparing fitted models to the experimental data, we considered the adjusted R2, the Akaike Information Criteria (AIC), and the Bayesian Information Criteria (BIC) to test for Goodness of fit.In the adjusted R2test, the model with the highest R2values provides a better fit to the available data. In the AIC and BIC tests, the model with the smaller values of the summary value provides the better fit. The non-linear NTE models fit the combined data better than the TE models that are linear at low doses. We evaluated the differences in the relative biological effectiveness (RBE) and found the NTE model provides a higher RBE at low dose compared to the TE model. POWER ANALYSIS: The final NTE model estimates were used to simulate example data to consider the design of new experiments to detect NTE at low dose for validation. Power and sample sizes were calculated for a variety of radiation qualities including some not considered in the Harderian Gland data

  2. Microscopy mineral image enhancement based on improved adaptive threshold in nonsubsampled shearlet transform domain

    NASA Astrophysics Data System (ADS)

    Li, Liangliang; Si, Yujuan; Jia, Zhenhong

    2018-03-01

    In this paper, a novel microscopy mineral image enhancement method based on adaptive threshold in non-subsampled shearlet transform (NSST) domain is proposed. First, the image is decomposed into one low-frequency sub-band and several high-frequency sub-bands. Second, the gamma correction is applied to process the low-frequency sub-band coefficients, and the improved adaptive threshold is adopted to suppress the noise of the high-frequency sub-bands coefficients. Third, the processed coefficients are reconstructed with the inverse NSST. Finally, the unsharp filter is used to enhance the details of the reconstructed image. Experimental results on various microscopy mineral images demonstrated that the proposed approach has a better enhancement effect in terms of objective metric and subjective metric.

  3. Uncertainty in determining extreme precipitation thresholds

    NASA Astrophysics Data System (ADS)

    Liu, Bingjun; Chen, Junfan; Chen, Xiaohong; Lian, Yanqing; Wu, Lili

    2013-10-01

    Extreme precipitation events are rare and occur mostly on a relatively small and local scale, which makes it difficult to set the thresholds for extreme precipitations in a large basin. Based on the long term daily precipitation data from 62 observation stations in the Pearl River Basin, this study has assessed the applicability of the non-parametric, parametric, and the detrended fluctuation analysis (DFA) methods in determining extreme precipitation threshold (EPT) and the certainty to EPTs from each method. Analyses from this study show the non-parametric absolute critical value method is easy to use, but unable to reflect the difference of spatial rainfall distribution. The non-parametric percentile method can account for the spatial distribution feature of precipitation, but the problem with this method is that the threshold value is sensitive to the size of rainfall data series and is subjected to the selection of a percentile thus make it difficult to determine reasonable threshold values for a large basin. The parametric method can provide the most apt description of extreme precipitations by fitting extreme precipitation distributions with probability distribution functions; however, selections of probability distribution functions, the goodness-of-fit tests, and the size of the rainfall data series can greatly affect the fitting accuracy. In contrast to the non-parametric and the parametric methods which are unable to provide information for EPTs with certainty, the DFA method although involving complicated computational processes has proven to be the most appropriate method that is able to provide a unique set of EPTs for a large basin with uneven spatio-temporal precipitation distribution. The consistency between the spatial distribution of DFA-based thresholds with the annual average precipitation, the coefficient of variation (CV), and the coefficient of skewness (CS) for the daily precipitation further proves that EPTs determined by the DFA method

  4. Re-visiting Trichuris trichiura intensity thresholds based on anemia during pregnancy.

    PubMed

    Gyorkos, Theresa W; Gilbert, Nicolas L; Larocque, Renée; Casapía, Martín; Montresor, Antonio

    2012-01-01

    The intensity categories, or thresholds, currently used for Trichuris trichiura (ie. epg intensities of 1-999 (light); 1,000-9,999 epg (moderate), and ≥ 10,000 epg (heavy)) were developed in the 1980s, when there were little epidemiological data available on dose-response relationships. This study was undertaken to determine a threshold for T. trichiura-associated anemia in pregnant women and to describe the implications of this threshold in terms of the need for primary prevention and chemotherapeutic interventions. In Iquitos, Peru, 935 pregnant women were tested for T. trichiura infection in their second trimester of pregnancy; were given daily iron supplements throughout their pregnancy; and had their blood hemoglobin levels measured in their third trimester of pregnancy. Women in the highest two T. trichiura intensity quintiles (601-1632 epg and ≥ 1633 epg) had significantly lower mean hemoglobin concentrations than the lowest quintile (0-24 epg). They also had a statistically significantly higher risk of anemia, with adjusted odds ratios of 1.67 (95% CI: 1.02, 2.62) and 1.73 (95% CI: 1.09, 2.74), respectively. This analysis provides support for categorizing a T. trichiura infection ≥ 1,000 epg as 'moderate', as currently defined by the World Health Organization. Because this 'moderate' level of T. trichiura infection was found to be a significant risk factor for anemia in pregnant women, the intensity of Trichuris infection deemed to cause or aggravate anemia should no longer be restricted to the 'heavy' intensity category. It should now include both 'heavy' and 'moderate' intensities of Trichuris infection. Evidence-based deworming strategies targeting pregnant women or populations where anemia is of concern should be updated accordingly.

  5. Determining the Critical Dose Threshold of Electron-Induced Electron Yield for Minimally Charged Highly Insulating Materials

    NASA Astrophysics Data System (ADS)

    Hoffmann, Ryan; Dennison, J. R.; Abbott, Jonathan

    2006-03-01

    When incident energetic electrons interact with a material, they excite electrons within the material to escape energies. The electron emission is quantified as the ratio of emitted electrons to incident particle flux, termed electron yield. Measuring the electron yield of insulators is difficult due to dynamic surface charge accumulation which directly affects landing energies and the potential barrier that emitted electrons must overcome. Our recent measurements of highly insulating materials have demonstrated significant changes in total yield curves and yield decay curves for very small electron doses equivalent to a trapped charge density of <10^10 electrons /cm^3. The Chung-Everhart theory provides a basic model for the behavior of the electron emission spectra which we relate to yield decay curves as charge is allowed to accumulate. Yield measurements as a function of dose for polyimide (Kapton^TM) and microcrystalline SiO2 will be presented. We use our data and model to address the question of whether there is a minimal dose threshold at which the accumulated charge no longer affects the yield.

  6. Network Motif Basis of Threshold Responses

    EPA Science Inventory

    There has been a long-running debate over the existence of thresholds for adverse effects. The difficulty stems from two fundamental challenges: (i) statistical analysis by itself cannot prove the existence of a threshold, i.e., a dose below which there is no effect; and (ii) the...

  7. Relieving dyspnoea by non-invasive ventilation decreases pain thresholds in amyotrophic lateral sclerosis.

    PubMed

    Dangers, Laurence; Laviolette, Louis; Georges, Marjolaine; Gonzalez-Bermejo, Jésus; Rivals, Isabelle; Similowski, Thomas; Morelot-Panzini, Capucine

    2017-03-01

    Dyspnoea is a threatening sensation of respiratory discomfort that presents many similarities with pain. Experimental dyspnoea in healthy subjects induces analgesia. This 'dyspnoea-pain counter-irritation' could, in reverse, imply that relieving dyspnoea in patients with chronic respiratory diseases would lower their pain thresholds. We first determined pressure pain thresholds in 25 healthy volunteers (22-31 years; 13 men; handheld algometer), during unloaded breathing (BASELINE) and during inspiratory threshold loading (ITL). Two levels of loading were used, adjusted to induce dyspnoea self-rated at 60% or 80% of a 10 cm visual analogue scale (ITL6 and ITL8). 18 patients with chronic respiratory failure due to amyotrophic lateral sclerosis (ALS) were then studied during unassisted breathing and after 30 and 60 min of non-invasive ventilation-NIV30 and NIV60-(same dyspnoea evaluation). In healthy volunteers, pressure pain thresholds increased significantly in the deltoid during ITL6 (p<0.05) and ITL8 (p<0.05) and in the trapezius during ITL8 (p<0.05), validating the use of pressure pain thresholds to study dyspnoea-pain counter-irritation. In patients with ALS, the pressure pain thresholds measured in the deltoid during unassisted breathing decreased by a median of 24.5%-33.0% of baseline during NIV30 and NIV60 (p<0.05). Relieving dyspnoea by NIV in patients with ALS having respiratory failure is associated with decreased pressure pain thresholds. Clinical implications have yet to be determined, but this observation suggests that patients with ALS could become more susceptible to pain after the institution of NIV, hence the need for reinforced attention towards potentially painful diagnostic and therapeutic interventions. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  8. Comparison of epicardial adipose tissue radiodensity threshold between contrast and non-contrast enhanced computed tomography scans: A cohort study of derivation and validation.

    PubMed

    Xu, Lingyu; Xu, Yuancheng; Coulden, Richard; Sonnex, Emer; Hrybouski, Stanislau; Paterson, Ian; Butler, Craig

    2018-05-11

    Epicardial adipose tissue (EAT) volume derived from contrast enhanced (CE) computed tomography (CT) scans is not well validated. We aim to establish a reliable threshold to accurately quantify EAT volume from CE datasets. We analyzed EAT volume on paired non-contrast (NC) and CE datasets from 25 patients to derive appropriate Hounsfield (HU) cutpoints to equalize two EAT volume estimates. The gold standard threshold (-190HU, -30HU) was used to assess EAT volume on NC datasets. For CE datasets, EAT volumes were estimated using three previously reported thresholds: (-190HU, -30HU), (-190HU, -15HU), (-175HU, -15HU) and were analyzed by a semi-automated 3D Fat analysis software. Subsequently, we applied a threshold correction to (-190HU, -30HU) based on mean differences in radiodensity between NC and CE images (ΔEATrd = CE radiodensity - NC radiodensity). We then validated our findings on EAT threshold in 21 additional patients with paired CT datasets. EAT volume from CE datasets using previously published thresholds consistently underestimated EAT volume from NC dataset standard by a magnitude of 8.2%-19.1%. Using our corrected threshold (-190HU, -3HU) in CE datasets yielded statistically identical EAT volume to NC EAT volume in the validation cohort (186.1 ± 80.3 vs. 185.5 ± 80.1 cm 3 , Δ = 0.6 cm 3 , 0.3%, p = 0.374). Estimating EAT volume from contrast enhanced CT scans using a corrected threshold of -190HU, -3HU provided excellent agreement with EAT volume from non-contrast CT scans using a standard threshold of -190HU, -30HU. Copyright © 2018. Published by Elsevier B.V.

  9. Low doses of ionizing radiation to mammalian cells may rather control than cause DNA damage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feinendegen, L.E.; Bond, V.P.; Sondhaus, C.A.

    This report examines the origin of tissue effects that may follow from different cellular responses to low-dose irradiation, using published data. Two principal categories of cellular responses are considered. One response category relates to the probability of radiation-induced DNA damage. The other category consists of low-dose induced metabolic changes that induce mechanisms of DNA damage mitigation, which do not operate at high levels of exposure. Modeled in this way, tissue is treated as a complex adaptive system. The interaction of the various cellular responses results in a net tissue dose-effect relation that is likely to deviate from linearity in themore » low-dose region. This suggests that the LNT hypothesis should be reexamined. This paper aims at demonstrating tissue effects as an expression of cellular responses, both damaging and defensive, in relation to the energy deposited in cell mass, by use of microdosimetric concepts.« less

  10. Dual-threshold segmentation using Arimoto entropy based on chaotic bee colony optimization

    NASA Astrophysics Data System (ADS)

    Li, Li

    2018-03-01

    In order to extract target from complex background more quickly and accurately, and to further improve the detection effect of defects, a method of dual-threshold segmentation using Arimoto entropy based on chaotic bee colony optimization was proposed. Firstly, the method of single-threshold selection based on Arimoto entropy was extended to dual-threshold selection in order to separate the target from the background more accurately. Then intermediate variables in formulae of Arimoto entropy dual-threshold selection was calculated by recursion to eliminate redundant computation effectively and to reduce the amount of calculation. Finally, the local search phase of artificial bee colony algorithm was improved by chaotic sequence based on tent mapping. The fast search for two optimal thresholds was achieved using the improved bee colony optimization algorithm, thus the search could be accelerated obviously. A large number of experimental results show that, compared with the existing segmentation methods such as multi-threshold segmentation method using maximum Shannon entropy, two-dimensional Shannon entropy segmentation method, two-dimensional Tsallis gray entropy segmentation method and multi-threshold segmentation method using reciprocal gray entropy, the proposed method can segment target more quickly and accurately with superior segmentation effect. It proves to be an instant and effective method for image segmentation.

  11. Influence of eye size and beam entry angle on dose to non-targeted tissues of the eye during stereotactic x-ray radiosurgery of AMD

    NASA Astrophysics Data System (ADS)

    Cantley, Justin L.; Hanlon, Justin; Chell, Erik; Lee, Choonsik; Smith, W. Clay; Bolch, Wesley E.

    2013-10-01

    Age-related macular degeneration is a leading cause of vision loss for the elderly population of industrialized nations. The IRay® Radiotherapy System, developed by Oraya® Therapeutics, Inc., is a stereotactic low-voltage irradiation system designed to treat the wet form of the disease. The IRay System uses three robotically positioned 100 kVp collimated photon beams to deliver an absorbed dose of up to 24 Gy to the macula. The present study uses the Monte Carlo radiation transport code MCNPX to assess absorbed dose to six non-targeted tissues within the eye—total lens, radiosensitive tissues of the lens, optic nerve, distal tip of the central retinal artery, non-targeted portion of the retina, and the ciliary body--all as a function of eye size and beam entry angle. The ocular axial length was ranged from 20 to 28 mm in 2 mm increments, with the polar entry angle of the delivery system varied from 18° to 34° in 2° increments. The resulting data showed insignificant variations in dose for all eye sizes. Slight variations in the dose to the optic nerve and the distal tip of the central retinal artery were noted as the polar beam angle changed. An increase in non-targeted retinal dose was noted as the entry angle increased, while the dose to the lens, sensitive volume of the lens, and ciliary body decreased as the treatment polar angle increased. Polar angles of 26° or greater resulted in no portion of the sensitive volume of the lens receiving an absorbed dose of 0.5 Gy or greater. All doses to non-targeted structures reported in this study were less than accepted thresholds for post-procedure complications.

  12. Relationship Between Radiation Therapy Dose and Outcome in Patients Treated With Neoadjuvant Chemoradiation Therapy and Surgery for Stage IIIA Non-Small Cell Lung Cancer: A Population-Based, Comparative Effectiveness Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sher, David J., E-mail: david_sher@rush.edu; Fidler, Mary Jo; Seder, Christopher W.

    Purpose: To compare, using the National Cancer Database, survival, pathologic, and surgical outcomes in patients with stage IIIA non-small cell lung cancer treated with differential doses of neoadjuvant chemoradiation therapy, with the aim to discern whether radiation dose escalation was associated with a comparative effectiveness benefit and/or toxicity risk. Methods and Materials: Patients in the National Cancer Database with stage IIIA non-small cell lung cancer treated with neoadjuvant chemoradiation therapy and surgery between 1998 and 2005 were analyzed. Dose strata were divided between 36 to 45 Gy (low-dose radiation therapy, LD-RT), 45 to 54 Gy (inclusive, standard-dose, SD-RT), and 54 to 74 Gymore » (high-dose, HD-RT). Outcomes included overall survival, residual nodal disease, positive surgical margin status, hospital length of stay, and adverse surgical outcomes (30-day mortality or readmission). Results: The cohort consisted of 1041 patients: 233 (22%) LD-RT, 584 (56%) SD-RT, and 230 (22%) HD-RT. The median, 3-year, and 5-year overall survival outcomes were 34.9 months, 48%, and 37%, respectively. On univariable analysis, patients treated with SD-RT experienced prolonged overall survival (median 38.3 vs 31.8 vs 29.0 months for SD-RT, LD-RT, and HD-RT, respectively, P=.0089), which was confirmed on multivariable analysis (hazard ratios 0.77 and 0.81 vs LD and HD, respectively). Residual nodal disease was seen less often after HD-RT (25.5% vs 31.8% and 37.5% for HD-RT, LD-RT, and SD-RT, respectively, P=.0038). Patients treated with SD-RT had fewer prolonged hospital stays. There were no differences in positive surgical margin status or adverse surgical outcomes between the cohorts. Conclusions: Neoadjuvant chemoradiation therapy between 45 and 54 Gy was associated with superior survival in comparison with doses above and below this threshold. Although this conclusion is limited by selection bias, clear candidates for trimodality therapy do not

  13. Color difference threshold determination for acrylic denture base resins.

    PubMed

    Ren, Jiabao; Lin, Hong; Huang, Qingmei; Liang, Qifan; Zheng, Gang

    2015-01-01

    This study aimed to set evaluation indicators, i.e., perceptibility and acceptability color difference thresholds, of color stability for acrylic denture base resins for a spectrophotometric assessing method, which offered an alternative to the visual method described in ISO 20795-1:2013. A total of 291 disk specimens 50±1 mm in diameter and 0.5±0.1 mm thick were prepared (ISO 20795-1:2013) and processed through radiation tests in an accelerated aging chamber (ISO 7491:2000) for increasing times of 0 to 42 hours. Color alterations were measured with a spectrophotometer and evaluated using the CIE L*a*b* colorimetric system. Color differences were calculated through the CIEDE2000 color difference formula. Thirty-two dental professionals without color vision deficiencies completed perceptibility and acceptability assessments under controlled conditions in vitro. An S-curve fitting procedure was used to analyze the 50:50% perceptibility and acceptability thresholds. Furthermore, perceptibility and acceptability against the differences of the three color attributes, lightness, chroma, and hue, were also investigated. According to the S-curve fitting procedure, the 50:50% perceptibility threshold was 1.71ΔE00 (r(2)=0.88) and the 50:50% acceptability threshold was 4.00 ΔE00 (r(2)=0.89). Within the limitations of this study, 1.71/4.00 ΔE00 could be used as perceptibility/acceptability thresholds for acrylic denture base resins.

  14. Invited perspectives: Hydrological perspectives on precipitation intensity-duration thresholds for landslide initiation: proposing hydro-meteorological thresholds

    NASA Astrophysics Data System (ADS)

    Bogaard, Thom; Greco, Roberto

    2018-01-01

    Many shallow landslides and debris flows are precipitation initiated. Therefore, regional landslide hazard assessment is often based on empirically derived precipitation intensity-duration (ID) thresholds and landslide inventories. Generally, two features of precipitation events are plotted and labeled with (shallow) landslide occurrence or non-occurrence. Hereafter, a separation line or zone is drawn, mostly in logarithmic space. The practical background of ID is that often only meteorological information is available when analyzing (non-)occurrence of shallow landslides and, at the same time, it could be that precipitation information is a good proxy for both meteorological trigger and hydrological cause. Although applied in many case studies, this approach suffers from many false positives as well as limited physical process understanding. Some first steps towards a more hydrologically based approach have been proposed in the past, but these efforts received limited follow-up.Therefore, the objective of our paper is to (a) critically analyze the concept of precipitation ID thresholds for shallow landslides and debris flows from a hydro-meteorological point of view and (b) propose a trigger-cause conceptual framework for lumped regional hydro-meteorological hazard assessment based on published examples and associated discussion. We discuss the ID thresholds in relation to return periods of precipitation, soil physics, and slope and catchment water balance. With this paper, we aim to contribute to the development of a stronger conceptual model for regional landslide hazard assessment based on physical process understanding and empirical data.

  15. Laser damage threshold measurements of microstructure-based high reflectors

    NASA Astrophysics Data System (ADS)

    Hobbs, Douglas S.

    2008-10-01

    In 2007, the pulsed laser induced damage threshold (LIDT) of anti-reflecting (AR) microstructures built in fused silica and glass was shown to be up to three times greater than the LIDT of single-layer thin-film AR coatings, and at least five times greater than multiple-layer thin-film AR coatings. This result suggested that microstructure-based wavelength selective mirrors might also exhibit high LIDT. Efficient light reflection over a narrow spectral range can be produced by an array of sub-wavelength sized surface relief microstructures built in a waveguide configuration. Such surface structure resonant (SSR) filters typically achieve a reflectivity exceeding 99% over a 1-10nm range about the filter center wavelength, making SSR filters useful as laser high reflectors (HR). SSR laser mirrors consist of microstructures that are first etched in the surface of fused silica and borosilicate glass windows and subsequently coated with a thin layer of a non-absorbing high refractive index dielectric material such as tantalum pent-oxide or zinc sulfide. Results of an initial investigation into the LIDT of single layer SSR laser mirrors operating at 532nm, 1064nm and 1573nm are described along with data from SEM analysis of the microstructures, and spectral reflection measurements. None of the twelve samples tested exhibited damage thresholds above 3 J/cm2 when illuminated at the resonant wavelength, indicating that the simple single layer, first order design will need further development to be suitable for high power laser applications. Samples of SSR high reflectors entered in the Thin Film Damage Competition also exhibited low damage thresholds of less than 1 J/cm2 for the ZnS coated SSR, and just over 4 J/cm2 for the Ta2O5 coated SSR.

  16. Re-Visiting Trichuris trichiura Intensity Thresholds Based on Anemia during Pregnancy

    PubMed Central

    Gyorkos, Theresa W.; Gilbert, Nicolas L.; Larocque, Renée; Casapía, Martín; Montresor, Antonio

    2012-01-01

    Background The intensity categories, or thresholds, currently used for Trichuris trichiura (ie. epg intensities of 1–999 (light); 1,000–9,999 epg (moderate), and ≥10,000 epg (heavy)) were developed in the 1980s, when there were little epidemiological data available on dose-response relationships. This study was undertaken to determine a threshold for T. trichiura-associated anemia in pregnant women and to describe the implications of this threshold in terms of the need for primary prevention and chemotherapeutic interventions. Methodology/Principal Findings In Iquitos, Peru, 935 pregnant women were tested for T. trichiura infection in their second trimester of pregnancy; were given daily iron supplements throughout their pregnancy; and had their blood hemoglobin levels measured in their third trimester of pregnancy. Women in the highest two T. trichiura intensity quintiles (601–1632 epg and ≥1633 epg) had significantly lower mean hemoglobin concentrations than the lowest quintile (0–24 epg). They also had a statistically significantly higher risk of anemia, with adjusted odds ratios of 1.67 (95% CI: 1.02, 2.62) and 1.73 (95% CI: 1.09, 2.74), respectively. Conclusions/Significance This analysis provides support for categorizing a T. trichiura infection ≥1,000 epg as ‘moderate’, as currently defined by the World Health Organization. Because this ‘moderate’ level of T. trichiura infection was found to be a significant risk factor for anemia in pregnant women, the intensity of Trichuris infection deemed to cause or aggravate anemia should no longer be restricted to the ‘heavy’ intensity category. It should now include both ‘heavy’ and ‘moderate’ intensities of Trichuris infection. Evidence-based deworming strategies targeting pregnant women or populations where anemia is of concern should be updated accordingly. PMID:23029572

  17. Mechanisms of carcinogensis: dose response

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gehring, P.J.; Blau, G.E.

    There is great controversy whether the carcinogenicity of chemicals is dose-dependent and whether a threshold dose exists below which cancer will not be induced by exposure. Evidence for dose-dependency exists and is believed to be accepted generally if extricated as it should be from the threshold concept. The threshold concept conflict is not likely to be resolved in the foreseeable future; proponents and opponents argue their case in a manner similar to those arguing religion. In this paper the various arguments are reviewed. Subsequently, a chemical process model for carcinogenesis is developed based on the generally accepted evidence that themore » carcinogenic activity of many chemicals can be related to electrophilic alkylation of DNA. Using this model, some incidence of cancer, albeit negligible, will be predicted regardless how low the dose. However, the model revelas that the incidence of cancer induced by real-life exposures is likely to be greatly overestimated by currently used stochastic statistical extrapolations. Even more important, modeling of the chemical processes involved in the fate of a carcinogenic chemical in the body reveals experimental approaches to elucidating the mechanism(s) of carcinogenesis and ultimately a more scientifically sound basis for assessing the hazard of low-level exposure to a chemical carcinogen.« less

  18. Effects of Cobalt-60 Exposure on Health of Taiwan Residents Suggest New Approach Needed in Radiation Protection

    PubMed Central

    Chen, W.L.; Luan, Y.C.; Shieh, M.C.; Chen, S.T.; Kung, H.T.; Soong, K.L; Yeh, Y.C.; Chou, T.S.; Mong, S.H.; Wu, J.T.; Sun, C.P.; Deng, W.P.; Wu, M.F.; Shen, M.L.

    2007-01-01

    The conventional approach for radiation protection is based on the ICRP's linear, no threshold (LNT) model of radiation carcinogenesis, which implies that ionizing radiation is always harmful, no matter how small the dose. But a different approach can be derived from the observed health effects of the serendipitous contamination of 1700 apartments in Taiwan with cobalt-60 (T1/2 = 5.3 y). This experience indicates that chronic exposure of the whole body to low-dose-rate radiation, even accumulated to a high annual dose, may be beneficial to human health. Approximately 10,000 people occupied these buildings and received an average radiation dose of 0.4 Sv, unknowingly, during a 9–20 year period. They did not suffer a higher incidence of cancer mortality, as the LNT theory would predict. On the contrary, the incidence of cancer deaths in this population was greatly reduced—to about 3 per cent of the incidence of spontaneous cancer death in the general Taiwan public. In addition, the incidence of congenital malformations was also reduced—to about 7 per cent of the incidence in the general public. These observations appear to be compatible with the radiation hormesis model. Information about this Taiwan experience should be communicated to the public worldwide to help allay its fear of radiation and create a positive impression about important radiation applications. Expenditures of many billions of dollars in nuclear reactor operation could be saved and expansion of nuclear electricity generation could be facilitated. In addition, this knowledge would encourage further investigation and implementation of very important applications of total-body, low-dose irradiation to treat and cure many illnesses, including cancer. The findings of this study are such a departure from expectations, based on ICRP criteria, that we believe that they ought to be carefully reviewed by other, independent organizations and that population data not available to the authors be provided

  19. Dose equivalent rate constants and barrier transmission data for nuclear medicine facility dose calculations and shielding design.

    PubMed

    Kusano, Maggie; Caldwell, Curtis B

    2014-07-01

    A primary goal of nuclear medicine facility design is to keep public and worker radiation doses As Low As Reasonably Achievable (ALARA). To estimate dose and shielding requirements, one needs to know both the dose equivalent rate constants for soft tissue and barrier transmission factors (TFs) for all radionuclides of interest. Dose equivalent rate constants are most commonly calculated using published air kerma or exposure rate constants, while transmission factors are most commonly calculated using published tenth-value layers (TVLs). Values can be calculated more accurately using the radionuclide's photon emission spectrum and the physical properties of lead, concrete, and/or tissue at these energies. These calculations may be non-trivial due to the polyenergetic nature of the radionuclides used in nuclear medicine. In this paper, the effects of dose equivalent rate constant and transmission factor on nuclear medicine dose and shielding calculations are investigated, and new values based on up-to-date nuclear data and thresholds specific to nuclear medicine are proposed. To facilitate practical use, transmission curves were fitted to the three-parameter Archer equation. Finally, the results of this work were applied to the design of a sample nuclear medicine facility and compared to doses calculated using common methods to investigate the effects of these values on dose estimates and shielding decisions. Dose equivalent rate constants generally agreed well with those derived from the literature with the exception of those from NCRP 124. Depending on the situation, Archer fit TFs could be significantly more accurate than TVL-based TFs. These results were reflected in the sample shielding problem, with unshielded dose estimates agreeing well, with the exception of those based on NCRP 124, and Archer fit TFs providing a more accurate alternative to TVL TFs and a simpler alternative to full spectral-based calculations. The data provided by this paper should assist

  20. A comparison of intensity modulated x-ray therapy to intensity modulated proton therapy for the delivery of non-uniform dose distributions

    NASA Astrophysics Data System (ADS)

    Flynn, Ryan

    2007-12-01

    The distribution of biological characteristics such as clonogen density, proliferation, and hypoxia throughout tumors is generally non-uniform, therefore it follows that the optimal dose prescriptions should also be non-uniform and tumor-specific. Advances in intensity modulated x-ray therapy (IMXT) technology have made the delivery of custom-made non-uniform dose distributions possible in practice. Intensity modulated proton therapy (IMPT) has the potential to deliver non-uniform dose distributions as well, while significantly reducing normal tissue and organ at risk dose relative to IMXT. In this work, a specialized treatment planning system was developed for the purpose of optimizing and comparing biologically based IMXT and IMPT plans. The IMXT systems of step-and-shoot (IMXT-SAS) and helical tomotherapy (IMXT-HT) and the IMPT systems of intensity modulated spot scanning (IMPT-SS) and distal gradient tracking (IMPT-DGT), were simulated. A thorough phantom study was conducted in which several subvolumes, which were contained within a base tumor region, were boosted or avoided with IMXT and IMPT. Different boosting situations were simulated by varying the size, proximity, and the doses prescribed to the subvolumes, and the size of the phantom. IMXT and IMPT were also compared for a whole brain radiation therapy (WBRT) case, in which a brain metastasis was simultaneously boosted and the hippocampus was avoided. Finally, IMXT and IMPT dose distributions were compared for the case of non-uniform dose prescription in a head and neck cancer patient that was based on PET imaging with the Cu(II)-diacetyl-bis(N4-methylthiosemicarbazone (Cu-ATSM) hypoxia marker. The non-uniform dose distributions within the tumor region were comparable for IMXT and IMPT. IMPT, however, was capable of delivering the same non-uniform dose distributions within a tumor using a 180° arc as for a full 360° rotation, which resulted in the reduction of normal tissue integral dose by a factor of

  1. Smeared spectrum jamming suppression based on generalized S transform and threshold segmentation

    NASA Astrophysics Data System (ADS)

    Li, Xin; Wang, Chunyang; Tan, Ming; Fu, Xiaolong

    2018-04-01

    Smeared Spectrum (SMSP) jamming is an effective jamming in countering linear frequency modulation (LFM) radar. According to the time-frequency distribution difference between jamming and echo, a jamming suppression method based on Generalized S transform (GST) and threshold segmentation is proposed. The sub-pulse period is firstly estimated based on auto correlation function firstly. Secondly, the time-frequency image and the related gray scale image are achieved based on GST. Finally, the Tsallis cross entropy is utilized to compute the optimized segmentation threshold, and then the jamming suppression filter is constructed based on the threshold. The simulation results show that the proposed method is of good performance in the suppression of false targets produced by SMSP.

  2. Tensile testing method for rare earth based bulk superconductors at liquid nitrogen temperature

    NASA Astrophysics Data System (ADS)

    Kasaba, K.; Katagiri, K.; Murakami, A.; Sato, G.; Sato, T.; Murakami, M.; Sakai, N.; Teshima, H.; Sawamura, M.

    2005-10-01

    Bending tests have been commonly carried out to investigate the mechanical properties of melt-processed rare earth based bulk superconductors. Tensile tests by using small specimen, however, are preferable to evaluate the detailed distribution of the mechanical properties and the intrinsic elastic modulus because no stress distributions exist in the cross-section. In this study, the tensile test method at low temperature by using specimens with the dimensions of 3 × 3 × 4 mm from Y123 and Gd123 bulks was examined. They were glued to Al alloy rods at 400 K by using epoxy resin. Tests were carried out at liquid nitrogen temperature (LNT) by using the immersion type jig. Although the bending strength in the direction perpendicular to the c-axis of the bulks at LNT is higher than that at room temperature (RT), the tensile strength at LNT was lower than that at RT. Many of specimens fractured near the interface between the specimen and the Al alloy rod at LNT. According to the finite element method analysis, it was shown that there was a peak thermal stress in the loading direction near the interface and it was significantly higher at LNT than that at RT. It was also shown that the replacement of the Al alloy rod to Ti rod of which the coefficient of thermal expansion is close to that of bulks significantly increased the tensile strength.

  3. Radiation leukaemogenesis at low doses DE-FG02-05 ER 63947 Final Technical Report 15 May 2005; 14 May 2010

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bouffler, Simon

    2010-07-28

    This report provides a complete summary of the work undertaken and results obtained under US Department of Energy grant DF-FG02-05 ER 63947, Radiation leukaemogenesis at low doses. There is ample epidemiological evidence indicating that ionizing radiation is carcinogenic in the higher dose range. This evidence, however, weakens and carries increasing uncertainties at doses below 100-200 mSv. At these low dose levels the form of the dose-response curve for radiation-induced cancer cannot be determined reliably or directly from studies of human populations. Therefore animal, cellular and other experimental systems must be employed to provide supporting evidence on which to base judgementsmore » of risk at low doses. Currently in radiological protection a linear non-threshold (LNT) extrapolation of risk estimates derived from human epidemiological studies is used to estimate risks in the dose range of interest for protection purposes. Myeloid leukaemias feature prominently among the cancers associated with human exposures to ionising radiation (eg UNSCEAR 2006; IARC 2000). Good animal models of radiation-induced acute myeloid leukaemia (AML) are available including strains such as CBA, RFM and SJL (eg Major and Mole 1978; Ullrich et al 1976; Resnitzky et al 1985). Early mechanistic studies using cytogenetic methods in these mouse models established that the majority of radiation-induced AMLs carried substantial interstitial deletions in one copy of chromosome (chr) 2 (eg Hayata et al 1983; Trakhtenbrot et al 1988; Breckon et al 1991; Rithidech et al 1993; Bouffler et al 1996). Chr2 aberrations are known to occur in bone marrow cells as early as 24 hours after in vivo irradiation (Bouffler et al 1997). Subsequent molecular mapping studies defined a distinct region of chr2 that is commonly lost in AMLs (Clark et al 1996; Silver et al 1999). Further, more detailed, analysis identified point mutations at a specific region of the Sfpi1/PU.1 haemopoietic transcription

  4. A threshold-based fixed predictor for JPEG-LS image compression

    NASA Astrophysics Data System (ADS)

    Deng, Lihua; Huang, Zhenghua; Yao, Shoukui

    2018-03-01

    In JPEG-LS, fixed predictor based on median edge detector (MED) only detect horizontal and vertical edges, and thus produces large prediction errors in the locality of diagonal edges. In this paper, we propose a threshold-based edge detection scheme for the fixed predictor. The proposed scheme can detect not only the horizontal and vertical edges, but also diagonal edges. For some certain thresholds, the proposed scheme can be simplified to other existing schemes. So, it can also be regarded as the integration of these existing schemes. For a suitable threshold, the accuracy of horizontal and vertical edges detection is higher than the existing median edge detection in JPEG-LS. Thus, the proposed fixed predictor outperforms the existing JPEG-LS predictors for all images tested, while the complexity of the overall algorithm is maintained at a similar level.

  5. New Fetal Dose Estimates from 18F-FDG Administered During Pregnancy: Standardization of Dose Calculations and Estimations with Voxel-Based Anthropomorphic Phantoms.

    PubMed

    Zanotti-Fregonara, Paolo; Chastan, Mathieu; Edet-Sanson, Agathe; Ekmekcioglu, Ozgul; Erdogan, Ezgi Basak; Hapdey, Sebastien; Hindie, Elif; Stabin, Michael G

    2016-11-01

    Data from the literature show that the fetal absorbed dose from 18 F-FDG administration to the pregnant mother ranges from 0.5E-2 to 4E-2 mGy/MBq. These figures were, however, obtained using different quantification techniques and with basic geometric anthropomorphic phantoms. The aim of this study was to refine the fetal dose estimates of published as well as new cases using realistic voxel-based phantoms. The 18 F-FDG doses to the fetus (n = 19; 5-34 wk of pregnancy) were calculated with new voxel-based anthropomorphic phantoms of the pregnant woman. The image-derived fetal time-integrated activity values were combined with those of the mothers' organs from the International Commission on Radiological Protection publication 106 and the dynamic bladder model with a 1-h bladder-voiding interval. The dose to the uterus was used as a proxy for early pregnancy (up to 10 wk). The time-integrated activities were entered into OLINDA/EXM 1.1 to derive the dose with the classic anthropomorphic phantoms of pregnant women, then into OLINDA/EXM 2.0 to assess the dose using new voxel-based phantoms. The average fetal doses (mGy/MBq) with OLINDA/EXM 2.0 were 2.5E-02 in early pregnancy, 1.3E-02 in the late part of the first trimester, 8.5E-03 in the second trimester, and 5.1E-03 in the third trimester. The differences compared with the doses calculated with OLINDA/EXM 1.1 were +7%, +70%, +35%, and -8%, respectively. Except in late pregnancy, the doses estimated with realistic voxelwise anthropomorphic phantoms are higher than the doses derived from old geometric phantoms. The doses remain, however, well below the threshold for any deterministic effects. Thus, pregnancy is not an absolute contraindication of a clinically justified 18 F-FDG PET scan. © 2016 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  6. Low-dose CT reconstruction with patch based sparsity and similarity constraints

    NASA Astrophysics Data System (ADS)

    Xu, Qiong; Mou, Xuanqin

    2014-03-01

    As the rapid growth of CT based medical application, low-dose CT reconstruction becomes more and more important to human health. Compared with other methods, statistical iterative reconstruction (SIR) usually performs better in lowdose case. However, the reconstructed image quality of SIR highly depends on the prior based regularization due to the insufficient of low-dose data. The frequently-used regularization is developed from pixel based prior, such as the smoothness between adjacent pixels. This kind of pixel based constraint cannot distinguish noise and structures effectively. Recently, patch based methods, such as dictionary learning and non-local means filtering, have outperformed the conventional pixel based methods. Patch is a small area of image, which expresses structural information of image. In this paper, we propose to use patch based constraint to improve the image quality of low-dose CT reconstruction. In the SIR framework, both patch based sparsity and similarity are considered in the regularization term. On one hand, patch based sparsity is addressed by sparse representation and dictionary learning methods, on the other hand, patch based similarity is addressed by non-local means filtering method. We conducted a real data experiment to evaluate the proposed method. The experimental results validate this method can lead to better image with less noise and more detail than other methods in low-count and few-views cases.

  7. A de-noising algorithm based on wavelet threshold-exponential adaptive window width-fitting for ground electrical source airborne transient electromagnetic signal

    NASA Astrophysics Data System (ADS)

    Ji, Yanju; Li, Dongsheng; Yu, Mingmei; Wang, Yuan; Wu, Qiong; Lin, Jun

    2016-05-01

    The ground electrical source airborne transient electromagnetic system (GREATEM) on an unmanned aircraft enjoys considerable prospecting depth, lateral resolution and detection efficiency, etc. In recent years it has become an important technical means of rapid resources exploration. However, GREATEM data are extremely vulnerable to stationary white noise and non-stationary electromagnetic noise (sferics noise, aircraft engine noise and other human electromagnetic noises). These noises will cause degradation of the imaging quality for data interpretation. Based on the characteristics of the GREATEM data and major noises, we propose a de-noising algorithm utilizing wavelet threshold method and exponential adaptive window width-fitting. Firstly, the white noise is filtered in the measured data using the wavelet threshold method. Then, the data are segmented using data window whose step length is even logarithmic intervals. The data polluted by electromagnetic noise are identified within each window based on the discriminating principle of energy detection, and the attenuation characteristics of the data slope are extracted. Eventually, an exponential fitting algorithm is adopted to fit the attenuation curve of each window, and the data polluted by non-stationary electromagnetic noise are replaced with their fitting results. Thus the non-stationary electromagnetic noise can be effectively removed. The proposed algorithm is verified by the synthetic and real GREATEM signals. The results show that in GREATEM signal, stationary white noise and non-stationary electromagnetic noise can be effectively filtered using the wavelet threshold-exponential adaptive window width-fitting algorithm, which enhances the imaging quality.

  8. Landslide triggering thresholds for Switzerland based on a new gridded precipitation dataset

    NASA Astrophysics Data System (ADS)

    Leonarduzzi, Elena; Molnar, Peter; McArdell, Brian W.

    2017-04-01

    In Switzerland floods are responsible for most of the damage caused by rainfall-triggered natural hazards (89%), followed by landslides (6%, ca. 520 M Euros) as reported in Hilker et al. (2009) for the period 1972-2007. The prediction of landslide occurrence is particularly challenging because of their wide distribution in space and the complex interdependence of predisposing and triggering factors. The overall goal of our research is to develop an Early Warning System for landsliding in Switzerland based on hydrological modelling and rainfall forecasts. In order to achieve this, we first analyzed rainfall triggering thresholds for landslides from a new gridded daily precipitation dataset (RhiresD, MeteoSwiss) for Switzerland combined with landslide events recorded in the Swiss Damage Database (Hilker et al.,2009). The high-resolution gridded precipitation dataset allows us to collocate rainfall and landslides accurately in space, which is an advantage over many previous studies. Each of the 2272 landslides in the database in the period 1972-2012 was assigned to the corresponding 2x2 km precipitation cell. For each of these cells, precipitation events were defined as series of consecutive rainy days and the following event parameters were computed: duration (day), maximum and mean daily intensity (mm/day), total rainfall depth (mm) and maximum daily intensity divided by Mean Daily Precipitation (MDP). The events were classified as triggering or non-triggering depending on whether a landslide was recorded in the cell during the event. This classification of observations was compared to predictions based on a threshold for each of the parameters. The predictive power of each parameter and the best threshold value were quantified by ROC analysis and statistics such as AUC and the True Skill Statistic (TSS). Event parameters based on rainfall intensity were found to have similarly high predictive power (TSS=0.54-0.59, AUC=0.85-0.86), while rainfall duration had a

  9. 2 CFR 200.88 - Simplified acquisition threshold.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 2 Grants and Agreements 1 2014-01-01 2014-01-01 false Simplified acquisition threshold. 200.88... acquisition threshold. Simplified acquisition threshold means the dollar amount below which a non-Federal... threshold. The simplified acquisition threshold is set by the Federal Acquisition Regulation at 48 CFR...

  10. a Threshold-Free Filtering Algorithm for Airborne LIDAR Point Clouds Based on Expectation-Maximization

    NASA Astrophysics Data System (ADS)

    Hui, Z.; Cheng, P.; Ziggah, Y. Y.; Nie, Y.

    2018-04-01

    Filtering is a key step for most applications of airborne LiDAR point clouds. Although lots of filtering algorithms have been put forward in recent years, most of them suffer from parameters setting or thresholds adjusting, which will be time-consuming and reduce the degree of automation of the algorithm. To overcome this problem, this paper proposed a threshold-free filtering algorithm based on expectation-maximization. The proposed algorithm is developed based on an assumption that point clouds are seen as a mixture of Gaussian models. The separation of ground points and non-ground points from point clouds can be replaced as a separation of a mixed Gaussian model. Expectation-maximization (EM) is applied for realizing the separation. EM is used to calculate maximum likelihood estimates of the mixture parameters. Using the estimated parameters, the likelihoods of each point belonging to ground or object can be computed. After several iterations, point clouds can be labelled as the component with a larger likelihood. Furthermore, intensity information was also utilized to optimize the filtering results acquired using the EM method. The proposed algorithm was tested using two different datasets used in practice. Experimental results showed that the proposed method can filter non-ground points effectively. To quantitatively evaluate the proposed method, this paper adopted the dataset provided by the ISPRS for the test. The proposed algorithm can obtain a 4.48 % total error which is much lower than most of the eight classical filtering algorithms reported by the ISPRS.

  11. ESTIMATING CHILDREN'S DERMAL AND NON-DIETARY INGESTION EXPOSURE AND DOSE WITH EPA'S SHEDS MODEL

    EPA Science Inventory

    A physically-based stochastic model (SHEDS) has been developed to estimate pesticide exposure and dose to children via dermal residue contact and non-dietary ingestion. Time-location-activity data are sampled from national survey results to generate a population of simulated ch...

  12. Heavy particle irradiation, neurochemistry and behavior: thresholds, dose-response curves and recovery of function

    NASA Astrophysics Data System (ADS)

    Rabin, B. M.; Joseph, J. A.; Shukitt-Hale, B.

    2004-01-01

    Exposure to heavy particles can affect the functioning of the central nervous system (CNS), particularly the dopaminergic system. In turn, the radiation-induced disruption of dopaminergic function affects a variety of behaviors that are dependent upon the integrity of this system, including motor behavior (upper body strength), amphetamine (dopamine)-mediated taste aversion learning, and operant conditioning (fixed-ratio bar pressing). Although the relationships between heavy particle irradiation and the effects of exposure depend, to some extent, upon the specific behavioral or neurochemical endpoint under consideration, a review of the available research leads to the hypothesis that the endpoints mediated by the CNS have certain characteristics in common. These include: (1) a threshold, below which there is no apparent effect; (2) the lack of a dose-response relationship, or an extremely steep dose-response curve, depending on the particular endpoint; and (3) the absence of recovery of function, such that the heavy particle-induced behavioral and neural changes are present when tested up to one year following exposure. The current report reviews the data relevant to the degree to which these characteristics are common to neurochemical and behavioral endpoints that are mediated by the effects of exposure to heavy particles on CNS activity.

  13. Heavy particle irradiation, neurochemistry and behavior: thresholds, dose-response curves and recovery of function

    NASA Technical Reports Server (NTRS)

    Rabin, B. M.; Joseph, J. A.; Shukitt-Hale, B.

    2004-01-01

    Exposure to heavy particles can affect the functioning of the central nervous system (CNS), particularly the dopaminergic system. In turn, the radiation-induced disruption of dopaminergic function affects a variety of behaviors that are dependent upon the integrity of this system, including motor behavior (upper body strength), amphetamine (dopamine)-mediated taste aversion learning, and operant conditioning (fixed-ratio bar pressing). Although the relationships between heavy particle irradiation and the effects of exposure depend, to some extent, upon the specific behavioral or neurochemical endpoint under consideration, a review of the available research leads to the hypothesis that the endpoints mediated by the CNS have certain characteristics in common. These include: (1) a threshold, below which there is no apparent effect; (2) the lack of a dose-response relationship, or an extremely steep dose-response curve, depending on the particular endpoint; and (3) the absence of recovery of function, such that the heavy particle-induced behavioral and neural changes are present when tested up to one year following exposure. The current report reviews the data relevant to the degree to which these characteristics are common to neurochemical and behavioral endpoints that are mediated by the effects of exposure to heavy particles on CNS activity. c2004 COSPAR. Published by Elsevier Ltd. All rights reserved.

  14. Excel-Based Tool for Pharmacokinetically Guided Dose Adjustment of Paclitaxel.

    PubMed

    Kraff, Stefanie; Lindauer, Andreas; Joerger, Markus; Salamone, Salvatore J; Jaehde, Ulrich

    2015-12-01

    Neutropenia is a frequent and severe adverse event in patients receiving paclitaxel chemotherapy. The time above a paclitaxel threshold concentration of 0.05 μmol/L (Tc > 0.05 μmol/L) is a strong predictor for paclitaxel-associated neutropenia and has been proposed as a target pharmacokinetic (PK) parameter for paclitaxel therapeutic drug monitoring and dose adaptation. Up to now, individual Tc > 0.05 μmol/L values are estimated based on a published PK model of paclitaxel by using the software NONMEM. Because many clinicians are not familiar with the use of NONMEM, an Excel-based dosing tool was developed to allow calculation of paclitaxel Tc > 0.05 μmol/L and give clinicians an easy-to-use tool. Population PK parameters of paclitaxel were taken from a published PK model. An Alglib VBA code was implemented in Excel 2007 to compute differential equations for the paclitaxel PK model. Maximum a posteriori Bayesian estimates of the PK parameters were determined with the Excel Solver using individual drug concentrations. Concentrations from 250 patients were simulated receiving 1 cycle of paclitaxel chemotherapy. Predictions of paclitaxel Tc > 0.05 μmol/L as calculated by the Excel tool were compared with NONMEM, whereby maximum a posteriori Bayesian estimates were obtained using the POSTHOC function. There was a good concordance and comparable predictive performance between Excel and NONMEM regarding predicted paclitaxel plasma concentrations and Tc > 0.05 μmol/L values. Tc > 0.05 μmol/L had a maximum bias of 3% and an error on precision of <12%. The median relative deviation of the estimated Tc > 0.05 μmol/L values between both programs was 1%. The Excel-based tool can estimate the time above a paclitaxel threshold concentration of 0.05 μmol/L with acceptable accuracy and precision. The presented Excel tool allows reliable calculation of paclitaxel Tc > 0.05 μmol/L and thus allows target concentration intervention to improve the benefit-risk ratio of the

  15. Low-Dose N,N-Dimethylformamide Exposure and Liver Injuries in a Cohort of Chinese Leather Industry Workers.

    PubMed

    Qi, Cong; Gu, Yiyang; Sun, Qing; Gu, Hongliang; Xu, Bo; Gu, Qing; Xiao, Jing; Lian, Yulong

    2017-05-01

    We assessed the risk of liver injuries following low doses of N,N-dimethylformamide (DMF) below threshold limit values (20 mg/m) among leather industry workers and comparison groups. A cohort of 429 workers from a leather factory and 466 non-exposed subjects in China were followed for 4 years. Poisson regression and piece-wise linear regression were used to examine the relationship between DMF and liver injury. Workers exposed to a cumulative dose of DMF were significantly more likely than non-exposed workers to develop liver injury. A nonlinear relationship between DMF and liver injury was observed, and a threshold of the cumulative DMF dose for liver injury was 7.30 (mg/m) year. The findings indicate the importance of taking action to reduce DMF occupational exposure limits for promoting worker health.

  16. Mercury demethylation in waterbird livers: Dose-response thresholds and differences among species

    USGS Publications Warehouse

    Eagles-Smith, Collin A.; Ackerman, Joshua T.; Julie, Y.E.E.; Adelsbach, T.L.

    2009-01-01

    We assessed methylmercury (MeHg) demethylation in the livers of adults and chicks of four waterbird species that commonly breed in San Francisco Bay: American avocets, black-necked stilts, Caspian terns, and Forster's terns. In adults (all species combined), we found strong evidence for a threshold, model where MeHg demethylation occurred above a hepatic total mercury concentration threshold of 8.51 ?? 0.93 ??g/g dry weight, and there was a strong decline in %MeHg values as total mercury (THg) concentrations increased above 8.51 ??g/g dry weight. Conversely, there was no evidence for a demethylation threshold in chicks, and we found that %MeHg values declined linearly with increasing THg concentrations. For adults, we also found taxonomie differences in the demethylation responses, with avocets and stilts showing a higher demethylation rate than that of terns when concentrations exceeded the threshold, whereas terns had a lower demethylation threshold (7.48 ?? 1.48 ??g/g dry wt) than that of avocets and stilts (9.91 ?? 1.29 ??g/g dry wt). Finally, we assessed the role of selenium (Se) in the demethylation process. Selenium concentrations were positively correlated with inorganic Hg in livers of birds above the demethylation threshold but not below. This suggests that Se may act as a binding site for demethylated Hg and may reduce the potential for secondary toxicity. Our findings indicate that waterbirds demethylate mercury in their livers if exposure exceeds a threshold value and suggest that taxonomie differences in demethylation ability may be an important factor in evaluating species-specific risk to MeHg exposure. Further, we provide strong evidence for a threshold of approximately 8.5 ??g/g dry weight of THg in the liver where demethylation is initiated. ?? 2009 SETAC.

  17. Normal Threshold Size of Stimuli in Children Using a Game-Based Visual Field Test.

    PubMed

    Wang, Yanfang; Ali, Zaria; Subramani, Siddharth; Biswas, Susmito; Fenerty, Cecilia; Henson, David B; Aslam, Tariq

    2017-06-01

    The aim of this study was to demonstrate and explore the ability of novel game-based perimetry to establish normal visual field thresholds in children. One hundred and eighteen children (aged 8.0 ± 2.8 years old) with no history of visual field loss or significant medical history were recruited. Each child had one eye tested using a game-based visual field test 'Caspar's Castle' at four retinal locations 12.7° (N = 118) from fixation. Thresholds were established repeatedly using up/down staircase algorithms with stimuli of varying diameter (luminance 20 cd/m 2 , duration 200 ms, background luminance 10 cd/m 2 ). Relationships between threshold and age were determined along with measures of intra- and intersubject variability. The Game-based visual field test was able to establish threshold estimates in the full range of children tested. Threshold size reduced with increasing age in children. Intrasubject variability and intersubject variability were inversely related to age in children. Normal visual field thresholds were established for specific locations in children using a novel game-based visual field test. These could be used as a foundation for developing a game-based perimetry screening test for children.

  18. A flash flood early warning system based on rainfall thresholds and daily soil moisture indexes

    NASA Astrophysics Data System (ADS)

    Brigandì, Giuseppina; Tito Aronica, Giuseppe

    2015-04-01

    Main focus of the paper is to present a flash flood early warning system, developed for Civil Protection Agency for the Sicily Region, for alerting extreme hydrometeorological events by using a methodology based on the combined use of rainfall thresholds and soil moisture indexes. As matter of fact, flash flood warning is a key element to improve the Civil Protection achievements to mitigate damages and safeguard the security of people. It is a rather complicated task, particularly in those catchments with flashy response where even brief anticipations are important and welcomed. In this context, some kind of hydrological precursors can be considered to improve the effectiveness of the emergency actions (i.e. early flood warning). Now, it is well known how soil moisture is an important factor in flood formation, because the runoff generation is strongly influenced by the antecedent soil moisture conditions of the catchment. The basic idea of the work here presented is to use soil moisture indexes derived in a continuous form to define a first alert phase in a flash flood forecasting chain and then define a unique rainfall threshold for a given day for the subsequent alarm phases activation, derived as a function of the soil moisture conditions at the beginning of the day. Daily soil moisture indexes, representative of the moisture condition of the catchment, were derived by using a parsimonious and simply to use approach based on the IHACRES model application in a modified form developed by the authors. It is a simple, spatially-lumped rainfall-streamflow model, based on the SCS-CN method and on the unit hydrograph approach that requires only rainfall, streamflow and air temperature data. It consists of two modules. In the first a non linear loss model, based on the SCS-CN method, was used to transform total rainfall into effective rainfall. In the second, a linear convolution of effective rainfall was performed using a total unit hydrograph with a configuration of

  19. Urgent Change Needed to Radiation Protection Policy.

    PubMed

    Cuttler, Jerry M

    2016-03-01

    Although almost 120 y of medical experience and data exist on human exposure to ionizing radiation, advisory bodies and regulators claim there are still significant uncertainties about radiation health risks that require extreme precautions be taken. Decades of evidence led to recommendations in the 1920s for protecting radiologists by limiting their daily exposure. These were shown in later studies to decrease both their overall mortality and cancer mortality below those of unexposed groups. In the 1950s, without scientific evidence, the National Academy of Sciences Biological Effects of Atomic Radiation (BEAR) Committee and the NCRP recommended that the linear no-threshold (LNT) model be used to assess the risk of radiation-induced mutations in germ cells and the risk of cancer in somatic cells. This policy change was accepted by the regulators of every country without a thorough review of its basis. Because use of the LNT model has created extreme public fear of radiation, which impairs vital medical applications of low-dose radiation in diagnostics and therapy and blocks nuclear energy projects, it is time to change radiation protection policy back into line with the data.

  20. Repeated restraint stress lowers the threshold for response to third ventricle CRF administration.

    PubMed

    Harris, Ruth B S

    2017-03-01

    Rats and mice exposed to repeated stress or a single severe stress exhibit a sustained increase in energetic, endocrine, and behavioral response to subsequent novel mild stress. This study tested whether the hyper-responsiveness was due to a lowered threshold of response to corticotropin releasing factor (CRF) or an exaggerated response to a standard dose of CRF. Male Sprague-Dawley rats were subjected to 3h of restraint on each of 3 consecutive days (RRS) or were non-restrained controls. RRS caused a temporary hypophagia but a sustained reduction in body weight. Eight days after the end of restraint, rats received increasing third ventricle doses of CRF (0-3.0μg). The lowest dose of CRF (0.25μg) increased corticosterone release in RRS, but not control rats. Higher doses caused the same stimulation of corticosterone in the two groups of rats. Fifteen days after the end of restraint, rats were food deprived during the light period and received increasing third ventricle doses of CRF at the start of the dark period. The lowest dose of CRF inhibited food intake during the first hour following infusion in RRS, but not control rats. All other doses of CRF inhibited food intake to the same degree in both RRS and control rats. The lowered threshold of response to central CRF is consistent with the chronic hyper-responsiveness to CRF and mild stress in RRS rats during the post-restraint period. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. Molybdenum target specifications for cyclotron production of 99mTc based on patient dose estimates.

    PubMed

    Hou, X; Tanguay, J; Buckley, K; Schaffer, P; Bénard, F; Ruth, T J; Celler, A

    2016-01-21

    In response to the recognized fragility of reactor-produced (99)Mo supply, direct production of (99m)Tc via (100)Mo(p,2n)(99m)Tc reaction using medical cyclotrons has been investigated. However, due to the existence of other Molybdenum (Mo) isotopes in the target, in parallel with (99m)Tc, other technetium (Tc) radioactive isotopes (impurities) will be produced. They will be incorporated into the labeled radiopharmaceuticals and result in increased patient dose. The isotopic composition of the target and beam energy are main factors that determine production of impurities, thus also dose increases. Therefore, they both must be considered when selecting targets for clinical (99m)Tc production. Although for any given Mo target, the patient dose can be predicted based on complicated calculations of production yields for each Tc radioisotope, it would be very difficult to reverse these calculations to specify target composition based on dosimetry considerations. In this article, a relationship between patient dosimetry and Mo target composition is studied. A simple and easy algorithm for dose estimation, based solely on the knowledge of target composition and beam energy, is described. Using this algorithm, the patient dose increase due to every Mo isotope that could be present in the target is estimated. Most importantly, a technique to determine Mo target composition thresholds that would meet any given dosimetry requirement is proposed.

  2. Molybdenum target specifications for cyclotron production of 99mTc based on patient dose estimates

    NASA Astrophysics Data System (ADS)

    Hou, X.; Tanguay, J.; Buckley, K.; Schaffer, P.; Bénard, F.; Ruth, T. J.; Celler, A.

    2016-01-01

    In response to the recognized fragility of reactor-produced 99Mo supply, direct production of 99mTc via 100Mo(p,2n)99mTc reaction using medical cyclotrons has been investigated. However, due to the existence of other Molybdenum (Mo) isotopes in the target, in parallel with 99mTc, other technetium (Tc) radioactive isotopes (impurities) will be produced. They will be incorporated into the labeled radiopharmaceuticals and result in increased patient dose. The isotopic composition of the target and beam energy are main factors that determine production of impurities, thus also dose increases. Therefore, they both must be considered when selecting targets for clinical 99mTc production. Although for any given Mo target, the patient dose can be predicted based on complicated calculations of production yields for each Tc radioisotope, it would be very difficult to reverse these calculations to specify target composition based on dosimetry considerations. In this article, a relationship between patient dosimetry and Mo target composition is studied. A simple and easy algorithm for dose estimation, based solely on the knowledge of target composition and beam energy, is described. Using this algorithm, the patient dose increase due to every Mo isotope that could be present in the target is estimated. Most importantly, a technique to determine Mo target composition thresholds that would meet any given dosimetry requirement is proposed.

  3. New flux based dose-response relationships for ozone for European forest tree species.

    PubMed

    Büker, P; Feng, Z; Uddling, J; Briolat, A; Alonso, R; Braun, S; Elvira, S; Gerosa, G; Karlsson, P E; Le Thiec, D; Marzuoli, R; Mills, G; Oksanen, E; Wieser, G; Wilkinson, M; Emberson, L D

    2015-11-01

    To derive O3 dose-response relationships (DRR) for five European forest trees species and broadleaf deciduous and needleleaf tree plant functional types (PFTs), phytotoxic O3 doses (PODy) were related to biomass reductions. PODy was calculated using a stomatal flux model with a range of cut-off thresholds (y) indicative of varying detoxification capacities. Linear regression analysis showed that DRR for PFT and individual tree species differed in their robustness. A simplified parameterisation of the flux model was tested and showed that for most non-Mediterranean tree species, this simplified model led to similarly robust DRR as compared to a species- and climate region-specific parameterisation. Experimentally induced soil water stress was not found to substantially reduce PODy, mainly due to the short duration of soil water stress periods. This study validates the stomatal O3 flux concept and represents a step forward in predicting O3 damage to forests in a spatially and temporally varying climate. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.

  4. Threshold multi-secret sharing scheme based on phase-shifting interferometry

    NASA Astrophysics Data System (ADS)

    Deng, Xiaopeng; Wen, Wei; Shi, Zhengang

    2017-03-01

    A threshold multi-secret sharing scheme is proposed based on phase-shifting interferometry. The K secret images to be shared are firstly encoded by using Fourier transformation, respectively. Then, these encoded images are shared into many shadow images based on recording principle of the phase-shifting interferometry. In the recovering stage, the secret images can be restored by combining any 2 K + 1 or more shadow images, while any 2 K or fewer shadow images cannot obtain any information about the secret images. As a result, a (2 K + 1 , N) threshold multi-secret sharing scheme can be implemented. Simulation results are presented to demonstrate the feasibility of the proposed method.

  5. Applicability of the linear-quadratic formalism for modeling local tumor control probability in high dose per fraction stereotactic body radiotherapy for early stage non-small cell lung cancer.

    PubMed

    Guckenberger, Matthias; Klement, Rainer Johannes; Allgäuer, Michael; Appold, Steffen; Dieckmann, Karin; Ernst, Iris; Ganswindt, Ute; Holy, Richard; Nestle, Ursula; Nevinny-Stickel, Meinhard; Semrau, Sabine; Sterzing, Florian; Wittig, Andrea; Andratschke, Nicolaus; Flentje, Michael

    2013-10-01

    To compare the linear-quadratic (LQ) and the LQ-L formalism (linear cell survival curve beyond a threshold dose dT) for modeling local tumor control probability (TCP) in stereotactic body radiotherapy (SBRT) for stage I non-small cell lung cancer (NSCLC). This study is based on 395 patients from 13 German and Austrian centers treated with SBRT for stage I NSCLC. The median number of SBRT fractions was 3 (range 1-8) and median single fraction dose was 12.5 Gy (2.9-33 Gy); dose was prescribed to the median 65% PTV encompassing isodose (60-100%). Assuming an α/β-value of 10 Gy, we modeled TCP as a sigmoid-shaped function of the biologically effective dose (BED). Models were compared using maximum likelihood ratio tests as well as Bayes factors (BFs). There was strong evidence for a dose-response relationship in the total patient cohort (BFs>20), which was lacking in single-fraction SBRT (BFs<3). Using the PTV encompassing dose or maximum (isocentric) dose, our data indicated a LQ-L transition dose (dT) at 11 Gy (68% CI 8-14 Gy) or 22 Gy (14-42 Gy), respectively. However, the fit of the LQ-L models was not significantly better than a fit without the dT parameter (p=0.07, BF=2.1 and p=0.86, BF=0.8, respectively). Generally, isocentric doses resulted in much better dose-response relationships than PTV encompassing doses (BFs>20). Our data suggest accurate modeling of local tumor control in fractionated SBRT for stage I NSCLC with the traditional LQ formalism. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  6. Approaches for characterizing threshold dose-response relationships for DNA-damage pathways involved in carcinogenicity in vivo and micronuclei formation in vitro.

    PubMed

    Clewell, Rebecca A; Andersen, Melvin E

    2016-05-01

    Assessing the shape of dose-response curves for DNA-damage in cellular systems and for the consequences of DNA damage in intact animals remains a controversial topic. This overview looks at aspects of the pharmacokinetics (PK) and pharmacodynamics (PD) of cellular DNA-damage/repair and their role in defining the shape of dose-response curves using an in vivo example with formaldehyde and in vitro examples for micronuclei (MN) formation with several test compounds. Formaldehyde is both strongly mutagenic and an endogenous metabolite in cells. With increasing inhaled concentrations, there were transitions in gene changes, from activation of selective stress pathway genes at low concentrations, to activation of pathways for cell-cycle control, p53-DNA damage, and stem cell niche pathways at higher exposures. These gene expression changes were more consistent with dose-dependent transitions in the PD responses to formaldehyde in epithelial cells in the intact rat rather than the low-dose linear extrapolation methods currently used for carcinogens. However, more complete PD explanations of non-linear dose response for creation of fixed damage in cells require detailed examination of cellular responses in vitro using measures of DNA damage and repair that are not easily accessible in the intact animal. In the second section of the article, we illustrate an approach from our laboratory that develops fit-for-purpose, in vitro assays and evaluates the PD of DNA damage and repair through studies using prototypical DNA-damaging agents. Examination of a broad range of responses in these cells showed that transcriptional upregulation of cell cycle control and DNA repair pathways only occurred at doses higher than those causing overt damage fixed damage-measured as MN formation. Lower levels of damage appear to be handled by post-translational repair process using pre-existing proteins. In depth evaluation of the PD properties of one such post-translational process (formation of

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Calabrese, Edward J.

    This paper assesses historical reasons that may account for the marginalization of hormesis as a dose-response model in the biomedical sciences in general and toxicology in particular. The most significant and enduring explanatory factors are the early and close association of the concept of hormesis with the highly controversial medical practice of homeopathy and the difficulty in assessing hormesis with high-dose testing protocols which have dominated the discipline of toxicology, especially regulatory toxicology. The long-standing and intensely acrimonious conflict between homeopathy and 'traditional' medicine (allopathy) lead to the exclusion of the hormesis concept from a vast array of medical- andmore » public health-related activities including research, teaching, grant funding, publishing, professional societal meetings, and regulatory initiatives of governmental agencies and their advisory bodies. Recent publications indicate that the hormetic dose-response is far more common and fundamental than the dose-response models [threshold/linear no threshold (LNT)] used in toxicology and risk assessment, and by governmental regulatory agencies in the establishment of exposure standards for workers and the general public. Acceptance of the possibility of hormesis has the potential to profoundly affect the practice of toxicology and risk assessment, especially with respect to carcinogen assessment.« less

  8. Initial analyses of the relationship between 'Thresholds' of toxicity for individual chemicals and 'Interaction Thresholds' for chemical mixtures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Raymond S.H.; Dennison, James E.

    2007-09-01

    The inter-relationship of 'Thresholds' between chemical mixtures and their respective component single chemicals was studied using three sets of data and two types of analyses. Two in vitro data sets involve cytotoxicity in human keratinocytes from treatment of metals and a metal mixture [Bae, D.S., Gennings, C., Carter, Jr., W.H., Yang, R.S.H., Campain, J.A., 2001. Toxicological interactions among arsenic, cadmium, chromium, and lead in human keratinocytes. Toxicol. Sci. 63, 132-142; Gennings, C., Carter, Jr., W.H., Campain, J.A., Bae, D.S., Yang, R.S.H., 2002. Statistical analysis of interactive cytotoxicity in human epidermal keratinocytes following exposure to a mixture of four metals. J.more » Agric. Biol. Environ. Stat. 7, 58-73], and induction of estrogen receptor alpha (ER-{alpha}) reporter gene in MCF-7 human breast cancer cells by estrogenic xenobiotics [Gennings, C., Carter, Jr., W.H., Carney, E.W., Charles, G.D., Gollapudi, B.B., Carchman, R.A., 2004. A novel flexible approach for evaluating fixed ratio mixtures of full and partial agonists. Toxicol. Sci. 80, 134-150]. The third data set came from PBPK modeling of gasoline and its components in the human. For in vitro cellular responses, we employed Benchmark Dose Software (BMDS) to obtain BMD{sub 01}, BMD{sub 05}, and BMD{sub 10}. We then plotted these BMDs against exposure concentrations for the chemical mixture and its components to assess the ranges and slopes of these BMD-concentration lines. In doing so, we consider certain BMDs to be 'Interaction Thresholds' or 'Thresholds' for mixtures and their component single chemicals and the slope of the line must be a reflection of the potency of the biological effects. For in vivo PBPK modeling, we used 0.1x TLVs, TLVs, and 10x TLVs for gasoline and six component markers as input dosing for PBPK modeling. In this case, the venous blood levels under the hypothetical exposure conditions become our designated 'Interaction Thresholds' or 'Thresholds' for

  9. What is a food and what is a medicinal product in the European Union? Use of the benchmark dose (BMD) methodology to define a threshold for "pharmacological action".

    PubMed

    Lachenmeier, Dirk W; Steffen, Christian; el-Atma, Oliver; Maixner, Sibylle; Löbell-Behrends, Sigrid; Kohl-Himmelseher, Matthias

    2012-11-01

    The decision criterion for the demarcation between foods and medicinal products in the EU is the significant "pharmacological action". Based on six examples of substances with ambivalent status, the benchmark dose (BMD) method is evaluated to provide a threshold for pharmacological action. Using significant dose-response models from literature clinical trial data or epidemiology, the BMD values were 63mg/day for caffeine, 5g/day for alcohol, 6mg/day for lovastatin, 769mg/day for glucosamine sulfate, 151mg/day for Ginkgo biloba extract, and 0.4mg/day for melatonin. The examples for caffeine and alcohol validate the approach because intake above BMD clearly exhibits pharmacological action. Nevertheless, due to uncertainties in dose-response modelling as well as the need for additional uncertainty factors to consider differences in sensitivity within the human population, a "borderline range" on the dose-response curve remains. "Pharmacological action" has proven to be not very well suited as binary decision criterion between foods and medicinal product. The European legislator should rethink the definition of medicinal products, as the current situation based on complicated case-by-case decisions on pharmacological action leads to an unregulated market flooded with potentially illegal food supplements. Copyright © 2012 Elsevier Inc. All rights reserved.

  10. Modelling the regulatory system for diabetes mellitus with a threshold window

    NASA Astrophysics Data System (ADS)

    Yang, Jin; Tang, Sanyi; Cheke, Robert A.

    2015-05-01

    Piecewise (or non-smooth) glucose-insulin models with threshold windows for type 1 and type 2 diabetes mellitus are proposed and analyzed with a view to improving understanding of the glucose-insulin regulatory system. For glucose-insulin models with a single threshold, the existence and stability of regular, virtual, pseudo-equilibria and tangent points are addressed. Then the relations between regular equilibria and a pseudo-equilibrium are studied. Furthermore, the sufficient and necessary conditions for the global stability of regular equilibria and the pseudo-equilibrium are provided by using qualitative analysis techniques of non-smooth Filippov dynamic systems. Sliding bifurcations related to boundary node bifurcations were investigated with theoretical and numerical techniques, and insulin clinical therapies are discussed. For glucose-insulin models with a threshold window, the effects of glucose thresholds or the widths of threshold windows on the durations of insulin therapy and glucose infusion were addressed. The duration of the effects of an insulin injection is sensitive to the variation of thresholds. Our results indicate that blood glucose level can be maintained within a normal range using piecewise glucose-insulin models with a single threshold or a threshold window. Moreover, our findings suggest that it is critical to individualise insulin therapy for each patient separately, based on initial blood glucose levels.

  11. Scene text detection via extremal region based double threshold convolutional network classification

    PubMed Central

    Zhu, Wei; Lou, Jing; Chen, Longtao; Xia, Qingyuan

    2017-01-01

    In this paper, we present a robust text detection approach in natural images which is based on region proposal mechanism. A powerful low-level detector named saliency enhanced-MSER extended from the widely-used MSER is proposed by incorporating saliency detection methods, which ensures a high recall rate. Given a natural image, character candidates are extracted from three channels in a perception-based illumination invariant color space by saliency-enhanced MSER algorithm. A discriminative convolutional neural network (CNN) is jointly trained with multi-level information including pixel-level and character-level information as character candidate classifier. Each image patch is classified as strong text, weak text and non-text by double threshold filtering instead of conventional one-step classification, leveraging confident scores obtained via CNN. To further prune non-text regions, we develop a recursive neighborhood search algorithm to track credible texts from weak text set. Finally, characters are grouped into text lines using heuristic features such as spatial location, size, color, and stroke width. We compare our approach with several state-of-the-art methods, and experiments show that our method achieves competitive performance on public datasets ICDAR 2011 and ICDAR 2013. PMID:28820891

  12. Non-Markovian Infection Spread Dramatically Alters the Susceptible-Infected-Susceptible Epidemic Threshold in Networks

    NASA Astrophysics Data System (ADS)

    Van Mieghem, P.; van de Bovenkamp, R.

    2013-03-01

    Most studies on susceptible-infected-susceptible epidemics in networks implicitly assume Markovian behavior: the time to infect a direct neighbor is exponentially distributed. Much effort so far has been devoted to characterize and precisely compute the epidemic threshold in susceptible-infected-susceptible Markovian epidemics on networks. Here, we report the rather dramatic effect of a nonexponential infection time (while still assuming an exponential curing time) on the epidemic threshold by considering Weibullean infection times with the same mean, but different power exponent α. For three basic classes of graphs, the Erdős-Rényi random graph, scale-free graphs and lattices, the average steady-state fraction of infected nodes is simulated from which the epidemic threshold is deduced. For all graph classes, the epidemic threshold significantly increases with the power exponents α. Hence, real epidemics that violate the exponential or Markovian assumption can behave seriously differently than anticipated based on Markov theory.

  13. Estimation of the diagnostic threshold accounting for decision costs and sampling uncertainty.

    PubMed

    Skaltsa, Konstantina; Jover, Lluís; Carrasco, Josep Lluís

    2010-10-01

    Medical diagnostic tests are used to classify subjects as non-diseased or diseased. The classification rule usually consists of classifying subjects using the values of a continuous marker that is dichotomised by means of a threshold. Here, the optimum threshold estimate is found by minimising a cost function that accounts for both decision costs and sampling uncertainty. The cost function is optimised either analytically in a normal distribution setting or empirically in a free-distribution setting when the underlying probability distributions of diseased and non-diseased subjects are unknown. Inference of the threshold estimates is based on approximate analytically standard errors and bootstrap-based approaches. The performance of the proposed methodology is assessed by means of a simulation study, and the sample size required for a given confidence interval precision and sample size ratio is also calculated. Finally, a case example based on previously published data concerning the diagnosis of Alzheimer's patients is provided in order to illustrate the procedure.

  14. Heavy particle irradiation, neurochemistry and behavior: thresholds, dose- response curves and recovery of function

    NASA Astrophysics Data System (ADS)

    Rabin, B.; Joseph, J.; Shukitt-Hale, B.

    Exposure to heavy particles can affect the functioning of the central nervous system (CNS), particularly the dopaminergic system. In turn, the radiation- induced disruption of dopaminergic function disrupts a variety of behaviors that are dependent upon the integrity of the dopaminergic system, including motor behavior (upper body strength), amphetamine (dopamine)-mediated taste aversion learning, spatial learning and memory (Morris water maze), and operant conditioning (fixed-ratio bar pressing). Although the relationships between heavy particle irradiation and the effects of exposure depend, to some extent, upon the specific behavioral or neurochemical endpoint under consideration, a review of the available research leads to the hypothesis that the endpoints mediated by the CNS have certain characteristics in common. These include: (1) a threshold, below which there is no apparent effect; (2) the lack of a dose-response relationship, or an extremely steep dose-response curve, depending on the particular endpoint; and (3) the absence of recovery of function, such that the heavy particle-induced behavioral and neural changes are present when tested up to one year following exposure. The current presentation will review the data relevant to the degree to which these characteristics are in fact common to neurochemical and behavioral endpoints that are mediated by the effects of exposure to heavy particles on CNS activity. Supported by N.A.S.A. Grant NAG9-1190.

  15. Discontinuous non-equilibrium phase transition in a threshold Schloegl model for autocatalysis: Generic two-phase coexistence and metastability

    NASA Astrophysics Data System (ADS)

    Wang, Chi-Jen; Liu, Da-Jiang; Evans, James W.

    2015-04-01

    Threshold versions of Schloegl's model on a lattice, which involve autocatalytic creation and spontaneous annihilation of particles, can provide a simple prototype for discontinuous non-equilibrium phase transitions. These models are equivalent to so-called threshold contact processes. A discontinuous transition between populated and vacuum states can occur selecting a threshold of N ≥ 2 for the minimum number, N, of neighboring particles enabling autocatalytic creation at an empty site. Fundamental open questions remain given the lack of a thermodynamic framework for analysis. For a square lattice with N = 2, we show that phase coexistence occurs not at a unique value but for a finite range of particle annihilation rate (the natural control parameter). This generic two-phase coexistence also persists when perturbing the model to allow spontaneous particle creation. Such behavior contrasts both the Gibbs phase rule for thermodynamic systems and also previous analysis for this model. We find metastability near the transition corresponding to a non-zero effective line tension, also contrasting previously suggested critical behavior. Mean-field type analysis, extended to treat spatially heterogeneous states, further elucidates model behavior.

  16. Novel threshold pressure sensors based on nonlinear dynamics of MEMS resonators

    NASA Astrophysics Data System (ADS)

    Hasan, Mohammad H.; Alsaleem, Fadi M.; Ouakad, Hassen M.

    2018-06-01

    Triggering an alarm in a car for low air-pressure in the tire or tripping an HVAC compressor if the refrigerant pressure is lower than a threshold value are examples for applications where measuring the amount of pressure is not as important as determining if the pressure has exceeded a threshold value for an action to occur. Unfortunately, current technology still relies on analog pressure sensors to perform this functionality by adding a complex interface (extra circuitry, controllers, and/or decision units). In this paper, we demonstrate two new smart tunable-threshold pressure switch concepts that can reduce the complexity of a threshold pressure sensor. The first concept is based on the nonlinear subharmonic resonance of a straight double cantilever microbeam with a proof mass and the other concept is based on the snap-through bi-stability of a clamped-clamped MEMS shallow arch. In both designs, the sensor operation concept is simple. Any actuation performed at a certain pressure lower than a threshold value will activate a nonlinear dynamic behavior (subharmonic resonance or snap-through bi-stability) yielding a large output that would be interpreted as a logic value of ONE, or ON. Once the pressure exceeds the threshold value, the nonlinear response ceases to exist, yielding a small output that would be interpreted as a logic value of ZERO, or OFF. A lumped, single degree of freedom model for the double cantilever beam, that is validated using experimental data, and a continuous beam model for the arch beam, are used to simulate the operation range of the proposed sensors by identifying the relationship between the excitation signal and the critical cut-off pressure.

  17. Non-targeted effects of ionizing radiation–implications for low dose risk

    PubMed Central

    Kadhim, Munira; Salomaa, Sisko; Wright, Eric; Hildebrandt, Guido; Belyakov, Oleg V.; Prise, Kevin M.; Little, Mark P.

    2014-01-01

    Non-DNA targeted effects of ionizing radiation, which include genomic instability, and a variety of bystander effects including abscopal effects and bystander mediated adaptive response, have raised concerns about the magnitude of low-dose radiation risk. Genomic instability, bystander effects and adaptive responses are powered by fundamental, but not clearly understood systems that maintain tissue homeostasis. Despite excellent research in this field by various groups, there are still gaps in our understanding of the likely mechanisms associated with non-DNA targeted effects, particularly with respect to systemic (human health) consequences at low and intermediate doses of ionizing radiation. Other outstanding questions include links between the different non-targeted responses and the variations in response observed between individuals and cell lines, possibly a function of genetic background. Furthermore, it is still not known what the initial target and early interactions in cells are that give rise to non-targeted responses in neighbouring or descendant cells. This paper provides a commentary on the current state of the field as a result of the Non-targeted effects of ionizing radiation (NOTE) Integrated Project funded by the European Union. Here we critically examine the evidence for non-targeted effects, discuss apparently contradictory results and consider implications for low-dose radiation health effects. PMID:23262375

  18. Evolution of TUNEL-labeling in the rat lens after in vivo exposure to just above threshold dose UVB.

    PubMed

    Kronschläger, Martin; Yu, Zhaohua; Talebizadeh, Nooshin; Meyer, Linda M; Hallböök, Finn; Söderberg, Per G

    2013-08-01

    To quantitatively analyse the evolution of TUNEL-labeling, after in vivo exposure to UVB. Altogether, 16 Sprague Dawley rats were unilaterally exposed in vivo for 15 min to close to threshold dose, 5 kJ/m(2), of ultraviolet radiation in the 300 nm wavelength region. Animals were sacrificed in groups of 4 at 1, 5, 24 and 120 h after exposure. For each animal, both eye globes were removed and frozen. The frozen eye was cryo-sectioned in 10 µm thick midsagittal sections. From each globe, three midsagittal sections with at least five sections interval in between were mounted on a microscope slide. Sections were TUNEL-labeled and counter stained with DAPI. For quantification of apoptosis, a fluorescence microscope was used. In sections with a continuous epithelial cell surface, the number of lens epithelial cell nuclei and the number of TUNEL-positive epithelial cell nuclei was counted. The total number of TUNEL-positive epithelial cell nuclei for all three sections of one lens in relation to the total number of epithelial cell nuclei for all three sections of the same lens was compared between exposed and contralateral not exposed lens for each animal. The relative difference of the fraction of TUNEL-positive nuclei between exposed and contralateral not exposed lens increased gradually, peaked in the time interval 5-120 h after exposure, and then declined. Close to threshold dose of UVB induces TUNEL-labeling that peaks in the time window 5-120 h after exposure to UVB.

  19. Variable-Threshold Threshold Elements,

    DTIC Science & Technology

    A threshold element is a mathematical model of certain types of logic gates and of a biological neuron. Much work has been done on the subject of... threshold elements with fixed thresholds; this study concerns itself with elements in which the threshold may be varied, variable- threshold threshold ...elements. Physical realizations include resistor-transistor elements, in which the threshold is simply a voltage. Variation of the threshold causes the

  20. A critique of the use of indicator-species scores for identifying thresholds in species responses

    USGS Publications Warehouse

    Cuffney, Thomas F.; Qian, Song S.

    2013-01-01

    Identification of ecological thresholds is important both for theoretical and applied ecology. Recently, Baker and King (2010, King and Baker 2010) proposed a method, threshold indicator analysis (TITAN), to calculate species and community thresholds based on indicator species scores adapted from Dufrêne and Legendre (1997). We tested the ability of TITAN to detect thresholds using models with (broken-stick, disjointed broken-stick, dose-response, step-function, Gaussian) and without (linear) definitive thresholds. TITAN accurately and consistently detected thresholds in step-function models, but not in models characterized by abrupt changes in response slopes or response direction. Threshold detection in TITAN was very sensitive to the distribution of 0 values, which caused TITAN to identify thresholds associated with relatively small differences in the distribution of 0 values while ignoring thresholds associated with large changes in abundance. Threshold identification and tests of statistical significance were based on the same data permutations resulting in inflated estimates of statistical significance. Application of bootstrapping to the split-point problem that underlies TITAN led to underestimates of the confidence intervals of thresholds. Bias in the derivation of the z-scores used to identify TITAN thresholds and skewedness in the distribution of data along the gradient produced TITAN thresholds that were much more similar than the actual thresholds. This tendency may account for the synchronicity of thresholds reported in TITAN analyses. The thresholds identified by TITAN represented disparate characteristics of species responses that, when coupled with the inability of TITAN to identify thresholds accurately and consistently, does not support the aggregation of individual species thresholds into a community threshold.

  1. Position Estimation for Switched Reluctance Motor Based on the Single Threshold Angle

    NASA Astrophysics Data System (ADS)

    Zhang, Lei; Li, Pang; Yu, Yue

    2017-05-01

    This paper presents a position estimate model of switched reluctance motor based on the single threshold angle. In view of the relationship of between the inductance and rotor position, the position is estimated by comparing the real-time dynamic flux linkage with the threshold angle position flux linkage (7.5° threshold angle, 12/8SRM). The sensorless model is built by Maltab/Simulink, the simulation are implemented under the steady state and transient state different condition, and verified its validity and feasibility of the method..

  2. Hydroxyisohexyl 3-cyclohexene carboxaldehyde allergy: relationship between patch test and repeated open application test thresholds.

    PubMed

    Fischer, L A; Menné, T; Avnstorp, C; Kasting, G B; Johansen, J D

    2009-09-01

    Hydroxyisohexyl 3-cyclohexene carboxaldehyde (HICC) is a synthetic fragrance ingredient. Case reports of allergy to HICC appeared in the 1980s, and HICC has recently been included in the European baseline series. Human elicitation dose-response studies performed with different allergens have shown a significant relationship between the patch-test threshold and the repeated open application test (ROAT) threshold, which mimics some real-life exposure situations. Fragrance ingredients are special as significant amounts of allergen may evaporate from the skin. The study aimed to investigate the relationship between elicitation threshold doses at the patch test and the ROAT, using HICC as the allergen. The expected evaporation rate was calculated. Seventeen HICC-allergic persons were tested with a dilution series of HICC in a patch test and a ROAT (duration up to 21 days). Seventeen persons with no HICC allergy were included as control group for the ROAT. Results The response frequency to the ROAT (in microg HICC cm(-2) per application) was significantly higher than the response frequency to the patch test at one of the tested doses. Furthermore the response rate to the accumulated ROAT dose was significantly lower at half of the doses compared with the patch test. The evaporation rate of HICC was calculated to be 72% over a 24-h period. The ROAT threshold in dose per area per application is lower than the patch test threshold; furthermore the accumulated ROAT threshold is higher than the patch test threshold, which can probably be explained by the evaporation of HICC from the skin in the open test.

  3. The effect of target-controlled infusion of low-dose ketamine on heat pain and temporal summation threshold.

    PubMed

    Lee, Joon-Ho; Cho, Sung-Hwan; Kim, Sang-Hyun; Chae, Won-Soek; Jin, Hee-Cheol; Lee, Jeong-Seok; Kim, Yong-Ik

    2011-08-01

    We investigated the heat pain threshold (HPT) and temporal summation threshold (TST) before and after target-controlled infusion (TCI) of ketamine with an effect-site concentration (Ce) of 30 and 60 ng/ml. Healthy young volunteers (n = 20) were enrolled. A thermode was applied to the volar side of each volunteer's right forearm, and HPT and TST were measured before and after TCI of ketamine. Vital signs and psychedelic effects according to ketamine infusion were also observed before and after TCI of ketamine. Mean HPT after TCI of ketamine with a Ce of 30 and 60 ng/ml did not increase significantly. However, mean TST after TCI of ketamine with a Ce of 30 and 60 ng/ml increased significantly, in a dose-dependent fashion, compared with the value before ketamine TCI. Vital signs showed no significant difference before and after ketamine TCI. The visual analog scale score of psychedelic symptoms was higher with a Ce of 60 ng/ml than with 30 ng/ml. TCI of ketamine with a Ce of 30 and 60 ng/ml increased TST but not HPT.

  4. Comparison of pressure pain threshold, grip strength,dexterity and touch pressure of dominant and non-dominant hands within and between right-and left-handed subjects.

    PubMed

    Ozcan, Ayse; Tulum, Zeliha; Pinar, Lamia; Başkurt, Ferdi

    2004-12-01

    This study was done to evaluate differences in pressure pain threshold, grip strength, manual dexterity and touch pressure threshold in the dominant and non-dominant hands of right- and left-handed subjects, and to compare findings within and between these groups. Thirty-nine right-handed and twenty-one left-handed subjects participated in the study. Pressure pain threshold was assessed using a dolorimeter, grip strength was assessed with a hand-grip dynamometer, manual dexterity was evaluated using the VALPAR Component Work Sample-4 system, and touch pressure threshold was determined using Semmes Weinstein monofilaments. Results for the dominant and non-dominant hands were compared within and between the groups. In the right-handed subjects, the dominant hand was significantly faster with the VALPAR Component Work Sample-4, showed significantly greater grip strength, and had a significantly higher pressure pain threshold than the non-dominant hand. The corresponding results for the two hands were similar in the left-handed subjects. The study revealed asymmetrical manual performance in grip strength, manual dexterity and pressure pain threshold in right-handed subjects, but no such asymmetries in left-handed subjects.

  5. Non-equilibrium Numerical Analysis of Microwave-supported Detonation Threshold Propagating through Diatomic Gas

    NASA Astrophysics Data System (ADS)

    Shiraishi, Hiroyuki

    2015-09-01

    Microwave-supported Detonation (MSD), one type of Microwave-supported Plasma (MSP), is considered as one of the most important phenomena because it can generate high pressure and high temperature for beam-powered space propulsion systems. In this study, I numerically simulate MSD waves propagating through a diatomic gas. In order to evaluate the threshold of beam intensity, I use the physical-fluid dynamics scheme, which has been developed for simulating unsteady and non-equilibrium LSD waves propagating through a hydrogen gas.

  6. 48 CFR 1352.213-71 - Instructions for submitting quotations under the simplified acquisition threshold-non-commercial.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 5 2011-10-01 2011-10-01 false Instructions for submitting quotations under the simplified acquisition threshold-non-commercial. 1352.213-71 Section 1352.213-71 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE CLAUSES AND FORMS SOLICITATION PROVISIONS AND CONTRACT CLAUSES Text of Provisions...

  7. 48 CFR 1352.213-71 - Instructions for submitting quotations under the simplified acquisition threshold-non-commercial.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 5 2012-10-01 2012-10-01 false Instructions for submitting quotations under the simplified acquisition threshold-non-commercial. 1352.213-71 Section 1352.213-71 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE CLAUSES AND FORMS SOLICITATION PROVISIONS AND CONTRACT CLAUSES Text of Provisions...

  8. 48 CFR 1352.213-71 - Instructions for submitting quotations under the simplified acquisition threshold-non-commercial.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 5 2014-10-01 2014-10-01 false Instructions for submitting quotations under the simplified acquisition threshold-non-commercial. 1352.213-71 Section 1352.213-71 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE CLAUSES AND FORMS SOLICITATION PROVISIONS AND CONTRACT CLAUSES Text of Provisions...

  9. 48 CFR 1352.213-71 - Instructions for submitting quotations under the simplified acquisition threshold-non-commercial.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 5 2013-10-01 2013-10-01 false Instructions for submitting quotations under the simplified acquisition threshold-non-commercial. 1352.213-71 Section 1352.213-71 Federal Acquisition Regulations System DEPARTMENT OF COMMERCE CLAUSES AND FORMS SOLICITATION PROVISIONS AND CONTRACT CLAUSES Text of Provisions...

  10. Flood Extent Delineation by Thresholding Sentinel-1 SAR Imagery Based on Ancillary Land Cover Information

    NASA Astrophysics Data System (ADS)

    Liang, J.; Liu, D.

    2017-12-01

    Emergency responses to floods require timely information on water extents that can be produced by satellite-based remote sensing. As SAR image can be acquired in adverse illumination and weather conditions, it is particularly suitable for delineating water extent during a flood event. Thresholding SAR imagery is one of the most widely used approaches to delineate water extent. However, most studies apply only one threshold to separate water and dry land without considering the complexity and variability of different dry land surface types in an image. This paper proposes a new thresholding method for SAR image to delineate water from other different land cover types. A probability distribution of SAR backscatter intensity is fitted for each land cover type including water before a flood event and the intersection between two distributions is regarded as a threshold to classify the two. To extract water, a set of thresholds are applied to several pairs of land cover types—water and urban or water and forest. The subsets are merged to form the water distribution for the SAR image during or after the flooding. Experiments show that this land cover based thresholding approach outperformed the traditional single thresholding by about 5% to 15%. This method has great application potential with the broadly acceptance of the thresholding based methods and availability of land cover data, especially for heterogeneous regions.

  11. Lead-induced anemia: Dose-response relationships and evidence for a threshold

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwartz, J.; Landrigan, P.J.; Baker, E.L. Jr.

    1990-02-01

    We conducted a cross-sectional epidemiologic study to assess the association between blood lead level and hematocrit in 579 one to five year-old children living near a primary lead smelter in 1974. Blood lead levels ranged from 0.53 to 7.91 mumol/L (11 to 164 micrograms/dl). To predict hematocrit as a function of blood lead level and age, we derived non-linear regression models and fit percentile curves. We used logistic regression to predict the probability of hematocrit values less than 35 per cent. We found a strong non-linear, dose-response relationship between blood lead level and hematocrit. This relationship was influenced by age,more » but (in this age group) not by sex; the effect was strongest in youngest children. In one year-olds, the age group most severely affected, the risk of an hematocrit value below 35 percent was 2 percent above background at blood lead levels between 0.97 and 1.88 mumol/L (20 and 39 micrograms/dl), 18 percent above background at lead levels of 1.93 to 2.85 mumol/L (40 to 59 micrograms/dl), and 40 percent above background at lead levels of 2.9 mumol/L (60 micrograms/dl) and greater; background was defined as a blood lead level below 1.88 mumol/L (20 micrograms/dl). This effect appeared independent of iron deficiency. These findings suggest that blood lead levels close to the currently recommended limit value of 1.21 mumol/L (25 micrograms/dl) are associated with dose-related depression of hematocrit in young children.« less

  12. A low-threshold high-index-contrast grating (HCG)-based organic VCSEL

    NASA Astrophysics Data System (ADS)

    Shayesteh, Mohammad Reza; Darvish, Ghafar; Ahmadi, Vahid

    2015-12-01

    We propose a low-threshold high-index-contrast grating (HCG)-based organic vertical-cavity surface-emitting laser (OVCSEL). The device has the feasibility to apply both electrical and optical excitation. The microcavity of the laser is a hybrid photonic crystal (HPC) in which the top distributed Bragg reflector (DBR) is replaced by a sub-wavelength high-contrast-grating layer, and provides a high-quality factor. The simulated quality factor of the microcavity is shown to be as high as 282,000. We also investigate the threshold behavior and the dynamics of the OVCSEL optically pumped with sub-picosecond pulses. Results from numerical simulation show that lasing threshold is 75 nJ/cm2.

  13. Automated segmentation of cardiac visceral fat in low-dose non-contrast chest CT images

    NASA Astrophysics Data System (ADS)

    Xie, Yiting; Liang, Mingzhu; Yankelevitz, David F.; Henschke, Claudia I.; Reeves, Anthony P.

    2015-03-01

    Cardiac visceral fat was segmented from low-dose non-contrast chest CT images using a fully automated method. Cardiac visceral fat is defined as the fatty tissues surrounding the heart region, enclosed by the lungs and posterior to the sternum. It is measured by constraining the heart region with an Anatomy Label Map that contains robust segmentations of the lungs and other major organs and estimating the fatty tissue within this region. The algorithm was evaluated on 124 low-dose and 223 standard-dose non-contrast chest CT scans from two public datasets. Based on visual inspection, 343 cases had good cardiac visceral fat segmentation. For quantitative evaluation, manual markings of cardiac visceral fat regions were made in 3 image slices for 45 low-dose scans and the Dice similarity coefficient (DSC) was computed. The automated algorithm achieved an average DSC of 0.93. Cardiac visceral fat volume (CVFV), heart region volume (HRV) and their ratio were computed for each case. The correlation between cardiac visceral fat measurement and coronary artery and aortic calcification was also evaluated. Results indicated the automated algorithm for measuring cardiac visceral fat volume may be an alternative method to the traditional manual assessment of thoracic region fat content in the assessment of cardiovascular disease risk.

  14. Conception, fabrication and characterization of a silicon based MEMS inertial switch with a threshold value of 5 g

    NASA Astrophysics Data System (ADS)

    Zhang, Fengtian; Wang, Chao; Yuan, Mingquan; Tang, Bin; Xiong, Zhuang

    2017-12-01

    Most of the MEMS inertial switches developed in recent years are intended for shock and impact sensing with a threshold value above 50 g. In order to follow the requirement of detecting linear acceleration signal at low-g level, a silicon based MEMS inertial switch with a threshold value of 5 g was designed, fabricated and characterized. The switch consisted of a large proof mass, supported by circular spiral springs. An analytical model of the structure stiffness of the proposed switch was derived and verified by finite-element simulation. The structure fabrication was based on a customized double-buried layer silicon-on-insulator wafer and encapsulated by glass wafers. The centrifugal experiment and nanoindentation experiment were performed to measure the threshold value as well as the structure stiffness. The actual threshold values were measured to be 0.1-0.3 g lower than the pre-designed value of 5 g due to the dimension loss during non-contact lithography processing. Concerning the reliability assessment, a series of environmental experiments were conducted and the switches remained operational without excessive errors. However, both the random vibration and the shock tests indicate that the metal particles generated during collision of contact parts might affect the contact reliability and long-time stability. According to the conclusion reached in this report, an attentive study on switch contact behavior should be included in future research.

  15. Ignition threshold of aluminized HMX-based PBXs

    NASA Astrophysics Data System (ADS)

    Miller, Christopher; Zhou, Min

    2017-06-01

    We report the results of micromechanical simulations of the ignition of aluminized HMX-based PBX under loading due to impact by thin flyers. The conditions analyzed concern loading pulses on the order of 20 nanoseconds to 0.8 microseconds in duration and impact piston velocities on the order of 300-1000 ms-1. The samples consist of a stochastically similar bimodal distribution of HMX grains, an Estane binder, and 50 μm aluminum particles. The computational model accounts for constituent elasto-vicoplasticity, viscoelasticity, bulk compressibility, fracture, interfacial debonding, fracture, internal contact, bulk and frictional heating, and heat conduction. The analysis focuses on the development of hotspots under different material settings and loading conditions. In particular, the ignition threshold in the form of the James relation and the corresponding ignition probability are calculated for the PBXs containing 0%, 6%, 10%, and 18% aluminum by volume. It is found that the addition of aluminum increases the ignition threshold, causing the materials to be less sensitive. Dissipation and heating mechanism changes responsible for this trend are delineated. Support by DOE NNSA SSGF is gratefully acknowledged.

  16. Discontinuous non-equilibrium phase transition in a threshold Schloegl model for autocatalysis: Generic two-phase coexistence and metastability

    DOE PAGES

    Wang, Chi -Jen; Liu, Da -Jiang; Evans, James W.

    2015-04-28

    Threshold versions of Schloegl’s model on a lattice, which involve autocatalytic creation and spontaneous annihilation of particles, can provide a simple prototype for discontinuous non-equilibrium phase transitions. These models are equivalent to so-called threshold contact processes. A discontinuous transition between populated and vacuum states can occur selecting a threshold of N ≥ 2 for the minimum number, N, of neighboring particles enabling autocatalytic creation at an empty site. Fundamental open questions remain given the lack of a thermodynamic framework for analysis. For a square lattice with N = 2, we show that phase coexistence occurs not at a unique valuemore » but for a finite range of particle annihilation rate (the natural control parameter). This generic two-phase coexistence also persists when perturbing the model to allow spontaneous particle creation. Such behavior contrasts both the Gibbs phase rule for thermodynamic systems and also previous analysis for this model. We find metastability near the transition corresponding to a non-zero effective line tension, also contrasting previously suggested critical behavior. As a result, mean-field type analysis, extended to treat spatially heterogeneous states, further elucidates model behavior.« less

  17. Modelling the Happiness Classification of Addicted, Addiction Risk, Threshold and Non-Addicted Groups on Internet Usage

    ERIC Educational Resources Information Center

    Sapmaz, Fatma; Totan, Tarik

    2018-01-01

    The aim of this study is to model the happiness classification of university students--grouped as addicted, addiction risk, threshold and non-addicted to internet usage--with compatibility analysis on a map as happiness, average and unhappiness. The participants in this study were 400 university students from Turkey. According to the results of…

  18. Excitable Neurons, Firing Threshold Manifolds and Canards

    PubMed Central

    2013-01-01

    We investigate firing threshold manifolds in a mathematical model of an excitable neuron. The model analyzed investigates the phenomenon of post-inhibitory rebound spiking due to propofol anesthesia and is adapted from McCarthy et al. (SIAM J. Appl. Dyn. Syst. 11(4):1674–1697, [2012]). Propofol modulates the decay time-scale of an inhibitory GABAa synaptic current. Interestingly, this system gives rise to rebound spiking within a specific range of propofol doses. Using techniques from geometric singular perturbation theory, we identify geometric structures, known as canards of folded saddle-type, which form the firing threshold manifolds. We find that the position and orientation of the canard separatrix is propofol dependent. Thus, the speeds of relevant slow synaptic processes are encoded within this geometric structure. We show that this behavior cannot be understood using a static, inhibitory current step protocol, which can provide a single threshold for rebound spiking but cannot explain the observed cessation of spiking for higher propofol doses. We then compare the analyses of dynamic and static synaptic inhibition, showing how the firing threshold manifolds of each relate, and why a current step approach is unable to fully capture the behavior of this model. PMID:23945278

  19. Lane change warning threshold based on driver perception characteristics.

    PubMed

    Wang, Chang; Sun, Qinyu; Fu, Rui; Li, Zhen; Zhang, Qiong

    2018-08-01

    Lane Change Warning system (LCW) is exploited to alleviate driver workload and improve the safety performance of lane changes. Depending on the secure threshold, the lane change warning system could transmit caution to drivers. Although the system possesses substantial benefits, it may perturb the conventional operating of the driver and affect driver judgment if the warning threshold does not conform to the driver perception of safety. Therefore, it is essential to establish an appropriate warning threshold to enhance the accuracy rate and acceptability of the lane change warning system. This research aims to identify the threshold that conforms to the driver perception of the ability to safely change lanes with a rear vehicle fast approaching. We propose a theoretical warning model of lane change based on a safe minimum distance and deceleration of the rear vehicle. For the purpose of acquiring the different safety levels of lane changes, 30 licensed drivers are recruited and we obtain the extreme moments represented by driver perception characteristics from a Front Extremity Test and a Rear Extremity Test implemented on the freeway. The required deceleration of the rear vehicle corresponding to the extreme time is calculated according to the proposed model. In light of discrepancies in the deceleration in these extremity experiments, we determine two levels of a hierarchical warning system. The purpose of the primary warning is to remind drivers of the existence of potentially dangerous vehicles and the second warning is used to warn the driver to stop changing lanes immediately. We use the signal detection theory to analyze the data. Ultimately, we confirm that the first deceleration threshold is 1.5 m/s 2 and the second deceleration threshold is 2.7 m/s 2 . The findings provide the basis for the algorithm design of LCW and enhance the acceptability of the intelligent system. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. A phenomenological biological dose model for proton therapy based on linear energy transfer spectra.

    PubMed

    Rørvik, Eivind; Thörnqvist, Sara; Stokkevåg, Camilla H; Dahle, Tordis J; Fjaera, Lars Fredrik; Ytre-Hauge, Kristian S

    2017-06-01

    The relative biological effectiveness (RBE) of protons varies with the radiation quality, quantified by the linear energy transfer (LET). Most phenomenological models employ a linear dependency of the dose-averaged LET (LET d ) to calculate the biological dose. However, several experiments have indicated a possible non-linear trend. Our aim was to investigate if biological dose models including non-linear LET dependencies should be considered, by introducing a LET spectrum based dose model. The RBE-LET relationship was investigated by fitting of polynomials from 1st to 5th degree to a database of 85 data points from aerobic in vitro experiments. We included both unweighted and weighted regression, the latter taking into account experimental uncertainties. Statistical testing was performed to decide whether higher degree polynomials provided better fits to the data as compared to lower degrees. The newly developed models were compared to three published LET d based models for a simulated spread out Bragg peak (SOBP) scenario. The statistical analysis of the weighted regression analysis favored a non-linear RBE-LET relationship, with the quartic polynomial found to best represent the experimental data (P = 0.010). The results of the unweighted regression analysis were on the borderline of statistical significance for non-linear functions (P = 0.053), and with the current database a linear dependency could not be rejected. For the SOBP scenario, the weighted non-linear model estimated a similar mean RBE value (1.14) compared to the three established models (1.13-1.17). The unweighted model calculated a considerably higher RBE value (1.22). The analysis indicated that non-linear models could give a better representation of the RBE-LET relationship. However, this is not decisive, as inclusion of the experimental uncertainties in the regression analysis had a significant impact on the determination and ranking of the models. As differences between the models were

  1. Irreducible normalizer operators and thresholds for degenerate quantum codes with sublinear distances

    NASA Astrophysics Data System (ADS)

    Pryadko, Leonid P.; Dumer, Ilya; Kovalev, Alexey A.

    2015-03-01

    We construct a lower (existence) bound for the threshold of scalable quantum computation which is applicable to all stabilizer codes, including degenerate quantum codes with sublinear distance scaling. The threshold is based on enumerating irreducible operators in the normalizer of the code, i.e., those that cannot be decomposed into a product of two such operators with non-overlapping support. For quantum LDPC codes with logarithmic or power-law distances, we get threshold values which are parametrically better than the existing analytical bound based on percolation. The new bound also gives a finite threshold when applied to other families of degenerate quantum codes, e.g., the concatenated codes. This research was supported in part by the NSF Grant PHY-1416578 and by the ARO Grant W911NF-11-1-0027.

  2. A revision of the gamma-evaluation concept for the comparison of dose distributions.

    PubMed

    Bakai, Annemarie; Alber, Markus; Nüsslin, Fridtjof

    2003-11-07

    A method for the quantitative four-dimensional (4D) evaluation of discrete dose data based on gradient-dependent local acceptance thresholds is presented. The method takes into account the local dose gradients of a reference distribution for critical appraisal of misalignment and collimation errors. These contribute to the maximum tolerable dose error at each evaluation point to which the local dose differences between comparison and reference data are compared. As shown, the presented concept is analogous to the gamma-concept of Low et al (1998a Med. Phys. 25 656-61) if extended to (3+1) dimensions. The pointwise dose comparisons of the reformulated concept are easier to perform and speed up the evaluation process considerably, especially for fine-grid evaluations of 3D dose distributions. The occurrences of false negative indications due to the discrete nature of the data are reduced with the method. The presented method was applied to film-measured, clinical data and compared with gamma-evaluations. 4D and 3D evaluations were performed. Comparisons prove that 4D evaluations have to be given priority, especially if complex treatment situations are verified, e.g., non-coplanar beam configurations.

  3. Burst of virus infection and a possibly largest epidemic threshold of non-Markovian susceptible-infected-susceptible processes on networks

    NASA Astrophysics Data System (ADS)

    Liu, Qiang; Van Mieghem, Piet

    2018-02-01

    Since a real epidemic process is not necessarily Markovian, the epidemic threshold obtained under the Markovian assumption may be not realistic. To understand general non-Markovian epidemic processes on networks, we study the Weibullian susceptible-infected-susceptible (SIS) process in which the infection process is a renewal process with a Weibull time distribution. We find that, if the infection rate exceeds 1 /ln(λ1+1 ) , where λ1 is the largest eigenvalue of the network's adjacency matrix, then the infection will persist on the network under the mean-field approximation. Thus, 1 /ln(λ1+1 ) is possibly the largest epidemic threshold for a general non-Markovian SIS process with a Poisson curing process under the mean-field approximation. Furthermore, non-Markovian SIS processes may result in a multimodal prevalence. As a byproduct, we show that a limiting Weibullian SIS process has the potential to model bursts of a synchronized infection.

  4. Unipolar Terminal-Attractor Based Neural Associative Memory with Adaptive Threshold

    NASA Technical Reports Server (NTRS)

    Liu, Hua-Kuang (Inventor); Barhen, Jacob (Inventor); Farhat, Nabil H. (Inventor); Wu, Chwan-Hwa (Inventor)

    1996-01-01

    A unipolar terminal-attractor based neural associative memory (TABAM) system with adaptive threshold for perfect convergence is presented. By adaptively setting the threshold values for the dynamic iteration for the unipolar binary neuron states with terminal-attractors for the purpose of reducing the spurious states in a Hopfield neural network for associative memory and using the inner-product approach, perfect convergence and correct retrieval is achieved. Simulation is completed with a small number of stored states (M) and a small number of neurons (N) but a large M/N ratio. An experiment with optical exclusive-OR logic operation using LCTV SLMs shows the feasibility of optoelectronic implementation of the models. A complete inner-product TABAM is implemented using a PC for calculation of adaptive threshold values to achieve a unipolar TABAM (UIT) in the case where there is no crosstalk, and a crosstalk model (CRIT) in the case where crosstalk corrupts the desired state.

  5. Unipolar terminal-attractor based neural associative memory with adaptive threshold

    NASA Technical Reports Server (NTRS)

    Liu, Hua-Kuang (Inventor); Barhen, Jacob (Inventor); Farhat, Nabil H. (Inventor); Wu, Chwan-Hwa (Inventor)

    1993-01-01

    A unipolar terminal-attractor based neural associative memory (TABAM) system with adaptive threshold for perfect convergence is presented. By adaptively setting the threshold values for the dynamic iteration for the unipolar binary neuron states with terminal-attractors for the purpose of reducing the spurious states in a Hopfield neural network for associative memory and using the inner product approach, perfect convergence and correct retrieval is achieved. Simulation is completed with a small number of stored states (M) and a small number of neurons (N) but a large M/N ratio. An experiment with optical exclusive-OR logic operation using LCTV SLMs shows the feasibility of optoelectronic implementation of the models. A complete inner-product TABAM is implemented using a PC for calculation of adaptive threshold values to achieve a unipolar TABAM (UIT) in the case where there is no crosstalk, and a crosstalk model (CRIT) in the case where crosstalk corrupts the desired state.

  6. Gradient-driven flux-tube simulations of ion temperature gradient turbulence close to the non-linear threshold

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peeters, A. G.; Rath, F.; Buchholz, R.

    2016-08-15

    It is shown that Ion Temperature Gradient turbulence close to the threshold exhibits a long time behaviour, with smaller heat fluxes at later times. This reduction is connected with the slow growth of long wave length zonal flows, and consequently, the numerical dissipation on these flows must be sufficiently small. Close to the nonlinear threshold for turbulence generation, a relatively small dissipation can maintain a turbulent state with a sizeable heat flux, through the damping of the zonal flow. Lowering the dissipation causes the turbulence, for temperature gradients close to the threshold, to be subdued. The heat flux then doesmore » not go smoothly to zero when the threshold is approached from above. Rather, a finite minimum heat flux is obtained below which no fully developed turbulent state exists. The threshold value of the temperature gradient length at which this finite heat flux is obtained is up to 30% larger compared with the threshold value obtained by extrapolating the heat flux to zero, and the cyclone base case is found to be nonlinearly stable. Transport is subdued when a fully developed staircase structure in the E × B shearing rate forms. Just above the threshold, an incomplete staircase develops, and transport is mediated by avalanche structures which propagate through the marginally stable regions.« less

  7. [Using fractional polynomials to estimate the safety threshold of fluoride in drinking water].

    PubMed

    Pan, Shenling; An, Wei; Li, Hongyan; Yang, Min

    2014-01-01

    To study the dose-response relationship between fluoride content in drinking water and prevalence of dental fluorosis on the national scale, then to determine the safety threshold of fluoride in drinking water. Meta-regression analysis was applied to the 2001-2002 national endemic fluorosis survey data of key wards. First, fractional polynomial (FP) was adopted to establish fixed effect model, determining the best FP structure, after that restricted maximum likelihood (REML) was adopted to estimate between-study variance, then the best random effect model was established. The best FP structure was first-order logarithmic transformation. Based on the best random effect model, the benchmark dose (BMD) of fluoride in drinking water and its lower limit (BMDL) was calculated as 0.98 mg/L and 0.78 mg/L. Fluoride in drinking water can only explain 35.8% of the variability of the prevalence, among other influencing factors, ward type was a significant factor, while temperature condition and altitude were not. Fractional polynomial-based meta-regression method is simple, practical and can provide good fitting effect, based on it, the safety threshold of fluoride in drinking water of our country is determined as 0.8 mg/L.

  8. Optimal antimalarial dose regimens for chloroquine in pregnancy based on population pharmacokinetic modelling.

    PubMed

    Salman, Sam; Baiwog, Francesca; Page-Sharp, Madhu; Kose, Kay; Karunajeewa, Harin A; Mueller, Ivo; Rogerson, Stephen J; Siba, Peter M; Ilett, Kenneth F; Davis, Timothy M E

    2017-10-01

    Despite extensive use and accumulated evidence of safety, there have been few pharmacokinetic studies from which appropriate chloroquine (CQ) dosing regimens could be developed specifically for pregnant women. Such optimised CQ-based regimens, used as treatment for acute malaria or as intermittent preventive treatment in pregnancy (IPTp), may have a valuable role if parasite CQ sensitivity returns following reduced drug pressure. In this study, population pharmacokinetic/pharmacodynamic modelling was used to simultaneously analyse plasma concentration-time data for CQ and its active metabolite desethylchloroquine (DCQ) in 44 non-pregnant and 45 pregnant Papua New Guinean women treated with CQ and sulfadoxine/pyrimethamine or azithromycin (AZM). Pregnancy was associated with 16% and 49% increases in CQ and DCQ clearance, respectively, as well as a 24% reduction in CQ relative bioavailability. Clearance of DCQ was 22% lower in those who received AZM in both groups. Simulations based on the final multicompartmental model demonstrated that a 33% CQ dose increase may be suitable for acute treatment for malaria in pregnancy as it resulted in equivalent exposure to that in non-pregnant women receiving recommended doses, whilst a double dose would likely be required for an effective duration of post-treatment prophylaxis when used as IPTp especially in areas of CQ resistance. The impact of co-administered AZM was clinically insignificant in simulations. The results of past/ongoing trials employing recommended adult doses of CQ-based regimens in pregnant women should be interpreted in light of these findings, and consideration should be given to using increased doses in future trials. Copyright © 2017 Elsevier B.V. and International Society of Chemotherapy. All rights reserved.

  9. A robust threshold-based cloud mask for the HRV channel of MSG SEVIRI

    NASA Astrophysics Data System (ADS)

    Bley, S.; Deneke, H.

    2013-03-01

    A robust threshold-based cloud mask for the high-resolution visible (HRV) channel (1 × 1 km2) of the METEOSAT SEVIRI instrument is introduced and evaluated. It is based on operational EUMETSAT cloud mask for the low resolution channels of SEVIRI (3 × 3 km2), which is used for the selection of suitable thresholds to ensure consistency with its results. The aim of using the HRV channel is to resolve small-scale cloud structures which cannot be detected by the low resolution channels. We find that it is of advantage to apply thresholds relative to clear-sky reflectance composites, and to adapt the threshold regionally. Furthermore, the accuracy of the different spectral channels for thresholding and the suitability of the HRV channel are investigated for cloud detection. The case studies show different situations to demonstrate the behaviour for various surface and cloud conditions. Overall, between 4 and 24% of cloudy low-resolution SEVIRI pixels are found to contain broken clouds in our test dataset depending on considered region. Most of these broken pixels are classified as cloudy by EUMETSAT's cloud mask, which will likely result in an overestimate if the mask is used as estimate of cloud fraction.

  10. A General Iterative Shrinkage and Thresholding Algorithm for Non-convex Regularized Optimization Problems.

    PubMed

    Gong, Pinghua; Zhang, Changshui; Lu, Zhaosong; Huang, Jianhua Z; Ye, Jieping

    2013-01-01

    Non-convex sparsity-inducing penalties have recently received considerable attentions in sparse learning. Recent theoretical investigations have demonstrated their superiority over the convex counterparts in several sparse learning settings. However, solving the non-convex optimization problems associated with non-convex penalties remains a big challenge. A commonly used approach is the Multi-Stage (MS) convex relaxation (or DC programming), which relaxes the original non-convex problem to a sequence of convex problems. This approach is usually not very practical for large-scale problems because its computational cost is a multiple of solving a single convex problem. In this paper, we propose a General Iterative Shrinkage and Thresholding (GIST) algorithm to solve the nonconvex optimization problem for a large class of non-convex penalties. The GIST algorithm iteratively solves a proximal operator problem, which in turn has a closed-form solution for many commonly used penalties. At each outer iteration of the algorithm, we use a line search initialized by the Barzilai-Borwein (BB) rule that allows finding an appropriate step size quickly. The paper also presents a detailed convergence analysis of the GIST algorithm. The efficiency of the proposed algorithm is demonstrated by extensive experiments on large-scale data sets.

  11. Dose escalation to high-risk sub-volumes based on non-invasive imaging of hypoxia and glycolytic activity in canine solid tumors: a feasibility study

    PubMed Central

    2013-01-01

    Introduction Glycolytic activity and hypoxia are associated with poor prognosis and radiation resistance. Including both the tumor uptake of 2-deoxy-2-[18 F]-fluorodeoxyglucose (FDG) and the proposed hypoxia tracer copper(II)diacetyl-bis(N4)-methylsemithio-carbazone (Cu-ATSM) in targeted therapy planning may therefore lead to improved tumor control. In this study we analyzed the overlap between sub-volumes of FDG and hypoxia assessed by the uptake of 64Cu-ATSM in canine solid tumors, and evaluated the possibilities for dose redistribution within the gross tumor volume (GTV). Materials and methods Positron emission tomography/computed tomography (PET/CT) scans of five spontaneous canine solid tumors were included. FDG-PET/CT was obtained at day 1, 64Cu-ATSM at day 2 and 3 (3 and 24 h pi.). GTV was delineated and CT images were co-registered. Sub-volumes for 3 h and 24 h 64Cu-ATSM (Cu3 and Cu24) were defined by a threshold based method. FDG sub-volumes were delineated at 40% (FDG40) and 50% (FDG50) of SUVmax. The size of sub-volumes, intersection and biological target volume (BTV) were measured in a treatment planning software. By varying the average dose prescription to the tumor from 66 to 85 Gy, the possible dose boost (D B ) was calculated for the three scenarios that the optimal target for the boost was one, the union or the intersection of the FDG and 64Cu-ATSM sub-volumes. Results The potential boost volumes represented a fairly large fraction of the total GTV: Cu3 49.8% (26.8-72.5%), Cu24 28.1% (2.4-54.3%), FDG40 45.2% (10.1-75.2%), and FDG50 32.5% (2.6-68.1%). A BTV including the union (∪) of Cu3 and FDG would involve boosting to a larger fraction of the GTV, in the case of Cu3∪FDG40 63.5% (51.8-83.8) and Cu3∪FDG50 48.1% (43.7-80.8). The union allowed only a very limited D B whereas the intersection allowed a substantial dose escalation. Conclusions FDG and 64Cu-ATSM sub-volumes were only partly overlapping, suggesting that the tracers offer

  12. Computational assessment of effective dose and patient specific doses for kilovoltage stereotactic radiosurgery of wet age-related macular degeneration

    NASA Astrophysics Data System (ADS)

    Hanlon, Justin Mitchell

    Age-related macular degeneration (AMD) is a leading cause of vision loss and a major health problem for people over the age of 50 in industrialized nations. The current standard of care, ranibizumab, is used to help slow and in some cases stabilize the process of AMD, but requires frequent invasive injections into the eye. Interest continues for stereotactic radiosurgery (SRS), an option that provides a non-invasive treatment for the wet form of AMD, through the development of the IRay(TM) (Oraya Therapeutics, Inc., Newark, CA). The goal of this modality is to destroy choroidal neovascularization beneath the pigment epithelium via delivery of three 100 kVp photon beams entering through the sclera and overlapping on the macula delivering up to 24 Gy of therapeutic dose over a span of approximately 5 minutes. The divergent x-ray beams targeting the fovea are robotically positioned and the eye is gently immobilized by a suction-enabled contact lens. Device development requires assessment of patient effective dose, reference patient mean absorbed doses to radiosensitive tissues, and patient specific doses to the lens and optic nerve. A series of head phantoms, including both reference and patient specific, was derived from CT data and employed in conjunction with the MCNPX 2.5.0 radiation transport code to simulate treatment and evaluate absorbed doses to potential tissues-at-risk. The reference phantoms were used to evaluate effective dose and mean absorbed doses to several radiosensitive tissues. The optic nerve was modeled with changeable positions based on individual patient variability seen in a review of head CT scans gathered. Patient specific phantoms were used to determine the effect of varying anatomy and gaze. The results showed that absorbed doses to the non-targeted tissues were below the threshold levels for serious complications; specifically the development of radiogenic cataracts and radiation induced optic neuropathy (RON). The effective dose

  13. Exploiting Sub-threshold and above-threshold characteristics in a silver-enhanced gold nanoparticle based biochip.

    PubMed

    Liu, Yang; Alocilja, Evangelyn; Chakrabartty, Shantanu

    2009-01-01

    Silver-enhanced labeling is a technique used in immunochromatographic assays for improving the sensitivity of pathogen detection. In this paper, we employ the silver enhancement approach for constructing a biomolecular transistor that uses a high-density interdigitated electrode to detect rabbit IgG. We show that the response of the biomolecular transistor comprises of: (a) a sub-threshold region where the conductance change is an exponential function of the enhancement time and; (b) an above-threshold region where the conductance change is a linear function with respect to the enhancement time. By exploiting both these regions of operation, it is shown that the silver enhancing time is a reliable indicator of the IgG concentration. The method provides a relatively straightforward alternative to biomolecular signal amplification techniques. The measured results using a biochip prototype fabricated in silicon show that 240 pg/mL rabbit IgG can be detected at the silver enhancing time of 42 min. Also, the biomolecular transistor is compatible with silicon based processing making it ideal for designing integrated CMOS biosensors.

  14. Noise reduction algorithm with the soft thresholding based on the Shannon entropy and bone-conduction speech cross- correlation bands.

    PubMed

    Na, Sung Dae; Wei, Qun; Seong, Ki Woong; Cho, Jin Ho; Kim, Myoung Nam

    2018-01-01

    The conventional methods of speech enhancement, noise reduction, and voice activity detection are based on the suppression of noise or non-speech components of the target air-conduction signals. However, air-conduced speech is hard to differentiate from babble or white noise signals. To overcome this problem, the proposed algorithm uses the bone-conduction speech signals and soft thresholding based on the Shannon entropy principle and cross-correlation of air- and bone-conduction signals. A new algorithm for speech detection and noise reduction is proposed, which makes use of the Shannon entropy principle and cross-correlation with the bone-conduction speech signals to threshold the wavelet packet coefficients of the noisy speech. The proposed method can be get efficient result by objective quality measure that are PESQ, RMSE, Correlation, SNR. Each threshold is generated by the entropy and cross-correlation approaches in the decomposed bands using the wavelet packet decomposition. As a result, the noise is reduced by the proposed method using the MATLAB simulation. To verify the method feasibility, we compared the air- and bone-conduction speech signals and their spectra by the proposed method. As a result, high performance of the proposed method is confirmed, which makes it quite instrumental to future applications in communication devices, noisy environment, construction, and military operations.

  15. Defining indoor heat thresholds for health in the UK.

    PubMed

    Anderson, Mindy; Carmichael, Catriona; Murray, Virginia; Dengel, Andy; Swainson, Michael

    2013-05-01

    It has been recognised that as outdoor ambient temperatures increase past a particular threshold, so do mortality/morbidity rates. However, similar thresholds for indoor temperatures have not yet been identified. Due to a warming climate, the non-sustainability of air conditioning as a solution, and the desire for more energy-efficient airtight homes, thresholds for indoor temperature should be defined as a public health issue. The aim of this paper is to outline the need for indoor heat thresholds and to establish if they can be identified. Our objectives include: describing how indoor temperature is measured; highlighting threshold measurements and indices; describing adaptation to heat; summary of the risk of susceptible groups to heat; reviewing the current evidence on the link between sleep, heat and health; exploring current heat and health warning systems and thresholds; exploring the built environment and the risk of overheating; and identifying the gaps in current knowledge and research. A global literature search of key databases was conducted using a pre-defined set of keywords to retrieve peer-reviewed and grey literature. The paper will apply the findings to the context of the UK. A summary of 96 articles, reports, government documents and textbooks were analysed and a gap analysis was conducted. Evidence on the effects of indoor heat on health implies that buildings are modifiers of the effect of climate on health outcomes. Personal exposure and place-based heat studies showed the most significant correlations between indoor heat and health outcomes. However, the data are sparse and inconclusive in terms of identifying evidence-based definitions for thresholds. Further research needs to be conducted in order to provide an evidence base for threshold determination. Indoor and outdoor heat are related but are different in terms of language and measurement. Future collaboration between the health and building sectors is needed to develop a common

  16. Probabilistic assessment method of the non-monotonic dose-responses-Part I: Methodological approach.

    PubMed

    Chevillotte, Grégoire; Bernard, Audrey; Varret, Clémence; Ballet, Pascal; Bodin, Laurent; Roudot, Alain-Claude

    2017-08-01

    More and more studies aim to characterize non-monotonic dose response curves (NMDRCs). The greatest difficulty is to assess the statistical plausibility of NMDRCs from previously conducted dose response studies. This difficulty is linked to the fact that these studies present (i) few doses tested, (ii) a low sample size per dose, and (iii) the absence of any raw data. In this study, we propose a new methodological approach to probabilistically characterize NMDRCs. The methodology is composed of three main steps: (i) sampling from summary data to cover all the possibilities that may be presented by the responses measured by dose and to obtain a new raw database, (ii) statistical analysis of each sampled dose-response curve to characterize the slopes and their signs, and (iii) characterization of these dose-response curves according to the variation of the sign in the slope. This method allows characterizing all types of dose-response curves and can be applied both to continuous data and to discrete data. The aim of this study is to present the general principle of this probabilistic method which allows to assess the non-monotonic dose responses curves, and to present some results. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Estimation of internal organ motion-induced variance in radiation dose in non-gated radiotherapy

    NASA Astrophysics Data System (ADS)

    Zhou, Sumin; Zhu, Xiaofeng; Zhang, Mutian; Zheng, Dandan; Lei, Yu; Li, Sicong; Bennion, Nathan; Verma, Vivek; Zhen, Weining; Enke, Charles

    2016-12-01

    In the delivery of non-gated radiotherapy (RT), owing to intra-fraction organ motion, a certain degree of RT dose uncertainty is present. Herein, we propose a novel mathematical algorithm to estimate the mean and variance of RT dose that is delivered without gating. These parameters are specific to individual internal organ motion, dependent on individual treatment plans, and relevant to the RT delivery process. This algorithm uses images from a patient’s 4D simulation study to model the actual patient internal organ motion during RT delivery. All necessary dose rate calculations are performed in fixed patient internal organ motion states. The analytical and deterministic formulae of mean and variance in dose from non-gated RT were derived directly via statistical averaging of the calculated dose rate over possible random internal organ motion initial phases, and did not require constructing relevant histograms. All results are expressed in dose rate Fourier transform coefficients for computational efficiency. Exact solutions are provided to simplified, yet still clinically relevant, cases. Results from a volumetric-modulated arc therapy (VMAT) patient case are also presented. The results obtained from our mathematical algorithm can aid clinical decisions by providing information regarding both mean and variance of radiation dose to non-gated patients prior to RT delivery.

  18. A Shearlet-based algorithm for quantum noise removal in low-dose CT images

    NASA Astrophysics Data System (ADS)

    Zhang, Aguan; Jiang, Huiqin; Ma, Ling; Liu, Yumin; Yang, Xiaopeng

    2016-03-01

    Low-dose CT (LDCT) scanning is a potential way to reduce the radiation exposure of X-ray in the population. It is necessary to improve the quality of low-dose CT images. In this paper, we propose an effective algorithm for quantum noise removal in LDCT images using shearlet transform. Because the quantum noise can be simulated by Poisson process, we first transform the quantum noise by using anscombe variance stabilizing transform (VST), producing an approximately Gaussian noise with unitary variance. Second, the non-noise shearlet coefficients are obtained by adaptive hard-threshold processing in shearlet domain. Third, we reconstruct the de-noised image using the inverse shearlet transform. Finally, an anscombe inverse transform is applied to the de-noised image, which can produce the improved image. The main contribution is to combine the anscombe VST with the shearlet transform. By this way, edge coefficients and noise coefficients can be separated from high frequency sub-bands effectively. A number of experiments are performed over some LDCT images by using the proposed method. Both quantitative and visual results show that the proposed method can effectively reduce the quantum noise while enhancing the subtle details. It has certain value in clinical application.

  19. MO-FG-202-08: Real-Time Monte Carlo-Based Treatment Dose Reconstruction and Monitoring for Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tian, Z; Shi, F; Gu, X

    2016-06-15

    Purpose: This proof-of-concept study is to develop a real-time Monte Carlo (MC) based treatment-dose reconstruction and monitoring system for radiotherapy, especially for the treatments with complicated delivery, to catch treatment delivery errors at the earliest possible opportunity and interrupt the treatment only when an unacceptable dosimetric deviation from our expectation occurs. Methods: First an offline scheme is launched to pre-calculate the expected dose from the treatment plan, used as ground truth for real-time monitoring later. Then an online scheme with three concurrent threads is launched while treatment delivering, to reconstruct and monitor the patient dose in a temporally resolved fashionmore » in real-time. Thread T1 acquires machine status every 20 ms to calculate and accumulate fluence map (FM). Once our accumulation threshold is reached, T1 transfers the FM to T2 for dose reconstruction ad starts to accumulate a new FM. A GPU-based MC dose calculation is performed on T2 when MC dose engine is ready and a new FM is available. The reconstructed instantaneous dose is directed to T3 for dose accumulation and real-time visualization. Multiple dose metrics (e.g. maximum and mean dose for targets and organs) are calculated from the current accumulated dose and compared with the pre-calculated expected values. Once the discrepancies go beyond our tolerance, an error message will be send to interrupt the treatment delivery. Results: A VMAT Head-and-neck patient case was used to test the performance of our system. Real-time machine status acquisition was simulated here. The differences between the actual dose metrics and the expected ones were 0.06%–0.36%, indicating an accurate delivery. ∼10Hz frequency of dose reconstruction and monitoring was achieved, with 287.94s online computation time compared to 287.84s treatment delivery time. Conclusion: Our study has demonstrated the feasibility of computing a dose distribution in a temporally resolved

  20. Twelve automated thresholding methods for segmentation of PET images: a phantom study.

    PubMed

    Prieto, Elena; Lecumberri, Pablo; Pagola, Miguel; Gómez, Marisol; Bilbao, Izaskun; Ecay, Margarita; Peñuelas, Iván; Martí-Climent, Josep M

    2012-06-21

    Tumor volume delineation over positron emission tomography (PET) images is of great interest for proper diagnosis and therapy planning. However, standard segmentation techniques (manual or semi-automated) are operator dependent and time consuming while fully automated procedures are cumbersome or require complex mathematical development. The aim of this study was to segment PET images in a fully automated way by implementing a set of 12 automated thresholding algorithms, classical in the fields of optical character recognition, tissue engineering or non-destructive testing images in high-tech structures. Automated thresholding algorithms select a specific threshold for each image without any a priori spatial information of the segmented object or any special calibration of the tomograph, as opposed to usual thresholding methods for PET. Spherical (18)F-filled objects of different volumes were acquired on clinical PET/CT and on a small animal PET scanner, with three different signal-to-background ratios. Images were segmented with 12 automatic thresholding algorithms and results were compared with the standard segmentation reference, a threshold at 42% of the maximum uptake. Ridler and Ramesh thresholding algorithms based on clustering and histogram-shape information, respectively, provided better results that the classical 42%-based threshold (p < 0.05). We have herein demonstrated that fully automated thresholding algorithms can provide better results than classical PET segmentation tools.

  1. Twelve automated thresholding methods for segmentation of PET images: a phantom study

    NASA Astrophysics Data System (ADS)

    Prieto, Elena; Lecumberri, Pablo; Pagola, Miguel; Gómez, Marisol; Bilbao, Izaskun; Ecay, Margarita; Peñuelas, Iván; Martí-Climent, Josep M.

    2012-06-01

    Tumor volume delineation over positron emission tomography (PET) images is of great interest for proper diagnosis and therapy planning. However, standard segmentation techniques (manual or semi-automated) are operator dependent and time consuming while fully automated procedures are cumbersome or require complex mathematical development. The aim of this study was to segment PET images in a fully automated way by implementing a set of 12 automated thresholding algorithms, classical in the fields of optical character recognition, tissue engineering or non-destructive testing images in high-tech structures. Automated thresholding algorithms select a specific threshold for each image without any a priori spatial information of the segmented object or any special calibration of the tomograph, as opposed to usual thresholding methods for PET. Spherical 18F-filled objects of different volumes were acquired on clinical PET/CT and on a small animal PET scanner, with three different signal-to-background ratios. Images were segmented with 12 automatic thresholding algorithms and results were compared with the standard segmentation reference, a threshold at 42% of the maximum uptake. Ridler and Ramesh thresholding algorithms based on clustering and histogram-shape information, respectively, provided better results that the classical 42%-based threshold (p < 0.05). We have herein demonstrated that fully automated thresholding algorithms can provide better results than classical PET segmentation tools.

  2. Threshold-adaptive canny operator based on cross-zero points

    NASA Astrophysics Data System (ADS)

    Liu, Boqi; Zhang, Xiuhua; Hong, Hanyu

    2018-03-01

    Canny edge detection[1] is a technique to extract useful structural information from different vision objects and dramatically reduce the amount of data to be processed. It has been widely applied in various computer vision systems. There are two thresholds have to be settled before the edge is segregated from background. Usually, by the experience of developers, two static values are set as the thresholds[2]. In this paper, a novel automatic thresholding method is proposed. The relation between the thresholds and Cross-zero Points is analyzed, and an interpolation function is deduced to determine the thresholds. Comprehensive experimental results demonstrate the effectiveness of proposed method and advantageous for stable edge detection at changing illumination.

  3. Evaluation of a threshold-based model of fatigue in gamma titanium aluminide following impact damage

    NASA Astrophysics Data System (ADS)

    Harding, Trevor Scott

    2000-10-01

    Recent interest in gamma titanium aluminide (gamma-TiAl) for use in gas turbine engine applications has centered on the low density and good elevated temperature strength retention of gamma-TiAl compared to current materials. However, the relatively low ductility and fracture toughness of gamma-TiAl leads to serious concerns regarding its ability to resist impact damage. Furthermore, the limited fatigue crack growth resistance of gamma-TiAl means that the potential for fatigue failures resulting from impact damage is real if a damage tolerant design approach is used. A threshold-based design approach may be required if fatigue crack growth from potential impact sites is to be avoided. The objective of the present research is to examine the feasibility of a threshold-based approach for the design of a gamma-TiAl low-pressure turbine blade subjected to both assembly-related impact damage and foreign object damage. Specimens of three different gamma-TiAl alloys were damaged in such a way as to simulate anticipated impact damage for a turbine blade. Step-loading fatigue tests were conducted at both room temperature and 600°C. In terms of the assembly-related impact damage, the results indicate that there is reasonably good agreement between the threshold-based predictions of the fatigue strength of damaged specimens and the measured data. However, some discrepancies do exist. In the case of very lightly damaged specimens, prediction of the resulting fatigue strength requires that a very conservative small-crack fatigue threshold be used. Consequently, the allowable design conditions are significantly reduced. For severely damaged specimens, an analytical approach found that the potential effects of residual stresses may be related to the discrepancies observed between the threshold-based model and measured fatigue strength data. In the case of foreign object damage, a good correlation was observed between impacts resulting in large cracks and a long-crack threshold-based

  4. CRADA Final Report for CRADA Number ORNL00-0605: Advanced Engine/Aftertreatment System R&D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pihl, Josh A; West, Brian H; Toops, Todd J

    2011-10-01

    experiments confirmed the previous results regarding hydrocarbon reactivity: 1-pentene was the most efficient LNT reductant, followed by toluene. Injection location had minimal impact on the reactivity of these two compounds. Iso-octane was an ineffective LNT reductant, requiring high doses (resulting in high HC emissions) to achieve reasonable NOx conversions. Diesel fuel reactivity was sensitive to injection location, with the best performance achieved through fuel injection downstream of the DOC. This configuration generated large LNT temperature excursions, which probably improved the efficiency of the NOx storage/reduction process, but also resulted in very high HC emissions. The ORNL team demonstrated an LNT desulfation under 'road load' conditions using throttling, EGR, and in-pipe injection of diesel fuel. Flow reactor characterization of core samples cut from the front and rear of the engine-aged LNT revealed complex spatially dependent degradation mechanisms. The front of the catalyst contained residual sulfates, which impacted NOx storage and conversion efficiencies at high temperatures. The rear of the catalyst showed significant sintering of the washcoat and precious metal particles, resulting in lower NOx conversion efficiencies at low temperatures. Further flow reactor characterization of engine-aged LNT core samples established that low temperature performance was limited by slow release and reduction of stored NOx during regeneration. Carbon monoxide was only effective at regenerating the LNT at temperatures above 200 C; propene was unreactive even at 250 C. Low temperature operation also resulted in unselective NOx reduction, resulting in high emissions of both N{sub 2}O and NH{sub 3}. During the latter years of the CRADA, the focus was shifted from LNTs to other aftertreatment devices. Two years of the CRADA were spent developing detailed ammonia SCR device models with sufficient accuracy and computational efficiency to be used in development of model-based

  5. Dose-response effects of corneal anesthetics.

    PubMed

    Polse, K A; Keener, R J; Jauregui, M J

    1978-01-01

    With double-masking procedures, the dose-response curves for 0.1, 0.2, and 0.4% benoxinate and 0.125, 0.25, and 0.50% proparacaine hydrochloride were determined by monitoring changes in corneal touch threshold after applying each anesthetic. The level of corneal anesthesia necessary for applanation tonometry was also determined. The maximum increase in threshold that could be measured following instillation of 50 microliter of the drug was 200 mg/mm2 All 6 anesthetic solutions produced this amount of decreased corneal sensitivity. Recovery from the anesthetic was exponential for all concentrations; however, the lower doses had the shortest duration. For applanation tonometry, the corneal threshold for touch must be 75 mg/mm2 or higher. We conclude that a quarter to a half of the commonly used anesthetic dose is sufficient for routine tonometric evaluation.

  6. Toxicity assessment strategies, data requirements, and risk assessment approaches to derive health based guidance values for non-relevant metabolites of plant protection products.

    PubMed

    Dekant, Wolfgang; Melching-Kollmuss, Stephanie; Kalberlah, Fritz

    2010-03-01

    In Europe, limits for tolerable concentrations of "non-relevant metabolites" for active ingredients (AI) of plant protection products in drinking water between 0.1 and 10 microg/L are discussed depending on the toxicological information available. "Non-relevant metabolites" are degradation products of AIs, which do not or only partially retain the targeted toxicities of AIs. For "non-relevant metabolites" without genotoxicity (to be confirmed by testing in vitro), the application of the concept of "thresholds of toxicological concern" results in a health-based drinking water limit of 4.5 microg/L even for Cramer class III compounds, using the TTC threshold of 90 microg/person/day (divided by 10 and 2). Taking into account the thresholds derived from two reproduction toxicity data bases a drinking water limit of 3.0 microg/L is proposed. Therefore, for "non-relevant metabolites" whose drinking water concentration is below 3.0 microg/L, no toxicity testing is necessary. This work develops a toxicity assessment strategy as a basis to delineate health-based limits for "non-relevant metabolites" in ground and drinking water. Toxicological testing is recommended to investigate, whether the metabolites are relevant or not, based on the hazard properties of the parent AIs, as outlined in the SANCO Guidance document. Also, genotoxicity testing of the water metabolites is clearly recommended. In this publication, tiered testing strategies are proposed for non-relevant metabolites, when drinking water concentrations >3.0 microg/L will occur. Conclusions based on structure-activity relationships and the detailed toxicity database on the parent AI should be included. When testing in animals is required for risk assessment, key aspects are studies along OECD-testing guidelines with "enhanced" study designs addressing additional endpoints such as reproductive toxicity and a developmental screening test to derive health-based tolerable drinking water limits with a limited number

  7. Stress induction in the bacteria Shewanella oneidensis and Deinococcus radiodurans in response to below-background ionizing radiation.

    PubMed

    Castillo, Hugo; Schoderbek, Donald; Dulal, Santosh; Escobar, Gabriela; Wood, Jeffrey; Nelson, Roger; Smith, Geoffrey

    2015-01-01

    The 'Linear no-threshold' (LNT) model predicts that any amount of radiation increases the risk of organisms to accumulate negative effects. Several studies at below background radiation levels (4.5-11.4 nGy h(-1)) show decreased growth rates and an increased susceptibility to oxidative stress. The purpose of our study is to obtain molecular evidence of a stress response in Shewanella oneidensis and Deinococcus radiodurans grown at a gamma dose rate of 0.16 nGy h(-1), about 400 times less than normal background radiation. Bacteria cultures were grown at a dose rate of 0.16 or 71.3 nGy h(-1) gamma irradiation. Total RNA was extracted from samples at early-exponential and stationary phases for the rt-PCR relative quantification (radiation-deprived treatment/background radiation control) of the stress-related genes katB (catalase), recA (recombinase), oxyR (oxidative stress transcriptional regulator), lexA (SOS regulon transcriptional repressor), dnaK (heat shock protein 70) and SOA0154 (putative heavy metal efflux pump). Deprivation of normal levels of radiation caused a reduction in growth of both bacterial species, accompanied by the upregulation of katB, recA, SOA0154 genes in S. oneidensis and the upregulation of dnaK in D. radiodurans. When cells were returned to background radiation levels, growth rates recovered and the stress response dissipated. Our results indicate that below-background levels of radiation inhibited growth and elicited a stress response in two species of bacteria, contrary to the LNT model prediction.

  8. Introducing hydrological information in rainfall intensity-duration thresholds

    NASA Astrophysics Data System (ADS)

    Greco, Roberto; Bogaard, Thom

    2016-04-01

    Regional landslide hazard assessment is mainly based on empirically derived precipitation-intensity-duration (PID) thresholds. Generally, two features of rainfall events are plotted to discriminate between observed occurrence and absence of occurrence of mass movements. Hereafter, a separation line is drawn in logarithmic space. Although successfully applied in many case studies, such PID thresholds suffer from many false positives as well as limited physical process insight. One of the main limitations is indeed that they do not include any information about the hydrological processes occurring along the slopes, so that the triggering is only related to rainfall characteristics. In order to introduce such an hydrological information in the definition of rainfall thresholds for shallow landslide triggering assessment, in this study the introduction of non-dimensional rainfall characteristics is proposed. In particular, rain storm depth, intensity and duration are divided by a characteristic infiltration depth, a characteristic infiltration rate and a characteristic duration, respectively. These latter variables depend on the hydraulic properties and on the moisture state of the soil cover at the beginning of the precipitation. The proposed variables are applied to the case of a slope covered with shallow pyroclastic deposits in Cervinara (southern Italy), for which experimental data of hourly rainfall and soil suction were available. Rainfall thresholds defined with the proposed non-dimensional variables perform significantly better than those defined with dimensional variables, either in the intensity-duration plane or in the depth-duration plane.

  9. A spatially encoded dose difference maximal intensity projection map for patient dose evaluation: A new first line patient quality assurance tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu Weigang; Graff, Pierre; Boettger, Thomas

    2011-04-15

    Purpose: To develop a spatially encoded dose difference maximal intensity projection (DD-MIP) as an online patient dose evaluation tool for visualizing the dose differences between the planning dose and dose on the treatment day. Methods: Megavoltage cone-beam CT (MVCBCT) images acquired on the treatment day are used for generating the dose difference index. Each index is represented by different colors for underdose, acceptable, and overdose regions. A maximal intensity projection (MIP) algorithm is developed to compress all the information of an arbitrary 3D dose difference index into a 2D DD-MIP image. In such an algorithm, a distance transformation is generatedmore » based on the planning CT. Then, two new volumes representing the overdose and underdose regions of the dose difference index are encoded with the distance transformation map. The distance-encoded indices of each volume are normalized using the skin distance obtained on the planning CT. After that, two MIPs are generated based on the underdose and overdose volumes with green-to-blue and green-to-red lookup tables, respectively. Finally, the two MIPs are merged with an appropriate transparency level and rendered in planning CT images. Results: The spatially encoded DD-MIP was implemented in a dose-guided radiotherapy prototype and tested on 33 MVCBCT images from six patients. The user can easily establish the threshold for the overdose and underdose. A 3% difference between the treatment and planning dose was used as the threshold in the study; hence, the DD-MIP shows red or blue color for the dose difference >3% or {<=}3%, respectively. With such a method, the overdose and underdose regions can be visualized and distinguished without being overshadowed by superficial dose differences. Conclusions: A DD-MIP algorithm was developed that compresses information from 3D into a single or two orthogonal projections while hinting the user whether the dose difference is on the skin surface or deeper.« less

  10. Rethinking the Clinically Based Thresholds of TransCelerate BioPharma for Risk-Based Monitoring.

    PubMed

    Zink, Richard C; Dmitrienko, Anastasia; Dmitrienko, Alex

    2018-01-01

    The quality of data from clinical trials has received a great deal of attention in recent years. Of central importance is the need to protect the well-being of study participants and maintain the integrity of final analysis results. However, traditional approaches to assess data quality have come under increased scrutiny as providing little benefit for the substantial cost. Numerous regulatory guidance documents and industry position papers have described risk-based approaches to identify quality and safety issues. In particular, the position paper of TransCelerate BioPharma recommends defining risk thresholds to assess safety and quality risks based on past clinical experience. This exercise can be extremely time-consuming, and the resulting thresholds may only be relevant to a particular therapeutic area, patient or clinical site population. In addition, predefined thresholds cannot account for safety or quality issues where the underlying rate of observing a particular problem may change over the course of a clinical trial, and often do not consider varying patient exposure. In this manuscript, we appropriate rules commonly utilized for funnel plots to define a traffic-light system for risk indicators based on statistical criteria that consider the duration of patient follow-up. Further, we describe how these methods can be adapted to assess changing risk over time. Finally, we illustrate numerous graphical approaches to summarize and communicate risk, and discuss hybrid clinical-statistical approaches to allow for the assessment of risk at sites with low patient enrollment. We illustrate the aforementioned methodologies for a clinical trial in patients with schizophrenia. Funnel plots are a flexible graphical technique that can form the basis for a risk-based strategy to assess data integrity, while considering site sample size, patient exposure, and changing risk across time.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doss, Mohan

    Oxidative damage has been implicated in the pathogenesis of most aging-related diseases including neurodegenerative diseases. Antioxidant supplementation has been found to be ineffective in reducing such diseases, but increased endogenous production of antioxidants from the adaptive response due to physical and cognitive exercises (which increase oxidative metabolism and oxidative stress) has been effective in reducing some of the diseases. Low dose radiation (LDR), which increases oxidative stress and results in adaptive response of increased antioxidants, may provide an alternative method of controlling the aging-related diseases. We have studied the effect of LDR on the induction of adaptive response in ratmore » brains and the effectiveness of the LDR in reducing the oxidative damage caused by subsequent high dose radiation. We have also investigated the effect of LDR on apomorphine-induced rotations in the 6-hydroxydopamine (6-OHDA) unilaterally-lesioned rat model of Parkinson?s disease (PD). LDR was observed to initiate an adaptive response in the brain, and reduce the oxidative damage from subsequent high dose radiation exposure, confirming the effectiveness of LDR adaptive response in reducing the oxidative damage from the free radicals due to high dose radiation. LDR resulted in a slight improvement in Tyrosine hydroxylase expression on the lesioned side of substantia nigra (indicative of its protective effect on the dopaminergic neurons), and reduced the behavioral symptoms in the 6-OHDA rat model of PD. Translation of this concept to humans, if found to be applicable, may be a possible approach for controlling the progression of PD and other neurodegenerative diseases. Since any translation of the concept to humans would be hindered by the currently prevalent carcinogenic concerns regarding LDR based on the linear no-threshold (LNT) model, we have also studied the justifications for the use of the LNT model. One of the shortcomings of the LNT model is

  12. Electrical percolation threshold of cementitious composites possessing self-sensing functionality incorporating different carbon-based materials

    NASA Astrophysics Data System (ADS)

    Al-Dahawi, Ali; Haroon Sarwary, Mohammad; Öztürk, Oğuzhan; Yıldırım, Gürkan; Akın, Arife; Şahmaran, Mustafa; Lachemi, Mohamed

    2016-10-01

    An experimental study was carried out to understand the electrical percolation thresholds of different carbon-based nano- and micro-scale materials in cementitious composites. Multi-walled carbon nanotubes (CNTs), graphene nanoplatelets (GNPs) and carbon black (CB) were selected as the nano-scale materials, while 6 and 12 mm long carbon fibers (CF6 and CF12) were used as the micro-scale carbon-based materials. After determining the percolation thresholds of different electrical conductive materials, mechanical properties and piezoresistive properties of specimens produced with the abovementioned conductive materials at percolation threshold were investigated under uniaxial compressive loading. Results demonstrate that regardless of initial curing age, the percolation thresholds of CNT, GNP, CB and CFs in ECC mortar specimens were around 0.55%, 2.00%, 2.00% and 1.00%, respectively. Including different carbon-based conductive materials did not harm compressive strength results; on the contrary, it improved overall values. All cementitious composites produced with carbon-based materials, with the exception of the control mixtures, exhibited piezoresistive behavior under compression, which is crucial for sensing capability. It is believed that incorporating the sensing attribute into cementitious composites will enhance benefits for sustainable civil infrastructures.

  13. Automated aortic calcification detection in low-dose chest CT images

    NASA Astrophysics Data System (ADS)

    Xie, Yiting; Htwe, Yu Maw; Padgett, Jennifer; Henschke, Claudia; Yankelevitz, David; Reeves, Anthony P.

    2014-03-01

    The extent of aortic calcification has been shown to be a risk indicator for vascular events including cardiac events. We have developed a fully automated computer algorithm to segment and measure aortic calcification in low-dose noncontrast, non-ECG gated, chest CT scans. The algorithm first segments the aorta using a pre-computed Anatomy Label Map (ALM). Then based on the segmented aorta, aortic calcification is detected and measured in terms of the Agatston score, mass score, and volume score. The automated scores are compared with reference scores obtained from manual markings. For aorta segmentation, the aorta is modeled as a series of discrete overlapping cylinders and the aortic centerline is determined using a cylinder-tracking algorithm. Then the aortic surface location is detected using the centerline and a triangular mesh model. The segmented aorta is used as a mask for the detection of aortic calcification. For calcification detection, the image is first filtered, then an elevated threshold of 160 Hounsfield units (HU) is used within the aorta mask region to reduce the effect of noise in low-dose scans, and finally non-aortic calcification voxels (bony structures, calcification in other organs) are eliminated. The remaining candidates are considered as true aortic calcification. The computer algorithm was evaluated on 45 low-dose non-contrast CT scans. Using linear regression, the automated Agatston score is 98.42% correlated with the reference Agatston score. The automated mass and volume score is respectively 98.46% and 98.28% correlated with the reference mass and volume score.

  14. 198 AAAAI Survey on Immunotherapy Practice Patterns Concerning Dosing, Dose-Adjustment after Missed Doses and Duration of Immunotherapy

    PubMed Central

    Linnemann, Désirée Larenas; Gupta, Payel; Mithani, Sima; Ponda, Punita

    2012-01-01

    Background Several practical issues dealing with the exact application of allergen immunotherapy (AIT) among European and US allergists are not well known. Guidelines on AIT give recommendations and suggestions for only some of them. We present this unique survey with worldwide response. Methods The AAAAI immunotherapy committee conducted a web-based practice patterns survey (program: Survey Monkey) among all members in&outside US on dosing, dose-adjustment after missed doses and duration of AIT. Results 1201 Returned questionnaires (almost 25% response rate). 21% were non-US-Canada members. Maintenance doses in USCan are (mean/median): Dermatophagoides farinae (Df) combined with Dermatophagoides pteronyssinus (Dpt): 2155/1000AU; Df solo 2484/1000AU. Dpt when combined with Df 1937/1000AU; Dpt solo: 2183/1000AU.Cat 3224/2000BAU. Grass 11,410/4000BAU. 57-65% of the dosing falls within the recommended Practice Parameters recommended ranges. Non-USCan allergists expressed maintenance doses in many different units making analysis impossible. Dose-adjustment after missed doses is based on ‘time elapsed since the last applied dose’ by 77% of USCan and 58% of non-USCan allergists and on ‘time since missed scheduled dose’ by the rest. Doses are adjusted when a patient comes in more than 14 d/5 wk after the last administration at build-up/maintenance by both USCan and non-USCan colleagues. The mostly followed dose-adjustment schedules after 1, 2, 3 missed doses are: Build-up: repeat last dose, reduce by one dose, reduce by 2 doses; maintenance: reduce by one dose, reduce by 2 doses, reduce by 3 doses. 26% uses a different approach reducing doses by a certain percentage or volume. AIT is restarted after a gap in build-up of >30 days and of >12 weeks during maintenance in both groups (median). Outside USCan AIT is prescribed for 3 years (Median). However, 75% of USCan allergists prescribes AIT for 5 years. Main reasons why to continue AIT beyond 5 years:

  15. Thresholds of allergenic proteins in foods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hourihane, Jonathan O'B.; Knulst, Andre C.

    2005-09-01

    Threshold doses or Estimated Eliciting Doses (EEDs) represent an important new field of research in food allergy. Clinicians and regulators have embraced some toxicological concepts such as LOAEL and NOAEL and applied them to an area of significant clinical uncertainty and interest. The impact of intrinsic human factors (e.g., asthma and exercise) and extrinsic event factors (e.g., season, location and especially dose of allergen) on a future allergic reaction in the community needs to be considered carefully when interpreting results of clinical and research low-dose food challenges. The ongoing cooperation of food allergy research groups in medicine, food science andmore » government will surely deliver results of the highest importance to the wider communities of allergology, food science and technology and the increasing number of allergic consumers.« less

  16. Methods for automatic trigger threshold adjustment

    DOEpatents

    Welch, Benjamin J; Partridge, Michael E

    2014-03-18

    Methods are presented for adjusting trigger threshold values to compensate for drift in the quiescent level of a signal monitored for initiating a data recording event, thereby avoiding false triggering conditions. Initial threshold values are periodically adjusted by re-measuring the quiescent signal level, and adjusting the threshold values by an offset computation based upon the measured quiescent signal level drift. Re-computation of the trigger threshold values can be implemented on time based or counter based criteria. Additionally, a qualification width counter can be utilized to implement a requirement that a trigger threshold criterion be met a given number of times prior to initiating a data recording event, further reducing the possibility of a false triggering situation.

  17. Evidence for dose-additive effects of pyrethroids on motor activity in rats.

    PubMed

    Wolansky, Marcelo J; Gennings, Chris; DeVito, Michael J; Crofton, Kevin M

    2009-10-01

    Pyrethroids are neurotoxic insecticides used in a variety of indoor and outdoor applications. Previous research characterized the acute dose-effect functions for 11 pyrethroids administered orally in corn oil (1 mL/kg) based on assessment of motor activity. We used a mixture of these 11 pyrethroids and the same testing paradigm used in single-compound assays to test the hypothesis that cumulative neurotoxic effects of pyrethroid mixtures can be predicted using the default dose-addition theory. Mixing ratios of the 11 pyrethroids in the tested mixture were based on the ED30 (effective dose that produces a 30% decrease in response) of the individual chemical (i.e., the mixture comprised equipotent amounts of each pyrethroid). The highest concentration of each individual chemical in the mixture was less than the threshold for inducing behavioral effects. Adult male rats received acute oral exposure to corn oil (control) or dilutions of the stock mixture solution. The mixture of 11 pyrethroids was administered either simultaneously (2 hr before testing) or after a sequence based on times of peak effect for the individual chemicals (4, 2, and 1 hr before testing). A threshold additivity model was fit to the single-chemical data to predict the theoretical dose-effect relationship for the mixture under the assumption of dose additivity. When subthreshold doses of individual chemicals were combined in the mixtures, we found significant dose-related decreases in motor activity. Further, we found no departure from the predicted dose-additive curve regardless of the mixture dosing protocol used. In this article we present the first in vivo evidence on pyrethroid cumulative effects supporting the default assumption of dose addition.

  18. A longitudinal study on the ammonia threshold in junior cyclists

    PubMed Central

    Yuan, Y; Chan, K

    2004-01-01

    Objectives: To identify the effect of a one year non-specific training programme on the ammonia threshold of a group of junior cyclists and to correlate ammonia threshold with other common physiological variables. Methods: The cyclists performed tests at three time points (T1, T2, T3) during the year. Follow up tests were conducted every six months after the original test. Ammonia threshold was obtained from a graded exercise with four minute steps. Results: The relatively non-specific one year training programme was effective in inducing an increase in peak VO2 (60.6 (5.9), 65.9 (7.4), and 64.6 (6.5) ml/min/kg at T1, T2, and T3 respectively) and endurance time (18.3 (4.5), 20.1 (5.2), and 27.0 (6.1) minutes at T1, T2, and T3 respectively), but was not effective for the sprint related variables. Ammonia threshold, together with lactate threshold and ventilatory threshold, was not significantly different at the three test times. Only endurance time correlated significantly with ammonia threshold (r  =  0.915, p  =  0.001). Conclusions: The findings suggest that a relatively non-specific one year training programme does not modify the ammonia threshold of junior cyclists. The significant correlation between ammonia threshold and endurance time further confirms that ammonia threshold is a measure of the ability to sustain exercise at submaximal intensities. PMID:15039242

  19. The New Radiobiology: Returning to Our Roots

    PubMed Central

    Ulsh, Brant A.

    2012-01-01

    In 2005, two expert advisory bodies examined the evidence on the effects of low doses of ionizing radiation. The U.S. National Research Council concluded that current scientific evidence is consistent with the linear no-threshold dose-response relationship (NRCNA 2005) while the French National Academies of Science and Medicine concluded the opposite (Aurengo et al. 2005). These contradictory conclusions may stem in part from an emphasis on epidemiological data (a “top down” approach) versus an emphasis on biological mechanisms (a “bottom up” approach). In this paper, the strengths and limitations of the top down and bottom up approaches are discussed, and proposals for strengthening and reconciling them are suggested. The past seven years since these two reports were published have yielded increasing evidence of nonlinear responses of biological systems to low radiation doses delivered at low dose-rates. This growing body of evidence is casting ever more doubt on the extrapolation of risks observed at high doses and dose-rates to estimate risks associated with typical environmental and occupational exposures. This paper compares current evidence on low dose, low dose-rate effects against objective criteria of causation. Finally, some questions for a post-LNT world are posed. PMID:23304107

  20. Non-hazardous pesticide concentrations in surface waters: An integrated approach simulating application thresholds and resulting farm income effects.

    PubMed

    Bannwarth, M A; Grovermann, C; Schreinemachers, P; Ingwersen, J; Lamers, M; Berger, T; Streck, T

    2016-01-01

    Pesticide application rates are high and increasing in upland agricultural systems in Thailand producing vegetables, fruits and ornamental crops, leading to the pollution of stream water with pesticide residues. The objective of this study was to determine the maximum per hectare application rates of two widely used pesticides that would achieve non-hazardous pesticide concentrations in the stream water and to evaluate how farm household incomes would be affected if farmers complied with these restricted application rates. For this purpose we perform an integrated modeling approach of a hydrological solute transport model (the Soil and Water Assessment Tool, SWAT) and an agent-based farm decision model (Mathematical Programming-based Multi-Agent Systems, MPMAS). SWAT was used to simulate the pesticide fate and behavior. The model was calibrated to a 77 km(2) watershed in northern Thailand. The results show that to stay under a pre-defined eco-toxicological threshold, the current average application of chlorothalonil (0.80 kg/ha) and cypermethrin (0.53 kg/ha) would have to be reduced by 80% and 99%, respectively. The income effect of such reductions was simulated using MPMAS. The results suggest that if farm households complied with the application thresholds then their income would reduce by 17.3% in the case of chlorothalonil and by 38.3% in the case of cypermethrin. Less drastic income effects can be expected if methods of integrated pest management were more widely available. The novelty of this study is to combine two models from distinctive disciplines to evaluate pesticide reduction scenarios based on real-world data from a single study site. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Analytical modeling and feasibility study of a multi-GPU cloud-based server (MGCS) framework for non-voxel-based dose calculations.

    PubMed

    Neylon, J; Min, Y; Kupelian, P; Low, D A; Santhanam, A

    2017-04-01

    In this paper, a multi-GPU cloud-based server (MGCS) framework is presented for dose calculations, exploring the feasibility of remote computing power for parallelization and acceleration of computationally and time intensive radiotherapy tasks in moving toward online adaptive therapies. An analytical model was developed to estimate theoretical MGCS performance acceleration and intelligently determine workload distribution. Numerical studies were performed with a computing setup of 14 GPUs distributed over 4 servers interconnected by a 1 Gigabits per second (Gbps) network. Inter-process communication methods were optimized to facilitate resource distribution and minimize data transfers over the server interconnect. The analytically predicted computation time predicted matched experimentally observations within 1-5 %. MGCS performance approached a theoretical limit of acceleration proportional to the number of GPUs utilized when computational tasks far outweighed memory operations. The MGCS implementation reproduced ground-truth dose computations with negligible differences, by distributing the work among several processes and implemented optimization strategies. The results showed that a cloud-based computation engine was a feasible solution for enabling clinics to make use of fast dose calculations for advanced treatment planning and adaptive radiotherapy. The cloud-based system was able to exceed the performance of a local machine even for optimized calculations, and provided significant acceleration for computationally intensive tasks. Such a framework can provide access to advanced technology and computational methods to many clinics, providing an avenue for standardization across institutions without the requirements of purchasing, maintaining, and continually updating hardware.

  2. Seizure threshold increases can be predicted by EEG quality in right unilateral ultrabrief ECT.

    PubMed

    Gálvez, Verònica; Hadzi-Pavlovic, Dusan; Waite, Susan; Loo, Colleen K

    2017-12-01

    Increases in seizure threshold (ST) over a course of brief pulse ECT can be predicted by decreases in EEG quality, informing ECT dose adjustment to maintain adequate supra-threshold dosing. ST increases also occur over a course of right unilateral ultrabrief (RUL UB) ECT, but no data exist on the relationship between ST increases and EEG indices. This study (n = 35) investigated if increases in ST over RUL UB ECT treatments could be predicted by a decline in seizure quality. ST titration was performed at ECT session one and seven, with treatment dosing maintained stable (at 6-8 times ST) in intervening sessions. Seizure quality indices (slow-wave onset, mid-ictal amplitude, regularity, stereotypy, and post-ictal suppression) were manually rated at the first supra-threshold treatment, and last supra-threshold treatment before re-titration, using a structured rating scale, by a single trained rater blinded to the ECT session being rated. Twenty-one subjects (60%) had a ST increase. The association between ST changes and EEG quality indices was analysed by logistic regression, yielding a significant model (p < 0.001). Initial ST (p < 0.05) and percentage change in mid-ictal amplitude (p < 0.05) were significant predictors of change in ST. Percentage change in post-ictal suppression reached trend level significance (p = 0.065). Increases in ST over a RUL UB ECT course may be predicted by decreases in seizure quality, specifically decline in mid-ictal amplitude and potentially in post-ictal suppression. Such EEG indices may be able to inform when dose adjustments are necessary to maintain adequate supra-threshold dosing in RUL UB ECT.

  3. Health Effects of High Radon Environments in Central Europe: Another Test for the LNT Hypothesis?

    PubMed Central

    Becker, Klaus

    2003-01-01

    Morbus Bechterew, radon treatments are beneficial, with the positive effect lasting until at least 6 months after the normally 3-week treatment by inhalation or bathes. Studies on the mechanism of these effects are progressing. In other cases of extensive use of radon treatment for a wide spectrum of various diseases, for example, in the former Soviet Union, the positive results are not so well established. However, according to a century of radon treatment experience (after millenniums of unknown radon therapy), in particular in Germany and Austria, the positive medical effects for some diseases far exceed any potential detrimental health effects. The total amount of available data in this field is too large to be covered in a brief review. Therefore, less known — in particular recent — work from Central Europe has been analyzed in an attempt to summarize new developments and trends. This includes cost/benefit aspects of radon reduction programs. As a test case for the LNT (linear non-threshold) hypothesis and possible biopositive effects of low radiation exposures, the data support a nonlinear human response to low and medium-level radon exposures. PMID:19330110

  4. Optimizing drug-dose alerts using commercial software throughout an integrated health care system.

    PubMed

    Saiyed, Salim M; Greco, Peter J; Fernandes, Glenn; Kaelber, David C

    2017-11-01

    All default electronic health record and drug reference database vendor drug-dose alerting recommendations (single dose, daily dose, dose frequency, and dose duration) were silently turned on in inpatient, outpatient, and emergency department areas for pediatric-only and nonpediatric-only populations. Drug-dose alerts were evaluated during a 3-month period. Drug-dose alerts fired on 12% of orders (104 098/834 911). System-level and drug-specific strategies to decrease drug-dose alerts were analyzed. System-level strategies included: (1) turning off all minimum drug-dosing alerts, (2) turning off all incomplete information drug-dosing alerts, (3) increasing the maximum single-dose drug-dose alert threshold to 125%, (4) increasing the daily dose maximum drug-dose alert threshold to 125%, and (5) increasing the dose frequency drug-dose alert threshold to more than 2 doses per day above initial threshold. Drug-specific strategies included changing drug-specific maximum single and maximum daily drug-dose alerting parameters for the top 22 drug categories by alert frequency. System-level approaches decreased alerting to 5% (46 988/834 911) and drug-specific approaches decreased alerts to 3% (25 455/834 911). Drug-dose alerts varied between care settings and patient populations. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. SU-F-T-267: A Clarkson-Based Independent Dose Verification for the Helical Tomotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nagata, H; Juntendo University, Hongo, Tokyo; Hongo, H

    2016-06-15

    Purpose: There have been few reports for independent dose verification for Tomotherapy. We evaluated the accuracy and the effectiveness of an independent dose verification system for the Tomotherapy. Methods: Simple MU Analysis (SMU, Triangle Product, Ishikawa, Japan) was used as the independent verification system and the system implemented a Clarkson-based dose calculation algorithm using CT image dataset. For dose calculation in the SMU, the Tomotherapy machine-specific dosimetric parameters (TMR, Scp, OAR and MLC transmission factor) were registered as the machine beam data. Dose calculation was performed after Tomotherapy sinogram from DICOM-RT plan information was converted to the information for MUmore » and MLC location at more segmented control points. The performance of the SMU was assessed by a point dose measurement in non-IMRT and IMRT plans (simple target and mock prostate plans). Subsequently, 30 patients’ treatment plans for prostate were compared. Results: From the comparison, dose differences between the SMU and the measurement were within 3% for all cases in non-IMRT plans. In the IMRT plan for the simple target, the differences (Average±1SD) were −0.70±1.10% (SMU vs. TPS), −0.40±0.10% (measurement vs. TPS) and −1.20±1.00% (measurement vs. SMU), respectively. For the mock prostate, the differences were −0.40±0.60% (SMU vs. TPS), −0.50±0.90% (measurement vs. TPS) and −0.90±0.60% (measurement vs. SMU), respectively. For patients’ plans, the difference was −0.50±2.10% (SMU vs. TPS). Conclusion: A Clarkson-based independent dose verification for the Tomotherapy can be clinically available as a secondary check with the similar tolerance level of AAPM Task group 114. This research is partially supported by Japan Agency for Medical Research and Development (AMED)« less

  6. A strategy for systemic toxicity assessment based on non-animal approaches: The Cosmetics Europe Long Range Science Strategy programme.

    PubMed

    Desprez, Bertrand; Dent, Matt; Keller, Detlef; Klaric, Martina; Ouédraogo, Gladys; Cubberley, Richard; Duplan, Hélène; Eilstein, Joan; Ellison, Corie; Grégoire, Sébastien; Hewitt, Nicola J; Jacques-Jamin, Carine; Lange, Daniela; Roe, Amy; Rothe, Helga; Blaauboer, Bas J; Schepky, Andreas; Mahony, Catherine

    2018-08-01

    When performing safety assessment of chemicals, the evaluation of their systemic toxicity based only on non-animal approaches is a challenging objective. The Safety Evaluation Ultimately Replacing Animal Test programme (SEURAT-1) addressed this question from 2011 to 2015 and showed that further research and development of adequate tools in toxicokinetic and toxicodynamic are required for performing non-animal safety assessments. It also showed how to implement tools like thresholds of toxicological concern (TTCs) and read-across in this context. This paper shows a tiered scientific workflow and how each tier addresses the four steps of the risk assessment paradigm. Cosmetics Europe established its Long Range Science Strategy (LRSS) programme, running from 2016 to 2020, based on the outcomes of SEURAT-1 to implement this workflow. Dedicated specific projects address each step of this workflow, which is introduced here. It tackles the question of evaluating the internal dose when systemic exposure happens. The applicability of the workflow will be shown through a series of case studies, which will be published separately. Even if the LRSS puts the emphasis on safety assessment of cosmetic relevant chemicals, it remains applicable to any type of chemical. Copyright © 2018. Published by Elsevier Ltd.

  7. Feasibility of a low-dose orbital CT protocol with a knowledge-based iterative model reconstruction algorithm for evaluating Graves' orbitopathy.

    PubMed

    Lee, Ho-Joon; Kim, Jinna; Kim, Ki Wook; Lee, Seung-Koo; Yoon, Jin Sook

    2018-06-23

    To evaluate the clinical feasibility of low-dose orbital CT with a knowledge-based iterative model reconstruction (IMR) algorithm for evaluating Graves' orbitopathy. Low-dose orbital CT was performed with a CTDI vol of 4.4 mGy. In 12 patients for whom prior or subsequent non-low-dose orbital CT data obtained within 12 months were available, background noise, SNR, and CNR were compared for images generated using filtered back projection (FBP), hybrid iterative reconstruction (iDose 4 ), and IMR and non-low-dose CT images. Comparison of clinically relevant measurements for Graves' orbitopathy, such as rectus muscle thickness and retrobulbar fat area, was performed in a subset of 6 patients who underwent CT for causes other than Graves' orbitopathy, by using the Wilcoxon signed-rank test. The lens dose estimated from skin dosimetry on a phantom was 4.13 mGy, which was on average 59.34% lower than that of the non-low-dose protocols. Image quality in terms of background noise, SNR, and CNR was the best for IMR, followed by non-low-dose CT, iDose 4 , and FBP, in descending order. A comparison of clinically relevant measurements revealed no significant difference in the retrobulbar fat area and the inferior and medial rectus muscle thicknesses between the low-dose and non-low-dose CT images. Low-dose CT with IMR may be performed without significantly affecting the measurement of prognostic parameters for Graves' orbitopathy while lowering the lens dose and image noise. Copyright © 2018 Elsevier Inc. All rights reserved.

  8. The dose-response relationship between the patch test and ROAT and the potential use for regulatory purposes.

    PubMed

    Fischer, Louise Arup; Voelund, Aage; Andersen, Klaus Ejner; Menné, Torkil; Johansen, Jeanne Duus

    2009-10-01

    Allergic contact dermatitis is common and can be prevented. The relationship between thresholds for patch tests and the repeated open application test (ROAT) is unclear. It would be desirable if patch test and ROAT data from already sensitized individuals could be used in prevention. The aim was to develop an equation that could predict the response to an allergen in a ROAT based on the dose-response curve derived by patch testing. Results from two human experimental elicitation studies with non-volatile allergens, nickel and the preservative methyldibromo glutaronitrile (MDBGN), were analysed by logistic dose-response statistics. The relation for volatile compounds was investigated using the results from experiments with the fragrance chemicals hydroxyisohexyl 3-cyclohexene carboxaldehyde and isoeugenol. For non-volatile compounds, the outcome of a ROAT can be estimated from the patch test by: ED(xx)(ROAT) = 0.0296 ED(xx)(patch test). For volatile compounds, the equation predicts that the response in the ROAT is more severe than the patch test response, but it overestimates the response. This equation may be used for non-volatile compounds other than nickel and MDBGN, after further validation. The relationship between the patch test and the ROAT can be used for prevention, to set safe levels of allergen exposure based on patch test data.

  9. Analysis of dynamic thresholds for the normalized difference water index

    USGS Publications Warehouse

    Ji, Lei; Zhang, Li; Wylie, Bruce K.

    2009-01-01

    The normalized difference water index (NDWI) has been successfully used to delineate surface water features. However, two major problems have been often encountered: (a) NDWIs calculated from different band combinations [visible, nearinfrared, or shortwave-infrared (SWIR)] can generate different results, and (b) NDWI thresholds vary depending on the proportions of subpixel water/non-water components. We need to evaluate all the NDWIS for determining the best performing index and to establish appropriate thresholds for clearly identifying water features. We used the spectral data obtained from a spectral library to simulate the satellite sensors Landsat ETM+, SPOT-5, ASTER, and MODIS, and calculated the simulated NDWI in different forms. We found that the NDWI calculated from (green - swm)/(green + SWIR), where SWIR is the shorter wavelength region (1.2 to 1.8 ??m), has the most stable threshold. We recommend this NDWI be employed for mapping water, but adjustment of the threshold based on actual situations is necessary. ?? 2009 American Society for Photogrammetry and Remote Sensing.

  10. Printed dose-recording tag based on organic complementary circuits and ferroelectric nonvolatile memories

    PubMed Central

    Nga Ng, Tse; Schwartz, David E.; Mei, Ping; Krusor, Brent; Kor, Sivkheng; Veres, Janos; Bröms, Per; Eriksson, Torbjörn; Wang, Yong; Hagel, Olle; Karlsson, Christer

    2015-01-01

    We have demonstrated a printed electronic tag that monitors time-integrated sensor signals and writes to nonvolatile memories for later readout. The tag is additively fabricated on flexible plastic foil and comprises a thermistor divider, complementary organic circuits, and two nonvolatile memory cells. With a supply voltage below 30 V, the threshold temperatures can be tuned between 0 °C and 80 °C. The time-temperature dose measurement is calibrated for minute-scale integration. The two memory bits are sequentially written in a thermometer code to provide an accumulated dose record. PMID:26307438

  11. Degraded Chinese rubbing images thresholding based on local first-order statistics

    NASA Astrophysics Data System (ADS)

    Wang, Fang; Hou, Ling-Ying; Huang, Han

    2017-06-01

    It is a necessary step for Chinese character segmentation from degraded document images in Optical Character Recognizer (OCR); however, it is challenging due to various kinds of noising in such an image. In this paper, we present three local first-order statistics method that had been adaptive thresholding for segmenting text and non-text of Chinese rubbing image. Both visual inspection and numerically investigate for the segmentation results of rubbing image had been obtained. In experiments, it obtained better results than classical techniques in the binarization of real Chinese rubbing image and PHIBD 2012 datasets.

  12. Weight-based dosing in medication use: what should we know?

    PubMed Central

    Pan, Sheng-dong; Zhu, Ling-ling; Chen, Meng; Xia, Ping; Zhou, Quan

    2016-01-01

    Background Weight-based dosing strategy is still challenging due to poor awareness and adherence. It is necessary to let clinicians know of the latest developments in this respect and the correct circumstances in which weight-based dosing is of clinical relevance. Methods A literature search was conducted using PubMed. Results Clinical indications, physiological factors, and types of medication may determine the applicability of weight-based dosing. In some cases, the weight effect may be minimal or the proper dosage can only be determined when weight is combined with other factors. Medications within similar therapeutic or structural class (eg, anticoagulants, antitumor necrosis factor medications, P2Y12-receptor antagonists, and anti-epidermal growth factor receptor antibodies) may exhibit differences in requirements on weight-based dosing. In some cases, weight-based dosing is superior to currently recommended fixed-dose regimen in adult patients (eg, hydrocortisone, vancomycin, linezolid, and aprotinin). On the contrary, fixed dosing is noninferior to or even better than currently recommended weight-based regimen in adult patients in some cases (eg, cyclosporine microemulsion, recombinant activated Factor VII, and epoetin α). Ideal body-weight-based dosing may be superior to the currently recommended total body-weight-based regimen (eg, atracurium and rocuronium). For dosing in pediatrics, whether weight-based dosing is better than body surface-area-based dosing is dependent on the particular medication (eg, methotrexate, prednisone, prednisolone, zidovudine, didanosine, growth hormone, and 13-cis-retinoic acid). Age-based dosing strategy is better than weight-based dosing in some cases (eg, intravenous busulfan and dalteparin). Dosing guided by pharmacogenetic testing did not show pharmacoeconomic advantage over weight-adjusted dosing of 6-mercaptopurine. The common viewpoint (ie, pediatric patients should be dosed on the basis of body weight) is not always

  13. OPTIMIZING THE PRECISION OF TOXICITY THRESHOLD ESTIMATION USING A TWO-STAGE EXPERIMENTAL DESIGN

    EPA Science Inventory

    An important consideration for risk assessment is the existence of a threshold, i.e., the highest toxicant dose where the response is not distinguishable from background. We have developed methodology for finding an experimental design that optimizes the precision of threshold mo...

  14. Model-based dose selection for phase III rivaroxaban study in Japanese patients with non-valvular atrial fibrillation.

    PubMed

    Tanigawa, Takahiko; Kaneko, Masato; Hashizume, Kensei; Kajikawa, Mariko; Ueda, Hitoshi; Tajiri, Masahiro; Paolini, John F; Mueck, Wolfgang

    2013-01-01

    The global ROCKET AF phase III trial evaluated rivaroxaban 20 mg once daily (o.d.) for stroke prevention in atrial fibrillation (AF). Based on rivaroxaban pharmacokinetics in Japanese subjects and lower anticoagulation preferences in Japan, particularly in elderly patients, the optimal dose regimen for Japanese AF patients was considered. The aim of this analysis was dose selection for Japanese patients from a pharmacokinetic aspect by comparison of simulated exposure in Japanese patients with those in Caucasian patients. As a result of population pharmacokinetics-pharmacodynamics analyses, a one-compartment pharmacokinetic model with first-order absorption and direct link pharmacokinetic-pharmacodynamic models optimally described the plasma concentration and pharmacodynamic models (Factor Xa activity, prothrombin time, activated partial thromboplastin time, and HepTest), which were also consistent with previous works. Steady-state simulations indicated 15 mg rivaroxaban o.d. doses in Japanese patients with AF would yield exposures comparable to the 20 mg o.d. dose in Caucasian patients with AF. In conclusion, in the context of the lower anticoagulation targets in Japanese practice, the population pharmacokinetic and pharmacodynamic modeling supports 15 mg o.d. as the principal rivaroxaban dose in J-ROCKET AF.

  15. The influence of thresholds on the risk assessment of carcinogens in food.

    PubMed

    Pratt, Iona; Barlow, Susan; Kleiner, Juliane; Larsen, John Christian

    2009-08-01

    The risks from exposure to chemical contaminants in food must be scientifically assessed, in order to safeguard the health of consumers. Risk assessment of chemical contaminants that are both genotoxic and carcinogenic presents particular difficulties, since the effects of such substances are normally regarded as being without a threshold. No safe level can therefore be defined, and this has implications for both risk management and risk communication. Risk management of these substances in food has traditionally involved application of the ALARA (As Low as Reasonably Achievable) principle, however ALARA does not enable risk managers to assess the urgency and extent of the risk reduction measures needed. A more refined approach is needed, and several such approaches have been developed. Low-dose linear extrapolation from animal carcinogenicity studies or epidemiological studies to estimate risks for humans at low exposure levels has been applied by a number of regulatory bodies, while more recently the Margin of Exposure (MOE) approach has been applied by both the European Food Safety Authority and the Joint FAO/WHO Expert Committee on Food Additives. A further approach is the Threshold of Toxicological Concern (TTC), which establishes exposure thresholds for chemicals present in food, dependent on structure. Recent experimental evidence that genotoxic responses may be thresholded has significant implications for the risk assessment of chemicals that are both genotoxic and carcinogenic. In relation to existing approaches such as linear extrapolation, MOE and TTC, the existence of a threshold reduces the uncertainties inherent in such methodology and improves confidence in the risk assessment. However, for the foreseeable future, regulatory decisions based on the concept of thresholds for genotoxic carcinogens are likely to be taken case-by-case, based on convincing data on the Mode of Action indicating that the rate limiting variable for the development of cancer

  16. A non-rigid point matching method with local topology preservation for accurate bladder dose summation in high dose rate cervical brachytherapy.

    PubMed

    Chen, Haibin; Zhong, Zichun; Liao, Yuliang; Pompoš, Arnold; Hrycushko, Brian; Albuquerque, Kevin; Zhen, Xin; Zhou, Linghong; Gu, Xuejun

    2016-02-07

    GEC-ESTRO guidelines for high dose rate cervical brachytherapy advocate the reporting of the D2cc (the minimum dose received by the maximally exposed 2cc volume) to organs at risk. Due to large interfractional organ motion, reporting of accurate cumulative D2cc over a multifractional course is a non-trivial task requiring deformable image registration and deformable dose summation. To efficiently and accurately describe the point-to-point correspondence of the bladder wall over all treatment fractions while preserving local topologies, we propose a novel graphic processing unit (GPU)-based non-rigid point matching algorithm. This is achieved by introducing local anatomic information into the iterative update of correspondence matrix computation in the 'thin plate splines-robust point matching' (TPS-RPM) scheme. The performance of the GPU-based TPS-RPM with local topology preservation algorithm (TPS-RPM-LTP) was evaluated using four numerically simulated synthetic bladders having known deformations, a custom-made porcine bladder phantom embedded with twenty one fiducial markers, and 29 fractional computed tomography (CT) images from seven cervical cancer patients. Results show that TPS-RPM-LTP achieved excellent geometric accuracy with landmark residual distance error (RDE) of 0.7  ±  0.3 mm for the numerical synthetic data with different scales of bladder deformation and structure complexity, and 3.7  ±  1.8 mm and 1.6  ±  0.8 mm for the porcine bladder phantom with large and small deformation, respectively. The RDE accuracy of the urethral orifice landmarks in patient bladders was 3.7  ±  2.1 mm. When compared to the original TPS-RPM, the TPS-RPM-LTP improved landmark matching by reducing landmark RDE by 50  ±  19%, 37  ±  11% and 28  ±  11% for the synthetic, porcine phantom and the patient bladders, respectively. This was achieved with a computational time of less than 15 s in all cases

  17. A non-rigid point matching method with local topology preservation for accurate bladder dose summation in high dose rate cervical brachytherapy

    NASA Astrophysics Data System (ADS)

    Chen, Haibin; Zhong, Zichun; Liao, Yuliang; Pompoš, Arnold; Hrycushko, Brian; Albuquerque, Kevin; Zhen, Xin; Zhou, Linghong; Gu, Xuejun

    2016-02-01

    GEC-ESTRO guidelines for high dose rate cervical brachytherapy advocate the reporting of the D2cc (the minimum dose received by the maximally exposed 2cc volume) to organs at risk. Due to large interfractional organ motion, reporting of accurate cumulative D2cc over a multifractional course is a non-trivial task requiring deformable image registration and deformable dose summation. To efficiently and accurately describe the point-to-point correspondence of the bladder wall over all treatment fractions while preserving local topologies, we propose a novel graphic processing unit (GPU)-based non-rigid point matching algorithm. This is achieved by introducing local anatomic information into the iterative update of correspondence matrix computation in the ‘thin plate splines-robust point matching’ (TPS-RPM) scheme. The performance of the GPU-based TPS-RPM with local topology preservation algorithm (TPS-RPM-LTP) was evaluated using four numerically simulated synthetic bladders having known deformations, a custom-made porcine bladder phantom embedded with twenty one fiducial markers, and 29 fractional computed tomography (CT) images from seven cervical cancer patients. Results show that TPS-RPM-LTP achieved excellent geometric accuracy with landmark residual distance error (RDE) of 0.7  ±  0.3 mm for the numerical synthetic data with different scales of bladder deformation and structure complexity, and 3.7  ±  1.8 mm and 1.6  ±  0.8 mm for the porcine bladder phantom with large and small deformation, respectively. The RDE accuracy of the urethral orifice landmarks in patient bladders was 3.7  ±  2.1 mm. When compared to the original TPS-RPM, the TPS-RPM-LTP improved landmark matching by reducing landmark RDE by 50  ±  19%, 37  ±  11% and 28  ±  11% for the synthetic, porcine phantom and the patient bladders, respectively. This was achieved with a computational time of less than 15 s in all cases

  18. Bitemporal Versus High-Dose Unilateral Twice-Weekly Electroconvulsive Therapy for Depression (EFFECT-Dep): A Pragmatic, Randomized, Non-Inferiority Trial.

    PubMed

    Semkovska, Maria; Landau, Sabine; Dunne, Ross; Kolshus, Erik; Kavanagh, Adam; Jelovac, Ana; Noone, Martha; Carton, Mary; Lambe, Sinead; McHugh, Caroline; McLoughlin, Declan M

    2016-04-01

    ECT is the most effective treatment for severe depression. Previous efficacy studies, using thrice-weekly brief-pulse ECT, reported that high-dose (6× seizure threshold) right unilateral ECT is similar to bitemporal ECT but may have fewer cognitive side effects. The authors aimed to assess the effectiveness and cognitive side effects of twice-weekly moderate-dose (1.5× seizure threshold) bitemporal ECT with high-dose unilateral ECT in real-world practice. This was a pragmatic, patient- and rater-blinded, noninferiority trial of patients with major depression (N=138; 63% female; age=56.7 years [SD=14.8]) in a national ECT service with a 6-month follow-up. Participants were independently randomly assigned to bitemporal or high-dose unilateral ECT. The primary outcome was change in the 24-item Hamilton Depression Rating Scale (HAM-D) score after the ECT course; the prespecified noninferiority margin was 4.0 points. Secondary outcomes included response and remission rates, relapse status after 6 months, and cognition. Of the eligible patients, 69 were assigned to bitemporal ECT and 69 to unilateral ECT. High-dose unilateral ECT was noninferior to bitemporal ECT regarding the 24-item HAM-D scores after the ECT course (mean difference=1.08 points in favor of unilateral ECT [95% CI=-1.67 to 3.84]). There were no significant differences for response and remission or 6-month relapse status. Recovery of orientation was quicker following unilateral ECT (median=19.1 minutes versus 26.4 minutes). Bitemporal ECT was associated with a lower percent recall of autobiographical information (odds ratio=0.66) that persisted for 6 months. Twice-weekly high-dose unilateral ECT is not inferior to bitemporal ECT for depression and may be preferable because of its better cognitive side-effect profile.

  19. Buprenorphine dose induction in non-opioid-tolerant pre-release prisoners.

    PubMed

    Vocci, Frank J; Schwartz, Robert P; Wilson, Monique E; Gordon, Michael S; Kinlock, Timothy W; Fitzgerald, Terrence T; O'Grady, Kevin E; Jaffe, Jerome H

    2015-11-01

    In a previously reported randomized controlled trial, formerly opioid-dependent prisoners were more likely to enter community drug abuse treatment when they were inducted in prison onto buprenorphine/naloxone (hereafter called buprenorphine) than when they received counseling without buprenorphine in prison (47.5% vs. 33.7%, p=0.012) (Gordon et al., 2014). In this communication we report on the results of the induction schedule and the adverse event profile seen in pre-release prisoners inducted onto buprenorphine. This paper examines the dose induction procedure, a comparison of the proposed versus actual doses given per week, and side effects reported for 104 adult participants who were randomized to buprenorphine treatment in prison. Self-reported side effects were analyzed using generalized estimated equations to determine changes over time in side effects. Study participants were inducted onto buprenorphine at a rate faster than the induction schedule. Of the 104 (72 males, 32 females) buprenorphine recipients, 64 (37 males, 27 females) remained on medication at release from prison. Nine participants (8.6%) discontinued buprenorphine because of unpleasant opioid side effects. There were no serious adverse events reported during the in-prison phase of the study. Constipation was the most frequent symptom reported (69 percent). Our findings suggest that buprenorphine administered to non-opioid-tolerant adults should be started at a lower, individualized dose than customarily used for adults actively using opioids, and that non-opioid-tolerant pre-release prisoners can be successfully inducted onto therapeutic doses prior to release. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  20. First measurement of the K-n →Λπ- non-resonant transition amplitude below threshold

    NASA Astrophysics Data System (ADS)

    Piscicchia, K.; Wycech, S.; Fabbietti, L.; Cargnelli, M.; Curceanu, C.; Del Grande, R.; Marton, J.; Moskal, P.; Scordo, A.; Silarski, M.; Sirghi, D.; Skurzok, M.; Tucakovic, I.; Vázquez Doce, O.; Zmeskal, J.; Branchini, P.; Czerwinski, E.; De Leo, V.; De Lucia, E.; Di Cicco, A.; Fermani, P.; Fiore, S.; Krzemien, W.; Mandaglio, G.; Martini, M.; Perez del Rio, E.; Selce, A.

    2018-07-01

    We present the analysis of K- absorption processes on 4He leading to Λπ- final states, measured with the KLOE spectrometer at the DAΦNE e+e- collider and extract, for the first time, the modulus of the non-resonant K-n → Λπ- direct production amplitude about 33 MeV below the K ‾N threshold. This analysis also allows to disentangle the K- nuclear absorption at-rest from the in-flight capture, for K- momenta of about 120 MeV. The data are interpreted with the help of a phenomenological model, and the modulus of the non-resonant K-n → Λπ- amplitude for K- absorption at-rest is found to be |A K- n → Λπ- | = (0.334 ± 0.018 stat-0.058+0.034 syst)fm.

  1. Confirmation of model-based dose selection for a Japanese phase III study of rivaroxaban in non-valvular atrial fibrillation patients.

    PubMed

    Kaneko, Masato; Tanigawa, Takahiko; Hashizume, Kensei; Kajikawa, Mariko; Tajiri, Masahiro; Mueck, Wolfgang

    2013-01-01

    This study was designed to confirm the appropriateness of the dose setting for a Japanese phase III study of rivaroxaban in patients with non-valvular atrial fibrillation (NVAF), which had been based on model simulation employing phase II study data. The previously developed mixed-effects pharmacokinetic/pharmacodynamic (PK-PD) model, which consisted of an oral one-compartment model parameterized in terms of clearance, volume and a first-order absorption rate, was rebuilt and optimized using the data for 597 subjects from the Japanese phase III study, J-ROCKET AF. A mixed-effects modeling technique in NONMEM was used to quantify both unexplained inter-individual variability and inter-occasion variability, which are random effect parameters. The final PK and PK-PD models were evaluated to identify influential covariates. The empirical Bayes estimates of AUC and C(max) from the final PK model were consistent with the simulated results from the Japanese phase II study. There was no clear relationship between individual estimated exposures and safety-related events, and the estimated exposure levels were consistent with the global phase III data. Therefore, it was concluded that the dose selected for the phase III study with Japanese NVAF patients by means of model simulation employing phase II study data had been appropriate from the PK-PD perspective.

  2. Thresholding Based on Maximum Weighted Object Correlation for Rail Defect Detection

    NASA Astrophysics Data System (ADS)

    Li, Qingyong; Huang, Yaping; Liang, Zhengping; Luo, Siwei

    Automatic thresholding is an important technique for rail defect detection, but traditional methods are not competent enough to fit the characteristics of this application. This paper proposes the Maximum Weighted Object Correlation (MWOC) thresholding method, fitting the features that rail images are unimodal and defect proportion is small. MWOC selects a threshold by optimizing the product of object correlation and the weight term that expresses the proportion of thresholded defects. Our experimental results demonstrate that MWOC achieves misclassification error of 0.85%, and outperforms the other well-established thresholding methods, including Otsu, maximum correlation thresholding, maximum entropy thresholding and valley-emphasis method, for the application of rail defect detection.

  3. Transient dynamics of NbOx threshold switches explained by Poole-Frenkel based thermal feedback mechanism

    NASA Astrophysics Data System (ADS)

    Wang, Ziwen; Kumar, Suhas; Nishi, Yoshio; Wong, H.-S. Philip

    2018-05-01

    Niobium oxide (NbOx) two-terminal threshold switches are potential candidates as selector devices in crossbar memory arrays and as building blocks for neuromorphic systems. However, the physical mechanism of NbOx threshold switches is still under debate. In this paper, we show that a thermal feedback mechanism based on Poole-Frenkel conduction can explain both the quasi-static and the transient electrical characteristics that are experimentally observed for NbOx threshold switches, providing strong support for the validity of this mechanism. Furthermore, a clear picture of the transient dynamics during the thermal-feedback-induced threshold switching is presented, providing useful insights required to model nonlinear devices where thermal feedback is important.

  4. SU-F-R-11: Designing Quality and Safety Informatics Through Implementation of a CT Radiation Dose Monitoring Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, JM; Samei, E; Departments of Physics, Electrical and Computer Engineering, and Biomedical Engineering, and Medical Physics Graduate Program, Duke University, Durham, NC

    2016-06-15

    Purpose: Recent legislative and accreditation requirements have driven rapid development and implementation of CT radiation dose monitoring solutions. Institutions must determine how to improve quality, safety, and consistency of their clinical performance. The purpose of this work was to design a strategy and meaningful characterization of results from an in-house, clinically-deployed dose monitoring solution. Methods: A dose monitoring platform was designed by our imaging physics group that focused on extracting protocol parameters, dose metrics, and patient demographics and size. Compared to most commercial solutions, which focus on individual exam alerts and global thresholds, the program sought to characterize overall consistencymore » and targeted thresholds based on eight analytic interrogations. Those were based on explicit questions related to protocol application, national benchmarks, protocol and size-specific dose targets, operational consistency, outliers, temporal trends, intra-system variability, and consistent use of electronic protocols. Using historical data since the start of 2013, 95% and 99% intervals were used to establish yellow and amber parameterized dose alert thresholds, respectively, as a function of protocol, scanner, and size. Results: Quarterly reports have been generated for three hospitals for 3 quarters of 2015 totaling 27880, 28502, 30631 exams, respectively. Four adult and two pediatric protocols were higher than external institutional benchmarks. Four protocol dose levels were being inconsistently applied as a function of patient size. For the three hospitals, the minimum and maximum amber outlier percentages were [1.53%,2.28%], [0.76%,1.8%], [0.94%,1.17%], respectively. Compared with the electronic protocols, 10 protocols were found to be used with some inconsistency. Conclusion: Dose monitoring can satisfy requirements with global alert thresholds and patient dose records, but the real value is in optimizing patient

  5. Is weight-based adjustment of automatic exposure control necessary for the reduction of chest CT radiation dose?

    PubMed

    Prakash, Priyanka; Kalra, Mannudeep K; Gilman, Matthew D; Shepard, Jo-Anne O; Digumarthy, Subba R

    2010-01-01

    To assess the effects of radiation dose reduction in the chest CT using a weight-based adjustment of the automatic exposure control (AEC) technique. With Institutional Review Board Approval, 60 patients (mean age, 59.1 years; M:F = 35:25) and 57 weight-matched patients (mean age, 52.3 years, M:F = 25:32) were scanned using a weight-adjusted AEC and non-weight-adjusted AEC, respectively on a 64-slice multidetector CT with a 0.984:1 pitch, 0.5 second rotation time, 40 mm table feed/rotation, and 2.5 mm section thickness. Patients were categorized into 3 weight categories; < 60 kg (n = 17), 60-90 kg (n = 52), and > 90 kg (n = 48). Patient weights, scanning parameters, CT dose index volumes (CTDIvol) and dose length product (DLP) were recorded, while effective dose (ED) was estimated. Image noise was measured in the descending thoracic aorta. Data were analyzed using a standard statistical package (SAS/STAT) (Version 9.1, SAS institute Inc, Cary, NC). Compared to the non-weight-adjusted AEC, the weight-adjusted AEC technique resulted in an average decrease of 29% in CTDIvol and a 27% effective dose reduction (p < 0.0001). With weight-adjusted AEC, the CTDIvol decreased to 15.8, 15.9, and 27.3 mGy for the < 60, 60-90 and > 91 kg weight groups, respectively, compared to 20.3, 27.9 and 32.8 mGy, with non-weight-adjusted AEC. No significant difference was observed for objective image noise between the chest CT acquired with the non-weight-adjusted (15.0 +/- 3.1) and weight-adjusted (16.1 +/- 5.6) AEC techniques (p > 0.05). The results of this study suggest that AEC should be tailored according to patient weight. Without weight-based adjustment of AEC, patients are exposed to a 17 - 43% higher radiation-dose from a chest CT.

  6. Comparative efficacy of low-dose versus standard-dose azithromycin for patients with yaws: a randomised non-inferiority trial in Ghana and Papua New Guinea.

    PubMed

    Marks, Michael; Mitjà, Oriol; Bottomley, Christian; Kwakye, Cynthia; Houinei, Wendy; Bauri, Mathias; Adwere, Paul; Abdulai, Abdul A; Dua, Fredrick; Boateng, Laud; Wangi, James; Ohene, Sally-Ann; Wangnapi, Regina; Simpson, Shirley V; Miag, Helen; Addo, Kennedy K; Basing, Laud A; Danavall, Damien; Chi, Kai H; Pillay, Allan; Ballard, Ronald; Solomon, Anthony W; Chen, Cheng Y; Bieb, Sibauk V; Adu-Sarkodie, Yaw; Mabey, David C W; Asiedu, Kingsley

    2018-04-01

    A dose of 30 mg/kg of azithromycin is recommended for treatment of yaws, a disease targeted for global eradication. Treatment with 20 mg/kg of azithromycin is recommended for the elimination of trachoma as a public health problem. In some settings, these diseases are co-endemic. We aimed to determine the efficacy of 20 mg/kg of azithromycin compared with 30 mg/kg azithromycin for the treatment of active and latent yaws. We did a non-inferiority, open-label, randomised controlled trial in children aged 6-15 years who were recruited from schools in Ghana and schools and the community in Papua New Guinea. Participants were enrolled based on the presence of a clinical lesion that was consistent with infectious primary or secondary yaws and a positive rapid diagnostic test for treponemal and non-treponemal antibodies. Participants were randomly assigned (1:1) to receive either standard-dose (30 mg/kg) or low-dose (20 mg/kg) azithromycin by a computer-generated random number sequence. Health-care workers assessing clinical outcomes in the field were not blinded to the patient's treatment, but investigators involved in statistical or laboratory analyses and the participants were blinded to treatment group. We followed up participants at 4 weeks and 6 months. The primary outcome was cure at 6 months, defined as lesion healing at 4 weeks in patients with active yaws and at least a four-fold decrease in rapid plasma reagin titre from baseline to 6 months in patients with active and latent yaws. Active yaws was defined as a skin lesion that was positive for Treponema pallidum ssp pertenue in PCR testing. We used a non-inferiority margin of 10%. This trial was registered with ClinicalTrials.gov, number NCT02344628. Between June 12, 2015, and July 2, 2016, 583 (65·1%) of 895 children screened were enrolled; 292 patients were assigned a low dose of azithromycin and 291 patients were assigned a standard dose of azithromycin. 191 participants had active yaws and 392 had presumed

  7. Deactivating stimulation sites based on low-rate thresholds improves spectral ripple and speech reception thresholds in cochlear implant users.

    PubMed

    Zhou, Ning

    2017-03-01

    The study examined whether the benefit of deactivating stimulation sites estimated to have broad neural excitation was attributed to improved spectral resolution in cochlear implant users. The subjects' spatial neural excitation pattern was estimated by measuring low-rate detection thresholds across the array [see Zhou (2016). PLoS One 11, e0165476]. Spectral resolution, as assessed by spectral-ripple discrimination thresholds, significantly improved after deactivation of five high-threshold sites. The magnitude of improvement in spectral-ripple discrimination thresholds predicted the magnitude of improvement in speech reception thresholds after deactivation. Results suggested that a smaller number of relatively independent channels provide a better outcome than using all channels that might interact.

  8. Cryogenic ion implantation near amorphization threshold dose for halo/extension junction improvement in sub-30 nm device technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Hugh; Todorov, Stan; Colombeau, Benjamin

    2012-11-06

    We report on junction advantages of cryogenic ion implantation with medium current implanters. We propose a methodical approach on maximizing cryogenic effects on junction characteristics near the amorphization threshold doses that are typically used for halo implants for sub-30 nm technologies. BF{sub 2}{sup +} implant at a dose of 8 Multiplication-Sign 10{sup 13}cm{sup -2} does not amorphize silicon at room temperature. When implanted at -100 Degree-Sign C, it forms a 30 - 35 nm thick amorphous layer. The cryogenic BF{sub 2}{sup +} implant significantly reduces the depth of the boron distribution, both as-implanted and after anneals, which improves short channelmore » rolloff characteristics. It also creates a shallower n{sup +}-p junction by steepening profiles of arsenic that is subsequently implanted in the surface region. We demonstrate effects of implant sequences, germanium preamorphization, indium and carbon co-implants for extension/halo process integration. When applied to sequences such as Ge+As+C+In+BF{sub 2}{sup +}, the cryogenic implants at -100 Degree-Sign C enable removal of Ge preamorphization, and form more active n{sup +}-p junctions and steeper B and In halo profiles than sequences at room temperature.« less

  9. Salicylate-induced changes in auditory thresholds of adolescent and adult rats.

    PubMed

    Brennan, J F; Brown, C A; Jastreboff, P J

    1996-01-01

    Shifts in auditory intensity thresholds after salicylate administration were examined in postweanling and adult pigmented rats at frequencies ranging from 1 to 35 kHz. A total of 132 subjects from both age levels were tested under two-way active avoidance or one-way active avoidance paradigms. Estimated thresholds were inferred from behavioral responses to presentations of descending and ascending series of intensities for each test frequency value. Reliable threshold estimates were found under both avoidance conditioning methods, and compared to controls, subjects at both age levels showed threshold shifts at selective higher frequency values after salicylate injection, and the extent of shifts was related to salicylate dose level.

  10. Bedding material affects mechanical thresholds, heat thresholds and texture preference

    PubMed Central

    Moehring, Francie; O’Hara, Crystal L.; Stucky, Cheryl L.

    2015-01-01

    It has long been known that the bedding type animals are housed on can affect breeding behavior and cage environment. Yet little is known about its effects on evoked behavior responses or non-reflexive behaviors. C57BL/6 mice were housed for two weeks on one of five bedding types: Aspen Sani Chips® (standard bedding for our institute), ALPHA-Dri®, Cellu-Dri™, Pure-o’Cel™ or TEK-Fresh. Mice housed on Aspen exhibited the lowest (most sensitive) mechanical thresholds while those on TEK-Fresh exhibited 3-fold higher thresholds. While bedding type had no effect on responses to punctate or dynamic light touch stimuli, TEK-Fresh housed animals exhibited greater responsiveness in a noxious needle assay, than those housed on the other bedding types. Heat sensitivity was also affected by bedding as animals housed on Aspen exhibited the shortest (most sensitive) latencies to withdrawal whereas those housed on TEK-Fresh had the longest (least sensitive) latencies to response. Slight differences between bedding types were also seen in a moderate cold temperature preference assay. A modified tactile conditioned place preference chamber assay revealed that animals preferred TEK-Fresh to Aspen bedding. Bedding type had no effect in a non-reflexive wheel running assay. In both acute (two day) and chronic (5 week) inflammation induced by injection of Complete Freund’s Adjuvant in the hindpaw, mechanical thresholds were reduced in all groups regardless of bedding type, but TEK-Fresh and Pure-o’Cel™ groups exhibited a greater dynamic range between controls and inflamed cohorts than Aspen housed mice. PMID:26456764

  11. Switching From Age-Based Stimulus Dosing to Dose Titration Protocols in Electroconvulsive Therapy: Empirical Evidence for Better Patient Outcomes With Lower Peak and Cumulative Energy Doses.

    PubMed

    O'Neill-Kerr, Alex; Yassin, Anhar; Rogers, Stephen; Cornish, Janie

    2017-09-01

    The aim of this study was to test the proposition that adoption of a dose titration protocol may be associated with better patient outcomes, at lower treatment dose, and with comparable cumulative dose to that in patients treated using an age-based stimulus dosing protocol. This was an analysis of data assembled from archived records and based on cohorts of patients treated respectively on an age-based stimulus dosing protocol and on a dose titration protocol in the National Health Service in England. We demonstrated a significantly better response in the patient cohort treated with dose titration than with age-based stimulus dosing. Peak doses were less and the total cumulative dose was less in the dose titration group than in the age-based stimulus dosing group. Our findings are consistent with superior outcomes in patients treated using a dose titration protocol when compared with age-based stimulus dosing in a similar cohort of patients.

  12. Novel Threshold Changeable Secret Sharing Schemes Based on Polynomial Interpolation.

    PubMed

    Yuan, Lifeng; Li, Mingchu; Guo, Cheng; Choo, Kim-Kwang Raymond; Ren, Yizhi

    2016-01-01

    After any distribution of secret sharing shadows in a threshold changeable secret sharing scheme, the threshold may need to be adjusted to deal with changes in the security policy and adversary structure. For example, when employees leave the organization, it is not realistic to expect departing employees to ensure the security of their secret shadows. Therefore, in 2012, Zhang et al. proposed (t → t', n) and ({t1, t2,⋯, tN}, n) threshold changeable secret sharing schemes. However, their schemes suffer from a number of limitations such as strict limit on the threshold values, large storage space requirement for secret shadows, and significant computation for constructing and recovering polynomials. To address these limitations, we propose two improved dealer-free threshold changeable secret sharing schemes. In our schemes, we construct polynomials to update secret shadows, and use two-variable one-way function to resist collusion attacks and secure the information stored by the combiner. We then demonstrate our schemes can adjust the threshold safely.

  13. Statistical Analysis of SSMIS Sea Ice Concentration Threshold at the Arctic Sea Ice Edge during Summer Based on MODIS and Ship-Based Observational Data.

    PubMed

    Ji, Qing; Li, Fei; Pang, Xiaoping; Luo, Cong

    2018-04-05

    The threshold of sea ice concentration (SIC) is the basis for accurately calculating sea ice extent based on passive microwave (PM) remote sensing data. However, the PM SIC threshold at the sea ice edge used in previous studies and released sea ice products has not always been consistent. To explore the representable value of the PM SIC threshold corresponding on average to the position of the Arctic sea ice edge during summer in recent years, we extracted sea ice edge boundaries from the Moderate-resolution Imaging Spectroradiometer (MODIS) sea ice product (MOD29 with a spatial resolution of 1 km), MODIS images (250 m), and sea ice ship-based observation points (1 km) during the fifth (CHINARE-2012) and sixth (CHINARE-2014) Chinese National Arctic Research Expeditions, and made an overlay and comparison analysis with PM SIC derived from Special Sensor Microwave Imager Sounder (SSMIS, with a spatial resolution of 25 km) in the summer of 2012 and 2014. Results showed that the average SSMIS SIC threshold at the Arctic sea ice edge based on ice-water boundary lines extracted from MOD29 was 33%, which was higher than that of the commonly used 15% discriminant threshold. The average SIC threshold at sea ice edge based on ice-water boundary lines extracted by visual interpretation from four scenes of the MODIS image was 35% when compared to the average value of 36% from the MOD29 extracted ice edge pixels for the same days. The average SIC of 31% at the sea ice edge points extracted from ship-based observations also confirmed that choosing around 30% as the SIC threshold during summer is recommended for sea ice extent calculations based on SSMIS PM data. These results can provide a reference for further studying the variation of sea ice under the rapidly changing Arctic.

  14. A population-based study of dosing and persistence with anti-dementia medications.

    PubMed

    Brewer, Linda; Bennett, Kathleen; McGreevy, Cora; Williams, David

    2013-07-01

    Cholinesterase inhibitors and memantine are the mainstay of pharmacological intervention for the cognitive symptoms of Alzheimer's disease (AD). This study assessed the adequacy of dosing and persistence with AD medications and the predictors of these variables in the 'real world' (outside the clinical trial setting). The Health Service Executive-Primary Care Reimbursement Services prescription claims database in the Republic of Ireland contains prescription information for 1.6 million people. Patients aged >70 years who received at least two prescriptions for donepezil, rivastigmine, galantamine and memantine between January 2006 and December 2010 were included in the study. Rates of dose-maximisation were recorded by examining the initiation dose of each AD drug commenced during the study period and any subsequent dose titrations. Non-persistence was defined by a gap in prescribing of more than 63 consecutive days. Predictors of dose-maximisation and non-persistence were also analysed. Between January 2006 and December 2010, 20,729 patients aged >70 years received a prescription for an AD medication. Despite most patients on donepezil and memantine receiving a prescription for the maximum drug dose, this dose was maintained for 2 consecutive months in only two-thirds of patients. Patients were significantly more likely to have their doses of donepezil and memantine maximised if prescribed in more recent years (2010 vs. 2007). Rates of non-persistence were 30.1 % at 6 months and 43.8 % at 12 months. Older age [75+ vs. <75 years; hazards ratio (HR) 1.16, 95 % confidence interval (CI) 1.06-1.27] and drug type (rivastigmine vs. donepezil; HR 1.15, 95 % CI 1.03-1.27) increased the risk of non-persistence. Non-persistence was lower for those commencing therapy in more recent years (2010 vs. 2007; HR 0.81, 95 % CI 0.73-0.89, p < 0.001) and for those on multiple anti-dementia medications (HR 0.59, 95 % CI 0.54-0.65, p < 0.001). Persistence was significantly higher when

  15. Sampling Based Influence Maximization on Linear Threshold Model

    NASA Astrophysics Data System (ADS)

    Jia, Su; Chen, Ling

    2018-04-01

    A sampling based influence maximization on linear threshold (LT) model method is presented. The method samples the routes in the possible worlds in the social networks, and uses Chernoff bound to estimate the number of samples so that the error can be constrained within a given bound. Then the active possibilities of the routes in the possible worlds are calculated, and are used to compute the influence spread of each node in the network. Our experimental results show that our method can effectively select appropriate seed nodes set that spreads larger influence than other similar methods.

  16. [Research on the threshold of Chl-a in Lake Taihu based on microcystins].

    PubMed

    Wei, Dai-chun; Su, Jing; Ji, Dan-feng; Fu, Xiao-yong; Wang, Ji; Huo, Shou-liang; Cui, Chi-fei; Tang, Jun; Xi, Bei-dou

    2014-12-01

    Water samples were collected in Lake Taihu from June to October in 2013 in order to investigate the threshold of chlorophyll a (Chl-a). The concentrations of three microcystins isomers (MC-LR, MC-RR, MC-YR) were detected by means of solid phase extraction and high performance liquid chromatography-tandem mass spectrometry. The correlations between various MCs and eutrophication factors, for instance of total nitrogen (TN), total phosphorus (TP), chlorophyll a, permanganate index etc were analyzed. The threshold of Chl-a was studied based on the relationships between MC-LR, MCs and Chl-a. The results showed that Lake Taihu was severely polluted by MCs and its spatial distribution could be described as follows: the concentration in Meiliang Bay was the highest, followed by Gonghu Bay and Western Lake, and Lake Center; the least polluted areas were in Lake Xuhu and Southern Lake. The concentration of MC-LR was the highest among the 3 MCs. The correlation analysis indicated that MC-LR, MC-RR, MC-YR and MCs had very positive correlation with permanganate index, TN, TP and Chl-a (P < 0.01). The threshold value of Chl-a was 12.26 mg x m(-3) according to the standard thresholds of MC-LR and MCs in drinking water. The threshold value of Chl-a in Lake Taihu was very close to the standard in the State of North Carolina, which demonstrated that the threshold value provided in this study was reasonable.

  17. Threshold-based insulin-pump interruption for reduction of hypoglycemia.

    PubMed

    Bergenstal, Richard M; Klonoff, David C; Garg, Satish K; Bode, Bruce W; Meredith, Melissa; Slover, Robert H; Ahmann, Andrew J; Welsh, John B; Lee, Scott W; Kaufman, Francine R

    2013-07-18

    The threshold-suspend feature of sensor-augmented insulin pumps is designed to minimize the risk of hypoglycemia by interrupting insulin delivery at a preset sensor glucose value. We evaluated sensor-augmented insulin-pump therapy with and without the threshold-suspend feature in patients with nocturnal hypoglycemia. We randomly assigned patients with type 1 diabetes and documented nocturnal hypoglycemia to receive sensor-augmented insulin-pump therapy with or without the threshold-suspend feature for 3 months. The primary safety outcome was the change in the glycated hemoglobin level. The primary efficacy outcome was the area under the curve (AUC) for nocturnal hypoglycemic events. Two-hour threshold-suspend events were analyzed with respect to subsequent sensor glucose values. A total of 247 patients were randomly assigned to receive sensor-augmented insulin-pump therapy with the threshold-suspend feature (threshold-suspend group, 121 patients) or standard sensor-augmented insulin-pump therapy (control group, 126 patients). The changes in glycated hemoglobin values were similar in the two groups. The mean AUC for nocturnal hypoglycemic events was 37.5% lower in the threshold-suspend group than in the control group (980 ± 1200 mg per deciliter [54.4 ± 66.6 mmol per liter] × minutes vs. 1568 ± 1995 mg per deciliter [87.0 ± 110.7 mmol per liter] × minutes, P<0.001). Nocturnal hypoglycemic events occurred 31.8% less frequently in the threshold-suspend group than in the control group (1.5 ± 1.0 vs. 2.2 ± 1.3 per patient-week, P<0.001). The percentages of nocturnal sensor glucose values of less than 50 mg per deciliter (2.8 mmol per liter), 50 to less than 60 mg per deciliter (3.3 mmol per liter), and 60 to less than 70 mg per deciliter (3.9 mmol per liter) were significantly reduced in the threshold-suspend group (P<0.001 for each range). After 1438 instances at night in which the pump was stopped for 2 hours, the mean sensor glucose value was 92.6 ± 40.7 mg

  18. SU-E-T-226: Correction of a Standard Model-Based Dose Calculator Using Measurement Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, M; Jiang, S; Lu, W

    Purpose: To propose a hybrid method that combines advantages of the model-based and measurement-based method for independent dose calculation. Modeled-based dose calculation, such as collapsed-cone-convolution/superposition (CCCS) or the Monte-Carlo method, models dose deposition in the patient body accurately; however, due to lack of detail knowledge about the linear accelerator (LINAC) head, commissioning for an arbitrary machine is tedious and challenging in case of hardware changes. On the contrary, the measurement-based method characterizes the beam property accurately but lacks the capability of dose disposition modeling in heterogeneous media. Methods: We used a standard CCCS calculator, which is commissioned by published data,more » as the standard model calculator. For a given machine, water phantom measurements were acquired. A set of dose distributions were also calculated using the CCCS for the same setup. The difference between the measurements and the CCCS results were tabulated and used as the commissioning data for a measurement based calculator. Here we used a direct-ray-tracing calculator (ΔDRT). The proposed independent dose calculation consists of the following steps: 1. calculate D-model using CCCS. 2. calculate D-ΔDRT using ΔDRT. 3. combine Results: D=D-model+D-ΔDRT. Results: The hybrid dose calculation was tested on digital phantoms and patient CT data for standard fields and IMRT plan. The results were compared to dose calculated by the treatment planning system (TPS). The agreement of the hybrid and the TPS was within 3%, 3 mm for over 98% of the volume for phantom studies and lung patients. Conclusion: The proposed hybrid method uses the same commissioning data as those for the measurement-based method and can be easily extended to any non-standard LINAC. The results met the accuracy, independence, and simple commissioning criteria for an independent dose calculator.« less

  19. Dose-response studies and 'no-effect-levels' of N-nitroso compounds: some general aspects.

    PubMed

    Preussmann, R

    1980-01-01

    One major problem in the evaluation of potential carcinogenic food additives and contaminants is that of thresholds or, better, of 'no-adverse-effect-levels'. Arguments in favor of the postulated 'irreversibility' of carcinogenic effects are based on dose-response studies, single dose and multigeneration experiments as well as on the concept of somatic mutation as the first step in carcinogenesis with subsequent transmittance of induced defects during cell replication. The problem of extrapolation of results of animal experiments using high doses to low exposure and low incidences in man is not yet solved satisfactorily. Possible practical consequences include zero tolerance, acceptable thresholds at low risk and safety factors. Acceptable intakes should never be considered constants but should be changeable as soon as new facts in regard to the safety evaluation are available.

  20. Non linear processes modulated by low doses of radiation exposure

    NASA Astrophysics Data System (ADS)

    Mariotti, Luca; Ottolenghi, Andrea; Alloni, Daniele; Babini, Gabriele; Morini, Jacopo; Baiocco, Giorgio

    The perturbation induced by radiation impinging on biological targets can stimulate the activation of several different pathways, spanning from the DNA damage processing to intra/extra -cellular signalling. In the mechanistic investigation of radiobiological damage this complex “system” response (e.g. omics, signalling networks, micro-environmental modifications, etc.) has to be taken into account, shifting from a focus on the DNA molecule solely to a systemic/collective view. An additional complication comes from the finding that the individual response of each of the involved processes is often not linear as a function of the dose. In this context, a systems biology approach to investigate the effects of low dose irradiations on intra/extra-cellular signalling will be presented, where low doses of radiation act as a mild perturbation of a robustly interconnected network. Results obtained through a multi-level investigation of both DNA damage repair processes (e.g. gamma-H2AX response) and of the activation kinetics for intra/extra cellular signalling pathways (e.g. NFkB activation) show that the overall cell response is dominated by non-linear processes - such as negative feedbacks - leading to possible non equilibrium steady states and to a poor signal-to-noise ratio. Together with experimental data of radiation perturbed pathways, different modelling approaches will be also discussed.

  1. SU-E-T-769: T-Test Based Prior Error Estimate and Stopping Criterion for Monte Carlo Dose Calculation in Proton Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hong, X; Gao, H; Schuemann, J

    2015-06-15

    Purpose: The Monte Carlo (MC) method is a gold standard for dose calculation in radiotherapy. However, it is not a priori clear how many particles need to be simulated to achieve a given dose accuracy. Prior error estimate and stopping criterion are not well established for MC. This work aims to fill this gap. Methods: Due to the statistical nature of MC, our approach is based on one-sample t-test. We design the prior error estimate method based on the t-test, and then use this t-test based error estimate for developing a simulation stopping criterion. The three major components are asmore » follows.First, the source particles are randomized in energy, space and angle, so that the dose deposition from a particle to the voxel is independent and identically distributed (i.i.d.).Second, a sample under consideration in the t-test is the mean value of dose deposition to the voxel by sufficiently large number of source particles. Then according to central limit theorem, the sample as the mean value of i.i.d. variables is normally distributed with the expectation equal to the true deposited dose.Third, the t-test is performed with the null hypothesis that the difference between sample expectation (the same as true deposited dose) and on-the-fly calculated mean sample dose from MC is larger than a given error threshold, in addition to which users have the freedom to specify confidence probability and region of interest in the t-test based stopping criterion. Results: The method is validated for proton dose calculation. The difference between the MC Result based on the t-test prior error estimate and the statistical Result by repeating numerous MC simulations is within 1%. Conclusion: The t-test based prior error estimate and stopping criterion are developed for MC and validated for proton dose calculation. Xiang Hong and Hao Gao were partially supported by the NSFC (#11405105), the 973 Program (#2015CB856000) and the Shanghai Pujiang Talent Program (#14PJ1404500)« less

  2. Setting nutrient thresholds to support an ecological assessment based on nutrient enrichment, potential primary production and undesirable disturbance.

    PubMed

    Devlin, Michelle; Painting, Suzanne; Best, Mike

    2007-01-01

    The EU Water Framework Directive recognises that ecological status is supported by the prevailing physico-chemical conditions in each water body. This paper describes an approach to providing guidance on setting thresholds for nutrients taking account of the biological response to nutrient enrichment evident in different types of water. Indices of pressure, state and impact are used to achieve a robust nutrient (nitrogen) threshold by considering each individual index relative to a defined standard, scale or threshold. These indices include winter nitrogen concentrations relative to a predetermined reference value; the potential of the waterbody to support phytoplankton growth (estimated as primary production); and detection of an undesirable disturbance (measured as dissolved oxygen). Proposed reference values are based on a combination of historical records, offshore (limited human influence) nutrient concentrations, literature values and modelled data. Statistical confidence is based on a number of attributes, including distance of confidence limits away from a reference threshold and how well the model is populated with real data. This evidence based approach ensures that nutrient thresholds are based on knowledge of real and measurable biological responses in transitional and coastal waters.

  3. Probability of an Abnormal Screening PSA Result Based on Age, Race, and PSA Threshold

    PubMed Central

    Espaldon, Roxanne; Kirby, Katharine A.; Fung, Kathy Z.; Hoffman, Richard M.; Powell, Adam A.; Freedland, Stephen J.; Walter, Louise C.

    2014-01-01

    Objective To determine the distribution of screening PSA values in older men and how different PSA thresholds affect the proportion of white, black, and Latino men who would have an abnormal screening result across advancing age groups. Methods We used linked national VA and Medicare data to determine the value of the first screening PSA test (ng/mL) of 327,284 men age 65+ who underwent PSA screening in the VA healthcare system in 2003. We calculated the proportion of men with an abnormal PSA result based on age, race, and common PSA thresholds. Results Among men age 65+, 8.4% had a PSA >4.0ng/mL. The percentage of men with a PSA >4.0ng/mL increased with age and was highest in black men (13.8%) versus white (8.0%) or Latino men (10.0%) (P<0.001). Combining age and race, the probability of having a PSA >4.0ng/mL ranged from 5.1% of Latino men age 65–69 to 27.4% of black men age 85+. Raising the PSA threshold from >4.0ng/mL to >10.0ng/mL, reclassified the greatest percentage of black men age 85+ (18.3% absolute change) and the lowest percentage of Latino men age 65–69 (4.8% absolute change) as being under the biopsy threshold (P<0.001). Conclusions Age, race, and PSA threshold together affect the pre-test probability of an abnormal screening PSA result. Based on screening PSA distributions, stopping screening among men whose PSA < 3ng/ml means over 80% of white and Latino men age 70+ would stop further screening, and increasing the biopsy threshold to >10ng/ml has the greatest effect on reducing the number of older black men who will face biopsy decisions after screening. PMID:24439009

  4. Comparison between intensity- duration thresholds and cumulative rainfall thresholds for the forecasting of landslide

    NASA Astrophysics Data System (ADS)

    Lagomarsino, Daniela; Rosi, Ascanio; Rossi, Guglielmo; Segoni, Samuele; Catani, Filippo

    2014-05-01

    This work makes a quantitative comparison between the results of landslide forecasting obtained using two different rainfall threshold models, one using intensity-duration thresholds and the other based on cumulative rainfall thresholds in an area of northern Tuscany of 116 km2. The first methodology identifies rainfall intensity-duration thresholds by means a software called MaCumBA (Massive CUMulative Brisk Analyzer) that analyzes rain-gauge records, extracts the intensities (I) and durations (D) of the rainstorms associated with the initiation of landslides, plots these values on a diagram, and identifies thresholds that define the lower bounds of the I-D values. A back analysis using data from past events can be used to identify the threshold conditions associated with the least amount of false alarms. The second method (SIGMA) is based on the hypothesis that anomalous or extreme values of rainfall are responsible for landslide triggering: the statistical distribution of the rainfall series is analyzed, and multiples of the standard deviation (σ) are used as thresholds to discriminate between ordinary and extraordinary rainfall events. The name of the model, SIGMA, reflects the central role of the standard deviations in the proposed methodology. The definition of intensity-duration rainfall thresholds requires the combined use of rainfall measurements and an inventory of dated landslides, whereas SIGMA model can be implemented using only rainfall data. These two methodologies were applied in an area of 116 km2 where a database of 1200 landslides was available for the period 2000-2012. The results obtained are compared and discussed. Although several examples of visual comparisons between different intensity-duration rainfall thresholds are reported in the international literature, a quantitative comparison between thresholds obtained in the same area using different techniques and approaches is a relatively undebated research topic.

  5. A MODELING FRAMEWORK FOR ESTIMATING CHILDREN'S RESIDENTIAL EXPOSURE AND DOSE TO CHLORPYRIFOS VIA DERMAL RESIDUE CONTACT AND NON-DIETARY INGESTION

    EPA Science Inventory

    To help address the Food Quality Protection Act of 1996, a physically-based probabilistic model (Residential Stochastic Human Exposure and Dose Simulation Model for Pesticides; Residential-SHEDS) has been developed to quantify and analyze dermal and non-dietary ingestion exposu...

  6. Assessment of olfactory threshold in patients undergoing radiotherapy for head and neck malignancies.

    PubMed

    Jalali, Mir Mohammad; Gerami, Hooshang; Rahimi, Abbas; Jafari, Manizheh

    2014-10-01

    Radiotherapy is a common treatment modality for patients with head and neck malignancies. As the nose lies within the field of radiotherapy of the head and neck, the olfactory fibers and olfactory receptors may be affected by radiation. The aim of this study was to evaluate changes in olfactory threshold in patients with head and neck malignancies who have received radiation to the head and neck. The olfactory threshold of patients with head and neck malignancies was assessed prospectively before radiation therapy and serially for up to 6 months after radiotherapy using sniff bottles. In vivo dosimetry was performed using 82 LiF (MCP) chips and a thermoluminescent dosimeter (TLD) system. Sixty-one patients were recruited before radiotherapy was commenced. Seven patients did not return for evaluation after radiation. Fifty-four patients were available for follow-up assessment (28 women, 26 men; age, 22-86 years; median, 49 years). Total radiation dose was 50.1 Gy (range, 30-66 Gy). Mean olfactory threshold scores were found to deteriorate significantly at various timepoints after radiotherapy (11.7 before radiotherapy versus 4.0 at Month 6, general linear model, P<0.0001). With in vivo dosimetry, we found that the median measured dose to the olfactory area was 334 µC. We also identified a cutoff point according to the dose to the olfactory epithelium. Olfactory threshold was significantly decreased 2-6 weeks after initiation of therapy, with cumulative local radiation >135 µC (Mann-Whitney U test, P=0.01). Deterioration in olfactory threshold scores was found at 6 months after initiation of radiation therapy. Provided that these results are reproducible, an evaluation of olfactory functioning in patients with head and neck malignancies using in vivo dosimetry may be useful for determining the optimal dose for patients treated with conformal radiotherapy techniques while avoiding the side effects of radiation.

  7. SU-E-I-29: Care KV: Dose It Influence Radiation Dose in Non-Contrast Examination of CT Abdomen/pelvis?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, J; Ganesh, H; Weir, V

    Purpose: CARE kV is a tool that automatically recommends optimal kV setting for individual patient for specific CT examination. The use of CARE kV depends on topogram and the user-selected contrast behavior. CARE kV is expected to reduce radiation dose while improving image quality. However, this may work only for certain groups of patients and/or certain CT examinations. This study is to investigate the effects of CARE kV on radiation dose of non-contrast examination of CT abdomen/pelvis. Methods: Radiation dose (CTDIvol and DLP) from patients who underwent abdomen/pelvis non-contrast examination with and without CARE kV were retrospectively reviewed. All patientsmore » were scanned in the same scanner (Siemens Somatom AS64). To mitigate any possible influences due to technologists’ unfamiliarity with the CARE kV, the data with CARE kV were retrieved 1.5 years after the start of CARE kV usage. T-test was used for significant difference in radiation dose. Results: Volume CTDIs and DLPs from 18 patients before and 24 patients after the use of CARE kV were obtained in a duration of one month. There is a slight increase in both average CTDIvol and average DLP with CARE kV compared to those without CARE kV (25.52 mGy vs. 22.65 mGy for CTDIvol; 1265.81 mGy-cm vs. 1199.19 mGy-cm). Statistically there was no significant difference. Without CARE kV, 140 kV was used in 9 of 18 patients, while with CARE KV, 140 kV was used in 15 of 24 patients. 80kV was not used in either group. Conclusion: The use of CARE kV may save time for protocol optimization and minimize variability among technologists. Radiation dose reduction was not observed in non-contrast examinations of CT abdomen/pelvis. This was partially because our CT protocols were tailored according to patient size before CARE kV and partially because of large size patients.« less

  8. Novel Threshold Changeable Secret Sharing Schemes Based on Polynomial Interpolation

    PubMed Central

    Li, Mingchu; Guo, Cheng; Choo, Kim-Kwang Raymond; Ren, Yizhi

    2016-01-01

    After any distribution of secret sharing shadows in a threshold changeable secret sharing scheme, the threshold may need to be adjusted to deal with changes in the security policy and adversary structure. For example, when employees leave the organization, it is not realistic to expect departing employees to ensure the security of their secret shadows. Therefore, in 2012, Zhang et al. proposed (t → t′, n) and ({t1, t2,⋯, tN}, n) threshold changeable secret sharing schemes. However, their schemes suffer from a number of limitations such as strict limit on the threshold values, large storage space requirement for secret shadows, and significant computation for constructing and recovering polynomials. To address these limitations, we propose two improved dealer-free threshold changeable secret sharing schemes. In our schemes, we construct polynomials to update secret shadows, and use two-variable one-way function to resist collusion attacks and secure the information stored by the combiner. We then demonstrate our schemes can adjust the threshold safely. PMID:27792784

  9. Embracing model-based designs for dose-finding trials

    PubMed Central

    Love, Sharon B; Brown, Sarah; Weir, Christopher J; Harbron, Chris; Yap, Christina; Gaschler-Markefski, Birgit; Matcham, James; Caffrey, Louise; McKevitt, Christopher; Clive, Sally; Craddock, Charlie; Spicer, James; Cornelius, Victoria

    2017-01-01

    Background: Dose-finding trials are essential to drug development as they establish recommended doses for later-phase testing. We aim to motivate wider use of model-based designs for dose finding, such as the continual reassessment method (CRM). Methods: We carried out a literature review of dose-finding designs and conducted a survey to identify perceived barriers to their implementation. Results: We describe the benefits of model-based designs (flexibility, superior operating characteristics, extended scope), their current uptake, and existing resources. The most prominent barriers to implementation of a model-based design were lack of suitable training, chief investigators’ preference for algorithm-based designs (e.g., 3+3), and limited resources for study design before funding. We use a real-world example to illustrate how these barriers can be overcome. Conclusions: There is overwhelming evidence for the benefits of CRM. Many leading pharmaceutical companies routinely implement model-based designs. Our analysis identified barriers for academic statisticians and clinical academics in mirroring the progress industry has made in trial design. Unified support from funders, regulators, and journal editors could result in more accurate doses for later-phase testing, and increase the efficiency and success of clinical drug development. We give recommendations for increasing the uptake of model-based designs for dose-finding trials in academia. PMID:28664918

  10. Confectionery-based dose forms.

    PubMed

    Tangso, Kristian J; Ho, Quy Phuong; Boyd, Ben J

    2015-01-01

    Conventional dosage forms such as tablets, capsules and syrups are prescribed in the normal course of practice. However, concerns about patient preferences and market demands have given rise to the exploration of novel unconventional dosage forms. Among these, confectionery-based dose forms have strong potential to overcome compliance problems. This report will review the availability of these unconventional dose forms used in treating the oral cavity and for systemic drug delivery, with a focus on medicated chewing gums, medicated lollipops, and oral bioadhesive devices. The aim is to stimulate increased interest in the opportunities for innovative new products that are available to formulators in this field, particularly for atypical patient populations.

  11. Ignition criterion for heterogeneous energetic materials based on hotspot size-temperature threshold

    NASA Astrophysics Data System (ADS)

    Barua, A.; Kim, S.; Horie, Y.; Zhou, M.

    2013-02-01

    A criterion for the ignition of granular explosives (GXs) and polymer-bonded explosives (PBXs) under shock and non-shock loading is developed. The formulation is based on integration of a quantification of the distributions of the sizes and locations of hotspots in loading events using a cohesive finite element method (CFEM) developed recently and the characterization by Tarver et al. [C. M. Tarver et al., "Critical conditions for impact- and shock-induced hot spots in solid explosives," J. Phys. Chem. 100, 5794-5799 (1996)] of the critical size-temperature threshold of hotspots required for chemical ignition of solid explosives. The criterion, along with the CFEM capability to quantify the thermal-mechanical behavior of GXs and PBXs, allows the critical impact velocity for ignition, time to ignition, and critical input energy at ignition to be determined as functions of material composition, microstructure, and loading conditions. The applicability of the relation between the critical input energy (E) and impact velocity of James [H. R. James, "An extension to the critical energy criterion used to predict shock initiation thresholds," Propellants, Explos., Pyrotech. 21, 8-13 (1996)] for shock loading is examined, leading to a modified interpretation, which is sensitive to microstructure and loading condition. As an application, numerical studies are undertaken to evaluate the ignition threshold of granular high melting point eXplosive, octahydro-1,3,5,7-tetranitro-1,2,3,5-tetrazocine (HMX) and HMX/Estane PBX under loading with impact velocities up to 350 ms-1 and strain rates up to 105 s-1. Results show that, for the GX, the time to criticality (tc) is strongly influenced by initial porosity, but is insensitive to grain size. Analyses also lead to a quantification of the differences between the responses of the GXs and PBXs in terms of critical impact velocity for ignition, time to ignition, and critical input energy at ignition. Since the framework permits

  12. Dose to heart substructures is associated with non-cancer death after SBRT in stage I-II NSCLC patients.

    PubMed

    Stam, Barbara; Peulen, Heike; Guckenberger, Matthias; Mantel, Frederick; Hope, Andrew; Werner-Wasik, Maria; Belderbos, Jose; Grills, Inga; O'Connell, Nicolette; Sonke, Jan-Jakob

    2017-06-01

    To investigate potential associations between dose to heart (sub)structures and non-cancer death, in early stage non-small cell lung cancer (NSCLC) patients treated with stereotactic body radiation therapy (SBRT). 803 patients with early stage NSCLC received SBRT with predominant schedules of 3×18Gy (59%) or 4×12Gy (19%). All patients were registered to an average anatomy, their planned dose deformed accordingly, and dosimetric parameters for heart substructures were obtained. Multivariate Cox regression and a sensitivity analysis were used to identify doses to heart substructures or heart region with a significant association with non-cancer death respectively. Median follow-up was 34.8months. Two year Kaplan-Meier overall survival rate was 67%. Of the deceased patients, 26.8% died of cancer. Multivariate analysis showed that the maximum dose on the left atrium (median 6.5Gy EQD2, range=0.009-197, HR=1.005, p-value=0.035), and the dose to 90% of the superior vena cava (median 0.59Gy EQD2, range=0.003-70, HR=1.025, p-value=0.008) were significantly associated with non-cancer death. Sensitivity analysis identified the upper region of the heart (atria+vessels) to be significantly associated with non-cancer death. Doses to mainly the upper region of the heart were significantly associated with non-cancer death. Consequently, dose sparing in particular of the upper region of the heart could potentially improve outcome, and should be further studied. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Determination and validation of soil thresholds for cadmium based on food quality standard and health risk assessment.

    PubMed

    Ding, Changfeng; Ma, Yibing; Li, Xiaogang; Zhang, Taolin; Wang, Xingxiang

    2018-04-01

    Cadmium (Cd) is an environmental toxicant with high rates of soil-plant transfer. It is essential to establish an accurate soil threshold for the implementation of soil management practices. This study takes root vegetable as an example to derive soil thresholds for Cd based on the food quality standard as well as health risk assessment using species sensitivity distribution (SSD). A soil type-specific bioconcentration factor (BCF, ratio of Cd concentration in plant to that in soil) generated from soil with a proper Cd concentration gradient was calculated and applied in the derivation of soil thresholds instead of a generic BCF value to minimize the uncertainty. The sensitivity variations of twelve root vegetable cultivars for accumulating soil Cd and the empirical soil-plant transfer model were investigated and developed in greenhouse experiments. After normalization, the hazardous concentrations from the fifth percentile of the distribution based on added Cd (HC5 add ) were calculated from the SSD curves fitted by Burr Type III distribution. The derived soil thresholds were presented as continuous or scenario criteria depending on the combination of soil pH and organic carbon content. The soil thresholds based on food quality standard were on average 0.7-fold of those based on health risk assessment, and were further validated to be reliable using independent data from field survey and published articles. The results suggested that deriving soil thresholds for Cd using SSD method is robust and also applicable to other crops as well as other trace elements that have the potential to cause health risk issues. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. [The effects of intra-cerebroventricular administered rocuronium on the central nervous system of rats and determination of its epileptic seizure-inducing dose].

    PubMed

    Baykal, Mehmet; Gökmen, Necati; Doğan, Alper; Erbayraktar, Serhat; Yılmaz, Osman; Ocmen, Elvan; Erdost, Hale Aksu; Arkan, Atalay

    The aim of this study was to investigate the effects of intracerebroventricularly administered rocuronium bromide on the central nervous system, determine the seizure threshold dose of rocuronium bromide in rats, and investigate the effects of rocuronium on the central nervous system at 1/5, 1/10, and 1/100 dilutions of the determined seizure threshold dose. A permanent cannula was placed in the lateral cerebral ventricle of the animals. The study was designed in two phases. In the first phase, the seizure threshold dose of rocuronium bromide was determined. In the second phase, Group R 1/5 (n=6), Group 1/10 (n=6), and Group 1/100 (n=6) were formed using doses of 1/5, 1/10, and 1/100, respectively, of the obtained rocuronium bromide seizure threshold dose. The rocuronium bromide seizure threshold value was found to be 0.056±0.009μmoL. The seizure threshold, as a function of the body weight of rats, was calculated as 0.286μmoL/kg -1 . A dose of 1/5 of the seizure threshold dose primarily caused splayed limbs, posturing, and tremors of the entire body, whereas the dose of 1/10 of the seizure threshold dose caused agitation and shivering. A dose of 1/100 of the seizure threshold dose was associated with decreased locomotor activity. This study showed that rocuronium bromide has dose-related deleterious effects on the central nervous system and can produce dose-dependent excitatory effects and seizures. Publicado por Elsevier Editora Ltda.

  15. The effects of intra-cerebroventricular administered rocuronium on the central nervous system of rats and determination of its epileptic seizure-inducing dose.

    PubMed

    Baykal, Mehmet; Gökmen, Necati; Doğan, Alper; Erbayraktar, Serhat; Yılmaz, Osman; Ocmen, Elvan; Erdost, Hale Aksu; Arkan, Atalay

    The aim of this study was to investigate the effects of intracerebroventricularly administered rocuronium bromide on the central nervous system, determine the seizure threshold dose of rocuronium bromide in rats, and investigate the effects of rocuronium on the central nervous system at 1/5, 1/10, and 1/100 dilutions of the determined seizure threshold dose. A permanent cannula was placed in the lateral cerebral ventricle of the animals. The study was designed in two phases. In the first phase, the seizure threshold dose of rocuronium bromide was determined. In the second phase, Group R 1/5 (n=6), Group 1/10 (n=6), and Group 1/100 (n=6) were formed using doses of 1/5, 1/10, and 1/100, respectively, of the obtained rocuronium bromide seizure threshold dose. The rocuronium bromide seizure threshold value was found to be 0.056±0.009μmoL. The seizure threshold, as a function of the body weight of rats, was calculated as 0.286μmoL/kg -1 . A dose of 1/5 of the seizure threshold dose primarily caused splayed limbs, posturing, and tremors of the entire body, whereas the dose of 1/10 of the seizure threshold dose caused agitation and shivering. A dose of 1/100 of the seizure threshold dose was associated with decreased locomotor activity. This study showed that rocuronium bromide has dose-related deleterious effects on the central nervous system and can produce dose-dependent excitatory effects and seizures. Published by Elsevier Editora Ltda.

  16. Single Low-Dose Ionizing Radiation Induces Genotoxicity in Adult Zebrafish and its Non-Irradiated Progeny.

    PubMed

    Lemos, J; Neuparth, T; Trigo, M; Costa, P; Vieira, D; Cunha, L; Ponte, F; Costa, P S; Metello, L F; Carvalho, A P

    2017-02-01

    This study investigated to what extent a single exposure to low doses of ionizing radiation can induce genotoxic damage in irradiated adult zebrafish (Danio rerio) and its non-irradiated F1 progeny. Four groups of adult zebrafish were irradiated with a single dose of X-rays at 0 (control), 100, 500 and 1000 mGy, respectively, and couples of each group were allowed to reproduce following irradiation. Blood of parental fish and whole-body offspring were analysed by the comet assay for detection of DNA damage. The level of DNA damage in irradiated parental fish increased in a radiation dose-dependent manner at day 1 post-irradiation, but returned to the control level thereafter. The level of DNA damage in the progeny was directly correlated with the parental irradiation dose. Results highlight the genotoxic risk of a single exposure to low-dose ionizing radiation in irradiated individuals and also in its non-irradiated progeny.

  17. Learning From Trials on Radiation Dose in Non-Small Cell Lung Cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bradley, Jeffrey, E-mail: jbradley@wustl.edu; Hu, Chen

    2016-11-15

    In this issue of the International Journal of Radiation Oncology • Biology • Physics, Taylor et al present a meta-analysis of published data supporting 2 findings: (1) radiation dose escalation seems to benefit patients who receive radiation alone for non-small cell lung cancer; and (2) radiation dose escalation has a detrimental effect on overall survival in the setting of concurrent chemotherapy. The latter finding is supported by data but has perplexed the oncology community. Perhaps these findings are not perplexing at all. Perhaps it is simply another lesson in the major principle in radiation oncology, to minimize radiation dose to normalmore » tissues.« less

  18. Sign language spotting with a threshold model based on conditional random fields.

    PubMed

    Yang, Hee-Deok; Sclaroff, Stan; Lee, Seong-Whan

    2009-07-01

    Sign language spotting is the task of detecting and recognizing signs in a signed utterance, in a set vocabulary. The difficulty of sign language spotting is that instances of signs vary in both motion and appearance. Moreover, signs appear within a continuous gesture stream, interspersed with transitional movements between signs in a vocabulary and nonsign patterns (which include out-of-vocabulary signs, epentheses, and other movements that do not correspond to signs). In this paper, a novel method for designing threshold models in a conditional random field (CRF) model is proposed which performs an adaptive threshold for distinguishing between signs in a vocabulary and nonsign patterns. A short-sign detector, a hand appearance-based sign verification method, and a subsign reasoning method are included to further improve sign language spotting accuracy. Experiments demonstrate that our system can spot signs from continuous data with an 87.0 percent spotting rate and can recognize signs from isolated data with a 93.5 percent recognition rate versus 73.5 percent and 85.4 percent, respectively, for CRFs without a threshold model, short-sign detection, subsign reasoning, and hand appearance-based sign verification. Our system can also achieve a 15.0 percent sign error rate (SER) from continuous data and a 6.4 percent SER from isolated data versus 76.2 percent and 14.5 percent, respectively, for conventional CRFs.

  19. Low-dose right unilateral electroconvulsive therapy (ECT): effectiveness of the first treatment.

    PubMed

    Lapidus, Kyle A B; Shin, Joseph S W; Pasculli, Rosa M; Briggs, Mimi C; Popeo, Dennis M; Kellner, Charles H

    2013-06-01

    Electroconvulsive therapy (ECT) is a widely used, highly effective antidepressant treatment. Except for the most severely ill patients, right unilateral (RUL) electrode placement is the most frequent initial treatment choice. In current practice, RUL ECT is administered at several multiples of seizure threshold (ST) based on reports that lower stimulus intensity results in lower response/remission rates. Many patients, as part of an initial dose titration to determine ST, will receive a single treatment with low-dose RUL ECT and subsequent treatments with a stimulus at a multiple of ST. To assess response to the first ECT. A retrospective analysis of charts from clinical practice at Mount Sinai Medical Center was performed. A single treatment with low-dose (presumably near ST) RUL ECT had a significant and immediate antidepressant effect in our sample of patients with major depression. We determined that this response is similar to that of patients receiving a single initial treatment with high-dose RUL ECT (at a multiple of ST). These data suggest, contrary to commonly held belief, that RUL ECT may be effective at a low stimulus dose. This argues against restimulating at 6 times ST in the initial session, based on the belief that the near-threshold seizure has no antidepressant efficacy. Our findings suggest a need for further investigation of cases in which low-dose RUL ECT may be an effective antidepressant treatment. Further prospective studies, including larger numbers of patients who receive randomized treatment with low- or high-dose RUL with longer follow-up, are indicated.

  20. Peaks Over Threshold (POT): A methodology for automatic threshold estimation using goodness of fit p-value

    NASA Astrophysics Data System (ADS)

    Solari, Sebastián.; Egüen, Marta; Polo, María. José; Losada, Miguel A.

    2017-04-01

    Threshold estimation in the Peaks Over Threshold (POT) method and the impact of the estimation method on the calculation of high return period quantiles and their uncertainty (or confidence intervals) are issues that are still unresolved. In the past, methods based on goodness of fit tests and EDF-statistics have yielded satisfactory results, but their use has not yet been systematized. This paper proposes a methodology for automatic threshold estimation, based on the Anderson-Darling EDF-statistic and goodness of fit test. When combined with bootstrapping techniques, this methodology can be used to quantify both the uncertainty of threshold estimation and its impact on the uncertainty of high return period quantiles. This methodology was applied to several simulated series and to four precipitation/river flow data series. The results obtained confirmed its robustness. For the measured series, the estimated thresholds corresponded to those obtained by nonautomatic methods. Moreover, even though the uncertainty of the threshold estimation was high, this did not have a significant effect on the width of the confidence intervals of high return period quantiles.

  1. Flavour and identification threshold detection overview of Slovak adepts for certified testing.

    PubMed

    Vietoris, VladimIr; Barborova, Petra; Jancovicova, Jana; Eliasova, Lucia; Karvaj, Marian

    2016-07-01

    During certification process of sensory assessors of Slovak certification body we obtained results for basic taste thresholds and lifestyle habits. 500 adult people were screened during experiment with food industry background. For analysis of basic and non basic tastes, we used standardized procedure of ISO 8586-1:1993. In flavour test experiment, group of (26-35 y.o) produced the lowest error ratio (1.438), highest is (56+ y.o.) group with result (2.0). Average error value based on gender for women was (1.510) in comparison to men (1.477). People with allergies have the average error ratio (1.437) in comparison to people without allergies (1.511). Non-smokers produced less errors (1.484) against the smokers (1.576). Another flavour threshold identification test detected differences among age groups (by age are values increased). The highest number of errors made by men in metallic taste was (24%) the same as made by women (22%). Higher error ratio made by men occurred in salty taste (19%) against women (10%). Analysis detected some differences between allergic/non-allergic, smokers/non-smokers groups.

  2. Prophylactic G-CSF and antibiotics enable a significant dose-escalation of triplet-chemotherapy in non-small cell lung cancer.

    PubMed

    Timmer-Bonte, J N H; Punt, C J A; vd Heijden, H F M; van Die, C E; Bussink, J; Beijnen, J H; Huitema, A D R; Tjan-Heijnen, V C G

    2008-05-01

    In advanced non-small cell lung cancer (NSCLC) the clinical benefit of a platinum-based doublet is only modest, therefore, attenuated dosed three-drug combinations are investigated. We hypothesized that with adequate support a full dosed chemotherapy triplet is feasible. The study was designed as a dose finding study of paclitaxel in chemotherapy-naive patients. Paclitaxel was given as a 3-h infusion on day 1, followed by fixed doses of teniposide (or etoposide) 100mg/m(2) days 1, 3, 5 and cisplatin 80 mg/m(2) day 1 every 3 weeks. As myelotoxicity was expected to be the dose-limiting toxicity, prophylactic G-CSF and antibiotic support was evaluated. Indeed, paclitaxel 120 mg/m(2) resulted in dose-limiting neutropenia, despite G-CSF support. Teniposide/etoposide day 1, 3, 5 was less myelotoxic compared to day 1, 2, 3. G-CSF support allowed paclitaxel dose-escalation to 250 mg/m(2). The addition of prophylactic antibiotics enabled dose-escalation to 275 mg/m(2) without reaching MTD. In conclusion, G-CSF and antibiotics prophylaxis enables the delivery of a full dosed chemotherapy triplet in previously untreated NSCLC patients.

  3. Evaluation and comparison of 50 Hz current threshold of electrocutaneous sensations using different methods

    PubMed Central

    Lindenblatt, G.; Silny, J.

    2006-01-01

    Leakage currents, tiny currents flowing from an everyday-life appliance through the body to the ground, can cause a non-adequate perception (called electrocutaneous sensation, ECS) or even pain and should be avoided. Safety standards for low-frequency range are based on experimental results of current thresholds of electrocutaneous sensations, which however show a wide range between about 50 μA (rms) and 1000 μA (rms). In order to be able to explain these differences, the perception threshold was measured repeatedly in experiments with test persons under identical experimental setup, but by means of different methods (measuring strategies), namely: direct adjustment, classical threshold as amperage of 50% perception probability, and confidence rating procedure of signal detection theory. The current is injected using a 1 cm2 electrode at the highly touch sensitive part of the index fingertip. These investigations show for the first time that the threshold of electrocutaneous sensations is influenced both by adaptation to the non-adequate stimulus and individual, emotional factors. Therefore, classical methods, on which the majority of the safety investigations are based, cannot be used to determine a leakage current threshold. The confidence rating procedure of the modern signal detection theory yields a value of 179.5 μA (rms) at 50 Hz power supply net frequency as the lower end of the 95% confidence range considering the variance in the investigated group. This value is expected to be free of adaptation influences, and is distinctly lower than the European limits and supports the stricter regulations of Canada and USA. PMID:17111461

  4. Is Weight-Based Adjustment of Automatic Exposure Control Necessary for the Reduction of Chest CT Radiation Dose?

    PubMed Central

    Prakash, Priyanka; Gilman, Matthew D.; Shepard, Jo-Anne O.; Digumarthy, Subba R.

    2010-01-01

    Objective To assess the effects of radiation dose reduction in the chest CT using a weight-based adjustment of the automatic exposure control (AEC) technique. Materials and Methods With Institutional Review Board Approval, 60 patients (mean age, 59.1 years; M:F = 35:25) and 57 weight-matched patients (mean age, 52.3 years, M:F = 25:32) were scanned using a weight-adjusted AEC and non-weight-adjusted AEC, respectively on a 64-slice multidetector CT with a 0.984:1 pitch, 0.5 second rotation time, 40 mm table feed/rotation, and 2.5 mm section thickness. Patients were categorized into 3 weight categories; < 60 kg (n = 17), 60-90 kg (n = 52), and > 90 kg (n = 48). Patient weights, scanning parameters, CT dose index volumes (CTDIvol) and dose length product (DLP) were recorded, while effective dose (ED) was estimated. Image noise was measured in the descending thoracic aorta. Data were analyzed using a standard statistical package (SAS/STAT) (Version 9.1, SAS institute Inc, Cary, NC). Results Compared to the non-weight-adjusted AEC, the weight-adjusted AEC technique resulted in an average decrease of 29% in CTDIvol and a 27% effective dose reduction (p < 0.0001). With weight-adjusted AEC, the CTDIvol decreased to 15.8, 15.9, and 27.3 mGy for the < 60, 60-90 and > 91 kg weight groups, respectively, compared to 20.3, 27.9 and 32.8 mGy, with non-weight-adjusted AEC. No significant difference was observed for objective image noise between the chest CT acquired with the non-weight-adjusted (15.0 ± 3.1) and weight-adjusted (16.1 ± 5.6) AEC techniques (p > 0.05). Conclusion The results of this study suggest that AEC should be tailored according to patient weight. Without weight-based adjustment of AEC, patients are exposed to a 17 - 43% higher radiation-dose from a chest CT. PMID:20046494

  5. Mitochondrial threshold effects.

    PubMed Central

    Rossignol, Rodrigue; Faustin, Benjamin; Rocher, Christophe; Malgat, Monique; Mazat, Jean-Pierre; Letellier, Thierry

    2003-01-01

    The study of mitochondrial diseases has revealed dramatic variability in the phenotypic presentation of mitochondrial genetic defects. To attempt to understand this variability, different authors have studied energy metabolism in transmitochondrial cell lines carrying different proportions of various pathogenic mutations in their mitochondrial DNA. The same kinds of experiments have been performed on isolated mitochondria and on tissue biopsies taken from patients with mitochondrial diseases. The results have shown that, in most cases, phenotypic manifestation of the genetic defect occurs only when a threshold level is exceeded, and this phenomenon has been named the 'phenotypic threshold effect'. Subsequently, several authors showed that it was possible to inhibit considerably the activity of a respiratory chain complex, up to a critical value, without affecting the rate of mitochondrial respiration or ATP synthesis. This phenomenon was called the 'biochemical threshold effect'. More recently, quantitative analysis of the effects of various mutations in mitochondrial DNA on the rate of mitochondrial protein synthesis has revealed the existence of a 'translational threshold effect'. In this review these different mitochondrial threshold effects are discussed, along with their molecular bases and the roles that they play in the presentation of mitochondrial diseases. PMID:12467494

  6. Diversity Outbred Mice Identify Population-Based Exposure Thresholds and Genetic Factors that Influence Benzene-Induced Genotoxicity

    PubMed Central

    Gatti, Daniel M.; Morgan, Daniel L.; Kissling, Grace E.; Shockley, Keith R.; Knudsen, Gabriel A.; Shepard, Kim G.; Price, Herman C.; King, Deborah; Witt, Kristine L.; Pedersen, Lars C.; Munger, Steven C.; Svenson, Karen L.; Churchill, Gary A.

    2014-01-01

    Background Inhalation of benzene at levels below the current exposure limit values leads to hematotoxicity in occupationally exposed workers. Objective We sought to evaluate Diversity Outbred (DO) mice as a tool for exposure threshold assessment and to identify genetic factors that influence benzene-induced genotoxicity. Methods We exposed male DO mice to benzene (0, 1, 10, or 100 ppm; 75 mice/exposure group) via inhalation for 28 days (6 hr/day for 5 days/week). The study was repeated using two independent cohorts of 300 animals each. We measured micronuclei frequency in reticulocytes from peripheral blood and bone marrow and applied benchmark concentration modeling to estimate exposure thresholds. We genotyped the mice and performed linkage analysis. Results We observed a dose-dependent increase in benzene-induced chromosomal damage and estimated a benchmark concentration limit of 0.205 ppm benzene using DO mice. This estimate is an order of magnitude below the value estimated using B6C3F1 mice. We identified a locus on Chr 10 (31.87 Mb) that contained a pair of overexpressed sulfotransferases that were inversely correlated with genotoxicity. Conclusions The genetically diverse DO mice provided a reproducible response to benzene exposure. The DO mice display interindividual variation in toxicity response and, as such, may more accurately reflect the range of response that is observed in human populations. Studies using DO mice can localize genetic associations with high precision. The identification of sulfotransferases as candidate genes suggests that DO mice may provide additional insight into benzene-induced genotoxicity. Citation French JE, Gatti DM, Morgan DL, Kissling GE, Shockley KR, Knudsen GA, Shepard KG, Price HC, King D, Witt KL, Pedersen LC, Munger SC, Svenson KL, Churchill GA. 2015. Diversity Outbred mice identify population-based exposure thresholds and genetic factors that influence benzene-induced genotoxicity. Environ Health Perspect 123:237

  7. Predictors of the nicotine reinforcement threshold, compensation, and elasticity of demand in a rodent model of nicotine reduction policy.

    PubMed

    Grebenstein, Patricia E; Burroughs, Danielle; Roiko, Samuel A; Pentel, Paul R; LeSage, Mark G

    2015-06-01

    The FDA is considering reducing the nicotine content in tobacco products as a population-based strategy to reduce tobacco addiction. Research is needed to determine the threshold level of nicotine needed to maintain smoking and the extent of compensatory smoking that could occur during nicotine reduction. Sources of variability in these measures across sub-populations also need to be identified so that policies can take into account the risks and benefits of nicotine reduction in vulnerable populations. The present study examined these issues in a rodent nicotine self-administration model of nicotine reduction policy to characterize individual differences in nicotine reinforcement thresholds, degree of compensation, and elasticity of demand during progressive reduction of the unit nicotine dose. The ability of individual differences in baseline nicotine intake and nicotine pharmacokinetics to predict responses to dose reduction was also examined. Considerable variability in the reinforcement threshold, compensation, and elasticity of demand was evident. High baseline nicotine intake was not correlated with the reinforcement threshold, but predicted less compensation and less elastic demand. Higher nicotine clearance predicted low reinforcement thresholds, greater compensation, and less elastic demand. Less elastic demand also predicted lower reinforcement thresholds. These findings suggest that baseline nicotine intake, nicotine clearance, and the essential value of nicotine (i.e. elasticity of demand) moderate the effects of progressive nicotine reduction in rats and warrant further study in humans. They also suggest that smokers with fast nicotine metabolism may be more vulnerable to the risks of nicotine reduction. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  8. Predictors of the nicotine reinforcement threshold, compensation, and elasticity of demand in a rodent model of nicotine reduction policy*

    PubMed Central

    Grebenstein, Patricia E.; Burroughs, Danielle; Roiko, Samuel A.; Pentel, Paul R.; LeSage, Mark G.

    2015-01-01

    Background The FDA is considering reducing the nicotine content in tobacco products as a population-based strategy to reduce tobacco addiction. Research is needed to determine the threshold level of nicotine needed to maintain smoking and the extent of compensatory smoking that could occur during nicotine reduction. Sources of variability in these measures across sub-populations also need to be identified so that policies can take into account the risks and benefits of nicotine reduction in vulnerable populations. Methods The present study examined these issues in a rodent nicotine self- administration model of nicotine reduction policy to characterize individual differences in nicotine reinforcement thresholds, degree of compensation, and elasticity of demand during progressive reduction of the unit nicotine dose. The ability of individual differences in baseline nicotine intake and nicotine pharmacokinetics to predict responses to dose reduction was also examined. Results Considerable variability in the reinforcement threshold, compensation, and elasticity of demand was evident. High baseline nicotine intake was not correlated with the reinforcement threshold, but predicted less compensation and less elastic demand. Higher nicotine clearance predicted low reinforcement thresholds, greater compensation, and less elastic demand. Less elastic demand also predicted lower reinforcement thresholds. Conclusions These findings suggest that baseline nicotine intake, nicotine clearance, and the essential value of nicotine (i.e. elasticity of demand) moderate the effects of progressive nicotine reduction in rats and warrant further study in humans. They also suggest that smokers with fast nicotine metabolism may be more vulnerable to the risks of nicotine reduction. PMID:25891231

  9. Threshold virus dynamics with impulsive antiretroviral drug effects

    PubMed Central

    Lou, Jie; Lou, Yijun; Wu, Jianhong

    2013-01-01

    The purposes of this paper are twofold: to develop a rigorous approach to analyze the threshold behaviors of nonlinear virus dynamics models with impulsive drug effects and to examine the feasibility of virus clearance following the Manuals of National AIDS Free Antiviral Treatment in China. An impulsive system of differential equations is developed to describe the within-host virus dynamics of both wild-type and drug-resistant strains when a combination of antiretroviral drugs is used to induce instantaneous drug effects at a sequence of dosing times equally spaced while drug concentrations decay exponentially after the dosing time. Threshold parameters are derived using the basic reproduction number of periodic epidemic models, and are used to depict virus clearance/persistence scenarios using the theory of asymptotic periodic systems and the persistence theory of discrete dynamical systems. Numerical simulations using model systems parametrized in terms of the antiretroviral therapy recommended in the aforementioned Manuals illustrate the theoretical threshold virus dynamics, and examine conditions under which the impulsive antiretroviral therapy leads to treatment success. In particular, our results show that only the drug-resistant strain can dominate (the first-line treatment program guided by the Manuals) or both strains may be rapidly eliminated (the second-line treatment program), thus the work indicates the importance of implementing the second-line treatment program as soon as possible. PMID:21987085

  10. Evidence for Dose-Additive Effects of Pyrethroids on Motor Activity in Rats

    PubMed Central

    Wolansky, Marcelo J.; Gennings, Chris; DeVito, Michael J.; Crofton, Kevin M.

    2009-01-01

    Background Pyrethroids are neurotoxic insecticides used in a variety of indoor and outdoor applications. Previous research characterized the acute dose–effect functions for 11 pyrethroids administered orally in corn oil (1 mL/kg) based on assessment of motor activity. Objectives We used a mixture of these 11 pyrethroids and the same testing paradigm used in single-compound assays to test the hypothesis that cumulative neurotoxic effects of pyrethroid mixtures can be predicted using the default dose–addition theory. Methods Mixing ratios of the 11 pyrethroids in the tested mixture were based on the ED30 (effective dose that produces a 30% decrease in response) of the individual chemical (i.e., the mixture comprised equipotent amounts of each pyrethroid). The highest concentration of each individual chemical in the mixture was less than the threshold for inducing behavioral effects. Adult male rats received acute oral exposure to corn oil (control) or dilutions of the stock mixture solution. The mixture of 11 pyrethroids was administered either simultaneously (2 hr before testing) or after a sequence based on times of peak effect for the individual chemicals (4, 2, and 1 hr before testing). A threshold additivity model was fit to the single-chemical data to predict the theoretical dose–effect relationship for the mixture under the assumption of dose additivity. Results When subthreshold doses of individual chemicals were combined in the mixtures, we found significant dose-related decreases in motor activity. Further, we found no departure from the predicted dose-additive curve regardless of the mixture dosing protocol used. Conclusion In this article we present the first in vivo evidence on pyrethroid cumulative effects supporting the default assumption of dose addition. PMID:20019907

  11. Combinatorial DNA Damage Pairing Model Based on X-Ray-Induced Foci Predicts the Dose and LET Dependence of Cell Death in Human Breast Cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vadhavkar, Nikhil; Pham, Christopher; Georgescu, Walter

    our model are based on experimental RIF and are three times larger than the hypothetical LEM voxel used to fit survival curves. Our model is therefore an alternative to previous approaches that provides a testable biological mechanism (i.e., RIF). In addition, we propose that DSB pairing will help develop more accurate alternatives to the linear cancer risk model (LNT) currently used for regulating exposure to very low levels of ionizing radiation.« less

  12. Influence of intravenous opioid dose on postoperative ileus.

    PubMed

    Barletta, Jeffrey F; Asgeirsson, Theodor; Senagore, Anthony J

    2011-07-01

    Intravenous opioids represent a major component in the pathophysiology of postoperative ileus (POI). However, the most appropriate measure and threshold to quantify the association between opioid dose (eg, average daily, cumulative, maximum daily) and POI remains unknown. To evaluate the relationship between opioid dose, POI, and length of stay (LOS) and identify the opioid measure that was most strongly associated with POI. Consecutive patients admitted to a community teaching hospital who underwent elective colorectal surgery by any technique with an enhanced-recovery protocol postoperatively were retrospectively identified. Patients were excluded if they received epidural analgesia, developed a major intraabdominal complication or medical complication, or had a prolonged workup prior to surgery. Intravenous opioid doses were quantified and converted to hydromorphone equivalents. Classification and regression tree (CART) analysis was used to determine the dosing threshold for the opioid measure most associated with POI and define high versus low use of opioids. Risk factors for POI and prolonged LOS were determined through multivariate analysis. The incidence of POI in 279 patients was 8.6%. CART analysis identified a maximum daily intravenous hydromorphone dose of 2 mg or more as the opioid measure most associated with POI. Multivariate analysis revealed maximum daily hydromorphone dose of 2 mg or more (p = 0.034), open surgical technique (p = 0.045), and days of intravenous narcotic therapy (p = 0.003) as significant risk factors for POI. Variables associated with increased LOS were POI (p < 0.001), maximum daily hydromorphone dose of 2 mg or more (p < 0.001), and age (p = 0.005); laparoscopy (p < 0.001) was associated with a decreased LOS. Intravenous opioid therapy is significantly associated with POI and prolonged LOS, particularly when the maximum hydromorphone dose per day exceeds 2 mg. Clinicians should consider alternative, nonopioid-based pain

  13. Validation of GPU based TomoTherapy dose calculation engine.

    PubMed

    Chen, Quan; Lu, Weiguo; Chen, Yu; Chen, Mingli; Henderson, Douglas; Sterpin, Edmond

    2012-04-01

    The graphic processing unit (GPU) based TomoTherapy convolution/superposition(C/S) dose engine (GPU dose engine) achieves a dramatic performance improvement over the traditional CPU-cluster based TomoTherapy dose engine (CPU dose engine). Besides the architecture difference between the GPU and CPU, there are several algorithm changes from the CPU dose engine to the GPU dose engine. These changes made the GPU dose slightly different from the CPU-cluster dose. In order for the commercial release of the GPU dose engine, its accuracy has to be validated. Thirty eight TomoTherapy phantom plans and 19 patient plans were calculated with both dose engines to evaluate the equivalency between the two dose engines. Gamma indices (Γ) were used for the equivalency evaluation. The GPU dose was further verified with the absolute point dose measurement with ion chamber and film measurements for phantom plans. Monte Carlo calculation was used as a reference for both dose engines in the accuracy evaluation in heterogeneous phantom and actual patients. The GPU dose engine showed excellent agreement with the current CPU dose engine. The majority of cases had over 99.99% of voxels with Γ(1%, 1 mm) < 1. The worst case observed in the phantom had 0.22% voxels violating the criterion. In patient cases, the worst percentage of voxels violating the criterion was 0.57%. For absolute point dose verification, all cases agreed with measurement to within ±3% with average error magnitude within 1%. All cases passed the acceptance criterion that more than 95% of the pixels have Γ(3%, 3 mm) < 1 in film measurement, and the average passing pixel percentage is 98.5%-99%. The GPU dose engine also showed similar degree of accuracy in heterogeneous media as the current TomoTherapy dose engine. It is verified and validated that the ultrafast TomoTherapy GPU dose engine can safely replace the existing TomoTherapy cluster based dose engine without degradation in dose accuracy.

  14. An organ-based approach to dose calculation in the assessment of dose-dependent biological effects of ionising radiation in Arabidopsis thaliana.

    PubMed

    Biermans, Geert; Horemans, Nele; Vanhoudt, Nathalie; Vandenhove, Hildegarde; Saenen, Eline; Van Hees, May; Wannijn, Jean; Vives i Batlle, Jordi; Cuypers, Ann

    2014-07-01

    There is a need for a better understanding of biological effects of radiation exposure in non-human biota. Correct description of these effects requires a more detailed model of dosimetry than that available in current risk assessment tools, particularly for plants. In this paper, we propose a simple model for dose calculations in roots and shoots of Arabidopsis thaliana seedlings exposed to radionuclides in a hydroponic exposure setup. This model is used to compare absorbed doses for three radionuclides, (241)Am (α-radiation), (90)Sr (β-radiation) and (133)Ba (γ radiation). Using established dosimetric calculation methods, dose conversion coefficient values were determined for each organ separately based on uptake data from the different plant organs. These calculations were then compared to the DCC values obtained with the ERICA tool under equivalent geometry assumptions. When comparing with our new method, the ERICA tool appears to overestimate internal doses and underestimate external doses in the roots for all three radionuclides, though each to a different extent. These observations might help to refine dose-response relationships. The DCC values for (90)Sr in roots are shown to deviate the most. A dose-effect curve for (90)Sr β-radiation has been established on biomass and photosynthesis endpoints, but no significant dose-dependent effects are observed. This indicates the need for use of endpoints at the molecular and physiological scale. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Ultra-low dose (+)-naloxone restores the thermal threshold of morphine tolerant rats.

    PubMed

    Chou, Kuang-Yi; Tsai, Ru-Yin; Tsai, Wei-Yuan; Wu, Ching-Tang; Yeh, Chun-Chang; Cherng, Chen-Hwan; Wong, Chih-Shung

    2013-12-01

    As known, long-term morphine infusion leads to tolerance. We previously demonstrated that both co-infusion and post-administration of ultra-low dose (±)-naloxone restores the antinociceptive effect of morphine in morphine-tolerant rats. However, whether the mechanism of the action of ultra-low dose (±)-naloxone is through opioid receptors or not. Therefore, in the present study, we further investigated the effect of ultra-low dose (+)-naloxone, it does not bind to opioid receptors, on the antinociceptive effect of morphine. Male Wistar rats were implanted with one or two intrathecal (i.t.) catheters; one catheter was connected to a mini-osmotic pump, used for morphine (15 μg/h), ultra-low dose (+)-naloxone (15 pg/h), morphine plus ultra-low dose (+)-naloxone (15 pg/h) or saline (1 μl/h) infusion for 5 days. On day 5, either ultra-low dose (+)-naloxone (15 pg) or saline (5 μl) was injected via the other catheter immediately after discontinued morphine or saline infusion. Three hours later, morphine (15 μg in 5 μl saline) or saline were given intrathecally. All rats received nociceptive tail-flick test every 30 minutes for 120 minutes after morphine challenge at different temperature (45-52°C, respective). Our results showed that, both co-infusion and post-treatment of ultra-low dose (+)-naloxone with morphine preserves the antinociceptive effect of morphine. Moreover, in the post administration rats, ultra-low dose (+)-naloxone further enhances the antinociceptive effect of morphine. This study provides an evidence for ultra-low dose (+)-naloxone as a therapeutic adjuvant for patients who need long-term opioid administration for pain management. Copyright © 2013. Published by Elsevier B.V.

  16. High dose of plasmid IL-15 inhibits immune responses in an influenza non-human primates immunogenicity model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yin Jiangmei; Dai Anlan; Laddy, Dominick J.

    2009-10-10

    Interleukin (IL)-15, is a cytokine that is important for the maintenance of long-lasting, high-avidity T cell response to invading pathogens and has, therefore, been used in vaccine and therapeutic platforms as an adjuvant. In addition to pure protein delivery, plasmids encoding the IL-15 gene have been utilized. However, it is critical to determine the appropriate dose to maximize the adjuvanting effects. We immunized rhesus macaques with different doses of IL-15 expressing plasmid in an influenza non-human primate immunogenicity model. We found that co-immunization of rhesus macaques with a Flu DNA-based vaccine and low doses of plasmid encoding macaque IL-15 enhancedmore » the production of IFN-gamma (0.5 mg) and the proliferation of CD4{sup +} and CD8{sup +} T cells, as well as T{sub CM} levels in proliferating CD8{sup +} T cells (0.25 mg). Whereas, high doses of IL-15 (4 mg) decrease the production of IFN-gamma and the proliferation of CD4{sup +} and CD8{sup +} T cells and T{sub CM} levels in the proliferating CD4{sup +} and CD8{sup +} T cells. In addition, the data of hemagglutination inhibition (HI) antibody titer suggest that although not significantly different, there appears to be a slight increase in antibodies at lower doses of IL-15. Importantly, however, the higher doses of IL-15 decrease the antibody levels significantly. This study demonstrates the importance of optimizing DNA-based cytokine adjuvants.« less

  17. “Protective Bystander Effects Simulated with the State-Vector Model”—HeLa x Skin Exposure to 137Cs Not Protective Bystander Response But Mammogram and Diagnostic X-Rays Are

    PubMed Central

    Leonard, Bobby E.

    2008-01-01

    The recent Dose Response journal article “Protective Bystander Effects Simulated with the State-Vector Model” (Schollnberger and Eckl 2007) identified the suppressive (below natural occurring, zero primer dose, spontaneous level) dose response for HeLa x skin exposure to 137Cs gamma rays (Redpath et al 2001) as a protective Bystander Effect (BE) behavior. I had previously analyzed the Redpath et al (2001) data with a Microdose Model and conclusively showed that the suppressive response was from Adaptive Response (AR) radio-protection (Leonard 2005, 2007a). The significance of my microdose analysis has been that low LET radiation induced single (i.e. only one) charged particle traversals through a cell can initiate a Poisson distributed activation of AR radio-protection. The purpose of this correspondence is to clarify the distinctions relative to the BE and the AR behaviors for the Redpath groups 137Cs data, show conversely however that the Redpath group data for mammography (Ko et al 2004) and diagnostic (Redpath et al 2003) X-rays do conclusively reflect protective bystander behavior and also herein emphasize the need for radio-biologist to apply microdosimetry in planning and analyzing their experiments for BE and AR. Whether we are adamantly pro-LNT, adamantly anti-LNT or, like most of us, just simple scientists searching for the truth in radio-biology, it is important that we accurately identify our results, especially when related to the LNT hypothesis controversy. PMID:18846260

  18. Discriminating the precipitation phase based on different temperature thresholds in the Songhua River Basin, China

    NASA Astrophysics Data System (ADS)

    Zhong, Keyuan; Zheng, Fenli; Xu, Ximeng; Qin, Chao

    2018-06-01

    Different precipitation phases (rain, snow or sleet) differ greatly in their hydrological and erosional processes. Therefore, accurate discrimination of the precipitation phase is highly important when researching hydrologic processes and climate change at high latitudes and mountainous regions. The objective of this study was to identify suitable temperature thresholds for discriminating the precipitation phase in the Songhua River Basin (SRB) based on 20-year daily precipitation collected from 60 meteorological stations located in and around the basin. Two methods, the air temperature method (AT method) and the wet bulb temperature method (WBT method), were used to discriminate the precipitation phase. Thirteen temperature thresholds were used to discriminate snowfall in the SRB. These thresholds included air temperatures from 0 to 5.5 °C at intervals of 0.5 °C and the wet bulb temperature (WBT). Three evaluation indices, the error percentage of discriminated snowfall days (Ep), the relative error of discriminated snowfall (Re) and the determination coefficient (R2), were applied to assess the discrimination accuracy. The results showed that 2.5 °C was the optimum threshold temperature for discriminating snowfall at the scale of the entire basin. Due to differences in the landscape conditions at the different stations, the optimum threshold varied by station. The optimal threshold ranged 1.5-4.0 °C, and 19 stations, 17 stations and 18 stations had optimal thresholds of 2.5 °C, 3.0 °C, and 3.5 °C respectively, occupying 90% of all stations. Compared with using a single suitable temperature threshold to discriminate snowfall throughout the basin, it was more accurate to use the optimum threshold at each station to estimate snowfall in the basin. In addition, snowfall was underestimated when the temperature threshold was the WBT and when the temperature threshold was below 2.5 °C, whereas snowfall was overestimated when the temperature threshold exceeded 4

  19. Stereotactic, Single-Dose Irradiation of Lung Tumors: A Comparison of Absolute Dose and Dose Distribution Between Pencil Beam and Monte Carlo Algorithms Based on Actual Patient CT Scans

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen Huixiao; Lohr, Frank; Fritz, Peter

    2010-11-01

    Purpose: Dose calculation based on pencil beam (PB) algorithms has its shortcomings predicting dose in tissue heterogeneities. The aim of this study was to compare dose distributions of clinically applied non-intensity-modulated radiotherapy 15-MV plans for stereotactic body radiotherapy between voxel Monte Carlo (XVMC) calculation and PB calculation for lung lesions. Methods and Materials: To validate XVMC, one treatment plan was verified in an inhomogeneous thorax phantom with EDR2 film (Eastman Kodak, Rochester, NY). Both measured and calculated (PB and XVMC) dose distributions were compared regarding profiles and isodoses. Then, 35 lung plans originally created for clinical treatment by PB calculationmore » with the Eclipse planning system (Varian Medical Systems, Palo Alto, CA) were recalculated by XVMC (investigational implementation in PrecisePLAN [Elekta AB, Stockholm, Sweden]). Clinically relevant dose-volume parameters for target and lung tissue were compared and analyzed statistically. Results: The XVMC calculation agreed well with film measurements (<1% difference in lateral profile), whereas the deviation between PB calculation and film measurements was up to +15%. On analysis of 35 clinical cases, the mean dose, minimal dose and coverage dose value for 95% volume of gross tumor volume were 1.14 {+-} 1.72 Gy, 1.68 {+-} 1.47 Gy, and 1.24 {+-} 1.04 Gy lower by XVMC compared with PB, respectively (prescription dose, 30 Gy). The volume covered by the 9 Gy isodose of lung was 2.73% {+-} 3.12% higher when calculated by XVMC compared with PB. The largest differences were observed for small lesions circumferentially encompassed by lung tissue. Conclusions: Pencil beam dose calculation overestimates dose to the tumor and underestimates lung volumes exposed to a given dose consistently for 15-MV photons. The degree of difference between XVMC and PB is tumor size and location dependent. Therefore XVMC calculation is helpful to further optimize treatment

  20. Effects of reference analgesics and psychoactive drugs on the noxious heat threshold of mice measured by an increasing-temperature water bath.

    PubMed

    Boros, Melinda; Benkó, Rita; Bölcskei, Kata; Szolcsányi, János; Barthó, Loránd; Pethő, Gábor

    2013-12-01

    The study aimed at validating an increasing-temperature water bath suitable for determining the noxious heat threshold for use in mice. The noxious heat threshold was determined by immersing the tail of the gently held awake mouse into a water container whose temperature was near-linearly increased at a rate of 24°C/min. until the animal withdrew its tail, that is, heating attained the noxious threshold. The effects of standard analgesic, neuroleptic and anxiolytic drugs were investigated in a parallel way on both the noxious heat threshold and the psychomotor activity assessed by the open field test. Morphine, diclofenac and metamizol (dipyrone) elevated the heat threshold of mice with minimum effective doses of 6, 30 and 1000 mg/kg i.p., respectively. These doses of morphine and diclofenac failed to induce any remarkable effect on psychomotor activity in the open field test while that of metamizol exerted a profound inhibition. The anxiolytic diazepam and the neuroleptic droperidol at doses evoking a mild and moderate, respectively, psychomotor inhibition failed to alter the heat threshold. Combination of a subliminal dose of morphine (regarding both antinociceptive and psychomotor inhibitory action) with diclofenac, metamizol, diazepam or droperidol at doses also subliminal regarding the thermal antinociceptive effect elevated the noxious heat threshold without major additional effects in the open field test. It is concluded that the increasing-temperature water bath is suitable for studying the thermal antinociceptive effects of morphine and diclofenac as well as the morphine-sparing action of diclofenac, metamizol, droperidol and diazepam. Behavioural testing is recommended when testing analgesics. © 2013 Nordic Pharmacological Society. Published by John Wiley & Sons Ltd.

  1. Quantile-based permutation thresholds for quantitative trait loci hotspots.

    PubMed

    Neto, Elias Chaibub; Keller, Mark P; Broman, Andrew F; Attie, Alan D; Jansen, Ritsert C; Broman, Karl W; Yandell, Brian S

    2012-08-01

    Quantitative trait loci (QTL) hotspots (genomic locations affecting many traits) are a common feature in genetical genomics studies and are biologically interesting since they may harbor critical regulators. Therefore, statistical procedures to assess the significance of hotspots are of key importance. One approach, randomly allocating observed QTL across the genomic locations separately by trait, implicitly assumes all traits are uncorrelated. Recently, an empirical test for QTL hotspots was proposed on the basis of the number of traits that exceed a predetermined LOD value, such as the standard permutation LOD threshold. The permutation null distribution of the maximum number of traits across all genomic locations preserves the correlation structure among the phenotypes, avoiding the detection of spurious hotspots due to nongenetic correlation induced by uncontrolled environmental factors and unmeasured variables. However, by considering only the number of traits above a threshold, without accounting for the magnitude of the LOD scores, relevant information is lost. In particular, biologically interesting hotspots composed of a moderate to small number of traits with strong LOD scores may be neglected as nonsignificant. In this article we propose a quantile-based permutation approach that simultaneously accounts for the number and the LOD scores of traits within the hotspots. By considering a sliding scale of mapping thresholds, our method can assess the statistical significance of both small and large hotspots. Although the proposed approach can be applied to any type of heritable high-volume "omic" data set, we restrict our attention to expression (e)QTL analysis. We assess and compare the performances of these three methods in simulations and we illustrate how our approach can effectively assess the significance of moderate and small hotspots with strong LOD scores in a yeast expression data set.

  2. The value of fixed rasburicase dosing versus weight-based dosing in the treatment and prevention of tumor lysis syndrome.

    PubMed

    Boutin, Alyssa; Blackman, Alison; O'Sullivan, David M; Forcello, Nicholas

    2018-01-01

    Background Rasburicase is a recombinant urate oxidase enzyme used for the treatment and prevention of tumor lysis syndrome. Our objective was to assess the efficacy of indication-based, low-dose rasburicase administration compared to the Food and Drug Administration-approved weight-based dosing. Methods This was a retrospective cohort study utilizing data from a tertiary medical center including patients admitted from 2012 to 2016, who received at least one dose of rasburicase. The primary outcome was achieving a uric acid level less than 7.5 mg/dl after a single dose of rasburicase in the preprotocol (Food and Drug Administration-approved weight-based dosing) and postprotocol (indication-based, low-dose) groups. Secondary outcomes included the change in uric acid levels between the pre- and postprotocol groups, adherence to the new institutional protocol, need for repeat rasburicase doses, and a cost analysis. Results Sixty-four patients received at least one dose of rasburicase between 1 January 2012 and 1 December 2016. Twenty-seven (79.4%) doses in the preprotocol group and 28 (82.4%) doses in the postprotocol group successfully achieved a uric acid level less than 7.5 mg/dl after a single dose of rasburicase (p=1.000). The average total monthly cost of rasburicase was reduced by 59.9% after adoption of the new protocol. Conclusions Indication-based, low-dose rasburicase displayed significantly more value when compared to weight-based dosing as shown by achieving cost savings without compromising clinical efficacy.

  3. Intensity-modulated radiotherapy for locally advanced non-small-cell lung cancer: a dose-escalation planning study.

    PubMed

    Lievens, Yolande; Nulens, An; Gaber, Mousa Amr; Defraene, Gilles; De Wever, Walter; Stroobants, Sigrid; Van den Heuvel, Frank

    2011-05-01

    To evaluate the potential for dose escalation with intensity-modulated radiotherapy (IMRT) in positron emission tomography-based radiotherapy planning for locally advanced non-small-cell lung cancer (LA-NSCLC). For 35 LA-NSCLC patients, three-dimensional conformal radiotherapy and IMRT plans were made to a prescription dose (PD) of 66 Gy in 2-Gy fractions. Dose escalation was performed toward the maximal PD using secondary endpoint constraints for the lung, spinal cord, and heart, with de-escalation according to defined esophageal tolerance. Dose calculation was performed using the Eclipse pencil beam algorithm, and all plans were recalculated using a collapsed cone algorithm. The normal tissue complication probabilities were calculated for the lung (Grade 2 pneumonitis) and esophagus (acute toxicity, grade 2 or greater, and late toxicity). IMRT resulted in statistically significant decreases in the mean lung (p <.0001) and maximal spinal cord (p = .002 and 0005) doses, allowing an average increase in the PD of 8.6-14.2 Gy (p ≤.0001). This advantage was lost after de-escalation within the defined esophageal dose limits. The lung normal tissue complication probabilities were significantly lower for IMRT (p <.0001), even after dose escalation. For esophageal toxicity, IMRT significantly decreased the acute NTCP values at the low dose levels (p = .0009 and p <.0001). After maximal dose escalation, late esophageal tolerance became critical (p <.0001), especially when using IMRT, owing to the parallel increases in the esophageal dose and PD. In LA-NSCLC, IMRT offers the potential to significantly escalate the PD, dependent on the lung and spinal cord tolerance. However, parallel increases in the esophageal dose abolished the advantage, even when using collapsed cone algorithms. This is important to consider in the context of concomitant chemoradiotherapy schedules using IMRT. Copyright © 2011 Elsevier Inc. All rights reserved.

  4. Absolute auditory threshold: testing the absolute.

    PubMed

    Heil, Peter; Matysiak, Artur

    2017-11-02

    The mechanisms underlying the detection of sounds in quiet, one of the simplest tasks for auditory systems, are debated. Several models proposed to explain the threshold for sounds in quiet and its dependence on sound parameters include a minimum sound intensity ('hard threshold'), below which sound has no effect on the ear. Also, many models are based on the assumption that threshold is mediated by integration of a neural response proportional to sound intensity. Here, we test these ideas. Using an adaptive forced choice procedure, we obtained thresholds of 95 normal-hearing human ears for 18 tones (3.125 kHz carrier) in quiet, each with a different temporal amplitude envelope. Grand-mean thresholds and standard deviations were well described by a probabilistic model according to which sensory events are generated by a Poisson point process with a low rate in the absence, and higher, time-varying rates in the presence, of stimulation. The subject actively evaluates the process and bases the decision on the number of events observed. The sound-driven rate of events is proportional to the temporal amplitude envelope of the bandpass-filtered sound raised to an exponent. We find no evidence for a hard threshold: When the model is extended to include such a threshold, the fit does not improve. Furthermore, we find an exponent of 3, consistent with our previous studies and further challenging models that are based on the assumption of the integration of a neural response that, at threshold sound levels, is directly proportional to sound amplitude or intensity. © 2017 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  5. Thresholds of Auditory-Motor Coupling Measured with a Simple Task in Musicians and Non-Musicians: Was the Sound Simultaneous to the Key Press?

    PubMed Central

    van Vugt, Floris T.; Tillmann, Barbara

    2014-01-01

    The human brain is able to predict the sensory effects of its actions. But how precise are these predictions? The present research proposes a tool to measure thresholds between a simple action (keystroke) and a resulting sound. On each trial, participants were required to press a key. Upon each keystroke, a woodblock sound was presented. In some trials, the sound came immediately with the downward keystroke; at other times, it was delayed by a varying amount of time. Participants were asked to verbally report whether the sound came immediately or was delayed. Participants' delay detection thresholds (in msec) were measured with a staircase-like procedure. We hypothesised that musicians would have a lower threshold than non-musicians. Comparing pianists and brass players, we furthermore hypothesised that, as a result of a sharper attack of the timbre of their instrument, pianists might have lower thresholds than brass players. Our results show that non-musicians exhibited higher thresholds for delay detection (180±104 ms) than the two groups of musicians (102±65 ms), but there were no differences between pianists and brass players. The variance in delay detection thresholds could be explained by variance in sensorimotor synchronisation capacities as well as variance in a purely auditory temporal irregularity detection measure. This suggests that the brain's capacity to generate temporal predictions of sensory consequences can be decomposed into general temporal prediction capacities together with auditory-motor coupling. These findings indicate that the brain has a relatively large window of integration within which an action and its resulting effect are judged as simultaneous. Furthermore, musical expertise may narrow this window down, potentially due to a more refined temporal prediction. This novel paradigm provides a simple test to estimate the temporal precision of auditory-motor action-effect coupling, and the paradigm can readily be incorporated in studies

  6. EEG-based functional networks evoked by acupuncture at ST 36: A data-driven thresholding study

    NASA Astrophysics Data System (ADS)

    Li, Huiyan; Wang, Jiang; Yi, Guosheng; Deng, Bin; Zhou, Hexi

    2017-10-01

    This paper investigates how acupuncture at ST 36 modulates the brain functional network. 20 channel EEG signals from 15 healthy subjects are respectively recorded before, during and after acupuncture. The correlation between two EEG channels is calculated by using Pearson’s coefficient. A data-driven approach is applied to determine the threshold, which is performed by considering the connected set, connected edge and network connectivity. Based on such thresholding approach, the functional network in each acupuncture period is built with graph theory, and the associated functional connectivity is determined. We show that acupuncturing at ST 36 increases the connectivity of the EEG-based functional network, especially for the long distance ones between two hemispheres. The properties of the functional network in five EEG sub-bands are also characterized. It is found that the delta and gamma bands are affected more obviously by acupuncture than the other sub-bands. These findings highlight the modulatory effects of acupuncture on the EEG-based functional connectivity, which is helpful for us to understand how it participates in the cortical or subcortical activities. Further, the data-driven threshold provides an alternative approach to infer the functional connectivity under other physiological conditions.

  7. Wafer plane inspection with soft resist thresholding

    NASA Astrophysics Data System (ADS)

    Hess, Carl; Shi, Rui-fang; Wihl, Mark; Xiong, Yalin; Pang, Song

    2008-10-01

    Wafer Plane Inspection (WPI) is an inspection mode on the KLA-Tencor TeraScaTM platform that uses the high signalto- noise ratio images from the high numerical aperture microscope, and then models the entire lithographic process to enable defect detection on the wafer plane[1]. This technology meets the needs of some advanced mask manufacturers to identify the lithographically-significant defects while ignoring the other non-lithographically-significant defects. WPI accomplishes this goal by performing defect detection based on a modeled image of how the mask features would actually print in the photoresist. There are several advantages to this approach: (1) the high fidelity of the images provide a sensitivity advantage over competing approaches; (2) the ability to perform defect detection on the wafer plane allows one to only see those defects that have a printing impact on the wafer; (3) the use of modeling on the lithographic portion of the flow enables unprecedented flexibility to support arbitrary illumination profiles, process-window inspection in unit time, and combination modes to find both printing and non-printing defects. WPI is proving to be a valuable addition to the KLA-Tencor detection algorithm suite. The modeling portion of WPI uses a single resist threshold as the final step in the processing. This has been shown to be adequate on several advanced customer layers, but is not ideal for all layers. Actual resist chemistry has complicated processes including acid and base-diffusion and quench that are not consistently well-modeled with a single resist threshold. We have considered the use of an advanced resist model for WPI, but rejected it because the burdensome requirements for the calibration of the model were not practical for reticle inspection. This paper describes an alternative approach that allows for a "soft" resist threshold to be applied that provides a more robust solution for the most challenging processes. This approach is just

  8. Unidirectional threshold switching in Ag/Si-based electrochemical metallization cells for high-density bipolar RRAM applications

    NASA Astrophysics Data System (ADS)

    Wang, Chao; Song, Bing; Li, Qingjiang; Zeng, Zhongming

    2018-03-01

    We herein present a novel unidirectional threshold selector for cross-point bipolar RRAM array. The proposed Ag/amorphous Si based threshold selector showed excellent threshold characteristics in positive field, such as high selectivity ( 105), steep slope (< 5 mV/decade) and low off-state current (< 300 pA). Meanwhile, the selector exhibited rectifying characteristics in the high resistance state as well and the rectification ratio was as high as 103 at ± 1.5 V. Nevertheless, due to the high reverse current about 9 mA at - 3 V, this unidirectional threshold selector can be used as a selection element for bipolar-type RRAM. By integrating a bipolar RRAM device with the selector, experiments showed that the undesired sneak was significantly suppressed, indicating its potentiality for high-density integrated nonvolatile memory applications.

  9. Investigation of the Effects of Biodiesel-based Na on Emissions Control Components

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brookshear, D. William; Nguyen, Ke; Toops, Todd J

    2012-01-01

    A single-cylinder diesel engine was used to investigate the impact of biodiesel-based Na on emissions control components using specially blended 20% biodiesel fuel (B20). The emissions control components investigated were a diesel oxidation catalyst (DOC), a Cu-zeolite-based NH{sub 3}-SCR (selective catalytic reduction) catalyst, and a diesel particulate filter (DPF). Both light-duty vehicle, DOC-SCR-DPF, and heavy-duty vehicle, DOC-DPF-SCR, emissions control configurations were employed. The accelerated Na aging is achieved by introducing elevated Na levels in the fuel, to represent full useful life exposure, and periodically increasing the exhaust temperature to replicate DPF regeneration. To assess the validity of the implemented acceleratedmore » Na aging protocol, engine-aged lean NO{sub x} traps (LNTs), DOCs and DPFs are also evaluated. To fully characterize the impact on the catalytic activity the LNT, DOC and SCR catalysts were evaluated using a bench flow reactor. The evaluation of the aged DOC samples and LNT show little to no deactivation as a result of Na contamination. However, the SCR in the light-duty configuration (DOC-SCR-DPF) was severely affected by Na contamination, especially when NO was the only fed NO{sub x} source. In the heavy-duty configuration (DOC-DPF-SCR), no impact is observed in the SCR NO{sub x} reduction activity. Electron probe micro-analysis (EPMA) reveals that Na contamination on the LNT, DOC, and SCR samples is present throughout the length of the catalysts with a higher concentration on the washcoat surface. In both the long-term engine-aged DPF and the accelerated Na-aged DPFs, there is significant Na ash present in the upstream channels; however, in the engine-aged sample lube oil-based ash is the predominant constituent.« less

  10. Evaluating an Action Threshold-Based Insecticide Program on Onion Cultivars Varying in Resistance to Onion Thrips (Thysanoptera: Thripidae).

    PubMed

    Nault, Brian A; Huseth, Anders S

    2016-08-01

    Onion thrips, Thrips tabaci Lindeman (Thysanoptera: Thripidae), is a highly destructive pest of onion, Allium cepa L., and its management relies on multiple applications of foliar insecticides. Development of insecticide resistance is common in T. tabaci populations, and new strategies are needed to relax existing levels of insecticide use, but still provide protection against T. tabaci without compromising marketable onion yield. An action threshold-based insecticide program combined with or without a thrips-resistant onion cultivar was investigated as an improved approach for managing T. tabaci infestations in commercial onion fields. Regardless of cultivar type, the average number of insecticide applications needed to manage T. tabaci infestations in the action-threshold based program was 4.3, while the average number of sprays in the standard weekly program was 7.2 (a 40% reduction). The mean percent reduction in numbers of applications following the action threshold treatment in the thrips-resistant onion cultivar, 'Advantage', was 46.7% (range 40-50%) compared with the standard program, whereas the percentage reduction in applications in action threshold treatments in the thrips-susceptible onion cultivar, 'Santana', was 34.3% (range 13-50%) compared with the standard program, suggesting a benefit of the thrips-resistant cultivar. Marketable bulb yields for both 'Advantage' and 'Santana' in the action threshold-based program were nearly identical to those in the standard program, indicating that commercially acceptable bulb yields will be generated with fewer insecticide sprays following an action threshold-based program, saving money, time and benefiting the environment. © The Authors 2016. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  11. In vivo TLD dose measurements in catheter-based high-dose-rate brachytherapy.

    PubMed

    Adlienė, Diana; Jakštas, Karolis; Urbonavičius, Benas Gabrielis

    2015-07-01

    Routine in vivo dosimetry is well established in external beam radiotherapy; however, it is restricted mainly to detection of gross errors in high-dose-rate (HDR) brachytherapy due to complicated measurements in the field of steep dose gradients in the vicinity of radioactive source and high uncertainties. The results of in vivo dose measurements using TLD 100 mini rods and TLD 'pin worms' in catheter-based HDR brachytherapy are provided in this paper alongside with their comparison with corresponding dose values obtained using calculation algorithm of the treatment planning system. Possibility to perform independent verification of treatment delivery in HDR brachytherapy using TLDs is discussed. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  12. Genetic Control of the Trigger for the G2/M Checkpoint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hall, Eric J.; Smilenov, Lubomir B.; Young, Erik F.

    The work undertaken in this project addressed two seminal areas of low dose radiation biology that are poorly understood and controversial. These areas are the challenge to the linear-no-threshold (LNT) paradigm at low doses of radiation and, the fundamental elements of radiation bystander effect biology Genetic contributions to low dose checkpoint engagement: The LNT paradigm is an extrapolation of known, measured cancer induction endpoints. Importantly, data for lower doses is often not available. Debatably, radiation protection standards have been introduced which are prudently contingent on the adherence of cancer risk to the established trend seen at higher doses. Intriguing findingsmore » from other labs have hinted at separate DNA damage response programs that engage at low or high levels of radiation. Individual radiation sensitivity commensurate with hemizygosity for a radiation sensitivity gene has been estimated at 1-2% in the U.S.. Careful interrogation of the DNA damage response at low doses of radiation became important and served as the basis for this grant. Several genes were tested in combinations to determine if combined haploinsufficiency for multiple radiosensitizing genes could render a cell more sensitive to lower levels of acute radiation exposure. We measured a classical radiation response endpoint, cell cycle arrest prior to mitosis. Mouse embryo fibroblasts were used and provided a uniform, rapidly dividing and genetically manipulable population of study. Our system did not report checkpoint engagement at acute doses of gamma rays below 100 mGy. The system did report checkpoint engagement reproducibly at 500 mGy establishing a threshold for activation between 100 and 500 mGy. Engagement of the checkpoint was ablated in cells nullizygous for ATM but was otherwise unperturbed in cells combinatorially haploinsufficient for ATM and Rad9, ATM and PTEN or PTEN and Rad9. Taken together, these experiments tell us that, in a sensitive fibroblast

  13. Comparison of eye-lens doses imparted during interventional and non-interventional neuroimaging techniques for assessment of intracranial aneurysms.

    PubMed

    Guberina, N; Dietrich, U; Forsting, M; Ringelstein, A

    2018-02-01

    A neurointerventional examination of intracranial aneurysms often involves the eye lens in the primary beam of radiation. To assess and compare eye-lens doses imparted during interventional and non-interventional imaging techniques for the examination of intracranial aneurysms. We performed a phantom study on an anthropomorphic phantom (ATOM dosimetry phantom 702-D; CIRS, Norfolk, Virginia, USA) and assessed eye-lens doses with thermoluminescent dosimeters (TLDs) type 100 (LiF:Mg, Ti) during (1) interventional (depiction of all cerebral arteries with triple 3D-rotational angiography and twice 2-plane DSA anteroposterior and lateral projections) and (2) non-interventional (CT angiography (CTA)) diagnosis of intracranial aneurysms. Eye-lens doses were calculated following recommendations of the ICRP 103. Image quality was analysed in retrospective by two experienced radiologists on the basis of non-interventional and interventional pan-angiography examinations of patients with incidental aneurysms (n=50) on a five-point Likert scale. The following eye-lens doses were assessed: (1) interventional setting (triple 3D-rotational angiography and twice 2-plane DSA anteroposterior and lateral projections) 12 mGy; (2) non-interventional setting (CTA) 4.1 mGy. Image quality for depiction of intracranial aneurysms (>3 mm) was evaluated as good by both readers for both imaging techniques. Eye-lens doses are markedly higher during the interventional than during the non-interventional diagnosis of intracranial aneurysms. For the eye-lens dose, CTA offers considerable radiation dose savings in the diagnosis of intracranial aneurysms. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  14. The Threshold Level--For Schools?

    ERIC Educational Resources Information Center

    Lauerbach, Gerda

    1979-01-01

    Comments on the document "Threshold Level for Modern Language Learning Schools" (J. A. Van Ek, Strasbourg, 1976) and its appropriateness as a description of learning goals for the first years of foreign language teaching. Criticizes particularly the "reduced learning" concept, on which the threshold projects are based. (IFS/WGA)

  15. Adaptive thresholding image series from fluorescence confocal scanning laser microscope using orientation intensity profiles

    NASA Astrophysics Data System (ADS)

    Feng, Judy J.; Ip, Horace H.; Cheng, Shuk H.

    2004-05-01

    Many grey-level thresholding methods based on histogram or other statistic information about the interest image such as maximum entropy and so on have been proposed in the past. However, most methods based on statistic analysis of the images concerned little about the characteristics of morphology of interest objects, which sometimes could provide very important indication which can help to find the optimum threshold, especially for those organisms which have special texture morphologies such as vasculature, neuro-network etc. in medical imaging. In this paper, we propose a novel method for thresholding the fluorescent vasculature image series recorded from Confocal Scanning Laser Microscope. After extracting the basic orientation of the slice of vessels inside a sub-region partitioned from the images, we analysis the intensity profiles perpendicular to the vessel orientation to get the reasonable initial threshold for each region. Then the threshold values of those regions near the interest one both in x-y and optical directions have been referenced to get the final result of thresholds of the region, which makes the whole stack of images look more continuous. The resulting images are characterized by suppressing both noise and non-interest tissues conglutinated to vessels, while improving the vessel connectivities and edge definitions. The value of the method for idealized thresholding the fluorescence images of biological objects is demonstrated by a comparison of the results of 3D vascular reconstruction.

  16. Thermal bistability-based method for real-time optimization of ultralow-threshold whispering gallery mode microlasers.

    PubMed

    Lin, Guoping; Candela, Y; Tillement, O; Cai, Zhiping; Lefèvre-Seguin, V; Hare, J

    2012-12-15

    A method based on thermal bistability for ultralow-threshold microlaser optimization is demonstrated. When sweeping the pump laser frequency across a pump resonance, the dynamic thermal bistability slows down the power variation. The resulting line shape modification enables a real-time monitoring of the laser characteristic. We demonstrate this method for a functionalized microsphere exhibiting a submicrowatt laser threshold. This approach is confirmed by comparing the results with a step-by-step recording in quasi-static thermal conditions.

  17. A Pilot Study of the Snap & Sniff Threshold Test.

    PubMed

    Jiang, Rong-San; Liang, Kai-Li

    2018-05-01

    The Snap & Sniff ® Threshold Test (S&S) has been recently developed to determine the olfactory threshold. The aim of this study was to further evaluate the validity and test-retest reliability of the S&S. The olfactory thresholds of 120 participants were determined using both the Smell Threshold Test (STT) and the S&S. The participants included 30 normosmic volunteers and 90 patients (60 hyposmic, 30 anosmic). The normosmic participants were retested using the STT and S&S at an intertest interval of at least 1 day. The mean olfactory threshold determined with the S&S was -6.76 for the normosmic participants, -3.79 for the hyposmic patients, and -2 for the anosmic patients. The olfactory thresholds were significantly different across the 3 groups ( P < .001). Snap & Sniff-based and STT-based olfactory thresholds were correlated weakly in the normosmic group (correlation coefficient = 0.162, P = .391) but more strongly correlated in the patient groups (hyposmic: correlation coefficient = 0.376, P = .003; anosmic: correlation coefficient = 1.0). The test-retest correlation for the S&S-based olfactory thresholds was 0.384 ( P = .036). Based on validity and test-retest reliability, we concluded that the S&S is a proper test for olfactory thresholds.

  18. Threshold-based segmentation of fluorescent and chromogenic images of microglia, astrocytes and oligodendrocytes in FIJI.

    PubMed

    Healy, Sinead; McMahon, Jill; Owens, Peter; Dockery, Peter; FitzGerald, Una

    2018-02-01

    Image segmentation is often imperfect, particularly in complex image sets such z-stack micrographs of slice cultures and there is a need for sufficient details of parameters used in quantitative image analysis to allow independent repeatability and appraisal. For the first time, we have critically evaluated, quantified and validated the performance of different segmentation methodologies using z-stack images of ex vivo glial cells. The BioVoxxel toolbox plugin, available in FIJI, was used to measure the relative quality, accuracy, specificity and sensitivity of 16 global and 9 local threshold automatic thresholding algorithms. Automatic thresholding yields improved binary representation of glial cells compared with the conventional user-chosen single threshold approach for confocal z-stacks acquired from ex vivo slice cultures. The performance of threshold algorithms varies considerably in quality, specificity, accuracy and sensitivity with entropy-based thresholds scoring highest for fluorescent staining. We have used the BioVoxxel toolbox to correctly and consistently select the best automated threshold algorithm to segment z-projected images of ex vivo glial cells for downstream digital image analysis and to define segmentation quality. The automated OLIG2 cell count was validated using stereology. As image segmentation and feature extraction can quite critically affect the performance of successive steps in the image analysis workflow, it is becoming increasingly necessary to consider the quality of digital segmenting methodologies. Here, we have applied, validated and extended an existing performance-check methodology in the BioVoxxel toolbox to z-projected images of ex vivo glia cells. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Molecular Signaling Network Motifs Provide a Mechanistic Basis for Cellular Threshold Responses

    PubMed Central

    Bhattacharya, Sudin; Conolly, Rory B.; Clewell, Harvey J.; Kaminski, Norbert E.; Andersen, Melvin E.

    2014-01-01

    Background: Increasingly, there is a move toward using in vitro toxicity testing to assess human health risk due to chemical exposure. As with in vivo toxicity testing, an important question for in vitro results is whether there are thresholds for adverse cellular responses. Empirical evaluations may show consistency with thresholds, but the main evidence has to come from mechanistic considerations. Objectives: Cellular response behaviors depend on the molecular pathway and circuitry in the cell and the manner in which chemicals perturb these circuits. Understanding circuit structures that are inherently capable of resisting small perturbations and producing threshold responses is an important step towards mechanistically interpreting in vitro testing data. Methods: Here we have examined dose–response characteristics for several biochemical network motifs. These network motifs are basic building blocks of molecular circuits underpinning a variety of cellular functions, including adaptation, homeostasis, proliferation, differentiation, and apoptosis. For each motif, we present biological examples and models to illustrate how thresholds arise from specific network structures. Discussion and Conclusion: Integral feedback, feedforward, and transcritical bifurcation motifs can generate thresholds. Other motifs (e.g., proportional feedback and ultrasensitivity)produce responses where the slope in the low-dose region is small and stays close to the baseline. Feedforward control may lead to nonmonotonic or hormetic responses. We conclude that network motifs provide a basis for understanding thresholds for cellular responses. Computational pathway modeling of these motifs and their combinations occurring in molecular signaling networks will be a key element in new risk assessment approaches based on in vitro cellular assays. Citation: Zhang Q, Bhattacharya S, Conolly RB, Clewell HJ III, Kaminski NE, Andersen ME. 2014. Molecular signaling network motifs provide a

  20. A threshold-based cloud mask for the high-resolution visible channel of Meteosat Second Generation SEVIRI

    NASA Astrophysics Data System (ADS)

    Bley, S.; Deneke, H.

    2013-10-01

    A threshold-based cloud mask for the high-resolution visible (HRV) channel (1 × 1 km2) of the Meteosat SEVIRI (Spinning Enhanced Visible and Infrared Imager) instrument is introduced and evaluated. It is based on operational EUMETSAT cloud mask for the low-resolution channels of SEVIRI (3 × 3 km2), which is used for the selection of suitable thresholds to ensure consistency with its results. The aim of using the HRV channel is to resolve small-scale cloud structures that cannot be detected by the low-resolution channels. We find that it is of advantage to apply thresholds relative to clear-sky reflectance composites, and to adapt the threshold regionally. Furthermore, the accuracy of the different spectral channels for thresholding and the suitability of the HRV channel are investigated for cloud detection. The case studies show different situations to demonstrate the behavior for various surface and cloud conditions. Overall, between 4 and 24% of cloudy low-resolution SEVIRI pixels are found to contain broken clouds in our test data set depending on considered region. Most of these broken pixels are classified as cloudy by EUMETSAT's cloud mask, which will likely result in an overestimate if the mask is used as an estimate of cloud fraction. The HRV cloud mask aims for small-scale convective sub-pixel clouds that are missed by the EUMETSAT cloud mask. The major limit of the HRV cloud mask is the minimum cloud optical thickness (COT) that can be detected. This threshold COT was found to be about 0.8 over ocean and 2 over land and is highly related to the albedo of the underlying surface.

  1. A probabilistic Poisson-based model accounts for an extensive set of absolute auditory threshold measurements.

    PubMed

    Heil, Peter; Matysiak, Artur; Neubauer, Heinrich

    2017-09-01

    Thresholds for detecting sounds in quiet decrease with increasing sound duration in every species studied. The neural mechanisms underlying this trade-off, often referred to as temporal integration, are not fully understood. Here, we probe the human auditory system with a large set of tone stimuli differing in duration, shape of the temporal amplitude envelope, duration of silent gaps between bursts, and frequency. Duration was varied by varying the plateau duration of plateau-burst (PB) stimuli, the duration of the onsets and offsets of onset-offset (OO) stimuli, and the number of identical bursts of multiple-burst (MB) stimuli. Absolute thresholds for a large number of ears (>230) were measured using a 3-interval-3-alternative forced choice (3I-3AFC) procedure. Thresholds decreased with increasing sound duration in a manner that depended on the temporal envelope. Most commonly, thresholds for MB stimuli were highest followed by thresholds for OO and PB stimuli of corresponding durations. Differences in the thresholds for MB and OO stimuli and in the thresholds for MB and PB stimuli, however, varied widely across ears, were negative in some ears, and were tightly correlated. We show that the variation and correlation of MB-OO and MB-PB threshold differences are linked to threshold microstructure, which affects the relative detectability of the sidebands of the MB stimuli and affects estimates of the bandwidth of auditory filters. We also found that thresholds for MB stimuli increased with increasing duration of the silent gaps between bursts. We propose a new model and show that it accurately accounts for our results and does so considerably better than a leaky-integrator-of-intensity model and a probabilistic model proposed by others. Our model is based on the assumption that sensory events are generated by a Poisson point process with a low rate in the absence of stimulation and higher, time-varying rates in the presence of stimulation. A subject in a 3I-3AFC

  2. Near-Threshold Fatigue Crack Growth Behavior of Fine-Grain Nickel-Based Alloys

    NASA Technical Reports Server (NTRS)

    Newman, John A.; Piascik, Robert S.

    2003-01-01

    Constant-Kmax fatigue crack growth tests were performed on two finegrain nickel-base alloys Inconel 718 (DA) and Ren 95 to determine if these alloys exhibit near-threshold time-dependent crack growth behavior observed for fine-grain aluminum alloys in room-temperature laboratory air. Test results showed that increases in K(sub max) values resulted in increased crack growth rates, but no evidence of time-dependent crack growth was observed for either nickel-base alloy at room temperature.

  3. The risk equivalent of an exposure to-, versus a dose of radiation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bond, V.P.

    The long-term potential carcinogenic effects of low-level exposure (LLE) are addressed. The principal point discussed is linear, no-threshold dose-response curve. That the linear no-threshold, or proportional relationship is widely used is seen in the way in which the values for cancer risk coefficients are expressed - in terms of new cases, per million persons exposed, per year, per unit exposure or dose. This implies that the underlying relationship is proportional, i.e., ''linear, without threshold''. 12 refs., 9 figs., 1 tab.

  4. A method to quantify infectious airborne pathogens at concentrations below the threshold of quantification by culture

    PubMed Central

    Cutler, Timothy D.; Wang, Chong; Hoff, Steven J.; Zimmerman, Jeffrey J.

    2013-01-01

    In aerobiology, dose-response studies are used to estimate the risk of infection to a susceptible host presented by exposure to a specific dose of an airborne pathogen. In the research setting, host- and pathogen-specific factors that affect the dose-response continuum can be accounted for by experimental design, but the requirement to precisely determine the dose of infectious pathogen to which the host was exposed is often challenging. By definition, quantification of viable airborne pathogens is based on the culture of micro-organisms, but some airborne pathogens are transmissible at concentrations below the threshold of quantification by culture. In this paper we present an approach to the calculation of exposure dose at microbiologically unquantifiable levels using an application of the “continuous-stirred tank reactor (CSTR) model” and the validation of this approach using rhodamine B dye as a surrogate for aerosolized microbial pathogens in a dynamic aerosol toroid (DAT). PMID:24082399

  5. A threshold-based weather model for predicting stripe rust infection in winter wheat

    USDA-ARS?s Scientific Manuscript database

    Wheat stripe rust (WSR) (caused by Puccinia striiformis sp. tritici) is a major threat in most wheat growing regions worldwide, with potential to inflict regular yield losses when environmental conditions are favorable. We propose a threshold-based disease-forecasting model using a stepwise modeling...

  6. THRIVE: threshold homomorphic encryption based secure and privacy preserving biometric verification system

    NASA Astrophysics Data System (ADS)

    Karabat, Cagatay; Kiraz, Mehmet Sabir; Erdogan, Hakan; Savas, Erkay

    2015-12-01

    In this paper, we introduce a new biometric verification and template protection system which we call THRIVE. The system includes novel enrollment and authentication protocols based on threshold homomorphic encryption where a private key is shared between a user and a verifier. In the THRIVE system, only encrypted binary biometric templates are stored in a database and verification is performed via homomorphically randomized templates, thus, original templates are never revealed during authentication. Due to the underlying threshold homomorphic encryption scheme, a malicious database owner cannot perform full decryption on encrypted templates of the users in the database. In addition, security of the THRIVE system is enhanced using a two-factor authentication scheme involving user's private key and biometric data. Using simulation-based techniques, the proposed system is proven secure in the malicious model. The proposed system is suitable for applications where the user does not want to reveal her biometrics to the verifier in plain form, but needs to prove her identity by using biometrics. The system can be used with any biometric modality where a feature extraction method yields a fixed size binary template and a query template is verified when its Hamming distance to the database template is less than a threshold. The overall connection time for the proposed THRIVE system is estimated to be 336 ms on average for 256-bit biometric templates on a desktop PC running with quad core 3.2 GHz CPUs at 10 Mbit/s up/down link connection speed. Consequently, the proposed system can be efficiently used in real-life applications.

  7. Critical review and hydrologic application of threshold detection methods for the generalized Pareto (GP) distribution

    NASA Astrophysics Data System (ADS)

    Mamalakis, Antonios; Langousis, Andreas; Deidda, Roberto

    2016-04-01

    Estimation of extreme rainfall from data constitutes one of the most important issues in statistical hydrology, as it is associated with the design of hydraulic structures and flood water management. To that extent, based on asymptotic arguments from Extreme Excess (EE) theory, several studies have focused on developing new, or improving existing methods to fit a generalized Pareto (GP) distribution model to rainfall excesses above a properly selected threshold u. The latter is generally determined using various approaches, such as non-parametric methods that are intended to locate the changing point between extreme and non-extreme regions of the data, graphical methods where one studies the dependence of GP distribution parameters (or related metrics) on the threshold level u, and Goodness of Fit (GoF) metrics that, for a certain level of significance, locate the lowest threshold u that a GP distribution model is applicable. In this work, we review representative methods for GP threshold detection, discuss fundamental differences in their theoretical bases, and apply them to 1714 daily rainfall records from the NOAA-NCDC open-access database, with more than 110 years of data. We find that non-parametric methods that are intended to locate the changing point between extreme and non-extreme regions of the data are generally not reliable, while methods that are based on asymptotic properties of the upper distribution tail lead to unrealistically high threshold and shape parameter estimates. The latter is justified by theoretical arguments, and it is especially the case in rainfall applications, where the shape parameter of the GP distribution is low; i.e. on the order of 0.1 ÷ 0.2. Better performance is demonstrated by graphical methods and GoF metrics that rely on pre-asymptotic properties of the GP distribution. For daily rainfall, we find that GP threshold estimates range between 2÷12 mm/d with a mean value of 6.5 mm/d, while the existence of quantization in the

  8. The Impact of Genetic and Non-Genetic Factors on Warfarin Dose Prediction in MENA Region: A Systematic Review

    PubMed Central

    2016-01-01

    Background Warfarin is the most commonly used oral anticoagulant for the treatment and prevention of thromboembolic disorders. Pharmacogenomics studies have shown that variants in CYP2C9 and VKORC1 genes are strongly and consistently associated with warfarin dose variability. Although different populations from the Middle East and North Africa (MENA) region may share the same ancestry, it is still unclear how they compare in the genetic and non-genetic factors affecting their warfarin dosing. Objective To explore the prevalence of CYP2C9 and VKORC1 variants in MENA, and the effect of these variants along with other non-genetic factors in predicting warfarin dose. Methods In this systematic review, we included observational cross sectional and cohort studies that enrolled patients on stable warfarin dose and had the genetics and non-genetics factors associated with mean warfarin dose as the primary outcome. We searched PubMed, Medline, Scopus, PharmGKB, PHGKB, Google scholar and reference lists of relevant reviews. Results We identified 17 studies in eight different populations: Iranian, Israeli, Egyptian, Lebanese, Omani, Kuwaiti, Sudanese and Turkish. Most common genetic variant in all populations was the VKORC1 (-1639G>A), with a minor allele frequency ranging from 30% in Egyptians and up to 52% and 56% in Lebanese and Iranian, respectively. Variants in the CYP2C9 were less common, with the highest MAF for CYP2C9*2 among Iranians (27%). Variants in the VKORC1 and CYP2C9 were the most significant predictors of warfarin dose in all populations. Along with other genetic and non-genetic factors, they explained up to 63% of the dose variability in Omani and Israeli patients. Conclusion Variants of VKORC1 and CYP2C9 are the strongest predictors of warfarin dose variability among the different populations from MENA. Although many of those populations share the same ancestry and are similar in their warfarin dose predictors, a population specific dosing algorithm is

  9. The Impact of Genetic and Non-Genetic Factors on Warfarin Dose Prediction in MENA Region: A Systematic Review.

    PubMed

    Bader, Loulia Akram; Elewa, Hazem

    2016-01-01

    Warfarin is the most commonly used oral anticoagulant for the treatment and prevention of thromboembolic disorders. Pharmacogenomics studies have shown that variants in CYP2C9 and VKORC1 genes are strongly and consistently associated with warfarin dose variability. Although different populations from the Middle East and North Africa (MENA) region may share the same ancestry, it is still unclear how they compare in the genetic and non-genetic factors affecting their warfarin dosing. To explore the prevalence of CYP2C9 and VKORC1 variants in MENA, and the effect of these variants along with other non-genetic factors in predicting warfarin dose. In this systematic review, we included observational cross sectional and cohort studies that enrolled patients on stable warfarin dose and had the genetics and non-genetics factors associated with mean warfarin dose as the primary outcome. We searched PubMed, Medline, Scopus, PharmGKB, PHGKB, Google scholar and reference lists of relevant reviews. We identified 17 studies in eight different populations: Iranian, Israeli, Egyptian, Lebanese, Omani, Kuwaiti, Sudanese and Turkish. Most common genetic variant in all populations was the VKORC1 (-1639G>A), with a minor allele frequency ranging from 30% in Egyptians and up to 52% and 56% in Lebanese and Iranian, respectively. Variants in the CYP2C9 were less common, with the highest MAF for CYP2C9*2 among Iranians (27%). Variants in the VKORC1 and CYP2C9 were the most significant predictors of warfarin dose in all populations. Along with other genetic and non-genetic factors, they explained up to 63% of the dose variability in Omani and Israeli patients. Variants of VKORC1 and CYP2C9 are the strongest predictors of warfarin dose variability among the different populations from MENA. Although many of those populations share the same ancestry and are similar in their warfarin dose predictors, a population specific dosing algorithm is needed for the prospective estimation of warfarin

  10. Theoretical models and simulation codes to investigate bystander effects and cellular communication at low doses

    NASA Astrophysics Data System (ADS)

    Ballarini, F.; Alloni, D.; Facoetti, A.; Mairani, A.; Nano, R.; Ottolenghi, A.

    Astronauts in space are continuously exposed to low doses of ionizing radiation from Galactic Cosmic Rays During the last ten years the effects of low radiation doses have been widely re-discussed following a large number of observations on the so-called non targeted effects in particular bystander effects The latter consist of induction of cytogenetic damage in cells not directly traversed by radiation most likely as a response to molecular messengers released by directly irradiated cells Bystander effects which are observed both for lethal endpoints e g clonogenic inactivation and apoptosis and for non-lethal ones e g mutations and neoplastic transformation tend to show non-linear dose responses This might have significant consequences in terms of low-dose risk which is generally calculated on the basis of the Linear No Threshold hypothesis Although the mechanisms underlying bystander effects are still largely unknown it is now clear that two types of cellular communication i e via gap junctions and or release of molecular messengers into the extracellular environment play a fundamental role Theoretical models and simulation codes can be of help in elucidating such mechanisms In the present paper we will review different available modelling approaches including one that is being developed at the University of Pavia The focus will be on the different assumptions adopted by the various authors and on the implications of such assumptions in terms of non-targeted radiobiological damage and more generally low-dose

  11. The estimation of absorbed dose rates for non-human biota : an extended inter-comparison.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Batlle, J. V. I.; Beaugelin-Seiller, K.; Beresford, N. A.

    An exercise to compare 10 approaches for the calculation of unweighted whole-body absorbed dose rates was conducted for 74 radionuclides and five of the ICRP's Reference Animals and Plants, or RAPs (duck, frog, flatfish egg, rat and elongated earthworm), selected for this exercise to cover a range of body sizes, dimensions and exposure scenarios. Results were analysed using a non-parametric method requiring no specific hypotheses about the statistical distribution of data. The obtained unweighted absorbed dose rates for internal exposure compare well between the different approaches, with 70% of the results falling within a range of variation of {+-}20%. Themore » variation is greater for external exposure, although 90% of the estimates are within an order of magnitude of one another. There are some discernible patterns where specific models over- or under-predicted. These are explained based on the methodological differences including number of daughter products included in the calculation of dose rate for a parent nuclide; source-target geometry; databases for discrete energy and yield of radionuclides; rounding errors in integration algorithms; and intrinsic differences in calculation methods. For certain radionuclides, these factors combine to generate systematic variations between approaches. Overall, the technique chosen to interpret the data enabled methodological differences in dosimetry calculations to be quantified and compared, allowing the identification of common issues between different approaches and providing greater assurance on the fundamental dose conversion coefficient approaches used in available models for assessing radiological effects to biota.« less

  12. Analysis of image thresholding segmentation algorithms based on swarm intelligence

    NASA Astrophysics Data System (ADS)

    Zhang, Yi; Lu, Kai; Gao, Yinghui; Yang, Bo

    2013-03-01

    Swarm intelligence-based image thresholding segmentation algorithms are playing an important role in the research field of image segmentation. In this paper, we briefly introduce the theories of four existing image segmentation algorithms based on swarm intelligence including fish swarm algorithm, artificial bee colony, bacteria foraging algorithm and particle swarm optimization. Then some image benchmarks are tested in order to show the differences of the segmentation accuracy, time consumption, convergence and robustness for Salt & Pepper noise and Gaussian noise of these four algorithms. Through these comparisons, this paper gives qualitative analyses for the performance variance of the four algorithms. The conclusions in this paper would give a significant guide for the actual image segmentation.

  13. Carbon deposition thresholds on nickel-based solid oxide fuel cell anodes II. Steam:carbon ratio and current density

    NASA Astrophysics Data System (ADS)

    Kuhn, J.; Kesler, O.

    2015-03-01

    For the second part of a two part publication, coking thresholds with respect to molar steam:carbon ratio (SC) and current density in nickel-based solid oxide fuel cells were determined. Anode-supported button cell samples were exposed to 2-component and 5-component gas mixtures with 1 ≤ SC ≤ 2 and zero fuel utilization for 10 h, followed by measurement of the resulting carbon mass. The effect of current density was explored by measuring carbon mass under conditions known to be prone to coking while increasing the current density until the cell was carbon-free. The SC coking thresholds were measured to be ∼1.04 and ∼1.18 at 600 and 700 °C, respectively. Current density experiments validated the thresholds measured with respect to fuel utilization and steam:carbon ratio. Coking thresholds at 600 °C could be predicted with thermodynamic equilibrium calculations when the Gibbs free energy of carbon was appropriately modified. Here, the Gibbs free energy of carbon on nickel-based anode support cermets was measured to be -6.91 ± 0.08 kJ mol-1. The results of this two part publication show that thermodynamic equilibrium calculations with appropriate modification to the Gibbs free energy of solid-phase carbon can be used to predict coking thresholds on nickel-based anodes at 600-700 °C.

  14. Plant Physiological, Morphological and Yield-Related Responses to Night Temperature Changes across Different Species and Plant Functional Types

    PubMed Central

    Jing, Panpan; Wang, Dan; Zhu, Chunwu; Chen, Jiquan

    2016-01-01

    Land surface temperature over the past decades has shown a faster warming trend during the night than during the day. Extremely low night temperatures have occurred frequently due to the influence of land-sea thermal difference, topography and climate change. This asymmetric night temperature change is expected to affect plant ecophysiology and growth, as the plant carbon consumption processes could be affected more than the assimilation processes because photosynthesis in most plants occurs during the daytime whereas plant respiration occurs throughout the day. The effects of high night temperature (HNT) and low night temperature (LNT) on plant ecophysiological and growing processes and how the effects vary among different plant functional types (PFTs) have not been analyzed extensively. In this meta-analysis, we examined the effect of HNT and LNT on plant physiology and growth across different PFTs and experimental settings. Plant species were grouped according to their photosynthetic pathways (C3, C4, and CAM), growth forms (herbaceous, woody), and economic purposes (crop, non-crop). We found that HNT and LNT both had a negative effect on plant yield, but the effect of HNT on plant yield was primarily related to a reduction in biomass allocation to reproduction organs and the effect of LNT on plant yield was more related to a negative effect on total biomass. Leaf growth was stimulated at HNT and suppressed at LNT. HNT accelerated plants ecophysiological processes, including photosynthesis and dark respiration, while LNT slowed these processes. Overall, the results showed that the effects of night temperature on plant physiology and growth varied between HNT and LNT, among the response variables and PFTs, and depended on the magnitude of temperature change and experimental design. These findings suggest complexities and challenges in seeking general patterns of terrestrial plant growth in HNT and LNT. The PFT specific responses of plants are critical for

  15. An Auditory-Masking-Threshold-Based Noise Suppression Algorithm GMMSE-AMT[ERB] for Listeners with Sensorineural Hearing Loss

    NASA Astrophysics Data System (ADS)

    Natarajan, Ajay; Hansen, John H. L.; Arehart, Kathryn Hoberg; Rossi-Katz, Jessica

    2005-12-01

    This study describes a new noise suppression scheme for hearing aid applications based on the auditory masking threshold (AMT) in conjunction with a modified generalized minimum mean square error estimator (GMMSE) for individual subjects with hearing loss. The representation of cochlear frequency resolution is achieved in terms of auditory filter equivalent rectangular bandwidths (ERBs). Estimation of AMT and spreading functions for masking are implemented in two ways: with normal auditory thresholds and normal auditory filter bandwidths (GMMSE-AMT[ERB]-NH) and with elevated thresholds and broader auditory filters characteristic of cochlear hearing loss (GMMSE-AMT[ERB]-HI). Evaluation is performed using speech corpora with objective quality measures (segmental SNR, Itakura-Saito), along with formal listener evaluations of speech quality rating and intelligibility. While no measurable changes in intelligibility occurred, evaluations showed quality improvement with both algorithm implementations. However, the customized formulation based on individual hearing losses was similar in performance to the formulation based on the normal auditory system.

  16. Theoretical and experimental approaches to possible thresholds of response in carcinogenicity

    EPA Science Inventory

    The determination and utilization of the actual low dose-response relationship for chemical carcinogens has long interested toxicologists, experimental pathologists, modelers and risk assessors. To date, no unequivocal examples of carcinogenic thresholds in humans are known. Ho...

  17. Extended range radiation dose-rate monitor

    DOEpatents

    Valentine, Kenneth H.

    1988-01-01

    An extended range dose-rate monitor is provided which utilizes the pulse pileup phenomenon that occurs in conventional counting systems to alter the dynamic response of the system to extend the dose-rate counting range. The current pulses from a solid-state detector generated by radiation events are amplified and shaped prior to applying the pulses to the input of a comparator. The comparator generates one logic pulse for each input pulse which exceeds the comparator reference threshold. These pulses are integrated and applied to a meter calibrated to indicate the measured dose-rate in response to the integrator output. A portion of the output signal from the integrator is fed back to vary the comparator reference threshold in proportion to the output count rate to extend the sensitive dynamic detection range by delaying the asymptotic approach of the integrator output toward full scale as measured by the meter.

  18. I. RENAL THRESHOLDS FOR HEMOGLOBIN IN DOGS

    PubMed Central

    Lichty, John A.; Havill, William H.; Whipple, George H.

    1932-01-01

    We use the term "renal threshold for hemoglobin" to indicate the smallest amount of hemoglobin which given intravenously will effect the appearance of recognizable hemoglobin in the urine. The initial renal threshold level for dog hemoglobin is established by the methods employed at an average value of 155 mg. hemoglobin per kilo body weight with maximal values of 210 and minimal of 124. Repeated daily injections of hemoglobin will depress this initial renal threshold level on the average 46 per cent with maximal values of 110 and minimal values of 60 mg. hemoglobin per kilo body weight. This minimal or depression threshold is relatively constant if the injections are continued. Rest periods without injections cause a return of the renal threshold for hemoglobin toward the initial threshold levels—recovery threshold level. Injections of hemoglobin below the initial threshold level but above the minimal or depression threshold will eventually reduce the renal threshold for hemoglobin to its depression threshold level. We believe the depression threshold or minimal renal threshold level due to repeated hemoglobin injections is a little above the glomerular threshold which we assume is the base line threshold for hemoglobin. Our reasons for this belief in the glomerular threshold are given above and in the other papers of this series. PMID:19870016

  19. Hematopoietic responses under protracted exposures to low daily dose gamma irradiation

    NASA Astrophysics Data System (ADS)

    Seed, T. M.; Fritz, T. E.; Tolle, D. V.; Jackson, W. E.

    In attempting to evaluate the possible health consequences of chronic ionizing radiation exposure during extended space travel (e.g., Mars Mission), ground-based experimental studies of the clinical and pathological responses of canines under low daily doses of 60Co gamma irradiation (0.3-26.3 cGy d -1) have been examined. Specific reference was given to responses of the blood forming system. Results suggest that the daily dose rate of 7.5 cGy d -1 represents a threshold below which the hematopoietic system can retain either partial or full trilineal cell-producing capacity (erythropoiesis, myelopoiesis, and megakaryopoiesis) for extended periods of exposure (> 1yr). Trilineal capacity was fully retained for several years of exposure at the lowest dose-rate tested (0.3 cGy d -1) but was completely lost within several hundred days at the highest dose-rate (26.3 cGy d -1). Retention of hematopoietic capacity under chronic exposure has been demonstrated to be mediated by hematopoietic progenitors with acquired radioresistance and repair functions, altered cytogenetics, and cell-cycle characteristics. Radiological, biological, and temporal parameters responsible for these vital acquisitions by hematopoietic progenitors have been partially characterized. These parameters, along with threshold responses, are described and discussed in relation to potential health risks of the space traveler under chronic stress of low-dose irradiation.

  20. Hematopoietic responses under protracted exposures to low daily dose gamma irradiation.

    PubMed

    Seed, T M; Fritz, T E; Tolle, D V; Jackson, W E

    2002-01-01

    In attempting to evaluate the possible health consequences of chronic ionizing radiation exposure during extended space travel (e.g., Mars Mission), ground-based experimental studies of the clinical and pathological responses of canines under low daily doses of 60Co gamma irradiation (0.3-26.3 cGy d-1) have been examined. Specific reference was given to responses of the blood forming system. Results suggest that the daily dose rate of 7.5 cGy d-1 represents a threshold below which the hematopoietic system can retain either partial or full trilineal cell-producing capacity (erythropoiesis, myelopoiesis, and megakaryopoiesis) for extended periods of exposure (>1 yr). Trilineal capacity was fully retained for several years of exposure at the lowest dose-rate tested (0.3 cGy d-1) but was completely lost within several hundred days at the highest dose-rate (26.3 cGy d-1). Retention of hematopoietic capacity under chronic exposure has been demonstrated to be mediated by hematopoietic progenitors with acquired radioresistance and repair functions, altered cytogenetics, and cell-cycle characteristics. Radiological, biological, and temporal parameters responsible for these vital acquisitions by hematopoietic progenitors have been partially characterized. These parameters, along with threshold responses, are described and discussed in relation to potential health risks of the space traveler under chronic stress of low-dose irradiation. Published by Elsevier Science Ltd on behalf of COSPAR.

  1. Optimal methotrexate dose is associated with better clinical outcomes than non-optimal dose in daily practice: results from the ESPOIR early arthritis cohort.

    PubMed

    Gaujoux-Viala, Cécile; Rincheval, Nathalie; Dougados, Maxime; Combe, Bernard; Fautrel, Bruno

    2017-12-01

    Although methotrexate (MTX) is the consensual first-line disease-modifying antirheumatic drug (DMARD) for rheumatoid arthritis (RA), substantial heterogeneity remains with its prescription and dosage, which are often not optimal. To evaluate the symptomatic and structural impact of optimal MTX dose in patients with early RA in daily clinical practice over 2 years. Patients included in the early arthritis ESPOIR cohort who fulfilled the ACR-EULAR (American College of Rheumatology/European League against Rheumatism) criteria for RA and received MTX as a first DMARD were assessed. Optimal MTX dose was defined as ≥10 mg/week during the first 3 months, with escalation to ≥20 mg/week or 0.3 mg/kg/week at 6 months without Disease Activity Score in 28 joints remission. Symptomatic and structural efficacy with and without optimal MTX dose was assessed by generalised logistic regression with adjustment for appropriate variables. Within the first year of follow-up, 314 patients (53%) with RA received MTX as a first DMARD (mean dose 12.2±3.8 mg/week). Only 26.4% (n=76) had optimal MTX dose. After adjustment, optimal versus non-optimal MTX dose was more efficient in achieving ACR-EULAR remission at 1 year (OR 4.28 (95% CI 1.86 to 9.86)) and normal functioning (Health Assessment Questionnaire ≤0.5; OR at 1 year 4.36 (95% CI 2.03 to 9.39)), with no effect on radiological progression. Results were similar during the second year. Optimal MTX dose is more efficacious than non-optimal dose for remission and function in early arthritis in daily practice, with no impact on radiological progression over 2 years. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  2. A de-noising method using the improved wavelet threshold function based on noise variance estimation

    NASA Astrophysics Data System (ADS)

    Liu, Hui; Wang, Weida; Xiang, Changle; Han, Lijin; Nie, Haizhao

    2018-01-01

    The precise and efficient noise variance estimation is very important for the processing of all kinds of signals while using the wavelet transform to analyze signals and extract signal features. In view of the problem that the accuracy of traditional noise variance estimation is greatly affected by the fluctuation of noise values, this study puts forward the strategy of using the two-state Gaussian mixture model to classify the high-frequency wavelet coefficients in the minimum scale, which takes both the efficiency and accuracy into account. According to the noise variance estimation, a novel improved wavelet threshold function is proposed by combining the advantages of hard and soft threshold functions, and on the basis of the noise variance estimation algorithm and the improved wavelet threshold function, the research puts forth a novel wavelet threshold de-noising method. The method is tested and validated using random signals and bench test data of an electro-mechanical transmission system. The test results indicate that the wavelet threshold de-noising method based on the noise variance estimation shows preferable performance in processing the testing signals of the electro-mechanical transmission system: it can effectively eliminate the interference of transient signals including voltage, current, and oil pressure and maintain the dynamic characteristics of the signals favorably.

  3. Should we expect population thresholds for wildlife disease?

    USGS Publications Warehouse

    Lloyd-Smith, James O.; Cross, P.C.; Briggs, C.J.; Daugherty, M.; Getz, W.M.; Latto, J.; Sanchez, M.; Smith, A.; Swei, A.

    2005-01-01

    Host population thresholds for invasion or persistence of infectious disease are core concepts of disease ecology, and underlie on-going and controversial disease control policies based on culling and vaccination. Empirical evidence for these thresholds in wildlife populations has been sparse, however, though recent studies have narrowed this gap. Here we review the theoretical bases for population thresholds for disease, revealing why they are difficult to measure and sometimes are not even expected, and identifying important facets of wildlife ecology left out of current theories. We discuss strengths and weaknesses of selected empirical studies that have reported disease thresholds for wildlife, identify recurring obstacles, and discuss implications of our imperfect understanding of wildlife thresholds for disease control policy.

  4. The influence of non-DNA-targeted effects on carbon ion–induced low-dose hyper-radiosensitivity in MRC-5 cells

    PubMed Central

    Ye, Fei; Ning, Jing; Liu, Xinguo; Jin, Xiaodong; Wang, Tieshan; Li, Qiang

    2016-01-01

    Low-dose hyper-radiosensitivity (LDHRS) is a hot topic in normal tissue radiation protection. However, the primary causes for LDHRS still remain unclear. In this study, the impact of non-DNA-targeted effects (NTEs) on high-LET radiation–induced LDHRS was investigated. Human normal lung fibroblast MRC-5 cells were irradiated with high-LET carbon ions, and low-dose biological effects (in terms of various bio-endpoints, including colony formation, DNA damage and micronuclei formation) were detected under conditions with and without gap junctional intercellular communication (GJIC) inhibition. LDHRS was observed when the radiation dose was <0.2 Gy for all bio-endpoints under investigation, but vanished when the GJIC was suppressed. Based on the probability of cells being hit and micro-dose per cell calculation, we deduced that the LDHRS phenomenon came from the combined action of direct hits and NTEs. We concluded that GJIC definitely plays an important role in cytotoxic substance spreading in high-LET carbon ion–induced LDHRS. PMID:26559335

  5. Non-Invasive Early Detection and Molecular Analysis of Low X-ray Dose Effects in the Lens

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldstein, Lee

    This is the Final Progress Report for DOE-funded research project DE-PS02-08ER08-01 titled “Non-Invasive Early Detection and Molecular Analysis of Low X-ray Dose Effects in the Lens”. The project focuses on the effects of low-linear energy transfer (LET) radiation on the ocular lens. The lens is an exquisitely radiosensitive tissue with a highly-ordered molecular structure that is amenable to non-invasive optical study from the periphery. These merits point to the lens as an ideal target for laser-based molecular biodosimetry (MBD). Following exposure to different types of ionizing radiations, the lens demonstrates molecular changes (e.g., oxidation, racemization, crosslinkage, truncation, aggregation, etc.) thatmore » impact the structure and function of the long-lived proteins in the cytosol of lens fiber cells. The vast majority of proteins in the lens comprise the highly-ordered crystallins. These highly conserved lens proteins are amongst the most concentrated and stable in the body. Once synthesized, the crystallins are retained in the fiber cell cytoplasm for life. Taken together, these properties point to the lens as an ideal system for quantitative in vivo MBD assessment using quasi-elastic light scattering (QLS) analysis. In this project, we deploy a purpose-designed non-invasive infrared laser QLS instrument as a quantitative tool for longitudinal assessment of pre-cataractous molecular changes in the lenses of living mice exposed to low-dose low-LET radiation compared to non-irradiated sham controls. We hypothesize that radiation exposure will induce dose-dependent changes in the molecular structure of matrix proteins in the lens. Mechanistic assays to ascertain radiation-induced molecular changes in the lens focus on protein aggregation and gene/protein expression patterns. We anticipate that this study will contribute to our understanding of early molecular changes associated with radiation-induced tissue pathology. This study also affords

  6. Current and emerging challenges in toxicopathology: Carcinogenic threshold of phenobarbital and proof of arsenic carcinogenicity using rat medium-term bioassays for carcinogens

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fukushima, Shoji; Morimura, Keiichirou; Wanibuchi, Hideki

    2005-09-01

    For the last 25 years, Prof. Nobuyuki Ito and his laboratory have focused on the development of liver medium-term bioassay system for detection of carcinogens in F344 rats utilizing glutathione S-transferase placental form (GST-P)-positive foci as an end point marker. In this presentation, the outline and samples of medium-term bioassay systems were described. Furthermore, our data demonstrated the presence of a threshold for the non-genotoxic carcinogen, phenobarbital (PB), and the lack of linearity in the low-dose area of the dose-response curve, providing evidence for hormesis. In addition, the establishment and applications of multiorgan carcinogenicity bioassay (DMBDD model), used for themore » examination of the carcinogenicity of genotoxic and non-genotoxic chemicals, are discussed. Dimethylarsinic acid, one of organic arsenics, was found to be carcinogenic in rat bladder using DMBDD model and carcinogenicity test.« less

  7. A Comparison of Dose-Response Models for the Parotid Gland in a Large Group of Head-and-Neck Cancer Patients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Houweling, Antonetta C., E-mail: A.Houweling@umcutrecht.n; Philippens, Marielle E.P.; Dijkema, Tim

    2010-03-15

    Purpose: The dose-response relationship of the parotid gland has been described most frequently using the Lyman-Kutcher-Burman model. However, various other normal tissue complication probability (NTCP) models exist. We evaluated in a large group of patients the value of six NTCP models that describe the parotid gland dose response 1 year after radiotherapy. Methods and Materials: A total of 347 patients with head-and-neck tumors were included in this prospective parotid gland dose-response study. The patients were treated with either conventional radiotherapy or intensity-modulated radiotherapy. Dose-volume histograms for the parotid glands were derived from three-dimensional dose calculations using computed tomography scans. Stimulatedmore » salivary flow rates were measured before and 1 year after radiotherapy. A threshold of 25% of the pretreatment flow rate was used to define a complication. The evaluated models included the Lyman-Kutcher-Burman model, the mean dose model, the relative seriality model, the critical volume model, the parallel functional subunit model, and the dose-threshold model. The goodness of fit (GOF) was determined by the deviance and a Monte Carlo hypothesis test. Ranking of the models was based on Akaike's information criterion (AIC). Results: None of the models was rejected based on the evaluation of the GOF. The mean dose model was ranked as the best model based on the AIC. The TD{sub 50} in these models was approximately 39 Gy. Conclusions: The mean dose model was preferred for describing the dose-response relationship of the parotid gland.« less

  8. Comparison of Threshold Detection Methods for the Generalized Pareto Distribution (GPD): Application to the NOAA-NCDC Daily Rainfall Dataset

    NASA Astrophysics Data System (ADS)

    Deidda, Roberto; Mamalakis, Antonis; Langousis, Andreas

    2015-04-01

    One of the most crucial issues in statistical hydrology is the estimation of extreme rainfall from data. To that extent, based on asymptotic arguments from Extreme Excess (EE) theory, several studies have focused on developing new, or improving existing methods to fit a Generalized Pareto Distribution (GPD) model to rainfall excesses above a properly selected threshold u. The latter is generally determined using various approaches that can be grouped into three basic classes: a) non-parametric methods that locate the changing point between extreme and non-extreme regions of the data, b) graphical methods where one studies the dependence of the GPD parameters (or related metrics) to the threshold level u, and c) Goodness of Fit (GoF) metrics that, for a certain level of significance, locate the lowest threshold u that a GPD model is applicable. In this work, we review representative methods for GPD threshold detection, discuss fundamental differences in their theoretical bases, and apply them to daily rainfall records from the NOAA-NCDC open-access database (http://www.ncdc.noaa.gov/oa/climate/ghcn-daily/). We find that non-parametric methods that locate the changing point between extreme and non-extreme regions of the data are generally not reliable, while graphical methods and GoF metrics that rely on limiting arguments for the upper distribution tail lead to unrealistically high thresholds u. The latter is expected, since one checks the validity of the limiting arguments rather than the applicability of a GPD distribution model. Better performance is demonstrated by graphical methods and GoF metrics that rely on GPD properties. Finally, we discuss the effects of data quantization (common in hydrologic applications) on the estimated thresholds. Acknowledgments: The research project is implemented within the framework of the Action «Supporting Postdoctoral Researchers» of the Operational Program "Education and Lifelong Learning" (Action's Beneficiary: General

  9. Threshold quantum secret sharing based on single qubit

    NASA Astrophysics Data System (ADS)

    Lu, Changbin; Miao, Fuyou; Meng, Keju; Yu, Yue

    2018-03-01

    Based on unitary phase shift operation on single qubit in association with Shamir's ( t, n) secret sharing, a ( t, n) threshold quantum secret sharing scheme (or ( t, n)-QSS) is proposed to share both classical information and quantum states. The scheme uses decoy photons to prevent eavesdropping and employs the secret in Shamir's scheme as the private value to guarantee the correctness of secret reconstruction. Analyses show it is resistant to typical intercept-and-resend attack, entangle-and-measure attack and participant attacks such as entanglement swapping attack. Moreover, it is easier to realize in physic and more practical in applications when compared with related ones. By the method in our scheme, new ( t, n)-QSS schemes can be easily constructed using other classical ( t, n) secret sharing.

  10. Threshold-Based Random Charging Scheme for Decentralized PEV Charging Operation in a Smart Grid.

    PubMed

    Kwon, Ojin; Kim, Pilkee; Yoon, Yong-Jin

    2016-12-26

    Smart grids have been introduced to replace conventional power distribution systems without real time monitoring for accommodating the future market penetration of plug-in electric vehicles (PEVs). When a large number of PEVs require simultaneous battery charging, charging coordination techniques have become one of the most critical factors to optimize the PEV charging performance and the conventional distribution system. In this case, considerable computational complexity of a central controller and exchange of real time information among PEVs may occur. To alleviate these problems, a novel threshold-based random charging (TBRC) operation for a decentralized charging system is proposed. Using PEV charging thresholds and random access rates, the PEVs themselves can participate in the charging requests. As PEVs with a high battery state do not transmit the charging requests to the central controller, the complexity of the central controller decreases due to the reduction of the charging requests. In addition, both the charging threshold and the random access rate are statistically calculated based on the average of supply power of the PEV charging system that do not require a real time update. By using the proposed TBRC with a tolerable PEV charging degradation, a 51% reduction of the PEV charging requests is achieved.

  11. Threshold-Based Random Charging Scheme for Decentralized PEV Charging Operation in a Smart Grid

    PubMed Central

    Kwon, Ojin; Kim, Pilkee; Yoon, Yong-Jin

    2016-01-01

    Smart grids have been introduced to replace conventional power distribution systems without real time monitoring for accommodating the future market penetration of plug-in electric vehicles (PEVs). When a large number of PEVs require simultaneous battery charging, charging coordination techniques have become one of the most critical factors to optimize the PEV charging performance and the conventional distribution system. In this case, considerable computational complexity of a central controller and exchange of real time information among PEVs may occur. To alleviate these problems, a novel threshold-based random charging (TBRC) operation for a decentralized charging system is proposed. Using PEV charging thresholds and random access rates, the PEVs themselves can participate in the charging requests. As PEVs with a high battery state do not transmit the charging requests to the central controller, the complexity of the central controller decreases due to the reduction of the charging requests. In addition, both the charging threshold and the random access rate are statistically calculated based on the average of supply power of the PEV charging system that do not require a real time update. By using the proposed TBRC with a tolerable PEV charging degradation, a 51% reduction of the PEV charging requests is achieved. PMID:28035963

  12. A technique for setting analytical thresholds in massively parallel sequencing-based forensic DNA analysis

    PubMed Central

    2017-01-01

    Amplicon (targeted) sequencing by massively parallel sequencing (PCR-MPS) is a potential method for use in forensic DNA analyses. In this application, PCR-MPS may supplement or replace other instrumental analysis methods such as capillary electrophoresis and Sanger sequencing for STR and mitochondrial DNA typing, respectively. PCR-MPS also may enable the expansion of forensic DNA analysis methods to include new marker systems such as single nucleotide polymorphisms (SNPs) and insertion/deletions (indels) that currently are assayable using various instrumental analysis methods including microarray and quantitative PCR. Acceptance of PCR-MPS as a forensic method will depend in part upon developing protocols and criteria that define the limitations of a method, including a defensible analytical threshold or method detection limit. This paper describes an approach to establish objective analytical thresholds suitable for multiplexed PCR-MPS methods. A definition is proposed for PCR-MPS method background noise, and an analytical threshold based on background noise is described. PMID:28542338

  13. A technique for setting analytical thresholds in massively parallel sequencing-based forensic DNA analysis.

    PubMed

    Young, Brian; King, Jonathan L; Budowle, Bruce; Armogida, Luigi

    2017-01-01

    Amplicon (targeted) sequencing by massively parallel sequencing (PCR-MPS) is a potential method for use in forensic DNA analyses. In this application, PCR-MPS may supplement or replace other instrumental analysis methods such as capillary electrophoresis and Sanger sequencing for STR and mitochondrial DNA typing, respectively. PCR-MPS also may enable the expansion of forensic DNA analysis methods to include new marker systems such as single nucleotide polymorphisms (SNPs) and insertion/deletions (indels) that currently are assayable using various instrumental analysis methods including microarray and quantitative PCR. Acceptance of PCR-MPS as a forensic method will depend in part upon developing protocols and criteria that define the limitations of a method, including a defensible analytical threshold or method detection limit. This paper describes an approach to establish objective analytical thresholds suitable for multiplexed PCR-MPS methods. A definition is proposed for PCR-MPS method background noise, and an analytical threshold based on background noise is described.

  14. Elevation of pain threshold by vaginal stimulation in women.

    PubMed

    Whipple, B; Komisaruk, B R

    1985-04-01

    In 2 studies with 10 women each, vaginal self-stimulation significantly increased the threshold to detect and tolerate painful finger compression, but did not significantly affect the threshold to detect innocuous tactile stimulation. The vaginal self-stimulation was applied with a specially designed pressure transducer assembly to produce a report of pressure or pleasure. In the first study, 6 of the women perceived the vaginal stimulation as producing pleasure. During that condition, the pain tolerance threshold increased significantly by 36.8% and the pain detection threshold increased significantly by 53%. A second study utilized other types of stimuli. Vaginal self-stimulation perceived as pressure significantly increased the pain tolerance threshold by 40.3% and the pain detection threshold by 47.4%. In the second study, when the vaginal stimulation was self-applied in a manner that produced orgasm, the pain tolerance threshold and pain detection threshold increased significantly by 74.6% and 106.7% respectively, while the tactile threshold remained unaffected. A variety of control conditions, including various types of distraction, did not significantly elevate pain or tactile thresholds. We conclude that in women, vaginal self-stimulation decreases pain sensitivity, but does not affect tactile sensitivity. This effect is apparently not due to painful or non-painful distraction.

  15. A distributed lag approach to fitting non-linear dose-response models in particulate matter air pollution time series investigations.

    PubMed

    Roberts, Steven; Martin, Michael A

    2007-06-01

    The majority of studies that have investigated the relationship between particulate matter (PM) air pollution and mortality have assumed a linear dose-response relationship and have used either a single-day's PM or a 2- or 3-day moving average of PM as the measure of PM exposure. Both of these modeling choices have come under scrutiny in the literature, the linear assumption because it does not allow for non-linearities in the dose-response relationship, and the use of the single- or multi-day moving average PM measure because it does not allow for differential PM-mortality effects spread over time. These two problems have been dealt with on a piecemeal basis with non-linear dose-response models used in some studies and distributed lag models (DLMs) used in others. In this paper, we propose a method for investigating the shape of the PM-mortality dose-response relationship that combines a non-linear dose-response model with a DLM. This combined model will be shown to produce satisfactory estimates of the PM-mortality dose-response relationship in situations where non-linear dose response models and DLMs alone do not; that is, the combined model did not systemically underestimate or overestimate the effect of PM on mortality. The combined model is applied to ten cities in the US and a pooled dose-response model formed. When fitted with a change-point value of 60 microg/m(3), the pooled model provides evidence for a positive association between PM and mortality. The combined model produced larger estimates for the effect of PM on mortality than when using a non-linear dose-response model or a DLM in isolation. For the combined model, the estimated percentage increase in mortality for PM concentrations of 25 and 75 microg/m(3) were 3.3% and 5.4%, respectively. In contrast, the corresponding values from a DLM used in isolation were 1.2% and 3.5%, respectively.

  16. Dose conversion coefficients for electron exposure of the human eye lens

    NASA Astrophysics Data System (ADS)

    Behrens, R.; Dietze, G.; Zankl, M.

    2009-07-01

    Recent epidemiological studies suggest a rather low dose threshold (below 0.5 Gy) for the induction of a cataract of the eye lens. Some other studies even assume that there is no threshold at all. Therefore, protection measures have to be optimized and current dose limits for the eye lens may be reduced in the future. Two questions arise from this situation: first, which dose quantity is related to the risk of developing a cataract, and second, which personal dose equivalent quantity is appropriate for monitoring this dose quantity. While the dose equivalent quantity Hp(0.07) has often been seen as being sufficiently accurate for monitoring the dose to the lens of the eye, this would be questionable in the case when the dose limits were reduced and, thus, it may be necessary to generally use the dose equivalent quantity Hp(3) for this purpose. The basis for a decision, however, must be the knowledge of accurate conversion coefficients from fluence to equivalent dose to the lens. This is especially important for low-penetrating radiation, for example, electrons. Formerly published values of conversion coefficients are based on quite simple models of the eye. In this paper, quite a sophisticated model of the eye including the inner structure of the lens was used for the calculations and precise conversion coefficients for electrons with energies between 0.2 MeV and 12 MeV, and for angles of radiation incidence between 0° and 45° are presented. Compared to the values adopted in 1996 by the International Commission on Radiological Protection (ICRP), the new values are up to 1000 times smaller for electron energies below 1 MeV, nearly equal at 1 MeV and above 4 MeV, and by a factor of 1.5 larger at about 1.5 MeV electron energy.

  17. Dose conversion coefficients for electron exposure of the human eye lens.

    PubMed

    Behrens, R; Dietze, G; Zankl, M

    2009-07-07

    Recent epidemiological studies suggest a rather low dose threshold (below 0.5 Gy) for the induction of a cataract of the eye lens. Some other studies even assume that there is no threshold at all. Therefore, protection measures have to be optimized and current dose limits for the eye lens may be reduced in the future. Two questions arise from this situation: first, which dose quantity is related to the risk of developing a cataract, and second, which personal dose equivalent quantity is appropriate for monitoring this dose quantity. While the dose equivalent quantity H(p)(0.07) has often been seen as being sufficiently accurate for monitoring the dose to the lens of the eye, this would be questionable in the case when the dose limits were reduced and, thus, it may be necessary to generally use the dose equivalent quantity H(p)(3) for this purpose. The basis for a decision, however, must be the knowledge of accurate conversion coefficients from fluence to equivalent dose to the lens. This is especially important for low-penetrating radiation, for example, electrons. Formerly published values of conversion coefficients are based on quite simple models of the eye. In this paper, quite a sophisticated model of the eye including the inner structure of the lens was used for the calculations and precise conversion coefficients for electrons with energies between 0.2 MeV and 12 MeV, and for angles of radiation incidence between 0 degrees and 45 degrees are presented. Compared to the values adopted in 1996 by the International Commission on Radiological Protection (ICRP), the new values are up to 1000 times smaller for electron energies below 1 MeV, nearly equal at 1 MeV and above 4 MeV, and by a factor of 1.5 larger at about 1.5 MeV electron energy.

  18. Optimizing Retransmission Threshold in Wireless Sensor Networks

    PubMed Central

    Bi, Ran; Li, Yingshu; Tan, Guozhen; Sun, Liang

    2016-01-01

    The retransmission threshold in wireless sensor networks is critical to the latency of data delivery in the networks. However, existing works on data transmission in sensor networks did not consider the optimization of the retransmission threshold, and they simply set the same retransmission threshold for all sensor nodes in advance. The method did not take link quality and delay requirement into account, which decreases the probability of a packet passing its delivery path within a given deadline. This paper investigates the problem of finding optimal retransmission thresholds for relay nodes along a delivery path in a sensor network. The object of optimizing retransmission thresholds is to maximize the summation of the probability of the packet being successfully delivered to the next relay node or destination node in time. A dynamic programming-based distributed algorithm for finding optimal retransmission thresholds for relay nodes along a delivery path in the sensor network is proposed. The time complexity is OnΔ·max1≤i≤n{ui}, where ui is the given upper bound of the retransmission threshold of sensor node i in a given delivery path, n is the length of the delivery path and Δ is the given upper bound of the transmission delay of the delivery path. If Δ is greater than the polynomial, to reduce the time complexity, a linear programming-based (1+pmin)-approximation algorithm is proposed. Furthermore, when the ranges of the upper and lower bounds of retransmission thresholds are big enough, a Lagrange multiplier-based distributed O(1)-approximation algorithm with time complexity O(1) is proposed. Experimental results show that the proposed algorithms have better performance. PMID:27171092

  19. Disease activity guided dose reduction and withdrawal of adalimumab or etanercept compared with usual care in rheumatoid arthritis: open label, randomised controlled, non-inferiority trial.

    PubMed

    van Herwaarden, Noortje; van der Maas, Aatke; Minten, Michiel J M; van den Hoogen, Frank H J; Kievit, Wietske; van Vollenhoven, Ronald F; Bijlsma, Johannes W J; van den Bemt, Bart J F; den Broeder, Alfons A

    2015-04-09

    To evaluate whether a disease activity guided strategy of dose reduction of two tumour necrosis factor (TNF) inhibitors, adalimumab or etanercept, is non-inferior in maintaining disease control in patients with rheumatoid arthritis compared with usual care. Randomised controlled, open label, non-inferiority strategy trial. Two rheumatology outpatient clinics in the Netherlands, from December 2011 to May 2014. 180 patients with rheumatoid arthritis and low disease activity using adalimumab or etanercept; 121 allocated to the dose reduction strategy, 59 to usual care. Disease activity guided dose reduction (advice to stepwise increase the injection interval every three months, until flare of disease activity or discontinuation) or usual care (no dose reduction advice). Flare was defined as increase in DAS28-CRP (a composite score measuring disease activity) greater than 1.2, or increase greater than 0.6 and current score of at least 3.2. In the case of flare, TNF inhibitor use was restarted or escalated. Difference in proportions of patients with major flare (DAS28-CRP based flare longer than three months) between the two groups at 18 months, compared against a non-inferiority margin of 20%. Secondary outcomes included TNF inhibitor use at study end, functioning, quality of life, radiographic progression, and adverse events. Dose reduction of adalimumab or etanercept was non-inferior to usual care (proportion of patients with major flare at 18 months, 12% v 10%; difference 2%, 95% confidence interval -12% to 12%). In the dose reduction group, TNF inhibitor use could successfully be stopped in 20% (95% confidence interval 13% to 28%), the injection interval successfully increased in 43% (34% to 53%), but no dose reduction was possible in 37% (28% to 46%). Functional status, quality of life, relevant radiographic progression, and adverse events did not differ between the groups, although short lived flares (73% v 27%) and minimal radiographic progression (32% v 15

  20. 40 CFR 98.381 - Reporting threshold.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 22 2012-07-01 2012-07-01 false Reporting threshold. 98.381 Section 98.381 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Coal-based Liquid Fuels § 98.381 Reporting threshold. Any...

  1. 40 CFR 98.381 - Reporting threshold.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Reporting threshold. 98.381 Section 98.381 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Coal-based Liquid Fuels § 98.381 Reporting threshold. Any...

  2. Nonlinear threshold effect in the Z-scan method of characterizing limiters for high-intensity laser light

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tereshchenko, S. A., E-mail: tsa@miee.ru; Savelyev, M. S.; Podgaetsky, V. M.

    A threshold model is described which permits one to determine the properties of limiters for high-powered laser light. It takes into account the threshold characteristics of the nonlinear optical interaction between the laser beam and the limiter working material. The traditional non-threshold model is a particular case of the threshold model when the limiting threshold is zero. The nonlinear characteristics of carbon nanotubes in liquid and solid media are obtained from experimental Z-scan data. Specifically, the nonlinear threshold effect was observed for aqueous dispersions of nanotubes, but not for nanotubes in solid polymethylmethacrylate. The threshold model fits the experimental Z-scanmore » data better than the non-threshold model. Output characteristics were obtained that integrally describe the nonlinear properties of the optical limiters.« less

  3. Rejection Thresholds in Solid Chocolate-Flavored Compound Coating

    PubMed Central

    Harwood, Meriel L.; Ziegler, Gregory R.; Hayes, John E.

    2012-01-01

    Classical detection thresholds do not predict liking, as they focus on the presence or absence of a sensation. Recently however, Prescott and colleagues described a new method, the rejection threshold, where a series of forced choice preference tasks are used to generate a dose-response function to determine hedonically acceptable concentrations. That is, how much is too much? To date, this approach has been used exclusively in liquid foods. Here, we determined group rejection thresholds in solid chocolate-flavored compound coating for bitterness. The influences of self-identified preferences for milk or dark chocolate, as well as eating style (chewers versus melters) on rejection thresholds were investigated. Stimuli included milk chocolate-flavored compound coating spiked with increasing amounts of sucrose octaacetate (SOA), a bitter GRAS additive. Paired preference tests (blank vs. spike) were used to determine the proportion of the group that preferred the blank. Across pairs, spiked samples were presented in ascending concentration. We were able to quantify and compare differences between two self-identified market segments. The rejection threshold for the dark chocolate preferring group was significantly higher than the milk chocolate preferring group (p = 0.01). Conversely, eating style did not affect group rejection thresholds (p = 0.14), although this may reflect the amount of chocolate given to participants. Additionally, there was no association between chocolate preference and eating style (p = 0.36). Present work supports the contention that this method can be used to examine preferences within specific market segments and potentially individual differences as they relate to ingestive behavior. PMID:22924788

  4. Cancer risk at low doses of ionizing radiation: artificial neural networks inference from atomic bomb survivors

    PubMed Central

    Sasaki, Masao S.; Tachibana, Akira; Takeda, Shunichi

    2014-01-01

    Cancer risk at low doses of ionizing radiation remains poorly defined because of ambiguity in the quantitative link to doses below 0.2 Sv in atomic bomb survivors in Hiroshima and Nagasaki arising from limitations in the statistical power and information available on overall radiation dose. To deal with these difficulties, a novel nonparametric statistics based on the ‘integrate-and-fire’ algorithm of artificial neural networks was developed and tested in cancer databases established by the Radiation Effects Research Foundation. The analysis revealed unique features at low doses that could not be accounted for by nominal exposure dose, including (i) the presence of a threshold that varied with organ, gender and age at exposure, and (ii) a small but significant bumping increase in cancer risk at low doses in Nagasaki that probably reflects internal exposure to 239Pu. The threshold was distinct from the canonical definition of zero effect in that it was manifested as negative excess relative risk, or suppression of background cancer rates. Such a unique tissue response at low doses of radiation exposure has been implicated in the context of the molecular basis of radiation–environment interplay in favor of recently emerging experimental evidence on DNA double-strand break repair pathway choice and its epigenetic memory by histone marking. PMID:24366315

  5. Cancer risk at low doses of ionizing radiation: artificial neural networks inference from atomic bomb survivors.

    PubMed

    Sasaki, Masao S; Tachibana, Akira; Takeda, Shunichi

    2014-05-01

    Cancer risk at low doses of ionizing radiation remains poorly defined because of ambiguity in the quantitative link to doses below 0.2 Sv in atomic bomb survivors in Hiroshima and Nagasaki arising from limitations in the statistical power and information available on overall radiation dose. To deal with these difficulties, a novel nonparametric statistics based on the 'integrate-and-fire' algorithm of artificial neural networks was developed and tested in cancer databases established by the Radiation Effects Research Foundation. The analysis revealed unique features at low doses that could not be accounted for by nominal exposure dose, including (i) the presence of a threshold that varied with organ, gender and age at exposure, and (ii) a small but significant bumping increase in cancer risk at low doses in Nagasaki that probably reflects internal exposure to (239)Pu. The threshold was distinct from the canonical definition of zero effect in that it was manifested as negative excess relative risk, or suppression of background cancer rates. Such a unique tissue response at low doses of radiation exposure has been implicated in the context of the molecular basis of radiation-environment interplay in favor of recently emerging experimental evidence on DNA double-strand break repair pathway choice and its epigenetic memory by histone marking.

  6. When is rational to order a diagnostic test, or prescribe treatment: the threshold model as an explanation of practice variation.

    PubMed

    Djulbegovic, Benjamin; van den Ende, Jef; Hamm, Robert M; Mayrhofer, Thomas; Hozo, Iztok; Pauker, Stephen G

    2015-05-01

    The threshold model represents an important advance in the field of medical decision-making. It is a linchpin between evidence (which exists on the continuum of credibility) and decision-making (which is a categorical exercise - we decide to act or not act). The threshold concept is closely related to the question of rational decision-making. When should the physician act, that is order a diagnostic test, or prescribe treatment? The threshold model embodies the decision theoretic rationality that says the most rational decision is to prescribe treatment when the expected treatment benefit outweighs its expected harms. However, the well-documented large variation in the way physicians order diagnostic tests or decide to administer treatments is consistent with a notion that physicians' individual action thresholds vary. We present a narrative review summarizing the existing literature on physicians' use of a threshold strategy for decision-making. We found that the observed variation in decision action thresholds is partially due to the way people integrate benefits and harms. That is, explanation of variation in clinical practice can be reduced to a consideration of thresholds. Limited evidence suggests that non-expected utility threshold (non-EUT) models, such as regret-based and dual-processing models, may explain current medical practice better. However, inclusion of costs and recognition of risk attitudes towards uncertain treatment effects and comorbidities may improve the explanatory and predictive value of the EUT-based threshold models. The decision when to act is closely related to the question of rational choice. We conclude that the medical community has not yet fully defined criteria for rational clinical decision-making. The traditional notion of rationality rooted in EUT may need to be supplemented by reflective rationality, which strives to integrate all aspects of medical practice - medical, humanistic and socio-economic - within a coherent

  7. The Prevalence and Clinical Features of Non-responsive Gastroesophageal Reflux Disease to Practical Proton Pump Inhibitor Dose in Korea: A Multicenter Study.

    PubMed

    Park, Hong Jun; Park, Soo Heon; Shim, Ki Nam; Kim, Yong Sung; Kim, Hyun Jin; Han, Jae Pil; Kim, Yong Sik; Bang, Byoung Wook; Kim, Gwang Ha; Baik, Gwang Ho; Kim, Hyung Hun; Park, Seon Young; Kim, Sung Soo

    2016-07-25

    In Korea, there are no available multicenter data concerning the prevalence of or diagnostic approaches for non-responsive gastroesophageal reflux disease (GERD) which does not respond to practical dose of proton pump inhibitor (PPI) in Korea. The purpose of this study is to evaluate the prevalence and the symptom pattern of non-responsive GERD. A total of 12 hospitals who were members of a Korean GERD research group joined this study. We used the composite score (CS) as a reflux symptom scale which is a standardized questionnaire based on the frequency and severity of typical symptoms of GERD. We defined "non-responsive GERD" as follows: a subject with the erosive reflux disease (ERD) whose CS was not decreased by at least 50% after standard-dose PPIs for 8 weeks or a subject with non-erosive reflux disease (NERD) whose CS was not decreased by at least 50% after half-dose PPIs for 4 weeks. A total of 234 subjects were analyzed. Among them, 87 and 147 were confirmed to have ERD and NERD, respectively. The prevalence of non-responsive GERD was 26.9% (63/234). The rates of non-responsive GERD were not different between the ERD and NERD groups (25.3% vs. 27.9%, respectively, p=0.664). There were no differences between the non-responsive GERD and responsive GERD groups for sex (p=0.659), age (p=0.134), or BMI (p=0.209). However, the initial CS for epigastric pain and fullness were higher in the non-responsive GERD group (p=0.044, p=0.014, respectively). In conclusion, this multicenter Korean study showed that the rate of non-responsive GERD was substantially high up to 26%. In addition, the patients with the non-responsive GERD frequently showed dyspeptic symptoms such as epigastric pain and fullness.

  8. Ultra-low dose naltrexone enhances cannabinoid-induced antinociception.

    PubMed

    Paquette, Jay; Olmstead, Mary C; Olmstead, Mary

    2005-12-01

    Both opioids and cannabinoids have inhibitory effects at micromolar doses, which are mediated by activated receptors coupling to Gi/o-proteins. Surprisingly, the analgesic effects of opioids are enhanced by ultra-low doses (nanomolar to picomolar) of the opioid antagonist, naltrexone. As opioid and cannabinoid systems interact, this study investigated whether ultra-low dose naltrexone also influences cannabinoid-induced antinociception. Separate groups of Long-Evans rats were tested for antinociception following an injection of vehicle, a sub-maximal dose of the cannabinoid agonist WIN 55 212-2, naltrexone (an ultra-low or a high dose) or a combination of WIN 55 212-2 and naltrexone doses. Tail-flick latencies were recorded for 3 h, at 10-min intervals for the first hour, and at 15-min intervals thereafter. Ultra-low dose naltrexone elevated WIN 55 212-2-induced tail flick thresholds without extending its duration of action. This enhancement was replicated in animals receiving intraperitoneal or intravenous injections. A high dose of naltrexone had no effect on WIN 55 212-2-induced tail flick latencies, but a high dose of the cannabinoid 1 receptor antagonist SR 141716 blocked the elevated tail-flick thresholds produced by WIN 55 212-2+ultra-low dose naltrexone. These data suggest a mechanism of cannabinoid-opioid interaction whereby activated opioid receptors that couple to Gs-proteins may attenuate cannabinoid-induced antinociception and/or motor functioning.

  9. Threshold of toxicological concern values for non-genotoxic effects in industrial chemicals: re-evaluation of the Cramer classification.

    PubMed

    Kalkhof, H; Herzler, M; Stahlmann, R; Gundert-Remy, U

    2012-01-01

    The TTC concept employs available data from animal testing to derive a distribution of NOAELs. Taking a probabilistic view, the 5th percentile of the distribution is taken as a threshold value for toxicity. In this paper, we use 824 NOAELs from repeated dose toxicity studies of industrial chemicals to re-evaluate the currently employed TTC values, which have been derived for substances grouped according to the Cramer scheme (Cramer et al. in Food Cosm Toxicol 16:255-276, 1978) by Munro et al. (Food Chem Toxicol 34:829-867, 1996) and refined by Kroes and Kozianowski (Toxicol Lett 127:43-46, 2002), Kroes et al. 2000. In our data set, consisting of 756 NOAELs from 28-day repeated dose testing and 57 NOAELs from 90-days repeated dose testing, the experimental NOAEL had to be extrapolated to chronic TTC using regulatory accepted extrapolation factors. The TTC values derived from our data set were higher than the currently used TTC values confirming the safety of the latter. We analysed the prediction of the Cramer classification by comparing the classification by this tool with the guidance values for classification according to the Globally Harmonised System of classification and labelling of the United Nations (GHS). Nearly 90% of the chemicals were in Cramer class 3 and assumed as highly toxic compared to 22% according to the GHS. The Cramer classification does underestimate the toxicity of chemicals only in 4.6% of the cases. Hence, from a regulatory perspective, the Cramer classification scheme might be applied as it overestimates hazard of a chemical.

  10. Peak skin and eye lens radiation dose from brain perfusion CT based on Monte Carlo simulation.

    PubMed

    Zhang, Di; Cagnon, Chris H; Villablanca, J Pablo; McCollough, Cynthia H; Cody, Dianna D; Stevens, Donna M; Zankl, Maria; Demarco, John J; Turner, Adam C; Khatonabadi, Maryam; McNitt-Gray, Michael F

    2012-02-01

    The purpose of our study was to accurately estimate the radiation dose to skin and the eye lens from clinical CT brain perfusion studies, investigate how well scanner output (expressed as volume CT dose index [CTDI(vol)]) matches these estimated doses, and investigate the efficacy of eye lens dose reduction techniques. Peak skin dose and eye lens dose were estimated using Monte Carlo simulation methods on a voxelized patient model and 64-MDCT scanners from four major manufacturers. A range of clinical protocols was evaluated. CTDI(vol) for each scanner was obtained from the scanner console. Dose reduction to the eye lens was evaluated for various gantry tilt angles as well as scan locations. Peak skin dose and eye lens dose ranged from 81 mGy to 348 mGy, depending on the scanner and protocol used. Peak skin dose and eye lens dose were observed to be 66-79% and 59-63%, respectively, of the CTDI(vol) values reported by the scanners. The eye lens dose was significantly reduced when the eye lenses were not directly irradiated. CTDI(vol) should not be interpreted as patient dose; this study has shown it to overestimate dose to the skin or eye lens. These results may be used to provide more accurate estimates of actual dose to ensure that protocols are operated safely below thresholds. Tilting the gantry or moving the scanning region further away from the eyes are effective for reducing lens dose in clinical practice. These actions should be considered when they are consistent with the clinical task and patient anatomy.

  11. Multi-threshold de-noising of electrical imaging logging data based on the wavelet packet transform

    NASA Astrophysics Data System (ADS)

    Xie, Fang; Xiao, Chengwen; Liu, Ruilin; Zhang, Lili

    2017-08-01

    A key problem of effectiveness evaluation for fractured-vuggy carbonatite reservoir is how to accurately extract fracture and vug information from electrical imaging logging data. Drill bits quaked during drilling and resulted in rugged surfaces of borehole walls and thus conductivity fluctuations in electrical imaging logging data. The occurrence of the conductivity fluctuations (formation background noise) directly affects the fracture/vug information extraction and reservoir effectiveness evaluation. We present a multi-threshold de-noising method based on wavelet packet transform to eliminate the influence of rugged borehole walls. The noise is present as fluctuations in button-electrode conductivity curves and as pockmarked responses in electrical imaging logging static images. The noise has responses in various scales and frequency ranges and has low conductivity compared with fractures or vugs. Our de-noising method is to decompose the data into coefficients with wavelet packet transform on a quadratic spline basis, then shrink high-frequency wavelet packet coefficients in different resolutions with minimax threshold and hard-threshold function, and finally reconstruct the thresholded coefficients. We use electrical imaging logging data collected from fractured-vuggy Ordovician carbonatite reservoir in Tarim Basin to verify the validity of the multi-threshold de-noising method. Segmentation results and extracted parameters are shown as well to prove the effectiveness of the de-noising procedure.

  12. SU-E-T-436: Fluence-Based Trajectory Optimization for Non-Coplanar VMAT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smyth, G; Bamber, JC; Bedford, JL

    2015-06-15

    Purpose: To investigate a fluence-based trajectory optimization technique for non-coplanar VMAT for brain cancer. Methods: Single-arc non-coplanar VMAT trajectories were determined using a heuristic technique for five patients. Organ at risk (OAR) volume intersected during raytracing was minimized for two cases: absolute volume and the sum of relative volumes weighted by OAR importance. These trajectories and coplanar VMAT formed starting points for the fluence-based optimization method. Iterative least squares optimization was performed on control points 24° apart in gantry rotation. Optimization minimized the root-mean-square (RMS) deviation of PTV dose from the prescription (relative importance 100), maximum dose to the brainstemmore » (10), optic chiasm (5), globes (5) and optic nerves (5), plus mean dose to the lenses (5), hippocampi (3), temporal lobes (2), cochleae (1) and brain excluding other regions of interest (1). Control point couch rotations were varied in steps of up to 10° and accepted if the cost function improved. Final treatment plans were optimized with the same objectives in an in-house planning system and evaluated using a composite metric - the sum of optimization metrics weighted by importance. Results: The composite metric decreased with fluence-based optimization in 14 of the 15 plans. In the remaining case its overall value, and the PTV and OAR components, were unchanged but the balance of OAR sparing differed. PTV RMS deviation was improved in 13 cases and unchanged in two. The OAR component was reduced in 13 plans. In one case the OAR component increased but the composite metric decreased - a 4 Gy increase in OAR metrics was balanced by a reduction in PTV RMS deviation from 2.8% to 2.6%. Conclusion: Fluence-based trajectory optimization improved plan quality as defined by the composite metric. While dose differences were case specific, fluence-based optimization improved both PTV and OAR dosimetry in 80% of cases.« less

  13. FREQUENCY OF WOUND INFECTION IN NON-PERFORATED APPENDICITIS WITH USE OF SINGLE DOSE PREOPERATIVE ANTIBIOTICS.

    PubMed

    Ali, Kishwar; Latif, Humera; Ahmad, Sajjad

    2015-01-01

    Antibiotics are used both pre and post-operatively in acute appendicitis for preventing wound infection. It has been observed that the routine use of post-operative antibiotics is not necessary in cases of non-perforated appendicitis as only prophylactic antibiotics are sufficient to prevent wound infection. The aim of this study was to see the frequency of wound infection in non-perforated appendicitis with single dose preoperative antibiotics only. This observational study was conducted at the Department of Surgery, Ayub Medical College, Abbottabad from May to November 2014. A total of 121 patients with non-perforated appendicitis were included in the study. Only single dose preoperative antibiotics were used. The patients were followed for wound infection till 8th post-operative day. 121 patients, 56 (46.28%) male and 65 (53.72%) female were included in the study. The mean age of patients was 27.41 +/- 7.12 years with an age range of 18 to 45 years. In the entire series, 7 (5.78%) patients developed wound infection. The infection was minor which settled with conservative therapy. Prophylactic antibiotics were found efficacious in 114 (94.21%) patients. There was no significant association between wound infection and age and gender. Single dose preoperative antibiotics were found effective in controlling post-operative wound infection without the need of extending the antibiotics to post-operative period in cases of non-perforated appendicitis.

  14. Ab initio chemical safety assessment: A workflow based on exposure considerations and non-animal methods.

    PubMed

    Berggren, Elisabet; White, Andrew; Ouedraogo, Gladys; Paini, Alicia; Richarz, Andrea-Nicole; Bois, Frederic Y; Exner, Thomas; Leite, Sofia; Grunsven, Leo A van; Worth, Andrew; Mahony, Catherine

    2017-11-01

    We describe and illustrate a workflow for chemical safety assessment that completely avoids animal testing. The workflow, which was developed within the SEURAT-1 initiative, is designed to be applicable to cosmetic ingredients as well as to other types of chemicals, e.g. active ingredients in plant protection products, biocides or pharmaceuticals. The aim of this work was to develop a workflow to assess chemical safety without relying on any animal testing, but instead constructing a hypothesis based on existing data, in silico modelling, biokinetic considerations and then by targeted non-animal testing. For illustrative purposes, we consider a hypothetical new ingredient x as a new component in a body lotion formulation. The workflow is divided into tiers in which points of departure are established through in vitro testing and in silico prediction, as the basis for estimating a safe external dose in a repeated use scenario. The workflow includes a series of possible exit (decision) points, with increasing levels of confidence, based on the sequential application of the Threshold of Toxicological (TTC) approach, read-across, followed by an "ab initio" assessment, in which chemical safety is determined entirely by new in vitro testing and in vitro to in vivo extrapolation by means of mathematical modelling. We believe that this workflow could be applied as a tool to inform targeted and toxicologically relevant in vitro testing, where necessary, and to gain confidence in safety decision making without the need for animal testing.

  15. SU-E-T-56: A Novel Approach to Computing Expected Value and Variance of Point Dose From Non-Gated Radiotherapy Delivery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, S; Zhu, X; Zhang, M

    Purpose: Randomness in patient internal organ motion phase at the beginning of non-gated radiotherapy delivery may introduce uncertainty to dose received by the patient. Concerns of this dose deviation from the planned one has motivated many researchers to study this phenomenon although unified theoretical framework for computing it is still missing. This study was conducted to develop such framework for analyzing the effect. Methods: Two reasonable assumptions were made: a) patient internal organ motion is stationary and periodic; b) no special arrangement is made to start a non -gated radiotherapy delivery at any specific phase of patient internal organ motion.more » A statistical ensemble was formed consisting of patient’s non-gated radiotherapy deliveries at all equally possible initial organ motion phases. To characterize the patient received dose, statistical ensemble average method is employed to derive formulae for two variables: expected value and variance of dose received by a patient internal point from a non-gated radiotherapy delivery. Fourier Series was utilized to facilitate our analysis. Results: According to our formulae, the two variables can be computed from non-gated radiotherapy generated dose rate time sequences at the point’s corresponding locations on fixed phase 3D CT images sampled evenly in time over one patient internal organ motion period. The expected value of point dose is simply the average of the doses to the point’s corresponding locations on the fixed phase CT images. The variance can be determined by time integration in terms of Fourier Series coefficients of the dose rate time sequences on the same fixed phase 3D CT images. Conclusion: Given a non-gated radiotherapy delivery plan and patient’s 4D CT study, our novel approach can predict the expected value and variance of patient radiation dose. We expect it to play a significant role in determining both quality and robustness of patient non-gated radiotherapy plan.« less

  16. Hybrid Artificial Root Foraging Optimizer Based Multilevel Threshold for Image Segmentation

    PubMed Central

    Liu, Yang; Liu, Junfei

    2016-01-01

    This paper proposes a new plant-inspired optimization algorithm for multilevel threshold image segmentation, namely, hybrid artificial root foraging optimizer (HARFO), which essentially mimics the iterative root foraging behaviors. In this algorithm the new growth operators of branching, regrowing, and shrinkage are initially designed to optimize continuous space search by combining root-to-root communication and coevolution mechanism. With the auxin-regulated scheme, various root growth operators are guided systematically. With root-to-root communication, individuals exchange information in different efficient topologies, which essentially improve the exploration ability. With coevolution mechanism, the hierarchical spatial population driven by evolutionary pressure of multiple subpopulations is structured, which ensure that the diversity of root population is well maintained. The comparative results on a suit of benchmarks show the superiority of the proposed algorithm. Finally, the proposed HARFO algorithm is applied to handle the complex image segmentation problem based on multilevel threshold. Computational results of this approach on a set of tested images show the outperformance of the proposed algorithm in terms of optimization accuracy computation efficiency. PMID:27725826

  17. Positive-negative corresponding normalized ghost imaging based on an adaptive threshold

    NASA Astrophysics Data System (ADS)

    Li, G. L.; Zhao, Y.; Yang, Z. H.; Liu, X.

    2016-11-01

    Ghost imaging (GI) technology has attracted increasing attention as a new imaging technique in recent years. However, the signal-to-noise ratio (SNR) of GI with pseudo-thermal light needs to be improved before it meets engineering application demands. We therefore propose a new scheme called positive-negative correspondence normalized GI based on an adaptive threshold (PCNGI-AT) to achieve a good performance with less amount of data. In this work, we use both the advantages of normalized GI (NGI) and positive-negative correspondence GI (P-NCGI). The correctness and feasibility of the scheme were proved in theory before we designed an adaptive threshold selection method, in which the parameter of object signal selection conditions is replaced by the normalizing value. The simulation and experimental results reveal that the SNR of the proposed scheme is better than that of time-correspondence differential GI (TCDGI), avoiding the calculation of the matrix of correlation and reducing the amount of data used. The method proposed will make GI far more practical in engineering applications.

  18. Hybrid Artificial Root Foraging Optimizer Based Multilevel Threshold for Image Segmentation.

    PubMed

    Liu, Yang; Liu, Junfei; Tian, Liwei; Ma, Lianbo

    2016-01-01

    This paper proposes a new plant-inspired optimization algorithm for multilevel threshold image segmentation, namely, hybrid artificial root foraging optimizer (HARFO), which essentially mimics the iterative root foraging behaviors. In this algorithm the new growth operators of branching, regrowing, and shrinkage are initially designed to optimize continuous space search by combining root-to-root communication and coevolution mechanism. With the auxin-regulated scheme, various root growth operators are guided systematically. With root-to-root communication, individuals exchange information in different efficient topologies, which essentially improve the exploration ability. With coevolution mechanism, the hierarchical spatial population driven by evolutionary pressure of multiple subpopulations is structured, which ensure that the diversity of root population is well maintained. The comparative results on a suit of benchmarks show the superiority of the proposed algorithm. Finally, the proposed HARFO algorithm is applied to handle the complex image segmentation problem based on multilevel threshold. Computational results of this approach on a set of tested images show the outperformance of the proposed algorithm in terms of optimization accuracy computation efficiency.

  19. Determination of simple thresholds for accelerometry-based parameters for fall detection.

    PubMed

    Kangas, Maarit; Konttila, Antti; Winblad, Ilkka; Jämsä, Timo

    2007-01-01

    The increasing population of elderly people is mainly living in a home-dwelling environment and needs applications to support their independency and safety. Falls are one of the major health risks that affect the quality of life among older adults. Body attached accelerometers have been used to detect falls. The placement of the accelerometric sensor as well as the fall detection algorithms are still under investigation. The aim of the present pilot study was to determine acceleration thresholds for fall detection, using triaxial accelerometric measurements at the waist, wrist, and head. Intentional falls (forward, backward, and lateral) and activities of daily living (ADL) were performed by two voluntary subjects. The results showed that measurements from the waist and head have potential to distinguish between falls and ADL. Especially, when the simple threshold-based detection was combined with posture detection after the fall, the sensitivity and specificity of fall detection were up to 100 %. On the contrary, the wrist did not appear to be an optimal site for fall detection.

  20. An integrative perspective of the anaerobic threshold.

    PubMed

    Sales, Marcelo Magalhães; Sousa, Caio Victor; da Silva Aguiar, Samuel; Knechtle, Beat; Nikolaidis, Pantelis Theodoros; Alves, Polissandro Mortoza; Simões, Herbert Gustavo

    2017-12-14

    The concept of anaerobic threshold (AT) was introduced during the nineteen sixties. Since then, several methods to identify the anaerobic threshold (AT) have been studied and suggested as novel 'thresholds' based upon the variable used for its detection (i.e. lactate threshold, ventilatory threshold, glucose threshold). These different techniques have brought some confusion about how we should name this parameter, for instance, anaerobic threshold or the physiological measure used (i.e. lactate, ventilation). On the other hand, the modernization of scientific methods and apparatus to detect AT, as well as the body of literature formed in the past decades, could provide a more cohesive understanding over the AT and the multiple physiological systems involved. Thus, the purpose of this review was to provide an integrative perspective of the methods to determine AT. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Radiation Dose-Volume Effects in the Stomach and Small Bowel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kavanagh, Brian D., E-mail: Brian.Kavanagh@ucdenver.ed; Pan, Charlie C.; Dawson, Laura A.

    2010-03-01

    Published data suggest that the risk of moderately severe (>=Grade 3) radiation-induced acute small-bowel toxicity can be predicted with a threshold model whereby for a given dose level, D, if the volume receiving that dose or greater (VD) exceeds a threshold quantity, the risk of toxicity escalates. Estimates of VD depend on the means of structure segmenting (e.g., V15 = 120 cc if individual bowel loops are outlined or V45 = 195 cc if entire peritoneal potential space of bowel is outlined). A similar predictive model of acute toxicity is not available for stomach. Late small-bowel/stomach toxicity is likely relatedmore » to maximum dose and/or volume threshold parameters qualitatively similar to those related to acute toxicity risk. Concurrent chemotherapy has been associated with a higher risk of acute toxicity, and a history of abdominal surgery has been associated with a higher risk of late toxicity.« less

  2. Threshold-voltage modulated phase change heterojunction for application of high density memory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yan, Baihan; Tong, Hao, E-mail: tonghao@hust.edu.cn; Qian, Hang

    2015-09-28

    Phase change random access memory is one of the most important candidates for the next generation non-volatile memory technology. However, the ability to reduce its memory size is compromised by the fundamental limitations inherent in the CMOS technology. While 0T1R configuration without any additional access transistor shows great advantages in improving the storage density, the leakage current and small operation window limit its application in large-scale arrays. In this work, phase change heterojunction based on GeTe and n-Si is fabricated to address those problems. The relationship between threshold voltage and doping concentration is investigated, and energy band diagrams and X-raymore » photoelectron spectroscopy measurements are provided to explain the results. The threshold voltage is modulated to provide a large operational window based on this relationship. The switching performance of the heterojunction is also tested, showing a good reverse characteristic, which could effectively decrease the leakage current. Furthermore, a reliable read-write-erase function is achieved during the tests. Phase change heterojunction is proposed for high-density memory, showing some notable advantages, such as modulated threshold voltage, large operational window, and low leakage current.« less

  3. Determinants of Change in the Cost-effectiveness Threshold.

    PubMed

    Paulden, Mike; O'Mahony, James; McCabe, Christopher

    2017-02-01

    The cost-effectiveness threshold in health care systems with a constrained budget should be determined by the cost-effectiveness of displacing health care services to fund new interventions. Using comparative statics, we review some potential determinants of the threshold, including the budget for health care, the demand for existing health care interventions, the technical efficiency of existing interventions, and the development of new health technologies. We consider the anticipated direction of impact that would affect the threshold following a change in each of these determinants. Where the health care system is technically efficient, an increase in the health care budget unambiguously raises the threshold, whereas an increase in the demand for existing, non-marginal health interventions unambiguously lowers the threshold. Improvements in the technical efficiency of existing interventions may raise or lower the threshold, depending on the cause of the improvement in efficiency, whether the intervention is already funded, and, if so, whether it is marginal. New technologies may also raise or lower the threshold, depending on whether the new technology is a substitute for an existing technology and, again, whether the existing technology is marginal. Our analysis permits health economists and decision makers to assess if and in what direction the threshold may change over time. This matters, as threshold changes impact the cost-effectiveness of interventions that require decisions now but have costs and effects that fall in future periods.

  4. Vitamin D supplementation increases calcium absorption without a threshold effect

    USDA-ARS?s Scientific Manuscript database

    The maximal calcium absorption in response to vitamin D has been proposed as a biomarker for vitamin D sufficiency. Our objective was to determine whether there is a threshold beyond which increasing doses of vitamin D, or concentrations of serum 25-hydroxyvitamin D [25(OH)D], no longer increase cal...

  5. Fully automated treatment planning for head and neck radiotherapy using a voxel-based dose prediction and dose mimicking method

    NASA Astrophysics Data System (ADS)

    McIntosh, Chris; Welch, Mattea; McNiven, Andrea; Jaffray, David A.; Purdie, Thomas G.

    2017-08-01

    Recent works in automated radiotherapy treatment planning have used machine learning based on historical treatment plans to infer the spatial dose distribution for a novel patient directly from the planning image. We present a probabilistic, atlas-based approach which predicts the dose for novel patients using a set of automatically selected most similar patients (atlases). The output is a spatial dose objective, which specifies the desired dose-per-voxel, and therefore replaces the need to specify and tune dose-volume objectives. Voxel-based dose mimicking optimization then converts the predicted dose distribution to a complete treatment plan with dose calculation using a collapsed cone convolution dose engine. In this study, we investigated automated planning for right-sided oropharaynx head and neck patients treated with IMRT and VMAT. We compare four versions of our dose prediction pipeline using a database of 54 training and 12 independent testing patients by evaluating 14 clinical dose evaluation criteria. Our preliminary results are promising and demonstrate that automated methods can generate comparable dose distributions to clinical. Overall, automated plans achieved an average of 0.6% higher dose for target coverage evaluation criteria, and 2.4% lower dose at the organs at risk criteria levels evaluated compared with clinical. There was no statistically significant difference detected in high-dose conformity between automated and clinical plans as measured by the conformation number. Automated plans achieved nine more unique criteria than clinical across the 12 patients tested and automated plans scored a significantly higher dose at the evaluation limit for two high-risk target coverage criteria and a significantly lower dose in one critical organ maximum dose. The novel dose prediction method with dose mimicking can generate complete treatment plans in 12-13 min without user interaction. It is a promising approach for fully automated treatment

  6. Fully automated treatment planning for head and neck radiotherapy using a voxel-based dose prediction and dose mimicking method.

    PubMed

    McIntosh, Chris; Welch, Mattea; McNiven, Andrea; Jaffray, David A; Purdie, Thomas G

    2017-07-06

    Recent works in automated radiotherapy treatment planning have used machine learning based on historical treatment plans to infer the spatial dose distribution for a novel patient directly from the planning image. We present a probabilistic, atlas-based approach which predicts the dose for novel patients using a set of automatically selected most similar patients (atlases). The output is a spatial dose objective, which specifies the desired dose-per-voxel, and therefore replaces the need to specify and tune dose-volume objectives. Voxel-based dose mimicking optimization then converts the predicted dose distribution to a complete treatment plan with dose calculation using a collapsed cone convolution dose engine. In this study, we investigated automated planning for right-sided oropharaynx head and neck patients treated with IMRT and VMAT. We compare four versions of our dose prediction pipeline using a database of 54 training and 12 independent testing patients by evaluating 14 clinical dose evaluation criteria. Our preliminary results are promising and demonstrate that automated methods can generate comparable dose distributions to clinical. Overall, automated plans achieved an average of 0.6% higher dose for target coverage evaluation criteria, and 2.4% lower dose at the organs at risk criteria levels evaluated compared with clinical. There was no statistically significant difference detected in high-dose conformity between automated and clinical plans as measured by the conformation number. Automated plans achieved nine more unique criteria than clinical across the 12 patients tested and automated plans scored a significantly higher dose at the evaluation limit for two high-risk target coverage criteria and a significantly lower dose in one critical organ maximum dose. The novel dose prediction method with dose mimicking can generate complete treatment plans in 12-13 min without user interaction. It is a promising approach for fully automated treatment

  7. SU-G-BRB-14: Uncertainty of Radiochromic Film Based Relative Dose Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Devic, S; Tomic, N; DeBlois, F

    2016-06-15

    Purpose: Due to inherently non-linear dose response, measurement of relative dose distribution with radiochromic film requires measurement of absolute dose using a calibration curve following previously established reference dosimetry protocol. On the other hand, a functional form that converts the inherently non-linear dose response curve of the radiochromic film dosimetry system into linear one has been proposed recently [Devic et al, Med. Phys. 39 4850–4857 (2012)]. However, there is a question what would be the uncertainty of such measured relative dose. Methods: If the relative dose distribution is determined going through the reference dosimetry system (conversion of the response bymore » using calibration curve into absolute dose) the total uncertainty of such determined relative dose will be calculated by summing in quadrature total uncertainties of doses measured at a given and at the reference point. On the other hand, if the relative dose is determined using linearization method, the new response variable is calculated as ζ=a(netOD)n/ln(netOD). In this case, the total uncertainty in relative dose will be calculated by summing in quadrature uncertainties for a new response function (σζ) for a given and the reference point. Results: Except at very low doses, where the measurement uncertainty dominates, the total relative dose uncertainty is less than 1% for the linear response method as compared to almost 2% uncertainty level for the reference dosimetry method. The result is not surprising having in mind that the total uncertainty of the reference dose method is dominated by the fitting uncertainty, which is mitigated in the case of linearization method. Conclusion: Linearization of the radiochromic film dose response provides a convenient and a more precise method for relative dose measurements as it does not require reference dosimetry and creation of calibration curve. However, the linearity of the newly introduced function must be verified. Dave

  8. Method of predicting the mean lung dose based on a patient's anatomy and dose-volume histograms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zawadzka, Anna, E-mail: a.zawadzka@zfm.coi.pl; Nesteruk, Marta; Department of Radiation Oncology, University Hospital Zurich and University of Zurich, Zurich

    The aim of this study was to propose a method to predict the minimum achievable mean lung dose (MLD) and corresponding dosimetric parameters for organs-at-risk (OAR) based on individual patient anatomy. For each patient, the dose for 36 equidistant individual multileaf collimator shaped fields in the treatment planning system (TPS) was calculated. Based on these dose matrices, the MLD for each patient was predicted by the homemade DosePredictor software in which the solution of linear equations was implemented. The software prediction results were validated based on 3D conformal radiotherapy (3D-CRT) and volumetric modulated arc therapy (VMAT) plans previously prepared formore » 16 patients with stage III non–small-cell lung cancer (NSCLC). For each patient, dosimetric parameters derived from plans and the results calculated by DosePredictor were compared. The MLD, the maximum dose to the spinal cord (D{sub max} {sub cord}) and the mean esophageal dose (MED) were analyzed. There was a strong correlation between the MLD calculated by the DosePredictor and those obtained in treatment plans regardless of the technique used. The correlation coefficient was 0.96 for both 3D-CRT and VMAT techniques. In a similar manner, MED correlations of 0.98 and 0.96 were obtained for 3D-CRT and VMAT plans, respectively. The maximum dose to the spinal cord was not predicted very well. The correlation coefficient was 0.30 and 0.61 for 3D-CRT and VMAT, respectively. The presented method allows us to predict the minimum MLD and corresponding dosimetric parameters to OARs without the necessity of plan preparation. The method can serve as a guide during the treatment planning process, for example, as initial constraints in VMAT optimization. It allows the probability of lung pneumonitis to be predicted.« less

  9. Comparison of image segmentation of lungs using methods: connected threshold, neighborhood connected, and threshold level set segmentation

    NASA Astrophysics Data System (ADS)

    Amanda, A. R.; Widita, R.

    2016-03-01

    The aim of this research is to compare some image segmentation methods for lungs based on performance evaluation parameter (Mean Square Error (MSE) and Peak Signal Noise to Ratio (PSNR)). In this study, the methods compared were connected threshold, neighborhood connected, and the threshold level set segmentation on the image of the lungs. These three methods require one important parameter, i.e the threshold. The threshold interval was obtained from the histogram of the original image. The software used to segment the image here was InsightToolkit-4.7.0 (ITK). This research used 5 lung images to be analyzed. Then, the results were compared using the performance evaluation parameter determined by using MATLAB. The segmentation method is said to have a good quality if it has the smallest MSE value and the highest PSNR. The results show that four sample images match the criteria of connected threshold, while one sample refers to the threshold level set segmentation. Therefore, it can be concluded that connected threshold method is better than the other two methods for these cases.

  10. Voxel-based population analysis for correlating local dose and rectal toxicity in prostate cancer radiotherapy

    NASA Astrophysics Data System (ADS)

    Acosta, Oscar; Drean, Gael; Ospina, Juan D.; Simon, Antoine; Haigron, Pascal; Lafond, Caroline; de Crevoisier, Renaud

    2013-04-01

    The majority of current models utilized for predicting toxicity in prostate cancer radiotherapy are based on dose-volume histograms. One of their main drawbacks is the lack of spatial accuracy, since they consider the organs as a whole volume and thus ignore the heterogeneous intra-organ radio-sensitivity. In this paper, we propose a dose-image-based framework to reveal the relationships between local dose and toxicity. In this approach, the three-dimensional (3D) planned dose distributions across a population are non-rigidly registered into a common coordinate system and compared at a voxel level, therefore enabling the identification of 3D anatomical patterns, which may be responsible for toxicity, at least to some extent. Additionally, different metrics were employed in order to assess the quality of the dose mapping. The value of this approach was demonstrated by prospectively analyzing rectal bleeding (⩾Grade 1 at 2 years) according to the CTCAE v3.0 classification in a series of 105 patients receiving 80 Gy to the prostate by intensity modulated radiation therapy (IMRT). Within the patients presenting bleeding, a significant dose excess (6 Gy on average, p < 0.01) was found in a region of the anterior rectal wall. This region, close to the prostate (1 cm), represented less than 10% of the rectum. This promising voxel-wise approach allowed subregions to be defined within the organ that may be involved in toxicity and, as such, must be considered during the inverse IMRT planning step.

  11. A framework for organ dose estimation in x-ray angiography and interventional radiology based on dose-related data in DICOM structured reports

    NASA Astrophysics Data System (ADS)

    Omar, Artur; Bujila, Robert; Fransson, Annette; Andreo, Pedro; Poludniowski, Gavin

    2016-04-01

    Although interventional x-ray angiography (XA) procedures involve relatively high radiation doses that can lead to deterministic tissue reactions in addition to stochastic effects, convenient and accurate estimation of absorbed organ doses has traditionally been out of reach. This has mainly been due to the absence of practical means to access dose-related data that describe the physical context of the numerous exposures during an XA procedure. The present work provides a comprehensive and general framework for the determination of absorbed organ dose, based on non-proprietary access to dose-related data by utilizing widely available DICOM radiation dose structured reports. The framework comprises a straightforward calculation workflow to determine the incident kerma and reconstruction of the geometrical relation between the projected x-ray beam and the patient’s anatomy. The latter is difficult in practice, as the position of the patient on the table top is unknown. A novel patient-specific approach for reconstruction of the patient position on the table is presented. The proposed approach was evaluated for 150 patients by comparing the estimated position of the primary irradiated organs (the target organs) with their position in clinical DICOM images. The approach is shown to locate the target organ position with a mean (max) deviation of 1.3 (4.3), 1.8 (3.6) and 1.4 (2.9) cm for neurovascular, adult and paediatric cardiovascular procedures, respectively. To illustrate the utility of the framework for systematic and automated organ dose estimation in routine clinical practice, a prototype implementation of the framework with Monte Carlo simulations is included.

  12. Assessing regional and interspecific variation in threshold responses of forest breeding birds through broad scale analyses.

    PubMed

    van der Hoek, Yntze; Renfrew, Rosalind; Manne, Lisa L

    2013-01-01

    Identifying persistence and extinction thresholds in species-habitat relationships is a major focal point of ecological research and conservation. However, one major concern regarding the incorporation of threshold analyses in conservation is the lack of knowledge on the generality and transferability of results across species and regions. We present a multi-region, multi-species approach of modeling threshold responses, which we use to investigate whether threshold effects are similar across species and regions. We modeled local persistence and extinction dynamics of 25 forest-associated breeding birds based on detection/non-detection data, which were derived from repeated breeding bird atlases for the state of Vermont. We did not find threshold responses to be particularly well-supported, with 9 species supporting extinction thresholds and 5 supporting persistence thresholds. This contrasts with a previous study based on breeding bird atlas data from adjacent New York State, which showed that most species support persistence and extinction threshold models (15 and 22 of 25 study species respectively). In addition, species that supported a threshold model in both states had associated average threshold estimates of 61.41% (SE = 6.11, persistence) and 66.45% (SE = 9.15, extinction) in New York, compared to 51.08% (SE = 10.60, persistence) and 73.67% (SE = 5.70, extinction) in Vermont. Across species, thresholds were found at 19.45-87.96% forest cover for persistence and 50.82-91.02% for extinction dynamics. Through an approach that allows for broad-scale comparisons of threshold responses, we show that species vary in their threshold responses with regard to habitat amount, and that differences between even nearby regions can be pronounced. We present both ecological and methodological factors that may contribute to the different model results, but propose that regardless of the reasons behind these differences, our results merit a warning that

  13. Assessing Regional and Interspecific Variation in Threshold Responses of Forest Breeding Birds through Broad Scale Analyses

    PubMed Central

    van der Hoek, Yntze; Renfrew, Rosalind; Manne, Lisa L.

    2013-01-01

    Background Identifying persistence and extinction thresholds in species-habitat relationships is a major focal point of ecological research and conservation. However, one major concern regarding the incorporation of threshold analyses in conservation is the lack of knowledge on the generality and transferability of results across species and regions. We present a multi-region, multi-species approach of modeling threshold responses, which we use to investigate whether threshold effects are similar across species and regions. Methodology/Principal Findings We modeled local persistence and extinction dynamics of 25 forest-associated breeding birds based on detection/non-detection data, which were derived from repeated breeding bird atlases for the state of Vermont. We did not find threshold responses to be particularly well-supported, with 9 species supporting extinction thresholds and 5 supporting persistence thresholds. This contrasts with a previous study based on breeding bird atlas data from adjacent New York State, which showed that most species support persistence and extinction threshold models (15 and 22 of 25 study species respectively). In addition, species that supported a threshold model in both states had associated average threshold estimates of 61.41% (SE = 6.11, persistence) and 66.45% (SE = 9.15, extinction) in New York, compared to 51.08% (SE = 10.60, persistence) and 73.67% (SE = 5.70, extinction) in Vermont. Across species, thresholds were found at 19.45–87.96% forest cover for persistence and 50.82–91.02% for extinction dynamics. Conclusions/Significance Through an approach that allows for broad-scale comparisons of threshold responses, we show that species vary in their threshold responses with regard to habitat amount, and that differences between even nearby regions can be pronounced. We present both ecological and methodological factors that may contribute to the different model results, but propose that regardless of the

  14. Laser-induced retinal damage thresholds for annular retinal beam profiles

    NASA Astrophysics Data System (ADS)

    Kennedy, Paul K.; Zuclich, Joseph A.; Lund, David J.; Edsall, Peter R.; Till, Stephen; Stuck, Bruce E.; Hollins, Richard C.

    2004-07-01

    The dependence of retinal damage thresholds on laser spot size, for annular retinal beam profiles, was measured in vivo for 3 μs, 590 nm pulses from a flashlamp-pumped dye laser. Minimum Visible Lesion (MVL)ED50 thresholds in rhesus were measured for annular retinal beam profiles covering 5, 10, and 20 mrad of visual field; which correspond to outer beam diameters of roughly 70, 160, and 300 μm, respectively, on the primate retina. Annular beam profiles at the retinal plane were achieved using a telescopic imaging system, with the focal properties of the eye represented as an equivalent thin lens, and all annular beam profiles had a 37% central obscuration. As a check on experimental data, theoretical MVL-ED50 thresholds for annular beam exposures were calculated using the Thompson-Gerstman granular model of laser-induced thermal damage to the retina. Threshold calculations were performed for the three experimental beam diameters and for an intermediate case with an outer beam diameter of 230 μm. Results indicate that the threshold vs. spot size trends, for annular beams, are similar to the trends for top hat beams determined in a previous study; i.e., the threshold dose varies with the retinal image area for larger image sizes. The model correctly predicts the threshold vs. spot size trends seen in the biological data, for both annular and top hat retinal beam profiles.

  15. Non-linear absorption pharmacokinetics of amoxicillin: consequences for dosing regimens and clinical breakpoints.

    PubMed

    de Velde, Femke; de Winter, Brenda C M; Koch, Birgit C P; van Gelder, Teun; Mouton, Johan W

    2016-10-01

    To describe the population pharmacokinetics of oral amoxicillin and to compare the PTA of current dosing regimens. Two groups, each with 14 healthy male volunteers, received oral amoxicillin/clavulanic acid tablets on two separate days 1 week apart. One group received 875/125 mg twice daily and 500/125 mg three times daily and the other group 500/125 mg twice daily and 250/125 mg three times daily. A total of 1428 amoxicillin blood samples were collected before and after administration. We analysed the concentration-time profiles using a non-compartmental pharmacokinetic method (PKSolver) and a population pharmacokinetic method (NONMEM). The PTA was computed using Monte Carlo simulations for several dosing regimens. AUC0-24 and Cmax increased non-linearly with dose. The final model included the following components: Savic's transit compartment model, Michaelis-Menten absorption, two distribution compartments and first-order elimination. The mean central volume of distribution was 27.7 L and mean clearance was 21.3 L/h. We included variability for the central volume of distribution (34.4%), clearance (25.8%), transit compartment model parameters and Michaelis-Menten absorption parameters. For 40% fT>MIC and >97.5% PTA, the breakpoints were 0.125 mg/L (500 mg twice daily), 0.25 mg/L (250 mg three times daily and 875 mg twice daily), 0.5 mg/L (500 mg three times daily) and 1 mg/L (750, 875 or 1000 mg three times daily and 500 mg four times daily). The amoxicillin absorption rate appears to be saturable. The PTAs of high-dose as well as twice-daily regimens are less favourable than regimens with lower doses and higher frequency. © The Author 2016. Published by Oxford University Press on behalf of the British Society for Antimicrobial Chemotherapy. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  16. Effects of Tiagabine on Slow Wave Sleep and Arousal Threshold in Patients With Obstructive Sleep Apnea.

    PubMed

    Taranto-Montemurro, Luigi; Sands, Scott A; Edwards, Bradley A; Azarbarzin, Ali; Marques, Melania; de Melo, Camila; Eckert, Danny J; White, David P; Wellman, Andrew

    2017-02-01

    Obstructive sleep apnea (OSA) severity is markedly reduced during slow-wave sleep (SWS) even in patients with a severe disease. The reason for this improvement is uncertain but likely relates to non-anatomical factors (i.e. reduced arousability, chemosensitivity, and increased dilator muscle activity). The anticonvulsant tiagabine produces a dose-dependent increase in SWS in subjects without OSA. This study aimed to test the hypothesis that tiagabine would reduce OSA severity by raising the overall arousal threshold during sleep. After a baseline physiology night to assess patients' OSA phenotypic traits, a placebo-controlled, double-blind, crossover trial of tiagabine 12 mg administered before sleep was performed in 14 OSA patients. Under each condition, we assessed the effects on sleep and OSA severity using standard clinical polysomnography. Tiagabine increased slow-wave activity (SWA) of the electroencephalogram (1-4 Hz) compared to placebo (1.8 [0.4] vs. 2.0 [0.5] LogμV2, p = .04) but did not reduce OSA severity (apnea-hypopnea index [AHI] 41.5 [20.3] vs. 39.1 [16.5], p > .5). SWS duration (25 [20] vs. 26 [43] mins, p > .5) and arousal threshold (-26.5 [5.0] vs. -27.6 [5.1] cmH2O, p = .26) were also unchanged between nights. Tiagabine modified sleep microstructure (increase in SWA) but did not change the duration of SWS, OSA severity, or arousal threshold in this group of OSA patients. Based on these findings, tiagabine should not be considered as a therapeutic option for OSA treatment. © Sleep Research Society 2016. Published by Oxford University Press on behalf of the Sleep Research Society. All rights reserved. For permissions, please e-mail journals.permissions@oup.com.

  17. Nut crop yield records show that budbreak-based chilling requirements may not reflect yield decline chill thresholds

    NASA Astrophysics Data System (ADS)

    Pope, Katherine S.; Dose, Volker; Da Silva, David; Brown, Patrick H.; DeJong, Theodore M.

    2015-06-01

    Warming winters due to climate change may critically affect temperate tree species. Insufficiently cold winters are thought to result in fewer viable flower buds and the subsequent development of fewer fruits or nuts, decreasing the yield of an orchard or fecundity of a species. The best existing approximation for a threshold of sufficient cold accumulation, the "chilling requirement" of a species or variety, has been quantified by manipulating or modeling the conditions that result in dormant bud breaking. However, the physiological processes that affect budbreak are not the same as those that determine yield. This study sought to test whether budbreak-based chilling thresholds can reasonably approximate the thresholds that affect yield, particularly regarding the potential impacts of climate change on temperate tree crop yields. County-wide yield records for almond ( Prunus dulcis), pistachio ( Pistacia vera), and walnut ( Juglans regia) in the Central Valley of California were compared with 50 years of weather records. Bayesian nonparametric function estimation was used to model yield potentials at varying amounts of chill accumulation. In almonds, average yields occurred when chill accumulation was close to the budbreak-based chilling requirement. However, in the other two crops, pistachios and walnuts, the best previous estimate of the budbreak-based chilling requirements was 19-32 % higher than the chilling accumulations associated with average or above average yields. This research indicates that physiological processes beyond requirements for budbreak should be considered when estimating chill accumulation thresholds of yield decline and potential impacts of climate change.

  18. Determining lower threshold concentrations for synergistic effects.

    PubMed

    Bjergager, Maj-Britt Andersen; Dalhoff, Kristoffer; Kretschmann, Andreas; Nørgaard, Katrine Banke; Mayer, Philipp; Cedergreen, Nina

    2017-01-01

    Though only occurring rarely, synergistic interactions between chemicals in mixtures have long been a point of focus. Most studies analyzing synergistic interactions used unrealistically high chemical concentrations. The aim of the present study is to determine the threshold concentration below which proven synergists cease to act as synergists towards the aquatic crustacean Daphnia magna. To do this, we compared several approaches and test-setups to evaluate which approach gives the most conservative estimate for the lower threshold for synergy for three known azole synergists. We focus on synergistic interactions between the pyrethroid insecticide, alpha-cypermethrin, and one of the three azole fungicides prochloraz, propiconazole or epoxiconazole measured on Daphnia magna immobilization. Three different experimental setups were applied: A standard 48h acute toxicity test, an adapted 48h test using passive dosing for constant chemical exposure concentrations, and a 14-day test. Synergy was defined as occuring in mixtures where either EC 50 values decreased more than two-fold below what was predicted by concentration addition (horizontal assessment) or as mixtures where the fraction of immobile organisms increased more than two-fold above what was predicted by independent action (vertical assessment). All three tests confirmed the hypothesis of the existence of a lower azole threshold concentration below which no synergistic interaction was observed. The lower threshold concentration, however, decreased with increasing test duration from 0.026±0.013μM (9.794±4.897μgL -1 ), 0.425±0.089μM (145.435±30.46μgL -1 ) and 0.757±0.253μM (249.659±83.44μgL -1 ) for prochloraz, propiconazole and epoxiconazole in standard 48h toxicity tests to 0.015±0.004μM (5.651±1.507μgL -1 ), 0.145±0.025μM (49.619±8.555μgL -1 ) and 0.122±0.0417μM (40.236±13.75μgL -1 ), respectively, in the 14-days tests. Testing synergy in relation to concentration addition provided

  19. Carbon deposition thresholds on nickel-based solid oxide fuel cell anodes I. Fuel utilization

    NASA Astrophysics Data System (ADS)

    Kuhn, J.; Kesler, O.

    2015-03-01

    In the first of a two part publication, the effect of fuel utilization (Uf) on carbon deposition rates in solid oxide fuel cell nickel-based anodes was studied. Representative 5-component CH4 reformate compositions (CH4, H2, CO, H2O, & CO2) were selected graphically by plotting the solutions to a system of mass-balance constraint equations. The centroid of the solution space was chosen to represent a typical anode gas mixture for each nominal Uf value. Selected 5-component and 3-component gas mixtures were then delivered to anode-supported cells for 10 h, followed by determination of the resulting deposited carbon mass. The empirical carbon deposition thresholds were affected by atomic carbon (C), hydrogen (H), and oxygen (O) fractions of the delivered gas mixtures and temperature. It was also found that CH4-rich gas mixtures caused irreversible damage, whereas atomically equivalent CO-rich compositions did not. The coking threshold predicted by thermodynamic equilibrium calculations employing graphite for the solid carbon phase agreed well with empirical thresholds at 700 °C (Uf ≈ 32%); however, at 600 °C, poor agreement was observed with the empirical threshold of ∼36%. Finally, cell operating temperatures correlated well with the difference in enthalpy between the supplied anode gas mixtures and their resulting thermodynamic equilibrium gas mixtures.

  20. A novel dose-based positioning method for CT image-guided proton therapy

    PubMed Central

    Cheung, Joey P.; Park, Peter C.; Court, Laurence E.; Ronald Zhu, X.; Kudchadker, Rajat J.; Frank, Steven J.; Dong, Lei

    2013-01-01

    Purpose: Proton dose distributions can potentially be altered by anatomical changes in the beam path despite perfect target alignment using traditional image guidance methods. In this simulation study, the authors explored the use of dosimetric factors instead of only anatomy to set up patients for proton therapy using in-room volumetric computed tomographic (CT) images. Methods: To simulate patient anatomy in a free-breathing treatment condition, weekly time-averaged four-dimensional CT data near the end of treatment for 15 lung cancer patients were used in this study for a dose-based isocenter shift method to correct dosimetric deviations without replanning. The isocenter shift was obtained using the traditional anatomy-based image guidance method as the starting position. Subsequent isocenter shifts were established based on dosimetric criteria using a fast dose approximation method. For each isocenter shift, doses were calculated every 2 mm up to ±8 mm in each direction. The optimal dose alignment was obtained by imposing a target coverage constraint that at least 99% of the target would receive at least 95% of the prescribed dose and by minimizing the mean dose to the ipsilateral lung. Results: The authors found that 7 of 15 plans did not meet the target coverage constraint when using only the anatomy-based alignment. After the authors applied dose-based alignment, all met the target coverage constraint. For all but one case in which the target dose was met using both anatomy-based and dose-based alignment, the latter method was able to improve normal tissue sparing. Conclusions: The authors demonstrated that a dose-based adjustment to the isocenter can improve target coverage and/or reduce dose to nearby normal tissue. PMID:23635262

  1. Cost–effectiveness thresholds: pros and cons

    PubMed Central

    Lauer, Jeremy A; De Joncheere, Kees; Edejer, Tessa; Hutubessy, Raymond; Kieny, Marie-Paule; Hill, Suzanne R

    2016-01-01

    Abstract Cost–effectiveness analysis is used to compare the costs and outcomes of alternative policy options. Each resulting cost–effectiveness ratio represents the magnitude of additional health gained per additional unit of resources spent. Cost–effectiveness thresholds allow cost–effectiveness ratios that represent good or very good value for money to be identified. In 2001, the World Health Organization’s Commission on Macroeconomics in Health suggested cost–effectiveness thresholds based on multiples of a country’s per-capita gross domestic product (GDP). In some contexts, in choosing which health interventions to fund and which not to fund, these thresholds have been used as decision rules. However, experience with the use of such GDP-based thresholds in decision-making processes at country level shows them to lack country specificity and this – in addition to uncertainty in the modelled cost–effectiveness ratios – can lead to the wrong decision on how to spend health-care resources. Cost–effectiveness information should be used alongside other considerations – e.g. budget impact and feasibility considerations – in a transparent decision-making process, rather than in isolation based on a single threshold value. Although cost–effectiveness ratios are undoubtedly informative in assessing value for money, countries should be encouraged to develop a context-specific process for decision-making that is supported by legislation, has stakeholder buy-in, for example the involvement of civil society organizations and patient groups, and is transparent, consistent and fair. PMID:27994285

  2. Cost-effectiveness thresholds: pros and cons.

    PubMed

    Bertram, Melanie Y; Lauer, Jeremy A; De Joncheere, Kees; Edejer, Tessa; Hutubessy, Raymond; Kieny, Marie-Paule; Hill, Suzanne R

    2016-12-01

    Cost-effectiveness analysis is used to compare the costs and outcomes of alternative policy options. Each resulting cost-effectiveness ratio represents the magnitude of additional health gained per additional unit of resources spent. Cost-effectiveness thresholds allow cost-effectiveness ratios that represent good or very good value for money to be identified. In 2001, the World Health Organization's Commission on Macroeconomics in Health suggested cost-effectiveness thresholds based on multiples of a country's per-capita gross domestic product (GDP). In some contexts, in choosing which health interventions to fund and which not to fund, these thresholds have been used as decision rules. However, experience with the use of such GDP-based thresholds in decision-making processes at country level shows them to lack country specificity and this - in addition to uncertainty in the modelled cost-effectiveness ratios - can lead to the wrong decision on how to spend health-care resources. Cost-effectiveness information should be used alongside other considerations - e.g. budget impact and feasibility considerations - in a transparent decision-making process, rather than in isolation based on a single threshold value. Although cost-effectiveness ratios are undoubtedly informative in assessing value for money, countries should be encouraged to develop a context-specific process for decision-making that is supported by legislation, has stakeholder buy-in, for example the involvement of civil society organizations and patient groups, and is transparent, consistent and fair.

  3. On effective dose for radiotherapy based on doses to nontarget organs and tissues

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Uselmann, Adam J., E-mail: ajuselmann@wisc.edu; Thomadsen, Bruce R.

    2015-02-15

    Purpose: The National Council for Radiation Protection and Measurement (NCRP) published estimates for the collective population dose and the mean effective dose to the population of the United States from medical imaging procedures for 1980/1982 and for 2006. The earlier report ignored the effective dose from radiotherapy and the latter gave a cursory discussion of the topic but again did not include it in the population exposure for various reasons. This paper explains the methodology used to calculate the effective dose in due to radiotherapy procedures in the latter NCRP report and revises the values based on more detailed modeling.more » Methods: This study calculated the dose to nontarget organs from radiotherapy for reference populations using CT images and published peripheral dose data. Results: Using International Commission on Radiological Protection (ICRP) 60 weighting factors, the total effective dose to nontarget organs in radiotherapy patients is estimated as 298 ± 194 mSv per patient, while the U.S. population effective dose is 0.939 ± 0.610 mSv per person, with a collective dose of 283 000 ± 184 000 person Sv per year. Using ICRP 103 weighting factors, the effective dose is 281 ± 183 mSv per patient, 0.887 ± 0.577 mSv per person in the U.S., and 268 000 ± 174 000 person Sv per year. The uncertainty in the calculations is largely governed by variations in patient size, which was accounted for by considering a range of patient sizes and taking the average treatment site to nontarget organ distance. Conclusions: The methods used to estimate the effective doses from radiotherapy used in NCRP Report No. 160 have been explained and the values updated.« less

  4. Retinal injury thresholds for 532, 578, and 630 nm lasers in connection to photodynamic therapy for choroidal neovascularization.

    PubMed

    Chen, Hongxia; Yang, Zaifu; Zou, Xianbiao; Wang, Jiarui; Zhu, Jianguo; Gu, Ying

    2014-01-01

    The purpose of this study was to explore the retinal injury thresholds in rabbits and evaluate the influence of retinal pigmentation on threshold irradiance at laser wavelengths of 532, 578, and 630 nm which might be involved in hypocrellin B (HB) and hematoporphyrin monomethyl ether (HMME) photodynamic therapy (PDT) for choroidal neovascularization (CNV). The eyes of pigmented and non-pigmented rabbits were exposed to 532, 578, and 630 nm lasers coupled to a slit lamp biological microscope. The exposure duration was 100 seconds and the retinal spot size was 2 mm throughout the experiment. The minimum visible lesions were detected by funduscopy at 1 and 24 hours post exposure. Bliss probit analysis was performed to determine the ED50 thresholds, fiducial limits and probit slope. In pigmented rabbits, the 24-hour retinal threshold irradiances at 532, 578, and 630 nm were 1,003, 1,475, and 1,720 mW/cm(2) , respectively. In non-pigmented rabbits, the 24-hour threshold irradiances were 1,657, 1,865, and 15,360 mW/cm(2) , respectively. The ED50 for 24-hour observation differed very little from the ED50 for 1-hour observation. The non-pigmented rabbits required a ninefold increase in threshold irradiance at 630 nm comparing to the pigmented rabbits. This study will contribute to the knowledge base for the limits of laser irradiance in application of HB or HMME PDT for CNV. © 2013 Wiley Periodicals, Inc.

  5. Measuring the educational impact of Promoting Environmental Awareness in Kids (PEAK): The development and implementation of a new scale

    Treesearch

    Jennifer Miller; Lindsey Brown; Eddie Hill; Amy Shellman; Ron Ramsing; Edwin Gómez

    2012-01-01

    The Leave No Trace Center for Outdoor Ethics (LNT) is a nonprofit educational organization that teaches skills and values for recreating responsibly in the out-of-doors. LNT developed Promoting Environmental Awareness in Kids (PEAK), based on seven ethical principles. The PEAK program provides a pack that contains several interactive activities specifically designed to...

  6. Construction of boundary-surface-based Chinese female astronaut computational phantom and proton dose estimation

    PubMed Central

    Sun, Wenjuan; JIA, Xianghong; XIE, Tianwu; XU, Feng; LIU, Qian

    2013-01-01

    With the rapid development of China's space industry, the importance of radiation protection is increasingly prominent. To provide relevant dose data, we first developed the Visible Chinese Human adult Female (VCH-F) phantom, and performed further modifications to generate the VCH-F Astronaut (VCH-FA) phantom, incorporating statistical body characteristics data from the first batch of Chinese female astronauts as well as reference organ mass data from the International Commission on Radiological Protection (ICRP; both within 1% relative error). Based on cryosection images, the original phantom was constructed via Non-Uniform Rational B-Spline (NURBS) boundary surfaces to strengthen the deformability for fitting the body parameters of Chinese female astronauts. The VCH-FA phantom was voxelized at a resolution of 2 × 2 × 4 mm3for radioactive particle transport simulations from isotropic protons with energies of 5000–10 000 MeV in Monte Carlo N-Particle eXtended (MCNPX) code. To investigate discrepancies caused by anatomical variations and other factors, the obtained doses were compared with corresponding values from other phantoms and sex-averaged doses. Dose differences were observed among phantom calculation results, especially for effective dose with low-energy protons. Local skin thickness shifts the breast dose curve toward high energy, but has little impact on inner organs. Under a shielding layer, organ dose reduction is greater for skin than for other organs. The calculated skin dose per day closely approximates measurement data obtained in low-Earth orbit (LEO). PMID:23135158

  7. Optimization and Dose Estimation of Aerosol Delivery to Non-Human Primates.

    PubMed

    MacLoughlin, Ronan J; van Amerongen, Geert; Fink, James B; Janssens, Hettie M; Duprex, W Paul; de Swart, Rik L

    2016-06-01

    In pre-clinical animal studies, the uniformity of dosing across subjects and routes of administration is a crucial requirement. In preparation for a study in which aerosolized live-attenuated measles virus vaccine was administered to cynomolgus monkeys (Macaca fascicularis) by inhalation, we assessed the percentage of a nebulized dose inhaled under varying conditions. Drug delivery varies with breathing parameters. Therefore we determined macaque breathing patterns (tidal volume, breathing frequency, and inspiratory to expiratory (I:E) ratio) across a range of 3.3-6.5 kg body weight, using a pediatric pneumotachometer interfaced either with an endotracheal tube or a facemask. Subsequently, these breathing patterns were reproduced using a breathing simulator attached to a filter to collect the inhaled dose. Albuterol was nebulized using a vibrating mesh nebulizer and the percentage inhaled dose was determined by extraction of drug from the filter and subsequent quantification. Tidal volumes ranged from 24 to 46 mL, breathing frequencies from 19 to 31 breaths per minute and I:E ratios from 0.7 to 1.6. A small pediatric resuscitation mask was identified as the best fitting interface between animal and pneumotachometer. The average efficiency of inhaled dose delivery was 32.1% (standard deviation 7.5, range 24%-48%), with variation in tidal volumes as the most important determinant. Studies in non-human primates aimed at comparing aerosol delivery with other routes of administration should take both the inter-subject variation and relatively low efficiency of delivery to these low body weight mammals into account.

  8. A low-threshold nanolaser based on hybrid plasmonic waveguides at the deep subwavelength scale

    NASA Astrophysics Data System (ADS)

    Li, Zhi-Quan; Piao, Rui-Qi; Zhao, Jing-Jing; Meng, Xiao-Yun; Tong, Kai

    2015-07-01

    A novel nanolaser structure based on a hybrid plasmonic waveguide is proposed and investigated. The coupling between the metal nanowire and the high-index semiconductor nanowire with optical gain leads to a strong field enhancement in the air gap region and low propagation loss, which enables the realization of lasing at the deep subwavelength scale. By optimizing the geometric parameters of the structure, a minimal lasing threshold is achieved while maintaining the capacity of ultra-deep subwavelength mode confinement. Compared with the previous coupled nanowire pair based hybrid plasmonic structure, a lower threshold can be obtained with the same geometric parameters. The proposed nanolaser can be integrated into a miniature chip as a nanoscale light source and has the potential to be widely used in optical communication and optical sensing technology. Project supported by the National Natural Science Foundation of China (Grant No. 61172044) and the Natural Science Foundation of Hebei Province, China (Grant No. F2014501150).

  9. THRESHOLD LOGIC.

    DTIC Science & Technology

    synthesis procedures; a ’best’ method is definitely established. (2) ’Symmetry Types for Threshold Logic’ is a tutorial expositon including a careful...development of the Goto-Takahasi self-dual type ideas. (3) ’Best Threshold Gate Decisions’ reports a comparison, on the 2470 7-argument threshold ...interpretation is shown best. (4) ’ Threshold Gate Networks’ reviews the previously discussed 2-algorithm in geometric terms, describes our FORTRAN

  10. Thermal antinociception after dexmedetomidine administration in cats: a dose-finding study.

    PubMed

    Slingsby, L S; Taylor, P M

    2008-04-01

    The optimum dose of dexmedetomidine for antinociception to a thermal stimulus was determined in a crossover study of 12 cats. In five treatment groups (n = 10 per group), dexmedetomidine was administered intramuscularly (i.m.) at 2, 5, 10, 20 and 40 microg/kg; positive and negative controls were administered buprenorphine (20 microg/kg, i.m.) and 0.9% saline (0.006 mL/kg, i.m.) respectively. Baseline thermal thresholds and visual analogue scale (VAS) sedation scores were obtained prior to drug treatment and then at regular intervals until 24 h after administration. The summary measures of overall mean thresholds and overall mean VAS scores were investigated using a univariate general linear model for multiple factors with post hoc Tukey's tests (P < 0.05). Only dexmedetomidine at 40 microg/kg displayed an analgesic effect (less than that of buprenorphine). The VAS for sedation did not significantly affect the thresholds obtained and treatment was the only significant factor to influence VAS. Dexmedetomidine resulted in higher VAS for sedation than saline and buprenorphine. Dexmedetomidine at 40 microg/kg significantly increased nociceptive thresholds compared with saline control, but less than buprenorphine. Dexmedetomidine produced dose-dependent sedation, but only the highest dose produced analgesia, suggesting that induction of analgesia requires the highest dose (or an additional analgesic) in the clinical setting.

  11. Re-assess Vector Indices Threshold as an Early Warning Tool for Predicting Dengue Epidemic in a Dengue Non-endemic Country

    PubMed Central

    Hsu, Pi-Shan; Chen, Chaur-Dong; Lian, Ie-Bin; Chao, Day-Yu

    2015-01-01

    Background Despite dengue dynamics being driven by complex interactions between human hosts, mosquito vectors and viruses that are influenced by climate factors, an operational model that will enable health authorities to anticipate the outbreak risk in a dengue non-endemic area has not been developed. The objectives of this study were to evaluate the temporal relationship between meteorological variables, entomological surveillance indices and confirmed dengue cases; and to establish the threshold for entomological surveillance indices including three mosquito larval indices [Breteau (BI), Container (CI) and House indices (HI)] and one adult index (AI) as an early warning tool for dengue epidemic. Methodology/Principal Findings Epidemiological, entomological and meteorological data were analyzed from 2005 to 2012 in Kaohsiung City, Taiwan. The successive waves of dengue outbreaks with different magnitudes were recorded in Kaohsiung City, and involved a dominant serotype during each epidemic. The annual indigenous dengue cases usually started from May to June and reached a peak in October to November. Vector data from 2005–2012 showed that the peak of the adult mosquito population was followed by a peak in the corresponding dengue activity with a lag period of 1–2 months. Therefore, we focused the analysis on the data from May to December and the high risk district, where the inspection of the immature and mature mosquitoes was carried out on a weekly basis and about 97.9% dengue cases occurred. The two-stage model was utilized here to estimate the risk and time-lag effect of annual dengue outbreaks in Taiwan. First, Poisson regression was used to select the optimal subset of variables and time-lags for predicting the number of dengue cases, and the final results of the multivariate analysis were selected based on the smallest AIC value. Next, each vector index models with selected variables were subjected to multiple logistic regression models to examine the

  12. 40 CFR 98.381 - Reporting threshold.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 21 2011-07-01 2011-07-01 false Reporting threshold. 98.381 Section 98.381 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Coal-based Liquid Fuels § 98.381 Reporting threshold. Any supplier of coal-to-liquid products who...

  13. 40 CFR 98.381 - Reporting threshold.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 21 2014-07-01 2014-07-01 false Reporting threshold. 98.381 Section 98.381 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Coal-based Liquid Fuels § 98.381 Reporting threshold. Any supplier of coal-to-liquid products who...

  14. 40 CFR 98.381 - Reporting threshold.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 22 2013-07-01 2013-07-01 false Reporting threshold. 98.381 Section 98.381 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Coal-based Liquid Fuels § 98.381 Reporting threshold. Any supplier of coal-to-liquid products who...

  15. CORRELATIONS IN LIGHT FROM A LASER AT THRESHOLD,

    DTIC Science & Technology

    Temporal correlations in the electromagnetic field radiated by a laser in the threshold region of oscillation (from one tenth of threshold intensity...to ten times threshold ) were measured by photoelectron counting techniques. The experimental results were compared with theoretical predictions based...shows that the intensity fluctuations at about one tenth threshold are nearly those of a Gaussian field and continuously approach those of a constant amplitude field as the intensity is increased. (Author)

  16. Subtle changes in brain functions produced by single doses of mevinphos (Phosdrin).

    DOT National Transportation Integrated Search

    1973-02-01

    Mevinphos (Phosdrin) was found to inhibit the amplitude of hippocampal evoked potentials in unanesthetized squirrel monkeys with chronically indwelling electrodes. The threshold dose was 0.050 mg/kg and the maximal dose studied was 0.200 mg/kg. Doses...

  17. An Evaluation of Performance Thresholds in Nursing Home Pay-for-Performance.

    PubMed

    Werner, Rachel M; Skira, Meghan; Konetzka, R Tamara

    2016-12-01

    Performance thresholds are commonly used in pay-for-performance (P4P) incentives, where providers receive a bonus payment for achieving a prespecified target threshold but may produce discontinuous incentives, with providers just below the threshold having the strongest incentive to improve and providers either far below or above the threshold having little incentive. We investigate the effect of performance thresholds on provider response in the setting of nursing home P4P. The Minimum Data Set (MDS) and Online Survey, Certification, and Reporting (OSCAR) datasets. Difference-in-differences design to test for changes in nursing home performance in three states that implemented threshold-based P4P (Colorado, Georgia, and Oklahoma) versus three comparator states (Arizona, Tennessee, and Arkansas) between 2006 and 2009. We find that those farthest below the threshold (i.e., the worst-performing nursing homes) had the largest improvements under threshold-based P4P while those farthest above the threshold worsened. This effect did not vary with the percentage of Medicaid residents in a nursing home. Threshold-based P4P may provide perverse incentives for nursing homes above the performance threshold, but we do not find evidence to support concerns about the effects of performance thresholds on low-performing nursing homes. © Health Research and Educational Trust.

  18. Sensing the intruder: a quantitative threshold for recognition cues perception in honeybees

    NASA Astrophysics Data System (ADS)

    Cappa, Federico; Bruschini, Claudia; Cipollini, Maria; Pieraccini, Giuseppe; Cervo, Rita

    2014-02-01

    The ability to discriminate among nestmates and non-nestmate is essential to defend social insect colonies from intruders. Over the years, nestmate recognition has been extensively studied in the honeybee Apis mellifera; nevertheless, the quantitative perceptual aspects at the basis of the recognition system represent an unexplored subject in this species. To test the existence of a cuticular hydrocarbons' quantitative perception threshold for nestmate recognition cues, we conducted behavioural assays by presenting different amounts of a foreign forager's chemical profile to honeybees at the entrance of their colonies. We found an increase in the explorative and aggressive responses as the amount of cues increased based on a threshold mechanism, highlighting the importance of the quantitative perceptual features for the recognition processes in A. mellifera.

  19. Effect of butorphanol on thermal nociceptive threshold in healthy pony foals.

    PubMed

    McGowan, K T; Elfenbein, J R; Robertson, S A; Sanchez, L C

    2013-07-01

    Pain management is an important component of foal nursing care, and no objective data currently exist regarding the analgesic efficacy of opioids in foals. To evaluate the somatic antinociceptive effects of 2 commonly used doses of intravenous (i.v.) butorphanol in healthy foals. Our hypothesis was that thermal nociceptive threshold would increase following i.v. butorphanol in a dose-dependent manner in both neonatal and older pony foals. Seven healthy neonatal pony foals (age 1-2 weeks), and 11 healthy older pony foals (age 4-8 weeks). Five foals were used during both age periods. Treatments, which included saline (0.5 ml), butorphanol (0.05 mg/kg bwt) and butorphanol (0.1 mg/kg bwt), were administered i.v. in a randomised crossover design with at least 2 days between treatments. Response variables included thermal nociceptive threshold, skin temperature and behaviour score. Data within each age period were analysed using a 2-way repeated measures ANOVA, followed by a Holm-Sidak multiple comparison procedure if warranted. There was a significant (P<0.05) increase in thermal threshold, relative to Time 0, following butorphanol (0.1 mg/kg bwt) administration in both age groups. No significant time or treatment effects were apparent for skin temperature. Significant time, but not treatment, effects were evident for behaviour score in both age groups. Butorphanol (0.1 mg/kg bwt, but not 0.05 mg/kg bwt) significantly increased thermal nociceptive threshold in neonatal and older foals without apparent adverse behavioural effects. Butorphanol shows analgesic potential in foals for management of somatic painful conditions. © 2012 EVJ Ltd.

  20. An atlas-based organ dose estimator for tomosynthesis and radiography

    NASA Astrophysics Data System (ADS)

    Hoye, Jocelyn; Zhang, Yakun; Agasthya, Greeshma; Sturgeon, Greg; Kapadia, Anuj; Segars, W. Paul; Samei, Ehsan

    2017-03-01

    The purpose of this study was to provide patient-specific organ dose estimation based on an atlas of human models for twenty tomosynthesis and radiography protocols. The study utilized a library of 54 adult computational phantoms (age: 18-78 years, weight 52-117 kg) and a validated Monte-Carlo simulation (PENELOPE) of a tomosynthesis and radiography system to estimate organ dose. Positioning of patient anatomy was based on radiographic positioning handbooks. The field of view for each exam was calculated to include relevant organs per protocol. Through simulations, the energy deposited in each organ was binned to estimate normalized organ doses into a reference database. The database can be used as the basis to devise a dose calculator to predict patient-specific organ dose values based on kVp, mAs, exposure in air, and patient habitus for a given protocol. As an example of the utility of this tool, dose to an organ was studied as a function of average patient thickness in the field of view for a given exam and as a function of Body Mass Index (BMI). For tomosynthesis, organ doses can also be studied as a function of x-ray tube position. This work developed comprehensive information for organ dose dependencies across tomosynthesis and radiography. There was a general exponential decrease dependency with increasing patient size that is highly protocol dependent. There was a wide range of variability in organ dose across the patient population, which needs to be incorporated in the metrology of organ dose.

  1. [Extraction of temperate vegetation phenology thresholds in North America based on flux tower observation data].

    PubMed

    Zhao, Jing-Jing; Liu, Liang-Yun

    2013-02-01

    Flux tower method can effectively monitor the vegetation seasonal and phenological variation processes. At present, the differences in the detection and quantitative evaluation of various phenology extraction methods were not well validated and quantified. Based on the gross primary productivity (GPP) and net ecosystem productivity (NEP) data of temperate forests from 9 forest FLUXNET sites in North America, and by using the start dates (SOS) and end dates (EOS) of the temperate forest growth seasons extracted by different phenology threshold extraction methods, in combining with the forest ecosystem carbon source/sink functions, this paper analyzed the effects of different threshold standards on the extraction results of the vegetations phenology. The results showed that the effects of different threshold standards on the stability of the extracted results of deciduous broadleaved forest (DBF) phenology were smaller than those on the stability of the extracted results of evergreen needleleaved forest (ENF) phenology. Among the extracted absolute and relative thresholds of the forests GPP, the extracted threshold of the DBF daily GPP= 2 g C.m-2.d-1 had the best agreement with the DBF daily GPP = 20% maximum GPP (GPPmax) , the phenological metrics with a threshold of daily GPP = 4 g C.m-2.d-1 was close to that between daily GPP = 20% GPPmax and daily GPP = 50% GPPmax, and the start date of ecosystem carbon sink function was close to the SOS metrics between daily GPP = 4 g C.m-2.d-1 and daily GPP= 20% GPPmax. For ENF, the phenological metrics with a threshold of daily GPP = 2 g C.m-2.d-1 and daily GPP = 4 g C.m-2.d-1 had the best agreement with the daily GPP = 20% GPPmax and daily GPP = 50% GPPmax, respectively, and the start date of the ecosystem carbon sink function was close to the SOS metrics between daily GPP = 2 g C.m-2.d-1 and daily GPP= 10% GPPmax.

  2. Patient doses from chest radiography in Victoria.

    PubMed

    Cardillo, I; Boal, T J; Einsiedel, P F

    1997-06-01

    This survey examines doses from PA chest radiography at radiology practices, private hospitals and public hospitals throughout metropolitan and country Victoria. Data were collected from 111 individual X-ray units at 86 different practices. Entrance skin doses in air were measured for exposure factors used by the centre for a 23 cm thick male chest. A CDRH LucA1 chest phantom was used when making these measurements. About half of the centres used grid technique and half used non-grid technique. There was a factor of greater than 10 difference in the entrance dose delivered between the highest dose centre and the lowest dose centre for non-grid centres; and a factor of about 5 for centres using grids. Factors contributing to the high doses recorded at some centres were identified. Guidance levels for chest radiography based on the third quartile value of the entrance doses from this survey have been recommended and compared with guidance levels recommended in other countries.

  3. Using thresholds based on risk of cardiovascular disease to target treatment for hypertension: modelling events averted and number treated

    PubMed Central

    Baker, Simon; Priest, Patricia; Jackson, Rod

    2000-01-01

    Objective To estimate the impact of using thresholds based on absolute risk of cardiovascular disease to target drug treatment to lower blood pressure in the community. Design Modelling of three thresholds of treatment for hypertension based on the absolute risk of cardiovascular disease. 5 year risk of disease was estimated for each participant using an equation to predict risk. Net predicted impact of the thresholds on the number of people treated and the number of disease events averted over 5 years was calculated assuming a relative treatment benefit of one quarter. Setting Auckland, New Zealand. Participants 2158 men and women aged 35-79 years randomly sampled from the general electoral rolls. Main outcome measures Predicted 5 year risk of cardiovascular disease event, estimated number of people for whom treatment would be recommended, and disease events averted over 5 years at different treatment thresholds. Results 46 374 (12%) Auckland residents aged 35-79 receive drug treatment to lower their blood pressure, averting an estimated 1689 disease events over 5 years. Restricting treatment to individuals with blood pressure ⩾170/100 mm Hg and those with blood pressure between 150/90-169/99 mm Hg who have a predicted 5 year risk of disease ⩾10% would increase the net number for whom treatment would be recommended by 19 401. This 42% relative increase is predicted to avert 1139/1689 (68%) additional disease events overall over 5 years compared with current treatment. If the threshold for 5 year risk of disease is set at 15% the number recommended for treatment increases by <10% but about 620/1689 (37%) additional events can be averted. A 20% threshold decreases the net number of patients recommended for treatment by about 10% but averts 204/1689 (12%) more disease events than current treatment. Conclusions Implementing treatment guidelines that use treatment thresholds based on absolute risk could significantly improve the efficiency of drug treatment to

  4. Comparison of a field-based test to estimate functional threshold power and power output at lactate threshold.

    PubMed

    Gavin, Timothy P; Van Meter, Jessica B; Brophy, Patricia M; Dubis, Gabriel S; Potts, Katlin N; Hickner, Robert C

    2012-02-01

    It has been proposed that field-based tests (FT) used to estimate functional threshold power (FTP) result in power output (PO) equivalent to PO at lactate threshold (LT). However, anecdotal evidence from regional cycling teams tested for LT in our laboratory suggested that PO at LT underestimated FTP. It was hypothesized that estimated FTP is not equivalent to PO at LT. The LT and estimated FTP were measured in 7 trained male competitive cyclists (VO2max = 65.3 ± 1.6 ml O2·kg(-1)·min(-1)). The FTP was estimated from an 8-minute FT and compared with PO at LT using 2 methods; LT(Δ1), a 1 mmol·L(-1) or greater rise in blood lactate in response to an increase in workload and LT(4.0), blood lactate of 4.0 mmol·L(-1). The estimated FTP was equivalent to PO at LT(4.0) and greater than PO at LT(Δ1). VO2max explained 93% of the variance in individual PO during the 8-minute FT. When the 8-minute FT PO was expressed relative to maximal PO from the VO2max test (individual exercise performance), VO2max explained 64% of the variance in individual exercise performance. The PO at LT was not related to 8-minute FT PO. In conclusion, FTP estimated from an 8-minute FT is equivalent to PO at LT if LT(4.0) is used but is not equivalent for all methods of LT determination including LT(Δ1).

  5. Aggregate versus day level association between methamphetamine use and HIV medication non-adherence among gay and bisexual men

    PubMed Central

    Parsons, Jeffrey T.; Kowalczyk, William; Botsko, Michael; Tomassilli, Julia; Golub, Sarit A.

    2013-01-01

    Methamphetamine use is associated with HIV infection, especially among gay and bisexual men. Methamphetamine use contributes to disease progression both directly, by increasing viral load and damaging the immune system, and indirectly, by decreasing medication adherence. Research examining the association of methamphetamine use and non-adherence has traditionally compared groups of users and nonusers on adherence, compared methamphetamine use between participants above or below some threshold level of adherence (e.g. >90% dose adherence), or examined aggregate relationships. Using Timeline Follow-back procedures, the present study examined aggregate, threshold, and day-level associations of methamphetamine use with non-adherence in 210 HIV-positive gay and bisexual methamphetamine-using men. Methamphetamine use was not associated with adherence behavior at the aggregate-level, but methamphetamine use on a given day was associated with 2.3 times the odds of non-adherence on that day. Threshold results were equivocal. These data suggest that the methamphetamine and non-adherence relationship is complicated: non-adherence is more likely to occur on days in which methamphetamine is used, but participants reported more non-adherence days in which methamphetamine was not used. This seeming paradox generates questions about the selection of analytical techniques and has important implications for behavioral interventions targeting substance use and adherence among HIV-positive individuals. PMID:23553345

  6. Redefining the Speed Limit of Phase Change Memory Revealed by Time-resolved Steep Threshold-Switching Dynamics of AgInSbTe Devices

    NASA Astrophysics Data System (ADS)

    Shukla, Krishna Dayal; Saxena, Nishant; Durai, Suresh; Manivannan, Anbarasu

    2016-11-01

    Although phase-change memory (PCM) offers promising features for a ‘universal memory’ owing to high-speed and non-volatility, achieving fast electrical switching remains a key challenge. In this work, a correlation between the rate of applied voltage and the dynamics of threshold-switching is investigated at picosecond-timescale. A distinct characteristic feature of enabling a rapid threshold-switching at a critical voltage known as the threshold voltage as validated by an instantaneous response of steep current rise from an amorphous off to on state is achieved within 250 picoseconds and this is followed by a slower current rise leading to crystallization. Also, we demonstrate that the extraordinary nature of threshold-switching dynamics in AgInSbTe cells is independent to the rate of applied voltage unlike other chalcogenide-based phase change materials exhibiting the voltage dependent transient switching characteristics. Furthermore, numerical solutions of time-dependent conduction process validate the experimental results, which reveal the electronic nature of threshold-switching. These findings of steep threshold-switching of ‘sub-50 ps delay time’, opens up a new way for achieving high-speed non-volatile memory for mainstream computing.

  7. Contributions of adaptation currents to dynamic spike threshold on slow timescales: Biophysical insights from conductance-based models

    NASA Astrophysics Data System (ADS)

    Yi, Guosheng; Wang, Jiang; Wei, Xile; Deng, Bin; Li, Huiyan; Che, Yanqiu

    2017-06-01

    Spike-frequency adaptation (SFA) mediated by various adaptation currents, such as voltage-gated K+ current (IM), Ca2+-gated K+ current (IAHP), or Na+-activated K+ current (IKNa), exists in many types of neurons, which has been shown to effectively shape their information transmission properties on slow timescales. Here we use conductance-based models to investigate how the activation of three adaptation currents regulates the threshold voltage for action potential (AP) initiation during the course of SFA. It is observed that the spike threshold gets depolarized and the rate of membrane depolarization (dV/dt) preceding AP is reduced as adaptation currents reduce firing rate. It is indicated that the presence of inhibitory adaptation currents enables the neuron to generate a dynamic threshold inversely correlated with preceding dV/dt on slower timescales than fast dynamics of AP generation. By analyzing the interactions of ionic currents at subthreshold potentials, we find that the activation of adaptation currents increase the outward level of net membrane current prior to AP initiation, which antagonizes inward Na+ to result in a depolarized threshold and lower dV/dt from one AP to the next. Our simulations demonstrate that the threshold dynamics on slow timescales is a secondary effect caused by the activation of adaptation currents. These findings have provided a biophysical interpretation of the relationship between adaptation currents and spike threshold.

  8. Patient dose estimation from CT scans at the Mexican National Neurology and Neurosurgery Institute

    NASA Astrophysics Data System (ADS)

    Alva-Sánchez, Héctor; Reynoso-Mejía, Alberto; Casares-Cruz, Katiuzka; Taboada-Barajas, Jesús

    2014-11-01

    In the radiology department of the Mexican National Institute of Neurology and Neurosurgery, a dedicated institute in Mexico City, on average 19.3 computed tomography (CT) examinations are performed daily on hospitalized patients for neurological disease diagnosis, control scans and follow-up imaging. The purpose of this work was to estimate the effective dose received by hospitalized patients who underwent a diagnostic CT scan using typical effective dose values for all CT types and to obtain the estimated effective dose distributions received by surgical and non-surgical patients. Effective patient doses were estimated from values per study type reported in the applications guide provided by the scanner manufacturer. This retrospective study included all hospitalized patients who underwent a diagnostic CT scan between 1 January 2011 and 31 December 2012. A total of 8777 CT scans were performed in this two-year period. Simple brain scan was the CT type performed the most (74.3%) followed by contrasted brain scan (6.1%) and head angiotomography (5.7%). The average number of CT scans per patient was 2.83; the average effective dose per patient was 7.9 mSv; the mean estimated radiation dose was significantly higher for surgical (9.1 mSv) than non-surgical patients (6.0 mSv). Three percent of the patients had 10 or more brain CT scans and exceeded the organ radiation dose threshold set by the International Commission on Radiological Protection for deterministic effects of the eye-lens. Although radiation patient doses from CT scans were in general relatively low, 187 patients received a high effective dose (>20 mSv) and 3% might develop cataract from cumulative doses to the eye lens.

  9. SU-F-BRB-12: A Novel Haar Wavelet Based Approach to Deliver Non-Coplanar Intensity Modulated Radiotherapy Using Sparse Orthogonal Collimators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nguyen, D; Ruan, D; Low, D

    2015-06-15

    Purpose: Existing efforts to replace complex multileaf collimator (MLC) by simple jaws for intensity modulated radiation therapy (IMRT) resulted in unacceptable compromise in plan quality and delivery efficiency. We introduce a novel fluence map segmentation method based on compressed sensing for plan delivery using a simplified sparse orthogonal collimator (SOC) on the 4π non-coplanar radiotherapy platform. Methods: 4π plans with varying prescription doses were first created by automatically selecting and optimizing 20 non-coplanar beams for 2 GBM, 2 head & neck, and 2 lung patients. To create deliverable 4π plans using SOC, which are two pairs of orthogonal collimators withmore » 1 to 4 leaves in each collimator bank, a Haar Fluence Optimization (HFO) method was used to regulate the number of Haar wavelet coefficients while maximizing the dose fidelity to the ideal prescription. The plans were directly stratified utilizing the optimized Haar wavelet rectangular basis. A matching number of deliverable segments were stratified for the MLC-based plans. Results: Compared to the MLC-based 4π plans, the SOC-based 4π plans increased the average PTV dose homogeneity from 0.811 to 0.913. PTV D98 and D99 were improved by 3.53% and 5.60% of the corresponding prescription doses. The average mean and maximal OAR doses slightly increased by 0.57% and 2.57% of the prescription doses. The average number of segments ranged between 5 and 30 per beam. The collimator travel time to create the segments decreased with increasing leaf numbers in the SOC. The two and four leaf designs were 1.71 and 1.93 times more efficient, on average, than the single leaf design. Conclusion: The innovative dose domain optimization based on compressed sensing enables uncompromised 4π non-coplanar IMRT dose delivery using simple rectangular segments that are deliverable using a sparse orthogonal collimator, which only requires 8 to 16 leaves yet is unlimited in modulation resolution. This work

  10. Analysis of the interrelationship of the pulmonary irritation and elicitation thresholds in rats sensitized with 1,6-hexamethylene diisocyanate (HDI)

    PubMed Central

    Pauluhn, Jürgen

    2015-01-01

    Abstract This paper summarizes a range of experimental data central for developing a science-based approach for hazard identification of monomeric and polymeric aliphatic 1,6-hexamethylene diisocyanate (HDI). The dose–response curve of HDI-induced pulmonary responses in naïve or dermally sensitized rats after one or several inhalation priming exposures was examined in the Brown Norway (BN) rat asthma model. Emphasis was directed to demonstrate the need and the difficulty in selecting an appropriate pulmonary dose when much of the inhaled chemically reactive vapor may concentration dependently be retained in the upper airways of obligate nose-breathing rats. The course taken acknowledges the experimental challenges in identifying an elicitation threshold for HDI-monomer near or above the saturated vapor concentration or in the presence of a HDI-polymer aerosol. The inhalation threshold dose on elicitation was determined based on a fixed concentration (C) × variable exposure duration (t) protocol for improving inhalation dosimetry of the lower airways. Neutrophilic granulocytes (PMN) in bronchoalveolar lavage (BAL) fluid in equally inhalation primed naïve and dermally sensitized rats were used to define the inhalation elicitation threshold C × t. Sensitized rats elaborated markedly increased PMN challenged sensitized rats relative to equally challenged naïve rats at 5625 mg HDI/m3 × min (75 mg/m3 for 75 min). PMN were essentially indistinguishable at 900 mg HDI/m3 × min. By applying adjustment factors accounting for both inter-species differences in inhalation dosimetry and intra-species susceptibility, the workplace human-equivalent threshold C × t was estimated to be in the range of the current ACGIH TLV® of HDI. Thus, this rat “asthma” model was suitable to demonstrate elicitation thresholds for HDI-vapor after one or several inhalation priming exposures and seems to be suitable to derive occupational exposure values

  11. Rectal Dose and Source Strength of the High-Dose-Rate Iridium-192 Both Affect Late Rectal Bleeding After Intracavitary Radiation Therapy for Uterine Cervical Carcinoma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Isohashi, Fumiaki, E-mail: isohashi@radonc.med.osaka-u.ac.j; Yoshioka, Yasuo; Koizumi, Masahiko

    2010-07-01

    Purpose: The purpose of this study was to reconfirm our previous findings that the rectal dose and source strength both affect late rectal bleeding after high-dose-rate intracavitary brachytherapy (HDR-ICBT), by using a rectal dose calculated in accordance with the definitions of the International Commission on Radiation Units and Measurements Report 38 (ICRU{sub RP}) or of dose-volume histogram (DVH) parameters by the Groupe Europeen de Curietherapie of the European Society for Therapeutic Radiology and Oncology. Methods and Materials: Sixty-two patients who underwent HDR-ICBT and were followed up for 1 year or more were studied. The rectal dose for ICBT was calculatedmore » by using the ICRP{sub RP} based on orthogonal radiographs or the DVH parameters based on computed tomography (CT). The total dose was calculated as the biologically equivalent dose expressed in 2-Gy fractions (EQD{sub 2}). The relationship between averaged source strength or the EQD{sub 2} and late rectal bleeding was then analyzed. Results: When patients were divided into four groups according to rectal EQD{sub 2} ({>=} or dose) and source strength ({>=} or <2.4 cGy.m{sup 2}.h{sup -1}), the group with both a high EQD{sub 2} and a high source strength showed a significantly greater probability of rectal bleeding for ICRU{sub RP}, D{sub 2cc}, and D{sub 1cc}. The patients with a median rectal dose above the threshold level did not show a greater frequency of rectal bleeding unless the source strength exceeded 2.4 cGy.m{sup 2}.h{sup -1}. Conclusions: Our results obtained with data based on ICRU{sub RP} and CT-based DVH parameters indicate that rectal dose and source strength both affect rectal bleeding after HDR-ICBT.« less

  12. Optimal Design for the Precise Estimation of an Interaction Threshold: The Impact of Exposure to a Mixture of 18 Polyhalogenated Aromatic Hydrocarbons

    PubMed Central

    Yeatts, Sharon D.; Gennings, Chris; Crofton, Kevin M.

    2014-01-01

    Traditional additivity models provide little flexibility in modeling the dose–response relationships of the single agents in a mixture. While the flexible single chemical required (FSCR) methods allow greater flexibility, its implicit nature is an obstacle in the formation of the parameter covariance matrix, which forms the basis for many statistical optimality design criteria. The goal of this effort is to develop a method for constructing the parameter covariance matrix for the FSCR models, so that (local) alphabetic optimality criteria can be applied. Data from Crofton et al. are provided as motivation; in an experiment designed to determine the effect of 18 polyhalogenated aromatic hydrocarbons on serum total thyroxine (T4), the interaction among the chemicals was statistically significant. Gennings et al. fit the FSCR interaction threshold model to the data. The resulting estimate of the interaction threshold was positive and within the observed dose region, providing evidence of a dose-dependent interaction. However, the corresponding likelihood-ratio-based confidence interval was wide and included zero. In order to more precisely estimate the location of the interaction threshold, supplemental data are required. Using the available data as the first stage, the Ds-optimal second-stage design criterion was applied to minimize the variance of the hypothesized interaction threshold. Practical concerns associated with the resulting design are discussed and addressed using the penalized optimality criterion. Results demonstrate that the penalized Ds-optimal second-stage design can be used to more precisely define the interaction threshold while maintaining the characteristics deemed important in practice. PMID:22640366

  13. Edge detection based on adaptive threshold b-spline wavelet for optical sub-aperture measuring

    NASA Astrophysics Data System (ADS)

    Zhang, Shiqi; Hui, Mei; Liu, Ming; Zhao, Zhu; Dong, Liquan; Liu, Xiaohua; Zhao, Yuejin

    2015-08-01

    In the research of optical synthetic aperture imaging system, phase congruency is the main problem and it is necessary to detect sub-aperture phase. The edge of the sub-aperture system is more complex than that in the traditional optical imaging system. And with the existence of steep slope for large-aperture optical component, interference fringe may be quite dense when interference imaging. Deep phase gradient may cause a loss of phase information. Therefore, it's urgent to search for an efficient edge detection method. Wavelet analysis as a powerful tool is widely used in the fields of image processing. Based on its properties of multi-scale transform, edge region is detected with high precision in small scale. Longing with the increase of scale, noise is reduced in contrary. So it has a certain suppression effect on noise. Otherwise, adaptive threshold method which sets different thresholds in various regions can detect edge points from noise. Firstly, fringe pattern is obtained and cubic b-spline wavelet is adopted as the smoothing function. After the multi-scale wavelet decomposition of the whole image, we figure out the local modulus maxima in gradient directions. However, it also contains noise, and thus adaptive threshold method is used to select the modulus maxima. The point which greater than threshold value is boundary point. Finally, we use corrosion and expansion deal with the resulting image to get the consecutive boundary of image.

  14. Threshold quantum cryptography

    NASA Astrophysics Data System (ADS)

    Tokunaga, Yuuki; Okamoto, Tatsuaki; Imoto, Nobuyuki

    2005-01-01

    We present the concept of threshold collaborative unitary transformation or threshold quantum cryptography, which is a kind of quantum version of threshold cryptography. Threshold quantum cryptography states that classical shared secrets are distributed to several parties and a subset of them, whose number is greater than a threshold, collaborates to compute a quantum cryptographic function, while keeping each share secretly inside each party. The shared secrets are reusable if no cheating is detected. As a concrete example of this concept, we show a distributed protocol (with threshold) of conjugate coding.

  15. Threshold detection in an on-off binary communications channel with atmospheric scintillation

    NASA Technical Reports Server (NTRS)

    Webb, W. E.; Marino, J. T., Jr.

    1974-01-01

    The optimum detection threshold in an on-off binary optical communications system operating in the presence of atmospheric turbulence was investigated assuming a poisson detection process and log normal scintillation. The dependence of the probability of bit error on log amplitude variance and received signal strength was analyzed and semi-emperical relationships to predict the optimum detection threshold derived. On the basis of this analysis a piecewise linear model for an adaptive threshold detection system is presented. Bit error probabilities for non-optimum threshold detection system were also investigated.

  16. Investigation on the inertial cavitation threshold and shell properties of commercialized ultrasound contrast agent microbubbles.

    PubMed

    Guo, Xiasheng; Li, Qian; Zhang, Zhe; Zhang, Dong; Tu, Juan

    2013-08-01

    The inertial cavitation (IC) activity of ultrasound contrast agents (UCAs) plays an important role in the development and improvement of ultrasound diagnostic and therapeutic applications. However, various diagnostic and therapeutic applications have different requirements for IC characteristics. Here through IC dose quantifications based on passive cavitation detection, IC thresholds were measured for two commercialized UCAs, albumin-shelled KangRun(®) and lipid-shelled SonoVue(®) microbubbles, at varied UCA volume concentrations (viz., 0.125 and 0.25 vol. %) and acoustic pulse lengths (viz., 5, 10, 20, 50, and 100 cycles). Shell elastic and viscous coefficients of UCAs were estimated by fitting measured acoustic attenuation spectra with Sarkar's model. The influences of sonication condition (viz., acoustic pulse length) and UCA shell properties on IC threshold were discussed based on numerical simulations. Both experimental measurements and numerical simulations indicate that IC thresholds of UCAs decrease with increasing UCA volume concentration and acoustic pulse length. The shell interfacial tension and dilatational viscosity estimated for SonoVue (0.7 ± 0.11 N/m, 6.5 ± 1.01 × 10(-8) kg/s) are smaller than those of KangRun (1.05 ± 0.18 N/m, 1.66 ± 0.38 × 10(-7) kg/s); this might result in lower IC threshold for SonoVue. The current results will be helpful for selecting and utilizing commercialized UCAs for specific clinical applications, while minimizing undesired IC-induced bioeffects.

  17. Estimation of breast dose reduction potential for organ-based tube current modulated CT with wide dose reduction arc

    NASA Astrophysics Data System (ADS)

    Fu, Wanyi; Sturgeon, Gregory M.; Agasthya, Greeshma; Segars, W. Paul; Kapadia, Anuj J.; Samei, Ehsan

    2017-03-01

    This study aimed to estimate the organ dose reduction potential for organ-dose-based tube current modulated (ODM) thoracic CT with wide dose reduction arc. Twenty-one computational anthropomorphic phantoms (XCAT, age range: 27- 75 years, weight range: 52.0-105.8 kg) were used to create a virtual patient population with clinical anatomic variations. For each phantom, two breast tissue compositions were simulated: 50/50 and 20/80 (glandular-to-adipose ratio). A validated Monte Carlo program was used to estimate the organ dose for standard tube current modulation (TCM) (SmartmA, GE Healthcare) and ODM (GE Healthcare) for a commercial CT scanner (Revolution, GE Healthcare) with explicitly modeled tube current modulation profile, scanner geometry, bowtie filtration, and source spectrum. Organ dose was determined using a typical clinical thoracic CT protocol. Both organ dose and CTDIvol-to-organ dose conversion coefficients (h factors) were compared between TCM and ODM. ODM significantly reduced all radiosensitive organ doses (p<0.01). The breast dose was reduced by 30+/-2%. For h factors, organs in the anterior region (e.g. thyroid, stomach) exhibited substantial decreases, and the medial, distributed, and posterior region either saw an increase or no significant change. The organ-dose-based tube current modulation significantly reduced organ doses especially for radiosensitive superficial anterior organs such as the breasts.

  18. Effect of sonication on particle dispersion, administered dose and metal release of non-functionalized, non-inert metal nanoparticles.

    PubMed

    Pradhan, Sulena; Hedberg, Jonas; Blomberg, Eva; Wold, Susanna; Odnevall Wallinder, Inger

    2016-01-01

    In this study, we elucidate the effect of different sonication techniques to efficiently prepare particle dispersions from selected non-functionalized NPs (Cu, Al, Mn, ZnO), and corresponding consequences on the particle dose, surface charge and release of metals. Probe sonication was shown to be the preferred method for dispersing non-inert, non-functionalized metal NPs (Cu, Mn, Al). However, rapid sedimentation during sonication resulted in differences between the real and the administered doses in the order of 30-80 % when sonicating in 1 and 2.56 g/L NP stock solutions. After sonication, extensive agglomeration of the metal NPs resulted in rapid sedimentation of all particles. DLVO calculations supported these findings, showing the strong van der Waals forces of the metal NPs to result in significant NP agglomeration. Metal release from the metal NPs was slightly increased by increased sonication. The addition of a stabilizing agent (bovine serum albumin) had an accelerating effect on the release of metals in sonicated solutions. For Cu and Mn NPs, the extent of particle dissolution increased from <1.6 to ~5 % after sonication for 15 min. A prolonged sonication time (3-15 min) had negligible effects on the zeta potential of the studied NPs. In all, it is shown that it is of utmost importance to carefully investigate how sonication influences the physico-chemical properties of dispersed metal NPs. This should be considered in nanotoxicology investigations of metal NPs.

  19. How useful is the concept of the 'harm threshold' in reproductive ethics and law?

    PubMed

    Smajdor, Anna

    2014-10-01

    In his book Reasons and Persons, Derek Parfit suggests that people are not harmed by being conceived with a disease or disability if they could not have existed without suffering that particular condition. He nevertheless contends that entities can be harmed if the suffering they experience is sufficiently severe. By implication, there is a threshold which divides harmful from non-harmful conceptions. The assumption that such a threshold exists has come to play a part in UK policy making. I argue that Parfit's distinction between harmful and non-harmful conceptions is untenable. Drawing on Kant's refutation of the ontological argument for God's existence, I suggest that the act of creation cannot be identical with the act of harming-nor indeed of benefiting-however great the offspring's suffering may be. I suggest that Parfit is right that bringing children into existence does not usually harm them, but I argue that this must be applied to all conceptions, since Parfit cannot show how the harm threshold can be operationalised. If we think certain conceptions are unethical or should be illegal, this must be on other grounds than that the child is harmed by them. I show that a Millian approach in this context fails to exemplify the empirical and epistemological advantages which are commonly associated with it, and that harm-based legislation would need to be based on broader harm considerations than those relating to the child who is conceived.

  20. Flood and landslide warning based on rainfall thresholds and soil moisture indexes: the HEWS (Hydrohazards Early Warning System) for Sicily

    NASA Astrophysics Data System (ADS)

    Brigandì, Giuseppina; Tito Aronica, Giuseppe; Bonaccorso, Brunella; Gueli, Roberto; Basile, Giuseppe

    2017-09-01

    The main focus of the paper is to present a flood and landslide early warning system, named HEWS (Hydrohazards Early Warning System), specifically developed for the Civil Protection Department of Sicily, based on the combined use of rainfall thresholds, soil moisture modelling and quantitative precipitation forecast (QPF). The warning system is referred to 9 different Alert Zones in which Sicily has been divided into and based on a threshold system of three different increasing critical levels: ordinary, moderate and high. In this system, for early flood warning, a Soil Moisture Accounting (SMA) model provides daily soil moisture conditions, which allow to select a specific set of three rainfall thresholds, one for each critical level considered, to be used for issue the alert bulletin. Wetness indexes, representative of the soil moisture conditions of a catchment, are calculated using a simple, spatially-lumped rainfall-streamflow model, based on the SCS-CN method, and on the unit hydrograph approach, that require daily observed and/or predicted rainfall, and temperature data as input. For the calibration of this model daily continuous time series of rainfall, streamflow and air temperature data are used. An event based lumped rainfall-runoff model has been, instead, used for the derivation of the rainfall thresholds for each catchment in Sicily characterised by an area larger than 50 km2. In particular, a Kinematic Instantaneous Unit Hydrograph based lumped rainfall-runoff model with the SCS-CN routine for net rainfall was developed for this purpose. For rainfall-induced shallow landslide warning, empirical rainfall thresholds provided by Gariano et al. (2015) have been included in the system. They were derived on an empirical basis starting from a catalogue of 265 shallow landslides in Sicily in the period 2002-2012. Finally, Delft-FEWS operational forecasting platform has been applied to link input data, SMA model and rainfall threshold models to produce