Benchmark dose risk assessment software (BMDS) was designed by EPA to generate dose-response curves and facilitate the analysis, interpretation and synthesis of toxicological data. Partial results of QA/QC testing of the EPA benchmark dose software (BMDS) are presented. BMDS pr...
Application of Benchmark Dose Methodology to a Variety of Endpoints and Exposures
This latest beta version (1.1b) of the U.S. Environmental Protection Agency (EPA) Benchmark Dose Software (BMDS) is being distributed for public comment. The BMDS system is being developed as a tool to facilitate the application of benchmark dose (BMD) methods to EPA hazardous p...
RESULTS OF QA/QC TESTING OF EPA BENCHMARK DOSE SOFTWARE VERSION 1.2
EPA is developing benchmark dose software (BMDS) to support cancer and non-cancer dose-response assessments. Following the recent public review of BMDS version 1.1b, EPA developed a Hill model for evaluating continuous data, and improved the user interface and Multistage, Polyno...
Quality Assurance Testing of Version 1.3 of U.S. EPA Benchmark Dose Software (Presentation)
EPA benchmark dose software (BMDS) issued to evaluate chemical dose-response data in support of Agency risk assessments, and must therefore be dependable. Quality assurance testing methods developed for BMDS were designed to assess model dependability with respect to curve-fitt...
The USEPA's benchmark dose software (BMDS) version 1.2 has been available over the Internet since April, 2000 (epa.gov/ncea/bmds.htm), and has already been used in risk assessments of some significant environmental pollutants (e.g., diesel exhaust, dichloropropene, hexachlorocycl...
Benchmark Dose Software (BMDS) Development and ...
This report is intended to provide an overview of beta version 1.0 of the implementation of a model of repeated measures data referred to as the Toxicodiffusion model. The implementation described here represents the first steps towards integration of the Toxicodiffusion model into the EPA benchmark dose software (BMDS). This version runs from within BMDS 2.0 using an option screen for making model selection, as is done for other models in the BMDS 2.0 suite. This report is intended to provide an overview of beta version 1.0 of the implementation of a model of repeated measures data referred to as the Toxicodiffusion model.
EPA's Benchmark Dose Modeling Software
The EPA developed the Benchmark Dose Software (BMDS) as a tool to help Agency risk assessors facilitate applying benchmark dose (BMD) method’s to EPA’s human health risk assessment (HHRA) documents. The application of BMD methods overcomes many well know limitations ...
EPA and EFSA approaches for Benchmark Dose modeling
Benchmark dose (BMD) modeling has become the preferred approach in the analysis of toxicological dose-response data for the purpose of deriving human health toxicity values. The software packages most often used are Benchmark Dose Software (BMDS, developed by EPA) and PROAST (de...
Nonparametric estimation of benchmark doses in environmental risk assessment
Piegorsch, Walter W.; Xiong, Hui; Bhattacharya, Rabi N.; Lin, Lizhen
2013-01-01
Summary An important statistical objective in environmental risk analysis is estimation of minimum exposure levels, called benchmark doses (BMDs), that induce a pre-specified benchmark response in a dose-response experiment. In such settings, representations of the risk are traditionally based on a parametric dose-response model. It is a well-known concern, however, that if the chosen parametric form is misspecified, inaccurate and possibly unsafe low-dose inferences can result. We apply a nonparametric approach for calculating benchmark doses, based on an isotonic regression method for dose-response estimation with quantal-response data (Bhattacharya and Kong, 2007). We determine the large-sample properties of the estimator, develop bootstrap-based confidence limits on the BMDs, and explore the confidence limits’ small-sample properties via a short simulation study. An example from cancer risk assessment illustrates the calculations. PMID:23914133
BENCHMARK DOSES FOR CHEMICAL MIXTURES: EVALUATION OF A MIXTURE OF 18 PHAHS.
Benchmark doses (BMDs), defined as doses of a substance that are expected to result in a pre-specified level of "benchmark" response (BMR), have been used for quantifying the risk associated with exposure to environmental hazards. The lower confidence limit of the BMD is used as...
Introduction to benchmark dose methods and U.S. EPA's benchmark dose software (BMDS) version 2.1.1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, J. Allen, E-mail: davis.allen@epa.gov; Gift, Jeffrey S.; Zhao, Q. Jay
2011-07-15
Traditionally, the No-Observed-Adverse-Effect-Level (NOAEL) approach has been used to determine the point of departure (POD) from animal toxicology data for use in human health risk assessments. However, this approach is subject to substantial limitations that have been well defined, such as strict dependence on the dose selection, dose spacing, and sample size of the study from which the critical effect has been identified. Also, the NOAEL approach fails to take into consideration the shape of the dose-response curve and other related information. The benchmark dose (BMD) method, originally proposed as an alternative to the NOAEL methodology in the 1980s, addressesmore » many of the limitations of the NOAEL method. It is less dependent on dose selection and spacing, and it takes into account the shape of the dose-response curve. In addition, the estimation of a BMD 95% lower bound confidence limit (BMDL) results in a POD that appropriately accounts for study quality (i.e., sample size). With the recent advent of user-friendly BMD software programs, including the U.S. Environmental Protection Agency's (U.S. EPA) Benchmark Dose Software (BMDS), BMD has become the method of choice for many health organizations world-wide. This paper discusses the BMD methods and corresponding software (i.e., BMDS version 2.1.1) that have been developed by the U.S. EPA, and includes a comparison with recently released European Food Safety Authority (EFSA) BMD guidance.« less
APPLICATION OF BENCHMARK DOSE METHODOLOGY TO DATA FROM PRENATAL DEVELOPMENTAL TOXICITY STUDIES
The benchmark dose (BMD) concept was applied to 246 conventional developmental toxicity datasets from government, industry and commercial laboratories. Five modeling approaches were used, two generic and three specific to developmental toxicity (DT models). BMDs for both quantal ...
A Web-Based System for Bayesian Benchmark Dose Estimation.
Shao, Kan; Shapiro, Andrew J
2018-01-11
Benchmark dose (BMD) modeling is an important step in human health risk assessment and is used as the default approach to identify the point of departure for risk assessment. A probabilistic framework for dose-response assessment has been proposed and advocated by various institutions and organizations; therefore, a reliable tool is needed to provide distributional estimates for BMD and other important quantities in dose-response assessment. We developed an online system for Bayesian BMD (BBMD) estimation and compared results from this software with U.S. Environmental Protection Agency's (EPA's) Benchmark Dose Software (BMDS). The system is built on a Bayesian framework featuring the application of Markov chain Monte Carlo (MCMC) sampling for model parameter estimation and BMD calculation, which makes the BBMD system fundamentally different from the currently prevailing BMD software packages. In addition to estimating the traditional BMDs for dichotomous and continuous data, the developed system is also capable of computing model-averaged BMD estimates. A total of 518 dichotomous and 108 continuous data sets extracted from the U.S. EPA's Integrated Risk Information System (IRIS) database (and similar databases) were used as testing data to compare the estimates from the BBMD and BMDS programs. The results suggest that the BBMD system may outperform the BMDS program in a number of aspects, including fewer failed BMD and BMDL calculations and estimates. The BBMD system is a useful alternative tool for estimating BMD with additional functionalities for BMD analysis based on most recent research. Most importantly, the BBMD has the potential to incorporate prior information to make dose-response modeling more reliable and can provide distributional estimates for important quantities in dose-response assessment, which greatly facilitates the current trend for probabilistic risk assessment. https://doi.org/10.1289/EHP1289.
Benchmark dose analysis via nonparametric regression modeling
Piegorsch, Walter W.; Xiong, Hui; Bhattacharya, Rabi N.; Lin, Lizhen
2013-01-01
Estimation of benchmark doses (BMDs) in quantitative risk assessment traditionally is based upon parametric dose-response modeling. It is a well-known concern, however, that if the chosen parametric model is uncertain and/or misspecified, inaccurate and possibly unsafe low-dose inferences can result. We describe a nonparametric approach for estimating BMDs with quantal-response data based on an isotonic regression method, and also study use of corresponding, nonparametric, bootstrap-based confidence limits for the BMD. We explore the confidence limits’ small-sample properties via a simulation study, and illustrate the calculations with an example from cancer risk assessment. It is seen that this nonparametric approach can provide a useful alternative for BMD estimation when faced with the problem of parametric model uncertainty. PMID:23683057
Categorical Regression and Benchmark Dose Software 3.0
The objective of this full-day course is to provide participants with interactive training on the use of the U.S. Environmental Protection Agency’s (EPA) Benchmark Dose software (BMDS, version 3.0, released fall 2018) and Categorical Regression software (CatReg, version 3.1...
BENCHMARK DOSE TECHNICAL GUIDANCE DOCUMENT ...
The purpose of this document is to provide guidance for the Agency on the application of the benchmark dose approach in determining the point of departure (POD) for health effects data, whether a linear or nonlinear low dose extrapolation is used. The guidance includes discussion on computation of benchmark doses and benchmark concentrations (BMDs and BMCs) and their lower confidence limits, data requirements, dose-response analysis, and reporting requirements. This guidance is based on today's knowledge and understanding, and on experience gained in using this approach.
Soeteman-Hernández, Lya G; Fellows, Mick D; Johnson, George E; Slob, Wout
2015-12-01
In this study, we explored the applicability of using in vitro micronucleus (MN) data from human lymphoblastoid TK6 cells to derive in vivo genotoxicity potency information. Nineteen chemicals covering a broad spectrum of genotoxic modes of action were tested in an in vitro MN test using TK6 cells using the same study protocol. Several of these chemicals were considered to need metabolic activation, and these were administered in the presence of S9. The Benchmark dose (BMD) approach was applied using the dose-response modeling program PROAST to estimate the genotoxic potency from the in vitro data. The resulting in vitro BMDs were compared with previously derived BMDs from in vivo MN and carcinogenicity studies. A proportional correlation was observed between the BMDs from the in vitro MN and the BMDs from the in vivo MN assays. Further, a clear correlation was found between the BMDs from in vitro MN and the associated BMDs for malignant tumors. Although these results are based on only 19 compounds, they show that genotoxicity potencies estimated from in vitro tests may result in useful information regarding in vivo genotoxic potency, as well as expected cancer potency. Extension of the number of compounds and further investigation of metabolic activation (S9) and of other toxicokinetic factors would be needed to validate our initial conclusions. However, this initial work suggests that this approach could be used for in vitro to in vivo extrapolations which would support the reduction of animals used in research (3Rs: replacement, reduction, and refinement). © The Author 2015. Published by Oxford University Press on behalf of the Society of Toxicology.
BMDExpress Data Viewer: A Visualization Tool to Analyze BMDExpress Datasets(SoTC)
Background: Benchmark Dose (BMD) modelling is a mathematical approach used to determine where a dose-response change begins to take place relative to controls following chemical exposure. BMDs are being increasingly applied in regulatory toxicology to estimate acceptable exposure...
BMDExpress Data Viewer: A Visualization Tool to Analyze BMDExpress Datasets (STC symposium)
Background: Benchmark Dose (BMD) modelling is a mathematical approach used to determine where a dose-response change begins to take place relative to controls following chemical exposure. BMDs are being increasingly applied in regulatory toxicology to estimate acceptable exposure...
Benchmark Dose (BMD) modelling is a mathematical approach used to determine where a dose-response change begins to take place relative to controls following chemical exposure. BMDs are being increasingly applied in regulatory toxicology to determine points of departure. BMDExpres...
CatReg Software for Categorical Regression Analysis (May 2016)
CatReg 3.0 is a Microsoft Windows enhanced version of the Agency’s categorical regression analysis (CatReg) program. CatReg complements EPA’s existing Benchmark Dose Software (BMDS) by greatly enhancing a risk assessor’s ability to determine whether data from separate toxicologic...
Benchmark Dose for Urinary Cadmium based on a Marker of Renal Dysfunction: A Meta-Analysis
Woo, Hae Dong; Chiu, Weihsueh A.; Jo, Seongil; Kim, Jeongseon
2015-01-01
Background Low doses of cadmium can cause adverse health effects. Benchmark dose (BMD) and the one-sided 95% lower confidence limit of BMD (BMDL) to derive points of departure for urinary cadmium exposure have been estimated in several previous studies, but the methods to derive BMD and the estimated BMDs differ. Objectives We aimed to find the associated factors that affect BMD calculation in the general population, and to estimate the summary BMD for urinary cadmium using reported BMDs. Methods A meta-regression was performed and the pooled BMD/BMDL was estimated using studies reporting a BMD and BMDL, weighted by sample size, that were calculated from individual data based on markers of renal dysfunction. Results BMDs were highly heterogeneous across studies. Meta-regression analysis showed that a significant predictor of BMD was the cut-off point which denotes an abnormal level. Using the 95th percentile as a cut off, BMD5/BMDL5 estimates for 5% benchmark responses (BMR) of β2-microglobulinuria (β2-MG) estimated was 6.18/4.88 μg/g creatinine in conventional quantal analysis and 3.56/3.13 μg/g creatinine in the hybrid approach, and BMD5/BMDL5 estimates for 5% BMR of N-acetyl-β-d-glucosaminidase (NAG) was 10.31/7.61 μg/g creatinine in quantal analysis and 3.21/2.24 g/g creatinine in the hybrid approach. However, the meta-regression showed that BMD and BMDL were significantly associated with the cut-off point, but BMD calculation method did not significantly affect the results. The urinary cadmium BMDL5 of β2-MG was 1.9 μg/g creatinine in the lowest cut-off point group. Conclusion The BMD was significantly associated with the cut-off point defining the abnormal level of renal dysfunction markers. PMID:25970611
Labib, Sarah; Williams, Andrew; Kuo, Byron; Yauk, Carole L; White, Paul A; Halappanavar, Sabina
2017-07-01
The assumption of additivity applied in the risk assessment of environmental mixtures containing carcinogenic polycyclic aromatic hydrocarbons (PAHs) was investigated using transcriptomics. MutaTMMouse were gavaged for 28 days with three doses of eight individual PAHs, two defined mixtures of PAHs, or coal tar, an environmentally ubiquitous complex mixture of PAHs. Microarrays were used to identify differentially expressed genes (DEGs) in lung tissue collected 3 days post-exposure. Cancer-related pathways perturbed by the individual or mixtures of PAHs were identified, and dose-response modeling of the DEGs was conducted to calculate gene/pathway benchmark doses (BMDs). Individual PAH-induced pathway perturbations (the median gene expression changes for all genes in a pathway relative to controls) and pathway BMDs were applied to models of additivity [i.e., concentration addition (CA), generalized concentration addition (GCA), and independent action (IA)] to generate predicted pathway-specific dose-response curves for each PAH mixture. The predicted and observed pathway dose-response curves were compared to assess the sensitivity of different additivity models. Transcriptomics-based additivity calculation showed that IA accurately predicted the pathway perturbations induced by all mixtures of PAHs. CA did not support the additivity assumption for the defined mixtures; however, GCA improved the CA predictions. Moreover, pathway BMDs derived for coal tar were comparable to BMDs derived from previously published coal tar-induced mouse lung tumor incidence data. These results suggest that in the absence of tumor incidence data, individual chemical-induced transcriptomics changes associated with cancer can be used to investigate the assumption of additivity and to predict the carcinogenic potential of a mixture.
Comparison of Points of Departure for Health Risk Assessment Based on High-Throughput Screening Data
Sand, Salomon; Parham, Fred; Portier, Christopher J.; Tice, Raymond R.; Krewski, Daniel
2016-01-01
Background: The National Research Council’s vision for toxicity testing in the 21st century anticipates that points of departure (PODs) for establishing human exposure guidelines in future risk assessments will increasingly be based on in vitro high-throughput screening (HTS) data. Objectives: The aim of this study was to compare different PODs for HTS data. Specifically, benchmark doses (BMDs) were compared to the signal-to-noise crossover dose (SNCD), which has been suggested as the lowest dose applicable as a POD. Methods: Hill models were fit to > 10,000 in vitro concentration–response curves, obtained for > 1,400 chemicals tested as part of the U.S. Tox21 Phase I effort. BMDs and lower confidence limits on the BMDs (BMDLs) corresponding to extra effects (i.e., changes in response relative to the maximum response) of 5%, 10%, 20%, 30%, and 40% were estimated for > 8,000 curves, along with BMDs and BMDLs corresponding to additional effects (i.e., absolute changes in response) of 5%, 10%, 15%, 20%, and 25%. The SNCD, defined as the dose where the ratio between the additional effect and the difference between the upper and lower bounds of the two-sided 90% confidence interval on absolute effect was 1, 0.67, and 0.5, respectively, was also calculated and compared with the BMDLs. Results: The BMDL40, BMDL25, and BMDL18, defined in terms of extra effect, corresponded to the SNCD1.0, SNCD0.67, and SNCD0.5, respectively, at the median. Similarly, the BMDL25, BMDL17, and BMDL13, defined in terms of additional effect, corresponded to the SNCD1.0, SNCD0.67, and SNCD0.5, respectively, at the median. Conclusions: The SNCD may serve as a reference level that guides the determination of standardized BMDs for risk assessment based on HTS concentration–response data. The SNCD may also have application as a POD for low-dose extrapolation. Citation: Sand S, Parham F, Portier CJ, Tice RR, Krewski D. 2017. Comparison of points of departure for health risk assessment based on high-throughput screening data. Environ Health Perspect 125:623–633; http://dx.doi.org/10.1289/EHP408 PMID:27384688
Bi, Jian
2010-01-01
As the desire to promote health increases, reductions of certain ingredients, for example, sodium, sugar, and fat in food products, are widely requested. However, the reduction is not risk free in sensory and marketing aspects. Over reduction may change the taste and influence the flavor of a product and lead to a decrease in consumer's overall liking or purchase intent for the product. This article uses the benchmark dose (BMD) methodology to determine an appropriate reduction. Calculations of BMD and one-sided lower confidence limit of BMD are illustrated. The article also discusses how to calculate BMD and BMDL for over dispersed binary data in replicated testing based on a corrected beta-binomial model. USEPA Benchmark Dose Software (BMDS) were used and S-Plus programs were developed. The method discussed in the article is originally used to determine an appropriate reduction of certain ingredients, for example, sodium, sugar, and fat in food products, considering both health reason and sensory or marketing risk.
Labib, Sarah; Williams, Andrew; Yauk, Carole L; Nikota, Jake K; Wallin, Håkan; Vogel, Ulla; Halappanavar, Sabina
2016-03-15
A diverse class of engineered nanomaterials (ENMs) exhibiting a wide array of physical-chemical properties that are associated with toxicological effects in experimental animals is in commercial use. However, an integrated framework for human health risk assessment (HHRA) of ENMs has yet to be established. Rodent 2-year cancer bioassays, clinical chemistry, and histopathological endpoints are still considered the 'gold standard' for detecting substance-induced toxicity in animal models. However, the use of data derived from alternative toxicological tools, such as genome-wide expression profiling and in vitro high-throughput assays, are gaining acceptance by the regulatory community for hazard identification and for understanding the underlying mode-of-action. Here, we conducted a case study to evaluate the application of global gene expression data in deriving pathway-based points of departure (PODs) for multi-walled carbon nanotube (MWCNT)-induced lung fibrosis, a non-cancer endpoint of regulatory importance. Gene expression profiles from the lungs of mice exposed to three individual MWCNTs with different physical-chemical properties were used within the framework of an adverse outcome pathway (AOP) for lung fibrosis to identify key biological events linking MWCNT exposure to lung fibrosis. Significantly perturbed pathways were categorized along the key events described in the AOP. Benchmark doses (BMDs) were calculated for each perturbed pathway and were used to derive transcriptional BMDs for each MWCNT. Similar biological pathways were perturbed by the different MWCNT types across the doses and post-exposure time points studied. The pathway BMD values showed a time-dependent trend, with lower BMDs for pathways perturbed at the earlier post-exposure time points (24 h, 3d). The transcriptional BMDs were compared to the apical BMDs derived by the National Institute for Occupational Safety and Health (NIOSH) using alveolar septal thickness and fibrotic lesions endpoints. We found that regardless of the type of MWCNT, the BMD values for pathways associated with fibrosis were 14.0-30.4 μg/mouse, which are comparable to the BMDs derived by NIOSH for MWCNT-induced lung fibrotic lesions (21.0-27.1 μg/mouse). The results demonstrate that transcriptomic data can be used to as an effective mechanism-based method to derive acceptable levels of exposure to nanomaterials in product development when epidemiological data are unavailable.
Recommended approaches in the application of ...
ABSTRACT:Only a fraction of chemicals in commerce have been fully assessed for their potential hazards to human health due to difficulties involved in conventional regulatory tests. It has recently been proposed that quantitative transcriptomic data can be used to determine benchmark dose (BMD) and estimate a point of departure (POD). Several studies have shown that transcriptional PODs correlate with PODs derived from analysis of pathological changes, but there is no consensus on how the genes that are used to derive a transcriptional POD should be selected. Because of very large number of unrelated genes in gene expression data, the process of selecting subsets of informative genes is a major challenge. We used published microarray data from studies on rats exposed orally to multiple doses of six chemicals for 5, 14, 28, and 90 days. We evaluated eight different approaches to select genes for POD derivation and compared them to three previously proposed approaches. The relationship between transcriptional BMDs derived using these 11 approaches were compared with PODs derived from apical data that might be used in a human health risk assessment. We found that transcriptional benchmark dose values for all 11 approaches were remarkably aligned with different apical PODs, while a subset of between 3 and 8 of the approaches met standard statistical criteria across the 5-, 14-, 28-, and 90-day time points and thus qualify as effective estimates of apical PODs. Our r
The current state of knowledge on the use of the benchmark dose concept in risk assessment.
Sand, Salomon; Victorin, Katarina; Filipsson, Agneta Falk
2008-05-01
This review deals with the current state of knowledge on the use of the benchmark dose (BMD) concept in health risk assessment of chemicals. The BMD method is an alternative to the traditional no-observed-adverse-effect level (NOAEL) and has been presented as a methodological improvement in the field of risk assessment. The BMD method has mostly been employed in the USA but is presently given higher attention also in Europe. The review presents a number of arguments in favor of the BMD, relative to the NOAEL. In addition, it gives a detailed overview of the several procedures that have been suggested and applied for BMD analysis, for quantal as well as continuous data. For quantal data the BMD is generally defined as corresponding to an additional or extra risk of 5% or 10%. For continuous endpoints it is suggested that the BMD is defined as corresponding to a percentage change in response relative to background or relative to the dynamic range of response. Under such definitions, a 5% or 10% change can be considered as default. Besides how to define the BMD and its lower bound, the BMDL, the question of how to select the dose-response model to be used in the BMD and BMDL determination is highlighted. Issues of study design and comparison of dose-response curves and BMDs are also covered. Copyright (c) 2007 John Wiley & Sons, Ltd.
Fournier, K; Tebby, C; Zeman, F; Glorennec, P; Zmirou-Navier, D; Bonvallot, N
2016-02-01
Semi-Volatile Organic Compounds (SVOCs) are commonly present in dwellings and several are suspected of having effects on male reproductive function mediated by an endocrine disruption mode of action. To improve knowledge of the health impact of these compounds, cumulative toxicity indicators are needed. This work derives Benchmark Doses (BMD) and Relative Potency Factors (RPF) for SVOCs acting on the male reproductive system through the same mode of action. We included SVOCs fulfilling the following conditions: detection frequency (>10%) in French dwellings, availability of data on the mechanism/mode of action for male reproductive toxicity, and availability of comparable dose-response relationships. Of 58 SVOCs selected, 18 induce a decrease in serum testosterone levels. Six have sufficient and comparable data to derive BMDs based on 10 or 50% of the response. The SVOCs inducing the largest decrease in serum testosterone concentration are: for 10%, bisphenol A (BMD10 = 7.72E-07 mg/kg bw/d; RPF10 = 7,033,679); for 50%, benzo[a]pyrene (BMD50 = 0.030 mg/kg bw/d; RPF50 = 1630), and the one inducing the smallest one is benzyl butyl phthalate (RPF10 and RPF50 = 0.095). This approach encompasses contaminants from diverse chemical families acting through similar modes of action, and makes possible a cumulative risk assessment in indoor environments. The main limitation remains the lack of comparable toxicological data. Copyright © 2015 Elsevier Inc. All rights reserved.
Rager, Julia E; Auerbach, Scott S; Chappell, Grace A; Martin, Elizabeth; Thompson, Chad M; Fry, Rebecca C
2017-10-16
Prenatal inorganic arsenic (iAs) exposure influences the expression of critical genes and proteins associated with adverse outcomes in newborns, in part through epigenetic mediators. The doses at which these genomic and epigenomic changes occur have yet to be evaluated in the context of dose-response modeling. The goal of the present study was to estimate iAs doses that correspond to changes in transcriptomic, proteomic, epigenomic, and integrated multi-omic signatures in human cord blood through benchmark dose (BMD) modeling. Genome-wide DNA methylation, microRNA expression, mRNA expression, and protein expression levels in cord blood were modeled against total urinary arsenic (U-tAs) levels from pregnant women exposed to varying levels of iAs. Dose-response relationships were modeled in BMDExpress, and BMDs representing 10% response levels were estimated. Overall, DNA methylation changes were estimated to occur at lower exposure concentrations in comparison to other molecular endpoints. Multi-omic module eigengenes were derived through weighted gene co-expression network analysis, representing co-modulated signatures across transcriptomic, proteomic, and epigenomic profiles. One module eigengene was associated with decreased gestational age occurring alongside increased iAs exposure. Genes/proteins within this module eigengene showed enrichment for organismal development, including potassium voltage-gated channel subfamily Q member 1 (KCNQ1), an imprinted gene showing differential methylation and expression in response to iAs. Modeling of this prioritized multi-omic module eigengene resulted in a BMD(BMDL) of 58(45) μg/L U-tAs, which was estimated to correspond to drinking water arsenic concentrations of 51(40) μg/L. Results are in line with epidemiological evidence supporting effects of prenatal iAs occurring at levels <100 μg As/L urine. Together, findings present a variety of BMD measures to estimate doses at which prenatal iAs exposure influences neonatal outcome-relevant transcriptomic, proteomic, and epigenomic profiles.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Raymond S.H.; Dennison, James E.
2007-09-01
The inter-relationship of 'Thresholds' between chemical mixtures and their respective component single chemicals was studied using three sets of data and two types of analyses. Two in vitro data sets involve cytotoxicity in human keratinocytes from treatment of metals and a metal mixture [Bae, D.S., Gennings, C., Carter, Jr., W.H., Yang, R.S.H., Campain, J.A., 2001. Toxicological interactions among arsenic, cadmium, chromium, and lead in human keratinocytes. Toxicol. Sci. 63, 132-142; Gennings, C., Carter, Jr., W.H., Campain, J.A., Bae, D.S., Yang, R.S.H., 2002. Statistical analysis of interactive cytotoxicity in human epidermal keratinocytes following exposure to a mixture of four metals. J.more » Agric. Biol. Environ. Stat. 7, 58-73], and induction of estrogen receptor alpha (ER-{alpha}) reporter gene in MCF-7 human breast cancer cells by estrogenic xenobiotics [Gennings, C., Carter, Jr., W.H., Carney, E.W., Charles, G.D., Gollapudi, B.B., Carchman, R.A., 2004. A novel flexible approach for evaluating fixed ratio mixtures of full and partial agonists. Toxicol. Sci. 80, 134-150]. The third data set came from PBPK modeling of gasoline and its components in the human. For in vitro cellular responses, we employed Benchmark Dose Software (BMDS) to obtain BMD{sub 01}, BMD{sub 05}, and BMD{sub 10}. We then plotted these BMDs against exposure concentrations for the chemical mixture and its components to assess the ranges and slopes of these BMD-concentration lines. In doing so, we consider certain BMDs to be 'Interaction Thresholds' or 'Thresholds' for mixtures and their component single chemicals and the slope of the line must be a reflection of the potency of the biological effects. For in vivo PBPK modeling, we used 0.1x TLVs, TLVs, and 10x TLVs for gasoline and six component markers as input dosing for PBPK modeling. In this case, the venous blood levels under the hypothetical exposure conditions become our designated 'Interaction Thresholds' or 'Thresholds' for gasoline and its component single chemicals. Our analyses revealed that the mixture 'Interaction Thresholds' appear to stay within the bounds of the 'Thresholds' of its respective component single chemicals. Although such a trend appears to be emerging, nevertheless, it should be emphasized that our analyses are based on limited data sets and further analyses on data sets, preferably the more comprehensive experimental data sets, are needed before a definitive conclusion can be drawn.« less
Modeling Respiratory Toxicity of Authentic Lunar Dust
NASA Technical Reports Server (NTRS)
Santana, Patricia A.; James, John T.; Lam, Chiu-Wing
2010-01-01
The lunar expeditions of the Apollo operations from the 60 s and early 70 s have generated awareness about lunar dust exposures and their implication towards future lunar explorations. Critical analyses on the reports from the Apollo crew members suggest that lunar dust is a mild respiratory and ocular irritant. Currently, NASA s space toxicology group is functioning with the Lunar Airborne Dust Toxicity Assessment Group (LADTAG) and the National Institute for Occupational Safety and Health (NIOSH) to investigate and examine toxic effects to the respiratory system of rats in order to establish permissible exposure levels (PELs) for human exposure to lunar dust. In collaboration with the space toxicology group, LADTAG and NIOSH the goal of the present research is to analyze dose-response curves from rat exposures seven and twenty-eight days after intrapharyngeal instillations, and model the response using BenchMark Dose Software (BMDS) from the Environmental Protection Agency (EPA). Via this analysis, the relative toxicities of three types of Apollo 14 lunar dust samples and two control dust samples, titanium dioxide (TiO2) and quartz will be determined. This will be executed for several toxicity endpoints such as cell counts and biochemical markers in bronchoaveolar lavage fluid (BALF) harvested from the rats.
Palmer, Cameron S; Davey, Tamzyn M; Mok, Meng Tuck; McClure, Rod J; Farrow, Nathan C; Gruen, Russell L; Pollard, Cliff W
2013-06-01
Trauma registries are central to the implementation of effective trauma systems. However, differences between trauma registry datasets make comparisons between trauma systems difficult. In 2005, the collaborative Australian and New Zealand National Trauma Registry Consortium began a process to develop a bi-national minimum dataset (BMDS) for use in Australasian trauma registries. This study aims to describe the steps taken in the development and preliminary evaluation of the BMDS. A working party comprising sixteen representatives from across Australasia identified and discussed the collectability and utility of potential BMDS fields. This included evaluating existing national and international trauma registry datasets, as well as reviewing all quality indicators and audit filters in use in Australasian trauma centres. After the working party activities concluded, this process was continued by a number of interested individuals, with broader feedback sought from the Australasian trauma community on a number of occasions. Once the BMDS had reached a suitable stage of development, an email survey was conducted across Australasian trauma centres to assess whether BMDS fields met an ideal minimum standard of field collectability. The BMDS was also compared with three prominent international datasets to assess the extent of dataset overlap. Following this, the BMDS was encapsulated in a data dictionary, which was introduced in late 2010. The finalised BMDS contained 67 data fields. Forty-seven of these fields met a previously published criterion of 80% collectability across respondent trauma institutions; the majority of the remaining fields either could be collected without any change in resources, or could be calculated from other data fields in the BMDS. However, comparability with international registry datasets was poor. Only nine BMDS fields had corresponding, directly comparable fields in all the national and international-level registry datasets evaluated. A draft BMDS has been developed for use in trauma registries across Australia and New Zealand. The email survey provided strong indications of the utility of the fields contained in the BMDS. The BMDS has been adopted as the dataset to be used by an ongoing Australian Trauma Quality Improvement Program. Copyright © 2012 Elsevier Ltd. All rights reserved.
BMDS 2.0 BETA WITH NEW QUANTAL MODEL DEVELOPMENT EXTERNAL REVIEW REPORTS AND SUPPORTING DOCUMENTS
With the availability of BMDS 2.0 on the BMDS web site, EPA is providing (a) results of the external review (charge to reviewers and reviewer comments), (b) EPA responses to the review comments, and (c) a report describing development and testing of the models in BMDS 2.0 with ne...
Dose and Effect Thresholds for Early Key Events in a Mode of ...
ABSTRACT Strategies for predicting adverse health outcomes of environmental chemicals are centered on early key events in toxicity pathways. However, quantitative relationships between early molecular changes in a given pathway and later health effects are often poorly defined. The goal of this study was to evaluate short-term key event indicators using qualitative and quantitative methods in an established pathway of mouse liver tumorigenesis mediated by peroxisome proliferator-activated receptor-alpha (PPARα). Male B6C3F1 mice were exposed for 7 days to di(2-ethylhexyl) phthalate (DEHP), di-n-octyl phthalate (DNOP), and n-butyl benzyl phthalate (BBP), which vary in PPARα activity and liver tumorigenicity. Each phthalate increased expression of select PPARα target genes at 7 days, while only DEHP significantly increased liver cell proliferation labeling index (LI). Transcriptional benchmark dose (BMDT) estimates for dose-related genomic markers stratified phthalates according to hypothetical tumorigenic potencies, unlike BMDs for non-genomic endpoints (liver weights or proliferation). The 7-day BMDT values for Acot1 as a surrogate measure for PPARα activation were 29, 370, and 676 mg/kg-d for DEHP, DNOP, and BBP, respectively, distinguishing DEHP (liver tumor BMD of 35 mg/kg-d) from non-tumorigenic DNOP and BBP. Effect thresholds were generated using linear regression of DEHP effects at 7 days and 2-year tumor incidence values to anchor early response molec
2015 Assessment of the Ballistic Missile Defense System (BMDS)
2016-04-01
performance and test adequacy of the BMDS, its four autonomous BMDS systems, and its sensor/command and control architecture. The four autonomous BMDS...Patriot. The Command and Control , Battle Management, and Communications (C2BMC) element anchors the sensor/command and control architecture. This...Warfare operations against a cruise missile surrogate. Ground-based Midcourse Defense (GMD). GMD has demonstrated capability against small
BMDExpress Data Viewer: A Visualization Tool to Analyze ...
Regulatory agencies increasingly apply benchmark dose (BMD) modeling to determine points of departure in human risk assessments. BMDExpress applies BMD modeling to transcriptomics datasets and groups genes to biological processes and pathways for rapid assessment of doses at which biological perturbations occur. However, graphing and analytical capabilities within BMDExpress are limited, and the analysis of output files is challenging. We developed a web-based application, BMDExpress Data Viewer, for visualization and graphical analyses of BMDExpress output files. The software application consists of two main components: ‘Summary Visualization Tools’ and ‘Dataset Exploratory Tools’. We demonstrate through two case studies that the ‘Summary Visualization Tools’ can be used to examine and assess the distributions of probe and pathway BMD outputs, as well as derive a potential regulatory BMD through the modes or means of the distributions. The ‘Functional Enrichment Analysis’ tool presents biological processes in a two-dimensional bubble chart view. By applying filters of pathway enrichment p-value and minimum number of significant genes, we showed that the Functional Enrichment Analysis tool can be applied to select pathways that are potentially sensitive to chemical perturbations. The ‘Multiple Dataset Comparison’ tool enables comparison of BMDs across multiple experiments (e.g., across time points, tissues, or organisms, etc.). The ‘BMDL-BM
Developing chemical criteria for wildlife: The benchmark dose versus NOAEL approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Linder, G.
1995-12-31
Wildlife may be exposed to a wide variety of chemicals in their environment, and various strategies for evaluating wildlife risk for these chemicals have been developed. One, a ``no-observable-adverse-effects-level`` or NOAEL-approach has increasingly been applied to develop chemical criteria for wildlife. In this approach, the NOAEL represents the highest experimental concentration at which there is no statistically significant change in some toxicity endpoint relative to a control. Another, the ``benchmark dose`` or BMD-approach relies on the lower confidence limit for a concentration that corresponds to a small, but statistically significant, change in effect over some reference condition. Rather than correspondingmore » to a single experimental concentration as does the NOAEL, the BMD-approach considers the full concentration response curve for derivation of the BMD. Here, using a variety of vertebrates and an assortment of chemicals (including carbofuran, paraquat, methylmercury, cadmium, zinc, and copper), the NOAEL-approach will be critically evaluated relative to the BMD approach. Statistical models used in the BMD approach suggest these methods are potentially available for eliminating safety factors in risk calculations. A reluctance to recommend this, however, stems from the uncertainty associated with the shape of concentration-response curves at low concentrations. Also, with existing data the derivation of BMDs has shortcomings when sample size is small (10 or fewer animals per treatment). The success of BMD models clearly depends upon the continued collection of wildlife data in the field and laboratory, the design of toxicity studies sufficient for BMD calculations, and complete reporting of these results in the literature. Overall, the BMD approach for developing chemical criteria for wildlife should be given further consideration, since it more fully evaluates concentration-response data.« less
A health risk benchmark for the neurologic effects of styrene: comparison with NOAEL/LOAEL approach.
Rabovsky, J; Fowles, J; Hill, M D; Lewis, D C
2001-02-01
Benchmark dose (BMD) analysis was used to estimate an inhalation benchmark concentration for styrene neurotoxicity. Quantal data on neuropsychologic test results from styrene-exposed workers [Mutti et al. (1984). American Journal of Industrial Medicine, 5, 275-286] were used to quantify neurotoxicity, defined as the percent of tested workers who responded abnormally to > or = 1, > or = 2, or > or = 3 out of a battery of eight tests. Exposure was based on previously published results on mean urinary mandelic- and phenylglyoxylic acid levels in the workers, converted to air styrene levels (15, 44, 74, or 115 ppm). Nonstyrene-exposed workers from the same region served as a control group. Maximum-likelihood estimates (MLEs) and BMDs at 5 and 10% response levels of the exposed population were obtained from log-normal analysis of the quantal data. The highest MLE was 9 ppm (BMD = 4 ppm) styrene and represents abnormal responses to > or = 3 tests by 10% of the exposed population. The most health-protective MLE was 2 ppm styrene (BMD = 0.3 ppm) and represents abnormal responses to > or = 1 test by 5% of the exposed population. A no observed adverse effect level/lowest observed adverse effect level (NOAEL/LOAEL) analysis of the same quantal data showed workers in all styrene exposure groups responded abnormally to > or = 1, > or = 2, or > or = 3 tests, compared to controls, and the LOAEL was 15 ppm. A comparison of the BMD and NOAEL/LOAEL analyses suggests that at air styrene levels below the LOAEL, a segment of the worker population may be adversely affected. The benchmark approach will be useful for styrene noncancer risk assessment purposes by providing a more accurate estimate of potential risk that should, in turn, help to reduce the uncertainty that is a common problem in setting exposure levels.
A Mode-of-Action Approach for the Identification of Genotoxic Carcinogens
Hernández, Lya G.; van Benthem, Jan; Johnson, George E.
2013-01-01
Distinguishing between clastogens and aneugens is vital in cancer risk assessment because the default assumption is that clastogens and aneugens have linear and non-linear dose-response curves, respectively. Any observed non-linearity must be supported by mode of action (MOA) analyses where biological mechanisms are linked with dose-response evaluations. For aneugens, the MOA has been well characterised as disruptors of mitotic machinery where chromosome loss via micronuclei (MN) formation is an accepted endpoint used in risk assessment. In this study we performed the cytokinesis-block micronucleus assay and immunofluorescence mitotic machinery visualisation in human lymphoblastoid (AHH-1) and Chinese Hamster fibroblast (V79) cell lines after treatment with the aneugen 17-β-oestradiol (E2). Results were compared to previously published data on bisphenol-A (BPA) and Rotenone data. Two concentration-response approaches (the threshold-[Td] and benchmark-dose [BMD] approaches) were applied to derive a point of departure (POD) for in vitro MN induction. BMDs were also derived from the most sensitive carcinogenic endpoint. Ranking comparisons of the PODs from the in vitro MN and the carcinogenicity studies demonstrated a link between these two endpoints for BPA, E2 and Rotenone. This analysis was extended to include 5 additional aneugens, 5 clastogens and 3 mutagens and further concentration and dose-response correlations were observed between PODs from the in vitro MN and carcinogenicity. This approach is promising and may be further extended to other genotoxic carcinogens, where MOA and quantitative information from the in vitro MN studies could be used in a quantitative manner to further inform cancer risk assessment. PMID:23675539
Peer Review of EPA's Draft BMDS Document: Exponential ...
BMDS is one of the Agency's premier tools for estimating risk assessments, therefore the validity and reliability of its statistical models are of paramount importance. This page provides links to peer review of the BMDS applications and its models as they were developed and eventually released documenting the rigorous review process taken to provide the best science tools available for statistical modeling. This page provides links to peer review of the BMDS applications and its models as they were developed and eventually released documenting the rigorous review process taken to provide the best science tools available for statistical modeling.
BMDS/SSA Integrated Sensing Demonstration (BISD)
NASA Astrophysics Data System (ADS)
Turner, T.; Springford, K.; Grimaldi, L.
2011-09-01
This demonstration is intended to provide a near-term prototype, leave-behind capability for integrating Ballistic Missile Defense System (BMDS) ground sensors for use in the Space Situational Awareness (SSA) mission. Closed-loop tasking and cueing capability will be implemented, and a demonstration of net-centric space data dissemination using the BMDS sensors will be undertaken using various SSA mission threads. The demonstration is designed to highlight the implications of modifying software and/or hardware at the BMDS command and control node so that cost, risk, and schedule for an operational implementation can be fully understood. Additionally, this demonstration is intended to assess the impacts to both mission areas as a multi-mission, non-traditional sensor capability is integrated into the SSA mission. A successful demonstration will have many leave-behind capabilities and first-of-its-kind achievements to include: a) an extensible SSA operational prototype configuration for BMDS X-Band radars such as AN/TPY-2 and Sea-Based X-Band (SBX) b) a prototype SSA tasking and cueing capability between the Joint Functional Component Command for Space (JFCC Space) Joint Space Operations Center (JSpOC) and the Command, Control, Battle Management, and Communications (C2BMC) Experimental Laboratory (X-Lab), extensible to the Combatant Commands (COCOMS), and out to BMDS sensors c) a capability for a twoway, net-centric, interface for JSpOC space operations, to include translation from net-centric communications to legacy systems and d) processing of BMDS X-Band Radar tracks in the Space Defense Operations Center (SPADOC).
Gu, Jie-mei; Wang, Li; Lin, Hua; Chen, De-cai; Tang, Hai; Jin, Xiao-lan; Xia, Wei-bo; Hu, Yun-qiu; Fu, Wen-zhen; He, Jin-wei; Zhang, Hao; Wang, Chun; Yue, Hua; Hu, Wei-wei; Liu, Yu-juan; Zhang, Zhen-lin
2015-07-01
Oral risedronate is effective in the treatment of postmenopausal osteoporosis when administered daily, weekly, or monthly. In this 1-year, randomized, double-blind, multicenter study we compared the weekly 35-mg and daily 5-mg risedronate dosing regimens in the treatment of Chinese postmenopausal women with osteoporosis or osteopenia. Postmenopausal women with primary osteoporosis or osteopenia were randomly assigned to the weekly group or daily group (n=145 for each) that received oral risedronate 35 mg once a week or 5 mg daily, respectively, for 1 year. The subjects' bone mineral densities (BMDs), bone turnover markers (P1NP and β-CTX), new vertebral fractures, and adverse events were assessed at baseline and during the treatments. All subjects in the weekly group and 144 subjects in the daily group completed the study. The primary efficacy endpoint after 1 year, ie the mean percent changes in the lumbar spine BMD (95% CI) were 4.87% (3.92% to 5.81%) for the weekly group and 4.35% (3.31% to 5.39%) for the daily group. The incidences of clinical adverse events were 48.3% in the weekly group and 54.2% in the daily group. The weekly 35-mg and daily 5-mg risedronate dosing regimens during 1 year of follow-up show similar efficacy in improving BMDs and biochemical markers of bone turnover in Chinese postmenopausal women with osteoporosis or osteopenia. Moreover, the two dosing regimens exhibit similar safety and tolerability.
Peer Review Documents Related to the Evaluation of ...
BMDS is one of the Agency's premier tools for estimating risk assessments, therefore the validity and reliability of its statistical models are of paramount importance. This page provides links to peer review and expert summaries of the BMDS application and its models as they were developed and eventually released documenting the rigorous review process taken to provide the best science tools available for statistical modeling. This page provides links to peer reviews and expert summaries of the BMDS applications and its models as they were developed and eventually released.
2007-01-01
4-22 Exhibit 4-8. Freshwater Species Tolerance to Acidity...environments or specific threatened or endangered species . Radio frequency use and testing would be coordinated with the appropriate resource management...impacts to the environment and the threatened and endangered species , the unique or sensitive environments, and the migratory, breeding, and
The Model Averaging for Dichotomous Response Benchmark Dose (MADr-BMD) Tool
Providing quantal response models, which are also used in the U.S. EPA benchmark dose software suite, and generates a model-averaged dose response model to generate benchmark dose and benchmark dose lower bound estimates.
Ballistic Missile Defense System (BMDS)
2015-12-01
Assessment and Program Evaluation CARD - Cost Analysis Requirements Description CDD - Capability Development Document CLIN - Contract Line Item Number CPD...Estimate RDT&E - Research, Development, Test, and Evaluation SAR - Selected Acquisition Report SCP - Service Cost Position TBD - To Be Determined TY - Then...BMDS December 2015 SAR March 23, 2016 16:29:09 UNCLASSIFIED 5 Mission and Description Mission and Description To develop, test, and field a layered
2014 Assessment of the Ballistic Missile Defense System (BMDS)
2015-03-23
for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of...take several more years to collect the test data needed to adequately VV&A the BMDS M&S required to perform such assessments. As data are collected ...Accreditation is possible only if a sufficient quantity and quality of flight test data have been collected to support model verification and
2007-07-01
Systems , Boeing-led Airborne Laser Team Actively Tracks Airborne Target, Compensates for Atmospheric Turbulence and Fires Sur- rogate High-Energy Laser...7100 System Requirements Analysis and Technological Support for the Ballistic Missile Defense System (BMDS) FY07 Progress Report By...Office of Management and Budget , Paperwork Reduction Project (0704-0188) Washington DC 20503. 1. AGENCY USE ONLY (Leave blank) 2. REPORT DATE July
77 FR 36533 - Notice of Availability of the Benchmark Dose Technical Guidance
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-19
... ENVIRONMENTAL PROTECTION AGENCY [FRL-9688-7] Notice of Availability of the Benchmark Dose Technical Guidance AGENCY: Environmental Protection Agency (EPA). ACTION: Notice of Availability. SUMMARY: The U.S. Environmental Protection Agency is announcing the availability of Benchmark Dose Technical...
A MULTIMODEL APPROACH FOR CALCULATING BENCHMARK DOSE
A Multimodel Approach for Calculating Benchmark Dose
Ramon I. Garcia and R. Woodrow Setzer
In the assessment of dose response, a number of plausible dose- response models may give fits that are consistent with the data. If no dose response formulation had been speci...
Role of the standard deviation in the estimation of benchmark doses with continuous data.
Gaylor, David W; Slikker, William
2004-12-01
For continuous data, risk is defined here as the proportion of animals with values above a large percentile, e.g., the 99th percentile or below the 1st percentile, for the distribution of values among control animals. It is known that reducing the standard deviation of measurements through improved experimental techniques will result in less stringent (higher) doses for the lower confidence limit on the benchmark dose that is estimated to produce a specified risk of animals with abnormal levels for a biological effect. Thus, a somewhat larger (less stringent) lower confidence limit is obtained that may be used as a point of departure for low-dose risk assessment. It is shown in this article that it is important for the benchmark dose to be based primarily on the standard deviation among animals, s(a), apart from the standard deviation of measurement errors, s(m), within animals. If the benchmark dose is incorrectly based on the overall standard deviation among average values for animals, which includes measurement error variation, the benchmark dose will be overestimated and the risk will be underestimated. The bias increases as s(m) increases relative to s(a). The bias is relatively small if s(m) is less than one-third of s(a), a condition achieved in most experimental designs.
Model Uncertainty and Bayesian Model Averaged Benchmark Dose Estimation for Continuous Data
The benchmark dose (BMD) approach has gained acceptance as a valuable risk assessment tool, but risk assessors still face significant challenges associated with selecting an appropriate BMD/BMDL estimate from the results of a set of acceptable dose-response models. Current approa...
Detection technique of targets for missile defense system
NASA Astrophysics Data System (ADS)
Guo, Hua-ling; Deng, Jia-hao; Cai, Ke-rong
2009-11-01
Ballistic missile defense system (BMDS) is a weapon system for intercepting enemy ballistic missiles. It includes ballistic-missile warning system, target discrimination system, anti-ballistic-missile guidance systems, and command-control communication system. Infrared imaging detection and laser imaging detection are widely used in BMDS for surveillance, target detection, target tracking, and target discrimination. Based on a comprehensive review of the application of target-detection techniques in the missile defense system, including infrared focal plane arrays (IRFPA), ground-based radar detection technology, 3-dimensional imaging laser radar with a photon counting avalanche photodiode (APD) arrays and microchip laser, this paper focuses on the infrared and laser imaging detection techniques in missile defense system, as well as the trends for their future development.
Ikedo, Aoi; Ishibashi, Aya; Matsumiya, Saori; Kaizaki, Aya; Ebi, Kumiko; Fujita, Satoshi
2016-01-01
We aimed to compare site-specific bone mineral densities (BMDs) between adolescent endurance runners and sprinters and examine the relationship of fat-free mass (FFM) and nutrient intake on BMD. In this cross-sectional study, 37 adolescent female endurance runners and sprinters (16.1 ± 0.8 years) were recruited. BMD and FFM were assessed by dual-energy X-ray absorptiometry. Nutrient intake and menstrual state were evaluated by questionnaires. After adjusting for covariates, spine and total bone less head (TBLH) BMDs were significantly higher in sprinters than endurance runners (TBLH, 1.02 ± 0.05 vs. 0.98 ± 0.06 g/cm2; spine, 0.99 ± 0.06 vs. 0.94 ± 0.06 g/cm2; p < 0.05). There was no significant difference between groups in other sites. The rate of menstrual abnormality was higher in endurance runners compared with sprinters (56.3% vs. 23.8%; p < 0.05). FFM was a significant covariate for BMD on all sites except the spine (p < 0.05). Dietary intake of vitamin D was identified as a significant covariate only for pelvic BMD (p < 0.05). The BMDs of different sites among endurance runners and sprinters were strongly related to FFM. However, the association of FFM with spine BMD cannot be explained by FFM alone. Other factors, including nutrition and/or mechanical loading, may affect the spine BMD. PMID:27916891
Guo, Liting; Gao, Zhihong; Ge, Huanqi
2017-01-01
The aims of this study is to observe the levels of serum 25-hydroxyvitaminD (25OHD), parathyroid hormone and bone mineral density (BMD) in type 2 diabetes as well as to analyze the correlationship between 25OHD level and BMD. The subjects included 368 type 2 diabetic patients, ages ranged 40-79 years and 300 non-diabetic control subjects matched for age, gender and body mass index. The serum 25OHD concentration, parathyroid hormone level and BMDs value at lumbar spine (L1-L4), femoral neck, total hip and total body were measured. The BMDs (g/cm2) was measured by LUNAR's DEXA dual-energy X-ray absorptiometry. ①Compared with control subjects, the serum 25OHD level, BMDs at the femoral neck and total hip declined in type 2 diabetes[(45±17 vs. 36±12 nmol/L), (0.93±0.17 vs. 0.85±0.14 g/cm2), (0.93±0.14 vs. 0.87±0.15g/cm2) (all P<0.05)]; The parathyroid hormone level in type 2 diabetes was higher in type 2 diabetes than that in control subjects (8.5±4.2 vs. 5.6±3.9 pmol/L) (P<0.05). ②Compared with diabetes duration ≤10 years group, BMDs at the femoral neck and total hip decreased in diabetes duration >10years group [(0.88±0.11 vs. 0.81±0.15 g/cm2), (0.91±0.14 vs. 0.84±0.16 g/cm2)(All P<0.05)]; The parathyroid hormone level increased in diabetes duration >10years group than diabetes duration ≤10 years group (10.6±9.1 vs. 7.1±3.7 pmol/L) (P<0.05). ③ Compared with hemoglobin A1c (HbA1c) ≤8% group, 25OHD and BMDs at the femoral neck and total hip in HbA1c>8% group decreased [(40±15 vs. 32±13 nmol/l), (0.89±0.13 vs. 0.83±0.13 g/cm2), (0.95±0.13 vs. 0.83±0.16 g/cm2) (All P<0.05)] and the parathyroid hormone level increased (7.2±4.0 vs. 10.0±8.8 pmol/L) (P<0.05). ④The morbidity of diabetic osteoporosis and osteopenia (41.0%, 47.8%) were higher than those in control subjects (27.0%,33.3%) (X2 = 4.37 and 4.70, P = 0.04 and 0.03); Diabetes duration, HbA1c and parathyroid hormone levels were longer or higher in Diabetic osteoporosis group than those in normal BMD group and osteopenia group(All p<0.05). ⑤ Simple factor correlation analysis showed that the BMD at the femoral neck was negatively correlated with the age, diabetes duration, HbA1c, parathyroid hormone (rs = -0.18,-0.23,-0.18,-0.25), and positively correlated with 25OHD (rs = 0.23). Decreased BMDs and increased incidence of osteoporosis were observed in type 2 diabetic patients, which are closely related to the serum 25OHD level. These findings were more prominent at the femoral neck and total hip for patients with a longer diabetic history and poor glycemic control.
Correlation of Noncancer Benchmark Doses in Short- and Long-Term Rodent Bioassays.
Kratchman, Jessica; Wang, Bing; Fox, John; Gray, George
2018-05-01
This study investigated whether, in the absence of chronic noncancer toxicity data, short-term noncancer toxicity data can be used to predict chronic toxicity effect levels by focusing on the dose-response relationship instead of a critical effect. Data from National Toxicology Program (NTP) technical reports have been extracted and modeled using the Environmental Protection Agency's Benchmark Dose Software. Best-fit, minimum benchmark dose (BMD), and benchmark dose lower limits (BMDLs) have been modeled for all NTP pathologist identified significant nonneoplastic lesions, final mean body weight, and mean organ weight of 41 chemicals tested by NTP between 2000 and 2012. Models were then developed at the chemical level using orthogonal regression techniques to predict chronic (two years) noncancer health effect levels using the results of the short-term (three months) toxicity data. The findings indicate that short-term animal studies may reasonably provide a quantitative estimate of a chronic BMD or BMDL. This can allow for faster development of human health toxicity values for risk assessment for chemicals that lack chronic toxicity data. © 2017 Society for Risk Analysis.
Diniz, Tiego Aparecido; Agostinete, Ricardo Ribeiro; Costa, Paulo; Saraiva, Bruna Thamyres Ciccotti; Sonvenso, Diego Kanashiro; Freitas, Ismael Forte; Fernandes, Rômulo Araujo; Christofaro, Diego Giulliano Destro
2017-01-01
This study aimed to investigate the relationship between total and segmental bone mineral density (BDM) and physical activity (PA) in different domains (school, leisure and sports) among adolescents and children. Cross-sectional study in the Universidade Estadual Paulista Júlio de Mesquita Filho (UNESP). The study sample consisted of 173 children and adolescents (10.31 ± 1.87 years). The BMDs for the whole body (WB) and the regions of the trunk and legs were measured using dual energy X-ray absorptiometry (DXA). PA was measured using the Baecke questionnaire. A regression model was used to analyze the relationship between all the BMDs and the different domains of PA. 41.5% of the adolescents had high percentages of body fat. Regarding the comparison between physically active and insufficiently active adolescents, there were no statistically significant differences in any BMD variables (P > 0.05). The BMD of the legs showed positive relationships with the total PA (β = 0.009; P = 0.013) and sports PA (β = 0.010; P = 0.049) after insertion of the confounders. Similarly, the WB BMD showed the same relationships (total PA: β = 0.005; P = 0.045; and sports PA: β = 0.008; P = 0.049). No relationship was found between leisure and school PA and any of the BMDs (P > 0.05). The results indicated that practice of sport was related to higher BMD values, independent of sex, age and body fatness.
The Missile Defense Agency's space tracking and surveillance system
NASA Astrophysics Data System (ADS)
Watson, John; Zondervan, Keith
2008-10-01
The Ballistic Missile Defense System (BMDS) is a layered system incorporating elements in space. In addition to missile warning systems at geosynchronous altitudes, an operational BMDS will include a low Earth orbit (LEO) system-the Space Tracking and Surveillance System (STSS). It will use infrared sensing technologies synergistically with the Space Based Infrared Systems (SBIRS) and will provide a seamless adjunct to radars and sensors on the ground and in airborne platforms. STSS is being designed for a future operational capability to defend against evolving threats. STSS development is divided into phases, commencing with a two-satellite demonstration constellation scheduled for launch in 2008. The demonstration satellites will conduct a menu of tests and experiments to prove the system concept, including the ground segment. They will have limited operational capability within the integrated BMDS. Data from the demonstration satellites will be received and processed by the Missile Defense Space Experiment Center (MDSEC), a part of the Missile Defense Integration and Operations Center (MDIOC). MDA launched in 2007 into LEO a satellite (NFIRE) designed to make near-field multispectral measurements of boosting targets and to demonstrate laser communication, the latter in conjunction with the German satellite TerraSAR-X. The gimbaled, lightweight laser terminal has demonstrated on orbit a 5.5 gbps rate in both directions. The filter passbands of NFIRE are similar to the STSS demonstrator track sensor. While providing useful phenomenology during its time on orbit, NFIRE will also serve as a pathfinder in the development of STSS operations procedures.
Wheeler, Matthew W; Bailer, A John
2007-06-01
Model averaging (MA) has been proposed as a method of accounting for model uncertainty in benchmark dose (BMD) estimation. The technique has been used to average BMD dose estimates derived from dichotomous dose-response experiments, microbial dose-response experiments, as well as observational epidemiological studies. While MA is a promising tool for the risk assessor, a previous study suggested that the simple strategy of averaging individual models' BMD lower limits did not yield interval estimators that met nominal coverage levels in certain situations, and this performance was very sensitive to the underlying model space chosen. We present a different, more computationally intensive, approach in which the BMD is estimated using the average dose-response model and the corresponding benchmark dose lower bound (BMDL) is computed by bootstrapping. This method is illustrated with TiO(2) dose-response rat lung cancer data, and then systematically studied through an extensive Monte Carlo simulation. The results of this study suggest that the MA-BMD, estimated using this technique, performs better, in terms of bias and coverage, than the previous MA methodology. Further, the MA-BMDL achieves nominal coverage in most cases, and is superior to picking the "best fitting model" when estimating the benchmark dose. Although these results show utility of MA for benchmark dose risk estimation, they continue to highlight the importance of choosing an adequate model space as well as proper model fit diagnostics.
Experimental benchmarking of a Monte Carlo dose simulation code for pediatric CT
NASA Astrophysics Data System (ADS)
Li, Xiang; Samei, Ehsan; Yoshizumi, Terry; Colsher, James G.; Jones, Robert P.; Frush, Donald P.
2007-03-01
In recent years, there has been a desire to reduce CT radiation dose to children because of their susceptibility and prolonged risk for cancer induction. Concerns arise, however, as to the impact of dose reduction on image quality and thus potentially on diagnostic accuracy. To study the dose and image quality relationship, we are developing a simulation code to calculate organ dose in pediatric CT patients. To benchmark this code, a cylindrical phantom was built to represent a pediatric torso, which allows measurements of dose distributions from its center to its periphery. Dose distributions for axial CT scans were measured on a 64-slice multidetector CT (MDCT) scanner (GE Healthcare, Chalfont St. Giles, UK). The same measurements were simulated using a Monte Carlo code (PENELOPE, Universitat de Barcelona) with the applicable CT geometry including bowtie filter. The deviations between simulated and measured dose values were generally within 5%. To our knowledge, this work is one of the first attempts to compare measured radial dose distributions on a cylindrical phantom with Monte Carlo simulated results. It provides a simple and effective method for benchmarking organ dose simulation codes and demonstrates the potential of Monte Carlo simulation for investigating the relationship between dose and image quality for pediatric CT patients.
Yoshimura, Toshihiro; Tohya, Toshimitsu; Onoda, Chikashi; Okamura, Hitoshi
2005-09-16
To assess the extent to which malnutrition in childhood affects bone mineral density (BMD) decades later. BMDs were compared in healthy women (35-59 years old) who visited our hospital for annual examinations between 1992 and 1993 (group 1) and between 1999 and 2002 (group 2). The BMDs of 50- to 54-year-old women in group 1 averaged 0.86+/-0.15 g/cm2, which was significantly (p<0.001) lower than age-matched women in group 2 (1.02+/-0.16 g/cm2). At the end of World War II (1945) undernutrition was rampant throughout Japan, and there were unprecedented numbers of cases of malnutrition. BMD was lower in women who experienced those conditions while they were 5 years old in average, a time when rapid skeletal growth was beginning. Thus, nutrition in childhood is a particularly crucial determinant of lifelong bone health.
De Bondt, Timo; Mulkens, Tom; Zanca, Federica; Pyfferoen, Lotte; Casselman, Jan W; Parizel, Paul M
2017-02-01
To benchmark regional standard practice for paediatric cranial CT-procedures in terms of radiation dose and acquisition parameters. Paediatric cranial CT-data were retrospectively collected during a 1-year period, in 3 different hospitals of the same country. A dose tracking system was used to automatically gather information. Dose (CTDI and DLP), scan length, amount of retakes and demographic data were stratified by age and clinical indication; appropriate use of child-specific protocols was assessed. In total, 296 paediatric cranial CT-procedures were collected. Although the median dose of each hospital was below national and international diagnostic reference level (DRL) for all age categories, statistically significant (p-value < 0.001) dose differences among hospitals were observed. The hospital with lowest dose levels showed smallest dose variability and used age-stratified protocols for standardizing paediatric head exams. Erroneous selection of adult protocols for children still occurred, mostly in the oldest age-group. Even though all hospitals complied with national and international DRLs, dose tracking and benchmarking showed that further dose optimization and standardization is possible by using age-stratified protocols for paediatric cranial CT. Moreover, having a dose tracking system revealed that adult protocols are still applied for paediatric CT, a practice that must be avoided. • Significant differences were observed in the delivered dose between age-groups and hospitals. • Using age-adapted scanning protocols gives a nearly linear dose increase. • Sharing dose-data can be a trigger for hospitals to reduce dose levels.
Megias, Daniel; Phillips, Mark; Clifton-Hadley, Laura; Harron, Elizabeth; Eaton, David J; Sanghera, Paul; Whitfield, Gillian
2017-03-01
The HIPPO trial is a UK randomized Phase II trial of hippocampal sparing (HS) vs conventional whole-brain radiotherapy after surgical resection or radiosurgery in patients with favourable prognosis with 1-4 brain metastases. Each participating centre completed a planning benchmark case as part of the dedicated radiotherapy trials quality assurance programme (RTQA), promoting the safe and effective delivery of HS intensity-modulated radiotherapy (IMRT) in a multicentre trial setting. Submitted planning benchmark cases were reviewed using visualization for radiotherapy software (VODCA) evaluating plan quality and compliance in relation to the HIPPO radiotherapy planning and delivery guidelines. Comparison of the planning benchmark data highlighted a plan specified using dose to medium as an outlier by comparison with those specified using dose to water. Further evaluation identified that the reported plan statistics for dose to medium were lower as a result of the dose calculated at regions of PTV inclusive of bony cranium being lower relative to brain. Specification of dose to water or medium remains a source of potential ambiguity and it is essential that as part of a multicentre trial, consideration is given to reported differences, particularly in the presence of bone. Evaluation of planning benchmark data as part of an RTQA programme has highlighted an important feature of HS IMRT dosimetry dependent on dose being specified to water or medium, informing the development and undertaking of HS IMRT as part of the HIPPO trial. Advances in knowledge: The potential clinical impact of differences between dose to medium and dose to water are demonstrated for the first time, in the setting of HS whole-brain radiotherapy.
2007-01-01
buffalo grass (Buchloe dactuloides), peppergrass (Lepidium lasiocarpum), and Bermuda grass ( Cynodon dactylon). Some examples of indigenous... aethiopicus ), zebra (Equus burchelli), rhinoceros (Diceros bicornis [black], Ceratotherium simum [white]), giraffe (Giraffa camelopardalis), gazelle
Bohl, Michael A; Goswami, Roopa; Strassner, Brett; Stanger, Paula
2016-08-01
The purpose of this investigation was to evaluate the potential of using the ACR's Dose Index Registry(®) to meet The Joint Commission's requirements to identify incidents in which the radiation dose index from diagnostic CT examinations exceeded the protocol's expected dose index range. In total, 10,970 records in the Dose Index Registry were statistically analyzed to establish both an upper and lower expected dose index for each protocol. All 2015 studies to date were then retrospectively reviewed to identify examinations whose total examination dose index exceeded the protocol's defined upper threshold. Each dose incident was then logged and reviewed per the new Joint Commission requirements. Facilities may leverage their participation in the ACR's Dose Index Registry to fully meet The Joint Commission's dose incident identification review and external benchmarking requirements. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.
ORANGE: a Monte Carlo dose engine for radiotherapy.
van der Zee, W; Hogenbirk, A; van der Marck, S C
2005-02-21
This study presents data for the verification of ORANGE, a fast MCNP-based dose engine for radiotherapy treatment planning. In order to verify the new algorithm, it has been benchmarked against DOSXYZ and against measurements. For the benchmarking, first calculations have been done using the ICCR-XIII benchmark. Next, calculations have been done with DOSXYZ and ORANGE in five different phantoms (one homogeneous, two with bone equivalent inserts and two with lung equivalent inserts). The calculations have been done with two mono-energetic photon beams (2 MeV and 6 MeV) and two mono-energetic electron beams (10 MeV and 20 MeV). Comparison of the calculated data (from DOSXYZ and ORANGE) against measurements was possible for a realistic 10 MV photon beam and a realistic 15 MeV electron beam in a homogeneous phantom only. For the comparison of the calculated dose distributions and dose distributions against measurements, the concept of the confidence limit (CL) has been used. This concept reduces the difference between two data sets to a single number, which gives the deviation for 90% of the dose distributions. Using this concept, it was found that ORANGE was always within the statistical bandwidth with DOSXYZ and the measurements. The ICCR-XIII benchmark showed that ORANGE is seven times faster than DOSXYZ, a result comparable with other accelerated Monte Carlo dose systems when no variance reduction is used. As shown for XVMC, using variance reduction techniques has the potential for further acceleration. Using modern computer hardware, this brings the total calculation time for a dose distribution with 1.5% (statistical) accuracy within the clinical range (less then 10 min). This means that ORANGE can be a candidate for a dose engine in radiotherapy treatment planning.
Multiple-Tumor Analysis with MS_Combo Model (Use with BMDS Wizard)
Exercises and procedures on setting up and using the MS_Combo Wizard. The MS_Combo model provides BMD and BMDL estimates for the risk of getting one or more tumors for any combination of tumors observed in a single bioassay.
Shen, Jun; Fu, Shiping; Song, Yuan
2017-12-01
The aim of this study was to determine the relationship between serum fibroblast growth factor-23 (FGF-23) level and bone mass in postmenopausal women. A total of 60 premenopausal, 60 early postmenopausal, and 60 late postmenopausal women were investigated by the measurement of bone mineral densities (BMDs) at lumbar spine and proximal femur by DXA, together with serum concentrations of Ca, P, 25 (OH) D 3 , OC, iPTH, CTX-I, PINP, and FGF-23. The levels of FGF-23 and PINP in early postmenopausal group were significantly higher than that in the premenopausal or the late postmenopausal groups, their changing patterns were different form 25(OH)D 3, iPTH, IGF, CTX-I, and OC. According to the AUCs in the ROC analysis, we found that serum FGF-23 level was associated with the highest validity as compared to the other bone metabolism factors. Further study indicated the significant negative relationships between serum FGF-23 level and lumbar spine/proximal femur BMDs in postmenopausal women. After detection of the sensitivity and specificity of serum FGF- 23 for the low bone mass at different T-score (SD) lumbar spine/proximal femur BMDs, we found that serum FGF-23 level may be a reliable marker for low bone mass in postmenopausal women. The performance of FGF-23 in the differential diagnosis low bone mass from healthy participants indicated that FGF-23 has the capacity to differentiate the women with low bone mass from the normal ones. Our study indicated that serum FGF-23 level could be served as the utility in the early detection of women with low bone mass. J. Cell. Biochem. 118: 4454-4459, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
Zhou, Yijun; Li, Yan; Zhang, Dan; Wang, Jiahe; Yang, Hongwu
2010-12-01
To determine the prevalence and biochemical/hormonal determinants of osteopenia/osteoporosis in postmenopausal Chinese women with type 2 diabetes. This cross-sectional study was carried out in 890 postmenopausal women with type 2 diabetes and 689 age-matched non-diabetic women. Of the total subjects included in both groups were classified as obese (BMI ≥ 25 kg/m²) and non-obese (BMI< 25 kg/m²). Bone mineral density (BMD) at the sites (lumbar spine, femoral neck, and hip), obtained by dual X-ray absorptiometry and some other relevant clinical and laboratory indices of bone mineral metabolism were investigated. The prevalence of osteopenia and that of osteoporosis were evaluated. BMDs, T- and Z-scores at the total hip, femoral neck and ward's triangle were significantly lower in non-obese diabetic women than those in BMI-matched control subjects (P < 0.038). Obese diabetic patients and control subjects had similar BMDs and T- and Z-scores at various skeletal regions. Osteopenia/osteoporosis was more common at the hip and femoral neck in non-obese diabetic women than in obese diabetic women and control subjects (P = 0.026). On multiple linear regression analysis, which was adjusted for the sex hormone concentration, BMI, fasting insulin level, and serum osteocalcin were positively associated with BMDs at the hip and lumbar spine. Age, mean HbA₁(c) levels, and NTx/Cr showed negative correlation (P < 0.0284) with BMD at the lumbar spine and femoral neck. Postmenopausal non-obese women with type 2 diabetes have lower BMD levels and higher osteopenia/osteoporosis rate than BMI-matched control subjects. Impaired bone formation may occur in Chinese postmenopausal women with type 2 diabetes. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Tirabassi, G; delli Muti, N; Gioia, A; Biagioli, A; Lenzi, A; Balercia, G
2014-04-01
The relationship between androgen receptor (AR) CAG polymorphism and bone metabolism is highly controversial. We, therefore, aimed to evaluate the independent role of AR CAG repeat polymorphism on bone metabolism improvement induced by testosterone replacement therapy (TRT) in male post-surgical hypogonadotropic hypogonadism, a condition frequently associated with hypopituitarism and in which the effects of TRT have to be distinguished from those resulting from concomitant administration of pituitary function replacing hormones. 12 men affected by post-surgical hypogonadotropic hypogonadism [mean duration of hypogonadism 8.3 ± 2.05 (SD) months] were retrospectively assessed before and after TRT (from 74 to 84 weeks after the beginning of therapy). The following measures were studied: parameters of bone metabolism [serum markers and bone mineral density (BMD)], pituitary dependent hormones and genetic analysis (AR CAG repeat number). Total testosterone, estradiol, free T4 (FT4) and insulin-like growth factor-1 (IGF-1) increased between the two phases, while follicle stimulating hormone (FSH) decreased. While serum markers did not vary significantly between the two phases, BMD improved slightly but significantly in all the studied sites. The number of CAG triplets correlated negatively and significantly with all the variations (Δ-) of BMDs. Conversely, Δ-testosterone correlated positively and significantly with all studied Δ-BMDs, while Δ-FSH, Δ-estradiol, Δ-FT4, and Δ-IGF-1 did not correlate significantly with any of the Δ-BMDs. Multiple linear regression analysis, after correction for Δ-testosterone, showed that CAG repeat length was negatively and significantly associated with ∆-BMD of all measured sites. Our data suggest that, in post-surgical male hypogonadotropic hypogonadism, shorter AR CAG tract is independently associated with greater TRT-induced improvement of BMD.
High-energy neutron depth-dose distribution experiment.
Ferenci, M S; Hertel, N E
2003-01-01
A unique set of high-energy neutron depth-dose benchmark experiments were performed at the Los Alamos Neutron Science Center/Weapons Neutron Research (LANSCE/WNR) complex. The experiments consisted of filtered neutron beams with energies up to 800 MeV impinging on a 30 x 30 x 30 cm3 liquid, tissue-equivalent phantom. The absorbed dose was measured in the phantom at various depths with tissue-equivalent ion chambers. This experiment is intended to serve as a benchmark experiment for the testing of high-energy radiation transport codes for the international radiation protection community.
Ballistic Missile Defense System (BMDS) Programmatic Environmental Impact Statement
2007-01-01
upon the fish for their nutrition , health, and economy. Additional studies need to be done to assess this potential threat to the Alaska...the Yucatan . (Lincoln et al., 1998) Several studies of bird migrations using NEXRAD (weather radar) have allowed researchers to estimate the
Demb, Joshua; Chu, Philip; Nelson, Thomas; Hall, David; Seibert, Anthony; Lamba, Ramit; Boone, John; Krishnam, Mayil; Cagnon, Christopher; Bostani, Maryam; Gould, Robert; Miglioretti, Diana; Smith-Bindman, Rebecca
2017-06-01
Radiation doses for computed tomography (CT) vary substantially across institutions. To assess the impact of institutional-level audit and collaborative efforts to share best practices on CT radiation doses across 5 University of California (UC) medical centers. In this before/after interventional study, we prospectively collected radiation dose metrics on all diagnostic CT examinations performed between October 1, 2013, and December 31, 2014, at 5 medical centers. Using data from January to March (baseline), we created audit reports detailing the distribution of radiation dose metrics for chest, abdomen, and head CT scans. In April, we shared reports with the medical centers and invited radiology professionals from the centers to a 1.5-day in-person meeting to review reports and share best practices. We calculated changes in mean effective dose 12 weeks before and after the audits and meeting, excluding a 12-week implementation period when medical centers could make changes. We compared proportions of examinations exceeding previously published benchmarks at baseline and following the audit and meeting, and calculated changes in proportion of examinations exceeding benchmarks. Of 158 274 diagnostic CT scans performed in the study period, 29 594 CT scans were performed in the 3 months before and 32 839 CT scans were performed 12 to 24 weeks after the audit and meeting. Reductions in mean effective dose were considerable for chest and abdomen. Mean effective dose for chest CT decreased from 13.2 to 10.7 mSv (18.9% reduction; 95% CI, 18.0%-19.8%). Reductions at individual medical centers ranged from 3.8% to 23.5%. The mean effective dose for abdominal CT decreased from 20.0 to 15.0 mSv (25.0% reduction; 95% CI, 24.3%-25.8%). Reductions at individual medical centers ranged from 10.8% to 34.7%. The number of CT scans that had an effective dose measurement that exceeded benchmarks was reduced considerably by 48% and 54% for chest and abdomen, respectively. After the audit and meeting, head CT doses varied less, although some institutions increased and some decreased mean head CT doses and the proportion above benchmarks. Reviewing institutional doses and sharing dose-optimization best practices resulted in lower radiation doses for chest and abdominal CT and more consistent doses for head CT.
Caoili, Salvador Eugenio C.
2014-01-01
B-cell epitope prediction can enable novel pharmaceutical product development. However, a mechanistically framed consensus has yet to emerge on benchmarking such prediction, thus presenting an opportunity to establish standards of practice that circumvent epistemic inconsistencies of casting the epitope prediction task as a binary-classification problem. As an alternative to conventional dichotomous qualitative benchmark data, quantitative dose-response data on antibody-mediated biological effects are more meaningful from an information-theoretic perspective in the sense that such effects may be expressed as probabilities (e.g., of functional inhibition by antibody) for which the Shannon information entropy (SIE) can be evaluated as a measure of informativeness. Accordingly, half-maximal biological effects (e.g., at median inhibitory concentrations of antibody) correspond to maximally informative data while undetectable and maximal biological effects correspond to minimally informative data. This applies to benchmarking B-cell epitope prediction for the design of peptide-based immunogens that elicit antipeptide antibodies with functionally relevant cross-reactivity. Presently, the Immune Epitope Database (IEDB) contains relatively few quantitative dose-response data on such cross-reactivity. Only a small fraction of these IEDB data is maximally informative, and many more of them are minimally informative (i.e., with zero SIE). Nevertheless, the numerous qualitative data in IEDB suggest how to overcome the paucity of informative benchmark data. PMID:24949474
BENCHMARK DOSE TECHNICAL GUIDANCE DOCUMENT ...
The U.S. EPA conducts risk assessments for an array of health effects that may result from exposure to environmental agents, and that require an analysis of the relationship between exposure and health-related outcomes. The dose-response assessment is essentially a two-step process, the first being the definition of a point of departure (POD), and the second extrapolation from the POD to low environmentally-relevant exposure levels. The benchmark dose (BMD) approach provides a more quantitative alternative to the first step in the dose-response assessment than the current NOAEL/LOAEL process for noncancer health effects, and is similar to that for determining the POD proposed for cancer endpoints. As the Agency moves toward harmonization of approaches for human health risk assessment, the dichotomy between cancer and noncancer health effects is being replaced by consideration of mode of action and whether the effects of concern are likely to be linear or nonlinear at low doses. Thus, the purpose of this project is to provide guidance for the Agency and the outside community on the application of the BMD approach in determining the POD for all types of health effects data, whether a linear or nonlinear low dose extrapolation is used. A guidance document is being developed under the auspices of EPA's Risk Assessment Forum. The purpose of this project is to provide guidance for the Agency and the outside community on the application of the benchmark dose (BMD) appr
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramos-Mendez, J; Faddegon, B; Perl, J
2015-06-15
Purpose: To develop and verify an extension to TOPAS for calculation of dose response models (TCP/NTCP). TOPAS wraps and extends Geant4. Methods: The TOPAS DICOM interface was extended to include structure contours, for subsequent calculation of DVH’s and TCP/NTCP. The following dose response models were implemented: Lyman-Kutcher-Burman (LKB), critical element (CE), population based critical volume (CV), parallel-serials, a sigmoid-based model of Niemierko for NTCP and TCP, and a Poisson-based model for TCP. For verification, results for the parallel-serial and Poisson models, with 6 MV x-ray dose distributions calculated with TOPAS and Pinnacle v9.2, were compared to data from the benchmarkmore » configuration of the AAPM Task Group 166 (TG166). We provide a benchmark configuration suitable for proton therapy along with results for the implementation of the Niemierko, CV and CE models. Results: The maximum difference in DVH calculated with Pinnacle and TOPAS was 2%. Differences between TG166 data and Monte Carlo calculations of up to 4.2%±6.1% were found for the parallel-serial model and up to 1.0%±0.7% for the Poisson model (including the uncertainty due to lack of knowledge of the point spacing in TG166). For CE, CV and Niemierko models, the discrepancies between the Pinnacle and TOPAS results are 74.5%, 34.8% and 52.1% when using 29.7 cGy point spacing, the differences being highly sensitive to dose spacing. On the other hand, with our proposed benchmark configuration, the largest differences were 12.05%±0.38%, 3.74%±1.6%, 1.57%±4.9% and 1.97%±4.6% for the CE, CV, Niemierko and LKB models, respectively. Conclusion: Several dose response models were successfully implemented with the extension module. Reference data was calculated for future benchmarking. Dose response calculated for the different models varied much more widely for the TG166 benchmark than for the proposed benchmark, which had much lower sensitivity to the choice of DVH dose points. This work was supported by National Cancer Institute Grant R01CA140735.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Al-Hallaq, Hania A., E-mail: halhallaq@radonc.uchicago.edu; Chmura, Steven J.; Salama, Joseph K.
Purpose: The NRG-BR001 trial is the first National Cancer Institute–sponsored trial to treat multiple (range 2-4) extracranial metastases with stereotactic body radiation therapy. Benchmark credentialing is required to ensure adherence to this complex protocol, in particular, for metastases in close proximity. The present report summarizes the dosimetric results and approval rates. Methods and Materials: The benchmark used anonymized data from a patient with bilateral adrenal metastases, separated by <5 cm of normal tissue. Because the planning target volume (PTV) overlaps with organs at risk (OARs), institutions must use the planning priority guidelines to balance PTV coverage (45 Gy in 3 fractions) againstmore » OAR sparing. Submitted plans were processed by the Imaging and Radiation Oncology Core and assessed by the protocol co-chairs by comparing the doses to targets, OARs, and conformity metrics using nonparametric tests. Results: Of 63 benchmarks submitted through October 2015, 94% were approved, with 51% approved at the first attempt. Most used volumetric arc therapy (VMAT) (78%), a single plan for both PTVs (90%), and prioritized the PTV over the stomach (75%). The median dose to 95% of the volume was 44.8 ± 1.0 Gy and 44.9 ± 1.0 Gy for the right and left PTV, respectively. The median dose to 0.03 cm{sup 3} was 14.2 ± 2.2 Gy to the spinal cord and 46.5 ± 3.1 Gy to the stomach. Plans that spared the stomach significantly reduced the dose to the left PTV and stomach. Conformity metrics were significantly better for single plans that simultaneously treated both PTVs with VMAT, intensity modulated radiation therapy, or 3-dimensional conformal radiation therapy compared with separate plans. No significant differences existed in the dose at 2 cm from the PTVs. Conclusions: Although most plans used VMAT, the range of conformity and dose falloff was large. The decision to prioritize either OARs or PTV coverage varied considerably, suggesting that the toxicity outcomes in the trial could be affected. Several benchmarks met the dose-volume histogram metrics but produced unacceptable plans owing to low conformity. Dissemination of a frequently-asked-questions document improved the approval rate at the first attempt. Benchmark credentialing was found to be a valuable tool for educating institutions about the protocol requirements.« less
Al-Hallaq, Hania A; Chmura, Steven J; Salama, Joseph K; Lowenstein, Jessica R; McNulty, Susan; Galvin, James M; Followill, David S; Robinson, Clifford G; Pisansky, Thomas M; Winter, Kathryn A; White, Julia R; Xiao, Ying; Matuszak, Martha M
2017-01-01
The NRG-BR001 trial is the first National Cancer Institute-sponsored trial to treat multiple (range 2-4) extracranial metastases with stereotactic body radiation therapy. Benchmark credentialing is required to ensure adherence to this complex protocol, in particular, for metastases in close proximity. The present report summarizes the dosimetric results and approval rates. The benchmark used anonymized data from a patient with bilateral adrenal metastases, separated by <5 cm of normal tissue. Because the planning target volume (PTV) overlaps with organs at risk (OARs), institutions must use the planning priority guidelines to balance PTV coverage (45 Gy in 3 fractions) against OAR sparing. Submitted plans were processed by the Imaging and Radiation Oncology Core and assessed by the protocol co-chairs by comparing the doses to targets, OARs, and conformity metrics using nonparametric tests. Of 63 benchmarks submitted through October 2015, 94% were approved, with 51% approved at the first attempt. Most used volumetric arc therapy (VMAT) (78%), a single plan for both PTVs (90%), and prioritized the PTV over the stomach (75%). The median dose to 95% of the volume was 44.8 ± 1.0 Gy and 44.9 ± 1.0 Gy for the right and left PTV, respectively. The median dose to 0.03 cm 3 was 14.2 ± 2.2 Gy to the spinal cord and 46.5 ± 3.1 Gy to the stomach. Plans that spared the stomach significantly reduced the dose to the left PTV and stomach. Conformity metrics were significantly better for single plans that simultaneously treated both PTVs with VMAT, intensity modulated radiation therapy, or 3-dimensional conformal radiation therapy compared with separate plans. No significant differences existed in the dose at 2 cm from the PTVs. Although most plans used VMAT, the range of conformity and dose falloff was large. The decision to prioritize either OARs or PTV coverage varied considerably, suggesting that the toxicity outcomes in the trial could be affected. Several benchmarks met the dose-volume histogram metrics but produced unacceptable plans owing to low conformity. Dissemination of a frequently-asked-questions document improved the approval rate at the first attempt. Benchmark credentialing was found to be a valuable tool for educating institutions about the protocol requirements. Copyright © 2016 Elsevier Inc. All rights reserved.
Renner, Franziska
2016-09-01
Monte Carlo simulations are regarded as the most accurate method of solving complex problems in the field of dosimetry and radiation transport. In (external) radiation therapy they are increasingly used for the calculation of dose distributions during treatment planning. In comparison to other algorithms for the calculation of dose distributions, Monte Carlo methods have the capability of improving the accuracy of dose calculations - especially under complex circumstances (e.g. consideration of inhomogeneities). However, there is a lack of knowledge of how accurate the results of Monte Carlo calculations are on an absolute basis. A practical verification of the calculations can be performed by direct comparison with the results of a benchmark experiment. This work presents such a benchmark experiment and compares its results (with detailed consideration of measurement uncertainty) with the results of Monte Carlo calculations using the well-established Monte Carlo code EGSnrc. The experiment was designed to have parallels to external beam radiation therapy with respect to the type and energy of the radiation, the materials used and the kind of dose measurement. Because the properties of the beam have to be well known in order to compare the results of the experiment and the simulation on an absolute basis, the benchmark experiment was performed using the research electron accelerator of the Physikalisch-Technische Bundesanstalt (PTB), whose beam was accurately characterized in advance. The benchmark experiment and the corresponding Monte Carlo simulations were carried out for two different types of ionization chambers and the results were compared. Considering the uncertainty, which is about 0.7 % for the experimental values and about 1.0 % for the Monte Carlo simulation, the results of the simulation and the experiment coincide. Copyright © 2015. Published by Elsevier GmbH.
NASA Technical Reports Server (NTRS)
James, John T.; Lam, Chiu-wing; Scully, Robert R.
2013-01-01
Brief exposures of Apollo Astronauts to lunar dust occasionally elicited upper respiratory irritation; however, no limits were ever set for prolonged exposure ot lunar dust. Habitats for exploration, whether mobile of fixed must be designed to limit human exposure to lunar dust to safe levels. We have used a new technique we call Comparative Benchmark Dose Modeling to estimate safe exposure limits for lunar dust collected during the Apollo 14 mission.
BMDS: A Collection of R Functions for Bayesian Multidimensional Scaling
ERIC Educational Resources Information Center
Okada, Kensuke; Shigemasu, Kazuo
2009-01-01
Bayesian multidimensional scaling (MDS) has attracted a great deal of attention because: (1) it provides a better fit than do classical MDS and ALSCAL; (2) it provides estimation errors of the distances; and (3) the Bayesian dimension selection criterion, MDSIC, provides a direct indication of optimal dimensionality. However, Bayesian MDS is not…
Zhang, Y D; Zhang, Z; Zhou, N F; Jia, W T; Cheng, X G; Wei, X J
2014-08-28
Primary osteoporosis is a common health problem in postmenopausal women. This study aimed to detect the association of the g.19074G>A genetic variant in the osteoprotegerin gene (OPG) with bone mineral density (BMD) and primary osteoporosis. The created restriction site-polymerase chain reaction method was used to investigate the g.19074G>A genetic variant. The BMD of the femoral neck hip, lumbar spine (L2-4), and total hip were assessed by dual-energy X-ray absorptiometry (DEXA) in 856 unrelated Chinese postmenopausal women. We found significant differences in the BMDs of the femoral neck hip, lumbar spine (L2-4), and total hip among different genotypes; individuals with the GG genotype had significantly higher BMDs than those with the GA and AA genotypes (P < 0.05). Our results indicated that the A allele was an increased risk factor for primary osteoporosis and the g.19074G>A genetic variant of the OPG gene was associated with BMD and primary osteoporosis in Chinese postmenopausal women.
IL-17A-mediated sRANK ligand elevation involved in postmenopausal osteoporosis.
Molnár, I; Bohaty, I; Somogyiné-Vári, É
2014-02-01
The role of proinflammatory IL-17 cytokine was studied in postmenopausal bone loss between 31 osteopenic and 41 osteoporotic women. The effect of serum IL-17A, soluble receptor activator of NF-κB (sRANK) ligand, and osteoprotegerin (OPG) levels on lumbar bone mineral densities was measured. The results demonstrated an increased IL-17A-mediated sRANK ligand elevation in postmenopausal osteoporotic bone loss. IL-17 proinflammatory cytokine is a new inducer of bone loss. Postmenopausal osteoporosis represents a cross talk between estrogen deprivation and increased immune reactivity. The role of IL-17 was studied in the bone loss of postmenopausal osteoporosis. Serum IL-17A, sRANK ligand, and OPG levels were investigated on bone mineral densities (BMDs) in the total lumbar (L1-L4) region in 18 pre- and 72 postmenopausal women. IL-17A, sRANK ligand, OPG levels, and BMDs were measured with enzyme-linked immunosorbent assay (ELISA) and dual-energy X-ray absorptiometry (DXA). Increased serum IL-17A, sRANK ligand, and OPG levels were demonstrated in postmenopausal osteoporotic women compared to osteopenic women (3.65 ± 0.61 vs 3.31 ± 0.43 ng/ml for IL-17A, P < 0.007; 2.88 ± 0.84 vs 2.49 ± 0.61 ng/ml for sRANK ligand, P < 0.027; and 1.43 ± 0.07 vs 1.39 ± 0.07 ng/ml for OPG, P < 0.038). In postmenopausal women, IL-17A levels correlated inversely with total lumbar BMDs (P < 0.008, r = -0.279) and positively with sRANK ligand levels (P < 0.0001, r = 0.387) or the ratio of sRANK ligand and OPG (P < 0.013, r = 0.261), but did not with OPG levels alone. Increased IL-17A levels are involved in postmenopausal osteoporosis, playing a role in the bone-resorpting processes.
Multiscale benchmarking of drug delivery vectors.
Summers, Huw D; Ware, Matthew J; Majithia, Ravish; Meissner, Kenith E; Godin, Biana; Rees, Paul
2016-10-01
Cross-system comparisons of drug delivery vectors are essential to ensure optimal design. An in-vitro experimental protocol is presented that separates the role of the delivery vector from that of its cargo in determining the cell response, thus allowing quantitative comparison of different systems. The technique is validated through benchmarking of the dose-response of human fibroblast cells exposed to the cationic molecule, polyethylene imine (PEI); delivered as a free molecule and as a cargo on the surface of CdSe nanoparticles and Silica microparticles. The exposure metrics are converted to a delivered dose with the transport properties of the different scale systems characterized by a delivery time, τ. The benchmarking highlights an agglomeration of the free PEI molecules into micron sized clusters and identifies the metric determining cell death as the total number of PEI molecules presented to cells, determined by the delivery vector dose and the surface density of the cargo. Copyright © 2016 Elsevier Inc. All rights reserved.
Benchmark studies of induced radioactivity produced in LHC materials, Part II: Remanent dose rates.
Brugger, M; Khater, H; Mayer, S; Prinz, A; Roesler, S; Ulrici, L; Vincke, H
2005-01-01
A new method to estimate remanent dose rates, to be used with the Monte Carlo code FLUKA, was benchmarked against measurements from an experiment that was performed at the CERN-EU high-energy reference field facility. An extensive collection of samples of different materials were placed downstream of, and laterally to, a copper target, intercepting a positively charged mixed hadron beam with a momentum of 120 GeV c(-1). Emphasis was put on the reduction of uncertainties by taking measures such as careful monitoring of the irradiation parameters, using different instruments to measure dose rates, adopting detailed elemental analyses of the irradiated materials and making detailed simulations of the irradiation experiment. The measured and calculated dose rates are in good agreement.
SU-E-T-148: Benchmarks and Pre-Treatment Reviews: A Study of Quality Assurance Effectiveness
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lowenstein, J; Nguyen, H; Roll, J
Purpose: To determine the impact benchmarks and pre-treatment reviews have on improving the quality of submitted clinical trial data. Methods: Benchmarks are used to evaluate a site’s ability to develop a treatment that meets a specific protocol’s treatment guidelines prior to placing their first patient on the protocol. A pre-treatment review is an actual patient placed on the protocol in which the dosimetry and contour volumes are evaluated to be per protocol guidelines prior to allowing the beginning of the treatment. A key component of these QA mechanisms is that sites are provided timely feedback to educate them on howmore » to plan per the protocol and prevent protocol deviations on patients accrued to a protocol. For both benchmarks and pre-treatment reviews a dose volume analysis (DVA) was performed using MIM softwareTM. For pre-treatment reviews a volume contour evaluation was also performed. Results: IROC Houston performed a QA effectiveness analysis of a protocol which required both benchmarks and pre-treatment reviews. In 70 percent of the patient cases submitted, the benchmark played an effective role in assuring that the pre-treatment review of the cases met protocol requirements. The 35 percent of sites failing the benchmark subsequently modified there planning technique to pass the benchmark before being allowed to submit a patient for pre-treatment review. However, in 30 percent of the submitted cases the pre-treatment review failed where the majority (71 percent) failed the DVA. 20 percent of sites submitting patients failed to correct their dose volume discrepancies indicated by the benchmark case. Conclusion: Benchmark cases and pre-treatment reviews can be an effective QA tool to educate sites on protocol guidelines and to minimize deviations. Without the benchmark cases it is possible that 65 percent of the cases undergoing a pre-treatment review would have failed to meet the protocols requirements.Support: U24-CA-180803.« less
Bibbo, Giovanni; Brown, Scott; Linke, Rebecca
2016-08-01
Diagnostic Reference Levels (DRL) of procedures involving ionizing radiation are important tools to optimizing radiation doses delivered to patients and in identifying cases where the levels of doses are unusually high. This is particularly important for paediatric patients undergoing computed tomography (CT) examinations as these examinations are associated with relatively high-dose. Paediatric CT studies, performed at our institution from January 2010 to March 2014, have been retrospectively analysed to determine the 75th and 95th percentiles of both the volume computed tomography dose index (CTDIvol ) and dose-length product (DLP) for the most commonly performed studies to: establish local diagnostic reference levels for paediatric computed tomography examinations performed at our institution, benchmark our DRL with national and international published paediatric values, and determine the compliance of CT radiographer with established protocols. The derived local 75th percentile DRL have been found to be acceptable when compared with those published by the Australian National Radiation Dose Register and two national children's hospitals, and at the international level with the National Reference Doses for the UK. The 95th percentiles of CTDIvol for the various CT examinations have been found to be acceptable values for the CT scanner Dose-Check Notification. Benchmarking CT radiographers shows that they follow the set protocols for the various examinations without significant variations in the machine setting factors. The derivation of DRL has given us the tool to evaluate and improve the performance of our CT service by improved compliance and a reduction in radiation dose to our paediatric patients. We have also been able to benchmark our performance with similar national and international institutions. © 2016 The Royal Australian and New Zealand College of Radiologists.
Bhat, Virunya S; Hester, Susan D; Nesnow, Stephen; Eastmond, David A
2013-11-01
The ability to anchor chemical class-based gene expression changes to phenotypic lesions and to describe these changes as a function of dose and time informs mode-of-action determinations and improves quantitative risk assessments. Previous global expression profiling identified a 330-probe cluster differentially expressed and commonly responsive to 3 hepatotumorigenic conazoles (cyproconazole, epoxiconazole, and propiconazole) at 30 days. Extended to 2 more conazoles (triadimefon and myclobutanil), the present assessment encompasses 4 tumorigenic and 1 nontumorigenic conazole. Transcriptional benchmark dose levels (BMDL(T)) were estimated for a subset of the cluster with dose-responsive behavior and a ≥ 5-fold increase or decrease in signal intensity at the highest dose. These genes primarily encompassed CAR/RXR activation, P450 metabolism, liver hypertrophy- glutathione depletion, LPS/IL-1-mediated inhibition of RXR, and NRF2-mediated oxidative stress pathways. Median BMDL(T) estimates from the subset were concordant (within a factor of 2.4) with apical benchmark doses (BMDL(A)) for increased liver weight at 30 days for the 5 conazoles. The 30-day median BMDL(T) estimates were within one-half order of magnitude of the chronic BMDLA for hepatocellular tumors. Potency differences seen in the dose-responsive transcription of certain phase II metabolism, bile acid detoxification, and lipid oxidation genes mirrored each conazole's tumorigenic potency. The 30-day BMDL(T) corresponded to tumorigenic potency on a milligram per kilogram day basis with cyproconazole > epoxiconazole > propiconazole > triadimefon > myclobutanil (nontumorigenic). These results support the utility of measuring short-term gene expression changes to inform quantitative risk assessments from long-term exposures.
van Wijngaarden, Edwin; Beck, Christopher; Shamlaye, Conrad F; Cernichiari, Elsa; Davidson, Philip W; Myers, Gary J; Clarkson, Thomas W
2006-09-01
Methyl mercury (MeHg) is highly toxic to the developing nervous system. Human exposure is mainly from fish consumption since small amounts are present in all fish. Findings of developmental neurotoxicity following high-level prenatal exposure to MeHg raised the question of whether children whose mothers consumed fish contaminated with background levels during pregnancy are at an increased risk of impaired neurological function. Benchmark doses determined from studies in New Zealand, and the Faroese and Seychelles Islands indicate that a level of 4-25 parts per million (ppm) measured in maternal hair may carry a risk to the infant. However, there are numerous sources of uncertainty that could affect the derivation of benchmark doses, and it is crucial to continue to investigate the most appropriate derivation of safe consumption levels. Earlier, we published the findings from benchmark analyses applied to the data collected on the Seychelles main cohort at the 66-month follow-up period. Here, we expand on the main cohort analyses by determining the benchmark doses (BMD) of MeHg level in maternal hair based on 643 Seychellois children for whom 26 different neurobehavioral endpoints were measured at 9 years of age. Dose-response models applied to these continuous endpoints incorporated a variety of covariates and included the k-power model, the Weibull model, and the logistic model. The average 95% lower confidence limit of the BMD (BMDL) across all 26 endpoints varied from 20.1 ppm (range=17.2-22.5) for the logistic model to 20.4 ppm (range=17.9-23.0) for the k-power model. These estimates are somewhat lower than those obtained after 66 months of follow-up. The Seychelles Child Development Study continues to provide a firm scientific basis for the derivation of safe levels of MeHg consumption.
Wake Island Supplemental Environmental Assessment
2007-02-01
operations, the oxidizer transfer system would be flushed with water . This operation is expected to yield approximately 5 grams (0.2 ounces) of nitric...Defense System (BMDS) to provide a defensive capability for the U.S., its deployed forces, friends, and allies from ballistic missile threats. The...infrastructure, land use, physical resources, noise, socioeconomics, transportation, and water resources. MDA determined that six of the thirteen resource
2007-01-01
Bering Sea through direct affects on the fish, as well as the thousands of people dependent upon the fish for their nutrition , health, and economy...Newfoundland to British Columbia; however, the width of the migratory path narrows to 400 miles from east-west at the latitude of the Yucatan . (Lincoln et
NASA Astrophysics Data System (ADS)
Han, S. M.; Davis, J.
1997-10-01
The bone mineral density (BMD), ultrasound velocity (UV) and attenuation were examined in sixteen matched sets of human patellae and calcanei. For the sixteen calcanei, BMD was strongly correlated with all ultrasound parameters. Calcaneal UV appeared to be inferior to attenuation in the ability to predict BMD. For the sixteen patellae, the average UV was found to be greater in the superior/inferior direction than in the anterior/posterior and medial/lateral directions. It was found that patella BMD was significantly correlated with each of three directional ultrasound velocities. The relationship between BMD and ultrasound attenuation parameters was not significant in the patella. A comparative study of the two different bone sets demonstrated that the BMDs of the patella and calcaneus were significantly correlated with each other. Ultrasound velocity of calcaneus, measured in the medial/lateral direction, was not significantly associated with any of three directional ultrasound velocities in the patella. Similarly, ultrasound attenuation parameters of calcaneus were not significantly correlated with those of patella. The present study also demonstrated evidence that when predicting BMDs at their respective sites using ultrasound, the calcaneus appeared to be superior to the patella.
Zheng, Hao; Wang, Xiong; Ren, Feifei; Zou, Shenglong; Feng, Min; Xu, Liangliang; Yao, Lunguang; Sun, Jingchen
2018-06-19
The classical baculovirus display system (BDS) has often recruited fields including gene delivery, gene therapy, and the genetic engineering of vaccines, as it is capable of presenting foreign polypeptides on the membranes of recombinant baculovirus through a transmembrane protein. However, classical BDS's high cost, complicated operation, low display efficiency and its inability to simultaneously display multiple gene products impede its practicality. In this study, we present a novel and highly efficient display system based on ires-dependent gp64 for rescuing gp64-null Bacmid of baculovirus construction without affecting the viral replication cycle, which we name the baculovirus multigene display system (BMDS). Laser scanning confocal microscopy demonstrated that eGFP, eYFP, and mCherry were translocated on the membrane of Spodoptera frugiperda 9 cell successfully as expected. Western blot analysis further confirmed the presence of the fluorescent proteins on the budded, mature viral particles. The results showed the display efficiency of target gene on cell surface is fourfold that of classical BDS. In addition, a recombinant baculovirus displaying three kinds of fluorescent proteins simultaneously was constructed, thereby demonstrating the effectiveness of BMDS as a co-display system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, M; Chetty, I; Zhong, H
2014-06-01
Purpose: Tumor control probability (TCP) calculated with accumulated radiation doses may help design appropriate treatment margins. Image registration errors, however, may compromise the calculated TCP. The purpose of this study is to develop benchmark CT images to quantify registration-induced errors in the accumulated doses and their corresponding TCP. Methods: 4DCT images were registered from end-inhale (EI) to end-exhale (EE) using a “demons” algorithm. The demons DVFs were corrected by an FEM model to get realistic deformation fields. The FEM DVFs were used to warp the EI images to create the FEM-simulated images. The two images combined with the FEM DVFmore » formed a benchmark model. Maximum intensity projection (MIP) images, created from the EI and simulated images, were used to develop IMRT plans. Two plans with 3 and 5 mm margins were developed for each patient. With these plans, radiation doses were recalculated on the simulated images and warped back to the EI images using the FEM DVFs to get the accumulated doses. The Elastix software was used to register the FEM-simulated images to the EI images. TCPs calculated with the Elastix-accumulated doses were compared with those generated by the FEM to get the TCP error of the Elastix registrations. Results: For six lung patients, the mean Elastix registration error ranged from 0.93 to 1.98 mm. Their relative dose errors in PTV were between 0.28% and 6.8% for 3mm margin plans, and between 0.29% and 6.3% for 5mm-margin plans. As the PTV margin reduced from 5 to 3 mm, the mean TCP error of the Elastix-reconstructed doses increased from 2.0% to 2.9%, and the mean NTCP errors decreased from 1.2% to 1.1%. Conclusion: Patient-specific benchmark images can be used to evaluate the impact of registration errors on the computed TCPs, and may help select appropriate PTV margins for lung SBRT patients.« less
Evaluation of triclosan in Minnesota lakes and rivers: Part II - human health risk assessment.
Yost, Lisa J; Barber, Timothy R; Gentry, P Robinan; Bock, Michael J; Lyndall, Jennifer L; Capdevielle, Marie C; Slezak, Brian P
2017-08-01
Triclosan, an antimicrobial compound found in consumer products, has been detected in low concentrations in Minnesota municipal wastewater treatment plant (WWTP) effluent. This assessment evaluates potential health risks for exposure of adults and children to triclosan in Minnesota surface water, sediments, and fish. Potential exposures via fish consumption are considered for recreational or subsistence-level consumers. This assessment uses two chronic oral toxicity benchmarks, which bracket other available toxicity values. The first benchmark is a lower bound on a benchmark dose associated with a 10% risk (BMDL 10 ) of 47mg per kilogram per day (mg/kg-day) for kidney effects in hamsters. This value was identified as the most sensitive endpoint and species in a review by Rodricks et al. (2010) and is used herein to derive an estimated reference dose (RfD (Rodricks) ) of 0.47mg/kg-day. The second benchmark is a reference dose (RfD) of 0.047mg/kg-day derived from a no observed adverse effect level (NOAEL) of 10mg/kg-day for hepatic and hematopoietic effects in mice (Minnesota Department of Health [MDH] 2014). Based on conservative assumptions regarding human exposures to triclosan, calculated risk estimates are far below levels of concern. These estimates are likely to overestimate risks for potential receptors, particularly because sample locations were generally biased towards known discharges (i.e., WWTP effluent). Copyright © 2017 Elsevier Inc. All rights reserved.
Lachenmeier, Dirk W; Rehm, Jürgen
2015-01-30
A comparative risk assessment of drugs including alcohol and tobacco using the margin of exposure (MOE) approach was conducted. The MOE is defined as ratio between toxicological threshold (benchmark dose) and estimated human intake. Median lethal dose values from animal experiments were used to derive the benchmark dose. The human intake was calculated for individual scenarios and population-based scenarios. The MOE was calculated using probabilistic Monte Carlo simulations. The benchmark dose values ranged from 2 mg/kg bodyweight for heroin to 531 mg/kg bodyweight for alcohol (ethanol). For individual exposure the four substances alcohol, nicotine, cocaine and heroin fall into the "high risk" category with MOE < 10, the rest of the compounds except THC fall into the "risk" category with MOE < 100. On a population scale, only alcohol would fall into the "high risk" category, and cigarette smoking would fall into the "risk" category, while all other agents (opiates, cocaine, amphetamine-type stimulants, ecstasy, and benzodiazepines) had MOEs > 100, and cannabis had a MOE > 10,000. The toxicological MOE approach validates epidemiological and social science-based drug ranking approaches especially in regard to the positions of alcohol and tobacco (high risk) and cannabis (low risk).
Baumung, Claudia; Rehm, Jürgen; Franke, Heike; Lachenmeier, Dirk W.
2016-01-01
Nicotine was not included in previous efforts to identify the most important toxicants of tobacco smoke. A health risk assessment of nicotine for smokers of cigarettes was conducted using the margin of exposure (MOE) approach and results were compared to literature MOEs of various other tobacco toxicants. The MOE is defined as ratio between toxicological threshold (benchmark dose) and estimated human intake. Dose-response modelling of human and animal data was used to derive the benchmark dose. The MOE was calculated using probabilistic Monte Carlo simulations for daily cigarette smokers. Benchmark dose values ranged from 0.004 mg/kg bodyweight for symptoms of intoxication in children to 3 mg/kg bodyweight for mortality in animals; MOEs ranged from below 1 up to 7.6 indicating a considerable consumer risk. The dimension of the MOEs is similar to those of other tobacco toxicants with high concerns relating to adverse health effects such as acrolein or formaldehyde. Owing to the lack of toxicological data in particular relating to cancer, long term animal testing studies for nicotine are urgently necessary. There is immediate need of action concerning the risk of nicotine also with regard to electronic cigarettes and smokeless tobacco. PMID:27759090
Lachenmeier, Dirk W.; Rehm, Jürgen
2015-01-01
A comparative risk assessment of drugs including alcohol and tobacco using the margin of exposure (MOE) approach was conducted. The MOE is defined as ratio between toxicological threshold (benchmark dose) and estimated human intake. Median lethal dose values from animal experiments were used to derive the benchmark dose. The human intake was calculated for individual scenarios and population-based scenarios. The MOE was calculated using probabilistic Monte Carlo simulations. The benchmark dose values ranged from 2 mg/kg bodyweight for heroin to 531 mg/kg bodyweight for alcohol (ethanol). For individual exposure the four substances alcohol, nicotine, cocaine and heroin fall into the “high risk” category with MOE < 10, the rest of the compounds except THC fall into the “risk” category with MOE < 100. On a population scale, only alcohol would fall into the “high risk” category, and cigarette smoking would fall into the “risk” category, while all other agents (opiates, cocaine, amphetamine-type stimulants, ecstasy, and benzodiazepines) had MOEs > 100, and cannabis had a MOE > 10,000. The toxicological MOE approach validates epidemiological and social science-based drug ranking approaches especially in regard to the positions of alcohol and tobacco (high risk) and cannabis (low risk). PMID:25634572
Translational benchmark risk analysis
Piegorsch, Walter W.
2010-01-01
Translational development – in the sense of translating a mature methodology from one area of application to another, evolving area – is discussed for the use of benchmark doses in quantitative risk assessment. Illustrations are presented with traditional applications of the benchmark paradigm in biology and toxicology, and also with risk endpoints that differ from traditional toxicological archetypes. It is seen that the benchmark approach can apply to a diverse spectrum of risk management settings. This suggests a promising future for this important risk-analytic tool. Extensions of the method to a wider variety of applications represent a significant opportunity for enhancing environmental, biomedical, industrial, and socio-economic risk assessments. PMID:20953283
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Grace L.; Department of Health Services Research, The University of Texas MD Anderson Cancer Center, Houston, Texas; Jiang, Jing
Purpose: High-quality treatment for intact cervical cancer requires external radiation therapy, brachytherapy, and chemotherapy, carefully sequenced and completed without delays. We sought to determine how frequently current treatment meets quality benchmarks and whether new technologies have influenced patterns of care. Methods and Materials: By searching diagnosis and procedure claims in MarketScan, an employment-based health care claims database, we identified 1508 patients with nonmetastatic, intact cervical cancer treated from 1999 to 2011, who were <65 years of age and received >10 fractions of radiation. Treatments received were identified using procedure codes and compared with 3 quality benchmarks: receipt of brachytherapy, receipt ofmore » chemotherapy, and radiation treatment duration not exceeding 63 days. The Cochran-Armitage test was used to evaluate temporal trends. Results: Seventy-eight percent of patients (n=1182) received brachytherapy, with brachytherapy receipt stable over time (Cochran-Armitage P{sub trend}=.15). Among patients who received brachytherapy, 66% had high–dose rate and 34% had low–dose rate treatment, although use of high–dose rate brachytherapy steadily increased to 75% by 2011 (P{sub trend}<.001). Eighteen percent of patients (n=278) received intensity modulated radiation therapy (IMRT), and IMRT receipt increased to 37% by 2011 (P{sub trend}<.001). Only 2.5% of patients (n=38) received IMRT in the setting of brachytherapy omission. Overall, 79% of patients (n=1185) received chemotherapy, and chemotherapy receipt increased to 84% by 2011 (P{sub trend}<.001). Median radiation treatment duration was 56 days (interquartile range, 47-65 days); however, duration exceeded 63 days in 36% of patients (n=543). Although 98% of patients received at least 1 benchmark treatment, only 44% received treatment that met all 3 benchmarks. With more stringent indicators (brachytherapy, ≥4 chemotherapy cycles, and duration not exceeding 56 days), only 25% of patients received treatment that met all benchmarks. Conclusion: In this cohort, most cervical cancer patients received treatment that did not comply with all 3 benchmarks for quality treatment. In contrast to increasing receipt of newer radiation technologies, there was little improvement in receipt of essential treatment benchmarks.« less
Faught, Austin M; Davidson, Scott E; Popple, Richard; Kry, Stephen F; Etzel, Carol; Ibbott, Geoffrey S; Followill, David S
2017-09-01
The Imaging and Radiation Oncology Core-Houston (IROC-H) Quality Assurance Center (formerly the Radiological Physics Center) has reported varying levels of compliance from their anthropomorphic phantom auditing program. IROC-H studies have suggested that one source of disagreement between institution submitted calculated doses and measurement is the accuracy of the institution's treatment planning system dose calculations and heterogeneity corrections used. In order to audit this step of the radiation therapy treatment process, an independent dose calculation tool is needed. Monte Carlo multiple source models for Varian flattening filter free (FFF) 6 MV and FFF 10 MV therapeutic x-ray beams were commissioned based on central axis depth dose data from a 10 × 10 cm 2 field size and dose profiles for a 40 × 40 cm 2 field size. The models were validated against open-field measurements in a water tank for field sizes ranging from 3 × 3 cm 2 to 40 × 40 cm 2 . The models were then benchmarked against IROC-H's anthropomorphic head and neck phantom and lung phantom measurements. Validation results, assessed with a ±2%/2 mm gamma criterion, showed average agreement of 99.9% and 99.0% for central axis depth dose data for FFF 6 MV and FFF 10 MV models, respectively. Dose profile agreement using the same evaluation technique averaged 97.8% and 97.9% for the respective models. Phantom benchmarking comparisons were evaluated with a ±3%/2 mm gamma criterion, and agreement averaged 90.1% and 90.8% for the respective models. Multiple source models for Varian FFF 6 MV and FFF 10 MV beams have been developed, validated, and benchmarked for inclusion in an independent dose calculation quality assurance tool for use in clinical trial audits. © 2017 American Association of Physicists in Medicine.
Angus, Simon D.; Piotrowska, Monika Joanna
2014-01-01
Multi-dose radiotherapy protocols (fraction dose and timing) currently used in the clinic are the product of human selection based on habit, received wisdom, physician experience and intra-day patient timetabling. However, due to combinatorial considerations, the potential treatment protocol space for a given total dose or treatment length is enormous, even for relatively coarse search; well beyond the capacity of traditional in-vitro methods. In constrast, high fidelity numerical simulation of tumor development is well suited to the challenge. Building on our previous single-dose numerical simulation model of EMT6/Ro spheroids, a multi-dose irradiation response module is added and calibrated to the effective dose arising from 18 independent multi-dose treatment programs available in the experimental literature. With the developed model a constrained, non-linear, search for better performing cadidate protocols is conducted within the vicinity of two benchmarks by genetic algorithm (GA) techniques. After evaluating less than 0.01% of the potential benchmark protocol space, candidate protocols were identified by the GA which conferred an average of 9.4% (max benefit 16.5%) and 7.1% (13.3%) improvement (reduction) on tumour cell count compared to the two benchmarks, respectively. Noticing that a convergent phenomenon of the top performing protocols was their temporal synchronicity, a further series of numerical experiments was conducted with periodic time-gap protocols (10 h to 23 h), leading to the discovery that the performance of the GA search candidates could be replicated by 17–18 h periodic candidates. Further dynamic irradiation-response cell-phase analysis revealed that such periodicity cohered with latent EMT6/Ro cell-phase temporal patterning. Taken together, this study provides powerful evidence towards the hypothesis that even simple inter-fraction timing variations for a given fractional dose program may present a facile, and highly cost-effecitive means of significantly improving clinical efficacy. PMID:25460164
Angus, Simon D; Piotrowska, Monika Joanna
2014-01-01
Multi-dose radiotherapy protocols (fraction dose and timing) currently used in the clinic are the product of human selection based on habit, received wisdom, physician experience and intra-day patient timetabling. However, due to combinatorial considerations, the potential treatment protocol space for a given total dose or treatment length is enormous, even for relatively coarse search; well beyond the capacity of traditional in-vitro methods. In constrast, high fidelity numerical simulation of tumor development is well suited to the challenge. Building on our previous single-dose numerical simulation model of EMT6/Ro spheroids, a multi-dose irradiation response module is added and calibrated to the effective dose arising from 18 independent multi-dose treatment programs available in the experimental literature. With the developed model a constrained, non-linear, search for better performing cadidate protocols is conducted within the vicinity of two benchmarks by genetic algorithm (GA) techniques. After evaluating less than 0.01% of the potential benchmark protocol space, candidate protocols were identified by the GA which conferred an average of 9.4% (max benefit 16.5%) and 7.1% (13.3%) improvement (reduction) on tumour cell count compared to the two benchmarks, respectively. Noticing that a convergent phenomenon of the top performing protocols was their temporal synchronicity, a further series of numerical experiments was conducted with periodic time-gap protocols (10 h to 23 h), leading to the discovery that the performance of the GA search candidates could be replicated by 17-18 h periodic candidates. Further dynamic irradiation-response cell-phase analysis revealed that such periodicity cohered with latent EMT6/Ro cell-phase temporal patterning. Taken together, this study provides powerful evidence towards the hypothesis that even simple inter-fraction timing variations for a given fractional dose program may present a facile, and highly cost-effecitive means of significantly improving clinical efficacy.
BMDExpress Data Viewer: A Visualization Tool to Analyze BMDExpress Datasets
Regulatory agencies increasingly apply benchmark dose (BMD) modeling to determine points of departure in human risk assessments. BMDExpress applies BMD modeling to transcriptomics datasets and groups genes to biological processes and pathways for rapid assessment of doses at whic...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Faillace, E.R.; Cheng, J.J.; Yu, C.
A series of benchmarking runs were conducted so that results obtained with the RESRAD code could be compared against those obtained with six pathway analysis models used to determine the radiation dose to an individual living on a radiologically contaminated site. The RESRAD computer code was benchmarked against five other computer codes - GENII-S, GENII, DECOM, PRESTO-EPA-CPG, and PATHRAE-EPA - and the uncodified methodology presented in the NUREG/CR-5512 report. Estimated doses for the external gamma pathway; the dust inhalation pathway; and the soil, food, and water ingestion pathways were calculated for each methodology by matching, to the extent possible, inputmore » parameters such as occupancy, shielding, and consumption factors.« less
Comparison of Vocal Vibration-Dose Measures for Potential-Damage Risk Criteria
ERIC Educational Resources Information Center
Titze, Ingo R.; Hunter, Eric J.
2015-01-01
Purpose: School-teachers have become a benchmark population for the study of occupational voice use. A decade of vibration-dose studies on the teacher population allows a comparison to be made between specific dose measures for eventual assessment of damage risk. Method: Vibration dosimetry is reformulated with the inclusion of collision stress.…
Markowski, V P; Zareba, G; Stern, S; Cox, C; Weiss, B
2001-06-01
Pregnant Holtzman rats were exposed to a single oral dose of 0, 20, 60, or 180 ng/kg 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD) on the 18th day of gestation. Their adult female offspring were trained to respond on a lever for brief opportunities to run in specially designed running wheels. Once they had begun responding on a fixed-ratio 1 (FR1) schedule of reinforcement, the fixed-ratio requirement for lever pressing was increased at five-session intervals to values of FR2, FR5, FR10, FR20, and FR30. We examined vaginal cytology after each behavior session to track estrous cyclicity. Under each of the FR values, perinatal TCDD exposure produced a significant dose-related reduction in the number of earned opportunities to run, the lever response rate, and the total number of revolutions in the wheel. Estrous cyclicity was not affected. Because of the consistent dose-response relationship at all FR values, we used the behavioral data to calculate benchmark doses based on displacements from modeled zero-dose performance of 1% (ED(01)) and 10% (ED(10)), as determined by a quadratic fit to the dose-response function. The mean ED(10) benchmark dose for earned run opportunities was 10.13 ng/kg with a 95% lower bound of 5.77 ng/kg. The corresponding ED(01) was 0.98 ng/kg with a 95% lower bound of 0.83 ng/kg. The mean ED(10) for total wheel revolutions was calculated as 7.32 ng/kg with a 95% lower bound of 5.41 ng/kg. The corresponding ED(01) was 0.71 ng/kg with a 95% lower bound of 0.60. These values should be viewed from the perspective of current human body burdens, whose average value, based on TCDD toxic equivalents, has been calculated as 13 ng/kg.
Thompson, Chad M; Gaylor, David W; Tachovsky, J Andrew; Perry, Camarie; Carakostas, Michael C; Haws, Laurie C
2013-12-01
Sulfolane is a widely used industrial solvent that is often used for gas treatment (sour gas sweetening; hydrogen sulfide removal from shale and coal processes, etc.), and in the manufacture of polymers and electronics, and may be found in pharmaceuticals as a residual solvent used in the manufacturing processes. Sulfolane is considered a high production volume chemical with worldwide production around 18 000-36 000 tons per year. Given that sulfolane has been detected as a contaminant in groundwater, an important potential route of exposure is tap water ingestion. Because there are currently no federal drinking water standards for sulfolane in the USA, we developed a noncancer oral reference dose (RfD) based on benchmark dose modeling, as well as a tap water screening value that is protective of ingestion. Review of the available literature suggests that sulfolane is not likely to be mutagenic, clastogenic or carcinogenic, or pose reproductive or developmental health risks except perhaps at very high exposure concentrations. RfD values derived using benchmark dose modeling were 0.01-0.04 mg kg(-1) per day, although modeling of developmental endpoints resulted in higher values, approximately 0.4 mg kg(-1) per day. The lowest, most conservative, RfD of 0.01 mg kg(-1) per day was based on reduced white blood cell counts in female rats. This RfD was used to develop a tap water screening level that is protective of ingestion, viz. 365 µg l(-1). It is anticipated that these values, along with the hazard identification and dose-response modeling described herein, should be informative for risk assessors and regulators interested in setting health-protective drinking water guideline values for sulfolane. Copyright © 2012 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
He, Weizhen; Zhu, Yunhao; Feng, Ting; Wang, Huaideng; Yuan, Jie; Xu, Guan; Wang, Xueding; Carson, Paul
2017-03-01
Osteoporosis is a progressive bone disease which is characterized by a decrease in the bone mass and deterioration in bone micro-architecture. In theory, photoacoustic (PA) imaging analysis has potential to obtain the characteristics of the bone effectively. Previous study demonstrated that photoacoustic spectral analysis (PASA) method with the qualified parameter slope could provide an objective assessment of bone microstructure and deterioration. In this study, we tried to compare PASA method with the traditional quantitative ultrasound (QUS) method in osteoporosis assessment. Numerical simulations of both PA and ultrasound (US) signal are performed on computerized tomographic (CT) images of trabecular bone with different bone mineral densities (BMDs). Ex vivo experiments were conducted on porcine femur bone model of different BMDs. We compared the quantified parameter slope and the broadband ultrasound attenuation (BUA) coefficient from the PASA and QUS among different bone models, respectively. Both the simulation and ex vivo experiment results show that bone with low BMD has a higher slope value and lower BUA value. Our result demonstrated that the PASA method has the same efficacy with QUS in bone assessment, considering PA is a non-ionizing, non-invasive technique, PASA method holds potential for clinical diagnosis in osteoporosis and other bone diseases.
DOSE-RESPONSE ASSESSMENT FOR DEVELOPMENTAL TOXICITY III. STATISTICAL MODELS
Although quantitative modeling has been central to cancer risk assessment for years, the concept of do@e-response modeling for developmental effects is relatively new. he benchmark dose (BMD) approach has been proposed for use with developmental (as well as other noncancer) endpo...
Bone density loss after allogeneic hematopoietic stem cell transplantation: a prospective study.
Stern, J M; Sullivan, K M; Ott, S M; Seidel, K; Fink, J C; Longton, G; Sherrard, D J
2001-01-01
The incidence and course of bone density abnormalities following hematopoietic stem cell transplantation are poorly understood and complicated by the impact of multiple factors. Hip, spine, and wrist bone mineral densities (BMDs) were measured in 104 adults (54 women, 54 men; mean age, 40 years [range, 18-64 years]) at 3 and 12 months after allogeneic transplantation. Clinical and laboratory variables were evaluated using univariate and multivariate analyses to determine risk factors for osteoporosis, fracture, and avascular necrosis. At 3 months posttransplantation, combined (male and female) hip, spine, and wrist z scores were -0.35, -0.42, and +0.04 standard deviations, respectively. At 12 months both men and women experienced significant loss of hip BMD (4.2%, P < .0001); changes in the spine and wrist were minimal. The cumulative dose and number of days of glucocorticoid therapy and the number of days of cyclosporine or tacrolimus therapy showed significant associations with loss of BMD; age, total body irradiation, diagnosis, and donor type did not. Nontraumatic fractures occurred in 10.6% of patients and avascular necrosis in 9.6% within 3 years posttransplantation. The decrease in height between pretransplantation and 12 months posttransplantation was significant (P = .0001). Results indicate that loss of BMD after allogeneic stem cell transplantation is common and accelerated by the length of immunosuppressive therapy and cumulative dose of glucocorticoid. An increased incidence of fracture and avascular necrosis may adversely impact long-term quality of life. Prevention of bone demineralization appears warranted after stem cell transplantation.
ABSTRACT The ability to anchor chemical class-based gene expression changes to phenotypic lesions and to describe these changes as a function of dose and time informs mode of action determinations and improves quantitative risk assessments. Previous transcription-based microarra...
Modification and benchmarking of SKYSHINE-III for use with ISFSI cask arrays
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hertel, N.E.; Napolitano, D.G.
1997-12-01
Dry cask storage arrays are becoming more and more common at nuclear power plants in the United States. Title 10 of the Code of Federal Regulations, Part 72, limits doses at the controlled area boundary of these independent spent-fuel storage installations (ISFSI) to 0.25 mSv (25 mrem)/yr. The minimum controlled area boundaries of such a facility are determined by cask array dose calculations, which include direct radiation and radiation scattered by the atmosphere, also known as skyshine. NAC International (NAC) uses SKYSHINE-III to calculate the gamma-ray and neutron dose rates as a function of distance from ISFSI arrays. In thismore » paper, we present modifications to the SKYSHINE-III that more explicitly model cask arrays. In addition, we have benchmarked the radiation transport methods used in SKYSHINE-III against {sup 60}Co gamma-ray experiments and MCNP neutron calculations.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Renaud, M; Seuntjens, J; Roberge, D
Purpose: Assessing the performance and uncertainty of a pre-calculated Monte Carlo (PMC) algorithm for proton and electron transport running on graphics processing units (GPU). While PMC methods have been described in the past, an explicit quantification of the latent uncertainty arising from recycling a limited number of tracks in the pre-generated track bank is missing from the literature. With a proper uncertainty analysis, an optimal pre-generated track bank size can be selected for a desired dose calculation uncertainty. Methods: Particle tracks were pre-generated for electrons and protons using EGSnrc and GEANT4, respectively. The PMC algorithm for track transport was implementedmore » on the CUDA programming framework. GPU-PMC dose distributions were compared to benchmark dose distributions simulated using general-purpose MC codes in the same conditions. A latent uncertainty analysis was performed by comparing GPUPMC dose values to a “ground truth” benchmark while varying the track bank size and primary particle histories. Results: GPU-PMC dose distributions and benchmark doses were within 1% of each other in voxels with dose greater than 50% of Dmax. In proton calculations, a submillimeter distance-to-agreement error was observed at the Bragg Peak. Latent uncertainty followed a Poisson distribution with the number of tracks per energy (TPE) and a track bank of 20,000 TPE produced a latent uncertainty of approximately 1%. Efficiency analysis showed a 937× and 508× gain over a single processor core running DOSXYZnrc for 16 MeV electrons in water and bone, respectively. Conclusion: The GPU-PMC method can calculate dose distributions for electrons and protons to a statistical uncertainty below 1%. The track bank size necessary to achieve an optimal efficiency can be tuned based on the desired uncertainty. Coupled with a model to calculate dose contributions from uncharged particles, GPU-PMC is a candidate for inverse planning of modulated electron radiotherapy and scanned proton beams. This work was supported in part by FRSQ-MSSS (Grant No. 22090), NSERC RG (Grant No. 432290) and CIHR MOP (Grant No. MOP-211360)« less
Latent uncertainties of the precalculated track Monte Carlo method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Renaud, Marc-André; Seuntjens, Jan; Roberge, David
Purpose: While significant progress has been made in speeding up Monte Carlo (MC) dose calculation methods, they remain too time-consuming for the purpose of inverse planning. To achieve clinically usable calculation speeds, a precalculated Monte Carlo (PMC) algorithm for proton and electron transport was developed to run on graphics processing units (GPUs). The algorithm utilizes pregenerated particle track data from conventional MC codes for different materials such as water, bone, and lung to produce dose distributions in voxelized phantoms. While PMC methods have been described in the past, an explicit quantification of the latent uncertainty arising from the limited numbermore » of unique tracks in the pregenerated track bank is missing from the paper. With a proper uncertainty analysis, an optimal number of tracks in the pregenerated track bank can be selected for a desired dose calculation uncertainty. Methods: Particle tracks were pregenerated for electrons and protons using EGSnrc and GEANT4 and saved in a database. The PMC algorithm for track selection, rotation, and transport was implemented on the Compute Unified Device Architecture (CUDA) 4.0 programming framework. PMC dose distributions were calculated in a variety of media and compared to benchmark dose distributions simulated from the corresponding general-purpose MC codes in the same conditions. A latent uncertainty metric was defined and analysis was performed by varying the pregenerated track bank size and the number of simulated primary particle histories and comparing dose values to a “ground truth” benchmark dose distribution calculated to 0.04% average uncertainty in voxels with dose greater than 20% of D{sub max}. Efficiency metrics were calculated against benchmark MC codes on a single CPU core with no variance reduction. Results: Dose distributions generated using PMC and benchmark MC codes were compared and found to be within 2% of each other in voxels with dose values greater than 20% of the maximum dose. In proton calculations, a small (≤1 mm) distance-to-agreement error was observed at the Bragg peak. Latent uncertainty was characterized for electrons and found to follow a Poisson distribution with the number of unique tracks per energy. A track bank of 12 energies and 60000 unique tracks per pregenerated energy in water had a size of 2.4 GB and achieved a latent uncertainty of approximately 1% at an optimal efficiency gain over DOSXYZnrc. Larger track banks produced a lower latent uncertainty at the cost of increased memory consumption. Using an NVIDIA GTX 590, efficiency analysis showed a 807 × efficiency increase over DOSXYZnrc for 16 MeV electrons in water and 508 × for 16 MeV electrons in bone. Conclusions: The PMC method can calculate dose distributions for electrons and protons to a statistical uncertainty of 1% with a large efficiency gain over conventional MC codes. Before performing clinical dose calculations, models to calculate dose contributions from uncharged particles must be implemented. Following the successful implementation of these models, the PMC method will be evaluated as a candidate for inverse planning of modulated electron radiation therapy and scanned proton beams.« less
Latent uncertainties of the precalculated track Monte Carlo method.
Renaud, Marc-André; Roberge, David; Seuntjens, Jan
2015-01-01
While significant progress has been made in speeding up Monte Carlo (MC) dose calculation methods, they remain too time-consuming for the purpose of inverse planning. To achieve clinically usable calculation speeds, a precalculated Monte Carlo (PMC) algorithm for proton and electron transport was developed to run on graphics processing units (GPUs). The algorithm utilizes pregenerated particle track data from conventional MC codes for different materials such as water, bone, and lung to produce dose distributions in voxelized phantoms. While PMC methods have been described in the past, an explicit quantification of the latent uncertainty arising from the limited number of unique tracks in the pregenerated track bank is missing from the paper. With a proper uncertainty analysis, an optimal number of tracks in the pregenerated track bank can be selected for a desired dose calculation uncertainty. Particle tracks were pregenerated for electrons and protons using EGSnrc and geant4 and saved in a database. The PMC algorithm for track selection, rotation, and transport was implemented on the Compute Unified Device Architecture (cuda) 4.0 programming framework. PMC dose distributions were calculated in a variety of media and compared to benchmark dose distributions simulated from the corresponding general-purpose MC codes in the same conditions. A latent uncertainty metric was defined and analysis was performed by varying the pregenerated track bank size and the number of simulated primary particle histories and comparing dose values to a "ground truth" benchmark dose distribution calculated to 0.04% average uncertainty in voxels with dose greater than 20% of Dmax. Efficiency metrics were calculated against benchmark MC codes on a single CPU core with no variance reduction. Dose distributions generated using PMC and benchmark MC codes were compared and found to be within 2% of each other in voxels with dose values greater than 20% of the maximum dose. In proton calculations, a small (≤ 1 mm) distance-to-agreement error was observed at the Bragg peak. Latent uncertainty was characterized for electrons and found to follow a Poisson distribution with the number of unique tracks per energy. A track bank of 12 energies and 60000 unique tracks per pregenerated energy in water had a size of 2.4 GB and achieved a latent uncertainty of approximately 1% at an optimal efficiency gain over DOSXYZnrc. Larger track banks produced a lower latent uncertainty at the cost of increased memory consumption. Using an NVIDIA GTX 590, efficiency analysis showed a 807 × efficiency increase over DOSXYZnrc for 16 MeV electrons in water and 508 × for 16 MeV electrons in bone. The PMC method can calculate dose distributions for electrons and protons to a statistical uncertainty of 1% with a large efficiency gain over conventional MC codes. Before performing clinical dose calculations, models to calculate dose contributions from uncharged particles must be implemented. Following the successful implementation of these models, the PMC method will be evaluated as a candidate for inverse planning of modulated electron radiation therapy and scanned proton beams.
SU-E-T-577: Commissioning of a Deterministic Algorithm for External Photon Beams
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, T; Finlay, J; Mesina, C
Purpose: We report commissioning results for a deterministic algorithm for external photon beam treatment planning. A deterministic algorithm solves the radiation transport equations directly using a finite difference method, thus improve the accuracy of dose calculation, particularly under heterogeneous conditions with results similar to that of Monte Carlo (MC) simulation. Methods: Commissioning data for photon energies 6 – 15 MV includes the percentage depth dose (PDD) measured at SSD = 90 cm and output ratio in water (Spc), both normalized to 10 cm depth, for field sizes between 2 and 40 cm and depths between 0 and 40 cm. Off-axismore » ratio (OAR) for the same set of field sizes was used at 5 depths (dmax, 5, 10, 20, 30 cm). The final model was compared with the commissioning data as well as additional benchmark data. The benchmark data includes dose per MU determined for 17 points for SSD between 80 and 110 cm, depth between 5 and 20 cm, and lateral offset of up to 16.5 cm. Relative comparisons were made in a heterogeneous phantom made of cork and solid water. Results: Compared to the commissioning beam data, the agreement are generally better than 2% with large errors (up to 13%) observed in the buildup regions of the FDD and penumbra regions of the OAR profiles. The overall mean standard deviation is 0.04% when all data are taken into account. Compared to the benchmark data, the agreements are generally better than 2%. Relative comparison in heterogeneous phantom is in general better than 4%. Conclusion: A commercial deterministic algorithm was commissioned for megavoltage photon beams. In a homogeneous medium, the agreement between the algorithm and measurement at the benchmark points is generally better than 2%. The dose accuracy for a deterministic algorithm is better than a convolution algorithm in heterogeneous medium.« less
Arnold, Scott M; Collins, Michael A; Graham, Cynthia; Jolly, Athena T; Parod, Ralph J; Poole, Alan; Schupp, Thomas; Shiotsuka, Ronald N; Woolhiser, Michael R
2012-12-01
Polyurethanes (PU) are polymers made from diisocyanates and polyols for a variety of consumer products. It has been suggested that PU foam may contain trace amounts of residual toluene diisocyanate (TDI) monomers and present a health risk. To address this concern, the exposure scenario and health risks posed by sleeping on a PU foam mattress were evaluated. Toxicity benchmarks for key non-cancer endpoints (i.e., irritation, sensitization, respiratory tract effects) were determined by dividing points of departure by uncertainty factors. The cancer benchmark was derived using the USEPA Benchmark Dose Software. Results of previous migration and emission data of TDI from PU foam were combined with conservative exposure factors to calculate upper-bound dermal and inhalation exposures to TDI as well as a lifetime average daily dose to TDI from dermal exposure. For each non-cancer endpoint, the toxicity benchmark was divided by the calculated exposure to determine the margin of safety (MOS), which ranged from 200 (respiratory tract) to 3×10(6) (irritation). Although available data indicate TDI is not carcinogenic, a theoretical excess cancer risk (1×10(-7)) was calculated. We conclude from this assessment that sleeping on a PU foam mattress does not pose TDI-related health risks to consumers. Copyright © 2012 Elsevier Inc. All rights reserved.
Current modeling practice may lead to falsely high benchmark dose estimates.
Ringblom, Joakim; Johanson, Gunnar; Öberg, Mattias
2014-07-01
Benchmark dose (BMD) modeling is increasingly used as the preferred approach to define the point-of-departure for health risk assessment of chemicals. As data are inherently variable, there is always a risk to select a model that defines a lower confidence bound of the BMD (BMDL) that, contrary to expected, exceeds the true BMD. The aim of this study was to investigate how often and under what circumstances such anomalies occur under current modeling practice. Continuous data were generated from a realistic dose-effect curve by Monte Carlo simulations using four dose groups and a set of five different dose placement scenarios, group sizes between 5 and 50 animals and coefficients of variations of 5-15%. The BMD calculations were conducted using nested exponential models, as most BMD software use nested approaches. "Non-protective" BMDLs (higher than true BMD) were frequently observed, in some scenarios reaching 80%. The phenomenon was mainly related to the selection of the non-sigmoidal exponential model (Effect=a·e(b)(·dose)). In conclusion, non-sigmoid models should be used with caution as it may underestimate the risk, illustrating that awareness of the model selection process and sound identification of the point-of-departure is vital for health risk assessment. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
Evaluating MoE and its Uncertainty and Variability for Food Contaminants (EuroTox presentation)
Margin of Exposure (MoE), is a metric for quantifying the relationship between exposure and hazard. Ideally, it is the ratio of the dose associated with hazard and an estimate of exposure. For example, hazard may be characterized by a benchmark dose (BMD), and, for food contami...
The US EPA’s N-Methyl Carbamate (NMC) Cumulative Risk assessment was based on the effect on acetylcholine esterase (AChE) activity of exposure to 10 NMC pesticides through dietary, drinking water, and residential exposures, assuming the effects of joint exposure to NMCs is dose-...
75 FR 40729 - Residues of Quaternary Ammonium Compounds, N-Alkyl (C12-14
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-14
.... Systemic toxicity occurs after absorption and distribution of the chemical to tissues in the body. Such... identified (the LOAEL) or a Benchmark Dose (BMD) approach is sometimes used for risk assessment. Uncertainty.... No systemic effects observed up to 20 mg/ kg/day, highest dose of technical that could be tested...
The ability to anchor chemical class-based gene expression changes to phenotypic lesions and to describe these changes as a function of dose and time can inform mode of action and improve quantitative risk assessment. Previous research identified a 330-gene cluster commonly resp...
For more than three decades chronic studies in rodents have been the benchmark for assessing the potential long-term toxicity, and particularly the carcinogenicity, of chemicals. With doses typically administered for about 2 years (18 months to lifetime), the rodent bioassay has ...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fujii, K; UCLA School of Medicine, Los Angeles, CA; Bostani, M
Purpose: The aim of this study was to collect CT dose index data from adult head exams to establish benchmarks based on either: (a) values pooled from all head exams or (b) values for specific protocols. One part of this was to investigate differences in scan frequency and CT dose index data for inpatients versus outpatients. Methods: We collected CT dose index data (CTDIvol) from adult head CT examinations performed at our medical facilities from Jan 1st to Dec 31th, 2014. Four of these scanners were used for inpatients, the other five were used for outpatients. All scanners used Tubemore » Current Modulation. We used X-ray dose management software to mine dose index data and evaluate CTDIvol for 15807 inpatients and 4263 outpatients undergoing Routine Brain, Sinus, Facial/Mandible, Temporal Bone, CTA Brain and CTA Brain-Neck protocols, and combined across all protocols. Results: For inpatients, Routine Brain series represented 84% of total scans performed. For outpatients, Sinus scans represented the largest fraction (36%). The CTDIvol (mean ± SD) across all head protocols was 39 ± 30 mGy (min-max: 3.3–540 mGy). The CTDIvol for Routine Brain was 51 ± 6.2 mGy (min-max: 36–84 mGy). The values for Sinus were 24 ± 3.2 mGy (min-max: 13–44 mGy) and for Facial/Mandible were 22 ± 4.3 mGy (min-max: 14–46 mGy). The mean CTDIvol for inpatients and outpatients was similar across protocols with one exception (CTA Brain-Neck). Conclusion: There is substantial dose variation when results from all protocols are pooled together; this is primarily a function of the differences in technical factors of the protocols themselves. When protocols are analyzed separately, there is much less variability. While analyzing pooled data affords some utility, reviewing protocols segregated by clinical indication provides greater opportunity for optimization and establishing useful benchmarks.« less
Missile Defense Information Technology Small Business Conference
2009-09-01
NetOps Survivability 4 • Supported User Base • Number of Workstations • Number of Servers • Number of Special Circuits • Number of Sites • Number...Contracts, MDIOC • Ground Test (DTC) • MDSEC (SS) • Infrastructure (IC) • BMDS Support (BCT) • JTAAS – SETA • Mod & Sim ( DES ) • Analysis (GML) • Tenants...AUG 09) 4 MDA DOCE Engineering Functions • Design Engineers – Develop detailed design artifacts based on architectural specifications – Coordinate
Information Management Principles Applied to the Ballistic Missile Defense System
2007-03-01
of a BMDS. From this, the Army produced the Nike -Zeus system comprised of four radars, the Zeus missile, and a computer fire control system (General...made the Nike -Zeus our first National Missile Defense (NMD) system named Sentinel. The architecture was to cover 14 locations, 10 of which were...1999). Additionally, there are cultural impacts (Gordon & Gordon, 1999). A company choosing an Apple OS may have to wage a big fight against the
Ali, F; Waker, A J; Waller, E J
2014-10-01
Tissue-equivalent proportional counters (TEPC) can potentially be used as a portable and personal dosemeter in mixed neutron and gamma-ray fields, but what hinders this use is their typically large physical size. To formulate compact TEPC designs, the use of a Monte Carlo transport code is necessary to predict the performance of compact designs in these fields. To perform this modelling, three candidate codes were assessed: MCNPX 2.7.E, FLUKA 2011.2 and PHITS 2.24. In each code, benchmark simulations were performed involving the irradiation of a 5-in. TEPC with monoenergetic neutron fields and a 4-in. wall-less TEPC with monoenergetic gamma-ray fields. The frequency and dose mean lineal energies and dose distributions calculated from each code were compared with experimentally determined data. For the neutron benchmark simulations, PHITS produces data closest to the experimental values and for the gamma-ray benchmark simulations, FLUKA yields data closest to the experimentally determined quantities. © The Author 2013. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Donald L.; Hilohi, C. Michael; Spelic, David C.
2012-10-15
Purpose: To determine patient radiation doses from interventional cardiology procedures in the U.S and to suggest possible initial values for U.S. benchmarks for patient radiation dose from selected interventional cardiology procedures [fluoroscopically guided diagnostic cardiac catheterization and percutaneous coronary intervention (PCI)]. Methods: Patient radiation dose metrics were derived from analysis of data from the 2008 to 2009 Nationwide Evaluation of X-ray Trends (NEXT) survey of cardiac catheterization. This analysis used deidentified data and did not require review by an IRB. Data from 171 facilities in 30 states were analyzed. The distributions (percentiles) of radiation dose metrics were determined for diagnosticmore » cardiac catheterizations, PCI, and combined diagnostic and PCI procedures. Confidence intervals for these dose distributions were determined using bootstrap resampling. Results: Percentile distributions (advisory data sets) and possible preliminary U.S. reference levels (based on the 75th percentile of the dose distributions) are provided for cumulative air kerma at the reference point (K{sub a,r}), cumulative air kerma-area product (P{sub KA}), fluoroscopy time, and number of cine runs. Dose distributions are sufficiently detailed to permit dose audits as described in National Council on Radiation Protection and Measurements Report No. 168. Fluoroscopy times are consistent with those observed in European studies, but P{sub KA} is higher in the U.S. Conclusions: Sufficient data exist to suggest possible initial benchmarks for patient radiation dose for certain interventional cardiology procedures in the U.S. Our data suggest that patient radiation dose in these procedures is not optimized in U.S. practice.« less
EPA's methodology for estimation of inhalation reference concentrations (RfCs) as benchmark estimates of the quantitative dose-response assessment of chronic noncancer toxicity for individual inhaled chemicals.
Benchmarking the minimum Electron Beam (eBeam) dose required for the sterilization of space foods
NASA Astrophysics Data System (ADS)
Bhatia, Sohini S.; Wall, Kayley R.; Kerth, Chris R.; Pillai, Suresh D.
2018-02-01
As manned space missions extend in length, the safety, nutrition, acceptability, and shelf life of space foods are of paramount importance to NASA. Since food and mealtimes play a key role in reducing stress and boredom of prolonged missions, the quality of food in terms of appearance, flavor, texture, and aroma can have significant psychological ramifications on astronaut performance. The FDA, which oversees space foods, currently requires a minimum dose of 44 kGy for irradiated space foods. The underlying hypothesis was that commercial sterility of space foods could be achieved at a significantly lower dose, and this lowered dose would positively affect the shelf life of the product. Electron beam processed beef fajitas were used as an example NASA space food to benchmark the minimum eBeam dose required for sterility. A 15 kGy dose was able to achieve an approximately 10 log reduction in Shiga-toxin-producing Escherichia coli bacteria, and a 5 log reduction in Clostridium sporogenes spores. Furthermore, accelerated shelf life testing (ASLT) to determine sensory and quality characteristics under various conditions was conducted. Using Multidimensional gas-chromatography-olfactometry-mass spectrometry (MDGC-O-MS), numerous volatiles were shown to be dependent on the dose applied to the product. Furthermore, concentrations of off -flavor aroma compounds such as dimethyl sulfide were decreased at the reduced 15 kGy dose. The results suggest that the combination of conventional cooking combined with eBeam processing (15 kGy) can achieve the safety and shelf-life objectives needed for long duration space-foods.
The grout/glass performance assessment code system (GPACS) with verification and benchmarking
DOE Office of Scientific and Technical Information (OSTI.GOV)
Piepho, M.G.; Sutherland, W.H.; Rittmann, P.D.
1994-12-01
GPACS is a computer code system for calculating water flow (unsaturated or saturated), solute transport, and human doses due to the slow release of contaminants from a waste form (in particular grout or glass) through an engineered system and through a vadose zone to an aquifer, well and river. This dual-purpose document is intended to serve as a user`s guide and verification/benchmark document for the Grout/Glass Performance Assessment Code system (GPACS). GPACS can be used for low-level-waste (LLW) Glass Performance Assessment and many other applications including other low-level-waste performance assessments and risk assessments. Based on all the cses presented, GPACSmore » is adequate (verified) for calculating water flow and contaminant transport in unsaturated-zone sediments and for calculating human doses via the groundwater pathway.« less
Pasler, Marlies; Kaas, Jochem; Perik, Thijs; Geuze, Job; Dreindl, Ralf; Künzler, Thomas; Wittkamper, Frits; Georg, Dietmar
2015-12-01
To systematically evaluate machine specific quality assurance (QA) for volumetric modulated arc therapy (VMAT) based on log files by applying a dynamic benchmark plan. A VMAT benchmark plan was created and tested on 18 Elekta linacs (13 MLCi or MLCi2, 5 Agility) at 4 different institutions. Linac log files were analyzed and a delivery robustness index was introduced. For dosimetric measurements an ionization chamber array was used. Relative dose deviations were assessed by mean gamma for each control point and compared to the log file evaluation. Fourteen linacs delivered the VMAT benchmark plan, while 4 linacs failed by consistently terminating the delivery. The mean leaf error (±1SD) was 0.3±0.2 mm for all linacs. Large MLC maximum errors up to 6.5 mm were observed at reversal positions. Delivery robustness index accounting for MLC position correction (0.8-1.0) correlated with delivery time (80-128 s) and depended on dose rate performance. Dosimetric evaluation indicated in general accurate plan reproducibility with γ(mean)(±1 SD)=0.4±0.2 for 1 mm/1%. However single control point analysis revealed larger deviations and attributed well to log file analysis. The designed benchmark plan helped identify linac related malfunctions in dynamic mode for VMAT. Log files serve as an important additional QA measure to understand and visualize dynamic linac parameters. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Gamma irradiator dose mapping simulation using the MCNP code and benchmarking with dosimetry.
Sohrabpour, M; Hassanzadeh, M; Shahriari, M; Sharifzadeh, M
2002-10-01
The Monte Carlo transport code, MCNP, has been applied in simulating dose rate distribution in the IR-136 gamma irradiator system. Isodose curves, cumulative dose values, and system design data such as throughputs, over-dose-ratios, and efficiencies have been simulated as functions of product density. Simulated isodose curves, and cumulative dose values were compared with dosimetry values obtained using polymethyle-methacrylate, Fricke, ethanol-chlorobenzene, and potassium dichromate dosimeters. The produced system design data were also found to agree quite favorably with those of the system manufacturer's data. MCNP has thus been found to be an effective transport code for handling of various dose mapping excercises for gamma irradiators.
Bercu, Joel P; Jolly, Robert A; Flagella, Kelly M; Baker, Thomas K; Romero, Pedro; Stevens, James L
2010-12-01
In order to determine a threshold for nongenotoxic carcinogens, the traditional risk assessment approach has been to identify a mode of action (MOA) with a nonlinear dose-response. The dose-response for one or more key event(s) linked to the MOA for carcinogenicity allows a point of departure (POD) to be selected from the most sensitive effect dose or no-effect dose. However, this can be challenging because multiple MOAs and key events may exist for carcinogenicity and oftentimes extensive research is required to elucidate the MOA. In the present study, a microarray analysis was conducted to determine if a POD could be identified following short-term oral rat exposure with two nongenotoxic rodent carcinogens, fenofibrate and methapyrilene, using a benchmark dose analysis of genes aggregated in Kyoto Encyclopedia of Genes and Genomes (KEGG) pathways and Gene Ontology (GO) biological processes, which likely encompass key event(s) for carcinogenicity. The gene expression response for fenofibrate given to rats for 2days was consistent with its MOA and known key events linked to PPARα activation. The temporal response from daily dosing with methapyrilene demonstrated biological complexity with waves of pathways/biological processes occurring over 1, 3, and 7days; nonetheless, the benchmark dose values were consistent over time. When comparing the dose-response of toxicogenomic data to tumorigenesis or precursor events, the toxicogenomics POD was slightly below any effect level. Our results suggest that toxicogenomic analysis using short-term studies can be used to identify a threshold for nongenotoxic carcinogens based on evaluation of potential key event(s) which then can be used within a risk assessment framework. Copyright © 2010 Elsevier Inc. All rights reserved.
Girard, Raphaële; Aupee, Martine; Erb, Martine; Bettinger, Anne; Jouve, Alice
2012-12-01
The 3ml volume currently used as the hand hygiene (HH) measure has been explored as the pertinent dose for an indirect indicator of HH compliance. A multicenter study was conducted in order to ascertain the required dose using different products. The average contact duration before drying was measured and compared with references. Effective hand coverage had to include the whole hand and the wrist. Two durations were chosen as points of reference: 30s, as given by guidelines, and the duration validated by the European standard EN 1500. Each product was to be tested, using standardized procedures, by three nosocomial infection prevention teams, for three different doses (3, 2 and 1.5ml). Data from 27 products and 1706 tests were analyzed. Depending on the product, the dose needed to ensure a 30-s contact duration in 75% of tests ranging from 2ml to more than 3ml, and to ensure a contact duration exceeding the EN 1500 times in 75% of tests ranging from 1.5ml to more than 3ml. The aftermath interpretation is the following: if different products are used, the volume utilized does not give an unbiased estimation of the HH compliance. Other compliance evaluation methods remain necessary for efficient benchmarking. Copyright © 2012 Ministry of Health, Saudi Arabia. Published by Elsevier Ltd. All rights reserved.
Lachenmeier, Dirk W; Steffen, Christian; el-Atma, Oliver; Maixner, Sibylle; Löbell-Behrends, Sigrid; Kohl-Himmelseher, Matthias
2012-11-01
The decision criterion for the demarcation between foods and medicinal products in the EU is the significant "pharmacological action". Based on six examples of substances with ambivalent status, the benchmark dose (BMD) method is evaluated to provide a threshold for pharmacological action. Using significant dose-response models from literature clinical trial data or epidemiology, the BMD values were 63mg/day for caffeine, 5g/day for alcohol, 6mg/day for lovastatin, 769mg/day for glucosamine sulfate, 151mg/day for Ginkgo biloba extract, and 0.4mg/day for melatonin. The examples for caffeine and alcohol validate the approach because intake above BMD clearly exhibits pharmacological action. Nevertheless, due to uncertainties in dose-response modelling as well as the need for additional uncertainty factors to consider differences in sensitivity within the human population, a "borderline range" on the dose-response curve remains. "Pharmacological action" has proven to be not very well suited as binary decision criterion between foods and medicinal product. The European legislator should rethink the definition of medicinal products, as the current situation based on complicated case-by-case decisions on pharmacological action leads to an unregulated market flooded with potentially illegal food supplements. Copyright © 2012 Elsevier Inc. All rights reserved.
Benchmark solutions for the galactic heavy-ion transport equations with energy and spatial coupling
NASA Technical Reports Server (NTRS)
Ganapol, Barry D.; Townsend, Lawrence W.; Lamkin, Stanley L.; Wilson, John W.
1991-01-01
Nontrivial benchmark solutions are developed for the galactic heavy ion transport equations in the straightahead approximation with energy and spatial coupling. Analytical representations of the ion fluxes are obtained for a variety of sources with the assumption that the nuclear interaction parameters are energy independent. The method utilizes an analytical LaPlace transform inversion to yield a closed form representation that is computationally efficient. The flux profiles are then used to predict ion dose profiles, which are important for shield design studies.
Prevalence and associated factors of low bone mass in adults with systemic lupus erythematosus.
Cramarossa, G; Urowitz, M B; Su, J; Gladman, D; Touma, Z
2017-04-01
Background Systemic lupus erythematosus (SLE) patients are often treated with glucocorticoids, which place them at risk of bone loss. Objectives The objectives of this article are to determine: (1) the prevalence of low bone mineral density (BMD) and factors associated with low BMD and (2) the prevalence of symptomatic fragility fractures in inception patients of the Toronto Lupus Cohort (TLC). Methods Prospectively collected data from the TLC (1996-2015) of inception patients' first BMD were analyzed. For pre-menopausal women/males <50 years, BMD 'below expected range for age' was defined by Z-score ≤ -2.0 SD. For post-menopausal women/males age 50 or older, osteoporosis was defined by T-score ≤ -2.5 SD and low bone mass by T-score between -1.0 and -2.5 SD. Patients' BMDs were defined as abnormal if Z-score ≤ -2.0 or T-score < -1.0 SD, and the remainder as normal. Descriptive analysis and logistic regression were employed. Results Of 1807 patients, 286 are inception patients with BMD results (mean age 37.9 ± 13.7 years); 88.8% are female. The overall prevalence of abnormal BMD is 31.5%. In pre-menopausal women ( n = 173), the prevalence of BMD below expected range is 17.3%. In post-menopausal women ( n = 81), the prevalence of osteoporosis and low BMD are 12.3% and 43.2%, respectively. Age and cumulative dose of glucocorticoids are statistically significantly associated with abnormal BMD in multivariate analysis. Of 769 inception patients from TLC, 11.1% experienced symptomatic fragility fractures (peripheral and vertebral) over the course of their disease. Conclusion The prevalence of low BMD is high in SLE patients, and is associated with older age and higher cumulative glucocorticoid dose.
NASA Technical Reports Server (NTRS)
Ganapol, Barry D.; Townsend, Lawrence W.; Wilson, John W.
1989-01-01
Nontrivial benchmark solutions are developed for the galactic ion transport (GIT) equations in the straight-ahead approximation. These equations are used to predict potential radiation hazards in the upper atmosphere and in space. Two levels of difficulty are considered: (1) energy independent, and (2) spatially independent. The analysis emphasizes analytical methods never before applied to the GIT equations. Most of the representations derived have been numerically implemented and compared to more approximate calculations. Accurate ion fluxes are obtained (3 to 5 digits) for nontrivial sources. For monoenergetic beams, both accurate doses and fluxes are found. The benchmarks presented are useful in assessing the accuracy of transport algorithms designed to accommodate more complex radiation protection problems. In addition, these solutions can provide fast and accurate assessments of relatively simple shield configurations.
Shao, Kan; Small, Mitchell J
2011-10-01
A methodology is presented for assessing the information value of an additional dosage experiment in existing bioassay studies. The analysis demonstrates the potential reduction in the uncertainty of toxicity metrics derived from expanded studies, providing insights for future studies. Bayesian methods are used to fit alternative dose-response models using Markov chain Monte Carlo (MCMC) simulation for parameter estimation and Bayesian model averaging (BMA) is used to compare and combine the alternative models. BMA predictions for benchmark dose (BMD) are developed, with uncertainty in these predictions used to derive the lower bound BMDL. The MCMC and BMA results provide a basis for a subsequent Monte Carlo analysis that backcasts the dosage where an additional test group would have been most beneficial in reducing the uncertainty in the BMD prediction, along with the magnitude of the expected uncertainty reduction. Uncertainty reductions are measured in terms of reduced interval widths of predicted BMD values and increases in BMDL values that occur as a result of this reduced uncertainty. The methodology is illustrated using two existing data sets for TCDD carcinogenicity, fitted with two alternative dose-response models (logistic and quantal-linear). The example shows that an additional dose at a relatively high value would have been most effective for reducing the uncertainty in BMA BMD estimates, with predicted reductions in the widths of uncertainty intervals of approximately 30%, and expected increases in BMDL values of 5-10%. The results demonstrate that dose selection for studies that subsequently inform dose-response models can benefit from consideration of how these models will be fit, combined, and interpreted. © 2011 Society for Risk Analysis.
Poet, T S; Schlosser, P M; Rodriguez, C E; Parod, R J; Rodwell, D E; Kirman, C R
2016-04-01
The developmental effects of NMP are well studied in Sprague-Dawley rats following oral, inhalation, and dermal routes of exposure. Short-term and chronic occupational exposure limit (OEL) values were derived using an updated physiologically based pharmacokinetic (PBPK) model for NMP, along with benchmark dose modeling. Two suitable developmental endpoints were evaluated for human health risk assessment: (1) for acute exposures, the increased incidence of skeletal malformations, an effect noted only at oral doses that were toxic to the dam and fetus; and (2) for repeated exposures to NMP, changes in fetal/pup body weight. Where possible, data from multiple studies were pooled to increase the predictive power of the dose-response data sets. For the purposes of internal dose estimation, the window of susceptibility was estimated for each endpoint, and was used in the dose-response modeling. A point of departure value of 390 mg/L (in terms of peak NMP in blood) was calculated for skeletal malformations based on pooled data from oral and inhalation studies. Acceptable dose-response model fits were not obtained using the pooled data for fetal/pup body weight changes. These data sets were also assessed individually, from which the geometric mean value obtained from the inhalation studies (470 mg*hr/L), was used to derive the chronic OEL. A PBPK model for NMP in humans was used to calculate human equivalent concentrations corresponding to the internal dose point of departure values. Application of a net uncertainty factor of 20-21, which incorporates data-derived extrapolation factors, to the point of departure values yields short-term and chronic occupational exposure limit values of 86 and 24 ppm, respectively. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
Suwazono, Yasushi; Dochi, Mirei; Kobayashi, Etsuko; Oishi, Mitsuhiro; Okubo, Yasushi; Tanaka, Kumihiko; Sakata, Kouichi
2008-12-01
The objective of this study was to calculate benchmark durations and lower 95% confidence limits for benchmark durations of working hours associated with subjective fatigue symptoms by applying the benchmark dose approach while adjusting for job-related stress using multiple logistic regression analyses. A self-administered questionnaire was completed by 3,069 male and 412 female daytime workers (age 18-67 years) in a Japanese steel company. The eight dependent variables in the Cumulative Fatigue Symptoms Index were decreased vitality, general fatigue, physical disorders, irritability, decreased willingness to work, anxiety, depressive feelings, and chronic tiredness. Independent variables were daily working hours, four subscales (job demand, job control, interpersonal relationship, and job suitability) of the Brief Job Stress Questionnaire, and other potential covariates. Using significant parameters for working hours and those for other covariates, the benchmark durations of working hours were calculated for the corresponding Index property. Benchmark response was set at 5% or 10%. Assuming a condition of worst job stress, the benchmark duration/lower 95% confidence limit for benchmark duration of working hours per day with a benchmark response of 5% or 10% were 10.0/9.4 or 11.7/10.7 (irritability) and 9.2/8.9 or 10.4/9.8 (chronic tiredness) in men and 8.9/8.4 or 9.8/8.9 (chronic tiredness) in women. The threshold amounts of working hours for fatigue symptoms under the worst job-related stress were very close to the standard daily working hours in Japan. The results strongly suggest that special attention should be paid to employees whose working hours exceed threshold amounts based on individual levels of job-related stress.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, JM; Samei, E; Departments of Physics, Electrical and Computer Engineering, and Biomedical Engineering, and Medical Physics Graduate Program, Duke University, Durham, NC
2016-06-15
Purpose: Recent legislative and accreditation requirements have driven rapid development and implementation of CT radiation dose monitoring solutions. Institutions must determine how to improve quality, safety, and consistency of their clinical performance. The purpose of this work was to design a strategy and meaningful characterization of results from an in-house, clinically-deployed dose monitoring solution. Methods: A dose monitoring platform was designed by our imaging physics group that focused on extracting protocol parameters, dose metrics, and patient demographics and size. Compared to most commercial solutions, which focus on individual exam alerts and global thresholds, the program sought to characterize overall consistencymore » and targeted thresholds based on eight analytic interrogations. Those were based on explicit questions related to protocol application, national benchmarks, protocol and size-specific dose targets, operational consistency, outliers, temporal trends, intra-system variability, and consistent use of electronic protocols. Using historical data since the start of 2013, 95% and 99% intervals were used to establish yellow and amber parameterized dose alert thresholds, respectively, as a function of protocol, scanner, and size. Results: Quarterly reports have been generated for three hospitals for 3 quarters of 2015 totaling 27880, 28502, 30631 exams, respectively. Four adult and two pediatric protocols were higher than external institutional benchmarks. Four protocol dose levels were being inconsistently applied as a function of patient size. For the three hospitals, the minimum and maximum amber outlier percentages were [1.53%,2.28%], [0.76%,1.8%], [0.94%,1.17%], respectively. Compared with the electronic protocols, 10 protocols were found to be used with some inconsistency. Conclusion: Dose monitoring can satisfy requirements with global alert thresholds and patient dose records, but the real value is in optimizing patient-specific protocols, balancing image quality trade-offs that dose-reduction strategies promise, and improving the performance and consistency of a clinical operation. Data plots that capture patient demographics and scanner performance demonstrate that value.« less
Edler, Lutz; Hart, Andy; Greaves, Peter; Carthew, Philip; Coulet, Myriam; Boobis, Alan; Williams, Gary M; Smith, Benjamin
2014-08-01
This article addresses a number of concepts related to the selection and modelling of carcinogenicity data for the calculation of a Margin of Exposure. It follows up on the recommendations put forward by the International Life Sciences Institute - European branch in 2010 on the application of the Margin of Exposure (MoE) approach to substances in food that are genotoxic and carcinogenic. The aims are to provide practical guidance on the relevance of animal tumour data for human carcinogenic hazard assessment, appropriate selection of tumour data for Benchmark Dose Modelling, and approaches for dealing with the uncertainty associated with the selection of data for modelling and, consequently, the derived Point of Departure (PoD) used to calculate the MoE. Although the concepts outlined in this article are interrelated, the background expertise needed to address each topic varies. For instance, the expertise needed to make a judgement on biological relevance of a specific tumour type is clearly different to that needed to determine the statistical uncertainty around the data used for modelling a benchmark dose. As such, each topic is dealt with separately to allow those with specialised knowledge to target key areas of guidance and provide a more in-depth discussion on each subject for those new to the concept of the Margin of Exposure approach. Copyright © 2013 ILSI Europe. Published by Elsevier Ltd.. All rights reserved.
NASA Astrophysics Data System (ADS)
Jaboulay, Jean-Charles; Brun, Emeric; Hugot, François-Xavier; Huynh, Tan-Dat; Malouch, Fadhel; Mancusi, Davide; Tsilanizara, Aime
2017-09-01
After fission or fusion reactor shutdown the activated structure emits decay photons. For maintenance operations the radiation dose map must be established in the reactor building. Several calculation schemes have been developed to calculate the shutdown dose rate. These schemes are widely developed in fusion application and more precisely for the ITER tokamak. This paper presents the rigorous-two-steps scheme implemented at CEA. It is based on the TRIPOLI-4® Monte Carlo code and the inventory code MENDEL. The ITER shutdown dose rate benchmark has been carried out, results are in a good agreement with the other participant.
Sex-specific effect of Pirin gene on bone mineral density in a cohort of 4000 Chinese.
Tang, Nelson L S; Liao, Chen Di; Ching, Jasmine K L; Suen, Eddie W C; Chan, Iris H S; Orwoll, Eric; Ho, Suzanne C; Chan, Frank W K; Kwok, Anthony W L; Kwok, Timothy; Woo, Jean; Leung, Ping Chung
2010-02-01
Osteoporosis is a common condition among elderly. Genetic mapping studies repeatedly located the distal short arms of X-chromosome as the quantitative trait loci (QTL) for BMD in mice. Fine mapping of a syntenic segment on Xp22 in a Caucasian female population suggested a moderate association between lumbar spine (LS) BMD and 2 intronic SNPs in the Pirin (PIR) gene, which encodes an iron-binding nuclear protein. This study aimed to examine genetic variations in the PIR gene by a comprehensive tagging method and its sex-specific effects on BMD and osteoporotic risk. Two thousand men and 2000 women aged 65 or above were recruited from the community. BMDs at the LS, femoral neck, total hip and whole body were measured and followed up at 4-year. Genotyping was performed for tagSNPs of PIR gene including adjacent regions, and the PIR haplotypes were inferred using PHASE program. Analysis by linear regression showed a significant association between SNP rs5935970 and LS-BMD, while haplotype T-T-A was significantly associated with BMD of all measured sites. However, none of such associations were found in men. Linear Mixed Model also confirmed the same sex-specific and site-specific effect for longitudinal BMD changes. In addition to confirming the association between BMDs and the PIR gene, we also revealed that this finding is sex-specific, possibly due to an X-linked effect. This study demonstrated the importance of considering sex and genetic interactions in studies of disease predisposition and complex traits. (c) 2009 Elsevier Inc. All rights reserved.
Chinese Herbal Medicine for Osteoporosis: A Meta-analysis of Randomized Controlled Trials.
Jin, Yong-Xiang; Wu, Peng; Mao, Yi-Fan; Wang, Bo; Zhang, Jia-Feng; Chen, Wen-Liang; Liu, Zhong; Shi, Xiao-Lin
Osteoporosis is a major public health problem in the elderly population. Several studies have suggested that Chinese herbal medicine has antiosteoporotic activities that might be beneficial for osteoporosis. This study aimed to assess the effectiveness of Chinese herbal medicine in osteoporosis patients. We comprehensively searched for randomized controlled trials (until December 2016) that compared Chinese herbal medicine with Western medicine in adults with osteoporosis and reported bone mineral densities (BMDs). A total of 10 randomized controlled trials were included. The pooled results suggested that the increased spine BMD was lower but not significant in the Chinese herbal medicine group than in the Western drug group (standard mean difference [SMD] = -0.11, 95% confidence interval [CI]: -0.62 to 0.39, p > 0.05). In the subgroup analysis, in postmenopausal women, Chinese herbal medicine also showed a insignificantly higher increment in BMD than the control group (SMD = 0.22, 95% CI: -0.00 to 0.43, p = 0.05). For different treatment durations, subgroups over 6 mo (SMD = 0.09, 95% CI: -0.24 to 0.41, p > 0.05) and less than 6 mo (SMD = -0.25, 95% CI: -1.14 to 0.64, p > 0.05) showed comparable BMDs between the 2 therapies. Our study demonstrated that Chinese herbal medicine alone did not significantly increase lumbar spine BMD. Further studies with better adherence to the intervention are needed to confirm the results of this meta-analysis. Copyright © 2017 The International Society for Clinical Densitometry. Published by Elsevier Inc. All rights reserved.
In the media: Burns as a method of assault.
O'Halloran, E; Duke, J; Rea, S; Wood, F
2013-09-01
The aims of this study were to determine whether a change occurred in the pattern of assault burn injury cases hospitalised to the adult state burns unit, Western Australia, from 2004 to mid-year of 2012, and to compare patient and burn characteristics of adult assault burns with those admitted for unintentional burns. Study data were obtained from the Royal Perth Hospital (RPH) Burns Minimum Dataset (BMDS). Aggregated data of unintentional burn admissions during the same period were provided by the BMDS data manager to enable comparisons with assault burn patients. Assault burn admissions during 2004-2012 accounted for approximately 1% of all adult burn hospitalisations. All assault victims were burned by either thermal or scald agents. A high rate of intubation (24%) and ICU admission (1 in 3 cases) was observed in the fire assault group. The six assault cases undergoing intubation were severe burns, median TBSA 50%, most commonly affecting the face, head and torso, half of these cases had inhalational injuries and also required escharotomies. Comparison of admissions by calendar period showed no statistically significant differences in demographic, burn cause or TBSA%. However, statistically significant differences were found for pre-morbid psychiatric history (15% vs. 58%, p=0.025) and concomitant fractures or dislocations (46% vs. 2%), p=0.011). While the proportion of assault burn admissions per total burn admissions steadily increased from 0.4% in 2009 to 1.5% in mid-2012, this proportion did not exceed that peak level observed of 2.1% for 2004. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.
Cornelius, Iwan; Guatelli, Susanna; Fournier, Pauline; Crosbie, Jeffrey C; Sanchez Del Rio, Manuel; Bräuer-Krisch, Elke; Rosenfeld, Anatoly; Lerch, Michael
2014-05-01
Microbeam radiation therapy (MRT) is a synchrotron-based radiotherapy modality that uses high-intensity beams of spatially fractionated radiation to treat tumours. The rapid evolution of MRT towards clinical trials demands accurate treatment planning systems (TPS), as well as independent tools for the verification of TPS calculated dose distributions in order to ensure patient safety and treatment efficacy. Monte Carlo computer simulation represents the most accurate method of dose calculation in patient geometries and is best suited for the purpose of TPS verification. A Monte Carlo model of the ID17 biomedical beamline at the European Synchrotron Radiation Facility has been developed, including recent modifications, using the Geant4 Monte Carlo toolkit interfaced with the SHADOW X-ray optics and ray-tracing libraries. The code was benchmarked by simulating dose profiles in water-equivalent phantoms subject to irradiation by broad-beam (without spatial fractionation) and microbeam (with spatial fractionation) fields, and comparing against those calculated with a previous model of the beamline developed using the PENELOPE code. Validation against additional experimental dose profiles in water-equivalent phantoms subject to broad-beam irradiation was also performed. Good agreement between codes was observed, with the exception of out-of-field doses and toward the field edge for larger field sizes. Microbeam results showed good agreement between both codes and experimental results within uncertainties. Results of the experimental validation showed agreement for different beamline configurations. The asymmetry in the out-of-field dose profiles due to polarization effects was also investigated, yielding important information for the treatment planning process in MRT. This work represents an important step in the development of a Monte Carlo-based independent verification tool for treatment planning in MRT.
ANALYSES OF NEUROBEHAVIORAL SCREENING DATA: BENCHMARK DOSE ESTIMATION.
Analysis of neurotoxicological screening data such as those of the functional observational battery (FOB) traditionally relies on analysis of variance (ANOVA) with repeated measurements, followed by determination of a no-adverse-effect level (NOAEL). The US EPA has proposed the ...
An Improved Method of Heterogeneity Compensation for the Convolution / Superposition Algorithm
NASA Astrophysics Data System (ADS)
Jacques, Robert; McNutt, Todd
2014-03-01
Purpose: To improve the accuracy of convolution/superposition (C/S) in heterogeneous material by developing a new algorithm: heterogeneity compensated superposition (HCS). Methods: C/S has proven to be a good estimator of the dose deposited in a homogeneous volume. However, near heterogeneities electron disequilibrium occurs, leading to the faster fall-off and re-buildup of dose. We propose to filter the actual patient density in a position and direction sensitive manner, allowing the dose deposited near interfaces to be increased or decreased relative to C/S. We implemented the effective density function as a multivariate first-order recursive filter and incorporated it into GPU-accelerated, multi-energetic C/S implementation. We compared HCS against C/S using the ICCR 2000 Monte-Carlo accuracy benchmark, 23 similar accuracy benchmarks and 5 patient cases. Results: Multi-energetic HCS increased the dosimetric accuracy for the vast majority of voxels; in many cases near Monte-Carlo results were achieved. We defined the per-voxel error, %|mm, as the minimum of the distance to agreement in mm and the dosimetric percentage error relative to the maximum MC dose. HCS improved the average mean error by 0.79 %|mm for the patient volumes; reducing the average mean error from 1.93 %|mm to 1.14 %|mm. Very low densities (i.e. < 0.1 g / cm3) remained problematic, but may be solvable with a better filter function. Conclusions: HCS improved upon C/S's density scaled heterogeneity correction with a position and direction sensitive density filter. This method significantly improved the accuracy of the GPU based algorithm reaching the accuracy levels of Monte Carlo based methods with performance in a few tenths of seconds per beam. Acknowledgement: Funding for this research was provided by the NSF Cooperative Agreement EEC9731748, Elekta / IMPAC Medical Systems, Inc. and the Johns Hopkins University. James Satterthwaite provided the Monte Carlo benchmark simulations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crowhurst, James A, E-mail: jimcrowhurst@hotmail.com; School of Medicine, University of Queensland, St. Lucia, Brisbane, Queensland; Whitby, Mark
Radiation dose to patients undergoing invasive coronary angiography (ICA) is relatively high. Guidelines suggest that a local benchmark or diagnostic reference level (DRL) be established for these procedures. This study sought to create a DRL for ICA procedures in Queensland public hospitals. Data were collected for all Cardiac Catheter Laboratories in Queensland public hospitals. Data were collected for diagnostic coronary angiography (CA) and single-vessel percutaneous intervention (PCI) procedures. Dose area product (P{sub KA}), skin surface entrance dose (K{sub AR}), fluoroscopy time (FT), and patient height and weight were collected for 3 months. The DRL was set from the 75th percentilemore » of the P{sub KA.} 2590 patients were included in the CA group where the median FT was 3.5 min (inter-quartile range = 2.3–6.1). Median K{sub AR} = 581 mGy (374–876). Median P{sub KA} = 3908 uGym{sup 2} (2489–5865) DRL = 5865 uGym{sup 2}. 947 patients were included in the PCI group where median FT was 11.2 min (7.7–17.4). Median K{sub AR} = 1501 mGy (928–2224). Median P{sub KA} = 8736 uGym{sup 2} (5449–12,900) DRL = 12,900 uGym{sup 2}. This study established a benchmark for radiation dose for diagnostic and interventional coronary angiography in Queensland public facilities.« less
Kawamoto, Taisuke; Ito, Yuichi; Morita, Osamu; Honda, Hiroshi
2017-01-01
Cholestasis is one of the major causes of drug-induced liver injury (DILI), which can result in withdrawal of approved drugs from the market. Early identification of cholestatic drugs is difficult due to the complex mechanisms involved. In order to develop a strategy for mechanism-based risk assessment of cholestatic drugs, we analyzed gene expression data obtained from the livers of rats that had been orally administered with 12 known cholestatic compounds repeatedly for 28 days at three dose levels. Qualitative analyses were performed using two statistical approaches (hierarchical clustering and principle component analysis), in addition to pathway analysis. The transcriptional benchmark dose (tBMD) and tBMD 95% lower limit (tBMDL) were used for quantitative analyses, which revealed three compound sub-groups that produced different types of differential gene expression; these groups of genes were mainly involved in inflammation, cholesterol biosynthesis, and oxidative stress. Furthermore, the tBMDL values for each test compound were in good agreement with the relevant no observed adverse effect level. These results indicate that our novel strategy for drug safety evaluation using mechanism-based classification and tBMDL would facilitate the application of toxicogenomics for risk assessment of cholestatic DILI.
Contributions to Integral Nuclear Data in ICSBEP and IRPhEP since ND 2013
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bess, John D.; Briggs, J. Blair; Gulliford, Jim
2016-09-01
The status of the International Criticality Safety Benchmark Evaluation Project (ICSBEP) and the International Reactor Physics Experiment Evaluation Project (IRPhEP) was last discussed directly with the international nuclear data community at ND2013. Since ND2013, integral benchmark data that are available for nuclear data testing has continued to increase. The status of the international benchmark efforts and the latest contributions to integral nuclear data for testing is discussed. Select benchmark configurations that have been added to the ICSBEP and IRPhEP Handbooks since ND2013 are highlighted. The 2015 edition of the ICSBEP Handbook now contains 567 evaluations with benchmark specifications for 4,874more » critical, near-critical, or subcritical configurations, 31 criticality alarm placement/shielding configuration with multiple dose points apiece, and 207 configurations that have been categorized as fundamental physics measurements that are relevant to criticality safety applications. The 2015 edition of the IRPhEP Handbook contains data from 143 different experimental series that were performed at 50 different nuclear facilities. Currently 139 of the 143 evaluations are published as approved benchmarks with the remaining four evaluations published in draft format only. Measurements found in the IRPhEP Handbook include criticality, buckling and extrapolation length, spectral characteristics, reactivity effects, reactivity coefficients, kinetics, reaction-rate distributions, power distributions, isotopic compositions, and/or other miscellaneous types of measurements for various types of reactor systems. Annual technical review meetings for both projects were held in April 2016; additional approved benchmark evaluations will be included in the 2016 editions of these handbooks.« less
Benchmarking of MCNP for calculating dose rates at an interim storage facility for nuclear waste.
Heuel-Fabianek, Burkhard; Hille, Ralf
2005-01-01
During the operation of research facilities at Research Centre Jülich, Germany, nuclear waste is stored in drums and other vessels in an interim storage building on-site, which has a concrete shielding at the side walls. Owing to the lack of a well-defined source, measured gamma spectra were unfolded to determine the photon flux on the surface of the containers. The dose rate simulation, including the effects of skyshine, using the Monte Carlo transport code MCNP is compared with the measured dosimetric data at some locations in the vicinity of the interim storage building. The MCNP data for direct radiation confirm the data calculated using a point-kernel method. However, a comparison of the modelled dose rates for direct radiation and skyshine with the measured data demonstrate the need for a more precise definition of the source. Both the measured and the modelled dose rates verified the fact that the legal limits (<1 mSv a(-1)) are met in the area outside the perimeter fence of the storage building to which members of the public have access. Using container surface data (gamma spectra) to define the source may be a useful tool for practical calculations and additionally for benchmarking of computer codes if the discussed critical aspects with respect to the source can be addressed adequately.
U.S. EPA Superfund Program's Policy for Risk and Dose Assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walker, Stuart
2008-01-15
The Environmental Protection Agency (EPA) Office of Superfund Remediation and Technology Innovation (OSRTI) has primary responsibility for implementing the long-term (non-emergency) portion of a key U.S. law regulating cleanup: the Comprehensive Environmental Response, Compensation and Liability Act, CERCLA, nicknamed 'Superfund'. The purpose of the Superfund program is to protect human health and the environment over the long term from releases or potential releases of hazardous substances from abandoned or uncontrolled hazardous waste sites. The focus of this paper is on risk and dose assessment policies and tools for addressing radioactively contaminated sites by the Superfund program. EPA has almost completedmore » two risk assessment tools that are particularly relevant to decommissioning activities conducted under CERCLA authority. These are the: 1. Building Preliminary Remediation Goals for Radionuclides (BPRG) electronic calculator, and 2. Radionuclide Outdoor Surfaces Preliminary Remediation Goals (SPRG) electronic calculator. EPA developed the BPRG calculator to help standardize the evaluation and cleanup of radiologically contaminated buildings at which risk is being assessed for occupancy. BPRGs are radionuclide concentrations in dust, air and building materials that correspond to a specified level of human cancer risk. The intent of SPRG calculator is to address hard outside surfaces such as building slabs, outside building walls, sidewalks and roads. SPRGs are radionuclide concentrations in dust and hard outside surface materials. EPA is also developing the 'Radionuclide Ecological Benchmark' calculator. This calculator provides biota concentration guides (BCGs), also known as ecological screening benchmarks, for use in ecological risk assessments at CERCLA sites. This calculator is intended to develop ecological benchmarks as part of the EPA guidance 'Ecological Risk Assessment Guidance for Superfund: Process for Designing and Conducting Ecological Risk Assessments'. The calculator develops ecological benchmarks for ionizing radiation based on cell death only.« less
Practical examples of modeling choices and their consequences for risk assessment
Although benchmark dose (BMD) modeling has become the preferred approach to identifying a point of departure (POD) over the No Observed Adverse Effect Level, there remain challenges to its application in human health risk assessment. BMD modeling, as currently implemented by the...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jarabek, A.M.; Menache, M.G.; Overton, J.H. Jr.
1990-10-01
The U.S. Environmental Protection Agency (U.S. EPA) has advocated the establishment of general and scientific guidelines for the evaluation of toxicological data and their use in deriving benchmark values to protect exposed populations from adverse health effects. The Agency's reference dose (RfD) methodology for deriving benchmark values for noncancer toxicity originally addressed risk assessment of oral exposures. This paper presents a brief background on the development of the inhalation reference dose (RfDi) methodology, including concepts and issues related to addressing the dynamics of the respiratory system as the portal of entry. Different dosimetric adjustments are described that were incorporated intomore » the methodology to account for the nature of the inhaled agent (particle or gas) and the site of the observed toxic effects (respiratory or extra-respiratory). Impacts of these adjustments on the extrapolation of toxicity data of inhaled agents for human health risk assessment and future research directions are also discussed.« less
U. S. Environmental Protection Agency's inhalation RFD methodology: Risk assessment for air toxics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jarabek, A.M.; Menache, M.G.; Overton, J.H.
1989-01-01
The U.S. Environmental Protection Agency (U.S. EPA) has advocated the establishment of general and scientific guidelines for the evaluation of toxicological data and their use in deriving benchmark values to protect exposed populations from adverse health effects. The Agency's reference dose (RfD) methodology for deriving benchmark values for noncancer toxicity originally addressed risk assessment of oral exposures. The paper presents a brief background on the development of the inhalation reference dose (RFDi) methodology, including concepts and issues related to addressing the dynamics of the respiratory system as the portal of entry. Different dosimetric adjustments are described that were incorporated intomore » the methodology to account for the nature of the inhaled agent (particle or gas) and the site of the observed toxic effects (respiratory or extrarespiratory). Impacts of these adjustments on the extrapolation of toxicity data of inhaled agents for human health risk assessment and future research directions are also discussed.« less
Lee, Kam L; Bernardo, Michael; Ireland, Timothy A
2016-06-01
This is part two of a two-part study in benchmarking system performance of fixed digital radiographic systems. The study compares the system performance of seven fixed digital radiography systems based on quantitative metrics like modulation transfer function (sMTF), normalised noise power spectrum (sNNPS), detective quantum efficiency (sDQE) and entrance surface air kerma (ESAK). It was found that the most efficient image receptors (greatest sDQE) were not necessarily operating at the lowest ESAK. In part one of this study, sMTF is shown to depend on system configuration while sNNPS is shown to be relatively consistent across systems. Systems are ranked on their signal-to-noise ratio efficiency (sDQE) and their ESAK. Systems using the same equipment configuration do not necessarily have the same system performance. This implies radiographic practice at the site will have an impact on the overall system performance. In general, systems are more dose efficient at low dose settings.
Faddegon, Bruce A.; Shin, Jungwook; Castenada, Carlos M.; Ramos-Méndez, José; Daftari, Inder K.
2015-01-01
Purpose: To measure depth dose curves for a 67.5 ± 0.1 MeV proton beam for benchmarking and validation of Monte Carlo simulation. Methods: Depth dose curves were measured in 2 beam lines. Protons in the raw beam line traversed a Ta scattering foil, 0.1016 or 0.381 mm thick, a secondary emission monitor comprised of thin Al foils, and a thin Kapton exit window. The beam energy and peak width and the composition and density of material traversed by the beam were known with sufficient accuracy to permit benchmark quality measurements. Diodes for charged particle dosimetry from two different manufacturers were used to scan the depth dose curves with 0.003 mm depth reproducibility in a water tank placed 300 mm from the exit window. Depth in water was determined with an uncertainty of 0.15 mm, including the uncertainty in the water equivalent depth of the sensitive volume of the detector. Parallel-plate chambers were used to verify the accuracy of the shape of the Bragg peak and the peak-to-plateau ratio measured with the diodes. The uncertainty in the measured peak-to-plateau ratio was 4%. Depth dose curves were also measured with a diode for a Bragg curve and treatment beam spread out Bragg peak (SOBP) on the beam line used for eye treatment. The measurements were compared to Monte Carlo simulation done with geant4 using topas. Results: The 80% dose at the distal side of the Bragg peak for the thinner foil was at 37.47 ± 0.11 mm (average of measurement with diodes from two different manufacturers), compared to the simulated value of 37.20 mm. The 80% dose for the thicker foil was at 35.08 ± 0.15 mm, compared to the simulated value of 34.90 mm. The measured peak-to-plateau ratio was within one standard deviation experimental uncertainty of the simulated result for the thinnest foil and two standard deviations for the thickest foil. It was necessary to include the collimation in the simulation, which had a more pronounced effect on the peak-to-plateau ratio for the thicker foil. The treatment beam, being unfocussed, had a broader Bragg peak than the raw beam. A 1.3 ± 0.1 MeV FWHM peak width in the energy distribution was used in the simulation to match the Bragg peak width. An additional 1.3–2.24 mm of water in the water column was required over the nominal values to match the measured depth penetration. Conclusions: The proton Bragg curve measured for the 0.1016 mm thick Ta foil provided the most accurate benchmark, having a low contribution of proton scatter from upstream of the water tank. The accuracy was 0.15% in measured beam energy and 0.3% in measured depth penetration at the Bragg peak. The depth of the distal edge of the Bragg peak in the simulation fell short of measurement, suggesting that the mean ionization potential of water is 2–5 eV higher than the 78 eV used in the stopping power calculation for the simulation. The eye treatment beam line depth dose curves provide validation of Monte Carlo simulation of a Bragg curve and SOBP with 4%/2 mm accuracy. PMID:26133619
Comment on ‘egs_brachy: a versatile and fast Monte Carlo code for brachytherapy’
NASA Astrophysics Data System (ADS)
Yegin, Gultekin
2018-02-01
In a recent paper (Chamberland et al 2016 Phys. Med. Biol. 61 8214) develop a new Monte Carlo code called egs_brachy for brachytherapy treatments. It is based on EGSnrc, and written in the C++ programming language. In order to benchmark the egs_brachy code, the authors use it in various test case scenarios in which complex geometry conditions exist. Another EGSnrc based brachytherapy dose calculation engine, BrachyDose, is used for dose comparisons. The authors fail to prove that egs_brachy can produce reasonable dose values for brachytherapy sources in a given medium. The dose comparisons in the paper are erroneous and misleading. egs_brachy should not be used in any further research studies unless and until all the potential bugs are fixed in the code.
Lenora, Janaka; Lekamwasam, Sarath; Karlsson, Magnus K
2009-07-01
Studies conducted in Western countries have shown that bone loss associated with pregnancy and breast-feeding is recovered after weaning. However, it is not clear whether recovery takes place after repeated pregnancies followed by prolonged periods of breast-feeding; especially in developing countries where nutritional intake is comparatively low.This study was designed to examine the effects of multiparity and prolonged breast-feeding on maternal bone mineral density (BMD) in a community-based sample of 210 Sri Lankan women, aged between 46 and 98 years. BMD of the lumbar spine (L2-L4) and femoral neck were measured by dual-energy X-ray absorptiometry. Reproductive history was recorded by using a questionnaire. Women were, first, divided into groups according to parity (nulliparous, 1-2, 3-4, and 5 or more children), and BMDs in different groups were compared, initially unadjusted and then adjusted for age. Same subjects were subdivided, again, according to the total duration of breast-feeding (0, 1-48, 49-96, and 97 months or more) and similar analysis was carried out. Women who had 5 or more children and women who had breast-fed for 97 months or more were older than the other women (p < 0.01) but no differences in height, weight or BMI were observed among the groups. Age adjusted BMD at lumbar spine and femoral neck BMDs of women grouped according to parity were not significantly different. Neither was there any difference between lumbar spine or femoral neck BMD in groups based on duration of breast-feeding. From this population-based study conducted in a developing country, we infer that history of multiparity or prolonged breast-feeding has no detrimental effects on maternal BMD in post-menopausal age.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dourson, M.L.
The quantitative procedures associated with noncancer risk assessment include reference dose (RfD), benchmark dose, and severity modeling. The RfD, which is part of the EPA risk assessment guidelines, is an estimation of a level that is likely to be without any health risk to sensitive individuals. The RfD requires two major judgments: the first is choice of a critical effect(s) and its No Observed Adverse Effect Level (NOAEL); the second judgment is choice of an uncertainty factor. This paper discusses major assumptions and limitations of the RfD model.
Fogliata, Antonella; Nicolini, Giorgia; Clivio, Alessandro; Vanetti, Eugenio; Laksar, Sarbani; Tozzi, Angelo; Scorsetti, Marta; Cozzi, Luca
2015-10-31
To evaluate the performance of a broad scope model-based optimisation process for volumetric modulated arc therapy applied to esophageal cancer. A set of 70 previously treated patients in two different institutions, were selected to train a model for the prediction of dose-volume constraints. The model was built with a broad-scope purpose, aiming to be effective for different dose prescriptions and tumour localisations. It was validated on three groups of patients from the same institution and from another clinic not providing patients for the training phase. Comparison of the automated plans was done against reference cases given by the clinically accepted plans. Quantitative improvements (statistically significant for the majority of the analysed dose-volume parameters) were observed between the benchmark and the test plans. Of 624 dose-volume objectives assessed for plan evaluation, in 21 cases (3.3 %) the reference plans failed to respect the constraints while the model-based plans succeeded. Only in 3 cases (<0.5 %) the reference plans passed the criteria while the model-based failed. In 5.3 % of the cases both groups of plans failed and in the remaining cases both passed the tests. Plans were optimised using a broad scope knowledge-based model to determine the dose-volume constraints. The results showed dosimetric improvements when compared to the benchmark data. Particularly the plans optimised for patients from the third centre, not participating to the training, resulted in superior quality. The data suggests that the new engine is reliable and could encourage its application to clinical practice.
Severe iodine deficiency is known to cause adverse health outcomes and remains a benchmark for understanding the effects of hypothyroidism. However, the implications of marginal iodine deficiency on function of the thyroid axis remain less well known. The current study examined t...
Guillén, J; Baeza, A; Beresford, N A; Wood, M D
2017-09-01
Fungi are used as biomonitors of forest ecosystems, having comparatively high uptakes of anthropogenic and naturally occurring radionuclides. However, whilst they are known to accumulate radionuclides they are not typically considered in radiological assessment tools for environmental (non-human biota) assessment. In this paper the total dose rate to fungi is estimated using the ERICA Tool, assuming different fruiting body geometries, a single ellipsoid and more complex geometries considering the different components of the fruit body and their differing radionuclide contents based upon measurement data. Anthropogenic and naturally occurring radionuclide concentrations from the Mediterranean ecosystem (Spain) were used in this assessment. The total estimated weighted dose rate was in the range 0.31-3.4 μGy/h (5 th -95 th percentile), similar to natural exposure rates reported for other wild groups. The total estimated dose was dominated by internal exposure, especially from 226 Ra and 210 Po. Differences in dose rate between complex geometries and a simple ellipsoid model were negligible. Therefore, the simple ellipsoid model is recommended to assess dose rates to fungal fruiting bodies. Fungal mycelium was also modelled assuming a long filament. Using these geometries, assessments for fungal fruiting bodies and mycelium under different scenarios (post-accident, planned release and existing exposure) were conducted, each being based on available monitoring data. The estimated total dose rate in each case was below the ERICA screening benchmark dose, except for the example post-accident existing exposure scenario (the Chernobyl Exclusion Zone) for which a dose rate in excess of 35 μGy/h was estimated for the fruiting body. Estimated mycelium dose rate in this post-accident existing exposure scenario was close to the 400 μGy/h benchmark for plants, although fungi are generally considered to be less radiosensitive than plants. Further research on appropriate mycelium geometries and their radionuclide content is required. Based on the assessments presented in this paper, there is no need to recommend that fungi should be added to the existing assessment tools and frameworks; if required some tools allow a geometry representing fungi to be created and used within a dose assessment. Copyright © 2017 Elsevier Ltd. All rights reserved.
Ciesielski, Bartlomiej; Marciniak, Agnieszka; Zientek, Agnieszka; Krefft, Karolina; Cieszyński, Mateusz; Boguś, Piotr; Prawdzik-Dampc, Anita
2016-12-01
This study is about the accuracy of EPR dosimetry in bones based on deconvolution of the experimental spectra into the background (BG) and the radiation-induced signal (RIS) components. The model RIS's were represented by EPR spectra from irradiated enamel or bone powder; the model BG signals by EPR spectra of unirradiated bone samples or by simulated spectra. Samples of compact and trabecular bones were irradiated in the 30-270 Gy range and the intensities of their RIS's were calculated using various combinations of those benchmark spectra. The relationships between the dose and the RIS were linear (R 2 > 0.995), with practically no difference between results obtained when using signals from irradiated enamel or bone as the model RIS. Use of different experimental spectra for the model BG resulted in variations in intercepts of the dose-RIS calibration lines, leading to systematic errors in reconstructed doses, in particular for high- BG samples of trabecular bone. These errors were reduced when simulated spectra instead of the experimental ones were used as the benchmark BG signal in the applied deconvolution procedures. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Benchmark Dose Software Development and Maintenance Ten Berge Cxt Models
This report is intended to provide an overview of beta version 1.0 of the implementation of a concentration-time (CxT) model originally programmed and provided by Wil ten Berge (referred to hereafter as the ten Berge model). The recoding and development described here represent ...
Al-Hallaq, Hania A; Chmura, Steven; Salama, Joseph K; Winter, Kathryn A; Robinson, Clifford G; Pisansky, Thomas M; Borges, Virginia; Lowenstein, Jessica R; McNulty, Susan; Galvin, James M; Followill, David S; Timmerman, Robert D; White, Julia R; Xiao, Ying; Matuszak, Martha M
In 2014, the NRG Oncology Group initiated the first National Cancer Institute-sponsored, phase 1 clinical trial of stereotactic body radiation therapy (SBRT) for the treatment of multiple metastases in multiple organ sites (BR001; NCT02206334). The primary endpoint is to test the safety of SBRT for the treatment of 2 to 4 multiple lesions in several anatomic sites in a multi-institutional setting. Because of the technical challenges inherent to treating multiple lesions as their spatial separation decreases, we present the technical requirements for NRG-BR001 and the rationale for their selection. Patients with controlled primary tumors of breast, non-small cell lung, or prostate are eligible if they have 2 to 4 metastases distributed among 7 extracranial anatomic locations throughout the body. Prescription and organ-at-risk doses were determined by expert consensus. Credentialing requirements include (1) irradiation of the Imaging and Radiation Oncology Core phantom with SBRT, (2) submitting image guided radiation therapy case studies, and (3) planning the benchmark. Guidelines for navigating challenging planning cases including assessing composite dose are discussed. Dosimetric planning to multiple lesions receiving differing doses (45-50 Gy) and fractionation (3-5) while irradiating the same organs at risk is discussed, particularly for metastases in close proximity (≤5 cm). The benchmark case was selected to demonstrate the planning tradeoffs required to satisfy protocol requirements for 2 nearby lesions. Examples of passing benchmark plans exhibited a large variability in plan conformity. NRG-BR001 was developed using expert consensus on multiple issues from the dose fractionation regimen to the minimum image guided radiation therapy guidelines. Credentialing was tied to the task rather than the anatomic site to reduce its burden. Every effort was made to include a variety of delivery methods to reflect current SBRT technology. Although some simplifications were adopted, the successful completion of this trial will inform future designs of both national and institutional trials and would allow immediate clinical adoption of SBRT trials for oligometastases. Copyright © 2016 American Society for Radiation Oncology. Published by Elsevier Inc. All rights reserved.
Benchmark dose and the three Rs. Part I. Getting more information from the same number of animals.
Slob, Wout
2014-08-01
Evaluating dose-response data using the Benchmark dose (BMD) approach rather than by the no observed adverse effect (NOAEL) approach implies a considerable step forward from the perspective of the Reduction, Replacement, and Refinement, three Rs, in particular the R of reduction: more information is obtained from the same number of animals, or, vice versa, similar information may be obtained from fewer animals. The first part of this twin paper focusses on the former, the second on the latter aspect. Regarding the former, the BMD approach provides more information from any given dose-response dataset in various ways. First, the BMDL (= BMD lower confidence bound) provides more information by its more explicit definition. Further, as compared to the NOAEL approach the BMD approach results in more statistical precision in the value of the point of departure (PoD), for deriving exposure limits. While part of the animals in the study do not directly contribute to the numerical value of a NOAEL, all animals are effectively used and do contribute to a BMDL. In addition, the BMD approach allows for combining similar datasets for the same chemical (e.g., both sexes) in a single analysis, which further increases precision. By combining a dose-response dataset with similar historical data for other chemicals, the precision can even be substantially increased. Further, the BMD approach results in more precise estimates for relative potency factors (RPFs, or TEFs). And finally, the BMD approach is not only more precise, it also allows for quantification of the precision in the BMD estimate, which is not possible in the NOAEL approach.
Webster, A. Francina; Chepelev, Nikolai; Gagné, Rémi; Kuo, Byron; Recio, Leslie; Williams, Andrew; Yauk, Carole L.
2015-01-01
Many regulatory agencies are exploring ways to integrate toxicogenomic data into their chemical risk assessments. The major challenge lies in determining how to distill the complex data produced by high-content, multi-dose gene expression studies into quantitative information. It has been proposed that benchmark dose (BMD) values derived from toxicogenomics data be used as point of departure (PoD) values in chemical risk assessments. However, there is limited information regarding which genomics platforms are most suitable and how to select appropriate PoD values. In this study, we compared BMD values modeled from RNA sequencing-, microarray-, and qPCR-derived gene expression data from a single study, and explored multiple approaches for selecting a single PoD from these data. The strategies evaluated include several that do not require prior mechanistic knowledge of the compound for selection of the PoD, thus providing approaches for assessing data-poor chemicals. We used RNA extracted from the livers of female mice exposed to non-carcinogenic (0, 2 mg/kg/day, mkd) and carcinogenic (4, 8 mkd) doses of furan for 21 days. We show that transcriptional BMD values were consistent across technologies and highly predictive of the two-year cancer bioassay-based PoD. We also demonstrate that filtering data based on statistically significant changes in gene expression prior to BMD modeling creates more conservative BMD values. Taken together, this case study on mice exposed to furan demonstrates that high-content toxicogenomics studies produce robust data for BMD modelling that are minimally affected by inter-technology variability and highly predictive of cancer-based PoD doses. PMID:26313361
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sample, B.E. Opresko, D.M. Suter, G.W.
Ecological risks of environmental contaminants are evaluated by using a two-tiered process. In the first tier, a screening assessment is performed where concentrations of contaminants in the environment are compared to no observed adverse effects level (NOAEL)-based toxicological benchmarks. These benchmarks represent concentrations of chemicals (i.e., concentrations presumed to be nonhazardous to the biota) in environmental media (water, sediment, soil, food, etc.). While exceedance of these benchmarks does not indicate any particular level or type of risk, concentrations below the benchmarks should not result in significant effects. In practice, when contaminant concentrations in food or water resources are less thanmore » these toxicological benchmarks, the contaminants may be excluded from further consideration. However, if the concentration of a contaminant exceeds a benchmark, that contaminant should be retained as a contaminant of potential concern (COPC) and investigated further. The second tier in ecological risk assessment, the baseline ecological risk assessment, may use toxicological benchmarks as part of a weight-of-evidence approach (Suter 1993). Under this approach, based toxicological benchmarks are one of several lines of evidence used to support or refute the presence of ecological effects. Other sources of evidence include media toxicity tests, surveys of biota (abundance and diversity), measures of contaminant body burdens, and biomarkers. This report presents NOAEL- and lowest observed adverse effects level (LOAEL)-based toxicological benchmarks for assessment of effects of 85 chemicals on 9 representative mammalian wildlife species (short-tailed shrew, little brown bat, meadow vole, white-footed mouse, cottontail rabbit, mink, red fox, and whitetail deer) or 11 avian wildlife species (American robin, rough-winged swallow, American woodcock, wild turkey, belted kingfisher, great blue heron, barred owl, barn owl, Cooper's hawk, and red-tailed hawk, osprey) (scientific names for both the mammalian and avian species are presented in Appendix B). [In this document, NOAEL refers to both dose (mg contaminant per kg animal body weight per day) and concentration (mg contaminant per kg of food or L of drinking water)]. The 20 wildlife species were chosen because they are widely distributed and provide a representative range of body sizes and diets. The chemicals are some of those that occur at U.S. Department of Energy (DOE) waste sites. The NOAEL-based benchmarks presented in this report represent values believed to be nonhazardous for the listed wildlife species; LOAEL-based benchmarks represent threshold levels at which adverse effects are likely to become evident. These benchmarks consider contaminant exposure through oral ingestion of contaminated media only. Exposure through inhalation and/or direct dermal exposure are not considered in this report.« less
Experimental validation of the TOPAS Monte Carlo system for passive scattering proton therapy
Testa, M.; Schümann, J.; Lu, H.-M.; Shin, J.; Faddegon, B.; Perl, J.; Paganetti, H.
2013-01-01
Purpose: TOPAS (TOol for PArticle Simulation) is a particle simulation code recently developed with the specific aim of making Monte Carlo simulations user-friendly for research and clinical physicists in the particle therapy community. The authors present a thorough and extensive experimental validation of Monte Carlo simulations performed with TOPAS in a variety of setups relevant for proton therapy applications. The set of validation measurements performed in this work represents an overall end-to-end testing strategy recommended for all clinical centers planning to rely on TOPAS for quality assurance or patient dose calculation and, more generally, for all the institutions using passive-scattering proton therapy systems. Methods: The authors systematically compared TOPAS simulations with measurements that are performed routinely within the quality assurance (QA) program in our institution as well as experiments specifically designed for this validation study. First, the authors compared TOPAS simulations with measurements of depth-dose curves for spread-out Bragg peak (SOBP) fields. Second, absolute dosimetry simulations were benchmarked against measured machine output factors (OFs). Third, the authors simulated and measured 2D dose profiles and analyzed the differences in terms of field flatness and symmetry and usable field size. Fourth, the authors designed a simple experiment using a half-beam shifter to assess the effects of multiple Coulomb scattering, beam divergence, and inverse square attenuation on lateral and longitudinal dose profiles measured and simulated in a water phantom. Fifth, TOPAS’ capabilities to simulate time dependent beam delivery was benchmarked against dose rate functions (i.e., dose per unit time vs time) measured at different depths inside an SOBP field. Sixth, simulations of the charge deposited by protons fully stopping in two different types of multilayer Faraday cups (MLFCs) were compared with measurements to benchmark the nuclear interaction models used in the simulations. Results: SOBPs’ range and modulation width were reproduced, on average, with an accuracy of +1, −2 and ±3 mm, respectively. OF simulations reproduced measured data within ±3%. Simulated 2D dose-profiles show field flatness and average field radius within ±3% of measured profiles. The field symmetry resulted, on average in ±3% agreement with commissioned profiles. TOPAS accuracy in reproducing measured dose profiles downstream the half beam shifter is better than 2%. Dose rate function simulation reproduced the measurements within ∼2% showing that the four-dimensional modeling of the passively modulation system was implement correctly and millimeter accuracy can be achieved in reproducing measured data. For MLFCs simulations, 2% agreement was found between TOPAS and both sets of experimental measurements. The overall results show that TOPAS simulations are within the clinical accepted tolerances for all QA measurements performed at our institution. Conclusions: Our Monte Carlo simulations reproduced accurately the experimental data acquired through all the measurements performed in this study. Thus, TOPAS can reliably be applied to quality assurance for proton therapy and also as an input for commissioning of commercial treatment planning systems. This work also provides the basis for routine clinical dose calculations in patients for all passive scattering proton therapy centers using TOPAS. PMID:24320505
Introduction of risk size in the determination of uncertainty factor UFL in risk assessment
NASA Astrophysics Data System (ADS)
Xue, Jinling; Lu, Yun; Velasquez, Natalia; Yu, Ruozhen; Hu, Hongying; Liu, Zhengtao; Meng, Wei
2012-09-01
The methodology for using uncertainty factors in health risk assessment has been developed for several decades. A default value is usually applied for the uncertainty factor UFL, which is used to extrapolate from LOAEL (lowest observed adverse effect level) to NAEL (no adverse effect level). Here, we have developed a new method that establishes a linear relationship between UFL and the additional risk level at LOAEL based on the dose-response information, which represents a very important factor that should be carefully considered. This linear formula makes it possible to select UFL properly in the additional risk range from 5.3% to 16.2%. Also the results remind us that the default value 10 may not be conservative enough when the additional risk level at LOAEL exceeds 16.2%. Furthermore, this novel method not only provides a flexible UFL instead of the traditional default value, but also can ensure a conservative estimation of the UFL with fewer errors, and avoid the benchmark response selection involved in the benchmark dose method. These advantages can improve the estimation of the extrapolation starting point in the risk assessment.
Osteogenic actions of metoprolol in an ovariectomized rat model of menopause.
Zang, Yuan; Tan, Quanchang; Ma, Xiangyu; Zhao, Xiong; Lei, Wei
2016-09-01
Osteoporosis and hypertension are age-related chronic diseases with increased morbidity rates among postmenopausal women. Clinical epidemiological investigations have demonstrated that hypertensive patients treated with β1-selective β-blockers have a higher bone mineral density (BMD) and lower fracture risk. Nevertheless, no fundamental studies have examined the relationships between β1-selective β-blockers and these effects. The present study explored the effects and mechanisms of metoprolol in the in vitro treatment of osteoblasts and the in vivo treatment of ovariectomy-induced osteoporosis in rats. Primary osteoblasts were obtained by digestion of the cranial bones of 24-hour-old Sprague-Dawley rats. After metoprolol treatment, cell proliferation and differentiation capacities were assessed at the corresponding time points. In addition, 3-month-old female Sprague-Dawley rats (200-220 g) were divided into a sham-operated group (n = 8) and three ovariectomized (OVX) (bilateral removal of ovaries) groups as follows: vehicle (OVX; n = 8), low-dose metoprolol (L-M, oral, 120 mg/kg/d; n = 8), and high-dose metoprolol (H-M, oral, 240 mg/kg/d; n = 8). After 12 weeks of metoprolol treatment, BMD, microarchitecture, and biomechanical properties were evaluated. The results indicated that the treatments with 0.01 to 0.1 μM metoprolol increased osteoblast proliferation, alkaline phosphatase activity, and calcium mineralization, and promoted the expression of osteogenic genes. The in vivo study indicated that administration of metoprolol to OVX rats resulted in maintenance of the BMDs of the L4 vertebrae. Moreover, amelioration of trabecular microarchitecture deterioration and preservation of bone biomechanical properties were detected in the trabecular bones of the OVX rats. Our findings indicate that metoprolol prevents estrogen deficiency-induced bone loss by increasing the number and enhancing the biological functions of osteoblasts, implying its potential use as an alternative treatment for postmenopausal osteoporosis in hypertensive patients.
Boos, J; Meineke, A; Rubbert, C; Heusch, P; Lanzman, R S; Aissa, J; Antoch, G; Kröpil, P
2016-03-01
To implement automated CT dose data monitoring using the DICOM-Structured Report (DICOM-SR) in order to monitor dose-related CT data in regard to national diagnostic reference levels (DRLs). We used a novel in-house co-developed software tool based on the DICOM-SR to automatically monitor dose-related data from CT examinations. The DICOM-SR for each CT examination performed between 09/2011 and 03/2015 was automatically anonymized and sent from the CT scanners to a cloud server. Data was automatically analyzed in accordance with body region, patient age and corresponding DRL for volumetric computed tomography dose index (CTDIvol) and dose length product (DLP). Data of 36,523 examinations (131,527 scan series) performed on three different CT scanners and one PET/CT were analyzed. The overall mean CTDIvol and DLP were 51.3% and 52.8% of the national DRLs, respectively. CTDIvol and DLP reached 43.8% and 43.1% for abdominal CT (n=10,590), 66.6% and 69.6% for cranial CT (n=16,098) and 37.8% and 44.0% for chest CT (n=10,387) of the compared national DRLs, respectively. Overall, the CTDIvol exceeded national DRLs in 1.9% of the examinations, while the DLP exceeded national DRLs in 2.9% of the examinations. Between different CT protocols of the same body region, radiation exposure varied up to 50% of the DRLs. The implemented cloud-based CT dose monitoring based on the DICOM-SR enables automated benchmarking in regard to national DRLs. Overall the local dose exposure from CT reached approximately 50% of these DRLs indicating that DRL actualization as well as protocol-specific DRLs are desirable. The cloud-based approach enables multi-center dose monitoring and offers great potential to further optimize radiation exposure in radiological departments. • The newly developed software based on the DICOM-Structured Report enables large-scale cloud-based CT dose monitoring • The implemented software solution enables automated benchmarking in regard to national DRLs • The local radiation exposure from CT reached approximately 50 % of the national DRLs • The cloud-based approach offers great potential for multi-center dose analysis. © Georg Thieme Verlag KG Stuttgart · New York.
Russell, Louise B.; Pentakota, Sri Ram; Toscano, Cristiana Maria; Cosgriff, Ben; Sinha, Anushua
2016-01-01
Background. Despite longstanding infant vaccination programs in low- and middle-income countries (LMICs), pertussis continues to cause deaths in the youngest infants. A maternal monovalent acellular pertussis (aP) vaccine, in development, could prevent many of these deaths. We estimated infant pertussis mortality rates at which maternal vaccination would be a cost-effective use of public health resources in LMICs. Methods. We developed a decision model to evaluate the cost-effectiveness of maternal aP immunization plus routine infant vaccination vs routine infant vaccination alone in Bangladesh, Nigeria, and Brazil. For a range of maternal aP vaccine prices, one-way sensitivity analyses identified the infant pertussis mortality rates required to make maternal immunization cost-effective by alternative benchmarks ($100, 0.5 gross domestic product [GDP] per capita, and GDP per capita per disability-adjusted life-year [DALY]). Probabilistic sensitivity analysis provided uncertainty intervals for these mortality rates. Results. Infant pertussis mortality rates necessary to make maternal aP immunization cost-effective exceed the rates suggested by current evidence except at low vaccine prices and/or cost-effectiveness benchmarks at the high end of those considered in this report. For example, at a vaccine price of $0.50/dose, pertussis mortality would need to be 0.051 per 1000 infants in Bangladesh, and 0.018 per 1000 in Nigeria, to cost 0.5 per capita GDP per DALY. In Brazil, a middle-income country, at a vaccine price of $4/dose, infant pertussis mortality would need to be 0.043 per 1000 to cost 0.5 per capita GDP per DALY. Conclusions. For commonly used cost-effectiveness benchmarks, maternal aP immunization would be cost-effective in many LMICs only if the vaccine were offered at less than $1–$2/dose. PMID:27838677
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hirayama, S; Fujimoto, R
Purpose: The purpose was to demonstrate a developed acceleration technique of dose optimization and to investigate its applicability to the optimization process in a treatment planning system (TPS) for proton therapy. Methods: In the developed technique, the dose matrix is divided into two parts, main and halo, based on beam sizes. The boundary of the two parts is varied depending on the beam energy and water equivalent depth by utilizing the beam size as a singular threshold parameter. The optimization is executed with two levels of iterations. In the inner loop, doses from the main part are updated, whereas dosesmore » from the halo part remain constant. In the outer loop, the doses from the halo part are recalculated. We implemented this technique to the optimization process in the TPS and investigated the dependence on the target volume of the speedup effect and applicability to the worst-case optimization (WCO) in benchmarks. Results: We created irradiation plans for various cubic targets and measured the optimization time varying the target volume. The speedup effect was improved as the target volume increased, and the calculation speed increased by a factor of six for a 1000 cm3 target. An IMPT plan for the RTOG benchmark phantom was created in consideration of ±3.5% range uncertainties using the WCO. Beams were irradiated at 0, 45, and 315 degrees. The target’s prescribed dose and OAR’s Dmax were set to 3 Gy and 1.5 Gy, respectively. Using the developed technique, the calculation speed increased by a factor of 1.5. Meanwhile, no significant difference in the calculated DVHs was found before and after incorporating the technique into the WCO. Conclusion: The developed technique could be adapted to the TPS’s optimization. The technique was effective particularly for large target cases.« less
Development of risk-based nanomaterial groups for occupational exposure control
NASA Astrophysics Data System (ADS)
Kuempel, E. D.; Castranova, V.; Geraci, C. L.; Schulte, P. A.
2012-09-01
Given the almost limitless variety of nanomaterials, it will be virtually impossible to assess the possible occupational health hazard of each nanomaterial individually. The development of science-based hazard and risk categories for nanomaterials is needed for decision-making about exposure control practices in the workplace. A possible strategy would be to select representative (benchmark) materials from various mode of action (MOA) classes, evaluate the hazard and develop risk estimates, and then apply a systematic comparison of new nanomaterials with the benchmark materials in the same MOA class. Poorly soluble particles are used here as an example to illustrate quantitative risk assessment methods for possible benchmark particles and occupational exposure control groups, given mode of action and relative toxicity. Linking such benchmark particles to specific exposure control bands would facilitate the translation of health hazard and quantitative risk information to the development of effective exposure control practices in the workplace. A key challenge is obtaining sufficient dose-response data, based on standard testing, to systematically evaluate the nanomaterials' physical-chemical factors influencing their biological activity. Categorization processes involve both science-based analyses and default assumptions in the absence of substance-specific information. Utilizing data and information from related materials may facilitate initial determinations of exposure control systems for nanomaterials.
Methodology and Data Sources for Assessing Extreme Charging Events within the Earth's Magnetosphere
NASA Astrophysics Data System (ADS)
Parker, L. N.; Minow, J. I.; Talaat, E. R.
2016-12-01
Spacecraft surface and internal charging is a potential threat to space technologies because electrostatic discharges on, or within, charged spacecraft materials can result in a number of adverse impacts to spacecraft systems. The Space Weather Action Plan (SWAP) ionizing radiation benchmark team recognized that spacecraft charging will need to be considered to complete the ionizing radiation benchmarks in order to evaluate the threat of charging to critical space infrastructure operating within the near-Earth ionizing radiation environments. However, the team chose to defer work on the lower energy charging environments and focus the initial benchmark efforts on the higher energy galactic cosmic ray, solar energetic particle, and trapped radiation belt particle environments of concern for radiation dose and single event effects in humans and hardware. Therefore, an initial set of 1 in 100 year spacecraft charging environment benchmarks remains to be defined to meet the SWAP goals. This presentation will discuss the available data sources and a methodology to assess the 1 in 100 year extreme space weather events that drive surface and internal charging threats to spacecraft. Environments to be considered are the hot plasmas in the outer magnetosphere during geomagnetic storms, relativistic electrons in the outer radiation belt, and energetic auroral electrons in low Earth orbit at high latitudes.
ABSTRACT Results of global gene expression profiling after short-term exposures can be used to inform tumorigenic potency and chemical mode of action (MOA) and thus serve as a strategy to prioritize future or data-poor chemicals for further evaluation. This compilation of cas...
Dose-additivity has been the default assumption in risk assessments of pesticides with a common mechanism of action but it has been suspected that there could be non-additive effects. Inhibition of plasma cholinesterase (ChE) activity and hypothermia were used as benchmarks of e...
Neutron skyshine calculations with the integral line-beam method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gui, A.A.; Shultis, J.K.; Faw, R.E.
1997-10-01
Recently developed line- and conical-beam response functions are used to calculate neutron skyshine doses for four idealized source geometries. These calculations, which can serve as benchmarks, are compared with MCNP calculations, and the excellent agreement indicates that the integral conical- and line-beam method is an effective alternative to more computationally expensive transport calculations.
NASA Astrophysics Data System (ADS)
Carpentieri, C.; Schwarz, C.; Ludwig, J.; Ashfaq, A.; Fiederle, M.
2002-07-01
High precision concerning the dose calibration of X-ray sources is required when counting and integrating methods are compared. The dose calibration for a dental X-ray tube was executed with special dose calibration equipment (dosimeter) as function of exposure time and rate. Results were compared with a benchmark spectrum and agree within ±1.5%. Dead time investigations with the Medipix1 photon-counting chip (PCC) have been performed by rate variations. Two different types of dead time, paralysable and non-paralysable will be discussed. The dead time depends on settings of the front-end electronics and is a function of signal height, which might lead to systematic defects of systems. Dead time losses in excess of 30% have been found for the PCC at 200 kHz absorbed photons per pixel.
Soller, Jeffrey A; Eftim, Sorina E; Nappier, Sharon P
2018-01-01
Understanding pathogen risks is a critically important consideration in the design of water treatment, particularly for potable reuse projects. As an extension to our published microbial risk assessment methodology to estimate infection risks associated with Direct Potable Reuse (DPR) treatment train unit process combinations, herein, we (1) provide an updated compilation of pathogen density data in raw wastewater and dose-response models; (2) conduct a series of sensitivity analyses to consider potential risk implications using updated data; (3) evaluate the risks associated with log credit allocations in the United States; and (4) identify reference pathogen reductions needed to consistently meet currently applied benchmark risk levels. Sensitivity analyses illustrated changes in cumulative annual risks estimates, the significance of which depends on the pathogen group driving the risk for a given treatment train. For example, updates to norovirus (NoV) raw wastewater values and use of a NoV dose-response approach, capturing the full range of uncertainty, increased risks associated with one of the treatment trains evaluated, but not the other. Additionally, compared to traditional log-credit allocation approaches, our results indicate that the risk methodology provides more nuanced information about how consistently public health benchmarks are achieved. Our results indicate that viruses need to be reduced by 14 logs or more to consistently achieve currently applied benchmark levels of protection associated with DPR. The refined methodology, updated model inputs, and log credit allocation comparisons will be useful to regulators considering DPR projects and design engineers as they consider which unit treatment processes should be employed for particular projects. Published by Elsevier Ltd.
Lu, Shan; Zhao, Lan-Juan; Chen, Xiang-Ding; Papasian, Christopher J.; Wu, Ke-Hao; Tan, Li-Jun; Wang, Zhuo-Er; Pei, Yu-Fang; Tian, Qing
2018-01-01
Several studies indicated bone mineral density (BMD) and alcohol intake might share common genetic factors. The study aimed to explore potential SNPs/genes related to both phenotypes in US Caucasians at the genome-wide level. A bivariate genome-wide association study (GWAS) was performed in 2069 unrelated participants. Regular drinking was graded as 1, 2, 3, 4, 5, or 6, representing drinking alcohol never, less than once, once or twice, three to six times, seven to ten times, or more than ten times per week respectively. Hip, spine, and whole body BMDs were measured. The bivariate GWAS was conducted on the basis of a bivariate linear regression model. Sex-stratified association analyses were performed in the male and female subgroups. In males, the most significant association signal was detected in SNP rs685395 in DYNC2H1 with bivariate spine BMD and alcohol drinking (P = 1.94 × 10−8). SNP rs685395 and five other SNPs, rs657752, rs614902, rs682851, rs626330, and rs689295, located in the same haplotype block in DYNC2H1 were the top ten most significant SNPs in the bivariate GWAS in males. Additionally, two SNPs in GRIK4 in males and three SNPs in OPRM1 in females were suggestively associated with BMDs (of the hip, spine, and whole body) and alcohol drinking. Nine SNPs in IL1RN were only suggestively associated with female whole body BMD and alcohol drinking. Our study indicated that DYNC2H1 may contribute to the genetic mechanisms of both spine BMD and alcohol drinking in male Caucasians. Moreover, our study suggested potential pleiotropic roles of OPRM1 and IL1RN in females and GRIK4 in males underlying variation of both BMD and alcohol drinking. PMID:28012008
Lenora, Janaka; Lekamwasam, Sarath; Karlsson, Magnus K
2009-01-01
Background Studies conducted in Western countries have shown that bone loss associated with pregnancy and breast-feeding is recovered after weaning. However, it is not clear whether recovery takes place after repeated pregnancies followed by prolonged periods of breast-feeding; especially in developing countries where nutritional intake is comparatively low. This study was designed to examine the effects of multiparity and prolonged breast-feeding on maternal bone mineral density (BMD) in a community-based sample of 210 Sri Lankan women, aged between 46 and 98 years. Methods BMD of the lumbar spine (L2–L4) and femoral neck were measured by dual-energy X-ray absorptiometry. Reproductive history was recorded by using a questionnaire. Women were, first, divided into groups according to parity (nulliparous, 1–2, 3–4, and 5 or more children), and BMDs in different groups were compared, initially unadjusted and then adjusted for age. Same subjects were subdivided, again, according to the total duration of breast-feeding (0, 1–48, 49–96, and 97 months or more) and similar analysis was carried out. Results Women who had 5 or more children and women who had breast-fed for 97 months or more were older than the other women (p < 0.01) but no differences in height, weight or BMI were observed among the groups. Age adjusted BMD at lumbar spine and femoral neck BMDs of women grouped according to parity were not significantly different. Neither was there any difference between lumbar spine or femoral neck BMD in groups based on duration of breast-feeding. Conclusion From this population-based study conducted in a developing country, we infer that history of multiparity or prolonged breast-feeding has no detrimental effects on maternal BMD in post-menopausal age. PMID:19570205
Life expectancy and causes of death in Bernese mountain dogs in Switzerland.
Klopfenstein, Michael; Howard, Judith; Rossetti, Menga; Geissbühler, Urs
2016-07-25
New regulations by the Swiss Federal Food Safety and Veterinary Office provide for the monitoring of breed health by Swiss breeding clubs. In collaboration with the Swiss Bernese Mountain Dog Club, the purpose of this study was to investigate the causes of death in purebred dogs registered by the club and born in 2001 and 2002. Of a total of 1290 Bernese mountain dogs (BMDs) born in 2001 and 2002 in Switzerland, data was collected from owners and veterinarians using a questionnaire designed for this study from 389 dogs (30.2 %). By the end of the study, 381/389 dogs (97.9 %) had died. The median life expectancy of all dogs was 8.4 years (IQR, 6.9-9.7). Female dogs had a significantly longer median survival (8.8 years; IQR, 7.1-10.3) than male dogs (7.7 years; IQR, 6.6-9.3) (P < 0.00). The cause of death was unknown in 89/381 dogs (23.4 %). For the remaining dogs, the most frequent causes of death were neoplasia (222/381, 58.3 %), degenerative joint disease (16/381, 4.2 %), spinal disorders (13/381, 3.4 %), renal injury (12/381, 3.1 %), and gastric or mesenteric volvulus (7/381, 1.8 %). However, large numbers of dogs were diagnosed with neoplasia without histopathologic or cytologic confirmation. Dogs with neoplasms had a shorter median survival than dogs with other disorders. The shortest median survival (6.8 years) was found for dogs with renal injury. Findings of this study confirm a high prevalence of neoplasia and associated low life expectancy in BMDs. The results underline a need for more widespread precise diagnostics and further research on malignant tumours in this breed to improve overall breed health.
Mavrogeni, Sophie; Giannakopoulou, Aikaterini; Papavasiliou, Antigoni; Markousis-Mavrogenis, George; Pons, Roser; Karanasios, Evangelos; Noutsias, Michel; Kolovou, Genovefa; Papadopoulos, George
2017-07-24
To evaluate cardiovascular function in boys with Duchenne (DMD) and Becker (BMD) muscular dystrophy, using cardiac magnetic resonance (CMR). This is a single point cross sectional study of twenty-four boys with genetically ascertained DMD, and 10 with BMD, aged 10.5 ± 1.5 years (range 9-13), were prospectively evaluated by a 1.5 T system and compared with those of age-sex matched controls. The DMD patients were divided in 2 groups. Group A (N = 12) were under treatment with both deflazacort and perindopril, while Group B (n = 12) were under treatment with deflazacort, only. BMD patients did not take any medication. Biventricular function was assessed using a standard SSFP sequence. Late gadolinium enhancement (LGE) was assessed from T1 images taken 15 min after injection of 0.2 mg/Kg gadolinium DTPA using a 3D-T1-TFE sequence. Group A and BMDs were asymptomatic with normal ECG, 24 h ECG recording and echocardiogram. Group B were asymptomatic but 6/12 had abnormal ECG and mildly impaired LVEF. Their 24 h ECG recording revealed supraventricular and ventricular extrasystoles (all at 12-13 yrs). LV indices in Group A and BMD did not differ from those of controls. However, LV indices in Group B were significantly impaired compared with controls, Group A and BMDs (p < 0.001). An epicardial LGE area = 3 ± 0.5% of LV mass was identified in the posterolateral wall of LV only in 6/12 patients of Group B, but in not in any BMD or Group A. Children with either BMD or DMD under treatment with both deflazacort and perindopril present preserved LV function and lack of LGE. However, further large scale multicenter studies are warranted to confirm these data, including further CMR mapping approaches.
Geffner-Sclarsky, D
To determine the load and characteristics of cerebrovascular diseases (CVD) admitted in the hospital network throughout the Valencian Region. The paper reports on an analysis of the information included in the basic minimum data set (BMDS) from the 26 hospitals run by the Valencian Regional Ministry of Health in the year 2001. Patients that were selected were those whose main diagnosis was codes C.430 to C.437, according to the International Classification of Diseases, 9th revision, clinical modification (ICD-9-CM). A total of 10,558 patients with CVD were discharged, which accounts for 2.6% of admissions and 3% of hospital stays. The mean age of the series was 71.03 years -standard deviation (SD): 9- and 94.8% were admitted as emergencies. By diagnoses, 3% (319) were subarachnoid haemorrhages (SAH; C.430); 13.4% (1,412) were cerebral haemorrhages (ICH; C.431); 18.5% (1,956) were transient ischemic attacks (TIA; C.435); 49.5% (5,225) were cases of cerebral infarction (CI; C.434 and C.436); and 15.6% involved other vascular processes (C.433 and C.437). Mortality rates were 30.1% in SAH; 33.9% in ICH; 11.7% in CI; and 2.7% in TIA. Mean number of days in hospital: SAH 17.4 (SD: 15); ICH 13.1 (SD: 11.8); CI 9.9 (SD: 6.4) and in cases of TIA 7.2 (SD: 4). The percentages of survivors who were discharged home were 78.9 % in SAH, 83.2% in ICH and 91.9% in the case of CI. In all, 51.3% (5,413 patients) were discharged by neurological units. In spite of possible insufficiencies analysed in this work, the use of the BMDS provides valuable epidemiological information that is very useful for health care management.
Transcriptomic Dose-Response Analysis for Mode of Action ...
Microarray and RNA-seq technologies can play an important role in assessing the health risks associated with environmental exposures. The utility of gene expression data to predict hazard has been well documented. Early toxicogenomics studies used relatively high, single doses with minimal replication. Thus, they were not useful in understanding health risks at environmentally-relevant doses. Until the past decade, application of toxicogenomics in dose response assessment and determination of chemical mode of action has been limited. New transcriptomic biomarkers have evolved to detect chemical hazards in multiple tissues together with pathway methods to study biological effects across the full dose response range and critical time course. Comprehensive low dose datasets are now available and with the use of transcriptomic benchmark dose estimation techniques within a mode of action framework, the ability to incorporate informative genomic data into human health risk assessment has substantially improved. The key advantage to applying transcriptomic technology to risk assessment is both the sensitivity and comprehensive examination of direct and indirect molecular changes that lead to adverse outcomes. Book Chapter with topic on future application of toxicogenomics technologies for MoA and risk assessment
Accuracy of a simplified method for shielded gamma-ray skyshine sources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bassett, M.S.; Shultis, J.K.
1989-11-01
Rigorous transport or Monte Carlo methods for estimating far-field gamma-ray skyshine doses generally are computationally intensive. consequently, several simplified techniques such as point-kernel methods and methods based on beam response functions have been proposed. For unshielded skyshine sources, these simplified methods have been shown to be quite accurate from comparisons to benchmark problems and to benchmark experimental results. For shielded sources, the simplified methods typically use exponential attenuation and photon buildup factors to describe the effect of the shield. However, the energy and directional redistribution of photons scattered in the shield is usually ignored, i.e., scattered photons are assumed tomore » emerge from the shield with the same energy and direction as the uncollided photons. The accuracy of this shield treatment is largely unknown due to the paucity of benchmark results for shielded sources. In this paper, the validity of such a shield treatment is assessed by comparison to a composite method, which accurately calculates the energy and angular distribution of photons penetrating the shield.« less
Comparison of Vocal Vibration-Dose Measures for Potential-Damage Risk Criteria
Hunter, Eric J.
2015-01-01
Purpose Schoolteachers have become a benchmark population for the study of occupational voice use. A decade of vibration-dose studies on the teacher population allows a comparison to be made between specific dose measures for eventual assessment of damage risk. Method Vibration dosimetry is reformulated with the inclusion of collision stress. Two methods of estimating amplitude of vocal-fold vibration are compared to capture variations in vocal intensity. Energy loss from collision is added to the energy-dissipation dose. An equal-energy-dissipation criterion is defined and used on the teacher corpus as a potential-damage risk criterion. Results Comparison of time-, cycle-, distance-, and energy-dose calculations for 57 teachers reveals a progression in information content in the ability to capture variations in duration, speaking pitch, and vocal intensity. The energy-dissipation dose carries the greatest promise in capturing excessive tissue stress and collision but also the greatest liability, due to uncertainty in parameters. Cycle dose is least correlated with the other doses. Conclusion As a first guide to damage risk in excessive voice use, the equal-energy-dissipation dose criterion can be used to structure trade-off relations between loudness, adduction, and duration of speech. PMID:26172434
GROWTH OF THE INTERNATIONAL CRITICALITY SAFETY AND REACTOR PHYSICS EXPERIMENT EVALUATION PROJECTS
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. Blair Briggs; John D. Bess; Jim Gulliford
2011-09-01
Since the International Conference on Nuclear Criticality Safety (ICNC) 2007, the International Criticality Safety Benchmark Evaluation Project (ICSBEP) and the International Reactor Physics Experiment Evaluation Project (IRPhEP) have continued to expand their efforts and broaden their scope. Eighteen countries participated on the ICSBEP in 2007. Now, there are 20, with recent contributions from Sweden and Argentina. The IRPhEP has also expanded from eight contributing countries in 2007 to 16 in 2011. Since ICNC 2007, the contents of the 'International Handbook of Evaluated Criticality Safety Benchmark Experiments1' have increased from 442 evaluations (38000 pages), containing benchmark specifications for 3955 critical ormore » subcritical configurations to 516 evaluations (nearly 55000 pages), containing benchmark specifications for 4405 critical or subcritical configurations in the 2010 Edition of the ICSBEP Handbook. The contents of the Handbook have also increased from 21 to 24 criticality-alarm-placement/shielding configurations with multiple dose points for each, and from 20 to 200 configurations categorized as fundamental physics measurements relevant to criticality safety applications. Approximately 25 new evaluations and 150 additional configurations are expected to be added to the 2011 edition of the Handbook. Since ICNC 2007, the contents of the 'International Handbook of Evaluated Reactor Physics Benchmark Experiments2' have increased from 16 different experimental series that were performed at 12 different reactor facilities to 53 experimental series that were performed at 30 different reactor facilities in the 2011 edition of the Handbook. Considerable effort has also been made to improve the functionality of the searchable database, DICE (Database for the International Criticality Benchmark Evaluation Project) and verify the accuracy of the data contained therein. DICE will be discussed in separate papers at ICNC 2011. The status of the ICSBEP and the IRPhEP will be discussed in the full paper, selected benchmarks that have been added to the ICSBEP Handbook will be highlighted, and a preview of the new benchmarks that will appear in the September 2011 edition of the Handbook will be provided. Accomplishments of the IRPhEP will also be highlighted and the future of both projects will be discussed. REFERENCES (1) International Handbook of Evaluated Criticality Safety Benchmark Experiments, NEA/NSC/DOC(95)03/I-IX, Organisation for Economic Co-operation and Development-Nuclear Energy Agency (OECD-NEA), September 2010 Edition, ISBN 978-92-64-99140-8. (2) International Handbook of Evaluated Reactor Physics Benchmark Experiments, NEA/NSC/DOC(2006)1, Organisation for Economic Co-operation and Development-Nuclear Energy Agency (OECD-NEA), March 2011 Edition, ISBN 978-92-64-99141-5.« less
Shutdown Dose Rate Analysis Using the Multi-Step CADIS Method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ibrahim, Ahmad M.; Peplow, Douglas E.; Peterson, Joshua L.
2015-01-01
The Multi-Step Consistent Adjoint Driven Importance Sampling (MS-CADIS) hybrid Monte Carlo (MC)/deterministic radiation transport method was proposed to speed up the shutdown dose rate (SDDR) neutron MC calculation using an importance function that represents the neutron importance to the final SDDR. This work applied the MS-CADIS method to the ITER SDDR benchmark problem. The MS-CADIS method was also used to calculate the SDDR uncertainty resulting from uncertainties in the MC neutron calculation and to determine the degree of undersampling in SDDR calculations because of the limited ability of the MC method to tally detailed spatial and energy distributions. The analysismore » that used the ITER benchmark problem compared the efficiency of the MS-CADIS method to the traditional approach of using global MC variance reduction techniques for speeding up SDDR neutron MC calculation. Compared to the standard Forward-Weighted-CADIS (FW-CADIS) method, the MS-CADIS method increased the efficiency of the SDDR neutron MC calculation by 69%. The MS-CADIS method also increased the fraction of nonzero scoring mesh tally elements in the space-energy regions of high importance to the final SDDR.« less
Benchmark of PENELOPE code for low-energy photon transport: dose comparisons with MCNP4 and EGS4.
Ye, Sung-Joon; Brezovich, Ivan A; Pareek, Prem; Naqvi, Shahid A
2004-02-07
The expanding clinical use of low-energy photon emitting 125I and 103Pd seeds in recent years has led to renewed interest in their dosimetric properties. Numerous papers pointed out that higher accuracy could be obtained in Monte Carlo simulations by utilizing newer libraries for the low-energy photon cross-sections, such as XCOM and EPDL97. The recently developed PENELOPE 2001 Monte Carlo code is user friendly and incorporates photon cross-section data from the EPDL97. The code has been verified for clinical dosimetry of high-energy electron and photon beams, but has not yet been tested at low energies. In the present work, we have benchmarked the PENELOPE code for 10-150 keV photons. We computed radial dose distributions from 0 to 10 cm in water at photon energies of 10-150 keV using both PENELOPE and MCNP4C with either DLC-146 or DLC-200 cross-section libraries, assuming a point source located at the centre of a 30 cm diameter and 20 cm length cylinder. Throughout the energy range of simulated photons (except for 10 keV), PENELOPE agreed within statistical uncertainties (at worst +/- 5%) with MCNP/DLC-146 in the entire region of 1-10 cm and with published EGS4 data up to 5 cm. The dose at 1 cm (or dose rate constant) of PENELOPE agreed with MCNP/DLC-146 and EGS4 data within approximately +/- 2% in the range of 20-150 keV, while MCNP/DLC-200 produced values up to 9% lower in the range of 20-100 keV than PENELOPE or the other codes. However, the differences among the four datasets became negligible above 100 keV.
Formative usability evaluation of a fixed-dose pen-injector platform device
Lange, Jakob; Nemeth, Tobias
2018-01-01
Background This article for the first time presents a formative usability study of a fixed-dose pen injector platform device used for the subcutaneous delivery of biopharmaceuticals, primarily for self-administration by the patient. The study was conducted with a user population of both naïve and experienced users across a range of ages. The goals of the study were to evaluate whether users could use the devices safely and effectively relying on the instructions for use (IFU) for guidance, as well as to benchmark the device against another similar injector established in the market. Further objectives were to capture any usability issues and obtain participants’ subjective ratings on the properties and performance of both devices. Methods A total of 20 participants in three groups studied the IFU and performed simulated injections into an injection pad. Results All participants were able to use the device successfully. The device was well appreciated by all users with, maximum usability feedback scores reported by 90% or more on handling forces and device feedback, and by 85% or more on fit and grip of the device. The presence of clear audible and visible feedbacks upon successful loading of a dose and completion of injection was seen to be a significant improvement over the benchmark injector. Conclusion The observation that the platform device can be safely and efficiently used by all user groups provides confidence that the device and IFU in their current form will pass future summative testing in specific applications. PMID:29670411
Blanck, Oliver; Wang, Lei; Baus, Wolfgang; Grimm, Jimm; Lacornerie, Thomas; Nilsson, Joakim; Luchkovskyi, Sergii; Cano, Isabel Palazon; Shou, Zhenyu; Ayadi, Myriam; Treuer, Harald; Viard, Romain; Siebert, Frank-Andre; Chan, Mark K H; Hildebrandt, Guido; Dunst, Jürgen; Imhoff, Detlef; Wurster, Stefan; Wolff, Robert; Romanelli, Pantaleo; Lartigau, Eric; Semrau, Robert; Soltys, Scott G; Schweikard, Achim
2016-05-08
Stereotactic radiosurgery (SRS) is the accurate, conformal delivery of high-dose radiation to well-defined targets while minimizing normal structure doses via steep dose gradients. While inverse treatment planning (ITP) with computerized optimization algorithms are routine, many aspects of the planning process remain user-dependent. We performed an international, multi-institutional benchmark trial to study planning variability and to analyze preferable ITP practice for spinal robotic radiosurgery. 10 SRS treatment plans were generated for a complex-shaped spinal metastasis with 21 Gy in 3 fractions and tight constraints for spinal cord (V14Gy < 2 cc, V18Gy < 0.1 cc) and target (coverage > 95%). The resulting plans were rated on a scale from 1 to 4 (excellent-poor) in five categories (constraint compliance, optimization goals, low-dose regions, ITP complexity, and clinical acceptability) by a blinded review panel. Additionally, the plans were mathemati-cally rated based on plan indices (critical structure and target doses, conformity, monitor units, normal tissue complication probability, and treatment time) and compared to the human rankings. The treatment plans and the reviewers' rankings varied substantially among the participating centers. The average mean overall rank was 2.4 (1.2-4.0) and 8/10 plans were rated excellent in at least one category by at least one reviewer. The mathematical rankings agreed with the mean overall human rankings in 9/10 cases pointing toward the possibility for sole mathematical plan quality comparison. The final rankings revealed that a plan with a well-balanced trade-off among all planning objectives was preferred for treatment by most par-ticipants, reviewers, and the mathematical ranking system. Furthermore, this plan was generated with simple planning techniques. Our multi-institutional planning study found wide variability in ITP approaches for spinal robotic radiosurgery. The participants', reviewers', and mathematical match on preferable treatment plans and ITP techniques indicate that agreement on treatment planning and plan quality can be reached for spinal robotic radiosurgery.
Pecquet, Alison M; Martinez, Jeanelle M; Vincent, Melissa; Erraguntla, Neeraja; Dourson, Michael
2018-06-01
A no-significant-risk-level of 20 mg day -1 was derived for tetrabromobisphenol A (TBBPA). Uterine tumors (adenomas, adenocarcinomas, and malignant mixed Müllerian) observed in female Wistar Han rats from a National Toxicology Program 2-year cancer bioassay were identified as the critical effect. Studies suggest that TBBPA is acting through a non-mutagenic mode of action. Thus, the most appropriate approach to derivation of a cancer risk value based on US Environmental Protection Agency guidelines is a threshold approach, akin to a cancer safe dose (RfD cancer ). Using the National Toxicology Program data, we utilized Benchmark dose software to derive a benchmark dose lower limit (BMDL 10 ) as the point of departure (POD) of 103 mg kg -1 day -1 . The POD was adjusted to a human equivalent dose of 25.6 mg kg -1 day -1 using allometric scaling. We applied a composite adjustment factor of 100 to the POD to derive an RfD cancer of 0.26 mg kg -1 day -1 . Based on a human body weight of 70 kg, the RfD cancer was adjusted to a no-significant-risk-level of 20 mg day -1 . This was compared to other available non-cancer and cancer risk values, and aligns well with our understanding of the underlying biology based on the toxicology data. Overall, the weight of evidence from animal studies indicates that TBBPA has low toxicity and suggests that high doses over long exposure durations are needed to induce uterine tumor formation. Future research needs include a thorough and detailed vetting of the proposed adverse outcome pathway, including further support for key events leading to uterine tumor formation and a quantitative weight of evidence analysis. Copyright © 2018 John Wiley & Sons, Ltd.
SU-E-T-776: Use of Quality Metrics for a New Hypo-Fractionated Pre-Surgical Mesothelioma Protocol
DOE Office of Scientific and Technical Information (OSTI.GOV)
Richardson, S; Mehta, V
Purpose: The “SMART” (Surgery for Mesothelioma After Radiation Therapy) approach involves hypo-fractionated radiotherapy of the lung pleura to 25Gy over 5 days followed by surgical resection within 7. Early clinical results suggest that this approach is very promising, but also logistically challenging due to the multidisciplinary involvement. Due to the compressed schedule, high dose, and shortened planning time, the delivery of the planned doses were monitored for safety with quality metric software. Methods: Hypo-fractionated IMRT treatment plans were developed for all patients and exported to Quality Reports™ software. Plan quality metrics or PQMs™ were created to calculate an objective scoringmore » function for each plan. This allows for an objective assessment of the quality of the plan and a benchmark for plan improvement for subsequent patients. The priorities of various components were incorporated based on similar hypo-fractionated protocols such as lung SBRT treatments. Results: Five patients have been treated at our institution using this approach. The plans were developed, QA performed, and ready within 5 days of simulation. Plan Quality metrics utilized in scoring included doses to OAR and target coverage. All patients tolerated treatment well and proceeded to surgery as scheduled. Reported toxicity included grade 1 nausea (n=1), grade 1 esophagitis (n=1), grade 2 fatigue (n=3). One patient had recurrent fluid accumulation following surgery. No patients experienced any pulmonary toxicity prior to surgery. Conclusion: An accelerated course of pre-operative high dose radiation for mesothelioma is an innovative and promising new protocol. Without historical data, one must proceed cautiously and monitor the data carefully. The development of quality metrics and scoring functions for these treatments allows us to benchmark our plans and monitor improvement. If subsequent toxicities occur, these will be easy to investigate and incorporate into the metrics. This will improve the safe delivery of large doses for these patients.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ma, Y; Lacroix, F; Lavallee, M
Purpose: To evaluate the commercially released Collapsed Cone convolution-based(CCC) dose calculation module of the Elekta OncentraBrachy(OcB) treatment planning system(TPS). Methods: An allwater phantom was used to perform TG43 benchmarks with single source and seventeen sources, separately. Furthermore, four real-patient heterogeneous geometries (chestwall, lung, breast and prostate) were used. They were selected based on their clinical representativity of a class of clinical anatomies that pose clear challenges. The plans were used as is(no modification). For each case, TG43 and CCC calculations were performed in the OcB TPS, with TG186-recommended materials properly assigned to ROIs. For comparison, Monte Carlo simulation was runmore » for each case with the same material scheme and grid mesh as TPS calculations. Both modes of CCC (standard and high quality) were tested. Results: For the benchmark case, the CCC dose, when divided by that of TG43, yields hot-n-cold spots in a radial pattern. The pattern of the high mode is denser than that of the standard mode and is representative of angular dicretization. The total deviation ((hot-cold)/TG43) is 18% for standard mode and 11% for high mode. Seventeen dwell positions help to reduce “ray-effect”, with the total deviation to 6% (standard) and 5% (high), respectively. For the four patient cases, CCC produces, as expected, more realistic dose distributions than TG43. A close agreement was observed between CCC and MC for all isodose lines, from 20% and up; the 10% isodose line of CCC appears shifted compared to that of MC. The DVH plots show dose deviations of CCC from MC in small volume, high dose regions (>100% isodose). For patient cases, the difference between standard and high modes is almost undiscernable. Conclusion: OncentraBrachy CCC algorithm marks a significant dosimetry improvement relative to TG43 in real-patient cases. Further researches are recommended regarding the clinical implications of the above observations. Support provided by a CIHR grant and CCC system provided by Elekta-Nucletron.« less
Initial characterization, dosimetric benchmark and performance validation of Dynamic Wave Arc.
Burghelea, Manuela; Verellen, Dirk; Poels, Kenneth; Hung, Cecilia; Nakamura, Mitsuhiro; Dhont, Jennifer; Gevaert, Thierry; Van den Begin, Robbe; Collen, Christine; Matsuo, Yukinori; Kishi, Takahiro; Simon, Viorica; Hiraoka, Masahiro; de Ridder, Mark
2016-04-29
Dynamic Wave Arc (DWA) is a clinical approach designed to maximize the versatility of Vero SBRT system by synchronizing the gantry-ring noncoplanar movement with D-MLC optimization. The purpose of this study was to verify the delivery accuracy of DWA approach and to evaluate the potential dosimetric benefits. DWA is an extended form of VMAT with a continuous varying ring position. The main difference in the optimization modules of VMAT and DWA is during the angular spacing, where the DWA algorithm does not consider the gantry spacing, but only the Euclidian norm of the ring and gantry angle. A preclinical version of RayStation v4.6 (RaySearch Laboratories, Sweden) was used to create patient specific wave arc trajectories for 31 patients with various anatomical tumor regions (prostate, oligometatstatic cases, centrally-located non-small cell lung cancer (NSCLC) and locally advanced pancreatic cancer-LAPC). DWA was benchmarked against the current clinical approaches and coplanar VMAT. Each plan was evaluated with regards to dose distribution, modulation complexity (MCS), monitor units and treatment time efficiency. The delivery accuracy was evaluated using a 2D diode array that takes in consideration the multi-dimensionality of DWA during dose reconstruction. In centrally-located NSCLC cases, DWA improved the low dose spillage with 20 %, while the target coverage was increased with 17 % compared to 3D CRT. The structures that significantly benefited from using DWA were proximal bronchus and esophagus, with the maximal dose being reduced by 17 % and 24 %, respectively. For prostate and LAPC, neither technique seemed clearly superior to the other; however, DWA reduced with more than 65 % of the delivery time over IMRT. A steeper dose gradient outside the target was observed for all treatment sites (p < 0.01) with DWA. Except the oligometastatic cases, where the DWA-MCSs indicate a higher modulation, both DWA and VMAT modalities provide plans of similar complexity. The average ɣ (3 % /3 mm) passing rate for DWA plans was 99.2 ± 1 % (range from 96.8 to 100 %). DWA proven to be a fully functional treatment technique, allowing additional flexibility in dose shaping, while preserving dosimetrically robust delivery and treatment times comparable with coplanar VMAT.
Bahadori, Amir A; Sato, Tatsuhiko; Slaba, Tony C; Shavers, Mark R; Semones, Edward J; Van Baalen, Mary; Bolch, Wesley E
2013-10-21
NASA currently uses one-dimensional deterministic transport to generate values of the organ dose equivalent needed to calculate stochastic radiation risk following crew space exposures. In this study, organ absorbed doses and dose equivalents are calculated for 50th percentile male and female astronaut phantoms using both the NASA High Charge and Energy Transport Code to perform one-dimensional deterministic transport and the Particle and Heavy Ion Transport Code System to perform three-dimensional Monte Carlo transport. Two measures of radiation risk, effective dose and risk of exposure-induced death (REID) are calculated using the organ dose equivalents resulting from the two methods of radiation transport. For the space radiation environments and simplified shielding configurations considered, small differences (<8%) in the effective dose and REID are found. However, for the galactic cosmic ray (GCR) boundary condition, compensating errors are observed, indicating that comparisons between the integral measurements of complex radiation environments and code calculations can be misleading. Code-to-code benchmarks allow for the comparison of differential quantities, such as secondary particle differential fluence, to provide insight into differences observed in integral quantities for particular components of the GCR spectrum.
NASA Astrophysics Data System (ADS)
Bahadori, Amir A.; Sato, Tatsuhiko; Slaba, Tony C.; Shavers, Mark R.; Semones, Edward J.; Van Baalen, Mary; Bolch, Wesley E.
2013-10-01
NASA currently uses one-dimensional deterministic transport to generate values of the organ dose equivalent needed to calculate stochastic radiation risk following crew space exposures. In this study, organ absorbed doses and dose equivalents are calculated for 50th percentile male and female astronaut phantoms using both the NASA High Charge and Energy Transport Code to perform one-dimensional deterministic transport and the Particle and Heavy Ion Transport Code System to perform three-dimensional Monte Carlo transport. Two measures of radiation risk, effective dose and risk of exposure-induced death (REID) are calculated using the organ dose equivalents resulting from the two methods of radiation transport. For the space radiation environments and simplified shielding configurations considered, small differences (<8%) in the effective dose and REID are found. However, for the galactic cosmic ray (GCR) boundary condition, compensating errors are observed, indicating that comparisons between the integral measurements of complex radiation environments and code calculations can be misleading. Code-to-code benchmarks allow for the comparison of differential quantities, such as secondary particle differential fluence, to provide insight into differences observed in integral quantities for particular components of the GCR spectrum.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beer, M.; Cohen, M.O.
1975-02-01
The adjoint Monte Carlo method previously developed by MAGI has been applied to the calculation of initial radiation dose due to air secondary gamma rays and fission product gamma rays at detector points within buildings for a wide variety of problems. These provide an in-depth survey of structure shielding effects as well as many new benchmark problems for matching by simplified models. Specifically, elevated ring source results were obtained in the following areas: doses at on-and off-centerline detectors in four concrete blockhouse structures; doses at detector positions along the centerline of a high-rise structure without walls; dose mapping at basementmore » detector positions in the high-rise structure; doses at detector points within a complex concrete structure containing exterior windows and walls and interior partitions; modeling of the complex structure by replacing interior partitions by additional material at exterior walls; effects of elevation angle changes; effects on the dose of changes in fission product ambient spectra; and modeling of mutual shielding due to external structures. In addition, point source results yielding dose extremes about the ring source average were obtained. (auth)« less
Analytical dose evaluation of neutron and secondary gamma-ray skyshine from nuclear facilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hayashi, K.; Nakamura, T.
1985-11-01
The skyshine dose distributions of neutron and secondary gamma rays were calculated systematically using the Monte Carlo method for distances up to 2 km from the source. The energy of source neutrons ranged from thermal to 400 MeV; their emission angle from 0 to 90 deg from the ver tical was treated with a distribution of the direction cosine containing five equal intervals. Calculated dose distributions D(r) were fitted to the formula; D(r) = Q exp (-r/lambda)/r. The value of Q and lambda are slowly varied functions of energy. This formula was applied to the benchmark problems of neutron skyshinemore » from fission, fusion, and accelerator facilities, and good agreement was achieved. This formula will be quite useful for shielding designs of various nuclear facilities.« less
A Monte-Carlo Benchmark of TRIPOLI-4® and MCNP on ITER neutronics
NASA Astrophysics Data System (ADS)
Blanchet, David; Pénéliau, Yannick; Eschbach, Romain; Fontaine, Bruno; Cantone, Bruno; Ferlet, Marc; Gauthier, Eric; Guillon, Christophe; Letellier, Laurent; Proust, Maxime; Mota, Fernando; Palermo, Iole; Rios, Luis; Guern, Frédéric Le; Kocan, Martin; Reichle, Roger
2017-09-01
Radiation protection and shielding studies are often based on the extensive use of 3D Monte-Carlo neutron and photon transport simulations. ITER organization hence recommends the use of MCNP-5 code (version 1.60), in association with the FENDL-2.1 neutron cross section data library, specifically dedicated to fusion applications. The MCNP reference model of the ITER tokamak, the `C-lite', is being continuously developed and improved. This article proposes to develop an alternative model, equivalent to the 'C-lite', but for the Monte-Carlo code TRIPOLI-4®. A benchmark study is defined to test this new model. Since one of the most critical areas for ITER neutronics analysis concerns the assessment of radiation levels and Shutdown Dose Rates (SDDR) behind the Equatorial Port Plugs (EPP), the benchmark is conducted to compare the neutron flux through the EPP. This problem is quite challenging with regard to the complex geometry and considering the important neutron flux attenuation ranging from 1014 down to 108 n•cm-2•s-1. Such code-to-code comparison provides independent validation of the Monte-Carlo simulations, improving the confidence in neutronic results.
Jansen, Esther J S; Dijkman, Koen P; van Lingen, Richard A; de Vries, Willem B; Vijlbrief, Daniel C; de Boode, Willem P; Andriessen, Peter
2017-10-01
The aim of this study was to identify inter-centre differences in persistent ductus arteriosus treatment and their related outcomes. Materials and methods We carried out a retrospective, multicentre study including infants between 24+0 and 27+6 weeks of gestation in the period between 2010 and 2011. In all centres, echocardiography was used as the standard procedure to diagnose a patent ductus arteriosus and to document ductal closure. In total, 367 preterm infants were included. All four participating neonatal ICU had a comparable number of preterm infants; however, differences were observed in the incidence of treatment (33-63%), choice and dosing of medication (ibuprofen or indomethacin), number of pharmacological courses (1-4), and the need for surgical ligation after failure of pharmacological treatment (8-52%). Despite the differences in treatment, we found no difference in short-term morbidity between the centres. Adjusted mortality showed independent risk contribution of gestational age, birth weight, ductal ligation, and perinatal centre. Using benchmarking as a tool identified inter-centre differences. In these four perinatal centres, the factors that explained the differences in patent ductus arteriosus treatment are quite complex. Timing, choice of medication, and dosing are probably important determinants for successful patent ductus arteriosus closure.
Mayo, Charles; Conners, Steve; Warren, Christopher; Miller, Robert; Court, Laurence; Popple, Richard
2013-01-01
Purpose: With emergence of clinical outcomes databases as tools utilized routinely within institutions, comes need for software tools to support automated statistical analysis of these large data sets and intrainstitutional exchange from independent federated databases to support data pooling. In this paper, the authors present a design approach and analysis methodology that addresses both issues. Methods: A software application was constructed to automate analysis of patient outcomes data using a wide range of statistical metrics, by combining use of C#.Net and R code. The accuracy and speed of the code was evaluated using benchmark data sets. Results: The approach provides data needed to evaluate combinations of statistical measurements for ability to identify patterns of interest in the data. Through application of the tools to a benchmark data set for dose-response threshold and to SBRT lung data sets, an algorithm was developed that uses receiver operator characteristic curves to identify a threshold value and combines use of contingency tables, Fisher exact tests, Welch t-tests, and Kolmogorov-Smirnov tests to filter the large data set to identify values demonstrating dose-response. Kullback-Leibler divergences were used to provide additional confirmation. Conclusions: The work demonstrates the viability of the design approach and the software tool for analysis of large data sets. PMID:24320426
Mayo, Charles; Conners, Steve; Warren, Christopher; Miller, Robert; Court, Laurence; Popple, Richard
2013-11-01
With emergence of clinical outcomes databases as tools utilized routinely within institutions, comes need for software tools to support automated statistical analysis of these large data sets and intrainstitutional exchange from independent federated databases to support data pooling. In this paper, the authors present a design approach and analysis methodology that addresses both issues. A software application was constructed to automate analysis of patient outcomes data using a wide range of statistical metrics, by combining use of C#.Net and R code. The accuracy and speed of the code was evaluated using benchmark data sets. The approach provides data needed to evaluate combinations of statistical measurements for ability to identify patterns of interest in the data. Through application of the tools to a benchmark data set for dose-response threshold and to SBRT lung data sets, an algorithm was developed that uses receiver operator characteristic curves to identify a threshold value and combines use of contingency tables, Fisher exact tests, Welch t-tests, and Kolmogorov-Smirnov tests to filter the large data set to identify values demonstrating dose-response. Kullback-Leibler divergences were used to provide additional confirmation. The work demonstrates the viability of the design approach and the software tool for analysis of large data sets.
Key Performance Indicators in the Evaluation of the Quality of Radiation Safety Programs.
Schultz, Cheryl Culver; Shaffer, Sheila; Fink-Bennett, Darlene; Winokur, Kay
2016-08-01
Beaumont is a multiple hospital health care system with a centralized radiation safety department. The health system operates under a broad scope Nuclear Regulatory Commission license but also maintains several other limited use NRC licenses in off-site facilities and clinics. The hospital-based program is expansive including diagnostic radiology and nuclear medicine (molecular imaging), interventional radiology, a comprehensive cardiovascular program, multiple forms of radiation therapy (low dose rate brachytherapy, high dose rate brachytherapy, external beam radiotherapy, and gamma knife), and the Research Institute (including basic bench top, human and animal). Each year, in the annual report, data is analyzed and then tracked and trended. While any summary report will, by nature, include items such as the number of pieces of equipment, inspections performed, staff monitored and educated and other similar parameters, not all include an objective review of the quality and effectiveness of the program. Through objective numerical data Beaumont adopted seven key performance indicators. The assertion made is that key performance indicators can be used to establish benchmarks for evaluation and comparison of the effectiveness and quality of radiation safety programs. Based on over a decade of data collection, and adoption of key performance indicators, this paper demonstrates one way to establish objective benchmarking for radiation safety programs in the health care environment.
Alcohol calibration of tests measuring skills related to car driving.
Jongen, Stefan; Vuurman, Eric; Ramaekers, Jan; Vermeeren, Annemiek
2014-06-01
Medication and illicit drugs can have detrimental side effects which impair driving performance. A drug's impairing potential should be determined by well-validated, reliable, and sensitive tests and ideally be calibrated by benchmark drugs and doses. To date, no consensus has been reached on the issue of which psychometric tests are best suited for initial screening of a drug's driving impairment potential. The aim of this alcohol calibration study is to determine which performance tests are useful to measure drug-induced impairment. The effects of alcohol are used to compare the psychometric quality between tests and as benchmark to quantify performance changes in each test associated with potentially impairing drug effects. Twenty-four healthy volunteers participated in a double-blind, four-way crossover study. Treatments were placebo and three different doses of alcohol leading to blood alcohol concentrations (BACs) of 0.2, 0.5, and 0.8 g/L. Main effects of alcohol were found in most tests. Compared with placebo, performance in the Divided Attention Test (DAT) was significantly impaired after all alcohol doses and performance in the Psychomotor Vigilance Test (PVT) and the Balance Test was impaired with a BAC of 0.5 and 0.8 g/L. The largest effect sizes were found on postural balance with eyes open and mean reaction time in the divided attention and the psychomotor vigilance test. The preferable tests for initial screening are the DAT and the PVT, as these tests were most sensitive to the impairing effects of alcohol and being considerably valid in assessing potential driving impairment.
Benchmarking the MCNP Monte Carlo code with a photon skyshine experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olsher, R.H.; Hsu, Hsiao Hua; Harvey, W.F.
1993-07-01
The MCNP Monte Carlo transport code is used by the Los Alamos National Laboratory Health and Safety Division for a broad spectrum of radiation shielding calculations. One such application involves the determination of skyshine dose for a variety of photon sources. To verify the accuracy of the code, it was benchmarked with the Kansas State Univ. (KSU) photon skyshine experiment of 1977. The KSU experiment for the unshielded source geometry was simulated in great detail to include the contribution of groundshine, in-silo photon scatter, and the effect of spectral degradation in the source capsule. The standard deviation of the KSUmore » experimental data was stated to be 7%, while the statistical uncertainty of the simulation was kept at or under 1%. The results of the simulation agreed closely with the experimental data, generally to within 6%. At distances of under 100 m from the silo, the modeling of the in-silo scatter was crucial to achieving close agreement with the experiment. Specifically, scatter off the top layer of the source cask accounted for [approximately]12% of the dose at 50 m. At distance >300m, using the [sup 60]Co line spectrum led to a dose overresponse as great as 19% at 700 m. It was necessary to use the actual source spectrum, which includes a Compton tail from photon collisions in the source capsule, to achieve close agreement with experimental data. These results highlight the importance of using Monte Carlo transport techniques to account for the nonideal features of even simple experiments''.« less
A deterministic partial differential equation model for dose calculation in electron radiotherapy.
Duclous, R; Dubroca, B; Frank, M
2010-07-07
High-energy ionizing radiation is a prominent modality for the treatment of many cancers. The approaches to electron dose calculation can be categorized into semi-empirical models (e.g. Fermi-Eyges, convolution-superposition) and probabilistic methods (e.g.Monte Carlo). A third approach to dose calculation has only recently attracted attention in the medical physics community. This approach is based on the deterministic kinetic equations of radiative transfer. We derive a macroscopic partial differential equation model for electron transport in tissue. This model involves an angular closure in the phase space. It is exact for the free streaming and the isotropic regime. We solve it numerically by a newly developed HLLC scheme based on Berthon et al (2007 J. Sci. Comput. 31 347-89) that exactly preserves the key properties of the analytical solution on the discrete level. We discuss several test cases taken from the medical physics literature. A test case with an academic Henyey-Greenstein scattering kernel is considered. We compare our model to a benchmark discrete ordinate solution. A simplified model of electron interactions with tissue is employed to compute the dose of an electron beam in a water phantom, and a case of irradiation of the vertebral column. Here our model is compared to the PENELOPE Monte Carlo code. In the academic example, the fluences computed with the new model and a benchmark result differ by less than 1%. The depths at half maximum differ by less than 0.6%. In the two comparisons with Monte Carlo, our model gives qualitatively reasonable dose distributions. Due to the crude interaction model, these so far do not have the accuracy needed in clinical practice. However, the new model has a computational cost that is less than one-tenth of the cost of a Monte Carlo simulation. In addition, simulations can be set up in a similar way as a Monte Carlo simulation. If more detailed effects such as coupled electron-photon transport, bremsstrahlung, Compton scattering and the production of delta electrons are added to our model, the computation time will only slightly increase. Its margin of error, on the other hand, will decrease and should be within a few per cent of the actual dose. Therefore, the new model has the potential to become useful for dose calculations in clinical practice.
A deterministic partial differential equation model for dose calculation in electron radiotherapy
NASA Astrophysics Data System (ADS)
Duclous, R.; Dubroca, B.; Frank, M.
2010-07-01
High-energy ionizing radiation is a prominent modality for the treatment of many cancers. The approaches to electron dose calculation can be categorized into semi-empirical models (e.g. Fermi-Eyges, convolution-superposition) and probabilistic methods (e.g. Monte Carlo). A third approach to dose calculation has only recently attracted attention in the medical physics community. This approach is based on the deterministic kinetic equations of radiative transfer. We derive a macroscopic partial differential equation model for electron transport in tissue. This model involves an angular closure in the phase space. It is exact for the free streaming and the isotropic regime. We solve it numerically by a newly developed HLLC scheme based on Berthon et al (2007 J. Sci. Comput. 31 347-89) that exactly preserves the key properties of the analytical solution on the discrete level. We discuss several test cases taken from the medical physics literature. A test case with an academic Henyey-Greenstein scattering kernel is considered. We compare our model to a benchmark discrete ordinate solution. A simplified model of electron interactions with tissue is employed to compute the dose of an electron beam in a water phantom, and a case of irradiation of the vertebral column. Here our model is compared to the PENELOPE Monte Carlo code. In the academic example, the fluences computed with the new model and a benchmark result differ by less than 1%. The depths at half maximum differ by less than 0.6%. In the two comparisons with Monte Carlo, our model gives qualitatively reasonable dose distributions. Due to the crude interaction model, these so far do not have the accuracy needed in clinical practice. However, the new model has a computational cost that is less than one-tenth of the cost of a Monte Carlo simulation. In addition, simulations can be set up in a similar way as a Monte Carlo simulation. If more detailed effects such as coupled electron-photon transport, bremsstrahlung, Compton scattering and the production of δ electrons are added to our model, the computation time will only slightly increase. Its margin of error, on the other hand, will decrease and should be within a few per cent of the actual dose. Therefore, the new model has the potential to become useful for dose calculations in clinical practice.
Alonzo, Frédéric; Hertel-Aas, Turid; Real, Almudena; Lance, Emilie; Garcia-Sanchez, Laurent; Bradshaw, Clare; Vives I Batlle, Jordi; Oughton, Deborah H; Garnier-Laplace, Jacqueline
2016-02-01
In this study, we modelled population responses to chronic external gamma radiation in 12 laboratory species (including aquatic and soil invertebrates, fish and terrestrial mammals). Our aim was to compare radiosensitivity between individual and population endpoints and to examine how internationally proposed benchmarks for environmental radioprotection protected species against various risks at the population level. To do so, we used population matrix models, combining life history and chronic radiotoxicity data (derived from laboratory experiments and described in the literature and the FREDERICA database) to simulate changes in population endpoints (net reproductive rate R0, asymptotic population growth rate λ, equilibrium population size Neq) for a range of dose rates. Elasticity analyses of models showed that population responses differed depending on the affected individual endpoint (juvenile or adult survival, delay in maturity or reduction in fecundity), the considered population endpoint (R0, λ or Neq) and the life history of the studied species. Among population endpoints, net reproductive rate R0 showed the lowest EDR10 (effective dose rate inducing 10% effect) in all species, with values ranging from 26 μGy h(-1) in the mouse Mus musculus to 38,000 μGy h(-1) in the fish Oryzias latipes. For several species, EDR10 for population endpoints were lower than the lowest EDR10 for individual endpoints. Various population level risks, differing in severity for the population, were investigated. Population extinction (predicted when radiation effects caused population growth rate λ to decrease below 1, indicating that no population growth in the long term) was predicted for dose rates ranging from 2700 μGy h(-1) in fish to 12,000 μGy h(-1) in soil invertebrates. A milder risk, that population growth rate λ will be reduced by 10% of the reduction causing extinction, was predicted for dose rates ranging from 24 μGy h(-1) in mammals to 1800 μGy h(-1) in soil invertebrates. These predictions suggested that proposed reference benchmarks from the literature for different taxonomic groups protected all simulated species against population extinction. A generic reference benchmark of 10 μGy h(-1) protected all simulated species against 10% of the effect causing population extinction. Finally, a risk of pseudo-extinction was predicted from 2.0 μGy h(-1) in mammals to 970 μGy h(-1) in soil invertebrates, representing a slight but statistically significant population decline, the importance of which remains to be evaluated in natural settings. Copyright © 2015 Elsevier Ltd. All rights reserved.
The MARS15-based FermiCORD code system for calculation of the accelerator-induced residual dose
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grebe, A.; Leveling, A.; Lu, T.
The FermiCORD code system, a set of codes based on MARS15 that calculates the accelerator-induced residual doses at experimental facilities of arbitrary configurations, has been developed. FermiCORD is written in C++ as an add-on to Fortran-based MARS15. The FermiCORD algorithm consists of two stages: 1) simulation of residual doses on contact with the surfaces surrounding the studied location and of radionuclide inventories in the structures surrounding those locations using MARS15, and 2) simulation of the emission of the nuclear decay gamma-quanta by the residuals in the activated structures and scoring the prompt doses of these gamma-quanta at arbitrary distances frommore » those structures. The FermiCORD code system has been benchmarked against similar algorithms based on other code systems and showed a good agreement. The code system has been applied for calculation of the residual dose of the target station for the Mu2e experiment and the results have been compared to approximate dosimetric approaches.« less
The MARS15-based FermiCORD code system for calculation of the accelerator-induced residual dose
NASA Astrophysics Data System (ADS)
Grebe, A.; Leveling, A.; Lu, T.; Mokhov, N.; Pronskikh, V.
2018-01-01
The FermiCORD code system, a set of codes based on MARS15 that calculates the accelerator-induced residual doses at experimental facilities of arbitrary configurations, has been developed. FermiCORD is written in C++ as an add-on to Fortran-based MARS15. The FermiCORD algorithm consists of two stages: 1) simulation of residual doses on contact with the surfaces surrounding the studied location and of radionuclide inventories in the structures surrounding those locations using MARS15, and 2) simulation of the emission of the nuclear decay γ-quanta by the residuals in the activated structures and scoring the prompt doses of these γ-quanta at arbitrary distances from those structures. The FermiCORD code system has been benchmarked against similar algorithms based on other code systems and against experimental data from the CERF facility at CERN, and FermiCORD showed reasonable agreement with these. The code system has been applied for calculation of the residual dose of the target station for the Mu2e experiment and the results have been compared to approximate dosimetric approaches.
Clinical decision-making tools for exam selection, reporting and dose tracking.
Brink, James A
2014-10-01
Although many efforts have been made to reduce the radiation dose associated with individual medical imaging examinations to "as low as reasonably achievable," efforts to ensure such examinations are performed only when medically indicated and appropriate are equally if not more important. Variations in the use of ionizing radiation for medical imaging are concerning, regardless of whether they occur on a local, regional or national basis. Such variations among practices can be reduced with the use of decision support tools at the time of order entry. These tools help reduce radiation exposure among practices through the appropriate use of medical imaging. Similarly, adoption of best practices among imaging facilities can be promoted through tracking the radiation exposure among imaging patients. Practices can benchmark their aggregate radiation exposures for medical imaging through the use of dose index registries. However several variables must be considered when contemplating individual patient dose tracking. The specific dose measures and the variation among them introduced by variations in body habitus must be understood. Moreover the uncertainties in risk estimation from dose metrics related to age, gender and life expectancy must also be taken into account.
Kim, Steven B; Kodell, Ralph L; Moon, Hojin
2014-03-01
In chemical and microbial risk assessments, risk assessors fit dose-response models to high-dose data and extrapolate downward to risk levels in the range of 1-10%. Although multiple dose-response models may be able to fit the data adequately in the experimental range, the estimated effective dose (ED) corresponding to an extremely small risk can be substantially different from model to model. In this respect, model averaging (MA) provides more robustness than a single dose-response model in the point and interval estimation of an ED. In MA, accounting for both data uncertainty and model uncertainty is crucial, but addressing model uncertainty is not achieved simply by increasing the number of models in a model space. A plausible set of models for MA can be characterized by goodness of fit and diversity surrounding the truth. We propose a diversity index (DI) to balance between these two characteristics in model space selection. It addresses a collective property of a model space rather than individual performance of each model. Tuning parameters in the DI control the size of the model space for MA. © 2013 Society for Risk Analysis.
Tang, Leilei; Guérard, Melanie; Zeller, Andreas
2014-01-01
Mutagenic and clastogenic effects of some DNA damaging agents such as methyl methanesulfonate (MMS) and ethyl methanesulfonate (EMS) have been demonstrated to exhibit a nonlinear or even "thresholded" dose-response in vitro and in vivo. DNA repair seems to be mainly responsible for these thresholds. To this end, we assessed several mutagenic alkylators in the Ames test with four different strains of Salmonella typhimurium: the alkyl transferases proficient strain TA1535 (Ogt+/Ada+), as well as the alkyl transferases deficient strains YG7100 (Ogt+/Ada-), YG7104 (Ogt-/Ada+) and YG7108 (Ogt-/Ada-). The known genotoxins EMS, MMS, temozolomide (TMZ), ethylnitrosourea (ENU) and methylnitrosourea (MNU) were tested in as many as 22 concentration levels. Dose-response curves were statistically fitted by the PROAST benchmark dose model and the Lutz-Lutz "hockeystick" model. These dose-response curves suggest efficient DNA-repair for lesions inflicted by all agents in strain TA1535. In the absence of Ogt, Ada is predominantly repairing methylations but not ethylations. It is concluded that the capacity of alkyl-transferases to successfully repair DNA lesions up to certain dose levels contributes to genotoxicity thresholds. Copyright © 2013 Wiley Periodicals, Inc.
Low-dose CT image reconstruction using gain intervention-based dictionary learning
NASA Astrophysics Data System (ADS)
Pathak, Yadunath; Arya, K. V.; Tiwari, Shailendra
2018-05-01
Computed tomography (CT) approach is extensively utilized in clinical diagnoses. However, X-ray residue in human body may introduce somatic damage such as cancer. Owing to radiation risk, research has focused on the radiation exposure distributed to patients through CT investigations. Therefore, low-dose CT has become a significant research area. Many researchers have proposed different low-dose CT reconstruction techniques. But, these techniques suffer from various issues such as over smoothing, artifacts, noise, etc. Therefore, in this paper, we have proposed a novel integrated low-dose CT reconstruction technique. The proposed technique utilizes global dictionary-based statistical iterative reconstruction (GDSIR) and adaptive dictionary-based statistical iterative reconstruction (ADSIR)-based reconstruction techniques. In case the dictionary (D) is predetermined, then GDSIR can be used and if D is adaptively defined then ADSIR is appropriate choice. The gain intervention-based filter is also used as a post-processing technique for removing the artifacts from low-dose CT reconstructed images. Experiments have been done by considering the proposed and other low-dose CT reconstruction techniques on well-known benchmark CT images. Extensive experiments have shown that the proposed technique outperforms the available approaches.
Laitinen, E-M; Hero, M; Vaaralahti, K; Tommiska, J; Raivio, T
2012-08-01
Patients with congenital hypogonadotropic hypogonadism (HH) may have reduced peak bone mass in early adulthood, and increased risk for osteoporosis despite long-term hormonal replacement therapy (HRT). To investigate the relationship between HRT history and measures of bone health in patients with HH, we recruited 33 subjects (24 men, nine women; mean age 39.8 years, range: 24.0-69.1) with congenital HH (Kallmann syndrome or normosmic HH). They underwent clinical examination, were interviewed and medical charts were reviewed. Twenty-six subjects underwent dual-energy X-ray absorptiometry for evaluation of BMD of lumbar spine, hip, femoral neck and whole body; body composition and vertebral morphology were evaluated in 22 and 23 subjects, respectively. Circulating PINP, ICTP and sex hormone levels were measured. HRT history clearly associated to bone health: BMDs of lumbar spine, femoral neck, hip and whole body were lower in subjects (n = 9) who had had long (≥5 years) treatment pauses or low dose testosterone (T) treatment as compared to subjects without such history (n = 17; all p-values < 0.05). In addition, fat mass and body mass index (BMI) were significantly higher in men with deficient treatment history (median fat mass: 37.5 vs. 23.1%, p = 0.005; BMI: 32.6 vs. 25.2 kg/m(2), p < 0.05). Serum PINP correlated with ICTP (r(s) = 0.61; p < 0.005) in men, but these markers correlated neither with circulating T, nor with serum estradiol levels in women. In conclusion, patients with congenital HH require life-long follow-up to avoid inadequate HRT, long treatment pauses and further morbidity. © 2012 The Authors. International Journal of Andrology © 2012 European Academy of Andrology.
Buyukkaplan, U S; Guldag, M U
2012-07-01
Fluoride is one of the biological trace elements with a strong affinity for osseous, cartilaginous and dental tissue. The dental and skeletal effects of high fluoride intake have already been studied in the literature, but little is known about the effects of high fluoride intake on edentulous mandibles. The purpose of this study was to evaluate the effects of high fluoride intake on mandibular bone mineral density (BMD) measured by the dual-energy X-ray absorptiometry (DXA) technique in edentulous individuals with systemic fluorosis. 32 people who were living in an endemic fluorosis area since birth and 31 people who were living in a non-endemic fluorosis area since birth (control group) participated in this study. Systemic fluorosis was diagnosed in the patients using the sialic acid (NANA)/glycosaminoglycan (GAG) ratio. The BMDs of the mandibles were determined by the DXA technique. The serum NANA/GAG ratios in the fluorosis group were significantly lower than those in the control group (p < 0.001). There was also a statistically significant difference in mandibular BMD measurements (p < 0.05) between the systemic fluorosis and control groups, as measured by the DXA technique. Mandibular body BMD measurements were higher in the fluorosis group (1.25 ± 0.24 g cm(-2)) than in the control group (1.01 ± 0.31 g cm(-2)). The results of the study showed that fluoride intake higher than the optimum level causes increased mandibular BMD in edentulous individuals. Further dose-related studies are needed to determine the effects of high fluoride intake on bony structures of the stomatognathic system.
ANABOLIC BONE WINDOW WITH WEEKLY TERIPARATIDE THERAPY IN POSTMENOPAUSAL OSTEOPOROSIS: A PILOT STUDY.
Gopalaswamy, Vinaya; Dhibar, Deba Prasad; Gupta, Vipin; Arya, Ashutosh Kumar; Khandelwal, Niranjan; Bhansali, Anil; Garg, Sudhir Kumar; Agarwal, Neelam; Rao, Sudhaker D; Bhadada, Sanjay Kumar
2017-06-01
Osteoporosis is a major public health problem that reduces bone strength and increases fracture risk. Teriparatide is an established and the only currently available anabolic therapy for the treatment of postmenopausal osteoporosis (PMO) with a recommended daily dose of 20 μg given subcutaneously. However, there are limited data regarding the long-term effect of once-weekly teriparatide therapy on bone mineral density (BMD), bone turnover markers (BTMs), and anabolic bone window. In this prospective observational study, 26 patients with PMO were treated with weekly teriparatide therapy (60 μg) for 2 years. BMD was measured at baseline, 12 months, and 24 months. The bone formation marker type 1 collagen C-terminal propeptide (P1NP) and the bone resorption marker C-terminal telopeptide of type 1 collagen (CTx) were measured at baseline; 6 weeks; and 6, 12, 18, and 24 months. BMDs at the lumbar spine increased by 3.1% and 10.8% after 1 and 2 years of weekly teriparatide therapy, respectively. The T-score increased significantly at the lumbar spine compared to baseline after 2 years of therapy (P = .015). Serum P1NP levels increased significantly at 6 months (P = .024), peaked at 1 year, and remained above the baseline even after 2 years. Serum CTx levels decreased significantly at 6 months (P = .025) and remained below baseline after 2 years of teriparatide therapy. Weekly teriparatide therapy (60 μg) appears to be as effective as daily teriparatide for the treatment of PMO by extending the anabolic bone window. AE = adverse event; BMD = bone mineral density; BTM = bone turnover marker; CTx = C-terminal telopeptide of type 1 collagen; DXA = dual-energy X-ray absorptiometry; iPTH = intact parathyroid hormone; P1NP = type 1 collagen C-terminal propeptide; PMO = postmenopausal osteoporosis.
Dosimetric evaluation of a Monte Carlo IMRT treatment planning system incorporating the MIMiC
NASA Astrophysics Data System (ADS)
Rassiah-Szegedi, P.; Fuss, M.; Sheikh-Bagheri, D.; Szegedi, M.; Stathakis, S.; Lancaster, J.; Papanikolaou, N.; Salter, B.
2007-12-01
The high dose per fraction delivered to lung lesions in stereotactic body radiation therapy (SBRT) demands high dose calculation and delivery accuracy. The inhomogeneous density in the thoracic region along with the small fields used typically in intensity-modulated radiation therapy (IMRT) treatments poses a challenge in the accuracy of dose calculation. In this study we dosimetrically evaluated a pre-release version of a Monte Carlo planning system (PEREGRINE 1.6b, NOMOS Corp., Cranberry Township, PA), which incorporates the modeling of serial tomotherapy IMRT treatments with the binary multileaf intensity modulating collimator (MIMiC). The aim of this study is to show the validation process of PEREGRINE 1.6b since it was used as a benchmark to investigate the accuracy of doses calculated by a finite size pencil beam (FSPB) algorithm for lung lesions treated on the SBRT dose regime via serial tomotherapy in our previous study. Doses calculated by PEREGRINE were compared against measurements in homogeneous and inhomogeneous materials carried out on a Varian 600C with a 6 MV photon beam. Phantom studies simulating various sized lesions were also carried out to explain some of the large dose discrepancies seen in the dose calculations with small lesions. Doses calculated by PEREGRINE agreed to within 2% in water and up to 3% for measurements in an inhomogeneous phantom containing lung, bone and unit density tissue.
Park, Robert M; Bowler, Rosemarie M; Roels, Harry A
2009-10-01
The exposure-response relationship for manganese (Mn)-induced adverse nervous system effects is not well described. Symptoms and neuropsychological deficits associated with early manganism were previously reported for welders constructing bridge piers during 2003 to 2004. A reanalysis using improved exposure, work history information, and diverse exposure metrics is presented here. Ten neuropsychological performance measures were examined, including working memory index (WMI), verbal intelligence quotient, design fluency, Stroop color word test, Rey-Osterrieth Complex Figure, and Auditory Consonant Trigram tests. Mn blood levels and air sampling data in the form of both personal and area samples were available. The exposure metrics used were cumulative exposure to Mn, body burden assuming simple first-order kinetics for Mn elimination, and cumulative burden (effective dose). Benchmark doses were calculated. Burden with a half-life of about 150 days was the best predictor of blood Mn. WMI performance declined by 3.6 (normal = 100, SD = 15) for each 1.0 mg/m3 x mo exposure (P = 0.02, one tailed). At the group mean exposure metric (burden; half-life = 275 days), WMI performance was at the lowest 17th percentile of normal, and at the maximum observed metric, performance was at the lowest 2.5 percentiles. Four other outcomes also exhibited statistically significant associations (verbal intelligence quotient, verbal comprehension index, design fluency, Stroop color word test); no dose-rate effect was observed for three of the five outcomes. A risk assessment performed for the five stronger effects, choosing various percentiles of normal performance to represent impairment, identified benchmark doses for a 2-year exposure leading to 5% excess impairment prevalence in the range of 0.03 to 0.15 mg/m3, or 30 to 150 microg/m3, total Mn in air, levels that are far below those permitted by current occupational standards. More than one-third of workers would be impaired after working 2 years at 0.2 mg/m3 Mn (the current threshold limit value).
SU-F-T-231: Improving the Efficiency of a Radiotherapy Peer-Review System for Quality Assurance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hsu, S; Basavatia, A; Garg, M
Purpose: To improve the efficiency of a radiotherapy peer-review system using a commercially available software application for plan quality evaluation and documentation. Methods: A commercial application, FullAccess (Radialogica LLC, Version 1.4.4), was implemented in a Citrix platform for peer-review process and patient documentation. This application can display images, isodose lines, and dose-volume histograms and create plan reports for peer-review process. Dose metrics in the report can also be benchmarked for plan quality evaluation. Site-specific templates were generated based on departmental treatment planning policies and procedures for each disease site, which generally follow RTOG protocols as well as published prospective clinicalmore » trial data, including both conventional fractionation and hypo-fractionation schema. Once a plan is ready for review, the planner exports the plan to FullAccess, applies the site-specific template, and presents the report for plan review. The plan is still reviewed in the treatment planning system, as that is the legal record. Upon physician’s approval of a plan, the plan is packaged for peer review with the plan report and dose metrics are saved to the database. Results: The reports show dose metrics of PTVs and critical organs for the plans and also indicate whether or not the metrics are within tolerance. Graphical results with green, yellow, and red lights are displayed of whether planning objectives have been met. In addition, benchmarking statistics are collected to see where the current plan falls compared to all historical plans on each metric. All physicians in peer review can easily verify constraints by these reports. Conclusion: We have demonstrated the improvement in a radiotherapy peer-review system, which allows physicians to easily verify planning constraints for different disease sites and fractionation schema, allows for standardization in the clinic to ensure that departmental policies are maintained, and builds a comprehensive database for potential clinical outcome evaluation.« less
SU-E-T-22: A Deterministic Solver of the Boltzmann-Fokker-Planck Equation for Dose Calculation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hong, X; Gao, H; Paganetti, H
2015-06-15
Purpose: The Boltzmann-Fokker-Planck equation (BFPE) accurately models the migration of photons/charged particles in tissues. While the Monte Carlo (MC) method is popular for solving BFPE in a statistical manner, we aim to develop a deterministic BFPE solver based on various state-of-art numerical acceleration techniques for rapid and accurate dose calculation. Methods: Our BFPE solver is based on the structured grid that is maximally parallelizable, with the discretization in energy, angle and space, and its cross section coefficients are derived or directly imported from the Geant4 database. The physical processes that are taken into account are Compton scattering, photoelectric effect, pairmore » production for photons, and elastic scattering, ionization and bremsstrahlung for charged particles.While the spatial discretization is based on the diamond scheme, the angular discretization synergizes finite element method (FEM) and spherical harmonics (SH). Thus, SH is used to globally expand the scattering kernel and FFM is used to locally discretize the angular sphere. As a Result, this hybrid method (FEM-SH) is both accurate in dealing with forward-peaking scattering via FEM, and efficient for multi-energy-group computation via SH. In addition, FEM-SH enables the analytical integration in energy variable of delta scattering kernel for elastic scattering with reduced truncation error from the numerical integration based on the classic SH-based multi-energy-group method. Results: The accuracy of the proposed BFPE solver was benchmarked against Geant4 for photon dose calculation. In particular, FEM-SH had improved accuracy compared to FEM, while both were within 2% of the results obtained with Geant4. Conclusion: A deterministic solver of the Boltzmann-Fokker-Planck equation is developed for dose calculation, and benchmarked against Geant4. Xiang Hong and Hao Gao were partially supported by the NSFC (#11405105), the 973 Program (#2015CB856000) and the Shanghai Pujiang Talent Program (#14PJ1404500)« less
The relationship of maternal bone density with nutritional rickets in Nigerian children.
Hsu, Jennifer; Fischer, Philip R; Pettifor, John M; Thacher, Tom D
2017-04-01
Factors that affect maternal bone mineral density may be related to the risk of nutritional rickets in their offspring. Our aim was to determine the relationship between maternal areal bone mineral density (aBMD) and rickets in Nigerian children. Using a case-control design, we measured forearm aBMD in 56 and 135 mothers of children with and without nutritional rickets, respectively. Active rickets was confirmed or excluded in all children radiographically. Using logistic regression, we assessed the association of maternal aBMD, adjusted for parity, pregnancy and lactation status, duration of most recent completed lactation, age of menarche, height, body mass index, and maternal age with nutritional rickets. The median (range) age of all mothers was 30years (17-47years), and parity was 4 (1-12). A total of 36 (19%) were pregnant and 55 (29%) were currently breast feeding. Mean (±SD) metaphyseal forearm aBMDs were 0.321±0.057 and 0.316±0.053g/cm 2 in mothers of children with and without rickets, respectively (P=0.60). Diaphyseal forearm aBMDs were 0.719±0.071 and 0.715±0.072g/cm 2 , respectively (P=0.69). In an adjusted analysis, maternal forearm aBMD, bone mineral content and bone area at metaphyseal and diaphyseal sites were not associated with rickets in the child. In the adjusted analysis, rickets was associated with shorter duration of most recently completed lactation (aOR 0.91 for each additional month; 95% CI 0.83-0.99), older maternal age (aOR 1.07 for each additional year; 1.00-1.14), and less frequent maternal use of lead-containing eye cosmetics (aOR 0.20; 95% CI 0.05-0.64), without any difference in maternal blood lead levels. Maternal age, parity, age of menarche, height, and body mass index were not associated with having had a child with rickets in multivariate analysis. Nutritional rickets in Nigerian children was not associated with maternal forearm aBMD. Other unidentified maternal characteristics and practices likely contribute to the risk of rickets in Nigerian children. Copyright © 2017 Elsevier Inc. All rights reserved.
Wang, Lei; Baus, Wolfgang; Grimm, Jimm; Lacornerie, Thomas; Nilsson, Joakim; Luchkovskyi, Sergii; Cano, Isabel Palazon; Shou, Zhenyu; Ayadi, Myriam; Treuer, Harald; Viard, Romain; Siebert, Frank‐Andre; Chan, Mark K.H.; Hildebrandt, Guido; Dunst, Jürgen; Imhoff, Detlef; Wurster, Stefan; Wolff, Robert; Romanelli, Pantaleo; Lartigau, Eric; Semrau, Robert; Soltys, Scott G.; Schweikard, Achim
2016-01-01
Stereotactic radiosurgery (SRS) is the accurate, conformal delivery of high‐dose radiation to well‐defined targets while minimizing normal structure doses via steep dose gradients. While inverse treatment planning (ITP) with computerized optimization algorithms are routine, many aspects of the planning process remain user‐dependent. We performed an international, multi‐institutional benchmark trial to study planning variability and to analyze preferable ITP practice for spinal robotic radiosurgery. 10 SRS treatment plans were generated for a complex‐shaped spinal metastasis with 21 Gy in 3 fractions and tight constraints for spinal cord (V14Gy<2 cc, V18Gy<0.1 cc) and target (coverage >95%). The resulting plans were rated on a scale from 1 to 4 (excellent‐poor) in five categories (constraint compliance, optimization goals, low‐dose regions, ITP complexity, and clinical acceptability) by a blinded review panel. Additionally, the plans were mathematically rated based on plan indices (critical structure and target doses, conformity, monitor units, normal tissue complication probability, and treatment time) and compared to the human rankings. The treatment plans and the reviewers' rankings varied substantially among the participating centers. The average mean overall rank was 2.4 (1.2‐4.0) and 8/10 plans were rated excellent in at least one category by at least one reviewer. The mathematical rankings agreed with the mean overall human rankings in 9/10 cases pointing toward the possibility for sole mathematical plan quality comparison. The final rankings revealed that a plan with a well‐balanced trade‐off among all planning objectives was preferred for treatment by most participants, reviewers, and the mathematical ranking system. Furthermore, this plan was generated with simple planning techniques. Our multi‐institutional planning study found wide variability in ITP approaches for spinal robotic radiosurgery. The participants', reviewers', and mathematical match on preferable treatment plans and ITP techniques indicate that agreement on treatment planning and plan quality can be reached for spinal robotic radiosurgery. PACS number(s): 87.55.de PMID:27167291
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tang, S; Ho, M; Chen, C
Purpose: The use of log files to perform patient specific quality assurance for both protons and IMRT has been established. Here, we extend that approach to a proprietary log file format and compare our results to measurements in phantom. Our goal was to generate a system that would permit gross errors to be found within 3 fractions until direct measurements. This approach could eventually replace direct measurements. Methods: Spot scanning protons pass through multi-wire ionization chambers which provide information about the charge, location, and size of each delivered spot. We have generated a program that calculates the dose in phantommore » from these log files and compares the measurements with the plan. The program has 3 different spot shape models: single Gaussian, double Gaussian and the ASTROID model. The program was benchmarked across different treatment sites for 23 patients and 74 fields. Results: The dose calculated from the log files were compared to those generate by the treatment planning system (Raystation). While the dual Gaussian model often gave better agreement, overall, the ASTROID model gave the most consistent results. Using a 5%–3 mm gamma with a 90% passing criteria and excluding doses below 20% of prescription all patient samples passed. However, the degree of agreement of the log file approach was slightly worse than that of the chamber array measurement approach. Operationally, this implies that if the beam passes the log file model, it should pass direct measurement. Conclusion: We have established and benchmarked a model for log file QA in an IBA proteus plus system. The choice of optimal spot model for a given class of patients may be affected by factors such as site, field size, and range shifter and will be investigated further.« less
Dunnick, June K; Shockley, Keith R; Morgan, Daniel L; Brix, Amy; Travlos, Gregory S; Gerrish, Kevin; Michael Sanders, J; Ton, T V; Pandiri, Arun R
2017-04-01
N,N-dimethyl-p-toluidine (DMPT), an accelerant for methyl methacrylate monomers in medical devices, was a liver carcinogen in male and female F344/N rats and B6C3F1 mice in a 2-year oral exposure study. p-Toluidine, a structurally related chemical, was a liver carcinogen in mice but not in rats in an 18-month feed exposure study. In this current study, liver transcriptomic data were used to characterize mechanisms in DMPT and p-toluidine liver toxicity and for conducting benchmark dose (BMD) analysis. Male F344/N rats were exposed orally to DMPT or p-toluidine (0, 1, 6, 20, 60 or 120 mg/kg/day) for 5 days. The liver was examined for lesions and transcriptomic alterations. Both chemicals caused mild hepatic toxicity at 60 and 120 mg/kg and dose-related transcriptomic alterations in the liver. There were 511 liver transcripts differentially expressed for DMPT and 354 for p-toluidine at 120 mg/kg/day (false discovery rate threshold of 5 %). The liver transcriptomic alterations were characteristic of an anti-oxidative damage response (activation of the Nrf2 pathway) and hepatic toxicity. The top cellular processes in gene ontology (GO) categories altered in livers exposed to DMPT or p-toluidine were used for BMD calculations. The lower confidence bound benchmark doses for these chemicals were 2 mg/kg/day for DMPT and 7 mg/kg/day for p-toluidine. These studies show the promise of using 5-day target organ transcriptomic data to identify chemical-induced molecular changes that can serve as markers for preliminary toxicity risk assessment.
Schentag, J J; Paladino, J A; Birmingham, M C; Zimmer, G; Carr, J R; Hanson, S C
1995-01-01
To apply basic benchmarking techniques to hospital antibiotic expenditures and clinical pharmacy personnel and their duties, to identify cost savings strategies for clinical pharmacy services. Prospective survey of 18 hospitals ranging in size from 201 to 942 beds. Each was asked to provide antibiotic expenditures, an overview of their clinical pharmacy services, and to describe the duties of clinical pharmacists involved in antibiotic management activities. Specific information was sought on the use of pharmacokinetic dosing services, antibiotic streamlining, and oral switch in each of the hospitals. Most smaller hospitals (< 300 beds) did not employ clinical pharmacists with the specific duties of antibiotic management or streamlining. At these institutions, antibiotic management services consisted of formulary enforcement and aminoglycoside and/or vancomycin dosing services. The larger hospitals we surveyed employed clinical pharmacists designated as antibiotic management specialists, but their usual activities were aminoglycoside and/or vancomycin dosing services and formulary enforcement. In virtually all hospitals, the yearly expenses for antibiotics exceeded those of Millard Fillmore Hospitals by $2,000-3,000 per occupied bed. In a 500-bed hospital, this difference in expenditures would exceed $1.5 million yearly. Millard Fillmore Health System has similar types of patients, but employs clinical pharmacists to perform streamlining and/or switch functions at days 2-4, when cultures come back from the laboratory. The antibiotic streamlining and oral switch duties of clinical pharmacy specialists are associated with the majority of cost savings in hospital antibiotic management programs. The savings are considerable to the extent that most hospitals with 200-300 beds could readily cost-justify a full-time clinical pharmacist to perform these activities on a daily basis. Expenses of the program would be offset entirely by the reduction in the actual pharmacy expenditures on antibiotics.
Natto, S A; Lewis, D G; Ryde, S J
1998-01-01
The Monte Carlo computer code MCNP (version 4A) has been used to develop a personal computer-based model of the Swansea in vivo neutron activation analysis (IVNAA) system. The model included specification of the neutron source (252Cf), collimators, reflectors and shielding. The MCNP model was 'benchmarked' against fast neutron and thermal neutron fluence data obtained experimentally from the IVNAA system. The Swansea system allows two irradiation geometries using 'short' and 'long' collimators, which provide alternative dose rates for IVNAA. The data presented here relate to the short collimator, although results of similar accuracy were obtained using the long collimator. The fast neutron fluence was measured in air at a series of depths inside the collimator. The measurements agreed with the MCNP simulation within the statistical uncertainty (5-10%) of the calculations. The thermal neutron fluence was measured and calculated inside the cuboidal water phantom. The depth of maximum thermal fluence was 3.2 cm (measured) and 3.0 cm (calculated). The width of the 50% thermal fluence level across the phantom at its mid-depth was found to be the same by both MCNP and experiment. This benchmarking exercise has given us a high degree of confidence in MCNP as a tool for the design of IVNAA systems.
Men, Wu; Deng, Fangfang; He, Jianhua; Yu, Wen; Wang, Fenfen; Li, Yiliang; Lin, Feng; Lin, Jing; Lin, Longshan; Zhang, Yusheng; Yu, Xingguang
2017-10-01
This study investigated the radioactive impacts on 10 nekton species in the Northwest Pacific more than one year after the Fukushima Nuclear Accident (FNA) from the two perspectives of contamination and harm. Squids were especially used for the spatial and temporal comparisons to demonstrate the impacts from the FNA. The radiation doses to nekton species and humans were assessed to link this radioactivity contamination to possible harm. The total dose rates to nektons were lower than the ERICA ecosystem screening benchmark of 10μGy/h. Further dose-contribution analysis showed that the internal doses from the naturally occurring nuclide 210 Po were the main dose contributor. The dose rates from 134 Cs, 137 Cs, 90 Sr and 110m Ag were approximately three or four orders of magnitude lower than those from naturally occurring radionuclides. The 210 Po-derived dose was also the main contributor of the total human dose from immersion in the seawater and the ingestion of nekton species. The human doses from anthropogenic radionuclides were ~ 100 to ~ 10,000 times lower than the doses from naturally occurring radionuclides. A morbidity assessment was performed based on the Linear No Threshold assumptions of exposure and showed 7 additional cancer cases per 100,000,000 similarly exposed people. Taken together, there is no need for concern regarding the radioactive harm in the open ocean area of the Northwest Pacific. Copyright © 2017 Elsevier Inc. All rights reserved.
Ralston, Shawn; Garber, Matthew; Narang, Steve; Shen, Mark; Pate, Brian; Pope, John; Lossius, Michele; Croland, Trina; Bennett, Jeff; Jewell, Jennifer; Krugman, Scott; Robbins, Elizabeth; Nazif, Joanne; Liewehr, Sheila; Miller, Ansley; Marks, Michelle; Pappas, Rita; Pardue, Jeanann; Quinonez, Ricardo; Fine, Bryan R; Ryan, Michael
2013-01-01
Acute viral bronchiolitis is the most common diagnosis resulting in hospital admission in pediatrics. Utilization of non-evidence-based therapies and testing remains common despite a large volume of evidence to guide quality improvement efforts. Our objective was to reduce utilization of unnecessary therapies in the inpatient care of bronchiolitis across a diverse network of clinical sites. We formed a voluntary quality improvement collaborative of pediatric hospitalists for the purpose of benchmarking the use of bronchodilators, steroids, chest radiography, chest physiotherapy, and viral testing in bronchiolitis using hospital administrative data. We shared resources within the network, including protocols, scores, order sets, and key bibliographies, and established group norms for decreasing utilization. Aggregate data on 11,568 hospitalizations for bronchiolitis from 17 centers was analyzed for this report. The network was organized in 2008. By 2010, we saw a 46% reduction in overall volume of bronchodilators used, a 3.4 dose per patient absolute decrease in utilization (95% confidence interval [CI] 1.4-5.8). Overall exposure to any dose of bronchodilator decreased by 12 percentage points as well (95% CI 5%-25%). There was also a statistically significant decline in chest physiotherapy usage, but not for steroids, chest radiography, or viral testing. Benchmarking within a voluntary pediatric hospitalist collaborative facilitated decreased utilization of bronchodilators and chest physiotherapy in bronchiolitis. Copyright © 2012 Society of Hospital Medicine.
Benchmark dose for cadmium exposure and elevated N-acetyl-β-D-glucosaminidase: a meta-analysis.
Liu, CuiXia; Li, YuBiao; Zhu, ChunShui; Dong, ZhaoMin; Zhang, Kun; Zhao, YanBin; Xu, YiLu
2016-10-01
Cadmium (Cd) is a well-known nephrotoxic contaminant, and N-acetyl-β-D-glucosaminidase (NAG) is considered to be an early and sensitive marker of tubular dysfunction. The link between Cd exposure and NAG level enables us to derive the benchmark dose (BMD) of Cd. Although several reports have already documented urinary Cd (UCd)-NAG relationships and BMD estimations, high heterogeneities arise due to the sub-populations (age, gender, and ethnicity) and BMD methodologies being employed. To clarify the influences that these variables exert, firstly, a random effect meta-analysis was performed in this study to correlate the UCd and NAG based on 92 datasets collected from 30 publications. Later, this established correlation (Ln(NAG) = 0.51 × Ln(UCd) + 0.83) was applied to derive the UCd BMD 5 of 1.76 μg/g creatinine and 95 % lower confidence limit of BMD 5 (BMDL 5 ) of 1.67 μg/g creatinine. While the regressions for different age groups and genders differed slightly, it is age and not gender that significantly affects BMD estimations. Ethnic differences may require further investigation given that limited data is currently available. Based on a comprehensive and systematic literature review, this study is a new attempt to quantify the UCd-NAG link and estimate BMD.
NASA Astrophysics Data System (ADS)
Lin, Yi-Chun; Huang, Tseng-Te; Liu, Yuan-Hao; Chen, Wei-Lin; Chen, Yen-Fu; Wu, Shu-Wei; Nievaart, Sander; Jiang, Shiang-Huei
2015-06-01
The paired ionization chambers (ICs) technique is commonly employed to determine neutron and photon doses in radiology or radiotherapy neutron beams, where neutron dose shows very strong dependence on the accuracy of accompanying high energy photon dose. During the dose derivation, it is an important issue to evaluate the photon and electron response functions of two commercially available ionization chambers, denoted as TE(TE) and Mg(Ar), used in our reactor based epithermal neutron beam. Nowadays, most perturbation corrections for accurate dose determination and many treatment planning systems are based on the Monte Carlo technique. We used general purposed Monte Carlo codes, MCNP5, EGSnrc, FLUKA or GEANT4 for benchmark verifications among them and carefully measured values for a precise estimation of chamber current from absorbed dose rate of cavity gas. Also, energy dependent response functions of two chambers were calculated in a parallel beam with mono-energies from 20 keV to 20 MeV photons and electrons by using the optimal simple spherical and detailed IC models. The measurements were performed in the well-defined (a) four primary M-80, M-100, M120 and M150 X-ray calibration fields, (b) primary 60Co calibration beam, (c) 6 MV and 10 MV photon, (d) 6 MeV and 18 MeV electron LINACs in hospital and (e) BNCT clinical trials neutron beam. For the TE(TE) chamber, all codes were almost identical over the whole photon energy range. In the Mg(Ar) chamber, MCNP5 showed lower response than other codes for photon energy region below 0.1 MeV and presented similar response above 0.2 MeV (agreed within 5% in the simple spherical model). With the increase of electron energy, the response difference between MCNP5 and other codes became larger in both chambers. Compared with the measured currents, MCNP5 had the difference from the measurement data within 5% for the 60Co, 6 MV, 10 MV, 6 MeV and 18 MeV LINACs beams. But for the Mg(Ar) chamber, the derivations reached 7.8-16.5% below 120 kVp X-ray beams. In this study, we were especially interested in BNCT doses where low energy photon contribution is less to ignore, MCNP model is recognized as the most suitable to simulate wide photon-electron and neutron energy distributed responses of the paired ICs. Also, MCNP provides the best prediction of BNCT source adjustment by the detector's neutron and photon responses.
Use of computer code for dose distribution studies in A 60CO industrial irradiator
NASA Astrophysics Data System (ADS)
Piña-Villalpando, G.; Sloan, D. P.
1995-09-01
This paper presents a benchmark comparison between calculated and experimental absorbed dose values tor a typical product, in a 60Co industrial irradiator, located at ININ, México. The irradiator is a two levels, two layers system with overlapping product configuration with activity around 300kCi. Experimental values were obtanied from routine dosimetry, using red acrylic pellets. Typical product was Petri dishes packages, apparent density 0.13 g/cm3; that product was chosen because uniform size, large quantity and low density. Minimum dose was fixed in 15 kGy. Calculated values were obtained from QAD-CGGP code. This code uses a point kernel technique, build-up factors fitting was done by geometrical progression and combinatorial geometry is used for system description. Main modifications for the code were related with source sumilation, using punctual sources instead of pencils and an energy and anisotropic emission spectrums were included. Results were, for maximum dose, calculated value (18.2 kGy) was 8% higher than experimental average value (16.8 kGy); for minimum dose, calculated value (13.8 kGy) was 3% higher than experimental average value (14.3 kGy).
Oral toxicity of 3-nitro-1,2,4-triazol-5-one in rats.
Crouse, Lee C B; Lent, Emily May; Leach, Glenn J
2015-01-01
3-Nitro-1,2,4-triazol-5-one (NTO), an insensitive explosive, was evaluated to assess potential environmental and human health effects. A 14-day oral toxicity study in Sprague-Dawley rats was conducted with NTO in polyethylene glycol -200 by gavage at doses of 0, 250, 500, 1000, 1500, or 2000 mg/kg-d. Body mass and food consumption decreased in males (2000 mg/kg-d), and testes mass was reduced at doses of 500 mg/kg-d and greater. Based on the findings in the 14-day study, a 90-day study was conducted at doses of 0, 30, 100, 315, or 1000 mg/kg-d NTO. There was no effect on food consumption, body mass, or neurobehavioral parameters. Males in the 315 and 1000 mg/kg-d groups had reduced testes mass with associated tubular degeneration and atrophy. The testicular effects were the most sensitive adverse effect and were used to derive a benchmark dose (BMD) of 70 mg/kg-d with a 10% effect level (BMDL10) of 40 mg/kg-d. © The Author(s) 2015.
The use of National Weather Service Data to Compute the Dose to the MEOI.
Vickers, Linda
2018-05-01
The Turner method is the "benchmark method" for computing the stability class that is used to compute the X/Q (s m). The Turner method should be used to ascertain the validity of X/Q results determined by other methods. This paper used site-specific meteorological data obtained from the National Weather Service. The Turner method described herein is simple, quick, accurate, and transparent because all of the data, calculations, and results are visible for verification and validation with published literature.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Y M; Bush, K; Han, B
Purpose: Accurate and fast dose calculation is a prerequisite of precision radiation therapy in modern photon and particle therapy. While Monte Carlo (MC) dose calculation provides high dosimetric accuracy, the drastically increased computational time hinders its routine use. Deterministic dose calculation methods are fast, but problematic in the presence of tissue density inhomogeneity. We leverage the useful features of deterministic methods and MC to develop a hybrid dose calculation platform with autonomous utilization of MC and deterministic calculation depending on the local geometry, for optimal accuracy and speed. Methods: Our platform utilizes a Geant4 based “localized Monte Carlo” (LMC) methodmore » that isolates MC dose calculations only to volumes that have potential for dosimetric inaccuracy. In our approach, additional structures are created encompassing heterogeneous volumes. Deterministic methods calculate dose and energy fluence up to the volume surfaces, where the energy fluence distribution is sampled into discrete histories and transported using MC. Histories exiting the volume are converted back into energy fluence, and transported deterministically. By matching boundary conditions at both interfaces, deterministic dose calculation account for dose perturbations “downstream” of localized heterogeneities. Hybrid dose calculation was performed for water and anthropomorphic phantoms. Results: We achieved <1% agreement between deterministic and MC calculations in the water benchmark for photon and proton beams, and dose differences of 2%–15% could be observed in heterogeneous phantoms. The saving in computational time (a factor ∼4–7 compared to a full Monte Carlo dose calculation) was found to be approximately proportional to the volume of the heterogeneous region. Conclusion: Our hybrid dose calculation approach takes advantage of the computational efficiency of deterministic method and accuracy of MC, providing a practical tool for high performance dose calculation in modern RT. The approach is generalizable to all modalities where heterogeneities play a large role, notably particle therapy.« less
Comparison of Monte Carlo and analytical dose computations for intensity modulated proton therapy
NASA Astrophysics Data System (ADS)
Yepes, Pablo; Adair, Antony; Grosshans, David; Mirkovic, Dragan; Poenisch, Falk; Titt, Uwe; Wang, Qianxia; Mohan, Radhe
2018-02-01
To evaluate the effect of approximations in clinical analytical calculations performed by a treatment planning system (TPS) on dosimetric indices in intensity modulated proton therapy. TPS calculated dose distributions were compared with dose distributions as estimated by Monte Carlo (MC) simulations, calculated with the fast dose calculator (FDC) a system previously benchmarked to full MC. This study analyzed a total of 525 patients for four treatment sites (brain, head-and-neck, thorax and prostate). Dosimetric indices (D02, D05, D20, D50, D95, D98, EUD and Mean Dose) and a gamma-index analysis were utilized to evaluate the differences. The gamma-index passing rates for a 3%/3 mm criterion for voxels with a dose larger than 10% of the maximum dose had a median larger than 98% for all sites. The median difference for all dosimetric indices for target volumes was less than 2% for all cases. However, differences for target volumes as large as 10% were found for 2% of the thoracic patients. For organs at risk (OARs), the median absolute dose difference was smaller than 2 Gy for all indices and cohorts. However, absolute dose differences as large as 10 Gy were found for some small volume organs in brain and head-and-neck patients. This analysis concludes that for a fraction of the patients studied, TPS may overestimate the dose in the target by as much as 10%, while for some OARs the dose could be underestimated by as much as 10 Gy. Monte Carlo dose calculations may be needed to ensure more accurate dose computations to improve target coverage and sparing of OARs in proton therapy.
Extension of PENELOPE to protons: simulation of nuclear reactions and benchmark with Geant4.
Sterpin, E; Sorriaux, J; Vynckier, S
2013-11-01
Describing the implementation of nuclear reactions in the extension of the Monte Carlo code (MC) PENELOPE to protons (PENH) and benchmarking with Geant4. PENH is based on mixed-simulation mechanics for both elastic and inelastic electromagnetic collisions (EM). The adopted differential cross sections for EM elastic collisions are calculated using the eikonal approximation with the Dirac-Hartree-Fock-Slater atomic potential. Cross sections for EM inelastic collisions are computed within the relativistic Born approximation, using the Sternheimer-Liljequist model of the generalized oscillator strength. Nuclear elastic and inelastic collisions were simulated using explicitly the scattering analysis interactive dialin database for (1)H and ICRU 63 data for (12)C, (14)N, (16)O, (31)P, and (40)Ca. Secondary protons, alphas, and deuterons were all simulated as protons, with the energy adapted to ensure consistent range. Prompt gamma emission can also be simulated upon user request. Simulations were performed in a water phantom with nuclear interactions switched off or on and integral depth-dose distributions were compared. Binary-cascade and precompound models were used for Geant4. Initial energies of 100 and 250 MeV were considered. For cases with no nuclear interactions simulated, additional simulations in a water phantom with tight resolution (1 mm in all directions) were performed with FLUKA. Finally, integral depth-dose distributions for a 250 MeV energy were computed with Geant4 and PENH in a homogeneous phantom with, first, ICRU striated muscle and, second, ICRU compact bone. For simulations with EM collisions only, integral depth-dose distributions were within 1%/1 mm for doses higher than 10% of the Bragg-peak dose. For central-axis depth-dose and lateral profiles in a phantom with tight resolution, there are significant deviations between Geant4 and PENH (up to 60%/1 cm for depth-dose distributions). The agreement is much better with FLUKA, with deviations within 3%/3 mm. When nuclear interactions were turned on, agreement (within 6% before the Bragg-peak) between PENH and Geant4 was consistent with uncertainties on nuclear models and cross sections, whatever the material simulated (water, muscle, or bone). A detailed and flexible description of nuclear reactions has been implemented in the PENH extension of PENELOPE to protons, which utilizes a mixed-simulation scheme for both elastic and inelastic EM collisions, analogous to the well-established algorithm for electrons/positrons. PENH is compatible with all current main programs that use PENELOPE as the MC engine. The nuclear model of PENH is realistic enough to give dose distributions in fair agreement with those computed by Geant4.
Extension of PENELOPE to protons: Simulation of nuclear reactions and benchmark with Geant4
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sterpin, E.; Sorriaux, J.; Vynckier, S.
2013-11-15
Purpose: Describing the implementation of nuclear reactions in the extension of the Monte Carlo code (MC) PENELOPE to protons (PENH) and benchmarking with Geant4.Methods: PENH is based on mixed-simulation mechanics for both elastic and inelastic electromagnetic collisions (EM). The adopted differential cross sections for EM elastic collisions are calculated using the eikonal approximation with the Dirac–Hartree–Fock–Slater atomic potential. Cross sections for EM inelastic collisions are computed within the relativistic Born approximation, using the Sternheimer–Liljequist model of the generalized oscillator strength. Nuclear elastic and inelastic collisions were simulated using explicitly the scattering analysis interactive dialin database for {sup 1}H and ICRUmore » 63 data for {sup 12}C, {sup 14}N, {sup 16}O, {sup 31}P, and {sup 40}Ca. Secondary protons, alphas, and deuterons were all simulated as protons, with the energy adapted to ensure consistent range. Prompt gamma emission can also be simulated upon user request. Simulations were performed in a water phantom with nuclear interactions switched off or on and integral depth–dose distributions were compared. Binary-cascade and precompound models were used for Geant4. Initial energies of 100 and 250 MeV were considered. For cases with no nuclear interactions simulated, additional simulations in a water phantom with tight resolution (1 mm in all directions) were performed with FLUKA. Finally, integral depth–dose distributions for a 250 MeV energy were computed with Geant4 and PENH in a homogeneous phantom with, first, ICRU striated muscle and, second, ICRU compact bone.Results: For simulations with EM collisions only, integral depth–dose distributions were within 1%/1 mm for doses higher than 10% of the Bragg-peak dose. For central-axis depth–dose and lateral profiles in a phantom with tight resolution, there are significant deviations between Geant4 and PENH (up to 60%/1 cm for depth–dose distributions). The agreement is much better with FLUKA, with deviations within 3%/3 mm. When nuclear interactions were turned on, agreement (within 6% before the Bragg-peak) between PENH and Geant4 was consistent with uncertainties on nuclear models and cross sections, whatever the material simulated (water, muscle, or bone).Conclusions: A detailed and flexible description of nuclear reactions has been implemented in the PENH extension of PENELOPE to protons, which utilizes a mixed-simulation scheme for both elastic and inelastic EM collisions, analogous to the well-established algorithm for electrons/positrons. PENH is compatible with all current main programs that use PENELOPE as the MC engine. The nuclear model of PENH is realistic enough to give dose distributions in fair agreement with those computed by Geant4.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fasso, A.; Ferrari, A.; Ferrari, A.
In 1974, Nelson, Kase and Svensson published an experimental investigation on muon shielding around SLAC high-energy electron accelerators [1]. They measured muon fluence and absorbed dose induced by 14 and 18 GeV electron beams hitting a copper/water beamdump and attenuated in a thick steel shielding. In their paper, they compared the results with the theoretical models available at that time. In order to compare their experimental results with present model calculations, we use the modern transport Monte Carlo codes MARS15, FLUKA2011 and GEANT4 to model the experimental setup and run simulations. The results are then compared between the codes, andmore » with the SLAC data.« less
NASA Technical Reports Server (NTRS)
Gronoff, Guillaume; Norman, Ryan B.; Mertens, Christopher J.
2014-01-01
The ability to evaluate the cosmic ray environment at Mars is of interest for future manned exploration. To support exploration, tools must be developed to accurately access the radiation environment in both free space and on planetary surfaces. The primary tool NASA uses to quantify radiation exposure behind shielding materials is the space radiation transport code, HZETRN. In order to build confidence in HZETRN, code benchmarking against Monte Carlo radiation transport codes is often used. This work compares the dose calculations at Mars by HZETRN and the Geant4 application Planetocosmics. The dose at ground and the energy deposited in the atmosphere by galactic cosmic ray protons and alpha particles has been calculated for the Curiosity landing conditions. In addition, this work has considered Solar Energetic Particle events, allowing for the comparison of varying input radiation environments. The results for protons and alpha particles show very good agreement between HZETRN and Planetocosmics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Thomas Martin; Celik, Cihangir; McMahan, Kimberly L.
This benchmark experiment was conducted as a joint venture between the US Department of Energy (DOE) and the French Commissariat à l'Energie Atomique (CEA). Staff at the Oak Ridge National Laboratory (ORNL) in the US and the Centre de Valduc in France planned this experiment. The experiment was conducted on October 11, 2010 in the SILENE critical assembly facility at Valduc. Several other organizations contributed to this experiment and the subsequent evaluation, including CEA Saclay, Lawrence Livermore National Laboratory (LLNL), the Y-12 National Security Complex (NSC), Babcock International Group in the United Kingdom, and Los Alamos National Laboratory (LANL). Themore » goal of this experiment was to measure neutron activation and thermoluminescent dosimeter (TLD) doses from a source similar to a fissile solution critical excursion. The resulting benchmark can be used for validation of computer codes and nuclear data libraries as required when performing analysis of criticality accident alarm systems (CAASs). A secondary goal of this experiment was to qualitatively test performance of two CAAS detectors similar to those currently and formerly in use in some US DOE facilities. The detectors tested were the CIDAS MkX and the Rocky Flats NCD-91. These detectors were being evaluated to determine whether they would alarm, so they were not expected to generate benchmark quality data.« less
Radiological assessment for bauxite mining and alumina refining.
O'Connor, Brian H; Donoghue, A Michael; Manning, Timothy J H; Chesson, Barry J
2013-01-01
Two international benchmarks assess whether the mining and processing of ores containing Naturally Occurring Radioactive Material (NORM) require management under radiological regulations set by local jurisdictions. First, the 1 Bq/g benchmark for radionuclide head of chain activity concentration determines whether materials may be excluded from radiological regulation. Second, processes may be exempted from radiological regulation where occupational above-background exposures for members of the workforce do not exceed 1 mSv/year. This is also the upper-limit of exposure prescribed for members of the public. Alcoa of Australia Limited (Alcoa) has undertaken radiological evaluations of the mining and processing of bauxite from the Darling Range of Western Australia since the 1980s. Short-term monitoring projects have demonstrated that above-background exposures for workers do not exceed 1 mSv/year. A whole-of-year evaluation of above-background, occupational radiological doses for bauxite mining, alumina refining and residue operations was conducted during 2008/2009 as part of the Alcoa NORM Quality Assurance System (NQAS). The NQAS has been guided by publications from the International Commission on Radiological Protection (ICRP), the International Atomic Energy Agency (IAEA) and the Australian Radiation Protection and Nuclear Safety Agency (ARPANSA). The NQAS has been developed specifically in response to implementation of the Australian National Directory on Radiation Protection (NDRP). Positional monitoring was undertaken to increase the accuracy of natural background levels required for correction of occupational exposures. This is important in view of the small increments in exposure that occur in bauxite mining, alumina refining and residue operations relative to natural background. Positional monitoring was also undertaken to assess the potential for exposure in operating locations. Personal monitoring was undertaken to characterise exposures in Similar Exposure Groups (SEGs). The monitoring was undertaken over 12 months, to provide annual average assessments of above-background doses, thereby reducing temporal variations, especially for radon exposures. The monitoring program concentrated on gamma and radon exposures, rather than gross alpha exposures, as past studies have shown that gross alpha exposures from inhalable dust for most of the workforce are small in comparison to combined gamma and radon exposures. The natural background determinations were consistent with data in the literature for localities near Alcoa's mining, refining and residue operations in Western Australia, and also with UNSCEAR global data. Within the mining operations, there was further consistency between the above-background dose estimates and the local geochemistry, with slight elevation of dose levels in mining pits. Conservative estimates of above-background levels for the workforce have been made using an assumption of 100% occupancy (1920 hours per year) for the SEGs considered. Total incremental composite doses for individuals were clearly less than 1.0 mSv/year when gamma, radon progeny and gross alpha exposures were considered. This is despite the activity concentration of some materials being slightly higher than the benchmark of 1 Bq/g. The results are consistent with previous monitoring and demonstrate compliance with the 1 mSv/year exemption level within mining, refining and residue operations. These results will be of value to bauxite mines and alumina refineries elsewhere in the world.
Quality assurance of the SCOPE 1 trial in oesophageal radiotherapy.
Wills, Lucy; Maggs, Rhydian; Lewis, Geraint; Jones, Gareth; Nixon, Lisette; Staffurth, John; Crosby, Tom
2017-11-15
SCOPE 1 was the first UK based multi-centre trial involving radiotherapy of the oesophagus. A comprehensive radiotherapy trials quality assurance programme was launched with two main aims: 1. To assist centres, where needed, to adapt their radiotherapy techniques in order to achieve protocol compliance and thereby enable their participation in the trial. 2. To support the trial's clinical outcomes by ensuring the consistent planning and delivery of radiotherapy across all participating centres. A detailed information package was provided and centres were required to complete a benchmark case in which the delineated target volumes and organs at risk, dose distribution and completion of a plan assessment form were assessed prior to recruiting patients into the trial. Upon recruiting, the quality assurance (QA) programme continued to monitor the outlining and planning of radiotherapy treatments. Completion of a questionnaire was requested in order to gather information about each centre's equipment and techniques relating to their trial participation and to assess the impact of the trial nationally on standard practice for radiotherapy of the oesophagus. During the trial, advice was available for individual planning issues, and was circulated amongst the SCOPE 1 community in response to common areas of concern using bulletins. 36 centres were supported through QA processes to enable their participation in SCOPE1. We discuss the issues which have arisen throughout this process and present details of the benchmark case solutions, centre questionnaires and on-trial protocol compliance. The range of submitted benchmark case GTV volumes was 29.8-67.8cm 3 ; and PTV volumes 221.9-513.3 cm 3 . For the dose distributions associated with these volumes, the percentage volume of the lungs receiving 20Gy (V20Gy) ranged from 20.4 to 33.5%. Similarly, heart V40Gy ranged from 16.1 to 33.0%. Incidence of incorrect outlining of OAR volumes increased from 50% of centres at benchmark case, to 64% on trial. Sixty-five percent of centres, who returned the trial questionnaire, stated that their standard practice had changed as a result of their participation in the SCOPE1 trial. The SCOPE 1 QA programme outcomes lend support to the trial's clinical conclusions. The range of patient planning outcomes for the benchmark case indicated, at the outset of the trial, the significant degree of variation present in UK oesophageal radiotherapy planning outcomes, despite the presence of a protocol. This supports the case for increasingly detailed definition of practice by means of consensus protocols, training and peer review. The incidence of minor inconsistencies of technique highlights the potential for improved QA systems and the need for sufficient resource for this to be addressed within future trials. As indicated in questionnaire responses, the QA exercise as a whole has contributed to greater consistency of oesophageal radiotherapy in the UK via the adoption into standard practice of elements of the protocol. The SCOPE1 trial is an International Standard Randomized Controlled Trial, ISRCTN47718479 .
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, L; Huang, S; Kang, M
Purpose: Eclipse proton Monte Carlo AcurosPT 13.7 was commissioned and experimentally validated for an IBA dedicated PBS nozzle in water. Topas 1.3 was used to isolate the cause of differences in output and penumbra between simulation and experiment. Methods: The spot profiles were measured in air at five locations using Lynx. PTW-34070 Bragg peak chamber (Freiburg, Germany) was used to collect the relative integral Bragg peak for 15 proton energies from 100 MeV to 225 MeV. The phase space parameters (σx, σθ, ρxθ) number of protons per MU, energy spread and calculated mean energy provided by AcurosPT were identically implementedmore » into Topas. The absolute dose, profiles and field size factors measured using ionization chamber arrays were compared with both AcurosPT and Topas. Results: The beam spot size, σx, and the angular spread, σθ, in air were both energy-dependent: in particular, the spot size in air at isocentre ranged from 2.8 to 5.3 mm, and the angular spread ranged from 2.7 mrad to 6 mrad. The number of protons per MU increased from ∼9E7 at 100 MeV to ∼1.5E8 at 225 MeV. Both AcurosPT and TOPAS agree with experiment within 2 mm penumbra difference or 3% dose difference for scenarios including central axis depth dose and profiles at two depths in multi-spot square fields, from 40 to 200 mm, for all the investigated single-energy and multi-energy beams, indicating clinically acceptable source model and radiation transport algorithm in water. Conclusion: By comparing measured data and TOPAS simulation using the same source model, the AcurosPT 13.7 was validated in water within 2 mm penumbra difference or 3% dose difference. Benchmarks versus an independent Monte Carlo code are recommended to study the agreement in output, filed size factors and penumbra differences. This project is partially supported by the Varian grant under the master agreement between University of Pennsylvania and Varian.« less
NASA Astrophysics Data System (ADS)
Pappas, E. P.; Moutsatsos, A.; Pantelis, E.; Zoros, E.; Georgiou, E.; Torrens, M.; Karaiskos, P.
2016-02-01
This work presents a comprehensive Monte Carlo (MC) simulation model for the Gamma Knife Perfexion (PFX) radiosurgery unit. Model-based dosimetry calculations were benchmarked in terms of relative dose profiles (RDPs) and output factors (OFs), against corresponding EBT2 measurements. To reduce the rather prolonged computational time associated with the comprehensive PFX model MC simulations, two approximations were explored and evaluated on the grounds of dosimetric accuracy. The first consists in directional biasing of the 60Co photon emission while the second refers to the implementation of simplified source geometric models. The effect of the dose scoring volume dimensions in OF calculations accuracy was also explored. RDP calculations for the comprehensive PFX model were found to be in agreement with corresponding EBT2 measurements. Output factors of 0.819 ± 0.004 and 0.8941 ± 0.0013 were calculated for the 4 mm and 8 mm collimator, respectively, which agree, within uncertainties, with corresponding EBT2 measurements and published experimental data. Volume averaging was found to affect OF results by more than 0.3% for scoring volume radii greater than 0.5 mm and 1.4 mm for the 4 mm and 8 mm collimators, respectively. Directional biasing of photon emission resulted in a time efficiency gain factor of up to 210 with respect to the isotropic photon emission. Although no considerable effect on relative dose profiles was detected, directional biasing led to OF overestimations which were more pronounced for the 4 mm collimator and increased with decreasing emission cone half-angle, reaching up to 6% for a 5° angle. Implementation of simplified source models revealed that omitting the sources’ stainless steel capsule significantly affects both OF results and relative dose profiles, while the aluminum-based bushing did not exhibit considerable dosimetric effect. In conclusion, the results of this work suggest that any PFX simulation model should be benchmarked in terms of both RDP and OF results.
Fisher, Nicholas S.; Beaugelin-Seiller, Karine; Hinton, Thomas G.; Baumann, Zofia; Madigan, Daniel J.; Garnier-Laplace, Jacqueline
2013-01-01
Radioactive isotopes originating from the damaged Fukushima nuclear reactor in Japan following the earthquake and tsunami in March 2011 were found in resident marine animals and in migratory Pacific bluefin tuna (PBFT). Publication of this information resulted in a worldwide response that caused public anxiety and concern, although PBFT captured off California in August 2011 contained activity concentrations below those from naturally occurring radionuclides. To link the radioactivity to possible health impairments, we calculated doses, attributable to the Fukushima-derived and the naturally occurring radionuclides, to both the marine biota and human fish consumers. We showed that doses in all cases were dominated by the naturally occurring alpha-emitter 210Po and that Fukushima-derived doses were three to four orders of magnitude below 210Po-derived doses. Doses to marine biota were about two orders of magnitude below the lowest benchmark protection level proposed for ecosystems (10 µGy⋅h−1). The additional dose from Fukushima radionuclides to humans consuming tainted PBFT in the United States was calculated to be 0.9 and 4.7 µSv for average consumers and subsistence fishermen, respectively. Such doses are comparable to, or less than, the dose all humans routinely obtain from naturally occurring radionuclides in many food items, medical treatments, air travel, or other background sources. Although uncertainties remain regarding the assessment of cancer risk at low doses of ionizing radiation to humans, the dose received from PBFT consumption by subsistence fishermen can be estimated to result in two additional fatal cancer cases per 10,000,000 similarly exposed people. PMID:23733934
NASA Astrophysics Data System (ADS)
Giménez-Alventosa, Vicent; Antunes, Paula C. G.; Vijande, Javier; Ballester, Facundo; Pérez-Calatayud, José; Andreo, Pedro
2017-01-01
The AAPM TG-43 brachytherapy dosimetry formalism, introduced in 1995, has become a standard for brachytherapy dosimetry worldwide; it implicitly assumes that charged-particle equilibrium (CPE) exists for the determination of absorbed dose to water at different locations, except in the vicinity of the source capsule. Subsequent dosimetry developments, based on Monte Carlo calculations or analytical solutions of transport equations, do not rely on the CPE assumption and determine directly the dose to different tissues. At the time of relating dose to tissue and dose to water, or vice versa, it is usually assumed that the photon fluence in water and in tissues are practically identical, so that the absorbed dose in the two media can be related by their ratio of mass energy-absorption coefficients. In this work, an efficient way to correlate absorbed dose to water and absorbed dose to tissue in brachytherapy calculations at clinically relevant distances for low-energy photon emitting seeds is proposed. A correction is introduced that is based on the ratio of the water-to-tissue photon energy-fluences. State-of-the art Monte Carlo calculations are used to score photon fluence differential in energy in water and in various human tissues (muscle, adipose and bone), which in all cases include a realistic modelling of low-energy brachytherapy sources in order to benchmark the formalism proposed. The energy-fluence based corrections given in this work are able to correlate absorbed dose to tissue and absorbed dose to water with an accuracy better than 0.5% in the most critical cases (e.g. bone tissue).
Giménez-Alventosa, Vicent; Antunes, Paula C G; Vijande, Javier; Ballester, Facundo; Pérez-Calatayud, José; Andreo, Pedro
2017-01-07
The AAPM TG-43 brachytherapy dosimetry formalism, introduced in 1995, has become a standard for brachytherapy dosimetry worldwide; it implicitly assumes that charged-particle equilibrium (CPE) exists for the determination of absorbed dose to water at different locations, except in the vicinity of the source capsule. Subsequent dosimetry developments, based on Monte Carlo calculations or analytical solutions of transport equations, do not rely on the CPE assumption and determine directly the dose to different tissues. At the time of relating dose to tissue and dose to water, or vice versa, it is usually assumed that the photon fluence in water and in tissues are practically identical, so that the absorbed dose in the two media can be related by their ratio of mass energy-absorption coefficients. In this work, an efficient way to correlate absorbed dose to water and absorbed dose to tissue in brachytherapy calculations at clinically relevant distances for low-energy photon emitting seeds is proposed. A correction is introduced that is based on the ratio of the water-to-tissue photon energy-fluences. State-of-the art Monte Carlo calculations are used to score photon fluence differential in energy in water and in various human tissues (muscle, adipose and bone), which in all cases include a realistic modelling of low-energy brachytherapy sources in order to benchmark the formalism proposed. The energy-fluence based corrections given in this work are able to correlate absorbed dose to tissue and absorbed dose to water with an accuracy better than 0.5% in the most critical cases (e.g. bone tissue).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stathakis, S; Defoor, D; Saenz, D
Purpose: Stereotactic radiosurgery (SRS) outcomes are related to the delivered dose to the target and to surrounding tissue. We have commissioned a Monte Carlo based dose calculation algorithm to recalculated the delivered dose planned using pencil beam calculation dose engine. Methods: Twenty consecutive previously treated patients have been selected for this study. All plans were generated using the iPlan treatment planning system (TPS) and calculated using the pencil beam algorithm. Each patient plan consisted of 1 to 3 targets and treated using dynamically conformal arcs or intensity modulated beams. Multi-target treatments were delivered using multiple isocenters, one for each target.more » These plans were recalculated for the purpose of this study using a single isocenter. The CT image sets along with the plan, doses and structures were DICOM exported to Monaco TPS and the dose was recalculated using the same voxel resolution and monitor units. Benchmark data was also generated prior to patient calculations to assess the accuracy of the two TPS against measurements using a micro ionization chamber in solid water. Results: Good agreement, within −0.4% for Monaco and +2.2% for iPlan were observed for measurements in water phantom. Doses in patient geometry revealed up to 9.6% differences for single target plans and 9.3% for multiple-target-multiple-isocenter plans. The average dose differences for multi-target-single-isocenter plans were approximately 1.4%. Similar differences were observed for the OARs and integral dose. Conclusion: Accuracy of the beam is crucial for the dose calculation especially in the case of small fields such as those used in SRS treatments. A superior dose calculation algorithm such as Monte Carlo, with properly commissioned beam models, which is unaffected by the lack of electronic equilibrium should be preferred for the calculation of small fields to improve accuracy.« less
Using a knowledge-based planning solution to select patients for proton therapy.
Delaney, Alexander R; Dahele, Max; Tol, Jim P; Kuijper, Ingrid T; Slotman, Ben J; Verbakel, Wilko F A R
2017-08-01
Patient selection for proton therapy by comparing proton/photon treatment plans is time-consuming and prone to bias. RapidPlan™, a knowledge-based-planning solution, uses plan-libraries to model and predict organ-at-risk (OAR) dose-volume-histograms (DVHs). We investigated whether RapidPlan, utilizing an algorithm based only on photon beam characteristics, could generate proton DVH-predictions and whether these could correctly identify patients for proton therapy. Model PROT and Model PHOT comprised 30 head-and-neck cancer proton and photon plans, respectively. Proton and photon knowledge-based-plans (KBPs) were made for ten evaluation-patients. DVH-prediction accuracy was analyzed by comparing predicted-vs-achieved mean OAR doses. KBPs and manual plans were compared using salivary gland and swallowing muscle mean doses. For illustration, patients were selected for protons if predicted Model PHOT mean dose minus predicted Model PROT mean dose (ΔPrediction) for combined OARs was ≥6Gy, and benchmarked using achieved KBP doses. Achieved and predicted Model PROT /Model PHOT mean dose R 2 was 0.95/0.98. Generally, achieved mean dose for Model PHOT /Model PROT KBPs was respectively lower/higher than predicted. Comparing Model PROT /Model PHOT KBPs with manual plans, salivary and swallowing mean doses increased/decreased by <2Gy, on average. ΔPrediction≥6Gy correctly selected 4 of 5 patients for protons. Knowledge-based DVH-predictions can provide efficient, patient-specific selection for protons. A proton-specific RapidPlan-solution could improve results. Copyright © 2017 Elsevier B.V. All rights reserved.
İnal, Tolga; Ataç, Gökçe
2014-01-01
We aimed to determine the radiation doses delivered to patients undergoing general examinations using computed or digital radiography systems in Turkey. Radiographs of 20 patients undergoing posteroanterior chest X-ray and of 20 patients undergoing anteroposterior kidney-ureter-bladder radiography were evaluated in five X-ray rooms at four local hospitals in the Ankara region. Currently, almost all radiology departments in Turkey have switched from conventional radiography systems to computed radiography or digital radiography systems. Patient dose was measured for both systems. The results were compared with published diagnostic reference levels (DRLs) from the European Union and International Atomic Energy Agency. The average entrance surface doses (ESDs) for chest examinations exceeded established international DRLs at two of the X-ray rooms in a hospital with computed radiography. All of the other ESD measurements were approximately equal to or below the DRLs for both examinations in all of the remaining hospitals. Improper adjustment of the exposure parameters, uncalibrated automatic exposure control systems, and failure of the technologists to choose exposure parameters properly were problems we noticed during the study. This study is an initial attempt at establishing local DRL values for digital radiography systems, and will provide a benchmark so that the authorities can establish reference dose levels for diagnostic radiology in Turkey.
Wang, L; Lovelock, M; Chui, C S
1999-12-01
To further validate the Monte Carlo dose-calculation method [Med. Phys. 25, 867-878 (1998)] developed at the Memorial Sloan-Kettering Cancer Center, we have performed experimental verification in various inhomogeneous phantoms. The phantom geometries included simple layered slabs, a simulated bone column, a simulated missing-tissue hemisphere, and an anthropomorphic head geometry (Alderson Rando Phantom). The densities of the inhomogeneity range from 0.14 to 1.86 g/cm3, simulating both clinically relevant lunglike and bonelike materials. The data are reported as central axis depth doses, dose profiles, dose values at points of interest, such as points at the interface of two different media and in the "nasopharynx" region of the Rando head. The dosimeters used in the measurement included dosimetry film, TLD chips, and rods. The measured data were compared to that of Monte Carlo calculations for the same geometrical configurations. In the case of the Rando head phantom, a CT scan of the phantom was used to define the calculation geometry and to locate the points of interest. The agreement between the calculation and measurement is generally within 2.5%. This work validates the accuracy of the Monte Carlo method. While Monte Carlo, at present, is still too slow for routine treatment planning, it can be used as a benchmark against which other dose calculation methods can be compared.
Lee, Joon Oh; Chung, Moon Sang; Baek, Goo Hyun; Oh, Joo Han; Lee, Young Ho; Gong, Hyun Sik
2010-09-01
To assess age- and site-related bone mineral density (BMD) values in Korean female patients with a distal radius fracture, and to compare them with those of the community-based general Korean female population. For this study, we recruited 54 consecutive Korean women, 50 to 79 years of age, with a distal radius fracture caused by minor trauma. We performed dual-energy x-ray absorptiometry scans at central sites: the lumbar spine, femoral neck, trochanter, and Ward's triangle, which is a triangular area within the femoral neck. Age- and site-related BMDs were assessed and compared with those of population-based reference data for Korean women. The overall prevalence (defined as meeting the osteoporosis criteria in at least one of the earlier-described measurement areas) of osteoporosis in patients with a distal radius fracture was 57%. The site-related prevalence was 54% at Ward's triangle, 43% at the lumbar spine, 32% at the femoral neck, and 26% at the trochanter, and these values were individually statistically significantly higher than those of the general Korean female population except for the lumbar spine. In patients 50 to 59 and 70 to 79 years of age, patients' mean BMD values at the hip were statistically significantly lower than those of the reference female population of corresponding age groups, but the hip BMD differences were not statistically significant in patients 60 to 69 years of age. There were no statistically significant BMD differences measured at the lumbar spine in any age group. Korean female patients with a distal radius fracture, 50 to 59 and 70 to 79 years of age, had lower BMDs at the hip than the reference Korean female population. However, no statistically significant BMD differences were found in those 60 to 69 years of age. Low BMD may have a greater impact on distal radius fracture in women younger than 60 years of age or over 70 years of age. Considering the young onset of bone loss, patients younger than 60 years of age with a distal radius fracture are a good target group for secondary prevention of osteoporosis. Copyright 2010 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.
Svensson, J; Lall, S; Dickson, S L; Bengtsson, B A; Rømer, J; Ahnfelt-Rønne, I; Ohlsson, C; Jansson, J O
2000-06-01
Growth hormone (GH) is of importance for normal bone remodelling. A recent clinical study demonstrated that MK-677, a member of a class of GH secretagogues (GHSs), increases serum concentrations of biochemical markers of bone formation and bone resorption. The aim of the present study was to investigate whether the GHSs, ipamorelin (IPA) and GH-releasing peptide-6 (GHRP-6), increase bone mineral content (BMC) in young adult female rats. Thirteen-week-old female Sprague-Dawley rats were given IPA (0.5 mg/kg per day; n=7), GHRP-6 (0.5 mg/kg per day; n=8), GH (3.5 mg/kg per day; n=7), or vehicle administered continuously s.c. via osmotic minipumps for 12 weeks. The animals were followed in vivo by dual X-ray absorptiometry (DXA) measurements every 4th week. After the animals were killed, femurs were analysed in vitro by mid-diaphyseal peripheral quantitative computed tomography (pQCT) scans. After this, excised femurs and vertebrae L6 were analysed by the use of Archimedes' principle and by determinations of ash weights. All treatments increased body weight and total tibial and vertebral BMC measured by DXA in vivo compared with vehicle-treated controls. However, total BMC corrected for the increase in body weight (total BMC:body weight ratio) was unaffected. Tibial area bone mineral density (BMD, BMC/area) was increased, but total and vertebral area BMDs were unchanged. The pQCT measurements in vitro revealed that the increase in the cortical BMC was due to an increased cross-sectional bone area, whereas the cortical volumetric BMD was unchanged. Femur and vertebra L6 volumes were increased but no effect was seen on the volumetric BMDs as measured by Archimedes' principle. Ash weight was increased by all treatments, but the mineral concentration was unchanged. We conclude that treatment of adult female rats with the GHSs ipamorelin and GHRP-6 increases BMC as measured by DXA in vivo. The results of in vitro measurements using pQCT and Archimedes' principle, in addition to ash weight determinations, show that the increases in cortical and total BMC were due to an increased growth of the bones with increased bone dimensions, whereas the volumetric BMD was unchanged.
Evaluation of neutron skyshine from a cyclotron
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huyashi, K.; Nakamura, T.
1984-06-01
The dose distribution and the spectrum variation of neutrons due to the skyshine effect have been measured with various detectors in the environment surrounding the cyclotron of the Institute for Nuclear Study, University of Tokyo. The source neutrons were produced by stopping a 52-MeV proton beam into a carbon beam stopper and were extracted upward from the opening in the concrete shield surrounding the cyclotron and then leaked into the atmosphere through the cyclotron building. The dose distribution and the spectrum of neutrons near the beam stopper were also measured in order to get information on the skyshine source. Themore » measured skyshine neutron spectra and dose distribution were analyzed with two codes, MMCR2 and SKYSHINE-II, with the result that the calculated results are in good agreement with the experiment. Valuable characteristics of this experiment are the determination of the energy spectrum and dose distribution of source neutron and the measurement of skyshine neutrons from an actual large-scale accelerator building to the exclusion of direct neutrons transported through the air. This experiment must be useful as a kind of benchmark experiment on the skyshine phenomenon.« less
Effective Dose in Nuclear Medicine Studies and SPECT/CT: Dosimetry Survey Across Quebec Province.
Charest, Mathieu; Asselin, Chantal
2018-06-01
The aims of the current study were to draw a portrait of the delivered dose in selected nuclear medicine studies in Québec province and to assess the degree of change between an earlier survey performed in 2010 and a later survey performed in 2014. Methods: Each surveyed nuclear medicine department had to complete 2 forms: the first, about the administered activity in selected nuclear medicine studies, and the second, about the CT parameters used in SPECT/CT imaging, if available. The administered activities were converted into effective doses using the most recent conversion factors. Diagnostic reference levels were computed for each imaging procedure to obtain a benchmark for comparison. Results: The distributions of administered activity in various nuclear medicine studies, along with the corresponding distribution of the effective doses, were determined. Excluding 131 I for thyroid studies, 67 Ga-citrate for infectious workups, and combined stress and rest myocardial perfusion studies, the remainder of the 99m Tc-based studies delivered average effective doses clustered below 10 mSv. Between the 2010 survey and the 2014 survey, there was a statistically significant decrease in delivered dose from 18.3 to 14.5 mSv. 67 Ga-citrate studies for infectious workups also showed a significant decrease in delivered dose from 31.0 to 26.2 mSv. The standardized CT portion of SPECT/CT studies yielded a mean effective dose 14 times lower than the radiopharmaceutical portion of the study. Conclusion: Between 2010 and 2014, there was a significant decrease in the delivered effective dose in myocardial perfusion and 67 Ga-citrate studies. The CT portions of the surveyed SPECT/CT studies contributed a relatively small fraction of the total delivered effective dose. © 2018 by the Society of Nuclear Medicine and Molecular Imaging.
Maier, Andrew; Vincent, Melissa J; Parker, Ann; Gadagbui, Bernard K; Jayjock, Michael
2015-12-01
Asthma is a complex syndrome with significant consequences for those affected. The number of individuals affected is growing, although the reasons for the increase are uncertain. Ensuring the effective management of potential exposures follows from substantial evidence that exposure to some chemicals can increase the likelihood of asthma responses. We have developed a safety assessment approach tailored to the screening of asthma risks from residential consumer product ingredients as a proactive risk management tool. Several key features of the proposed approach advance the assessment resources often used for asthma issues. First, a quantitative health benchmark for asthma or related endpoints (irritation and sensitization) is provided that extends qualitative hazard classification methods. Second, a parallel structure is employed to include dose-response methods for asthma endpoints and methods for scenario specific exposure estimation. The two parallel tracks are integrated in a risk characterization step. Third, a tiered assessment structure is provided to accommodate different amounts of data for both the dose-response assessment (i.e., use of existing benchmarks, hazard banding, or the threshold of toxicological concern) and exposure estimation (i.e., use of empirical data, model estimates, or exposure categories). Tools building from traditional methods and resources have been adapted to address specific issues pertinent to asthma toxicology (e.g., mode-of-action and dose-response features) and the nature of residential consumer product use scenarios (e.g., product use patterns and exposure durations). A case study for acetic acid as used in various sentinel products and residential cleaning scenarios was developed to test the safety assessment methodology. In particular, the results were used to refine and verify relationships among tiered approaches such that each lower data tier in the approach provides a similar or greater margin of safety for a given scenario. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
An adaptive algorithm for the detection of microcalcifications in simulated low-dose mammography.
Treiber, O; Wanninger, F; Führ, H; Panzer, W; Regulla, D; Winkler, G
2003-02-21
This paper uses the task of microcalcification detection as a benchmark problem to assess the potential for dose reduction in x-ray mammography. We present the results of a newly developed algorithm for detection of microcalcifications as a case study for a typical commercial film-screen system (Kodak Min-R 2000/2190). The first part of the paper deals with the simulation of dose reduction for film-screen mammography based on a physical model of the imaging process. Use of a more sensitive film-screen system is expected to result in additional smoothing of the image. We introduce two different models of that behaviour, called moderate and strong smoothing. We then present an adaptive, model-based microcalcification detection algorithm. Comparing detection results with ground-truth images obtained under the supervision of an expert radiologist allows us to establish the soundness of the detection algorithm. We measure the performance on the dose-reduced images in order to assess the loss of information due to dose reduction. It turns out that the smoothing behaviour has a strong influence on detection rates. For moderate smoothing. a dose reduction by 25% has no serious influence on the detection results. whereas a dose reduction by 50% already entails a marked deterioration of the performance. Strong smoothing generally leads to an unacceptable loss of image quality. The test results emphasize the impact of the more sensitive film-screen system and its characteristics on the problem of assessing the potential for dose reduction in film-screen mammography. The general approach presented in the paper can be adapted to fully digital mammography.
An adaptive algorithm for the detection of microcalcifications in simulated low-dose mammography
NASA Astrophysics Data System (ADS)
Treiber, O.; Wanninger, F.; Führ, H.; Panzer, W.; Regulla, D.; Winkler, G.
2003-02-01
This paper uses the task of microcalcification detection as a benchmark problem to assess the potential for dose reduction in x-ray mammography. We present the results of a newly developed algorithm for detection of microcalcifications as a case study for a typical commercial film-screen system (Kodak Min-R 2000/2190). The first part of the paper deals with the simulation of dose reduction for film-screen mammography based on a physical model of the imaging process. Use of a more sensitive film-screen system is expected to result in additional smoothing of the image. We introduce two different models of that behaviour, called moderate and strong smoothing. We then present an adaptive, model-based microcalcification detection algorithm. Comparing detection results with ground-truth images obtained under the supervision of an expert radiologist allows us to establish the soundness of the detection algorithm. We measure the performance on the dose-reduced images in order to assess the loss of information due to dose reduction. It turns out that the smoothing behaviour has a strong influence on detection rates. For moderate smoothing, a dose reduction by 25% has no serious influence on the detection results, whereas a dose reduction by 50% already entails a marked deterioration of the performance. Strong smoothing generally leads to an unacceptable loss of image quality. The test results emphasize the impact of the more sensitive film-screen system and its characteristics on the problem of assessing the potential for dose reduction in film-screen mammography. The general approach presented in the paper can be adapted to fully digital mammography.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ma, R; Zhu, X; Li, S
Purpose: High Dose Rate (HDR) brachytherapy forward planning is principally an iterative process; hence, plan quality is affected by planners’ experiences and limited planning time. Thus, this may lead to sporadic errors and inconsistencies in planning. A statistical tool based on previous approved clinical treatment plans would help to maintain the consistency of planning quality and improve the efficiency of second checking. Methods: An independent dose calculation tool was developed from commercial software. Thirty-three previously approved cervical HDR plans with the same prescription dose (550cGy), applicator type, and treatment protocol were examined, and ICRU defined reference point doses (bladder, vaginalmore » mucosa, rectum, and points A/B) along with dwell times were collected. Dose calculation tool then calculated appropriate range with a 95% confidence interval for each parameter obtained, which would be used as the benchmark for evaluation of those parameters in future HDR treatment plans. Model quality was verified using five randomly selected approved plans from the same dataset. Results: Dose variations appears to be larger at the reference point of bladder and mucosa as compared with rectum. Most reference point doses from verification plans fell between the predicted range, except the doses of two points of rectum and two points of reference position A (owing to rectal anatomical variations & clinical adjustment in prescription points, respectively). Similar results were obtained for tandem and ring dwell times despite relatively larger uncertainties. Conclusion: This statistical tool provides an insight into clinically acceptable range of cervical HDR plans, which could be useful in plan checking and identifying potential planning errors, thus improving the consistency of plan quality.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duwel, D; Lamba, M; Elson, H
Purpose: Various cancers of the eye are successfully treated with radiotherapy utilizing one anterior-posterior (A/P) beam that encompasses the entire content of the orbit. In such cases, a hanging lens shield can be used to spare dose to the radiosensitive lens of the eye to prevent cataracts. Methods: This research focused on Monte Carlo characterization of dose distributions resulting from a single A-P field to the orbit with a hanging shield in place. Monte Carlo codes were developed which calculated dose distributions for various electron radiation energies, hanging lens shield radii, shield heights above the eye, and beam spoiler configurations.more » Film dosimetry was used to benchmark the coding to ensure it was calculating relative dose accurately. Results: The Monte Carlo dose calculations indicated that lateral and depth dose profiles are insensitive to changes in shield height and electron beam energy. Dose deposition was sensitive to shield radius and beam spoiler composition and height above the eye. Conclusion: The use of a single A/P electron beam to treat cancers of the eye while maintaining adequate lens sparing is feasible. Shield radius should be customized to have the same radius as the patient’s lens. A beam spoiler should be used if it is desired to substantially dose the eye tissues lying posterior to the lens in the shadow of the lens shield. The compromise between lens sparing and dose to diseased tissues surrounding the lens can be modulated by varying the beam spoiler thickness, spoiler material composition, and spoiler height above the eye. The sparing ratio is a metric that can be used to evaluate the compromise between lens sparing and dose to surrounding tissues. The higher the ratio, the more dose received by the tissues immediately posterior to the lens relative to the dose received by the lens.« less
Xia, Pu; Zhang, Xiaowei; Zhang, Hanxin; Wang, Pingping; Tian, Mingming; Yu, Hongxia
2017-08-15
One of the major challenges in environmental science is monitoring and assessing the risk of complex environmental mixtures. In vitro bioassays with limited key toxicological end points have been shown to be suitable to evaluate mixtures of organic pollutants in wastewater and recycled water. Omics approaches such as transcriptomics can monitor biological effects at the genome scale. However, few studies have applied omics approach in the assessment of mixtures of organic micropollutants. Here, an omics approach was developed for profiling bioactivity of 10 water samples ranging from wastewater to drinking water in human cells by a reduced human transcriptome (RHT) approach and dose-response modeling. Transcriptional expression of 1200 selected genes were measured by an Ampliseq technology in two cell lines, HepG2 and MCF7, that were exposed to eight serial dilutions of each sample. Concentration-effect models were used to identify differentially expressed genes (DEGs) and to calculate effect concentrations (ECs) of DEGs, which could be ranked to investigate low dose response. Furthermore, molecular pathways disrupted by different samples were evaluated by Gene Ontology (GO) enrichment analysis. The ability of RHT for representing bioactivity utilizing both HepG2 and MCF7 was shown to be comparable to the results of previous in vitro bioassays. Finally, the relative potencies of the mixtures indicated by RHT analysis were consistent with the chemical profiles of the samples. RHT analysis with human cells provides an efficient and cost-effective approach to benchmarking mixture of micropollutants and may offer novel insight into the assessment of mixture toxicity in water.
Beyer, W. Nelson; Chen, Yu; Henry, Paula; May, Thomas; Mosby, David; Rattner, Barnett A.; Shearn-Bochsler, Valerie I.; Sprague, Daniel; Weber, John
2014-01-01
This study relates tissue concentrations and toxic effects of Pb in Japanese quail (Coturnix japonica) to the dietary exposure of soil-borne Pb associated with mining and smelting. From 0% to 12% contaminated soil, by weight, was added to 5 experimental diets (0.12 to 382 mg Pb/kg, dry wt) and fed to the quail for 6 weeks. Benchmark doses associated with a 50% reduction in delta-aminolevulinic acid dehydratase activity were 0.62 mg Pb/kg in the blood, dry wt, and 27 mg Pb/kg in the diet. Benchmark doses associated with a 20% increase in the concentration of erythrocyte protoporphyrin were 2.7 mg Pb/kg in the blood and 152 mg Pb/kg in the diet. The quail showed no other signs of toxicity (histopathological lesions, alterations in plasma–testosterone concentration, and body and organ weights). The relation of the blood Pb concentration to the soil Pb concentration was linear, with a slope of 0.013 mg Pb/kg of blood (dry wt) divided by mg Pb/kg of diet. We suggest that this slope is potentially useful in ecological risk assessments on birds in the same way that the intake slope factor is an important parameter in risk assessments of children exposed to Pb. The slope may also be used in a tissue-residue approach as an additional line of evidence in ecological risk assessment, supplementary to an estimate of hazard based on dietary toxicity reference values.
NASA Astrophysics Data System (ADS)
Bu, Zhongming; Zhang, Yinping; Mmereki, Daniel; Yu, Wei; Li, Baizhan
2016-02-01
Six phthalates - dimethyl phthalate (DMP), diethyl phthalate (DEP), di(isobutyl) phthalate (DiBP), di(n-butyl) phthalate (DnBP), butyl benzyl phthalate (BBzP) and di(2-ethylhexyl) phthalate (DEHP) - in indoor gas-phase and dust samples were measured in thirty residential apartments for the first time in Chongqing, China. Monte-Carlo simulation was used to estimate preschool children's exposure via inhalation, non-dietary ingestion and dermal absorption based on gas-phase and dust concentrations. Risk assessment was evaluated by comparing the modeled exposure doses with child-specific benchmarks specified in California's Proposition 65. The detection frequency for all the targeted phthalates was more than 80% except for BBzP. DMP was the most predominant compound in the gas-phase (median = 0.91 μg/m3 and 0.82 μg/m3 in living rooms and bedrooms, respectively), and DEHP was the most predominant compound in the dust samples (median = 1543 μg/g and 1450 μg/g in living rooms and bedrooms, respectively). Correlation analysis suggests that indoor DiBP and DnBP might come from the same emission sources. The simulations showed that the median DEHP daily intake was 3.18-4.28 μg/day/kg-bw in all age groups, suggesting that it was the greatest of the targeted phthalates. The risk assessment indicated that the exposure doses of DnBP and DEHP exceeded the child-specific benchmarks in more than 90% of preschool children in Chongqing. Therefore, from a children's health perspective, efforts should focus on controlling indoor phthalate concentrations and exposures.
Slob, Wout
2017-04-01
A general theory on effect size for continuous data predicts a relationship between maximum response and within-group variation of biological parameters, which is empirically confirmed by results from dose-response analyses of 27 different biological parameters. The theory shows how effect sizes observed in distinct biological parameters can be compared and provides a basis for a generic definition of small, intermediate and large effects. While the theory is useful for experimental science in general, it has specific consequences for risk assessment: it solves the current debate on the appropriate metric for the Benchmark response in continuous data. The theory shows that scaling the BMR expressed as a percent change in means to the maximum response (in the way specified) automatically takes "natural variability" into account. Thus, the theory supports the underlying rationale of the BMR 1 SD. For various reasons, it is, however, recommended to use a BMR in terms of a percent change that is scaled to maximum response and/or within group variation (averaged over studies), as a single harmonized approach.
Whole-body to tissue concentration ratios for use in biota dose assessments for animals.
Yankovich, Tamara L; Beresford, Nicholas A; Wood, Michael D; Aono, Tasuo; Andersson, Pål; Barnett, Catherine L; Bennett, Pamela; Brown, Justin E; Fesenko, Sergey; Fesenko, J; Hosseini, Ali; Howard, Brenda J; Johansen, Mathew P; Phaneuf, Marcel M; Tagami, Keiko; Takata, Hyoe; Twining, John R; Uchida, Shigeo
2010-11-01
Environmental monitoring programs often measure contaminant concentrations in animal tissues consumed by humans (e.g., muscle). By comparison, demonstration of the protection of biota from the potential effects of radionuclides involves a comparison of whole-body doses to radiological dose benchmarks. Consequently, methods for deriving whole-body concentration ratios based on tissue-specific data are required to make best use of the available information. This paper provides a series of look-up tables with whole-body:tissue-specific concentration ratios for non-human biota. Focus was placed on relatively broad animal categories (including molluscs, crustaceans, freshwater fishes, marine fishes, amphibians, reptiles, birds and mammals) and commonly measured tissues (specifically, bone, muscle, liver and kidney). Depending upon organism, whole-body to tissue concentration ratios were derived for between 12 and 47 elements. The whole-body to tissue concentration ratios can be used to estimate whole-body concentrations from tissue-specific measurements. However, we recommend that any given whole-body to tissue concentration ratio should not be used if the value falls between 0.75 and 1.5. Instead, a value of one should be assumed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Overton, J.H.; Jarabek, A.M.
1989-01-01
The U.S. EPA advocates the assessment of health-effects data and calculation of inhaled reference doses as benchmark values for gauging systemic toxicity to inhaled gases. The assessment often requires an inter- or intra-species dose extrapolation from no observed adverse effect level (NOAEL) exposure concentrations in animals to human equivalent NOAEL exposure concentrations. To achieve this, a dosimetric extrapolation procedure was developed based on the form or type of equations that describe the uptake and disposition of inhaled volatile organic compounds (VOCs) in physiologically-based pharmacokinetic (PB-PK) models. The procedure assumes allometric scaling of most physiological parameters and that the value ofmore » the time-integrated human arterial-blood concentration must be limited to no more than to that of experimental animals. The scaling assumption replaces the need for most parameter values and allows the derivation of a simple formula for dose extrapolation of VOCs that gives equivalent or more-conservative exposure concentrations values than those that would be obtained using a PB-PK model in which scaling was assumed.« less
Ghadyani, Hamid R.; Bastien, Adam D.; Lutz, Nicholas N.; Hepel, Jaroslaw T.
2015-01-01
Purpose Noninvasive image-guided breast brachytherapy delivers conformal HDR 192Ir brachytherapy treatments with the breast compressed, and treated in the cranial-caudal and medial-lateral directions. This technique subjects breast tissue to extreme deformations not observed for other disease sites. Given that, commercially-available software for deformable image registration cannot accurately co-register image sets obtained in these two states, a finite element analysis based on a biomechanical model was developed to deform dose distributions for each compression circumstance for dose summation. Material and methods The model assumed the breast was under planar stress with values of 30 kPa for Young's modulus and 0.3 for Poisson's ratio. Dose distributions from round and skin-dose optimized applicators in cranial-caudal and medial-lateral compressions were deformed using 0.1 cm planar resolution. Dose distributions, skin doses, and dose-volume histograms were generated. Results were examined as a function of breast thickness, applicator size, target size, and offset distance from the center. Results Over the range of examined thicknesses, target size increased several millimeters as compression thickness decreased. This trend increased with increasing offset distances. Applicator size minimally affected target coverage, until applicator size was less than the compressed target size. In all cases, with an applicator larger or equal to the compressed target size, > 90% of the target covered by > 90% of the prescription dose. In all cases, dose coverage became less uniform as offset distance increased and average dose increased. This effect was more pronounced for smaller target–applicator combinations. Conclusions The model exhibited skin dose trends that matched MC-generated benchmarking results within 2% and clinical observations over a similar range of breast thicknesses and target sizes. The model provided quantitative insight on dosimetric treatment variables over a range of clinical circumstances. These findings highlight the need for careful target localization and accurate identification of compression thickness and target offset. PMID:25829938
An analysis of MCNP cross-sections and tally methods for low-energy photon emitters.
Demarco, John J; Wallace, Robert E; Boedeker, Kirsten
2002-04-21
Monte Carlo calculations are frequently used to analyse a variety of radiological science applications using low-energy (10-1000 keV) photon sources. This study seeks to create a low-energy benchmark for the MCNP Monte Carlo code by simulating the absolute dose rate in water and the air-kerma rate for monoenergetic point sources with energies between 10 keV and 1 MeV. The analysis compares four cross-section datasets as well as the tally method for collision kerma versus absorbed dose. The total photon attenuation coefficient cross-section for low atomic number elements has changed significantly as cross-section data have changed between 1967 and 1989. Differences of up to 10% are observed in the photoelectric cross-section for water at 30 keV between the standard MCNP cross-section dataset (DLC-200) and the most recent XCOM/NIST tabulation. At 30 keV, the absolute dose rate in water at 1.0 cm from the source increases by 7.8% after replacing the DLC-200 photoelectric cross-sections for water with those from the XCOM/NIST tabulation. The differences in the absolute dose rate are analysed when calculated with either the MCNP absorbed dose tally or the collision kerma tally. Significant differences between the collision kerma tally and the absorbed dose tally can occur when using the DLC-200 attenuation coefficients in conjunction with a modern tabulation of mass energy-absorption coefficients.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ding, G; Wang, L
Purpose: The unintended radiation dose to organs at risk (OAR) can be contributed from imaging guidance procedures as well as from leakage and scatter of therapeutic beams. This study compares the imaging dose with the unintended out-of-field therapeutic dose to patient sensitive organs. Methods: The Monte Carlo EGSnrc user codes, BEAMnrc and DOSXYZnrc, were used to simulate kV X-ray sources from imaging devices as well as the therapeutic IMRT/VMAT beams and to calculate doses to target and OARs on patient treatment planning CT images. The accuracy of the Monte Carlo simulations was benchmarked against measurements in phantoms. The dose-volume histogrammore » was utilized in analyzing the patient organ doses. Results: The dose resulting from Standard Head kV-CBCT scans to bone and soft tissues ranges from 0.7 to 1.1 cGy and from 0.03 to 0.3 cGy, respectively. The dose resulting from Thorax scans on the chest to bone and soft tissues ranges from 1.1 to 1.8 cGy and from 0.3 to 0.6 cGy, respectively. The dose resulting from Pelvis scans on the abdomen to bone and soft tissues range from 3.2 to 4.2 cGy and from 1.2 to 2.2 cGy, respectively. The out-of-field doses to OAR are sensitive to the distance between the treated target and the OAR. For a typical Head-and-Neck IMRT/VMAT treatment the out-of-field doses to eyes are 1–3% of the target dose, or 2–6 cGy per fraction. Conclusion: The imaging doses to OAR are predictable based on the imaging protocols used when OARs are within the imaged volume and can be estimated and accounted for by using tabulated values. The unintended out-of-field doses are proportional to the target dose, strongly depend on the distance between the treated target and OAR, and are generally higher comparing to the imaging dose. This work was partially supported by Varian research grant VUMC40590.« less
NASA Astrophysics Data System (ADS)
Mahnam, Mehdi; Gendreau, Michel; Lahrichi, Nadia; Rousseau, Louis-Martin
2017-07-01
In this paper, we propose a novel heuristic algorithm for the volumetric-modulated arc therapy treatment planning problem, optimizing the trade-off between delivery time and treatment quality. We present a new mixed integer programming model in which the multi-leaf collimator leaf positions, gantry speed, and dose rate are determined simultaneously. Our heuristic is based on column generation; the aperture configuration is modeled in the columns and the dose distribution and time restriction in the rows. To reduce the number of voxels and increase the efficiency of the master model, we aggregate similar voxels using a clustering technique. The efficiency of the algorithm and the treatment quality are evaluated on a benchmark clinical prostate cancer case. The computational results show that a high-quality treatment is achievable using a four-thread CPU. Finally, we analyze the effects of the various parameters and two leaf-motion strategies.
Praveen Pole, R P; Feroz Khan, M; Godwin Wesley, S
2017-04-01
The activity concentration of 210 Po in 26 species of marine macroalgae found along coast near to a nuclear installation in southeast coast of India was studied. Phaeophytes were found to accumulate the maximum 210 Po concentration and chlorophytes the minimum. The average 210 Po activity concentration values in the three groups were 6.2 ± 2.5 Bq kg -1 (Chlorophyta), 14.4 ± 5.2 Bq kg -1 (Phaeophyta) and 11.3 ± 3.9 Bq kg -1 (Rhodophyta). A statistically significant variation in accumulation was found between groups (p < 0.05). The un-weighted dose rate to these algae due to 210 Po was calculated to be well below the benchmark dose limit of 10 μGy h -1 . Copyright © 2017 Elsevier Ltd. All rights reserved.
Combining uncertainty factors in deriving human exposure levels of noncarcinogenic toxicants.
Kodell, R L; Gaylor, D W
1999-01-01
Acceptable levels of human exposure to noncarcinogenic toxicants in environmental and occupational settings generally are derived by reducing experimental no-observed-adverse-effect levels (NOAELs) or benchmark doses (BDs) by a product of uncertainty factors (Barnes and Dourson, Ref. 1). These factors are presumed to ensure safety by accounting for uncertainty in dose extrapolation, uncertainty in duration extrapolation, differential sensitivity between humans and animals, and differential sensitivity among humans. The common default value for each uncertainty factor is 10. This paper shows how estimates of means and standard deviations of the approximately log-normal distributions of individual uncertainty factors can be used to estimate percentiles of the distribution of the product of uncertainty factors. An appropriately selected upper percentile, for example, 95th or 99th, of the distribution of the product can be used as a combined uncertainty factor to replace the conventional product of default factors.
Neutron skyshine from intense 14-MeV neutron source facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nakamura, T.; Hayashi, K.; Takahashi, A.
1985-07-01
The dose distribution and the spectrum variation of neutrons due to the skyshine effect have been measured with the high-efficiency rem counter, the multisphere spectrometer, and the NE-213 scintillator in the environment surrounding an intense 14-MeV neutron source facility. The dose distribution and the energy spectra of neutrons around the facility used as a skyshine source have also been measured to enable the absolute evaluation of the skyshine effect. The skyshine effect was analyzed by two multigroup Monte Carlo codes, NIMSAC and MMCR-2, by two discrete ordinates S /sub n/ codes, ANISN and DOT3.5, and by the shield structure designmore » code for skyshine, SKYSHINE-II. The calculated results show good agreement with the measured results in absolute values. These experimental results should be useful as benchmark data for shyshine analysis and for shielding design of fusion facilities.« less
NASA Astrophysics Data System (ADS)
Davidson, S.; Cui, J.; Followill, D.; Ibbott, G.; Deasy, J.
2008-02-01
The Dose Planning Method (DPM) is one of several 'fast' Monte Carlo (MC) computer codes designed to produce an accurate dose calculation for advanced clinical applications. We have developed a flexible machine modeling process and validation tests for open-field and IMRT calculations. To complement the DPM code, a practical and versatile source model has been developed, whose parameters are derived from a standard set of planning system commissioning measurements. The primary photon spectrum and the spectrum resulting from the flattening filter are modeled by a Fatigue function, cut-off by a multiplying Fermi function, which effectively regularizes the difficult energy spectrum determination process. Commonly-used functions are applied to represent the off-axis softening, increasing primary fluence with increasing angle ('the horn effect'), and electron contamination. The patient dependent aspect of the MC dose calculation utilizes the multi-leaf collimator (MLC) leaf sequence file exported from the treatment planning system DICOM output, coupled with the source model, to derive the particle transport. This model has been commissioned for Varian 2100C 6 MV and 18 MV photon beams using percent depth dose, dose profiles, and output factors. A 3-D conformal plan and an IMRT plan delivered to an anthropomorphic thorax phantom were used to benchmark the model. The calculated results were compared to Pinnacle v7.6c results and measurements made using radiochromic film and thermoluminescent detectors (TLD).
Dose-Response Analysis of RNA-Seq Profiles in Archival ...
Use of archival resources has been limited to date by inconsistent methods for genomic profiling of degraded RNA from formalin-fixed paraffin-embedded (FFPE) samples. RNA-sequencing offers a promising way to address this problem. Here we evaluated transcriptomic dose responses using RNA-sequencing in paired FFPE and frozen (FROZ) samples from two archival studies in mice, one 20 years old. Experimental treatments included 3 different doses of di(2-ethylhexyl)phthalate or dichloroacetic acid for the recently archived and older studies, respectively. Total RNA was ribo-depleted and sequenced using the Illumina HiSeq platform. In the recently archived study, FFPE samples had 35% lower total counts compared to FROZ samples but high concordance in fold-change values of differentially expressed genes (DEGs) (r2 = 0.99), highly enriched pathways (90% overlap with FROZ), and benchmark dose estimates for preselected target genes (2% difference vs FROZ). In contrast, older FFPE samples had markedly lower total counts (3% of FROZ) and poor concordance in global DEGs and pathways. However, counts from FFPE and FROZ samples still positively correlated (r2 = 0.84 across all transcripts) and showed comparable dose responses for more highly expressed target genes. These findings highlight potential applications and issues in using RNA-sequencing data from FFPE samples. Recently archived FFPE samples were highly similar to FROZ samples in sequencing q
İnal, Tolga; Ataç, Gökçe
2014-01-01
PURPOSE We aimed to determine the radiation doses delivered to patients undergoing general examinations using computed or digital radiography systems in Turkey. MATERIALS AND METHODS Radiographs of 20 patients undergoing posteroanterior chest X-ray and of 20 patients undergoing anteroposterior kidney-ureter-bladder radiography were evaluated in five X-ray rooms at four local hospitals in the Ankara region. Currently, almost all radiology departments in Turkey have switched from conventional radiography systems to computed radiography or digital radiography systems. Patient dose was measured for both systems. The results were compared with published diagnostic reference levels (DRLs) from the European Union and International Atomic Energy Agency. RESULTS The average entrance surface doses (ESDs) for chest examinations exceeded established international DRLs at two of the X-ray rooms in a hospital with computed radiography. All of the other ESD measurements were approximately equal to or below the DRLs for both examinations in all of the remaining hospitals. Improper adjustment of the exposure parameters, uncalibrated automatic exposure control systems, and failure of the technologists to choose exposure parameters properly were problems we noticed during the study. CONCLUSION This study is an initial attempt at establishing local DRL values for digital radiography systems, and will provide a benchmark so that the authorities can establish reference dose levels for diagnostic radiology in Turkey. PMID:24317331
Benchmarking reference services: step by step.
Buchanan, H S; Marshall, J G
1996-01-01
This article is a companion to an introductory article on benchmarking published in an earlier issue of Medical Reference Services Quarterly. Librarians interested in benchmarking often ask the following questions: How do I determine what to benchmark; how do I form a benchmarking team; how do I identify benchmarking partners; what's the best way to collect and analyze benchmarking information; and what will I do with the data? Careful planning is a critical success factor of any benchmarking project, and these questions must be answered before embarking on a benchmarking study. This article summarizes the steps necessary to conduct benchmarking research. Relevant examples of each benchmarking step are provided.
Ojala, J; Hyödynmaa, S; Barańczyk, R; Góra, E; Waligórski, M P R
2014-03-01
Electron radiotherapy is applied to treat the chest wall close to the mediastinum. The performance of the GGPB and eMC algorithms implemented in the Varian Eclipse treatment planning system (TPS) was studied in this region for 9 and 16 MeV beams, against Monte Carlo (MC) simulations, point dosimetry in a water phantom and dose distributions calculated in virtual phantoms. For the 16 MeV beam, the accuracy of these algorithms was also compared over the lung-mediastinum interface region of an anthropomorphic phantom, against MC calculations and thermoluminescence dosimetry (TLD). In the phantom with a lung-equivalent slab the results were generally congruent, the eMC results for the 9 MeV beam slightly overestimating the lung dose, and the GGPB results for the 16 MeV beam underestimating the lung dose. Over the lung-mediastinum interface, for 9 and 16 MeV beams, the GGPB code underestimated the lung dose and overestimated the dose in water close to the lung, compared to the congruent eMC and MC results. In the anthropomorphic phantom, results of TLD measurements and MC and eMC calculations agreed, while the GGPB code underestimated the lung dose. Good agreement between TLD measurements and MC calculations attests to the accuracy of "full" MC simulations as a reference for benchmarking TPS codes. Application of the GGPB code in chest wall radiotherapy may result in significant underestimation of the lung dose and overestimation of dose to the mediastinum, affecting plan optimization over volumes close to the lung-mediastinum interface, such as the lung or heart. Copyright © 2013 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Mukumoto, Nobutaka; Tsujii, Katsutomo; Saito, Susumu; Yasunaga, Masayoshi; Takegawa, Hideki; Yamamoto, Tokihiro; Numasaki, Hodaka; Teshima, Teruki
2009-10-01
To develop an infrastructure for the integrated Monte Carlo verification system (MCVS) to verify the accuracy of conventional dose calculations, which often fail to accurately predict dose distributions, mainly due to inhomogeneities in the patient's anatomy, for example, in lung and bone. The MCVS consists of the graphical user interface (GUI) based on a computational environment for radiotherapy research (CERR) with MATLAB language. The MCVS GUI acts as an interface between the MCVS and a commercial treatment planning system to import the treatment plan, create MC input files, and analyze MC output dose files. The MCVS consists of the EGSnrc MC codes, which include EGSnrc/BEAMnrc to simulate the treatment head and EGSnrc/DOSXYZnrc to calculate the dose distributions in the patient/phantom. In order to improve computation time without approximations, an in-house cluster system was constructed. The phase-space data of a 6-MV photon beam from a Varian Clinac unit was developed and used to establish several benchmarks under homogeneous conditions. The MC results agreed with the ionization chamber measurements to within 1%. The MCVS GUI could import and display the radiotherapy treatment plan created by the MC method and various treatment planning systems, such as RTOG and DICOM-RT formats. Dose distributions could be analyzed by using dose profiles and dose volume histograms and compared on the same platform. With the cluster system, calculation time was improved in line with the increase in the number of central processing units (CPUs) at a computation efficiency of more than 98%. Development of the MCVS was successful for performing MC simulations and analyzing dose distributions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wan Chan Tseung, H; Ma, J; Ma, D
2015-06-15
Purpose: To demonstrate the feasibility of fast Monte Carlo (MC) based biological planning for the treatment of thyroid tumors in spot-scanning proton therapy. Methods: Recently, we developed a fast and accurate GPU-based MC simulation of proton transport that was benchmarked against Geant4.9.6 and used as the dose calculation engine in a clinically-applicable GPU-accelerated IMPT optimizer. Besides dose, it can simultaneously score the dose-averaged LET (LETd), which makes fast biological dose (BD) estimates possible. To convert from LETd to BD, we used a linear relation based on cellular irradiation data. Given a thyroid patient with a 93cc tumor volume, we createdmore » a 2-field IMPT plan in Eclipse (Varian Medical Systems). This plan was re-calculated with our MC to obtain the BD distribution. A second 5-field plan was made with our in-house optimizer, using pre-generated MC dose and LETd maps. Constraints were placed to maintain the target dose to within 25% of the prescription, while maximizing the BD. The plan optimization and calculation of dose and LETd maps were performed on a GPU cluster. The conventional IMPT and biologically-optimized plans were compared. Results: The mean target physical and biological doses from our biologically-optimized plan were, respectively, 5% and 14% higher than those from the MC re-calculation of the IMPT plan. Dose sparing to critical structures in our plan was also improved. The biological optimization, including the initial dose and LETd map calculations, can be completed in a clinically viable time (∼30 minutes) on a cluster of 25 GPUs. Conclusion: Taking advantage of GPU acceleration, we created a MC-based, biologically optimized treatment plan for a thyroid patient. Compared to a standard IMPT plan, a 5% increase in the target’s physical dose resulted in ∼3 times as much increase in the BD. Biological planning was thus effective in escalating the target BD.« less
Qualitative and quantitative approaches in the dose-response assessment of genotoxic carcinogens.
Fukushima, Shoji; Gi, Min; Kakehashi, Anna; Wanibuchi, Hideki; Matsumoto, Michiharu
2016-05-01
Qualitative and quantitative approaches are important issues in field of carcinogenic risk assessment of the genotoxic carcinogens. Herein, we provide quantitative data on low-dose hepatocarcinogenicity studies for three genotoxic hepatocarcinogens: 2-amino-3,8-dimethylimidazo[4,5-f]quinoxaline (MeIQx), 2-amino-3-methylimidazo[4,5-f]quinoline (IQ) and N-nitrosodiethylamine (DEN). Hepatocarcinogenicity was examined by quantitative analysis of glutathione S-transferase placental form (GST-P) positive foci, which are the preneoplastic lesions in rat hepatocarcinogenesis and the endpoint carcinogenic marker in the rat liver medium-term carcinogenicity bioassay. We also examined DNA damage and gene mutations which occurred through the initiation stage of carcinogenesis. For the establishment of points of departure (PoD) from which the cancer-related risk can be estimated, we analyzed the above events by quantitative no-observed-effect level and benchmark dose approaches. MeIQx at low doses induced formation of DNA-MeIQx adducts; somewhat higher doses caused elevation of 8-hydroxy-2'-deoxyquanosine levels; at still higher doses gene mutations occurred; and the highest dose induced formation of GST-P positive foci. These data indicate that early genotoxic events in the pathway to carcinogenesis showed the expected trend of lower PoDs for earlier events in the carcinogenic process. Similarly, only the highest dose of IQ caused an increase in the number of GST-P positive foci in the liver, while IQ-DNA adduct formation was observed with low doses. Moreover, treatment with DEN at low doses had no effect on development of GST-P positive foci in the liver. These data on PoDs for the markers contribute to understand whether genotoxic carcinogens have a threshold for their carcinogenicity. The most appropriate approach to use in low dose-response assessment must be approved on the basis of scientific judgment. © The Author 2015. Published by Oxford University Press on behalf of the UK Environmental Mutagen Society. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
NASA Astrophysics Data System (ADS)
Magro, G.; Molinelli, S.; Mairani, A.; Mirandola, A.; Panizza, D.; Russo, S.; Ferrari, A.; Valvo, F.; Fossati, P.; Ciocca, M.
2015-09-01
This study was performed to evaluate the accuracy of a commercial treatment planning system (TPS), in optimising proton pencil beam dose distributions for small targets of different sizes (5-30 mm side) located at increasing depths in water. The TPS analytical algorithm was benchmarked against experimental data and the FLUKA Monte Carlo (MC) code, previously validated for the selected beam-line. We tested the Siemens syngo® TPS plan optimisation module for water cubes fixing the configurable parameters at clinical standards, with homogeneous target coverage to a 2 Gy (RBE) dose prescription as unique goal. Plans were delivered and the dose at each volume centre was measured in water with a calibrated PTW Advanced Markus® chamber. An EBT3® film was also positioned at the phantom entrance window for the acquisition of 2D dose maps. Discrepancies between TPS calculated and MC simulated values were mainly due to the different lateral spread modeling and resulted in being related to the field-to-spot size ratio. The accuracy of the TPS was proved to be clinically acceptable in all cases but very small and shallow volumes. In this contest, the use of MC to validate TPS results proved to be a reliable procedure for pre-treatment plan verification.
Magro, G; Molinelli, S; Mairani, A; Mirandola, A; Panizza, D; Russo, S; Ferrari, A; Valvo, F; Fossati, P; Ciocca, M
2015-09-07
This study was performed to evaluate the accuracy of a commercial treatment planning system (TPS), in optimising proton pencil beam dose distributions for small targets of different sizes (5-30 mm side) located at increasing depths in water. The TPS analytical algorithm was benchmarked against experimental data and the FLUKA Monte Carlo (MC) code, previously validated for the selected beam-line. We tested the Siemens syngo(®) TPS plan optimisation module for water cubes fixing the configurable parameters at clinical standards, with homogeneous target coverage to a 2 Gy (RBE) dose prescription as unique goal. Plans were delivered and the dose at each volume centre was measured in water with a calibrated PTW Advanced Markus(®) chamber. An EBT3(®) film was also positioned at the phantom entrance window for the acquisition of 2D dose maps. Discrepancies between TPS calculated and MC simulated values were mainly due to the different lateral spread modeling and resulted in being related to the field-to-spot size ratio. The accuracy of the TPS was proved to be clinically acceptable in all cases but very small and shallow volumes. In this contest, the use of MC to validate TPS results proved to be a reliable procedure for pre-treatment plan verification.
NASA Astrophysics Data System (ADS)
Srinivasan, P.; Priya, S.; Patel, Tarun; Gopalakrishnan, R. K.; Sharma, D. N.
2015-01-01
DD/DT fusion neutron generators are used as sources of 2.5 MeV/14.1 MeV neutrons in experimental laboratories for various applications. Detailed knowledge of the radiation dose rates around the neutron generators are essential for ensuring radiological protection of the personnel involved with the operation. This work describes the experimental and Monte Carlo studies carried out in the Purnima Neutron Generator facility of the Bhabha Atomic Research Center (BARC), Mumbai. Verification and validation of the shielding adequacy was carried out by measuring the neutron and gamma dose-rates at various locations inside and outside the neutron generator hall during different operational conditions both for 2.5-MeV and 14.1-MeV neutrons and comparing with theoretical simulations. The calculated and experimental dose rates were found to agree with a maximum deviation of 20% at certain locations. This study has served in benchmarking the Monte Carlo simulation methods adopted for shield design of such facilities. This has also helped in augmenting the existing shield thickness to reduce the neutron and associated gamma dose rates for radiological protection of personnel during operation of the generators at higher source neutron yields up to 1 × 1010 n/s.
Modeling antimicrobial tolerance and treatment of heterogeneous biofilms.
Zhao, Jia; Seeluangsawat, Paisa; Wang, Qi
2016-12-01
A multiphasic, hydrodynamic model for spatially heterogeneous biofilms based on the phase field formulation is developed and applied to analyze antimicrobial tolerance of biofilms by acknowledging the existence of persistent and susceptible cells in the total population of bacteria. The model implements a new conversion rate between persistent and susceptible cells and its homogeneous dynamics is bench-marked against a known experiment quantitatively. It is then discretized and solved on graphic processing units (GPUs) in 3-D space and time. With the model, biofilm development and antimicrobial treatment of biofilms in a flow cell are investigated numerically. Model predictions agree qualitatively well with available experimental observations. Specifically, numerical results demonstrate that: (i) in a flow cell, nutrient, diffused in solvent and transported by hydrodynamics, has an apparent impact on persister formation, thereby antimicrobial persistence of biofilms; (ii) dosing antimicrobial agents inside biofilms is more effective than dosing through diffusion in solvent; (iii) periodic dosing is less effective in antimicrobial treatment of biofilms in a nutrient deficient environment than in a nutrient sufficient environment. This model provides us with a simulation tool to analyze mechanisms of biofilm tolerance to antimicrobial agents and to derive potentially optimal dosing strategies for biofilm control and treatment. Copyright © 2016 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Petroccia, H; Olguin, E; Culberson, W
2016-06-15
Purpose: Innovations in radiotherapy treatments, such as dynamic IMRT, VMAT, and SBRT/SRS, result in larger proportions of low-dose regions where normal tissues are exposed to low doses levels. Low doses of radiation have been linked to secondary cancers and cardiac toxicities. The AAPM TG Committee No.158 entitled, ‘Measurements and Calculations of Doses outside the Treatment Volume from External-Beam Radiation Therapy’, has been formed to review the dosimetry of non-target and out-of-field exposures using experimental and computational approaches. Studies on historical patients can provide comprehensive information about secondary effects from out-of-field doses when combined with long-term patient follow-up, thus providing significantmore » insight into projecting future outcomes of patients undergoing modern-day treatments. Methods: We present a Monte Carlo model of a Theratron-1000 cobalt-60 teletherapy unit, which historically treated patients at the University of Florida, as a means of determining doses located outside the primary beam. Experimental data for a similar Theratron-1000 was obtained at the University of Wisconsin’s ADCL to benchmark the model for out-of-field dosimetry. An Exradin A12 ion chamber and TLD100 chips were used to measure doses in an extended water phantom to 60 cm outside the primary field at 5 and 10 cm depths. Results: Comparison between simulated and experimental measurements of PDDs and lateral profiles show good agreement for in-field and out-of-field doses. At 10 cm away from the edge of a 6×6, 10×10, and 20×20 cm2 field, relative out-of-field doses were measured in the range of 0.5% to 3% of the dose measured at 5 cm depth along the CAX. Conclusion: Out-of-field doses can be as high as 90 to 180 cGy assuming historical prescription doses of 30 to 60 Gy and should be considered when correlating late effects with normal tissue dose.« less
Limitations of Community College Benchmarking and Benchmarks
ERIC Educational Resources Information Center
Bers, Trudy H.
2006-01-01
This chapter distinguishes between benchmarks and benchmarking, describes a number of data and cultural limitations to benchmarking projects, and suggests that external demands for accountability are the dominant reason for growing interest in benchmarking among community colleges.
Georg, Dietmar; Stock, Markus; Kroupa, Bernhard; Olofsson, Jörgen; Nyholm, Tufve; Ahnesjö, Anders; Karlsson, Mikael
2007-08-21
Experimental methods are commonly used for patient-specific intensity-modulated radiotherapy (IMRT) verification. The purpose of this study was to investigate the accuracy and performance of independent dose calculation software (denoted as 'MUV' (monitor unit verification)) for patient-specific quality assurance (QA). 52 patients receiving step-and-shoot IMRT were considered. IMRT plans were recalculated by the treatment planning systems (TPS) in a dedicated QA phantom, in which an experimental 1D and 2D verification (0.3 cm(3) ionization chamber; films) was performed. Additionally, an independent dose calculation was performed. The fluence-based algorithm of MUV accounts for collimator transmission, rounded leaf ends, tongue-and-groove effect, backscatter to the monitor chamber and scatter from the flattening filter. The dose calculation utilizes a pencil beam model based on a beam quality index. DICOM RT files from patient plans, exported from the TPS, were directly used as patient-specific input data in MUV. For composite IMRT plans, average deviations in the high dose region between ionization chamber measurements and point dose calculations performed with the TPS and MUV were 1.6 +/- 1.2% and 0.5 +/- 1.1% (1 S.D.). The dose deviations between MUV and TPS slightly depended on the distance from the isocentre position. For individual intensity-modulated beams (total 367), an average deviation of 1.1 +/- 2.9% was determined between calculations performed with the TPS and with MUV, with maximum deviations up to 14%. However, absolute dose deviations were mostly less than 3 cGy. Based on the current results, we aim to apply a confidence limit of 3% (with respect to the prescribed dose) or 6 cGy for routine IMRT verification. For off-axis points at distances larger than 5 cm and for low dose regions, we consider 5% dose deviation or 10 cGy acceptable. The time needed for an independent calculation compares very favourably with the net time for an experimental approach. The physical effects modelled in the dose calculation software MUV allow accurate dose calculations in individual verification points. Independent calculations may be used to replace experimental dose verification once the IMRT programme is mature.
A clinical study of lung cancer dose calculation accuracy with Monte Carlo simulation.
Zhao, Yanqun; Qi, Guohai; Yin, Gang; Wang, Xianliang; Wang, Pei; Li, Jian; Xiao, Mingyong; Li, Jie; Kang, Shengwei; Liao, Xiongfei
2014-12-16
The accuracy of dose calculation is crucial to the quality of treatment planning and, consequently, to the dose delivered to patients undergoing radiation therapy. Current general calculation algorithms such as Pencil Beam Convolution (PBC) and Collapsed Cone Convolution (CCC) have shortcomings in regard to severe inhomogeneities, particularly in those regions where charged particle equilibrium does not hold. The aim of this study was to evaluate the accuracy of the PBC and CCC algorithms in lung cancer radiotherapy using Monte Carlo (MC) technology. Four treatment plans were designed using Oncentra Masterplan TPS for each patient. Two intensity-modulated radiation therapy (IMRT) plans were developed using the PBC and CCC algorithms, and two three-dimensional conformal therapy (3DCRT) plans were developed using the PBC and CCC algorithms. The DICOM-RT files of the treatment plans were exported to the Monte Carlo system to recalculate. The dose distributions of GTV, PTV and ipsilateral lung calculated by the TPS and MC were compared. For 3DCRT and IMRT plans, the mean dose differences for GTV between the CCC and MC increased with decreasing of the GTV volume. For IMRT, the mean dose differences were found to be higher than that of 3DCRT. The CCC algorithm overestimated the GTV mean dose by approximately 3% for IMRT. For 3DCRT plans, when the volume of the GTV was greater than 100 cm(3), the mean doses calculated by CCC and MC almost have no difference. PBC shows large deviations from the MC algorithm. For the dose to the ipsilateral lung, the CCC algorithm overestimated the dose to the entire lung, and the PBC algorithm overestimated V20 but underestimated V5; the difference in V10 was not statistically significant. PBC substantially overestimates the dose to the tumour, but the CCC is similar to the MC simulation. It is recommended that the treatment plans for lung cancer be developed using an advanced dose calculation algorithm other than PBC. MC can accurately calculate the dose distribution in lung cancer and can provide a notably effective tool for benchmarking the performance of other dose calculation algorithms within patients.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Thomas Martin; Celik, Cihangir; McMahan, Kimberly L.
This benchmark experiment was conducted as a joint venture between the US Department of Energy (DOE) and the French Commissariat à l'Energie Atomique (CEA). Staff at the Oak Ridge National Laboratory (ORNL) in the US and the Centre de Valduc in France planned this experiment. The experiment was conducted on October 19, 2010 in the SILENE critical assembly facility at Valduc. Several other organizations contributed to this experiment and the subsequent evaluation, including CEA Saclay, Lawrence Livermore National Laboratory (LLNL), the Y-12 National Security Complex (NSC), Babcock International Group in the United Kingdom, and Los Alamos National Laboratory (LANL). Themore » goal of this experiment was to measure neutron activation and thermoluminescent dosimeter (TLD) doses from a source similar to a fissile solution critical excursion. The resulting benchmark can be used for validation of computer codes and nuclear data libraries as required when performing analysis of criticality accident alarm systems (CAASs). A secondary goal of this experiment was to qualitatively test performance of two CAAS detectors similar to those currently and formerly in use in some US DOE facilities. The detectors tested were the CIDAS MkX and the Rocky Flats NCD-91. The CIDAS detects gammas with a Geiger-Muller tube and the Rocky Flats detects neutrons via charged particles produced in a thin 6LiF disc depositing energy in a Si solid state detector. These detectors were being evaluated to determine whether they would alarm, so they were not expected to generate benchmark quality data.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Thomas Martin; Celik, Cihangir; Isbell, Kimberly McMahan
This benchmark experiment was conducted as a joint venture between the US Department of Energy (DOE) and the French Commissariat à l'Energie Atomique (CEA). Staff at the Oak Ridge National Laboratory (ORNL) in the US and the Centre de Valduc in France planned this experiment. The experiment was conducted on October 13, 2010 in the SILENE critical assembly facility at Valduc. Several other organizations contributed to this experiment and the subsequent evaluation, including CEA Saclay, Lawrence Livermore National Laboratory (LLNL), the Y-12 National Security Complex (NSC), Babcock International Group in the United Kingdom, and Los Alamos National Laboratory (LANL). Themore » goal of this experiment was to measure neutron activation and thermoluminescent dosimeter (TLD) doses from a source similar to a fissile solution critical excursion. The resulting benchmark can be used for validation of computer codes and nuclear data libraries as required when performing analysis of criticality accident alarm systems (CAASs). A secondary goal of this experiment was to qualitatively test performance of two CAAS detectors similar to those currently and formerly in use in some US DOE facilities. The detectors tested were the CIDAS MkX and the Rocky Flats NCD-91. The CIDAS detects gammas with a Geiger-Muller tube, and the Rocky Flats detects neutrons via charged particles produced in a thin 6LiF disc, depositing energy in a Si solid-state detector. These detectors were being evaluated to determine whether they would alarm, so they were not expected to generate benchmark quality data.« less
Jameson, Michael G; McNamara, Jo; Bailey, Michael; Metcalfe, Peter E; Holloway, Lois C; Foo, Kerwyn; Do, Viet; Mileshkin, Linda; Creutzberg, Carien L; Khaw, Pearly
2016-08-01
Protocol deviations in Randomised Controlled Trials have been found to result in a significant decrease in survival and local control. In some cases, the magnitude of the detrimental effect can be larger than the anticipated benefits of the interventions involved. The implementation of appropriate quality assurance of radiotherapy measures for clinical trials has been found to result in fewer deviations from protocol. This paper reports on a benchmarking study conducted in preparation for the PORTEC-3 trial in Australasia. A benchmarking CT dataset was sent to each of the Australasian investigators, it was requested they contour and plan the case according to trial protocol using local treatment planning systems. These data was then sent back to Trans-Tasman Oncology Group for collation and analysis. Thirty three investigators from eighteen institutions across Australia and New Zealand took part in the study. The mean clinical target volume (CTV) volume was 383.4 (228.5-497.8) cm(3) and the mean dose to a reference gold standard CTV was 48.8 (46.4-50.3) Gy. Although there were some large differences in the contouring of the CTV and its constituent parts, these did not translate into large variations in dosimetry. Where individual investigators had deviations from the trial contouring protocol, feedback was provided. The results of this study will be used to compare with the international study QA for the PORTEC-3 trial. © 2016 The Royal Australian and New Zealand College of Radiologists.
A measurement-based generalized source model for Monte Carlo dose simulations of CT scans
Ming, Xin; Feng, Yuanming; Liu, Ransheng; Yang, Chengwen; Zhou, Li; Zhai, Hezheng; Deng, Jun
2018-01-01
The goal of this study is to develop a generalized source model (GSM) for accurate Monte Carlo dose simulations of CT scans based solely on the measurement data without a priori knowledge of scanner specifications. The proposed generalized source model consists of an extended circular source located at x-ray target level with its energy spectrum, source distribution and fluence distribution derived from a set of measurement data conveniently available in the clinic. Specifically, the central axis percent depth dose (PDD) curves measured in water and the cone output factors measured in air were used to derive the energy spectrum and the source distribution respectively with a Levenberg-Marquardt algorithm. The in-air film measurement of fan-beam dose profiles at fixed gantry was back-projected to generate the fluence distribution of the source model. A benchmarked Monte Carlo user code was used to simulate the dose distributions in water with the developed source model as beam input. The feasibility and accuracy of the proposed source model was tested on a GE LightSpeed and a Philips Brilliance Big Bore multi-detector CT (MDCT) scanners available in our clinic. In general, the Monte Carlo simulations of the PDDs in water and dose profiles along lateral and longitudinal directions agreed with the measurements within 4%/1mm for both CT scanners. The absolute dose comparison using two CTDI phantoms (16 cm and 32 cm in diameters) indicated a better than 5% agreement between the Monte Carlo-simulated and the ion chamber-measured doses at a variety of locations for the two scanners. Overall, this study demonstrated that a generalized source model can be constructed based only on a set of measurement data and used for accurate Monte Carlo dose simulations of patients’ CT scans, which would facilitate patient-specific CT organ dose estimation and cancer risk management in the diagnostic and therapeutic radiology. PMID:28079526
Evaluation of radiochromic gel dosimetry and polymer gel dosimetry in a clinical dose verification
NASA Astrophysics Data System (ADS)
Vandecasteele, Jan; De Deene, Yves
2013-09-01
A quantitative comparison of two full three-dimensional (3D) gel dosimetry techniques was assessed in a clinical setting: radiochromic gel dosimetry with an in-house developed optical laser CT scanner and polymer gel dosimetry with magnetic resonance imaging (MRI). To benchmark both gel dosimeters, they were exposed to a 6 MV photon beam and the depth dose was compared against a diamond detector measurement that served as golden standard. Both gel dosimeters were found accurate within 4% accuracy. In the 3D dose matrix of the radiochromic gel, hotspot dose deviations up to 8% were observed which are attributed to the fabrication procedure. The polymer gel readout was shown to be sensitive to B0 field and B1 field non-uniformities as well as temperature variations during scanning. The performance of the two gel dosimeters was also evaluated for a brain tumour IMRT treatment. Both gel measured dose distributions were compared against treatment planning system predicted dose maps which were validated independently with ion chamber measurements and portal dosimetry. In the radiochromic gel measurement, two sources of deviations could be identified. Firstly, the dose in a cluster of voxels near the edge of the phantom deviated from the planned dose. Secondly, the presence of dose hotspots in the order of 10% related to inhomogeneities in the gel limit the clinical acceptance of this dosimetry technique. Based on the results of the micelle gel dosimeter prototype presented here, chemical optimization will be subject of future work. Polymer gel dosimetry is capable of measuring the absolute dose in the whole 3D volume within 5% accuracy. A temperature stabilization technique is incorporated to increase the accuracy during short measurements, however keeping the temperature stable during long measurement times in both calibration phantoms and the volumetric phantom is more challenging. The sensitivity of MRI readout to minimal temperature fluctuations is demonstrated which proves the need for adequate compensation strategies.
Toxicological profile of ultrapure 2,2',3,4,4',5,5'-heptachlorbiphenyl (PCB 180) in adult rats.
Viluksela, Matti; Heikkinen, Päivi; van der Ven, Leo T M; Rendel, Filip; Roos, Robert; Esteban, Javier; Korkalainen, Merja; Lensu, Sanna; Miettinen, Hanna M; Savolainen, Kari; Sankari, Satu; Lilienthal, Hellmuth; Adamsson, Annika; Toppari, Jorma; Herlin, Maria; Finnilä, Mikko; Tuukkanen, Juha; Leslie, Heather A; Hamers, Timo; Hamscher, Gerd; Al-Anati, Lauy; Stenius, Ulla; Dervola, Kine-Susann; Bogen, Inger-Lise; Fonnum, Frode; Andersson, Patrik L; Schrenk, Dieter; Halldin, Krister; Håkansson, Helen
2014-01-01
PCB 180 is a persistent non-dioxin-like polychlorinated biphenyl (NDL-PCB) abundantly present in food and the environment. Risk characterization of NDL-PCBs is confounded by the presence of highly potent dioxin-like impurities. We used ultrapure PCB 180 to characterize its toxicity profile in a 28-day repeat dose toxicity study in young adult rats extended to cover endocrine and behavioral effects. Using a loading dose/maintenance dose regimen, groups of 5 males and 5 females were given total doses of 0, 3, 10, 30, 100, 300, 1000 or 1700 mg PCB 180/kg body weight by gavage. Dose-responses were analyzed using benchmark dose modeling based on dose and adipose tissue PCB concentrations. Body weight gain was retarded at 1700 mg/kg during loading dosing, but recovered thereafter. The most sensitive endpoint of toxicity that was used for risk characterization was altered open field behavior in females; i.e. increased activity and distance moved in the inner zone of an open field suggesting altered emotional responses to unfamiliar environment and impaired behavioral inhibition. Other dose-dependent changes included decreased serum thyroid hormones with associated histopathological changes, altered tissue retinoid levels, decreased hematocrit and hemoglobin, decreased follicle stimulating hormone and luteinizing hormone levels in males and increased expression of DNA damage markers in liver of females. Dose-dependent hypertrophy of zona fasciculata cells was observed in adrenals suggesting activation of cortex. There were gender differences in sensitivity and toxicity profiles were partly different in males and females. PCB 180 adipose tissue concentrations were clearly above the general human population levels, but close to the levels in highly exposed populations. The results demonstrate a distinct toxicological profile of PCB 180 with lack of dioxin-like properties required for assignment of WHO toxic equivalency factor. However, PCB 180 shares several toxicological targets with dioxin-like compounds emphasizing the potential for interactions.
Toxicological Profile of Ultrapure 2,2′,3,4,4′,5,5′-Heptachlorbiphenyl (PCB 180) in Adult Rats
Viluksela, Matti; Heikkinen, Päivi; van der Ven, Leo T. M.; Rendel, Filip; Roos, Robert; Esteban, Javier; Korkalainen, Merja; Lensu, Sanna; Miettinen, Hanna M.; Savolainen, Kari; Sankari, Satu; Lilienthal, Hellmuth; Adamsson, Annika; Toppari, Jorma; Herlin, Maria; Finnilä, Mikko; Tuukkanen, Juha; Leslie, Heather A.; Hamers, Timo; Hamscher, Gerd; Al-Anati, Lauy; Stenius, Ulla; Dervola, Kine-Susann; Bogen, Inger-Lise; Fonnum, Frode; Andersson, Patrik L.; Schrenk, Dieter; Halldin, Krister; Håkansson, Helen
2014-01-01
PCB 180 is a persistent non-dioxin-like polychlorinated biphenyl (NDL-PCB) abundantly present in food and the environment. Risk characterization of NDL-PCBs is confounded by the presence of highly potent dioxin-like impurities. We used ultrapure PCB 180 to characterize its toxicity profile in a 28-day repeat dose toxicity study in young adult rats extended to cover endocrine and behavioral effects. Using a loading dose/maintenance dose regimen, groups of 5 males and 5 females were given total doses of 0, 3, 10, 30, 100, 300, 1000 or 1700 mg PCB 180/kg body weight by gavage. Dose-responses were analyzed using benchmark dose modeling based on dose and adipose tissue PCB concentrations. Body weight gain was retarded at 1700 mg/kg during loading dosing, but recovered thereafter. The most sensitive endpoint of toxicity that was used for risk characterization was altered open field behavior in females; i.e. increased activity and distance moved in the inner zone of an open field suggesting altered emotional responses to unfamiliar environment and impaired behavioral inhibition. Other dose-dependent changes included decreased serum thyroid hormones with associated histopathological changes, altered tissue retinoid levels, decreased hematocrit and hemoglobin, decreased follicle stimulating hormone and luteinizing hormone levels in males and increased expression of DNA damage markers in liver of females. Dose-dependent hypertrophy of zona fasciculata cells was observed in adrenals suggesting activation of cortex. There were gender differences in sensitivity and toxicity profiles were partly different in males and females. PCB 180 adipose tissue concentrations were clearly above the general human population levels, but close to the levels in highly exposed populations. The results demonstrate a distinct toxicological profile of PCB 180 with lack of dioxin-like properties required for assignment of WHO toxic equivalency factor. However, PCB 180 shares several toxicological targets with dioxin-like compounds emphasizing the potential for interactions. PMID:25137063
Faught, Austin M; Davidson, Scott E; Fontenot, Jonas; Kry, Stephen F; Etzel, Carol; Ibbott, Geoffrey S; Followill, David S
2017-09-01
The Imaging and Radiation Oncology Core Houston (IROC-H) (formerly the Radiological Physics Center) has reported varying levels of agreement in their anthropomorphic phantom audits. There is reason to believe one source of error in this observed disagreement is the accuracy of the dose calculation algorithms and heterogeneity corrections used. To audit this component of the radiotherapy treatment process, an independent dose calculation tool is needed. Monte Carlo multiple source models for Elekta 6 MV and 10 MV therapeutic x-ray beams were commissioned based on measurement of central axis depth dose data for a 10 × 10 cm 2 field size and dose profiles for a 40 × 40 cm 2 field size. The models were validated against open field measurements consisting of depth dose data and dose profiles for field sizes ranging from 3 × 3 cm 2 to 30 × 30 cm 2 . The models were then benchmarked against measurements in IROC-H's anthropomorphic head and neck and lung phantoms. Validation results showed 97.9% and 96.8% of depth dose data passed a ±2% Van Dyk criterion for 6 MV and 10 MV models respectively. Dose profile comparisons showed an average agreement using a ±2%/2 mm criterion of 98.0% and 99.0% for 6 MV and 10 MV models respectively. Phantom plan comparisons were evaluated using ±3%/2 mm gamma criterion, and averaged passing rates between Monte Carlo and measurements were 87.4% and 89.9% for 6 MV and 10 MV models respectively. Accurate multiple source models for Elekta 6 MV and 10 MV x-ray beams have been developed for inclusion in an independent dose calculation tool for use in clinical trial audits. © 2017 American Association of Physicists in Medicine.
A measurement-based generalized source model for Monte Carlo dose simulations of CT scans
NASA Astrophysics Data System (ADS)
Ming, Xin; Feng, Yuanming; Liu, Ransheng; Yang, Chengwen; Zhou, Li; Zhai, Hezheng; Deng, Jun
2017-03-01
The goal of this study is to develop a generalized source model for accurate Monte Carlo dose simulations of CT scans based solely on the measurement data without a priori knowledge of scanner specifications. The proposed generalized source model consists of an extended circular source located at x-ray target level with its energy spectrum, source distribution and fluence distribution derived from a set of measurement data conveniently available in the clinic. Specifically, the central axis percent depth dose (PDD) curves measured in water and the cone output factors measured in air were used to derive the energy spectrum and the source distribution respectively with a Levenberg-Marquardt algorithm. The in-air film measurement of fan-beam dose profiles at fixed gantry was back-projected to generate the fluence distribution of the source model. A benchmarked Monte Carlo user code was used to simulate the dose distributions in water with the developed source model as beam input. The feasibility and accuracy of the proposed source model was tested on a GE LightSpeed and a Philips Brilliance Big Bore multi-detector CT (MDCT) scanners available in our clinic. In general, the Monte Carlo simulations of the PDDs in water and dose profiles along lateral and longitudinal directions agreed with the measurements within 4%/1 mm for both CT scanners. The absolute dose comparison using two CTDI phantoms (16 cm and 32 cm in diameters) indicated a better than 5% agreement between the Monte Carlo-simulated and the ion chamber-measured doses at a variety of locations for the two scanners. Overall, this study demonstrated that a generalized source model can be constructed based only on a set of measurement data and used for accurate Monte Carlo dose simulations of patients’ CT scans, which would facilitate patient-specific CT organ dose estimation and cancer risk management in the diagnostic and therapeutic radiology.
Benchmarking specialty hospitals, a scoping review on theory and practice.
Wind, A; van Harten, W H
2017-04-04
Although benchmarking may improve hospital processes, research on this subject is limited. The aim of this study was to provide an overview of publications on benchmarking in specialty hospitals and a description of study characteristics. We searched PubMed and EMBASE for articles published in English in the last 10 years. Eligible articles described a project stating benchmarking as its objective and involving a specialty hospital or specific patient category; or those dealing with the methodology or evaluation of benchmarking. Of 1,817 articles identified in total, 24 were included in the study. Articles were categorized into: pathway benchmarking, institutional benchmarking, articles on benchmark methodology or -evaluation and benchmarking using a patient registry. There was a large degree of variability:(1) study designs were mostly descriptive and retrospective; (2) not all studies generated and showed data in sufficient detail; and (3) there was variety in whether a benchmarking model was just described or if quality improvement as a consequence of the benchmark was reported upon. Most of the studies that described a benchmark model described the use of benchmarking partners from the same industry category, sometimes from all over the world. Benchmarking seems to be more developed in eye hospitals, emergency departments and oncology specialty hospitals. Some studies showed promising improvement effects. However, the majority of the articles lacked a structured design, and did not report on benchmark outcomes. In order to evaluate the effectiveness of benchmarking to improve quality in specialty hospitals, robust and structured designs are needed including a follow up to check whether the benchmark study has led to improvements.
High exposure to inorganic arsenic by food: the need for risk reduction.
Gundert-Remy, Ursula; Damm, Georg; Foth, Heidi; Freyberger, Alexius; Gebel, Thomas; Golka, Klaus; Röhl, Claudia; Schupp, Thomas; Wollin, Klaus-Michael; Hengstler, Jan Georg
2015-12-01
Arsenic is a human carcinogen that occurs ubiquitously in soil and water. Based on epidemiological studies, a benchmark dose (lower/higher bound estimate) between 0.3 and 8 μg/kg bw/day was estimated to cause a 1 % increased risk of lung, skin and bladder cancer. A recently published study by EFSA on dietary exposure to inorganic arsenic in the European population reported 95th percentiles (lower bound min to upper bound max) for different age groups in the same range as the benchmark dose. For toddlers, a highly exposed group, the highest values ranged between 0.61 and 2.09 µg arsenic/kg bw/day. For all other age classes, the margin of exposure is also small. This scenario calls for regulatory action to reduce arsenic exposure. One priority measure should be to reduce arsenic in food categories that contribute most to exposure. In the EFSA study the food categories 'milk and dairy products,' 'drinking water' and 'food for infants' represent major sources of inorganic arsenic for infants and also rice is an important source. Long-term strategies are required to reduce inorganic arsenic in these food groups. The reduced consumption of rice and rice products which has been recommended may be helpful for a minority of individuals consuming unusually high amounts of rice. However, it is only of limited value for the general European population, because the food categories 'grain-based processed products (non rice-based)' or 'milk and dairy products' contribute more to the exposure with inorganic arsenic than the food category 'rice.' A balanced regulatory activity focusing on the most relevant food categories is required. In conclusion, exposure to inorganic arsenic represents a risk to the health of the European population, particularly to young children. Regulatory measures to reduce exposure are urgently required.
Ellis, Judith
2006-07-01
The aim of this article is to review published descriptions of benchmarking activity and synthesize benchmarking principles to encourage the acceptance and use of Essence of Care as a new benchmarking approach to continuous quality improvement, and to promote its acceptance as an integral and effective part of benchmarking activity in health services. The Essence of Care, was launched by the Department of Health in England in 2001 to provide a benchmarking tool kit to support continuous improvement in the quality of fundamental aspects of health care, for example, privacy and dignity, nutrition and hygiene. The tool kit is now being effectively used by some frontline staff. However, use is inconsistent, with the value of the tool kit, or the support clinical practice benchmarking requires to be effective, not always recognized or provided by National Health Service managers, who are absorbed with the use of quantitative benchmarking approaches and measurability of comparative performance data. This review of published benchmarking literature, was obtained through an ever-narrowing search strategy commencing from benchmarking within quality improvement literature through to benchmarking activity in health services and including access to not only published examples of benchmarking approaches and models used but the actual consideration of web-based benchmarking data. This supported identification of how benchmarking approaches have developed and been used, remaining true to the basic benchmarking principles of continuous improvement through comparison and sharing (Camp 1989). Descriptions of models and exemplars of quantitative and specifically performance benchmarking activity in industry abound (Camp 1998), with far fewer examples of more qualitative and process benchmarking approaches in use in the public services and then applied to the health service (Bullivant 1998). The literature is also in the main descriptive in its support of the effectiveness of benchmarking activity and although this does not seem to have restricted its popularity in quantitative activity, reticence about the value of the more qualitative approaches, for example Essence of Care, needs to be overcome in order to improve the quality of patient care and experiences. The perceived immeasurability and subjectivity of Essence of Care and clinical practice benchmarks means that these benchmarking approaches are not always accepted or supported by health service organizations as valid benchmarking activity. In conclusion, Essence of Care benchmarking is a sophisticated clinical practice benchmarking approach which needs to be accepted as an integral part of health service benchmarking activity to support improvement in the quality of patient care and experiences.
NASA Astrophysics Data System (ADS)
Williamson, Jeffrey F.
2006-09-01
This paper briefly reviews the evolution of brachytherapy dosimetry from 1900 to the present. Dosimetric practices in brachytherapy fall into three distinct eras: During the era of biological dosimetry (1900-1938), radium pioneers could only specify Ra-226 and Rn-222 implants in terms of the mass of radium encapsulated within the implanted sources. Due to the high energy of its emitted gamma rays and the long range of its secondary electrons in air, free-air chambers could not be used to quantify the output of Ra-226 sources in terms of exposure. Biological dosimetry, most prominently the threshold erythema dose, gained currency as a means of intercomparing radium treatments with exposure-calibrated orthovoltage x-ray units. The classical dosimetry era (1940-1980) began with successful exposure standardization of Ra-226 sources by Bragg-Gray cavity chambers. Classical dose-computation algorithms, based upon 1-D buildup factor measurements and point-source superposition computational algorithms, were able to accommodate artificial radionuclides such as Co-60, Ir-192, and Cs-137. The quantitative dosimetry era (1980- ) arose in response to the increasing utilization of low energy K-capture radionuclides such as I-125 and Pd-103 for which classical approaches could not be expected to estimate accurate correct doses. This led to intensive development of both experimental (largely TLD-100 dosimetry) and Monte Carlo dosimetry techniques along with more accurate air-kerma strength standards. As a result of extensive benchmarking and intercomparison of these different methods, single-seed low-energy radionuclide dose distributions are now known with a total uncertainty of 3%-5%.
Wang, Haitao; Duan, Huawei; Meng, Tao; Yang, Mo; Cui, Lianhua; Bin, Ping; Dai, Yufei; Niu, Yong; Shen, Meili; Zhang, Liping; Zheng, Yuxin; Leng, Shuguang
2018-04-01
Diesel exhaust (DE) as the major source of vehicle-emitted particle matter in ambient air impairs lung function. The objectives were to assess the contribution of local (eg, the fraction of exhaled nitric oxide [FeNO] and serum Club cell secretory protein [CC16]) and systemic (eg, serum C-reaction protein [CRP] and interleukin-6 [IL-6]) inflammation to DE-induced lung function impairment using a unique cohort of diesel engine testers (DETs, n = 137) and non-DETs (n = 127), made up of current and noncurrent smokers. Urinary metabolites, FeNO, serum markers, and spirometry were assessed. A 19% reduction in CC16 and a 94% increase in CRP were identified in DETs compared with non-DETs (all p values <10-4), which were further corroborated by showing a dose-response relationship with internal dose for DE exposure (all p values <.04) and a time-course relationship with DE exposure history (all p values <.005). Mediation analysis showed that 43% of the difference in FEV1 between DETs and non-DETs can be explained by circulating CC16 and CRP (permuted p < .001). An inverse dose-dependent relationship between FeNO and internal dose for cigarette smoke was identified (p = .0003). A range of 95% lower bounds of benchmark dose of 1.0261-1.4513 μg phenanthrols/g creatinine in urine as an internal dose was recommended for regulatory risk assessment. Local and systemic inflammation may be key processes that contribute to the subsequent development of obstructive lung disease in DE-exposed populations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lalonde, Michel; Alexander, Kevin; Olding, Tim
Purpose: Radiochromic film dosimetry is a standard technique used in clinics to verify modern conformal radiation therapy delivery, and sometimes in research to validate other dosimeters. We are using film as a standard for comparison as we improve high-resolution three-dimensional gel systems for small field dosimetry; however, precise film dosimetry can be technically challenging. We report here measurements for fractionated stereotactic radiation therapy (FSRT) delivered using volumetric modulated arc therapy (VMAT) to investigate the accuracy and reproducibility of film measurements with a novel in-house readout system. We show that radiochromic film can accurately and reproducibly validate FSRT deliveries and alsomore » benchmark our gel dosimetry work. Methods: VMAT FSRT plans for metastases alone (PTV{sub MET}) and whole brain plus metastases (WB+PTV{sub MET}) were delivered onto a multi-configurational phantom with a sheet of EBT3 Gafchromic film inserted mid-plane. A dose of 400 cGy was prescribed to 4 small PTV{sub MET} structures in the phantom, while a WB structure was prescribed a dose of 200 cGy in the WB+PTV{sub MET} iterations. Doses generated from film readout with our in-house system were compared to treatment planned doses. Each delivery was repeated multiple times to assess reproducibility. Results and Conclusions: The reproducibility of film optical density readout was excellent throughout all experiments. Doses measured from the film agreed well with plans for the WB+PTV{sub MET} delivery. But, film doses for PTV{sub MET} only deliveries were significantly below planned doses. This discrepancy is due to stray/scattered light perturbations in our system during readout. Corrections schemes will be presented.« less
NASA Astrophysics Data System (ADS)
Jansen, Jan T. M.; Shrimpton, Paul C.
2016-07-01
The ImPACT (imaging performance assessment of CT scanners) CT patient dosimetry calculator is still used world-wide to estimate organ and effective doses (E) for computed tomography (CT) examinations, although the tool is based on Monte Carlo calculations reflecting practice in the early 1990’s. Subsequent developments in CT scanners, definitions of E, anthropomorphic phantoms, computers and radiation transport codes, have all fuelled an urgent need for updated organ dose conversion factors for contemporary CT. A new system for such simulations has been developed and satisfactorily tested. Benchmark comparisons of normalised organ doses presently derived for three old scanners (General Electric 9800, Philips Tomoscan LX and Siemens Somatom DRH) are within 5% of published values. Moreover, calculated normalised values of CT Dose Index for these scanners are in reasonable agreement (within measurement and computational uncertainties of ±6% and ±1%, respectively) with reported standard measurements. Organ dose coefficients calculated for a contemporary CT scanner (Siemens Somatom Sensation 16) demonstrate potential deviations by up to around 30% from the surrogate values presently assumed (through a scanner matching process) when using the ImPACT CT Dosimetry tool for newer scanners. Also, illustrative estimates of E for some typical examinations and a range of anthropomorphic phantoms demonstrate the significant differences (by some 10’s of percent) that can arise when changing from the previously adopted stylised mathematical phantom to the voxel phantoms presently recommended by the International Commission on Radiological Protection (ICRP), and when following the 2007 ICRP recommendations (updated from 1990) concerning tissue weighting factors. Further simulations with the validated dosimetry system will provide updated series of dose coefficients for a wide range of contemporary scanners.
Mirsch, Johanna; Tommasino, Francesco; Frohns, Antonia; Conrad, Sandro; Durante, Marco; Scholz, Michael; Friedrich, Thomas; Löbrich, Markus
2015-01-01
Charged particles are increasingly used in cancer radiotherapy and contribute significantly to the natural radiation risk. The difference in the biological effects of high-energy charged particles compared with X-rays or γ-rays is determined largely by the spatial distribution of their energy deposition events. Part of the energy is deposited in a densely ionizing manner in the inner part of the track, with the remainder spread out more sparsely over the outer track region. Our knowledge about the dose distribution is derived solely from modeling approaches and physical measurements in inorganic material. Here we exploited the exceptional sensitivity of γH2AX foci technology and quantified the spatial distribution of DNA lesions induced by charged particles in a mouse model tissue. We observed that charged particles damage tissue nonhomogenously, with single cells receiving high doses and many other cells exposed to isolated damage resulting from high-energy secondary electrons. Using calibration experiments, we transformed the 3D lesion distribution into a dose distribution and compared it with predictions from modeling approaches. We obtained a radial dose distribution with sub-micrometer resolution that decreased with increasing distance to the particle path following a 1/r2 dependency. The analysis further revealed the existence of a background dose at larger distances from the particle path arising from overlapping dose deposition events from independent particles. Our study provides, to our knowledge, the first quantification of the spatial dose distribution of charged particles in biologically relevant material, and will serve as a benchmark for biophysical models that predict the biological effects of these particles. PMID:26392532
An international dosimetry exchange for BNCT part II: computational dosimetry normalizations.
Riley, K J; Binns, P J; Harling, O K; Albritton, J R; Kiger, W S; Rezaei, A; Sköld, K; Seppälä, T; Savolainen, S; Auterinen, I; Marek, M; Viererbl, L; Nievaart, V A; Moss, R L
2008-12-01
The meaningful sharing and combining of clinical results from different centers in the world performing boron neutron capture therapy (BNCT) requires improved precision in dose specification between programs. To this end absorbed dose normalizations were performed for the European clinical centers at the Joint Research Centre of the European Commission, Petten (The Netherlands), Nuclear Research Institute, Rez (Czech Republic), VTT, Espoo (Finland), and Studsvik, Nyköping (Sweden). Each European group prepared a treatment plan calculation that was bench-marked against Massachusetts Institute of Technology (MIT) dosimetry performed in a large, water-filled phantom to uniformly evaluate dose specifications with an estimated precision of +/-2%-3%. These normalizations were compared with those derived from an earlier exchange between Brookhaven National Laboratory (BNL) and MIT in the USA. Neglecting the uncertainties related to biological weighting factors, large variations between calculated and measured dose are apparent that depend upon the 10B uptake in tissue. Assuming a boron concentration of 15 microg g(-1) in normal tissue, differences in the evaluated maximum dose to brain for the same nominal specification of 10 Gy(w) at the different facilities range between 7.6 and 13.2 Gy(w) in the trials using boronophenylalanine (BPA) as the boron delivery compound and between 8.9 and 11.1 Gy(w) in the two boron sulfhydryl (BSH) studies. Most notably, the value for the same specified dose of 10 Gy(w) determined at the different participating centers using BPA is significantly higher than at BNL by 32% (MIT), 43% (VTT), 49% (JRC), and 74% (Studsvik). Conversion of dose specification is now possible between all active participants and should be incorporated into future multi-center patient analyses.
Monaco and film dosimetry of 3D CRT, IMRT and VMAT cases in a realistic pelvic prosthetic phantom
NASA Astrophysics Data System (ADS)
Ade, Nicholas; du Plessis, F. C. P.
2018-04-01
The dosimetry of patients with metallic hip implants during irradiation of pelvic lesions is challenging due to dose distortions caused by implants. This work presents a dosimetric comparison of various multi-field photon-beam dose distributions in the presence of unilateral hip titanium prosthesis (UHTiP) embedded in a unique pelvic phantom made out of water-equivalent nylon slices. The impact of the UHTiP on the accuracy of dose calculations from a Monaco TPS (treatment planning system) using the X-ray voxel Monte Carlo (XVMC) algorithm was benchmarked against measured dose data using Gafchromic EBT3 film. Multi-field beam arrangements including a 4-field box, 5-field 3DCRT (three-dimensional conformal radiation therapy), 6-field IMRT (intensity modulated radiation therapy) and a single-arc VMAT (volumetric modulated arc therapy) plan were set up for 6 MV and 15 MV beams. These plans were generated for the pelvic phantom that contains the prosthesis with film inserted. Compared to Monaco TPS dose calculations, film measurements showed enhanced dose in the prosthesis which was not predicted by Monaco due to its limitation in relative density assignment. The enhanced prosthesis dose increased with increase in beam energy and decreased with the complexity of the treatment plans, with VMAT giving the least escalated dose. The dose increased between 5% and 19% for 6 MV and between 6% and 21% for 15 MV. A gamma index analysis showed that 70-92% of dose points (excluding the prosthesis) were within 3% discrepancy. Increasing the number of treatment fields increases target dose coverage and improves the agreement between film and Monaco. When the relative electron density (RED) in the prosthesis was varied between 3.72 and 15 the dose discrepancy between film and Monaco increased from 30% to 57% for 6 MV and from 30% to 50% for 15 MV. The study indicates that beam weights for fields that pass through the prosthesis should be minimised and its RED must be correct for accurate dose calculation on Monaco.
NASA Technical Reports Server (NTRS)
Bell, Michael A.
1999-01-01
Informal benchmarking using personal or professional networks has taken place for many years at the Kennedy Space Center (KSC). The National Aeronautics and Space Administration (NASA) recognized early on, the need to formalize the benchmarking process for better utilization of resources and improved benchmarking performance. The need to compete in a faster, better, cheaper environment has been the catalyst for formalizing these efforts. A pioneering benchmarking consortium was chartered at KSC in January 1994. The consortium known as the Kennedy Benchmarking Clearinghouse (KBC), is a collaborative effort of NASA and all major KSC contractors. The charter of this consortium is to facilitate effective benchmarking, and leverage the resulting quality improvements across KSC. The KBC acts as a resource with experienced facilitators and a proven process. One of the initial actions of the KBC was to develop a holistic methodology for Center-wide benchmarking. This approach to Benchmarking integrates the best features of proven benchmarking models (i.e., Camp, Spendolini, Watson, and Balm). This cost-effective alternative to conventional Benchmarking approaches has provided a foundation for consistent benchmarking at KSC through the development of common terminology, tools, and techniques. Through these efforts a foundation and infrastructure has been built which allows short duration benchmarking studies yielding results gleaned from world class partners that can be readily implemented. The KBC has been recognized with the Silver Medal Award (in the applied research category) from the International Benchmarking Clearinghouse.
Field, Kevin G.; Yang, Ying; Busby, Jeremy T.; ...
2015-03-09
Radiation induced segregation (RIS) is a well-studied phenomena which occurs in many structurally relevant nuclear materials including austenitic stainless steels. RIS occurs due to solute atoms preferentially coupling to mobile point defect fluxes that migrate and interact with defect sinks. Here, a 304 stainless steel was neutron irradiated up to 47.1 dpa at 320 °C. Investigations into the RIS response at specific grain boundary types were utilized to determine the sink characteristics of different boundary types as a function of irradiation dose. A rate theory model built on the foundation of the modified inverse Kirkendall (MIK) model is proposed andmore » benchmarked to the experimental results. This model, termed the GiMIK model, includes alterations in the boundary conditions based on grain boundary structure and includes expressions for interstitial binding. This investigation, through experiment and modeling, found specific grain boundary structures exhibit unique defect sink characteristics depending on their local structure. Furthermore, such interactions were found to be consistent across all doses investigated and had larger global implications including precipitation of Ni-Si clusters near different grain boundary types.« less
Moudgal, Chandrika J; Garrahan, Kevin; Brady-Roberts, Eletha; Gavrelis, Naida; Arbogast, Michelle; Dun, Sarah
2008-11-15
The toxicity value database of the United States Environmental Protection Agency's (EPA) National Homeland Security Research Center has been in development since 2004. The toxicity value database includes a compilation of agent property, toxicity, dose-response, and health effects data for 96 agents: 84 chemical and radiological agents and 12 biotoxins. The database is populated with multiple toxicity benchmark values and agent property information from secondary sources, with web links to the secondary sources, where available. A selected set of primary literature citations and associated dose-response data are also included. The toxicity value database offers a powerful means to quickly and efficiently gather pertinent toxicity and dose-response data for a number of agents that are of concern to the nation's security. This database, in conjunction with other tools, will play an important role in understanding human health risks, and will provide a means for risk assessors and managers to make quick and informed decisions on the potential health risks and determine appropriate responses (e.g., cleanup) to agent release. A final, stand alone MS ACESSS working version of the toxicity value database was completed in November, 2007.
Hasegawa, R; Hirata-Koizumi, M; Dourson, M; Parker, A; Hirose, A; Nakai, S; Kamata, E; Ema, M
2007-04-01
We comprehensively re-analyzed the toxicity data for 18 industrial chemicals from repeated oral exposures in newborn and young rats, which were previously published. Two new toxicity endpoints specific to this comparative analysis were identified, the first, the presumed no observed adverse effect level (pNOAEL) was estimated based on results of both main and dose-finding studies, and the second, the presumed unequivocally toxic level (pUETL) was defined as a clear toxic dose giving similar severity in both newborn and young rats. Based on the analyses of both pNOAEL and pUETL ratios between the different ages, newborn rats demonstrated greater susceptibility (at most 8-fold) to nearly two thirds of these 18 chemicals (mostly phenolic substances), and less or nearly equal sensitivity to the other chemicals. Exceptionally one chemical only showed toxicity in newborn rats. In addition, Benchmark Dose Lower Bound (BMDL) estimates were calculated as an alternative endpoint. Most BMDLs were comparable to their corresponding pNOAELs and the overall correlation coefficient was 0.904. We discussed how our results can be incorporated into chemical risk assessment approaches to protect pediatric health from direct oral exposure to chemicals.
Application of the first collision source method to CSNS target station shielding calculation
NASA Astrophysics Data System (ADS)
Zheng, Ying; Zhang, Bin; Chen, Meng-Teng; Zhang, Liang; Cao, Bo; Chen, Yi-Xue; Yin, Wen; Liang, Tian-Jiao
2016-04-01
Ray effects are an inherent problem of the discrete ordinates method. RAY3D, a functional module of ARES, which is a discrete ordinates code system, employs a semi-analytic first collision source method to mitigate ray effects. This method decomposes the flux into uncollided and collided components, and then calculates them with an analytical method and discrete ordinates method respectively. In this article, RAY3D is validated by the Kobayashi benchmarks and applied to the neutron beamline shielding problem of China Spallation Neutron Source (CSNS) target station. The numerical results of the Kobayashi benchmarks indicate that the solutions of DONTRAN3D with RAY3D agree well with the Monte Carlo solutions. The dose rate at the end of the neutron beamline is less than 10.83 μSv/h in the CSNS target station neutron beamline shutter model. RAY3D can effectively mitigate the ray effects and obtain relatively reasonable results. Supported by Major National S&T Specific Program of Large Advanced Pressurized Water Reactor Nuclear Power Plant (2011ZX06004-007), National Natural Science Foundation of China (11505059, 11575061), and the Fundamental Research Funds for the Central Universities (13QN34).
NASA Astrophysics Data System (ADS)
Smekens, F.; Létang, J. M.; Noblet, C.; Chiavassa, S.; Delpon, G.; Freud, N.; Rit, S.; Sarrut, D.
2014-12-01
We propose the split exponential track length estimator (seTLE), a new kerma-based method combining the exponential variant of the TLE and a splitting strategy to speed up Monte Carlo (MC) dose computation for low energy photon beams. The splitting strategy is applied to both the primary and the secondary emitted photons, triggered by either the MC events generator for primaries or the photon interactions generator for secondaries. Split photons are replaced by virtual particles for fast dose calculation using the exponential TLE. Virtual particles are propagated by ray-tracing in voxelized volumes and by conventional MC navigation elsewhere. Hence, the contribution of volumes such as collimators, treatment couch and holding devices can be taken into account in the dose calculation. We evaluated and analysed the seTLE method for two realistic small animal radiotherapy treatment plans. The effect of the kerma approximation, i.e. the complete deactivation of electron transport, was investigated. The efficiency of seTLE against splitting multiplicities was also studied. A benchmark with analog MC and TLE was carried out in terms of dose convergence and efficiency. The results showed that the deactivation of electrons impacts the dose at the water/bone interface in high dose regions. The maximum and mean dose differences normalized to the dose at the isocenter were, respectively of 14% and 2% . Optimal splitting multiplicities were found to be around 300. In all situations, discrepancies in integral dose were below 0.5% and 99.8% of the voxels fulfilled a 1%/0.3 mm gamma index criterion. Efficiency gains of seTLE varied from 3.2 × 105 to 7.7 × 105 compared to analog MC and from 13 to 15 compared to conventional TLE. In conclusion, seTLE provides results similar to the TLE while increasing the efficiency by a factor between 13 and 15, which makes it particularly well-suited to typical small animal radiation therapy applications.
NASA Astrophysics Data System (ADS)
Bäumer, C.; Janson, M.; Timmermann, B.; Wulff, J.
2018-04-01
To assess if apertures shall be mounted upstream or downstream of a range shifting block if these field-shaping devices are combined with the pencil-beam scanning delivery technique (PBS). The lateral dose fall-off served as a benchmark parameter. Both options realizing PBS-with-apertures were compared to the uniform scanning mode. We also evaluated the difference regarding the out-of-field dose caused by interactions of protons in beam-shaping devices. The potential benefit of the downstream configuration over the upstream configuration was estimated analytically. Guided by this theoretical evaluation a mechanical adapter was developed which transforms the upstream configuration provided by the proton machine vendor to a downstream configuration. Transversal dose profiles were calculated with the Monte-Carlo based dose engine of the commercial treatment planning system RayStation 6. Two-dimensional dose planes were measured with an ionization chamber array and a scintillation detector at different depths and compared to the calculation. Additionally, a clinical example for the irradiation of the orbit was compared for both PBS options and a uniform scanning treatment plan. Assuming the same air gap the lateral dose fall-off at the field edge at a few centimeter depth is 20% smaller for the aperture-downstream configuration than for the upstream one. For both options of PBS-with-apertures the dose fall-off is larger than in uniform scanning delivery mode if the minimum accelerator energy is 100 MeV. The RayStation treatment planning system calculated the width of the lateral dose fall-off with an accuracy of typically 0.1 mm–0.3 mm. Although experiments and calculations indicate a ranking of the three delivery options regarding lateral dose fall-off, there seems to be a limited impact on a multi-field treatment plan.
Yang, Y. M.; Geurts, M.; Smilowitz, J. B.; Sterpin, E.; Bednarz, B. P.
2015-01-01
Purpose: Several groups are exploring the integration of magnetic resonance (MR) image guidance with radiotherapy to reduce tumor position uncertainty during photon radiotherapy. The therapeutic gain from reducing tumor position uncertainty using intrafraction MR imaging during radiotherapy could be partially offset if the negative effects of magnetic field-induced dose perturbations are not appreciated or accounted for. The authors hypothesize that a more rotationally symmetric modality such as helical tomotherapy will permit a systematic mediation of these dose perturbations. This investigation offers a unique look at the dose perturbations due to homogeneous transverse magnetic field during the delivery of Tomotherapy® Treatment System plans under varying degrees of rotational beamlet symmetry. Methods: The authors accurately reproduced treatment plan beamlet and patient configurations using the Monte Carlo code geant4. This code has a thoroughly benchmarked electromagnetic particle transport physics package well-suited for the radiotherapy energy regime. The three approved clinical treatment plans for this study were for a prostate, head and neck, and lung treatment. The dose heterogeneity index metric was used to quantify the effect of the dose perturbations to the target volumes. Results: The authors demonstrate the ability to reproduce the clinical dose–volume histograms (DVH) to within 4% dose agreement at each DVH point for the target volumes and most planning structures, and therefore, are able to confidently examine the effects of transverse magnetic fields on the plans. The authors investigated field strengths of 0.35, 0.7, 1, 1.5, and 3 T. Changes to the dose heterogeneity index of 0.1% were seen in the prostate and head and neck case, reflecting negligible dose perturbations to the target volumes, a change from 5.5% to 20.1% was observed with the lung case. Conclusions: This study demonstrated that the effect of external magnetic fields can be mitigated by exploiting a more rotationally symmetric treatment modality. PMID:25652485
DOE Office of Scientific and Technical Information (OSTI.GOV)
Suter, G.W. II; Tsao, C.L.
1996-06-01
This report presents potential screening benchmarks for protection of aquatic life form contaminants in water. Because there is no guidance for screening for benchmarks, a set of alternative benchmarks is presented herein. This report presents the alternative benchmarks for chemicals that have been detected on the Oak Ridge Reservation. It also presents the data used to calculate the benchmarks and the sources of the data. It compares the benchmarks and discusses their relative conservatism and utility. Also included is the updates of benchmark values where appropriate, new benchmark values, secondary sources are replaced by primary sources, and a more completemore » documentation of the sources and derivation of all values are presented.« less
Benchmarking in emergency health systems.
Kennedy, Marcus P; Allen, Jacqueline; Allen, Greg
2002-12-01
This paper discusses the role of benchmarking as a component of quality management. It describes the historical background of benchmarking, its competitive origin and the requirement in today's health environment for a more collaborative approach. The classical 'functional and generic' types of benchmarking are discussed with a suggestion to adopt a different terminology that describes the purpose and practicalities of benchmarking. Benchmarking is not without risks. The consequence of inappropriate focus and the need for a balanced overview of process is explored. The competition that is intrinsic to benchmarking is questioned and the negative impact it may have on improvement strategies in poorly performing organizations is recognized. The difficulty in achieving cross-organizational validity in benchmarking is emphasized, as is the need to scrutinize benchmarking measures. The cost effectiveness of benchmarking projects is questioned and the concept of 'best value, best practice' in an environment of fixed resources is examined.
NASA Technical Reports Server (NTRS)
Bailey, David (Editor); Barton, John (Editor); Lasinski, Thomas (Editor); Simon, Horst (Editor)
1993-01-01
A new set of benchmarks was developed for the performance evaluation of highly parallel supercomputers. These benchmarks consist of a set of kernels, the 'Parallel Kernels,' and a simulated application benchmark. Together they mimic the computation and data movement characteristics of large scale computational fluid dynamics (CFD) applications. The principal distinguishing feature of these benchmarks is their 'pencil and paper' specification - all details of these benchmarks are specified only algorithmically. In this way many of the difficulties associated with conventional benchmarking approaches on highly parallel systems are avoided.
Benchmarking and Performance Measurement.
ERIC Educational Resources Information Center
Town, J. Stephen
This paper defines benchmarking and its relationship to quality management, describes a project which applied the technique in a library context, and explores the relationship between performance measurement and benchmarking. Numerous benchmarking methods contain similar elements: deciding what to benchmark; identifying partners; gathering…
HPC Analytics Support. Requirements for Uncertainty Quantification Benchmarks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paulson, Patrick R.; Purohit, Sumit; Rodriguez, Luke R.
2015-05-01
This report outlines techniques for extending benchmark generation products so they support uncertainty quantification by benchmarked systems. We describe how uncertainty quantification requirements can be presented to candidate analytical tools supporting SPARQL. We describe benchmark data sets for evaluating uncertainty quantification, as well as an approach for using our benchmark generator to produce data sets for generating benchmark data sets.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Suter, G.W., II
1993-01-01
One of the initial stages in ecological risk assessment of hazardous waste sites is the screening of contaminants to determine which, if any, of them are worthy of further consideration; this process is termed contaminant screening. Screening is performed by comparing concentrations in ambient media to benchmark concentrations that are either indicative of a high likelihood of significant effects (upper screening benchmarks) or of a very low likelihood of significant effects (lower screening benchmarks). Exceedance of an upper screening benchmark indicates that the chemical in question is clearly of concern and remedial actions are likely to be needed. Exceedance ofmore » a lower screening benchmark indicates that a contaminant is of concern unless other information indicates that the data are unreliable or the comparison is inappropriate. Chemicals with concentrations below the lower benchmark are not of concern if the ambient data are judged to be adequate. This report presents potential screening benchmarks for protection of aquatic life from contaminants in water. Because there is no guidance for screening benchmarks, a set of alternative benchmarks is presented herein. The alternative benchmarks are based on different conceptual approaches to estimating concentrations causing significant effects. For the upper screening benchmark, there are the acute National Ambient Water Quality Criteria (NAWQC) and the Secondary Acute Values (SAV). The SAV concentrations are values estimated with 80% confidence not to exceed the unknown acute NAWQC for those chemicals with no NAWQC. The alternative chronic benchmarks are the chronic NAWQC, the Secondary Chronic Value (SCV), the lowest chronic values for fish and daphnids, the lowest EC20 for fish and daphnids from chronic toxicity tests, the estimated EC20 for a sensitive species, and the concentration estimated to cause a 20% reduction in the recruit abundance of largemouth bass. It is recommended that ambient chemical concentrations be compared to all of these benchmarks. If NAWQC are exceeded, the chemicals must be contaminants of concern because the NAWQC are applicable or relevant and appropriate requirements (ARARs). If NAWQC are not exceeded, but other benchmarks are, contaminants should be selected on the basis of the number of benchmarks exceeded and the conservatism of the particular benchmark values, as discussed in the text. To the extent that toxicity data are available, this report presents the alternative benchmarks for chemicals that have been detected on the Oak Ridge Reservation. It also presents the data used to calculate the benchmarks and the sources of the data. It compares the benchmarks and discusses their relative conservatism and utility. This report supersedes a prior aquatic benchmarks report (Suter and Mabrey 1994). It adds two new types of benchmarks. It also updates the benchmark values where appropriate, adds some new benchmark values, replaces secondary sources with primary sources, and provides more complete documentation of the sources and derivation of all values.« less
The KMAT: Benchmarking Knowledge Management.
ERIC Educational Resources Information Center
de Jager, Martha
Provides an overview of knowledge management and benchmarking, including the benefits and methods of benchmarking (e.g., competitive, cooperative, collaborative, and internal benchmarking). Arthur Andersen's KMAT (Knowledge Management Assessment Tool) is described. The KMAT is a collaborative benchmarking tool, designed to help organizations make…
An experimental MOSFET approach to characterize (192)Ir HDR source anisotropy.
Toye, W C; Das, K R; Todd, S P; Kenny, M B; Franich, R D; Johnston, P N
2007-09-07
The dose anisotropy around a (192)Ir HDR source in a water phantom has been measured using MOSFETs as relative dosimeters. In addition, modeling using the EGSnrc code has been performed to provide a complete dose distribution consistent with the MOSFET measurements. Doses around the Nucletron 'classic' (192)Ir HDR source were measured for a range of radial distances from 5 to 30 mm within a 40 x 30 x 30 cm(3) water phantom, using a TN-RD-50 MOSFET dosimetry system with an active area of 0.2 mm by 0.2 mm. For each successive measurement a linear stepper capable of movement in intervals of 0.0125 mm re-positioned the MOSFET at the required radial distance, while a rotational stepper enabled angular displacement of the source at intervals of 0.9 degrees . The source-dosimeter arrangement within the water phantom was modeled using the standardized cylindrical geometry of the DOSRZnrc user code. In general, the measured relative anisotropy at each radial distance from 5 mm to 30 mm is in good agreement with the EGSnrc simulations, benchmark Monte Carlo simulation and TLD measurements where they exist. The experimental approach employing a MOSFET detection system of small size, high spatial resolution and fast read out capability allowed a practical approach to the determination of dose anisotropy around a HDR source.
NASA Technical Reports Server (NTRS)
Bailey, D. H.; Barszcz, E.; Barton, J. T.; Carter, R. L.; Lasinski, T. A.; Browning, D. S.; Dagum, L.; Fatoohi, R. A.; Frederickson, P. O.; Schreiber, R. S.
1991-01-01
A new set of benchmarks has been developed for the performance evaluation of highly parallel supercomputers in the framework of the NASA Ames Numerical Aerodynamic Simulation (NAS) Program. These consist of five 'parallel kernel' benchmarks and three 'simulated application' benchmarks. Together they mimic the computation and data movement characteristics of large-scale computational fluid dynamics applications. The principal distinguishing feature of these benchmarks is their 'pencil and paper' specification-all details of these benchmarks are specified only algorithmically. In this way many of the difficulties associated with conventional benchmarking approaches on highly parallel systems are avoided.
Radiation breakage of DNA: a model based on random-walk chromatin structure
NASA Technical Reports Server (NTRS)
Ponomarev, A. L.; Sachs, R. K.
2001-01-01
Monte Carlo computer software, called DNAbreak, has recently been developed to analyze observed non-random clustering of DNA double strand breaks in chromatin after exposure to densely ionizing radiation. The software models coarse-grained configurations of chromatin and radiation tracks, small-scale details being suppressed in order to obtain statistical results for larger scales, up to the size of a whole chromosome. We here give an analytic counterpart of the numerical model, useful for benchmarks, for elucidating the numerical results, for analyzing the assumptions of a more general but less mechanistic "randomly-located-clusters" formalism, and, potentially, for speeding up the calculations. The equations characterize multi-track DNA fragment-size distributions in terms of one-track action; an important step in extrapolating high-dose laboratory results to the much lower doses of main interest in environmental or occupational risk estimation. The approach can utilize the experimental information on DNA fragment-size distributions to draw inferences about large-scale chromatin geometry during cell-cycle interphase.
Benchmark Analysis of Pion Contribution from Galactic Cosmic Rays
NASA Technical Reports Server (NTRS)
Aghara, Sukesh K.; Blattnig, Steve R.; Norbury, John W.; Singleterry, Robert C., Jr.
2008-01-01
Shielding strategies for extended stays in space must include a comprehensive resolution of the secondary radiation environment inside the spacecraft induced by the primary, external radiation. The distribution of absorbed dose and dose equivalent is a function of the type, energy and population of these secondary products. A systematic verification and validation effort is underway for HZETRN, which is a space radiation transport code currently used by NASA. It performs neutron, proton and heavy ion transport explicitly, but it does not take into account the production and transport of mesons, photons and leptons. The question naturally arises as to what is the contribution of these particles to space radiation. The pion has a production kinetic energy threshold of about 280 MeV. The Galactic cosmic ray (GCR) spectra, coincidentally, reaches flux maxima in the hundreds of MeV range, corresponding to the pion production threshold. We present results from the Monte Carlo code MCNPX, showing the effect of lepton and meson physics when produced and transported explicitly in a GCR environment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
2015-06-15
With the recent introduction of heterogeneity correction algorithms for brachytherapy, the AAPM community is still unclear on how to commission and implement these into clinical practice. The recently-published AAPM TG-186 report discusses important issues for clinical implementation of these algorithms. A charge of the AAPM-ESTRO-ABG Working Group on MBDCA in Brachytherapy (WGMBDCA) is the development of a set of well-defined test case plans, available as references in the software commissioning process to be performed by clinical end-users. In this practical medical physics course, specific examples on how to perform the commissioning process are presented, as well as descriptions of themore » clinical impact from recent literature reporting comparisons of TG-43 and heterogeneity-based dosimetry. Learning Objectives: Identify key clinical applications needing advanced dose calculation in brachytherapy. Review TG-186 and WGMBDCA guidelines, commission process, and dosimetry benchmarks. Evaluate clinical cases using commercially available systems and compare to TG-43 dosimetry.« less
Alternative stitching method for massively parallel e-beam lithography
NASA Astrophysics Data System (ADS)
Brandt, Pieter; Tranquillin, Céline; Wieland, Marco; Bayle, Sébastien; Milléquant, Matthieu; Renault, Guillaume
2015-03-01
In this study a novel stitching method other than Soft Edge (SE) and Smart Boundary (SB) is introduced and benchmarked against SE. The method is based on locally enhanced Exposure Latitude without cost of throughput, making use of the fact that the two beams that pass through the stitching region can deposit up to 2x the nominal dose. The method requires a complex Proximity Effect Correction that takes a preset stitching dose profile into account. On a Metal clip at minimum half-pitch of 32 nm for MAPPER FLX 1200 tool specifications, the novel stitching method effectively mitigates Beam to Beam (B2B) position errors such that they do not induce increase in CD Uniformity (CDU). In other words, the same CDU can be realized inside the stitching region as outside the stitching region. For the SE method, the CDU inside is 0.3 nm higher than outside the stitching region. 5 nm direct overlay impact from B2B position errors cannot be reduced by a stitching strategy.
Berger, Thomas; Bilski, Paweł; Hajek, Michael; Puchalska, Monika; Reitz, Günther
2013-12-01
Astronauts working and living in space are exposed to considerably higher doses and different qualities of ionizing radiation than people on Earth. The multilateral MATROSHKA (MTR) experiment, coordinated by the German Aerospace Center, represents the most comprehensive effort to date in radiation protection dosimetry in space using an anthropomorphic upper-torso phantom used for radiotherapy treatment planning. The anthropomorphic upper-torso phantom maps the radiation distribution as a simulated human body installed outside (MTR-1) and inside different compartments (MTR-2A: Pirs; MTR-2B: Zvezda) of the Russian Segment of the International Space Station. Thermoluminescence dosimeters arranged in a 2.54 cm orthogonal grid, at the site of vital organs and on the surface of the phantom allow for visualization of the absorbed dose distribution with superior spatial resolution. These results should help improve the estimation of radiation risks for long-term human space exploration and support benchmarking of radiation transport codes.
42 CFR 440.335 - Benchmark-equivalent health benefits coverage.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 42 Public Health 4 2013-10-01 2013-10-01 false Benchmark-equivalent health benefits coverage. 440... and Benchmark-Equivalent Coverage § 440.335 Benchmark-equivalent health benefits coverage. (a) Aggregate actuarial value. Benchmark-equivalent coverage is health benefits coverage that has an aggregate...
42 CFR 440.335 - Benchmark-equivalent health benefits coverage.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 42 Public Health 4 2011-10-01 2011-10-01 false Benchmark-equivalent health benefits coverage. 440... and Benchmark-Equivalent Coverage § 440.335 Benchmark-equivalent health benefits coverage. (a) Aggregate actuarial value. Benchmark-equivalent coverage is health benefits coverage that has an aggregate...
NASA Astrophysics Data System (ADS)
Pappas, Eleftherios P.; Zoros, Emmanouil; Moutsatsos, Argyris; Peppa, Vasiliki; Zourari, Kyveli; Karaiskos, Pantelis; Papagiannis, Panagiotis
2017-05-01
There is an acknowledged need for the design and implementation of physical phantoms appropriate for the experimental validation of model-based dose calculation algorithms (MBDCA) introduced recently in 192Ir brachytherapy treatment planning systems (TPS), and this work investigates whether it can be met. A PMMA phantom was prepared to accommodate material inhomogeneities (air and Teflon), four plastic brachytherapy catheters, as well as 84 LiF TLD dosimeters (MTS-100M 1 × 1 × 1 mm3 microcubes), two radiochromic films (Gafchromic EBT3) and a plastic 3D dosimeter (PRESAGE). An irradiation plan consisting of 53 source dwell positions was prepared on phantom CT images using a commercially available TPS and taking into account the calibration dose range of each detector. Irradiation was performed using an 192Ir high dose rate (HDR) source. Dose to medium in medium, Dmm , was calculated using the MBDCA option of the same TPS as well as Monte Carlo (MC) simulation with the MCNP code and a benchmarked methodology. Measured and calculated dose distributions were spatially registered and compared. The total standard (k = 1) spatial uncertainties for TLD, film and PRESAGE were: 0.71, 1.58 and 2.55 mm. Corresponding percentage total dosimetric uncertainties were: 5.4-6.4, 2.5-6.4 and 4.85, owing mainly to the absorbed dose sensitivity correction and the relative energy dependence correction (position dependent) for TLD, the film sensitivity calibration (dose dependent) and the dependencies of PRESAGE sensitivity. Results imply a LiF over-response due to a relative intrinsic energy dependence between 192Ir and megavoltage calibration energies, and a dose rate dependence of PRESAGE sensitivity at low dose rates (<1 Gy min-1). Calculations were experimentally validated within uncertainties except for MBDCA results for points in the phantom periphery and dose levels <20%. Experimental MBDCA validation is laborious, yet feasible. Further work is required for the full characterization of dosimeter response for 192Ir and the reduction of experimental uncertainties.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang Yibao; Yan Yulong; Nath, Ravinder
2012-08-01
Purpose: To develop a quantitative method for the estimation of kV cone beam computed tomography (kVCBCT) doses in pediatric patients undergoing image-guided radiotherapy. Methods and Materials: Forty-two children were retrospectively analyzed in subgroups of different scanned regions: one group in the head-and-neck and the other group in the pelvis. Critical structures in planning CT images were delineated on an Eclipse treatment planning system before being converted into CT phantoms for Monte Carlo simulations. A benchmarked EGS4 Monte Carlo code was used to calculate three-dimensional dose distributions of kVCBCT scans with full-fan high-quality head or half-fan pelvis protocols predefined by themore » manufacturer. Based on planning CT images and structures exported in DICOM RT format, occipital-frontal circumferences (OFC) were calculated for head-and-neck patients using DICOMan software. Similarly, hip circumferences (HIP) were acquired for the pelvic group. Correlations between mean organ doses and age, weight, OFC, and HIP values were analyzed with SigmaPlot software suite, where regression performances were analyzed with relative dose differences (RDD) and coefficients of determination (R{sup 2}). Results: kVCBCT-contributed mean doses to all critical structures decreased monotonically with studied parameters, with a steeper decrease in the pelvis than in the head. Empirical functions have been developed for a dose estimation of the major organs at risk in the head and pelvis, respectively. If evaluated with physical parameters other than age, a mean RDD of up to 7.9% was observed for all the structures in our population of 42 patients. Conclusions: kVCBCT doses are highly correlated with patient size. According to this study, weight can be used as a primary index for dose assessment in both head and pelvis scans, while OFC and HIP may serve as secondary indices for dose estimation in corresponding regions. With the proposed empirical functions, it is possible to perform an individualized quantitative dose assessment of kVCBCT scans.« less
Sarayani, Amir; Rashidian, Arash; Gholami, Kheirollah
2014-01-01
Objectives Diabetes is a major public health concern worldwide, particularly in low-income and middle-income countries (LMICs). Limited data exist on the status of access to diabetes medicines in LMICs. We assessed the utilisation and affordability of diabetes medicines in Iran as a middle-income country. Design We used a retrospective time-series design (2000–2012) and assessed national diabetes medicines’ utilisation using pharmaceuticals wholesale data. Methods We calculated defined daily dose consumptions per population days (DDDs/1000 inhabitants/day; DIDs) indicator. Findings were benchmarked with data from Organization for Economic Co-operation and Development (OECD) countries. We also employed Drug Utilization-90% (DU-90) method to compare DU-90s with the Essential Medicines List published by the WHO. We measured affordability using number of minimum daily wage required to purchase a treatment course for 1 month. Results Diabetes medicines’ consumption increased from 4.47 to 33.54 DIDs. The benchmarking showed that medicines’ utilisation in Iran in 2011 was only 54% of the median DIDs of 22 OECD countries. Oral hypoglycaemic agents consisted over 80% of use throughout the study period. Regular and isophane insulin (NPH), glibenclamide, metformin and gliclazide were the DU-90 drugs in 2012. Metformin, glibenclamide and regular/NPH insulin combination therapy were affordable throughout the study period (∼0.4, ∼0.1, ∼0.3 of minimum daily wage, respectively). While the affordability of novel insulin preparations improved over time, they were still unaffordable in 2012. Conclusions The utilisation of diabetes medicines was relatively low, perhaps due to underdiagnosis and inadequate management of patients with diabetes. This had occurred despite affordability of essential diabetes medicines in Iran. Appropriate policies are required to address the underutilisation of diabetes medicines in Iran. PMID:25324322
42 CFR 440.330 - Benchmark health benefits coverage.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 42 Public Health 4 2012-10-01 2012-10-01 false Benchmark health benefits coverage. 440.330 Section 440.330 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN... Benchmark-Equivalent Coverage § 440.330 Benchmark health benefits coverage. Benchmark coverage is health...
Adeleye, Yeyejide; Andersen, Melvin; Clewell, Rebecca; Davies, Michael; Dent, Matthew; Edwards, Sue; Fowler, Paul; Malcomber, Sophie; Nicol, Beate; Scott, Andrew; Scott, Sharon; Sun, Bin; Westmoreland, Carl; White, Andrew; Zhang, Qiang; Carmichael, Paul L
2015-06-05
Risk assessment methodologies in toxicology have remained largely unchanged for decades. The default approach uses high dose animal studies, together with human exposure estimates, and conservative assessment (uncertainty) factors or linear extrapolations to determine whether a specific chemical exposure is 'safe' or 'unsafe'. Although some incremental changes have appeared over the years, results from all new approaches are still judged against this process of extrapolating high-dose effects in animals to low-dose exposures in humans. The US National Research Council blueprint for change, entitled Toxicity Testing in the 21st Century: A Vision and Strategy called for a transformation of toxicity testing from a system based on high-dose studies in laboratory animals to one founded primarily on in vitro methods that evaluate changes in normal cellular signalling pathways using human-relevant cells or tissues. More recently, this concept of pathways-based approaches to risk assessment has been expanded by the description of 'Adverse Outcome Pathways' (AOPs). The question, however, has been how to translate this AOP/TT21C vision into the practical tools that will be useful to those expected to make safety decisions. We have sought to provide a practical example of how the TT21C vision can be implemented to facilitate a safety assessment for a commercial chemical without the use of animal testing. To this end, the key elements of the TT21C vision have been broken down to a set of actions that can be brought together to achieve such a safety assessment. Such components of a pathways-based risk assessment have been widely discussed, however to-date, no worked examples of the entire risk assessment process exist. In order to begin to test the process, we have taken the approach of examining a prototype toxicity pathway (DNA damage responses mediated by the p53 network) and constructing a strategy for the development of a pathway based risk assessment for a specific chemical in a case study mode. This contribution represents a 'work-in-progress' and is meant to both highlight concepts that are well-developed and identify aspects of the overall process which require additional development. To guide our understanding of what a pathways-based risk assessment could look like in practice, we chose to work on a case study chemical (quercetin) with a defined human exposure and to bring a multidisciplinary team of chemists, biologists, modellers and risk assessors to work together towards a safety assessment. Our goal was to see if the in vitro dose response for quercetin could be sufficiently understood to construct a TT21C risk assessment without recourse to rodent carcinogenicity study data. The data presented include high throughput pathway biomarkers (p-H2AX, p-ATM, p-ATR, p-Chk2, p53, p-p53, MDM2 and Wip1) and markers of cell-cycle, apoptosis and micronuclei formation, plus gene transcription in HT1080 cells. Eighteen point dose response curves were generated using flow cytometry and imaging to determine the concentrations that resulted in significant perturbation. NOELs and BMDs were compared to the output from biokinetic modelling and the potential for in vitro to in vivo extrapolation explored. A first tier risk assessment was performed comparing the total quercetin concentration in the in vitro systems with the predicted total quercetin concentration in plasma and tissues. The shortcomings of this approach and recommendations for improvement are described. This paper therefore describes the current progress in an ongoing research effort aimed at providing a pathways-based, proof-of-concept in vitro-only safety assessment for a consumer use product. Copyright © 2014 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
Ferretti, A; Martignano, A; Simonato, F; Paiusco, M
2014-02-01
The aim of the present work was the validation of the VMC(++) Monte Carlo (MC) engine implemented in the Oncentra Masterplan (OMTPS) and used to calculate the dose distribution produced by the electron beams (energy 5-12 MeV) generated by the linear accelerator (linac) Primus (Siemens), shaped by a digital variable applicator (DEVA). The BEAMnrc/DOSXYZnrc (EGSnrc package) MC model of the linac head was used as a benchmark. Commissioning results for both MC codes were evaluated by means of 1D Gamma Analysis (2%, 2 mm), calculated with a home-made Matlab (The MathWorks) program, comparing the calculations with the measured profiles. The results of the commissioning of OMTPS were good [average gamma index (γ) > 97%]; some mismatches were found with large beams (size ≥ 15 cm). The optimization of the BEAMnrc model required to increase the beam exit window to match the calculated and measured profiles (final average γ > 98%). Then OMTPS dose distribution maps were compared with DOSXYZnrc with a 2D Gamma Analysis (3%, 3 mm), in 3 virtual water phantoms: (a) with an air step, (b) with an air insert, and (c) with a bone insert. The OMTPD and EGSnrc dose distributions with the air-water step phantom were in very high agreement (γ ∼ 99%), while for heterogeneous phantoms there were differences of about 9% in the air insert and of about 10-15% in the bone region. This is due to the Masterplan implementation of VMC(++) which reports the dose as "dose to water", instead of "dose to medium". Copyright © 2013 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Sayre, George Anthony
The purpose of this dissertation was to develop the C ++ program Emergency Dose to calculate transport of radionuclides through indoor spaces using intermediate fidelity physics that provides improved spatial heterogeneity over well-mixed models such as MELCORRTM and much lower computation times than CFD codes such as FLUENTRTM . Modified potential flow theory, which is an original formulation of potential flow theory with additions of turbulent jet and natural convection approximations, calculates spatially heterogeneous velocity fields that well-mixed models cannot predict. Other original contributions of MPFT are: (1) generation of high fidelity boundary conditions relative to well-mixed-CFD coupling methods (conflation), (2) broadening of potential flow applications to arbitrary indoor spaces previously restricted to specific applications such as exhaust hood studies, and (3) great reduction of computation time relative to CFD codes without total loss of heterogeneity. Additionally, the Lagrangian transport module, which is discussed in Sections 1.3 and 2.4, showcases an ensemble-based formulation thought to be original to interior studies. Velocity and concentration transport benchmarks against analogous formulations in COMSOLRTM produced favorable results with discrepancies resulting from the tetrahedral meshing used in COMSOLRTM outperforming the Cartesian method used by Emergency Dose. A performance comparison of the concentration transport modules against MELCORRTM showed that Emergency Dose held advantages over the well-mixed model especially in scenarios with many interior partitions and varied source positions. A performance comparison of velocity module against FLUENTRTM showed that viscous drag provided the largest error between Emergency Dose and CFD velocity calculations, but that Emergency Dose's turbulent jets well approximated the corresponding CFD jets. Overall, Emergency Dose was found to provide a viable intermediate solution method for concentration transport with relatively low computation times.
A method for modeling laterally asymmetric proton beamlets resulting from collimation
Gelover, Edgar; Wang, Dongxu; Hill, Patrick M.; Flynn, Ryan T.; Gao, Mingcheng; Laub, Steve; Pankuch, Mark; Hyer, Daniel E.
2015-01-01
Purpose: To introduce a method to model the 3D dose distribution of laterally asymmetric proton beamlets resulting from collimation. The model enables rapid beamlet calculation for spot scanning (SS) delivery using a novel penumbra-reducing dynamic collimation system (DCS) with two pairs of trimmers oriented perpendicular to each other. Methods: Trimmed beamlet dose distributions in water were simulated with MCNPX and the collimating effects noted in the simulations were validated by experimental measurement. The simulated beamlets were modeled analytically using integral depth dose curves along with an asymmetric Gaussian function to represent fluence in the beam’s eye view (BEV). The BEV parameters consisted of Gaussian standard deviations (sigmas) along each primary axis (σx1,σx2,σy1,σy2) together with the spatial location of the maximum dose (μx,μy). Percent depth dose variation with trimmer position was accounted for with a depth-dependent correction function. Beamlet growth with depth was accounted for by combining the in-air divergence with Hong’s fit of the Highland approximation along each axis in the BEV. Results: The beamlet model showed excellent agreement with the Monte Carlo simulation data used as a benchmark. The overall passing rate for a 3D gamma test with 3%/3 mm passing criteria was 96.1% between the analytical model and Monte Carlo data in an example treatment plan. Conclusions: The analytical model is capable of accurately representing individual asymmetric beamlets resulting from use of the DCS. This method enables integration of the DCS into a treatment planning system to perform dose computation in patient datasets. The method could be generalized for use with any SS collimation system in which blades, leaves, or trimmers are used to laterally sharpen beamlets. PMID:25735287
Sodickson, Aaron; Warden, Graham I; Farkas, Cameron E; Ikuta, Ichiro; Prevedello, Luciano M; Andriole, Katherine P; Khorasani, Ramin
2012-08-01
To develop and validate an informatics toolkit that extracts anatomy-specific computed tomography (CT) radiation exposure metrics (volume CT dose index and dose-length product) from existing digital image archives through optical character recognition of CT dose report screen captures (dose screens) combined with Digital Imaging and Communications in Medicine attributes. This institutional review board-approved HIPAA-compliant study was performed in a large urban health care delivery network. Data were drawn from a random sample of CT encounters that occurred between 2000 and 2010; images from these encounters were contained within the enterprise image archive, which encompassed images obtained at an adult academic tertiary referral hospital and its affiliated sites, including a cancer center, a community hospital, and outpatient imaging centers, as well as images imported from other facilities. Software was validated by using 150 randomly selected encounters for each major CT scanner manufacturer, with outcome measures of dose screen retrieval rate (proportion of correctly located dose screens) and anatomic assignment precision (proportion of extracted exposure data with correctly assigned anatomic region, such as head, chest, or abdomen and pelvis). The 95% binomial confidence intervals (CIs) were calculated for discrete proportions, and CIs were derived from the standard error of the mean for continuous variables. After validation, the informatics toolkit was used to populate an exposure repository from a cohort of 54 549 CT encounters; of which 29 948 had available dose screens. Validation yielded a dose screen retrieval rate of 99% (597 of 605 CT encounters; 95% CI: 98%, 100%) and an anatomic assignment precision of 94% (summed DLP fraction correct 563 in 600 CT encounters; 95% CI: 92%, 96%). Patient safety applications of the resulting data repository include benchmarking between institutions, CT protocol quality control and optimization, and cumulative patient- and anatomy-specific radiation exposure monitoring. Large-scale anatomy-specific radiation exposure data repositories can be created with high fidelity from existing digital image archives by using open-source informatics tools.
SU-F-T-513: Dosimetric Validation of Spatially Fractionated Radiotherapy Using Gel Dosimetry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Papanikolaou, P; Watts, L; Kirby, N
2016-06-15
Purpose: Spatially fractionated radiation therapy, also known as GRID therapy, is used to treat large solid tumors by irradiating the target to a single dose of 10–20Gy through spatially distributed beamlets. We have investigated the use of a 3D gel for dosimetric characterization of GRID therapy. Methods: GRID therapy is an external beam analog of volumetric brachytherapy, whereby we produce a distribution of hot and cold dose columns inside the tumor volume. Such distribution can be produced with a block or by using a checker-like pattern with MLC. We have studied both types of GRID delivery. A cube shaped acrylicmore » phantom was filled with polymer gel and served as a 3D dosimeter. The phantom was scanned and the CT images were used to produce two plans in Pinnacle, one with the grid block and one with the MLC defined grid. A 6MV beam was used for the plan with a prescription of 1500cGy at dmax. The irradiated phantom was scanned in a 3T MRI scanner. Results: 3D dose maps were derived from the MR scans of the gel dosimeter and were found to be in good agreement with the predicted dose distribution from the RTP system. Gamma analysis showed a passing rate of 93% for 5% dose and 2mm DTA scoring criteria. Both relative and absolute dose profiles are in good agreement, except in the peripheral beamlets where the gel measured slightly higher dose, possibly because of the changing head scatter conditions that the RTP is not fully accounting for. Our results have also been benchmarked against ionization chamber measurements. Conclusion: We have investigated the use of a polymer gel for the 3D dosimetric characterization and evaluation of GRID therapy. Our results demonstrated that the planning system can predict fairly accurately the dose distribution for GRID type therapy.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Ruohui; Department of Medical Physics, Medical Faculty Mannheim, Heidelberg University; Fan, Xiaomei
2014-08-15
Objective: The purpose of this study is to propose an alternative planning approach for VMAT using constant dose rate and gantry speed arc therapy(CDR-CAS-IMAT) implementation on conventional Linac Varian 23EX and used IMRT as a benchmark to evaluate the performance. Methods and materials: Eighteen patients with thoracic esophageal carcinoma who were previously treated with IMRT on Varian 23EX were retrospectively planned for CDR-CAS-IMAT plans. Dose prescription was set to 60 Gy to PTVs in 30 fractions. The planning objectives for PTVs and OAR were corresponding with the IMRT plans. Dose to the PTVs and OAR were compared to IMRT withmore » respect to plan quality, MU, treatment time and delivery accuracy. Results: CDR-CAS-IMAT plans led to equivalent or superior plan quality as compared to IMRT, PTV's CI relative increased 16.2%, while small deviations were observed on minimum dose for PTV. Volumes in the cord receiving 40Gy were increased from 3.6% with IMRT to 7.0%. Treatment times were reduced significantly with CDR-CAS-IMAT(mean 85.7s vs. 232.1s, p < .05), however, MU increased by a factor of 1.3 and lung V10/5/3.5/aver were relative increase 6.7%,12%,17.9%,4.2%, respectively. And increased the E-P low dose area volume decreased the hight dose area. There were no significant difference in Delta4 measurements results between both planning techniques. Conclusion: CDR-CAS-IMAT plans can be implemented smoothly and quickly into a busy cancer center, which improved PTV CI and reduces treatment time but increased the MU and low dose irradiated area. An evaluation of weight loss must be performed during treatment for CDR-CAS-IMAT patients.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Suter, G.W. II; Mabrey, J.B.
1994-07-01
This report presents potential screening benchmarks for protection of aquatic life from contaminants in water. Because there is no guidance for screening benchmarks, a set of alternative benchmarks is presented herein. The alternative benchmarks are based on different conceptual approaches to estimating concentrations causing significant effects. For the upper screening benchmark, there are the acute National Ambient Water Quality Criteria (NAWQC) and the Secondary Acute Values (SAV). The SAV concentrations are values estimated with 80% confidence not to exceed the unknown acute NAWQC for those chemicals with no NAWQC. The alternative chronic benchmarks are the chronic NAWQC, the Secondary Chronicmore » Value (SCV), the lowest chronic values for fish and daphnids from chronic toxicity tests, the estimated EC20 for a sensitive species, and the concentration estimated to cause a 20% reduction in the recruit abundance of largemouth bass. It is recommended that ambient chemical concentrations be compared to all of these benchmarks. If NAWQC are exceeded, the chemicals must be contaminants of concern because the NAWQC are applicable or relevant and appropriate requirements (ARARs). If NAWQC are not exceeded, but other benchmarks are, contaminants should be selected on the basis of the number of benchmarks exceeded and the conservatism of the particular benchmark values, as discussed in the text. To the extent that toxicity data are available, this report presents the alternative benchmarks for chemicals that have been detected on the Oak Ridge Reservation. It also presents the data used to calculate benchmarks and the sources of the data. It compares the benchmarks and discusses their relative conservatism and utility.« less
Raising Quality and Achievement. A College Guide to Benchmarking.
ERIC Educational Resources Information Center
Owen, Jane
This booklet introduces the principles and practices of benchmarking as a way of raising quality and achievement at further education colleges in Britain. Section 1 defines the concept of benchmarking. Section 2 explains what benchmarking is not and the steps that should be taken before benchmarking is initiated. The following aspects and…
Benchmarking in Education: Tech Prep, a Case in Point. IEE Brief Number 8.
ERIC Educational Resources Information Center
Inger, Morton
Benchmarking is a process by which organizations compare their practices, processes, and outcomes to standards of excellence in a systematic way. The benchmarking process entails the following essential steps: determining what to benchmark and establishing internal baseline data; identifying the benchmark; determining how that standard has been…
Benchmarks: The Development of a New Approach to Student Evaluation.
ERIC Educational Resources Information Center
Larter, Sylvia
The Toronto Board of Education Benchmarks are libraries of reference materials that demonstrate student achievement at various levels. Each library contains video benchmarks, print benchmarks, a staff handbook, and summary and introductory documents. This book is about the development and the history of the benchmark program. It has taken over 3…
Wu, Yue; Gu, Jun-Ming; Huang, Yun; Duan, Yan-Ying; Huang, Rui-Xue; Hu, Jian-An
2016-01-01
Long-term airborne lead exposure, even below official occupational limits, has been found to cause lead poisoning at higher frequencies than expected, which suggests that China’s existing occupational exposure limits should be reexamined. A retrospective cohort study was conducted on 1832 smelting workers from 1988 to 2008 in China. These were individuals who entered the plant and came into continuous contact with lead at work for longer than 3 months. The dose-response relationship between occupational cumulative lead exposure and lead poisoning, abnormal blood lead, urinary lead and erythrocyte zinc protoporphyrin (ZPP) were analyzed and the benchmark dose lower bound confidence limits (BMDLs) were calculated. Statistically significant positive correlations were found between cumulative lead dust and lead fumes exposures and workplace seniority, blood lead, urinary lead and ZPP values. A dose-response relationship was observed between cumulative lead dust or lead fumes exposure and lead poisoning (p < 0.01). The BMDLs of the cumulative occupational lead dust and fumes doses were 0.68 mg-year/m3 and 0.30 mg-year/m3 for lead poisoning, respectively. The BMDLs of workplace airborne lead concentrations associated with lead poisoning were 0.02 mg/m3 and 0.01 mg/m3 for occupational exposure lead dust and lead fume, respectively. In conclusion, BMDLs for airborne lead were lower than occupational exposure limits, suggesting that the occupational lead exposure limits need re-examination and adjustment. Occupational cumulative exposure limits (OCELs) should be established to better prevent occupational lead poisoning. PMID:26999177
HS06 Benchmark for an ARM Server
NASA Astrophysics Data System (ADS)
Kluth, Stefan
2014-06-01
We benchmarked an ARM cortex-A9 based server system with a four-core CPU running at 1.1 GHz. The system used Ubuntu 12.04 as operating system and the HEPSPEC 2006 (HS06) benchmarking suite was compiled natively with gcc-4.4 on the system. The benchmark was run for various settings of the relevant gcc compiler options. We did not find significant influence from the compiler options on the benchmark result. The final HS06 benchmark result is 10.4.
PMLB: a large benchmark suite for machine learning evaluation and comparison.
Olson, Randal S; La Cava, William; Orzechowski, Patryk; Urbanowicz, Ryan J; Moore, Jason H
2017-01-01
The selection, development, or comparison of machine learning methods in data mining can be a difficult task based on the target problem and goals of a particular study. Numerous publicly available real-world and simulated benchmark datasets have emerged from different sources, but their organization and adoption as standards have been inconsistent. As such, selecting and curating specific benchmarks remains an unnecessary burden on machine learning practitioners and data scientists. The present study introduces an accessible, curated, and developing public benchmark resource to facilitate identification of the strengths and weaknesses of different machine learning methodologies. We compare meta-features among the current set of benchmark datasets in this resource to characterize the diversity of available data. Finally, we apply a number of established machine learning methods to the entire benchmark suite and analyze how datasets and algorithms cluster in terms of performance. From this study, we find that existing benchmarks lack the diversity to properly benchmark machine learning algorithms, and there are several gaps in benchmarking problems that still need to be considered. This work represents another important step towards understanding the limitations of popular benchmarking suites and developing a resource that connects existing benchmarking standards to more diverse and efficient standards in the future.
SU-C-202-05: Pilot Study of Online Treatment Evaluation and Adaptive Re-Planning for Laryngeal SBRT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mao, W; Henry Ford Health System, Detroit, MI; Liu, C
Purpose: We have instigated a phase I trial of 5-fraction stereotactic body radiotherapy (SBRT) for advanced-stage laryngeal cancer. We conducted this pilot dosimetric study to confirm the potential utility of online adaptive re-planning to preserve treatment quality. Methods: Ten cases of larynx cancer were evaluated. Baseline and daily SBRT treatment plans were generated per trial protocol. Daily volumetric images were acquired prior to every fraction of treatment. Reference simulation CT images were deformably registered to daily volumetric images using Eclipse. Planning contours were then deformably propagated to daily images. Reference SBRT plans were directly copied to calculate delivered dose distributionsmore » on deformed reference CT images. In-house software platform has been developed to calculate cumulative dose over a course of treatment in four steps: 1) deforming delivered dose grid to reference CT images using deformation information exported from Eclipse; 2) generating tetrahedrons using deformed dose grid as vertices; 3) resampling dose to a high resolution within every tetrahedron; 4) calculating dose-volume histograms. Our inhouse software was benchmarked with a commercial software, Mirada. Results: In all ten cases including 49 fractions of treatments, delivered daily doses were completely evaluated and treatment could be re-planned within 10 minutes. Prescription dose coverage of PTV was less than intended in 53% of fractions of treatment (mean: 94%, range: 84%–98%) while minimum coverage of CTV and GTV was 94% and 97%, respectively. Maximum bystander point dose limits to arytenoids, parotids, and spinal cord remained respected in all cases, although variances in carotid artery doses were observed in a minority of cases. Conclusion: Although GTV and CTV coverage is preserved by in-room 3D image guidance of larynx SBRT, PTV coverage can vary significantly from intended plans. Online adaptive treatment evaluation and re-planning is potentially necessary and our procedure is clinically applicable to fully preserve treatment quality. This project is supported by CPRIT Individual Investigator Research Award RP150386.« less
Dose assessment in environmental radiological protection: State of the art and perspectives.
Stark, Karolina; Goméz-Ros, José M; Vives I Batlle, Jordi; Lindbo Hansen, Elisabeth; Beaugelin-Seiller, Karine; Kapustka, Lawrence A; Wood, Michael D; Bradshaw, Clare; Real, Almudena; McGuire, Corynne; Hinton, Thomas G
2017-09-01
Exposure to radiation is a potential hazard to humans and the environment. The Fukushima accident reminded the world of the importance of a reliable risk management system that incorporates the dose received from radiation exposures. The dose to humans from exposure to radiation can be quantified using a well-defined system; its environmental equivalent, however, is still in a developmental state. Additionally, the results of several papers published over the last decade have been criticized because of poor dosimetry. Therefore, a workshop on environmental dosimetry was organized by the STAR (Strategy for Allied Radioecology) Network of Excellence to review the state of the art in environmental dosimetry and prioritize areas of methodological and guidance development. Herein, we report the key findings from that international workshop, summarise parameters that affect the dose animals and plants receive when exposed to radiation, and identify further research needs. Current dosimetry practices for determining environmental protection are based on simple screening dose assessments using knowledge of fundamental radiation physics, source-target geometry relationships, the influence of organism shape and size, and knowledge of how radionuclide distributions in the body and in the soil profile alter dose. In screening model calculations that estimate whole-body dose to biota the shapes of organisms are simply represented as ellipsoids, while recently developed complex voxel phantom models allow organ-specific dose estimates. We identified several research and guidance development priorities for dosimetry. For external exposures, the uncertainty in dose estimates due to spatially heterogeneous distributions of radionuclide contamination is currently being evaluated. Guidance is needed on the level of dosimetry that is required when screening benchmarks are exceeded and how to report exposure in dose-effect studies, including quantification of uncertainties. Further research is needed to establish whether and how dosimetry should account for differences in tissue physiology, organism life stages, seasonal variability (in ecology, physiology and radiation field), species life span, and the proportion of a population that is actually exposed. We contend that, although major advances have recently been made in environmental radiation protection, substantive improvements are required to reduce uncertainties and increase the reliability of environmental dosimetry. Copyright © 2017 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stewart, B; Kanal, K; Dickinson, R
2014-06-15
Purpose: We have implemented a commercially available Radiation Exposure Monitoring System (REMS) to enhance the processes of radiation dose data collection, analysis and alerting developed over the past decade at our sites of practice. REMS allows for consolidation of multiple radiation dose information sources and quicker alerting than previously developed processes. Methods: Thirty-nine x-ray producing imaging modalities were interfaced with the REMS: thirteen computed tomography scanners, sixteen angiography/interventional systems, nine digital radiography systems and one mammography system. A number of methodologies were used to provide dose data to the REMS: Modality Performed Procedure Step (MPPS) messages, DICOM Radiation Dose Structuredmore » Reports (RDSR), and DICOM header information. Once interfaced, the dosimetry information from each device underwent validation (first 15–20 exams) before release for viewing by end-users: physicians, medical physicists, technologists and administrators. Results: Before REMS, our diagnostic physics group pulled dosimetry data from seven disparate databases throughout the radiology, radiation oncology, cardiology, electrophysiology, anesthesiology/pain management and vascular surgery departments at two major medical centers and four associated outpatient clinics. With the REMS implementation, we now have one authoritative source of dose information for alerting, longitudinal analysis, dashboard/graphics generation and benchmarking. REMS provides immediate automatic dose alerts utilizing thresholds calculated through daily statistical analysis. This has streamlined our Closing the Loop process for estimated skin exposures in excess of our institutional specific substantial radiation dose level which relied on technologist notification of the diagnostic physics group and daily report from the radiology information system (RIS). REMS also automatically calculates the CT size-specific dose estimate (SSDE) as well as provides two-dimensional angulation dose maps for angiography/interventional procedures. Conclusion: REMS implementation has streamlined and consolidated the dosimetry data collection and analysis process at our institutions while eliminating manual entry error and providing immediate alerting and access to dosimetry data to both physicists and physicians. Brent Stewart has funded research through GE Healthcare.« less
3D conditional generative adversarial networks for high-quality PET image estimation at low dose.
Wang, Yan; Yu, Biting; Wang, Lei; Zu, Chen; Lalush, David S; Lin, Weili; Wu, Xi; Zhou, Jiliu; Shen, Dinggang; Zhou, Luping
2018-07-01
Positron emission tomography (PET) is a widely used imaging modality, providing insight into both the biochemical and physiological processes of human body. Usually, a full dose radioactive tracer is required to obtain high-quality PET images for clinical needs. This inevitably raises concerns about potential health hazards. On the other hand, dose reduction may cause the increased noise in the reconstructed PET images, which impacts the image quality to a certain extent. In this paper, in order to reduce the radiation exposure while maintaining the high quality of PET images, we propose a novel method based on 3D conditional generative adversarial networks (3D c-GANs) to estimate the high-quality full-dose PET images from low-dose ones. Generative adversarial networks (GANs) include a generator network and a discriminator network which are trained simultaneously with the goal of one beating the other. Similar to GANs, in the proposed 3D c-GANs, we condition the model on an input low-dose PET image and generate a corresponding output full-dose PET image. Specifically, to render the same underlying information between the low-dose and full-dose PET images, a 3D U-net-like deep architecture which can combine hierarchical features by using skip connection is designed as the generator network to synthesize the full-dose image. In order to guarantee the synthesized PET image to be close to the real one, we take into account of the estimation error loss in addition to the discriminator feedback to train the generator network. Furthermore, a concatenated 3D c-GANs based progressive refinement scheme is also proposed to further improve the quality of estimated images. Validation was done on a real human brain dataset including both the normal subjects and the subjects diagnosed as mild cognitive impairment (MCI). Experimental results show that our proposed 3D c-GANs method outperforms the benchmark methods and achieves much better performance than the state-of-the-art methods in both qualitative and quantitative measures. Copyright © 2018 Elsevier Inc. All rights reserved.
The General Concept of Benchmarking and Its Application in Higher Education in Europe
ERIC Educational Resources Information Center
Nazarko, Joanicjusz; Kuzmicz, Katarzyna Anna; Szubzda-Prutis, Elzbieta; Urban, Joanna
2009-01-01
The purposes of this paper are twofold: a presentation of the theoretical basis of benchmarking and a discussion on practical benchmarking applications. Benchmarking is also analyzed as a productivity accelerator. The authors study benchmarking usage in the private and public sectors with due consideration of the specificities of the two areas.…
Intrinsic Gettering in Nitrogen-Doped and Hydrogen-Annealed Czochralski-Grown Silicon Wafers
NASA Astrophysics Data System (ADS)
Goto, Hiroyuki; Pan, Lian-Sheng; Tanaka, Masafumi; Kashima, Kazuhiko
2001-06-01
The properties of nitrogen-doped and hydrogen-annealed Czochralski-grown silicon (NHA-CZ-Si) wafers were investigated in this study. The quality of the subsurface was investigated by monitoring the generation lifetime of minority carriers, as measured by the capacitance-time measurements of a metal oxide silicon capacitor (MOS C-t). The intrinsic gettering (IG) ability was investigated by determining the nickel concentration on the surface and in the subsurface as measured by graphite furnace atomic absorption spectrometry (GFAAS) after the wafer was deliberately contaminated with nickel. From the results obtained, the generation lifetimes of these NHA-CZ-Si wafers were determined to be almost the same as, or a little longer than those of epitaxial wafers, and the IG ability was proportional to the total volume of oxygen precipitates [i.e., bulk micro defects (BMDs)], which was influenced by the oxygen and nitrogen concentrations in the wafers. Therefore, it is suggested that the subsurface of the NHA-CZ-Si wafers is of good quality and the IG capacity is controllable by the nitrogen and oxygen concentrations in the wafers.
Benchmarking reference services: an introduction.
Marshall, J G; Buchanan, H S
1995-01-01
Benchmarking is based on the common sense idea that someone else, either inside or outside of libraries, has found a better way of doing certain things and that your own library's performance can be improved by finding out how others do things and adopting the best practices you find. Benchmarking is one of the tools used for achieving continuous improvement in Total Quality Management (TQM) programs. Although benchmarking can be done on an informal basis, TQM puts considerable emphasis on formal data collection and performance measurement. Used to its full potential, benchmarking can provide a common measuring stick to evaluate process performance. This article introduces the general concept of benchmarking, linking it whenever possible to reference services in health sciences libraries. Data collection instruments that have potential application in benchmarking studies are discussed and the need to develop common measurement tools to facilitate benchmarking is emphasized.
Taking the Battle Upstream: Towards a Benchmarking Role for NATO
2012-09-01
Benchmark.........................................................................................14 Figure 8. World Bank Benchmarking Work on Quality...Search of a Benchmarking Theory for the Public Sector.” 16 Figure 8. World Bank Benchmarking Work on Quality of Governance One of the most...the Ministries of Defense in the countries in which it works ). Another interesting innovation is that for comparison purposes, McKinsey categorized
ERIC Educational Resources Information Center
Kent State Univ., OH. Ohio Literacy Resource Center.
This document is intended to show the relationship between Ohio's Standards and Competencies, Equipped for the Future's (EFF's) Standards and Components of Performance, and Ohio's Revised Benchmarks. The document is divided into three parts, with Part 1 covering mathematics instruction, Part 2 covering reading instruction, and Part 3 covering…
How do I know if my forecasts are better? Using benchmarks in hydrological ensemble prediction
NASA Astrophysics Data System (ADS)
Pappenberger, F.; Ramos, M. H.; Cloke, H. L.; Wetterhall, F.; Alfieri, L.; Bogner, K.; Mueller, A.; Salamon, P.
2015-03-01
The skill of a forecast can be assessed by comparing the relative proximity of both the forecast and a benchmark to the observations. Example benchmarks include climatology or a naïve forecast. Hydrological ensemble prediction systems (HEPS) are currently transforming the hydrological forecasting environment but in this new field there is little information to guide researchers and operational forecasters on how benchmarks can be best used to evaluate their probabilistic forecasts. In this study, it is identified that the forecast skill calculated can vary depending on the benchmark selected and that the selection of a benchmark for determining forecasting system skill is sensitive to a number of hydrological and system factors. A benchmark intercomparison experiment is then undertaken using the continuous ranked probability score (CRPS), a reference forecasting system and a suite of 23 different methods to derive benchmarks. The benchmarks are assessed within the operational set-up of the European Flood Awareness System (EFAS) to determine those that are 'toughest to beat' and so give the most robust discrimination of forecast skill, particularly for the spatial average fields that EFAS relies upon. Evaluating against an observed discharge proxy the benchmark that has most utility for EFAS and avoids the most naïve skill across different hydrological situations is found to be meteorological persistency. This benchmark uses the latest meteorological observations of precipitation and temperature to drive the hydrological model. Hydrological long term average benchmarks, which are currently used in EFAS, are very easily beaten by the forecasting system and the use of these produces much naïve skill. When decomposed into seasons, the advanced meteorological benchmarks, which make use of meteorological observations from the past 20 years at the same calendar date, have the most skill discrimination. They are also good at discriminating skill in low flows and for all catchment sizes. Simpler meteorological benchmarks are particularly useful for high flows. Recommendations for EFAS are to move to routine use of meteorological persistency, an advanced meteorological benchmark and a simple meteorological benchmark in order to provide a robust evaluation of forecast skill. This work provides the first comprehensive evidence on how benchmarks can be used in evaluation of skill in probabilistic hydrological forecasts and which benchmarks are most useful for skill discrimination and avoidance of naïve skill in a large scale HEPS. It is recommended that all HEPS use the evidence and methodology provided here to evaluate which benchmarks to employ; so forecasters can have trust in their skill evaluation and will have confidence that their forecasts are indeed better.
Gaining competitive advantage in personal dosimetry services through ISO 9001 certification.
Noriah, M A
2007-01-01
This paper discusses the advantage of certification process in the quality assurance of individual dose monitoring in Malaysia. The demand by customers and the regulatory authority for a higher degree of quality service requires a switch in emphasis from a technically focused quality assurance program to a comprehensive quality management for service provision. Achieving the ISO 9001:2000 certification by an accredited third party demonstrates acceptable recognition and documents the fact that the methods used are capable of generating results that satisfy the performance criteria of the certification program. It also offers a proof of the commitment to quality and, as a benchmark, allows measurement of the progress for continual improvement of service performance.
A benchmarking method to measure dietary absorption efficiency of chemicals by fish.
Xiao, Ruiyang; Adolfsson-Erici, Margaretha; Åkerman, Gun; McLachlan, Michael S; MacLeod, Matthew
2013-12-01
Understanding the dietary absorption efficiency of chemicals in the gastrointestinal tract of fish is important from both a scientific and a regulatory point of view. However, reported fish absorption efficiencies for well-studied chemicals are highly variable. In the present study, the authors developed and exploited an internal chemical benchmarking method that has the potential to reduce uncertainty and variability and, thus, to improve the precision of measurements of fish absorption efficiency. The authors applied the benchmarking method to measure the gross absorption efficiency for 15 chemicals with a wide range of physicochemical properties and structures. They selected 2,2',5,6'-tetrachlorobiphenyl (PCB53) and decabromodiphenyl ethane as absorbable and nonabsorbable benchmarks, respectively. Quantities of chemicals determined in fish were benchmarked to the fraction of PCB53 recovered in fish, and quantities of chemicals determined in feces were benchmarked to the fraction of decabromodiphenyl ethane recovered in feces. The performance of the benchmarking procedure was evaluated based on the recovery of the test chemicals and precision of absorption efficiency from repeated tests. Benchmarking did not improve the precision of the measurements; after benchmarking, however, the median recovery for 15 chemicals was 106%, and variability of recoveries was reduced compared with before benchmarking, suggesting that benchmarking could account for incomplete extraction of chemical in fish and incomplete collection of feces from different tests. © 2013 SETAC.
Bess, John D.; Fujimoto, Nozomu
2014-10-09
Benchmark models were developed to evaluate six cold-critical and two warm-critical, zero-power measurements of the HTTR. Additional measurements of a fully-loaded subcritical configuration, core excess reactivity, shutdown margins, six isothermal temperature coefficients, and axial reaction-rate distributions were also evaluated as acceptable benchmark experiments. Insufficient information is publicly available to develop finely-detailed models of the HTTR as much of the design information is still proprietary. However, the uncertainties in the benchmark models are judged to be of sufficient magnitude to encompass any biases and bias uncertainties incurred through the simplification process used to develop the benchmark models. Dominant uncertainties in themore » experimental keff for all core configurations come from uncertainties in the impurity content of the various graphite blocks that comprise the HTTR. Monte Carlo calculations of keff are between approximately 0.9 % and 2.7 % greater than the benchmark values. Reevaluation of the HTTR models as additional information becomes available could improve the quality of this benchmark and possibly reduce the computational biases. High-quality characterization of graphite impurities would significantly improve the quality of the HTTR benchmark assessment. Simulation of the other reactor physics measurements are in good agreement with the benchmark experiment values. The complete benchmark evaluation details are available in the 2014 edition of the International Handbook of Evaluated Reactor Physics Benchmark Experiments.« less
Yang, Y M; Bednarz, B
2013-02-21
Following the proposal by several groups to integrate magnetic resonance imaging (MRI) with radiation therapy, much attention has been afforded to examining the impact of strong (on the order of a Tesla) transverse magnetic fields on photon dose distributions. The effect of the magnetic field on dose distributions must be considered in order to take full advantage of the benefits of real-time intra-fraction imaging. In this investigation, we compared the handling of particle transport in magnetic fields between two Monte Carlo codes, EGSnrc and Geant4, to analyze various aspects of their electromagnetic transport algorithms; both codes are well-benchmarked for medical physics applications in the absence of magnetic fields. A water-air-water slab phantom and a water-lung-water slab phantom were used to highlight dose perturbations near high- and low-density interfaces. We have implemented a method of calculating the Lorentz force in EGSnrc based on theoretical models in literature, and show very good consistency between the two Monte Carlo codes. This investigation further demonstrates the importance of accurate dosimetry for MRI-guided radiation therapy (MRIgRT), and facilitates the integration of a ViewRay MRIgRT system in the University of Wisconsin-Madison's Radiation Oncology Department.
NASA Astrophysics Data System (ADS)
Yang, Y. M.; Bednarz, B.
2013-02-01
Following the proposal by several groups to integrate magnetic resonance imaging (MRI) with radiation therapy, much attention has been afforded to examining the impact of strong (on the order of a Tesla) transverse magnetic fields on photon dose distributions. The effect of the magnetic field on dose distributions must be considered in order to take full advantage of the benefits of real-time intra-fraction imaging. In this investigation, we compared the handling of particle transport in magnetic fields between two Monte Carlo codes, EGSnrc and Geant4, to analyze various aspects of their electromagnetic transport algorithms; both codes are well-benchmarked for medical physics applications in the absence of magnetic fields. A water-air-water slab phantom and a water-lung-water slab phantom were used to highlight dose perturbations near high- and low-density interfaces. We have implemented a method of calculating the Lorentz force in EGSnrc based on theoretical models in literature, and show very good consistency between the two Monte Carlo codes. This investigation further demonstrates the importance of accurate dosimetry for MRI-guided radiation therapy (MRIgRT), and facilitates the integration of a ViewRay MRIgRT system in the University of Wisconsin-Madison's Radiation Oncology Department.
Bertzbach, F; Franz, T; Möller, K
2012-01-01
This paper shows the results of performance improvement, which have been achieved in benchmarking projects in the wastewater industry in Germany over the last 15 years. A huge number of changes in operational practice and also in achieved annual savings can be shown, induced in particular by benchmarking at process level. Investigation of this question produces some general findings for the inclusion of performance improvement in a benchmarking project and for the communication of its results. Thus, we elaborate on the concept of benchmarking at both utility and process level, which is still a necessary distinction for the integration of performance improvement into our benchmarking approach. To achieve performance improvement via benchmarking it should be made quite clear that this outcome depends, on one hand, on a well conducted benchmarking programme and, on the other, on the individual situation within each participating utility.
Benchmarking clinical photography services in the NHS.
Arbon, Giles
2015-01-01
Benchmarking is used in services across the National Health Service (NHS) using various benchmarking programs. Clinical photography services do not have a program in place and services have to rely on ad hoc surveys of other services. A trial benchmarking exercise was undertaken with 13 services in NHS Trusts. This highlights valuable data and comparisons that can be used to benchmark and improve services throughout the profession.
A Seafloor Benchmark for 3-dimensional Geodesy
NASA Astrophysics Data System (ADS)
Chadwell, C. D.; Webb, S. C.; Nooner, S. L.
2014-12-01
We have developed an inexpensive, permanent seafloor benchmark to increase the longevity of seafloor geodetic measurements. The benchmark provides a physical tie to the sea floor lasting for decades (perhaps longer) on which geodetic sensors can be repeatedly placed and removed with millimeter resolution. Global coordinates estimated with seafloor geodetic techniques will remain attached to the benchmark allowing for the interchange of sensors as they fail or become obsolete, or for the sensors to be removed and used elsewhere, all the while maintaining a coherent series of positions referenced to the benchmark. The benchmark has been designed to free fall from the sea surface with transponders attached. The transponder can be recalled via an acoustic command sent from the surface to release from the benchmark and freely float to the sea surface for recovery. The duration of the sensor attachment to the benchmark will last from a few days to a few years depending on the specific needs of the experiment. The recovered sensors are then available to be reused at other locations, or again at the same site in the future. Three pins on the sensor frame mate precisely and unambiguously with three grooves on the benchmark. To reoccupy a benchmark a Remotely Operated Vehicle (ROV) uses its manipulator arm to place the sensor pins into the benchmark grooves. In June 2014 we deployed four benchmarks offshore central Oregon. We used the ROV Jason to successfully demonstrate the removal and replacement of packages onto the benchmark. We will show the benchmark design and its operational capabilities. Presently models of megathrust slip within the Cascadia Subduction Zone (CSZ) are mostly constrained by the sub-aerial GPS vectors from the Plate Boundary Observatory, a part of Earthscope. More long-lived seafloor geodetic measures are needed to better understand the earthquake and tsunami risk associated with a large rupture of the thrust fault within the Cascadia subduction zone. Using a ROV to place and remove sensors on the benchmarks will significantly reduce the number of sensors required by the community to monitor offshore strain in subduction zones.
Jang, Cheng-Shin; Liang, Ching-Ping
2018-01-01
Taiwan is surrounded by oceans, and therefore numerous pleasure beaches attract millions of tourists annually to participate in recreational swimming activities. However, impaired water quality because of fecal pollution poses a potential threat to the tourists' health. This study probabilistically characterized the health risks associated with recreational swimming engendered by waterborne enterococci at 13 Taiwanese beaches by using quantitative microbial risk assessment. First, data on enterococci concentrations at coastal beaches monitored by the Taiwan Environmental Protection Administration were reproduced using nonparametric Monte Carlo simulation (MCS). The ingestion volumes of recreational swimming based on uniform and gamma distributions were subsequently determined using MCS. Finally, after the distribution combination of the two parameters, the beta-Poisson dose-response function was employed to quantitatively estimate health risks to recreational swimmers. Moreover, various levels of risk to recreational swimmers were classified and spatially mapped to explore feasible recreational and environmental management strategies at the beaches. The study results revealed that although the health risks associated with recreational swimming did not exceed an acceptable benchmark of 0.019 illnesses daily at all beaches, they approached to this benchmark at certain beaches. Beaches with relatively high risks are located in Northwestern Taiwan owing to the current movements.
Benchmarking--Measuring and Comparing for Continuous Improvement.
ERIC Educational Resources Information Center
Henczel, Sue
2002-01-01
Discussion of benchmarking focuses on the use of internal and external benchmarking by special librarians. Highlights include defining types of benchmarking; historical development; benefits, including efficiency, improved performance, increased competitiveness, and better decision making; problems, including inappropriate adaptation; developing a…
Conformal image-guided microbeam radiation therapy at the ESRF biomedical beamline ID17
DOE Office of Scientific and Technical Information (OSTI.GOV)
Donzelli, Mattia, E-mail: donzelli@esrf.fr; Bräuer-Krisch, Elke; Nemoz, Christian
Purpose: Upcoming veterinary trials in microbeam radiation therapy (MRT) demand for more advanced irradiation techniques than in preclinical research with small animals. The treatment of deep-seated tumors in cats and dogs with MRT requires sophisticated irradiation geometries from multiple ports, which impose further efforts to spare the normal tissue surrounding the target. Methods: This work presents the development and benchmarking of a precise patient alignment protocol for MRT at the biomedical beamline ID17 of the European Synchrotron Radiation Facility (ESRF). The positioning of the patient prior to irradiation is verified by taking x-ray projection images from different angles. Results: Usingmore » four external fiducial markers of 1.7 mm diameter and computed tomography-based treatment planning, a target alignment error of less than 2 mm can be achieved with an angular deviation of less than 2{sup ∘}. Minor improvements on the protocol and the use of smaller markers indicate that even a precision better than 1 mm is technically feasible. Detailed investigations concerning the imaging dose lead to the conclusion that doses for skull radiographs lie in the same range as dose reference levels for human head radiographs. A currently used online dose monitor for MRT has been proven to give reliable results for the imaging beam. Conclusions: The ESRF biomedical beamline ID17 is technically ready to apply conformal image-guided MRT from multiple ports to large animals during future veterinary trials.« less
Clinical practice variations in prescribing antipsychotics for patients with schizophrenia.
Owen, Richard R; Fischer, Ellen P; Kirchner, JoAnn E; Thrush, Carol R; Williams, D Keith; Cuffel, Brian J; Elliott, Carl E; Booth, Brenda M
2003-01-01
Few studies have examined the variations among individual physicians in prescribing antipsychotics for schizophrenia. This study examined clinical practice variations in the route and dosage of antipsychotic medication prescribed for inpatients with schizophrenia by 11 different psychiatrists. The sample consisted of 130 patients with a DSM-III-R diagnosis of schizophrenia who had received inpatient care at a state hospital or Veterans Affairs medical center in the southeastern United States in 1992-1993. Mixed-effects regression models were developed to explore the influence of individual physicians and hospitals on route of antipsychotic administration (oral or depot) and daily antipsychotic dose, controlling for patient case-mix variables (age, race, sex, duration of illness, symptom severity, and substance-abuse diagnosis). The average daily antipsychotic dose was 1092 +/- 892 chlorpromazine mg equivalents. Almost half of the patients (48%) were prescribed doses above or below the range recommended by current practice guidelines. The proportion of patients prescribed depot antipsychotics was significantly different at the 2 hospitals, as was the antipsychotic dose prescribed at discharge. Individual physicians and patient characteristics were not significantly associated with prescribing practices. These data, which were obtained before clinical practice guidelines were widely disseminated, provide a benchmark against which to examine more current practice variations in antipsychotic prescribing. The results raise several questions about deviations from practice guidelines in the pharmacological treatment of schizophrenia. To adequately assess quality and inform and possibly further develop clinical practice guideline recommendations for schizophrenia, well-designed research studies conducted in routine clinical settings are needed.
TU-FG-201-06: Remote Dosimetric Auditing for Clinical Trials Using EPID Dosimetry: A Pilot Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miri, N; Legge, K; Greer, P
2016-06-15
Purpose: To perform a pilot study for remote dosimetric credentialing of intensity modulated radiation therapy (IMRT) based clinical trials. The study introduces a novel, time efficient and inexpensive dosimetry audit method for multi-center credentialing. The method employs electronic portal imaging device (EPID) to reconstruct delivered dose inside a virtual flat/cylindrical water phantom. Methods: Five centers, including different accelerator types and treatment planning systems (TPS), were asked to download two CT data sets of a Head and Neck (H&N) and Postprostatectomy (P-P) patients to produce benchmark plans. These were then transferred to virtual flat and cylindrical phantom data sets that weremore » also provided. In-air EPID images of the plans were then acquired, and the data sent to the central site for analysis. At the central site, these were converted to DICOM format, all images were used to reconstruct 2D and 3D dose distributions inside respectively the flat and cylindrical phantoms using inhouse EPID to dose conversion software. 2D dose was calculated for individual fields and 3D dose for the combined fields. The results were compared to corresponding TPS doses. Three gamma criteria were used, 3%3mm-3%/2mm–2%/2mm with a 10% dose threshold, to compare the calculated and prescribed dose. Results: All centers had a high pass rate for the criteria of 3%/3 mm. For 2D dose, the average of centers mean pass rate was 99.6% (SD: 0.3%) and 99.8% (SD: 0.3%) for respectively H&N and PP patients. For 3D dose, 3D gamma was used to compare the model dose with TPS combined dose. The mean pass rate was 97.7% (SD: 2.8%) and 98.3% (SD: 1.6%). Conclusion: Successful performance of the method for the pilot centers establishes the method for dosimetric multi-center credentialing. The results are promising and show a high level of gamma agreement and, the procedure is efficient, consistent and inexpensive. Funding has been provided from Department of Radiation Oncology, TROG Cancer Research and the University of Newcastle. Narges Miri is a recipient of a University of Newcastle postgraduate scholarship.« less
Knowledge-based IMRT planning for individual liver cancer patients using a novel specific model.
Yu, Gang; Li, Yang; Feng, Ziwei; Tao, Cheng; Yu, Zuyi; Li, Baosheng; Li, Dengwang
2018-03-27
The purpose of this work is to benchmark RapidPlan against clinical plans for liver Intensity-modulated radiotherapy (IMRT) treatment of patients with special anatomical characteristics, and to investigate the prediction capability of the general model (Model-G) versus our specific model (Model-S). A library consisting of 60 liver cancer patients with IMRT planning was used to set up two models (Model-S, Model-G), using the RapidPlan knowledge-based planning system. Model-S consisted of 30 patients with special anatomical characteristics where the distance from planning target volume (PTV) to the right kidney was less than three centimeters and Model-G was configurated using all 60 patients in this library. Knowledge-based IMRT plans were created for the evaluation group formed of 13 patients similar to those included in Model-S by Model-G, Model-S and manually (M), named RPG-plans, RPS-plans and M-plans, respectively. The differences in the dose-volume histograms (DVHs) were compared, not only between RP-plans and their respective M-plans, but also between RPG-plans and RPS-plans. For all 13 patients, RapidPlan could automatically produce clinically acceptable plans. Comparing RP-plans to M-plans, RP-plans improved V 95% of PTV and had greater dose sparing in the right kidney. For the normal liver, RPG-plans delivered similar doses, while RPS-plans delivered a higher dose than M-plans. With respect to RapidPlan models, RPS-plans had better conformity index (CI) values and delivered lower doses to the right kidney V 20Gy and maximizing point doses to spinal cord, while delivering higher doses to the normal liver. The study shows that RapidPlan can create high-quality plans, and our specific model can improve the CI of PTV, resulting in more sparing of OAR in IMRT for individual liver cancer patients.
MutAIT: an online genetic toxicology data portal and analysis tools.
Avancini, Daniele; Menzies, Georgina E; Morgan, Claire; Wills, John; Johnson, George E; White, Paul A; Lewis, Paul D
2016-05-01
Assessment of genetic toxicity and/or carcinogenic activity is an essential element of chemical screening programs employed to protect human health. Dose-response and gene mutation data are frequently analysed by industry, academia and governmental agencies for regulatory evaluations and decision making. Over the years, a number of efforts at different institutions have led to the creation and curation of databases to house genetic toxicology data, largely, with the aim of providing public access to facilitate research and regulatory assessments. This article provides a brief introduction to a new genetic toxicology portal called Mutation Analysis Informatics Tools (MutAIT) (www.mutait.org) that provides easy access to two of the largest genetic toxicology databases, the Mammalian Gene Mutation Database (MGMD) and TransgenicDB. TransgenicDB is a comprehensive collection of transgenic rodent mutation data initially compiled and collated by Health Canada. The updated MGMD contains approximately 50 000 individual mutation spectral records from the published literature. The portal not only gives access to an enormous quantity of genetic toxicology data, but also provides statistical tools for dose-response analysis and calculation of benchmark dose. Two important R packages for dose-response analysis are provided as web-distributed applications with user-friendly graphical interfaces. The 'drsmooth' package performs dose-response shape analysis and determines various points of departure (PoD) metrics and the 'PROAST' package provides algorithms for dose-response modelling. The MutAIT statistical tools, which are currently being enhanced, provide users with an efficient and comprehensive platform to conduct quantitative dose-response analyses and determine PoD values that can then be used to calculate human exposure limits or margins of exposure. © The Author 2015. Published by Oxford University Press on behalf of the UK Environmental Mutagen Society. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Miften, Moyed; Olch, Arthur; Mihailidis, Dimitris; Moran, Jean; Pawlicki, Todd; Molineu, Andrea; Li, Harold; Wijesooriya, Krishni; Shi, Jie; Xia, Ping; Papanikolaou, Nikos; Low, Daniel A
2018-04-01
Patient-specific IMRT QA measurements are important components of processes designed to identify discrepancies between calculated and delivered radiation doses. Discrepancy tolerance limits are neither well defined nor consistently applied across centers. The AAPM TG-218 report provides a comprehensive review aimed at improving the understanding and consistency of these processes as well as recommendations for methodologies and tolerance limits in patient-specific IMRT QA. The performance of the dose difference/distance-to-agreement (DTA) and γ dose distribution comparison metrics are investigated. Measurement methods are reviewed and followed by a discussion of the pros and cons of each. Methodologies for absolute dose verification are discussed and new IMRT QA verification tools are presented. Literature on the expected or achievable agreement between measurements and calculations for different types of planning and delivery systems are reviewed and analyzed. Tests of vendor implementations of the γ verification algorithm employing benchmark cases are presented. Operational shortcomings that can reduce the γ tool accuracy and subsequent effectiveness for IMRT QA are described. Practical considerations including spatial resolution, normalization, dose threshold, and data interpretation are discussed. Published data on IMRT QA and the clinical experience of the group members are used to develop guidelines and recommendations on tolerance and action limits for IMRT QA. Steps to check failed IMRT QA plans are outlined. Recommendations on delivery methods, data interpretation, dose normalization, the use of γ analysis routines and choice of tolerance limits for IMRT QA are made with focus on detecting differences between calculated and measured doses via the use of robust analysis methods and an in-depth understanding of IMRT verification metrics. The recommendations are intended to improve the IMRT QA process and establish consistent, and comparable IMRT QA criteria among institutions. © 2018 American Association of Physicists in Medicine.
Bohm, Tim D; DeLuca, Paul M; DeWerd, Larry A
2003-04-01
Permanent implantation of low energy (20-40 keV) photon emitting radioactive seeds to treat prostate cancer is an important treatment option for patients. In order to produce accurate implant brachytherapy treatment plans, the dosimetry of a single source must be well characterized. Monte Carlo based transport calculations can be used for source characterization, but must have up to date cross section libraries to produce accurate dosimetry results. This work benchmarks the MCNP code and its photon cross section library for low energy photon brachytherapy applications. In particular, we calculate the emitted photon spectrum, air kerma, depth dose in water, and radial dose function for both 125I and 103Pd based seeds and compare to other published results. Our results show that MCNP's cross section library differs from recent data primarily in the photoelectric cross section for low energies and low atomic number materials. In water, differences as large as 10% in the photoelectric cross section and 6% in the total cross section occur at 125I and 103Pd photon energies. This leads to differences in the dose rate constant of 3% and 5%, and differences as large as 18% and 20% in the radial dose function for the 125I and 103Pd based seeds, respectively. Using a partially updated photon library, calculations of the dose rate constant and radial dose function agree with other published results. Further, the use of the updated photon library allows us to verify air kerma and depth dose in water calculations performed using MCNP's perturbation feature to simulate updated cross sections. We conclude that in order to most effectively use MCNP for low energy photon brachytherapy applications, we must update its cross section library. Following this update, the MCNP code system will be a very effective tool for low energy photon brachytherapy dosimetry applications.
Allen, Bruce C.; Andres, Kara L.; Ehresman, David J.; Falvo, Ria; Provencher, Anne; Olsen, Geary W.; Butenhoff, John L.
2017-01-01
Abstract An oral dose study with perfluorooctanesulfonate (PFOS) was undertaken to identify potential associations between serum PFOS and changes in serum clinical chemistry parameters in purpose-bred young adult cynomolgus monkeys (Macaca fascicularis). In this study, control group (n = 6/sex) was sham-dosed with vehicle (0.5% Tween 20 and 5% ethanol in water), low-dose group (n = 6/sex) received 1 single K+PFOS dose (9 mg/kg), and high-dose group (n = 4–6/sex) received 3 separate K+ PFOS doses (11–17.2 mg/kg). Monkeys were given routine checkups and observed carefully for health problems on a daily basis. Scheduled blood samples were drawn from all monkeys prior to, during, and after K+PFOS administration for up to 1 year and they were analyzed for PFOS concentrations and clinical chemistry markers for coagulation, lipids, hepatic, renal, electrolytes, and thyroid-related hormones. No mortality occurred during the study. All the monkeys were healthy, gained weight, and were released back to the colony at the end of the study. The highest serum PFOS achieved was approximately 165 μg/ml. When compared with time-matched controls, administration of K+PFOS to monkeys did not result in any toxicologically meaningful or clinically relevant changes in serum clinical measurements for coagulation, lipids, hepatic, renal, electrolytes, and thyroid-related hormones. A slight reduction in serum cholesterol (primarily the high-density lipoprotein fraction), although not toxicologically significant, was observed. The corresponding lower-bound fifth percentile benchmark concentrations (BMCL1sd) were 74 and 76 μg/ml for male and female monkeys, respectively. Compared to the 2013–2014 geometric mean serum PFOS level of 4.99 ng/ml (0.00499 μg/ml) in US general population reported by CDC NHANES, this represents 4 orders of magnitude for margin of exposure. PMID:28115654
Developing Benchmarks for Solar Radio Bursts
NASA Astrophysics Data System (ADS)
Biesecker, D. A.; White, S. M.; Gopalswamy, N.; Black, C.; Domm, P.; Love, J. J.; Pierson, J.
2016-12-01
Solar radio bursts can interfere with radar, communication, and tracking signals. In severe cases, radio bursts can inhibit the successful use of radio communications and disrupt a wide range of systems that are reliant on Position, Navigation, and Timing services on timescales ranging from minutes to hours across wide areas on the dayside of Earth. The White House's Space Weather Action Plan has asked for solar radio burst intensity benchmarks for an event occurrence frequency of 1 in 100 years and also a theoretical maximum intensity benchmark. The solar radio benchmark team was also asked to define the wavelength/frequency bands of interest. The benchmark team developed preliminary (phase 1) benchmarks for the VHF (30-300 MHz), UHF (300-3000 MHz), GPS (1176-1602 MHz), F10.7 (2800 MHz), and Microwave (4000-20000) bands. The preliminary benchmarks were derived based on previously published work. Limitations in the published work will be addressed in phase 2 of the benchmark process. In addition, deriving theoretical maxima requires additional work, where it is even possible to, in order to meet the Action Plan objectives. In this presentation, we will present the phase 1 benchmarks and the basis used to derive them. We will also present the work that needs to be done in order to complete the final, or phase 2 benchmarks.
Benchmarking in national health service procurement in Scotland.
Walker, Scott; Masson, Ron; Telford, Ronnie; White, David
2007-11-01
The paper reports the results of a study on benchmarking activities undertaken by the procurement organization within the National Health Service (NHS) in Scotland, namely National Procurement (previously Scottish Healthcare Supplies Contracts Branch). NHS performance is of course politically important, and benchmarking is increasingly seen as a means to improve performance, so the study was carried out to determine if the current benchmarking approaches could be enhanced. A review of the benchmarking activities used by the private sector, local government and NHS organizations was carried out to establish a framework of the motivations, benefits, problems and costs associated with benchmarking. This framework was used to carry out the research through case studies and a questionnaire survey of NHS procurement organizations both in Scotland and other parts of the UK. Nine of the 16 Scottish Health Boards surveyed reported carrying out benchmarking during the last three years. The findings of the research were that there were similarities in approaches between local government and NHS Scotland Health, but differences between NHS Scotland and other UK NHS procurement organizations. Benefits were seen as significant and it was recommended that National Procurement should pursue the formation of a benchmarking group with members drawn from NHS Scotland and external benchmarking bodies to establish measures to be used in benchmarking across the whole of NHS Scotland.
Blecher, Evan
2010-08-01
To investigate the appropriateness of tax incidence (the percentage of the retail price occupied by taxes) benchmarking in low-income and-middle-income countries (LMICs) with rapidly growing economies and to explore the viability of an alternative tax policy rule based on the affordability of cigarettes. The paper outlines criticisms of tax incidence benchmarking, particularly in the context of LMICs. It then considers an affordability-based benchmark using relative income price (RIP) as a measure of affordability. The RIP measures the percentage of annual per capita GDP required to purchase 100 packs of cigarettes. Using South Africa as a case study of an LMIC, future consumption is simulated using both tax incidence benchmarks and affordability benchmarks. I show that a tax incidence benchmark is not an optimal policy tool in South Africa and that an affordability benchmark could be a more effective means of reducing tobacco consumption in the future. Although a tax incidence benchmark was successful in increasing prices and reducing tobacco consumption in South Africa in the past, this approach has drawbacks, particularly in the context of a rapidly growing LMIC economy. An affordability benchmark represents an appropriate alternative that would be more effective in reducing future cigarette consumption.
Mille, Matthew M; Jung, Jae Won; Lee, Choonik; Kuzmin, Gleb A; Lee, Choonsik
2018-06-01
Radiation dosimetry is an essential input for epidemiological studies of radiotherapy patients aimed at quantifying the dose-response relationship of late-term morbidity and mortality. Individualised organ dose must be estimated for all tissues of interest located in-field, near-field, or out-of-field. Whereas conventional measurement approaches are limited to points in water or anthropomorphic phantoms, computational approaches using patient images or human phantoms offer greater flexibility and can provide more detailed three-dimensional dose information. In the current study, we systematically compared four different dose calculation algorithms so that dosimetrists and epidemiologists can better understand the advantages and limitations of the various approaches at their disposal. The four dose calculations algorithms considered were as follows: the (1) Analytical Anisotropic Algorithm (AAA) and (2) Acuros XB algorithm (Acuros XB), as implemented in the Eclipse treatment planning system (TPS); (3) a Monte Carlo radiation transport code, EGSnrc; and (4) an accelerated Monte Carlo code, the x-ray Voxel Monte Carlo (XVMC). The four algorithms were compared in terms of their accuracy and appropriateness in the context of dose reconstruction for epidemiological investigations. Accuracy in peripheral dose was evaluated first by benchmarking the calculated dose profiles against measurements in a homogeneous water phantom. Additional simulations in a heterogeneous cylinder phantom evaluated the performance of the algorithms in the presence of tissue heterogeneity. In general, we found that the algorithms contained within the commercial TPS (AAA and Acuros XB) were fast and accurate in-field or near-field, but not acceptable out-of-field. Therefore, the TPS is best suited for epidemiological studies involving large cohorts and where the organs of interest are located in-field or partially in-field. The EGSnrc and XVMC codes showed excellent agreement with measurements both in-field and out-of-field. The EGSnrc code was the most accurate dosimetry approach, but was too slow to be used for large-scale epidemiological cohorts. The XVMC code showed similar accuracy to EGSnrc, but was significantly faster, and thus epidemiological applications seem feasible, especially when the organs of interest reside far away from the field edge.
Benchmarking: applications to transfusion medicine.
Apelseth, Torunn Oveland; Molnar, Laura; Arnold, Emmy; Heddle, Nancy M
2012-10-01
Benchmarking is as a structured continuous collaborative process in which comparisons for selected indicators are used to identify factors that, when implemented, will improve transfusion practices. This study aimed to identify transfusion medicine studies reporting on benchmarking, summarize the benchmarking approaches used, and identify important considerations to move the concept of benchmarking forward in the field of transfusion medicine. A systematic review of published literature was performed to identify transfusion medicine-related studies that compared at least 2 separate institutions or regions with the intention of benchmarking focusing on 4 areas: blood utilization, safety, operational aspects, and blood donation. Forty-five studies were included: blood utilization (n = 35), safety (n = 5), operational aspects of transfusion medicine (n = 5), and blood donation (n = 0). Based on predefined criteria, 7 publications were classified as benchmarking, 2 as trending, and 36 as single-event studies. Three models of benchmarking are described: (1) a regional benchmarking program that collects and links relevant data from existing electronic sources, (2) a sentinel site model where data from a limited number of sites are collected, and (3) an institutional-initiated model where a site identifies indicators of interest and approaches other institutions. Benchmarking approaches are needed in the field of transfusion medicine. Major challenges include defining best practices and developing cost-effective methods of data collection. For those interested in initiating a benchmarking program, the sentinel site model may be most effective and sustainable as a starting point, although the regional model would be the ideal goal. Copyright © 2012 Elsevier Inc. All rights reserved.
42 CFR 440.330 - Benchmark health benefits coverage.
Code of Federal Regulations, 2011 CFR
2011-10-01
... Benchmark-Equivalent Coverage § 440.330 Benchmark health benefits coverage. Benchmark coverage is health...) Federal Employees Health Benefit Plan Equivalent Coverage (FEHBP—Equivalent Health Insurance Coverage). A benefit plan equivalent to the standard Blue Cross/Blue Shield preferred provider option service benefit...
42 CFR 440.330 - Benchmark health benefits coverage.
Code of Federal Regulations, 2014 CFR
2014-10-01
... Benchmark-Equivalent Coverage § 440.330 Benchmark health benefits coverage. Benchmark coverage is health...) Federal Employees Health Benefit Plan Equivalent Coverage (FEHBP—Equivalent Health Insurance Coverage). A benefit plan equivalent to the standard Blue Cross/Blue Shield preferred provider option service benefit...
42 CFR 440.330 - Benchmark health benefits coverage.
Code of Federal Regulations, 2013 CFR
2013-10-01
... Benchmark-Equivalent Coverage § 440.330 Benchmark health benefits coverage. Benchmark coverage is health...) Federal Employees Health Benefit Plan Equivalent Coverage (FEHBP—Equivalent Health Insurance Coverage). A benefit plan equivalent to the standard Blue Cross/Blue Shield preferred provider option service benefit...
42 CFR 440.330 - Benchmark health benefits coverage.
Code of Federal Regulations, 2010 CFR
2010-10-01
... Benchmark-Equivalent Coverage § 440.330 Benchmark health benefits coverage. Benchmark coverage is health...) Federal Employees Health Benefit Plan Equivalent Coverage (FEHBP—Equivalent Health Insurance Coverage). A benefit plan equivalent to the standard Blue Cross/Blue Shield preferred provider option service benefit...
Ó Conchúir, Shane; Barlow, Kyle A; Pache, Roland A; Ollikainen, Noah; Kundert, Kale; O'Meara, Matthew J; Smith, Colin A; Kortemme, Tanja
2015-01-01
The development and validation of computational macromolecular modeling and design methods depend on suitable benchmark datasets and informative metrics for comparing protocols. In addition, if a method is intended to be adopted broadly in diverse biological applications, there needs to be information on appropriate parameters for each protocol, as well as metrics describing the expected accuracy compared to experimental data. In certain disciplines, there exist established benchmarks and public resources where experts in a particular methodology are encouraged to supply their most efficient implementation of each particular benchmark. We aim to provide such a resource for protocols in macromolecular modeling and design. We present a freely accessible web resource (https://kortemmelab.ucsf.edu/benchmarks) to guide the development of protocols for protein modeling and design. The site provides benchmark datasets and metrics to compare the performance of a variety of modeling protocols using different computational sampling methods and energy functions, providing a "best practice" set of parameters for each method. Each benchmark has an associated downloadable benchmark capture archive containing the input files, analysis scripts, and tutorials for running the benchmark. The captures may be run with any suitable modeling method; we supply command lines for running the benchmarks using the Rosetta software suite. We have compiled initial benchmarks for the resource spanning three key areas: prediction of energetic effects of mutations, protein design, and protein structure prediction, each with associated state-of-the-art modeling protocols. With the help of the wider macromolecular modeling community, we hope to expand the variety of benchmarks included on the website and continue to evaluate new iterations of current methods as they become available.
Edwards, Roger A; Dee, Deborah; Umer, Amna; Perrine, Cria G; Shealy, Katherine R; Grummer-Strawn, Laurence M
2014-02-01
A substantial proportion of US maternity care facilities engage in practices that are not evidence-based and that interfere with breastfeeding. The CDC Survey of Maternity Practices in Infant Nutrition and Care (mPINC) showed significant variation in maternity practices among US states. The purpose of this article is to use benchmarking techniques to identify states within relevant peer groups that were top performers on mPINC survey indicators related to breastfeeding support. We used 11 indicators of breastfeeding-related maternity care from the 2011 mPINC survey and benchmarking techniques to organize and compare hospital-based maternity practices across the 50 states and Washington, DC. We created peer categories for benchmarking first by region (grouping states by West, Midwest, South, and Northeast) and then by size (grouping states by the number of maternity facilities and dividing each region into approximately equal halves based on the number of facilities). Thirty-four states had scores high enough to serve as benchmarks, and 32 states had scores low enough to reflect the lowest score gap from the benchmark on at least 1 indicator. No state served as the benchmark on more than 5 indicators and no state was furthest from the benchmark on more than 7 indicators. The small peer group benchmarks in the South, West, and Midwest were better than the large peer group benchmarks on 91%, 82%, and 36% of the indicators, respectively. In the West large, the Midwest large, the Midwest small, and the South large peer groups, 4-6 benchmarks showed that less than 50% of hospitals have ideal practice in all states. The evaluation presents benchmarks for peer group state comparisons that provide potential and feasible targets for improvement.
Hospital benchmarking: are U.S. eye hospitals ready?
de Korne, Dirk F; van Wijngaarden, Jeroen D H; Sol, Kees J C A; Betz, Robert; Thomas, Richard C; Schein, Oliver D; Klazinga, Niek S
2012-01-01
Benchmarking is increasingly considered a useful management instrument to improve quality in health care, but little is known about its applicability in hospital settings. The aims of this study were to assess the applicability of a benchmarking project in U.S. eye hospitals and compare the results with an international initiative. We evaluated multiple cases by applying an evaluation frame abstracted from the literature to five U.S. eye hospitals that used a set of 10 indicators for efficiency benchmarking. Qualitative analysis entailed 46 semistructured face-to-face interviews with stakeholders, document analyses, and questionnaires. The case studies only partially met the conditions of the evaluation frame. Although learning and quality improvement were stated as overall purposes, the benchmarking initiative was at first focused on efficiency only. No ophthalmic outcomes were included, and clinicians were skeptical about their reporting relevance and disclosure. However, in contrast with earlier findings in international eye hospitals, all U.S. hospitals worked with internal indicators that were integrated in their performance management systems and supported benchmarking. Benchmarking can support performance management in individual hospitals. Having a certain number of comparable institutes provide similar services in a noncompetitive milieu seems to lay fertile ground for benchmarking. International benchmarking is useful only when these conditions are not met nationally. Although the literature focuses on static conditions for effective benchmarking, our case studies show that it is a highly iterative and learning process. The journey of benchmarking seems to be more important than the destination. Improving patient value (health outcomes per unit of cost) requires, however, an integrative perspective where clinicians and administrators closely cooperate on both quality and efficiency issues. If these worlds do not share such a relationship, the added "public" value of benchmarking in health care is questionable.
Development and application of freshwater sediment-toxicity benchmarks for currently used pesticides
Nowell, Lisa H.; Norman, Julia E.; Ingersoll, Christopher G.; Moran, Patrick W.
2016-01-01
Sediment-toxicity benchmarks are needed to interpret the biological significance of currently used pesticides detected in whole sediments. Two types of freshwater sediment benchmarks for pesticides were developed using spiked-sediment bioassay (SSB) data from the literature. These benchmarks can be used to interpret sediment-toxicity data or to assess the potential toxicity of pesticides in whole sediment. The Likely Effect Benchmark (LEB) defines a pesticide concentration in whole sediment above which there is a high probability of adverse effects on benthic invertebrates, and the Threshold Effect Benchmark (TEB) defines a concentration below which adverse effects are unlikely. For compounds without available SSBs, benchmarks were estimated using equilibrium partitioning (EqP). When a sediment sample contains a pesticide mixture, benchmark quotients can be summed for all detected pesticides to produce an indicator of potential toxicity for that mixture. Benchmarks were developed for 48 pesticide compounds using SSB data and 81 compounds using the EqP approach. In an example application, data for pesticides measured in sediment from 197 streams across the United States were evaluated using these benchmarks, and compared to measured toxicity from whole-sediment toxicity tests conducted with the amphipod Hyalella azteca (28-d exposures) and the midge Chironomus dilutus (10-d exposures). Amphipod survival, weight, and biomass were significantly and inversely related to summed benchmark quotients, whereas midge survival, weight, and biomass showed no relationship to benchmarks. Samples with LEB exceedances were rare (n = 3), but all were toxic to amphipods (i.e., significantly different from control). Significant toxicity to amphipods was observed for 72% of samples exceeding one or more TEBs, compared to 18% of samples below all TEBs. Factors affecting toxicity below TEBs may include the presence of contaminants other than pesticides, physical/chemical characteristics of sediment, and uncertainty in TEB values. Additional evaluations of benchmarks in relation to sediment chemistry and toxicity are ongoing.
40 CFR 141.172 - Disinfection profiling and benchmarking.
Code of Federal Regulations, 2011 CFR
2011-07-01
... benchmarking. 141.172 Section 141.172 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED... Disinfection-Systems Serving 10,000 or More People § 141.172 Disinfection profiling and benchmarking. (a... sanitary surveys conducted by the State. (c) Disinfection benchmarking. (1) Any system required to develop...
42 CFR 440.390 - Assurance of transportation.
Code of Federal Regulations, 2014 CFR
2014-10-01
...-Equivalent Coverage § 440.390 Assurance of transportation. If a benchmark or benchmark-equivalent plan does... nevertheless assure that emergency and non-emergency transportation is covered for beneficiaries enrolled in the benchmark or benchmark-equivalent plan, as required under § 431.53 of this chapter. ...
42 CFR 440.390 - Assurance of transportation.
Code of Federal Regulations, 2012 CFR
2012-10-01
...-Equivalent Coverage § 440.390 Assurance of transportation. If a benchmark or benchmark-equivalent plan does... nevertheless assure that emergency and non-emergency transportation is covered for beneficiaries enrolled in the benchmark or benchmark-equivalent plan, as required under § 431.53 of this chapter. ...
42 CFR 440.390 - Assurance of transportation.
Code of Federal Regulations, 2011 CFR
2011-10-01
...-Equivalent Coverage § 440.390 Assurance of transportation. If a benchmark or benchmark-equivalent plan does... nevertheless assure that emergency and non-emergency transportation is covered for beneficiaries enrolled in the benchmark or benchmark-equivalent plan, as required under § 431.53 of this chapter. ...
42 CFR 440.390 - Assurance of transportation.
Code of Federal Regulations, 2010 CFR
2010-10-01
...-Equivalent Coverage § 440.390 Assurance of transportation. If a benchmark or benchmark-equivalent plan does... nevertheless assure that emergency and non-emergency transportation is covered for beneficiaries enrolled in the benchmark or benchmark-equivalent plan, as required under § 431.53 of this chapter. ...
42 CFR 440.390 - Assurance of transportation.
Code of Federal Regulations, 2013 CFR
2013-10-01
...-Equivalent Coverage § 440.390 Assurance of transportation. If a benchmark or benchmark-equivalent plan does... nevertheless assure that emergency and non-emergency transportation is covered for beneficiaries enrolled in the benchmark or benchmark-equivalent plan, as required under § 431.53 of this chapter. ...
The Zoo, Benchmarks & You: How To Reach the Oregon State Benchmarks with Zoo Resources.
ERIC Educational Resources Information Center
2002
This document aligns Oregon state educational benchmarks and standards with Oregon Zoo resources. Benchmark areas examined include English, mathematics, science, social studies, and career and life roles. Brief descriptions of the programs offered by the zoo are presented. (SOE)
The Isprs Benchmark on Indoor Modelling
NASA Astrophysics Data System (ADS)
Khoshelham, K.; Díaz Vilariño, L.; Peter, M.; Kang, Z.; Acharya, D.
2017-09-01
Automated generation of 3D indoor models from point cloud data has been a topic of intensive research in recent years. While results on various datasets have been reported in literature, a comparison of the performance of different methods has not been possible due to the lack of benchmark datasets and a common evaluation framework. The ISPRS benchmark on indoor modelling aims to address this issue by providing a public benchmark dataset and an evaluation framework for performance comparison of indoor modelling methods. In this paper, we present the benchmark dataset comprising several point clouds of indoor environments captured by different sensors. We also discuss the evaluation and comparison of indoor modelling methods based on manually created reference models and appropriate quality evaluation criteria. The benchmark dataset is available for download at: http://www2.isprs.org/commissions/comm4/wg5/benchmark-on-indoor-modelling.html.
Combining Phase Identification and Statistic Modeling for Automated Parallel Benchmark Generation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jin, Ye; Ma, Xiaosong; Liu, Qing Gary
2015-01-01
Parallel application benchmarks are indispensable for evaluating/optimizing HPC software and hardware. However, it is very challenging and costly to obtain high-fidelity benchmarks reflecting the scale and complexity of state-of-the-art parallel applications. Hand-extracted synthetic benchmarks are time-and labor-intensive to create. Real applications themselves, while offering most accurate performance evaluation, are expensive to compile, port, reconfigure, and often plainly inaccessible due to security or ownership concerns. This work contributes APPRIME, a novel tool for trace-based automatic parallel benchmark generation. Taking as input standard communication-I/O traces of an application's execution, it couples accurate automatic phase identification with statistical regeneration of event parameters tomore » create compact, portable, and to some degree reconfigurable parallel application benchmarks. Experiments with four NAS Parallel Benchmarks (NPB) and three real scientific simulation codes confirm the fidelity of APPRIME benchmarks. They retain the original applications' performance characteristics, in particular the relative performance across platforms.« less
Benchmarking in Academic Pharmacy Departments
Chisholm-Burns, Marie; Nappi, Jean; Gubbins, Paul O.; Ross, Leigh Ann
2010-01-01
Benchmarking in academic pharmacy, and recommendations for the potential uses of benchmarking in academic pharmacy departments are discussed in this paper. Benchmarking is the process by which practices, procedures, and performance metrics are compared to an established standard or best practice. Many businesses and industries use benchmarking to compare processes and outcomes, and ultimately plan for improvement. Institutions of higher learning have embraced benchmarking practices to facilitate measuring the quality of their educational and research programs. Benchmarking is used internally as well to justify the allocation of institutional resources or to mediate among competing demands for additional program staff or space. Surveying all chairs of academic pharmacy departments to explore benchmarking issues such as department size and composition, as well as faculty teaching, scholarly, and service productivity, could provide valuable information. To date, attempts to gather this data have had limited success. We believe this information is potentially important, urge that efforts to gather it should be continued, and offer suggestions to achieve full participation. PMID:21179251
Benchmarking in academic pharmacy departments.
Bosso, John A; Chisholm-Burns, Marie; Nappi, Jean; Gubbins, Paul O; Ross, Leigh Ann
2010-10-11
Benchmarking in academic pharmacy, and recommendations for the potential uses of benchmarking in academic pharmacy departments are discussed in this paper. Benchmarking is the process by which practices, procedures, and performance metrics are compared to an established standard or best practice. Many businesses and industries use benchmarking to compare processes and outcomes, and ultimately plan for improvement. Institutions of higher learning have embraced benchmarking practices to facilitate measuring the quality of their educational and research programs. Benchmarking is used internally as well to justify the allocation of institutional resources or to mediate among competing demands for additional program staff or space. Surveying all chairs of academic pharmacy departments to explore benchmarking issues such as department size and composition, as well as faculty teaching, scholarly, and service productivity, could provide valuable information. To date, attempts to gather this data have had limited success. We believe this information is potentially important, urge that efforts to gather it should be continued, and offer suggestions to achieve full participation.
Peeters, Dominique; Sekeris, Elke; Verschaffel, Lieven; Luwel, Koen
2017-01-01
Some authors argue that age-related improvements in number line estimation (NLE) performance result from changes in strategy use. More specifically, children’s strategy use develops from only using the origin of the number line, to using the origin and the endpoint, to eventually also relying on the midpoint of the number line. Recently, Peeters et al. (unpublished) investigated whether the provision of additional unlabeled benchmarks at 25, 50, and 75% of the number line, positively affects third and fifth graders’ NLE performance and benchmark-based strategy use. It was found that only the older children benefitted from the presence of these benchmarks at the quartiles of the number line (i.e., 25 and 75%), as they made more use of these benchmarks, leading to more accurate estimates. A possible explanation for this lack of improvement in third graders might be their inability to correctly link the presented benchmarks with their corresponding numerical values. In the present study, we investigated whether labeling these benchmarks with their corresponding numerical values, would have a positive effect on younger children’s NLE performance and quartile-based strategy use as well. Third and sixth graders were assigned to one of three conditions: (a) a control condition with an empty number line bounded by 0 at the origin and 1,000 at the endpoint, (b) an unlabeled condition with three additional external benchmarks without numerical labels at 25, 50, and 75% of the number line, and (c) a labeled condition in which these benchmarks were labeled with 250, 500, and 750, respectively. Results indicated that labeling the benchmarks has a positive effect on third graders’ NLE performance and quartile-based strategy use, whereas sixth graders already benefited from the mere provision of unlabeled benchmarks. These findings imply that children’s benchmark-based strategy use can be stimulated by adding additional externally provided benchmarks on the number line, but that, depending on children’s age and familiarity with the number range, these additional external benchmarks might need to be labeled. PMID:28713302
Peeters, Dominique; Sekeris, Elke; Verschaffel, Lieven; Luwel, Koen
2017-01-01
Some authors argue that age-related improvements in number line estimation (NLE) performance result from changes in strategy use. More specifically, children's strategy use develops from only using the origin of the number line, to using the origin and the endpoint, to eventually also relying on the midpoint of the number line. Recently, Peeters et al. (unpublished) investigated whether the provision of additional unlabeled benchmarks at 25, 50, and 75% of the number line, positively affects third and fifth graders' NLE performance and benchmark-based strategy use. It was found that only the older children benefitted from the presence of these benchmarks at the quartiles of the number line (i.e., 25 and 75%), as they made more use of these benchmarks, leading to more accurate estimates. A possible explanation for this lack of improvement in third graders might be their inability to correctly link the presented benchmarks with their corresponding numerical values. In the present study, we investigated whether labeling these benchmarks with their corresponding numerical values, would have a positive effect on younger children's NLE performance and quartile-based strategy use as well. Third and sixth graders were assigned to one of three conditions: (a) a control condition with an empty number line bounded by 0 at the origin and 1,000 at the endpoint, (b) an unlabeled condition with three additional external benchmarks without numerical labels at 25, 50, and 75% of the number line, and (c) a labeled condition in which these benchmarks were labeled with 250, 500, and 750, respectively. Results indicated that labeling the benchmarks has a positive effect on third graders' NLE performance and quartile-based strategy use, whereas sixth graders already benefited from the mere provision of unlabeled benchmarks. These findings imply that children's benchmark-based strategy use can be stimulated by adding additional externally provided benchmarks on the number line, but that, depending on children's age and familiarity with the number range, these additional external benchmarks might need to be labeled.
Liao, Hehuan; Krometis, Leigh-Anne H; Kline, Karen
2016-05-01
Within the United States, elevated levels of fecal indicator bacteria (FIB) remain the leading cause of surface water-quality impairments requiring formal remediation plans under the federal Clean Water Act's Total Maximum Daily Load (TMDL) program. The sufficiency of compliance with numerical FIB criteria as the targeted endpoint of TMDL remediation plans may be questionable given poor correlations between FIB and pathogenic microorganisms and varying degrees of risk associated with exposure to different fecal pollution sources (e.g. human vs animal). The present study linked a watershed-scale FIB fate and transport model with a dose-response model to continuously predict human health risks via quantitative microbial risk assessment (QMRA), for comparison to regulatory benchmarks. This process permitted comparison of risks associated with different fecal pollution sources in an impaired urban watershed in order to identify remediation priorities. Results indicate that total human illness risks were consistently higher than the regulatory benchmark of 36 illnesses/1000 people for the study watershed, even when the predicted FIB levels were in compliance with the Escherichia coli geometric mean standard of 126CFU/100mL. Sanitary sewer overflows were associated with the greatest risk of illness. This is of particular concern, given increasing indications that sewer leakage is ubiquitous in urban areas, yet not typically fully accounted for during TMDL development. Uncertainty analysis suggested the accuracy of risk estimates would be improved by more detailed knowledge of site-specific pathogen presence and densities. While previous applications of the QMRA process to impaired waterways have mostly focused on single storm events or hypothetical situations, the continuous modeling framework presented in this study could be integrated into long-term water quality management planning, especially the United States' TMDL program, providing greater clarity to watershed stakeholders and decision-makers. Copyright © 2016 Elsevier B.V. All rights reserved.
Medical school benchmarking - from tools to programmes.
Wilkinson, Tim J; Hudson, Judith N; Mccoll, Geoffrey J; Hu, Wendy C Y; Jolly, Brian C; Schuwirth, Lambert W T
2015-02-01
Benchmarking among medical schools is essential, but may result in unwanted effects. To apply a conceptual framework to selected benchmarking activities of medical schools. We present an analogy between the effects of assessment on student learning and the effects of benchmarking on medical school educational activities. A framework by which benchmarking can be evaluated was developed and applied to key current benchmarking activities in Australia and New Zealand. The analogy generated a conceptual framework that tested five questions to be considered in relation to benchmarking: what is the purpose? what are the attributes of value? what are the best tools to assess the attributes of value? what happens to the results? and, what is the likely "institutional impact" of the results? If the activities were compared against a blueprint of desirable medical graduate outcomes, notable omissions would emerge. Medical schools should benchmark their performance on a range of educational activities to ensure quality improvement and to assure stakeholders that standards are being met. Although benchmarking potentially has positive benefits, it could also result in perverse incentives with unforeseen and detrimental effects on learning if it is undertaken using only a few selected assessment tools.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clewell, H.J., E-mail: hclewell@thehamner.org; Efremenko, A.; Campbell, J.L.
Male and female Fischer 344 rats were exposed to naphthalene vapors at 0 (controls), 0.1, 1, 10, and 30 ppm for 6 h/d, 5 d/wk, over a 90-day period. Following exposure, the respiratory epithelium and olfactory epithelium from the nasal cavity were dissected separately, RNA was isolated, and gene expression microarray analysis was conducted. Only a few significant gene expression changes were observed in the olfactory or respiratory epithelium of either gender at the lowest concentration (0.1 ppm). At the 1.0 ppm concentration there was limited evidence of an oxidative stress response in the respiratory epithelium, but not in themore » olfactory epithelium. In contrast, a large number of significantly enriched cellular pathway responses were observed in both tissues at the two highest concentrations (10 and 30 ppm, which correspond to tumorigenic concentrations in the NTP bioassay). The nature of these responses supports a mode of action involving oxidative stress, inflammation and proliferation. These results are consistent with a dose-dependent transition in the mode of action for naphthalene toxicity/carcinogenicity between 1.0 and 10 ppm in the rat. In the female olfactory epithelium (the gender/site with the highest incidences of neuroblastomas in the NTP bioassay), the lowest concentration at which any signaling pathway was significantly affected, as characterized by the median pathway benchmark dose (BMD) or its 95% lower bound (BMDL) was 6.0 or 3.7 ppm, respectively, while the lowest female olfactory BMD values for pathways related to glutathione homeostasis, inflammation, and proliferation were 16.1, 11.1, and 8.4 ppm, respectively. In the male respiratory epithelium (the gender/site with the highest incidences of adenomas in the NTP bioassay), the lowest pathway BMD and BMDL were 0.4 and 0.3 ppm, respectively, and the lowest male respiratory BMD values for pathways related to glutathione homeostasis, inflammation, and proliferation were 0.5, 0.7, and 0.9 ppm, respectively. Using a published physiologically based pharmacokinetic (PBPK) model to estimate target tissue dose relevant to the proposed mode of action (total naphthalene metabolism per gram nasal tissue), the lowest transcriptional BMDLs from this analysis equate to human continuous naphthalene exposure at approximately 0.3 ppm. It is unlikely that significant effects of naphthalene or its metabolites will occur at exposures below this concentration. - Highlights: • We investigated mode of action for carcinogenicity of inhaled naphthalene in rats. • Gene expression changes were measured in rat nasal tissues after 90 day exposures. • Support a non-linear mode of action (oxidative stress, inflammation, and proliferation) • Suggest a dose-dependent transition in the mode of action between 1.0 and 10 ppm • Transcriptional benchmark doses could inform point of departure for risk assessment.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Teter, Sarah A
Conversion of biomass to sugars plays a central in reducing our dependence on petroleum, as it allows production of a wide range of biobased fuels and chemicals, through fermentation of those sugars. The DECREASE project delivers an effective enzyme cocktail for this conversion, enabling reduced costs for producing advanced biofuels such as cellulosic ethanol. Benefits to the public contributed by growth of the advanced biofuels industry include job creation, economic growth, and energy security. The DECREASE primary project objective was to develop a two-fold improved enzyme cocktail, relative to an advanced cocktail (CZP00005) that had been developed previously (from 2000-more » 2007). While the final milestone was delivery of all enzyme components as an experimental mixture, a secondary objective was to deploy an improved cocktail within 3 years following the close of the project. In February 2012, Novozymes launched Cellic CTec3, a multi-enzyme cocktail derived in part from components developed under DECREASE. The externally validated performance of CTec3 and an additional component under project benchmarking conditions indicated a 1.8-fold dose reduction in enzyme dose required for 90% conversion (based on all available glucose and xylose sources) of NREL dilute acid pretreated PCS, relative to the starting advanced enzyme cocktail. While the ability to achieve 90% conversion is impressive, targeting such high levels of biomass digestion is likely not the most cost effective strategy. Novozymes techno economic modeling showed that for NREL's dilute acid pretreated corn stover (PCS), 80% target conversion enables a lower total production cost for cellulosic ethanol than for 90% conversion, and this was also found to be the case when cost assumptions were based on the NREL 2002 Design Report. A 1.8X dose-reduction was observed for 80% conversion in the small scale (50 g) DECREASE benchmark assay for CTec3 and an additional component. An upscaled experiment (in 0.5 kg kettle reactors) was performed to compare the starting enzyme mixture CZP00005 with CTec3 alone; these results indicated a 1.9X dose- reduction for 80% conversion. The CTec3 composition does not include the best available enzyme components from the DECREASE effort. While these components are not yet available in a commercial product, experimental mixtures were assayed in a smaller scale assay using DECREASE PCS, at high solids loadings (21.5% TS). The results indicated that the newer mixtures required 2.9X-less enzyme for 90% conversion, and 3.2X-less enzyme for 80% conversion, relative to the starting enzyme cocktail. In conclusion, CTec3 delivers a 1.8-1.9X dose reduction on NREL PCS at high solids loadings, and the next generation enzyme from Novozymes will continue to show dramatically improved biochemical performance. CTec3 allows reduced costs today, and the experimental cocktails point to continued biotechnological improvements that will further drive down costs for biorefineries of tomorrow.« less
42 CFR 457.430 - Benchmark-equivalent health benefits coverage.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 42 Public Health 4 2011-10-01 2011-10-01 false Benchmark-equivalent health benefits coverage. 457... STATES State Plan Requirements: Coverage and Benefits § 457.430 Benchmark-equivalent health benefits coverage. (a) Aggregate actuarial value. Benchmark-equivalent coverage is health benefits coverage that has...
42 CFR 457.430 - Benchmark-equivalent health benefits coverage.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 42 Public Health 4 2013-10-01 2013-10-01 false Benchmark-equivalent health benefits coverage. 457... STATES State Plan Requirements: Coverage and Benefits § 457.430 Benchmark-equivalent health benefits coverage. (a) Aggregate actuarial value. Benchmark-equivalent coverage is health benefits coverage that has...
42 CFR 457.430 - Benchmark-equivalent health benefits coverage.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 42 Public Health 4 2010-10-01 2010-10-01 false Benchmark-equivalent health benefits coverage. 457... STATES State Plan Requirements: Coverage and Benefits § 457.430 Benchmark-equivalent health benefits coverage. (a) Aggregate actuarial value. Benchmark-equivalent coverage is health benefits coverage that has...
42 CFR 440.335 - Benchmark-equivalent health benefits coverage.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 42 Public Health 4 2012-10-01 2012-10-01 false Benchmark-equivalent health benefits coverage. 440.335 Section 440.335 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND... and Benchmark-Equivalent Coverage § 440.335 Benchmark-equivalent health benefits coverage. (a...
42 CFR 440.335 - Benchmark-equivalent health benefits coverage.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 42 Public Health 4 2014-10-01 2014-10-01 false Benchmark-equivalent health benefits coverage. 440.335 Section 440.335 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND... and Benchmark-Equivalent Coverage § 440.335 Benchmark-equivalent health benefits coverage. (a...
egs_brachy: a versatile and fast Monte Carlo code for brachytherapy
NASA Astrophysics Data System (ADS)
Chamberland, Marc J. P.; Taylor, Randle E. P.; Rogers, D. W. O.; Thomson, Rowan M.
2016-12-01
egs_brachy is a versatile and fast Monte Carlo (MC) code for brachytherapy applications. It is based on the EGSnrc code system, enabling simulation of photons and electrons. Complex geometries are modelled using the EGSnrc C++ class library and egs_brachy includes a library of geometry models for many brachytherapy sources, in addition to eye plaques and applicators. Several simulation efficiency enhancing features are implemented in the code. egs_brachy is benchmarked by comparing TG-43 source parameters of three source models to previously published values. 3D dose distributions calculated with egs_brachy are also compared to ones obtained with the BrachyDose code. Well-defined simulations are used to characterize the effectiveness of many efficiency improving techniques, both as an indication of the usefulness of each technique and to find optimal strategies. Efficiencies and calculation times are characterized through single source simulations and simulations of idealized and typical treatments using various efficiency improving techniques. In general, egs_brachy shows agreement within uncertainties with previously published TG-43 source parameter values. 3D dose distributions from egs_brachy and BrachyDose agree at the sub-percent level. Efficiencies vary with radionuclide and source type, number of sources, phantom media, and voxel size. The combined effects of efficiency-improving techniques in egs_brachy lead to short calculation times: simulations approximating prostate and breast permanent implant (both with (2 mm)3 voxels) and eye plaque (with (1 mm)3 voxels) treatments take between 13 and 39 s, on a single 2.5 GHz Intel Xeon E5-2680 v3 processor core, to achieve 2% average statistical uncertainty on doses within the PTV. egs_brachy will be released as free and open source software to the research community.
Dean, Jeffry L; Zhao, Q Jay; Lambert, Jason C; Hawkins, Belinda S; Thomas, Russell S; Wesselkamper, Scott C
2017-05-01
The rate of new chemical development in commerce combined with a paucity of toxicity data for legacy chemicals presents a unique challenge for human health risk assessment. There is a clear need to develop new technologies and incorporate novel data streams to more efficiently inform derivation of toxicity values. One avenue of exploitation lies in the field of transcriptomics and the application of gene expression analysis to characterize biological responses to chemical exposures. In this context, gene set enrichment analysis (GSEA) was employed to evaluate tissue-specific, dose-response gene expression data generated following exposure to multiple chemicals for various durations. Patterns of transcriptional enrichment were evident across time and with increasing dose, and coordinated enrichment plausibly linked to the etiology of the biological responses was observed. GSEA was able to capture both transient and sustained transcriptional enrichment events facilitating differentiation between adaptive versus longer term molecular responses. When combined with benchmark dose (BMD) modeling of gene expression data from key drivers of biological enrichment, GSEA facilitated characterization of dose ranges required for enrichment of biologically relevant molecular signaling pathways, and promoted comparison of the activation dose ranges required for individual pathways. Median transcriptional BMD values were calculated for the most sensitive enriched pathway as well as the overall median BMD value for key gene members of significantly enriched pathways, and both were observed to be good estimates of the most sensitive apical endpoint BMD value. Together, these efforts support the application of GSEA to qualitative and quantitative human health risk assessment. Published by Oxford University Press on behalf of the Society of Toxicology 2017. This work is written by US Government employees and is in the public domain in the US.
A chronic oral reference dose for hexavalent chromium-induced intestinal cancer†
Thompson, Chad M; Kirman, Christopher R; Proctor, Deborah M; Haws, Laurie C; Suh, Mina; Hays, Sean M; Hixon, J Gregory; Harris, Mark A
2014-01-01
High concentrations of hexavalent chromium [Cr(VI)] in drinking water induce villous cytotoxicity and compensatory crypt hyperplasia in the small intestines of mice (but not rats). Lifetime exposure to such cytotoxic concentrations increases intestinal neoplasms in mice, suggesting that the mode of action for Cr(VI)-induced intestinal tumors involves chronic wounding and compensatory cell proliferation of the intestine. Therefore, we developed a chronic oral reference dose (RfD) designed to be protective of intestinal damage and thus intestinal cancer. A physiologically based pharmacokinetic model for chromium in mice was used to estimate the amount of Cr(VI) entering each intestinal tissue section (duodenum, jejunum and ileum) from the lumen per day (normalized to intestinal tissue weight). These internal dose metrics, together with corresponding incidences for diffuse hyperplasia, were used to derive points of departure using benchmark dose modeling and constrained nonlinear regression. Both modeling techniques resulted in similar points of departure, which were subsequently converted to human equivalent doses using a human physiologically based pharmacokinetic model. Applying appropriate uncertainty factors, an RfD of 0.006 mg kg–1 day–1 was derived for diffuse hyperplasia—an effect that precedes tumor formation. This RfD is protective of both noncancer and cancer effects in the small intestine and corresponds to a safe drinking water equivalent level of 210 µg l–1. This concentration is higher than the current federal maximum contaminant level for total Cr (100 µg l–1) and well above levels of Cr(VI) in US drinking water supplies (typically ≤ 5 µg l–1). © 2013 The Authors. Journal of Applied Toxicology published by John Wiley & Sons, Ltd. PMID:23943231
Impact of dose engine algorithm in pencil beam scanning proton therapy for breast cancer.
Tommasino, Francesco; Fellin, Francesco; Lorentini, Stefano; Farace, Paolo
2018-06-01
Proton therapy for the treatment of breast cancer is acquiring increasing interest, due to the potential reduction of radiation-induced side effects such as cardiac and pulmonary toxicity. While several in silico studies demonstrated the gain in plan quality offered by pencil beam scanning (PBS) compared to passive scattering techniques, the related dosimetric uncertainties have been poorly investigated so far. Five breast cancer patients were planned with Raystation 6 analytical pencil beam (APB) and Monte Carlo (MC) dose calculation algorithms. Plans were optimized with APB and then MC was used to recalculate dose distribution. Movable snout and beam splitting techniques (i.e. using two sub-fields for the same beam entrance, one with and the other without the use of a range shifter) were considered. PTV dose statistics were recorded. The same planning configurations were adopted for the experimental benchmark. Dose distributions were measured with a 2D array of ionization chambers and compared to APB and MC calculated ones by means of a γ analysis (agreement criteria 3%, 3 mm). Our results indicate that, when using proton PBS for breast cancer treatment, the Raystation 6 APB algorithm does not allow obtaining sufficient accuracy, especially with large air gaps. On the contrary, the MC algorithm resulted into much higher accuracy in all beam configurations tested and has to be recommended. Centers where a MC algorithm is not yet available should consider a careful use of APB, possibly combined with a movable snout system or in any case with strategies aimed at minimizing air gaps. Copyright © 2018 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
egs_brachy: a versatile and fast Monte Carlo code for brachytherapy.
Chamberland, Marc J P; Taylor, Randle E P; Rogers, D W O; Thomson, Rowan M
2016-12-07
egs_brachy is a versatile and fast Monte Carlo (MC) code for brachytherapy applications. It is based on the EGSnrc code system, enabling simulation of photons and electrons. Complex geometries are modelled using the EGSnrc C++ class library and egs_brachy includes a library of geometry models for many brachytherapy sources, in addition to eye plaques and applicators. Several simulation efficiency enhancing features are implemented in the code. egs_brachy is benchmarked by comparing TG-43 source parameters of three source models to previously published values. 3D dose distributions calculated with egs_brachy are also compared to ones obtained with the BrachyDose code. Well-defined simulations are used to characterize the effectiveness of many efficiency improving techniques, both as an indication of the usefulness of each technique and to find optimal strategies. Efficiencies and calculation times are characterized through single source simulations and simulations of idealized and typical treatments using various efficiency improving techniques. In general, egs_brachy shows agreement within uncertainties with previously published TG-43 source parameter values. 3D dose distributions from egs_brachy and BrachyDose agree at the sub-percent level. Efficiencies vary with radionuclide and source type, number of sources, phantom media, and voxel size. The combined effects of efficiency-improving techniques in egs_brachy lead to short calculation times: simulations approximating prostate and breast permanent implant (both with (2 mm) 3 voxels) and eye plaque (with (1 mm) 3 voxels) treatments take between 13 and 39 s, on a single 2.5 GHz Intel Xeon E5-2680 v3 processor core, to achieve 2% average statistical uncertainty on doses within the PTV. egs_brachy will be released as free and open source software to the research community.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sandor, Debra; Chung, Donald; Keyser, David
This report documents the CEMAC methodologies for developing and reporting annual global clean energy manufacturing benchmarks. The report reviews previously published manufacturing benchmark reports and foundational data, establishes a framework for benchmarking clean energy technologies, describes the CEMAC benchmark analysis methodologies, and describes the application of the methodologies to the manufacturing of four specific clean energy technologies.
Benchmarking for Higher Education.
ERIC Educational Resources Information Center
Jackson, Norman, Ed.; Lund, Helen, Ed.
The chapters in this collection explore the concept of benchmarking as it is being used and developed in higher education (HE). Case studies and reviews show how universities in the United Kingdom are using benchmarking to aid in self-regulation and self-improvement. The chapters are: (1) "Introduction to Benchmarking" (Norman Jackson…
How Benchmarking and Higher Education Came Together
ERIC Educational Resources Information Center
Levy, Gary D.; Ronco, Sharron L.
2012-01-01
This chapter introduces the concept of benchmarking and how higher education institutions began to use benchmarking for a variety of purposes. Here, benchmarking is defined as a strategic and structured approach whereby an organization compares aspects of its processes and/or outcomes to those of another organization or set of organizations to…
Benchmark Study of Global Clean Energy Manufacturing | Advanced
Manufacturing Research | NREL Benchmark Study of Global Clean Energy Manufacturing Benchmark Study of Global Clean Energy Manufacturing Through a first-of-its-kind benchmark study, the Clean Energy Technology End Product.' The study examined four clean energy technologies: wind turbine components
DOE Office of Scientific and Technical Information (OSTI.GOV)
Field, Kevin G.; Yang, Ying; Busby, Jeremy T.
Radiation induced segregation (RIS) is a well-studied phenomena which occurs in many structurally relevant nuclear materials including austenitic stainless steels. RIS occurs due to solute atoms preferentially coupling to mobile point defect fluxes that migrate and interact with defect sinks. Here, a 304 stainless steel was neutron irradiated up to 47.1 dpa at 320 °C. Investigations into the RIS response at specific grain boundary types were utilized to determine the sink characteristics of different boundary types as a function of irradiation dose. A rate theory model built on the foundation of the modified inverse Kirkendall (MIK) model is proposed andmore » benchmarked to the experimental results. This model, termed the GiMIK model, includes alterations in the boundary conditions based on grain boundary structure and includes expressions for interstitial binding. This investigation, through experiment and modeling, found specific grain boundary structures exhibit unique defect sink characteristics depending on their local structure. Furthermore, such interactions were found to be consistent across all doses investigated and had larger global implications including precipitation of Ni-Si clusters near different grain boundary types.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Field, Kevin G.; Yang, Ying; Allen, Todd R.
Radiation induced segregation (RIS) is a well-studied phenomena which occurs in many structurally relevant nuclear materials including austenitic stainless steels. RIS occurs due to solute atoms preferentially coupling to mobile point defect fluxes that migrate and interact with defect sinks. Here, a 304 stainless steel was neutron irradiated up to 47.1 dpa at 320 °C. Investigations into the RIS response at specific grain boundary types were utilized to determine the sink characteristics of different boundary types as a function of irradiation dose. A rate theory model built on the foundation of the modified inverse Kirkendall (MIK) model is proposed andmore » benchmarked to the experimental results. This model, termed the GiMIK model, includes alterations in the boundary conditions based on grain boundary structure and includes expressions for interstitial binding. This investigation, through experiment and modeling, found specific grain boundary structures exhibit unique defect sink characteristics depending on their local structure. Such interactions were found to be consistent across all doses investigated and had larger global implications including precipitation of Ni-Si clusters near different grain boundary types.« less
NASA Astrophysics Data System (ADS)
Ortego, Pedro; Rodriguez, Alain; Töre, Candan; Compadre, José Luis de Diego; Quesada, Baltasar Rodriguez; Moreno, Raul Orive
2017-09-01
In order to increase the storage capacity of the East Spent Fuel Pool at the Cofrentes NPP, located in Valencia province, Spain, the existing storage stainless steel racks were replaced by a new design of compact borated stainless steel racks allowing a 65% increase in fuel storing capacity. Calculation of the activation of the used racks was successfully performed with the use of MCNP4B code. Additionally the dose rate at contact with a row of racks in standing position and behind a wall of shielding material has been calculated using MCNP4B code as well. These results allowed a preliminary definition of the burnker required for the storage of racks. Recently the activity in the racks has been recalculated with SEACAB system which combines the mesh tally of MCNP codes with the activation code ACAB, applying the rigorous two-step method (R2S) developed at home, benchmarked with FNG irradiation experiments and usually applied in fusion calculations for ITER project.
Dose-response algorithms for water-borne Pseudomonas aeruginosa folliculitis.
Roser, D J; Van Den Akker, B; Boase, S; Haas, C N; Ashbolt, N J; Rice, S A
2015-05-01
We developed two dose-response algorithms for P. aeruginosa pool folliculitis using bacterial and lesion density estimates, associated with undetectable, significant, and almost certain folliculitis. Literature data were fitted to Furumoto & Mickey's equations, developed for plant epidermis-invading pathogens: N l = A ln(1 + BC) (log-linear model); P inf = 1-e(-r c C) (exponential model), where A and B are 2.51644 × 107 lesions/m2 and 2.28011 × 10-11 c.f.u./ml P. aeruginosa, respectively; C = pathogen density (c.f.u./ml), N l = folliculitis lesions/m2, P inf = probability of infection, and r C = 4·3 × 10-7 c.f.u./ml P. aeruginosa. Outbreak data indicates these algorithms apply to exposure durations of 41 ± 25 min. Typical water quality benchmarks (≈10-2 c.f.u./ml) appear conservative but still useful as the literature indicated repeated detection likely implies unstable control barriers and bacterial bloom potential. In future, culture-based outbreak testing should be supplemented with quantitative polymerase chain reaction and organic carbon assays, and quantification of folliculitis aetiology to better understand P. aeruginosa risks.
Poster — Thur Eve — 61: A new framework for MPERT plan optimization using MC-DAO
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baker, M; Lloyd, S AM; Townson, R
2014-08-15
This work combines the inverse planning technique known as Direct Aperture Optimization (DAO) with Intensity Modulated Radiation Therapy (IMRT) and combined electron and photon therapy plans. In particular, determining conditions under which Modulated Photon/Electron Radiation Therapy (MPERT) produces better dose conformality and sparing of organs at risk than traditional IMRT plans is central to the project. Presented here are the materials and methods used to generate and manipulate the DAO procedure. Included is the introduction of a powerful Java-based toolkit, the Aperture-based Monte Carlo (MC) MPERT Optimizer (AMMO), that serves as a framework for optimization and provides streamlined access tomore » underlying particle transport packages. Comparison of the toolkit's dose calculations to those produced by the Eclipse TPS and the demonstration of a preliminary optimization are presented as first benchmarks. Excellent agreement is illustrated between the Eclipse TPS and AMMO for a 6MV photon field. The results of a simple optimization shows the functioning of the optimization framework, while significant research remains to characterize appropriate constraints.« less
Cross-industry benchmarking: is it applicable to the operating room?
Marco, A P; Hart, S
2001-01-01
The use of benchmarking has been growing in nonmedical industries. This concept is being increasingly applied to medicine as the industry strives to improve quality and improve financial performance. Benchmarks can be either internal (set by the institution) or external (use other's performance as a goal). In some industries, benchmarking has crossed industry lines to identify breakthroughs in thinking. In this article, we examine whether the airline industry can be used as a source of external process benchmarking for the operating room.
Overview of TPC Benchmark E: The Next Generation of OLTP Benchmarks
NASA Astrophysics Data System (ADS)
Hogan, Trish
Set to replace the aging TPC-C, the TPC Benchmark E is the next generation OLTP benchmark, which more accurately models client database usage. TPC-E addresses the shortcomings of TPC-C. It has a much more complex workload, requires the use of RAID-protected storage, generates much less I/O, and is much cheaper and easier to set up, run, and audit. After a period of overlap, it is expected that TPC-E will become the de facto OLTP benchmark.
Davidson, Scott E; Cui, Jing; Kry, Stephen; Deasy, Joseph O; Ibbott, Geoffrey S; Vicic, Milos; White, R Allen; Followill, David S
2016-08-01
A dose calculation tool, which combines the accuracy of the dose planning method (DPM) Monte Carlo code and the versatility of a practical analytical multisource model, which was previously reported has been improved and validated for the Varian 6 and 10 MV linear accelerators (linacs). The calculation tool can be used to calculate doses in advanced clinical application studies. One shortcoming of current clinical trials that report dose from patient plans is the lack of a standardized dose calculation methodology. Because commercial treatment planning systems (TPSs) have their own dose calculation algorithms and the clinical trial participant who uses these systems is responsible for commissioning the beam model, variation exists in the reported calculated dose distributions. Today's modern linac is manufactured to tight specifications so that variability within a linac model is quite low. The expectation is that a single dose calculation tool for a specific linac model can be used to accurately recalculate dose from patient plans that have been submitted to the clinical trial community from any institution. The calculation tool would provide for a more meaningful outcome analysis. The analytical source model was described by a primary point source, a secondary extra-focal source, and a contaminant electron source. Off-axis energy softening and fluence effects were also included. The additions of hyperbolic functions have been incorporated into the model to correct for the changes in output and in electron contamination with field size. A multileaf collimator (MLC) model is included to facilitate phantom and patient dose calculations. An offset to the MLC leaf positions was used to correct for the rudimentary assumed primary point source. Dose calculations of the depth dose and profiles for field sizes 4 × 4 to 40 × 40 cm agree with measurement within 2% of the maximum dose or 2 mm distance to agreement (DTA) for 95% of the data points tested. The model was capable of predicting the depth of the maximum dose within 1 mm. Anthropomorphic phantom benchmark testing of modulated and patterned MLCs treatment plans showed agreement to measurement within 3% in target regions using thermoluminescent dosimeters (TLD). Using radiochromic film normalized to TLD, a gamma criteria of 3% of maximum dose and 2 mm DTA was applied with a pass rate of least 85% in the high dose, high gradient, and low dose regions. Finally, recalculations of patient plans using DPM showed good agreement relative to a commercial TPS when comparing dose volume histograms and 2D dose distributions. A unique analytical source model coupled to the dose planning method Monte Carlo dose calculation code has been modified and validated using basic beam data and anthropomorphic phantom measurement. While this tool can be applied in general use for a particular linac model, specifically it was developed to provide a singular methodology to independently assess treatment plan dose distributions from those clinical institutions participating in National Cancer Institute trials.
Implementation and validation of a conceptual benchmarking framework for patient blood management.
Kastner, Peter; Breznik, Nada; Gombotz, Hans; Hofmann, Axel; Schreier, Günter
2015-01-01
Public health authorities and healthcare professionals are obliged to ensure high quality health service. Because of the high variability of the utilisation of blood and blood components, benchmarking is indicated in transfusion medicine. Implementation and validation of a benchmarking framework for Patient Blood Management (PBM) based on the report from the second Austrian Benchmark trial. Core modules for automatic report generation have been implemented with KNIME (Konstanz Information Miner) and validated by comparing the output with the results of the second Austrian benchmark trial. Delta analysis shows a deviation <0.1% for 95% (max. 1.4%). The framework provides a reliable tool for PBM benchmarking. The next step is technical integration with hospital information systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Will, M.E.; Suter, G.W. II
1995-09-01
An important step in ecological risk assessments is screening the chemicals occur-ring on a site for contaminants of potential concern. Screening may be accomplished by comparing reported ambient concentrations to a set of toxicological benchmarks. Multiple endpoints for assessing risks posed by soil-borne contaminants to organisms directly impacted by them have been established. This report presents benchmarks for soil invertebrates and microbial processes and addresses only chemicals found at United States Department of Energy (DOE) sites. No benchmarks for pesticides are presented. After discussing methods, this report presents the results of the literature review and benchmark derivation for toxicity tomore » earthworms (Sect. 3), heterotrophic microbes and their processes (Sect. 4), and other invertebrates (Sect. 5). The final sections compare the benchmarks to other criteria and background and draw conclusions concerning the utility of the benchmarks.« less
Benchmarks for target tracking
NASA Astrophysics Data System (ADS)
Dunham, Darin T.; West, Philip D.
2011-09-01
The term benchmark originates from the chiseled horizontal marks that surveyors made, into which an angle-iron could be placed to bracket ("bench") a leveling rod, thus ensuring that the leveling rod can be repositioned in exactly the same place in the future. A benchmark in computer terms is the result of running a computer program, or a set of programs, in order to assess the relative performance of an object by running a number of standard tests and trials against it. This paper will discuss the history of simulation benchmarks that are being used by multiple branches of the military and agencies of the US government. These benchmarks range from missile defense applications to chemical biological situations. Typically, a benchmark is used with Monte Carlo runs in order to tease out how algorithms deal with variability and the range of possible inputs. We will also describe problems that can be solved by a benchmark.
Benchmarking Using Basic DBMS Operations
NASA Astrophysics Data System (ADS)
Crolotte, Alain; Ghazal, Ahmad
The TPC-H benchmark proved to be successful in the decision support area. Many commercial database vendors and their related hardware vendors used these benchmarks to show the superiority and competitive edge of their products. However, over time, the TPC-H became less representative of industry trends as vendors keep tuning their database to this benchmark-specific workload. In this paper, we present XMarq, a simple benchmark framework that can be used to compare various software/hardware combinations. Our benchmark model is currently composed of 25 queries that measure the performance of basic operations such as scans, aggregations, joins and index access. This benchmark model is based on the TPC-H data model due to its maturity and well-understood data generation capability. We also propose metrics to evaluate single-system performance and compare two systems. Finally we illustrate the effectiveness of this model by showing experimental results comparing two systems under different conditions.
Management of postmenopausal osteoporosis and the prevention of fractures.
Gambacciani, M; Levancini, M
2014-06-01
Postmenopausal osteoporosis affects millions of women, being estrogen deficiency the key factor in the pathogenesis of involutional osteoporosis. Fracture prevention is one of the public health priorities worldwide. Different treatments for osteoporosis are available. The various options are aimed to maintain bone health and decrease the risk of fractures. The majority of these drugs are antiresorptive agents, i.e., drugs that lower bone turnover, inhibiting osteoclastic bone resorption. Dietary sources of calcium intake and vitamin D are ideal, while pharmachological supplements should be used if diet alone cannot provide the recommended daily intake. Bisphosphonates are first-line therapy for patients with established osteoporosis at high risk of fracture. Some serious, but rare, adverse events have been associated with their long-term administration. The monoclonal antibody to RANKL, named denosumab, administered as a 60-mg subcutaneous injection every 6 months, is a valuable option for the treatment of postmenopausal osteoporosis in women at increased or high risk of fractures, who are unable to take other osteoporosis treatments. Teriparatide (PTH 1-34) is the only available osteoanabolic drugs for osteoporosis treatment at present. Its use is limited to severe osteoporosis because of the high cost of the treatment. In climacteric women, in different stages of menopausal transition, and beyond, hormone replacement therapy at different doses (HRT) rapidly normalizes turnover, preventing and/or treating osteoporosis. HRT is able to preserve and even increase BMD at all skeletal sites, leading to a significant reduction in vertebral and non-vertebral fractures. Selective estrogen modulators (SERMs) as raloxifene and bazedoxifene reduce bone turnover and maintains or increases vertebral and femoral BMDs in comparison to placebo and reduces the risk of vertebral and new vertebral fractures, in high risk women. The combination of a SERM with an estrogen has been defined as tissue selective estrogen complex (TSEC). The bazedoxifene with conjugated estrogen is able to reduce climacteric symptoms, reducing bone turnover and preserving BMD. Studies investigating the actions of phytoestrogens on BMD or bone turnover are largely contradictory, making them inconclusive. At the present time, phytoestrogens cannot be recommended for postmenopausal osteoporosis. In conclusion, the use of HRT for osteoporosis prevention is based on biology, epidemiology, animal and preclinical data, observational studies and randomized, clinical trials. Osteoporosis prevention can actually be considered as a major additional effect in climacteric women who use HRT for treatment of climacteric symptoms. Bone protection is one of the major benefits of HRT. The possibility that low dose HRT or TSEC causes a decrease in fracture risk is not demonstrated but the scientific evidence is compelling. Conversely, established osteoporosis, often occurring in elderly women, can better be treated with specific treatments, such as bisphosphonates or, in more severe and selected cases, anabolic agents (teriparatide).
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-26
... coverage \\1\\ in the individual and small group markets, Medicaid benchmark and benchmark-equivalent plans...) Act extends the coverage of the EHB package to issuers of non-grandfathered individual and small group... small group markets, and not to Medicaid benchmark or benchmark-equivalent plans. EHB applicability to...
Discovering and Implementing Best Practices to Strengthen SEAs: Collaborative Benchmarking
ERIC Educational Resources Information Center
Building State Capacity and Productivity Center, 2013
2013-01-01
This paper is written for state educational agency (SEA) leaders who are considering the benefits of collaborative benchmarking, and it addresses the following questions: (1) What does benchmarking of best practices entail?; (2) How does "collaborative benchmarking" enhance the process?; (3) How do SEAs control the process so that "their" needs…
The Concepts "Benchmarks and Benchmarking" Used in Education Planning: Teacher Education as Example
ERIC Educational Resources Information Center
Steyn, H. J.
2015-01-01
Planning in education is a structured activity that includes several phases and steps that take into account several kinds of information (Steyn, Steyn, De Waal & Wolhuter, 2002: 146). One of the sets of information that are usually considered is the (so-called) "benchmarks" and "benchmarking" regarding the focus of a…
ERIC Educational Resources Information Center
McGregor, Ellen N.; Attinasi, Louis C., Jr.
This paper describes the processes involved in selecting peer institutions for appropriate benchmarking using national databases (NCES-IPEDS). Benchmarking involves the identification of peer institutions and/or best practices in specific operational areas for the purpose of developing standards. The benchmarking process was borne in the early…
Measuring How Benchmark Assessments Affect Student Achievement. Issues & Answers. REL 2007-No. 039
ERIC Educational Resources Information Center
Henderson, Susan; Petrosino, Anthony; Guckenburg, Sarah; Hamilton, Stephen
2007-01-01
This report examines a Massachusetts pilot program for quarterly benchmark exams in middle-school mathematics, finding that program schools do not show greater gains in student achievement after a year. But that finding might reflect limited data rather than ineffective benchmark assessments. Benchmark assessments are used in many districts…
24 CFR 990.185 - Utilities expense level: Incentives for energy conservation/rate reduction.
Code of Federal Regulations, 2011 CFR
2011-04-01
...) Utility benchmarking. HUD will pursue benchmarking utility consumption at the project level as part of the... convene a meeting with representation of appropriate stakeholders to review utility benchmarking options so that HUD may determine whether or how to implement utility benchmarking to be effective in FY 2011...
ERIC Educational Resources Information Center
Ossiannilsson, E.; Landgren, L.
2012-01-01
Between 2008 and 2010, Lund University took part in three international benchmarking projects, "E-xcellence+," the "eLearning Benchmarking Exercise 2009," and the "First Dual-Mode Distance Learning Benchmarking Club." A comparison of these models revealed a rather high level of correspondence. From this finding and…
24 CFR 990.185 - Utilities expense level: Incentives for energy conservation/rate reduction.
Code of Federal Regulations, 2010 CFR
2010-04-01
...) Utility benchmarking. HUD will pursue benchmarking utility consumption at the project level as part of the... convene a meeting with representation of appropriate stakeholders to review utility benchmarking options so that HUD may determine whether or how to implement utility benchmarking to be effective in FY 2011...
40 CFR 141.543 - How is the disinfection benchmark calculated?
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 24 2012-07-01 2012-07-01 false How is the disinfection benchmark... Disinfection-Systems Serving Fewer Than 10,000 People Disinfection Benchmark § 141.543 How is the disinfection benchmark calculated? If your system is making a significant change to its disinfection practice, it must...
40 CFR 141.543 - How is the disinfection benchmark calculated?
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 23 2014-07-01 2014-07-01 false How is the disinfection benchmark... Disinfection-Systems Serving Fewer Than 10,000 People Disinfection Benchmark § 141.543 How is the disinfection benchmark calculated? If your system is making a significant change to its disinfection practice, it must...
40 CFR 141.543 - How is the disinfection benchmark calculated?
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 24 2013-07-01 2013-07-01 false How is the disinfection benchmark... Disinfection-Systems Serving Fewer Than 10,000 People Disinfection Benchmark § 141.543 How is the disinfection benchmark calculated? If your system is making a significant change to its disinfection practice, it must...
40 CFR 141.543 - How is the disinfection benchmark calculated?
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 23 2011-07-01 2011-07-01 false How is the disinfection benchmark... Disinfection-Systems Serving Fewer Than 10,000 People Disinfection Benchmark § 141.543 How is the disinfection benchmark calculated? If your system is making a significant change to its disinfection practice, it must...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Will, M.E.; Suter, G.W. II
1994-09-01
One of the initial stages in ecological risk assessment for hazardous waste sites is screening contaminants to determine which of them are worthy of further consideration as contaminants of potential concern. This process is termed contaminant screening. It is performed by comparing measured ambient concentrations of chemicals to benchmark concentrations. Currently, no standard benchmark concentrations exist for assessing contaminants in soil with respect to their toxicity to plants. This report presents a standard method for deriving benchmarks for this purpose (phytotoxicity benchmarks), a set of data concerning effects of chemicals in soil or soil solution on plants, and a setmore » of phytotoxicity benchmarks for 38 chemicals potentially associated with United States Department of Energy (DOE) sites. In addition, background information on the phytotoxicity and occurrence of the chemicals in soils is presented, and literature describing the experiments from which data were drawn for benchmark derivation is reviewed. Chemicals that are found in soil at concentrations exceeding both the phytotoxicity benchmark and the background concentration for the soil type should be considered contaminants of potential concern.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Suter, G.W. II
1993-01-01
One of the initial stages in ecological risk assessment for hazardous waste sites is screening contaminants to determine which of them are worthy of further consideration as contaminants of potential concern. This process is termed contaminant screening. It is performed by comparing measured ambient concentrations of chemicals to benchmark concentrations. Currently, no standard benchmark concentrations exist for assessing contaminants in soil with respect to their toxicity to plants. This report presents a standard method for deriving benchmarks for this purpose (phytotoxicity benchmarks), a set of data concerning effects of chemicals in soil or soil solution on plants, and a setmore » of phytotoxicity benchmarks for 38 chemicals potentially associated with United States Department of Energy (DOE) sites. In addition, background information on the phytotoxicity and occurrence of the chemicals in soils is presented, and literature describing the experiments from which data were drawn for benchmark derivation is reviewed. Chemicals that are found in soil at concentrations exceeding both the phytotoxicity benchmark and the background concentration for the soil type should be considered contaminants of potential concern.« less
NASA Technical Reports Server (NTRS)
Krause, David L.; Brewer, Ethan J.; Pawlik, Ralph
2013-01-01
This report provides test methodology details and qualitative results for the first structural benchmark creep test of an Advanced Stirling Convertor (ASC) heater head of ASC-E2 design heritage. The test article was recovered from a flight-like Microcast MarM-247 heater head specimen previously used in helium permeability testing. The test article was utilized for benchmark creep test rig preparation, wall thickness and diametral laser scan hardware metrological developments, and induction heater custom coil experiments. In addition, a benchmark creep test was performed, terminated after one week when through-thickness cracks propagated at thermocouple weld locations. Following this, it was used to develop a unique temperature measurement methodology using contact thermocouples, thereby enabling future benchmark testing to be performed without the use of conventional welded thermocouples, proven problematic for the alloy. This report includes an overview of heater head structural benchmark creep testing, the origin of this particular test article, test configuration developments accomplished using the test article, creep predictions for its benchmark creep test, qualitative structural benchmark creep test results, and a short summary.
How to Advance TPC Benchmarks with Dependability Aspects
NASA Astrophysics Data System (ADS)
Almeida, Raquel; Poess, Meikel; Nambiar, Raghunath; Patil, Indira; Vieira, Marco
Transactional systems are the core of the information systems of most organizations. Although there is general acknowledgement that failures in these systems often entail significant impact both on the proceeds and reputation of companies, the benchmarks developed and managed by the Transaction Processing Performance Council (TPC) still maintain their focus on reporting bare performance. Each TPC benchmark has to pass a list of dependability-related tests (to verify ACID properties), but not all benchmarks require measuring their performances. While TPC-E measures the recovery time of some system failures, TPC-H and TPC-C only require functional correctness of such recovery. Consequently, systems used in TPC benchmarks are tuned mostly for performance. In this paper we argue that nowadays systems should be tuned for a more comprehensive suite of dependability tests, and that a dependability metric should be part of TPC benchmark publications. The paper discusses WHY and HOW this can be achieved. Two approaches are introduced and discussed: augmenting each TPC benchmark in a customized way, by extending each specification individually; and pursuing a more unified approach, defining a generic specification that could be adjoined to any TPC benchmark.
Space Weather Action Plan Solar Radio Burst Phase 1 Benchmarks and the Steps to Phase 2
NASA Astrophysics Data System (ADS)
Biesecker, D. A.; White, S. M.; Gopalswamy, N.; Black, C.; Love, J. J.; Pierson, J.
2017-12-01
Solar radio bursts, when at the right frequency and when strong enough, can interfere with radar, communication, and tracking signals. In severe cases, radio bursts can inhibit the successful use of radio communications and disrupt a wide range of systems that are reliant on Position, Navigation, and Timing services on timescales ranging from minutes to hours across wide areas on the dayside of Earth. The White House's Space Weather Action Plan asked for solar radio burst intensity benchmarks for an event occurrence frequency of 1 in 100 years and also a theoretical maximum intensity benchmark. The benchmark team has developed preliminary (phase 1) benchmarks for the VHF (30-300 MHz), UHF (300-3000 MHz), GPS (1176-1602 MHz), F10.7 (2800 MHz), and Microwave (4000-20000) bands. The preliminary benchmarks were derived based on previously published work. Limitations in the published work will be addressed in phase 2 of the benchmark process. In addition, deriving theoretical maxima requires additional work, where it is even possible to, in order to meet the Action Plan objectives. In this presentation, we will present the phase 1 benchmarks, the basis used to derive them, and the limitations of that work. We will also discuss the work that needs to be done to complete the phase 2 benchmarks.
Comparison of Origin 2000 and Origin 3000 Using NAS Parallel Benchmarks
NASA Technical Reports Server (NTRS)
Turney, Raymond D.
2001-01-01
This report describes results of benchmark tests on the Origin 3000 system currently being installed at the NASA Ames National Advanced Supercomputing facility. This machine will ultimately contain 1024 R14K processors. The first part of the system, installed in November, 2000 and named mendel, is an Origin 3000 with 128 R12K processors. For comparison purposes, the tests were also run on lomax, an Origin 2000 with R12K processors. The BT, LU, and SP application benchmarks in the NAS Parallel Benchmark Suite and the kernel benchmark FT were chosen to determine system performance and measure the impact of changes on the machine as it evolves. Having been written to measure performance on Computational Fluid Dynamics applications, these benchmarks are assumed appropriate to represent the NAS workload. Since the NAS runs both message passing (MPI) and shared-memory, compiler directive type codes, both MPI and OpenMP versions of the benchmarks were used. The MPI versions used were the latest official release of the NAS Parallel Benchmarks, version 2.3. The OpenMP versiqns used were PBN3b2, a beta version that is in the process of being released. NPB 2.3 and PBN 3b2 are technically different benchmarks, and NPB results are not directly comparable to PBN results.
Mitchell, L
1996-01-01
The processes of benchmarking, benchmark data comparative analysis, and study of best practices are distinctly different. The study of best practices is explained with an example based on the Arthur Andersen & Co. 1992 "Study of Best Practices in Ambulatory Surgery". The results of a national best practices study in ambulatory surgery were used to provide our quality improvement team with the goal of improving the turnaround time between surgical cases. The team used a seven-step quality improvement problem-solving process to improve the surgical turnaround time. The national benchmark for turnaround times between surgical cases in 1992 was 13.5 minutes. The initial turnaround time at St. Joseph's Medical Center was 19.9 minutes. After the team implemented solutions, the time was reduced to an average of 16.3 minutes, an 18% improvement. Cost-benefit analysis showed a potential enhanced revenue of approximately $300,000, or a potential savings of $10,119. Applying quality improvement principles to benchmarking, benchmarks, or best practices can improve process performance. Understanding which form of benchmarking the institution wishes to embark on will help focus a team and use appropriate resources. Communicating with professional organizations that have experience in benchmarking will save time and money and help achieve the desired results.
Barty, Rebecca L; Gagliardi, Kathleen; Owens, Wendy; Lauzon, Deborah; Scheuermann, Sheena; Liu, Yang; Wang, Grace; Pai, Menaka; Heddle, Nancy M
2015-07-01
Benchmarking is a quality improvement tool that compares an organization's performance to that of its peers for selected indicators, to improve practice. Processes to develop evidence-based benchmarks for red blood cell (RBC) outdating in Ontario hospitals, based on RBC hospital disposition data from Canadian Blood Services, have been previously reported. These benchmarks were implemented in 160 hospitals provincewide with a multifaceted approach, which included hospital education, inventory management tools and resources, summaries of best practice recommendations, recognition of high-performing sites, and audit tools on the Transfusion Ontario website (http://transfusionontario.org). In this study we describe the implementation process and the impact of the benchmarking program on RBC outdating. A conceptual framework for continuous quality improvement of a benchmarking program was also developed. The RBC outdating rate for all hospitals trended downward continuously from April 2006 to February 2012, irrespective of hospitals' transfusion rates or their distance from the blood supplier. The highest annual outdating rate was 2.82%, at the beginning of the observation period. Each year brought further reductions, with a nadir outdating rate of 1.02% achieved in 2011. The key elements of the successful benchmarking strategy included dynamic targets, a comprehensive and evidence-based implementation strategy, ongoing information sharing, and a robust data system to track information. The Ontario benchmarking program for RBC outdating resulted in continuous and sustained quality improvement. Our conceptual iterative framework for benchmarking provides a guide for institutions implementing a benchmarking program. © 2015 AABB.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Munro, J.F.; Kristal, J.; Thompson, G.
The Office of Environmental Management is bringing Headquarters and the Field together to implement process improvements throughout the Complex through a systematic process of organizational learning called benchmarking. Simply stated, benchmarking is a process of continuously comparing and measuring practices, processes, or methodologies with those of other private and public organizations. The EM benchmarking program, which began as the result of a recommendation from Xerox Corporation, is building trust and removing barriers to performance enhancement across the DOE organization. The EM benchmarking program is designed to be field-centered with Headquarters providing facilitatory and integrative functions on an ``as needed`` basis.more » One of the main goals of the program is to assist Field Offices and their associated M&O/M&I contractors develop the capabilities to do benchmarking for themselves. In this regard, a central precept is that in order to realize tangible performance benefits, program managers and staff -- the ones closest to the work - must take ownership of the studies. This avoids the ``check the box`` mentality associated with some third party studies. This workshop will provide participants with a basic level of understanding why the EM benchmarking team was developed and the nature and scope of its mission. Participants will also begin to understand the types of study levels and the particular methodology the EM benchmarking team is using to conduct studies. The EM benchmarking team will also encourage discussion on ways that DOE (both Headquarters and the Field) can team with its M&O/M&I contractors to conduct additional benchmarking studies. This ``introduction to benchmarking`` is intended to create a desire to know more and a greater appreciation of how benchmarking processes could be creatively employed to enhance performance.« less
Edwards, Roger A.; Dee, Deborah; Umer, Amna; Perrine, Cria G.; Shealy, Katherine R.; Grummer-Strawn, Laurence M.
2015-01-01
Background A substantial proportion of US maternity care facilities engage in practices that are not evidence-based and that interfere with breastfeeding. The CDC Survey of Maternity Practices in Infant Nutrition and Care (mPINC) showed significant variation in maternity practices among US states. Objective The purpose of this article is to use benchmarking techniques to identify states within relevant peer groups that were top performers on mPINC survey indicators related to breastfeeding support. Methods We used 11 indicators of breastfeeding-related maternity care from the 2011 mPINC survey and benchmarking techniques to organize and compare hospital-based maternity practices across the 50 states and Washington, DC. We created peer categories for benchmarking first by region (grouping states by West, Midwest, South, and Northeast) and then by size (grouping states by the number of maternity facilities and dividing each region into approximately equal halves based on the number of facilities). Results Thirty-four states had scores high enough to serve as benchmarks, and 32 states had scores low enough to reflect the lowest score gap from the benchmark on at least 1 indicator. No state served as the benchmark on more than 5 indicators and no state was furthest from the benchmark on more than 7 indicators. The small peer group benchmarks in the South, West, and Midwest were better than the large peer group benchmarks on 91%, 82%, and 36% of the indicators, respectively. In the West large, the Midwest large, the Midwest small, and the South large peer groups, 4–6 benchmarks showed that less than 50% of hospitals have ideal practice in all states. Conclusion The evaluation presents benchmarks for peer group state comparisons that provide potential and feasible targets for improvement. PMID:24394963
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heilemann, G., E-mail: gerd.heilemann@meduniwien.ac.at; Kostiukhina, N.; Nesvacil, N.
2015-10-15
Purpose: The purpose of this study was to establish a method to perform multidimensional radiochromic film measurements of {sup 106}Ru plaques and to benchmark the resulting dose distributions against Monte Carlo simulations (MC), microdiamond, and diode measurements. Methods: Absolute dose rates and relative dose distributions in multiple planes were determined for three different plaque models (CCB, CCA, and COB), and three different plaques per model, using EBT3 films in an in-house developed polystyrene phantom and the MCNP6 MC code. Dose difference maps were generated to analyze interplaque variations for a specific type, and for comparing measurements against MC simulations. Furthermore,more » dose distributions were validated against values specified by the manufacturer (BEBIG) and microdiamond and diode measurements in a water scanning phantom. Radial profiles were assessed and used to estimate dosimetric margins for a given combination of representative tumor geometry and plaque size. Results: Absolute dose rates at a reference depth of 2 mm on the central axis of the plaque show an agreement better than 5% (10%) when comparing film measurements (MCNP6) to the manufacturer’s data. The reproducibility of depth-dose profile measurements was <7% (2 SD) for all investigated detectors and plaque types. Dose difference maps revealed minor interplaque deviations for a specific plaque type due to inhomogeneities of the active layer. The evaluation of dosimetric margins showed that for a majority of the investigated cases, the tumor was not completely covered by the 100% isodose prescribed to the tumor apex if the difference between geometrical plaque size and tumor base ≤4 mm. Conclusions: EBT3 film dosimetry in an in-house developed phantom was successfully used to characterize the dosimetric properties of different {sup 106}Ru plaque models. The film measurements were validated against MC calculations and other experimental methods and showed a good agreement with data from BEBIG well within published tolerances. The dosimetric information as well as interplaque comparison can be used for comprehensive quality assurance and for considerations in the treatment planning of ophthalmic brachytherapy.« less
NASA Astrophysics Data System (ADS)
van de Water, Steven; Albertini, Francesca; Weber, Damien C.; Heijmen, Ben J. M.; Hoogeman, Mischa S.; Lomax, Antony J.
2018-01-01
The aim of this study is to develop an anatomical robust optimization method for intensity-modulated proton therapy (IMPT) that accounts for interfraction variations in nasal cavity filling, and to compare it with conventional single-field uniform dose (SFUD) optimization and online plan adaptation. We included CT data of five patients with tumors in the sinonasal region. Using the planning CT, we generated for each patient 25 ‘synthetic’ CTs with varying nasal cavity filling. The robust optimization method available in our treatment planning system ‘Erasmus-iCycle’ was extended to also account for anatomical uncertainties by including (synthetic) CTs with varying patient anatomy as error scenarios in the inverse optimization. For each patient, we generated treatment plans using anatomical robust optimization and, for benchmarking, using SFUD optimization and online plan adaptation. Clinical target volume (CTV) and organ-at-risk (OAR) doses were assessed by recalculating the treatment plans on the synthetic CTs, evaluating dose distributions individually and accumulated over an entire fractionated 50 GyRBE treatment, assuming each synthetic CT to correspond to a 2 GyRBE fraction. Treatment plans were also evaluated using actual repeat CTs. Anatomical robust optimization resulted in adequate CTV doses (V95% ⩾ 98% and V107% ⩽ 2%) if at least three synthetic CTs were included in addition to the planning CT. These CTV requirements were also fulfilled for online plan adaptation, but not for the SFUD approach, even when applying a margin of 5 mm. Compared with anatomical robust optimization, OAR dose parameters for the accumulated dose distributions were on average 5.9 GyRBE (20%) higher when using SFUD optimization and on average 3.6 GyRBE (18%) lower for online plan adaptation. In conclusion, anatomical robust optimization effectively accounted for changes in nasal cavity filling during IMPT, providing substantially improved CTV and OAR doses compared with conventional SFUD optimization. OAR doses can be further reduced by using online plan adaptation.
Warden, Graham I.; Farkas, Cameron E.; Ikuta, Ichiro; Prevedello, Luciano M.; Andriole, Katherine P.; Khorasani, Ramin
2012-01-01
Purpose: To develop and validate an informatics toolkit that extracts anatomy-specific computed tomography (CT) radiation exposure metrics (volume CT dose index and dose-length product) from existing digital image archives through optical character recognition of CT dose report screen captures (dose screens) combined with Digital Imaging and Communications in Medicine attributes. Materials and Methods: This institutional review board–approved HIPAA-compliant study was performed in a large urban health care delivery network. Data were drawn from a random sample of CT encounters that occurred between 2000 and 2010; images from these encounters were contained within the enterprise image archive, which encompassed images obtained at an adult academic tertiary referral hospital and its affiliated sites, including a cancer center, a community hospital, and outpatient imaging centers, as well as images imported from other facilities. Software was validated by using 150 randomly selected encounters for each major CT scanner manufacturer, with outcome measures of dose screen retrieval rate (proportion of correctly located dose screens) and anatomic assignment precision (proportion of extracted exposure data with correctly assigned anatomic region, such as head, chest, or abdomen and pelvis). The 95% binomial confidence intervals (CIs) were calculated for discrete proportions, and CIs were derived from the standard error of the mean for continuous variables. After validation, the informatics toolkit was used to populate an exposure repository from a cohort of 54 549 CT encounters; of which 29 948 had available dose screens. Results: Validation yielded a dose screen retrieval rate of 99% (597 of 605 CT encounters; 95% CI: 98%, 100%) and an anatomic assignment precision of 94% (summed DLP fraction correct 563 in 600 CT encounters; 95% CI: 92%, 96%). Patient safety applications of the resulting data repository include benchmarking between institutions, CT protocol quality control and optimization, and cumulative patient- and anatomy-specific radiation exposure monitoring. Conclusion: Large-scale anatomy-specific radiation exposure data repositories can be created with high fidelity from existing digital image archives by using open-source informatics tools. ©RSNA, 2012 Supplemental material: http://radiology.rsna.org/lookup/suppl/doi:10.1148/radiol.12111822/-/DC1 PMID:22668563
An in-house developed resettable MOSFET dosimeter for radiotherapy.
Verellen, Dirk; Van Vaerenbergh, Sven; Tournel, Koen; Heuninckx, Karina; Joris, Laurent; Duchateau, Michael; Linthout, Nadine; Gevaert, Thierry; Reynders, Truus; Van de Vondel, Iwein; Coppens, Luc; Depuydt, Tom; De Ridder, Mark; Storme, Guy
2010-02-21
The purpose of this note is to report the feasibility and clinical validation of an in-house developed MOSFET dosimetry system and describe an integrated non-destructive reset procedure. Off-the-shelf MOSFETs are connected to a common PC using an 18 bit/analogue-input and 16 bit/output data acquisition card. A reading algorithm was developed defining the zero-temperature-coefficient point (ZTC) to determine the threshold voltage. A wireless interface was established for ease of use. The reset procedure consists of an internal circuit generating a local heating induced by an electrical current. Sensitivity has been investigated as a function of bias voltage (0-9 V) to the gate. Dosimetric properties have been evaluated for 6 MV and 15 MV clinical photon beams and in vivo benchmarking was performed against thermoluminescence dosimeters (TLD) for conventional treatments (two groups of ten patients for each energy) and total body irradiation (TBI). MOSFETS were pre-irradiated with 20 Gy. Sensitivity of 0.08 mV cGy(-1) can be obtained for 200 cGy irradiations at 5 V bias voltage. Ten consecutive measurements at 200 cGy yield a SD of 2.08 cGy (1.05%). Increasing the dose in steps from 5 cGy to 1000 cGy yields a 1.00 Pearson correlation coefficient and agreement within 2.0%. Dose rate dependence (160-800 cGy min(-1)) was within 2.5%, temperature dependence within 2.0% (25-37 degrees C). A strong angular dependence has been observed for gantry incidences exceeding +/-30 degrees C. Dose response is stable up to 50 Gy (saturation occurs at approximately 90 Gy), which is used as threshold dose before resetting the MOSFET. An average measured-over-calculated dose ratio within 1.05 (SD: 0.04) has been obtained in vivo. TBI midplane-dose assessed by entrance and exit dose measurements agreed within 1.9% with ionization chamber in phantom, and within 1.0% with TLD in vivo. An in-house developed resettable MOSFET-based dosimetry system is proposed. The system has been validated and is currently used for in vivo entrance dose measurement in clinical routine for simple (open field) treatment configurations.
NOTE: An in-house developed resettable MOSFET dosimeter for radiotherapy
NASA Astrophysics Data System (ADS)
Verellen, Dirk; Van Vaerenbergh, Sven; Tournel, Koen; Heuninckx, Karina; Joris, Laurent; Duchateau, Michael; Linthout, Nadine; Gevaert, Thierry; Reynders, Truus; Van de Vondel, Iwein; Coppens, Luc; Depuydt, Tom; De Ridder, Mark; Storme, Guy
2010-02-01
The purpose of this note is to report the feasibility and clinical validation of an in-house developed MOSFET dosimetry system and describe an integrated non-destructive reset procedure. Off-the-shelf MOSFETs are connected to a common PC using an 18 bit/analogue-input and 16 bit/output data acquisition card. A reading algorithm was developed defining the zero-temperature-coefficient point (ZTC) to determine the threshold voltage. A wireless interface was established for ease of use. The reset procedure consists of an internal circuit generating a local heating induced by an electrical current. Sensitivity has been investigated as a function of bias voltage (0-9 V) to the gate. Dosimetric properties have been evaluated for 6 MV and 15 MV clinical photon beams and in vivo benchmarking was performed against thermoluminescence dosimeters (TLD) for conventional treatments (two groups of ten patients for each energy) and total body irradiation (TBI). MOSFETS were pre-irradiated with 20 Gy. Sensitivity of 0.08 mV cGy-1 can be obtained for 200 cGy irradiations at 5 V bias voltage. Ten consecutive measurements at 200 cGy yield a SD of 2.08 cGy (1.05%). Increasing the dose in steps from 5 cGy to 1000 cGy yields a 1.00 Pearson correlation coefficient and agreement within 2.0%. Dose rate dependence (160-800 cGy min-1) was within 2.5%, temperature dependence within 2.0% (25-37° C). A strong angular dependence has been observed for gantry incidences exceeding ±30° C. Dose response is stable up to 50 Gy (saturation occurs at approximately 90 Gy), which is used as threshold dose before resetting the MOSFET. An average measured-over-calculated dose ratio within 1.05 (SD: 0.04) has been obtained in vivo. TBI midplane-dose assessed by entrance and exit dose measurements agreed within 1.9% with ionization chamber in phantom, and within 1.0% with TLD in vivo. An in-house developed resettable MOSFET-based dosimetry system is proposed. The system has been validated and is currently used for in vivo entrance dose measurement in clinical routine for simple (open field) treatment configurations.
Amoush, Ahmad; Wilkinson, Douglas A.
2015-01-01
This work is a comparative study of the dosimetry calculated by Plaque Simulator, a treatment planning system for eye plaque brachytherapy, to the dosimetry calculated using Monte Carlo simulation for an Eye Physics model EP917 eye plaque. Monte Carlo (MC) simulation using MCNPX 2.7 was used to calculate the central axis dose in water for an EP917 eye plaque fully loaded with 17 IsoAid Advantage 125I seeds. In addition, the dosimetry parameters Λ, gL(r), and F(r,θ) were calculated for the IsoAid Advantage model IAI‐125 125I seed and benchmarked against published data. Bebig Plaque Simulator (PS) v5.74 was used to calculate the central axis dose based on the AAPM Updated Task Group 43 (TG‐43U1) dose formalism. The calculated central axis dose from MC and PS was then compared. When the MC dosimetry parameters for the IsoAid Advantage 125I seed were compared with the consensus values, Λ agreed with the consensus value to within 2.3%. However, much larger differences were found between MC calculated gL(r) and F(r,θ) and the consensus values. The differences between MC‐calculated dosimetry parameters are much smaller when compared with recently published data. The differences between the calculated central axis absolute dose from MC and PS ranged from 5% to 10% for distances between 1 and 12 mm from the outer scleral surface. When the dosimetry parameters for the 125I seed from this study were used in PS, the calculated absolute central axis dose differences were reduced by 2.3% from depths of 4 to 12 mm from the outer scleral surface. We conclude that PS adequately models the central dose profile of this plaque using its defaults for the IsoAid model IAI‐125 at distances of 1 to 7 mm from the outer scleral surface. However, improved dose accuracy can be obtained by using updated dosimetry parameters for the IsoAid model IAI‐125 125I seed. PACS number: 87.55.K‐ PMID:26699577
DOE Office of Scientific and Technical Information (OSTI.GOV)
Russell Feder and Mahmoud Z. Yousef
Neutronics analysis to find nuclear heating rates and personnel dose rates were conducted in support of the integration of diagnostics in to the ITER Upper Port Plugs. Simplified shielding models of the Visible-Infrared diagnostic and of the ECH heating system were incorporated in to the ITER global CAD model. Results for these systems are representative of typical designs with maximum shielding and a small aperture (Vis-IR) and minimal shielding with a large aperture (ECH). The neutronics discrete-ordinates code ATTILA® and SEVERIAN® (the ATTILA parallel processing version) was used. Material properties and the 500 MW D-T volume source were taken frommore » the ITER “Brand Model” MCNP benchmark model. A biased quadrature set equivelant to Sn=32 and a scattering degree of Pn=3 were used along with a 46-neutron and 21-gamma FENDL energy subgrouping. Total nuclear heating (neutron plug gamma heating) in the upper port plugs ranged between 380 and 350 kW for the Vis-IR and ECH cases. The ECH or Large Aperture model exhibited lower total heating but much higher peak volumetric heating on the upper port plug structure. Personnel dose rates are calculated in a three step process involving a neutron-only transport calculation, the generation of activation volume sources at pre-defined time steps and finally gamma transport analyses are run for selected time steps. ANSI-ANS 6.1.1 1977 Flux-to-Dose conversion factors were used. Dose rates were evaluated for 1 full year of 500 MW DT operation which is comprised of 3000 1800-second pulses. After one year the machine is shut down for maintenance and personnel are permitted to access the diagnostic interspace after 2-weeks if dose rates are below 100 μSv/hr. Dose rates in the Visible-IR diagnostic model after one day of shutdown were 130 μSv/hr but fell below the limit to 90 μSv/hr 2-weeks later. The Large Aperture or ECH style shielding model exhibited higher and more persistent dose rates. After 1-day the dose rate was 230 μSv/hr but was still at 120 μSv/hr 4-weeks later. __________________________________________________« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Russell E. Feder and Mahmoud Z. Youssef
Neutronics analysis to find nuclear heating rates and personnel dose rates were conducted in support of the integration of diagnostics in to the ITER Upper Port Plugs. Simplified shielding models of the Visible-Infrared diagnostic and of a large aperture diagnostic were incorporated in to the ITER global CAD model. Results for these systems are representative of typical designs with maximum shielding and a small aperture (Vis-IR) and minimal shielding with a large aperture. The neutronics discrete-ordinates code ATTILA® and SEVERIAN® (the ATTILA parallel processing version) was used. Material properties and the 500 MW D-T volume source were taken from themore » ITER “Brand Model” MCNP benchmark model. A biased quadrature set equivelant to Sn=32 and a scattering degree of Pn=3 were used along with a 46-neutron and 21-gamma FENDL energy subgrouping. Total nuclear heating (neutron plug gamma heating) in the upper port plugs ranged between 380 and 350 kW for the Vis-IR and Large Aperture cases. The Large Aperture model exhibited lower total heating but much higher peak volumetric heating on the upper port plug structure. Personnel dose rates are calculated in a three step process involving a neutron-only transport calculation, the generation of activation volume sources at pre-defined time steps and finally gamma transport analyses are run for selected time steps. ANSI-ANS 6.1.1 1977 Flux-to-Dose conversion factors were used. Dose rates were evaluated for 1 full year of 500 MW DT operation which is comprised of 3000 1800-second pulses. After one year the machine is shut down for maintenance and personnel are permitted to access the diagnostic interspace after 2-weeks if dose rates are below 100 μSv/hr. Dose rates in the Visible-IR diagnostic model after one day of shutdown were 130 μSv/hr but fell below the limit to 90 μSv/hr 2-weeks later. The Large Aperture style shielding model exhibited higher and more persistent dose rates. After 1-day the dose rate was 230 μSv/hr but was still at 120 μSv/hr 4-weeks later.« less
Castorina, Rosemary; Bradman, Asa; McKone, Thomas E; Barr, Dana B; Harnly, Martha E; Eskenazi, Brenda
2003-01-01
Approximately 230,000 kg of organophosphate (OP) pesticides are applied annually in California's Salinas Valley. These activities have raised concerns about exposures to area residents. We collected three spot urine samples from pregnant women (between 1999 and 2001) enrolled in CHAMACOS (Center for the Health Assessment of Mothers and Children of Salinas), a longitudinal birth cohort study, and analyzed them for six dialkyl phosphate metabolites. We used urine from 446 pregnant women to estimate OP pesticide doses with two deterministic steady-state modeling methods: method 1, which assumed the metabolites were attributable entirely to a single diethyl or dimethyl OP pesticide; and method 2, which adapted U.S. Environmental Protection Agency (U.S. EPA) draft guidelines for cumulative risk assessment to estimate dose from a mixture of OP pesticides that share a common mechanism of toxicity. We used pesticide use reporting data for the Salinas Valley to approximate the mixture to which the women were exposed. Based on average OP pesticide dose estimates that assumed exposure to a single OP pesticide (method 1), between 0% and 36.1% of study participants' doses failed to attain a margin of exposure (MOE) of 100 relative to the U.S. EPA oral benchmark dose(10) (BMD(10)), depending on the assumption made about the parent compound. These BMD(10) values are doses expected to produce a 10% reduction in brain cholinesterase activity compared with background response in rats. Given the participants' average cumulative OP pesticide dose estimates (method 2) and regardless of the index chemical selected, we found that 14.8% of the doses failed to attain an MOE of 100 relative to the BMD(10) of the selected index. An uncertainty analysis of the pesticide mixture parameter, which is extrapolated from pesticide application data for the study area and not directly quantified for each individual, suggests that this point estimate could range from 1 to 34%. In future analyses, we will use pesticide-specific urinary metabolites, when available, to evaluate cumulative OP pesticide exposures. PMID:14527844
Fukushima Daiichi Radionuclide Inventories
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cardoni, Jeffrey N.; Jankovsky, Zachary Kyle
Radionuclide inventories are generated to permit detailed analyses of the Fukushima Daiichi meltdowns. This is necessary information for severe accident calculations, dose calculations, and source term and consequence analyses. Inventories are calculated using SCALE6 and compared to values predicted by international researchers supporting the OECD/NEA's Benchmark Study on the Accident at Fukushima Daiichi Nuclear Power Station (BSAF). Both sets of inventory information are acceptable for best-estimate analyses of the Fukushima reactors. Consistent nuclear information for severe accident codes, including radionuclide class masses and core decay powers, are also derived from the SCALE6 analyses. Key nuclide activity ratios are calculated asmore » functions of burnup and nuclear data in order to explore the utility for nuclear forensics and support future decommissioning efforts.« less
Sulfur activation at the Little Boy-Comet Critical Assembly: a replica of the Hiroshima bomb
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kerr, G.D.; Emery, J.F.; Pace, J.V. III
1985-04-01
Studies have been completed on the activation of sulfur by fast neutrons from the Little Boy-Comet Critical Assembly which replicates the general features of the Hiroshima bomb. The complex effects of the bomb's design and construction on leakage of sulfur-activation neutrons were investigated both experimentally and theoretically. Our sulfur activation studies were performed as part of a larger program to provide benchmark data for testing of methods used in recent source-term calculations for the Hiroshima bomb. Source neutrons capable of activating sulfur play an important role in determining neutron doses in Hiroshima at a kilometer or more from the pointmore » of explosion. 37 refs., 5 figs., 6 tabs.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-21
...] RIN 0691-AA80 Direct Investment Surveys: BE-12, Benchmark Survey of Foreign Direct Investment in the... of Foreign Direct Investment in the United States. Benchmark surveys are conducted every five years; the prior survey covered 2007. The benchmark survey covers the universe of foreign direct investment...
Unstructured Adaptive (UA) NAS Parallel Benchmark. Version 1.0
NASA Technical Reports Server (NTRS)
Feng, Huiyu; VanderWijngaart, Rob; Biswas, Rupak; Mavriplis, Catherine
2004-01-01
We present a complete specification of a new benchmark for measuring the performance of modern computer systems when solving scientific problems featuring irregular, dynamic memory accesses. It complements the existing NAS Parallel Benchmark suite. The benchmark involves the solution of a stylized heat transfer problem in a cubic domain, discretized on an adaptively refined, unstructured mesh.
Operationalizing the Rubric: The Effect of Benchmark Selection on the Assessed Quality of Writing.
ERIC Educational Resources Information Center
Popp, Sharon E. Osborn; Ryan, Joseph M.; Thompson, Marilyn S.; Behrens, John T.
The purposes of this study were to investigate the role of benchmark writing samples in direct assessment of writing and to examine the consequences of differential benchmark selection with a common writing rubric. The influences of discourse and grade level were also examined within the context of differential benchmark selection. Raters scored…
ERIC Educational Resources Information Center
Galloway, Melissa Ritchie
2016-01-01
The purpose of this causal comparative study was to test the theory of assessment that relates benchmark assessments to the Georgia middle grades science Criterion Referenced Competency Test (CRCT) percentages, controlling for schools who do not administer benchmark assessments versus schools who do administer benchmark assessments for all middle…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-21
... 2006 Decision Memorandum) at ``Benchmarks for Short-Term Financing.'' B. Benchmark for Long-Term Loans.... Subsidies Valuation Information A. Benchmarks for Short-Term Financing For those programs requiring the application of a won-denominated, short-term interest rate benchmark, in accordance with 19 CFR 351.505(a)(2...
Transaction Processing Performance Council (TPC): State of the Council 2010
NASA Astrophysics Data System (ADS)
Nambiar, Raghunath; Wakou, Nicholas; Carman, Forrest; Majdalany, Michael
The Transaction Processing Performance Council (TPC) is a non-profit corporation founded to define transaction processing and database benchmarks and to disseminate objective, verifiable performance data to the industry. Established in August 1988, the TPC has been integral in shaping the landscape of modern transaction processing and database benchmarks over the past twenty-two years. This paper provides an overview of the TPC's existing benchmark standards and specifications, introduces two new TPC benchmarks under development, and examines the TPC's active involvement in the early creation of additional future benchmarks.
Shift Verification and Validation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pandya, Tara M.; Evans, Thomas M.; Davidson, Gregory G
2016-09-07
This documentation outlines the verification and validation of Shift for the Consortium for Advanced Simulation of Light Water Reactors (CASL). Five main types of problems were used for validation: small criticality benchmark problems; full-core reactor benchmarks for light water reactors; fixed-source coupled neutron-photon dosimetry benchmarks; depletion/burnup benchmarks; and full-core reactor performance benchmarks. We compared Shift results to measured data and other simulated Monte Carlo radiation transport code results, and found very good agreement in a variety of comparison measures. These include prediction of critical eigenvalue, radial and axial pin power distributions, rod worth, leakage spectra, and nuclide inventories over amore » burn cycle. Based on this validation of Shift, we are confident in Shift to provide reference results for CASL benchmarking.« less
Deterministic absorbed dose estimation in computed tomography using a discrete ordinates method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Norris, Edward T.; Liu, Xin, E-mail: xinliu@mst.edu; Hsieh, Jiang
Purpose: Organ dose estimation for a patient undergoing computed tomography (CT) scanning is very important. Although Monte Carlo methods are considered gold-standard in patient dose estimation, the computation time required is formidable for routine clinical calculations. Here, the authors instigate a deterministic method for estimating an absorbed dose more efficiently. Methods: Compared with current Monte Carlo methods, a more efficient approach to estimating the absorbed dose is to solve the linear Boltzmann equation numerically. In this study, an axial CT scan was modeled with a software package, Denovo, which solved the linear Boltzmann equation using the discrete ordinates method. Themore » CT scanning configuration included 16 x-ray source positions, beam collimators, flat filters, and bowtie filters. The phantom was the standard 32 cm CT dose index (CTDI) phantom. Four different Denovo simulations were performed with different simulation parameters, including the number of quadrature sets and the order of Legendre polynomial expansions. A Monte Carlo simulation was also performed for benchmarking the Denovo simulations. A quantitative comparison was made of the simulation results obtained by the Denovo and the Monte Carlo methods. Results: The difference in the simulation results of the discrete ordinates method and those of the Monte Carlo methods was found to be small, with a root-mean-square difference of around 2.4%. It was found that the discrete ordinates method, with a higher order of Legendre polynomial expansions, underestimated the absorbed dose near the center of the phantom (i.e., low dose region). Simulations of the quadrature set 8 and the first order of the Legendre polynomial expansions proved to be the most efficient computation method in the authors’ study. The single-thread computation time of the deterministic simulation of the quadrature set 8 and the first order of the Legendre polynomial expansions was 21 min on a personal computer. Conclusions: The simulation results showed that the deterministic method can be effectively used to estimate the absorbed dose in a CTDI phantom. The accuracy of the discrete ordinates method was close to that of a Monte Carlo simulation, and the primary benefit of the discrete ordinates method lies in its rapid computation speed. It is expected that further optimization of this method in routine clinical CT dose estimation will improve its accuracy and speed.« less
Characteristic of EBT-XD and EBT3 radiochromic film dosimetry for photon and proton beams
NASA Astrophysics Data System (ADS)
Khachonkham, Suphalak; Dreindl, Ralf; Heilemann, Gerd; Lechner, Wolfgang; Fuchs, Hermann; Palmans, Hugo; Georg, Dietmar; Kuess, Peter
2018-03-01
Recently, a new type of radiochromic film, the EBT-XD film, has been introduced for high dose radiotherapy. The EBT-XD film contains the same structure as the EBT3 film but has a slightly different composition and a thinner active layer. This study benchmarks the EBT-XD against EBT3 film for 6 MV and 10 MV photon beams, as well as for 97.4 MeV and 148.2 MeV proton beams and 15-100 kV x-rays. Dosimetric and film reading characteristics, such as post irradiation darkening, film orientation effect, lateral response artifact (LRA), film sensitivity, energy and beam quality dependency were investigated. Furthermore, quenching effects in the Bragg peak were investigated for a single proton beam energy for both film types, in addition measurements were performed in a spread-out Bragg peak. EBT-XD films showed the same characteristic on film darkening as EBT3. The effects between portrait and landscape orientation were reduced by 3.1% (in pixel value) for EBT-XD compared to EBT3 at a dose of 2000 cGy. The LRA is reduced for EBT-XD films for all investigated dose ranges. The sensitivity of EBT-XD films is superior to EBT3 for doses higher than 500 cGy. In addition, EBT-XD showed a similar dosimetric response for photon and proton irradiation with low energy and beam quality dependency. A quenching effect of 10% was found for both film types. The slight decrease in the thickness of the active layer and different composition configuration of EBT-XD resulted in a reduced film orientation effect and LRA, as well as a sensitivity increase in high-dose regions for both photon and proton beams. Overall, the EBT-XD film improved regarding film reading characteristics and showed advantages in the high-dose region for photon and proton beams.
An Update of Recent Phits Code
NASA Astrophysics Data System (ADS)
Sihver, Lembit; Sato, Tatsuhiko; Niita, Koji; Iwase, Hiroshi; Iwamoto, Yosuke; Matsuda, Norihiro; Nakashima, Hiroshi; Sakamoto, Yukio; Gustafsson, Katarina; Mancusi, Davide
We will first present the current status of the General-Purpose Particle and Heavy-Ion Transport code System (PHITS). In particular, we will describe benchmarking of calculated cross sections against measurements; we will introduce a relativistically covariant version of JQMD, called R- JQMD, that features an improved ground-state initialization algorithm, and we will show heavyion charge-changing cross sections simulated with R-JQMD and compare them to experimental data and to results predicted by the JQMD model. We will also show calculations of dose received by aircrews and personnel in space from cosmic radiation. In recent years, many countries have issued regulations or recommendations to set annual dose limitations for aircrews. Since estimation of cosmic-ray spectra in the atmosphere is an essential issue for the evaluation of aviation doses we have calculated these spectra using PHITS. The accuracy of the simulation, which has well been verified by experimental data taken under various conditions, will be presented together with a software called EXPACS-V, that can visualize the cosmic-ray dose rates at ground level or at a certain altitude on the map of Google Earth, using the PHITS based Analytical Radiation Model in the Atmosphere (PARMA). PARMA can instantaneously calculate the cosmic-ray spectra anywhere in the world by specifying the atmospheric depth, the vertical cut-off rigidity and the force-field potential. For the purpose of examining the applicability of PHITS to the shielding design in space, the absorbed doses in a tissue equivalent water phantom inside an imaginary space vessel has been estimated for different shielding materials of different thicknesses. The results confirm previous results which indicate that PHITS is a suitable tool when performing shielding design studies of spacecrafts. Finally we have used PHITS for the calculations of depth-dose distributions in MATROSHKA, which is an ESA project dedicated to determining the radiation load on astronauts within and outside the International Space Station (ISS).
A Signal-to-Noise Crossover Dose as the Point of Departure for Health Risk Assessment
Portier, Christopher J.; Krewski, Daniel
2011-01-01
Background: The U.S. National Toxicology Program (NTP) cancer bioassay database provides an opportunity to compare both existing and new approaches to determining points of departure (PoDs) for establishing reference doses (RfDs). Objectives: The aims of this study were a) to investigate the risk associated with the traditional PoD used in human health risk assessment [the no observed adverse effect level (NOAEL)]; b) to present a new approach based on the signal-to-noise crossover dose (SNCD); and c) to compare the SNCD and SNCD-based RfD with PoDs and RfDs based on the NOAEL and benchmark dose (BMD) approaches. Methods: The complete NTP database was used as the basis for these analyses, which were performed using the Hill model. We determined NOAELs and estimated corresponding extra risks. Lower 95% confidence bounds on the BMD (BMDLs) corresponding to extra risks of 1%, 5%, and 10% (BMDL01, BMDL05, and BMDL10, respectively) were also estimated. We introduce the SNCD as a new PoD, defined as the dose where the additional risk is equal to the “background noise” (the difference between the upper and lower bounds of the two-sided 90% confidence interval on absolute risk) or a specified fraction thereof. Results: The median risk at the NOAEL was approximately 10%, and the default uncertainty factor (UF = 100) was considered most applicable to the BMDL10. Therefore, we chose a target risk of 1/1,000 (0.1/100) to derive an SNCD-based RfD by linear extrapolation. At the median, this approach provided the same RfD as the BMDL10 divided by the default UF. Conclusions: Under a standard BMD approach, the BMDL10 is considered to be the most appropriate PoD. The SNCD approach, which is based on the lowest dose at which the signal can be reliably detected, warrants further development as a PoD for human health risk assessment. PMID:21813365
DOE Office of Scientific and Technical Information (OSTI.GOV)
Egan, A; Laub, W
2014-06-15
Purpose: Several shortcomings of the current implementation of the analytic anisotropic algorithm (AAA) may lead to dose calculation errors in highly modulated treatments delivered to highly heterogeneous geometries. Here we introduce a set of dosimetric error predictors that can be applied to a clinical treatment plan and patient geometry in order to identify high risk plans. Once a problematic plan is identified, the treatment can be recalculated with more accurate algorithm in order to better assess its viability. Methods: Here we focus on three distinct sources dosimetric error in the AAA algorithm. First, due to a combination of discrepancies inmore » smallfield beam modeling as well as volume averaging effects, dose calculated through small MLC apertures can be underestimated, while that behind small MLC blocks can overestimated. Second, due the rectilinear scaling of the Monte Carlo generated pencil beam kernel, energy is not properly transported through heterogeneities near, but not impeding, the central axis of the beamlet. And third, AAA overestimates dose in regions very low density (< 0.2 g/cm{sup 3}). We have developed an algorithm to detect the location and magnitude of each scenario within the patient geometry, namely the field-size index (FSI), the heterogeneous scatter index (HSI), and the lowdensity index (LDI) respectively. Results: Error indices successfully identify deviations between AAA and Monte Carlo dose distributions in simple phantom geometries. Algorithms are currently implemented in the MATLAB computing environment and are able to run on a typical RapidArc head and neck geometry in less than an hour. Conclusion: Because these error indices successfully identify each type of error in contrived cases, with sufficient benchmarking, this method can be developed into a clinical tool that may be able to help estimate AAA dose calculation errors and when it might be advisable to use Monte Carlo calculations.« less
Bayesian Dose-Response Modeling in Sparse Data
NASA Astrophysics Data System (ADS)
Kim, Steven B.
This book discusses Bayesian dose-response modeling in small samples applied to two different settings. The first setting is early phase clinical trials, and the second setting is toxicology studies in cancer risk assessment. In early phase clinical trials, experimental units are humans who are actual patients. Prior to a clinical trial, opinions from multiple subject area experts are generally more informative than the opinion of a single expert, but we may face a dilemma when they have disagreeing prior opinions. In this regard, we consider compromising the disagreement and compare two different approaches for making a decision. In addition to combining multiple opinions, we also address balancing two levels of ethics in early phase clinical trials. The first level is individual-level ethics which reflects the perspective of trial participants. The second level is population-level ethics which reflects the perspective of future patients. We extensively compare two existing statistical methods which focus on each perspective and propose a new method which balances the two conflicting perspectives. In toxicology studies, experimental units are living animals. Here we focus on a potential non-monotonic dose-response relationship which is known as hormesis. Briefly, hormesis is a phenomenon which can be characterized by a beneficial effect at low doses and a harmful effect at high doses. In cancer risk assessments, the estimation of a parameter, which is known as a benchmark dose, can be highly sensitive to a class of assumptions, monotonicity or hormesis. In this regard, we propose a robust approach which considers both monotonicity and hormesis as a possibility. In addition, We discuss statistical hypothesis testing for hormesis and consider various experimental designs for detecting hormesis based on Bayesian decision theory. Past experiments have not been optimally designed for testing for hormesis, and some Bayesian optimal designs may not be optimal under a wrong parametric assumption. In this regard, we consider a robust experimental design which does not require any parametric assumption.
Characteristic of EBT-XD and EBT3 radiochromic film dosimetry for photon and proton beams.
Khachonkham, Suphalak; Dreindl, Ralf; Heilemann, Gerd; Lechner, Wolfgang; Fuchs, Hermann; Palmans, Hugo; Georg, Dietmar; Kuess, Peter
2018-03-15
Recently, a new type of radiochromic film, the EBT-XD film, has been introduced for high dose radiotherapy. The EBT-XD film contains the same structure as the EBT3 film but has a slightly different composition and a thinner active layer. This study benchmarks the EBT-XD against EBT3 film for 6 MV and 10 MV photon beams, as well as for 97.4 MeV and 148.2 MeV proton beams and 15-100 kV x-rays. Dosimetric and film reading characteristics, such as post irradiation darkening, film orientation effect, lateral response artifact (LRA), film sensitivity, energy and beam quality dependency were investigated. Furthermore, quenching effects in the Bragg peak were investigated for a single proton beam energy for both film types, in addition measurements were performed in a spread-out Bragg peak. EBT-XD films showed the same characteristic on film darkening as EBT3. The effects between portrait and landscape orientation were reduced by 3.1% (in pixel value) for EBT-XD compared to EBT3 at a dose of 2000 cGy. The LRA is reduced for EBT-XD films for all investigated dose ranges. The sensitivity of EBT-XD films is superior to EBT3 for doses higher than 500 cGy. In addition, EBT-XD showed a similar dosimetric response for photon and proton irradiation with low energy and beam quality dependency. A quenching effect of 10% was found for both film types. The slight decrease in the thickness of the active layer and different composition configuration of EBT-XD resulted in a reduced film orientation effect and LRA, as well as a sensitivity increase in high-dose regions for both photon and proton beams. Overall, the EBT-XD film improved regarding film reading characteristics and showed advantages in the high-dose region for photon and proton beams.
Dosimetric comparison of peripheral NSCLC SBRT using Acuros XB and AAA calculation algorithms.
Ong, Chloe C H; Ang, Khong Wei; Soh, Roger C X; Tin, Kah Ming; Yap, Jerome H H; Lee, James C L; Bragg, Christopher M
2017-01-01
There is a concern for dose calculation in highly heterogenous environments such as the thorax region. This study compares the quality of treatment plans of peripheral non-small cell lung cancer (NSCLC) stereotactic body radiation therapy (SBRT) using 2 calculation algorithms, namely, Eclipse Anisotropic Analytical Algorithm (AAA) and Acuros External Beam (AXB), for 3-dimensional conformal radiation therapy (3DCRT) and volumetric-modulated arc therapy (VMAT). Four-dimensional computed tomography (4DCT) data from 20 anonymized patients were studied using Varian Eclipse planning system, AXB, and AAA version 10.0.28. A 3DCRT plan and a VMAT plan were generated using AAA and AXB with constant plan parameters for each patient. The prescription and dose constraints were benchmarked against Radiation Therapy Oncology Group (RTOG) 0915 protocol. Planning parameters of the plan were compared statistically using Mann-Whitney U tests. Results showed that 3DCRT and VMAT plans have a lower target coverage up to 8% when calculated using AXB as compared with AAA. The conformity index (CI) for AXB plans was 4.7% lower than AAA plans, but was closer to unity, which indicated better target conformity. AXB produced plans with global maximum doses which were, on average, 2% hotter than AAA plans. Both 3DCRT and VMAT plans were able to achieve D95%. VMAT plans were shown to be more conformal (CI = 1.01) and were at least 3.2% and 1.5% lower in terms of PTV maximum and mean dose, respectively. There was no statistically significant difference for doses received by organs at risk (OARs) regardless of calculation algorithms and treatment techniques. In general, the difference in tissue modeling for AXB and AAA algorithm is responsible for the dose distribution between the AXB and the AAA algorithms. The AXB VMAT plans could be used to benefit patients receiving peripheral NSCLC SBRT. Copyright © 2017 American Association of Medical Dosimetrists. Published by Elsevier Inc. All rights reserved.
A method for modeling laterally asymmetric proton beamlets resulting from collimation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gelover, Edgar; Wang, Dongxu; Flynn, Ryan T.
2015-03-15
Purpose: To introduce a method to model the 3D dose distribution of laterally asymmetric proton beamlets resulting from collimation. The model enables rapid beamlet calculation for spot scanning (SS) delivery using a novel penumbra-reducing dynamic collimation system (DCS) with two pairs of trimmers oriented perpendicular to each other. Methods: Trimmed beamlet dose distributions in water were simulated with MCNPX and the collimating effects noted in the simulations were validated by experimental measurement. The simulated beamlets were modeled analytically using integral depth dose curves along with an asymmetric Gaussian function to represent fluence in the beam’s eye view (BEV). The BEVmore » parameters consisted of Gaussian standard deviations (sigmas) along each primary axis (σ{sub x1},σ{sub x2},σ{sub y1},σ{sub y2}) together with the spatial location of the maximum dose (μ{sub x},μ{sub y}). Percent depth dose variation with trimmer position was accounted for with a depth-dependent correction function. Beamlet growth with depth was accounted for by combining the in-air divergence with Hong’s fit of the Highland approximation along each axis in the BEV. Results: The beamlet model showed excellent agreement with the Monte Carlo simulation data used as a benchmark. The overall passing rate for a 3D gamma test with 3%/3 mm passing criteria was 96.1% between the analytical model and Monte Carlo data in an example treatment plan. Conclusions: The analytical model is capable of accurately representing individual asymmetric beamlets resulting from use of the DCS. This method enables integration of the DCS into a treatment planning system to perform dose computation in patient datasets. The method could be generalized for use with any SS collimation system in which blades, leaves, or trimmers are used to laterally sharpen beamlets.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williamson, Jeffrey F.
This paper briefly reviews the evolution of brachytherapy dosimetry from 1900 to the present. Dosimetric practices in brachytherapy fall into three distinct eras: During the era of biological dosimetry (1900-1938), radium pioneers could only specify Ra-226 and Rn-222 implants in terms of the mass of radium encapsulated within the implanted sources. Due to the high energy of its emitted gamma rays and the long range of its secondary electrons in air, free-air chambers could not be used to quantify the output of Ra-226 sources in terms of exposure. Biological dosimetry, most prominently the threshold erythema dose, gained currency as amore » means of intercomparing radium treatments with exposure-calibrated orthovoltage x-ray units. The classical dosimetry era (1940-1980) began with successful exposure standardization of Ra-226 sources by Bragg-Gray cavity chambers. Classical dose-computation algorithms, based upon 1-D buildup factor measurements and point-source superposition computational algorithms, were able to accommodate artificial radionuclides such as Co-60, Ir-192, and Cs-137. The quantitative dosimetry era (1980- ) arose in response to the increasing utilization of low energy K-capture radionuclides such as I-125 and Pd-103 for which classical approaches could not be expected to estimate accurate correct doses. This led to intensive development of both experimental (largely TLD-100 dosimetry) and Monte Carlo dosimetry techniques along with more accurate air-kerma strength standards. As a result of extensive benchmarking and intercomparison of these different methods, single-seed low-energy radionuclide dose distributions are now known with a total uncertainty of 3%-5%.« less
SU-E-T-493: Accelerated Monte Carlo Methods for Photon Dosimetry Using a Dual-GPU System and CUDA.
Liu, T; Ding, A; Xu, X
2012-06-01
To develop a Graphics Processing Unit (GPU) based Monte Carlo (MC) code that accelerates dose calculations on a dual-GPU system. We simulated a clinical case of prostate cancer treatment. A voxelized abdomen phantom derived from 120 CT slices was used containing 218×126×60 voxels, and a GE LightSpeed 16-MDCT scanner was modeled. A CPU version of the MC code was first developed in C++ and tested on Intel Xeon X5660 2.8GHz CPU, then it was translated into GPU version using CUDA C 4.1 and run on a dual Tesla m 2 090 GPU system. The code was featured with automatic assignment of simulation task to multiple GPUs, as well as accurate calculation of energy- and material- dependent cross-sections. Double-precision floating point format was used for accuracy. Doses to the rectum, prostate, bladder and femoral heads were calculated. When running on a single GPU, the MC GPU code was found to be ×19 times faster than the CPU code and ×42 times faster than MCNPX. These speedup factors were doubled on the dual-GPU system. The dose Result was benchmarked against MCNPX and a maximum difference of 1% was observed when the relative error is kept below 0.1%. A GPU-based MC code was developed for dose calculations using detailed patient and CT scanner models. Efficiency and accuracy were both guaranteed in this code. Scalability of the code was confirmed on the dual-GPU system. © 2012 American Association of Physicists in Medicine.
Kavlock, R J
1997-01-01
During the last several years, significant changes in the risk assessment process for developmental toxicity of environmental contaminants have begun to emerge. The first of these changes is the development and beginning use of statistically based dose-response models [the benchmark dose (BMD) approach] that better utilize data derived from existing testing approaches. Accompanying this change is the greater emphasis placed on understanding and using mechanistic information to yield more accurate, reliable, and less uncertain risk assessments. The next stage in the evolution of risk assessment will be the use of biologically based dose-response (BBDR) models that begin to build into the statistically based models factors related to the underlying kinetic, biochemical, and/or physiologic processes perturbed by a toxicant. Such models are now emerging from several research laboratories. The introduction of quantitative models and the incorporation of biologic information into them has pointed to the need for even more sophisticated modifications for which we offer the term embryologically based dose-response (EBDR) models. Because these models would be based upon the understanding of normal morphogenesis, they represent a quantum leap in our thinking, but their complexity presents daunting challenges both to the developmental biologist and the developmental toxicologist. Implementation of these models will require extensive communication between developmental toxicologists, molecular embryologists, and biomathematicians. The remarkable progress in the understanding of mammalian embryonic development at the molecular level that has occurred over the last decade combined with advances in computing power and computational models should eventually enable these as yet hypothetical models to be brought into use.