Sample records for computational exposure model

  1. Operation of the computer model for microenvironment atomic oxygen exposure

    NASA Technical Reports Server (NTRS)

    Bourassa, R. J.; Gillis, J. R.; Gruenbaum, P. E.

    1995-01-01

    A computer model for microenvironment atomic oxygen exposure has been developed to extend atomic oxygen modeling capability to include shadowing and reflections. The model uses average exposure conditions established by the direct exposure model and extends the application of these conditions to treat surfaces of arbitrary shape and orientation.

  2. Predicting Forearm Physical Exposures During Computer Work Using Self-Reports, Software-Recorded Computer Usage Patterns, and Anthropometric and Workstation Measurements.

    PubMed

    Huysmans, Maaike A; Eijckelhof, Belinda H W; Garza, Jennifer L Bruno; Coenen, Pieter; Blatter, Birgitte M; Johnson, Peter W; van Dieën, Jaap H; van der Beek, Allard J; Dennerlein, Jack T

    2017-12-15

    Alternative techniques to assess physical exposures, such as prediction models, could facilitate more efficient epidemiological assessments in future large cohort studies examining physical exposures in relation to work-related musculoskeletal symptoms. The aim of this study was to evaluate two types of models that predict arm-wrist-hand physical exposures (i.e. muscle activity, wrist postures and kinematics, and keyboard and mouse forces) during computer use, which only differed with respect to the candidate predicting variables; (i) a full set of predicting variables, including self-reported factors, software-recorded computer usage patterns, and worksite measurements of anthropometrics and workstation set-up (full models); and (ii) a practical set of predicting variables, only including the self-reported factors and software-recorded computer usage patterns, that are relatively easy to assess (practical models). Prediction models were build using data from a field study among 117 office workers who were symptom-free at the time of measurement. Arm-wrist-hand physical exposures were measured for approximately two hours while workers performed their own computer work. Each worker's anthropometry and workstation set-up were measured by an experimenter, computer usage patterns were recorded using software and self-reported factors (including individual factors, job characteristics, computer work behaviours, psychosocial factors, workstation set-up characteristics, and leisure-time activities) were collected by an online questionnaire. We determined the predictive quality of the models in terms of R2 and root mean squared (RMS) values and exposure classification agreement to low-, medium-, and high-exposure categories (in the practical model only). The full models had R2 values that ranged from 0.16 to 0.80, whereas for the practical models values ranged from 0.05 to 0.43. Interquartile ranges were not that different for the two models, indicating that only for some physical exposures the full models performed better. Relative RMS errors ranged between 5% and 19% for the full models, and between 10% and 19% for the practical model. When the predicted physical exposures were classified into low, medium, and high, classification agreement ranged from 26% to 71%. The full prediction models, based on self-reported factors, software-recorded computer usage patterns, and additional measurements of anthropometrics and workstation set-up, show a better predictive quality as compared to the practical models based on self-reported factors and recorded computer usage patterns only. However, predictive quality varied largely across different arm-wrist-hand exposure parameters. Future exploration of the relation between predicted physical exposure and symptoms is therefore only recommended for physical exposures that can be reasonably well predicted. © The Author 2017. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.

  3. Operation of the computer model for microenvironment solar exposure

    NASA Technical Reports Server (NTRS)

    Gillis, J. R.; Bourassa, R. J.; Gruenbaum, P. E.

    1995-01-01

    A computer model for microenvironmental solar exposure was developed to predict solar exposure to satellite surfaces which may shadow or reflect on one another. This document describes the technical features of the model as well as instructions for the installation and use of the program.

  4. Children, computer exposure and musculoskeletal outcomes: the development of pathway models for school and home computer-related musculoskeletal outcomes.

    PubMed

    Harris, Courtenay; Straker, Leon; Pollock, Clare; Smith, Anne

    2015-01-01

    Children's computer use is rapidly growing, together with reports of related musculoskeletal outcomes. Models and theories of adult-related risk factors demonstrate multivariate risk factors associated with computer use. Children's use of computers is different from adult's computer use at work. This study developed and tested a child-specific model demonstrating multivariate relationships between musculoskeletal outcomes, computer exposure and child factors. Using pathway modelling, factors such as gender, age, television exposure, computer anxiety, sustained attention (flow), socio-economic status and somatic complaints (headache and stomach pain) were found to have effects on children's reports of musculoskeletal symptoms. The potential for children's computer exposure to follow a dose-response relationship was also evident. Developing a child-related model can assist in understanding risk factors for children's computer use and support the development of recommendations to encourage children to use this valuable resource in educational, recreational and communication environments in a safe and productive manner. Computer use is an important part of children's school and home life. Application of this developed model, that encapsulates related risk factors, enables practitioners, researchers, teachers and parents to develop strategies that assist young people to use information technology for school, home and leisure in a safe and productive manner.

  5. Computer modelling as a tool for the exposure assessment of operators using faulty agricultural pesticide spraying equipment.

    PubMed

    Bańkowski, Robert; Wiadrowska, Bozena; Beresińska, Martyna; Ludwicki, Jan K; Noworyta-Głowacka, Justyna; Godyń, Artur; Doruchowski, Grzegorz; Hołownicki, Ryszard

    2013-01-01

    Faulty but still operating agricultural pesticide sprayers may pose an unacceptable health risk for operators. The computerized models designed to calculate exposure and risk for pesticide sprayers used as an aid in the evaluation and further authorisation of plant protection products may be applied also to assess a health risk for operators when faulty sprayers are used. To evaluate the impact of different exposure scenarios on the health risk for the operators using faulty agricultural spraying equipment by means of computer modelling. The exposure modelling was performed for 15 pesticides (5 insecticides, 7 fungicides and 3 herbicides). The critical parameter, i.e. toxicological end-point, on which the risk assessment was based was the no observable adverse effect level (NOAEL). This enabled risk to be estimated under various exposure conditions such as pesticide concentration in the plant protection product and type of the sprayed crop as well as the number of treatments. Computer modelling was based on the UK POEM model including determination of the acceptable operator exposure level (AOEL). Thus the degree of operator exposure could be defined during pesticide treatment whether or not personal protection equipment had been employed by individuals. Data used for computer modelling was obtained from simulated, pesticide substitute treatments using variously damaged knapsack sprayers. These substitute preparations consisted of markers that allowed computer simulations to be made, analogous to real-life exposure situations, in a dose dependent fashion. Exposures were estimated according to operator dosimetry exposure under 'field' conditions for low level, medium and high target field crops. The exposure modelling in the high target field crops demonstrated exceedance of the AOEL in all simulated treatment cases (100%) using damaged sprayers irrespective of the type of damage or if individual protective measures had been adopted or not. For low level and medium field crops exceedances ranged between 40 - 80% cases. The computer modelling may be considered as an practical tool for the hazard assessment when the faulty agricultural sprayers are used. It also may be applied for programming the quality checks and maintenance systems of this equipment.

  6. ADDRESSING HUMAN EXPOSURE TO AIR POLLUTANTS AROUND BUILDINGS IN URBAN AREAS WITH COMPUTATIONAL FLUID DYNAMICS (CFD) MODELS

    EPA Science Inventory

    Computational Fluid Dynamics (CFD) simulations provide a number of unique opportunities for expanding and improving capabilities for modeling exposures to environmental pollutants. The US Environmental Protection Agency's National Exposure Research Laboratory (NERL) has been c...

  7. Chemical Computer Man: Chemical Agent Response Simulation (CARS). Technical report, January 1983-September 1985

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, E.G.; Mioduszewski, R.J.

    The Chemical Computer Man: Chemical Agent Response Simulation (CARS) is a computer model and simulation program for estimating the dynamic changes in human physiological dysfunction resulting from exposures to chemical-threat nerve agents. The newly developed CARS methodology simulates agent exposure effects on the following five indices of human physiological function: mental, vision, cardio-respiratory, visceral, and limbs. Mathematical models and the application of basic pharmacokinetic principles were incorporated into the simulation so that for each chemical exposure, the relationship between exposure dosage, absorbed dosage (agent blood plasma concentration), and level of physiological response are computed as a function of time. CARS,more » as a simulation tool, is designed for the users with little or no computer-related experience. The model combines maximum flexibility with a comprehensive user-friendly interactive menu-driven system. Users define an exposure problem and obtain immediate results displayed in tabular, graphical, and image formats. CARS has broad scientific and engineering applications, not only in technology for the soldier in the area of Chemical Defense, but also in minimizing animal testing in biomedical and toxicological research and the development of a modeling system for human exposure to hazardous-waste chemicals.« less

  8. ADDRESSING HUMAN EXPOSURES TO AIR POLLUTANTS AROUND BUILDINGS IN URBAN AREAS WITH COMPUTATIONAL FLUID DYNAMICS MODELS

    EPA Science Inventory

    This paper discusses the status and application of Computational Fluid Dynamics (CFD) models to address challenges for modeling human exposures to air pollutants around urban building microenvironments. There are challenges for more detailed understanding of air pollutant sour...

  9. ASSESSING A COMPUTER MODEL FOR PREDICTING HUMAN EXPOSURE TO PM2.5

    EPA Science Inventory

    This paper compares outputs of a model for predicting PM2.5 exposure with experimental data obtained from exposure studies of selected subpopulations. The exposure model is built on a WWW platform called pCNEM, "A PC Version of pNEM." Exposure models created by pCNEM are sim...

  10. Computational Toxicology

    EPA Science Inventory

    ‘Computational toxicology’ is a broad term that encompasses all manner of computer-facilitated informatics, data-mining, and modeling endeavors in relation to toxicology, including exposure modeling, physiologically based pharmacokinetic (PBPK) modeling, dose-response modeling, ...

  11. Development and application of air quality models at the US ...

    EPA Pesticide Factsheets

    Overview of the development and application of air quality models at the U.S. EPA, particularly focused on the development and application of the Community Multiscale Air Quality (CMAQ) model developed within the Computation Exposure Division (CED) of the National Exposure Research Laboratory (NERL). This presentation will provide a simple overview of air quality model development and application geared toward a non-technical student audience. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  12. Computational Exposure Science: An Emerging Discipline to ...

    EPA Pesticide Factsheets

    Background: Computational exposure science represents a frontier of environmental science that is emerging and quickly evolving.Objectives: In this commentary, we define this burgeoning discipline, describe a framework for implementation, and review some key ongoing research elements that are advancing the science with respect to exposure to chemicals in consumer products.Discussion: The fundamental elements of computational exposure science include the development of reliable, computationally efficient predictive exposure models; the identification, acquisition, and application of data to support and evaluate these models; and generation of improved methods for extrapolating across chemicals. We describe our efforts in each of these areas and provide examples that demonstrate both progress and potential.Conclusions: Computational exposure science, linked with comparable efforts in toxicology, is ushering in a new era of risk assessment that greatly expands our ability to evaluate chemical safety and sustainability and to protect public health. The National Exposure Research Laboratory’s (NERL’s) Human Exposure and Atmospheric Sciences Division (HEASD) conducts research in support of EPA’s mission to protect human health and the environment. HEASD’s research program supports Goal 1 (Clean Air) and Goal 4 (Healthy People) of EPA’s strategic plan. More specifically, our division conducts research to characterize the movement of pollutants from the source

  13. Exposure Science and the US EPA National Center for Computational Toxicology

    EPA Science Inventory

    The emerging field of computational toxicology applies mathematical and computer models and molecular biological and chemical approaches to explore both qualitative and quantitative relationships between sources of environmental pollutant exposure and adverse health outcomes. The...

  14. DEVELOPMENT OF A DIETARY EXPOSURE POTENTIAL MODEL FOR EVALUATING DIETARY EXPOSURE TO CHEMICAL RESIDUES IN FOOD

    EPA Science Inventory

    The Dietary Exposure Potential Model (DEPM) is a computer-based model developed for estimating dietary exposure to chemical residues in food. The DEPM is based on food consumption data from the 1987-1988 Nationwide Food Consumption Survey (NFCS) administered by the United States ...

  15. EXPOSURE ANALYSIS MODELING SYSTEM (EXAMS): USER MANUAL AND SYSTEM DOCUMENTATION

    EPA Science Inventory

    The Exposure Analysis Modeling System, first published in 1982 (EPA-600/3-82-023), provides interactive computer software for formulating aquatic ecosystem models and rapidly evaluating the fate, transport, and exposure concentrations of synthetic organic chemicals - pesticides, ...

  16. NEXT GENERATION MULTIMEDIA/MULTIPATHWAY EXPOSURE MODELING

    EPA Science Inventory

    The Stochastic Human Exposure and Dose Simulation model for pesticides (SHEDS-Pesticides) supports the efforts of EPA to better understand human exposures and doses to multimedia, multipathway pollutants. It is a physically-based, probabilistic computer model that predicts, for u...

  17. OVERVIEW OF THE U.S. EPA NERL'S HUMAN EXPOSURE MODELING

    EPA Science Inventory

    Computational modeling of human exposure to environmental pollutants is one of the primary activities of the US Environmental Protection Agency's National Exposure Research Laboratory (NERL). Assessment of human exposures is a critical part of the overall risk assessment para...

  18. Applying mathematical modeling to create job rotation schedules for minimizing occupational noise exposure.

    PubMed

    Tharmmaphornphilas, Wipawee; Green, Benjamin; Carnahan, Brian J; Norman, Bryan A

    2003-01-01

    This research developed worker schedules by using administrative controls and a computer programming model to reduce the likelihood of worker hearing loss. By rotating the workers through different jobs during the day it was possible to reduce their exposure to hazardous noise levels. Computer simulations were made based on data collected in a real setting. Worker schedules currently used at the site are compared with proposed worker schedules from the computer simulations. For the worker assignment plans found by the computer model, the authors calculate a significant decrease in time-weighted average (TWA) sound level exposure. The maximum daily dose that any worker is exposed to is reduced by 58.8%, and the maximum TWA value for the workers is reduced by 3.8 dB from the current schedule.

  19. AN OVERVIEW OF HUMAN EXPOSURE MODELING ACTIVITIES AT THE U.S. EPA'S NATIONAL EXPOSURE RESEARCH LABORATORY

    EPA Science Inventory

    The computational modeling of human exposure to environmental pollutants is one of the primary activities of the US Environmental Protection Agency (USEPA)'s National Exposure Research Laboratory (NERL). Assessment of human exposures is a critical part of the overall risk assessm...

  20. Use of computer models to assess exposure to agricultural chemicals via drinking water.

    PubMed

    Gustafson, D I

    1995-10-27

    Surveys of drinking water quality throughout the agricultural regions of the world have revealed the tendency of certain crop protection chemicals to enter water supplies. Fortunately, the trace concentrations that have been detected are generally well below the levels thought to have any negative impact on human health or the environment. However, the public expects drinking water to be pristine and seems willing to bear the costs involved in further regulating agricultural chemical use in such a way so as to eliminate the potential for such materials to occur at any detectable level. Of all the tools available to assess exposure to agricultural chemicals via drinking water, computer models are one of the most cost-effective. Although not sufficiently predictive to be used in the absence of any field data, such computer programs can be used with some degree of certainty to perform quantitative extrapolations and thereby quantify regional exposure from field-scale monitoring information. Specific models and modeling techniques will be discussed for performing such exposure analyses. Improvements in computer technology have recently made it practical to use Monte Carlo and other probabilistic techniques as a routine tool for estimating human exposure. Such methods make it possible, at least in principle, to prepare exposure estimates with known confidence intervals and sufficient statistical validity to be used in the regulatory management of agricultural chemicals.

  1. Computational modeling of temperature elevation and thermoregulatory response in the brains of anesthetized rats locally exposed at 1.5 GHz

    NASA Astrophysics Data System (ADS)

    Hirata, Akimasa; Masuda, Hiroshi; Kanai, Yuya; Asai, Ryuichi; Fujiwara, Osamu; Arima, Takuji; Kawai, Hiroki; Watanabe, Soichi; Lagroye, Isabelle; Veyret, Bernard

    2011-12-01

    The dominant effect of human exposures to microwaves is caused by temperature elevation ('thermal effect'). In the safety guidelines/standards, the specific absorption rate averaged over a specific volume is used as a metric for human protection from localized exposure. Further investigation on the use of this metric is required, especially in terms of thermophysiology. The World Health Organization (2006 RF research agenda) has given high priority to research into the extent and consequences of microwave-induced temperature elevation in children. In this study, an electromagnetic-thermal computational code was developed to model electromagnetic power absorption and resulting temperature elevation leading to changes in active blood flow in response to localized 1.457 GHz exposure in rat heads. Both juvenile (4 week old) and young adult (8 week old) rats were considered. The computational code was validated against measurements for 4 and 8 week old rats. Our computational results suggest that the blood flow rate depends on both brain and core temperature elevations. No significant difference was observed between thermophysiological responses in 4 and 8 week old rats under these exposure conditions. The computational model developed herein is thus applicable to set exposure conditions for rats in laboratory investigations, as well as in planning treatment protocols in the thermal therapy.

  2. Enduring Influence of Stereotypical Computer Science Role Models on Women's Academic Aspirations

    ERIC Educational Resources Information Center

    Cheryan, Sapna; Drury, Benjamin J.; Vichayapai, Marissa

    2013-01-01

    The current work examines whether a brief exposure to a computer science role model who fits stereotypes of computer scientists has a lasting influence on women's interest in the field. One-hundred undergraduate women who were not computer science majors met a female or male peer role model who embodied computer science stereotypes in appearance…

  3. Development and application of a complex numerical model and software for the computation of dose conversion factors for radon progenies.

    PubMed

    Farkas, Árpád; Balásházy, Imre

    2015-04-01

    A more exact determination of dose conversion factors associated with radon progeny inhalation was possible due to the advancements in epidemiological health risk estimates in the last years. The enhancement of computational power and the development of numerical techniques allow computing dose conversion factors with increasing reliability. The objective of this study was to develop an integrated model and software based on a self-developed airway deposition code, an own bronchial dosimetry model and the computational methods accepted by International Commission on Radiological Protection (ICRP) to calculate dose conversion coefficients for different exposure conditions. The model was tested by its application for exposure and breathing conditions characteristic of mines and homes. The dose conversion factors were 8 and 16 mSv WLM(-1) for homes and mines when applying a stochastic deposition model combined with the ICRP dosimetry model (named PM-A model), and 9 and 17 mSv WLM(-1) when applying the same deposition model combined with authors' bronchial dosimetry model and the ICRP bronchiolar and alveolar-interstitial dosimetry model (called PM-B model). User friendly software for the computation of dose conversion factors has also been developed. The software allows one to compute conversion factors for a large range of exposure and breathing parameters and to perform sensitivity analyses. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  4. Evidence Theory Based Uncertainty Quantification in Radiological Risk due to Accidental Release of Radioactivity from a Nuclear Power Plant

    NASA Astrophysics Data System (ADS)

    Ingale, S. V.; Datta, D.

    2010-10-01

    Consequence of the accidental release of radioactivity from a nuclear power plant is assessed in terms of exposure or dose to the members of the public. Assessment of risk is routed through this dose computation. Dose computation basically depends on the basic dose assessment model and exposure pathways. One of the exposure pathways is the ingestion of contaminated food. The aim of the present paper is to compute the uncertainty associated with the risk to the members of the public due to the ingestion of contaminated food. The governing parameters of the ingestion dose assessment model being imprecise, we have approached evidence theory to compute the bound of the risk. The uncertainty is addressed by the belief and plausibility fuzzy measures.

  5. MODEL DEVELOPMENT AND APPLICATION FOR ASSESSING HUMAN EXPOSURE AND DOSE TO TOXIC CHEMICALS AND POLLUTANTS

    EPA Science Inventory

    This project aims to strengthen the general scientific foundation of EPA's exposure and risk assessment processes by developing state-of-the-art exposure to dose computational models. This research will produce physiologically-based pharmacokinetic (PBPK) and pharmacodynamic (PD)...

  6. A Series of Molecular Dynamics and Homology Modeling Computer Labs for an Undergraduate Molecular Modeling Course

    ERIC Educational Resources Information Center

    Elmore, Donald E.; Guayasamin, Ryann C.; Kieffer, Madeleine E.

    2010-01-01

    As computational modeling plays an increasingly central role in biochemical research, it is important to provide students with exposure to common modeling methods in their undergraduate curriculum. This article describes a series of computer labs designed to introduce undergraduate students to energy minimization, molecular dynamics simulations,…

  7. Computational toxicology as implemented by the U.S. EPA: providing high throughput decision support tools for screening and assessing chemical exposure, hazard and risk.

    PubMed

    Kavlock, Robert; Dix, David

    2010-02-01

    Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, the U.S. Environmental Protection Agency EPA is developing robust and flexible computational tools that can be applied to the thousands of chemicals in commerce, and contaminant mixtures found in air, water, and hazardous-waste sites. The Office of Research and Development (ORD) Computational Toxicology Research Program (CTRP) is composed of three main elements. The largest component is the National Center for Computational Toxicology (NCCT), which was established in 2005 to coordinate research on chemical screening and prioritization, informatics, and systems modeling. The second element consists of related activities in the National Health and Environmental Effects Research Laboratory (NHEERL) and the National Exposure Research Laboratory (NERL). The third and final component consists of academic centers working on various aspects of computational toxicology and funded by the U.S. EPA Science to Achieve Results (STAR) program. Together these elements form the key components in the implementation of both the initial strategy, A Framework for a Computational Toxicology Research Program (U.S. EPA, 2003), and the newly released The U.S. Environmental Protection Agency's Strategic Plan for Evaluating the Toxicity of Chemicals (U.S. EPA, 2009a). Key intramural projects of the CTRP include digitizing legacy toxicity testing information toxicity reference database (ToxRefDB), predicting toxicity (ToxCast) and exposure (ExpoCast), and creating virtual liver (v-Liver) and virtual embryo (v-Embryo) systems models. U.S. EPA-funded STAR centers are also providing bioinformatics, computational toxicology data and models, and developmental toxicity data and models. The models and underlying data are being made publicly available through the Aggregated Computational Toxicology Resource (ACToR), the Distributed Structure-Searchable Toxicity (DSSTox) Database Network, and other U.S. EPA websites. While initially focused on improving the hazard identification process, the CTRP is placing increasing emphasis on using high-throughput bioactivity profiling data in systems modeling to support quantitative risk assessments, and in developing complementary higher throughput exposure models. This integrated approach will enable analysis of life-stage susceptibility, and understanding of the exposures, pathways, and key events by which chemicals exert their toxicity in developing systems (e.g., endocrine-related pathways). The CTRP will be a critical component in next-generation risk assessments utilizing quantitative high-throughput data and providing a much higher capacity for assessing chemical toxicity than is currently available.

  8. A Data-Driven Framework for Incorporating New Tools for ...

    EPA Pesticide Factsheets

    This talk was given during the “Exposure-Based Toxicity Testing” session at the annual meeting of the International Society for Exposure Science. It provided an update on the state of the science and tools that may be employed in risk-based prioritization efforts. It outlined knowledge gained from the data provided using these high-throughput tools to assess chemical bioactivity and to predict chemical exposures and also identified future needs. It provided an opportunity to showcase ongoing research efforts within the National Exposure Research Laboratory and the National Center for Computational Toxicology within the Office of Research and Development to an international audience. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  9. IAQ MODEL FOR WINDOWS - RISK VERSION 1.0 USER MANUAL

    EPA Science Inventory

    The manual describes the use of the computer model, RISK, to calculate individual exposure to indoor air pollutants from sources. The model calculates exposure due to individual, as opposed to population, activity patterns and source use. The model also provides the capability to...

  10. Operation of the computer model for direct atomic oxygen exposure of Earth satellites

    NASA Technical Reports Server (NTRS)

    Bourassa, R. J.; Gruenbaum, P. E.; Gillis, J. R.; Hargraves, C. R.

    1995-01-01

    One of the primary causes of material degradation in low Earth orbit (LEO) is exposure to atomic oxygen. When atomic oxygen molecules collide with an orbiting spacecraft, the relative velocity is 7 to 8 km/sec and the collision energy is 4 to 5 eV per atom. Under these conditions, atomic oxygen may initiate a number of chemical and physical reactions with exposed materials. These reactions contribute to material degradation, surface erosion, and contamination. Interpretation of these effects on materials and the design of space hardware to withstand on-orbit conditions requires quantitative knowledge of the atomic oxygen exposure environment. Atomic oxygen flux is a function of orbit altitude, the orientation of the orbit plan to the Sun, solar and geomagnetic activity, and the angle between exposed surfaces and the spacecraft heading. We have developed a computer model to predict the atomic oxygen exposure of spacecraft in low Earth orbit. The application of this computer model is discussed.

  11. Source-to-Outcome Microbial Exposure and Risk Modeling Framework

    EPA Science Inventory

    A Quantitative Microbial Risk Assessment (QMRA) is a computer-based data-delivery and modeling approach that integrates interdisciplinary fate/transport, exposure, and impact models and databases to characterize potential health impacts/risks due to pathogens. As such, a QMRA ex...

  12. LDEF microenvironments, observed and predicted

    NASA Astrophysics Data System (ADS)

    Bourassa, R. J.; Pippin, H. G.; Gillis, J. R.

    1993-04-01

    A computer model for prediction of atomic oxygen exposure of spacecraft in low earth orbit, referred to as the primary atomic oxygen model, was originally described at the First Long Duration Exposure Facility (LDEF) Post-Retrieval Symposium. The primary atomic oxygen model accounts for variations in orbit parameters, the condition of the atmosphere, and for the orientation of exposed surfaces relative to the direction of spacecraft motion. The use of the primary atomic oxygen model to define average atomic oxygen exposure conditions for a spacecraft is discussed and a second microenvironments computer model is described that accounts for shadowing and scattering of atomic oxygen by complex surface protrusions and indentations. Comparisons of observed and predicted erosion of fluorinated ethylene propylene (FEP) thermal control blankets using the models are presented. Experimental and theoretical results are in excellent agreement. Work is in progress to expand modeling capability to include ultraviolet radiation exposure and to obtain more detailed information on reflecting and scattering characteristics of material surfaces.

  13. LDEF microenvironments, observed and predicted

    NASA Technical Reports Server (NTRS)

    Bourassa, R. J.; Pippin, H. G.; Gillis, J. R.

    1993-01-01

    A computer model for prediction of atomic oxygen exposure of spacecraft in low earth orbit, referred to as the primary atomic oxygen model, was originally described at the First Long Duration Exposure Facility (LDEF) Post-Retrieval Symposium. The primary atomic oxygen model accounts for variations in orbit parameters, the condition of the atmosphere, and for the orientation of exposed surfaces relative to the direction of spacecraft motion. The use of the primary atomic oxygen model to define average atomic oxygen exposure conditions for a spacecraft is discussed and a second microenvironments computer model is described that accounts for shadowing and scattering of atomic oxygen by complex surface protrusions and indentations. Comparisons of observed and predicted erosion of fluorinated ethylene propylene (FEP) thermal control blankets using the models are presented. Experimental and theoretical results are in excellent agreement. Work is in progress to expand modeling capability to include ultraviolet radiation exposure and to obtain more detailed information on reflecting and scattering characteristics of material surfaces.

  14. A simulation study to quantify the impacts of exposure ...

    EPA Pesticide Factsheets

    A simulation study to quantify the impacts of exposure measurement error on air pollution health risk estimates in copollutant time-series models The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  15. Prediction of Chemical Function: Model Development and Application

    EPA Science Inventory

    The United States Environmental Protection Agency’s Exposure Forecaster (ExpoCast) project is developing both statistical and mechanism-based computational models for predicting exposures to thousands of chemicals, including those in consumer products. The high-throughput (...

  16. COSIM: A Finite-Difference Computer Model to Predict Ternary Concentration Profiles Associated With Oxidation and Interdiffusion of Overlay-Coated Substrates

    NASA Technical Reports Server (NTRS)

    Nesbitt, James A.

    2001-01-01

    A finite-difference computer program (COSIM) has been written which models the one-dimensional, diffusional transport associated with high-temperature oxidation and interdiffusion of overlay-coated substrates. The program predicts concentration profiles for up to three elements in the coating and substrate after various oxidation exposures. Surface recession due to solute loss is also predicted. Ternary cross terms and concentration-dependent diffusion coefficients are taken into account. The program also incorporates a previously-developed oxide growth and spalling model to simulate either isothermal or cyclic oxidation exposures. In addition to predicting concentration profiles after various oxidation exposures, the program can also be used to predict coating life based on a concentration dependent failure criterion (e.g., surface solute content drops to 2%). The computer code is written in FORTRAN and employs numerous subroutines to make the program flexible and easily modifiable to other coating oxidation problems.

  17. Risk assessment of occupational exposure to benzene using numerical simulation in a complex geometry of a reforming unit of petroleum refinery.

    PubMed

    Bayatian, Majid; Ashrafi, Khosro; Azari, Mansour Rezazadeh; Jafari, Mohammad Javad; Mehrabi, Yadollah

    2018-04-01

    There has been an increasing concern about the continuous and the sudden release of volatile organic pollutants from petroleum refineries and occupational and environmental exposures. Benzene is one of the most prevalent volatile compounds, and it has been addressed by many authors for its potential toxicity in occupational and environmental settings. Due to the complexities of sampling and analysis of benzene in routine and accidental situations, a reliable estimation of the benzene concentration in the outdoor setting of refinery using a computational fluid dynamics (CFD) could be instrumental for risk assessment of occupational exposure. In the present work, a computational fluid dynamic model was applied for exposure risk assessment with consideration of benzene being released continuously from a reforming unit of a refinery. For simulation of benzene dispersion, GAMBIT, FLUENT, and CFD post software are used as preprocessing, processing, and post-processing, respectively. Computational fluid dynamic validation was carried out by comparing the computed data with the experimental measurements. Eventually, chronic daily intake and lifetime cancer risk for routine operations through the two seasons of a year are estimated through the simulation model. Root mean square errors are 0.19 and 0.17 for wind speed and concentration, respectively. Lifetime risk assessments of workers are 0.4-3.8 and 0.0096-0.25 per 1000 workers in stable and unstable atmospheric conditions, respectively. Exposure risk is unacceptable for the head of shift work, chief engineer, and general workers in 141 days (38.77%) in a year. The results of this study show that computational fluid dynamics is a useful tool for modeling of benzene exposure in a complex geometry and can be used to estimate lifetime risks of occupation groups in a refinery setting.

  18. Multiphysics and Thermal Response Models to Improve Accuracy of Local Temperature Estimation in Rat Cortex under Microwave Exposure

    PubMed Central

    Kodera, Sachiko; Gomez-Tames, Jose; Hirata, Akimasa; Masuda, Hiroshi; Arima, Takuji; Watanabe, Soichi

    2017-01-01

    The rapid development of wireless technology has led to widespread concerns regarding adverse human health effects caused by exposure to electromagnetic fields. Temperature elevation in biological bodies is an important factor that can adversely affect health. A thermophysiological model is desired to quantify microwave (MW) induced temperature elevations. In this study, parameters related to thermophysiological responses for MW exposures were estimated using an electromagnetic-thermodynamics simulation technique. To the authors’ knowledge, this is the first study in which parameters related to regional cerebral blood flow in a rat model were extracted at a high degree of accuracy through experimental measurements for localized MW exposure at frequencies exceeding 6 GHz. The findings indicate that the improved modeling parameters yield computed results that match well with the measured quantities during and after exposure in rats. It is expected that the computational model will be helpful in estimating the temperature elevation in the rat brain at multiple observation points (that are difficult to measure simultaneously) and in explaining the physiological changes in the local cortex region. PMID:28358345

  19. Development and application of air quality models at the U.S. EPA

    EPA Science Inventory

    Overview of the development and application of air quality models at the U.S. EPA, particularly focused on the development and application of the Community Multiscale Air Quality (CMAQ) model developed within the Computation Exposure Division (CED) of the National Exposure Resear...

  20. Computational Toxicology (S)

    EPA Science Inventory

    The emerging field of computational toxicology applies mathematical and computer models and molecular biological and chemical approaches to explore both qualitative and quantitative relationships between sources of environmental pollutant exposure and adverse health outcomes. Th...

  1. PREMOR: a point reactor exposure model computer code for survey analysis of power plant performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vondy, D.R.

    1979-10-01

    The PREMOR computer code was written to exploit a simple, two-group point nuclear reactor power plant model for survey analysis. Up to thirteen actinides, fourteen fission products, and one lumped absorber nuclide density are followed over a reactor history. Successive feed batches are accounted for with provision for from one to twenty batches resident. The effect of exposure of each of the batches to the same neutron flux is determined.

  2. Radiation exposure and risk assessment for critical female body organs

    NASA Technical Reports Server (NTRS)

    Atwell, William; Weyland, Mark D.; Hardy, Alva C.

    1991-01-01

    Space radiation exposure limits for astronauts are based on recommendations of the National Council on Radiation Protection and Measurements. These limits now include the age at exposure and sex of the astronaut. A recently-developed computerized anatomical female (CAF) model is discussed in detail. Computer-generated, cross-sectional data are presented to illustrate the completeness of the CAF model. By applying ray-tracing techniques, shield distribution functions have been computed to calculate absorbed dose and dose equivalent values for a variety of critical body organs (e.g., breasts, lungs, thyroid gland, etc.) and mission scenarios. Specific risk assessments, i.e., cancer induction and mortality, are reviewed.

  3. Computational fluid dynamics modeling of Bacillus anthracis spore deposition in rabbit and human respiratory airways

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kabilan, S.; Suffield, S. R.; Recknagle, K. P.

    Three-dimensional computational fluid dynamics and Lagrangian particle deposition models were developed to compare the deposition of aerosolized Bacillus anthracis spores in the respiratory airways of a human with that of the rabbit, a species commonly used in the study of anthrax disease. The respiratory airway geometries for each species were derived respectively from computed tomography (CT) and µCT images. Both models encompassed airways that extended from the external nose to the lung with a total of 272 outlets in the human model and 2878 outlets in the rabbit model. All simulations of spore deposition were conducted under transient, inhalation–exhalation breathingmore » conditions using average species-specific minute volumes. Two different exposure scenarios were modeled in the rabbit based upon experimental inhalation studies. For comparison, human simulations were conducted at the highest exposure concentration used during the rabbit experimental exposures. Results demonstrated that regional spore deposition patterns were sensitive to airway geometry and ventilation profiles. Due to the complex airway geometries in the rabbit nose, higher spore deposition efficiency was predicted in the nasal sinus compared to the human at the same air concentration of anthrax spores. In contrast, higher spore deposition was predicted in the lower conducting airways of the human compared to the rabbit lung due to differences in airway branching pattern. This information can be used to refine published and ongoing biokinetic models of inhalation anthrax spore exposures, which currently estimate deposited spore concentrations based solely upon exposure concentrations and inhaled doses that do not factor in species-specific anatomy and physiology for deposition.« less

  4. Emerging trends in geospatial artificial intelligence (geoAI): potential applications for environmental epidemiology.

    PubMed

    VoPham, Trang; Hart, Jaime E; Laden, Francine; Chiang, Yao-Yi

    2018-04-17

    Geospatial artificial intelligence (geoAI) is an emerging scientific discipline that combines innovations in spatial science, artificial intelligence methods in machine learning (e.g., deep learning), data mining, and high-performance computing to extract knowledge from spatial big data. In environmental epidemiology, exposure modeling is a commonly used approach to conduct exposure assessment to determine the distribution of exposures in study populations. geoAI technologies provide important advantages for exposure modeling in environmental epidemiology, including the ability to incorporate large amounts of big spatial and temporal data in a variety of formats; computational efficiency; flexibility in algorithms and workflows to accommodate relevant characteristics of spatial (environmental) processes including spatial nonstationarity; and scalability to model other environmental exposures across different geographic areas. The objectives of this commentary are to provide an overview of key concepts surrounding the evolving and interdisciplinary field of geoAI including spatial data science, machine learning, deep learning, and data mining; recent geoAI applications in research; and potential future directions for geoAI in environmental epidemiology.

  5. Computational model of retinal photocoagulation and rupture

    NASA Astrophysics Data System (ADS)

    Sramek, Christopher; Paulus, Yannis M.; Nomoto, Hiroyuki; Huie, Phil; Palanker, Daniel

    2009-02-01

    In patterned scanning laser photocoagulation, shorter duration (< 20 ms) pulses help reduce thermal damage beyond the photoreceptor layer, decrease treatment time and minimize pain. However, safe therapeutic window (defined as the ratio of rupture threshold power to that of light coagulation) decreases for shorter exposures. To quantify the extent of thermal damage in the retina, and maximize the therapeutic window, we developed a computational model of retinal photocoagulation and rupture. Model parameters were adjusted to match measured thresholds of vaporization, coagulation, and retinal pigment epithelial (RPE) damage. Computed lesion width agreed with histological measurements in a wide range of pulse durations and power. Application of ring-shaped beam profile was predicted to double the therapeutic window width for exposures in the range of 1 - 10 ms.

  6. COSIM: A Finite-Difference Computer Model to Predict Ternary Concentration Profiles Associated with Oxidation and Interdiffusion of Overlay-Coated Substrates

    NASA Technical Reports Server (NTRS)

    Nesbitt, James A.

    2000-01-01

    A finite-difference computer program (COSIM) has been written which models the one-dimensional, diffusional transport associated with high-temperature oxidation and interdiffusion of overlay-coated substrates. The program predicts concentration profiles for up to three elements in the coating and substrate after various oxidation exposures. Surface recession due to solute loss is also predicted. Ternary cross terms and concentration-dependent diffusion coefficients are taken into account. The program also incorporates a previously-developed oxide growth and spalling model to simulate either isothermal or cyclic oxidation exposures. In addition to predicting concentration profiles after various oxidation exposures, the program can also be used to predict coating fife based on a concentration dependent failure criterion (e.g., surface solute content drops to two percent). The computer code, written in an extension of FORTRAN 77, employs numerous subroutines to make the program flexible and easily modifiable to other coating oxidation problems.

  7. A latent process model for forecasting multiple time series in environmental public health surveillance.

    PubMed

    Morrison, Kathryn T; Shaddick, Gavin; Henderson, Sarah B; Buckeridge, David L

    2016-08-15

    This paper outlines a latent process model for forecasting multiple health outcomes arising from a common environmental exposure. Traditionally, surveillance models in environmental health do not link health outcome measures, such as morbidity or mortality counts, to measures of exposure, such as air pollution. Moreover, different measures of health outcomes are treated as independent, while it is known that they are correlated with one another over time as they arise in part from a common underlying exposure. We propose modelling an environmental exposure as a latent process, and we describe the implementation of such a model within a hierarchical Bayesian framework and its efficient computation using integrated nested Laplace approximations. Through a simulation study, we compare distinct univariate models for each health outcome with a bivariate approach. The bivariate model outperforms the univariate models in bias and coverage of parameter estimation, in forecast accuracy and in computational efficiency. The methods are illustrated with a case study using healthcare utilization and air pollution data from British Columbia, Canada, 2003-2011, where seasonal wildfires produce high levels of air pollution, significantly impacting population health. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  8. Predicting Adaptive Response to Fadrozole Exposure:Computational Model of the Fathead MinnowsHypothalamic-Pituitary-Gonadal Axis

    EPA Science Inventory

    Exposure to endocrine disrupting chemicals can affect reproduction and development in both humans and wildlife. We are developing a mechanistic mathematical model of the hypothalamic-pituitary-gonadal (HPG) axis in female fathead minnows to predict doseresponse and time-course (...

  9. Linkage of exposure and effects using genomics, proteomics and metabolomics in small fish models (presentation)

    EPA Science Inventory

    This research project combines the use of whole organism endpoints, genomic, proteomic and metabolomic approaches, and computational modeling in a systems biology approach to 1) identify molecular indicators of exposure and biomarkers of effect to EDCs representing several modes/...

  10. A reassessment of Galileo radiation exposures in the Jupiter magnetosphere.

    PubMed

    Atwell, William; Townsend, Lawrence; Miller, Thomas; Campbell, Christina

    2005-01-01

    Earlier particle experiments in the 1970s on Pioneer-10 and -11 and Voyager-1 and -2 provided Jupiter flyby particle data, which were used by Divine and Garrett to develop the first Jupiter trapped radiation environment model. This model was used to establish a baseline radiation effects design limit for the Galileo onboard electronics. Recently, Garrett et al. have developed an updated Galileo Interim Radiation Environment (GIRE) model based on Galileo electron data. In this paper, we have used the GIRE model to reassess the computed radiation exposures and dose effects for Galileo. The 34-orbit 'as flown' Galileo trajectory data and the updated GIRE model were used to compute the electron and proton spectra for each of the 34 orbits. The total ionisation doses of electrons and protons have been computed based on a parametric shielding configuration, and these results are compared with previously published results.

  11. Predicting Adaptive Response to Fadrozole Exposure: Computational Model of the Fathead Minnow Hypothalamic-Pituitary-Gonadal Axis

    EPA Science Inventory

    Exposure to endocrine disrupting chemicals can affect reproduction and development in both humans and wildlife. We are developing a mechanistic mathematical model of the hypothalamic-pituitary-gonadal (HPG) axis in female fathead minnows to predict dose-response and time-course (...

  12. Modeling Potential Carbon Monoxide Exposure Due to Operation of a Major Rocket Engine Altitude Test Facility Using Computational Fluid Dynamics

    NASA Technical Reports Server (NTRS)

    Blotzer, Michael J.; Woods, Jody L.

    2009-01-01

    This viewgraph presentation reviews computational fluid dynamics as a tool for modelling the dispersion of carbon monoxide at the Stennis Space Center's A3 Test Stand. The contents include: 1) Constellation Program; 2) Constellation Launch Vehicles; 3) J2X Engine; 4) A-3 Test Stand; 5) Chemical Steam Generators; 6) Emission Estimates; 7) Located in Existing Test Complex; 8) Computational Fluid Dynamics; 9) Computational Tools; 10) CO Modeling; 11) CO Model results; and 12) Next steps.

  13. Recent Enhancements to the Community Multiscale Air Quality Modeling System (CMAQ)

    EPA Science Inventory

    EPA’s Office of Research and Development, Computational Exposure Division held a webinar on January 31, 2017 to present the recent scientific and computational updates made by EPA to the Community Multi-Scale Air Quality Model (CMAQ). Topics covered included: (1) Improveme...

  14. COOPERATIVE RESEARCH AND DEVELOPMENT FOR APPLICATION OF CFD TO ESTIMATING HUMAN EXPOSURES TO ENVIRONMENTAL POLLUTANTS

    EPA Science Inventory

    Under a Cooperative Research and Development Agreement (CRADA), Fluent, Inc. and the US EPA National Exposure Research Laboratory (NERL) propose to improve the ability of environmental scientists to use computer modeling for environmental exposure to air pollutants in human exp...

  15. Toxcast and the Use of Human Relevant In Vitro Exposures ...

    EPA Pesticide Factsheets

    The path for incorporating new approach methods and technologies into quantitative chemical risk assessment poses a diverse set of scientific challenges. These challenges include sufficient coverage of toxicological mechanisms to meaningfully interpret negative test results, development of increasingly relevant test systems, computational modeling to integrate experimental data, putting results in a dose and exposure context, characterizing uncertainty, and efficient validation of the test systems and computational models. The presentation will cover progress at the U.S. EPA in systematically addressing each of these challenges and delivering more human-relevant risk-based assessments. This abstract does not necessarily reflect U.S. EPA policy. Presentation at the British Toxicological Society Annual Congress on ToxCast and the Use of Human Relevant In Vitro Exposures: Incorporating high-throughput exposure and toxicity testing data for 21st century risk assessments .

  16. Thinking Through Computational Exposure as an Evolving Paradign Shift for Exposure Science: Development and Application of Predictive Models from Big Data

    EPA Science Inventory

    Symposium Abstract: Exposure science has evolved from a time when the primary focus was on measurements of environmental and biological media and the development of enabling field and laboratory methods. The Total Exposure Assessment Method (TEAM) studies of the 1980s were class...

  17. Single-Photon Emission Computed Tomography/Computed Tomography Imaging in a Rabbit Model of Emphysema Reveals Ongoing Apoptosis In Vivo

    PubMed Central

    Goldklang, Monica P.; Tekabe, Yared; Zelonina, Tina; Trischler, Jordis; Xiao, Rui; Stearns, Kyle; Romanov, Alexander; Muzio, Valeria; Shiomi, Takayuki; Johnson, Lynne L.

    2016-01-01

    Evaluation of lung disease is limited by the inability to visualize ongoing pathological processes. Molecular imaging that targets cellular processes related to disease pathogenesis has the potential to assess disease activity over time to allow intervention before lung destruction. Because apoptosis is a critical component of lung damage in emphysema, a functional imaging approach was taken to determine if targeting apoptosis in a smoke exposure model would allow the quantification of early lung damage in vivo. Rabbits were exposed to cigarette smoke for 4 or 16 weeks and underwent single-photon emission computed tomography/computed tomography scanning using technetium-99m–rhAnnexin V-128. Imaging results were correlated with ex vivo tissue analysis to validate the presence of lung destruction and apoptosis. Lung computed tomography scans of long-term smoke–exposed rabbits exhibit anatomical similarities to human emphysema, with increased lung volumes compared with controls. Morphometry on lung tissue confirmed increased mean linear intercept and destructive index at 16 weeks of smoke exposure and compliance measurements documented physiological changes of emphysema. Tissue and lavage analysis displayed the hallmarks of smoke exposure, including increased tissue cellularity and protease activity. Technetium-99m–rhAnnexin V-128 single-photon emission computed tomography signal was increased after smoke exposure at 4 and 16 weeks, with confirmation of increased apoptosis through terminal deoxynucleotidyl transferase dUTP nick end labeling staining and increased tissue neutral sphingomyelinase activity in the tissue. These studies not only describe a novel emphysema model for use with future therapeutic applications, but, most importantly, also characterize a promising imaging modality that identifies ongoing destructive cellular processes within the lung. PMID:27483341

  18. Computer program for diagnostic X-ray exposure conversion.

    PubMed

    Lewis, S

    1984-01-01

    Presented is a computer program designed to convert any given set of exposure factors sequentially into another, yielding either an equivalent photographic density or one increased or decreased by a specifiable proportion. In addition to containing the wherewithal with which to manipulate a set of exposure factors, the facility to print hard (paper) copy is included enabling the results to be pasted into a notebook and used at any time. This program was originally written as an investigative exercise into examining the potential use of computers for practical radiographic purposes as conventionally encountered. At the same time, its possible use as an educational tool was borne in mind. To these ends, the current version of this program may be used as a means whereby exposure factors used in a diagnostic department may be altered to suit a particular requirement or may be used in the school as a mathematical model to describe the behaviour of exposure factors under manipulation without patient exposure.

  19. Computational Toxicology at the US EPA | Science Inventory ...

    EPA Pesticide Factsheets

    Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, EPA is developing robust and flexible computational tools that can be applied to the thousands of chemicals in commerce, and contaminant mixtures found in America’s air, water, and hazardous-waste sites. The ORD Computational Toxicology Research Program (CTRP) is composed of three main elements. The largest component is the National Center for Computational Toxicology (NCCT), which was established in 2005 to coordinate research on chemical screening and prioritization, informatics, and systems modeling. The second element consists of related activities in the National Health and Environmental Effects Research Laboratory (NHEERL) and the National Exposure Research Laboratory (NERL). The third and final component consists of academic centers working on various aspects of computational toxicology and funded by the EPA Science to Achieve Results (STAR) program. Key intramural projects of the CTRP include digitizing legacy toxicity testing information toxicity reference database (ToxRefDB), predicting toxicity (ToxCast™) and exposure (ExpoCast™), and creating virtual liver (v-Liver™) and virtual embryo (v-Embryo™) systems models. The models and underlying data are being made publicly available t

  20. Computational strategy for quantifying human pesticide exposure based upon a saliva measurement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Timchalk, Charles; Weber, Thomas J.; Smith, Jordan N.

    The National Research Council of the National Academies report, Toxicity Testing in the 21st Century: A Vision and Strategy, highlighted the importance of quantitative exposure data for evaluating human toxicity risk and noted that biomonitoring is a critical tool for quantitatively evaluating exposure from both environmental and occupational settings. Direct measurement of chemical exposures using personal monitoring provides the most accurate estimation of a subject’s true exposure, and non-invasive methods have also been advocated for quantifying the pharmacokinetics and bioavailability of drugs and xenobiotics. In this regard, there is a need to identify chemicals that are readily cleared in salivamore » at concentrations that can be quantified to support the implementation of this approach.. The current manuscript describes the use of computational modeling approaches that are closely coupled to in vivo and in vitro experiments to predict salivary uptake and clearance of xenobiotics. The primary mechanism by which xenobiotics leave the blood and enter saliva is thought to involve paracellular transport, passive transcellular diffusion, or trancellular active transport with the majority of drugs and xenobiotics cleared from plasma into saliva by passive diffusion. The transcellular or paracellular diffusion of unbound chemicals in plasma to saliva has been computational modeled using a combination of compartmental and physiologically based approaches. Of key importance for determining the plasma:saliva partitioning was the utilization of a modified Schmitt algorithm that calculates partitioning based upon the tissue composition, pH, chemical pKa and plasma protein-binding. Sensitivity analysis of key model parameters specifically identified that both protein-binding and pKa (for weak acids and bases) had the most significant impact on the determination of partitioning and that there were clear species dependent differences based upon physiological variance between rats and humans. Ongoing efforts are focused on extending this modeling strategy to an in vitro salivary acinar cell based system that will be utilized to experimentally determine and computationally predict salivary gland uptake and clearance for a broad range of xenobiotics. Hence, it is envisioned that a combination of salivary biomonitoring and computational modeling will enable the non-invasive measurement of both environmental and occupational exposure in human populations using saliva.« less

  1. Computational Fluid Dynamics Modeling of Bacillus anthracis ...

    EPA Pesticide Factsheets

    Journal Article Three-dimensional computational fluid dynamics and Lagrangian particle deposition models were developed to compare the deposition of aerosolized Bacillus anthracis spores in the respiratory airways of a human with that of the rabbit, a species commonly used in the study of anthrax disease. The respiratory airway geometries for each species were derived from computed tomography (CT) or µCT images. Both models encompassed airways that extended from the external nose to the lung with a total of 272 outlets in the human model and 2878 outlets in the rabbit model. All simulations of spore deposition were conducted under transient, inhalation-exhalation breathing conditions using average species-specific minute volumes. Four different exposure scenarios were modeled in the rabbit based upon experimental inhalation studies. For comparison, human simulations were conducted at the highest exposure concentration used during the rabbit experimental exposures. Results demonstrated that regional spore deposition patterns were sensitive to airway geometry and ventilation profiles. Despite the complex airway geometries in the rabbit nose, higher spore deposition efficiency was predicted in the upper conducting airways of the human at the same air concentration of anthrax spores. This greater deposition of spores in the upper airways in the human resulted in lower penetration and deposition in the tracheobronchial airways and the deep lung than that predict

  2. A PHYSIOLOGICALLY BASED COMPUTATIONAL MODEL OF THE BPG AXIS IN FATHEAD MINNOWS: PREDICTING EFFECTS OF ENDOCRINE DISRUPTING CHEMICAL EXPOSURE ON REPRODUCTIVE ENDPOINTS

    EPA Science Inventory

    This presentation describes development and application of a physiologically-based computational model that simulates the brain-pituitary-gonadal (BPG) axis and other endpoints important in reproduction such as concentrations of sex steroid hormones, 17-estradiol, testosterone, a...

  3. Adaptive Response in Female Fathead Minnows Exposed to an Aromatase Inhibitor: Computational Modeling of the Hypothalamic-Pituitary-Gonadal Axis

    EPA Science Inventory

    Exposure to endocrine disrupting chemicals can affect reproduction and development in both humans and wildlife. We are developing a mechanistic computational model of the hypothalamic-pituitary-gonadal (HPG) axis in female fathead minnows to predict dose-response and time-course ...

  4. Improved heat transfer modeling of the eye for electromagnetic wave exposures.

    PubMed

    Hirata, Akimasa

    2007-05-01

    This study proposed an improved heat transfer model of the eye for exposure to electromagnetic (EM) waves. Particular attention was paid to the difference from the simplified heat transfer model commonly used in this field. From our computational results, the temperature elevation in the eye calculated with the simplified heat transfer model was largely influenced by the EM absorption outside the eyeball, but not when we used our improved model.

  5. Modeling heat and moisture transport in firefighter protective clothing during flash fire exposure

    NASA Astrophysics Data System (ADS)

    Chitrphiromsri, Patirop; Kuznetsov, Andrey V.

    2005-01-01

    In this paper, a model of heat and moisture transport in firefighter protective clothing during a flash fire exposure is presented. The aim of this study is to investigate the effect of coupled heat and moisture transport on the protective performance of the garment. Computational results show the distribution of temperature and moisture content in the fabric during the exposure to the flash fire as well as during the cool-down period. Moreover, the duration of the exposure during which the garment protects the firefighter from getting second and third degree burns from the flash fire exposure is numerically predicted. A complete model for the fire-fabric-air gap-skin system is presented.

  6. An efficient use of mixing model for computing the effective dielectric and thermal properties of the human head.

    PubMed

    Mishra, Varsha; Puthucheri, Smitha; Singh, Dharmendra

    2018-05-07

    As a preventive measure against the electromagnetic (EM) wave exposure to human body, EM radiation regulatory authorities such as ICNIRP and FCC defined the value of specific absorption rate (SAR) for the human head during EM wave exposure from mobile phone. SAR quantifies the absorption of EM waves in the human body and it mainly depends on the dielectric properties (ε', σ) of the corresponding tissues. The head part of the human body is more susceptible to EM wave exposure due to the usage of mobile phones. The human head is a complex structure made up of multiple tissues with intermixing of many layers; thus, the accurate measurement of permittivity (ε') and conductivity (σ) of the tissues of the human head is still a challenge. For computing the SAR, researchers are using multilayer model, which has some challenges for defining the boundary for layers. Therefore, in this paper, an attempt has been made to propose a method to compute effective complex permittivity of the human head in the range of 0.3 to 3.0 GHz by applying De-Loor mixing model. Similarly, for defining the thermal effect in the tissue, thermal properties of the human head have also been computed using the De-Loor mixing method. The effective dielectric and thermal properties of equivalent human head model are compared with the IEEE Std. 1528. Graphical abstract ᅟ.

  7. Inferring ultraviolet anatomical exposure patterns while distinguishing the relative contribution of radiation components

    NASA Astrophysics Data System (ADS)

    Vuilleumier, Laurent; Milon, Antoine; Bulliard, Jean-Luc; Moccozet, Laurent; Vernez, David

    2013-05-01

    Exposure to solar ultraviolet (UV) radiation is the main causative factor for skin cancer. UV exposure depends on environmental and individual factors, but individual exposure data remain scarce. While ground UV irradiance is monitored via different techniques, it is difficult to translate such observations into human UV exposure or dose because of confounding factors. A multi-disciplinary collaboration developed a model predicting the dose and distribution of UV exposure on the basis of ground irradiation and morphological data. Standard 3D computer graphics techniques were adapted to develop a simulation tool that estimates solar exposure of a virtual manikin depicted as a triangle mesh surface. The amount of solar energy received by various body locations is computed for direct, diffuse and reflected radiation separately. Dosimetric measurements obtained in field conditions were used to assess the model performance. The model predicted exposure to solar UV adequately with a symmetric mean absolute percentage error of 13% and half of the predictions within 17% range of the measurements. Using this tool, solar UV exposure patterns were investigated with respect to the relative contribution of the direct, diffuse and reflected radiation. Exposure doses for various body parts and exposure scenarios of a standing individual were assessed using erythemally-weighted UV ground irradiance data measured in 2009 at Payerne, Switzerland as input. For most anatomical sites, mean daily doses were high (typically 6.2-14.6 Standard Erythemal Dose, SED) and exceeded recommended exposure values. Direct exposure was important during specific periods (e.g. midday during summer), but contributed moderately to the annual dose, ranging from 15 to 24% for vertical and horizontal body parts, respectively. Diffuse irradiation explained about 80% of the cumulative annual exposure dose.

  8. Using meta-regression models to systematically evaluate data in the published literature: relative contributions of agricultural drift, para-occupational, and residential use exposure pathways to house dust pesticide concentrations

    EPA Science Inventory

    Background: Data reported in the published literature have been used qualitatively to aid exposure assessment activities in epidemiologic studies. Analyzing these data in computational models presents statistical challenges because these data are often reported as summary statist...

  9. Evaluation of the Community Multi-scale Air Quality (CMAQ) ...

    EPA Pesticide Factsheets

    The Community Multiscale Air Quality (CMAQ) model is a state-of-the-science air quality model that simulates the emission, transport and fate of numerous air pollutants, including ozone and particulate matter. The Computational Exposure Division (CED) of the U.S. Environmental Protection Agency develops the CMAQ model and periodically releases new versions of the model that include bug fixes and various other improvements to the modeling system. In the fall of 2015, CMAQ version 5.1 was released. This new version of CMAQ will contain important bug fixes to several issues that were identified in CMAQv5.0.2 and additionally include updates to other portions of the code. Several annual, and numerous episodic, CMAQv5.1 simulations were performed to assess the impact of these improvements on the model results. These results will be presented, along with a base evaluation of the performance of the CMAQv5.1 modeling system against available surface and upper-air measurements available during the time period simulated. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, proces

  10. ADDRESSING ENVIRONMENTAL ENGINEERING CHALLENGES WITH COMPUTATIONAL FLUID DYNAMICS

    EPA Science Inventory

    This paper discusses the status and application of Computational Fluid Dynamics )CFD) models to address environmental engineering challenges for more detailed understanding of air pollutant source emissions, atmospheric dispersion and resulting human exposure. CFD simulations ...

  11. Predictive Models and Computational Toxicology

    EPA Science Inventory

    Understanding the potential health risks posed by environmental chemicals is a significant challenge elevated by the large number of diverse chemicals with generally uncharacterized exposures, mechanisms, and toxicities. The ToxCast computational toxicology research program was l...

  12. Computational Modeling and Simulation of Developmental Toxicity (EuroTox 2016)

    EPA Science Inventory

    Standard practice for assessing developmental toxicity is the observation of apical endpoints (intrauterine death, fetal growth retardation, structural malformations) in pregnant rats/rabbits following exposure during organogenesis. EPA’s computational toxicology research program...

  13. Computational and Organotypic Modeling of Microcephaly (Teratology Society)

    EPA Science Inventory

    Microcephaly is associated with reduced cortical surface area and ventricular dilations. Many genetic and environmental factors precipitate this malformation, including prenatal alcohol exposure and maternal Zika infection. This complexity motivates the engineering of computation...

  14. Computational Modeling of Hypothalamic-Pituitary-Gonadal Axis to Predict Adaptive Responses in Female Fathead Minnows Exposed to an Aromatase Inhibitor

    EPA Science Inventory

    Exposure to endocrine disrupting chemicals can affect reproduction and development in both humans and wildlife. We are developing a mechanistic computational model of the hypothalamic-pituitary-gonadal (HPG) axis in female fathead minnows to predict dose response and time-course...

  15. The need for non- or minimally-invasive biomonitoring strategies and the development of pharmacokinetic/pharmacodynamic models for quantification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Timchalk, Charles; Weber, Thomas J.; Smith, Jordan N.

    Advancements in Exposure Science involving the development and deployment of biomarkers of exposure and biological response are anticipated to significantly (and positively) influence health outcomes associated with occupational, environmental and clinical exposure to chemicals/drugs. To achieve this vision, innovative strategies are needed to develop multiplex sensor platforms capable of quantifying individual and mixed exposures (i.e. systemic dose) by measuring biomarkers of dose and biological response in readily obtainable (non-invasive) biofluids. Secondly, the use of saliva (alternative to blood) for biomonitoring coupled with the ability to rapidly analyze multiple samples in real-time offers an innovative opportunity to revolutionize biomonitoring assessments. Inmore » this regard, the timing and number of samples taken for biomonitoring will not be limited as is currently the case. In addition, real-time analysis will facilitate identification of work practices or conditions that are contributing to increased exposures and will make possible a more rapid and successful intervention strategy. The initial development and application of computational models for evaluation of saliva/blood analyte concentration at anticipated exposure levels represents an important opportunity to establish the limits of quantification and robustness of multiplex sensor systems by exploiting a unique computational modeling framework. The use of these pharmacokinetic models will also enable prediction of an exposure dose based on the saliva/blood measurement. This novel strategy will result in a more accurate prediction of exposures and, once validated, can be employed to assess dosimetry to a broad range of chemicals in support of biomonitoring and epidemiology studies.« less

  16. Linking environmental effects to health impacts: a computer modelling approach for air pollution

    PubMed Central

    Mindell, J.; Barrowcliffe, R.

    2005-01-01

    Study objective and Setting: To develop a computer model, using a geographical information system (GIS), to quantify potential health effects of air pollution from a new energy from waste facility on the surrounding urban population. Design: Health impacts were included where evidence of causality is sufficiently convincing. The evidence for no threshold means that annual average increases in concentration can be used to model changes in outcome. The study combined the "contours" of additional pollutant concentrations for the new source generated by a dispersion model with a population database within a GIS, which is set up to calculate the product of the concentration increase with numbers of people exposed within each enumeration district exposure response coefficients, and the background rates of mortality and hospital admissions for several causes. Main results: The magnitude of health effects might result from the increased PM10 exposure is small—about 0.03 deaths each year in a population of 3 500 000, with 0.04 extra hospital admissions for respiratory disease. Long term exposure might bring forward 1.8–7.8 deaths in 30 years. Conclusions: This computer model is a feasible approach to estimating impacts on human health from environmental effects but sensitivity analyses are recommended. Relevance to clinical or professional practice: The availability of GIS and dispersion models on personal computers enables quantification of health effects resulting from the additional air pollution new industrial development might cause. This approach could also be used in environmental impact assessment. Care must be taken in presenting results to emphasise methodological limitations and uncertainties in the numbers. PMID:16286501

  17. Inhalation toxicity of indoor air pollutants in Drosophila melanogaster using integrated transcriptomics and computational behavior analyses

    NASA Astrophysics Data System (ADS)

    Eom, Hyun-Jeong; Liu, Yuedan; Kwak, Gyu-Suk; Heo, Muyoung; Song, Kyung Seuk; Chung, Yun Doo; Chon, Tae-Soo; Choi, Jinhee

    2017-06-01

    We conducted an inhalation toxicity test on the alternative animal model, Drosophila melanogaster, to investigate potential hazards of indoor air pollution. The inhalation toxicity of toluene and formaldehyde was investigated using comprehensive transcriptomics and computational behavior analyses. The ingenuity pathway analysis (IPA) based on microarray data suggests the involvement of pathways related to immune response, stress response, and metabolism in formaldehyde and toluene exposure based on hub molecules. We conducted a toxicity test using mutants of the representative genes in these pathways to explore the toxicological consequences of alterations of these pathways. Furthermore, extensive computational behavior analysis showed that exposure to either toluene or formaldehyde reduced most of the behavioral parameters of both wild-type and mutants. Interestingly, behavioral alteration caused by toluene or formaldehyde exposure was most severe in the p38b mutant, suggesting that the defects in the p38 pathway underlie behavioral alteration. Overall, the results indicate that exposure to toluene and formaldehyde via inhalation causes severe toxicity in Drosophila, by inducing significant alterations in gene expression and behavior, suggesting that Drosophila can be used as a potential alternative model in inhalation toxicity screening.

  18. Inhalation toxicity of indoor air pollutants in Drosophila melanogaster using integrated transcriptomics and computational behavior analyses

    PubMed Central

    Eom, Hyun-Jeong; Liu, Yuedan; Kwak, Gyu-Suk; Heo, Muyoung; Song, Kyung Seuk; Chung, Yun Doo; Chon, Tae-Soo; Choi, Jinhee

    2017-01-01

    We conducted an inhalation toxicity test on the alternative animal model, Drosophila melanogaster, to investigate potential hazards of indoor air pollution. The inhalation toxicity of toluene and formaldehyde was investigated using comprehensive transcriptomics and computational behavior analyses. The ingenuity pathway analysis (IPA) based on microarray data suggests the involvement of pathways related to immune response, stress response, and metabolism in formaldehyde and toluene exposure based on hub molecules. We conducted a toxicity test using mutants of the representative genes in these pathways to explore the toxicological consequences of alterations of these pathways. Furthermore, extensive computational behavior analysis showed that exposure to either toluene or formaldehyde reduced most of the behavioral parameters of both wild-type and mutants. Interestingly, behavioral alteration caused by toluene or formaldehyde exposure was most severe in the p38b mutant, suggesting that the defects in the p38 pathway underlie behavioral alteration. Overall, the results indicate that exposure to toluene and formaldehyde via inhalation causes severe toxicity in Drosophila, by inducing significant alterations in gene expression and behavior, suggesting that Drosophila can be used as a potential alternative model in inhalation toxicity screening. PMID:28621308

  19. FDTD computation of human eye exposure to ultra-wideband electromagnetic pulses.

    PubMed

    Simicevic, Neven

    2008-03-21

    With an increase in the application of ultra-wideband (UWB) electromagnetic pulses in the communications industry, radar, biotechnology and medicine, comes an interest in UWB exposure safety standards. Despite an increase of the scientific research on bioeffects of exposure to non-ionizing UWB pulses, characterization of those effects is far from complete. A numerical computational approach, such as a finite-difference time domain (FDTD) method, is required to visualize and understand the complexity of broadband electromagnetic interactions. The FDTD method has almost no limits in the description of the geometrical and dispersive properties of the simulated material, it is numerically robust and appropriate for current computer technology. In this paper, a complete calculation of exposure of the human eye to UWB electromagnetic pulses in the frequency range of 3.1-10.6, 22-29 and 57-64 GHz is performed. Computation in this frequency range required a geometrical resolution of the eye of 0.1 mm and an arbitrary precision in the description of its dielectric properties in terms of the Debye model. New results show that the interaction of UWB pulses with the eye tissues exhibits the same properties as the interaction of the continuous electromagnetic waves (CWs) with the frequencies from the pulse's frequency spectrum. It is also shown that under the same exposure conditions the exposure to UWB pulses is from one to many orders of magnitude safer than the exposure to CW.

  20. Decompression management by 43 models of dive computer: single square-wave exposures to between 15 and 50 metres' depth.

    PubMed

    Sayer, Martin D J; Azzopardi, Elaine; Sieber, Arne

    2014-12-01

    Dive computers are used in some occupational diving sectors to manage decompression but there is little independent assessment of their performance. A significant proportion of occupational diving operations employ single square-wave pressure exposures in support of their work. Single examples of 43 models of dive computer were compressed to five simulated depths between 15 and 50 metres' sea water (msw) and maintained at those depths until they had registered over 30 minutes of decompression. At each depth, and for each model, downloaded data were used to collate the times at which the unit was still registering "no decompression" and the times at which various levels of decompression were indicated or exceeded. Each depth profile was replicated three times for most models. Decompression isopleths for no-stop dives indicated that computers tended to be more conservative than standard decompression tables at depths shallower than 30 msw but less conservative between 30-50 msw. For dives requiring decompression, computers were predominantly more conservative than tables across the whole depth range tested. There was considerable variation between models in the times permitted at all of the depth/decompression combinations. The present study would support the use of some dive computers for controlling single, square-wave diving by some occupational sectors. The choice of which makes and models to use would have to consider their specific dive management characteristics which may additionally be affected by the intended operational depth and whether staged decompression was permitted.

  1. Impact of input data uncertainty on environmental exposure assessment models: A case study for electromagnetic field modelling from mobile phone base stations.

    PubMed

    Beekhuizen, Johan; Heuvelink, Gerard B M; Huss, Anke; Bürgi, Alfred; Kromhout, Hans; Vermeulen, Roel

    2014-11-01

    With the increased availability of spatial data and computing power, spatial prediction approaches have become a standard tool for exposure assessment in environmental epidemiology. However, such models are largely dependent on accurate input data. Uncertainties in the input data can therefore have a large effect on model predictions, but are rarely quantified. With Monte Carlo simulation we assessed the effect of input uncertainty on the prediction of radio-frequency electromagnetic fields (RF-EMF) from mobile phone base stations at 252 receptor sites in Amsterdam, The Netherlands. The impact on ranking and classification was determined by computing the Spearman correlations and weighted Cohen's Kappas (based on tertiles of the RF-EMF exposure distribution) between modelled values and RF-EMF measurements performed at the receptor sites. The uncertainty in modelled RF-EMF levels was large with a median coefficient of variation of 1.5. Uncertainty in receptor site height, building damping and building height contributed most to model output uncertainty. For exposure ranking and classification, the heights of buildings and receptor sites were the most important sources of uncertainty, followed by building damping, antenna- and site location. Uncertainty in antenna power, tilt, height and direction had a smaller impact on model performance. We quantified the effect of input data uncertainty on the prediction accuracy of an RF-EMF environmental exposure model, thereby identifying the most important sources of uncertainty and estimating the total uncertainty stemming from potential errors in the input data. This approach can be used to optimize the model and better interpret model output. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. Reconstruction of Exposure to m-Xylene from Human Biomonitoring Data Using PBPK Modelling, Bayesian Inference, and Markov Chain Monte Carlo Simulation

    PubMed Central

    McNally, Kevin; Cotton, Richard; Cocker, John; Jones, Kate; Bartels, Mike; Rick, David; Price, Paul; Loizou, George

    2012-01-01

    There are numerous biomonitoring programs, both recent and ongoing, to evaluate environmental exposure of humans to chemicals. Due to the lack of exposure and kinetic data, the correlation of biomarker levels with exposure concentrations leads to difficulty in utilizing biomonitoring data for biological guidance values. Exposure reconstruction or reverse dosimetry is the retrospective interpretation of external exposure consistent with biomonitoring data. We investigated the integration of physiologically based pharmacokinetic modelling, global sensitivity analysis, Bayesian inference, and Markov chain Monte Carlo simulation to obtain a population estimate of inhalation exposure to m-xylene. We used exhaled breath and venous blood m-xylene and urinary 3-methylhippuric acid measurements from a controlled human volunteer study in order to evaluate the ability of our computational framework to predict known inhalation exposures. We also investigated the importance of model structure and dimensionality with respect to its ability to reconstruct exposure. PMID:22719759

  3. Computational Toxicology of Chloroform: Reverse Dosimetry Using Bayesian Inference, Markov Chain Monte Carlo Simulation, and Human Biomonitoring Data

    PubMed Central

    Lyons, Michael A.; Yang, Raymond S.H.; Mayeno, Arthur N.; Reisfeld, Brad

    2008-01-01

    Background One problem of interpreting population-based biomonitoring data is the reconstruction of corresponding external exposure in cases where no such data are available. Objectives We demonstrate the use of a computational framework that integrates physiologically based pharmacokinetic (PBPK) modeling, Bayesian inference, and Markov chain Monte Carlo simulation to obtain a population estimate of environmental chloroform source concentrations consistent with human biomonitoring data. The biomonitoring data consist of chloroform blood concentrations measured as part of the Third National Health and Nutrition Examination Survey (NHANES III), and for which no corresponding exposure data were collected. Methods We used a combined PBPK and shower exposure model to consider several routes and sources of exposure: ingestion of tap water, inhalation of ambient household air, and inhalation and dermal absorption while showering. We determined posterior distributions for chloroform concentration in tap water and ambient household air using U.S. Environmental Protection Agency Total Exposure Assessment Methodology (TEAM) data as prior distributions for the Bayesian analysis. Results Posterior distributions for exposure indicate that 95% of the population represented by the NHANES III data had likely chloroform exposures ≤ 67 μg/L in tap water and ≤ 0.02 μg/L in ambient household air. Conclusions Our results demonstrate the application of computer simulation to aid in the interpretation of human biomonitoring data in the context of the exposure–health evaluation–risk assessment continuum. These results should be considered as a demonstration of the method and can be improved with the addition of more detailed data. PMID:18709138

  4. Model of spacecraft atomic oxygen and solar exposure microenvironments

    NASA Technical Reports Server (NTRS)

    Bourassa, R. J.; Pippin, H. G.

    1993-01-01

    Computer models of environmental conditions in Earth orbit are needed for the following reasons: (1) derivation of material performance parameters from orbital test data, (2) evaluation of spacecraft hardware designs, (3) prediction of material service life, and (4) scheduling spacecraft maintenance. To meet these needs, Boeing has developed programs for modeling atomic oxygen (AO) and solar radiation exposures. The model allows determination of AO and solar ultraviolet (UV) radiation exposures for spacecraft surfaces (1) in arbitrary orientations with respect to the direction of spacecraft motion, (2) overall ranges of solar conditions, and (3) for any mission duration. The models have been successfully applied to prediction of experiment environments on the Long Duration Exposure Facility (LDEF) and for analysis of selected hardware designs for deployment on other spacecraft. The work on these models has been reported at previous LDEF conferences. Since publication of these reports, a revision has been made to the AO calculation for LDEF, and further work has been done on the microenvironments model for solar exposure.

  5. Functionalized anatomical models for EM-neuron Interaction modeling

    NASA Astrophysics Data System (ADS)

    Neufeld, Esra; Cassará, Antonino Mario; Montanaro, Hazael; Kuster, Niels; Kainz, Wolfgang

    2016-06-01

    The understanding of interactions between electromagnetic (EM) fields and nerves are crucial in contexts ranging from therapeutic neurostimulation to low frequency EM exposure safety. To properly consider the impact of in vivo induced field inhomogeneity on non-linear neuronal dynamics, coupled EM-neuronal dynamics modeling is required. For that purpose, novel functionalized computable human phantoms have been developed. Their implementation and the systematic verification of the integrated anisotropic quasi-static EM solver and neuronal dynamics modeling functionality, based on the method of manufactured solutions and numerical reference data, is described. Electric and magnetic stimulation of the ulnar and sciatic nerve were modeled to help understanding a range of controversial issues related to the magnitude and optimal determination of strength-duration (SD) time constants. The results indicate the importance of considering the stimulation-specific inhomogeneous field distributions (especially at tissue interfaces), realistic models of non-linear neuronal dynamics, very short pulses, and suitable SD extrapolation models. These results and the functionalized computable phantom will influence and support the development of safe and effective neuroprosthetic devices and novel electroceuticals. Furthermore they will assist the evaluation of existing low frequency exposure standards for the entire population under all exposure conditions.

  6. Linkage Of Exposure And Effects Using Genomics, Proteomics, And Metabolomics In Small Fish Models

    EPA Science Inventory

    Poster for the BOSC Computational Toxicology Research Program review. Knowledge of possible toxic mechanisms/modes of action (MOA) of chemicals can provide valuable insights as to appropriate methods for assessing exposure and effects, thereby reducing uncertainties related to e...

  7. Models, Measurements, and Local Decisions: Assessing and ...

    EPA Pesticide Factsheets

    This presentation includes a combination of modeling and measurement results to characterize near-source air quality in Newark, New Jersey with consideration of how this information could be used to inform decision making to reduce risk of health impacts. Decisions could include either exposure or emissions reduction, and a host of stakeholders, including residents, academics, NGOs, local and federal agencies. This presentation includes results from the C-PORT modeling system, and from a citizen science project from the local area. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  8. The Role of Dosimetry in High-Quality EMI Risk Assessment

    DTIC Science & Technology

    2006-09-14

    wireless communication usage and exposure to different parts of the body (especially for children and foetuses ), including multiple exposure from...Calculation of induced electric fields in pregnant women and in the foetus is urgently needed. Very little computation has been carried out on...advanced models of the pregnant human and the foetus with appropriate anatomical modelling. It is important to assess possible enhanced induction of

  9. Modeling the Unites States government's economic cost of noise-induced hearing loss for a military population.

    PubMed

    Tufts, Jennifer B; Weathersby, Paul K; Rodriguez, Francisco A

    2010-05-01

    The purpose of this paper is to demonstrate the feasibility and utility of developing economic cost models for noise-induced hearing loss (NIHL). First, we outline an economic model of NIHL for a population of US Navy sailors with an "industrial"-type noise exposure. Next, we describe the effect on NIHL-related cost of varying the two central model inputs--the noise-exposure level and the duration of exposure. Such an analysis can help prioritize promising areas, to which limited resources to reduce NIHL-related costs should be devoted. NIHL-related costs borne by the US government were computed on a yearly basis using a finite element approach that took into account varying levels of susceptibility to NIHL. Predicted hearing thresholds for the population were computed with ANSI S3.44-1996 and then used as the basis for the calculation of NIHL-related costs. Annual and cumulative costs were tracked. Noise-exposure level and duration were systematically varied to determine their effects on the expected lifetime NIHL-related cost of a specific US Navy sailor population. Our nominal noise-exposure case [93 dB(A) for six years] yielded a total expected lifetime cost of US $13,472 per sailor, with plausible lower and upper bounds of US $2,500 and US $26,000. Starting with the nominal case, a decrease of 50% in exposure level or duration would yield cost savings of approximately 23% and 19%, respectively. We concluded that a reduction in noise level would be more somewhat more cost-effective than the same percentage reduction in years of exposure. Our economic cost model can be used to estimate the changes in NIHL-related costs that would result from changes in noise-exposure level and/or duration for a single military population. Although the model is limited at present, suggestions are provided for adapting it to civilian populations.

  10. The Air Quality Model Evaluation International Initiative ...

    EPA Pesticide Factsheets

    This presentation provides an overview of the Air Quality Model Evaluation International Initiative (AQMEII). It contains a synopsis of the three phases of AQMEII, including objectives, logistics, and timelines. It also provides a number of examples of analyses conducted through AQMEII with a particular focus on past and future analyses of deposition. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  11. HYDROCARBON SPILL EXPOSURE ASSESSMENT MODELING

    EPA Science Inventory

    Hydrocarbon spills impact drinking water supplies at down gradient locations. onventional finite difference and finite element models of multiphase, multicomponent flow have extreme requirements for both computer time and site data. ite data and the intent of the modeling often d...

  12. Ku-Band rendezvous radar performance computer simulation model

    NASA Technical Reports Server (NTRS)

    Magnusson, H. G.; Goff, M. F.

    1984-01-01

    All work performed on the Ku-band rendezvous radar performance computer simulation model program since the release of the preliminary final report is summarized. Developments on the program fall into three distinct categories: (1) modifications to the existing Ku-band radar tracking performance computer model; (2) the addition of a highly accurate, nonrealtime search and acquisition performance computer model to the total software package developed on this program; and (3) development of radar cross section (RCS) computation models for three additional satellites. All changes in the tracking model involved improvements in the automatic gain control (AGC) and the radar signal strength (RSS) computer models. Although the search and acquisition computer models were developed under the auspices of the Hughes Aircraft Company Ku-Band Integrated Radar and Communications Subsystem program office, they have been supplied to NASA as part of the Ku-band radar performance comuter model package. Their purpose is to predict Ku-band acquisition performance for specific satellite targets on specific missions. The RCS models were developed for three satellites: the Long Duration Exposure Facility (LDEF) spacecraft, the Solar Maximum Mission (SMM) spacecraft, and the Space Telescopes.

  13. Ku-Band rendezvous radar performance computer simulation model

    NASA Astrophysics Data System (ADS)

    Magnusson, H. G.; Goff, M. F.

    1984-06-01

    All work performed on the Ku-band rendezvous radar performance computer simulation model program since the release of the preliminary final report is summarized. Developments on the program fall into three distinct categories: (1) modifications to the existing Ku-band radar tracking performance computer model; (2) the addition of a highly accurate, nonrealtime search and acquisition performance computer model to the total software package developed on this program; and (3) development of radar cross section (RCS) computation models for three additional satellites. All changes in the tracking model involved improvements in the automatic gain control (AGC) and the radar signal strength (RSS) computer models. Although the search and acquisition computer models were developed under the auspices of the Hughes Aircraft Company Ku-Band Integrated Radar and Communications Subsystem program office, they have been supplied to NASA as part of the Ku-band radar performance comuter model package. Their purpose is to predict Ku-band acquisition performance for specific satellite targets on specific missions. The RCS models were developed for three satellites: the Long Duration Exposure Facility (LDEF) spacecraft, the Solar Maximum Mission (SMM) spacecraft, and the Space Telescopes.

  14. Computational Toxicology as Implemented by the U.S. EPA: Providing High Throughput Decision Support Tools for Screening and Assessing Chemical Exposure, Hazard and Risk

    EPA Science Inventory

    Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, the U.S. Environ...

  15. SAR exposure from UHF RFID reader in adult, child, pregnant woman, and fetus anatomical models.

    PubMed

    Fiocchi, Serena; Markakis, Ioannis A; Ravazzani, Paolo; Samaras, Theodoros

    2013-09-01

    The spread of radio frequency identification (RFID) devices in ubiquitous applications without their simultaneous exposure assessment could give rise to public concerns about their potential adverse health effects. Among the various RFID system categories, the ultra high frequency (UHF) RFID systems have recently started to be widely used in many applications. This study addresses a computational exposure assessment of the electromagnetic radiation generated by a realistic UHF RFID reader, quantifying the exposure levels in different exposure scenarios and subjects (two adults, four children, and two anatomical models of women 7 and 9 months pregnant). The results of the computations are presented in terms of the whole-body and peak spatial specific absorption rate (SAR) averaged over 10 g of tissue to allow comparison with the basic restrictions of the exposure guidelines. The SAR levels in the adults and children were below 0.02 and 0.8 W/kg in whole-body SAR and maximum peak SAR levels, respectively, for all tested positions of the antenna. On the contrary, exposure of pregnant women and fetuses resulted in maximum peak SAR(10 g) values close to the values suggested by the guidelines (2 W/kg) in some of the exposure scenarios with the antenna positioned in front of the abdomen and with a 100% duty cycle and 1 W radiated power. Copyright © 2013 Wiley Periodicals, Inc.

  16. Can Computational Models Be Used to Assess the Developmental Toxicity of Environmental Exposures?

    EPA Science Inventory

    Environmental causes of birth defects include maternal exposure to drugs, chemicals, or physical agents. Environmental factors account for an estimated 3–7% of birth defects although a broader contribution is likely based on the mother’s general health status and genetic blueprin...

  17. Computational Fluid Dynamics Modeling of Bacillus anthracis Spore Deposition in Rabbit and Human Respiratory Airways

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kabilan, Senthil; Suffield, Sarah R.; Recknagle, Kurtis P.

    Three-dimensional computational fluid dynamics and Lagrangian particle deposition models were developed to compare the deposition of aerosolized Bacillus anthracis spores in the respiratory airways of a human with that of the rabbit, a species commonly used in the study of anthrax disease. The respiratory airway geometries for each species were derived from computed tomography (CT) or µCT images. Both models encompassed airways that extended from the external nose to the lung with a total of 272 outlets in the human model and 2878 outlets in the rabbit model. All simulations of spore deposition were conducted under transient, inhalation-exhalation breathing conditionsmore » using average species-specific minute volumes. The highest exposure concentration was modeled in the rabbit based upon prior acute inhalation studies. For comparison, human simulation was also conducted at the same concentration. Results demonstrated that regional spore deposition patterns were sensitive to airway geometry and ventilation profiles. Due to the complex airway geometries in the rabbit nose, higher spore deposition efficiency was predicted in the upper conducting airways compared to the human at the same air concentration of anthrax spores. As a result, higher particle deposition was predicted in the conducting airways and deep lung of the human compared to the rabbit lung due to differences in airway branching pattern. This information can be used to refine published and ongoing biokinetic models of inhalation anthrax spore exposures, which currently estimate deposited spore concentrations based solely upon exposure concentrations and inhaled doses that do not factor in species-specific anatomy and physiology.« less

  18. Relations between work and upper extremity musculoskeletal problems (UEMSP) and the moderating role of psychosocial work factors on the relation between computer work and UEMSP.

    PubMed

    Nicolakakis, Nektaria; Stock, Susan R; Abrahamowicz, Michal; Kline, Rex; Messing, Karen

    2017-11-01

    Computer work has been identified as a risk factor for upper extremity musculoskeletal problems (UEMSP). But few studies have investigated how psychosocial and organizational work factors affect this relation. Nor have gender differences in the relation between UEMSP and these work factors  been studied. We sought to estimate: (1) the association between UEMSP and a range of physical, psychosocial and organizational work exposures, including the duration of computer work, and (2) the moderating effect of psychosocial work exposures on the relation between computer work and UEMSP. Using 2007-2008 Québec survey data on 2478 workers, we carried out gender-stratified multivariable logistic regression modeling and two-way interaction analyses. In both genders, odds of UEMSP were higher with exposure to high physical work demands and emotionally demanding work. Additionally among women, UEMSP were associated with duration of occupational computer exposure, sexual harassment, tense situations when dealing with clients, high quantitative demands and lack of prospects for promotion, and among men, with low coworker support, episodes of unemployment, low job security and contradictory work demands. Among women, the effect of computer work on UEMSP was considerably increased in the presence of emotionally demanding work, and may also be moderated by low recognition at work, contradictory work demands, and low supervisor support. These results suggest that the relations between UEMSP and computer work are moderated by psychosocial work exposures and that the relations between working conditions and UEMSP are somewhat different for each gender, highlighting the complexity of these relations and the importance of considering gender.

  19. Developmental and Life-Stage Physiologically-Based Pharmacokinetic (PBPK) Models in Humans and Animal Models.

    EPA Science Inventory

    PBPK models provide a computational framework for incorporating pertinent physiological and biochemical information to estimate in vivo levels of xenobiotics in biological tissues. In general, PBPK models are used to correlate exposures to target tissue levels of chemicals and th...

  20. Improvements in Modelling Bystander and Resident Exposure to Pesticide Spray Drift: Investigations into New Approaches for Characterizing the 'Collection Efficiency' of the Human Body.

    PubMed

    Butler Ellis, M Clare; Kennedy, Marc C; Kuster, Christian J; Alanis, Rafael; Tuck, Clive R

    2018-05-28

    The BREAM (Bystander and Resident Exposure Assessment Model) (Kennedy et al. in BREAM: A probabilistic bystander and resident exposure assessment model of spray drift from an agricultural boom sprayer. Comput Electron Agric 2012;88:63-71) for bystander and resident exposure to spray drift from boom sprayers has recently been incorporated into the European Food Safety Authority (EFSA) guidance for determining non-dietary exposures of humans to plant protection products. The component of BREAM, which relates airborne spray concentrations to bystander and resident dermal exposure, has been reviewed to identify whether it is possible to improve this and its description of variability captured in the model. Two approaches have been explored: a more rigorous statistical analysis of the empirical data and a semi-mechanistic model based on established studies combined with new data obtained in a wind tunnel. A statistical comparison between field data and model outputs was used to determine which approach gave the better prediction of exposures. The semi-mechanistic approach gave the better prediction of experimental data and resulted in a reduction in the proposed regulatory values for the 75th and 95th percentiles of the exposure distribution.

  1. Computer Simulation of Embryonic Systems: What can a virtual embryo teach us about developmental toxicity? Microcephaly: Computational and organotypic modeling of a complex human birth defect (seminar and lecture - Thomas Jefferson University, Philadelphia, PA)

    EPA Science Inventory

    (1) Standard practice for assessing developmental toxicity is the observation of apical endpoints (intrauterine death, fetal growth retardation, structural malformations) in pregnant rats/rabbits following exposure during organogenesis. EPA’s computational toxicology research pro...

  2. Impacts of Lateral Boundary Conditions on US Ozone ...

    EPA Pesticide Factsheets

    Chemical boundary conditions are a key input to regional-scale photochemical models. In this study, we perform annual simulations over North America with chemical boundary conditions prepared from two global models (GEOS-CHEM and Hemispheric CMAQ). Results indicate that the impacts of different boundary conditions on ozone can be significant throughout the year. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  3. 49 CFR Appendix A to Part 227 - Noise Exposure Computation

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 4 2011-10-01 2011-10-01 false Noise Exposure Computation A Appendix A to Part... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION OCCUPATIONAL NOISE EXPOSURE Pt. 227, App. A Appendix A to Part 227—Noise Exposure Computation This appendix is mandatory. I. Computation of Employee Noise Exposure A...

  4. 49 CFR Appendix A to Part 227 - Noise Exposure Computation

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 4 2010-10-01 2010-10-01 false Noise Exposure Computation A Appendix A to Part... ADMINISTRATION, DEPARTMENT OF TRANSPORTATION OCCUPATIONAL NOISE EXPOSURE Pt. 227, App. A Appendix A to Part 227—Noise Exposure Computation This appendix is mandatory. I. Computation of Employee Noise Exposure A...

  5. Computational modeling of the amphibian thyroid axis ...

    EPA Pesticide Factsheets

    In vitro screening of chemicals for bioactivity together with computational modeling are beginning to replace animal toxicity testing in support of chemical risk assessment. To facilitate this transition, an amphibian thyroid axis model has been developed to describe thyroid homeostasis during Xenopus laevis pro-metamorphosis. The model simulates the dynamic relationships of normal thyroid biology throughout this critical period of amphibian development and includes molecular initiating events (MIEs) for thyroid axis disruption to allow in silico simulations of hormone levels following chemical perturbations. One MIE that has been formally described using the adverse outcome pathway (AOP) framework is thyroperoxidase (TPO) inhibition. The goal of this study was to refine the model parameters and validate model predictions by generating dose-response and time-course biochemical data following exposure to three TPO inhibitors, methimazole, 6-propylthiouracil and 2-mercaptobenzothiazole. Key model variables including gland and blood thyroid hormone (TH) levels were compared to empirical values measured in biological samples at 2, 4, 7 and 10 days following initiation of exposure at Nieuwkoop and Faber (NF) stage 54 (onset of pro-metamorphosis). The secondary objective of these studies was to relate depleted blood TH levels to delayed metamorphosis, the adverse apical outcome. Delayed metamorphosis was evaluated by continuing exposure with a subset of larvae until a

  6. Development of PIMAL: Mathematical Phantom with Moving Arms and Legs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Akkurt, Hatice; Eckerman, Keith F.

    2007-05-01

    The computational model of the human anatomy (phantom) has gone through many revisions since its initial development in the 1970s. The computational phantom model currently used by the Nuclear Regulatory Commission (NRC) is based on a model published in 1974. Hence, the phantom model used by the NRC staff was missing some organs (e.g., neck, esophagus) and tissues. Further, locations of some organs were inappropriate (e.g., thyroid).Moreover, all the computational phantoms were assumed to be in the vertical-upright position. However, many occupational radiation exposures occur with the worker in other positions. In the first phase of this work, updates onmore » the computational phantom models were reviewed and a revised phantom model, which includes the updates for the relevant organs and compositions, was identified. This revised model was adopted as the starting point for this development work, and hence a series of radiation transport computations, using the Monte Carlo code MCNP5, was performed. The computational results were compared against values reported by the International Commission on Radiation Protection (ICRP) in Publication 74. For some of the organs (e.g., thyroid), there were discrepancies between the computed values and the results reported in ICRP-74. The reasons behind these discrepancies have been investigated and are discussed in this report.Additionally, sensitivity computations were performed to determine the sensitivity of the organ doses for certain parameters, including composition and cross sections used in the simulations. To assess the dose for more realistic exposure configurations, the phantom model was revised to enable flexible positioning of the arms and legs. Furthermore, to reduce the user time for analyses, a graphical user interface (GUI) was developed. The GUI can be used to visualize the positioning of the arms and legs as desired posture is achieved to generate the input file, invoke the computations, and extract the organ dose values from the MCNP5 output file. In this report, the main features of the phantom model with moving arms and legs and user interface are described.« less

  7. A new surrogate modeling technique combining Kriging and polynomial chaos expansions – Application to uncertainty analysis in computational dosimetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kersaudy, Pierric, E-mail: pierric.kersaudy@orange.com; Whist Lab, 38 avenue du Général Leclerc, 92130 Issy-les-Moulineaux; ESYCOM, Université Paris-Est Marne-la-Vallée, 5 boulevard Descartes, 77700 Marne-la-Vallée

    2015-04-01

    In numerical dosimetry, the recent advances in high performance computing led to a strong reduction of the required computational time to assess the specific absorption rate (SAR) characterizing the human exposure to electromagnetic waves. However, this procedure remains time-consuming and a single simulation can request several hours. As a consequence, the influence of uncertain input parameters on the SAR cannot be analyzed using crude Monte Carlo simulation. The solution presented here to perform such an analysis is surrogate modeling. This paper proposes a novel approach to build such a surrogate model from a design of experiments. Considering a sparse representationmore » of the polynomial chaos expansions using least-angle regression as a selection algorithm to retain the most influential polynomials, this paper proposes to use the selected polynomials as regression functions for the universal Kriging model. The leave-one-out cross validation is used to select the optimal number of polynomials in the deterministic part of the Kriging model. The proposed approach, called LARS-Kriging-PC modeling, is applied to three benchmark examples and then to a full-scale metamodeling problem involving the exposure of a numerical fetus model to a femtocell device. The performances of the LARS-Kriging-PC are compared to an ordinary Kriging model and to a classical sparse polynomial chaos expansion. The LARS-Kriging-PC appears to have better performances than the two other approaches. A significant accuracy improvement is observed compared to the ordinary Kriging or to the sparse polynomial chaos depending on the studied case. This approach seems to be an optimal solution between the two other classical approaches. A global sensitivity analysis is finally performed on the LARS-Kriging-PC model of the fetus exposure problem.« less

  8. The EPA Comptox Chemistry Dashboard . (BOSC)

    EPA Science Inventory

    A consolidated web platform is necessary for researchers to access chemical information look-up, models and model predictions and linkages to Agency and public resources. This will provide access to: curated chemical structures, computed and measured physchem properties, exposure...

  9. Carpal tunnel syndrome and computer exposure at work in two large complementary cohorts.

    PubMed

    Mediouni, Z; Bodin, J; Dale, A M; Herquelot, E; Carton, M; Leclerc, A; Fouquet, N; Dumontier, C; Roquelaure, Y; Evanoff, B A; Descatha, A

    2015-09-09

    The boom in computer use and concurrent high rates in musculoskeletal complaints and carpal tunnel syndrome (CTS) among users have led to a controversy about a possible link. Most studies have used cross-sectional designs and shown no association. The present study used longitudinal data from two large complementary cohorts to evaluate a possible relationship between CTS and the performance of computer work. The Cosali cohort is a representative sample of a French working population that evaluated CTS using standardised clinical examinations and assessed self-reported computer use. The PrediCTS cohort study enrolled newly hired clerical, service and construction workers in several industries in the USA, evaluated CTS using symptoms and nerve conduction studies (NCS), and estimated exposures to computer work using a job exposure matrix. During a follow-up of 3-5 years, the association between new cases of CTS and computer work was calculated using logistic regression models adjusting for sex, age, obesity and relevant associated disorders. In the Cosali study, 1551 workers (41.8%) completed follow-up physical examinations; 36 (2.3%) participants were diagnosed with CTS. In the PrediCTS study, 711 workers (64.2%) completed follow-up evaluations, whereas 31 (4.3%) had new cases of CTS. The adjusted OR for the group with the highest exposure to computer use was 0.39 (0.17; 0.89) in the Cosali cohort and 0.16 (0.05; 0.59) in the PrediCTS cohort. Data from two large cohorts in two different countries showed no association between computer work and new cases of CTS among workers in diverse jobs with varying job exposures. CTS is far more common among workers in non-computer related jobs; prevention efforts and work-related compensation programmes should focus on workers performing forceful hand exertion. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  10. Carpal tunnel syndrome and computer exposure at work in two large complementary cohorts

    PubMed Central

    Mediouni, Z; Bodin, J; Dale, A M; Herquelot, E; Carton, M; Leclerc, A; Fouquet, N; Dumontier, C; Roquelaure, Y; Evanoff, B A; Descatha, A

    2015-01-01

    Objectives The boom in computer use and concurrent high rates in musculoskeletal complaints and carpal tunnel syndrome (CTS) among users have led to a controversy about a possible link. Most studies have used cross-sectional designs and shown no association. The present study used longitudinal data from two large complementary cohorts to evaluate a possible relationship between CTS and the performance of computer work. Settings and participants The Cosali cohort is a representative sample of a French working population that evaluated CTS using standardised clinical examinations and assessed self-reported computer use. The PrediCTS cohort study enrolled newly hired clerical, service and construction workers in several industries in the USA, evaluated CTS using symptoms and nerve conduction studies (NCS), and estimated exposures to computer work using a job exposure matrix. Primary and secondary outcome measures During a follow-up of 3–5 years, the association between new cases of CTS and computer work was calculated using logistic regression models adjusting for sex, age, obesity and relevant associated disorders. Results In the Cosali study, 1551 workers (41.8%) completed follow-up physical examinations; 36 (2.3%) participants were diagnosed with CTS. In the PrediCTS study, 711 workers (64.2%) completed follow-up evaluations, whereas 31 (4.3%) had new cases of CTS. The adjusted OR for the group with the highest exposure to computer use was 0.39 (0.17; 0.89) in the Cosali cohort and 0.16 (0.05; 0.59) in the PrediCTS cohort. Conclusions Data from two large cohorts in two different countries showed no association between computer work and new cases of CTS among workers in diverse jobs with varying job exposures. CTS is far more common among workers in non-computer related jobs; prevention efforts and work-related compensation programmes should focus on workers performing forceful hand exertion. PMID:26353869

  11. Relationship Between Vehicle Size and Fatality Risk in Model Year 1985-93 Passenger Cars and Light Trucks

    DOT National Transportation Integrated Search

    1997-01-01

    Fatality rates per million exposure years are computed by make, model and model year, : based on the crash experience of model year 1985-93 passenger cars and light trucks (pickups) vans : and sport utility vehicles) in the United States during calen...

  12. [Physically-based model of pesticide application for risk assessment of agricultural workers].

    PubMed

    Rubino, F M; Mandic-Rajcevic, S; Vianello, G; Brambilla, G; Colosio, C

    2012-01-01

    Due to their unavoidable toxicity to non-target organisms, including man, the not of Plant Protection Products requires a thorough risk assessment to rationally advise safe use procedures and protection equipment by farmers. Most information on active substances and formulations, such as dermal absorption rates and exposure limits are available in the large body of regulatory data. Physically-based computational models can be used to forecast risk in real-life conditions (preventive assessment by 'exposure profiles'), to drive the cost-effective use of products and equipment and to understand the sources of unexpected exposure.

  13. Pieces of the Puzzle: Tracking the Chemical Component of the ...

    EPA Pesticide Factsheets

    This presentation provides an overview of the risk assessment conducted at the U.S. EPA, as well as some research examples related to the exposome concept. This presentation also provides the recommendation of using two organizational and predictive frameworks for tracking chemical components in the exposome. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  14. Computational exposure assessment of electromagnetic fields generated by an RFID system for mother--newborn identity reconfirmation.

    PubMed

    Fiocchi, Serena; Parazzini, Marta; Paglialonga, Alessia; Ravazzani, Paolo

    2011-07-01

    Radio frequency identification (RFID) is an innovative technology currently applied in a large number of industrial and consumer applications. The spread of RFID technology does not correspond to a parallel increase in studies on its possible impact on health in terms of electromagnetic field (EMF) exposure. The aim of this paper is to estimate, by computational techniques, the EMF generated by passive RFID systems for mother-newborn identity reconfirmation. The computation was performed on realistic models of newborn and mother for three different reader positions. The compliance with EMF exposure guidelines was investigated as a function of the change in reader-tag specifications (magnetic field threshold and maximum distance of the reader to awake the tag) and time of use of the reader close to the body. The results show that attention should be paid to the identification of the optimal reader-tag technical specifications to be used in this type of application. That should be done by an accurate exposure assessment investigation, in particular for newborn exposure. The need to reduce the exposure time as much as possible indicates the importance of specific training on the practical applications of the RFID (DATALOGIC J-series, Bologna, Italy) device. Copyright © 2011 Wiley-Liss, Inc.

  15. A math model for high velocity sensoring with a focal plane shuttered camera.

    NASA Technical Reports Server (NTRS)

    Morgan, P.

    1971-01-01

    A new mathematical model is presented which describes the image produced by a focal plane shutter-equipped camera. The model is based upon the well-known collinearity condition equations and incorporates both the translational and rotational motion of the camera during the exposure interval. The first differentials of the model with respect to exposure interval, delta t, yield the general matrix expressions for image velocities which may be simplified to known cases. The exposure interval, delta t, may be replaced under certain circumstances with a function incorporating blind velocity and image position if desired. The model is tested using simulated Lunar Orbiter data and found to be computationally stable as well as providing excellent results, provided that some external information is available on the velocity parameters.

  16. An original imputation technique of missing data for assessing exposure of newborns to perchlorate in drinking water.

    PubMed

    Caron, Alexandre; Clement, Guillaume; Heyman, Christophe; Aernout, Eva; Chazard, Emmanuel; Le Tertre, Alain

    2015-01-01

    Incompleteness of epidemiological databases is a major drawback when it comes to analyzing data. We conceived an epidemiological study to assess the association between newborn thyroid function and the exposure to perchlorates found in the tap water of the mother's home. Only 9% of newborn's exposure to perchlorate was known. The aim of our study was to design, test and evaluate an original method for imputing perchlorate exposure of newborns based on their maternity of birth. In a first database, an exhaustive collection of newborn's thyroid function measured during a systematic neonatal screening was collected. In this database the municipality of residence of the newborn's mother was only available for 2012. Between 2004 and 2011, the closest data available was the municipality of the maternity of birth. Exposure was assessed using a second database which contained the perchlorate levels for each municipality. We computed the catchment area of every maternity ward based on the French nationwide exhaustive database of inpatient stay. Municipality, and consequently perchlorate exposure, was imputed by a weighted draw in the catchment area. Missing values for remaining covariates were imputed by chained equation. A linear mixture model was computed on each imputed dataset. We compared odds ratios (ORs) and 95% confidence intervals (95% CI) estimated on real versus imputed 2012 data. The same model was then carried out for the whole imputed database. The ORs estimated on 36,695 observations by our multiple imputation method are comparable to the real 2012 data. On the 394,979 observations of the whole database, the ORs remain stable but the 95% CI tighten considerably. The model estimates computed on imputed data are similar to those calculated on real data. The main advantage of multiple imputation is to provide unbiased estimate of the ORs while maintaining their variances. Thus, our method will be used to increase the statistical power of future studies by including all 394,979 newborns.

  17. ADDRESSING ENVIRONMENTAL ENGINEERING CHALLENGES WITH COMPUTATIONAL FLUID DYNAMICS

    EPA Science Inventory

    In the field of environmental engineering, modeling tools are playing an ever larger role in addressing air quality issues, including source pollutant emissions, atmospheric dispersion and human exposure risks. More detailed modeling of environmental flows requires tools for c...

  18. EXPOSURE ASSESSMENT MODELING FOR HYDROCARBON SPILLS INTO THE SUBSURFACE

    EPA Science Inventory

    Hydrocarbons which enter the subsurface through spills or leaks may create serious, long-lived ground-water contamination problems. onventional finite difference and finite element models of multiphase, multicomponent flow often have extreme requirements for both computer time an...

  19. PROGRAM PARAMS USERS GUIDE

    EPA Science Inventory

    PARAMS is a Windows-based computer program that implements 30 methods for estimating the parameters in indoor emissions source models, which are an essential component of indoor air quality (IAQ) and exposure models. These methods fall into eight categories: (1) the properties o...

  20. A Well-Mixed Computational Model for Estimating Room Air Levels of Selected Constituents from E-Vapor Product Use.

    PubMed

    Rostami, Ali A; Pithawalla, Yezdi B; Liu, Jianmin; Oldham, Michael J; Wagner, Karl A; Frost-Pineda, Kimberly; Sarkar, Mohamadi A

    2016-08-16

    Concerns have been raised in the literature for the potential of secondhand exposure from e-vapor product (EVP) use. It would be difficult to experimentally determine the impact of various factors on secondhand exposure including, but not limited to, room characteristics (indoor space size, ventilation rate), device specifications (aerosol mass delivery, e-liquid composition), and use behavior (number of users and usage frequency). Therefore, a well-mixed computational model was developed to estimate the indoor levels of constituents from EVPs under a variety of conditions. The model is based on physical and thermodynamic interactions between aerosol, vapor, and air, similar to indoor air models referred to by the Environmental Protection Agency. The model results agree well with measured indoor air levels of nicotine from two sources: smoking machine-generated aerosol and aerosol exhaled from EVP use. Sensitivity analysis indicated that increasing air exchange rate reduces room air level of constituents, as more material is carried away. The effect of the amount of aerosol released into the space due to variability in exhalation was also evaluated. The model can estimate the room air level of constituents as a function of time, which may be used to assess the level of non-user exposure over time.

  1. Quantitative disease progression model of α‐1 proteinase inhibitor therapy on computed tomography lung density in patients with α‐1 antitrypsin deficiency

    PubMed Central

    Rogers, James A.; Vit, Oliver; Bexon, Martin; Sandhaus, Robert A.; Burdon, Jonathan; Chorostowska‐Wynimko, Joanna; Thompson, Philip; Stocks, James; McElvaney, Noel G.; Chapman, Kenneth R.; Edelman, Jonathan M.

    2017-01-01

    Aims Early‐onset emphysema attributed to α‐1 antitrypsin deficiency (AATD) is frequently overlooked and undertreated. RAPID‐RCT/RAPID‐OLE, the largest clinical trials of purified human α‐1 proteinase inhibitor (A1‐PI; 60 mg kg–1 week–1) therapy completed to date, demonstrated for the first time that A1‐PI is clinically effective in slowing lung tissue loss in AATD. A posthoc pharmacometric analysis was undertaken to further explore dose, exposure and response. Methods A disease progression model was constructed, utilizing observed A1‐PI exposure and lung density decline rates (measured by computed tomography) from RAPID‐RCT/RAPID‐OLE, to predict effects of population variability and higher doses on A1‐PI exposure and clinical response. Dose–exposure and exposure–response relationships were characterized using nonlinear and linear mixed effects models, respectively. The dose–exposure model predicts summary exposures and not individual concentration kinetics; covariates included baseline serum A1‐PI, forced expiratory volume in 1 s and body weight. The exposure–response model relates A1‐PI exposure to lung density decline rate at varying exposure levels. Results A dose of 60 mg kg–1 week–1 achieved trough serum levels >11 μmol l–1 (putative ‘protective threshold’) in ≥98% patients. Dose–exposure–response simulations revealed increasing separation between A1‐PI and placebo in the proportions of patients achieving higher reductions in lung density decline rate; improvements in decline rates ≥0.5 g l–1 year–1 occurred more often in patients receiving A1‐PI: 63 vs. 12%. Conclusion Weight‐based A1‐PI dosing reliably raises serum levels above the 11 μmol l–1 threshold. However, our exposure–response simulations question whether this is the maximal, clinically effective threshold for A1‐PI therapy in AATD. The model suggested higher doses of A1‐PI would yield greater clinical effects. PMID:28662542

  2. Evaluation of the Community Multiscale Air Quality (CMAQ) Model Version 5.2

    EPA Science Inventory

    The Community Multiscale Air Quality (CMAQ) model is a state-of-the-science air quality model that simulates the emission, transport and fate of numerous air pollutants, including ozone and particulate matter. The Computational Exposure Division (CED) of the U.S. Environmental Pr...

  3. Evaluation of the Community Multi-scale Air Quality Model Version 5.2

    EPA Science Inventory

    The Community Multiscale Air Quality (CMAQ) model is a state-of-the-science air quality model that simulates the emission, transport and fate of numerous air pollutants, including ozone and particulate matter. The Computational Exposure Division (CED) of the U.S. Environmental Pr...

  4. A Bayesian context fear learning algorithm/automaton

    PubMed Central

    Krasne, Franklin B.; Cushman, Jesse D.; Fanselow, Michael S.

    2015-01-01

    Contextual fear conditioning is thought to involve the synaptic plasticity-dependent establishment in hippocampus of representations of to-be-conditioned contexts which can then become associated with USs in the amygdala. A conceptual and computational model of this process is proposed in which contextual attributes are assumed to be sampled serially and randomly during contextual exposures. Given this assumption, moment-to-moment information about such attributes will often be quite different from one exposure to another and, in particular, between exposures during which representations are created, exposures during which conditioning occurs, and during recall sessions. This presents challenges to current conceptual models of hippocampal function. In order to meet these challenges, our model's hippocampus was made to operate in different modes during representation creation and recall, and non-hippocampal machinery was constructed that controlled these hippocampal modes. This machinery uses a comparison between contextual information currently observed and information associated with existing hippocampal representations of familiar contexts to compute the Bayesian Weight of Evidence that the current context is (or is not) a known one, and it uses this value to assess the appropriateness of creation or recall modes. The model predicts a number of known phenomena such as the immediate shock deficit, spurious fear conditioning to contexts that are absent but similar to actually present ones, and modulation of conditioning by pre-familiarization with contexts. It also predicts a number of as yet unknown phenomena. PMID:26074792

  5. Ten-year ground exposure of composite materials used on the Bell Model 206L helicopter flight service program

    NASA Technical Reports Server (NTRS)

    Baker, Donald J.

    1994-01-01

    Residual strength results are presented for four composite material systems that have been exposed for up to 10 years to the environment at five different locations on the North American continent. The exposure locations are near where the Bell Model 206L helicopters, which participated in a flight service program sponsored by NASA Langley Research Center and the U.S. Army, were flying in daily commercial service. The composite material systems are (1) Kevlar-49 fabric/F-185 epoxy; (2) Kevlar-49 fabric/LRF-277 epoxy; (3) Kevlar-49 fabric/CE-306 epoxy; and (4) T-300 graphite/E-788 epoxy. Six replicates of each material were removed and tested after 1, 3, 5, 7, and 10 years of exposure. The average baseline strength was determined from testing six as-fabricated specimens. More than 1700 specimens have been tested. All specimens that were tested to determine their strength were painted with a polyurethane paint. Each set of specimens also included an unpainted panel for observing the weathering effects on the composite materials. A statistically based procedure has been used to determine the strength value above which at least 90 percent of the population is expected to fall with a 95-percent confidence level. The computed compression strengths are 80 to 90 percent of the baseline (no-exposure) strengths. The resulting compression strengths are approximately 8 percent below the population mean strengths. The computed short-beam-shear strengths are 83 to 92 percent of the baseline (no-exposure) strengths. The computed tension strength of all materials is 93 to 97 percent of the baseline (no-exposure) strengths.

  6. Low Dose Radiation Cancer Risks: Epidemiological and Toxicological Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    David G. Hoel, PhD

    2012-04-19

    The basic purpose of this one year research grant was to extend the two stage clonal expansion model (TSCE) of carcinogenesis to exposures other than the usual single acute exposure. The two-stage clonal expansion model of carcinogenesis incorporates the biological process of carcinogenesis, which involves two mutations and the clonal proliferation of the intermediate cells, in a stochastic, mathematical way. The current TSCE model serves a general purpose of acute exposure models but requires numerical computation of both the survival and hazard functions. The primary objective of this research project was to develop the analytical expressions for the survival functionmore » and the hazard function of the occurrence of the first cancer cell for acute, continuous and multiple exposure cases within the framework of the piece-wise constant parameter two-stage clonal expansion model of carcinogenesis. For acute exposure and multiple exposures of acute series, it is either only allowed to have the first mutation rate vary with the dose, or to have all the parameters be dose dependent; for multiple exposures of continuous exposures, all the parameters are allowed to vary with the dose. With these analytical functions, it becomes easy to evaluate the risks of cancer and allows one to deal with the various exposure patterns in cancer risk assessment. A second objective was to apply the TSCE model with varing continuous exposures from the cancer studies of inhaled plutonium in beagle dogs. Using step functions to estimate the retention functions of the pulmonary exposure of plutonium the multiple exposure versions of the TSCE model was to be used to estimate the beagle dog lung cancer risks. The mathematical equations of the multiple exposure versions of the TSCE model were developed. A draft manuscript which is attached provides the results of this mathematical work. The application work using the beagle dog data from plutonium exposure has not been completed due to the fact that the research project did not continue beyond its first year.« less

  7. Delivering The Benefits of Chemical-Biological Integration in ...

    EPA Pesticide Factsheets

    Abstract: Researchers at the EPA’s National Center for Computational Toxicology integrate advances in biology, chemistry, and computer science to examine the toxicity of chemicals and help prioritize chemicals for further research based on potential human health risks. The intention of this research program is to quickly evaluate thousands of chemicals for potential risk but with much reduced cost relative to historical approaches. This work involves computational and data driven approaches including high-throughput screening, modeling, text-mining and the integration of chemistry, exposure and biological data. We have developed a number of databases and applications that are delivering on the vision of developing a deeper understanding of chemicals and their effects on exposure and biological processes that are supporting a large community of scientists in their research efforts. This presentation will provide an overview of our work to bring together diverse large scale data from the chemical and biological domains, our approaches to integrate and disseminate these data, and the delivery of models supporting computational toxicology. This abstract does not reflect U.S. EPA policy. Presentation at ACS TOXI session on Computational Chemistry and Toxicology in Chemical Discovery and Assessement (QSARs).

  8. Evaluation of the Community Multi-scale Air Quality (CMAQ) Model Version 5.1

    EPA Science Inventory

    The Community Multiscale Air Quality (CMAQ) model is a state-of-the-science air quality model that simulates the emission, transport and fate of numerous air pollutants, including ozone and particulate matter. The Computational Exposure Division (CED) of the U.S. Environmental Pr...

  9. Overview and Evaluation of the Community Multiscale Air Quality Model Version 5.2

    EPA Science Inventory

    The Community Multiscale Air Quality (CMAQ) model is a state-of-the-science air quality model that simulates the emission, transport and fate of numerous air pollutants, including ozone and particulate matter. The Computational Exposure Division (CED) of the U.S. Environmental Pr...

  10. Evaluation of the Community Multi-scale Air Quality (CMAQ) Model Version 5.2

    EPA Science Inventory

    The Community Multiscale Air Quality (CMAQ) model is a state-of-the-science air quality model that simulates the emission, transport and fate of numerous air pollutants, including ozone and particulate matter. The Computational Exposure Division (CED) of the U.S. Environmental Pr...

  11. Space-Time Fusion Under Error in Computer Model Output: An Application to Modeling Air Quality

    EPA Science Inventory

    In the last two decades a considerable amount of research effort has been devoted to modeling air quality with public health objectives. These objectives include regulatory activities such as setting standards along with assessing the relationship between exposure to air pollutan...

  12. WRF/CMAQ AQMEII3 Simulations of US Regional-Scale ...

    EPA Pesticide Factsheets

    Chemical boundary conditions are a key input to regional-scale photochemical models. In this study, performed during the third phase of the Air Quality Model Evaluation International Initiative (AQMEII3), we perform annual simulations over North America with chemical boundary conditions prepared from four different global models. Results indicate that the impacts of different boundary conditions are significant for ozone throughout the year and most pronounced outside the summer season. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  13. Assessment of the computational uncertainty of temperature rise and SAR in the eyes and brain under far-field exposure from 1 to 10 GHz

    NASA Astrophysics Data System (ADS)

    Laakso, Ilkka

    2009-06-01

    This paper presents finite-difference time-domain (FDTD) calculations of specific absorption rate (SAR) values in the head under plane-wave exposure from 1 to 10 GHz using a resolution of 0.5 mm in adult male and female voxel models. Temperature rise due to the power absorption is calculated by the bioheat equation using a multigrid method solver. The computational accuracy is investigated by repeating the calculations with resolutions of 1 mm and 2 mm and comparing the results. Cubically averaged 10 g SAR in the eyes and brain and eye-averaged SAR are calculated and compared to the corresponding temperature rise as well as the recommended limits for exposure. The results suggest that 2 mm resolution should only be used for frequencies smaller than 2.5 GHz, and 1 mm resolution only under 5 GHz. Morphological differences in models seemed to be an important cause of variation: differences in results between the two different models were usually larger than the computational error due to the grid resolution, and larger than the difference between the results for open and closed eyes. Limiting the incident plane-wave power density to smaller than 100 W m-2 was sufficient for ensuring that the temperature rise in the eyes and brain were less than 1 °C in the whole frequency range.

  14. Exposure calculation code module for reactor core analysis: BURNER

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vondy, D.R.; Cunningham, G.W.

    1979-02-01

    The code module BURNER for nuclear reactor exposure calculations is presented. The computer requirements are shown, as are the reference data and interface data file requirements, and the programmed equations and procedure of calculation are described. The operating history of a reactor is followed over the period between solutions of the space, energy neutronics problem. The end-of-period nuclide concentrations are determined given the necessary information. A steady state, continuous fueling model is treated in addition to the usual fixed fuel model. The control options provide flexibility to select among an unusually wide variety of programmed procedures. The code also providesmore » user option to make a number of auxiliary calculations and print such information as the local gamma source, cumulative exposure, and a fine scale power density distribution in a selected zone. The code is used locally in a system for computation which contains the VENTURE diffusion theory neutronics code and other modules.« less

  15. Meet EPA Scientist Valerie Zartarian, Ph.D.

    EPA Pesticide Factsheets

    Senior exposure scientist and research environmental engineer Valerie Zartarian, Ph.D. helps build computer models and other tools that advance our understanding of how people interact with chemicals.

  16. Time series analysis of personal exposure to ambient air pollution and mortality using an exposure simulator.

    PubMed

    Chang, Howard H; Fuentes, Montserrat; Frey, H Christopher

    2012-09-01

    This paper describes a modeling framework for estimating the acute effects of personal exposure to ambient air pollution in a time series design. First, a spatial hierarchical model is used to relate Census tract-level daily ambient concentrations and simulated exposures for a subset of the study period. The complete exposure time series is then imputed for risk estimation. Modeling exposure via a statistical model reduces the computational burden associated with simulating personal exposures considerably. This allows us to consider personal exposures at a finer spatial resolution to improve exposure assessment and for a longer study period. The proposed approach is applied to an analysis of fine particulate matter of <2.5 μm in aerodynamic diameter (PM(2.5)) and daily mortality in the New York City metropolitan area during the period 2001-2005. Personal PM(2.5) exposures were simulated from the Stochastic Human Exposure and Dose Simulation. Accounting for exposure uncertainty, the authors estimated a 2.32% (95% posterior interval: 0.68, 3.94) increase in mortality per a 10 μg/m(3) increase in personal exposure to PM(2.5) from outdoor sources on the previous day. The corresponding estimates per a 10 μg/m(3) increase in PM(2.5) ambient concentration was 1.13% (95% confidence interval: 0.27, 2.00). The risks of mortality associated with PM(2.5) were also higher during the summer months.

  17. Biological and statistical approaches to predicting human lung cancer risk from silica.

    PubMed

    Kuempel, E D; Tran, C L; Bailer, A J; Porter, D W; Hubbs, A F; Castranova, V

    2001-01-01

    Chronic inflammation is a key step in the pathogenesis of particle-elicited fibrosis and lung cancer in rats, and possibly in humans. In this study, we compute the excess risk estimates for lung cancer in humans with occupational exposure to crystalline silica, using both rat and human data, and using both a threshold approach and linear models. From a toxicokinetic/dynamic model fit to lung burden and pulmonary response data from a subchronic inhalation study in rats, we estimated the minimum critical quartz lung burden (Mcrit) associated with reduced pulmonary clearance and increased neutrophilic inflammation. A chronic study in rats was also used to predict the human excess risk of lung cancer at various quartz burdens, including mean Mcrit (0.39 mg/g lung). We used a human kinetic lung model to link the equivalent lung burdens to external exposures in humans. We then computed the excess risk of lung cancer at these external exposures, using data of workers exposed to respirable crystalline silica and using Poisson regression and lifetable analyses. Finally, we compared the lung cancer excess risks estimated from male rat and human data. We found that the rat-based linear model estimates were approximately three times higher than those based on human data (e.g., 2.8% in rats vs. 0.9-1% in humans, at mean Mcrit lung burden or associated mean working lifetime exposure of 0.036 mg/m3). Accounting for variability and uncertainty resulted in 100-1000 times lower estimates of human critical lung burden and airborne exposure. This study illustrates that assumptions about the relevant biological mechanism, animal model, and statistical approach can all influence the magnitude of lung cancer risk estimates in humans exposed to crystalline silica.

  18. Adaptive Response in Female Modeling of the Hypothalamic-pituitary-gonadal Axis

    EPA Science Inventory

    Exposure to endocrine disrupting chemicals can affect reproduction and development in both humans and wildlife. We are developing a mechanistic computational model of the hypothalamic-pituitary-gonadal (HPG) axis in female fathead minnows to predict dose-response and time-course ...

  19. Targeted intervention: Computational approaches to elucidate and predict relapse in alcoholism.

    PubMed

    Heinz, Andreas; Deserno, Lorenz; Zimmermann, Ulrich S; Smolka, Michael N; Beck, Anne; Schlagenhauf, Florian

    2017-05-01

    Alcohol use disorder (AUD) and addiction in general is characterized by failures of choice resulting in repeated drug intake despite severe negative consequences. Behavioral change is hard to accomplish and relapse after detoxification is common and can be promoted by consumption of small amounts of alcohol as well as exposure to alcohol-associated cues or stress. While those environmental factors contributing to relapse have long been identified, the underlying psychological and neurobiological mechanism on which those factors act are to date incompletely understood. Based on the reinforcing effects of drugs of abuse, animal experiments showed that drug, cue and stress exposure affect Pavlovian and instrumental learning processes, which can increase salience of drug cues and promote habitual drug intake. In humans, computational approaches can help to quantify changes in key learning mechanisms during the development and maintenance of alcohol dependence, e.g. by using sequential decision making in combination with computational modeling to elucidate individual differences in model-free versus more complex, model-based learning strategies and their neurobiological correlates such as prediction error signaling in fronto-striatal circuits. Computational models can also help to explain how alcohol-associated cues trigger relapse: mechanisms such as Pavlovian-to-Instrumental Transfer can quantify to which degree Pavlovian conditioned stimuli can facilitate approach behavior including alcohol seeking and intake. By using generative models of behavioral and neural data, computational approaches can help to quantify individual differences in psychophysiological mechanisms that underlie the development and maintenance of AUD and thus promote targeted intervention. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Case Study: Organotypic human in vitro models of embryonic ...

    EPA Pesticide Factsheets

    Morphogenetic fusion of tissues is a common event in embryonic development and disruption of fusion is associated with birth defects of the eye, heart, neural tube, phallus, palate, and other organ systems. Embryonic tissue fusion requires precise regulation of cell-cell and cell-matrix interactions that drive proliferation, differentiation, and morphogenesis. Chemical low-dose exposures can disrupt morphogenesis across space and time by interfering with key embryonic fusion events. The Morphogenetic Fusion Task uses computer and in vitro models to elucidate consequences of developmental exposures. The Morphogenetic Fusion Task integrates multiple approaches to model responses to chemicals that leaad to birth defects, including integrative mining on ToxCast DB, ToxRefDB, and chemical structures, advanced computer agent-based models, and human cell-based cultures that model disruption of cellular and molecular behaviors including mechanisms predicted from integrative data mining and agent-based models. The purpose of the poster is to indicate progress on the CSS 17.02 Virtual Tissue Models Morphogenesis Task 1 products for the Board of Scientific Counselors meeting on Nov 16-17.

  1. Physiologically Based Pharmacokinetic Modeling of the Lactating Rat and Nursing Pup: a Multiroute Exposure Model for Trichloroethylene and its Metabolite, Trichloroacetic Acid

    DTIC Science & Technology

    1990-01-01

    cumulated during pregnancy was described as a linear animal (allometrically scaled), was estimated by comput- process changing from 12.0% of body weight of...biochemical effects of TCE in neonatal pregnancy (Fisher et al., 1989) were used for repeated- rats born to dams exposed to TCE via drink- exposure...studies during lactation. Female cesarean-de- rived Fischer-344 rats, obtained from Charles Rivering water during pregnancy and lactation. Breeding

  2. International Seminar on The Role of Dosimetry in High-Quality EMF Risk Assessment Held in Ljubljana, Slovenia and Zagreb, Croatia on 13-15 September 2006

    DTIC Science & Technology

    2006-09-01

    wireless communication usage and exposure to different parts of the body (especially for children and foetuses ), including multiple exposure from...Calculation of induced electric fields in pregnant women and in the foetus is urgently needed. Very little computation has been carried out on...advanced models of the pregnant human and the foetus with appropriate anatomical modelling. It is important to assess possible enhanced induction of

  3. Incorporating High-Dimensional Exposure Modelling into Studies of Air Pollution and Health.

    PubMed

    Liu, Yi; Shaddick, Gavin; Zidek, James V

    2017-01-01

    Performing studies on the risks of environmental hazards on human health requires accurate estimates of exposures that might be experienced by the populations at risk. Often there will be missing data and in many epidemiological studies, the locations and times of exposure measurements and health data do not match. To a large extent this will be due to the health and exposure data having arisen from completely different data sources and not as the result of a carefully designed study, leading to problems of both 'change of support' and 'misaligned data'. In such cases, a direct comparison of the exposure and health outcome is often not possible without an underlying model to align the two in the spatial and temporal domains. The Bayesian approach provides the natural framework for such models; however, the large amounts of data that can arise from environmental networks means that inference using Markov Chain Monte Carlo might not be computationally feasible in this setting. Here we adapt the integrated nested Laplace approximation to implement spatio-temporal exposure models. We also propose methods for the integration of large-scale exposure models and health analyses. It is important that any model structure allows the correct propagation of uncertainty from the predictions of the exposure model through to the estimates of risk and associated confidence intervals. The methods are demonstrated using a case study of the levels of black smoke in the UK, measured over several decades, and respiratory mortality.

  4. Computational modeling of blast exposure associated with recoilless weapons combat training

    NASA Astrophysics Data System (ADS)

    Wiri, S.; Ritter, A. C.; Bailie, J. M.; Needham, C.; Duckworth, J. L.

    2017-11-01

    Military personnel are exposed to blast as part of routine combat training with shoulder-fired recoilless rifles. These weapons fire large-caliber ammunitions capable of disabling structures and uparmored vehicles (e.g., tanks). Scientific, medical, and military leaders are beginning to recognize the blast overpressure from these shoulder-fired weapons may result in acute and even long-term physiological effects to military personnel. However, the back blast generated from the Carl Gustav and Shoulder-launched Multipurpose Assault Weapon (SMAW) shoulder-fired weapons on the weapon operator has not been quantified. By quantifying and modeling the full-body blast exposure from these weapons, better injury correlations can be constructed. Blast exposure data from the Carl Gustav and SMAW were used to calibrate a propellant burn source term for computational simulations of blast exposure on operators of these shoulder-mounted weapon systems. A propellant burn model provided the source term for each weapon to capture blast effects. Blast data from personnel-mounted gauges during weapon firing were used to create initial, high-fidelity 3D computational fluid dynamic simulations using SHAMRC (Second-order Hydrodynamic Automatic Mesh Refinement Code). These models were then improved upon using data collected from static blast sensors positioned around the military personnel while weapons were utilized in actual combat training. The final simulation models for both the Carl Gustav and SMAW were in good agreement with the data collected from the personnel-mounted and static pressure gauges. Using the final simulation results, contour maps were created for peak overpressure and peak overpressure impulse experienced by military personnel firing the weapon as well as those assisting with firing of those weapons. Reconstruction of the full-body blast loading enables a more accurate assessment of the cause of potential mechanisms of injury due to air blast even for subjects not wearing blast gauges themselves. By accurately understanding the blast exposure and its variations across an individual, more meaningful correlations with physiologic response including potential TBI spectrum physiology associated with sub-concussive blast exposure can be established. As blast injury thresholds become better defined, results from these reconstructions can provide important insights into approaches for reducing possible risk of injury to personnel operating shoulder-launched weapons.

  5. The CREp program, a fully parameterizable program to compute exposure ages (3He, 10Be)

    NASA Astrophysics Data System (ADS)

    Martin, L.; Blard, P. H.; Lave, J.; Delunel, R.; Balco, G.

    2015-12-01

    Over the last decades, cosmogenic exposure dating permitted major advances in Earth surface sciences, and particularly in paleoclimatology. Yet, exposure age calculation is a dense procedure. It requires numerous choices of parameterization and the use of an appropriate production rate. Nowadays, Earth surface scientists may either calculate exposure ages on their own or use the available programs. However, these programs do not offer the possibility to include all the most recent advances in Cosmic Ray Exposure (CRE) dating. Notably, they do not propose the most recent production rate datasets and they only offer few possibilities to test the impact of the atmosphere model and the geomagnetic model on the computed ages. We present the CREp program, a Matlab © code that computes CRE ages for 3He and 10Be over the last 2 million years. The CREp program includes the scaling models of Lal-Stone in the "Lal modified" version (Balco et al., 2008; Lal, 1991; Stone, 2000) and the LSD model (Lifton et al., 2014). For any of these models, CREP allows choosing between the ERA-40 atmosphere model (Uppala et al., 2005) and the standard atmosphere (National Oceanic and Atmospheric Administration, 1976). Regarding the geomagnetic database, users can opt for one of the three proposed datasets: Muscheler et al. 2005, GLOPIS-75 (Laj et al. 2004) and the geomagnetic framework proposed in the LSD model (Lifton et al., 2014). They may also import their own geomagnetic database. Importantly, the reference production rate can be chosen among a large variety of possibilities. We made an effort to propose a wide and homogenous calibration database in order to promote the use of local calibration rates: CREp includes all the calibration data published until July 2015 and will be able to access an updated online database including all the newly published production rates. This is crucial for improving the ages accuracy. Users may also choose a global production rate or use their own data to either calibrate a production rate or directly input a Sea Level High Latitude value. The program is fast to calculate a large number of ages and to export the final density probability function associated with each age into an Excel © spreadsheet format.

  6. Evaluation of SAR in a human body model due to wireless power transmission in the 10 MHz band.

    PubMed

    Laakso, Ilkka; Tsuchida, Shogo; Hirata, Akimasa; Kamimura, Yoshitsugu

    2012-08-07

    This study discusses a computational method for calculating the specific absorption rate (SAR) due to a wireless power transmission system in the 10 MHz frequency band. A two-step quasi-static method comprised of the method of moments and the scalar potential finite-difference method are proposed. The applicability of the quasi-static approximation for localized exposure in this frequency band is discussed by comparing the SAR in a lossy dielectric cylinder computed with a full-wave electromagnetic analysis and the quasi-static approximation. From the computational results, the input impedance of the resonant coils was affected by the existence of the cylinder. On the other hand, the magnetic field distribution in free space and considering the cylinder and an impedance matching circuit were in good agreement; the maximum difference in the amplitude of the magnetic field was 4.8%. For a cylinder-coil distance of 10 mm, the difference between the peak 10 g averaged SAR in the cylinder computed with the full-wave electromagnetic method and our quasi-static method was 7.8%. These results suggest that the quasi-static approach is applicable for conducting the dosimetry of wireless power transmission in the 10 MHz band. With our two-step quasi-static method, the SAR in the anatomically based model was computed for different exposure scenarios. From those computations, the allowable input power satisfying the limit of a peak 10 g averaged SAR of 2.0 W kg(-1) was 830 W in the worst case exposure scenario with a coil positioned at a distance of 30 mm from the chest.

  7. Human Environmental Disease Network: A computational model to assess toxicology of contaminants.

    PubMed

    Taboureau, Olivier; Audouze, Karine

    2017-01-01

    During the past decades, many epidemiological, toxicological and biological studies have been performed to assess the role of environmental chemicals as potential toxicants associated with diverse human disorders. However, the relationships between diseases based on chemical exposure rarely have been studied by computational biology. We developed a human environmental disease network (EDN) to explore and suggest novel disease-disease and chemical-disease relationships. The presented scored EDN model is built upon the integration of systems biology and chemical toxicology using information on chemical contaminants and their disease relationships reported in the TDDB database. The resulting human EDN takes into consideration the level of evidence of the toxicant-disease relationships, allowing inclusion of some degrees of significance in the disease-disease associations. Such a network can be used to identify uncharacterized connections between diseases. Examples are discussed for type 2 diabetes (T2D). Additionally, this computational model allows confirmation of already known links between chemicals and diseases (e.g., between bisphenol A and behavioral disorders) and also reveals unexpected associations between chemicals and diseases (e.g., between chlordane and olfactory alteration), thus predicting which chemicals may be risk factors to human health. The proposed human EDN model allows exploration of common biological mechanisms of diseases associated with chemical exposure, helping us to gain insight into disease etiology and comorbidity. This computational approach is an alternative to animal testing supporting the 3R concept.

  8. Cerebellar White Matter Abnormalities following Primary Blast Injury in US Military Personnel

    PubMed Central

    Mac Donald, Christine; Johnson, Ann; Cooper, Dana; Malone, Thomas; Sorrell, James; Shimony, Joshua; Parsons, Matthew; Snyder, Abraham; Raichle, Marcus; Fang, Raymond; Flaherty, Stephen; Russell, Michael; Brody, David L.

    2013-01-01

    Little is known about the effects of blast exposure on the human brain in the absence of head impact. Clinical reports, experimental animal studies, and computational modeling of blast exposure have suggested effects on the cerebellum and brainstem. In US military personnel with isolated, primary blast-related ‘mild’ traumatic brain injury and no other known insult, we found diffusion tensor MRI abnormalities consistent with cerebellar white matter injury in 3 of 4 subjects. No abnormalities in other brain regions were detected. These findings add to the evidence supporting the hypothesis that primary blast exposure contributes to brain injury in the absence of head impact and that the cerebellum may be particularly vulnerable. However, the clinical effects of these abnormalities cannot be determined with certainty; none of the subjects had ataxia or other detected evidence of cerebellar dysfunction. The details of the blast events themselves cannot be disclosed at this time, thus additional animal and computational modeling will be required to dissect the mechanisms underlying primary blast-related traumatic brain injury. Furthermore, the effects of possible subconcussive impacts and other military-related exposures cannot be determined from the data presented. Thus many aspects of topic will require further investigation. PMID:23409052

  9. Theoretical assessment of the maximum obtainable power in wireless power transfer constrained by human body exposure limits in a typical room scenario.

    PubMed

    Chen, Xi Lin; De Santis, Valerio; Umenei, Aghuinyue Esai

    2014-07-07

    In this study, the maximum received power obtainable through wireless power transfer (WPT) by a small receiver (Rx) coil from a relatively large transmitter (Tx) coil is numerically estimated in the frequency range from 100 kHz to 10 MHz based on human body exposure limits. Analytical calculations were first conducted to determine the worst-case coupling between a homogeneous cylindrical phantom with a radius of 0.65 m and a Tx coil positioned 0.1 m away with the radius ranging from 0.25 to 2.5 m. Subsequently, three high-resolution anatomical models were employed to compute the peak induced field intensities with respect to various Tx coil locations and dimensions. Based on the computational results, scaling factors which correlate the cylindrical phantom and anatomical model results were derived. Next, the optimal operating frequency, at which the highest transmitter source power can be utilized without exceeding the exposure limits, is found to be around 2 MHz. Finally, a formulation is proposed to estimate the maximum obtainable power of WPT in a typical room scenario while adhering to the human body exposure compliance mandates.

  10. Theoretical assessment of the maximum obtainable power in wireless power transfer constrained by human body exposure limits in a typical room scenario

    NASA Astrophysics Data System (ADS)

    Chen, Xi Lin; De Santis, Valerio; Esai Umenei, Aghuinyue

    2014-07-01

    In this study, the maximum received power obtainable through wireless power transfer (WPT) by a small receiver (Rx) coil from a relatively large transmitter (Tx) coil is numerically estimated in the frequency range from 100 kHz to 10 MHz based on human body exposure limits. Analytical calculations were first conducted to determine the worst-case coupling between a homogeneous cylindrical phantom with a radius of 0.65 m and a Tx coil positioned 0.1 m away with the radius ranging from 0.25 to 2.5 m. Subsequently, three high-resolution anatomical models were employed to compute the peak induced field intensities with respect to various Tx coil locations and dimensions. Based on the computational results, scaling factors which correlate the cylindrical phantom and anatomical model results were derived. Next, the optimal operating frequency, at which the highest transmitter source power can be utilized without exceeding the exposure limits, is found to be around 2 MHz. Finally, a formulation is proposed to estimate the maximum obtainable power of WPT in a typical room scenario while adhering to the human body exposure compliance mandates.

  11. Remembrance of phases past: An autoregressive method for generating realistic atmospheres in simulations

    NASA Astrophysics Data System (ADS)

    Srinath, Srikar; Poyneer, Lisa A.; Rudy, Alexander R.; Ammons, S. M.

    2014-08-01

    The advent of expensive, large-aperture telescopes and complex adaptive optics (AO) systems has strengthened the need for detailed simulation of such systems from the top of the atmosphere to control algorithms. The credibility of any simulation is underpinned by the quality of the atmosphere model used for introducing phase variations into the incident photons. Hitherto, simulations which incorporate wind layers have relied upon phase screen generation methods that tax the computation and memory capacities of the platforms on which they run. This places limits on parameters of a simulation, such as exposure time or resolution, thus compromising its utility. As aperture sizes and fields of view increase the problem will only get worse. We present an autoregressive method for evolving atmospheric phase that is efficient in its use of computation resources and allows for variability in the power contained in frozen flow or stochastic components of the atmosphere. Users have the flexibility of generating atmosphere datacubes in advance of runs where memory constraints allow to save on computation time or of computing the phase at each time step for long exposure times. Preliminary tests of model atmospheres generated using this method show power spectral density and rms phase in accordance with established metrics for Kolmogorov models.

  12. in vitro Models if Human Embryonic Mesenchymal Transitions in Morphogenesis

    EPA Science Inventory

    Our ability to predict human developmental consequences produced by exposure to environmental chemicals is limited by the current experimental and computational models.Human heart defects are among the most common type of birth defects and affect 1% of children (~40,000 children)...

  13. Rapid Prototyping of Physiologically-Based Toxicokinetic (PBTK) Models (SOT annual meeting)

    EPA Science Inventory

    Determining the tissue concentrations resulting from chemical exposure (i.e., toxicokinetics (TK)) is essential in emergency or other situations where time and data are lacking. Generic TK models can be created rapidly using in vitro assays and computational approaches to generat...

  14. The Influence of Self-Regulated Learning and Prior Knowledge on Knowledge Acquisition in Computer-Based Learning Environments

    ERIC Educational Resources Information Center

    Bernacki, Matthew

    2010-01-01

    This study examined how learners construct textbase and situation model knowledge in hypertext computer-based learning environments (CBLEs) and documented the influence of specific self-regulated learning (SRL) tactics, prior knowledge, and characteristics of the learner on posttest knowledge scores from exposure to a hypertext. A sample of 160…

  15. Industrial machinery noise impact modeling, volume 1

    NASA Astrophysics Data System (ADS)

    Hansen, C. H.; Kugler, B. A.

    1981-07-01

    The development of a machinery noise computer model which may be used to assess the effect of occupational noise on the health and welfare of industrial workers is discussed. The purpose of the model is to provide EPA with the methodology to evaluate the personnel noise problem, to identify the equipment types responsible for the exposure and to assess the potential benefits of a given noise control action. Due to its flexibility in design and application, the model and supportive computer program can be used by other federal agencies, state governments, labor and industry as an aid in the development of noise abatement programs.

  16. Space Shuttle and Space Station Radio Frequency (RF) Exposure Analysis

    NASA Technical Reports Server (NTRS)

    Hwu, Shian U.; Loh, Yin-Chung; Sham, Catherine C.; Kroll, Quin D.

    2005-01-01

    This paper outlines the modeling techniques and important parameters to define a rigorous but practical procedure that can verify the compliance of RF exposure to the NASA standards for astronauts and electronic equipment. The electromagnetic modeling techniques are applied to analyze RF exposure in Space Shuttle and Space Station environments with reasonable computing time and resources. The modeling techniques are capable of taking into account the field interactions with Space Shuttle and Space Station structures. The obtained results illustrate the multipath effects due to the presence of the space vehicle structures. It's necessary to include the field interactions with the space vehicle in the analysis for an accurate assessment of the RF exposure. Based on the obtained results, the RF keep out zones are identified for appropriate operational scenarios, flight rules and necessary RF transmitter constraints to ensure a safe operating environment and mission success.

  17. Does exposure to computers affect the routine parameters of semen quality?

    PubMed

    Sun, Yue-Lian; Zhou, Wei-Jin; Wu, Jun-Qing; Gao, Er-Sheng

    2005-09-01

    To assess whether exposure to computers harms the semen quality of healthy young men. A total of 178 subjects were recruited from two maternity and children healthcare centers in Shanghai, 91 with a history of exposure to computers (i.e., exposure for 20 h or more per week in the last 2 years) and 87 persons to act as control (no or little exposure to computers). Data on the history of exposure to computers and other characteristics were obtained by means of a structured questionnaire interview. Semen samples were collected by masturbation in the place where the semen samples were analyzed. No differences in the distribution of the semen parameters (semen volume, sperm density, percentage of progressive sperm, sperm viability and percentage of normal form sperm) were found between the exposed group and the control group. Exposure to computers was not found to be a risk factor for inferior semen quality after adjusting for potential confounders, including abstinence days, testicle size, occupation, history of exposure to toxic substances. The present study did not find that healthy men exposed to computers had inferior semen quality.

  18. Predictive Modeling and Computational Toxicology

    EPA Science Inventory

    Embryonic development is orchestrated via a complex series of cellular interactions controlling behaviors such as mitosis, migration, differentiation, adhesion, contractility, apoptosis, and extracellular matrix remodeling. Any chemical exposure that perturbs these cellular proce...

  19. Neurological Effects of Blast Injury

    PubMed Central

    Hicks, Ramona R.; Fertig, Stephanie J.; Desrocher, Rebecca E.; Koroshetz, Walter J.; Pancrazio, Joseph J.

    2010-01-01

    Over the last few years, thousands of soldiers and an even greater number of civilians have suffered traumatic injuries due to blast exposure, largely attributed to improvised explosive devices in terrorist and insurgent activities. The use of body armor is allowing soldiers to survive blasts that would otherwise be fatal due to systemic damage. Emerging evidence suggests that exposure to a blast can produce neurological consequences in the brain, but much remains unknown. To elucidate the current scientific basis for understanding blast-induced traumatic brain injury (bTBI), the NIH convened a workshop in April, 2008. A multidisciplinary group of neuroscientists, engineers, and clinicians were invited to share insights on bTBI, specifically pertaining to: physics of blast explosions, acute clinical observations and treatments, preclinical and computational models, and lessons from the international community on civilian exposures. This report provides an overview of the state of scientific knowledge of bTBI, drawing from the published literature, as well as presentations, discussions, and recommendations from the workshop. One of the major recommendations from the workshop was the need to characterize the effects of blast exposure on clinical neuropathology. Clearer understanding of the human neuropathology would enable validation of preclinical and computational models, which are attempting to simulate blast wave interactions with the central nervous system. Furthermore, the civilian experience with bTBI suggests that polytrauma models incorporating both brain and lung injuries may be more relevant to the study of civilian countermeasures than considering models with a neurological focus alone. PMID:20453776

  20. Uncertainty Modeling of Pollutant Transport in Atmosphere and Aquatic Route Using Soft Computing

    NASA Astrophysics Data System (ADS)

    Datta, D.

    2010-10-01

    Hazardous radionuclides are released as pollutants in the atmospheric and aquatic environment (ATAQE) during the normal operation of nuclear power plants. Atmospheric and aquatic dispersion models are routinely used to assess the impact of release of radionuclide from any nuclear facility or hazardous chemicals from any chemical plant on the ATAQE. Effect of the exposure from the hazardous nuclides or chemicals is measured in terms of risk. Uncertainty modeling is an integral part of the risk assessment. The paper focuses the uncertainty modeling of the pollutant transport in atmospheric and aquatic environment using soft computing. Soft computing is addressed due to the lack of information on the parameters that represent the corresponding models. Soft-computing in this domain basically addresses the usage of fuzzy set theory to explore the uncertainty of the model parameters and such type of uncertainty is called as epistemic uncertainty. Each uncertain input parameters of the model is described by a triangular membership function.

  1. Association between activity space exposure to food establishments and individual risk of overweight.

    PubMed

    Kestens, Yan; Lebel, Alexandre; Chaix, Basile; Clary, Christelle; Daniel, Mark; Pampalon, Robert; Theriault, Marius; P Subramanian, S V

    2012-01-01

    Environmental exposure to food sources may underpin area level differences in individual risk for overweight. Place of residence is generally used to assess neighbourhood exposure. Yet, because people are mobile, multiple exposures should be accounted for to assess the relation between food environments and overweight. Unfortunately, mobility data is often missing from health surveys. We hereby test the feasibility of linking travel survey data with food listings to derive food store exposure predictors of overweight among health survey participants. Food environment exposure measures accounting for non-residential activity places (activity spaces) were computed and modelled in Montreal and Quebec City, Canada, using travel surveys and food store listings. Models were then used to predict activity space food exposures for 5,578 participants of the Canadian Community Health Survey. These food exposure estimates, accounting for daily mobility, were used to model self-reported overweight in a multilevel framework. Median Odd Ratios were used to assess the proportion of between-neighborhood variance explained by such food exposure predictors. Estimates of food environment exposure accounting for both residential and non-residential destinations were significantly and more strongly associated with overweight than residential-only measures of exposure for men. For women, residential exposures were more strongly associated with overweight than non-residential exposures. In Montreal, adjusted models showed men in the highest quartile of exposure to food stores were at lesser risk of being overweight considering exposure to restaurants (OR = 0.36 [0.21-0.62]), fast food outlets (0.48 [0.30-0.79]), or corner stores (0.52 [0.35-0.78]). Conversely, men experiencing the highest proportion of restaurants being fast-food outlets were at higher risk of being overweight (2.07 [1.25-3.42]). Women experiencing higher residential exposures were at lower risk of overweight. Using residential neighbourhood food exposure measures may underestimate true exposure and observed associations. Using mobility data offers potential for deriving activity space exposure estimates in epidemiological models.

  2. Developing Predictive Approaches to Characterize Adaptive Responses of the Reproductive Endocrine Axis to Aromatase Inhibition II: Computational Modeling

    EPA Science Inventory

    ABSTRACT Exposure to endocrine disrupting chemicals can affect reproduction and development in both humans and wildlife. We developed a mechanistic mathematical model of the hypothalamic­ pituitary-gonadal (HPG) axis in female fathead minnows to predic...

  3. Preclinical evaluation of implantable cardioverter-defibrillator developed for magnetic resonance imaging use.

    PubMed

    Gold, Michael R; Kanal, Emanuel; Schwitter, Juerg; Sommer, Torsten; Yoon, Hyun; Ellingson, Michael; Landborg, Lynn; Bratten, Tara

    2015-03-01

    Many patients with an implantable cardioverter-defibrillator (ICD) have indications for magnetic resonance imaging (MRI). However, MRI is generally contraindicated in ICD patients because of potential risks from hazardous interactions between the MRI and ICD system. The purpose of this study was to use preclinical computer modeling, animal studies, and bench and scanner testing to demonstrate the safety of an ICD system developed for 1.5-T whole-body MRI. MRI hazards were assessed and mitigated using multiple approaches: design decisions to increase safety and reliability, modeling and simulation to quantify clinical MRI exposure levels, animal studies to quantify the physiologic effects of MRI exposure, and bench testing to evaluate safety margin. Modeling estimated the incidence of a chronic change in pacing capture threshold >0.5 V and 1.0 V to be less than 1 in 160,000 and less than 1 in 1,000,000 cases, respectively. Modeling also estimated the incidence of unintended cardiac stimulation to occur in less than 1 in 1,000,000 cases. Animal studies demonstrated no delay in ventricular fibrillation detection and no reduction in ventricular fibrillation amplitude at clinical MRI exposure levels, even with multiple exposures. Bench and scanner testing demonstrated performance and safety against all other MRI-induced hazards. A preclinical strategy that includes comprehensive computer modeling, animal studies, and bench and scanner testing predicts that an ICD system developed for the magnetic resonance environment is safe and poses very low risks when exposed to 1.5-T normal operating mode whole-body MRI. Copyright © 2015 Heart Rhythm Society. Published by Elsevier Inc. All rights reserved.

  4. Numerical simulation of gender differences in a long-term microgravity exposure

    NASA Astrophysics Data System (ADS)

    Perez-Poch, Antoni

    The objective of this work is to analyse and simulate gender differences when individuals are exposed to long-term microgravity. Risk probability of a health impairment which may put in jeopardy a long-term mission is also evaluated. Computer simulations are becoming a promising research line of work, as physiological models become more and more sophisticated and reliable. Technological advances in state-of-the-art hardware technology and software allow nowadays for better and more accurate simulations of complex phenomena, such as the response of the human cardiovascular system to long-term exposure to microgravity. Experimental data for long-term missions are difficult to achieve and reproduce, therefore the predictions of computer simulations are of a major importance in this field. Our approach is based on a previous model developed and implemented in our laboratory (NELME: Numerical Evaluation of Long-term Microgravity Effects). The software simulates the behaviour of the cardiovascular system and different human organs, has a modular architecture, and allows to introduce perturbations such as physical exercise or countermeasures. The implementation is based on a complex electricallike model of this control system, using inexpensive software development frameworks, and has been tested and validated with the available experimental data. Gender differences have been implemented for this specific work, as an adjustment of a number of parameters that are included in the model. Women versus men physiological differences have been therefore taken into account, based upon estimations from the physiology bibliography. A number of simulations have been carried out for long-term exposure to microgravity. Gravity varying from Earth-based to zero, and time exposure are the two main variables involved in the construction of results, including responses to patterns of physical aerobical exercise, and also thermal stress simulating an extra-vehicular activity. Results show that significant differences appear between men and women physiological response after long-term exposure (more than three months) to microgravity. Risk evaluation for every gender, and specific risk thresholds are provided. Initial results are compatible with the existing data, and provide unique information regarding different patterns of microgravity exposure. We conclude that computer-based models such us NELME are a promising line of work to predict health risks in long-term missions. More experimental work is needed to adjust some parameters of the model. This work may be seen as another contribution to a better understanding of the underlying processes involved for both women in man adaptation to long-term microgravity.

  5. Simulation of Spatial and Temporal Radiation Exposures for ISS in the South Atlantic Anomaly

    NASA Technical Reports Server (NTRS)

    Anderson, Brooke M.; Nealy, John E.; Luetke, Nathan J.; Sandridge, Christopher A.; Qualls, Garry D.

    2004-01-01

    The International Space Station (ISS) living areas receive the preponderance of ionizing radiation exposure from Galactic Cosmic Rays (GCR) and geomagnetically trapped protons. Practically all trapped proton exposure occurs when the ISS passes through the South Atlantic Anomaly (SAA) region. The fact that this region is in proximity to a trapping mirror point indicates that the proton flux is highly directional. The inherent shielding provided by the ISS structure is represented by a recently-developed CAD model of the current 11-A configuration. Using modeled environment and configuration, trapped proton exposures have been analytically estimated at selected target points within the Service and Lab Modules. The results indicate that the directional flux may lead to substantially different exposure characteristics than the more common analyses that assume an isotropic environment. Additionally, predictive capability of the computational procedure should allow sensitive validation with corresponding on-board directional dosimeters.

  6. [Navigated drilling for femoral head necrosis. Experimental and clinical results].

    PubMed

    Beckmann, J; Tingart, M; Perlick, L; Lüring, C; Grifka, J; Anders, S

    2007-05-01

    In the early stages of osteonecrosis of the femoral head, core decompression by exact drilling into the ischemic areas can reduce pain and achieve reperfusion. Using computer aided surgery, the precision of the drilling can be improved while simultaneously lowering radiation exposure time for both staff and patients. We describe the experimental and clinical results of drilling under the guidance of the fluoroscopically-based VectorVision navigation system (BrainLAB, Munich, Germany). A total of 70 sawbones were prepared mimicking an osteonecrosis of the femoral head. In two experimental models, bone only and obesity, as well as in a clinical setting involving ten patients with osteonecrosis of the femoral head, the precision and the duration of radiation exposure were compared between the VectorVision system and conventional drilling. No target was missed. For both models, there was a statistically significant difference in terms of the precision, the number of drilling corrections as well as the radiation exposure time. The average distance to the desired midpoint of the lesion of both models was 0.48 mm for navigated drilling and 1.06 mm for conventional drilling, the average drilling corrections were 0.175 and 2.1, and the radiation exposure time less than 1 s and 3.6 s, respectively. In the clinical setting, the reduction of radiation exposure (below 1 s for navigation compared to 56 s for the conventional technique) as well as of drilling corrections (0.2 compared to 3.4) was also significant. Computer guided drilling using the fluoroscopically based VectorVision navigation system shows a clearly improved precision with a enormous simultaneous reduction in radiation exposure. It is therefore recommended for clinical routine.

  7. Development of an RF-EMF Exposure Surrogate for Epidemiologic Research.

    PubMed

    Roser, Katharina; Schoeni, Anna; Bürgi, Alfred; Röösli, Martin

    2015-05-22

    Exposure assessment is a crucial part in studying potential effects of RF-EMF. Using data from the HERMES study on adolescents, we developed an integrative exposure surrogate combining near-field and far-field RF-EMF exposure in a single brain and whole-body exposure measure. Contributions from far-field sources were modelled by propagation modelling and multivariable regression modelling using personal measurements. Contributions from near-field sources were assessed from both, questionnaires and mobile phone operator records. Mean cumulative brain and whole-body doses were 1559.7 mJ/kg and 339.9 mJ/kg per day, respectively. 98.4% of the brain dose originated from near-field sources, mainly from GSM mobile phone calls (93.1%) and from DECT phone calls (4.8%). Main contributors to the whole-body dose were GSM mobile phone calls (69.0%), use of computer, laptop and tablet connected to WLAN (12.2%) and data traffic on the mobile phone via WLAN (6.5%). The exposure from mobile phone base stations contributed 1.8% to the whole-body dose, while uplink exposure from other people's mobile phones contributed 3.6%. In conclusion, the proposed approach is considered useful to combine near-field and far-field exposure to an integrative exposure surrogate for exposure assessment in epidemiologic studies. However, substantial uncertainties remain about exposure contributions from various near-field and far-field sources.

  8. Development of an RF-EMF Exposure Surrogate for Epidemiologic Research

    PubMed Central

    Roser, Katharina; Schoeni, Anna; Bürgi, Alfred; Röösli, Martin

    2015-01-01

    Exposure assessment is a crucial part in studying potential effects of RF-EMF. Using data from the HERMES study on adolescents, we developed an integrative exposure surrogate combining near-field and far-field RF-EMF exposure in a single brain and whole-body exposure measure. Contributions from far-field sources were modelled by propagation modelling and multivariable regression modelling using personal measurements. Contributions from near-field sources were assessed from both, questionnaires and mobile phone operator records. Mean cumulative brain and whole-body doses were 1559.7 mJ/kg and 339.9 mJ/kg per day, respectively. 98.4% of the brain dose originated from near-field sources, mainly from GSM mobile phone calls (93.1%) and from DECT phone calls (4.8%). Main contributors to the whole-body dose were GSM mobile phone calls (69.0%), use of computer, laptop and tablet connected to WLAN (12.2%) and data traffic on the mobile phone via WLAN (6.5%). The exposure from mobile phone base stations contributed 1.8% to the whole-body dose, while uplink exposure from other people’s mobile phones contributed 3.6%. In conclusion, the proposed approach is considered useful to combine near-field and far-field exposure to an integrative exposure surrogate for exposure assessment in epidemiologic studies. However, substantial uncertainties remain about exposure contributions from various near-field and far-field sources. PMID:26006132

  9. Sample size calculations for case-control studies

    Cancer.gov

    This R package can be used to calculate the required samples size for unconditional multivariate analyses of unmatched case-control studies. The sample sizes are for a scalar exposure effect, such as binary, ordinal or continuous exposures. The sample sizes can also be computed for scalar interaction effects. The analyses account for the effects of potential confounder variables that are also included in the multivariate logistic model.

  10. FDTD computation of temperature elevation in the elderly for far-field RF exposures.

    PubMed

    Nomura, Tomoki; Laakso, Ilkka; Hirata, Akimasa

    2014-03-01

    Core temperature elevation and perspiration in younger and older adults is investigated for plane-wave exposure at whole-body averaged specific absorption rate of 0.4 W kg(-1). Numeric Japanese male model is considered together with a thermoregulatory response formula proposed in the authors' previous study. The frequencies considered were at 65 MHz and 2 GHz where the total power absorption in humans becomes maximal for the allowable power density prescribed in the international guidelines. From the computational results used here, the core temperature elevation in the older adult model was larger than that in the younger one at both frequencies. The reason for this difference is attributable to the difference of sweating, which is originated from the difference in the threshold activating the sweating and the decline in sweating in the legs.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Jordan N.; Hinderliter, Paul M.; Timchalk, Charles

    Sensitivity to chemicals in animals and humans are known to vary with age. Age-related changes in sensitivity to chlorpyrifos have been reported in animal models. A life-stage physiologically based pharmacokinetic and pharmacodynamic (PBPK/PD) model was developed to computationally predict disposition of CPF and its metabolites, chlorpyrifos-oxon (the ultimate toxicant) and 3,5,6-trichloro-2-pyridinol (TCPy), as well as B-esterase inhibition by chlorpyrifos-oxon in humans. In this model, age-dependent body weight was calculated from a generalized Gompertz function, and compartments (liver, brain, fat, blood, diaphragm, rapid, and slow) were scaled based on body weight from polynomial functions on a fractional body weight basis. Bloodmore » flows among compartments were calculated as a constant flow per compartment volume. The life-stage PBPK/PD model was calibrated and tested against controlled adult human exposure studies. Model simulations suggest age-dependent pharmacokinetics and response may exist. At oral doses ≥ 0.55 mg/kg of chlorpyrifos (significantly higher than environmental exposure levels), 6 mo old children are predicted to have higher levels of chlorpyrifos-oxon in blood and higher levels of red blood cell cholinesterase inhibition compared to adults from equivalent oral doses of chlorpyrifos. At lower doses that are more relevant to environmental exposures, the model predicts that adults will have slightly higher levels of chlorpyrifos-oxon in blood and greater cholinesterase inhibition. This model provides a computational framework for age-comparative simulations that can be utilized to predict CPF disposition and biological response over various postnatal life-stages.« less

  12. Impact of temporal upscaling and chemical transport model horizontal resolution on reducing ozone exposure misclassification

    NASA Astrophysics Data System (ADS)

    Xu, Yadong; Serre, Marc L.; Reyes, Jeanette M.; Vizuete, William

    2017-10-01

    We have developed a Bayesian Maximum Entropy (BME) framework that integrates observations from a surface monitoring network and predictions from a Chemical Transport Model (CTM) to create improved exposure estimates that can be resolved into any spatial and temporal resolution. The flexibility of the framework allows for input of data in any choice of time scales and CTM predictions of any spatial resolution with varying associated degrees of estimation error and cost in terms of implementation and computation. This study quantifies the impact on exposure estimation error due to these choices by first comparing estimations errors when BME relied on ozone concentration data either as an hourly average, the daily maximum 8-h average (DM8A), or the daily 24-h average (D24A). Our analysis found that the use of DM8A and D24A data, although less computationally intensive, reduced estimation error more when compared to the use of hourly data. This was primarily due to the poorer CTM model performance in the hourly average predicted ozone. Our second analysis compared spatial variability and estimation errors when BME relied on CTM predictions with a grid cell resolution of 12 × 12 km2 versus a coarser resolution of 36 × 36 km2. Our analysis found that integrating the finer grid resolution CTM predictions not only reduced estimation error, but also increased the spatial variability in daily ozone estimates by 5 times. This improvement was due to the improved spatial gradients and model performance found in the finer resolved CTM simulation. The integration of observational and model predictions that is permitted in a BME framework continues to be a powerful approach for improving exposure estimates of ambient air pollution. The results of this analysis demonstrate the importance of also understanding model performance variability and its implications on exposure error.

  13. The Virtual Liver Project: Modeling Tissue Response To Chemicals Through Multiscale Simulation

    EPA Science Inventory

    The US EPA Virtual Liver Project is aimed at simulating the risk of toxic effects from environmental chemicals in silico. The computational systems model of organ injury due to chronic chemical exposure is based on: (i) the dynamics of perturbed molecular pathways, (ii) their lin...

  14. Developing Predictive Approaches to Characterize Adaptive Responses of the Reproductive Endocrine Axis to Aromatase Inhibition: Computational Modeling

    EPA Science Inventory

    Exposure to endocrine disrupting chemicals can affect reproduction and development in both humans and wildlife. We developed a mechanistic mathematical model of the hypothalamic-pituitary-gonadal (HPG) axis in female fathead minnows to predict dose-response and time-course (DRTC)...

  15. Global Dynamic Exposure and the OpenBuildingMap

    NASA Astrophysics Data System (ADS)

    Schorlemmer, D.; Beutin, T.; Hirata, N.; Hao, K. X.; Wyss, M.; Cotton, F.; Prehn, K.

    2015-12-01

    Detailed understanding of local risk factors regarding natural catastrophes requires in-depth characterization of the local exposure. Current exposure capture techniques have to find the balance between resolution and coverage. We aim at bridging this gap by employing a crowd-sourced approach to exposure capturing focusing on risk related to earthquake hazard. OpenStreetMap (OSM), the rich and constantly growing geographical database, is an ideal foundation for us. More than 2.5 billion geographical nodes, more than 150 million building footprints (growing by ~100'000 per day), and a plethora of information about school, hospital, and other critical facility locations allow us to exploit this dataset for risk-related computations. We will harvest this dataset by collecting exposure and vulnerability indicators from explicitly provided data (e.g. hospital locations), implicitly provided data (e.g. building shapes and positions), and semantically derived data, i.e. interpretation applying expert knowledge. With this approach, we can increase the resolution of existing exposure models from fragility classes distribution via block-by-block specifications to building-by-building vulnerability. To increase coverage, we will provide a framework for collecting building data by any person or community. We will implement a double crowd-sourced approach to bring together the interest and enthusiasm of communities with the knowledge of earthquake and engineering experts. The first crowd-sourced approach aims at collecting building properties in a community by local people and activists. This will be supported by tailored building capture tools for mobile devices for simple and fast building property capturing. The second crowd-sourced approach involves local experts in estimating building vulnerability that will provide building classification rules that translate building properties into vulnerability and exposure indicators as defined in the Building Taxonomy 2.0 developed by the Global Earthquake Model (GEM). These indicators will then be combined with a hazard model using the GEM OpenQuake engine to compute a risk model. The free/open framework we will provide can be used on commodity hardware for local to regional exposure capturing and for communities to understand their earthquake risk.

  16. A Synchronization Account of False Recognition

    ERIC Educational Resources Information Center

    Johns, Brendan T.; Jones, Michael N.; Mewhort, Douglas J. K.

    2012-01-01

    We describe a computational model to explain a variety of results in both standard and false recognition. A key attribute of the model is that it uses plausible semantic representations for words, built through exposure to a linguistic corpus. A study list is encoded in the model as a gist trace, similar to the proposal of fuzzy trace theory…

  17. Pregnant women and children's exposure to tobacco and solid fuel smoke in southwestern India.

    PubMed

    Kelly, Patricia J; Goudar, Shivaprasad S; Chakraborty, Hrishikesh; Moore, Janet; Derman, Richard; Kodkany, Bhala; Bellad, Mrutyunjaya; Naik, Vijjaya A; Angolkar, Mubashir; Bloch, Michele

    2011-07-01

    To examine factors associated with smoke exposure among pregnant women in rural India. We conducted a survey of exposure to second-hand smoke (SHS) and solid fuel smoke (SFS) among 736 pregnant women. Odds ratios (OR) and 95% confidence intervals (CI) were computed using logistic regression models to assess the relationship between demographic variables and exposure to SHS and to SFS. While few respondents smoked cigarettes, 19.9% of women and 27.8% of children were frequently or always exposed to SHS, and 43.5% were at high and 46.7% at medium risk for SFE. Low educational levels and illiteracy were associated with exposure. Smoke exposure is a serious health risk for many poor women and children in India.

  18. An Integrated Model of the Cardiovascular and Central Nervous Systems for Analysis of Microgravity Induced Fluid Redistribution

    NASA Technical Reports Server (NTRS)

    Price, R.; Gady, S.; Heinemann, K.; Nelson, E. S.; Mulugeta, L.; Ethier, C. R.; Samuels, B. C.; Feola, A.; Vera, J.; Myers, J. G.

    2015-01-01

    A recognized side effect of prolonged microgravity exposure is visual impairment and intracranial pressure (VIIP) syndrome. The medical understanding of this phenomenon is at present preliminary, although it is hypothesized that the headward shift of bodily fluids in microgravity may be a contributor. Computational models can be used to provide insight into the origins of VIIP. In order to further investigate this phenomenon, NASAs Digital Astronaut Project (DAP) is developing an integrated computational model of the human body which is divided into the eye, the cerebrovascular system, and the cardiovascular system. This presentation will focus on the development and testing of the computational model of an integrated model of the cardiovascular system (CVS) and central nervous system (CNS) that simulates the behavior of pressures, volumes, and flows within these two physiological systems.

  19. Exploration of the molecular basis of blast injury in a biofidelic model of traumatic brain injury

    NASA Astrophysics Data System (ADS)

    Thielen, P.; Mehoke, T.; Gleason, J.; Iwaskiw, A.; Paulson, J.; Merkle, A.; Wester, B.; Dymond, J.

    2018-01-01

    Biological response to blast overpressure is complex and results in various and potentially non-concomitant acute and long-term deficits to exposed individuals. Clinical links between blast severity and injury outcomes remain elusive and have yet to be fully described, resulting in a critical inability to develop associated protection and mitigation strategies. Further, experimental models frequently fail to reproduce observed physiological phenomena and/or introduce artifacts that confound analysis and reproducibility. New models are required that employ consistent mechanical inputs, scale with biological analogs and known clinical data, and permit high-throughput examination of biological responses for a range of environmental and battlefield- relevant exposures. Here we describe a novel, biofidelic headform capable of integrating complex biological samples for blast exposure studies. We additionally demonstrate its utility in detecting acute transcriptional responses in the model organism Caenorhabditis elegans after exposure to blast overpressure. This approach enables correlation between mechanical exposure and biological outcome, permitting both the enhancement of existing surrogate and computational models and the high-throughput biofidelic testing of current and future protection systems.

  20. Accommodating the ecological fallacy in disease mapping in the absence of individual exposures.

    PubMed

    Wang, Feifei; Wang, Jian; Gelfand, Alan; Li, Fan

    2017-12-30

    In health exposure modeling, in particular, disease mapping, the ecological fallacy arises because the relationship between aggregated disease incidence on areal units and average exposure on those units differs from the relationship between the event of individual incidence and the associated individual exposure. This article presents a novel modeling approach to address the ecological fallacy in the least informative data setting. We assume the known population at risk with an observed incidence for a collection of areal units and, separately, environmental exposure recorded during the period of incidence at a collection of monitoring stations. We do not assume any partial individual level information or random allocation of individuals to observed exposures. We specify a conceptual incidence surface over the study region as a function of an exposure surface resulting in a stochastic integral of the block average disease incidence. The true block level incidence is an unavailable Monte Carlo integration for this stochastic integral. We propose an alternative manageable Monte Carlo integration for the integral. Modeling in this setting is immediately hierarchical, and we fit our model within a Bayesian framework. To alleviate the resulting computational burden, we offer 2 strategies for efficient model fitting: one is through modularization, the other is through sparse or dimension-reduced Gaussian processes. We illustrate the performance of our model with simulations based on a heat-related mortality dataset in Ohio and then analyze associated real data. Copyright © 2017 John Wiley & Sons, Ltd.

  1. Virtual Reality versus Computer-Aided Exposure Treatments for Fear of Flying

    ERIC Educational Resources Information Center

    Tortella-Feliu, Miquel; Botella, Cristina; Llabres, Jordi; Breton-Lopez, Juana Maria; del Amo, Antonio Riera; Banos, Rosa M.; Gelabert, Joan M.

    2011-01-01

    Evidence is growing that two modalities of computer-based exposure therapies--virtual reality and computer-aided psychotherapy--are effective in treating anxiety disorders, including fear of flying. However, they have not yet been directly compared. The aim of this study was to analyze the efficacy of three computer-based exposure treatments for…

  2. Model for Porosity Changes Occurring during Ultrasound-Enhanced Transcorneal Drug Delivery.

    PubMed

    Hariharan, Prasanna; Nabili, Marjan; Guan, Allan; Zderic, Vesna; Myers, Matthew

    2017-06-01

    Ultrasound-enhanced drug delivery through the cornea has considerable therapeutic potential. However, our understanding of how ultrasound enhances drug transport is poor, as is our ability to predict the increased level of transport for given ultrasound parameters. Described here is a computational model for quantifying changes in corneal porosity during ultrasound exposure. The model is calibrated through experiments involving sodium fluorescein transport through rabbit cornea. Validation was performed using nylon filters, for which the properties are known. It was found that exposure to 800-kHz ultrasound at an intensity 2 W/cm 2 for 5 min increased the porosity of the epithelium by a factor of 5. The model can be useful for determining the extent to which ultrasound enhances the amount of drug transported through biological barriers, and the time at which a therapeutic dose is achieved at a given location, for different drugs and exposure strategies. Published by Elsevier Inc.

  3. Impact of AMS-02 Measurements on Reducing GCR Model Uncertainties

    NASA Technical Reports Server (NTRS)

    Slaba, T. C.; O'Neill, P. M.; Golge, S.; Norbury, J. W.

    2015-01-01

    For vehicle design, shield optimization, mission planning, and astronaut risk assessment, the exposure from galactic cosmic rays (GCR) poses a significant and complex problem both in low Earth orbit and in deep space. To address this problem, various computational tools have been developed to quantify the exposure and risk in a wide range of scenarios. Generally, the tool used to describe the ambient GCR environment provides the input into subsequent computational tools and is therefore a critical component of end-to-end procedures. Over the past few years, several researchers have independently and very carefully compared some of the widely used GCR models to more rigorously characterize model differences and quantify uncertainties. All of the GCR models studied rely heavily on calibrating to available near-Earth measurements of GCR particle energy spectra, typically over restricted energy regions and short time periods. In this work, we first review recent sensitivity studies quantifying the ions and energies in the ambient GCR environment of greatest importance to exposure quantities behind shielding. Currently available measurements used to calibrate and validate GCR models are also summarized within this context. It is shown that the AMS-II measurements will fill a critically important gap in the measurement database. The emergence of AMS-II measurements also provides a unique opportunity to validate existing models against measurements that were not used to calibrate free parameters in the empirical descriptions. Discussion is given regarding rigorous approaches to implement the independent validation efforts, followed by recalibration of empirical parameters.

  4. Systems Toxicology: From Basic Research to Risk Assessment

    PubMed Central

    2014-01-01

    Systems Toxicology is the integration of classical toxicology with quantitative analysis of large networks of molecular and functional changes occurring across multiple levels of biological organization. Society demands increasingly close scrutiny of the potential health risks associated with exposure to chemicals present in our everyday life, leading to an increasing need for more predictive and accurate risk-assessment approaches. Developing such approaches requires a detailed mechanistic understanding of the ways in which xenobiotic substances perturb biological systems and lead to adverse outcomes. Thus, Systems Toxicology approaches offer modern strategies for gaining such mechanistic knowledge by combining advanced analytical and computational tools. Furthermore, Systems Toxicology is a means for the identification and application of biomarkers for improved safety assessments. In Systems Toxicology, quantitative systems-wide molecular changes in the context of an exposure are measured, and a causal chain of molecular events linking exposures with adverse outcomes (i.e., functional and apical end points) is deciphered. Mathematical models are then built to describe these processes in a quantitative manner. The integrated data analysis leads to the identification of how biological networks are perturbed by the exposure and enables the development of predictive mathematical models of toxicological processes. This perspective integrates current knowledge regarding bioanalytical approaches, computational analysis, and the potential for improved risk assessment. PMID:24446777

  5. Systems toxicology: from basic research to risk assessment.

    PubMed

    Sturla, Shana J; Boobis, Alan R; FitzGerald, Rex E; Hoeng, Julia; Kavlock, Robert J; Schirmer, Kristin; Whelan, Maurice; Wilks, Martin F; Peitsch, Manuel C

    2014-03-17

    Systems Toxicology is the integration of classical toxicology with quantitative analysis of large networks of molecular and functional changes occurring across multiple levels of biological organization. Society demands increasingly close scrutiny of the potential health risks associated with exposure to chemicals present in our everyday life, leading to an increasing need for more predictive and accurate risk-assessment approaches. Developing such approaches requires a detailed mechanistic understanding of the ways in which xenobiotic substances perturb biological systems and lead to adverse outcomes. Thus, Systems Toxicology approaches offer modern strategies for gaining such mechanistic knowledge by combining advanced analytical and computational tools. Furthermore, Systems Toxicology is a means for the identification and application of biomarkers for improved safety assessments. In Systems Toxicology, quantitative systems-wide molecular changes in the context of an exposure are measured, and a causal chain of molecular events linking exposures with adverse outcomes (i.e., functional and apical end points) is deciphered. Mathematical models are then built to describe these processes in a quantitative manner. The integrated data analysis leads to the identification of how biological networks are perturbed by the exposure and enables the development of predictive mathematical models of toxicological processes. This perspective integrates current knowledge regarding bioanalytical approaches, computational analysis, and the potential for improved risk assessment.

  6. Rotenone and paraquat perturb dopamine metabolism: a computational analysis of pesticide toxicity

    PubMed Central

    Qi, Zhen; Miller, Gary W.; Voit, Eberhard O.

    2014-01-01

    Pesticides, such as rotenone and paraquat, are suspected in the pathogenesis of Parkinson’s disease (PD), whose hallmark is the progressive loss of dopaminergic neurons in the substantia nigra pars compacta. Thus, compounds expected to play a role in the pathogenesis of PD will likely impact the function of dopaminergic neurons. To explore the relationship between pesticide exposure and dopaminergic toxicity, we developed a custom-tailored mathematical model of dopamine metabolism and utilized it to infer potential mechanisms underlying the toxicity of rotenone and paraquat, asking how these pesticides perturb specific processes. We performed two types of analyses, which are conceptually different and complement each other. The first analysis, a purely algebraic reverse engineering approach, analytically and deterministically computes the altered profile of enzyme activities that characterize the effects of a pesticide. The second method consists of large-scale Monte Carlo simulations that statistically reveal possible mechanisms of pesticides. The results from the reverse engineering approach show that rotenone and paraquat exposures lead to distinctly different flux perturbations. Rotenone seems to affect all fluxes associated with dopamine compartmentalization, whereas paraquat exposure perturbs fluxes associated with dopamine and its breakdown metabolites. The statistical results of the Monte-Carlo analysis suggest several specific mechanisms. The findings are interesting, because no a priori assumptions are made regarding specific pesticide actions, and all parameters characterizing the processes in the dopamine model are treated in an unbiased manner. Our results show how approaches from computational systems biology can help identify mechanisms underlying the toxicity of pesticide exposure. PMID:24269752

  7. Performance characteristics of a Kodak computed radiography system.

    PubMed

    Bradford, C D; Peppler, W W; Dobbins, J T

    1999-01-01

    The performance characteristics of a photostimulable phosphor based computed radiographic (CR) system were studied. The modulation transfer function (MTF), noise power spectra (NPS), and detective quantum efficiency (DQE) of the Kodak Digital Science computed radiography (CR) system (Eastman Kodak Co.-model 400) were measured and compared to previously published results of a Fuji based CR system (Philips Medical Systems-PCR model 7000). To maximize comparability, the same measurement techniques and analysis methods were used. The DQE at four exposure levels (30, 3, 0.3, 0.03 mR) and two plate types (standard and high resolution) were calculated from the NPS and MTF measurements. The NPS was determined from two-dimensional Fourier analysis of uniformly exposed plates. The presampling MTF was determined from the Fourier transform (FT) of the system's finely sampled line spread function (LSF) as produced by a narrow slit. A comparison of the slit type ("beveled edge" versus "straight edge") and its effect on the resulting MTF measurements was also performed. The results show that both systems are comparable in resolution performance. The noise power studies indicated a higher level of noise for the Kodak images (approximately 20% at the low exposure levels and 40%-70% at higher exposure levels). Within the clinically relevant exposure range (0.3-3 mR), the resulting DQE for the Kodak plates ranged between 20%-50% lower than for the corresponding Fuji plates. Measurements of the presampling MTF with the two slit types have shown that a correction factor can be applied to compensate for transmission through the relief edges.

  8. Testing of the European Union exposure-response relationships and annoyance equivalents model for annoyance due to transportation noises: The need of revised exposure-response relationships and annoyance equivalents model.

    PubMed

    Gille, Laure-Anne; Marquis-Favre, Catherine; Morel, Julien

    2016-09-01

    An in situ survey was performed in 8 French cities in 2012 to study the annoyance due to combined transportation noises. As the European Commission recommends to use the exposure-response relationships suggested by Miedema and Oudshoorn [Environmental Health Perspective, 2001] to predict annoyance due to single transportation noise, these exposure-response relationships were tested using the annoyance due to each transportation noise measured during the French survey. These relationships only enabled a good prediction in terms of the percentages of people highly annoyed by road traffic noise. For the percentages of people annoyed and a little annoyed by road traffic noise, the quality of prediction is weak. For aircraft and railway noises, prediction of annoyance is not satisfactory either. As a consequence, the annoyance equivalents model of Miedema [The Journal of the Acoustical Society of America, 2004], based on these exposure-response relationships did not enable a good prediction of annoyance due to combined transportation noises. Local exposure-response relationships were derived, following the whole computation suggested by Miedema and Oudshoorn [Environmental Health Perspective, 2001]. They led to a better calculation of annoyance due to each transportation noise in the French cities. A new version of the annoyance equivalents model was proposed using these new exposure-response relationships. This model enabled a better prediction of the total annoyance due to the combined transportation noises. These results encourage therefore to improve the annoyance prediction for noise in isolation with local or revised exposure-response relationships, which will also contribute to improve annoyance modeling for combined noises. With this aim in mind, a methodology is proposed to consider noise sensitivity in exposure-response relationships and in the annoyance equivalents model. The results showed that taking into account such variable did not enable to enhance both exposure-response relationships and the annoyance equivalents model. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Foveal light exposure is increased at the time of removal of silicone oil with the potential for phototoxicity.

    PubMed

    Dogramaci, Mahmut; Williams, Katie; Lee, Ed; Williamson, Tom H

    2013-01-01

    There is sudden and dramatic visual function deterioration in 1-10 % of eyes filled with silicone oil at the time of removal of silicon oil. Transmission of high-energy blue light is increased in eyes filled with silicone oil. We sought to identify if increased foveal light exposure is a potential factor in the pathophysiology of the visual loss at the time of removal of silicone oil. A graphic ray tracing computer program and laboratory models were used to determine the effect of the intraocular silicone oil bubble size on the foveal illuminance at the time of removal of silicone oil under direct microscope light. The graphic ray tracing computer program revealed a range of optical vignetting effects created by different sizes of silicone oil bubble within the vitreous cavity giving rise to an uneven macular illumination. The laboratory model was used to quantify the variation of illuminance at the foveal region with different sizes of silicone oil bubble with in the vitreous cavity at the time of removal of silicon oil under direct microscope light. To substantiate the hypothesis of the light toxicity during removal of silicone oil, The outcome of oil removal procedures performed under direct microscope illumination in compared to those performed under blocked illumination. The computer program showed that the optical vignetting effect at the macula was dependent on the size of the intraocular silicone oil bubble. The laboratory eye model showed that the foveal illuminance followed a bell-shaped curve with 70 % greater illuminance demonstrated at with 50-60 % silicone oil fill. The clinical data identified five eyes with unexplained vision loss out of 114 eyes that had the procedure performed under direct microscope illumination compared to none out of 78 eyes that had the procedure under blocked illumination. Foveal light exposure, and therefore the potential for phototoxicity, is transiently increased at the time of removal of silicone oil. This is due to uneven macular illumination resulting from the optical vignetting effect of different silicone oil bubble sizes. The increase in foveal light exposure may be significant when the procedure is performed under bright operating microscope light on already stressed photoreceptors of an eye filled with silicon oil. We advocate the use of precautions, such as central shadow filter on the operating microscope light source to reduce foveal light exposure and the risk of phototoxicity at the time of removal of silicone oil. The graphic ray tracing computer program used in this study shows promise in eye modeling for future studies.

  10. Modeling of road traffic noise and estimated human exposure in Fulton County, Georgia, USA.

    PubMed

    Seong, Jeong C; Park, Tae H; Ko, Joon H; Chang, Seo I; Kim, Minho; Holt, James B; Mehdi, Mohammed R

    2011-11-01

    Environmental noise is a major source of public complaints. Noise in the community causes physical and socio-economic effects and has been shown to be related to adverse health impacts. Noise, however, has not been actively researched in the United States compared with the European Union countries in recent years. In this research, we aimed at modeling road traffic noise and analyzing human exposure in Fulton County, Georgia, United States. We modeled road traffic noise levels using the United States Department of Transportation Federal Highway Administration Traffic Noise Model implemented in SoundPLAN®. After analyzing noise levels with raster, vector and façade maps, we estimated human exposure to high noise levels. Accurate digital elevation models and building heights were derived from Light Detection And Ranging survey datasets and building footprint boundaries. Traffic datasets were collected from the Georgia Department of Transportation and the Atlanta Regional Commission. Noise level simulation was performed with 62 computers in a distributed computing environment. Finally, the noise-exposed population was calculated using geographic information system techniques. Results show that 48% of the total county population [N=870,166 residents] is potentially exposed to 55 dB(A) or higher noise levels during daytime. About 9% of the population is potentially exposed to 67 dB(A) or higher noises. At nighttime, 32% of the population is expected to be exposed to noise levels higher than 50 dB(A). This research shows that large-scale traffic noise estimation is possible with the help of various organizations. We believe that this research is a significant stepping stone for analyzing community health associated with noise exposures in the United States. Copyright © 2011 Elsevier Ltd. All rights reserved.

  11. Modelling indirect interactions during failure spreading in a project activity network.

    PubMed

    Ellinas, Christos

    2018-03-12

    Spreading broadly refers to the notion of an entity propagating throughout a networked system via its interacting components. Evidence of its ubiquity and severity can be seen in a range of phenomena, from disease epidemics to financial systemic risk. In order to understand the dynamics of these critical phenomena, computational models map the probability of propagation as a function of direct exposure, typically in the form of pairwise interactions between components. By doing so, the important role of indirect interactions remains unexplored. In response, we develop a simple model that accounts for the effect of both direct and subsequent exposure, which we deploy in the novel context of failure propagation within a real-world engineering project. We show that subsequent exposure has a significant effect in key aspects, including the: (a) final spreading event size, (b) propagation rate, and (c) spreading event structure. In addition, we demonstrate the existence of 'hidden influentials' in large-scale spreading events, and evaluate the role of direct and subsequent exposure in their emergence. Given the evidence of the importance of subsequent exposure, our findings offer new insight on particular aspects that need to be included when modelling network dynamics in general, and spreading processes specifically.

  12. CFD-RANS prediction of individual exposure from continuous release of hazardous airborne materials in complex urban environments

    NASA Astrophysics Data System (ADS)

    Efthimiou, G. C.; Andronopoulos, S.; Bartzis, J. G.; Berbekar, E.; Harms, F.; Leitl, B.

    2017-02-01

    One of the key issues of recent research on the dispersion inside complex urban environments is the ability to predict individual exposure (maximum dosages) of an airborne material which is released continuously from a point source. The present work addresses the question whether the computational fluid dynamics (CFD)-Reynolds-averaged Navier-Stokes (RANS) methodology can be used to predict individual exposure for various exposure times. This is feasible by providing the two RANS concentration moments (mean and variance) and a turbulent time scale to a deterministic model. The whole effort is focused on the prediction of individual exposure inside a complex real urban area. The capabilities of the proposed methodology are validated against wind-tunnel data (CUTE experiment). The present simulations were performed 'blindly', i.e. the modeller had limited information for the inlet boundary conditions and the results were kept unknown until the end of the COST Action ES1006. Thus, a high uncertainty of the results was expected. The general performance of the methodology due to this 'blind' strategy is good. The validation metrics fulfil the acceptance criteria. The effect of the grid and the turbulence model on the model performance is examined.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Callahan, M.A.

    Three major issues to be dealt with over the next ten years in the exposure assessment field are: consistency in terminology, the impact of computer technology on the choice of data and modeling, and conceptual issues such as the use of time-weighted averages.

  14. Creation of an idealized nasopharynx geometry for accurate computational fluid dynamics simulations of nasal airflow in patient-specific models lacking the nasopharynx anatomy

    PubMed Central

    Borojeni, Azadeh A.T.; Frank-Ito, Dennis O.; Kimbell, Julia S.; Rhee, John S.; Garcia, Guilherme J. M.

    2016-01-01

    Virtual surgery planning based on computational fluid dynamics (CFD) simulations has the potential to improve surgical outcomes for nasal airway obstruction (NAO) patients, but the benefits of virtual surgery planning must outweigh the risks of radiation exposure. Cone beam computed tomography (CBCT) scans represent an attractive imaging modality for virtual surgery planning due to lower costs and lower radiation exposures compared with conventional CT scans. However, to minimize the radiation exposure, the CBCT sinusitis protocol sometimes images only the nasal cavity, excluding the nasopharynx. The goal of this study was to develop an idealized nasopharynx geometry for accurate representation of outlet boundary conditions when the nasopharynx geometry is unavailable. Anatomically-accurate models of the nasopharynx created from thirty CT scans were intersected with planes rotated at different angles to obtain an average geometry. Cross sections of the idealized nasopharynx were approximated as ellipses with cross-sectional areas and aspect ratios equal to the average in the actual patient-specific models. CFD simulations were performed to investigate whether nasal airflow patterns were affected when the CT-based nasopharynx was replaced by the idealized nasopharynx in 10 NAO patients. Despite the simple form of the idealized geometry, all biophysical variables (nasal resistance, airflow rate, and heat fluxes) were very similar in the idealized vs. patient-specific models. The results confirmed the expectation that the nasopharynx geometry has a minimal effect in the nasal airflow patterns during inspiration. The idealized nasopharynx geometry will be useful in future CFD studies of nasal airflow based on medical images that exclude the nasopharynx. PMID:27525807

  15. Reconstructing population exposures to environmental chemicals from biomarkers: challenges and opportunities.

    PubMed

    Georgopoulos, Panos G; Sasso, Alan F; Isukapalli, Sastry S; Lioy, Paul J; Vallero, Daniel A; Okino, Miles; Reiter, Larry

    2009-02-01

    A conceptual/computational framework for exposure reconstruction from biomarker data combined with auxiliary exposure-related data is presented, evaluated with example applications, and examined in the context of future needs and opportunities. This framework employs physiologically based toxicokinetic (PBTK) modeling in conjunction with numerical "inversion" techniques. To quantify the value of different types of exposure data "accompanying" biomarker data, a study was conducted focusing on reconstructing exposures to chlorpyrifos, from measurements of its metabolite levels in urine. The study employed biomarker data as well as supporting exposure-related information from the National Human Exposure Assessment Survey (NHEXAS), Maryland, while the MENTOR-3P system (Modeling ENvironment for TOtal Risk with Physiologically based Pharmacokinetic modeling for Populations) was used for PBTK modeling. Recently proposed, simple numerical reconstruction methods were applied in this study, in conjunction with PBTK models. Two types of reconstructions were studied using (a) just the available biomarker and supporting exposure data and (b) synthetic data developed via augmenting available observations. Reconstruction using only available data resulted in a wide range of variation in estimated exposures. Reconstruction using synthetic data facilitated evaluation of numerical inversion methods and characterization of the value of additional information, such as study-specific data that can be collected in conjunction with the biomarker data. Although the NHEXAS data set provides a significant amount of supporting exposure-related information, especially when compared to national studies such as the National Health and Nutrition Examination Survey (NHANES), this information is still not adequate for detailed reconstruction of exposures under several conditions, as demonstrated here. The analysis presented here provides a starting point for introducing improved designs for future biomonitoring studies, from the perspective of exposure reconstruction; identifies specific limitations in existing exposure reconstruction methods that can be applied to population biomarker data; and suggests potential approaches for addressing exposure reconstruction from such data.

  16. A New Model of Sensorimotor Coupling in the Development of Speech

    ERIC Educational Resources Information Center

    Westermann, Gert; Miranda, Eduardo Reck

    2004-01-01

    We present a computational model that learns a coupling between motor parameters and their sensory consequences in vocal production during a babbling phase. Based on the coupling, preferred motor parameters and prototypically perceived sounds develop concurrently. Exposure to an ambient language modifies perception to coincide with the sounds from…

  17. Guide to Selected Algorithms, Distributions, and Databases Used in Exposure Models Developed By the Office of Air Quality Planning and Standards

    EPA Pesticide Factsheets

    In the evaluation of emissions standards, OAQPS frequently uses one or more computer-based models to estimate the number of people who will be exposed to the air pollution levels that are expected to occur under various air quality scenarios.

  18. Developing predictive approaches to characterize adaptive responses of the reproductive endocrine axis to aromatase inhibition: I. Data generation in a small fish model

    EPA Science Inventory

    Adaptive or compensatory responses to chemical exposure can significantly influence in vivo concentration-duration-response relationships. The aim of this study was to provide data to support development of a computational dynamic model of the hypothalamic-pituitary-gonadal axis ...

  19. Dynamics of retinal photocoagulation and rupture

    NASA Astrophysics Data System (ADS)

    Sramek, Christopher; Paulus, Yannis; Nomoto, Hiroyuki; Huie, Phil; Brown, Jefferson; Palanker, Daniel

    2009-05-01

    In laser retinal photocoagulation, short (<20 ms) pulses have been found to reduce thermal damage to the inner retina, decrease treatment time, and minimize pain. However, the safe therapeutic window (defined as the ratio of power for producing a rupture to that of mild coagulation) decreases with shorter exposures. To quantify the extent of retinal heating and maximize the therapeutic window, a computational model of millisecond retinal photocoagulation and rupture was developed. Optical attenuation of 532-nm laser light in ocular tissues was measured, including retinal pigment epithelial (RPE) pigmentation and cell-size variability. Threshold powers for vaporization and RPE damage were measured with pulse durations ranging from 1 to 200 ms. A finite element model of retinal heating inferred that vaporization (rupture) takes place at 180-190°C. RPE damage was accurately described by the Arrhenius model with activation energy of 340 kJ/mol. Computed photocoagulation lesion width increased logarithmically with pulse duration, in agreement with histological findings. The model will allow for the optimization of beam parameters to increase the width of the therapeutic window for short exposures.

  20. ART, Stoffenmanager, and TRA: A Systematic Comparison of Exposure Estimates Using the TREXMO Translation System.

    PubMed

    Savic, Nenad; Gasic, Bojan; Vernez, David

    2017-12-15

    Several occupational exposure models are recommended under the EU's REACH legislation. Due to limited availability of high-quality exposure data, their validation is an ongoing process. It was shown, however, that different models may calculate significantly different estimates and thus lead to potentially dangerous conclusions about chemical risk. In this paper, the between-model translation rules defined in TREXMO were used to generate 319000 different in silico exposure situations in ART, Stoffenmanager, and ECETOC TRA v3. The three models' estimates were computed and the correlation and consistency between them were investigated. The best correlated pair was Stoffenmanager-ART (R, 0.52-0.90), whereas the ART-TRA and Stoffenmanager-TRA correlations were either lower (R, 0.36-0.69) or no correlation was found. Consistency varied significantly according to different exposure types (e.g. vapour versus dust) or settings (near-field versus far-field and indoors versus outdoors). The percentages of generated situations for which estimates differed by more than a factor of 100 ranged from 14 to 97%, 37 to 99%, and 1 to 68% for Stoffenmanager-ART, TRA-ART, and TRA-Stoffenmanager, respectively. Overall, the models were more consistent for vapours than for dusts and solids, near-fields than for far-fields, and indoor than for outdoor exposure. Multiple linear regression analyses evidenced the relationship between the models' parameters and the relative differences between the models' predictions. The relative difference can be used to estimate the consistency between the models. Furthermore, the study showed that the tiered approach is not generally applicable to all exposure situations. These findings emphasize the need for a multiple-model approach to assessing critical exposure scenarios under REACH. Moreover, in combination with occupational exposure measurements, they might also be used for future studies to improve prediction accuracy. © The Author(s) 2017. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.

  1. Radiation dose predictions for SPE events during solar cycle 23 from NASA's Nowcast of Atmospheric Ionizing Radiation for Aviation Safety (NAIRAS) model

    NASA Astrophysics Data System (ADS)

    Mertens, Christopher; Blattnig, Steve; Slaba, Tony; Kress, Brian; Wiltberger, Michael; Solomon, Stan

    NASA's High Charge and Energy Transport (HZETRN) code is a deterministic model for rapid and accurate calculations of the particle radiation fields in the space environment. HZETRN is used to calculate dosimetric quantities on the International Space Station (ISS) and assess astronaut risk to space radiations, including realistic spacecraft and human geometry for final exposure evaluation. HZETRN is used as an engineering design tool for materials research for radiation shielding protection. Moreover, it is used to calculate HZE propagation through the Earth and Martian atmospheres, and to evaluate radiation exposures for epidemiological studies. A new research project has begun that will use HZETRN as the transport engine for the development of a nowcast prediction of air-crew radiation exposure for both background galactic cosmic ray (GCR) exposure and radiation exposure during solar particle events (SPE) that may accompany solar storms. The new air-crew radiation exposure model is called the Nowcast of Atmospheric Ionizing Radiation for Aviation Safety (NAIRAS) model, which utilizes real-time observations from ground-based, atmospheric, and satellite measurements. In this paper, we compute the global distribution of atmospheric radiation dose for several SPE events during solar cycle 23, with particular emphasis on the high-latitude and polar region. We also characterize the suppression of the geomagnetic cutoff rigidity during these storm periods and their subsequent influence on atmospheric radiation exposure.

  2. Image based Monte Carlo Modeling for Computational Phantom

    NASA Astrophysics Data System (ADS)

    Cheng, Mengyun; Wang, Wen; Zhao, Kai; Fan, Yanchang; Long, Pengcheng; Wu, Yican

    2014-06-01

    The evaluation on the effects of ionizing radiation and the risk of radiation exposure on human body has been becoming one of the most important issues for radiation protection and radiotherapy fields, which is helpful to avoid unnecessary radiation and decrease harm to human body. In order to accurately evaluate the dose on human body, it is necessary to construct more realistic computational phantom. However, manual description and verfication of the models for Monte carlo(MC)simulation are very tedious, error-prone and time-consuming. In addiation, it is difficult to locate and fix the geometry error, and difficult to describe material information and assign it to cells. MCAM (CAD/Image-based Automatic Modeling Program for Neutronics and Radiation Transport Simulation) was developed as an interface program to achieve both CAD- and image-based automatic modeling by FDS Team (Advanced Nuclear Energy Research Team, http://www.fds.org.cn). The advanced version (Version 6) of MCAM can achieve automatic conversion from CT/segmented sectioned images to computational phantoms such as MCNP models. Imaged-based automatic modeling program(MCAM6.0) has been tested by several medical images and sectioned images. And it has been applied in the construction of Rad-HUMAN. Following manual segmentation and 3D reconstruction, a whole-body computational phantom of Chinese adult female called Rad-HUMAN was created by using MCAM6.0 from sectioned images of a Chinese visible human dataset. Rad-HUMAN contains 46 organs/tissues, which faithfully represented the average anatomical characteristics of the Chinese female. The dose conversion coefficients(Dt/Ka) from kerma free-in-air to absorbed dose of Rad-HUMAN were calculated. Rad-HUMAN can be applied to predict and evaluate dose distributions in the Treatment Plan System (TPS), as well as radiation exposure for human body in radiation protection.

  3. Association between Activity Space Exposure to Food Establishments and Individual Risk of Overweight

    PubMed Central

    Kestens, Yan; Lebel, Alexandre; Chaix, Basile; Clary, Christelle; Daniel, Mark; Pampalon, Robert; Theriault, Marius; p Subramanian, S. V.

    2012-01-01

    Objective Environmental exposure to food sources may underpin area level differences in individual risk for overweight. Place of residence is generally used to assess neighbourhood exposure. Yet, because people are mobile, multiple exposures should be accounted for to assess the relation between food environments and overweight. Unfortunately, mobility data is often missing from health surveys. We hereby test the feasibility of linking travel survey data with food listings to derive food store exposure predictors of overweight among health survey participants. Methods Food environment exposure measures accounting for non-residential activity places (activity spaces) were computed and modelled in Montreal and Quebec City, Canada, using travel surveys and food store listings. Models were then used to predict activity space food exposures for 5,578 participants of the Canadian Community Health Survey. These food exposure estimates, accounting for daily mobility, were used to model self-reported overweight in a multilevel framework. Median Odd Ratios were used to assess the proportion of between-neighborhood variance explained by such food exposure predictors. Results Estimates of food environment exposure accounting for both residential and non-residential destinations were significantly and more strongly associated with overweight than residential-only measures of exposure for men. For women, residential exposures were more strongly associated with overweight than non-residential exposures. In Montreal, adjusted models showed men in the highest quartile of exposure to food stores were at lesser risk of being overweight considering exposure to restaurants (OR = 0.36 [0.21–0.62]), fast food outlets (0.48 [0.30–0.79]), or corner stores (0.52 [0.35–0.78]). Conversely, men experiencing the highest proportion of restaurants being fast-food outlets were at higher risk of being overweight (2.07 [1.25–3.42]). Women experiencing higher residential exposures were at lower risk of overweight. Conclusion Using residential neighbourhood food exposure measures may underestimate true exposure and observed associations. Using mobility data offers potential for deriving activity space exposure estimates in epidemiological models. PMID:22936974

  4. Chronic Exposure to Methamphetamine Disrupts Reinforcement-Based Decision Making in Rats.

    PubMed

    Groman, Stephanie M; Rich, Katherine M; Smith, Nathaniel J; Lee, Daeyeol; Taylor, Jane R

    2018-03-01

    The persistent use of psychostimulant drugs, despite the detrimental outcomes associated with continued drug use, may be because of disruptions in reinforcement-learning processes that enable behavior to remain flexible and goal directed in dynamic environments. To identify the reinforcement-learning processes that are affected by chronic exposure to the psychostimulant methamphetamine (MA), the current study sought to use computational and biochemical analyses to characterize decision-making processes, assessed by probabilistic reversal learning, in rats before and after they were exposed to an escalating dose regimen of MA (or saline control). The ability of rats to use flexible and adaptive decision-making strategies following changes in stimulus-reward contingencies was significantly impaired following exposure to MA. Computational analyses of parameters that track choice and outcome behavior indicated that exposure to MA significantly impaired the ability of rats to use negative outcomes effectively. These MA-induced changes in decision making were similar to those observed in rats following administration of a dopamine D2/3 receptor antagonist. These data use computational models to provide insight into drug-induced maladaptive decision making that may ultimately identify novel targets for the treatment of psychostimulant addiction. We suggest that the disruption in utilization of negative outcomes to adaptively guide dynamic decision making is a new behavioral mechanism by which MA rigidly biases choice behavior.

  5. Prediction of Chemical Function: Model Development and ...

    EPA Pesticide Factsheets

    The United States Environmental Protection Agency’s Exposure Forecaster (ExpoCast) project is developing both statistical and mechanism-based computational models for predicting exposures to thousands of chemicals, including those in consumer products. The high-throughput (HT) screening-level exposures developed under ExpoCast can be combined with HT screening (HTS) bioactivity data for the risk-based prioritization of chemicals for further evaluation. The functional role (e.g. solvent, plasticizer, fragrance) that a chemical performs can drive both the types of products in which it is found and the concentration in which it is present and therefore impacting exposure potential. However, critical chemical use information (including functional role) is lacking for the majority of commercial chemicals for which exposure estimates are needed. A suite of machine-learning based models for classifying chemicals in terms of their likely functional roles in products based on structure were developed. This effort required collection, curation, and harmonization of publically-available data sources of chemical functional use information from government and industry bodies. Physicochemical and structure descriptor data were generated for chemicals with function data. Machine-learning classifier models for function were then built in a cross-validated manner from the descriptor/function data using the method of random forests. The models were applied to: 1) predict chemi

  6. Modeling low-dose mortality and disease incubation period of inhalational anthrax in the rabbit.

    PubMed

    Gutting, Bradford W; Marchette, David; Sherwood, Robert; Andrews, George A; Director-Myska, Alison; Channel, Stephen R; Wolfe, Daniel; Berger, Alan E; Mackie, Ryan S; Watson, Brent J; Rukhin, Andrey

    2013-07-21

    There is a need to advance our ability to conduct credible human risk assessments for inhalational anthrax associated with exposure to a low number of bacteria. Combining animal data with computational models of disease will be central in the low-dose and cross-species extrapolations required in achieving this goal. The objective of the current work was to apply and advance the competing risks (CR) computational model of inhalational anthrax where data was collected from NZW rabbits exposed to aerosols of Ames strain Bacillus anthracis. An initial aim was to parameterize the CR model using high-dose rabbit data and then conduct a low-dose extrapolation. The CR low-dose attack rate was then compared against known low-dose rabbit data as well as the low-dose curve obtained when the entire rabbit dose-response data set was fitted to an exponential dose-response (EDR) model. The CR model predictions demonstrated excellent agreement with actual low-dose rabbit data. We next used a modified CR model (MCR) to examine disease incubation period (the time to reach a fever >40 °C). The MCR model predicted a germination period of 14.5h following exposure to a low spore dose, which was confirmed by monitoring spore germination in the rabbit lung using PCR, and predicted a low-dose disease incubation period in the rabbit between 14.7 and 16.8 days. Overall, the CR and MCR model appeared to describe rabbit inhalational anthrax well. These results are discussed in the context of conducting laboratory studies in other relevant animal models, combining the CR/MCR model with other computation models of inhalational anthrax, and using the resulting information towards extrapolating a low-dose response prediction for man. Published by Elsevier Ltd.

  7. The influence of age, gender and other information technology use on young people's computer use at school and home.

    PubMed

    Harris, C; Straker, L; Pollock, C

    2013-01-01

    Young people are exposed to a range of information technologies (IT) in different environments, including home and school, however the factors influencing IT use at home and school are poorly understood. The aim of this study was to investigate young people's computer exposure patterns at home and school, and related factors such as age, gender and the types of IT used. 1351 children in Years 1, 6, 9 and 11 from 10 schools in metropolitan Western Australia were surveyed. Most children had access to computers at home and school, with computer exposures comparable to TV, reading and writing. Total computer exposure was greater at home than school, and increased with age. Computer activities varied with age and gender and became more social with increased age, at the same time parental involvement reduced. Bedroom computer use was found to result in higher exposure patterns. High use of home and school computers were associated with each other. Associations varied depending on the type of IT exposure measure (frequency, mean weekly hours, usual and longest duration). The frequency and duration of children's computer exposure were associated with a complex interplay of the environment of use, the participant's age and gender and other IT activities.

  8. Comparison of modeling approaches to prioritize chemicals based on estimates of exposure and exposure potential

    PubMed Central

    Mitchell, Jade; Arnot, Jon A.; Jolliet, Olivier; Georgopoulos, Panos G.; Isukapalli, Sastry; Dasgupta, Surajit; Pandian, Muhilan; Wambaugh, John; Egeghy, Peter; Cohen Hubal, Elaine A.; Vallero, Daniel A.

    2014-01-01

    While only limited data are available to characterize the potential toxicity of over 8 million commercially available chemical substances, there is even less information available on the exposure and use-scenarios that are required to link potential toxicity to human and ecological health outcomes. Recent improvements and advances such as high throughput data gathering, high performance computational capabilities, and predictive chemical inherency methodology make this an opportune time to develop an exposure-based prioritization approach that can systematically utilize and link the asymmetrical bodies of knowledge for hazard and exposure. In response to the US EPA’s need to develop novel approaches and tools for rapidly prioritizing chemicals, a “Challenge” was issued to several exposure model developers to aid the understanding of current systems in a broader sense and to assist the US EPA’s effort to develop an approach comparable to other international efforts. A common set of chemicals were prioritized under each current approach. The results are presented herein along with a comparative analysis of the rankings of the chemicals based on metrics of exposure potential or actual exposure estimates. The analysis illustrates the similarities and differences across the domains of information incorporated in each modeling approach. The overall findings indicate a need to reconcile exposures from diffuse, indirect sources (far-field) with exposures from directly, applied chemicals in consumer products or resulting from the presence of a chemical in a microenvironment like a home or vehicle. Additionally, the exposure scenario, including the mode of entry into the environment (i.e. through air, water or sediment) appears to be an important determinant of the level of agreement between modeling approaches. PMID:23707726

  9. Comparison of modeling approaches to prioritize chemicals based on estimates of exposure and exposure potential.

    PubMed

    Mitchell, Jade; Arnot, Jon A; Jolliet, Olivier; Georgopoulos, Panos G; Isukapalli, Sastry; Dasgupta, Surajit; Pandian, Muhilan; Wambaugh, John; Egeghy, Peter; Cohen Hubal, Elaine A; Vallero, Daniel A

    2013-08-01

    While only limited data are available to characterize the potential toxicity of over 8 million commercially available chemical substances, there is even less information available on the exposure and use-scenarios that are required to link potential toxicity to human and ecological health outcomes. Recent improvements and advances such as high throughput data gathering, high performance computational capabilities, and predictive chemical inherency methodology make this an opportune time to develop an exposure-based prioritization approach that can systematically utilize and link the asymmetrical bodies of knowledge for hazard and exposure. In response to the US EPA's need to develop novel approaches and tools for rapidly prioritizing chemicals, a "Challenge" was issued to several exposure model developers to aid the understanding of current systems in a broader sense and to assist the US EPA's effort to develop an approach comparable to other international efforts. A common set of chemicals were prioritized under each current approach. The results are presented herein along with a comparative analysis of the rankings of the chemicals based on metrics of exposure potential or actual exposure estimates. The analysis illustrates the similarities and differences across the domains of information incorporated in each modeling approach. The overall findings indicate a need to reconcile exposures from diffuse, indirect sources (far-field) with exposures from directly, applied chemicals in consumer products or resulting from the presence of a chemical in a microenvironment like a home or vehicle. Additionally, the exposure scenario, including the mode of entry into the environment (i.e. through air, water or sediment) appears to be an important determinant of the level of agreement between modeling approaches. Copyright © 2013 Elsevier B.V. All rights reserved.

  10. Assessment of the magnetic field exposure due to the battery current of digital mobile phones.

    PubMed

    Jokela, Kari; Puranen, Lauri; Sihvonen, Ari-Pekka

    2004-01-01

    Hand-held digital mobile phones generate pulsed magnetic fields associated with the battery current. The peak value and the waveform of the battery current were measured for seven different models of digital mobile phones, and the results were applied to compute approximately the magnetic flux density and induced currents in the phone-user's head. A simple circular loop model was used for the magnetic field source and a homogeneous sphere consisting of average brain tissue equivalent material simulated the head. The broadband magnetic flux density and the maximal induced current density were compared with the guidelines of ICNIRP using two various approaches. In the first approach the relative exposure was determined separately at each frequency and the exposure ratios were summed to obtain the total exposure (multiple-frequency rule). In the second approach the waveform was weighted in the time domain with a simple low-pass RC filter and the peak value was divided by a peak limit, both derived from the guidelines (weighted peak approach). With the maximum transmitting power (2 W) the measured peak current varied from 1 to 2.7 A. The ICNIRP exposure ratio based on the current density varied from 0.04 to 0.14 for the weighted peak approach and from 0.08 to 0.27 for the multiple-frequency rule. The latter values are considerably greater than the corresponding exposure ratios 0.005 (min) to 0.013 (max) obtained by applying the evaluation based on frequency components presented by the new IEEE standard. Hence, the exposure does not seem to exceed the guidelines. The computed peak magnetic flux density exceeded substantially the derived peak reference level of ICNIRP, but it should be noted that in a near-field exposure the external field strengths are not valid indicators of exposure. Currently, no biological data exist to give a reason for concern about the health effects of magnetic field pulses from mobile phones.

  11. Development of a Physiologically Based Computational Kidney Model to Describe the Renal Excretion of Hydrophilic Agents in Rats

    PubMed Central

    Niederalt, Christoph; Wendl, Thomas; Kuepfer, Lars; Claassen, Karina; Loosen, Roland; Willmann, Stefan; Lippert, Joerg; Schultze-Mosgau, Marcus; Winkler, Julia; Burghaus, Rolf; Bräutigam, Matthias; Pietsch, Hubertus; Lengsfeld, Philipp

    2013-01-01

    A physiologically based kidney model was developed to analyze the renal excretion and kidney exposure of hydrophilic agents, in particular contrast media, in rats. In order to study the influence of osmolality and viscosity changes, the model mechanistically represents urine concentration by water reabsorption in different segments of kidney tubules and viscosity dependent tubular fluid flow. The model was established using experimental data on the physiological steady state without administration of any contrast media or drugs. These data included the sodium and urea concentration gradient along the cortico-medullary axis, water reabsorption, urine flow, and sodium as well as urea urine concentrations for a normal hydration state. The model was evaluated by predicting the effects of mannitol and contrast media administration and comparing to experimental data on cortico-medullary concentration gradients, urine flow, urine viscosity, hydrostatic tubular pressures and single nephron glomerular filtration rate. Finally the model was used to analyze and compare typical examples of ionic and non-ionic monomeric as well as non-ionic dimeric contrast media with respect to their osmolality and viscosity. With the computational kidney model, urine flow depended mainly on osmolality, while osmolality and viscosity were important determinants for tubular hydrostatic pressure and kidney exposure. The low diuretic effect of dimeric contrast media in combination with their high intrinsic viscosity resulted in a high viscosity within the tubular fluid. In comparison to monomeric contrast media, this led to a higher increase in tubular pressure, to a reduction in glomerular filtration rate and tubular flow and to an increase in kidney exposure. The presented kidney model can be implemented into whole body physiologically based pharmacokinetic models and extended in order to simulate the renal excretion of lipophilic drugs which may also undergo active secretion and reabsorption. PMID:23355822

  12. Outside-In Systems Pharmacology Combines Innovative Computational Methods With High-Throughput Whole Vertebrate Studies.

    PubMed

    Schulthess, Pascal; van Wijk, Rob C; Krekels, Elke H J; Yates, James W T; Spaink, Herman P; van der Graaf, Piet H

    2018-04-25

    To advance the systems approach in pharmacology, experimental models and computational methods need to be integrated from early drug discovery onward. Here, we propose outside-in model development, a model identification technique to understand and predict the dynamics of a system without requiring prior biological and/or pharmacological knowledge. The advanced data required could be obtained by whole vertebrate, high-throughput, low-resource dose-exposure-effect experimentation with the zebrafish larva. Combinations of these innovative techniques could improve early drug discovery. © 2018 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  13. The impact of computer self-efficacy, computer anxiety, and perceived usability and acceptability on the efficacy of a decision support tool for colorectal cancer screening

    PubMed Central

    Lindblom, Katrina; Gregory, Tess; Flight, Ingrid H K; Zajac, Ian

    2011-01-01

    Objective This study investigated the efficacy of an internet-based personalized decision support (PDS) tool designed to aid in the decision to screen for colorectal cancer (CRC) using a fecal occult blood test. We tested whether the efficacy of the tool in influencing attitudes to screening was mediated by perceived usability and acceptability, and considered the role of computer self-efficacy and computer anxiety in these relationships. Methods Eighty-one participants aged 50–76 years worked through the on-line PDS tool and completed questionnaires on computer self-efficacy, computer anxiety, attitudes to and beliefs about CRC screening before and after exposure to the PDS, and perceived usability and acceptability of the tool. Results Repeated measures ANOVA found that PDS exposure led to a significant increase in knowledge about CRC and screening, and more positive attitudes to CRC screening as measured by factors from the Preventive Health Model. Perceived usability and acceptability of the PDS mediated changes in attitudes toward CRC screening (but not CRC knowledge), and computer self-efficacy and computer anxiety were significant predictors of individuals' perceptions of the tool. Conclusion Interventions designed to decrease computer anxiety, such as computer courses and internet training, may improve the acceptability of new health information technologies including internet-based decision support tools, increasing their impact on behavior change. PMID:21857024

  14. Design, development and validation of software for modelling dietary exposure to food chemicals and nutrients.

    PubMed

    McNamara, C; Naddy, B; Rohan, D; Sexton, J

    2003-10-01

    The Monte Carlo computational system for stochastic modelling of dietary exposure to food chemicals and nutrients is presented. This system was developed through a European Commission-funded research project. It is accessible as a Web-based application service. The system allows and supports very significant complexity in the data sets used as the model input, but provides a simple, general purpose, linear kernel for model evaluation. Specific features of the system include the ability to enter (arbitrarily) complex mathematical or probabilistic expressions at each and every input data field, automatic bootstrapping on subjects and on subject food intake diaries, and custom kernels to apply brand information such as market share and loyalty to the calculation of food and chemical intake.

  15. Holographic Reciprocity Law Failure, with Applications to the Three-Dimensional Display of Medical Data

    NASA Astrophysics Data System (ADS)

    Johnson, Kristina Mary

    In 1973 the computerized tomography (CT) scanner revolutionized medical imaging. This machine can isolate and display in two-dimensional cross-sections, internal lesions and organs previously impossible to visualize. The possibility of three-dimensional imaging however is not yet exploited by present tomographic systems. Using multiple-exposure holography, three-dimensional displays can be synthesizing from two-dimensional CT cross -sections. A multiple-exposure hologram is an incoherent superposition of many individual holograms. Intuitively it is expected that holograms recorded with equal energy will reconstruct images with equal brightness. It is found however, that holograms recorded first are brighter than holograms recorded later in the superposition. This phenomena is called Holographic Reciprocity Law Failure (HRLF). Computer simulations of latent image formation in multiple-exposure holography are one of the methods used to investigate HRLF. These simulations indicate that it is the time between individual exposures in the multiple -exposure hologram that is responsible for HRLF. This physical parameter introduces an asymmetry into the latent image formation process that favors the signal of previously recorded holograms over holograms recorded later in the superposition. The origin of this asymmetry lies in the dynamics of latent image formation, and in particular in the decay of single-atom latent image specks, which have lifetimes that are short compared to typical times between exposures. An analytical model is developed for a double exposure hologram that predicts a decrease in the brightness of the second exposure as compared to the first exposure as the time between exposures increases. These results are consistent with the computer simulations. Experiments investigating the influence of this parameter on the diffraction efficiency of reconstructed images in a double exposure hologram are also found to be consistent with the computer simulations and analytical results. From this information, two techniques are presented that correct for HRLF, and succeed in reconstructing multiple holographic images of CT cross-sections with equal brightness. The multiple multiple-exposure hologram is a new hologram that increases the number of equally bright images that can be superimposed on one photographic plate.

  16. System-based identification of toxicity pathways associated with multi-walled carbon nanotube-induced pathological responses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snyder-Talkington, Brandi N.; Dymacek, Julian; Mary Babb Randolph Cancer Center, West Virginia University, Morgantown, WV 26506-9300

    2013-10-15

    The fibrous shape and biopersistence of multi-walled carbon nanotubes (MWCNT) have raised concern over their potential toxicity after pulmonary exposure. As in vivo exposure to MWCNT produced a transient inflammatory and progressive fibrotic response, this study sought to identify significant biological processes associated with lung inflammation and fibrosis pathology data, based upon whole genome mRNA expression, bronchoaveolar lavage scores, and morphometric analysis from C57BL/6J mice exposed by pharyngeal aspiration to 0, 10, 20, 40, or 80 μg MWCNT at 1, 7, 28, or 56 days post-exposure. Using a novel computational model employing non-negative matrix factorization and Monte Carlo Markov Chainmore » simulation, significant biological processes with expression similar to MWCNT-induced lung inflammation and fibrosis pathology data in mice were identified. A subset of genes in these processes was determined to be functionally related to either fibrosis or inflammation by Ingenuity Pathway Analysis and was used to determine potential significant signaling cascades. Two genes determined to be functionally related to inflammation and fibrosis, vascular endothelial growth factor A (vegfa) and C-C motif chemokine 2 (ccl2), were confirmed by in vitro studies of mRNA and protein expression in small airway epithelial cells exposed to MWCNT as concordant with in vivo expression. This study identified that the novel computational model was sufficient to determine biological processes strongly associated with the pathology of lung inflammation and fibrosis and could identify potential toxicity signaling pathways and mechanisms of MWCNT exposure which could be used for future animal studies to support human risk assessment and intervention efforts. - Highlights: • A novel computational model identified toxicity pathways matching in vivo pathology. • Systematic identification of MWCNT-induced biological processes in mouse lungs • MWCNT-induced functional networks of lung inflammation and fibrosis were revealed. • Two functional, representative genes, ccl2 and vegfa, were validated in vitro.« less

  17. A Modular Approach to Teaching Mathematical Modeling in Biotechnology in the Undergraduate Curriculum

    ERIC Educational Resources Information Center

    Larripa, Kamila R.; Mazzag, Borbala

    2016-01-01

    Our paper describes a solution we found to a still existing need to develop mathematical modeling courses for undergraduate biology majors. Some challenges of such courses are: (i) relatively limited exposure of biology students to higher-level mathematical and computational concepts; (ii) availability of texts that can give a flavor of how…

  18. A Sustainable Model for Integrating Current Topics in Machine Learning Research into the Undergraduate Curriculum

    ERIC Educational Resources Information Center

    Georgiopoulos, M.; DeMara, R. F.; Gonzalez, A. J.; Wu, A. S.; Mollaghasemi, M.; Gelenbe, E.; Kysilka, M.; Secretan, J.; Sharma, C. A.; Alnsour, A. J.

    2009-01-01

    This paper presents an integrated research and teaching model that has resulted from an NSF-funded effort to introduce results of current Machine Learning research into the engineering and computer science curriculum at the University of Central Florida (UCF). While in-depth exposure to current topics in Machine Learning has traditionally occurred…

  19. Low Earth orbit assessment of proton anisotropy using AP8 and AP9 trapped proton models

    NASA Astrophysics Data System (ADS)

    Badavi, Francis F.; Walker, Steven A.; Santos Koos, Lindsey M.

    2015-04-01

    The completion of the International Space Station (ISS) in 2011 has provided the space research community with an ideal evaluation and testing facility for future long duration human activities in space. Ionized and secondary neutral particles radiation measurements inside ISS form the ideal tool for validation of radiation environmental models, nuclear reaction cross sections and transport codes. Studies using thermo-luminescent detectors (TLD), tissue equivalent proportional counter (TPEC), and computer aided design (CAD) models of early ISS configurations confirmed that, as input, computational dosimetry at low Earth orbit (LEO) requires an environmental model with directional (anisotropic) capability to properly describe the exposure of trapped protons within ISS. At LEO, ISS encounters exposure from trapped electrons, protons and geomagnetically attenuated galactic cosmic rays (GCR). For short duration studies at LEO, one can ignore trapped electrons and ever present GCR exposure contributions during quiet times. However, within the trapped proton field, a challenge arises from properly estimating the amount of proton exposure acquired. There exist a number of models to define the intensity of trapped particles. Among the established trapped models are the historic AE8/AP8, dating back to the 1980s and the recently released AE9/AP9/SPM. Since at LEO electrons have minimal exposure contribution to ISS, this work ignores the AE8 and AE9 components of the models and couples a measurement derived anisotropic trapped proton formalism to omnidirectional output from the AP8 and AP9 models, allowing the assessment of the differences between the two proton models. The assessment is done at a target point within the ISS-11A configuration (circa 2003) crew quarter (CQ) of Russian Zvezda service module (SM), during its ascending and descending nodes passes through the south Atlantic anomaly (SAA). The anisotropic formalism incorporates the contributions of proton narrow pitch angle (PA) and east-west (EW) effects. Within SAA, the EW anisotropy results in different level of exposure to each side of the ISS Zvezda SM, allowing angular evaluation of the anisotropic proton spectrum. While the combined magnitude of PA and EW effects at LEO depends on a multitude of factors such as trapped proton energy, orientation and altitude of the spacecraft along the velocity vector, this paper draws quantitative conclusions on the combined anisotropic magnitude differences within ISS SM target point between AP8 and AP9 models.

  20. Low Earth orbit assessment of proton anisotropy using AP8 and AP9 trapped proton models.

    PubMed

    Badavi, Francis F; Walker, Steven A; Santos Koos, Lindsey M

    2015-04-01

    The completion of the International Space Station (ISS) in 2011 has provided the space research community with an ideal evaluation and testing facility for future long duration human activities in space. Ionized and secondary neutral particles radiation measurements inside ISS form the ideal tool for validation of radiation environmental models, nuclear reaction cross sections and transport codes. Studies using thermo-luminescent detectors (TLD), tissue equivalent proportional counter (TPEC), and computer aided design (CAD) models of early ISS configurations confirmed that, as input, computational dosimetry at low Earth orbit (LEO) requires an environmental model with directional (anisotropic) capability to properly describe the exposure of trapped protons within ISS. At LEO, ISS encounters exposure from trapped electrons, protons and geomagnetically attenuated galactic cosmic rays (GCR). For short duration studies at LEO, one can ignore trapped electrons and ever present GCR exposure contributions during quiet times. However, within the trapped proton field, a challenge arises from properly estimating the amount of proton exposure acquired. There exist a number of models to define the intensity of trapped particles. Among the established trapped models are the historic AE8/AP8, dating back to the 1980s and the recently released AE9/AP9/SPM. Since at LEO electrons have minimal exposure contribution to ISS, this work ignores the AE8 and AE9 components of the models and couples a measurement derived anisotropic trapped proton formalism to omnidirectional output from the AP8 and AP9 models, allowing the assessment of the differences between the two proton models. The assessment is done at a target point within the ISS-11A configuration (circa 2003) crew quarter (CQ) of Russian Zvezda service module (SM), during its ascending and descending nodes passes through the south Atlantic anomaly (SAA). The anisotropic formalism incorporates the contributions of proton narrow pitch angle (PA) and east-west (EW) effects. Within SAA, the EW anisotropy results in different level of exposure to each side of the ISS Zvezda SM, allowing angular evaluation of the anisotropic proton spectrum. While the combined magnitude of PA and EW effects at LEO depends on a multitude of factors such as trapped proton energy, orientation and altitude of the spacecraft along the velocity vector, this paper draws quantitative conclusions on the combined anisotropic magnitude differences within ISS SM target point between AP8 and AP9 models. Copyright © 2015 The Committee on Space Research (COSPAR). All rights reserved.

  1. A Five- Year CMAQ Model Performance for Wildfires and ...

    EPA Pesticide Factsheets

    Biomass burning has been identified as an important contributor to the degradation of air quality because of its impact on ozone and particulate matter. Two components of the biomass burning inventory, wildfires and prescribed fires are routinely estimated in the national emissions inventory. However, there is a large amount of uncertainty in the development of these emission inventory sectors. We have completed a 5 year set of CMAQ model simulations (2008-2012) in which we have simulated regional air quality with and without the wildfire and prescribed fire inventory. We will examine CMAQ model performance over regions with significant PM2.5 and Ozone contribution from prescribed fires and wildfires. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  2. An Eye Model for Computational Dosimetry Using A Multi-Scale Voxel Phantom

    NASA Astrophysics Data System (ADS)

    Caracappa, Peter F.; Rhodes, Ashley; Fiedler, Derek

    2014-06-01

    The lens of the eye is a radiosensitive tissue with cataract formation being the major concern. Recently reduced recommended dose limits to the lens of the eye have made understanding the dose to this tissue of increased importance. Due to memory limitations, the voxel resolution of computational phantoms used for radiation dose calculations is too large to accurately represent the dimensions of the eye. A revised eye model is constructed using physiological data for the dimensions of radiosensitive tissues, and is then transformed into a high-resolution voxel model. This eye model is combined with an existing set of whole body models to form a multi-scale voxel phantom, which is used with the MCNPX code to calculate radiation dose from various exposure types. This phantom provides an accurate representation of the radiation transport through the structures of the eye. Two alternate methods of including a high-resolution eye model within an existing whole body model are developed. The accuracy and performance of each method is compared against existing computational phantoms.

  3. Estimation of whole-body radiation exposure from brachytherapy for oral cancer using a Monte Carlo simulation

    PubMed Central

    Ozaki, Y.; Kaida, A.; Miura, M.; Nakagawa, K.; Toda, K.; Yoshimura, R.; Sumi, Y.; Kurabayashi, T.

    2017-01-01

    Abstract Early stage oral cancer can be cured with oral brachytherapy, but whole-body radiation exposure status has not been previously studied. Recently, the International Commission on Radiological Protection Committee (ICRP) recommended the use of ICRP phantoms to estimate radiation exposure from external and internal radiation sources. In this study, we used a Monte Carlo simulation with ICRP phantoms to estimate whole-body exposure from oral brachytherapy. We used a Particle and Heavy Ion Transport code System (PHITS) to model oral brachytherapy with 192Ir hairpins and 198Au grains and to perform a Monte Carlo simulation on the ICRP adult reference computational phantoms. To confirm the simulations, we also computed local dose distributions from these small sources, and compared them with the results from Oncentra manual Low Dose Rate Treatment Planning (mLDR) software which is used in day-to-day clinical practice. We successfully obtained data on absorbed dose for each organ in males and females. Sex-averaged equivalent doses were 0.547 and 0.710 Sv with 192Ir hairpins and 198Au grains, respectively. Simulation with PHITS was reliable when compared with an alternative computational technique using mLDR software. We concluded that the absorbed dose for each organ and whole-body exposure from oral brachytherapy can be estimated with Monte Carlo simulation using PHITS on ICRP reference phantoms. Effective doses for patients with oral cancer were obtained. PMID:28339846

  4. SAR in a child voxel phantom from exposure to wireless computer networks (Wi-Fi).

    PubMed

    Findlay, R P; Dimbylow, P J

    2010-08-07

    Specific energy absorption rate (SAR) values have been calculated in a 10 year old sitting voxel model from exposure to electromagnetic fields at 2.4 and 5 GHz, frequencies commonly used by Wi-Fi devices. Both plane-wave exposure of the model and irradiation from antennas in the near field were investigated for a variety of exposure conditions. In all situations studied, the SAR values calculated were considerably below basic restrictions. For a typical Wi-Fi exposure scenario using an inverted F antenna operating at 100 mW, a duty factor of 0.1 and an antenna-body separation of 34 cm, the maximum peak localized SAR was found to be 3.99 mW kg(-1) in the torso region. At 2.4 GHz, using a power of 100 mW and a duty factor of 1, the highest localized SAR value in the head was calculated as 5.7 mW kg(-1). This represents less than 1% of the SAR previously calculated in the head for a typical mobile phone exposure condition.

  5. How Accumulated Real Life Stress Experience and Cognitive Speed Interact on Decision-Making Processes

    PubMed Central

    Friedel, Eva; Sebold, Miriam; Kuitunen-Paul, Sören; Nebe, Stephan; Veer, Ilya M.; Zimmermann, Ulrich S.; Schlagenhauf, Florian; Smolka, Michael N.; Rapp, Michael; Walter, Henrik; Heinz, Andreas

    2017-01-01

    Rationale: Advances in neurocomputational modeling suggest that valuation systems for goal-directed (deliberative) on one side, and habitual (automatic) decision-making on the other side may rely on distinct computational strategies for reinforcement learning, namely model-free vs. model-based learning. As a key theoretical difference, the model-based system strongly demands cognitive functions to plan actions prospectively based on an internal cognitive model of the environment, whereas valuation in the model-free system relies on rather simple learning rules from operant conditioning to retrospectively associate actions with their outcomes and is thus cognitively less demanding. Acute stress reactivity is known to impair model-based but not model-free choice behavior, with higher working memory capacity protecting the model-based system from acute stress. However, it is not clear which impact accumulated real life stress has on model-free and model-based decision systems and how this influence interacts with cognitive abilities. Methods: We used a sequential decision-making task distinguishing relative contributions of both learning strategies to choice behavior, the Social Readjustment Rating Scale questionnaire to assess accumulated real life stress, and the Digit Symbol Substitution Test to test cognitive speed in 95 healthy subjects. Results: Individuals reporting high stress exposure who had low cognitive speed showed reduced model-based but increased model-free behavioral control. In contrast, subjects exposed to accumulated real life stress with high cognitive speed displayed increased model-based performance but reduced model-free control. Conclusion: These findings suggest that accumulated real life stress exposure can enhance reliance on cognitive speed for model-based computations, which may ultimately protect the model-based system from the detrimental influences of accumulated real life stress. The combination of accumulated real life stress exposure and slower information processing capacities, however, might favor model-free strategies. Thus, the valence and preference of either system strongly depends on stressful experiences and individual cognitive capacities. PMID:28642696

  6. How Accumulated Real Life Stress Experience and Cognitive Speed Interact on Decision-Making Processes.

    PubMed

    Friedel, Eva; Sebold, Miriam; Kuitunen-Paul, Sören; Nebe, Stephan; Veer, Ilya M; Zimmermann, Ulrich S; Schlagenhauf, Florian; Smolka, Michael N; Rapp, Michael; Walter, Henrik; Heinz, Andreas

    2017-01-01

    Rationale: Advances in neurocomputational modeling suggest that valuation systems for goal-directed (deliberative) on one side, and habitual (automatic) decision-making on the other side may rely on distinct computational strategies for reinforcement learning, namely model-free vs. model-based learning. As a key theoretical difference, the model-based system strongly demands cognitive functions to plan actions prospectively based on an internal cognitive model of the environment, whereas valuation in the model-free system relies on rather simple learning rules from operant conditioning to retrospectively associate actions with their outcomes and is thus cognitively less demanding. Acute stress reactivity is known to impair model-based but not model-free choice behavior, with higher working memory capacity protecting the model-based system from acute stress. However, it is not clear which impact accumulated real life stress has on model-free and model-based decision systems and how this influence interacts with cognitive abilities. Methods: We used a sequential decision-making task distinguishing relative contributions of both learning strategies to choice behavior, the Social Readjustment Rating Scale questionnaire to assess accumulated real life stress, and the Digit Symbol Substitution Test to test cognitive speed in 95 healthy subjects. Results: Individuals reporting high stress exposure who had low cognitive speed showed reduced model-based but increased model-free behavioral control. In contrast, subjects exposed to accumulated real life stress with high cognitive speed displayed increased model-based performance but reduced model-free control. Conclusion: These findings suggest that accumulated real life stress exposure can enhance reliance on cognitive speed for model-based computations, which may ultimately protect the model-based system from the detrimental influences of accumulated real life stress. The combination of accumulated real life stress exposure and slower information processing capacities, however, might favor model-free strategies. Thus, the valence and preference of either system strongly depends on stressful experiences and individual cognitive capacities.

  7. Review and standardization of cell phone exposure calculations using the SAM phantom and anatomically correct head models.

    PubMed

    Beard, Brian B; Kainz, Wolfgang

    2004-10-13

    We reviewed articles using computational RF dosimetry to compare the Specific Anthropomorphic Mannequin (SAM) to anatomically correct models of the human head. Published conclusions based on such comparisons have varied widely. We looked for reasons that might cause apparently similar comparisons to produce dissimilar results. We also looked at the information needed to adequately compare the results of computational RF dosimetry studies. We concluded studies were not comparable because of differences in definitions, models, and methodology. Therefore we propose a protocol, developed by an IEEE standards group, as an initial step in alleviating this problem. The protocol calls for a benchmark validation study comparing the SAM phantom to two anatomically correct models of the human head. It also establishes common definitions and reporting requirements that will increase the comparability of all computational RF dosimetry studies of the human head.

  8. Review and standardization of cell phone exposure calculations using the SAM phantom and anatomically correct head models

    PubMed Central

    Beard, Brian B; Kainz, Wolfgang

    2004-01-01

    We reviewed articles using computational RF dosimetry to compare the Specific Anthropomorphic Mannequin (SAM) to anatomically correct models of the human head. Published conclusions based on such comparisons have varied widely. We looked for reasons that might cause apparently similar comparisons to produce dissimilar results. We also looked at the information needed to adequately compare the results of computational RF dosimetry studies. We concluded studies were not comparable because of differences in definitions, models, and methodology. Therefore we propose a protocol, developed by an IEEE standards group, as an initial step in alleviating this problem. The protocol calls for a benchmark validation study comparing the SAM phantom to two anatomically correct models of the human head. It also establishes common definitions and reporting requirements that will increase the comparability of all computational RF dosimetry studies of the human head. PMID:15482601

  9. On the equivalence of case-crossover and time series methods in environmental epidemiology.

    PubMed

    Lu, Yun; Zeger, Scott L

    2007-04-01

    The case-crossover design was introduced in epidemiology 15 years ago as a method for studying the effects of a risk factor on a health event using only cases. The idea is to compare a case's exposure immediately prior to or during the case-defining event with that same person's exposure at otherwise similar "reference" times. An alternative approach to the analysis of daily exposure and case-only data is time series analysis. Here, log-linear regression models express the expected total number of events on each day as a function of the exposure level and potential confounding variables. In time series analyses of air pollution, smooth functions of time and weather are the main confounders. Time series and case-crossover methods are often viewed as competing methods. In this paper, we show that case-crossover using conditional logistic regression is a special case of time series analysis when there is a common exposure such as in air pollution studies. This equivalence provides computational convenience for case-crossover analyses and a better understanding of time series models. Time series log-linear regression accounts for overdispersion of the Poisson variance, while case-crossover analyses typically do not. This equivalence also permits model checking for case-crossover data using standard log-linear model diagnostics.

  10. Spatial aspects of building and population exposure data and their implications for global earthquake exposure modeling

    USGS Publications Warehouse

    Dell’Acqua, F.; Gamba, P.; Jaiswal, K.

    2012-01-01

    This paper discusses spatial aspects of the global exposure dataset and mapping needs for earthquake risk assessment. We discuss this in the context of development of a Global Exposure Database for the Global Earthquake Model (GED4GEM), which requires compilation of a multi-scale inventory of assets at risk, for example, buildings, populations, and economic exposure. After defining the relevant spatial and geographic scales of interest, different procedures are proposed to disaggregate coarse-resolution data, to map them, and if necessary to infer missing data by using proxies. We discuss the advantages and limitations of these methodologies and detail the potentials of utilizing remote-sensing data. The latter is used especially to homogenize an existing coarser dataset and, where possible, replace it with detailed information extracted from remote sensing using the built-up indicators for different environments. Present research shows that the spatial aspects of earthquake risk computation are tightly connected with the availability of datasets of the resolution necessary for producing sufficiently detailed exposure. The global exposure database designed by the GED4GEM project is able to manage datasets and queries of multiple spatial scales.

  11. Projected 2050 Model Simulations for the Chesapeake Bay ...

    EPA Pesticide Factsheets

    The Chesapeake Bay Program as has been tasked with assessing how changes in climate systems are expected to alter key variables and processes within the Watershed in concurrence with land use changes. EPA’s Office of Research and Development will be conducting historic and future, 2050, Weather Research and Forecast (WRF) metrological and Community Multiscale Air Quality (CMAQ) chemical transport model simulations to provide meteorological and nutrient deposition estimates for inclusion of the Chesapeake Bay Program’s assessment of how climate and land use change may impact water quality and ecosystem health. This presentation will present the timeline and research updates. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  12. Exposure to electromagnetic fields from laptop use of "laptop" computers.

    PubMed

    Bellieni, C V; Pinto, I; Bogi, A; Zoppetti, N; Andreuccetti, D; Buonocore, G

    2012-01-01

    Portable computers are often used at tight contact with the body and therefore are called "laptop." The authors measured electromagnetic fields (EMFs) laptop computers produce and estimated the induced currents in the body, to assess the safety of laptop computers. The authors evaluated 5 commonly used laptop of different brands. They measured EMF exposure produced and, using validated computerized models, the authors exploited the data of one of the laptop computers (LTCs) to estimate the magnetic flux exposure of the user and of the fetus in the womb, when the laptop is used at close contact with the woman's womb. In the LTCs analyzed, EMF values (range 1.8-6 μT) are within International Commission on Non-Ionizing Radiation (NIR) Protection (ICNIRP) guidelines, but are considerably higher than the values recommended by 2 recent guidelines for computer monitors magnetic field emissions, MPR II (Swedish Board for Technical Accreditation) and TCO (Swedish Confederation of Professional Employees), and those considered risky for tumor development. When close to the body, the laptop induces currents that are within 34.2% to 49.8% ICNIRP recommendations, but not negligible, to the adult's body and to the fetus (in pregnant women). On the contrary, the power supply induces strong intracorporal electric current densities in the fetus and in the adult subject, which are respectively 182-263% and 71-483% higher than ICNIRP 98 basic restriction recommended to prevent adverse health effects. Laptop is paradoxically an improper site for the use of a LTC, which consequently should be renamed to not induce customers towards an improper use.

  13. Constructing inverse probability weights for continuous exposures: a comparison of methods.

    PubMed

    Naimi, Ashley I; Moodie, Erica E M; Auger, Nathalie; Kaufman, Jay S

    2014-03-01

    Inverse probability-weighted marginal structural models with binary exposures are common in epidemiology. Constructing inverse probability weights for a continuous exposure can be complicated by the presence of outliers, and the need to identify a parametric form for the exposure and account for nonconstant exposure variance. We explored the performance of various methods to construct inverse probability weights for continuous exposures using Monte Carlo simulation. We generated two continuous exposures and binary outcomes using data sampled from a large empirical cohort. The first exposure followed a normal distribution with homoscedastic variance. The second exposure followed a contaminated Poisson distribution, with heteroscedastic variance equal to the conditional mean. We assessed six methods to construct inverse probability weights using: a normal distribution, a normal distribution with heteroscedastic variance, a truncated normal distribution with heteroscedastic variance, a gamma distribution, a t distribution (1, 3, and 5 degrees of freedom), and a quantile binning approach (based on 10, 15, and 20 exposure categories). We estimated the marginal odds ratio for a single-unit increase in each simulated exposure in a regression model weighted by the inverse probability weights constructed using each approach, and then computed the bias and mean squared error for each method. For the homoscedastic exposure, the standard normal, gamma, and quantile binning approaches performed best. For the heteroscedastic exposure, the quantile binning, gamma, and heteroscedastic normal approaches performed best. Our results suggest that the quantile binning approach is a simple and versatile way to construct inverse probability weights for continuous exposures.

  14. Towards Probablistic Assessment of Hypobaric Decompression Sickness Treatment

    NASA Technical Reports Server (NTRS)

    Conkin, J.; Abercromby, A. F.; Feiveson, A. H.; Gernhardt, M. L.; Norcross, J. R.; Ploutz-Snyder, R.; Wessel, J. H., III

    2013-01-01

    INTRODUCTION: Pressure, oxygen (O2), and time are the pillars to effective treatment of decompression sickness (DCS). The NASA DCS Treatment Model links a decrease in computed bubble volume to the resolution of a symptom. The decrease in volume is realized in two stages: a) during the Boyle's Law compression and b) during subsequent dissolution of the gas phase by the O2 window. METHODS: The cumulative distribution of 154 symptoms that resolved during repressurization was described with a log-logistic density function of pressure difference (deltaP as psid) associated with symptom resolution and two other explanatory variables. The 154 symptoms originated from 119 cases of DCS during 969 exposures in 47 different altitude tests. RESULTS: The probability of symptom resolution [P(symptom resolution)] = 1 / (1+exp(- (ln(deltaP) - 1.682 + 1.089×AMB - 0.00395×SYMPTOM TIME) / 0.633)), where AMB is 1 when the subject ambulated as part of the altitude exposure or else 0 and SYMPTOM TIME is the elapsed time in min from start of the altitude exposure to recognition of a DCS symptom. The P(symptom resolution) was estimated from computed deltaP from the Tissue Bubble Dynamics Model based on the "effective" Boyle's Law change: P2 - P1 (deltaP, psid) = P1×V1/V2 - P1, where V1 is the computed volume of a spherical bubble in a unit volume of tissue at low pressure P1 and V2 is computed volume after a change to a higher pressure P2. V2 continues to decrease through time at P2, at a faster rate if 100% ground level O2 was breathed. The computed deltaP is the effective treatment pressure at any point in time as if the entire ?deltaP was just from Boyle's Law compression. DISCUSSION: Given the low probability of DCS during extravehicular activity and the prompt treatment of a symptom with options through the model it is likely that the symptom and gas phase will resolve with minimum resources and minimal impact on astronaut health, safety, and productivity.

  15. Accidental release of chlorine in Chicago: Coupling of an exposure model with a Computational Fluid Dynamics model

    NASA Astrophysics Data System (ADS)

    Sanchez, E. Y.; Colman Lerner, J. E.; Porta, A.; Jacovkis, P. M.

    2013-01-01

    The adverse health effects of the release of hazardous substances into the atmosphere continue being a matter of concern, especially in densely populated urban regions. Emergency responders need to have estimates of these adverse health effects in the local population to aid planning, emergency response, and recovery efforts. For this purpose, models that predict the transport and dispersion of hazardous materials are as necessary as those that estimate the adverse health effects in the population. In this paper, we present the results obtained by coupling a Computational Fluid Dynamics model, FLACS (FLame ACceleration Simulator), with an exposure model, DDC (Damage Differential Coupling). This coupled model system is applied to a scenario of hypothetical release of chlorine with obstacles, such as buildings, and the results show how it is capable of predicting the atmospheric dispersion of hazardous chemicals, and the adverse health effects in the exposed population, to support decision makers both in charge of emergency planning and in charge of real-time response. The results obtained show how knowing the influence of obstacles in the trajectory of the toxic cloud and in the diffusion of the pollutants transported, and obtaining dynamic information of the potentially affected population and of associated symptoms, contribute to improve the planning of the protection and response measures.

  16. Standardized input for Hanford environmental impact statements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Napier, B.A.

    1981-05-01

    Models and computer programs for simulating the environmental behavior of radionuclides in the environment and the resulting radiation dose to humans have been developed over the years by the Environmental Analysis Section staff, Ecological Sciences Department at the Pacific Northwest Laboratory (PNL). Methodologies have evolved for calculating raidation doses from many exposure pathways for any type of release mechanism. Depending on the situation or process being simulated, different sets of computer programs, assumptions, and modeling techniques must be used. This report is a compilation of recommended computer programs and necessary input information for use in calculating doses to members ofmore » the general public for environmental impact statements prepared for DOE activities to be conducted on or near the Hanford Reservation.« less

  17. Computational fluid dynamics modeling of transport and deposition of pesticides in an aircraft cabin

    PubMed Central

    Isukapalli, Sastry S.; Mazumdar, Sagnik; George, Pradeep; Wei, Binnian; Jones, Byron; Weisel, Clifford P.

    2015-01-01

    Spraying of pesticides in aircraft cabins is required by some countries as part of a disinsection process to kill insects that pose a public health threat. However, public health concerns remain regarding exposures of cabin crew and passengers to pesticides in aircraft cabins. While large scale field measurements of pesticide residues and air concentrations in aircraft cabins scenarios are expensive and time consuming, Computational Fluid Dynamics (CFD) models provide an effective alternative for characterizing concentration distributions and exposures. This study involved CFD modeling of a twin-aisle 11 row cabin mockup with heated manikins, mimicking a part of a fully occupied Boeing 767 cabin. The model was applied to study the flow and deposition of pesticides under representative scenarios with different spraying patterns (sideways and overhead) and cabin air exchange rates (low and high). Corresponding spraying experiments were conducted in the cabin mockup, and pesticide deposition samples were collected at the manikin’s lap and seat top for a limited set of five seats. The CFD model performed well for scenarios corresponding to high air exchange rates, captured the concentration profiles for middle seats under low air exchange rates, and underestimated the concentrations at window seats under low air exchange rates. Additionally, both the CFD and experimental measurements showed no major variation in deposition characteristics between sideways and overhead spraying. The CFD model can estimate concentration fields and deposition profiles at very high resolutions, which can be used for characterizing the overall variability in air concentrations and surface loadings. Additionally, these model results can also provide a realistic range of surface and air concentrations of pesticides in the cabin that can be used to estimate potential exposures of cabin crew and passengers to these pesticides. PMID:25642134

  18. Wait, are you sad or angry? Large exposure time differences required for the categorization of facial expressions of emotion

    PubMed Central

    Du, Shichuan; Martinez, Aleix M.

    2013-01-01

    Abstract Facial expressions of emotion are essential components of human behavior, yet little is known about the hierarchical organization of their cognitive analysis. We study the minimum exposure time needed to successfully classify the six classical facial expressions of emotion (joy, surprise, sadness, anger, disgust, fear) plus neutral as seen at different image resolutions (240 × 160 to 15 × 10 pixels). Our results suggest a consistent hierarchical analysis of these facial expressions regardless of the resolution of the stimuli. Happiness and surprise can be recognized after very short exposure times (10–20 ms), even at low resolutions. Fear and anger are recognized the slowest (100–250 ms), even in high-resolution images, suggesting a later computation. Sadness and disgust are recognized in between (70–200 ms). The minimum exposure time required for successful classification of each facial expression correlates with the ability of a human subject to identify it correctly at low resolutions. These results suggest a fast, early computation of expressions represented mostly by low spatial frequencies or global configural cues and a later, slower process for those categories requiring a more fine-grained analysis of the image. We also demonstrate that those expressions that are mostly visible in higher-resolution images are not recognized as accurately. We summarize implications for current computational models. PMID:23509409

  19. On the role of numerical simulations in studies of reduced gravity-induced physiological effects in humans. Results from NELME.

    NASA Astrophysics Data System (ADS)

    Perez-Poch, Antoni

    Computer simulations are becoming a promising research line of work, as physiological models become more and more sophisticated and reliable. Technological advances in state-of-the-art hardware technology and software allow nowadays for better and more accurate simulations of complex phenomena, such as the response of the human cardiovascular system to long-term exposure to microgravity. Experimental data for long-term missions are difficult to achieve and reproduce, therefore the predictions of computer simulations are of a major importance in this field. Our approach is based on a previous model developed and implemented in our laboratory (NELME: Numercial Evaluation of Long-term Microgravity Effects). The software simulates the behaviour of the cardiovascular system and different human organs, has a modular archi-tecture, and allows to introduce perturbations such as physical exercise or countermeasures. The implementation is based on a complex electrical-like model of this control system, using inexpensive development frameworks, and has been tested and validated with the available experimental data. The objective of this work is to analyse and simulate long-term effects and gender differences when individuals are exposed to long-term microgravity. Risk probability of a health impairement which may put in jeopardy a long-term mission is also evaluated. . Gender differences have been implemented for this specific work, as an adjustment of a number of parameters that are included in the model. Women versus men physiological differences have been therefore taken into account, based upon estimations from the physiology bibliography. A number of simulations have been carried out for long-term exposure to microgravity. Gravity varying continuosly from Earth-based to zero, and time exposure are the two main variables involved in the construction of results, including responses to patterns of physical aerobic ex-ercise and thermal stress simulating an extra-vehicular activity. Results show that significant differences appear between men and women physiological response after long-term exposure (more than three months) to microgravity. Risk evaluation for every gender, and specific risk thresholds are provided. Different scenarios like a long-term mission to Moon or Mars are evaluated, including countermeasures such as aerobic exercise. Initial results are compatible with the existing data, and provide useful insights regarding different patterns of microgravity exposure. We conclude that computer-based models such us NELME are a promising line of work to predict health risks in long-term missions.

  20. A Comparison of Exposure Metrics for Traffic-Related Air Pollutants: Application to Epidemiology Studies in Detroit, Michigan

    PubMed Central

    Batterman, Stuart; Burke, Janet; Isakov, Vlad; Lewis, Toby; Mukherjee, Bhramar; Robins, Thomas

    2014-01-01

    Vehicles are major sources of air pollutant emissions, and individuals living near large roads endure high exposures and health risks associated with traffic-related air pollutants. Air pollution epidemiology, health risk, environmental justice, and transportation planning studies would all benefit from an improved understanding of the key information and metrics needed to assess exposures, as well as the strengths and limitations of alternate exposure metrics. This study develops and evaluates several metrics for characterizing exposure to traffic-related air pollutants for the 218 residential locations of participants in the NEXUS epidemiology study conducted in Detroit (MI, USA). Exposure metrics included proximity to major roads, traffic volume, vehicle mix, traffic density, vehicle exhaust emissions density, and pollutant concentrations predicted by dispersion models. Results presented for each metric include comparisons of exposure distributions, spatial variability, intraclass correlation, concordance and discordance rates, and overall strengths and limitations. While showing some agreement, the simple categorical and proximity classifications (e.g., high diesel/low diesel traffic roads and distance from major roads) do not reflect the range and overlap of exposures seen in the other metrics. Information provided by the traffic density metric, defined as the number of kilometers traveled (VKT) per day within a 300 m buffer around each home, was reasonably consistent with the more sophisticated metrics. Dispersion modeling provided spatially- and temporally-resolved concentrations, along with apportionments that separated concentrations due to traffic emissions and other sources. While several of the exposure metrics showed broad agreement, including traffic density, emissions density and modeled concentrations, these alternatives still produced exposure classifications that differed for a substantial fraction of study participants, e.g., from 20% to 50% of homes, depending on the metric, would be incorrectly classified into “low”, “medium” or “high” traffic exposure classes. These and other results suggest the potential for exposure misclassification and the need for refined and validated exposure metrics. While data and computational demands for dispersion modeling of traffic emissions are non-trivial concerns, once established, dispersion modeling systems can provide exposure information for both on- and near-road environments that would benefit future traffic-related assessments. PMID:25226412

  1. Life sciences research in space: The requirement for animal models

    NASA Technical Reports Server (NTRS)

    Fuller, C. A.; Philips, R. W.; Ballard, R. W.

    1987-01-01

    Use of animals in NASA space programs is reviewed. Animals are needed because life science experimentation frequently requires long-term controlled exposure to environments, statistical validation, invasive instrumentation or biological tissue sampling, tissue destruction, exposure to dangerous or unknown agents, or sacrifice of the subject. The availability and use of human subjects inflight is complicated by the multiple needs and demands upon crew time. Because only living organisms can sense, integrate and respond to the environment around them, the sole use of tissue culture and computer models is insufficient for understanding the influence of the space environment on intact organisms. Equipment for spaceborne experiments with animals is described.

  2. Computational Model for Oxygen Transport and Consumption in Human Vitreous

    PubMed Central

    Filas, Benjamen A.; Shui, Ying-Bo; Beebe, David C.

    2013-01-01

    Purpose. Previous studies that measured liquefaction and oxygen content in human vitreous suggested that exposure of the lens to excess oxygen causes nuclear cataracts. Here, we developed a computational model that reproduced available experimental oxygen distributions for intact and degraded human vitreous in physiologic and environmentally perturbed conditions. After validation, the model was used to estimate how age-related changes in vitreous physiology and structure alter oxygen levels at the lens. Methods. A finite-element model for oxygen transport and consumption in the human vitreous was created. Major inputs included ascorbate-mediated oxygen consumption in the vitreous, consumption at the posterior lens surface, and inflow from the retinal vasculature. Concentration-dependent relations were determined from experimental human data or estimated from animal studies, with the impact of all assumptions explored via parameter studies. Results. The model reproduced experimental data in humans, including oxygen partial pressure (Po2) gradients (≈15 mm Hg) across the anterior-posterior extent of the vitreous body, higher oxygen levels at the pars plana relative to the vitreous core, increases in Po2 near the lens after cataract surgery, and equilibration in the vitreous chamber following vitrectomy. Loss of the antioxidative capacity of ascorbate increases oxygen levels 3-fold at the lens surface. Homogeneous vitreous degeneration (liquefaction), but not partial posterior vitreous detachment, greatly increases oxygen exposure to the lens. Conclusions. Ascorbate content and the structure of the vitreous gel are critical determinants of lens oxygen exposure. Minimally invasive surgery and restoration of vitreous structure warrant further attention as strategies for preventing nuclear cataracts. PMID:24008409

  3. Computational model for oxygen transport and consumption in human vitreous.

    PubMed

    Filas, Benjamen A; Shui, Ying-Bo; Beebe, David C

    2013-10-15

    Previous studies that measured liquefaction and oxygen content in human vitreous suggested that exposure of the lens to excess oxygen causes nuclear cataracts. Here, we developed a computational model that reproduced available experimental oxygen distributions for intact and degraded human vitreous in physiologic and environmentally perturbed conditions. After validation, the model was used to estimate how age-related changes in vitreous physiology and structure alter oxygen levels at the lens. A finite-element model for oxygen transport and consumption in the human vitreous was created. Major inputs included ascorbate-mediated oxygen consumption in the vitreous, consumption at the posterior lens surface, and inflow from the retinal vasculature. Concentration-dependent relations were determined from experimental human data or estimated from animal studies, with the impact of all assumptions explored via parameter studies. The model reproduced experimental data in humans, including oxygen partial pressure (Po2) gradients (≈15 mm Hg) across the anterior-posterior extent of the vitreous body, higher oxygen levels at the pars plana relative to the vitreous core, increases in Po2 near the lens after cataract surgery, and equilibration in the vitreous chamber following vitrectomy. Loss of the antioxidative capacity of ascorbate increases oxygen levels 3-fold at the lens surface. Homogeneous vitreous degeneration (liquefaction), but not partial posterior vitreous detachment, greatly increases oxygen exposure to the lens. Ascorbate content and the structure of the vitreous gel are critical determinants of lens oxygen exposure. Minimally invasive surgery and restoration of vitreous structure warrant further attention as strategies for preventing nuclear cataracts.

  4. Integration and Exposure of Large Scale Computational Resources Across the Earth System Grid Federation (ESGF)

    NASA Astrophysics Data System (ADS)

    Duffy, D.; Maxwell, T. P.; Doutriaux, C.; Williams, D. N.; Chaudhary, A.; Ames, S.

    2015-12-01

    As the size of remote sensing observations and model output data grows, the volume of the data has become overwhelming, even to many scientific experts. As societies are forced to better understand, mitigate, and adapt to climate changes, the combination of Earth observation data and global climate model projects is crucial to not only scientists but to policy makers, downstream applications, and even the public. Scientific progress on understanding climate is critically dependent on the availability of a reliable infrastructure that promotes data access, management, and provenance. The Earth System Grid Federation (ESGF) has created such an environment for the Intergovernmental Panel on Climate Change (IPCC). ESGF provides a federated global cyber infrastructure for data access and management of model outputs generated for the IPCC Assessment Reports (AR). The current generation of the ESGF federated grid allows consumers of the data to find and download data with limited capabilities for server-side processing. Since the amount of data for future AR is expected to grow dramatically, ESGF is working on integrating server-side analytics throughout the federation. The ESGF Compute Working Team (CWT) has created a Web Processing Service (WPS) Application Programming Interface (API) to enable access scalable computational resources. The API is the exposure point to high performance computing resources across the federation. Specifically, the API allows users to execute simple operations, such as maximum, minimum, average, and anomalies, on ESGF data without having to download the data. These operations are executed at the ESGF data node site with access to large amounts of parallel computing capabilities. This presentation will highlight the WPS API, its capabilities, provide implementation details, and discuss future developments.

  5. Computational Modeling of Male Reproductive Tract Development for Use in Predictive Toxicology (ASCCT meeting)

    EPA Science Inventory

    Adverse trends in male reproductive health have been reported for increased rates of testicular germ cell tumor, low semen quality, cryptorchidism, and hypospadias. An association with prenatal environmental exposure has been inferred from human and animal studies underlying male...

  6. Evaluating Pharmacokinetic and Pharmacodynamic Interactions with Computational Models in Cumulative Risk Assessment

    EPA Science Inventory

    Simultaneous or sequential exposure to multiple chemicals may cause interactions in the pharmacokinetics (PK) and/or pharmacodynamics (PD) of the individual chemicals. Such interactions can cause modification of the internal or target dose/response of one chemical in the mixture ...

  7. Crowd-Sourced Verification of Computational Methods and Data in Systems Toxicology: A Case Study with a Heat-Not-Burn Candidate Modified Risk Tobacco Product.

    PubMed

    Poussin, Carine; Belcastro, Vincenzo; Martin, Florian; Boué, Stéphanie; Peitsch, Manuel C; Hoeng, Julia

    2017-04-17

    Systems toxicology intends to quantify the effect of toxic molecules in biological systems and unravel their mechanisms of toxicity. The development of advanced computational methods is required for analyzing and integrating high throughput data generated for this purpose as well as for extrapolating predictive toxicological outcomes and risk estimates. To ensure the performance and reliability of the methods and verify conclusions from systems toxicology data analysis, it is important to conduct unbiased evaluations by independent third parties. As a case study, we report here the results of an independent verification of methods and data in systems toxicology by crowdsourcing. The sbv IMPROVER systems toxicology computational challenge aimed to evaluate computational methods for the development of blood-based gene expression signature classification models with the ability to predict smoking exposure status. Participants created/trained models on blood gene expression data sets including smokers/mice exposed to 3R4F (a reference cigarette) or noncurrent smokers/Sham (mice exposed to air). Participants applied their models on unseen data to predict whether subjects classify closer to smoke-exposed or nonsmoke exposed groups. The data sets also included data from subjects that had been exposed to potential modified risk tobacco products (MRTPs) or that had switched to a MRTP after exposure to conventional cigarette smoke. The scoring of anonymized participants' predictions was done using predefined metrics. The top 3 performers' methods predicted class labels with area under the precision recall scores above 0.9. Furthermore, although various computational approaches were used, the crowd's results confirmed our own data analysis outcomes with regards to the classification of MRTP-related samples. Mice exposed directly to a MRTP were classified closer to the Sham group. After switching to a MRTP, the confidence that subjects belonged to the smoke-exposed group decreased significantly. Smoking exposure gene signatures that contributed to the group separation included a core set of genes highly consistent across teams such as AHRR, LRRN3, SASH1, and P2RY6. In conclusion, crowdsourcing constitutes a pertinent approach, in complement to the classical peer review process, to independently and unbiasedly verify computational methods and data for risk assessment using systems toxicology.

  8. Statistical learning and probabilistic prediction in music cognition: mechanisms of stylistic enculturation.

    PubMed

    Pearce, Marcus T

    2018-05-11

    Music perception depends on internal psychological models derived through exposure to a musical culture. It is hypothesized that this musical enculturation depends on two cognitive processes: (1) statistical learning, in which listeners acquire internal cognitive models of statistical regularities present in the music to which they are exposed; and (2) probabilistic prediction based on these learned models that enables listeners to organize and process their mental representations of music. To corroborate these hypotheses, I review research that uses a computational model of probabilistic prediction based on statistical learning (the information dynamics of music (IDyOM) model) to simulate data from empirical studies of human listeners. The results show that a broad range of psychological processes involved in music perception-expectation, emotion, memory, similarity, segmentation, and meter-can be understood in terms of a single, underlying process of probabilistic prediction using learned statistical models. Furthermore, IDyOM simulations of listeners from different musical cultures demonstrate that statistical learning can plausibly predict causal effects of differential cultural exposure to musical styles, providing a quantitative model of cultural distance. Understanding the neural basis of musical enculturation will benefit from close coordination between empirical neuroimaging and computational modeling of underlying mechanisms, as outlined here. © 2018 The Authors. Annals of the New York Academy of Sciences published by Wiley Periodicals, Inc. on behalf of New York Academy of Sciences.

  9. Problematic Substance Use in Urban Adolescents: Role of Intrauterine Exposures to Cocaine and Marijuana and Post-Natal Environment

    PubMed Central

    Frank, Deborah A.; Kuranz, Seth; Appugliese, Danielle; Cabral, Howard; Chen, Clara; Crooks, Denise; Heeren, Timothy; Liebschutz, Jane; Richardson, Mark; Rose-Jacobs, Ruth

    2014-01-01

    Background Linkages between intrauterine exposures to cocaine and marijuana and adolescents’ problematic substance use have not been fully delineated. Methods Prospective longitudinal study with assessors unaware of intrauterine exposure history followed 157 urban participants from birth until late adolescence. Level of intrauterine exposures was identified by mother's report and infant’s meconium. Problematic substance use, identified by the Voice Diagnostic Interview Schedule for Children (V-DISC) or the Audio Computer Assisted Self-Interview (ACASI) and urine assay, was a composite encompassing DSM-IV indication of tolerance, abuse, and dependence on alcohol, marijuana, and tobacco and any use of cocaine, glue, or opiates. Results Twenty percent (32/157) of the sample experienced problematic substance use by age 18 years, of whom the majority (22/157) acknowledged abuse, tolerance or dependence on marijuana with or without other substances. Structural equation models examining direct and indirect pathways linking a Cox survival model for early substance initiation to a logistic regression models found effects of post-natal factors including childhood exposure to violence and household substance use, early youth substance initiation, and ongoing youth violence exposure contributing to adolescent problematic substance use. Conclusion We did not identify direct relationships between intrauterine cocaine or marijuana exposure and problematic substance use, but did find potentially modifiable post-natal risk factors also noted to be associated with problematic substance use in the general population including earlier substance initiation, exposure to violence and to household substance use. PMID:24999059

  10. An anisotropic thermomechanical damage model for concrete at transient elevated temperatures.

    PubMed

    Baker, Graham; de Borst, René

    2005-11-15

    The behaviour of concrete at elevated temperatures is important for an assessment of integrity (strength and durability) of structures exposed to a high-temperature environment, in applications such as fire exposure, smelting plants and nuclear installations. In modelling terms, a coupled thermomechanical analysis represents a generalization of the computational mechanics of fracture and damage. Here, we develop a fully coupled anisotropic thermomechanical damage model for concrete under high stress and transient temperature, with emphasis on the adherence of the model to the laws of thermodynamics. Specific analytical results are given, deduced from thermodynamics, of a novel interpretation on specific heat, evolution of entropy and the identification of the complete anisotropic, thermomechanical damage surface. The model is also shown to be stable in a computational sense, and to satisfy the laws of thermodynamics.

  11. Daily computer usage correlated with undergraduate students' musculoskeletal symptoms.

    PubMed

    Chang, Che-Hsu Joe; Amick, Benjamin C; Menendez, Cammie Chaumont; Katz, Jeffrey N; Johnson, Peter W; Robertson, Michelle; Dennerlein, Jack Tigh

    2007-06-01

    A pilot prospective study was performed to examine the relationships between daily computer usage time and musculoskeletal symptoms on undergraduate students. For three separate 1-week study periods distributed over a semester, 27 students reported body part-specific musculoskeletal symptoms three to five times daily. Daily computer usage time for the 24-hr period preceding each symptom report was calculated from computer input device activities measured directly by software loaded on each participant's primary computer. General Estimating Equation models tested the relationships between daily computer usage and symptom reporting. Daily computer usage longer than 3 hr was significantly associated with an odds ratio 1.50 (1.01-2.25) of reporting symptoms. Odds of reporting symptoms also increased with quartiles of daily exposure. These data suggest a potential dose-response relationship between daily computer usage time and musculoskeletal symptoms.

  12. Monte Carlo Computational Modeling of the Energy Dependence of Atomic Oxygen Undercutting of Protected Polymers

    NASA Technical Reports Server (NTRS)

    Banks, Bruce A.; Stueber, Thomas J.; Norris, Mary Jo

    1998-01-01

    A Monte Carlo computational model has been developed which simulates atomic oxygen attack of protected polymers at defect sites in the protective coatings. The parameters defining how atomic oxygen interacts with polymers and protective coatings as well as the scattering processes which occur have been optimized to replicate experimental results observed from protected polyimide Kapton on the Long Duration Exposure Facility (LDEF) mission. Computational prediction of atomic oxygen undercutting at defect sites in protective coatings for various arrival energies was investigated. The atomic oxygen undercutting energy dependence predictions enable one to predict mass loss that would occur in low Earth orbit, based on lower energy ground laboratory atomic oxygen beam systems. Results of computational model prediction of undercut cavity size as a function of energy and defect size will be presented to provide insight into expected in-space mass loss of protected polymers with protective coating defects based on lower energy ground laboratory testing.

  13. Prosthetically directed implant placement using computer software to ensure precise placement and predictable prosthetic outcomes. Part 2: rapid-prototype medical modeling and stereolithographic drilling guides requiring bone exposure.

    PubMed

    Rosenfeld, Alan L; Mandelaris, George A; Tardieu, Philippe B

    2006-08-01

    The purpose of this paper is to expand on part 1 of this series (published in the previous issue) regarding the emerging future of computer-guided implant dentistry. This article will introduce the concept of rapid-prototype medical modeling as well as describe the utilization and fabrication of computer-generated surgical drilling guides used during implant surgery. The placement of dental implants has traditionally been an intuitive process, whereby the surgeon relies on mental navigation to achieve optimal implant positioning. Through rapid-prototype medical modeling and the ste-reolithographic process, surgical drilling guides (eg, SurgiGuide) can be created. These guides are generated from a surgical implant plan created with a computer software system that incorporates all relevant prosthetic information from which the surgical plan is developed. The utilization of computer-generated planning and stereolithographically generated surgical drilling guides embraces the concept of collaborative accountability and supersedes traditional mental navigation on all levels of implant therapy.

  14. Estimation of whole-body radiation exposure from brachytherapy for oral cancer using a Monte Carlo simulation.

    PubMed

    Ozaki, Y; Watanabe, H; Kaida, A; Miura, M; Nakagawa, K; Toda, K; Yoshimura, R; Sumi, Y; Kurabayashi, T

    2017-07-01

    Early stage oral cancer can be cured with oral brachytherapy, but whole-body radiation exposure status has not been previously studied. Recently, the International Commission on Radiological Protection Committee (ICRP) recommended the use of ICRP phantoms to estimate radiation exposure from external and internal radiation sources. In this study, we used a Monte Carlo simulation with ICRP phantoms to estimate whole-body exposure from oral brachytherapy. We used a Particle and Heavy Ion Transport code System (PHITS) to model oral brachytherapy with 192Ir hairpins and 198Au grains and to perform a Monte Carlo simulation on the ICRP adult reference computational phantoms. To confirm the simulations, we also computed local dose distributions from these small sources, and compared them with the results from Oncentra manual Low Dose Rate Treatment Planning (mLDR) software which is used in day-to-day clinical practice. We successfully obtained data on absorbed dose for each organ in males and females. Sex-averaged equivalent doses were 0.547 and 0.710 Sv with 192Ir hairpins and 198Au grains, respectively. Simulation with PHITS was reliable when compared with an alternative computational technique using mLDR software. We concluded that the absorbed dose for each organ and whole-body exposure from oral brachytherapy can be estimated with Monte Carlo simulation using PHITS on ICRP reference phantoms. Effective doses for patients with oral cancer were obtained. © The Author 2017. Published by Oxford University Press on behalf of The Japan Radiation Research Society and Japanese Society for Radiation Oncology.

  15. A Computer Model for Evaluating the Effects on Fighting Vehicle Crewmembers of Exposure to Carbon Monoxide Emissions.

    DTIC Science & Technology

    1980-01-01

    SUPPLEMENTARY NOTES Is. KEY WORDS (Continue on reverse ede If neceseay id Identify by block number) Carbon Monoxide (CO) Computer Program Carboxyhemoglobin ...several researchers, which predicts the instantaneous amount of carboxyhemoglobin (COHb) in the blood of a person based upon the amount of carbon monoxide...developed from an empirical equation (derived from reference I and detailed in reference 3) which predicts the amount of carboxyhemoglobin (COHb) in

  16. 20170312 - Computer Simulation of Developmental ...

    EPA Pesticide Factsheets

    Rationale: Recent progress in systems toxicology and synthetic biology have paved the way to new thinking about in vitro/in silico modeling of developmental processes and toxicities, both for embryological and reproductive impacts. Novel in vitro platforms such as 3D organotypic culture models, engineered microscale tissues and complex microphysiological systems (MPS), together with computational models and computer simulation of tissue dynamics, lend themselves to a integrated testing strategies for predictive toxicology. As these emergent methodologies continue to evolve, they must be integrally tied to maternal/fetal physiology and toxicity of the developing individual across early lifestage transitions, from fertilization to birth, through puberty and beyond. Scope: This symposium will focus on how the novel technology platforms can help now and in the future, with in vitro/in silico modeling of complex biological systems for developmental and reproductive toxicity issues, and translating systems models into integrative testing strategies. The symposium is based on three main organizing principles: (1) that novel in vitro platforms with human cells configured in nascent tissue architectures with a native microphysiological environments yield mechanistic understanding of developmental and reproductive impacts of drug/chemical exposures; (2) that novel in silico platforms with high-throughput screening (HTS) data, biologically-inspired computational models of

  17. Computer Simulation of Developmental Processes and ...

    EPA Pesticide Factsheets

    Rationale: Recent progress in systems toxicology and synthetic biology have paved the way to new thinking about in vitro/in silico modeling of developmental processes and toxicities, both for embryological and reproductive impacts. Novel in vitro platforms such as 3D organotypic culture models, engineered microscale tissues and complex microphysiological systems (MPS), together with computational models and computer simulation of tissue dynamics, lend themselves to a integrated testing strategies for predictive toxicology. As these emergent methodologies continue to evolve, they must be integrally tied to maternal/fetal physiology and toxicity of the developing individual across early lifestage transitions, from fertilization to birth, through puberty and beyond. Scope: This symposium will focus on how the novel technology platforms can help now and in the future, with in vitro/in silico modeling of complex biological systems for developmental and reproductive toxicity issues, and translating systems models into integrative testing strategies. The symposium is based on three main organizing principles: (1) that novel in vitro platforms with human cells configured in nascent tissue architectures with a native microphysiological environments yield mechanistic understanding of developmental and reproductive impacts of drug/chemical exposures; (2) that novel in silico platforms with high-throughput screening (HTS) data, biologically-inspired computational models of

  18. The Interactive Effects of Computer Conferencing and Multiple Intelligences on Expository Writing.

    ERIC Educational Resources Information Center

    Cifuentes, Lauren; Hughey, Jane

    2003-01-01

    Investigates the differential effects of computer conferencing on expository writing for students of seven intelligence types. Students were assigned to treatment groups that provided controlled exposure to a topic: unstructured exposure; computer conferencing; face-to-face discussion; and computer conferencing and face-to-face discussion.…

  19. Computer-Assisted Exposure Treatment for Flight Phobia

    ERIC Educational Resources Information Center

    Tortella-Feliu, Miguel; Bornas, Xavier; Llabres, Jordi

    2008-01-01

    This review introduces the state of the art in computer-assisted treatment for behavioural disorders. The core of the paper is devoted to describe one of these interventions providing computer-assisted exposure for flight phobia treatment, the Computer-Assisted Fear of Flying Treatment (CAFFT). The rationale, contents and structure of the CAFFT…

  20. A model for the perception of environmental sound based on notice-events.

    PubMed

    De Coensel, Bert; Botteldooren, Dick; De Muer, Tom; Berglund, Birgitta; Nilsson, Mats E; Lercher, Peter

    2009-08-01

    An approach is proposed to shed light on the mechanisms underlying human perception of environmental sound that intrudes in everyday living. Most research on exposure-effect relationships aims at relating overall effects to overall exposure indicators in an epidemiological fashion, without including available knowledge on the possible underlying mechanisms. Here, it is proposed to start from available knowledge on audition and perception to construct a computational framework for the effect of environmental sound on individuals. Obviously, at the individual level additional mechanisms (inter-sensory, attentional, cognitive, emotional) play a role in the perception of environmental sound. As a first step, current knowledge is made explicit by building a model mimicking some aspects of human auditory perception. This model is grounded in the hypothesis that long-term perception of environmental sound is determined primarily by short notice-events. The applicability of the notice-event model is illustrated by simulating a synthetic population exposed to typical Flemish environmental noise. From these simulation results, it is demonstrated that the notice-event model is able to mimic the differences between the annoyance caused by road traffic noise exposure and railway traffic noise exposure that are also observed empirically in other studies and thus could provide an explanation for these differences.

  1. Early Childhood Media Exposure and Self-Regulation: Bi-Directional Longitudinal Associations.

    PubMed

    Cliff, Dylan P; Howard, Steven J; Radesky, Jenny S; McNeill, Jade; Vella, Stewart A

    2018-04-26

    To investigate: i) prospective associations between media exposure (television viewing, computers, and electronic games) at 2 years and self-regulation at 4 and 6 years, and ii) bi-directional associations between media exposure and self-regulation at 4 and 6 years. We hypothesized that media exposure and self-regulation would display a negative prospective association and subsequent bi-directional inverse associations. Data from the nationally-representative Longitudinal Study of Australian Children (LSAC) when children were aged 2 (n=2786) and 4/6 years (n=3527) were used. Primary caregivers reported children's weekly electronic media exposure. A composite measure of self-regulation was computed from caregivers-, teacher-, and observer-report data. Associations were examined using linear regression and cross-lagged panel models, accounting for covariates. Lower television viewing and total media exposure at 2 years were associated with higher self-regulation at 4 years (both β -0.02; 95% confidence interval [CI] -0.03, -0.01). Lower self-regulation at 4 years was also significantly associated with higher television viewing (β -0.15; 95% CI -0.21, -0.08), electronic game use (β -0.05; 95% CI -0.09, -0.01), and total media exposure (β -0.19; 95% CI -0.29, -0.09) at 6 years. However, media exposure at 4 years was not associated with self-regulation at 6 years. Although media exposure duration at 2 years was associated with later self-regulation, and self-regulation at 4 years was associated with later media exposure, associations were of small magnitude. More research is needed examining content quality, social context, and mobile media use and child self-regulation. Copyright © 2018. Published by Elsevier Inc.

  2. Differences in muscle load between computer and non-computer work among office workers.

    PubMed

    Richter, J M; Mathiassen, S E; Slijper, H P; Over, E A B; Frens, M A

    2009-12-01

    Introduction of more non-computer tasks has been suggested to increase exposure variation and thus reduce musculoskeletal complaints (MSC) in computer-intensive office work. This study investigated whether muscle activity did, indeed, differ between computer and non-computer activities. Whole-day logs of input device use in 30 office workers were used to identify computer and non-computer work, using a range of classification thresholds (non-computer thresholds (NCTs)). Exposure during these activities was assessed by bilateral electromyography recordings from the upper trapezius and lower arm. Contrasts in muscle activity between computer and non-computer work were distinct but small, even at the individualised, optimal NCT. Using an average group-based NCT resulted in less contrast, even in smaller subgroups defined by job function or MSC. Thus, computer activity logs should be used cautiously as proxies of biomechanical exposure. Conventional non-computer tasks may have a limited potential to increase variation in muscle activity during computer-intensive office work.

  3. A stochastic whole-body physiologically based pharmacokinetic model to assess the impact of inter-individual variability on tissue dosimetry over the human lifespan.

    PubMed

    Beaudouin, Rémy; Micallef, Sandrine; Brochot, Céline

    2010-06-01

    Physiologically based pharmacokinetic (PBPK) models have proven to be successful in integrating and evaluating the influence of age- or gender-dependent changes with respect to the pharmacokinetics of xenobiotics throughout entire lifetimes. Nevertheless, for an effective application of toxicokinetic modelling to chemical risk assessment, a PBPK model has to be detailed enough to include all the multiple tissues that could be targeted by the various xenobiotics present in the environment. For this reason, we developed a PBPK model based on a detailed compartmentalization of the human body and parameterized with new relationships describing the time evolution of physiological and anatomical parameters. To take into account the impact of human variability on the predicted toxicokinetics, we defined probability distributions for key parameters related to the xenobiotics absorption, distribution, metabolism and excretion. The model predictability was evaluated by a direct comparison between computational predictions and experimental data for the internal concentrations of two chemicals (1,3-butadiene and 2,3,7,8-tetrachlorodibenzo-p-dioxin). A good agreement between predictions and observed data was achieved for different scenarios of exposure (e.g., acute or chronic exposure and different populations). Our results support that the general stochastic PBPK model can be a valuable computational support in the area of chemical risk analysis. (c)2010 Elsevier Inc. All rights reserved.

  4. Evaluation and characterization of fetal exposures to low frequency magnetic fields generated by laptop computers.

    PubMed

    Zoppetti, Nicola; Andreuccetti, Daniele; Bellieni, Carlo; Bogi, Andrea; Pinto, Iole

    2011-12-01

    Portable - or "laptop" - computers (LCs) are widely and increasingly used all over the world. Since LCs are often used in tight contact with the body even by pregnant women, fetal exposures to low frequency magnetic fields generated by these units can occur. LC emissions are usually characterized by complex waveforms and are often generated by the main AC power supply (when connected) and by the display power supply sub-system. In the present study, low frequency magnetic field emissions were measured for a set of five models of portable computers. For each of them, the magnetic flux density was characterized in terms not just of field amplitude, but also of the so called "weighted peak" (WP) index, introduced in the 2003 ICNIRP Statement on complex waveforms and confirmed in the 2010 ICNIRP Guidelines for low frequency fields. For the model of LC presenting the higher emission, a deeper analysis was also carried out, using numerical dosimetry techniques to calculate internal quantities (current density and in-situ electric field) with reference to a digital body model of a pregnant woman. Since internal quantities have complex waveforms too, the concept of WP index was extended to them, considering the ICNIRP basic restrictions defined in the 1998 Guidelines for the current density and in the 2010 Guidelines for the in-situ electric field. Induced quantities and WP indexes were computed using an appropriate original formulation of the well known Scalar Potential Finite Difference (SPFD) numerical method for electromagnetic dosimetry in quasi-static conditions. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. Multi-resolution voxel phantom modeling: a high-resolution eye model for computational dosimetry

    NASA Astrophysics Data System (ADS)

    Caracappa, Peter F.; Rhodes, Ashley; Fiedler, Derek

    2014-09-01

    Voxel models of the human body are commonly used for simulating radiation dose with a Monte Carlo radiation transport code. Due to memory limitations, the voxel resolution of these computational phantoms is typically too large to accurately represent the dimensions of small features such as the eye. Recently reduced recommended dose limits to the lens of the eye, which is a radiosensitive tissue with a significant concern for cataract formation, has lent increased importance to understanding the dose to this tissue. A high-resolution eye model is constructed using physiological data for the dimensions of radiosensitive tissues, and combined with an existing set of whole-body models to form a multi-resolution voxel phantom, which is used with the MCNPX code to calculate radiation dose from various exposure types. This phantom provides an accurate representation of the radiation transport through the structures of the eye. Two alternate methods of including a high-resolution eye model within an existing whole-body model are developed. The accuracy and performance of each method is compared against existing computational phantoms.

  6. Thermal modeling of lesion growth with radiofrequency ablation devices

    PubMed Central

    Chang, Isaac A; Nguyen, Uyen D

    2004-01-01

    Background Temperature is a frequently used parameter to describe the predicted size of lesions computed by computational models. In many cases, however, temperature correlates poorly with lesion size. Although many studies have been conducted to characterize the relationship between time-temperature exposure of tissue heating to cell damage, to date these relationships have not been employed in a finite element model. Methods We present an axisymmetric two-dimensional finite element model that calculates cell damage in tissues and compare lesion sizes using common tissue damage and iso-temperature contour definitions. The model accounts for both temperature-dependent changes in the electrical conductivity of tissue as well as tissue damage-dependent changes in local tissue perfusion. The data is validated using excised porcine liver tissues. Results The data demonstrate the size of thermal lesions is grossly overestimated when calculated using traditional temperature isocontours of 42°C and 47°C. The computational model results predicted lesion dimensions that were within 5% of the experimental measurements. Conclusion When modeling radiofrequency ablation problems, temperature isotherms may not be representative of actual tissue damage patterns. PMID:15298708

  7. Principles for the wise use of computers by children.

    PubMed

    Straker, L; Pollock, C; Maslen, B

    2009-11-01

    Computer use by children at home and school is now common in many countries. Child computer exposure varies with the type of computer technology available and the child's age, gender and social group. This paper reviews the current exposure data and the evidence for positive and negative effects of computer use by children. Potential positive effects of computer use by children include enhanced cognitive development and school achievement, reduced barriers to social interaction, enhanced fine motor skills and visual processing and effective rehabilitation. Potential negative effects include threats to child safety, inappropriate content, exposure to violence, bullying, Internet 'addiction', displacement of moderate/vigorous physical activity, exposure to junk food advertising, sleep displacement, vision problems and musculoskeletal problems. The case for child specific evidence-based guidelines for wise use of computers is presented based on children using computers differently to adults, being physically, cognitively and socially different to adults, being in a state of change and development and the potential to impact on later adult risk. Progress towards child-specific guidelines is reported. Finally, a set of guideline principles is presented as the basis for more detailed guidelines on the physical, cognitive and social impact of computer use by children. The principles cover computer literacy, technology safety, child safety and privacy and appropriate social, cognitive and physical development. The majority of children in affluent communities now have substantial exposure to computers. This is likely to have significant effects on child physical, cognitive and social development. Ergonomics can provide and promote guidelines for wise use of computers by children and by doing so promote the positive effects and reduce the negative effects of computer-child, and subsequent computer-adult, interaction.

  8. Transport of Space Environment Electrons: A Simplified Rapid-Analysis Computational Procedure

    NASA Technical Reports Server (NTRS)

    Nealy, John E.; Anderson, Brooke M.; Cucinotta, Francis A.; Wilson, John W.; Katz, Robert; Chang, C. K.

    2002-01-01

    A computational procedure for describing transport of electrons in condensed media has been formulated for application to effects and exposures from spectral distributions typical of electrons trapped in planetary magnetic fields. The procedure is based on earlier parameterizations established from numerous electron beam experiments. New parameterizations have been derived that logically extend the domain of application to low molecular weight (high hydrogen content) materials and higher energies (approximately 50 MeV). The production and transport of high energy photons (bremsstrahlung) generated in the electron transport processes have also been modeled using tabulated values of photon production cross sections. A primary purpose for developing the procedure has been to provide a means for rapidly performing numerous repetitive calculations essential for electron radiation exposure assessments for complex space structures. Several favorable comparisons have been made with previous calculations for typical space environment spectra, which have indicated that accuracy has not been substantially compromised at the expense of computational speed.

  9. A modeling investigation of the impact of street and building configurations on personal air pollutant exposure in isolated deep urban canyons.

    PubMed

    Ng, Wai-Yin; Chau, Chi-Kwan

    2014-01-15

    This study evaluated the effectiveness of different configurations for two building design elements, namely building permeability and setback, proposed for mitigating air pollutant exposure problems in isolated deep canyons by using an indirect exposure approach. The indirect approach predicted the exposures of three different population subgroups (i.e. pedestrians, shop vendors and residents) by multiplying the pollutant concentrations with the duration of exposure within a specific micro-environment. In this study, the pollutant concentrations for different configurations were predicted using a computational fluid dynamics model. The model was constructed based on the Reynolds-Averaged Navier-Stokes (RANS) equations with the standard k-ε turbulence model. Fifty-one canyon configurations with aspect ratios of 2, 4, 6 and different building permeability values (ratio of building spacing to the building façade length) or different types of building setback (recess of a high building from the road) were examined. The findings indicated that personal exposures of shop vendors were extremely high if they were present inside a canyon without any setback or separation between buildings and when the prevailing wind was perpendicular to the canyon axis. Building separation and building setbacks were effective in reducing personal air exposures in canyons with perpendicular wind, although their effectiveness varied with different configurations. Increasing the permeability value from 0 to 10% significantly lowered the personal exposures on the different population subgroups. Likewise, the personal exposures could also be reduced by the introduction of building setbacks despite their effects being strongly influenced by the aspect ratio of a canyon. Equivalent findings were observed if the reduction in the total development floor area (the total floor area permitted to be developed within a particular site area) was also considered. These findings were employed to formulate a hierarchy decision making model to guide the planning of deep canyons in high density urban cities. © 2013 Elsevier B.V. All rights reserved.

  10. Computational approach on PEB process in EUV resist: multi-scale simulation

    NASA Astrophysics Data System (ADS)

    Kim, Muyoung; Moon, Junghwan; Choi, Joonmyung; Lee, Byunghoon; Jeong, Changyoung; Kim, Heebom; Cho, Maenghyo

    2017-03-01

    For decades, downsizing has been a key issue for high performance and low cost of semiconductor, and extreme ultraviolet lithography is one of the promising candidates to achieve the goal. As a predominant process in extreme ultraviolet lithography on determining resolution and sensitivity, post exposure bake has been mainly studied by experimental groups, but development of its photoresist is at the breaking point because of the lack of unveiled mechanism during the process. Herein, we provide theoretical approach to investigate underlying mechanism on the post exposure bake process in chemically amplified resist, and it covers three important reactions during the process: acid generation by photo-acid generator dissociation, acid diffusion, and deprotection. Density functional theory calculation (quantum mechanical simulation) was conducted to quantitatively predict activation energy and probability of the chemical reactions, and they were applied to molecular dynamics simulation for constructing reliable computational model. Then, overall chemical reactions were simulated in the molecular dynamics unit cell, and final configuration of the photoresist was used to predict the line edge roughness. The presented multiscale model unifies the phenomena of both quantum and atomic scales during the post exposure bake process, and it will be helpful to understand critical factors affecting the performance of the resulting photoresist and design the next-generation material.

  11. Company-level, semi-quantitative assessment of occupational styrene exposure when individual data are not available.

    PubMed

    Kolstad, Henrik A; Sønderskov, Jette; Burstyn, Igor

    2005-03-01

    In epidemiological research, self-reported information about determinants and levels of occupational exposures is difficult to obtain, especially if the disease under study has a high mortality rate or follow-up has exceeded several years. In this paper, we present a semi-quantitative exposure assessment strategy for nested case-control studies of styrene exposure among workers of the Danish reinforced plastics industry when no information on job title, task or other indicators of individual exposure were readily available from cases and controls. The strategy takes advantage of the variability in styrene exposure level and styrene exposure probability across companies. The study comprised 1522 cases of selected malignancies and neurodegenerative diseases and controls employed in 230 reinforced plastics companies and other related industries. Between 1960 and 1996, 3057 measurements of styrene exposure level obtained from 191 companies, were identified. Mixed effects models were used to estimate expected styrene exposure levels by production characteristics for all companies. Styrene exposure probability within each company was estimated for all but three cases and controls from the fraction of laminators, which was reported by a sample of 945 living colleagues of the cases and controls and by employers and dealers of plastic raw materials. The estimates were validated from a subset of 427 living cases and controls that reported their own work as laminators in the industry. We computed styrene exposure scores that integrated estimated styrene exposure level and styrene exposure probability. Product (boats), process (hand and spray lamination) and calendar year period were the major determinants of styrene exposure level. Within-company styrene exposure variability increased by calendar year and was accounted for when computing the styrene exposure scores. Exposure probability estimates based on colleagues' reports showed the highest predictive values in the validation test, which also indicated that up to 67% of the workers were correctly classified into a styrene-exposed job. Styrene exposure scores declined about 10-fold from the 1960s-1990s. This exposure assessment approach may be justified in other industries, and especially in industries dominated by small companies with simple exposure conditions.

  12. Assessment of global flood exposures - developing an appropriate approach

    NASA Astrophysics Data System (ADS)

    Millinship, Ian; Booth, Naomi

    2015-04-01

    Increasingly complex probabilistic catastrophe models have become the standard for quantitative flood risk assessments by re/insurance companies. On the one hand, probabilistic modelling of this nature is extremely useful; a large range of risk metrics can be output. However, they can be time consuming and computationally expensive to develop and run. Levels of uncertainty are persistently high despite, or perhaps because of, attempts to increase resolution and complexity. A cycle of dependency between modelling companies and re/insurers has developed whereby available models are purchased, models run, and both portfolio and model data 'improved' every year. This can lead to potential exposures in perils and territories that are not currently modelled being largely overlooked by companies, who may then face substantial and unexpected losses when large events occur in these areas. We present here an approach to assessing global flood exposures which reduces the scale and complexity of approach used and begins with the identification of hotspots where there is a significant exposure to flood risk. The method comprises four stages: i) compile consistent exposure information, ii) to apply reinsurance terms and conditions to calculate values exposed, iii) to assess the potential hazard using a global set of flood hazard maps, and iv) to identify potential risk 'hotspots' which include considerations of spatially and/or temporally clustered historical events, and local flood defences. This global exposure assessment is designed as a scoping exercise, and reveals areas or cities where the potential for accumulated loss is of significant interest to a reinsurance company, and for which there is no existing catastrophe model. These regions are then candidates for the development of deterministic scenarios, or probabilistic models. The key advantages of this approach will be discussed. These include simplicity and ability of business leaders to understand results, as well as ease and speed of analysis and the advantages this can offer in terms of monitoring changing exposures over time. Significantly, in many areas of the world, this increase in exposure is likely to have more of an impact on increasing catastrophe losses than potential anthropogenically driven changes in weather extremes.

  13. Prediction of mesothelioma and lung cancer in a cohort of asbestos exposed workers.

    PubMed

    Gasparrini, Antonio; Pizzo, Anna Maria; Gorini, Giuseppe; Seniori Costantini, Adele; Silvestri, Stefano; Ciapini, Cesare; Innocenti, Andrea; Berry, Geoffrey

    2008-01-01

    Several papers have reported state-wide projections of mesothelioma deaths, but few have computed these predictions in selected exposed groups. To predict the future deaths attributable to asbestos in a cohort of railway rolling stock workers. The future mortality of the 1,146 living workers has been computed in term of individual probability of dying for three different risks: baseline mortality, lung cancer excess, mesothelioma mortality. Lung cancer mortality attributable to asbestos was calculated assuming the excess risk as stable or with a decrease after a period of time since first exposure. Mesothelioma mortality was based on cumulative exposure and time since first exposure, with the inclusion of a term for clearance of asbestos fibres from the lung. The most likely range of the number of deaths attributable to asbestos in the period 2005-2050 was 15-30 for excess of lung cancer, and 23-35 for mesothelioma. This study provides predictions of asbestos-related mortality even in a selected cohort of exposed subjects, using previous knowledge about exposure-response relationship. The inclusion of individual information in the projection model helps reduce misclassification and improves the results. The method could be extended in other selected cohorts.

  14. Bayesian adjustment for measurement error in continuous exposures in an individually matched case-control study.

    PubMed

    Espino-Hernandez, Gabriela; Gustafson, Paul; Burstyn, Igor

    2011-05-14

    In epidemiological studies explanatory variables are frequently subject to measurement error. The aim of this paper is to develop a Bayesian method to correct for measurement error in multiple continuous exposures in individually matched case-control studies. This is a topic that has not been widely investigated. The new method is illustrated using data from an individually matched case-control study of the association between thyroid hormone levels during pregnancy and exposure to perfluorinated acids. The objective of the motivating study was to examine the risk of maternal hypothyroxinemia due to exposure to three perfluorinated acids measured on a continuous scale. Results from the proposed method are compared with those obtained from a naive analysis. Using a Bayesian approach, the developed method considers a classical measurement error model for the exposures, as well as the conditional logistic regression likelihood as the disease model, together with a random-effect exposure model. Proper and diffuse prior distributions are assigned, and results from a quality control experiment are used to estimate the perfluorinated acids' measurement error variability. As a result, posterior distributions and 95% credible intervals of the odds ratios are computed. A sensitivity analysis of method's performance in this particular application with different measurement error variability was performed. The proposed Bayesian method to correct for measurement error is feasible and can be implemented using statistical software. For the study on perfluorinated acids, a comparison of the inferences which are corrected for measurement error to those which ignore it indicates that little adjustment is manifested for the level of measurement error actually exhibited in the exposures. Nevertheless, a sensitivity analysis shows that more substantial adjustments arise if larger measurement errors are assumed. In individually matched case-control studies, the use of conditional logistic regression likelihood as a disease model in the presence of measurement error in multiple continuous exposures can be justified by having a random-effect exposure model. The proposed method can be successfully implemented in WinBUGS to correct individually matched case-control studies for several mismeasured continuous exposures under a classical measurement error model.

  15. Shuttle Space Suit: Fabric/LCVG Model Validation. Chapter 8

    NASA Technical Reports Server (NTRS)

    Wilson, J. W.; Tweed, J.; Zeitlin, C.; Kim, M.-H. Y.; Anderson, B. M.; Cucinotta, F. A.; Ware, J.; Persans, A. E.

    2003-01-01

    A detailed space suit computational model is being developed at the Langley Research Center for radiation exposure evaluation studies. The details of the construction of the space suit are critical to estimation of exposures and assessing the risk to the astronaut on EVA. Past evaluations of space suit shielding properties assumed the basic fabric layup (Thermal Micrometeoroid Garment, fabric restraints, and pressure envelope) and LCVG could be homogenized as a single layer overestimating the protective properties over 60 percent of the fabric area. The present space suit model represents the inhomogeneous distributions of LCVG materials (mainly the water filled cooling tubes). An experimental test is performed using a 34-MeV proton beam and high-resolution detectors to compare with model-predicted transmission factors. Some suggestions are made on possible improved construction methods to improve the space suit s protection properties.

  16. Shuttle Spacesuit: Fabric/LCVG Model Validation

    NASA Technical Reports Server (NTRS)

    Wilson, J. W.; Tweed, J.; Zeitlin, C.; Kim, M.-H. Y.; Anderson, B. M.; Cucinotta, F. A.; Ware, J.; Persans, A. E.

    2001-01-01

    A detailed spacesuit computational model is being developed at the Langley Research Center for radiation exposure evaluation studies. The details of the construction of the spacesuit are critical to estimation of exposures and assessing the risk to the astronaut on EVA. Past evaluations of spacesuit shielding properties assumed the basic fabric lay-up (Thermal Micrometeroid Garment, fabric restraints, and pressure envelope) and Liquid Cooling and Ventilation Garment (LCVG) could be homogenized as a single layer overestimating the protective properties over 60 percent of the fabric area. The present spacesuit model represents the inhomogeneous distributions of LCVG materials (mainly the water filled cooling tubes). An experimental test is performed using a 34-MeV proton beam and highresolution detectors to compare with model-predicted transmission factors. Some suggestions are made on possible improved construction methods to improve the spacesuit's protection properties.

  17. Numerical simulation of aerobic exercise as a countermeasure in human spaceflight

    NASA Astrophysics Data System (ADS)

    Perez-Poch, Antoni

    The objective of this work is to analyse the efficacy of long-term regular exercise on relevant cardiovascular parameters when the human body is also exposed to microgravity. Computer simulations are an important tool which may be used to predict and analyse these possible effects, and compare them with in-flight experiments. We based our study on a electrical-like computer model (NELME: Numerical Evaluation of Long-term Microgravity Effects) which was developed in our laboratory and validated with the available data, focusing on the cardiovascu-lar parameters affected by changes in gravity exposure. NELME is based on an electrical-like control system model of the physiological changes, that are known to take place when grav-ity changes are applied. The computer implementation has a modular architecture. Hence, different output parameters, potential effects, organs and countermeasures can be easily imple-mented and evaluated. We added to the previous cardiovascular system module a perturbation module to evaluate the effect of regular exercise on the output parameters previously studied. Therefore, we simulated a well-known countermeasure with different protocols of exercising, as a pattern of input electric-like perturbations on the basic module. Different scenarios have been numerically simulated for both men and women, in different patterns of microgravity, reduced gravity and time exposure. Also EVAs were simulated as perturbations to the system. Results show slight differences in gender, with more risk reduction for women than for men after following an aerobic exercise pattern during a simulated mission. Also, risk reduction of a cardiovascular malfunction is evaluated, with a ceiling effect found in all scenarios. A turning point in vascular resistance for a long-term exposure of microgravity below 0.4g has been found of particular interest. In conclusion, we show that computer simulations are a valuable tool to analyse different effects of long-term microgravity exposure on the human body. Potential countermeasures such as physical exercise can also be evaluated as an induced perturbation into the system. Relevant results are compatible with existing data, and are of valuable interest as an assessment of the efficacy of aerobic exercise as a countermeasure in future missions to Mars.

  18. Experimental and numerical study on particle distribution in a two-zone chamber

    NASA Astrophysics Data System (ADS)

    Lai, Alvin C. K.; Wang, K.; Chen, F. Z.

    Better understanding of aerosol dynamics is an important step for improving personal exposure assessments in indoor environments. Although the limitation of the assumptions in a well-mixed model is well known, there has been very little research reported in the published literature on the discrepancy of exposure assessments between numerical models which take account of gravitational effects and the well-mixed model. A new Eulerian-type drift-flux model has been developed to simulate particle dispersion and personal exposure in a two-zone geometry, which accounts for the drift velocity resulting from gravitational settling and diffusion. To validate the numerical model, a small-scale chamber was fabricated. The airflow characteristics and particle concentrations were measured by a phase Doppler Anemometer. Both simulated airflow and concentration profiles agree well with the experimental results. A strong inhomogeneous concentration was observed experimentally for 10 μm aerosols. The computational model was further applied to study a simple hypothetical, yet more realistic scenario. The aim was to explore different levels of exposure predicted by the new model and the well-mixed model. Aerosols are initially uniformly distributed in one zone and subsequently transported and dispersed to an adjacent zone through an opening. Owing to the significant difference in the rates of transport and dispersion between aerosols and gases, inferred from the results, the well-mixed model tends to overpredict the concentration in the source zone, and under-predict the concentration in the exposed zone. The results are very useful to illustrate that the well-mixed assumption must be applied cautiously for exposure assessments as such an ideal condition may not be applied for coarse particles.

  19. Computational estimation of errors generated by lumping of physiologically-based pharmacokinetic (PBPK) interaction models of inhaled complex chemical mixtures

    EPA Science Inventory

    Many cases of environmental contamination result in concurrent or sequential exposure to more than one chemical. However, limitations of available resources make it unlikely that experimental toxicology will provide health risk information about all the possible mixtures to which...

  20. Computational Model of the Hypothalamic-pituitary-gonadal Axis to Predict Biochemical Adaptive Response to Endocrine Disrupting Fungicide Prochloraz

    EPA Science Inventory

    There is increasing evidence that exposure to endocrine disrupting chemicals can induce adverse effects on reproduction and development in both humans and wildlife. Recent studies report adaptive changes within exposed organisms in response to endocrine disrupting chemicals, and ...

  1. Modelling of polymer photodegradation for solar cell modules

    NASA Technical Reports Server (NTRS)

    Guillet, J. E.

    1982-01-01

    A computer program which simulates the complex processes of photooxidation which take place in a polymer upon prolonged exposure outdoors causing it to fail in photovoltaic and other applications. The method calculates from an input data set of elementary reactions and rates the concentration profiles of all species over time.

  2. Computational Modeling and Simulation of Developmental Toxicity: what can we learn from a virtual embryo? (RIVM, Brussels)

    EPA Science Inventory

    Developmental and Reproductive Toxicity (DART) testing is important for assessing the potential consequences of drug and chemical exposure on human health and well-being. Complexity of pregnancy and the reproductive cycle makes DART testing challenging and costly for traditional ...

  3. High-Throughput Analysis of Ovarian Cycle Disruption by Mixtures of Aromatase Inhibitors

    PubMed Central

    Golbamaki-Bakhtyari, Nazanin; Kovarich, Simona; Tebby, Cleo; Gabb, Henry A.; Lemazurier, Emmanuel

    2017-01-01

    Background: Combining computational toxicology with ExpoCast exposure estimates and ToxCast™ assay data gives us access to predictions of human health risks stemming from exposures to chemical mixtures. Objectives: We explored, through mathematical modeling and simulations, the size of potential effects of random mixtures of aromatase inhibitors on the dynamics of women's menstrual cycles. Methods: We simulated random exposures to millions of potential mixtures of 86 aromatase inhibitors. A pharmacokinetic model of intake and disposition of the chemicals predicted their internal concentration as a function of time (up to 2 y). A ToxCast™ aromatase assay provided concentration–inhibition relationships for each chemical. The resulting total aromatase inhibition was input to a mathematical model of the hormonal hypothalamus–pituitary–ovarian control of ovulation in women. Results: Above 10% inhibition of estradiol synthesis by aromatase inhibitors, noticeable (eventually reversible) effects on ovulation were predicted. Exposures to individual chemicals never led to such effects. In our best estimate, ∼10% of the combined exposures simulated had mild to catastrophic impacts on ovulation. A lower bound on that figure, obtained using an optimistic exposure scenario, was 0.3%. Conclusions: These results demonstrate the possibility to predict large-scale mixture effects for endocrine disrupters with a predictive toxicology approach that is suitable for high-throughput ranking and risk assessment. The size of the effects predicted is consistent with an increased risk of infertility in women from everyday exposures to our chemical environment. https://doi.org/10.1289/EHP742 PMID:28886606

  4. Air flow and concentration fields at urban road intersections for improved understanding of personal exposure.

    PubMed

    Tiwary, Abhishek; Robins, Alan; Namdeo, Anil; Bell, Margaret

    2011-07-01

    This paper reviews the state of knowledge on modelling air flow and concentration fields at road intersections. The first part covers the available literature from the past two decades on experimental (both field and wind tunnel) and modelling activities in order to provide insight into the physical basis of flow behaviour at a typical cross-street intersection. This is followed by a review of associated investigations of the impact of traffic-generated localised turbulence on the concentration fields due to emissions from vehicles. There is a discussion on the role of adequate characterisation of vehicle-induced turbulence in making predictions using hybrid models, combining the merits of conventional approaches with information obtained from more detailed modelling. This concludes that, despite advancements in computational techniques, there are crucial knowledge gaps affecting the parameterisations used in current models for individual exposure. This is specifically relevant to the growing impetus on walking and cycling activities on urban roads in the context of current drives for sustainable transport and healthy living. Due to inherently longer travel times involved during such trips, compared to automotive transport, pedestrians and cyclists are subjected to higher levels of exposure to emissions. Current modelling tools seem to under-predict this exposure because of limitations in their design and in the empirical parameters employed. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. Stochastic modelling of human exposure to food chemicals and nutrients within the "Montecarlo" project: an exploration of the influence of brand loyalty and market share on intake estimates of intense sweeteners from sugar-free soft drinks.

    PubMed

    Leclercq, Catherine; Arcella, Davide; Le Donne, Cinzia; Piccinelli, Raffaela; Sette, Stefania; Soggiu, Maria Eleonora

    2003-04-11

    To get a more realistic view of exposure to food chemicals, risk managers are getting more interested in stochastic modelling as an alternative to deterministic approaches based on conservative assumptions. It allows to take into account all the available information in the concentration of the chemical present in foods and in food consumption patterns. Within the EC-funded "Montecarlo" project, a comprehensive set of mathematical algorithms was developed to take into account all the necessary components for stochastic modelling of a variety of food chemicals, nutrients and ingredients. An appropriate computer software is being developed. Since the concentration of food chemicals may vary among different brands of the same product, consumer behaviour with respect to brands may have an impact on exposure assessments. Numeric experiments were carried out on different ways of incorporating indicators of market share and brand loyalty in the mathematical algorithms developed within the stochastic model of exposure to intense sweeteners from sugar-free beverages. The 95th percentiles of intake were shown to vary according to the inclusion/exclusion of these indicators. The market share should be included in the model especially if the market is not equitably distributed between brands. If brand loyalty data are not available, the model may be run under theoretical scenarios.

  6. Computed tomography assessment of peripubertal craniofacial morphology in a sheep model of binge alcohol drinking in the first trimester

    PubMed Central

    Birch, Sharla M.; Lenox, Mark W.; Kornegay, Joe N.; Shen, Li; Ai, Huisi; Ren, Xiaowei; Goodlett, Charles R.; Cudd, Tim A.; Washburn, Shannon E.

    2015-01-01

    Identification of facial dysmorphology is essential for the diagnosis of fetal alcohol syndrome (FAS); however, most children with fetal alcohol spectrum disorders (FASD) do not meet the dysmorphology criterion. Additional objective indicators are needed to help identify the broader spectrum of children affected by prenatal alcohol exposure. Computed tomography (CT) was used in a sheep model of prenatal binge alcohol exposure to test the hypothesis that quantitative measures of craniofacial bone volumes and linear distances could identify alcohol-exposed lambs. Pregnant sheep were randomly assigned to four groups: heavy binge alcohol, 2.5 g/kg/day (HBA); binge alcohol, 1.75 g/kg/day (BA); saline control (SC); and normal control (NC). Intravenous alcohol (BA; HBA) or saline (SC) infusions were given three consecutive days per week from gestation day 4–41, and a CT scan was performed on postnatal day 182. The volumes of eight skull bones, cranial circumference, and 19 linear measures of the face and skull were compared among treatment groups. Lambs from both alcohol groups showed significant reduction in seven of the eight skull bones and total skull bone volume, as well as cranial circumference. Alcohol exposure also decreased four of the 19 craniofacial measures. Discriminant analysis showed that alcohol-exposed and control lambs could be classified with high accuracy based on total skull bone volume, frontal, parietal, or mandibular bone volumes, cranial circumference, or interorbital distance. Total skull volume was significantly more sensitive than cranial circumference in identifying the alcohol-exposed lambs when alcohol-exposed lambs were classified using the typical FAS diagnostic cutoff of ≤10th percentile. This first demonstration of the usefulness of CT-derived craniofacial measures in a sheep model of FASD following binge-like alcohol exposure during the first trimester suggests that volumetric measurement of cranial bones may be a novel biomarker for binge alcohol exposure during the first trimester to help identify non-dysmorphic children with FASD. PMID:26496796

  7. Dynamic Evaluation of Two Decades of CMAQ Simulations ...

    EPA Pesticide Factsheets

    This presentation focuses on the dynamic evaluation of the CMAQ model over the continental United States using multi-decadal simulations for the period from 1990 to 2010 to examine how well the changes in observed ozone air quality induced by variations in meteorology and/or emissions are simulated by the model. We applied spectral decomposition of the ozone time-series using the KZ filter to assess the variations in the strengths of synoptic (weather-induced variations) and baseline (long-term variation) forcings, embedded in the simulated and observed concentrations. The results reveal that CMAQ captured the year-to-year variability (more so in the later years than the earlier years) and the synoptic forcing in accordance with what the observations are showing. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.

  8. Dermal uptake of phthalates from clothing: Comparison of model to human participant results.

    PubMed

    Morrison, G C; Weschler, C J; Bekö, G

    2017-05-01

    In this research, we extend a model of transdermal uptake of phthalates to include a layer of clothing. When compared with experimental results, this model better estimates dermal uptake of diethylphthalate and di-n-butylphthalate (DnBP) than a previous model. The model predictions are consistent with the observation that previously exposed clothing can increase dermal uptake over that observed in bare-skin participants for the same exposure air concentrations. The model predicts that dermal uptake from clothing of DnBP is a substantial fraction of total uptake from all sources of exposure. For compounds that have high dermal permeability coefficients, dermal uptake is increased for (i) thinner clothing, (ii) a narrower gap between clothing and skin, and (iii) longer time intervals between laundering and wearing. Enhanced dermal uptake is most pronounced for compounds with clothing-air partition coefficients between 10 4 and 10 7 . In the absence of direct measurements of cotton cloth-air partition coefficients, dermal exposure may be predicted using equilibrium data for compounds in equilibrium with cellulose and water, in combination with computational methods of predicting partition coefficients. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  9. Cellular burdens and biological effects on tissue level caused by inhaled radon progenies.

    PubMed

    Madas, B G; Balásházy, I; Farkas, Á; Szoke, I

    2011-02-01

    In the case of radon exposure, the spatial distribution of deposited radioactive particles is highly inhomogeneous in the central airways. The object of this research is to investigate the consequences of this heterogeneity regarding cellular burdens in the bronchial epithelium and to study the possible biological effects at tissue level. Applying computational fluid and particle dynamics techniques, the deposition distribution of inhaled radon daughters has been determined in a bronchial airway model for 23 min of work in the New Mexico uranium mine corresponding to 0.0129 WLM exposure. A numerical epithelium model based on experimental data has been utilised in order to quantify cellular hits and doses. Finally, a carcinogenesis model considering cell death-induced cell-cycle shortening has been applied to assess the biological responses. Present computations reveal that cellular dose may reach 1.5 Gy, which is several orders of magnitude higher than tissue dose. The results are in agreement with the histological finding that the uneven deposition distribution of radon progenies may lead to inhomogeneous spatial distribution of tumours in the bronchial airways. In addition, at the macroscopic level, the relationship between cancer risk and radiation burden seems to be non-linear.

  10. Computational and Organotypic Modeling of Microcephaly ...

    EPA Pesticide Factsheets

    Microcephaly is associated with reduced cortical surface area and ventricular dilations. Many genetic and environmental factors precipitate this malformation, including prenatal alcohol exposure and maternal Zika infection. This complexity motivates the engineering of computational and experimental models to probe the underlying molecular targets, cellular consequences, and biological processes. We describe an Adverse Outcome Pathway (AOP) framework for microcephaly derived from literature on all gene-, chemical-, or viral- effects and brain development. Overlap with NTDs is likely, although the AOP connections identified here focused on microcephaly as the adverse outcome. A query of the Mammalian Phenotype Browser database for ‘microcephaly’ (MP:0000433) returned 85 gene associations; several function in microtubule assembly and centrosome cycle regulated by (microcephalin, MCPH1), a gene for primary microcephaly in humans. The developing ventricular zone is the likely target. In this zone, neuroprogenitor cells (NPCs) self-replicate during the 1st trimester setting brain size, followed by neural differentiation of the neocortex. Recent studies with human NPCs confirmed infectivity with Zika virions invoking critical cell loss (apoptosis) of precursor NPCs; similar findings have been shown with fetal alcohol or methylmercury exposure in rodent studies, leading to mathematical models of NPC dynamics in size determination of the ventricular zone. A key event

  11. The UF family of hybrid phantoms of the developing human fetus for computational radiation dosimetry

    NASA Astrophysics Data System (ADS)

    Maynard, Matthew R.; Geyer, John W.; Aris, John P.; Shifrin, Roger Y.; Bolch, Wesley

    2011-08-01

    Historically, the development of computational phantoms for radiation dosimetry has primarily been directed at capturing and representing adult and pediatric anatomy, with less emphasis devoted to models of the human fetus. As concern grows over possible radiation-induced cancers from medical and non-medical exposures of the pregnant female, the need to better quantify fetal radiation doses, particularly at the organ-level, also increases. Studies such as the European Union's SOLO (Epidemiological Studies of Exposed Southern Urals Populations) hope to improve our understanding of cancer risks following chronic in utero radiation exposure. For projects such as SOLO, currently available fetal anatomic models do not provide sufficient anatomical detail for organ-level dose assessment. To address this need, two fetal hybrid computational phantoms were constructed using high-quality magnetic resonance imaging and computed tomography image sets obtained for two well-preserved fetal specimens aged 11.5 and 21 weeks post-conception. Individual soft tissue organs, bone sites and outer body contours were segmented from these images using 3D-DOCTOR™ and then imported to the 3D modeling software package Rhinoceros™ for further modeling and conversion of soft tissue organs, certain bone sites and outer body contours to deformable non-uniform rational B-spline surfaces. The two specimen-specific phantoms, along with a modified version of the 38 week UF hybrid newborn phantom, comprised a set of base phantoms from which a series of hybrid computational phantoms was derived for fetal ages 8, 10, 15, 20, 25, 30, 35 and 38 weeks post-conception. The methodology used to construct the series of phantoms accounted for the following age-dependent parameters: (1) variations in skeletal size and proportion, (2) bone-dependent variations in relative levels of bone growth, (3) variations in individual organ masses and total fetal masses and (4) statistical percentile variations in skeletal size, individual organ masses and total fetal masses. The resulting series of fetal hybrid computational phantoms is applicable to organ-level and bone-level internal and external radiation dosimetry for human fetuses of various ages and weight percentiles

  12. A modelling exercise to examine variations of NOx concentrations on adjacent footpaths in a street canyon: The importance of accounting for wind conditions and fleet composition.

    PubMed

    Gallagher, J

    2016-04-15

    Personal measurement studies and modelling investigations are used to examine pollutant exposure for pedestrians in the urban environment: each presenting various strengths and weaknesses in relation to labour and equipment costs, a sufficient sampling period and the accuracy of results. This modelling exercise considers the potential benefits of modelling results over personal measurement studies and aims to demonstrate how variations in fleet composition affects exposure results (presented as mean concentrations along the centre of both footpaths) in different traffic scenarios. A model of Pearse Street in Dublin, Ireland was developed by combining a computational fluid dynamic (CFD) model and a semi-empirical equation to simulate pollutant dispersion in the street. Using local NOx concentrations, traffic and meteorological data from a two-week period in 2011, the model were validated and a good fit was presented. To explore the long-term variations in personal exposure due to variations in fleet composition, synthesised traffic data was used to compare short-term personal exposure data (over a two-week period) with the results for an extended one-year period. Personal exposure during the two-week period underestimated the one-year results by between 8% and 65% on adjacent footpaths. The findings demonstrate the potential for relative differences in pedestrian exposure to exist between the north and south footpaths due to changing wind conditions in both peak and off-peak traffic scenarios. This modelling approach may help overcome potential under- or over-estimations of concentrations in personal measurement studies on the footpaths. Further research aims to measure pollutant concentrations on adjacent footpaths in different traffic and wind conditions and to develop a simpler modelling system to identify pollutant hotspots on our city footpaths so that urban planners can implement improvement strategies to improve urban air quality. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Investigating the role of transportation models in epidemiologic studies of traffic related air pollution and health effects.

    PubMed

    Shekarrizfard, Maryam; Valois, Marie-France; Goldberg, Mark S; Crouse, Dan; Ross, Nancy; Parent, Marie-Elise; Yasmin, Shamsunnahar; Hatzopoulou, Marianne

    2015-07-01

    In two earlier case-control studies conducted in Montreal, nitrogen dioxide (NO2), a marker for traffic-related air pollution was found to be associated with the incidence of postmenopausal breast cancer and prostate cancer. These studies relied on a land use regression model (LUR) for NO2 that is commonly used in epidemiologic studies for deriving estimates of traffic-related air pollution. Here, we investigate the use of a transportation model developed during the summer season to generate a measure of traffic emissions as an alternative to the LUR model. Our traffic model provides estimates of emissions of nitrogen oxides (NOx) at the level of individual roads, as does the LUR model. Our main objective was to compare the distribution of the spatial estimates of NOx computed from our transportation model to the distribution obtained from the LUR model. A secondary objective was to compare estimates of risk using these two exposure estimates. We observed that the correlation (spearman) between our two measures of exposure (NO2 and NOx) ranged from less than 0.3 to more than 0.9 across Montreal neighborhoods. The most important factor affecting the "agreement" between the two measures in a specific area was found to be the length of roads. Areas affected by a high level of traffic-related air pollution had a far better agreement between the two exposure measures. A comparison of odds ratios (ORs) obtained from NO2 and NOx used in two case-control studies of breast and prostate cancer, showed that the differences between the ORs associated with NO2 exposure vs NOx exposure differed by 5.2-8.8%. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Meta-analysis of cranial CT scans in children. A mathematical model to predict radiation-induced tumors.

    PubMed

    Stein, Sherman C; Hurst, Robert W; Sonnad, Seema S

    2008-01-01

    We aimed to estimate the risks of radiation exposure from a single head CT scan to children of different ages. We constructed a multistate time-dependent Markov model to simulate the course of children exposed to a head CT. The relevant literature was reviewed for probabilities, which were used to calculate tumor types, latencies after exposure and outcomes in the model. Where multiple approximations of the same probability had been reported, meta-analytic techniques were employed to compute pooled estimates. The model was then used to calculate the effect of the radiation exposure on life expectancy and quality of life for children following head CT at different ages. The tumors likely to be induced by low-level cranial irradiation include thyroid carcinoma (47%), meningioma (34%) and glioma (19%). According to the model, a single head CT is likely to cause one of these tumors in 0.22% of 1-year-olds, 30% of whom will consequently die. The exposure will shorten the life expectancy of all exposed 1-year-olds by an average of 0.04 years and their expected quality of life by 0.02 quality-adjusted life years. The risks of radiation exposure diminish for older children. The model predicts that the effective radiation dose from a single head CT is capable of inducing a thyroid or brain tumor in an infant or child. These tumors can severely impact both quality of life and life expectancy. Care should be taken before ordering CT scans in children, particularly in infants and toddlers. Copyright 2008 S. Karger AG, Basel.

  15. Validation of Aircraft Noise Models at Lower Levels of Exposure

    NASA Technical Reports Server (NTRS)

    Page, Juliet A.; Plotkin, Kenneth J.; Carey, Jeffrey N.; Bradley, Kevin A.

    1996-01-01

    Noise levels around airports and airbases in the United States arc computed via the FAA's Integrated Noise Model (INM) or the Air Force's NOISEMAP (NMAP) program. These models were originally developed for use in the vicinity of airports, at distances which encompass a day night average sound level in decibels (Ldn) of 65 dB or higher. There is increasing interest in aircraft noise at larger distances from the airport. including en-route noise. To evaluate the applicability of INM and NMAP at larger distances, a measurement program was conducted at a major air carrier airport with monitoring sites located in areas exposed to an Ldn of 55 dB and higher. Automated Radar Terminal System (ARTS) radar tracking data were obtained to provide actual flight parameters and positive identification of aircraft. Flight operations were grouped according to aircraft type. stage length, straight versus curved flight tracks, and arrival versus departure. Sound exposure levels (SEL) were computed at monitoring locations, using the INM, and compared with measured values. While individual overflight SEL data was characterized by a high variance, analysis performed on an energy-averaging basis indicates that INM and similar models can be applied to regions exposed to an Ldn of 55 dB with no loss of reliability.

  16. Life-Stage Physiologically-Based Pharmacokinetic (PBPK) ...

    EPA Pesticide Factsheets

    This presentation discusses methods used to extrapolate from in vitro high-throughput screening (HTS) toxicity data for an endocrine pathway to in vivo for early life stages in humans, and the use of a life stage PBPK model to address rapidly changing physiological parameters. Adverse outcome pathways (AOPs), in this case endocrine disruption during development, provide a biologically-based framework for linking molecular initiating events triggered by chemical exposures to key events leading to adverse outcomes. The application of AOPs to human health risk assessment requires extrapolation of in vitro HTS toxicity data to in vivo exposures (IVIVE) in humans, which can be achieved through the use of a PBPK/PD model. Exposure scenarios for chemicals in the PBPK/PD model will consider both placental and lactational transfer of chemicals, with a focus on age dependent dosimetry during fetal development and after birth for a nursing infant. This talk proposes a universal life-stage computational model that incorporates changing physiological parameters to link environmental exposures to in vitro levels of HTS assays related to a developmental toxicological AOP for vascular disruption. In vitro toxicity endpoints discussed are based on two mechanisms: 1) Fetal vascular disruption, and 2) Neurodevelopmental toxicity induced by altering thyroid hormone levels in neonates via inhibition of thyroperoxidase in the thyroid gland. Application of our Life-stage computati

  17. Quantifying the association between white matter integrity changes and subconcussive head impact exposure from a single season of youth and high school football using 3D convolutional neural networks

    NASA Astrophysics Data System (ADS)

    Saghafi, Behrouz; Murugesan, Gowtham; Davenport, Elizabeth; Wagner, Ben; Urban, Jillian; Kelley, Mireille; Jones, Derek; Powers, Alexander; Whitlow, Christopher; Stitzel, Joel; Maldjian, Joseph; Montillo, Albert

    2018-02-01

    The effect of subconcussive head impact exposure during contact sports, including American football, on brain health is poorly understood particularly in young and adolescent players, who may be more vulnerable to brain injury during periods of rapid brain maturation. This study aims to quantify the association between cumulative effects of head impact exposure from a single season of football on white matter (WM) integrity as measured with diffusion MRI. The study targets football players aged 9-18 years old. All players were imaged pre- and post-season with structural MRI and diffusion tensor MRI (DTI). Fractional Anisotropy (FA) maps, shown to be closely correlated with WM integrity, were computed for each subject, co-registered and subtracted to compute the change in FA per subject. Biomechanical metrics were collected at every practice and game using helmet mounted accelerometers. Each head impact was converted into a risk of concussion, and the risk of concussion-weighted cumulative exposure (RWE) was computed for each player for the season. Athletes with high and low RWE were selected for a two-category classification task. This task was addressed by developing a 3D Convolutional Neural Network (CNN) to automatically classify players into high and low impact exposure groups from the change in FA maps. Using the proposed model, high classification performance, including ROC Area Under Curve score of 85.71% and F1 score of 83.33% was achieved. This work adds to the growing body of evidence for the presence of detectable neuroimaging brain changes in white matter integrity from a single season of contact sports play, even in the absence of a clinically diagnosed concussion.

  18. Bladder cancer mortality trends and patterns in Córdoba, Argentina (1986-2006).

    PubMed

    Pou, Sonia Alejandra; Osella, Alberto Ruben; Diaz, Maria Del Pilar

    2011-03-01

    Bladder cancer is common worldwide and the fourth most commonly diagnosed malignancy in men in Argentina. To describe bladder cancer mortality trends in Córdoba (1986-2006), considering the effect of age, period, and cohort, and to estimate the effect of arsenic exposure on bladder cancer, and its interaction with sex, while controlling by smoking habits and space and time variation of the rates. A joinpoint regression was performed to compute the estimated annual percentage changes (EAPC) of the age-standardized mortality rates (ASMR) in an adult population from Córdoba, Argentina. A Poisson model was fitted to estimate the effect of age, period, and cohort. The influence of gender, tobacco smoking (using lung cancer ASMR as surrogate), and arsenic in drinking water was examined using a hierarchical model. A favorable trend (1986-2006) in bladder cancer ASMR in both sexes was found: EAPC of -2.54 in men and -1.69 in women. There was a decreasing trend in relative risk (RR) for cohorts born in 1931 or after. The multilevel model showed an increasing risk for each increase in lung cancer ASMR unit (RR = 1.001) and a biological interaction between sex and arsenic exposure. RR was higher among men exposed to increasing As-exposure categories (RR male low exposure 3.14, RR male intermediate exposure 4.03, RR male high exposure 4.71 versus female low exposure). A non-random space-time distribution of the rates was observed. There has been a decreasing trend in ASMR for bladder cancer in Córdoba. This study confirms that bladder cancer is associated with age, gender, smoking habit, and exposure to arsenic. Moreover, an effect measure modification between exposure to arsenic and sex was found.

  19. Improvements to the Ionizing Radiation Risk Assessment Program for NASA Astronauts

    NASA Technical Reports Server (NTRS)

    Semones, E. J.; Bahadori, A. A.; Picco, C. E.; Shavers, M. R.; Flores-McLaughlin, J.

    2011-01-01

    To perform dosimetry and risk assessment, NASA collects astronaut ionizing radiation exposure data from space flight, medical imaging and therapy, aviation training activities and prior occupational exposure histories. Career risk of exposure induced death (REID) from radiation is limited to 3 percent at a 95 percent confidence level. The Radiation Health Office at Johnson Space Center (JSC) is implementing a program to integrate the gathering, storage, analysis and reporting of astronaut ionizing radiation dose and risk data and records. This work has several motivations, including more efficient analyses and greater flexibility in testing and adopting new methods for evaluating risks. The foundation for these improvements is a set of software tools called the Astronaut Radiation Exposure Analysis System (AREAS). AREAS is a series of MATLAB(Registered TradeMark)-based dose and risk analysis modules that interface with an enterprise level SQL Server database by means of a secure web service. It communicates with other JSC medical and space weather databases to maintain data integrity and consistency across systems. AREAS is part of a larger NASA Space Medicine effort, the Mission Medical Integration Strategy, with the goal of collecting accurate, high-quality and detailed astronaut health data, and then securely, timely and reliably presenting it to medical support personnel. The modular approach to the AREAS design accommodates past, current, and future sources of data from active and passive detectors, space radiation transport algorithms, computational phantoms and cancer risk models. Revisions of the cancer risk model, new radiation detection equipment and improved anthropomorphic computational phantoms can be incorporated. Notable hardware updates include the Radiation Environment Monitor (which uses Medipix technology to report real-time, on-board dosimetry measurements), an updated Tissue-Equivalent Proportional Counter, and the Southwest Research Institute Radiation Assessment Detector. Also, the University of Florida hybrid phantoms, which are flexible in morphometry and positioning, are being explored as alternatives to the current NASA computational phantoms.

  20. Data-driven nonlinear optimisation of a simple air pollution dispersion model generating high resolution spatiotemporal exposure

    NASA Astrophysics Data System (ADS)

    Yuval; Bekhor, Shlomo; Broday, David M.

    2013-11-01

    Spatially detailed estimation of exposure to air pollutants in the urban environment is needed for many air pollution epidemiological studies. To benefit studies of acute effects of air pollution such exposure maps are required at high temporal resolution. This study introduces nonlinear optimisation framework that produces high resolution spatiotemporal exposure maps. An extensive traffic model output, serving as proxy for traffic emissions, is fitted via a nonlinear model embodying basic dispersion properties, to high temporal resolution routine observations of traffic-related air pollutant. An optimisation problem is formulated and solved at each time point to recover the unknown model parameters. These parameters are then used to produce a detailed concentration map of the pollutant for the whole area covered by the traffic model. Repeating the process for multiple time points results in the spatiotemporal concentration field. The exposure at any location and for any span of time can then be computed by temporal integration of the concentration time series at selected receptor locations for the durations of desired periods. The methodology is demonstrated for NO2 exposure using the output of a traffic model for the greater Tel Aviv area, Israel, and the half-hourly monitoring and meteorological data from the local air quality network. A leave-one-out cross-validation resulted in simulated half-hourly concentrations that are almost unbiased compared to the observations, with a mean error (ME) of 5.2 ppb, normalised mean error (NME) of 32%, 78% of the simulated values are within a factor of two (FAC2) of the observations, and the coefficient of determination (R2) is 0.6. The whole study period integrated exposure estimations are also unbiased compared with their corresponding observations, with ME of 2.5 ppb, NME of 18%, FAC2 of 100% and R2 that equals 0.62.

  1. Coupled Neutron Transport for HZETRN

    NASA Technical Reports Server (NTRS)

    Slaba, Tony C.; Blattnig, Steve R.

    2009-01-01

    Exposure estimates inside space vehicles, surface habitats, and high altitude aircrafts exposed to space radiation are highly influenced by secondary neutron production. The deterministic transport code HZETRN has been identified as a reliable and efficient tool for such studies, but improvements to the underlying transport models and numerical methods are still necessary. In this paper, the forward-backward (FB) and directionally coupled forward-backward (DC) neutron transport models are derived, numerical methods for the FB model are reviewed, and a computationally efficient numerical solution is presented for the DC model. Both models are compared to the Monte Carlo codes HETC-HEDS, FLUKA, and MCNPX, and the DC model is shown to agree closely with the Monte Carlo results. Finally, it is found in the development of either model that the decoupling of low energy neutrons from the light particle transport procedure adversely affects low energy light ion fluence spectra and exposure quantities. A first order correction is presented to resolve the problem, and it is shown to be both accurate and efficient.

  2. Modelling deuterium release from tungsten after high flux high temperature deuterium plasma exposure

    NASA Astrophysics Data System (ADS)

    Grigorev, Petr; Matveev, Dmitry; Bakaeva, Anastasiia; Terentyev, Dmitry; Zhurkin, Evgeny E.; Van Oost, Guido; Noterdaeme, Jean-Marie

    2016-12-01

    Tungsten is a primary candidate for plasma facing materials for future fusion devices. An important safety concern in the design of plasma facing components is the retention of hydrogen isotopes. Available experimental data is vast and scattered, and a consistent physical model of retention of hydrogen isotopes in tungsten is still missing. In this work we propose a model of non-equilibrium hydrogen isotopes trapping under fusion relevant plasma exposure conditions. The model is coupled to a diffusion-trapping simulation tool and is used to interpret recent experiments involving high plasma flux exposures. From the computational analysis performed, it is concluded that high flux high temperature exposures (T = 1000 K, flux = 1024 D/m2/s and fluence of 1026 D/m2) result in generation of sub-surface damage and bulk diffusion, so that the retention is driven by both sub-surface plasma-induced defects (bubbles) and trapping at natural defects. On the basis of the non-equilibrium trapping model we have estimated the amount of H stored in the sub-surface region to be ∼10-5 at-1, while the bulk retention is about 4 × 10-7 at-1, calculated by assuming the sub-surface layer thickness of about 10 μm and adjusting the trap concentration to comply with the experimental results for the integral retention.

  3. Naphthalene distributions and human exposure in Southern California

    NASA Astrophysics Data System (ADS)

    Lu, Rong; Wu, Jun; Turco, Richard P.; Winer, Arthur M.; Atkinson, Roger; Arey, Janet; Paulson, Suzanne E.; Lurmann, Fred W.; Miguel, Antonio H.; Eiguren-Fernandez, Arantzazu

    The regional distribution of, and human exposure to, naphthalene are investigated for Southern California. A comprehensive approach is taken in which advanced models are linked for the first time to quantify population exposure to the emissions of naphthalene throughout Southern California. Naphthalene is the simplest and most abundant of the polycyclic aromatic hydrocarbons found in polluted urban environments, and has been detected in both outdoor and indoor air samples. Exposure to high concentrations of naphthalene may have adverse health effects, possibly causing cancer in humans. Among the significant emission sources are volatilization from naphthalene-containing products, petroleum refining, and combustion of fossil fuels and wood. Gasoline and diesel engine exhaust, with related vaporization from fuels, are found to contribute roughly half of the daily total naphthalene burden in Southern California. As part of this study, the emission inventory for naphthalene has been verified against new field measurements of the naphthalene-to-benzene ratio in a busy traffic tunnel in Los Angeles, supporting the modeling work carried out here. The Surface Meteorology and Ozone Generation (SMOG) airshed model is used to compute the spatial and temporal distributions of naphthalene and its photooxidation products in Southern California. The present simulations reveal a high degree of spatial variability in the concentrations of naphthalene-related species, with large diurnal and seasonal variations as well. Peak naphthalene concentrations are estimated to occur in the early morning hours in the winter season. The naphthalene concentration estimates obtained from the SMOG model are employed in the Regional Human Exposure (REHEX) model to calculate population exposure statistics. Results show average hourly naphthalene exposures in Southern California under summer and winter conditions of 270 and 430 ng m -3, respectively. Exposure to significantly higher concentrations may occur for individuals close to local sources, or in naphthalene "hotspots" revealed by simulations and observations. Such levels of naphthalene exposure may be used to gauge the potential health impacts of long-term naphthalene exposure. Results are also given for the distributions of 1,4-naphthoquinone, a naphthalene reaction product that may have significant health effects.

  4. Computer aided radiation analysis for manned spacecraft

    NASA Technical Reports Server (NTRS)

    Appleby, Matthew H.; Griffin, Brand N.; Tanner, Ernest R., II; Pogue, William R.; Golightly, Michael J.

    1991-01-01

    In order to assist in the design of radiation shielding an analytical tool is presented that can be employed in combination with CAD facilities and NASA transport codes. The nature of radiation in space is described, and the operational requirements for protection are listed as background information for the use of the technique. The method is based on the Boeing radiation exposure model (BREM) for combining NASA radiation transport codes and CAD facilities, and the output is given as contour maps of the radiation-shield distribution so that dangerous areas can be identified. Computational models are used to solve the 1D Boltzmann transport equation and determine the shielding needs for the worst-case scenario. BREM can be employed directly with the radiation computations to assess radiation protection during all phases of design which saves time and ultimately spacecraft weight.

  5. A new model of sensorimotor coupling in the development of speech.

    PubMed

    Westermann, Gert; Reck Miranda, Eduardo

    2004-05-01

    We present a computational model that learns a coupling between motor parameters and their sensory consequences in vocal production during a babbling phase. Based on the coupling, preferred motor parameters and prototypically perceived sounds develop concurrently. Exposure to an ambient language modifies perception to coincide with the sounds from the language. The model develops motor mirror neurons that are active when an external sound is perceived. An extension to visual mirror neurons for oral gestures is suggested.

  6. Radiomics-based differentiation of lung disease models generated by polluted air based on X-ray computed tomography data.

    PubMed

    Szigeti, Krisztián; Szabó, Tibor; Korom, Csaba; Czibak, Ilona; Horváth, Ildikó; Veres, Dániel S; Gyöngyi, Zoltán; Karlinger, Kinga; Bergmann, Ralf; Pócsik, Márta; Budán, Ferenc; Máthé, Domokos

    2016-02-11

    Lung diseases (resulting from air pollution) require a widely accessible method for risk estimation and early diagnosis to ensure proper and responsive treatment. Radiomics-based fractal dimension analysis of X-ray computed tomography attenuation patterns in chest voxels of mice exposed to different air polluting agents was performed to model early stages of disease and establish differential diagnosis. To model different types of air pollution, BALBc/ByJ mouse groups were exposed to cigarette smoke combined with ozone, sulphur dioxide gas and a control group was established. Two weeks after exposure, the frequency distributions of image voxel attenuation data were evaluated. Specific cut-off ranges were defined to group voxels by attenuation. Cut-off ranges were binarized and their spatial pattern was associated with calculated fractal dimension, then abstracted by the fractal dimension -- cut-off range mathematical function. Nonparametric Kruskal-Wallis (KW) and Mann-Whitney post hoc (MWph) tests were used. Each cut-off range versus fractal dimension function plot was found to contain two distinctive Gaussian curves. The ratios of the Gaussian curve parameters are considerably significant and are statistically distinguishable within the three exposure groups. A new radiomics evaluation method was established based on analysis of the fractal dimension of chest X-ray computed tomography data segments. The specific attenuation patterns calculated utilizing our method may diagnose and monitor certain lung diseases, such as chronic obstructive pulmonary disease (COPD), asthma, tuberculosis or lung carcinomas.

  7. [Measures of occupational exposure to time-varying low frequency magnetic fields of non-uniform spatial distribution in the light of international guidelines and electrodynamic exposure effects in the human body].

    PubMed

    Karpowicz, Jolanta; Zradziński, Patryk; Gryz, Krzysztof

    2012-01-01

    The aim of study was to analyze by computer simulations the electrodynamic effects of magnetic field (MF) on workers, to harmonize the principles of occupational hazards assessment with international guidelines. Simulations involved 50 Hz MF of various spatial distributions, representing workers' exposure in enterprises. Homogeneous models of sigma = 0.2 S/m conductivity and dimensions of body parts - palm, head and trunk - were located at 50 cm ("hand-distance") or 5 cm (adjacent) from the source (circle conductor of 20 cm or 200 cm in diameter). Parameters of magnetic flux density (B(i)) affecting the models were the exposure measures, and the induced electric field strength (E(in)) was the measure of MF exposure effects. The ratio E(in)/B(i) in the analyzed cases ranged from 2.59 to 479 (V/m)/T. The strongest correlation (p < 0.001) between B(i) and E(in) was found for parameters characterizing MF at the surface of body models. Parameters characterizing the averaged value of the field affecting models (measures of non-uniform field exposure following ICNIRP guidelines), were less correlated with exposure effects (p < 0.005). E(in)(trunk)/E(in) (palm) estimated from E(in) calculations was 3.81-4.56 but estimated from parameters representing B(i) measurement accounted for 3.96-9.74. It is justified to accept 3.96-9.74 times higher exposure to limb than that to trunk. This supports the regulation of labor law in Poland, which provides that the ceiling value for limb exposure to MF below 800 kHz is fivefold higher than that of the trunk. High uncertainty in assessing the effects of non-uniform fields exposure, resulting from a strong dependence of the E(in)/B(i) ratio on the conditions of exposure and its applied measures, requires special caution when defining the permissible MF levels and the principles of exposure assessment at workplace.

  8. Evaluation of electrical fields inside a biological structure.

    PubMed Central

    Drago, G. P.; Ridella, S.

    1982-01-01

    A digital computer simulation has been carried out of exposure of a cell, modelled as a multilayered spherical structure, to an alternating electrical field. Electrical quantities of possible biological interest can be evaluated everywhere inside the cell. A strong frequency selective behaviour in the range 0-10 MHz has been obtained. PMID:6279135

  9. Analysis of Radiation Exposure for Troop Observers, Exercise Desert Rock V, Operation Upshot-Knothole.

    DTIC Science & Technology

    1981-04-28

    on initial doses. Residual doses are determined through an automiated procedure that utilizes raw data in regression analyses to fit space-time models...show their relationship to the observer positions. The computer-calculated doses do not reflect the presence of the human body in the radiological

  10. Survey mirrors and lenses and their required surface accuracy. Volume 1. Technical report. Final report for September 15, 1978-December 1, 1979

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beesing, M. E.; Buchholz, R. L.; Evans, R. A.

    1980-01-01

    An investigation of the optical performance of a variety of concentrating solar collectors is reported. The study addresses two important issues: the accuracy of reflective or refractive surfaces required to achieve specified performance goals, and the effect of environmental exposure on the performance concentrators. To assess the importance of surface accuracy on optical performance, 11 tracking and nontracking concentrator designs were selected for detailed evaluation. Mathematical models were developed for each design and incorporated into a Monte Carlo ray trace computer program to carry out detailed calculations. Results for the 11 concentrators are presented in graphic form. The models andmore » computer program are provided along with a user's manual. A survey data base was established on the effect of environmental exposure on the optical degradation of mirrors and lenses. Information on environmental and maintenance effects was found to be insufficient to permit specific recommendations for operating and maintenance procedures, but the available information is compiled and reported and does contain procedures that other workers have found useful.« less

  11. Enhanced representation of soil NO emissions in the ...

    EPA Pesticide Factsheets

    Modeling of soil nitric oxide (NO) emissions is highly uncertain and may misrepresent its spatial and temporal distribution. This study builds upon a recently introduced parameterization to improve the timing and spatial distribution of soil NO emission estimates in the Community Multiscale Air Quality (CMAQ) model. The parameterization considers soil parameters, meteorology, land use, and mineral nitrogen (N) availability to estimate NO emissions. We incorporate daily year-specific fertilizer data from the Environmental Policy Integrated Climate (EPIC) agricultural model to replace the annual generic data of the initial parameterization, and use a 12 km resolution soil biome map over the continental USA. CMAQ modeling for July 2011 shows slight differences in model performance in simulating fine particulate matter and ozone from Interagency Monitoring of Protected Visual Environments (IMPROVE) and Clean Air Status and Trends Network (CASTNET) sites and NO2 columns from Ozone Monitoring Instrument (OMI) satellite retrievals. We also simulate how the change in soil NO emissions scheme affects the expected O3 response to projected emissions reductions. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and

  12. Computational Modeling and Simulation of Developmental ...

    EPA Pesticide Factsheets

    Standard practice for assessing developmental toxicity is the observation of apical endpoints (intrauterine death, fetal growth retardation, structural malformations) in pregnant rats/rabbits following exposure during organogenesis. EPA’s computational toxicology research program (ToxCast) generated vast in vitro cellular and molecular effects data on >1858 chemicals in >600 high-throughput screening (HTS) assays. The diversity of assays has been increased for developmental toxicity with several HTS platforms, including the devTOX-quickPredict assay from Stemina Biomarker Discovery utilizing the human embryonic stem cell line (H9). Translating these HTS data into higher order-predictions of developmental toxicity is a significant challenge. Here, we address the application of computational systems models that recapitulate the kinematics of dynamical cell signaling networks (e.g., SHH, FGF, BMP, retinoids) in a CompuCell3D.org modeling environment. Examples include angiogenesis (angiodysplasia) and dysmorphogenesis. Being numerically responsive to perturbation, these models are amenable to data integration for systems Toxicology and Adverse Outcome Pathways (AOPs). The AOP simulation outputs predict potential phenotypes based on the in vitro HTS data ToxCast. A heuristic computational intelligence framework that recapitulates the kinematics of dynamical cell signaling networks in the embryo, together with the in vitro profiling data, produce quantitative predic

  13. System-based Identification of Toxicity Pathways Associated With Multi-Walled Carbon Nanotube-Induced Pathological Responses

    PubMed Central

    Snyder-Talkington, Brandi N.; Dymacek, Julian; Porter, Dale W.; Wolfarth, Michael G.; Mercer, Robert R.; Pacurari, Maricica; Denvir, James; Castranova, Vincent; Qian, Yong; Guo, Nancy L.

    2014-01-01

    The fibrous shape and biopersistence of multi-walled carbon nanotubes (MWCNT) have raised concern over their potential toxicity after pulmonary exposure. As in vivo exposure to MWCNT produced a transient inflammatory and progressive fibrotic response, this study sought to identify significant biological processes associated with lung inflammation and fibrosis pathology data, based upon whole genome mRNA expression, bronchoaveolar lavage scores, and morphometric analysis from C57BL/6J mice exposed by pharyngeal aspiration to 0, 10, 20, 40, or 80 µg MWCNT at 1, 7, 28, or 56 days post-exposure. Using a novel computational model employing non-negative matrix factorization and Monte Carlo Markov Chain simulation, significant biological processes with expression similar to MWCNT-induced lung inflammation and fibrosis pathology data in mice were identified. A subset of genes in these processes was determined to be functionally related to either fibrosis or inflammation by Ingenuity Pathway Analysis and were used to determine potential significant signaling cascades. Two genes determined to be functionally related to inflammation and fibrosis, vascular endothelial growth factor A (vegfa) and C-C motif chemokine 2 (ccl2), were confirmed by in vitro studies of mRNA and protein expression in small airway epithelial cells exposed to MWCNT as concordant with in vivo expression. This study identified that the novel computational model was sufficient to determine biological processes strongly associated with the pathology of lung inflammation and fibrosis and could identify potential toxicity signaling pathways and mechanisms of MWCNT exposure which could be used for future animal studies to support human risk assessment and intervention efforts. PMID:23845593

  14. Development of the NASA Digital Astronaut Project Muscle Model

    NASA Technical Reports Server (NTRS)

    Lewandowski, Beth E.; Pennline, James A.; Thompson, W. K.; Humphreys, B. T.; Ryder, J. W.; Ploutz-Snyder, L. L.; Mulugeta, L.

    2015-01-01

    This abstract describes development work performed on the NASA Digital Astronaut Project Muscle Model. Muscle atrophy is a known physiological response to exposure to a low gravity environment. The DAP muscle model computationally predicts the change in muscle structure and function vs. time in a reduced gravity environment. The spaceflight muscle model can then be used in biomechanical models of exercise countermeasures and spaceflight tasks to: 1) develop site specific bone loading input to the DAP bone adaptation model over the course of a mission; 2) predict astronaut performance of spaceflight tasks; 3) inform effectiveness of new exercise countermeasures concepts.

  15. Mirror, mirror on my Facebook wall: effects of exposure to Facebook on self-esteem.

    PubMed

    Gonzales, Amy L; Hancock, Jeffrey T

    2011-01-01

    Contrasting hypotheses were posed to test the effect of Facebook exposure on self-esteem. Objective Self-Awareness (OSA) from social psychology and the Hyperpersonal Model from computer-mediated communication were used to argue that Facebook would either diminish or enhance self-esteem respectively. The results revealed that, in contrast to previous work on OSA, becoming self-aware by viewing one's own Facebook profile enhances self-esteem rather than diminishes it. Participants that updated their profiles and viewed their own profiles during the experiment also reported greater self-esteem, which lends additional support to the Hyperpersonal Model. These findings suggest that selective self-presentation in digital media, which leads to intensified relationship formation, also influences impressions of the self.

  16. Defining Nitrogen Kinetics for Air Break in Prebreath

    NASA Technical Reports Server (NTRS)

    Conkin, Johnny

    2010-01-01

    Actual tissue nitrogen (N2) kinetics are complex; the uptake and elimination is often approximated with a single half-time compartment in statistical descriptions of denitrogenation [prebreathe(PB)] protocols. Air breaks during PB complicate N2 kinetics. A comparison of symmetrical versus asymmetrical N2 kinetics was performed using the time to onset of hypobaric decompression sickness (DCS) as a surrogate for actual venous N2 tension. METHODS: Published results of 12 tests involving 179 hypobaric exposures in altitude chambers after PB, with and without airbreaks, provide the complex protocols from which to model N2 kinetics. DCS survival time for combined control and airbreaks were described with an accelerated log logistic model where N2 uptake and elimination before, during, and after the airbreak was computed with a simple exponential function or a function that changed half-time depending on ambient N2 partial pressure. P1N2-P2 = (Delta)P defined decompression dose for each altitude exposure, where P2 was the test altitude and P1N2 was computed N2 pressure at the beginning of the altitude exposure. RESULTS: The log likelihood (LL) without decompression dose (null model) was -155.6, and improved (best-fit) to -97.2 when dose was defined with a 240 min half-time for both N2 elimination and uptake during the PB. The description of DCS survival time was less precise with asymmetrical N2 kinetics, for example, LL was -98.9 with 240 min half-time elimination and 120 min half-time uptake. CONCLUSION: The statistical regression described survival time mechanistically linked to symmetrical N2 kinetics during PBs that also included airbreaks. The results are data-specific, and additional data may change the conclusion. The regression is useful to compute additional PB time to compensate for an airbreak in PB within the narrow range of tested conditions.

  17. Defining Nitrogen Kinetics for Air Break in Prebreathe

    NASA Technical Reports Server (NTRS)

    Conkin, Johnny

    2009-01-01

    Actual tissue nitrogen (N2) kinetics are complex; the uptake and elimination is often approximated with a single half-time compartment in statistical descriptions of denitrogenation [prebreathe (PB)] protocols. Air breaks during PB complicate N2 kinetics. A comparison of symmetrical versus asymmetrical N2 kinetics was performed using the time to onset of hypobaric decompression sickness (DCS) as a surrogate for actual venous N2 tension. Published results of 12 tests involving 179 hypobaric exposures in altitude chambers after PB, with and without air breaks, provide the complex protocols from which to model N2 kinetics. DCS survival time for combined control and air breaks were described with an accelerated log logistic model where N2 uptake and elimination before, during, and after the air break was computed with a simple exponential function or a function that changed half-time depending on ambient N2 partial pressure. P1N2-P2 = delta P defined DCS dose for each altitude exposure, where P2 was the test altitude and P1N2 was computed N2 pressure at the beginning of the altitude exposure. The log likelihood (LL) without DCS dose (null model) was -155.6, and improved (best-fit) to -97.2 when dose was defined with a 240 min half-time for both N2 elimination and uptake during the PB. The description of DCS survival time was less precise with asymmetrical N2 kinetics, for example, LL was -98.9 with 240 min half-time elimination and 120 min half-time uptake. The statistical regression described survival time mechanistically linked to symmetrical N2 kinetics during PBs that also included air breaks. The results are data-specific, and additional data may change the conclusion. The regression is useful to compute additional PB time to compensate for an air break in PB within the narrow range of tested conditions.

  18. Modelling of aircrew radiation exposure from galactic cosmic rays and solar particle events.

    PubMed

    Takada, M; Lewis, B J; Boudreau, M; Al Anid, H; Bennett, L G I

    2007-01-01

    Correlations have been developed for implementation into the semi-empirical Predictive Code for Aircrew Radiation Exposure (PCAIRE) to account for effects of extremum conditions of solar modulation and low altitude based on transport code calculations. An improved solar modulation model, as proposed by NASA, has been further adopted to interpolate between the bounding correlations for solar modulation. The conversion ratio of effective dose to ambient dose equivalent, as applied to the PCAIRE calculation (based on measurements) for the legal regulation of aircrew exposure, was re-evaluated in this work to take into consideration new ICRP-92 radiation-weighting factors and different possible irradiation geometries of the source cosmic-radiation field. A computational analysis with Monte Carlo N-Particle eXtended Code was further used to estimate additional aircrew exposure that may result from sporadic solar energetic particle events considering real-time monitoring by the Geosynchronous Operational Environmental Satellite. These predictions were compared with the ambient dose equivalent rates measured on-board an aircraft and to count rate data observed at various ground-level neutron monitors.

  19. Reactive decontamination of absorbing thin film polymer coatings: model development and parameter determination

    NASA Astrophysics Data System (ADS)

    Varady, Mark; Mantooth, Brent; Pearl, Thomas; Willis, Matthew

    2014-03-01

    A continuum model of reactive decontamination in absorbing polymeric thin film substrates exposed to the chemical warfare agent O-ethyl S-[2-(diisopropylamino)ethyl] methylphosphonothioate (known as VX) was developed to assess the performance of various decontaminants. Experiments were performed in conjunction with an inverse analysis method to obtain the necessary model parameters. The experiments involved contaminating a substrate with a fixed VX exposure, applying a decontaminant, followed by a time-resolved, liquid phase extraction of the absorbing substrate to measure the residual contaminant by chromatography. Decontamination model parameters were uniquely determined using the Levenberg-Marquardt nonlinear least squares fitting technique to best fit the experimental time evolution of extracted mass. The model was implemented numerically in both a 2D axisymmetric finite element program and a 1D finite difference code, and it was found that the more computationally efficient 1D implementation was sufficiently accurate. The resulting decontamination model provides an accurate quantification of contaminant concentration profile in the material, which is necessary to assess exposure hazards.

  20. Probabilistic modelling of human exposure to intense sweeteners in Italian teenagers: validation and sensitivity analysis of a probabilistic model including indicators of market share and brand loyalty.

    PubMed

    Arcella, D; Soggiu, M E; Leclercq, C

    2003-10-01

    For the assessment of exposure to food-borne chemicals, the most commonly used methods in the European Union follow a deterministic approach based on conservative assumptions. Over the past few years, to get a more realistic view of exposure to food chemicals, risk managers are getting more interested in the probabilistic approach. Within the EU-funded 'Monte Carlo' project, a stochastic model of exposure to chemical substances from the diet and a computer software program were developed. The aim of this paper was to validate the model with respect to the intake of saccharin from table-top sweeteners and cyclamate from soft drinks by Italian teenagers with the use of the software and to evaluate the impact of the inclusion/exclusion of indicators on market share and brand loyalty through a sensitivity analysis. Data on food consumption and the concentration of sweeteners were collected. A food frequency questionnaire aimed at identifying females who were high consumers of sugar-free soft drinks and/or of table top sweeteners was filled in by 3982 teenagers living in the District of Rome. Moreover, 362 subjects participated in a detailed food survey by recording, at brand level, all foods and beverages ingested over 12 days. Producers were asked to provide the intense sweeteners' concentration of sugar-free products. Results showed that consumer behaviour with respect to brands has an impact on exposure assessments. Only probabilistic models that took into account indicators of market share and brand loyalty met the validation criteria.

  1. Track structure model of microscopic energy deposition by protons and heavy ions in segments of neuronal cell dendrites represented by cylinders or spheres

    PubMed Central

    Alp, Murat; Cucinotta, Francis A.

    2017-01-01

    Changes to cognition, including memory, following radiation exposure are a concern for cosmic ray exposures to astronauts and in Hadron therapy with proton and heavy ion beams. The purpose of the present work is to develop computational methods to evaluate microscopic energy deposition (ED) in volumes representative of neuron cell structures, including segments of dendrites and spines, using a stochastic track structure model. A challenge for biophysical models of neuronal damage is the large sizes (>100 μm) and variability in volumes of possible dendritic segments and pre-synaptic elements (spines and filopodia). We consider cylindrical and spherical microscopic volumes of varying geometric parameters and aspect ratios from 0.5 to 5 irradiated by protons, and 3He and 12C particles at energies corresponding to a distance of 1 cm to the Bragg peak, which represent particles of interest in Hadron therapy as well as space radiation exposure. We investigate the optimal axis length of dendritic segments to evaluate microscopic ED and hit probabilities along the dendritic branches at a given macroscopic dose. Because of large computation times to analyze ED in volumes of varying sizes, we developed an analytical method to find the mean primary dose in spheres that can guide numerical methods to find the primary dose distribution for cylinders. Considering cylindrical segments of varying aspect ratio at constant volume, we assess the chord length distribution, mean number of hits and ED profiles by primary particles and secondary electrons (δ-rays). For biophysical modeling applications, segments on dendritic branches are proposed to have equal diameters and axes lengths along the varying diameter of a dendritic branch. PMID:28554507

  2. Track structure model of microscopic energy deposition by protons and heavy ions in segments of neuronal cell dendrites represented by cylinders or spheres

    NASA Astrophysics Data System (ADS)

    Alp, Murat; Cucinotta, Francis A.

    2017-05-01

    Changes to cognition, including memory, following radiation exposure are a concern for cosmic ray exposures to astronauts and in Hadron therapy with proton and heavy ion beams. The purpose of the present work is to develop computational methods to evaluate microscopic energy deposition (ED) in volumes representative of neuron cell structures, including segments of dendrites and spines, using a stochastic track structure model. A challenge for biophysical models of neuronal damage is the large sizes (> 100 μm) and variability in volumes of possible dendritic segments and pre-synaptic elements (spines and filopodia). We consider cylindrical and spherical microscopic volumes of varying geometric parameters and aspect ratios from 0.5 to 5 irradiated by protons, and 3He and 12C particles at energies corresponding to a distance of 1 cm to the Bragg peak, which represent particles of interest in Hadron therapy as well as space radiation exposure. We investigate the optimal axis length of dendritic segments to evaluate microscopic ED and hit probabilities along the dendritic branches at a given macroscopic dose. Because of large computation times to analyze ED in volumes of varying sizes, we developed an analytical method to find the mean primary dose in spheres that can guide numerical methods to find the primary dose distribution for cylinders. Considering cylindrical segments of varying aspect ratio at constant volume, we assess the chord length distribution, mean number of hits and ED profiles by primary particles and secondary electrons (δ-rays). For biophysical modeling applications, segments on dendritic branches are proposed to have equal diameters and axes lengths along the varying diameter of a dendritic branch.

  3. Track structure model of microscopic energy deposition by protons and heavy ions in segments of neuronal cell dendrites represented by cylinders or spheres.

    PubMed

    Alp, Murat; Cucinotta, Francis A

    2017-05-01

    Changes to cognition, including memory, following radiation exposure are a concern for cosmic ray exposures to astronauts and in Hadron therapy with proton and heavy ion beams. The purpose of the present work is to develop computational methods to evaluate microscopic energy deposition (ED) in volumes representative of neuron cell structures, including segments of dendrites and spines, using a stochastic track structure model. A challenge for biophysical models of neuronal damage is the large sizes (> 100µm) and variability in volumes of possible dendritic segments and pre-synaptic elements (spines and filopodia). We consider cylindrical and spherical microscopic volumes of varying geometric parameters and aspect ratios from 0.5 to 5 irradiated by protons, and 3 He and 12 C particles at energies corresponding to a distance of 1cm to the Bragg peak, which represent particles of interest in Hadron therapy as well as space radiation exposure. We investigate the optimal axis length of dendritic segments to evaluate microscopic ED and hit probabilities along the dendritic branches at a given macroscopic dose. Because of large computation times to analyze ED in volumes of varying sizes, we developed an analytical method to find the mean primary dose in spheres that can guide numerical methods to find the primary dose distribution for cylinders. Considering cylindrical segments of varying aspect ratio at constant volume, we assess the chord length distribution, mean number of hits and ED profiles by primary particles and secondary electrons (δ-rays). For biophysical modeling applications, segments on dendritic branches are proposed to have equal diameters and axes lengths along the varying diameter of a dendritic branch. Copyright © 2017. Published by Elsevier Ltd.

  4. Occupational risk factors have to be considered in the definition of high-risk lung cancer populations.

    PubMed

    Wild, P; Gonzalez, M; Bourgkard, E; Courouble, N; Clément-Duchêne, C; Martinet, Y; Févotte, J; Paris, C

    2012-03-27

    The aim of this study was to compute attributable fractions (AF) to occupational factors in an area in North-Eastern France with high lung cancer rates and a past of mining and steel industry. A population-based case-control study among males aged 40-79 was conducted, including confirmed primary lung cancer cases from all hospitals of the study region. Controls were stratified by broad age-classes, district and socioeconomic classes. Detailed occupational and personal risk factors were obtained in face-to-face interviews. Cumulative occupational exposure indices were obtained from the questionnaires. Attributable fractions were computed from multiple unconditional logistic regression models. A total of 246 cases and 531 controls were included. The odds ratios (ORs) adjusted on cumulative smoking and family history of lung cancer increased significantly with the cumulative exposure indices to asbestos, polycyclic aromatic hydrocarbons and crystalline silica, and with exposure to diesel motor exhaust. The AF for occupational factors exceeded 50%, the most important contributor being crystalline silica and asbestos. These AFs are higher than most published figures. This can be because of the highly industrialised area or methods for exposure assessments. Occupational factors are important risk factors and should not be forgotten when defining high-risk lung cancer populations.

  5. Computational modeling and statistical analyses on individual contact rate and exposure to disease in complex and confined transportation hubs

    NASA Astrophysics Data System (ADS)

    Wang, W. L.; Tsui, K. L.; Lo, S. M.; Liu, S. B.

    2018-01-01

    Crowded transportation hubs such as metro stations are thought as ideal places for the development and spread of epidemics. However, for the special features of complex spatial layout, confined environment with a large number of highly mobile individuals, it is difficult to quantify human contacts in such environments, wherein disease spreading dynamics were less explored in the previous studies. Due to the heterogeneity and dynamic nature of human interactions, increasing studies proved the importance of contact distance and length of contact in transmission probabilities. In this study, we show how detailed information on contact and exposure patterns can be obtained by statistical analyses on microscopic crowd simulation data. To be specific, a pedestrian simulation model-CityFlow was employed to reproduce individuals' movements in a metro station based on site survey data, values and distributions of individual contact rate and exposure in different simulation cases were obtained and analyzed. It is interesting that Weibull distribution fitted the histogram values of individual-based exposure in each case very well. Moreover, we found both individual contact rate and exposure had linear relationship with the average crowd densities of the environments. The results obtained in this paper can provide reference to epidemic study in complex and confined transportation hubs and refine the existing disease spreading models.

  6. Comparison of Highly Resolved Model-Based Exposure ...

    EPA Pesticide Factsheets

    Human exposure to air pollution in many studies is represented by ambient concentrations from space-time kriging of observed values. Space-time kriging techniques based on a limited number of ambient monitors may fail to capture the concentration from local sources. Further, because people spend more time indoors, using ambient concentration to represent exposure may cause error. To quantify the associated exposure error, we computed a series of six different hourly-based exposure metrics at 16,095 Census blocks of three Counties in North Carolina for CO, NOx, PM2.5, and elemental carbon (EC) during 2012. These metrics include ambient background concentration from space-time ordinary kriging (STOK), ambient on-road concentration from the Research LINE source dispersion model (R-LINE), a hybrid concentration combining STOK and R-LINE, and their associated indoor concentrations from an indoor infiltration mass balance model. Using a hybrid-based indoor concentration as the standard, the comparison showed that outdoor STOK metrics yielded large error at both population (67% to 93%) and individual level (average bias between −10% to 95%). For pollutants with significant contribution from on-road emission (EC and NOx), the on-road based indoor metric performs the best at the population level (error less than 52%). At the individual level, however, the STOK-based indoor concentration performs the best (average bias below 30%). For PM2.5, due to the relatively low co

  7. Finite element simulation of the mechanical impact of computer work on the carpal tunnel syndrome.

    PubMed

    Mouzakis, Dionysios E; Rachiotis, George; Zaoutsos, Stefanos; Eleftheriou, Andreas; Malizos, Konstantinos N

    2014-09-22

    Carpal tunnel syndrome (CTS) is a clinical disorder resulting from the compression of the median nerve. The available evidence regarding the association between computer use and CTS is controversial. There is some evidence that computer mouse or keyboard work, or both are associated with the development of CTS. Despite the availability of pressure measurements in the carpal tunnel during computer work (exposure to keyboard or mouse) there are no available data to support a direct effect of the increased intracarpal canal pressure on the median nerve. This study presents an attempt to simulate the direct effects of computer work on the whole carpal area section using finite element analysis. A finite element mesh was produced from computerized tomography scans of the carpal area, involving all tissues present in the carpal tunnel. Two loading scenarios were applied on these models based on biomechanical data measured during computer work. It was found that mouse work can produce large deformation fields on the median nerve region. Also, the high stressing effect of the carpal ligament was verified. Keyboard work produced considerable and heterogeneous elongations along the longitudinal axis of the median nerve. Our study provides evidence that increased intracarpal canal pressures caused by awkward wrist postures imposed during computer work were associated directly with deformation of the median nerve. Despite the limitations of the present study the findings could be considered as a contribution to the understanding of the development of CTS due to exposure to computer work. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Comparison of monoenergetic photon organ dose rate coefficients for stylized and voxel phantoms submerged in air

    DOE PAGES

    Bellamy, Michael B.; Hiller, Mauritius M.; Dewji, Shaheen A.; ...

    2016-02-01

    As part of a broader effort to calculate effective dose rate coefficients for external exposure to photons and electrons emitted by radionuclides distributed in air, soil or water, age-specific stylized phantoms have been employed to determine dose coefficients relating dose rate to organs and tissues in the body. In this article, dose rate coefficients computed using the International Commission on Radiological Protection reference adult male voxel phantom are compared with values computed using the Oak Ridge National Laboratory adult male stylized phantom in an air submersion exposure geometry. Monte Carlo calculations for both phantoms were performed for monoenergetic source photonsmore » in the range of 30 keV to 5 MeV. Furthermore, these calculations largely result in differences under 10 % for photon energies above 50 keV, and it can be expected that both models show comparable results for the environmental sources of radionuclides.« less

  9. Comparison of monoenergetic photon organ dose rate coefficients for stylized and voxel phantoms submerged in air

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bellamy, Michael B.; Hiller, Mauritius M.; Dewji, Shaheen A.

    As part of a broader effort to calculate effective dose rate coefficients for external exposure to photons and electrons emitted by radionuclides distributed in air, soil or water, age-specific stylized phantoms have been employed to determine dose coefficients relating dose rate to organs and tissues in the body. In this article, dose rate coefficients computed using the International Commission on Radiological Protection reference adult male voxel phantom are compared with values computed using the Oak Ridge National Laboratory adult male stylized phantom in an air submersion exposure geometry. Monte Carlo calculations for both phantoms were performed for monoenergetic source photonsmore » in the range of 30 keV to 5 MeV. Furthermore, these calculations largely result in differences under 10 % for photon energies above 50 keV, and it can be expected that both models show comparable results for the environmental sources of radionuclides.« less

  10. A parallel graded-mesh FDTD algorithm for human-antenna interaction problems.

    PubMed

    Catarinucci, Luca; Tarricone, Luciano

    2009-01-01

    The finite difference time domain method (FDTD) is frequently used for the numerical solution of a wide variety of electromagnetic (EM) problems and, among them, those concerning human exposure to EM fields. In many practical cases related to the assessment of occupational EM exposure, large simulation domains are modeled and high space resolution adopted, so that strong memory and central processing unit power requirements have to be satisfied. To better afford the computational effort, the use of parallel computing is a winning approach; alternatively, subgridding techniques are often implemented. However, the simultaneous use of subgridding schemes and parallel algorithms is very new. In this paper, an easy-to-implement and highly-efficient parallel graded-mesh (GM) FDTD scheme is proposed and applied to human-antenna interaction problems, demonstrating its appropriateness in dealing with complex occupational tasks and showing its capability to guarantee the advantages of a traditional subgridding technique without affecting the parallel FDTD performance.

  11. Virtual Reality-Enhanced Extinction of Phobias and Post-Traumatic Stress.

    PubMed

    Maples-Keller, Jessica L; Yasinski, Carly; Manjin, Nicole; Rothbaum, Barbara Olasov

    2017-07-01

    Virtual reality (VR) refers to an advanced technological communication interface in which the user is actively participating in a computer-generated 3-dimensional virtual world that includes computer sensory input devices used to simulate real-world interactive experiences. VR has been used within psychiatric treatment for anxiety disorders, particularly specific phobias and post-traumatic stress disorder, given several advantages that VR provides for use within treatment for these disorders. Exposure therapy for anxiety disorder is grounded in fear-conditioning models, in which extinction learning involves the process through which conditioned fear responses decrease or are inhibited. The present review will provide an overview of extinction training and anxiety disorder treatment, advantages for using VR within extinction training, a review of the literature regarding the effectiveness of VR within exposure therapy for specific phobias and post-traumatic stress disorder, and limitations and future directions of the extant empirical literature.

  12. Observed differences in upper extremity forces, muscle efforts, postures, velocities and accelerations across computer activities in a field study of office workers.

    PubMed

    Bruno Garza, J L; Eijckelhof, B H W; Johnson, P W; Raina, S M; Rynell, P W; Huysmans, M A; van Dieën, J H; van der Beek, A J; Blatter, B M; Dennerlein, J T

    2012-01-01

    This study, a part of the PRedicting Occupational biomechanics in OFfice workers (PROOF) study, investigated whether there are differences in field-measured forces, muscle efforts, postures, velocities and accelerations across computer activities. These parameters were measured continuously for 120 office workers performing their own work for two hours each. There were differences in nearly all forces, muscle efforts, postures, velocities and accelerations across keyboard, mouse and idle activities. Keyboard activities showed a 50% increase in the median right trapezius muscle effort when compared to mouse activities. Median shoulder rotation changed from 25 degrees internal rotation during keyboard use to 15 degrees external rotation during mouse use. Only keyboard use was associated with median ulnar deviations greater than 5 degrees. Idle activities led to the greatest variability observed in all muscle efforts and postures measured. In future studies, measurements of computer activities could be used to provide information on the physical exposures experienced during computer use. Practitioner Summary: Computer users may develop musculoskeletal disorders due to their force, muscle effort, posture and wrist velocity and acceleration exposures during computer use. We report that many physical exposures are different across computer activities. This information may be used to estimate physical exposures based on patterns of computer activities over time.

  13. High-Resolution Computed Tomography and Pulmonary Function Findings of Occupational Arsenic Exposure in Workers.

    PubMed

    Ergün, Recai; Evcik, Ender; Ergün, Dilek; Ergan, Begüm; Özkan, Esin; Gündüz, Özge

    2017-05-05

    The number of studies where non-malignant pulmonary diseases are evaluated after occupational arsenic exposure is very few. To investigate the effects of occupational arsenic exposure on the lung by high-resolution computed tomography and pulmonary function tests. Retrospective cross-sectional study. In this study, 256 workers with suspected respiratory occupational arsenic exposure were included, with an average age of 32.9±7.8 years and an average of 3.5±2.7 working years. Hair and urinary arsenic levels were analysed. High-resolution computed tomography and pulmonary function tests were done. In workers with occupational arsenic exposure, high-resolution computed tomography showed 18.8% pulmonary involvement. In pulmonary involvement, pulmonary nodule was the most frequently seen lesion (64.5%). The other findings of pulmonary involvement were 18.8% diffuse interstitial lung disease, 12.5% bronchiectasis, and 27.1% bullae-emphysema. The mean age of patients with pulmonary involvement was higher and as they smoked more. The pulmonary involvement was 5.2 times higher in patients with skin lesions because of arsenic. Diffusing capacity of lung for carbon monoxide was significantly lower in patients with pulmonary involvement. Besides lung cancer, chronic occupational inhalation of arsenic exposure may cause non-malignant pulmonary findings such as bronchiectasis, pulmonary nodules and diffuse interstitial lung disease. So, in order to detect pulmonary involvement in the early stages, workers who experience occupational arsenic exposure should be followed by diffusion test and high-resolution computed tomography.

  14. The impact of urban open space and 'lift-up' building design on building intake fraction and daily pollutant exposure in idealized urban models.

    PubMed

    Sha, Chenyuan; Wang, Xuemei; Lin, Yuanyuan; Fan, Yifan; Chen, Xi; Hang, Jian

    2018-08-15

    Sustainable urban design is an effective way to improve urban ventilation and reduce vehicular pollutant exposure to urban residents. This paper investigated the impacts of urban open space and 'lift-up' building design on vehicular CO (carbon monoxide) exposure in typical three-dimensional (3D) urban canopy layer (UCL) models under neutral atmospheric conditions. The building intake fraction (IF) represents the fraction of total vehicular pollutant emissions inhaled by residents when they stay at home. The building daily CO exposure (E t ) means the extent of human beings' contact with CO within one day indoor at home. Computational fluid dynamics (CFD) simulations integrating with these two concepts were performed to solve turbulent flow and assess vehicular CO exposure to urban residents. CFD technique with the standard k-ε model was successfully validated by wind tunnel data. The initial numerical UCL model consists of 5-row and 5-column (5×5) cubic buildings (building height H=street width W=30m) with four approaching wind directions (θ=0°, 15°, 30°, 45°). In Group I, one of the 25 building models is removed to attain urban open space settings. In Group II, the first floor (Lift-up1), or second floor (Lift-up2), or third floor (Lift-up3) of all buildings is elevated respectively to create wind pathways through buildings. Compared to the initial case, urban open space can slightly or significantly reduce pollutant exposure for urban residents. As θ=30° and 45°, open space settings are more effective to reduce pollutant exposure than θ=0° and 15°.The pollutant dilution near or surrounding open space and in its adjacent downstream regions is usually enhanced. Lift-up1 and Lift-up2 experience much greater pollutant exposure reduction in all wind directions than Lift-up3 and open space. Although further investigations are still required to provide practical guidelines, this study is one of the first attempts for reducing urban pollutant exposure by improving urban design. Copyright © 2018. Published by Elsevier B.V.

  15. Computational modelling of temperature rises in the eye in the near field of radiofrequency sources at 380, 900 and 1800 MHz

    NASA Astrophysics Data System (ADS)

    Wainwright, P. R.

    2007-07-01

    This paper reports calculations of the temperature rises induced in the eye and lens by near-field exposure to radiation from communication handsets, using the finite difference time domain method and classical bioheat equation. Various models are compared, including the analytic solution for a sphere, a finite element model of an isolated eye and a modern model of the whole head. The role of the blood supply to the choroid in moderating temperature is discussed. Three different frequencies are considered, namely 380 MHz (used by TETRA), and 900 and 1800 MHz (used by GSM mobile phones). At 380 MHz, monopole and helical antennas are compared. An 'equivalent blood flow' is derived for the choroid in order to facilitate comparison of the whole head and isolated eye models. In the whole head model, the heating of the lens receives a significant contribution from energy absorbed outside the eye. The temperature rise in the lens is compared to the ICNIRP-recommended average specific energy absorption rate (SAR) and the SAR averaged over the eye alone. The temperature rise may reach 1.4 °C at the ICNIRP occupational exposure limit if an antenna is placed less than 24 mm from the eye and the exposure is sufficiently prolonged.

  16. Computational modelling of temperature rises in the eye in the near field of radiofrequency sources at 380, 900 and 1800 MHz.

    PubMed

    Wainwright, P R

    2007-06-21

    This paper reports calculations of the temperature rises induced in the eye and lens by near-field exposure to radiation from communication handsets, using the finite difference time domain method and classical bioheat equation. Various models are compared, including the analytic solution for a sphere, a finite element model of an isolated eye and a modern model of the whole head. The role of the blood supply to the choroid in moderating temperature is discussed. Three different frequencies are considered, namely 380 MHz (used by TETRA), and 900 and 1800 MHz (used by GSM mobile phones). At 380 MHz, monopole and helical antennas are compared. An 'equivalent blood flow' is derived for the choroid in order to facilitate comparison of the whole head and isolated eye models. In the whole head model, the heating of the lens receives a significant contribution from energy absorbed outside the eye. The temperature rise in the lens is compared to the ICNIRP-recommended average specific energy absorption rate (SAR) and the SAR averaged over the eye alone. The temperature rise may reach 1.4 degrees C at the ICNIRP occupational exposure limit if an antenna is placed less than 24 mm from the eye and the exposure is sufficiently prolonged.

  17. A Cloud Computing Approach to Personal Risk Management: The Open Hazards Group

    NASA Astrophysics Data System (ADS)

    Graves, W. R.; Holliday, J. R.; Rundle, J. B.

    2010-12-01

    According to the California Earthquake Authority, only about 12% of current California residences are covered by any form of earthquake insurance, down from about 30% in 1996 following the 1994, M6.7 Northridge earthquake. Part of the reason for this decreasing rate of insurance uptake is the high deductible, either 10% or 15% of the value of the structure, and the relatively high cost of the premiums, as much as thousands of dollars per year. The earthquake insurance industry is composed of the CEA, a public-private partnership; modeling companies that produce damage and loss models similar to the FEMA HAZUS model; and financial companies such as the insurance, reinsurance, and investment banking companies in New York, London, the Cayman Islands, Zurich, Dubai, Singapore, and elsewhere. In setting earthquake insurance rates, financial companies rely on models like HAZUS, that calculate on risk and exposure. In California, the process begins with an official earthquake forecast by the Working Group on California Earthquake Probabilities. Modeling companies use these 30 year earthquake probabilities as inputs to their attenuation and damage models to estimate the possible damage factors from scenario earthquakes. Economic loss is then estimated from processes such as structural failure, lost economic activity, demand surge, and fire following the earthquake. Once the potential losses are known, rates can be set so that a target ruin probability of less than 1% or so can be assured. Open Hazards Group was founded with the idea that the global public might be interested in a personal estimate of earthquake risk, computed using data supplied by the public, with models running in a cloud computing environment. These models process data from the ANSS catalog, updated at least daily, to produce rupture forecasts that are backtested with standard Reliability/Attributes and Receiver Operating Characteristic tests, among others. Models for attenuation and structural damage are then used in a computationally efficient workflow to produce real-time estimates of damage and loss for individual structures. All models are based on techniques that either have been published in the literature or will soon be published. Using these results, members of the public can gain an appreciation of their risk of exposure to damage from destructive earthquakes, information that has heretofore only been available to a few members of the financial and insurance industries.

  18. Biomarkers in Computational Toxicology

    EPA Science Inventory

    Biomarkers are a means to evaluate chemical exposure and/or the subsequent impacts on toxicity pathways that lead to adverse health outcomes. Computational toxicology can integrate biomarker data with knowledge of exposure, chemistry, biology, pharmacokinetics, toxicology, and e...

  19. Strengthening the weak link: Built Environment modelling for loss analysis

    NASA Astrophysics Data System (ADS)

    Millinship, I.

    2012-04-01

    Methods to analyse insured losses from a range of natural perils, including pricing by primary insurers and catastrophe modelling by reinsurers, typically lack sufficient exposure information. Understanding the hazard intensity in terms of spatial severity and frequency is only the first step towards quantifying the risk of a catastrophic event. For any given event we need to know: Are any structures affected? What type of buildings are they? How much damaged occurred? How much will the repairs cost? To achieve this, detailed exposure information is required to assess the likely damage and to effectively calculate the resultant loss. Modelling exposures in the Built Environment therefore plays as important a role in understanding re/insurance risk as characterising the physical hazard. Across both primary insurance books and aggregated reinsurance portfolios, the location of a property (a risk) and its monetary value is typically known. Exactly what that risk is in terms of detailed property descriptors including structure type and rebuild cost - and therefore its vulnerability to loss - is often omitted. This data deficiency is a primary source of variations between modelled losses and the actual claims value. Built Environment models are therefore required at a high resolution to describe building attributes that relate vulnerability to property damage. However, national-scale household-level datasets are often not computationally practical in catastrophe models and data must be aggregated. In order to provide more accurate risk analysis, we have developed and applied a methodology for Built Environment modelling for incorporation into a range of re/insurance applications, including operational models for different international regions and different perils and covering residential, commercial and industry exposures. Illustrated examples are presented, including exposure modelling suitable for aggregated reinsurance analysis for the UK and bespoke high resolution modelling for industrial sites in Germany. A range of attributes are included following detailed claims analysis and engineering research with property type, age and condition identified as important differentiators of damage from flood, wind and freeze events.

  20. Can virtual reality improve anatomy education? A randomised controlled study of a computer-generated three-dimensional anatomical ear model.

    PubMed

    Nicholson, Daren T; Chalk, Colin; Funnell, W Robert J; Daniel, Sam J

    2006-11-01

    The use of computer-generated 3-dimensional (3-D) anatomical models to teach anatomy has proliferated. However, there is little evidence that these models are educationally effective. The purpose of this study was to test the educational effectiveness of a computer-generated 3-D model of the middle and inner ear. We reconstructed a fully interactive model of the middle and inner ear from a magnetic resonance imaging scan of a human cadaver ear. To test the model's educational usefulness, we conducted a randomised controlled study in which 28 medical students completed a Web-based tutorial on ear anatomy that included the interactive model, while a control group of 29 students took the tutorial without exposure to the model. At the end of the tutorials, both groups were asked a series of 15 quiz questions to evaluate their knowledge of 3-D relationships within the ear. The intervention group's mean score on the quiz was 83%, while that of the control group was 65%. This difference in means was highly significant (P < 0.001). Our findings stand in contrast to the handful of previous randomised controlled trials that evaluated the effects of computer-generated 3-D anatomical models on learning. The equivocal and negative results of these previous studies may be due to the limitations of these studies (such as small sample size) as well as the limitations of the models that were studied (such as a lack of full interactivity). Given our positive results, we believe that further research is warranted concerning the educational effectiveness of computer-generated anatomical models.

  1. Recent advances in modeling and simulation of the exposure and response of tungsten to fusion energy conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marian, Jaime; Becquart, Charlotte S.; Domain, Christophe

    2017-06-09

    Under the anticipated operating conditions for demonstration magnetic fusion reactors beyond ITER, structural materials will be exposed to unprecedented conditions of irradiation, heat flux, and temperature. While such extreme environments remain inaccessible experimentally, computational modeling and simulation can provide qualitative and quantitative insights into materials response and complement the available experimental measurements with carefully validated predictions. For plasma facing components such as the first wall and the divertor, tungsten (W) has been selected as the best candidate material due to its superior high-temperature and irradiation properties. In this paper we provide a review of recent efforts in computational modeling ofmore » W both as a plasma-facing material exposed to He deposition as well as a bulk structural material subjected to fast neutron irradiation. We use a multiscale modeling approach –commonly used as the materials modeling paradigm– to define the outline of the paper and highlight recent advances using several classes of techniques and their interconnection. We highlight several of the most salient findings obtained via computational modeling and point out a number of remaining challenges and future research directions« less

  2. Computational Toxicology as Implemented by the US EPA ...

    EPA Pesticide Factsheets

    Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, the U.S. Environmental Protection Agency EPA is developing robust and flexible computational tools that can be applied to the thousands of chemicals in commerce, and contaminant mixtures found in air, water, and hazardous-waste sites. The Office of Research and Development (ORD) Computational Toxicology Research Program (CTRP) is composed of three main elements. The largest component is the National Center for Computational Toxicology (NCCT), which was established in 2005 to coordinate research on chemical screening and prioritization, informatics, and systems modeling. The second element consists of related activities in the National Health and Environmental Effects Research Laboratory (NHEERL) and the National Exposure Research Laboratory (NERL). The third and final component consists of academic centers working on various aspects of computational toxicology and funded by the U.S. EPA Science to Achieve Results (STAR) program. Together these elements form the key components in the implementation of both the initial strategy, A Framework for a Computational Toxicology Research Program (U.S. EPA, 2003), and the newly released The U.S. Environmental Protection Agency's Strategic Plan for Evaluating the T

  3. Estimate of Space Radiation-Induced Cancer Risks for International Space Station Orbits

    NASA Technical Reports Server (NTRS)

    Wu, Honglu; Atwell, William; Cucinotta, Francis A.; Yang, Chui-hsu

    1996-01-01

    Excess cancer risks from exposures to space radiation are estimated for various orbits of the International Space Station (ISS). Organ exposures are computed with the transport codes, BRYNTRN and HZETRN, and the computerized anatomical male and computerized anatomical female models. Cancer risk coefficients in the National Council on Radiation Protection and Measurements report No. 98 are used to generate lifetime excess cancer incidence and cancer mortality after a one-month mission to ISS. The generated data are tabulated to serve as a quick reference for assessment of radiation risk to astronauts on ISS missions.

  4. Radon induced hyperplasia: effective adaptation reducing the local doses in the bronchial epithelium.

    PubMed

    Madas, Balázs G

    2016-09-01

    There is experimental and histological evidence that chronic irritation and cell death may cause hyperplasia in the exposed tissue. As the heterogeneous deposition of inhaled radon progeny results in high local doses at the peak of the bronchial bifurcations, it was proposed earlier that hyperplasia occurs in these deposition hot spots upon chronic radon exposure. The objective of the present study is to quantify how the induction of basal cell hyperplasia modulates the microdosimetric consequences of a given radon exposure. For this purpose, computational epithelium models were constructed with spherical cell nuclei of six different cell types based on histological data. Basal cell hyperplasia was modelled by epithelium models with additional basal cells and increased epithelium thickness. Microdosimetry for alpha-particles was performed by an own-developed Monte-Carlo code. Results show that the average tissue dose, and the average hit number and dose of basal cells decrease by the increase of the measure of hyperplasia. Hit and dose distribution reveal that the induction of hyperplasia may result in a basal cell pool which is shielded from alpha-radiation. It highlights that the exposure history affects the microdosimetric consequences of a present exposure, while the biological and health effects may also depend on previous exposures. The induction of hyperplasia can be considered as a radioadaptive response at the tissue level. Such an adaptation of the tissue challenges the validity of the application of the dose and dose rate effectiveness factor from a mechanistic point of view. As the location of radiosensitive target cells may change due to previous exposures, dosimetry models considering the tissue geometry characteristic of normal conditions may be inappropriate for dose estimation in case of protracted exposures. As internal exposures are frequently chronic, such changes in tissue geometry may be highly relevant for other incorporated radionuclides.

  5. Documentation of the Ecological Risk Assessment Computer Model ECORSK.5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anthony F. Gallegos; Gilbert J. Gonzales

    1999-06-01

    The FORTRAN77 ecological risk computer model--ECORSK.5--has been used to estimate the potential toxicity of surficial deposits of radioactive and non-radioactive contaminants to several threatened and endangered (T and E) species at the Los Alamos National Laboratory (LANL). These analyses to date include preliminary toxicity estimates for the Mexican spotted owl, the American peregrine falcon, the bald eagle, and the southwestern willow flycatcher. This work has been performed as required for the Record of Decision for the construction of the Dual Axis Radiographic Hydrodynamic Test (DARHT) Facility at LANL as part of the Environmental Impact Statement. The model is dependent onmore » the use of the geographic information system and associated software--ARC/INFO--and has been used in conjunction with LANL's Facility for Information Management and Display (FIMAD) contaminant database. The integration of FIMAD data and ARC/INFO using ECORSK.5 allows the generation of spatial information from a gridded area of potential exposure called an Ecological Exposure Unit. ECORSK.5 was used to simulate exposures using a modified Environmental Protection Agency Quotient Method. The model can handle a large number of contaminants within the home range of T and E species. This integration results in the production of hazard indices which, when compared to risk evaluation criteria, estimate the potential for impact from consumption of contaminants in food and ingestion of soil. The assessment is considered a Tier-2 type of analysis. This report summarizes and documents the ECORSK.5 code, the mathematical models used in the development of ECORSK.5, and the input and other requirements for its operation. Other auxiliary FORTRAN 77 codes used for processing and graphing output from ECORSK.5 are also discussed. The reader may refer to reports cited in the introduction to obtain greater detail on past applications of ECORSK.5 and assumptions used in deriving model parameters.« less

  6. Computer Model Used to Help Customize Medicine

    NASA Technical Reports Server (NTRS)

    Stauber, Laurel J.; Veris, Jenise

    2001-01-01

    Dr. Radhakrishnan, a researcher at the NASA Glenn Research Center, in collaboration with biomedical researchers at the Case Western Reserve University School of Medicine and Rainbow Babies and Children's Hospital, is developing computational models of human physiology that quantitate metabolism and its regulation, in both healthy and pathological states. These models can help predict the effects of stresses or interventions, such as drug therapies, and contribute to the development of customized medicine. Customized medical treatment protocols can give more comprehensive evaluations and lead to more specific and effective treatments for patients, reducing treatment time and cost. Commercial applications of this research may help the pharmaceutical industry identify therapeutic needs and predict drug-drug interactions. Researchers will be able to study human metabolic reactions to particular treatments while in different environments as well as establish more definite blood metabolite concentration ranges in normal and pathological states. These computational models may help NASA provide the background for developing strategies to monitor and safeguard the health of astronauts and civilians in space stations and colonies. They may also help to develop countermeasures that ameliorate the effects of both acute and chronic space exposure.

  7. Integrating Embedded Computing Systems into High School and Early Undergraduate Education

    ERIC Educational Resources Information Center

    Benson, B.; Arfaee, A.; Choon Kim; Kastner, R.; Gupta, R. K.

    2011-01-01

    Early exposure to embedded computing systems is crucial for students to be prepared for the embedded computing demands of today's world. However, exposure to systems knowledge often comes too late in the curriculum to stimulate students' interests and to provide a meaningful difference in how they direct their choice of electives for future…

  8. Subject-enabled analytics model on measurement statistics in health risk expert system for public health informatics.

    PubMed

    Chung, Chi-Jung; Kuo, Yu-Chen; Hsieh, Yun-Yu; Li, Tsai-Chung; Lin, Cheng-Chieh; Liang, Wen-Miin; Liao, Li-Na; Li, Chia-Ing; Lin, Hsueh-Chun

    2017-11-01

    This study applied open source technology to establish a subject-enabled analytics model that can enhance measurement statistics of case studies with the public health data in cloud computing. The infrastructure of the proposed model comprises three domains: 1) the health measurement data warehouse (HMDW) for the case study repository, 2) the self-developed modules of online health risk information statistics (HRIStat) for cloud computing, and 3) the prototype of a Web-based process automation system in statistics (PASIS) for the health risk assessment of case studies with subject-enabled evaluation. The system design employed freeware including Java applications, MySQL, and R packages to drive a health risk expert system (HRES). In the design, the HRIStat modules enforce the typical analytics methods for biomedical statistics, and the PASIS interfaces enable process automation of the HRES for cloud computing. The Web-based model supports both modes, step-by-step analysis and auto-computing process, respectively for preliminary evaluation and real time computation. The proposed model was evaluated by computing prior researches in relation to the epidemiological measurement of diseases that were caused by either heavy metal exposures in the environment or clinical complications in hospital. The simulation validity was approved by the commercial statistics software. The model was installed in a stand-alone computer and in a cloud-server workstation to verify computing performance for a data amount of more than 230K sets. Both setups reached efficiency of about 10 5 sets per second. The Web-based PASIS interface can be used for cloud computing, and the HRIStat module can be flexibly expanded with advanced subjects for measurement statistics. The analytics procedure of the HRES prototype is capable of providing assessment criteria prior to estimating the potential risk to public health. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Relationships (II) of International Classification of High-resolution Computed Tomography for Occupational and Environmental Respiratory Diseases with ventilatory functions indices for parenchymal abnormalities.

    PubMed

    Tamura, Taro; Suganuma, Narufumi; Hering, Kurt G; Vehmas, Tapio; Itoh, Harumi; Akira, Masanori; Takashima, Yoshihiro; Hirano, Harukazu; Kusaka, Yukinori

    2015-01-01

    The International Classification of High-Resolution Computed Tomography (HRCT) for Occupational and Environmental Respiratory Diseases (ICOERD) is used to screen and diagnose respiratory illnesses. Using univariate and multivariate analysis, we investigated the relationship between subject characteristics and parenchymal abnormalities according to ICOERD, and the results of ventilatory function tests (VFT). Thirty-five patients with and 27 controls without mineral-dust exposure underwent VFT and HRCT. We recorded all subjects' occupational history for mineral dust exposure and smoking history. Experts independently assessed HRCT using the ICOERD parenchymal abnormalities (Items) grades for well-defined rounded opacities (RO), linear and/or irregular opacities (IR), and emphysema (EM). High-resolution computed tomography showed that 11 patients had RO; 15 patients, IR; and 19 patients, EM. According to the multiple regression model, age and height had significant associations with many indices ventilatory functions such as vital capacity, forced vital capacity, and forced expiratory volume in 1 s (FEV1). The EM summed grades on the upper, middle, and lower zones of the right and left lungs also had significant associations with FEV1 and the maximum mid-expiratory flow rate. The results suggest the ICOERD notation is adequate based on the good and significant multiple regression modeling of ventilatory function with the EM summed grades.

  10. Occupational Exposures and Subclinical Interstitial Lung Disease. The MESA (Multi-Ethnic Study of Atherosclerosis) Air and Lung Studies.

    PubMed

    Sack, Coralynn S; Doney, Brent C; Podolanczuk, Anna J; Hooper, Laura G; Seixas, Noah S; Hoffman, Eric A; Kawut, Steven M; Vedal, Sverre; Raghu, Ganesh; Barr, R Graham; Lederer, David J; Kaufman, Joel D

    2017-10-15

    The impact of a broad range of occupational exposures on subclinical interstitial lung disease (ILD) has not been studied. To determine whether occupational exposures to vapors, gas, dust, and fumes (VGDF) are associated with high-attenuation areas (HAA) and interstitial lung abnormalities (ILA), which are quantitative and qualitative computed tomography (CT)-based measurements of subclinical ILD, respectively. We performed analyses of participants enrolled in MESA (Multi-Ethnic Study of Atherosclerosis), a population-based cohort aged 45-84 years at recruitment. HAA was measured at baseline and on serial cardiac CT scans in 5,702 participants. ILA was ascertained in a subset of 2,312 participants who underwent full-lung CT scanning at 10-year follow-up. Occupational exposures were assessed by self-reported VGDF exposure and by job-exposure matrix (JEM). Linear mixed models and logistic regression were used to determine whether occupational exposures were associated with log-transformed HAA and ILA. Models were adjusted for age, sex, race/ethnicity, education, employment status, tobacco use, and scanner technology. Each JEM score increment in VGDF exposure was associated with 2.64% greater HAA (95% confidence interval [CI], 1.23-4.19%). Self-reported vapors/gas exposure was associated with an increased odds of ILA among those currently employed (1.76-fold; 95% CI, 1.09-2.84) and those less than 65 years old (1.97-fold; 95% CI, 1.16-3.35). There was no consistent evidence that occupational exposures were associated with progression of HAA over the follow-up period. JEM-assigned and self-reported exposures to VGDF were associated with measurements of subclinical ILD in community-dwelling adults.

  11. Development and application of a catchment scale pesticide fate and transport model for use in drinking water risk assessment.

    PubMed

    Pullan, S P; Whelan, M J; Rettino, J; Filby, K; Eyre, S; Holman, I P

    2016-09-01

    This paper describes the development and application of IMPT (Integrated Model for Pesticide Transport), a parameter-efficient tool for predicting diffuse-source pesticide concentrations in surface waters used for drinking water supply. The model was applied to a small UK headwater catchment with high frequency (8h) pesticide monitoring data and to five larger catchments (479-1653km(2)) with sampling approximately every 14days. Model performance was good for predictions of both flow (Nash Sutcliffe Efficiency generally >0.59 and PBIAS <10%) and pesticide concentrations, although low sampling frequency in the larger catchments is likely to mask the true episodic nature of exposure. The computational efficiency of the model, along with the fact that most of its parameters can be derived from existing national soil property data mean that it can be used to rapidly predict pesticide exposure in multiple surface water resources to support operational and strategic risk assessments. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  12. Large scale study on the variation of RF energy absorption in the head & brain regions of adults and children and evaluation of the SAM phantom conservativeness.

    PubMed

    Keshvari, J; Kivento, M; Christ, A; Bit-Babik, G

    2016-04-21

    This paper presents the results of two computational large scale studies using highly realistic exposure scenarios, MRI based human head and hand models, and two mobile phone models. The objectives are (i) to study the relevance of age when people are exposed to RF by comparing adult and child heads and (ii) to analyze and discuss the conservativeness of the SAM phantom for all age groups. Representative use conditions were simulated using detailed CAD models of two mobile phones operating between 900 MHz and 1950 MHz including configurations with the hand holding the phone, which were not considered in most previous studies. The peak spatial-average specific absorption rate (psSAR) in the head and the pinna tissues is assessed using anatomically accurate head and hand models. The first of the two mentioned studies involved nine head-, four hand- and two phone-models, the second study included six head-, four hand- and three simplified phone-models (over 400 configurations in total). In addition, both studies also evaluated the exposure using the SAM phantom. Results show no systematic differences between psSAR induced in the adult and child heads. The exposure level and its variation for different age groups may be different for particular phones, but no correlation between psSAR and model age was found. The psSAR from all exposure conditions was compared to the corresponding configurations using SAM, which was found to be conservative in the large majority of cases.

  13. Large scale study on the variation of RF energy absorption in the head & brain regions of adults and children and evaluation of the SAM phantom conservativeness

    NASA Astrophysics Data System (ADS)

    Keshvari, J.; Kivento, M.; Christ, A.; Bit-Babik, G.

    2016-04-01

    This paper presents the results of two computational large scale studies using highly realistic exposure scenarios, MRI based human head and hand models, and two mobile phone models. The objectives are (i) to study the relevance of age when people are exposed to RF by comparing adult and child heads and (ii) to analyze and discuss the conservativeness of the SAM phantom for all age groups. Representative use conditions were simulated using detailed CAD models of two mobile phones operating between 900 MHz and 1950 MHz including configurations with the hand holding the phone, which were not considered in most previous studies. The peak spatial-average specific absorption rate (psSAR) in the head and the pinna tissues is assessed using anatomically accurate head and hand models. The first of the two mentioned studies involved nine head-, four hand- and two phone-models, the second study included six head-, four hand- and three simplified phone-models (over 400 configurations in total). In addition, both studies also evaluated the exposure using the SAM phantom. Results show no systematic differences between psSAR induced in the adult and child heads. The exposure level and its variation for different age groups may be different for particular phones, but no correlation between psSAR and model age was found. The psSAR from all exposure conditions was compared to the corresponding configurations using SAM, which was found to be conservative in the large majority of cases.

  14. The ability of non-computer tasks to increase biomechanical exposure variability in computer-intensive office work.

    PubMed

    Barbieri, Dechristian França; Srinivasan, Divya; Mathiassen, Svend Erik; Nogueira, Helen Cristina; Oliveira, Ana Beatriz

    2015-01-01

    Postures and muscle activity in the upper body were recorded from 50 academics office workers during 2 hours of normal work, categorised by observation into computer work (CW) and three non-computer (NC) tasks (NC seated work, NC standing/walking work and breaks). NC tasks differed significantly in exposures from CW, with standing/walking NC tasks representing the largest contrasts for most of the exposure variables. For the majority of workers, exposure variability was larger in their present job than in CW alone, as measured by the job variance ratio (JVR), i.e. the ratio between min-min variabilities in the job and in CW. Calculations of JVRs for simulated jobs containing different proportions of CW showed that variability could, indeed, be increased by redistributing available tasks, but that substantial increases could only be achieved by introducing more vigorous tasks in the job, in casu illustrated by cleaning.

  15. Verification, Validation and Credibility Assessment of a Computational Model of the Advanced Resistive Exercise Device (ARED)

    NASA Technical Reports Server (NTRS)

    Werner, C. R.; Humphreys, B. T.; Mulugeta, L.

    2014-01-01

    The Advanced Resistive Exercise Device (ARED) is the resistive exercise device used by astronauts on the International Space Station (ISS) to mitigate bone loss and muscle atrophy due to extended exposure to microgravity (micro g). The Digital Astronaut Project (DAP) has developed a multi-body dynamics model of biomechanics models for use in spaceflight exercise physiology research and operations. In an effort to advance model maturity and credibility of the ARED model, the DAP performed verification, validation and credibility (VV and C) assessment of the analyses of the model in accordance to NASA-STD-7009 'Standards for Models and Simulations'.

  16. Decompression sickness after air break in prebreathe described with a survival model.

    PubMed

    Conkin, Johnny

    2011-06-01

    A perception exists in aerospace that a brief interruption in a 100% oxygen prebreathe (PB) by breathing air has a substantial decompression sickness (DCS) consequence. The consequences of an air break during PB on the subsequent hypobaric DCS outcomes were evaluated. The hypothesis was that asymmetrical and not symmetrical nitrogen (N2) kinetics was best to model the distribution of subsequent DCS survival times after PBs that included air breaks. DCS survival times from 95 controls for a 60-min PB prior to 2- or 4-h exposures to 4.37 psia (9144 m; 30,000 ft) were analyzed along with 3 experimental conditions: 10-min air break (N = 40), 20-min air break (N = 40), or 60-min air break (N = 32) 30 min into the PB followed by 30 min of PB. Ascent rate was 1524 m x min(-1) and all 207 exposures included light exercise at 4.37 psia. Various computations of decompression dose were evaluated; either the difference or ratio of P1N2 and P2, where P1N2 was computed tissue N2 pressure to account for the PB and P2 was altitude pressure. Survival times were described with an accelerated log logistic model with asymmetrical N2 kinetics defining P1N2--P2 as best decompression dose. Exponential N2 uptake during the air break was described with a 10-min half time and N2 elimination during PB with a 60-min half time. A simple conclusion about compensation for air break is not possible because the duration and location of a break in a PB is variable. The resulting survival model is used to compute additional PB time to compensate for an air break in PB within the range of tested conditions.

  17. The Evolving MCART Multimodal Imaging Core: Establishing a protocol for Computed Tomography and Echocardiography in the Rhesus macaque to perform longitudinal analysis of radiation-induced organ injury

    PubMed Central

    de Faria, Eduardo B.; Barrow, Kory R.; Ruehle, Bradley T.; Parker, Jordan T.; Swartz, Elisa; Taylor-Howell, Cheryl; Kieta, Kaitlyn M.; Lees, Cynthia J.; Sleeper, Meg M.; Dobbin, Travis; Baron, Adam D.; Mohindra, Pranshu; MacVittie, Thomas J.

    2015-01-01

    Computed Tomography (CT) and Echocardiography (EC) are two imaging modalities that produce critical longitudinal data that can be analyzed for radiation-induced organ-specific injury to the lung and heart. The Medical Countermeasures Against Radiological Threats (MCART) consortium has a well-established animal model research platform that includes nonhuman primate (NHP) models of the acute radiation syndrome and the delayed effects of acute radiation exposure. These models call for a definition of the latency, incidence, severity, duration, and resolution of different organ-specific radiation-induced subsyndromes. The pulmonary subsyndromes and cardiac effects are a pair of inter-dependent syndromes impacted by exposure to potentially lethal doses of radiation. Establishing a connection between these will reveal important information about their interaction and progression of injury and recovery. Herein, we demonstrate the use of CT and EC data in the rhesus macaque models to define delayed organ injury thereby establishing: a) consistent and reliable methodology to assess radiation-induced damage to the lung and heart, b) an extensive database in normal age-matched NHP for key primary and secondary endpoints, c) identified problematic variables in imaging techniques and proposed solutions to maintain data integrity and d) initiated longitudinal analysis of potentially lethal radiation-induced damage to the lung and heart. PMID:26425907

  18. In vitro dosimetry of agglomerates

    NASA Astrophysics Data System (ADS)

    Hirsch, V.; Kinnear, C.; Rodriguez-Lorenzo, L.; Monnier, C. A.; Rothen-Rutishauser, B.; Balog, S.; Petri-Fink, A.

    2014-06-01

    Agglomeration of nanoparticles in biological fluids is a pervasive phenomenon that leads to difficulty in the interpretation of results from in vitro exposure, primarily due to differing particokinetics of agglomerates to nanoparticles. Therefore, well-defined small agglomerates were designed that possessed different particokinetic profiles, and their cellular uptake was compared to a computational model of dosimetry. The approach used here paves the way for a better understanding of the impact of agglomeration on the nanoparticle-cell interaction.Agglomeration of nanoparticles in biological fluids is a pervasive phenomenon that leads to difficulty in the interpretation of results from in vitro exposure, primarily due to differing particokinetics of agglomerates to nanoparticles. Therefore, well-defined small agglomerates were designed that possessed different particokinetic profiles, and their cellular uptake was compared to a computational model of dosimetry. The approach used here paves the way for a better understanding of the impact of agglomeration on the nanoparticle-cell interaction. Electronic supplementary information (ESI) available: ITC data for tiopronin/Au-NP interactions, agglomeration kinetics at different pHs for tiopronin-coated Au-NPs, UV-Vis spectra in water, PBS and DMEM and temporal correlation functions for single Au-NPs and corresponding agglomerates, calculation of diffusion and sedimentation parameters, modelling of relative cell uptake based on the ISDD model and cytotoxicity of single Au-NPs and their agglomerates, and synthesis and cell uptake of large spherical Au-NPs. See DOI: 10.1039/c4nr00460d

  19. Optimal segmentation and packaging process

    DOEpatents

    Kostelnik, Kevin M.; Meservey, Richard H.; Landon, Mark D.

    1999-01-01

    A process for improving packaging efficiency uses three dimensional, computer simulated models with various optimization algorithms to determine the optimal segmentation process and packaging configurations based on constraints including container limitations. The present invention is applied to a process for decontaminating, decommissioning (D&D), and remediating a nuclear facility involving the segmentation and packaging of contaminated items in waste containers in order to minimize the number of cuts, maximize packaging density, and reduce worker radiation exposure. A three-dimensional, computer simulated, facility model of the contaminated items are created. The contaminated items are differentiated. The optimal location, orientation and sequence of the segmentation and packaging of the contaminated items is determined using the simulated model, the algorithms, and various constraints including container limitations. The cut locations and orientations are transposed to the simulated model. The contaminated items are actually segmented and packaged. The segmentation and packaging may be simulated beforehand. In addition, the contaminated items may be cataloged and recorded.

  20. Invited commentary: G-computation--lost in translation?

    PubMed

    Vansteelandt, Stijn; Keiding, Niels

    2011-04-01

    In this issue of the Journal, Snowden et al. (Am J Epidemiol. 2011;173(7):731-738) give a didactic explanation of G-computation as an approach for estimating the causal effect of a point exposure. The authors of the present commentary reinforce the idea that their use of G-computation is equivalent to a particular form of model-based standardization, whereby reference is made to the observed study population, a technique that epidemiologists have been applying for several decades. They comment on the use of standardized versus conditional effect measures and on the relative predominance of the inverse probability-of-treatment weighting approach as opposed to G-computation. They further propose a compromise approach, doubly robust standardization, that combines the benefits of both of these causal inference techniques and is not more difficult to implement.

  1. Neutron Transport Models and Methods for HZETRN and Coupling to Low Energy Light Ion Transport

    NASA Technical Reports Server (NTRS)

    Blattnig, S.R.; Slaba, T.C.; Heinbockel, J.H.

    2008-01-01

    Exposure estimates inside space vehicles, surface habitats, and high altitude aircraft exposed to space radiation are highly influenced by secondary neutron production. The deterministic transport code HZETRN has been identified as a reliable and efficient tool for such studies, but improvements to the underlying transport models and numerical methods are still necessary. In this paper, the forward-backward (FB) and directionally coupled forward-backward (DC) neutron transport models are derived, numerical methods for the FB model are reviewed, and a computationally efficient numerical solution is presented for the DC model. Both models are compared to the Monte Carlo codes HETCHEDS and FLUKA, and the DC model is shown to agree closely with the Monte Carlo results. Finally, it is found in the development of either model that the decoupling of low energy neutrons from the light ion (A<4) transport procedure adversely affects low energy light ion fluence spectra and exposure quantities. A first order correction is presented to resolve the problem, and it is shown to be both accurate and efficient.

  2. High-resolution ophthalmic imaging system

    DOEpatents

    Olivier, Scot S.; Carrano, Carmen J.

    2007-12-04

    A system for providing an improved resolution retina image comprising an imaging camera for capturing a retina image and a computer system operatively connected to the imaging camera, the computer producing short exposures of the retina image and providing speckle processing of the short exposures to provide the improved resolution retina image. The system comprises the steps of capturing a retina image, producing short exposures of the retina image, and speckle processing the short exposures of the retina image to provide the improved resolution retina image.

  3. Patient dose, gray level and exposure index with a computed radiography system

    NASA Astrophysics Data System (ADS)

    Silva, T. R.; Yoshimura, E. M.

    2014-02-01

    Computed radiography (CR) is gradually replacing conventional screen-film system in Brazil. To assess image quality, manufactures provide the calculation of an exposure index through the acquisition software of the CR system. The objective of this study is to verify if the CR image can be used as an evaluator of patient absorbed dose too, through a relationship between the entrance skin dose and the exposure index or the gray level values obtained in the image. The CR system used for this study (Agfa model 30-X with NX acquisition software) calculates an exposure index called Log of the Median (lgM), related to the absorbed dose to the IP. The lgM value depends on the average gray level (called Scan Average Level (SAL)) of the segmented pixel value histogram of the whole image. A Rando male phantom was used to simulate a human body (chest and head), and was irradiated with an X-ray equipment, using usual radiologic techniques for chest exams. Thermoluminescent dosimeters (LiF, TLD100) were used to evaluate entrance skin dose and exit dose. The results showed a logarithm relation between entrance dose and SAL in the image center, regardless of the beam filtration. The exposure index varies linearly with the entrance dose, but the angular coefficient is beam quality dependent. We conclude that, with an adequate calibration, the CR system can be used to evaluate the patient absorbed dose.

  4. Cohort mortality study of garment industry workers exposed to formaldehyde: update and internal comparisons.

    PubMed

    Meyers, Alysha R; Pinkerton, Lynne E; Hein, Misty J

    2013-09-01

    To further evaluate the association between formaldehyde and leukemia, we extended follow-up through 2008 for a cohort mortality study of 11,043 US formaldehyde-exposed garment workers. We computed standardized mortality ratios and standardized rate ratios stratified by year of first exposure, exposure duration, and time since first exposure. Associations between exposure duration and rates of leukemia and myeloid leukemia were further examined using Poisson regression models. Compared to the US population, myeloid leukemia mortality was elevated but overall leukemia mortality was not. In internal analyses, overall leukemia mortality increased with increasing exposure duration and this trend was statistically significant. We continue to see limited evidence of an association between formaldehyde and leukemia. However, the extended follow-up did not strengthen previously observed associations. In addition to continued epidemiologic research, we recommend further research to evaluate the biological plausibility of a causal relation between formaldehyde and leukemia. Copyright © 2013 Wiley Periodicals, Inc.

  5. Phantom dosimetry and image quality of i-CAT FLX cone-beam computed tomography

    PubMed Central

    Ludlow, John B.; Walker, Cameron

    2013-01-01

    Introduction Increasing use of cone-beam computed tomography in orthodontics has been coupled with heightened concern with the long-term risks of x-ray exposure in orthodontic populations. An industry response to this has been to offer low-exposure alternative scanning options in newer cone-beam computed tomography models. Methods Effective doses resulting from various combinations of field size, and field location comparing child and adult anthropomorphic phantoms using the recently introduced i-CAT FLX cone-beam computed tomography unit were measured with Optical Stimulated Dosimetry using previously validated protocols. Scan protocols included High Resolution (360° rotation, 600 image frames, 120 kVp, 5 mA, 7.4 sec), Standard (360°, 300 frames, 120 kVp, 5 mA, 3.7 sec), QuickScan (180°, 160 frames, 120 kVp, 5 mA, 2 sec) and QuickScan+ (180°, 160 frames, 90 kVp, 3 mA, 2 sec). Contrast-to-noise ratio (CNR) was calculated as a quantitative measure of image quality for the various exposure options using the QUART DVT phantom. Results Child phantom doses were on average 36% greater than Adult phantom doses. QuickScan+ protocols resulted in significantly lower doses than Standard protocols for child (p=0.0167) and adult (p=0.0055) phantoms. 13×16 cm cephalometric fields of view ranged from 11–85 μSv in the adult phantom and 18–120 μSv in the child for QuickScan+ and Standard protocols respectively. CNR was reduced by approximately 2/3rds comparing QuickScan+ to Standard exposure parameters. Conclusions QuickScan+ effective doses are comparable to conventional panoramic examinations. Significant dose reductions are accompanied by significant reductions in image quality. However, this trade-off may be acceptable for certain diagnostic tasks such as interim assessment of treatment results. PMID:24286904

  6. Computed tomography assessment of peripubertal craniofacial morphology in a sheep model of binge alcohol drinking in the first trimester.

    PubMed

    Birch, Sharla M; Lenox, Mark W; Kornegay, Joe N; Shen, Li; Ai, Huisi; Ren, Xiaowei; Goodlett, Charles R; Cudd, Tim A; Washburn, Shannon E

    2015-11-01

    Identification of facial dysmorphology is essential for the diagnosis of fetal alcohol syndrome (FAS); however, most children with fetal alcohol spectrum disorders (FASD) do not meet the dysmorphology criterion. Additional objective indicators are needed to help identify the broader spectrum of children affected by prenatal alcohol exposure. Computed tomography (CT) was used in a sheep model of prenatal binge alcohol exposure to test the hypothesis that quantitative measures of craniofacial bone volumes and linear distances could identify alcohol-exposed lambs. Pregnant sheep were randomly assigned to four groups: heavy binge alcohol, 2.5 g/kg/day (HBA); binge alcohol, 1.75 g/kg/day (BA); saline control (SC); and normal control (NC). Intravenous alcohol (BA; HBA) or saline (SC) infusions were given three consecutive days per week from gestation day 4-41, and a CT scan was performed on postnatal day 182. The volumes of eight skull bones, cranial circumference, and 19 linear measures of the face and skull were compared among treatment groups. Lambs from both alcohol groups showed significant reduction in seven of the eight skull bones and total skull bone volume, as well as cranial circumference. Alcohol exposure also decreased four of the 19 craniofacial measures. Discriminant analysis showed that alcohol-exposed and control lambs could be classified with high accuracy based on total skull bone volume, frontal, parietal, or mandibular bone volumes, cranial circumference, or interorbital distance. Total skull volume was significantly more sensitive than cranial circumference in identifying the alcohol-exposed lambs when alcohol-exposed lambs were classified using the typical FAS diagnostic cutoff of ≤10th percentile. This first demonstration of the usefulness of CT-derived craniofacial measures in a sheep model of FASD following binge-like alcohol exposure during the first trimester suggests that volumetric measurement of cranial bones may be a novel biomarker for binge alcohol exposure during the first trimester to help identify non-dysmorphic children with FASD. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Nanosecond laser ablation of target Al in a gaseous medium: explosive boiling

    NASA Astrophysics Data System (ADS)

    Mazhukin, V. I.; Mazhukin, A. V.; Demin, M. M.; Shapranov, A. V.

    2018-03-01

    An approximate mathematical description of the processes of homogeneous nucleation and homogeneous evaporation (explosive boiling) of a metal target (Al) under the influence of ns laser radiation is proposed in the framework of the hydrodynamic model. Within the continuum approach, a multi-phase, multi-front hydrodynamic model and a computational algorithm are designed to simulate nanosecond laser ablation of the metal targets immersed in gaseous media. The proposed approach is intended for modeling and detailed analysis of the mechanisms of heterogeneous and homogeneous evaporation and their interaction with each other. It is shown that the proposed model and computational algorithm allow modeling of interrelated mechanisms of heterogeneous and homogeneous evaporation of metals, manifested in the form of pulsating explosive boiling. Modeling has shown that explosive evaporation in metals is due to the presence of a near-surface temperature maximum. It has been established that in nanosecond pulsed laser ablation, such exposure regimes can be implemented in which phase explosion is the main mechanism of material removal.

  8. The Impact of Iodide-Mediated Ozone Deposition and ...

    EPA Pesticide Factsheets

    The air quality of many large coastal areas in the United States is affected by the confluence of polluted urban and relatively clean marine airmasses, each with distinct atmospheric chemistry. In this context, the role of iodide-mediated ozone (O3) deposition over seawater and marine halogen chemistry accounted for in both the lateral boundary conditions and coastal waters surrounding the continental U.S. is examined using the Community Multiscale Air Quality (CMAQ) model. Several nested simulations are conducted in which these halogen processes are implemented separately in the continental U.S. and hemispheric CMAQ domains, the latter providing lateral boundary conditions for the former. Overall, it is the combination of these processes within both the continental U.S. domain and from lateral boundary conditions that lead to the largest reductions in modeled surface O3 concentrations. Predicted reductions in surface O3 concentrations occur mainly along the coast where CMAQ typically has large overpredictions. These results suggest that a realistic representation of halogen processes in marine regions can improve model prediction of O3 concentrations near the coast. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and

  9. The EPA Comptox Chemistry Dashboard: A Web-Based Data ...

    EPA Pesticide Factsheets

    The U.S. Environmental Protection Agency (EPA) Computational Toxicology Program integrates advances in biology, chemistry, and computer science to help prioritize chemicals for further research based on potential human health risks. This work involves computational and data driven approaches that integrate chemistry, exposure and biological data. As an outcome of these efforts the National Center for Computational Toxicology (NCCT) has measured, assembled and delivered an enormous quantity and diversity of data for the environmental sciences including high-throughput in vitro screening data, in vivo and functional use data, exposure models and chemical databases with associated properties. A series of software applications and databases have been produced over the past decade to deliver these data but recent developments have focused on the development of a new software architecture that assembles the resources into a single platform. A new web application, the CompTox Chemistry Dashboard provides access to data associated with ~720,000 chemical substances. These data include experimental and predicted physicochemical property data, bioassay screening data associated with the ToxCast program, product and functional use information and a myriad of related data of value to environmental scientists. The dashboard provides chemical-based searching based on chemical names, synonyms and CAS Registry Numbers. Flexible search capabilities allow for chemical identificati

  10. The EPA CompTox Chemistry Dashboard - an online resource ...

    EPA Pesticide Factsheets

    The U.S. Environmental Protection Agency (EPA) Computational Toxicology Program integrates advances in biology, chemistry, and computer science to help prioritize chemicals for further research based on potential human health risks. This work involves computational and data driven approaches that integrate chemistry, exposure and biological data. As an outcome of these efforts the National Center for Computational Toxicology (NCCT) has measured, assembled and delivered an enormous quantity and diversity of data for the environmental sciences including high-throughput in vitro screening data, in vivo and functional use data, exposure models and chemical databases with associated properties. A series of software applications and databases have been produced over the past decade to deliver these data. Recent work has focused on the development of a new architecture that assembles the resources into a single platform. With a focus on delivering access to Open Data streams, web service integration accessibility and a user-friendly web application the CompTox Dashboard provides access to data associated with ~720,000 chemical substances. These data include research data in the form of bioassay screening data associated with the ToxCast program, experimental and predicted physicochemical properties, product and functional use information and related data of value to environmental scientists. This presentation will provide an overview of the CompTox Dashboard and its va

  11. Long-term Exposure to Fine Particulate Matter Air Pollution and Mortality Among Canadian Women.

    PubMed

    Villeneuve, Paul J; Weichenthal, Scott A; Crouse, Daniel; Miller, Anthony B; To, Teresa; Martin, Randall V; van Donkelaar, Aaron; Wall, Claus; Burnett, Richard T

    2015-07-01

    Long-term exposure to fine particulate matter (PM2.5) has been associated with increased mortality, especially from cardiovascular disease. There are, however, uncertainties about the nature of the exposure-response relation at lower concentrations. In Canada, where ambient air pollution levels are substantially lower than in most other countries, there have been few attempts to study associations between long-term exposure to PM2.5 and mortality. We present a prospective cohort analysis of 89,248 women who enrolled in the Canadian National Breast Screening Study between 1980 and 1985, and for whom residential measures of PM2.5 could be assigned. We derived individual-level estimates of long-term exposure to PM2.5 from satellite observations. We linked cohort records to national mortality data to ascertain mortality between 1980 and 2005. We used Cox proportional hazards models to characterize associations between PM2.5 and several causes of death. The hazard ratios (HRs) and 95% confidence intervals (CIs) computed from these models were adjusted for several individual and neighborhood-level characteristics. The cohort was composed predominantly of Canadian-born (82%) and married (80%) women. The median residential concentration of PM2.5 was 9.1 μg/m(3) (standard deviation = 3.4). In fully adjusted models, a 10 μg/m(3) increase in PM2.5 exposure was associated with elevated risks of nonaccidental (HR: 1.12; 95% CI = 1.04, 1.19), and ischemic heart disease mortality (HR: 1.34; 95% CI = 1.09, 1.66). The findings from this study provide additional support for the hypothesis that exposure to very low levels of ambient PM2.5 increases the risk of cardiovascular mortality.

  12. C60 fullerene localization and membrane interactions in RAW 264.7 immortalized mouse macrophages

    NASA Astrophysics Data System (ADS)

    Russ, K. A.; Elvati, P.; Parsonage, T. L.; Dews, A.; Jarvis, J. A.; Ray, M.; Schneider, B.; Smith, P. J. S.; Williamson, P. T. F.; Violi, A.; Philbert, M. A.

    2016-02-01

    There continues to be a significant increase in the number and complexity of hydrophobic nanomaterials that are engineered for a variety of commercial purposes making human exposure a significant health concern. This study uses a combination of biophysical, biochemical and computational methods to probe potential mechanisms for uptake of C60 nanoparticles into various compartments of living immune cells. Cultures of RAW 264.7 immortalized murine macrophage were used as a canonical model of immune-competent cells that are likely to provide the first line of defense following inhalation. Modes of entry studied were endocytosis/pinocytosis and passive permeation of cellular membranes. The evidence suggests marginal uptake of C60 clusters is achieved through endocytosis/pinocytosis, and that passive diffusion into membranes provides a significant source of biologically-available nanomaterial. Computational modeling of both a single molecule and a small cluster of fullerenes predicts that low concentrations of fullerenes enter the membrane individually and produce limited perturbation; however, at higher concentrations the clusters in the membrane causes deformation of the membrane. These findings are bolstered by nuclear magnetic resonance (NMR) of model membranes that reveal deformation of the cell membrane upon exposure to high concentrations of fullerenes. The atomistic and NMR models fail to explain escape of the particle out of biological membranes, but are limited to idealized systems that do not completely recapitulate the complexity of cell membranes. The surprising contribution of passive modes of cellular entry provides new avenues for toxicological research that go beyond the pharmacological inhibition of bulk transport systems such as pinocytosis.There continues to be a significant increase in the number and complexity of hydrophobic nanomaterials that are engineered for a variety of commercial purposes making human exposure a significant health concern. This study uses a combination of biophysical, biochemical and computational methods to probe potential mechanisms for uptake of C60 nanoparticles into various compartments of living immune cells. Cultures of RAW 264.7 immortalized murine macrophage were used as a canonical model of immune-competent cells that are likely to provide the first line of defense following inhalation. Modes of entry studied were endocytosis/pinocytosis and passive permeation of cellular membranes. The evidence suggests marginal uptake of C60 clusters is achieved through endocytosis/pinocytosis, and that passive diffusion into membranes provides a significant source of biologically-available nanomaterial. Computational modeling of both a single molecule and a small cluster of fullerenes predicts that low concentrations of fullerenes enter the membrane individually and produce limited perturbation; however, at higher concentrations the clusters in the membrane causes deformation of the membrane. These findings are bolstered by nuclear magnetic resonance (NMR) of model membranes that reveal deformation of the cell membrane upon exposure to high concentrations of fullerenes. The atomistic and NMR models fail to explain escape of the particle out of biological membranes, but are limited to idealized systems that do not completely recapitulate the complexity of cell membranes. The surprising contribution of passive modes of cellular entry provides new avenues for toxicological research that go beyond the pharmacological inhibition of bulk transport systems such as pinocytosis. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr07003a

  13. The Complexity of Biomechanics Causing Primary Blast-Induced Traumatic Brain Injury: A Review of Potential Mechanisms

    PubMed Central

    Courtney, Amy; Courtney, Michael

    2015-01-01

    Primary blast-induced traumatic brain injury (bTBI) is a prevalent battlefield injury in recent conflicts, yet biomechanical mechanisms of bTBI remain unclear. Elucidating specific biomechanical mechanisms is essential to developing animal models for testing candidate therapies and for improving protective equipment. Three hypothetical mechanisms of primary bTBI have received the most attention. Because translational and rotational head accelerations are primary contributors to TBI from non-penetrating blunt force head trauma, the acceleration hypothesis suggests that blast-induced head accelerations may cause bTBI. The hypothesis of direct cranial transmission suggests that a pressure transient traverses the skull into the brain and directly injures brain tissue. The thoracic hypothesis of bTBI suggests that some combination of a pressure transient reaching the brain via the thorax and a vagally mediated reflex result in bTBI. These three mechanisms may not be mutually exclusive, and quantifying exposure thresholds (for blasts of a given duration) is essential for determining which mechanisms may be contributing for a level of blast exposure. Progress has been hindered by experimental designs, which do not effectively expose animal models to a single mechanism and by over-reliance on poorly validated computational models. The path forward should be predictive validation of computational models by quantitative confirmation with blast experiments in animal models, human cadavers, and biofidelic human surrogates over a range of relevant blast magnitudes and durations coupled with experimental designs, which isolate a single injury mechanism. PMID:26539158

  14. Measurement error in mobile source air pollution exposure estimates due to residential mobility during pregnancy

    PubMed Central

    Pennington, Audrey Flak; Strickland, Matthew J.; Klein, Mitchel; Zhai, Xinxin; Russell, Armistead G.; Hansen, Craig; Darrow, Lyndsey A.

    2018-01-01

    Prenatal air pollution exposure is frequently estimated using maternal residential location at the time of delivery as a proxy for residence during pregnancy. We describe residential mobility during pregnancy among 19,951 children from the Kaiser Air Pollution and Pediatric Asthma Study, quantify measurement error in spatially-resolved estimates of prenatal exposure to mobile source fine particulate matter (PM2.5) due to ignoring this mobility, and simulate the impact of this error on estimates of epidemiologic associations. Two exposure estimates were compared, one calculated using complete residential histories during pregnancy (weighted average based on time spent at each address) and the second calculated using only residence at birth. Estimates were computed using annual averages of primary PM2.5 from traffic emissions modeled using a research line-source dispersion model (RLINE) at 250 meter resolution. In this cohort, 18.6% of children were born to mothers who moved at least once during pregnancy. Mobile source PM2.5 exposure estimates calculated using complete residential histories during pregnancy and only residence at birth were highly correlated (rS>0.9). Simulations indicated that ignoring residential mobility resulted in modest bias of epidemiologic associations toward the null, but varied by maternal characteristics and prenatal exposure windows of interest (ranging from −2% to −10% bias). PMID:27966666

  15. Measurement error in mobile source air pollution exposure estimates due to residential mobility during pregnancy.

    PubMed

    Pennington, Audrey Flak; Strickland, Matthew J; Klein, Mitchel; Zhai, Xinxin; Russell, Armistead G; Hansen, Craig; Darrow, Lyndsey A

    2017-09-01

    Prenatal air pollution exposure is frequently estimated using maternal residential location at the time of delivery as a proxy for residence during pregnancy. We describe residential mobility during pregnancy among 19,951 children from the Kaiser Air Pollution and Pediatric Asthma Study, quantify measurement error in spatially resolved estimates of prenatal exposure to mobile source fine particulate matter (PM 2.5 ) due to ignoring this mobility, and simulate the impact of this error on estimates of epidemiologic associations. Two exposure estimates were compared, one calculated using complete residential histories during pregnancy (weighted average based on time spent at each address) and the second calculated using only residence at birth. Estimates were computed using annual averages of primary PM 2.5 from traffic emissions modeled using a Research LINE-source dispersion model for near-surface releases (RLINE) at 250 m resolution. In this cohort, 18.6% of children were born to mothers who moved at least once during pregnancy. Mobile source PM 2.5 exposure estimates calculated using complete residential histories during pregnancy and only residence at birth were highly correlated (r S >0.9). Simulations indicated that ignoring residential mobility resulted in modest bias of epidemiologic associations toward the null, but varied by maternal characteristics and prenatal exposure windows of interest (ranging from -2% to -10% bias).

  16. Space environment and lunar surface processes

    NASA Technical Reports Server (NTRS)

    Comstock, G. M.

    1979-01-01

    The development of a general rock/soil model capable of simulating in a self consistent manner the mechanical and exposure history of an assemblage of solid and loose material from submicron to planetary size scales, applicable to lunar and other space exposed planetary surfaces is discussed. The model was incorporated into a computer code called MESS.2 (model for the evolution of space exposed surfaces). MESS.2, which represents a considerable increase in sophistication and scope over previous soil and rock surface models, is described. The capabilities of previous models for near surface soil and rock surfaces are compared with the rock/soil model, MESS.2.

  17. Hobbies with solvent exposure and risk of non-Hodgkin lymphoma.

    PubMed

    Colt, Joanne S; Hartge, Patricia; Davis, Scott; Cerhan, James R; Cozen, Wendy; Severson, Richard K

    2007-05-01

    Occupational exposure to solvents has been reported to increase non-Hodgkin lymphoma (NHL) risk in some, but not all, studies. In a population-based case-control study, we examined whether participation in selected hobbies involving solvent exposure increases NHL risk. We identified NHL cases diagnosed at ages 20-74 years between 1998 and 2000 in Iowa or metropolitan Los Angeles, Detroit, and Seattle. Controls were selected using random digit dialing or Medicare files. Computer-assisted personal interviews (551 cases, 462 controls) elicited data on model building, painting/silkscreening/artwork, furniture refinishing, and woodworking/home carpentry. Hobby participation (68% of cases, 69% of controls) was not associated with NHL risk (OR = 0.9, 95% CI = 0.7-1.2). Compared to people with none of the hobbies evaluated, those who built models had significantly lower risk (OR = 0.7, CI = 0.5-1.0), but risk did not vary with the number of years or lifetime hours. Risk estimates for the other hobbies were generally less than one, but the associations were not significant and there were no notable patterns with duration of exposure. Use of oil-based, acrylic, or water-based paints; paint strippers; polyurethane; or varnishes was not associated with NHL risk. We conclude that participation in hobbies involving exposure to organic solvents is unlikely to increase NHL risk.

  18. National-scale exposure prediction for long-term concentrations of particulate matter and nitrogen dioxide in South Korea.

    PubMed

    Kim, Sun-Young; Song, Insang

    2017-07-01

    The limited spatial coverage of the air pollution data available from regulatory air quality monitoring networks hampers national-scale epidemiological studies of air pollution. The present study aimed to develop a national-scale exposure prediction model for estimating annual average concentrations of PM 10 and NO 2 at residences in South Korea using regulatory monitoring data for 2010. Using hourly measurements of PM 10 and NO 2 at 277 regulatory monitoring sites, we calculated the annual average concentrations at each site. We also computed 322 geographic variables in order to represent plausible local and regional pollution sources. Using these data, we developed universal kriging models, including three summary predictors estimated by partial least squares (PLS). The model performance was evaluated with fivefold cross-validation. In sensitivity analyses, we compared our approach with two alternative approaches, which added regional interactions and replaced the PLS predictors with up to ten selected variables. Finally, we predicted the annual average concentrations of PM 10 and NO 2 at 83,463 centroids of residential census output areas in South Korea to investigate the population exposure to these pollutants and to compare the exposure levels between monitored and unmonitored areas. The means of the annual average concentrations of PM 10 and NO 2 for 2010, across regulatory monitoring sites in South Korea, were 51.63 μg/m3 (SD = 8.58) and 25.64 ppb (11.05), respectively. The universal kriging exposure prediction models yielded cross-validated R 2 s of 0.45 and 0.82 for PM 10 and NO 2 , respectively. Compared to our model, the two alternative approaches gave consistent or worse performances. Population exposure levels in unmonitored areas were lower than in monitored areas. This is the first study that focused on developing a national-scale point wise exposure prediction approach in South Korea, which will allow national exposure assessments and epidemiological research to answer policy-related questions and to draw comparisons among different countries. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Virtual reality in anxiety disorders: the past and the future.

    PubMed

    Gorini, Alessandra; Riva, Giuseppe

    2008-02-01

    One of the most effective treatments of anxiety is exposure therapy: a person is exposed to specific feared situations or objects that trigger anxiety. This exposure process may be done through actual exposure, with visualization, by imagination or using virtual reality (VR), that provides users with computer simulated environments with and within which they can interact. VR is made possible by the capability of computers to synthesize a 3D graphical environment from numerical data. Furthermore, because input devices sense the subject's reactions and motions, the computer can modify the synthetic environment accordingly, creating the illusion of interacting with, and thus being immersed within the environment. Starting from 1995, different experimental studies have been conducted in order to investigate the effect of VR exposure in the treatment of subclinical fears and anxiety disorders. This review will discuss their outcome and provide guidelines for the use of VR exposure for the treatment of anxious patients.

  20. Computer Simulation of Embryonic Systems: What can a ...

    EPA Pesticide Factsheets

    (1) Standard practice for assessing developmental toxicity is the observation of apical endpoints (intrauterine death, fetal growth retardation, structural malformations) in pregnant rats/rabbits following exposure during organogenesis. EPA’s computational toxicology research program (ToxCast) generated vast in vitro cellular and molecular effects data on >1858 chemicals in >600 high-throughput screening (HTS) assays. The diversity of assays has been increased for developmental toxicity with several HTS platforms, including the devTOX-quickPredict assay from Stemina Biomarker Discovery utilizing the human embryonic stem cell line (H9). Translating these HTS data into higher order-predictions of developmental toxicity is a significant challenge. Here, we address the application of computational systems models that recapitulate the kinematics of dynamical cell signaling networks (e.g., SHH, FGF, BMP, retinoids) in a CompuCell3D.org modeling environment. Examples include angiogenesis (angiodysplasia) and dysmorphogenesis. Being numerically responsive to perturbation, these models are amenable to data integration for systems Toxicology and Adverse Outcome Pathways (AOPs). The AOP simulation outputs predict potential phenotypes based on the in vitro HTS data ToxCast. A heuristic computational intelligence framework that recapitulates the kinematics of dynamical cell signaling networks in the embryo, together with the in vitro profiling data, produce quantitative pr

  1. Comparison of observed lung retention and urinary excretion of thorium workers and members of the public in India with the values predicted by the ICRP biokinetic model.

    PubMed

    Jaiswal, D D; Singh, I S; Nair, Suma; Dang, H S; Garg, S P; Pradhan, A S

    2004-01-01

    The daily intake of natural Th and its contents in lungs, skeleton and liver of an Indian adult population group were estimated using radiochemical neutron activation analysis (RNAA) technique. These data on daily intake (through inhalation and ingestion) were used to compute Th contents in lungs and other systemic organs such as skeleton and liver using the new human respiratory tract model (HRTM) and the new biokinetic model of Th. The theoretically computed Th contents in lungs, skeleton and liver of an average Indian adult are 2.56, 4.00 and 0.17 microg, respectively which are comparable with the corresponding experimentally measured values of 4.31, 3.45 and 0.14 microg in an urban population group living in Mumbai. The measured lung contents of Th in a group of five occupational workers were used to compute their total body Th contents and the corresponding daily urinary excretions. The computed total body contents and daily urinary excretions of Th in the five subjects compared favourably with their measured values. These studies, thus, validate the new biokinetic model of Th in natural as well as in occupational exposures in Indian conditions.

  2. Momentary effects of exposure to prosmoking media on college students' future smoking risk.

    PubMed

    Shadel, William G; Martino, Steven C; Setodji, Claude; Scharf, Deborah

    2012-07-01

    This study used ecological momentary assessment to examine acute changes in college students' future smoking risk as a function of their exposure to prosmoking media (e.g., smoking in movies, paid advertising, point-of-sale displays). A sample of 135 college students ("ever" and "never" smokers) carried handheld computers for 21 days, recording their exposures to all forms of prosmoking media during the assessment period. They also responded to three investigator-initiated control prompts during each day of the assessment period (i.e., programmed to occur randomly). After each prosmoking media exposure and after each random control prompt they answered questions that measured their risk of future smoking. Responses between prosmoking media encounters were compared (within subjects) to responses made during random control prompts. Compliance with the study protocol was high, with participants responding to over 83% of all random prompts. Participants recorded nearly three encounters with prosmoking media each week. Results of linear mixed modeling indicated that all participants had higher future smoking risk following exposure to prosmoking media compared with control prompts (p < .05); this pattern of response did not differ between ever and never smokers (p = .769). Additional modeling of the variances around participants' risk of future smoking revealed that the response of never smokers to prosmoking media was significantly more variable than the response of ever smokers. Exposure to prosmoking media is associated with acute changes in future smoking risk, and never smokers and ever smokers respond differently to these exposures.

  3. Exposure to Radiofrequency Electromagnetic Fields and Sleep Quality: A Prospective Cohort Study

    PubMed Central

    Mohler, Evelyn; Frei, Patrizia; Fröhlich, Jürg; Braun-Fahrländer, Charlotte; Röösli, Martin

    2012-01-01

    Background There is persistent public concern about sleep disturbances due to radiofrequency electromagnetic field (RF-EMF) exposure. The aim of this prospective cohort study was to investigate whether sleep quality is affected by mobile phone use or by other RF-EMF sources in the everyday environment. Methods We conducted a prospective cohort study with 955 study participants aged between 30 and 60 years. Sleep quality and daytime sleepiness was assessed by means of standardized questionnaires in May 2008 (baseline) and May 2009 (follow-up). We also asked about mobile and cordless phone use and asked study participants for consent to obtain their mobile phone connection data from the mobile phone operators. Exposure to environmental RF-EMF was computed for each study participant using a previously developed and validated prediction model. In a nested sample of 119 study participants, RF-EMF exposure was measured in the bedroom and data on sleep behavior was collected by means of actigraphy during two weeks. Data were analyzed using multivariable regression models adjusted for relevant confounders. Results In the longitudinal analyses neither operator-recorded nor self-reported mobile phone use was associated with sleep disturbances or daytime sleepiness. Also, exposure to environmental RF-EMF did not affect self-reported sleep quality. The results from the longitudinal analyses were confirmed in the nested sleep study with objectively recorded exposure and measured sleep behavior data. Conclusions We did not find evidence for adverse effects on sleep quality from RF-EMF exposure in our everyday environment. PMID:22624036

  4. Multicore Challenges and Benefits for High Performance Scientific Computing

    DOE PAGES

    Nielsen, Ida M. B.; Janssen, Curtis L.

    2008-01-01

    Until recently, performance gains in processors were achieved largely by improvements in clock speeds and instruction level parallelism. Thus, applications could obtain performance increases with relatively minor changes by upgrading to the latest generation of computing hardware. Currently, however, processor performance improvements are realized by using multicore technology and hardware support for multiple threads within each core, and taking full advantage of this technology to improve the performance of applications requires exposure of extreme levels of software parallelism. We will here discuss the architecture of parallel computers constructed from many multicore chips as well as techniques for managing the complexitymore » of programming such computers, including the hybrid message-passing/multi-threading programming model. We will illustrate these ideas with a hybrid distributed memory matrix multiply and a quantum chemistry algorithm for energy computation using Møller–Plesset perturbation theory.« less

  5. CAD-based stand-alone spacecraft radiation exposure analysis system: An application of the early man-tended Space Station

    NASA Technical Reports Server (NTRS)

    Appleby, M. H.; Golightly, M. J.; Hardy, A. C.

    1993-01-01

    Major improvements have been completed in the approach to analyses and simulation of spacecraft radiation shielding and exposure. A computer-aided design (CAD)-based system has been developed for determining the amount of shielding provided by a spacecraft and simulating transmission of an incident radiation environment to any point within or external to the vehicle. Shielding analysis is performed using a customized ray-tracing subroutine contained within a standard engineering modeling software package. This improved shielding analysis technique has been used in several vehicle design programs such as a Mars transfer habitat, pressurized lunar rover, and the redesigned international Space Station. Results of analysis performed for the Space Station astronaut exposure assessment are provided to demonastrate the applicability and versatility of the system.

  6. The role of skin conductivity in a low frequency exposure assessment for peripheral nerve tissue according to the ICNIRP 2010 guidelines

    NASA Astrophysics Data System (ADS)

    Schmid, Gernot; Cecil, Stefan; Überbacher, Richard

    2013-07-01

    Based on numerical computations using commercially available finite difference time domain code and a state-of-the art anatomical model of a 5-year old child, the influence of skin conductivity on the induced electric field strength inside the tissue for homogeneous front-to-back magnetic field exposure and homogeneous vertical electric field exposure was computed. Both ungrounded as well as grounded conditions of the body model were considered. For electric field strengths induced inside CNS tissue the impact of skin conductivity was found to be less than 15%. However, the results demonstrated that the use of skin conductivity values as obtainable from the most widely used data base of dielectric tissue properties and recommended by safety standards are not suitable for exposure assessment with respect to peripheral nerve tissue according to the ICNIRP 2010 guidelines in which the use of the induced electric field strengths inside the skin is suggested as a conservative surrogate for peripheral nerve exposure. This is due to the fact that the skin conductivity values derived from these data bases refer to the stratum corneum, the uppermost layer of the skin, which does not contain any nerve or receptor cells to be protected from stimulation effects. Using these skin conductivity values which are approximately a factor 250-500 lower than skin conductivity values used in studies on which the ICNIRP 2010 guidelines are based on, may lead to overestimations of the induced electric field strengths inside the skin by substantially more than a factor of 10. However, reliable conductivity data of deeper skin layers where nerve and preceptor cells are located is very limited. It is therefore recommended to include appropriate background information in the ICNIRP guidelines and the dielectric tissue property databases, and to put some emphasis on a detailed layer-specific characterization of skin conductivity in near future.

  7. Computer Aided Dosimetry and Verification of Exposure to Radiation

    NASA Astrophysics Data System (ADS)

    Waller, Edward; Stodilka, Robert Z.; Leach, Karen E.; Lalonde, Louise

    2002-06-01

    In the timeframe following the September 11th attacks on the United States, increased emphasis has been placed on Chemical, Biological, Radiological and Nuclear (CBRN) preparedness. Of prime importance is rapid field assessment of potential radiation exposure to Canadian Forces field personnel. This work set up a framework for generating an 'expert' computer system for aiding and assisting field personnel in determining the extent of radiation insult to military personnel. Data was gathered by review of the available literature, discussions with medical and health physics personnel having hands-on experience dealing with radiation accident victims, and from experience of the principal investigator. Flow charts and generic data fusion algorithms were developed. Relationships between known exposure parameters, patient interview and history, clinical symptoms, clinical work-ups, physical dosimetry, biological dosimetry, and dose reconstruction as critical data indicators were investigated. The data obtained was examined in terms of information theory. A main goal was to determine how best to generate an adaptive model (i.e. when more data becomes available, how is the prediction improved). Consideration was given to determination of predictive algorithms for health outcome. In addition. the concept of coding an expert medical treatment advisor system was developed (U)

  8. Efficacy of It's Your Game-Tech: A Computer-Based Sexual Health Education Program for Middle School Youth.

    PubMed

    Peskin, Melissa F; Shegog, Ross; Markham, Christine M; Thiel, Melanie; Baumler, Elizabeth R; Addy, Robert C; Gabay, Efrat K; Emery, Susan Tortolero

    2015-05-01

    Few computer-based HIV, sexually transmitted infection (STI), and pregnancy prevention programs are available, and even fewer target early adolescents. In this study, we tested the efficacy of It's Your Game (IYG)-Tech, a completely computer-based, middle school sexual health education program. The primary hypothesis was that students who received IYG-Tech would significantly delay sexual initiation by ninth grade. We evaluated IYG-Tech using a randomized, two-arm nested design among 19 schools in a large, urban school district in southeast Texas (20 schools were originally randomized). The target population was English-speaking eighth-grade students who were followed into the ninth grade. The final analytic sample included 1,374 students. Multilevel logistic regression models were used to test for differences in sexual initiation between intervention and control students, while adjusting for age, gender, ethnicity, time between measures, and family structure. There was no significant difference in the delay of sexual activity or in any other sexual behavior between intervention and control students. However, there were significant positive between-group differences for psychosocial variables related to STI and condom knowledge, attitudes about abstinence, condom use self-efficacy, and perceived norms about sex. Post hoc analyses conducted among intervention students revealed some significant associations: "full exposure" (completion of all 13 lessons) and "mid-exposure" (5-8 lessons) students were less likely than "low exposure" (1-4 lessons) students to initiate sex. Collectively, our findings indicate that IYG-Tech impacts some determinants of sexual behavior, and that additional efficacy evaluation with full intervention exposure may be warranted. Copyright © 2015 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  9. Eulerian-Lagrangian CFD modelling of pesticide dust emissions from maize planters

    NASA Astrophysics Data System (ADS)

    Devarrewaere, Wouter; Foqué, Dieter; Nicolai, Bart; Nuyttens, David; Verboven, Pieter

    2018-07-01

    An Eulerian-Lagrangian 3D computational fluid dynamics (CFD) model of pesticide dust drift from precision vacuum planters in field conditions was developed. Tractor and planter models were positioned in an atmospheric computational domain, representing the field and its edges. Physicochemical properties of dust abraded from maize seeds (particle size, shape, porosity, density, a.i. content), dust emission rates and exhaust air velocity values at the planter fan outlets were measured experimentally and implemented in the model. The wind profile, the airflow pattern around the machines and the dust dispersion were computed. Various maize sowing scenarios with different wind conditions, dust properties, planter designs and vacuum pressures were simulated. Dust particle trajectories were calculated by means of Lagrangian particle tracking, considering nonspherical particle drag, gravity and turbulent dispersion. The dust dispersion model was previously validated with wind tunnel data. In this study, simulated pesticide concentrations in the air and on the soil in the different sowing scenarios were compared and discussed. The model predictions were similar to experimental literature data in terms of concentrations and drift distance. Pesticide exposure levels to bees during flight and foraging were estimated from the simulated concentrations. The proposed CFD model can be used in risk assessment studies and in the evaluation of dust drift mitigation measures.

  10. Use of a variable exposure photographic pyrometer to measure surface temperatures on a hemispherical-face model

    NASA Technical Reports Server (NTRS)

    Kantsios, A. G.; Henley, W. C., Jr.; Snow, W. L.

    1982-01-01

    The use of a photographic pyrometer for nonintrusive measurement of high temperature surfaces in a wind tunnel test is described. The advantages of the pyrometer for measuring surfaces whose unique shape makes use of thermocouples difficult are pointed out. The use of computer operated densitometers or optical processors for the data reduction is recommended.

  11. Index extraction for electromagnetic field evaluation of high power wireless charging system.

    PubMed

    Park, SangWook

    2017-01-01

    This paper presents the precise dosimetry for highly resonant wireless power transfer (HR-WPT) system using an anatomically realistic human voxel model. The dosimetry for the HR-WPT system designed to operate at 13.56 MHz frequency, which one of the ISM band frequency band, is conducted in the various distances between the human model and the system, and in the condition of alignment and misalignment between transmitting and receiving circuits. The specific absorption rates in the human body are computed by the two-step approach; in the first step, the field generated by the HR-WPT system is calculated and in the second step the specific absorption rates are computed with the scattered field finite-difference time-domain method regarding the fields obtained in the first step as the incident fields. The safety compliance for non-uniform field exposure from the HR-WPT system is discussed with the international safety guidelines. Furthermore, the coupling factor concept is employed to relax the maximum allowable transmitting power. Coupling factors derived from the dosimetry results are presented. In this calculation, the external magnetic field from the HR-WPT system can be relaxed by approximately four times using coupling factor in the worst exposure scenario.

  12. Multimodal Word Meaning Induction From Minimal Exposure to Natural Text.

    PubMed

    Lazaridou, Angeliki; Marelli, Marco; Baroni, Marco

    2017-04-01

    By the time they reach early adulthood, English speakers are familiar with the meaning of thousands of words. In the last decades, computational simulations known as distributional semantic models (DSMs) have demonstrated that it is possible to induce word meaning representations solely from word co-occurrence statistics extracted from a large amount of text. However, while these models learn in batch mode from large corpora, human word learning proceeds incrementally after minimal exposure to new words. In this study, we run a set of experiments investigating whether minimal distributional evidence from very short passages suffices to trigger successful word learning in subjects, testing their linguistic and visual intuitions about the concepts associated with new words. After confirming that subjects are indeed very efficient distributional learners even from small amounts of evidence, we test a DSM on the same multimodal task, finding that it behaves in a remarkable human-like way. We conclude that DSMs provide a convincing computational account of word learning even at the early stages in which a word is first encountered, and the way they build meaning representations can offer new insights into human language acquisition. Copyright © 2017 Cognitive Science Society, Inc.

  13. Estimates of radiological risk from depleted uranium weapons in war scenarios.

    PubMed

    Durante, Marco; Pugliese, Mariagabriella

    2002-01-01

    Several weapons used during the recent conflict in Yugoslavia contain depleted uranium, including missiles and armor-piercing incendiary rounds. Health concern is related to the use of these weapons, because of the heavy-metal toxicity and radioactivity of uranium. Although chemical toxicity is considered the more important source of health risk related to uranium, radiation exposure has been allegedly related to cancers among veterans of the Balkan conflict, and uranium munitions are a possible source of contamination in the environment. Actual measurements of radioactive contamination are needed to assess the risk. In this paper, a computer simulation is proposed to estimate radiological risk related to different exposure scenarios. Dose caused by inhalation of radioactive aerosols and ground contamination induced by Tomahawk missile impact are simulated using a Gaussian plume model (HOTSPOT code). Environmental contamination and committed dose to the population resident in contaminated areas are predicted by a food-web model (RESRAD code). Small values of committed effective dose equivalent appear to be associated with missile impacts (50-y CEDE < 5 mSv), or population exposure by water-independent pathways (50-y CEDE < 80 mSv). The greatest hazard is related to the water contamination in conditions of effective leaching of uranium in the groundwater (50-y CEDE < 400 mSv). Even in this worst case scenario, the chemical toxicity largely predominates over radiological risk. These computer simulations suggest that little radiological risk is associated to the use of depleted uranium weapons.

  14. Computational Exposure Science: An Emerging Discipline to Support 21st-Century Risk Assessment

    EPA Science Inventory

    Background: Computational exposure science represents a frontier of environmental science that is emerging and quickly evolving.Objectives: In this commentary, we define this burgeoning discipline, describe a framework for implementation, and review some key ongoing research elem...

  15. Computer-aided video exposure monitoring.

    PubMed

    Walsh, P T; Clark, R D; Flaherty, S; Gentry, S J

    2000-01-01

    A computer-aided video exposure monitoring system was used to record exposure information. The system comprised a handheld camcorder, portable video cassette recorder, radio-telemetry transmitter/receiver, and handheld or notebook computers for remote data logging, photoionization gas/vapor detectors (PIDs), and a personal aerosol monitor. The following workplaces were surveyed using the system: dry cleaning establishments--monitoring tetrachoroethylene in the air and in breath; printing works--monitoring white spirit type solvent; tire manufacturing factory--monitoring rubber fume; and a slate quarry--monitoring respirable dust and quartz. The system based on the handheld computer, in particular, simplified the data acquisition process compared with earlier systems in use by our laboratory. The equipment is more compact and easier to operate, and allows more accurate calibration of the instrument reading on the video image. Although a variety of data display formats are possible, the best format for videos intended for educational and training purposes was the review-preview chart superimposed on the video image of the work process. Recommendations for reducing exposure by engineering or by modifying work practice were possible through use of the video exposure system in the dry cleaning and tire manufacturing applications. The slate quarry work illustrated how the technique can be used to test ventilation configurations quickly to see their effect on the worker's personal exposure.

  16. Association of Parkinson's Disease and Its Subtypes with Agricultural Pesticide Exposures in Men: A Case-Control Study in France.

    PubMed

    Moisan, Frédéric; Spinosi, Johan; Delabre, Laurène; Gourlet, Véronique; Mazurie, Jean-Louis; Bénatru, Isabelle; Goldberg, Marcel; Weisskopf, Marc G; Imbernon, Ellen; Tzourio, Christophe; Elbaz, Alexis

    2015-11-01

    Pesticides have been associated with Parkinson's disease (PD), but there are few data on important exposure characteristics such as dose-effect relations. It is unknown whether associations depend on clinical PD subtypes. We examined quantitative aspects of occupational pesticide exposure associated with PD and investigated whether associations were similar across PD subtypes. As part of a French population-based case-control study including men enrolled in the health insurance plan for farmers and agricultural workers, cases with clinically confirmed PD were identified through antiparkinsonian drug claims. Two controls were matched to each case. Using a comprehensive occupational questionnaire, we computed indicators for different dimensions of exposure (duration, cumulative exposure, intensity). We used conditional logistic regression to compute odds ratios (ORs) and 95% confidence intervals (CIs) among exposed male farmers (133 cases, 298 controls). We examined the relation between pesticides and PD subtypes (tremor dominant/non-tremor dominant) using polytomous logistic regression. There appeared to be a stronger association with intensity than duration of pesticide exposure based on separate models, as well as a synergistic interaction between duration and intensity (p-interaction = 0.04). High-intensity exposure to insecticides was positively associated with PD among those with low-intensity exposure to fungicides and vice versa, suggesting independent effects. Pesticide exposure in farms that specialized in vineyards was associated with PD (OR = 2.56; 95% CI: 1.31, 4.98). The association with intensity of pesticide use was stronger, although not significantly (p-heterogeneity = 0.60), for tremor-dominant (p-trend < 0.01) than for non-tremor-dominant PD (p-trend = 0.24). This study helps to better characterize different aspects of pesticide exposure associated with PD, and shows a significant association of pesticides with tremor-dominant PD in men, the most typical PD presentation. Moisan F, Spinosi J, Delabre L, Gourlet V, Mazurie JL, Bénatru I, Goldberg M, Weisskopf MG, Imbernon E, Tzourio C, Elbaz A. 2015. Association of Parkinson's disease and its subtypes with agricultural pesticide exposures in men: a case-control study in France. Environ Health Perspect 123:1123-1129; http://dx.doi.org/10.1289/ehp.1307970.

  17. Considerations of Environmentally Relevant Test Conditions for Improved Evaluation of Ecological Hazards of Engineered Nanomaterials.

    PubMed

    Holden, Patricia A; Gardea-Torresdey, Jorge L; Klaessig, Fred; Turco, Ronald F; Mortimer, Monika; Hund-Rinke, Kerstin; Cohen Hubal, Elaine A; Avery, David; Barceló, Damià; Behra, Renata; Cohen, Yoram; Deydier-Stephan, Laurence; Ferguson, P Lee; Fernandes, Teresa F; Herr Harthorn, Barbara; Henderson, W Matthew; Hoke, Robert A; Hristozov, Danail; Johnston, John M; Kane, Agnes B; Kapustka, Larry; Keller, Arturo A; Lenihan, Hunter S; Lovell, Wess; Murphy, Catherine J; Nisbet, Roger M; Petersen, Elijah J; Salinas, Edward R; Scheringer, Martin; Sharma, Monita; Speed, David E; Sultan, Yasir; Westerhoff, Paul; White, Jason C; Wiesner, Mark R; Wong, Eva M; Xing, Baoshan; Steele Horan, Meghan; Godwin, Hilary A; Nel, André E

    2016-06-21

    Engineered nanomaterials (ENMs) are increasingly entering the environment with uncertain consequences including potential ecological effects. Various research communities view differently whether ecotoxicological testing of ENMs should be conducted using environmentally relevant concentrations-where observing outcomes is difficult-versus higher ENM doses, where responses are observable. What exposure conditions are typically used in assessing ENM hazards to populations? What conditions are used to test ecosystem-scale hazards? What is known regarding actual ENMs in the environment, via measurements or modeling simulations? How should exposure conditions, ENM transformation, dose, and body burden be used in interpreting biological and computational findings for assessing risks? These questions were addressed in the context of this critical review. As a result, three main recommendations emerged. First, researchers should improve ecotoxicology of ENMs by choosing test end points, duration, and study conditions-including ENM test concentrations-that align with realistic exposure scenarios. Second, testing should proceed via tiers with iterative feedback that informs experiments at other levels of biological organization. Finally, environmental realism in ENM hazard assessments should involve greater coordination among ENM quantitative analysts, exposure modelers, and ecotoxicologists, across government, industry, and academia.

  18. Low-energy light bulbs, computers, tablets and the blue light hazard.

    PubMed

    O'Hagan, J B; Khazova, M; Price, L L A

    2016-02-01

    The introduction of low energy lighting and the widespread use of computer and mobile technologies have changed the exposure of human eyes to light. Occasional claims that the light sources with emissions containing blue light may cause eye damage raise concerns in the media. The aim of the study was to determine if it was appropriate to issue advice on the public health concerns. A number of sources were assessed and the exposure conditions were compared with international exposure limits, and the exposure likely to be received from staring at a blue sky. None of the sources assessed approached the exposure limits, even for extended viewing times.

  19. Comparison of radio frequency energy absorption in ear and eye region of children and adults at 900, 1800 and 2450 MHz.

    PubMed

    Keshvari, J; Lang, S

    2005-09-21

    The increasing use of mobile communication devices, especially mobile phones by children, has triggered discussions on whether there is a larger radio frequency (RF) energy absorption in the heads of children compared to that of adults. The objective of this study was to clarify possible differences in RF energy absorption in the head region of children and adults using computational techniques. Using the finite-difference time-domain (FDTD) computational method, a set of specific absorption rate (SAR) calculations were performed for anatomically correct adult and child head models. A half-wave dipole was used as an exposure source at 900, 1800 and 2450 MHz frequencies. The ear and eye regions were studied representing realistic exposure scenarios to current and upcoming mobile wireless communication devices. The differences in absorption were compared with the maximum energy absorption of the head model. Four magnetic resonance imaging (MRI) based head models, one female, one adult, two child head models, aged 3 and 7 years, were used. The head models greatly differ from each other in terms of size, external shape and the internal anatomy. The same tissue dielectric parameters were applied for all models. The analyses suggest that the SAR difference between adults and children is more likely caused by the general differences in the head anatomy and geometry of the individuals rather than age. It seems that the external shape of the head and the distribution of different tissues within the head play a significant role in the RF energy absorption.

  20. Ultrafine particles dispersion modeling in a street canyon: development and evaluation of a composite lattice Boltzmann model.

    PubMed

    Habilomatis, George; Chaloulakou, Archontoula

    2013-10-01

    Recently, a branch of particulate matter research concerns on ultrafine particles found in the urban environment, which originate, to a significant extent, from traffic sources. In urban street canyons, dispersion of ultrafine particles affects pedestrian's short term exposure and resident's long term exposure as well. The aim of the present work is the development and the evaluation of a composite lattice Boltzmann model to study the dispersion of ultrafine particles, in urban street canyon microenvironment. The proposed model has the potential to penetrate into the physics of this complex system. In order to evaluate the model performance against suitable experimental data, ultrafine particles levels have been monitored on an hourly basis for a period of 35 days, in a street canyon, in Athens area. The results of the comparative analysis are quite satisfactory. Furthermore, our modeled results are in a good agreement with the results of other computational and experimental studies. This work is a first attempt to study the dispersion of an air pollutant by application of the lattice Boltzmann method. Copyright © 2013 Elsevier B.V. All rights reserved.

  1. A human life-stage physiologically based pharmacokinetic and pharmacodynamic model for chlorpyrifos: development and validation.

    PubMed

    Smith, Jordan Ned; Hinderliter, Paul M; Timchalk, Charles; Bartels, Michael J; Poet, Torka S

    2014-08-01

    Sensitivity to some chemicals in animals and humans are known to vary with age. Age-related changes in sensitivity to chlorpyrifos have been reported in animal models. A life-stage physiologically based pharmacokinetic and pharmacodynamic (PBPK/PD) model was developed to predict disposition of chlorpyrifos and its metabolites, chlorpyrifos-oxon (the ultimate toxicant) and 3,5,6-trichloro-2-pyridinol (TCPy), as well as B-esterase inhibition by chlorpyrifos-oxon in humans. In this model, previously measured age-dependent metabolism of chlorpyrifos and chlorpyrifos-oxon were integrated into age-related descriptions of human anatomy and physiology. The life-stage PBPK/PD model was calibrated and tested against controlled adult human exposure studies. Simulations suggest age-dependent pharmacokinetics and response may exist. At oral doses ⩾0.6mg/kg of chlorpyrifos (100- to 1000-fold higher than environmental exposure levels), 6months old children are predicted to have higher levels of chlorpyrifos-oxon in blood and higher levels of red blood cell cholinesterase inhibition compared to adults from equivalent doses. At lower doses more relevant to environmental exposures, simulations predict that adults will have slightly higher levels of chlorpyrifos-oxon in blood and greater cholinesterase inhibition. This model provides a computational framework for age-comparative simulations that can be utilized to predict chlorpyrifos disposition and biological response over various postnatal life stages. Copyright © 2013 Elsevier Inc. All rights reserved.

  2. Adding Four- Dimensional Data Assimilation (aka grid ...

    EPA Pesticide Factsheets

    Adding four-dimensional data assimilation (a.k.a. grid nudging) to MPAS.The U.S. Environmental Protection Agency is investigating the use of MPAS as the meteorological driver for its next-generation air quality model. To function as such, MPAS needs to operate in a diagnostic mode in much the same manner as the current meteorological driver, the Weather Research and Forecasting (WRF) model. The WRF operates in diagnostic mode using Four-Dimensional Data Assimilation, also known as "grid nudging". MPAS version 4.0 has been modified with the addition of an FDDA routine to the standard physics drivers to nudge the state variables for wind, temperature and water vapor towards MPAS initialization fields defined at 6-hour intervals from GFS-derived data. The results to be shown demonstrate the ability to constrain MPAS simulations to known historical conditions and thus provide the U.S. EPA with a practical meteorological driver for global-scale air quality simulations. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use bo

  3. Validation of 2D flood models with insurance claims

    NASA Astrophysics Data System (ADS)

    Zischg, Andreas Paul; Mosimann, Markus; Bernet, Daniel Benjamin; Röthlisberger, Veronika

    2018-02-01

    Flood impact modelling requires reliable models for the simulation of flood processes. In recent years, flood inundation models have been remarkably improved and widely used for flood hazard simulation, flood exposure and loss analyses. In this study, we validate a 2D inundation model for the purpose of flood exposure analysis at the river reach scale. We validate the BASEMENT simulation model with insurance claims using conventional validation metrics. The flood model is established on the basis of available topographic data in a high spatial resolution for four test cases. The validation metrics were calculated with two different datasets; a dataset of event documentations reporting flooded areas and a dataset of insurance claims. The model fit relating to insurance claims is in three out of four test cases slightly lower than the model fit computed on the basis of the observed inundation areas. This comparison between two independent validation data sets suggests that validation metrics using insurance claims can be compared to conventional validation data, such as the flooded area. However, a validation on the basis of insurance claims might be more conservative in cases where model errors are more pronounced in areas with a high density of values at risk.

  4. DEVELOPMENT OF DNA MICROARRAYS FOR ECOLOGICAL EXPOSURE ASSESSMENT

    EPA Science Inventory

    EPA/ORD is moving forward with a computational toxicology initiative in FY 04 which aims to integrate genomics and computational methods to provide a mechanistic basis for prediction of exposure and effects of chemical stressors in the environment.

    The goal of the presen...

  5. Spatial Resolution Requirements for Traffic-Related Air Pollutant Exposure Evaluations

    PubMed Central

    Batterman, Stuart; Chambliss, Sarah; Isakov, Vlad

    2014-01-01

    Vehicle emissions represent one of the most important air pollution sources in most urban areas, and elevated concentrations of pollutants found near major roads have been associated with many adverse health impacts. To understand these impacts, exposure estimates should reflect the spatial and temporal patterns observed for traffic-related air pollutants. This paper evaluates the spatial resolution and zonal systems required to estimate accurately intraurban and near-road exposures of traffic-related air pollutants. The analyses use the detailed information assembled for a large (800 km2) area centered on Detroit, Michigan, USA. Concentrations of nitrogen oxides (NOx) due to vehicle emissions were estimated using hourly traffic volumes and speeds on 9,700 links representing all but minor roads in the city, the MOVES2010 emission model, the RLINE dispersion model, local meteorological data, a temporal resolution of 1 hr, and spatial resolution as low as 10 m. Model estimates were joined with the corresponding shape files to estimate residential exposures for 700,000 individuals at property parcel, census block, census tract, and ZIP code levels. We evaluate joining methods, the spatial resolution needed to meet specific error criteria, and the extent of exposure misclassification. To portray traffic-related air pollutant exposure, raster or inverse distance-weighted interpolations are superior to nearest neighbor approaches, and interpolations between receptors and points of interest should not exceed about 40 m near major roads, and 100 m at larger distances. For census tracts and ZIP codes, average exposures are overestimated since few individuals live very near major roads, the range of concentrations is compressed, most exposures are misclassified, and high concentrations near roads are entirely omitted. While smaller zones improve performance considerably, even block-level data can misclassify many individuals. To estimate exposures and impacts of traffic-related pollutants accurately, data should be geocoded or estimated at the most-resolved spatial level; census tract and larger zones have little if any ability to represent intraurban variation in traffic-related air pollutant concentrations. These results are based on one of the most comprehensive intraurban modeling studies in the literature and results are robust. Recommendations address the value of dispersion models to portray spatial and temporal variation of air pollutants in epidemiology and other studies; techniques to improve accuracy and reduce the computational burden in urban scale modeling; the necessary spatial resolution for health surveillance, demographic, and pollution data; and the consequences of low resolution data in terms of exposure misclassification. PMID:25132794

  6. Organ radiation exposure with EOS: GATE simulations versus TLD measurements

    NASA Astrophysics Data System (ADS)

    Clavel, A. H.; Thevenard-Berger, P.; Verdun, F. R.; Létang, J. M.; Darbon, A.

    2016-03-01

    EOS® is an innovative X-ray imaging system allowing the acquisition of two simultaneous images of a patient in the standing position, during the vertical scan of two orthogonal fan beams. This study aimed to compute organs radiation exposure to a patient, in the particular geometry of this system. Two different positions of the patient in the machine were studied, corresponding to postero-anterior plus left lateral projections (PA-LLAT) and antero-posterior plus right lateral projections (AP-RLAT). To achieve this goal, a Monte-Carlo simulation was developed based on a GATE environment. To model the physical properties of the patient, a computational phantom was produced based on computed tomography scan data of an anthropomorphic phantom. The simulations provided several organs doses, which were compared to previously published dose results measured with Thermo Luminescent Detectors (TLD) in the same conditions and with the same phantom. The simulation results showed a good agreement with measured doses at the TLD locations, for both AP-RLAT and PA-LLAT projections. This study also showed that the organ dose assessed only from a sample of locations, rather than considering the whole organ, introduced significant bias, depending on organs and projections.

  7. A coupled airflow and source/sink model for simulating indoor VOC exposures.

    PubMed

    Yang, X; Chen, Q

    2001-12-01

    In this paper, a numerical model is presented to study the indoor air quality (IAQ) in a room with different emission sources, sinks, and ventilation methods. A computer program, ACCESS-IAQ, is developed to simulate the airflow pattern, the time history of the contaminant concentrations in the occupied zone, and the inhalation exposures. The program developed may be useful for IAQ professional to design healthy and comfortable indoor environments. A numerical study has been carried out to predict the effectiveness of a displacement ventilation and a mixing ventilation on volatile organic compound (VOC) removal in a model office. Results from the numerical predictions show that when a "wet" emission source (a freshly painted wood stain) is distributed uniformly across the floor area with sinks (gypsum board) from the four vertical walls, displacement ventilation has consistently lower exposure at the breathing level of the occupant in the room. Such an effect is mainly due to the higher ventilation efficiency of displacement ventilation compared to the mixing ventilation. The simulation results also show that the walls adsorb significant amounts of VOCs during the first hour and act as secondary sources thereafter.

  8. Diffusion Tensor Imaging Reveals White Matter Injury in a Rat Model of Repetitive Blast-Induced Traumatic Brain Injury

    PubMed Central

    Calabrese, Evan; Du, Fu; Garman, Robert H.; Johnson, G. Allan; Riccio, Cory; Tong, Lawrence C.

    2014-01-01

    Abstract Blast-induced traumatic brain injury (bTBI) is one of the most common combat-related injuries seen in U.S. military personnel, yet relatively little is known about the underlying mechanisms of injury. In particular, the effects of the primary blast pressure wave are poorly understood. Animal models have proven invaluable for the study of primary bTBI, because it rarely occurs in isolation in human subjects. Even less is known about the effects of repeated primary blast wave exposure, but existing data suggest cumulative increases in brain damage with a second blast. MRI and, in particular, diffusion tensor imaging (DTI), have become important tools for assessing bTBI in both clinical and preclinical settings. Computational statistical methods such as voxelwise analysis have shown promise in localizing and quantifying bTBI throughout the brain. In this study, we use voxelwise analysis of DTI to quantify white matter injury in a rat model of repetitive primary blast exposure. Our results show a significant increase in microstructural damage with a second blast exposure, suggesting that primary bTBI may sensitize the brain to subsequent injury. PMID:24392843

  9. Media and youth: access, exposure, and privatization.

    PubMed

    Roberts, D F

    2000-08-01

    To describe U.S. youth's access and exposure to the full array of media, as well as the social contexts in which media exposure occurs. A cross-sectional national random sample of 2065 adolescents aged 8 through 18 years, including oversamples of African-American and Hispanic youth, completed questionnaires about use of television, videotapes, movies, computers, video games, radio, compact discs, tape players, books, newspapers, and magazines. U.S. youngsters are immersed in media. Most households contain most media (computers and video game systems are the exception); the majority of youth have their own personal media. The average youth devotes 6 3/4 h to media; simultaneous use of multiple media increases exposure to 8 h of media messages daily. Overall, media exposure and exposure to individual media vary as a function of age, gender, race/ethnicity, and family socioeconomic level. Television remains the dominant medium. About one-half of the youth sampled uses a computer daily. A substantial proportion of children's and adolescents' media use occurs in the absence of parents. American youth devote more time to media than to any other waking activity, as much as one-third of each day. This demands increased parental attention and research into the effects of such extensive exposure.

  10. Development of a computational model for astronaut reorientation.

    PubMed

    Stirling, Leia; Willcox, Karen; Newman, Dava

    2010-08-26

    The ability to model astronaut reorientations computationally provides a simple way to develop and study human motion control strategies. Since the cost of experimenting in microgravity is high, and underwater training can lead to motions inappropriate for microgravity, these techniques allow for motions to be developed and well-understood prior to any microgravity exposure. By including a model of the current space suit, we have the ability to study both intravehicular and extravehicular activities. We present several techniques for rotating about the axes of the body and show that motions performed by the legs create a greater net rotation than those performed by the arms. Adding a space suit to the motions was seen to increase the resistance torque and limit the available range of motion. While rotations about the body axes can be performed in the current space suit, the resulting motions generated a reduced rotation when compared to the unsuited configuration. 2010 Elsevier Ltd. All rights reserved.

  11. Computational Modeling Using OpenSim to Simulate a Squat Exercise Motion

    NASA Technical Reports Server (NTRS)

    Gallo, C. A.; Thompson, W. K.; Lewandowski, B. E.; Humphreys, B. T.; Funk, J. H.; Funk, N. H.; Weaver, A. S.; Perusek, G. P.; Sheehan, C. C.; Mulugeta, L.

    2015-01-01

    Long duration space travel to destinations such as Mars or an asteroid will expose astronauts to extended periods of reduced gravity. Astronauts will use an exercise regime for the duration of the space flight to minimize the loss of bone density, muscle mass and aerobic capacity that occurs during exposure to a reduced gravity environment. Since the area available in the spacecraft for an exercise device is limited and gravity is not present to aid loading, compact resistance exercise device prototypes are being developed. Since it is difficult to rigorously test these proposed devices in space flight, computational modeling provides an estimation of the muscle forces, joint torques and joint loads during exercise to gain insight on the efficacy to protect the musculoskeletal health of astronauts.

  12. Interactive vs passive screen time and nighttime sleep duration among school-aged children.

    PubMed

    Yland, Jennifer; Guan, Stanford; Emanuele, Erin; Hale, Lauren

    2015-09-01

    Insufficient sleep among school-aged children is a growing concern, as numerous studies have shown that chronic short sleep duration increases the risk of poor academic performance and specific adverse health outcomes. We examined the association between weekday nighttime sleep duration and 3 types of screen exposure: television, computer use, and video gaming. We used age 9 data from an ethnically diverse national birth cohort study, the Fragile Families and Child Wellbeing Study, to assess the association between screen time and sleep duration among 9-year-olds, using screen time data reported by both the child (n = 3269) and by the child's primary caregiver (n= 2770). Within the child-reported models, children who watched more than 2 hours of television per day had shorter sleep duration by approximately 11 minutes per night compared to those who watched less than 2 hours of television (β = -0.18; P < .001). Using the caregiver-reported models, both television and computer use were associated with reduced sleep duration. For both child- and parent-reported screen time measures, we did not find statistically significant differences in effect size across various types of screen time. Screen time from televisions and computers is associated with reduced sleep duration among 9-year-olds, using 2 sources of estimates of screen time exposure (child and parent reports). No specific type or use of screen time resulted in significantly shorter sleep duration than another, suggesting that caution should be advised against excessive use of all screens.

  13. Informatics in radiology: use of a C-arm fluoroscopy simulator to support training in intraoperative radiography.

    PubMed

    Bott, Oliver Johannes; Dresing, Klaus; Wagner, Markus; Raab, Björn-Werner; Teistler, Michael

    2011-01-01

    Mobile image intensifier systems (C-arms) are used frequently in orthopedic and reconstructive surgery, especially in trauma and emergency settings, but image quality and radiation exposure levels may vary widely, depending on the extent of the C-arm operator's knowledge and experience. Current training programs consist mainly of theoretical instruction in C-arm operation, the physical foundations of radiography, and radiation avoidance, and are largely lacking in hands-on application. A computer-based simulation program such as that tested by the authors may be one way to improve the effectiveness of C-arm training. In computer simulations of various scenarios commonly encountered in the operating room, trainees using the virtX program interact with three-dimensional models to test their knowledge base and improve their skill levels. Radiographs showing the simulated patient anatomy and surgical implants are "reconstructed" from data computed on the basis of the trainee's positioning of models of a C-arm, patient, and table, and are displayed in real time on the desktop monitor. Trainee performance is signaled in real time by color graphics in several control panels and, on completion of the exercise, is compared in detail with the performance of an expert operator. Testing of this computer-based training program in continuing medical education courses for operating room personnel showed an improvement in the overall understanding of underlying principles of intraoperative radiography performed with a C-arm, with resultant higher image quality, lower overall radiation exposure, and greater time efficiency. Supplemental material available at http://radiographics.rsna.org/lookup/suppl/doi:10.1148/rg.313105125/-/DC1. Copyright © RSNA, 2011.

  14. LDEF data: Comparisons with existing models

    NASA Astrophysics Data System (ADS)

    Coombs, Cassandra R.; Watts, Alan J.; Wagner, John D.; Atkinson, Dale R.

    1993-04-01

    The relationship between the observed cratering impact damage on the Long Duration Exposure Facility (LDEF) versus the existing models for both the natural environment of micrometeoroids and the man-made debris was investigated. Experimental data was provided by several LDEF Principal Investigators, Meteoroid and Debris Special Investigation Group (M&D SIG) members, and by the Kennedy Space Center Analysis Team (KSC A-Team) members. These data were collected from various aluminum materials around the LDEF satellite. A PC (personal computer) computer program, SPENV, was written which incorporates the existing models of the Low Earth Orbit (LEO) environment. This program calculates the expected number of impacts per unit area as functions of altitude, orbital inclination, time in orbit, and direction of the spacecraft surface relative to the velocity vector, for both micrometeoroids and man-made debris. Since both particle models are couched in terms of impact fluxes versus impactor particle size, and much of the LDEF data is in the form of crater production rates, scaling laws have been used to relate the two. Also many hydrodynamic impact computer simulations were conducted, using CTH, of various impact events, that identified certain modes of response, including simple metallic target cratering, perforations and delamination effects of coatings.

  15. LDEF data: Comparisons with existing models

    NASA Technical Reports Server (NTRS)

    Coombs, Cassandra R.; Watts, Alan J.; Wagner, John D.; Atkinson, Dale R.

    1993-01-01

    The relationship between the observed cratering impact damage on the Long Duration Exposure Facility (LDEF) versus the existing models for both the natural environment of micrometeoroids and the man-made debris was investigated. Experimental data was provided by several LDEF Principal Investigators, Meteoroid and Debris Special Investigation Group (M&D SIG) members, and by the Kennedy Space Center Analysis Team (KSC A-Team) members. These data were collected from various aluminum materials around the LDEF satellite. A PC (personal computer) computer program, SPENV, was written which incorporates the existing models of the Low Earth Orbit (LEO) environment. This program calculates the expected number of impacts per unit area as functions of altitude, orbital inclination, time in orbit, and direction of the spacecraft surface relative to the velocity vector, for both micrometeoroids and man-made debris. Since both particle models are couched in terms of impact fluxes versus impactor particle size, and much of the LDEF data is in the form of crater production rates, scaling laws have been used to relate the two. Also many hydrodynamic impact computer simulations were conducted, using CTH, of various impact events, that identified certain modes of response, including simple metallic target cratering, perforations and delamination effects of coatings.

  16. Pulmonary deposition modeling with airborne fiber exposure data: a study of workers manufacturing refractory ceramic fibers.

    PubMed

    Lentz, Thomas J; Rice, Carol H; Succop, Paul A; Lockey, James E; Dement, John M; LeMasters, Grace K

    2003-04-01

    Increasing production of refractory ceramic fiber (RCF), a synthetic vitreous material with industrial applications (e.g., kiln insulation), has created interest in potential respiratory effects of exposure to airborne fibers during manufacturing. An ongoing study of RCF manufacturing workers in the United States has indicated an association between cumulative fiber exposure and pleural plaques. Fiber sizing data, obtained from electron microscopy analyses of 118 air samples collected in three independent studies over a 20-year period (1976-1995), were used with a computer deposition model to estimate pulmonary dose of fibers of specified dimensions for 652 former and current RCF production workers. Separate dose correction factors reflecting differences in fiber dimensions in six uniform job title groups were used with data on airborne fiber concentration and employment duration to calculate cumulative dose estimates for each worker. From review of the literature, critical dimensions (diameter <0.4 microm, length <10 microm) were defined for fibers that may translocate to the parietal pleura. Each of three continuous exposure/dose metrics analyzed in separate logistic regression models was significantly related to plaques, even after adjusting for possible past asbestos exposure: cumulative fiber exposure, chi(2) = 15.2 (p < 0.01); cumulative pulmonary dose (all fibers), chi(2) = 14.6 (p < 0.01); cumulative pulmonary dose (critical dimension fibers), chi(2) = 12.4 (p < 0.01). Odds ratios (ORs) were calculated for levels of each metric. Increasing ORs were statistically significant for the two highest dose levels of critical dimension fibers (level three, OR = 11, 95%CI = [1.4, 98]; level four, OR = 25, 95%CI = [3.2, 190]). Similar associations existed for all metrics after adjustment for possible asbestos exposure. It was concluded that development of pleural plaques follows exposure- and dose-response patterns, and that airborne fibers in RCF manufacturing facilities include those with critical dimensions associated with pleural plaque formation. Analysis of additional air samples may improve estimates of the dose-response relationship.

  17. Single-exposure quantitative phase imaging in color-coded LED microscopy.

    PubMed

    Lee, Wonchan; Jung, Daeseong; Ryu, Suho; Joo, Chulmin

    2017-04-03

    We demonstrate single-shot quantitative phase imaging (QPI) in a platform of color-coded LED microscopy (cLEDscope). The light source in a conventional microscope is replaced by a circular LED pattern that is trisected into subregions with equal area, assigned to red, green, and blue colors. Image acquisition with a color image sensor and subsequent computation based on weak object transfer functions allow for the QPI of a transparent specimen. We also provide a correction method for color-leakage, which may be encountered in implementing our method with consumer-grade LEDs and image sensors. Most commercially available LEDs and image sensors do not provide spectrally isolated emissions and pixel responses, generating significant error in phase estimation in our method. We describe the correction scheme for this color-leakage issue, and demonstrate improved phase measurement accuracy. The computational model and single-exposure QPI capability of our method are presented by showing images of calibrated phase samples and cellular specimens.

  18. Visualization of Wind Data on Google Earth for the Three-dimensional Wind Field (3DWF) Model

    DTIC Science & Technology

    2012-09-01

    ActiveX components or XPCOM extensions can be used by JavaScript to write data to the local file system. Since there is an inherent risk, it is very...important to only use these types of objects ( ActiveX or XPCOM) from a trusted source in order to minimize the exposure of a computer system to malware

  19. Low-energy light bulbs, computers, tablets and the blue light hazard

    PubMed Central

    O'Hagan, J B; Khazova, M; Price, L L A

    2016-01-01

    The introduction of low energy lighting and the widespread use of computer and mobile technologies have changed the exposure of human eyes to light. Occasional claims that the light sources with emissions containing blue light may cause eye damage raise concerns in the media. The aim of the study was to determine if it was appropriate to issue advice on the public health concerns. A number of sources were assessed and the exposure conditions were compared with international exposure limits, and the exposure likely to be received from staring at a blue sky. None of the sources assessed approached the exposure limits, even for extended viewing times. PMID:26768920

  20. An assessment of air pollutant exposure methods in Mexico City, Mexico.

    PubMed

    Rivera-González, Luis O; Zhang, Zhenzhen; Sánchez, Brisa N; Zhang, Kai; Brown, Daniel G; Rojas-Bracho, Leonora; Osornio-Vargas, Alvaro; Vadillo-Ortega, Felipe; O'Neill, Marie S

    2015-05-01

    Geostatistical interpolation methods to estimate individual exposure to outdoor air pollutants can be used in pregnancy cohorts where personal exposure data are not collected. Our objectives were to a) develop four assessment methods (citywide average (CWA); nearest monitor (NM); inverse distance weighting (IDW); and ordinary Kriging (OK)), and b) compare daily metrics and cross-validations of interpolation models. We obtained 2008 hourly data from Mexico City's outdoor air monitoring network for PM10, PM2.5, O3, CO, NO2, and SO2 and constructed daily exposure metrics for 1,000 simulated individual locations across five populated geographic zones. Descriptive statistics from all methods were calculated for dry and wet seasons, and by zone. We also evaluated IDW and OK methods' ability to predict measured concentrations at monitors using cross validation and a coefficient of variation (COV). All methods were performed using SAS 9.3, except ordinary Kriging which was modeled using R's gstat package. Overall, mean concentrations and standard deviations were similar among the different methods for each pollutant. Correlations between methods were generally high (r=0.77 to 0.99). However, ranges of estimated concentrations determined by NM, IDW, and OK were wider than the ranges for CWA. Root mean square errors for OK were consistently equal to or lower than for the IDW method. OK standard errors varied considerably between pollutants and the computed COVs ranged from 0.46 (least error) for SO2 and PM10 to 3.91 (most error) for PM2.5. OK predicted concentrations measured at the monitors better than IDW and NM. Given the similarity in results for the exposure methods, OK is preferred because this method alone provides predicted standard errors which can be incorporated in statistical models. The daily estimated exposures calculated using these different exposure methods provide flexibility to evaluate multiple windows of exposure during pregnancy, not just trimester or pregnancy-long exposures. Many studies evaluating associations between outdoor air pollution and adverse pregnancy outcomes rely on outdoor air pollution monitoring data linked to information gathered from large birth registries, and often lack residence location information needed to estimate individual exposure. This study simulated 1,000 residential locations to evaluate four air pollution exposure assessment methods, and describes possible exposure misclassification from using spatial averaging versus geostatistical interpolation models. An implication of this work is that policies to reduce air pollution and exposure among pregnant women based on epidemiologic literature should take into account possible error in estimates of effect when spatial averages alone are evaluated.

  1. Low lifetime stress exposure is associated with reduced stimulus–response memory

    PubMed Central

    Goldfarb, Elizabeth V.; Shields, Grant S.; Daw, Nathaniel D.; Slavich, George M.; Phelps, Elizabeth A.

    2017-01-01

    Exposure to stress throughout life can cumulatively influence later health, even among young adults. The negative effects of high cumulative stress exposure are well-known, and a shift from episodic to stimulus–response memory has been proposed to underlie forms of psychopathology that are related to high lifetime stress. At the other extreme, effects of very low stress exposure are mixed, with some studies reporting that low stress leads to better outcomes, while others demonstrate that low stress is associated with diminished resilience and negative outcomes. However, the influence of very low lifetime stress exposure on episodic and stimulus–response memory is unknown. Here we use a lifetime stress assessment system (STRAIN) to assess cumulative lifetime stress exposure and measure memory performance in young adults reporting very low and moderate levels of lifetime stress exposure. Relative to moderate levels of stress, very low levels of lifetime stress were associated with reduced use and retention (24 h later) of stimulus–response (SR) associations, and a higher likelihood of using context memory. Further, computational modeling revealed that participants with low levels of stress exhibited worse expression of memory for SR associations than those with moderate stress. These results demonstrate that very low levels of stress exposure can have negative effects on cognition. PMID:28298555

  2. Discovery of a Series of Indazole TRPA1 Antagonists

    PubMed Central

    2017-01-01

    A series of TRPA1 antagonists is described which has as its core structure an indazole moiety. The physical properties and in vitro DMPK profiles are discussed. Good in vivo exposure was obtained with several analogs, allowing efficacy to be assessed in rodent models of inflammatory pain. Two compounds showed significant activity in these models when administered either systemically or topically. Protein chimeras were constructed to indicate compounds from the series bound in the S5 region of the channel, and a computational docking model was used to propose a binding mode for example compounds. PMID:28626530

  3. Calculating excess lifetime risk in relative risk models.

    PubMed Central

    Vaeth, M; Pierce, D A

    1990-01-01

    When assessing the impact of radiation exposure it is common practice to present the final conclusions in terms of excess lifetime cancer risk in a population exposed to a given dose. The present investigation is mainly a methodological study focusing on some of the major issues and uncertainties involved in calculating such excess lifetime risks and related risk projection methods. The age-constant relative risk model used in the recent analyses of the cancer mortality that was observed in the follow-up of the cohort of A-bomb survivors in Hiroshima and Nagasaki is used to describe the effect of the exposure on the cancer mortality. In this type of model the excess relative risk is constant in age-at-risk, but depends on the age-at-exposure. Calculation of excess lifetime risks usually requires rather complicated life-table computations. In this paper we propose a simple approximation to the excess lifetime risk; the validity of the approximation for low levels of exposure is justified empirically as well as theoretically. This approximation provides important guidance in understanding the influence of the various factors involved in risk projections. Among the further topics considered are the influence of a latent period, the additional problems involved in calculations of site-specific excess lifetime cancer risks, the consequences of a leveling off or a plateau in the excess relative risk, and the uncertainties involved in transferring results from one population to another. The main part of this study relates to the situation with a single, instantaneous exposure, but a brief discussion is also given of the problem with a continuous exposure at a low-dose rate. PMID:2269245

  4. Calculating excess lifetime risk in relative risk models.

    PubMed

    Vaeth, M; Pierce, D A

    1990-07-01

    When assessing the impact of radiation exposure it is common practice to present the final conclusions in terms of excess lifetime cancer risk in a population exposed to a given dose. The present investigation is mainly a methodological study focusing on some of the major issues and uncertainties involved in calculating such excess lifetime risks and related risk projection methods. The age-constant relative risk model used in the recent analyses of the cancer mortality that was observed in the follow-up of the cohort of A-bomb survivors in Hiroshima and Nagasaki is used to describe the effect of the exposure on the cancer mortality. In this type of model the excess relative risk is constant in age-at-risk, but depends on the age-at-exposure. Calculation of excess lifetime risks usually requires rather complicated life-table computations. In this paper we propose a simple approximation to the excess lifetime risk; the validity of the approximation for low levels of exposure is justified empirically as well as theoretically. This approximation provides important guidance in understanding the influence of the various factors involved in risk projections. Among the further topics considered are the influence of a latent period, the additional problems involved in calculations of site-specific excess lifetime cancer risks, the consequences of a leveling off or a plateau in the excess relative risk, and the uncertainties involved in transferring results from one population to another. The main part of this study relates to the situation with a single, instantaneous exposure, but a brief discussion is also given of the problem with a continuous exposure at a low-dose rate.

  5. Development of skeletal system for mesh-type ICRP reference adult phantoms

    NASA Astrophysics Data System (ADS)

    Yeom, Yeon Soo; Wang, Zhao Jun; Tat Nguyen, Thang; Kim, Han Sung; Choi, Chansoo; Han, Min Cheol; Kim, Chan Hyeong; Lee, Jai Ki; Chung, Beom Sun; Zankl, Maria; Petoussi-Henss, Nina; Bolch, Wesley E.; Lee, Choonsik

    2016-10-01

    The reference adult computational phantoms of the international commission on radiological protection (ICRP) described in Publication 110 are voxel-type computational phantoms based on whole-body computed tomography (CT) images of adult male and female patients. The voxel resolutions of these phantoms are in the order of a few millimeters and smaller tissues such as the eye lens, the skin, and the walls of some organs cannot be properly defined in the phantoms, resulting in limitations in dose coefficient calculations for weakly penetrating radiations. In order to address the limitations of the ICRP-110 phantoms, an ICRP Task Group has been recently formulated and the voxel phantoms are now being converted to a high-quality mesh format. As a part of the conversion project, in the present study, the skeleton models, one of the most important and complex organs of the body, were constructed. The constructed skeleton models were then tested by calculating red bone marrow (RBM) and endosteum dose coefficients (DCs) for broad parallel beams of photons and electrons and comparing the calculated values with those of the original ICRP-110 phantoms. The results show that for the photon exposures, there is a generally good agreement in the DCs between the mesh-type phantoms and the original voxel-type ICRP-110 phantoms; that is, the dose discrepancies were less than 7% in all cases except for the 0.03 MeV cases, for which the maximum difference was 14%. On the other hand, for the electron exposures (⩽4 MeV), the DCs of the mesh-type phantoms deviate from those of the ICRP-110 phantoms by up to ~1600 times at 0.03 MeV, which is indeed due to the improvement of the skeletal anatomy of the developed skeleton mesh models.

  6. Report on computation of repetitive hyperbaric-hypobaric decompression tables

    NASA Technical Reports Server (NTRS)

    Edel, P. O.

    1975-01-01

    The tables were constructed specifically for NASA's simulated weightlessness training program; they provide for 8 depth ranges covering depths from 7 to 47 FSW, with exposure times of 15 to 360 minutes. These tables were based up on an 8 compartment model using tissue half-time values of 5 to 360 minutes and Workmanline M-values for control of the decompression obligation resulting from hyperbaric exposures. Supersaturation ratios of 1.55:1 to 2:1 were used for control of ascents to altitude following such repetitive dives. Adequacy of the method and the resultant tables were determined in light of past experience with decompression involving hyperbaric-hypobaric interfaces in human exposures. Using these criteria, the method showed conformity with empirically determined values. In areas where a discrepancy existed, the tables would err in the direction of safety.

  7. Improved inhalation technology for setting safe exposure levels for workplace chemicals

    NASA Technical Reports Server (NTRS)

    Stuart, Bruce O.

    1993-01-01

    Threshold Limit Values recommended as allowable air concentrations of a chemical in the workplace are often based upon a no-observable-effect-level (NOEL) determined by experimental inhalation studies using rodents. A 'safe level' for human exposure must then be estimated by the use of generalized safety factors in attempts to extrapolate from experimental rodents to man. The recent development of chemical-specific physiologically-based toxicokinetics makes use of measured physiological, biochemical, and metabolic parameters to construct a validated model that is able to 'scale-up' rodent response data to predict the behavior of the chemical in man. This procedure is made possible by recent advances in personal computer software and the emergence of appropriate biological data, and provides an analytical tool for much more reliable risk evaluation and airborne chemical exposure level setting for humans.

  8. Integrated Experimental and Computational Approach to Understand the Effects of Heavy Ion Radiation on Skin Homeostasis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    von Neubeck, Claere; Shankaran, Harish; Geniza, Matthew

    2013-08-08

    The effects of low dose high linear energy transfer (LET) radiation on human health are of concern for both space and clinical exposures. As epidemiological data for such radiation exposures are scarce for making relevant predictions, we need to understand the mechanism of response especially in normal tissues. Our objective here is to understand the effects of heavy ion radiation on tissue homeostasis in a realistic model system. Towards this end, we exposed an in vitro three dimensional skin equivalent to low fluences of Neon (Ne) ions (300 MeV/u), and determined the differentiation profile as a function of time followingmore » exposure using immunohistochemistry. We found that Ne ion exposures resulted in transient increases in the tissue regions expressing the differentiation markers keratin 10, and filaggrin, and more subtle time-dependent effects on the number of basal cells in the epidermis. We analyzed the data using a mathematical model of the skin equivalent, to quantify the effect of radiation on cell proliferation and differentiation. The agent-based mathematical model for the epidermal layer treats the epidermis as a collection of heterogeneous cell types with different proliferation/differentiation properties. We obtained model parameters from the literature where available, and calibrated the unknown parameters to match the observed properties in unirradiated skin. We then used the model to rigorously examine alternate hypotheses regarding the effects of high LET radiation on the tissue. Our analysis indicates that Ne ion exposures induce rapid, but transient, changes in cell division, differentiation and proliferation. We have validated the modeling results by histology and quantitative reverse transcription polymerase chain reaction (qRT-PCR). The integrated approach presented here can be used as a general framework to understand the responses of multicellular systems, and can be adapted to other epithelial tissues.« less

  9. Momentary Effects of Exposure to Pro-Smoking Media on College Students’ Future Smoking Risk

    PubMed Central

    Shadel, William G.; Martino, Steven C.; Setodji, Claude; Scharf, Deborah

    2012-01-01

    Objective This study used ecological momentary assessment to examine acute changes in college students’ future smoking risk as a function of their exposure to pro-smoking media (e.g., smoking in movies, paid advertising, point-of-sale promotions). Methods A sample of 135 college students (ever and never smokers) carried handheld computers for 21 days, recording their exposures to all forms of pro-smoking media during the assessment period. They also responded to three investigator-initiated control prompts during each day of the assessment period (i.e., programmed to occur randomly). After each pro-media smoking exposure and after each random control prompt they answered questions that measured their risk of future smoking. Responses between pro-smoking media encounters were compared to responses made during random control prompts. Results Compliance with the study protocol was high, with participants responding to over 83% of all random prompts. Participants recorded nearly three encounters with pro-smoking media each week. Results of linear mixed modeling indicated that all participants had higher future smoking risk following exposure to pro-smoking media compared with control prompts (p < 0.05); this pattern of response did not differ between ever and never smokers (p = 0.769). Additional modeling of the variances around participants’ risk of future smoking revealed that the response of never smokers to pro-smoking media was significantly more variable than the response of ever smokers. Conclusions Exposure to pro-smoking media is associated with acute changes in future smoking risk, and never smokers and ever smokers respond differently to these exposures. PMID:22353027

  10. Pediatric in vitro and in silico models of deposition via oral and nasal inhalation.

    PubMed

    Carrigy, Nicholas B; Ruzycki, Conor A; Golshahi, Laleh; Finlay, Warren H

    2014-06-01

    Respiratory tract deposition models provide a useful method for optimizing the design and administration of inhaled pharmaceutical aerosols, and can be useful for estimating exposure risks to inhaled particulate matter. As aerosol must first pass through the extrathoracic region prior to reaching the lungs, deposition in this region plays an important role in both cases. Compared to adults, much less extrathoracic deposition data are available with pediatric subjects. Recently, progress in magnetic resonance imaging and computed tomography scans to develop pediatric extrathoracic airway replicas has facilitated addressing this issue. Indeed, the use of realistic replicas for benchtop inhaler testing is now relatively common during the development and in vitro evaluation of pediatric respiratory drug delivery devices. Recently, in vitro empirical modeling studies using a moderate number of these realistic replicas have related airway geometry, particle size, fluid properties, and flow rate to extrathoracic deposition. Idealized geometries provide a standardized platform for inhaler testing and exposure risk assessment and have been designed to mimic average in vitro deposition in infants and children by replicating representative average geometrical dimensions. In silico mathematical models have used morphometric data and aerosol physics to illustrate the relative importance of different deposition mechanisms on respiratory tract deposition. Computational fluid dynamics simulations allow for the quantification of local deposition patterns and an in-depth examination of aerosol behavior in the respiratory tract. Recent studies have used both in vitro and in silico deposition measurements in realistic pediatric airway geometries to some success. This article reviews the current understanding of pediatric in vitro and in silico deposition modeling via oral and nasal inhalation.

  11. Computational modeling of nanoscale and microscale particle deposition, retention and dosimetry in the mouse respiratory tract.

    PubMed

    Asgharian, B; Price, O T; Oldham, M; Chen, Lung-Chi; Saunders, E L; Gordon, T; Mikheev, V B; Minard, K R; Teeguarden, J G

    2014-12-01

    Comparing effects of inhaled particles across rodent test systems and between rodent test systems and humans is a key obstacle to the interpretation of common toxicological test systems for human risk assessment. These comparisons, correlation with effects and prediction of effects, are best conducted using measures of tissue dose in the respiratory tract. Differences in lung geometry, physiology and the characteristics of ventilation can give rise to differences in the regional deposition of particles in the lung in these species. Differences in regional lung tissue doses cannot currently be measured experimentally. Regional lung tissue dosimetry can however be predicted using models developed for rats, monkeys, and humans. A computational model of particle respiratory tract deposition and clearance was developed for BALB/c and B6C3F1 mice, creating a cross-species suite of available models for particle dosimetry in the lung. Airflow and particle transport equations were solved throughout the respiratory tract of these mice strains to obtain temporal and spatial concentration of inhaled particles from which deposition fractions were determined. Particle inhalability (Inhalable fraction, IF) and upper respiratory tract (URT) deposition were directly related to particle diffusive and inertial properties. Measurements of the retained mass at several post-exposure times following exposure to iron oxide nanoparticles, micro- and nanoscale C60 fullerene, and nanoscale silver particles were used to calibrate and verify model predictions of total lung dose. Interstrain (mice) and interspecies (mouse, rat and human) differences in particle inhalability, fractional deposition and tissue dosimetry are described for ultrafine, fine and coarse particles.

  12. Efficacy of visor and helmet for blast protection assessed using a computational head model

    NASA Astrophysics Data System (ADS)

    Singh, D.; Cronin, D. S.

    2017-11-01

    Head injury resulting from blast exposure has been identified as a challenge that may be addressed, in part, through improved protective systems. Existing detailed head models validated for blast loading were applied to investigate the influence of helmet visor configuration, liner properties, and shell material stiffness. Response metrics including head acceleration and intracranial pressures (ICPs) generated in brain tissue during primary blast exposure were used to assess and compare helmet configurations. The addition of a visor was found to reduce peak head acceleration and positive ICPs. However, negative ICPs associated with a potential for injury were increased when a visor and a foam liner were present. In general, the foam liner material was found to be more significant in affecting the negative ICP response than positive ICP or acceleration. Shell stiffness was found to have relatively small effects on either metric. A strap suspension system, modeled as an air gap between the head and helmet, was more effective in reducing response metrics compared to a foam liner. In cases with a foam liner, lower-density foam offered a greater reduction of negative ICPs. The models demonstrated the "underwash" effect in cases where no foam liner was present; however, the reflected pressures generated between the helmet and head did not translate to significant ICPs in adjacent tissue, when compared to peak ICPs from initial blast wave interaction. This study demonstrated that the efficacy of head protection can be expressed in terms of load transmission pathways when assessed with a detailed computational model.

  13. Comprehensive computational model for combining fluid hydrodynamics, light transport and biomass growth in a Taylor vortex algal photobioreactor: Lagrangian approach.

    PubMed

    Gao, Xi; Kong, Bo; Vigil, R Dennis

    2017-01-01

    A comprehensive quantitative model incorporating the effects of fluid flow patterns, light distribution, and algal growth kinetics on biomass growth rate is developed in order to predict the performance of a Taylor vortex algal photobioreactor for culturing Chlorella vulgaris. A commonly used Lagrangian strategy for coupling the various factors influencing algal growth was employed whereby results from computational fluid dynamics and radiation transport simulations were used to compute numerous microorganism light exposure histories, and this information in turn was used to estimate the global biomass specific growth rate. The simulations provide good quantitative agreement with experimental data and correctly predict the trend in reactor performance as a key reactor operating parameter is varied (inner cylinder rotation speed). However, biomass growth curves are consistently over-predicted and potential causes for these over-predictions and drawbacks of the Lagrangian approach are addressed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Space Radiation Analysis for the Mark III Spacesuit

    NASA Technical Reports Server (NTRS)

    Atwell, Bill; Boeder, Paul; Ross, Amy

    2013-01-01

    NASA has continued the development of space systems by applying and integrating improved technologies that include safety issues, lightweight materials, and electronics. One such area is extravehicular (EVA) spacesuit development with the most recent Mark III spacesuit. In this paper the Mark III spacesuit is discussed in detail that includes the various components that comprise the spacesuit, materials and their chemical composition that make up the spacesuit, and a discussion of the 3-D CAD model of the Mark III spacesuit. In addition, the male (CAM) and female (CAF) computerized anatomical models are also discussed in detail. We combined the spacesuit and the human models, that is, we developed a method of incorporating the human models in the Mark III spacesuit and performed a ray-tracing technique to determine the space radiation shielding distributions for all of the critical body organs. These body organ shielding distributions include the BFO (Blood-Forming Organs), skin, eye, lungs, stomach, and colon, to name a few, for both the male and female. Using models of the trapped (Van Allen) proton and electron environments, radiation exposures were computed for a typical low earth orbit (LEO) EVA mission scenario including the geostationary (GEO) high electron environment. A radiation exposure assessment of these mission scenarios is made to determine whether or not the crew radiation exposure limits are satisfied, and if not, the additional shielding material that would be required to satisfy the crew limits.

  15. A hybrid solution using computational prediction and measured data to accurately determine process corrections with reduced overlay sampling

    NASA Astrophysics Data System (ADS)

    Noyes, Ben F.; Mokaberi, Babak; Mandoy, Ram; Pate, Alex; Huijgen, Ralph; McBurney, Mike; Chen, Owen

    2017-03-01

    Reducing overlay error via an accurate APC feedback system is one of the main challenges in high volume production of the current and future nodes in the semiconductor industry. The overlay feedback system directly affects the number of dies meeting overlay specification and the number of layers requiring dedicated exposure tools through the fabrication flow. Increasing the former number and reducing the latter number is beneficial for the overall efficiency and yield of the fabrication process. An overlay feedback system requires accurate determination of the overlay error, or fingerprint, on exposed wafers in order to determine corrections to be automatically and dynamically applied to the exposure of future wafers. Since current and future nodes require correction per exposure (CPE), the resolution of the overlay fingerprint must be high enough to accommodate CPE in the overlay feedback system, or overlay control module (OCM). Determining a high resolution fingerprint from measured data requires extremely dense overlay sampling that takes a significant amount of measurement time. For static corrections this is acceptable, but in an automated dynamic correction system this method creates extreme bottlenecks for the throughput of said system as new lots have to wait until the previous lot is measured. One solution is using a less dense overlay sampling scheme and employing computationally up-sampled data to a dense fingerprint. That method uses a global fingerprint model over the entire wafer; measured localized overlay errors are therefore not always represented in its up-sampled output. This paper will discuss a hybrid system shown in Fig. 1 that combines a computationally up-sampled fingerprint with the measured data to more accurately capture the actual fingerprint, including local overlay errors. Such a hybrid system is shown to result in reduced modelled residuals while determining the fingerprint, and better on-product overlay performance.

  16. A Perceptual Pathway to Bias: Interracial Exposure Reduces Abrupt Shifts in Real-Time Race Perception That Predict Mixed-Race Bias.

    PubMed

    Freeman, Jonathan B; Pauker, Kristin; Sanchez, Diana T

    2016-04-01

    In two national samples, we examined the influence of interracial exposure in one's local environment on the dynamic process underlying race perception and its evaluative consequences. Using a mouse-tracking paradigm, we found in Study 1 that White individuals with low interracial exposure exhibited a unique effect of abrupt, unstable White-Black category shifting during real-time perception of mixed-race faces, consistent with predictions from a neural-dynamic model of social categorization and computational simulations. In Study 2, this shifting effect was replicated and shown to predict a trust bias against mixed-race individuals and to mediate the effect of low interracial exposure on that trust bias. Taken together, the findings demonstrate that interracial exposure shapes the dynamics through which racial categories activate and resolve during real-time perceptions, and these initial perceptual dynamics, in turn, may help drive evaluative biases against mixed-race individuals. Thus, lower-level perceptual aspects of encounters with racial ambiguity may serve as a foundation for mixed-race prejudice. © The Author(s) 2016.

  17. Optimal segmentation and packaging process

    DOEpatents

    Kostelnik, K.M.; Meservey, R.H.; Landon, M.D.

    1999-08-10

    A process for improving packaging efficiency uses three dimensional, computer simulated models with various optimization algorithms to determine the optimal segmentation process and packaging configurations based on constraints including container limitations. The present invention is applied to a process for decontaminating, decommissioning (D and D), and remediating a nuclear facility involving the segmentation and packaging of contaminated items in waste containers in order to minimize the number of cuts, maximize packaging density, and reduce worker radiation exposure. A three-dimensional, computer simulated, facility model of the contaminated items are created. The contaminated items are differentiated. The optimal location, orientation and sequence of the segmentation and packaging of the contaminated items is determined using the simulated model, the algorithms, and various constraints including container limitations. The cut locations and orientations are transposed to the simulated model. The contaminated items are actually segmented and packaged. The segmentation and packaging may be simulated beforehand. In addition, the contaminated items may be cataloged and recorded. 3 figs.

  18. Risk factors for keratinocyte skin cancer in patients diagnosed with melanoma, a large retrospective study.

    PubMed

    Espinosa, Pablo; Pfeiffer, Ruth M; García-Casado, Zaida; Requena, Celia; Landi, Maria Teresa; Kumar, Rajiv; Nagore, Eduardo

    2016-01-01

    Melanoma survivors are at an increased risk of developing other malignancies, including keratinocyte skin cancer (KSC). While it is known that many risk factors for melanoma also impact risk of KSC in the general population, no previous study has investigated risk factors for KSC development in melanoma patients. We assessed associations of personal and clinical characteristics, including skin phenotype and variations in the melanocortin 1 receptor (MC1R) gene, with KSC risk in melanoma patients. We used prospective follow-up information on 1200 patients treated for melanoma at the Instituto Valenciano de Oncología, Spain, between 2000 and 2011. We computed hazard ratios and 95% confidence intervals (CIs) for the association of clinical, personal and genetic characteristics with risk of KSC, squamous cell carcinoma (SCC), or basal cell carcinoma (BCC) from Cox proportional hazard models. Five-year cumulative incidence based on competing risk models of SCC, BCC or KSC overall was computed using multivariate subdistribution hazard models. To assess predictive performance of the models, we computed areas under the receiver-operating characteristic curves (AUCs, discriminatory power) using cross-validation. Median follow-up was 57.2 months; a KSC was detected in 163 patients (13.6%). In multivariable Cox models, age, sex, sunburns, chronic sun exposure, past personal history of non-melanoma skin cancer or other non-cutaneous neoplasia, and the MC1R variants p.D294H and p.R163Q were significantly associated with KSC risk. A cumulative incidence model including age, sex, personal history of KSC, and of other non-cutaneous neoplasia had an AUC of 0.76 (95% CI: 0.71-0.80). When p.D294H and p.R163Q variants were added to the model, the AUC increased to 0.81 (95% CI: 0.77-0.84) (p-value for difference <0.0001). In addition to age, sex, skin characteristics, and sun exposure, p.R163Q and p.D294H MC1R variants significantly increased KSC risk among melanoma patients. Our findings may help identify patients who could benefit most from preventive measures. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. FUZZY COMPUTATIONAL MODELS TO EVALUATE THE EFFECTS OF AIR POLLUTION ON CHILDREN.

    PubMed

    David, Gleise Silva; Rizol, Paloma Maria Silva Rocha; Nascimento, Luiz Fernando Costa

    2018-01-01

    To build a fuzzy computational model to estimate the number of hospitalizations of children aged up to 10 years due to respiratory conditions based on pollutants and climatic factors in the city of São José do Rio Preto, Brazil. A computational model was constructed using the fuzzy logic. The model has 4 inputs, each with 2 membership functions generating 16 rules, and the output with 5 pertinence functions, based on the Mamdani's method, to estimate the association between the pollutants and the number of hospitalizations. Data from hospitalizations, from 2011-2013, were obtained in DATASUS - and the pollutants Particulate Matter (PM10) and Nitrogen Dioxide (NO2), wind speed and temperature were obtained by the Environmental Company of São Paulo State (Cetesb). A total of 1,161 children were hospitalized in the period and the mean of pollutants was 36 and 51 µg/m3 - PM10 and NO2, respectively. The best values of the Pearson correlation (0.34) and accuracy measured by the Receiver Operating Characteristic (ROC) curve (NO2 - 96.7% and PM10 - 90.4%) were for hospitalizations on the same day of exposure. The model was effective in predicting the number of hospitalizations of children and could be used as a tool in the hospital management of the studied region.

  20. Occupational exposure to ionizing radiation and electromagnetic fields in relation to the risk of thyroid cancer in Sweden.

    PubMed

    Lope, Virginia; Pérez-Gómez, Beatriz; Aragonés, Nuria; López-Abente, Gonzalo; Gustavsson, Per; Floderus, Birgitta; Dosemeci, Mustafa; Silva, Agustín; Pollán, Marina

    2006-08-01

    This study sought to ascertain the risk of thyroid cancer in relation to occupational exposure to ionizing radiation and extremely low-frequency magnetic fields (ELFMF) in a cohort representative of Sweden's gainfully employed population. A historical cohort of 2 992 166 gainfully employed Swedish male and female workers was followed up from 1971 through 1989. Exposure to ELFMF and ionizing radiation was assessed using three job exposure matrices based on industrial branch or occupational codes. Relative risks (RR) for male and female workers, adjusted for age and geographic area, were computed using log-linear Poisson models. Occupational ELFMF exposure showed no effect on the risk of thyroid cancer in the study. However, female workers exposed to high intensities of ionizing radiation registered a marked excess risk (RR 1.85, 95% confidence interval (95% CI) 1.02-3.35]. This trend was not in evidence among the men. While the study confirms the etiologic role of ionizing radiation, with a higher incidence of thyroid cancer being recorded for the most-exposed female workers, our results do not support the possibility of occupational exposure to ELFMF being a risk factor for the development of thyroid cancer.

  1. Do Hassles and Uplifts Change with Age? Longitudinal Findings from the VA Normative Aging Study

    PubMed Central

    Aldwin, Carolyn M.; Jeong, Yu-Jin; Igarashi, Heidi; Spiro, Avron

    2014-01-01

    To examine emotion regulation in later life, we contrasted the modified hedonic treadmill theory with developmental theories, using hassles and uplifts to assess emotion regulation in context. The sample was 1,315 men from the VA Normative Aging Study aged 53 to 85 years, who completed 3,894 observations between 1989 and 2004. We computed three scores for both hassles and uplifts: intensity (ratings reflecting appraisal processes), exposure (count), and summary (total) scores. Growth curves over age showed marked differences in trajectory patterns for intensity and exposure scores. Although exposure to hassles and uplifts decreased in later life, intensity scores increased. Growth based modelling showed individual differences in patterns of hassles and uplifts intensity and exposure, with relative stability in uplifts intensity, normative non-linear changes in hassles intensity, and complex patterns of individual differences in exposure for both hassles and uplifts. Analyses with the summary scores showed that emotion regulation in later life is a function of both developmental change and contextual exposure, with different patterns emerging for hassles and uplifts. Thus, support was found for both hedonic treadmill and developmental change theories, reflecting different aspects of emotion regulation in late life. PMID:24660796

  2. Do hassles and uplifts change with age? Longitudinal findings from the VA normative aging study.

    PubMed

    Aldwin, Carolyn M; Jeong, Yu-Jin; Igarashi, Heidi; Spiro, Avron

    2014-03-01

    To examine emotion regulation in later life, we contrasted the modified hedonic treadmill theory with developmental theories, using hassles and uplifts to assess emotion regulation in context. The sample was 1,315 men from the VA Normative Aging Study aged 53 to 85 years, who completed 3,894 observations between 1989 and 2004. We computed 3 scores for both hassles and uplifts: intensity (ratings reflecting appraisal processes), exposure (count), and summary (total) scores. Growth curves over age showed marked differences in trajectory patterns for intensity and exposure scores. Although exposure to hassles and uplifts decreased in later life, intensity scores increased. Group-based modeling showed individual differences in patterns of hassles and uplifts intensity and exposure, with relative stability in uplifts intensity, normative nonlinear changes in hassles intensity, and complex patterns of individual differences in exposure for both hassles and uplifts. Analyses with the summary scores showed that emotion regulation in later life is a function of both developmental change and contextual exposure, with different patterns emerging for hassles and uplifts. Thus, support was found for both hedonic treadmill and developmental change theories, reflecting different aspects of emotion regulation in late life. (c) 2014 APA, all rights reserved.

  3. Development of 1-year-old computational phantom and calculation of organ doses during CT scans using Monte Carlo simulation.

    PubMed

    Pan, Yuxi; Qiu, Rui; Gao, Linfeng; Ge, Chaoyong; Zheng, Junzheng; Xie, Wenzhang; Li, Junli

    2014-09-21

    With the rapidly growing number of CT examinations, the consequential radiation risk has aroused more and more attention. The average dose in each organ during CT scans can only be obtained by using Monte Carlo simulation with computational phantoms. Since children tend to have higher radiation sensitivity than adults, the radiation dose of pediatric CT examinations requires special attention and needs to be assessed accurately. So far, studies on organ doses from CT exposures for pediatric patients are still limited. In this work, a 1-year-old computational phantom was constructed. The body contour was obtained from the CT images of a 1-year-old physical phantom and the internal organs were deformed from an existing Chinese reference adult phantom. To ensure the organ locations in the 1-year-old computational phantom were consistent with those of the physical phantom, the organ locations in 1-year-old computational phantom were manually adjusted one by one, and the organ masses were adjusted to the corresponding Chinese reference values. Moreover, a CT scanner model was developed using the Monte Carlo technique and the 1-year-old computational phantom was applied to estimate organ doses derived from simulated CT exposures. As a result, a database including doses to 36 organs and tissues from 47 single axial scans was built. It has been verified by calculation that doses of axial scans are close to those of helical scans; therefore, this database could be applied to helical scans as well. Organ doses were calculated using the database and compared with those obtained from the measurements made in the physical phantom for helical scans. The differences between simulation and measurement were less than 25% for all organs. The result shows that the 1-year-old phantom developed in this work can be used to calculate organ doses in CT exposures, and the dose database provides a method for the estimation of 1-year-old patient doses in a variety of CT examinations.

  4. Development of 1-year-old computational phantom and calculation of organ doses during CT scans using Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Pan, Yuxi; Qiu, Rui; Gao, Linfeng; Ge, Chaoyong; Zheng, Junzheng; Xie, Wenzhang; Li, Junli

    2014-09-01

    With the rapidly growing number of CT examinations, the consequential radiation risk has aroused more and more attention. The average dose in each organ during CT scans can only be obtained by using Monte Carlo simulation with computational phantoms. Since children tend to have higher radiation sensitivity than adults, the radiation dose of pediatric CT examinations requires special attention and needs to be assessed accurately. So far, studies on organ doses from CT exposures for pediatric patients are still limited. In this work, a 1-year-old computational phantom was constructed. The body contour was obtained from the CT images of a 1-year-old physical phantom and the internal organs were deformed from an existing Chinese reference adult phantom. To ensure the organ locations in the 1-year-old computational phantom were consistent with those of the physical phantom, the organ locations in 1-year-old computational phantom were manually adjusted one by one, and the organ masses were adjusted to the corresponding Chinese reference values. Moreover, a CT scanner model was developed using the Monte Carlo technique and the 1-year-old computational phantom was applied to estimate organ doses derived from simulated CT exposures. As a result, a database including doses to 36 organs and tissues from 47 single axial scans was built. It has been verified by calculation that doses of axial scans are close to those of helical scans; therefore, this database could be applied to helical scans as well. Organ doses were calculated using the database and compared with those obtained from the measurements made in the physical phantom for helical scans. The differences between simulation and measurement were less than 25% for all organs. The result shows that the 1-year-old phantom developed in this work can be used to calculate organ doses in CT exposures, and the dose database provides a method for the estimation of 1-year-old patient doses in a variety of CT examinations.

  5. The scientific jigsaw puzzle: Fitting the pieces of the low-level radiation debate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beyea, Jan

    2012-05-01

    Quantitative risk estimates from exposure to ionizing radiation are dominated by analysis of the one-time exposures received by the Japanese survivors at Hiroshima and Nagasaki. Three recent epidemiologic studies suggest that the risk from protracted exposure is no lower, and in fact may be higher, than from single exposures. There is near-universal acceptance that epidemiologic data demonstrates an excess risk of delayed cancer incidence above a dose of 0.1 sievert (Sv), which, for the average American, is equivalent to 40 years of unavoidable exposure from natural background radiation. Model fits, both parametric and nonparametric, to the atomic-bomb data support amore » linear no-threshold model, below 0.1 Sv. On the basis of biologic arguments, the scientific establishment in the United States and many other countries accepts this dose-model down to zero-dose, but there is spirited dissent. The dissent may be irrelevant for developed countries, given the increase in medical diagnostic radiation that has occurred in recent decades; a sizeable percentage of this population will receive cumulative doses from the medical profession in excess of 0.1 Sv, making talk of a threshold or other sublinear response below that dose moot for future releases from nuclear facilities or a dirty bomb. The risks from both medical diagnostic doses and nuclear accident doses can be computed using the linear dose-response model, with uncertainties assigned below 0.1 Sv in a way that captures alternative scientific hypotheses. Then, the important debate over low-level radiation exposures, namely planning for accident response and weighing benefits and risks of technologies, can proceed with less distraction. One of the biggest paradoxes in the low-level radiation debate is that an individual risk can be a minor concern, while the societal risk-the total delayed cancers in an exposed population-can be of major concern.« less

  6. WAZA-ARI: computational dosimetry system for X-ray CT examinations II: development of web-based system.

    PubMed

    Ban, Nobuhiko; Takahashi, Fumiaki; Ono, Koji; Hasegawa, Takayuki; Yoshitake, Takayasu; Katsunuma, Yasushi; Sato, Kaoru; Endo, Akira; Kai, Michiaki

    2011-07-01

    A web-based dose computation system, WAZA-ARI, is being developed for patients undergoing X-ray CT examinations. The system is implemented in Java on a Linux server running Apache Tomcat. Users choose scanning options and input parameters via a web browser over the Internet. Dose coefficients, which were calculated in a Japanese adult male phantom (JM phantom) are called upon user request and are summed over the scan range specified by the user to estimate a normalised dose. Tissue doses are finally computed based on the radiographic exposure (mA s) and the pitch factor. While dose coefficients are currently available only for limited CT scanner models, the system has achieved a high degree of flexibility and scalability without the use of commercial software.

  7. Computational Approaches and Tools for Exposure Prioritization and Biomonitoring Data Interpretation

    EPA Science Inventory

    The ability to describe the source-environment-exposure-dose-response continuum is essential for identifying exposures of greater concern to prioritize chemicals for toxicity testing or risk assessment, as well as for interpreting biomarker data for better assessment of exposure ...

  8. Human exposure assessment in the near field of GSM base-station antennas using a hybrid finite element/method of moments technique.

    PubMed

    Meyer, Frans J C; Davidson, David B; Jakobus, Ulrich; Stuchly, Maria A

    2003-02-01

    A hybrid finite-element method (FEM)/method of moments (MoM) technique is employed for specific absorption rate (SAR) calculations in a human phantom in the near field of a typical group special mobile (GSM) base-station antenna. The MoM is used to model the metallic surfaces and wires of the base-station antenna, and the FEM is used to model the heterogeneous human phantom. The advantages of each of these frequency domain techniques are, thus, exploited, leading to a highly efficient and robust numerical method for addressing this type of bioelectromagnetic problem. The basic mathematical formulation of the hybrid technique is presented. This is followed by a discussion of important implementation details-in particular, the linear algebra routines for sparse, complex FEM matrices combined with dense MoM matrices. The implementation is validated by comparing results to MoM (surface equivalence principle implementation) and finite-difference time-domain (FDTD) solutions of human exposure problems. A comparison of the computational efficiency of the different techniques is presented. The FEM/MoM implementation is then used for whole-body and critical-organ SAR calculations in a phantom at different positions in the near field of a base-station antenna. This problem cannot, in general, be solved using the MoM or FDTD due to computational limitations. This paper shows that the specific hybrid FEM/MoM implementation is an efficient numerical tool for accurate assessment of human exposure in the near field of base-station antennas.

  9. Air toxics and birth defects: a Bayesian hierarchical approach to evaluate multiple pollutants and spina bifida.

    PubMed

    Swartz, Michael D; Cai, Yi; Chan, Wenyaw; Symanski, Elaine; Mitchell, Laura E; Danysh, Heather E; Langlois, Peter H; Lupo, Philip J

    2015-02-09

    While there is evidence that maternal exposure to benzene is associated with spina bifida in offspring, to our knowledge there have been no assessments to evaluate the role of multiple hazardous air pollutants (HAPs) simultaneously on the risk of this relatively common birth defect. In the current study, we evaluated the association between maternal exposure to HAPs identified by the United States Environmental Protection Agency (U.S. EPA) and spina bifida in offspring using hierarchical Bayesian modeling that includes Stochastic Search Variable Selection (SSVS). The Texas Birth Defects Registry provided data on spina bifida cases delivered between 1999 and 2004. The control group was a random sample of unaffected live births, frequency matched to cases on year of birth. Census tract-level estimates of annual HAP levels were obtained from the U.S. EPA's 1999 Assessment System for Population Exposure Nationwide. Using the distribution among controls, exposure was categorized as high exposure (>95(th) percentile), medium exposure (5(th)-95(th) percentile), and low exposure (<5(th) percentile, reference). We used hierarchical Bayesian logistic regression models with SSVS to evaluate the association between HAPs and spina bifida by computing an odds ratio (OR) for each HAP using the posterior mean, and a 95% credible interval (CI) using the 2.5(th) and 97.5(th) quantiles of the posterior samples. Based on previous assessments, any pollutant with a Bayes factor greater than 1 was selected for inclusion in a final model. Twenty-five HAPs were selected in the final analysis to represent "bins" of highly correlated HAPs (ρ > 0.80). We identified two out of 25 HAPs with a Bayes factor greater than 1: quinoline (ORhigh = 2.06, 95% CI: 1.11-3.87, Bayes factor = 1.01) and trichloroethylene (ORmedium = 2.00, 95% CI: 1.14-3.61, Bayes factor = 3.79). Overall there is evidence that quinoline and trichloroethylene may be significant contributors to the risk of spina bifida. Additionally, the use of Bayesian hierarchical models with SSVS is an alternative approach in the evaluation of multiple environmental pollutants on disease risk. This approach can be easily extended to environmental exposures, where novel approaches are needed in the context of multi-pollutant modeling.

  10. Reconstruction of improvised explosive device blast loading to personnel in the open

    NASA Astrophysics Data System (ADS)

    Wiri, Suthee; Needham, Charles

    2016-05-01

    Significant advances in reconstructing attacks by improvised explosive devices (IEDs) and other blast events are reported. A high-fidelity three-dimensional computational fluid dynamics tool, called Second-order Hydrodynamic Automatic Mesh Refinement Code, was used for the analysis. Computer-aided design models for subjects or vehicles in the scene accurately represent geometries of objects in the blast field. A wide range of scenario types and blast exposure levels were reconstructed including free field blast, enclosed space of vehicle cabin, IED attack on a vehicle, buried charges, recoilless rifle operation, rocket-propelled grenade attack and missile attack with single subject or multiple subject exposure to pressure levels from ˜ 27.6 kPa (˜ 4 psi) to greater than 690 kPa (>100 psi). To create a full 3D pressure time-resolved reconstruction of a blast event for injury and blast exposure analysis, a combination of intelligence data and Blast Gauge data can be used to reconstruct an actual in-theatre blast event. The methodology to reconstruct an event and the "lessons learned" from multiple reconstructions in open space are presented. The analysis uses records of blast pressure at discrete points, and the output is a spatial and temporal blast load distribution for all personnel involved.

  11. Illuminating cancer health disparities using ethnogenetic layering (EL) and phenotype segregation network analysis (PSNA).

    PubMed

    Jackson, Fatimah L C

    2006-01-01

    Resolving cancer health disparities continues to befuddle simplistic racial models. The racial groups alluded to in biomedicine, public health, and epidemiology are often profoundly substructured. EL and PSNA are computational assisted techniques that focus on microethnic group (MEG) substructure. Geographical variations in cancer may be due to differences in MEG ancestry or similar environmental exposures to a recognized carcinogen. Examples include breast and prostate cancers in the Chesapeake Bay region and Bight of Biafra biological ancestry, hypertension and stroke in the Carolina Coast region and Central African biological ancestry, and pancreatic cancer in the Mississippi Delta region and dietary/medicinal exposure to safrol from Sassafras albidum.

  12. Index extraction for electromagnetic field evaluation of high power wireless charging system

    PubMed Central

    2017-01-01

    This paper presents the precise dosimetry for highly resonant wireless power transfer (HR-WPT) system using an anatomically realistic human voxel model. The dosimetry for the HR-WPT system designed to operate at 13.56 MHz frequency, which one of the ISM band frequency band, is conducted in the various distances between the human model and the system, and in the condition of alignment and misalignment between transmitting and receiving circuits. The specific absorption rates in the human body are computed by the two-step approach; in the first step, the field generated by the HR-WPT system is calculated and in the second step the specific absorption rates are computed with the scattered field finite-difference time-domain method regarding the fields obtained in the first step as the incident fields. The safety compliance for non-uniform field exposure from the HR-WPT system is discussed with the international safety guidelines. Furthermore, the coupling factor concept is employed to relax the maximum allowable transmitting power. Coupling factors derived from the dosimetry results are presented. In this calculation, the external magnetic field from the HR-WPT system can be relaxed by approximately four times using coupling factor in the worst exposure scenario. PMID:28708840

  13. Computer programs for producing single-event aircraft noise data for specific engine power and meteorological conditions for use with USAF (United States Air Force) community noise model (NOISEMAP)

    NASA Astrophysics Data System (ADS)

    Mohlman, H. T.

    1983-04-01

    The Air Force community noise prediction model (NOISEMAP) is used to describe the aircraft noise exposure around airbases and thereby aid airbase planners to minimize exposure and prevent community encroachment which could limit mission effectiveness of the installation. This report documents two computer programs (OMEGA 10 and OMEGA 11) which were developed to prepare aircraft flight and ground runup noise data for input to NOISEMAP. OMEGA 10 is for flight operations and OMEGA 11 is for aircraft ground runups. All routines in each program are documented at a level useful to a programmer working with the code or a reader interested in a general overview of what happens within a specific subroutine. Both programs input normalized, reference aircraft noise data; i.e., data at a standard reference distance from the aircraft, for several fixed engine power settings, a reference airspeed and standard day meteorological conditions. Both programs operate on these normalized, reference data in accordance with user-defined, non-reference conditions to derive single-event noise data for 22 distances (200 to 25,000 feet) in a variety of physical and psycho-acoustic metrics. These outputs are in formats ready for input to NOISEMAP.

  14. Assessment of phantom dosimetry and image quality of i-CAT FLX cone-beam computed tomography.

    PubMed

    Ludlow, John B; Walker, Cameron

    2013-12-01

    The increasing use of cone-beam computed tomography in orthodontics has been coupled with heightened concern about the long-term risks of x-ray exposure in orthodontic populations. An industry response to this has been to offer low-exposure alternative scanning options in newer cone-beam computed tomography models. Effective doses resulting from various combinations of field of view size and field location comparing child and adult anthropomorphic phantoms with the recently introduced i-CAT FLX cone-beam computed tomography unit (Imaging Sciences, Hatfield, Pa) were measured with optical stimulated dosimetry using previously validated protocols. Scan protocols included high resolution (360° rotation, 600 image frames, 120 kV[p], 5 mA, 7.4 seconds), standard (360°, 300 frames, 120 kV[p], 5 mA, 3.7 seconds), QuickScan (180°, 160 frames, 120 kV[p], 5 mA, 2 seconds), and QuickScan+ (180°, 160 frames, 90 kV[p], 3 mA, 2 seconds). Contrast-to-noise ratio was calculated as a quantitative measure of image quality for the various exposure options using the QUART DVT phantom. Child phantom doses were on average 36% greater than adult phantom doses. QuickScan+ protocols resulted in significantly lower doses than standard protocols for the child (P = 0.0167) and adult (P = 0.0055) phantoms. The 13 × 16-cm cephalometric fields of view ranged from 11 to 85 μSv in the adult phantom and 18 to 120 μSv in the child phantom for the QuickScan+ and standard protocols, respectively. The contrast-to-noise ratio was reduced by approximately two thirds when comparing QuickScan+ with standard exposure parameters. QuickScan+ effective doses are comparable with conventional panoramic examinations. Significant dose reductions are accompanied by significant reductions in image quality. However, this trade-off might be acceptable for certain diagnostic tasks such as interim assessment of treatment results. Copyright © 2013 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  15. InMAP: a new model for air pollution interventions

    NASA Astrophysics Data System (ADS)

    Tessum, C. W.; Hill, J. D.; Marshall, J. D.

    2015-10-01

    Mechanistic air pollution models are essential tools in air quality management. Widespread use of such models is hindered, however, by the extensive expertise or computational resources needed to run most models. Here, we present InMAP (Intervention Model for Air Pollution), which offers an alternative to comprehensive air quality models for estimating the air pollution health impacts of emission reductions and other potential interventions. InMAP estimates annual-average changes in primary and secondary fine particle (PM2.5) concentrations - the air pollution outcome generally causing the largest monetized health damages - attributable to annual changes in precursor emissions. InMAP leverages pre-processed physical and chemical information from the output of a state-of-the-science chemical transport model (WRF-Chem) within an Eulerian modeling framework, to perform simulations that are several orders of magnitude less computationally intensive than comprehensive model simulations. InMAP uses a variable resolution grid that focuses on human exposures by employing higher spatial resolution in urban areas and lower spatial resolution in rural and remote locations and in the upper atmosphere; and by directly calculating steady-state, annual average concentrations. In comparisons run here, InMAP recreates WRF-Chem predictions of changes in total PM2.5 concentrations with population-weighted mean fractional error (MFE) and bias (MFB) < 10 % and population-weighted R2 ~ 0.99. Among individual PM2.5 species, the best predictive performance is for primary PM2.5 (MFE: 16 %; MFB: 13 %) and the worst predictive performance is for particulate nitrate (MFE: 119 %; MFB: 106 %). Potential uses of InMAP include studying exposure, health, and environmental justice impacts of potential shifts in emissions for annual-average PM2.5. Features planned for future model releases include a larger spatial domain, more temporal information, and the ability to predict ground-level ozone (O3) concentrations. The InMAP model source code and input data are freely available online.

  16. A Liver-centric Multiscale Modeling Framework for Xenobiotics ...

    EPA Pesticide Factsheets

    We describe a multi-scale framework for modeling acetaminophen-induced liver toxicity. Acetaminophen is a widely used analgesic. Overdose of acetaminophen can result in liver injury via its biotransformation into toxic product, which further induce massive necrosis. Our study focuses on developing a multi-scale computational model to characterize both phase I and phase II metabolism of acetaminophen, by bridging Physiologically Based Pharmacokinetic (PBPK) modeling at the whole body level, cell movement and blood flow at the tissue level and cell signaling and drug metabolism at the sub-cellular level. To validate the model, we estimated our model parameters by fi?tting serum concentrations of acetaminophen and its glucuronide and sulfate metabolites to experiments, and carried out sensitivity analysis on 35 parameters selected from three modules. Our study focuses on developing a multi-scale computational model to characterize both phase I and phase II metabolism of acetaminophen, by bridging Physiologically Based Pharmacokinetic (PBPK) modeling at the whole body level, cell movement and blood flow at the tissue level and cell signaling and drug metabolism at the sub-cellular level. This multiscale model bridges the CompuCell3D tool used by the Virtual Tissue project with the httk tool developed by the Rapid Exposure and Dosimetry project.

  17. Entrainment of circadian rhythms to irregular light/dark cycles: a subterranean perspective

    PubMed Central

    Flôres, Danilo E. F. L.; Jannetti, Milene G.; Valentinuzzi, Veronica S.; Oda, Gisele A.

    2016-01-01

    Synchronization of biological rhythms to the 24-hour day/night has long been studied with model organisms, under artificial light/dark cycles in the laboratory. The commonly used rectangular light/dark cycles, comprising hours of continuous light and darkness, may not be representative of the natural light exposure for most species, including humans. Subterranean rodents live in dark underground tunnels and offer a unique opportunity to investigate extreme mechanisms of photic entrainment in the wild. Here, we show automated field recordings of the daily light exposure patterns in a South American subterranean rodent, the tuco-tuco (Ctenomys aff. knighti ). In the laboratory, we exposed tuco-tucos to a simplified version of this natural light exposure pattern, to determine the minimum light timing information that is necessary for synchronization. As predicted from our previous studies using mathematical modeling, the activity rhythm of tuco-tucos synchronized to this mostly simplified light/dark regimen consisting of a single light pulse per day, occurring at randomly scattered times within a day length interval. Our integrated semi-natural, lab and computer simulation findings indicate that photic entrainment of circadian oscillators is robust, even in face of artificially reduced exposure and increased phase instability of the synchronizing stimuli. PMID:27698436

  18. Characterization and Evaluation of a Commercial WLAN System for Human Provocation Studies.

    PubMed

    Zentai, Norbert; Fiocchi, Serena; Parazzini, Marta; Trunk, Attila; Juhász, Péter; Ravazzani, Paolo; Hernádi, István; Thuróczy, György

    2015-01-01

    This work evaluates the complex exposure characteristics of Wireless Local Area Network (WLAN) technology and describes the design of a WLAN exposure system built using commercially available modular parts for the study of possible biological health effects due to WLAN exposure in a controlled environment. The system consisted of an access point and a client unit (CU) with router board cards types R52 and R52n with 18 dBm and 25 dBm peak power, respectively. Free space radiofrequency field (RF) measurements were performed with a field meter at a distance of 40 cm from the CU in order to evaluate the RF exposure at several signal configurations of the exposure system. Finally, the specific absorption rate (SAR) generated by the CU was estimated computationally in the head of two human models. Results suggest that exposure to RF fields of WLAN systems strongly depends on the sets of the router configuration: the stability of the exposure was more constant and reliable when both antennas were active and vertically positioned, with best signal quality obtained with the R52n router board at channel 9, in UDP mode. The maximum levels of peak SAR were far away from the limits of international guidelines with peak levels found over the skin.

  19. Characterization and Evaluation of a Commercial WLAN System for Human Provocation Studies

    PubMed Central

    Parazzini, Marta; Trunk, Attila; Juhász, Péter; Hernádi, István; Thuróczy, György

    2015-01-01

    This work evaluates the complex exposure characteristics of Wireless Local Area Network (WLAN) technology and describes the design of a WLAN exposure system built using commercially available modular parts for the study of possible biological health effects due to WLAN exposure in a controlled environment. The system consisted of an access point and a client unit (CU) with router board cards types R52 and R52n with 18 dBm and 25 dBm peak power, respectively. Free space radiofrequency field (RF) measurements were performed with a field meter at a distance of 40 cm from the CU in order to evaluate the RF exposure at several signal configurations of the exposure system. Finally, the specific absorption rate (SAR) generated by the CU was estimated computationally in the head of two human models. Results suggest that exposure to RF fields of WLAN systems strongly depends on the sets of the router configuration: the stability of the exposure was more constant and reliable when both antennas were active and vertically positioned, with best signal quality obtained with the R52n router board at channel 9, in UDP mode. The maximum levels of peak SAR were far away from the limits of international guidelines with peak levels found over the skin. PMID:26180791

  20. Numerical-experimental analysis of a carbon-phenolic composite via plasma jet ablation test

    NASA Astrophysics Data System (ADS)

    Guilherme Silva Pesci, Pedro; Araújo Machado, Humberto; Silva, Homero de Paula e.; Cley Paterniani Rita, Cristian; Petraconi Filho, Gilberto; Cocchieri Botelho, Edson

    2018-06-01

    Materials used in space vehicles components are subjected to thermally aggressive environments when exposed to atmospheric reentry. In order to protect the payload and the vehicle itself, ablative composites are employed as TPS (Thermal Protection System). The development of TPS materials generally go through phases of obtaining, atmospheric reentry tests and comparison with a mathematical model. The state of the art presents some reentry tests in a subsonic or supersonic arc-jet facility, and a complex type of mathematical model, which normally requires large computational cost. This work presents a reliable method for estimate the performance of ablative composites, combining empirical and experimental data. Tests of composite materials used in thermal protection systems through exposure to a plasma jet are performed, where the heat fluxes emulate those present in atmospheric reentry of space vehicles components. The carbon/phenolic material samples have been performed in the hypersonic plasma tunnel of Plasma and Process Laboratory, available in Aeronautics Institute of Technology (ITA), by a plasma torch with a 50 kW DC power source. The plasma tunnel parameters were optimized to reproduce the conditions close to the critical re-entry point of the space vehicles payloads developed by the Aeronautics and Space Institute (IAE). The specimens in study were developed and manufactured in Brazil. Mass loss and specific mass loss rates of the samples and the back surface temperatures, as a function of the exposure time to the thermal flow, were determined. A computational simulation based in a two-front ablation model was performed, in order to compare the tests and the simulation results. The results allowed to estimate the ablative behavior of the tested material and to validate the theoretical model used in the computational simulation for its use in geometries close to the thermal protection systems used in the Brazilian space and suborbital vehicles.

  1. Are There Effects of Intrauterine Cocaine Exposure on Delinquency during Early Adolescence? A Preliminary Report

    PubMed Central

    Gerteis, Jessie; Chartrand, Molinda; Martin, Brett; Cabral, Howard J.; Rose-Jacobs, Ruth; Crooks, Denise; Frank, Deborah A.

    2011-01-01

    Objective To ascertain whether level of intrauterine cocaine exposure (IUCE) is associated with early adolescent delinquent behavior, after accounting for prenatal exposures to other psychoactive substances and relevant psychosocial factors. Methods Ninety-three early adolescents (12.5–14.5 years old) participating since birth in a longitudinal study of IUCE reported delinquent acts via an audio computer assisted self interview (ACASI). Level of IUCE and exposure to cigarettes, alcohol, and marijuana were determined by maternal report, maternal and infant urine assays, and infant meconium assays at birth. Participants reported their exposure to violence on the Violence Exposure Scale for Children – Revised (VEX-R) at ages 8.5, 9.5, 11 years and during early adolescence, and the strictness of supervision by their caregivers during early adolescence. Results Of the 93 participants, 24 (26%) reported ≥3 delinquent behaviors during early adolescence. In the final multivariate model (including level of IUCE and cigarette exposure, childhood exposure to violence, and caregiver strictness/supervision) ≥ 3 delinquent behaviors were not significantly associated with level of IUCE but were significantly associated with intrauterine exposure to half a pack or more of cigarettes per day and higher levels of childhood exposure to violence, effects substantially unchanged after control for early adolescent violence exposure. Conclusions In this cohort, prospectively ascertained prenatal exposure to cigarettes and childhood exposure to violence are associated with self-reported delinquent behaviors during early adolescence. Contrary to initial popular predictions, intrauterine cocaine is not a strong predictor of adolescent delinquent behaviors in this cohort. PMID:21558951

  2. Evaluation and recommendation of sensitivity analysis methods for application to Stochastic Human Exposure and Dose Simulation models.

    PubMed

    Mokhtari, Amirhossein; Christopher Frey, H; Zheng, Junyu

    2006-11-01

    Sensitivity analyses of exposure or risk models can help identify the most significant factors to aid in risk management or to prioritize additional research to reduce uncertainty in the estimates. However, sensitivity analysis is challenged by non-linearity, interactions between inputs, and multiple days or time scales. Selected sensitivity analysis methods are evaluated with respect to their applicability to human exposure models with such features using a testbed. The testbed is a simplified version of a US Environmental Protection Agency's Stochastic Human Exposure and Dose Simulation (SHEDS) model. The methods evaluated include the Pearson and Spearman correlation, sample and rank regression, analysis of variance, Fourier amplitude sensitivity test (FAST), and Sobol's method. The first five methods are known as "sampling-based" techniques, wheras the latter two methods are known as "variance-based" techniques. The main objective of the test cases was to identify the main and total contributions of individual inputs to the output variance. Sobol's method and FAST directly quantified these measures of sensitivity. Results show that sensitivity of an input typically changed when evaluated under different time scales (e.g., daily versus monthly). All methods provided similar insights regarding less important inputs; however, Sobol's method and FAST provided more robust insights with respect to sensitivity of important inputs compared to the sampling-based techniques. Thus, the sampling-based methods can be used in a screening step to identify unimportant inputs, followed by application of more computationally intensive refined methods to a smaller set of inputs. The implications of time variation in sensitivity results for risk management are briefly discussed.

  3. Digital radiography: are the manufacturers' settings too high? Optimisation of the Kodak digital radiography system with aid of the computed radiography dose index.

    PubMed

    Peters, Sinead E; Brennan, Patrick C

    2002-09-01

    Manufacturers offer exposure indices as a safeguard against overexposure in computed radiography, but the basis for recommended values is unclear. This study establishes an optimum exposure index to be used as a guideline for a specific CR system to minimise radiation exposures for computed mobile chest radiography, and compares this with manufacturer guidelines and current practice. An anthropomorphic phantom was employed to establish the minimum milliamperes consistent with acceptable image quality for mobile chest radiography images. This was found to be 2 mAs. Consecutively, 10 patients were exposed with this optimised milliampere value and 10 patients were exposed with the 3.2 mAs routinely used in the department of the study. Image quality was objectively assessed using anatomical criteria. Retrospective analyses of 717 exposure indices recorded over 2 months from mobile chest examinations were performed. The optimised milliampere value provided a significant reduction of the average exposure index from 1840 to 1570 ( p<0.0001). This new "optimum" exposure index is substantially lower than manufacturer guidelines of 2000 and significantly lower than exposure indices from the retrospective study (1890). Retrospective data showed a significant increase in exposure indices if the examination was performed out of hours. The data provided by this study emphasise the need for clinicians and personnel to consider establishing their own optimum exposure indices for digital investigations rather than simply accepting manufacturers' guidelines. Such an approach, along with regular monitoring of indices, may result in a substantial reduction in patient exposure.

  4. Efficient storage, computation, and exposure of computer-generated holograms by electron-beam lithography.

    PubMed

    Newman, D M; Hawley, R W; Goeckel, D L; Crawford, R D; Abraham, S; Gallagher, N C

    1993-05-10

    An efficient storage format was developed for computer-generated holograms for use in electron-beam lithography. This method employs run-length encoding and Lempel-Ziv-Welch compression and succeeds in exposing holograms that were previously infeasible owing to the hologram's tremendous pattern-data file size. These holograms also require significant computation; thus the algorithm was implemented on a parallel computer, which improved performance by 2 orders of magnitude. The decompression algorithm was integrated into the Cambridge electron-beam machine's front-end processor.Although this provides much-needed ability, some hardware enhancements will be required in the future to overcome inadequacies in the current front-end processor that result in a lengthy exposure time.

  5. Computional algorithm for lifetime exposure to antimicrobials in pigs using register data-The LEA algorithm.

    PubMed

    Birkegård, Anna Camilla; Andersen, Vibe Dalhoff; Halasa, Tariq; Jensen, Vibeke Frøkjær; Toft, Nils; Vigre, Håkan

    2017-10-01

    Accurate and detailed data on antimicrobial exposure in pig production are essential when studying the association between antimicrobial exposure and antimicrobial resistance. Due to difficulties in obtaining primary data on antimicrobial exposure in a large number of farms, there is a need for a robust and valid method to estimate the exposure using register data. An approach that estimates the antimicrobial exposure in every rearing period during the lifetime of a pig using register data was developed into a computational algorithm. In this approach data from national registers on antimicrobial purchases, movements of pigs and farm demographics registered at farm level are used. The algorithm traces batches of pigs retrospectively from slaughter to the farm(s) that housed the pigs during their finisher, weaner, and piglet period. Subsequently, the algorithm estimates the antimicrobial exposure as the number of Animal Defined Daily Doses for treatment of one kg pig in each of the rearing periods. Thus, the antimicrobial purchase data at farm level are translated into antimicrobial exposure estimates at batch level. A batch of pigs is defined here as pigs sent to slaughter at the same day from the same farm. In this study we present, validate, and optimise a computational algorithm that calculate the lifetime exposure of antimicrobials for slaughter pigs. The algorithm was evaluated by comparing the computed estimates to data on antimicrobial usage from farm records in 15 farm units. We found a good positive correlation between the two estimates. The algorithm was run for Danish slaughter pigs sent to slaughter in January to March 2015 from farms with more than 200 finishers to estimate the proportion of farms that it was applicable for. In the final process, the algorithm was successfully run for batches of pigs originating from 3026 farms with finisher units (77% of the initial population). This number can be increased if more accurate register data can be obtained. The algorithm provides a systematic and repeatable approach to estimating the antimicrobial exposure throughout the rearing period, independent of rearing site for finisher batches, as a lifetime exposure measurement. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. The passive control of air pollution exposure in Dublin, Ireland: a combined measurement and modelling case study.

    PubMed

    Gallagher, J; Gill, L W; McNabola, A

    2013-08-01

    This study investigates the potential real world application of passive control systems to reduce personal pollutant exposure in an urban street canyon in Dublin, Ireland. The implementation of parked cars and/or low boundary walls as a passive control system has been shown to minimise personal exposure to pollutants on footpaths in previous investigations. However, previous research has been limited to generic numerical modelling studies. This study combines real-time traffic data, meteorological conditions and pollution concentrations, in a real world urban street canyon before and after the implementation of a passive control system. Using a combination of field measurements and numerical modelling this study assessed the potential impact of passive controls on personal exposure to nitric oxide (NO) concentrations in the street canyon in winter conditions. A calibrated numerical model of the urban street canyon was developed, taking into account the variability in traffic and meteorological conditions. The modelling system combined the computational fluid dynamic (CFD) simulations and a semi-empirical equation, and demonstrated a good agreement with measured field data collected in the street canyon. The results indicated that lane distribution, fleet composition and vehicular turbulence all affected pollutant dispersion, in addition to the canyon geometry and local meteorological conditions. The introduction of passive controls displayed mixed results for improvements in air quality on the footpaths for different wind and traffic conditions. Parked cars demonstrated the most comprehensive passive control system with average improvements in air quality of up to 15% on the footpaths. This study highlights the potential of passive controls in a real street canyon to increase dispersion and improve air quality at street level. Copyright © 2013 Elsevier B.V. All rights reserved.

  7. Time series analysis and mortality model of dog bite victims presented for treatment at a referral clinic for rabies exposure in Monrovia, Liberia, 2010-2013.

    PubMed

    Olarinmoye, Ayodeji O; Ojo, Johnson F; Fasunla, Ayotunde J; Ishola, Olayinka O; Dakinah, Fahnboah G; Mulbah, Charles K; Al-Hezaimi, Khalid; Olugasa, Babasola O

    2017-08-01

    We developed time trend model, determined treatment outcome and estimated annual human deaths among dog bite victims (DBVs) from 2010 to 2013 in Monrovia, Liberia. Data obtained from clinic records included victim's age, gender and site of bite marks, site name of residence of rabies-exposed patients, promptness of care sought, initial treatment and post-exposure-prophylaxis (PEP) compliance. We computed DBV time-trend plot, seasonal index and year 2014 case forecast. Associated annual human death (AHD) was estimated using a standardized decision tree model. Of the 775 DBVs enlisted, care seeking time was within 24h of injury in 328 (42.32%) DBVs. Victim's residential location, site of bite mark, and time dependent variables were significantly associated with treatment outcome (p< 0.05). The equation X^ t =28.278-0.365t models the trend of DBVs. The high (n=705, 90.97%) defaulted PEP and average 155 AHD from rabies implied urgent need for policy formulation on national programme for rabies prevention in Liberia. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Long-term health experience of jet engine manufacturing workers: IX. further investigation of general mortality patterns in relation to workplace exposures.

    PubMed

    Youk, Ada O; Marsh, Gary M; Buchanich, Jeanine M; Downing, Sarah; Kennedy, Kathleen J; Esmen, Nurtan A; Hancock, Roger P; Lacey, Steven E

    2013-06-01

    To evaluate mortality rates among a cohort of jet engine manufacturing workers. Subjects were 222,123 workers employed from 1952 to 2001. Vital status was determined through 2004 for 99% of subjects and cause of death for 95% of 68,317 deaths. We computed standardized mortality ratios and modeled internal cohort rates. Mortality excesses reported initially no longer met the criteria for further investigation. We found two chronic obstructive pulmonary disease-related mortality excesses that met the criteria in two of eight study plants. At the total cohort level, chronic obstructive pulmonary disease-related categories were not related to any factors or occupational exposures considered. A full evaluation of these excesses was limited by lack of data on smoking history. Occupational exposures received outside of work or uncontrolled positive confounding by smoking cannot be ruled out as reasons for these excesses.

  9. Fractional poisson--a simple dose-response model for human norovirus.

    PubMed

    Messner, Michael J; Berger, Philip; Nappier, Sharon P

    2014-10-01

    This study utilizes old and new Norovirus (NoV) human challenge data to model the dose-response relationship for human NoV infection. The combined data set is used to update estimates from a previously published beta-Poisson dose-response model that includes parameters for virus aggregation and for a beta-distribution that describes variable susceptibility among hosts. The quality of the beta-Poisson model is examined and a simpler model is proposed. The new model (fractional Poisson) characterizes hosts as either perfectly susceptible or perfectly immune, requiring a single parameter (the fraction of perfectly susceptible hosts) in place of the two-parameter beta-distribution. A second parameter is included to account for virus aggregation in the same fashion as it is added to the beta-Poisson model. Infection probability is simply the product of the probability of nonzero exposure (at least one virus or aggregate is ingested) and the fraction of susceptible hosts. The model is computationally simple and appears to be well suited to the data from the NoV human challenge studies. The model's deviance is similar to that of the beta-Poisson, but with one parameter, rather than two. As a result, the Akaike information criterion favors the fractional Poisson over the beta-Poisson model. At low, environmentally relevant exposure levels (<100), estimation error is small for the fractional Poisson model; however, caution is advised because no subjects were challenged at such a low dose. New low-dose data would be of great value to further clarify the NoV dose-response relationship and to support improved risk assessment for environmentally relevant exposures. © 2014 Society for Risk Analysis Published 2014. This article is a U.S. Government work and is in the public domain for the U.S.A.

  10. Promoting healthy computer use among middle school students: a pilot school-based health promotion program.

    PubMed

    Ciccarelli, Marina; Portsmouth, Linda; Harris, Courtenay; Jacobs, Karen

    2012-01-01

    Introduction of notebook computers in many schools has become integral to learning. This has increased students' screen-based exposure and the potential risks to physical and visual health. Unhealthy computing behaviours include frequent and long durations of exposure; awkward postures due to inappropriate furniture and workstation layout, and ignoring computer-related discomfort. Describe the framework for a planned school-based health promotion program to encourage healthy computing behaviours among middle school students. This planned program uses a community- based participatory research approach. Students in Year 7 in 2011 at a co-educational middle school, their parents, and teachers have been recruited. Baseline data was collected on students' knowledge of computer ergonomics, current notebook exposure, and attitudes towards healthy computing behaviours; and teachers' and self-perceived competence to promote healthy notebook use among students, and what education they wanted. The health promotion program is being developed by an inter-professional team in collaboration with students, teachers and parents to embed concepts of ergonomics education in relevant school activities and school culture. End of year changes in reported and observed student computing behaviours will be used to determine the effectiveness of the program. Building a body of evidence regarding physical health benefits to students from this school-based ergonomics program can guide policy development on the healthy use of computers within children's educational environments.

  11. Effect of computer radiation on weight and oxidant-antioxidant status of mice.

    PubMed

    Pei, Xuexian; Gu, Qijun; Ye, Dongdong; Wang, Yang; Zou, Xu; He, Lianping; Jin, Yuelong; Yao, Yingshui

    2014-10-20

    To explore the effects of computer radiation on weight and oxidant-antioxidant status of mice, and further to confirm that whether vitamin C has protective effects on computer radiation. Sixty Male adult ICR mice were randomly divided into six groups. each group give different treatment as follows: group A was control, group B given vitamin C intake, group C given 8 h/day computer radiation exposure, group D given vitamin C intake and 8 h/day computer radiation group E given 16 h/day computer radiation exposure, group F given vitamin C intake plus exposure to 16 h/day computer radiation. After seven weeks, mice was executed to collect the blood samples, for detecting total antioxidant capacity (T-AOC) and alkaline phosphatases (ALP)content in serum or liver tissue were determined by ELISA. No difference was found for the change of weight among six groups at different week. In the group C, D and F, the liver tissue T-AOC level were higher than the group A. In the group B, C and E, the serum ALP level were lower than the group A (P<0.05). The study indicate that computer radiation may have an adverse effect on T-AOC and ALP level of mice, and vitamin C have protective effect against computer radiation. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.

  12. ISDD: A computational model of particle sedimentation, diffusion and target cell dosimetry for in vitro toxicity studies

    PubMed Central

    2010-01-01

    Background The difficulty of directly measuring cellular dose is a significant obstacle to application of target tissue dosimetry for nanoparticle and microparticle toxicity assessment, particularly for in vitro systems. As a consequence, the target tissue paradigm for dosimetry and hazard assessment of nanoparticles has largely been ignored in favor of using metrics of exposure (e.g. μg particle/mL culture medium, particle surface area/mL, particle number/mL). We have developed a computational model of solution particokinetics (sedimentation, diffusion) and dosimetry for non-interacting spherical particles and their agglomerates in monolayer cell culture systems. Particle transport to cells is calculated by simultaneous solution of Stokes Law (sedimentation) and the Stokes-Einstein equation (diffusion). Results The In vitro Sedimentation, Diffusion and Dosimetry model (ISDD) was tested against measured transport rates or cellular doses for multiple sizes of polystyrene spheres (20-1100 nm), 35 nm amorphous silica, and large agglomerates of 30 nm iron oxide particles. Overall, without adjusting any parameters, model predicted cellular doses were in close agreement with the experimental data, differing from as little as 5% to as much as three-fold, but in most cases approximately two-fold, within the limits of the accuracy of the measurement systems. Applying the model, we generalize the effects of particle size, particle density, agglomeration state and agglomerate characteristics on target cell dosimetry in vitro. Conclusions Our results confirm our hypothesis that for liquid-based in vitro systems, the dose-rates and target cell doses for all particles are not equal; they can vary significantly, in direct contrast to the assumption of dose-equivalency implicit in the use of mass-based media concentrations as metrics of exposure for dose-response assessment. The difference between equivalent nominal media concentration exposures on a μg/mL basis and target cell doses on a particle surface area or number basis can be as high as three to six orders of magnitude. As a consequence, in vitro hazard assessments utilizing mass-based exposure metrics have inherently high errors where particle number or surface areas target cells doses are believed to drive response. The gold standard for particle dosimetry for in vitro nanotoxicology studies should be direct experimental measurement of the cellular content of the studied particle. However, where such measurements are impractical, unfeasible, and before such measurements become common, particle dosimetry models such as ISDD provide a valuable, immediately useful alternative, and eventually, an adjunct to such measurements. PMID:21118529

  13. Computer Literacy Project. A General Orientation in Basic Computer Concepts and Applications.

    ERIC Educational Resources Information Center

    Murray, David R.

    This paper proposes a two-part, basic computer literacy program for university faculty, staff, and students with no prior exposure to computers. The program described would introduce basic computer concepts and computing center service programs and resources; provide fundamental preparation for other computer courses; and orient faculty towards…

  14. Associations of Mortality with Long-Term Exposures to Fine and Ultrafine Particles, Species and Sources: Results from the California Teachers Study Cohort

    PubMed Central

    Hu, Jianlin; Goldberg, Debbie; Reynolds, Peggy; Hertz, Andrew; Bernstein, Leslie; Kleeman, Michael J.

    2015-01-01

    Background Although several cohort studies report associations between chronic exposure to fine particles (PM2.5) and mortality, few have studied the effects of chronic exposure to ultrafine (UF) particles. In addition, few studies have estimated the effects of the constituents of either PM2.5 or UF particles. Methods We used a statewide cohort of > 100,000 women from the California Teachers Study who were followed from 2001 through 2007. Exposure data at the residential level were provided by a chemical transport model that computed pollutant concentrations from > 900 sources in California. Besides particle mass, monthly concentrations of 11 species and 8 sources or primary particles were generated at 4-km grids. We used a Cox proportional hazards model to estimate the association between the pollutants and all-cause, cardiovascular, ischemic heart disease (IHD), and respiratory mortality. Results We observed statistically significant (p < 0.05) associations of IHD with PM2.5 mass, nitrate, elemental carbon (EC), copper (Cu), and secondary organics and the sources gas- and diesel-fueled vehicles, meat cooking, and high-sulfur fuel combustion. The hazard ratio estimate of 1.19 (95% CI: 1.08, 1.31) for IHD in association with a 10-μg/m3 increase in PM2.5 is consistent with findings from the American Cancer Society cohort. We also observed significant positive associations between IHD and several UF components including EC, Cu, metals, and mobile sources. Conclusions Using an emissions-based model with a 4-km spatial scale, we observed significant positive associations between IHD mortality and both fine and ultrafine particle species and sources. Our results suggest that the exposure model effectively measured local exposures and facilitated the examination of the relative toxicity of particle species. Citation Ostro B, Hu J, Goldberg D, Reynolds P, Hertz A, Bernstein L, Kleeman MJ. 2015. Associations of mortality with long-term exposures to fine and ultrafine particles, species and sources: results from the California Teachers Study cohort. Environ Health Perspect 123:549–556; http://dx.doi.org/10.1289/ehp.1408565 PMID:25633926

  15. Interactive vs passive screen time and nighttime sleep duration among school-aged children

    PubMed Central

    Yland, Jennifer; Guan, Stanford; Emanuele, Erin; Hale, Lauren

    2016-01-01

    Background Insufficient sleep among school-aged children is a growing concern, as numerous studies have shown that chronic short sleep duration increases the risk of poor academic performance and specific adverse health outcomes. We examined the association between weekday nighttime sleep duration and 3 types of screen exposure: television, computer use, and video gaming. Methods We used age 9 data from an ethnically diverse national birth cohort study, the Fragile Families and Child Wellbeing Study, to assess the association between screen time and sleep duration among 9-year-olds, using screen time data reported by both the child (n = 3269) and by the child's primary caregiver (n= 2770). Results Within the child-reported models, children who watched more than 2 hours of television per day had shorter sleep duration by approximately 11 minutes per night compared to those who watched less than 2 hours of television (β = −0.18; P < .001). Using the caregiver-reported models, both television and computer use were associated with reduced sleep duration. For both child- and parent-reported screen time measures, we did not find statistically significant differences in effect size across various types of screen time. Conclusions Screen time from televisions and computers is associated with reduced sleep duration among 9-year-olds, using 2 sources of estimates of screen time exposure (child and parent reports). No specific type or use of screen time resulted in significantly shorter sleep duration than another, suggesting that caution should be advised against excessive use of all screens. PMID:27540566

  16. Estimating the time and temperature relationship for causation of deep-partial thickness skin burns.

    PubMed

    Abraham, John P; Plourde, Brian; Vallez, Lauren; Stark, John; Diller, Kenneth R

    2015-12-01

    The objective of this study is to develop and present a simple procedure for evaluating the temperature and exposure-time conditions that lead to causation of a deep-partial thickness burn and the effect that the immediate post-burn thermal environment can have on the process. A computational model has been designed and applied to predict the time required for skin burns to reach a deep-partial thickness level of injury. The model includes multiple tissue layers including the epidermis, dermis, hypodermis, and subcutaneous tissue. Simulated exposure temperatures ranged from 62.8 to 87.8°C (145-190°F). Two scenarios were investigated. The first and worst case scenario was a direct exposure to water (characterized by a large convection coefficient) with the clothing left on the skin following the exposure. A second case consisted of a scald insult followed immediately by the skin being washed with cool water (20°C). For both cases, an Arrhenius injury model was applied whereby the extent and depth of injury were calculated and compared for the different post-burn treatments. In addition, injury values were compared with experiment data from the literature to assess verification of the numerical methodology. It was found that the clinical observations of injury extent agreed with the calculated values. Furthermore, inundation with cool water decreased skin temperatures more quickly than the clothing insulating case and led to a modest decrease in the burn extent. Copyright © 2015 Elsevier Ltd and ISBI. All rights reserved.

  17. Ambient air pollution, traffic noise and adult asthma prevalence: a BioSHaRE approach.

    PubMed

    Cai, Yutong; Zijlema, Wilma L; Doiron, Dany; Blangiardo, Marta; Burton, Paul R; Fortier, Isabel; Gaye, Amadou; Gulliver, John; de Hoogh, Kees; Hveem, Kristian; Mbatchou, Stéphane; Morley, David W; Stolk, Ronald P; Elliott, Paul; Hansell, Anna L; Hodgson, Susan

    2017-01-01

    We investigated the effects of both ambient air pollution and traffic noise on adult asthma prevalence, using harmonised data from three European cohort studies established in 2006-2013 (HUNT3, Lifelines and UK Biobank).Residential exposures to ambient air pollution (particulate matter with aerodynamic diameter ≤10 µm (PM 10 ) and nitrogen dioxide (NO 2 )) were estimated by a pan-European Land Use Regression model for 2007. Traffic noise for 2009 was modelled at home addresses by adapting a standardised noise assessment framework (CNOSSOS-EU). A cross-sectional analysis of 646 731 participants aged ≥20 years was undertaken using DataSHIELD to pool data for individual-level analysis via a "compute to the data" approach. Multivariate logistic regression models were fitted to assess the effects of each exposure on lifetime and current asthma prevalence.PM 10 or NO 2 higher by 10 µg·m -3 was associated with 12.8% (95% CI 9.5-16.3%) and 1.9% (95% CI 1.1-2.8%) higher lifetime asthma prevalence, respectively, independent of confounders. Effects were larger in those aged ≥50 years, ever-smokers and less educated. Noise exposure was not significantly associated with asthma prevalence.This study suggests that long-term ambient PM 10 exposure is associated with asthma prevalence in western European adults. Traffic noise is not associated with asthma prevalence, but its potential to impact on asthma exacerbations needs further investigation. Copyright ©ERS 2017.

  18. Two Decades of WRF/CMAQ simulations over the continental ...

    EPA Pesticide Factsheets

    Confidence in the application of models for forecasting and regulatory assessments is furthered by conducting four types of model evaluation: operational, dynamic, diagnostic, and probabilistic. Operational model evaluation alone does not reveal the confidence limits that can be associated with modeled air quality concentrations. This paper presents novel approaches for performing dynamic model evaluation and for evaluating the confidence limits of ozone exceedances using the WRF/CMAQ model simulations over the continental United States for the period from 1990 to 2010. The methodology presented here entails spectral decomposition of ozone time series using the KZ filter to assess the variations in the strengths of the synoptic (i.e., weather-induced variation) and baseline (i.e., long-term variation attributable to emissions, policy, and trends) forcings embedded in the modeled and observed concentrations. A method is presented where the future year observations are estimated based on the changes in the concentrations predicted by the model applied to the current year observations. The proposed method can provide confidence limits for ozone exceedances for a given emission reduction scenario. We present and discuss these new approaches to identify the strengths of the model in representing the changes in simulated O3 air quality over the 21-year period. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates

  19. Computational Model Prediction and Biological Validation Using Simplified Mixed Field Exposures for the Development of a GCR Reference Field

    NASA Technical Reports Server (NTRS)

    Hada, M.; Rhone, J.; Beitman, A.; Saganti, P.; Plante, I.; Ponomarev, A.; Slaba, T.; Patel, Z.

    2018-01-01

    The yield of chromosomal aberrations has been shown to increase in the lymphocytes of astronauts after long-duration missions of several months in space. Chromosome exchanges, especially translocations, are positively correlated with many cancers and are therefore a potential biomarker of cancer risk associated with radiation exposure. Although extensive studies have been carried out on the induction of chromosomal aberrations by low- and high-LET radiation in human lymphocytes, fibroblasts, and epithelial cells exposed in vitro, there is a lack of data on chromosome aberrations induced by low dose-rate chronic exposure and mixed field beams such as those expected in space. Chromosome aberration studies at NSRL will provide the biological validation needed to extend the computational models over a broader range of experimental conditions (more complicated mixed fields leading up to the galactic cosmic rays (GCR) simulator), helping to reduce uncertainties in radiation quality effects and dose-rate dependence in cancer risk models. These models can then be used to answer some of the open questions regarding requirements for a full GCR reference field, including particle type and number, energy, dose rate, and delivery order. In this study, we designed a simplified mixed field beam with a combination of proton, helium, oxygen, and iron ions with shielding or proton, helium, oxygen, and titanium without shielding. Human fibroblasts cells were irradiated with these mixed field beam as well as each single beam with acute and chronic dose rate, and chromosome aberrations (CA) were measured with 3-color fluorescent in situ hybridization (FISH) chromosome painting methods. Frequency and type of CA induced with acute dose rate and chronic dose rates with single and mixed field beam will be discussed. A computational chromosome and radiation-induced DNA damage model, BDSTRACKS (Biological Damage by Stochastic Tracks), was updated to simulate various types of CA induced by acute exposures of the mixed field beams used for the experiments. The chromosomes were simulated by a polymer random walk algorithm with restrictions to their respective domains in the nucleus [1]. The stochastic dose to the nucleus was calculated with the code RITRACKS [2]. Irradiation of a target volume by a mixed field of ions was implemented within RITRACKs, and the fields of ions can be delivered over specific periods of time, allowing the simulation of dose-rate effects. Similarly, particles of various types and energies extracted from a pre-calculated spectra of galactic cosmic rays (GCR) can be used in RITRACKS. The number and spatial location of DSBs (DNA double-strand breaks) were calculated in BDSTRACKS using the simulated chromosomes and local (voxel) dose. Assuming that DSBs led to chromosome breaks, and simulating the rejoining of damaged chromosomes occurring during repair, BDSTRACKS produces the yield of various types of chromosome aberrations as a function of time (only final yields are presented). A comparison between experimental and simulation results will be shown.

  20. Assessment of health and economic effects by PM2.5 pollution in Beijing: a combined exposure-response and computable general equilibrium analysis.

    PubMed

    Wang, Guizhi; Gu, SaiJu; Chen, Jibo; Wu, Xianhua; Yu, Jun

    2016-12-01

    Assessment of the health and economic impacts of PM2.5 pollution is of great importance for urban air pollution prevention and control. In this study, we evaluate the damage of PM2.5 pollution using Beijing as an example. First, we use exposure-response functions to estimate the adverse health effects due to PM2.5 pollution. Then, the corresponding labour loss and excess medical expenditure are computed as two conducting variables. Finally, different from the conventional valuation methods, this paper introduces the two conducting variables into the computable general equilibrium (CGE) model to assess the impacts on sectors and the whole economic system caused by PM2.5 pollution. The results show that, substantial health effects of the residents in Beijing from PM2.5 pollution occurred in 2013, including 20,043 premature deaths and about one million other related medical cases. Correspondingly, using the 2010 social accounting data, Beijing gross domestic product loss due to the health impact of PM2.5 pollution is estimated as 1286.97 (95% CI: 488.58-1936.33) million RMB. This demonstrates that PM2.5 pollution not only has adverse health effects, but also brings huge economic loss.

  1. Lighting Condition Analysis for Mars' Moon Phobos

    NASA Technical Reports Server (NTRS)

    Li, Zu Qun; de Carufel, Guy; Crues, Edwin Z.; Bielski, Paul

    2016-01-01

    This study used high fidelity computer simulation to investigate the lighting conditions, specifically the solar radiation flux over the surface, on Phobos. Ephemeris data from the Jet Propulsion Laboratory (JPL) DE405 model was used to model the state of the Sun, Earth, Moon, and Mars. An occultation model was developed to simulate Phobos' self-shadowing and its solar eclipses by Mars. The propagated Phobos state was compared with data from JPL's Horizon system to ensure the accuracy of the result. Results for Phobos lighting conditions over one Martian year are presented, which include the duration of solar eclipses, average solar radiation intensity, surface exposure time, and radiant exposure for both sun tracking and fixed solar arrays. The results show that: Phobos' solar eclipse time varies throughout the Martian year, with longer eclipse durations during the Martian northern spring and fall seasons and no eclipses during the Martian northern summer and winter seasons; solar radiation intensity is close to minimum in late spring and close to maximum in late fall; exposure time per orbit is relatively constant over the surface during the spring and fall but varies with latitude during the summer and winter; and Sun tracking solar arrays generate more energy than a fixed solar array. A usage example of the result is also present in this paper to demonstrate the utility.

  2. C 60 fullerene localization and membrane interactions in RAW 264.7 immortalized mouse macrophages

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Russ, K. A.; Elvati, P.; Parsonage, T. L.

    There continues to be a significant increase in the number and complexity of hydrophobic nanomaterials that are engineered for a variety of commercial purposes making human exposure a significant health concern. This study uses a combination of biophysical, biochemical and computational methods to probe potential mechanisms for uptake of C 60 nanoparticles into various compartments of living immune cells. Cultures of RAW 264.7 immortalized murine macrophage were used as a canonical model of immune-competent cells that are likely to provide the first line of defense following inhalation. Modes of entry studied were endocytosis/pinocytosis and passive permeation of cellular membranes. Themore » evidence suggests marginal uptake of C 60 clusters is achieved through endocytosis/pinocytosis, and that passive diffusion into membranes provides a significant source of biologically-available nanomaterial. Compu-tational modeling of both a single molecule and a small cluster of fullerenes predicts that low concentrations of fullerenes enter the membrane individually and produce limited perturbation; however, at higher concentrations the clusters in the membrane causes deformation of the membrane. These findings are bolstered by nuclear magnetic resonance (NMR) of model membranes that reveal defor-mation of the cell membrane upon exposure to high concentrations of fullerenes. The atomistic and NMR models fail to explain escape of the particle out of biological membranes, but are limited to idealized systems that do not completely recapitulate the complexity of cell membranes. Lastly, the surprising contribution of passive modes of cellular entry provides new avenues for toxicological research that go beyond the pharmacological inhibition of bulk transport systems such as pinocytosis.« less

  3. C 60 fullerene localization and membrane interactions in RAW 264.7 immortalized mouse macrophages

    DOE PAGES

    Russ, K. A.; Elvati, P.; Parsonage, T. L.; ...

    2016-01-01

    There continues to be a significant increase in the number and complexity of hydrophobic nanomaterials that are engineered for a variety of commercial purposes making human exposure a significant health concern. This study uses a combination of biophysical, biochemical and computational methods to probe potential mechanisms for uptake of C 60 nanoparticles into various compartments of living immune cells. Cultures of RAW 264.7 immortalized murine macrophage were used as a canonical model of immune-competent cells that are likely to provide the first line of defense following inhalation. Modes of entry studied were endocytosis/pinocytosis and passive permeation of cellular membranes. Themore » evidence suggests marginal uptake of C 60 clusters is achieved through endocytosis/pinocytosis, and that passive diffusion into membranes provides a significant source of biologically-available nanomaterial. Compu-tational modeling of both a single molecule and a small cluster of fullerenes predicts that low concentrations of fullerenes enter the membrane individually and produce limited perturbation; however, at higher concentrations the clusters in the membrane causes deformation of the membrane. These findings are bolstered by nuclear magnetic resonance (NMR) of model membranes that reveal defor-mation of the cell membrane upon exposure to high concentrations of fullerenes. The atomistic and NMR models fail to explain escape of the particle out of biological membranes, but are limited to idealized systems that do not completely recapitulate the complexity of cell membranes. Lastly, the surprising contribution of passive modes of cellular entry provides new avenues for toxicological research that go beyond the pharmacological inhibition of bulk transport systems such as pinocytosis.« less

  4. The effect of increase in dielectric values on specific absorption rate (SAR) in eye and head tissues following 900, 1800 and 2450 MHz radio frequency (RF) exposure

    NASA Astrophysics Data System (ADS)

    Keshvari, Jafar; Keshvari, Rahim; Lang, Sakari

    2006-03-01

    Numerous studies have attempted to address the question of the RF energy absorption difference between children and adults using computational methods. They have assumed the same dielectric parameters for child and adult head models in SAR calculations. This has been criticized by many researchers who have stated that child organs are not fully developed, their anatomy is different and also their tissue composition is slightly different with higher water content. Higher water content would affect dielectric values, which in turn would have an effect on RF energy absorption. The objective of this study was to investigate possible variation in specific absorption rate (SAR) in the head region of children and adults by applying the finite-difference time-domain (FDTD) method and using anatomically correct child and adult head models. In the calculations, the conductivity and permittivity of all tissues were increased from 5 to 20% but using otherwise the same exposure conditions. A half-wave dipole antenna was used as an exposure source to minimize the uncertainties of the positioning of a real mobile device and making the simulations easily replicable. Common mobile telephony frequencies of 900, 1800 and 2450 MHz were used in this study. The exposures of ear and eye regions were investigated. The SARs of models with increased dielectric values were compared to the SARs of the models where dielectric values were unchanged. The analyses suggest that increasing the value of dielectric parameters does not necessarily mean that volume-averaged SAR would increase. Under many exposure conditions, specifically at higher frequencies in eye exposure, volume-averaged SAR decreases. An increase of up to 20% in dielectric conductivity or both conductivity and permittivity always caused a SAR variation of less than 20%, usually about 5%, when it was averaged over 1, 5 or 10 g of cubic mass for all models. The thickness and composition of different tissue layers in the exposed regions within the human head play a more significant role in SAR variation compared to the variations (5-20%) of the tissue dielectric parameters.

  5. Polybrominated diphenyl ethers (PBDE) in serum: findings from a US cohort of consumers of sport-caught fish.

    PubMed

    Anderson, Henry A; Imm, Pamela; Knobeloch, Lynda; Turyk, Mary; Mathew, John; Buelow, Carol; Persky, Victoria

    2008-09-01

    Polybrominated diphenyl ethers (PBDEs) have been used as flame retardants in foams, fabrics and plastics, and are common contaminants of household air and dust and bioaccumulate in wildlife, and are detectable in human tissues and in fish and animal food products. In the Great Lakes Basin sport fish consumption has been demonstrated to be an important source of PCB and DDE exposure. PBDEs are present in the same sport fish but prior to our study the contribution to human PBDE body burdens from Great Lakes sport fish consumption had not been investigated. This study was designed to assess PBDE, PCB and 1,1-bis(4-chlorophenyl)-2,2-dichloroethene (DDE) serum concentrations in an existing cohort of 508 frequent and infrequent consumers of sport-caught fish living in five Great Lake states. BDE congeners 47 and 99 were identified in the majority of blood samples, 98% and 62% respectively. summation operatorPBDE levels were positively associated with age, hours spent outdoors, DDE, summation operatorPCB, years of sportfish consumption, and catfish and shellfish intake, and negatively associated with income and recent weight loss. Other dietary components collected were not predictive of measured summation operatorPBDE levels. In multivariate models, summation operatorPBDE levels were positively associated with age, years consuming sport fish, shellfish meals, and computer use and negatively associated with recent weight loss. Having summation operatorPBDE levels in the highest quintile was independently associated with older age, male gender, consumption of catfish and shellfish, computer use and spending less time indoors. summation operatorPCB and DDE were strongly associated suggesting common exposure routes. The association between summation operatorPBDE and summation operatorPCB or DDE was much weaker and modeling suggested more diverse PBDE sources with few identified multi-contaminant-shared exposure routes. In our cohort Great Lakes sport fish consumption does not contribute strongly to PBDE exposure.

  6. Potential impact of clinical use of noninvasive FFRCT on radiation dose exposure and downstream clinical event rate.

    PubMed

    Bilbey, Nicolas; Blanke, Philipp; Naoum, Christopher; Arepalli, Chesnel Dey; Norgaard, Bjarne Linde; Leipsic, Jonathon

    2016-01-01

    This study aims to determine the potential impact of introducing noninvasive fractional flow reserve based on coronary computed tomography angiography (CTA) into clinical practice, with respect to radiation dose exposure and downstream event rate. We modeled a population of 1000 stable, symptomatic patients with suspected coronary artery disease, using the disease prevalence from the CONFIRM registry to estimate the pretest likelihood. Four potential clinical pathways were modeled based on the first noninvasive diagnostic test performed: (1) dobutamine echo; (2) single-photon emission computerized tomography (SPECT); (3) coronary CTA; and (4) CTA+FFRCT and leading to possible invasive coronary angiography. The posttest likelihood of testing positive/negative by each test was based on the presenting disease burden and diagnostic accuracy of each test. The dobutamine echo pathway resulted in the lowest radiation dose of 5.4 mSv, with 4.0 mSv from angiography and 1.4 mSv from percutaneous coronary intervention (PCI). The highest dose was with SPECT, with 26.5 mSv. The coronary computed tomography angiography (cCTA) pathway demonstrated a dose of 14.2 mSv, 3.7 mSv from cCTA, 7.7 mSv from angiography, and 2.8 mSv from PCI. The CTA+FFRCT pathway exhibited a radiation dose of 9.7 mSv, 3.7 mSv for cCTA, 4.2 mSv for angiography, and 1.8 mSv for PCI. Radiation dose exposure for CTA+FFRCT was lower than for SPECT (P<.001). The CTA+FFRCT pathway resulted in the lowest projected death/myocardial infarction rate at 1 year (2.44%) while the dobutamine stress pathway had the highest 1-year event rate (2.84%). Our analysis suggests that integrating FFRCT into the CTA clinical pathway may result in reduced cumulative radiation exposure, while promoting favorable clinical outcomes. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. ELF exposure from mobile and cordless phones for the epidemiological MOBI-Kids study.

    PubMed

    Calderón, Carolina; Ichikawa, Hiroki; Taki, Masao; Wake, Kanako; Addison, Darren; Mee, Terry; Maslanyj, Myron; Kromhout, Hans; Lee, Ae-Kyoung; Sim, Malcolm R; Wiart, Joe; Cardis, Elisabeth

    2017-04-01

    This paper describes measurements and computational modelling carried out in the MOBI-Kids case-control study to assess the extremely low frequency (ELF) exposure of the brain from use of mobile and cordless phones. Four different communication systems were investigated: Global System for Mobile (GSM), Universal Mobile Telecommunications System (UMTS), Digital Enhanced Cordless Telecommunications (DECT) and Wi-Fi Voice over Internet Protocol (VoIP). The magnetic fields produced by the phones during transmission were measured under controlled laboratory conditions, and an equivalent loop was fitted to the data to produce three-dimensional extrapolations of the field. Computational modelling was then used to calculate the induced current density and electric field strength in the brain resulting from exposure to these magnetic fields. Human voxel phantoms of four different ages were used: 8, 11, 14 and adult. The results indicate that the current densities induced in the brain during DECT calls are likely to be an order of magnitude lower than those generated during GSM calls but over twice that during UMTS calls. The average current density during Wi-Fi VoIP calls was found to be lower than for UMTS by 30%, but the variability across the samples investigated was high. Spectral contributions were important to consider in relation to current density, particularly for DECT phones. This study suggests that the spatial distribution of the ELF induced current densities in brain tissues is determined by the physical characteristics of the phone (in particular battery position) while the amplitude is mainly dependent on communication system, thus providing a feasible basis for assessing ELF exposure in the epidemiological study. The number of phantoms was not large enough to provide definitive evidence of an increase of induced current density with age, but the data that are available suggest that, if present, the effect is likely to be very small. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Cloud immersion building shielding factors for US residential structures.

    PubMed

    Dickson, E D; Hamby, D M

    2014-12-01

    This paper presents validated building shielding factors designed for contemporary US housing-stock under an idealized, yet realistic, exposure scenario within a semi-infinite cloud of radioactive material. The building shielding factors are intended for use in emergency planning and level three probabilistic risk assessments for a variety of postulated radiological events in which a realistic assessment is necessary to better understand the potential risks for accident mitigation and emergency response planning. Factors are calculated from detailed computational housing-units models using the general-purpose Monte Carlo N-Particle computational code, MCNP5, and are benchmarked from a series of narrow- and broad-beam measurements analyzing the shielding effectiveness of ten common general-purpose construction materials and ten shielding models representing the primary weather barriers (walls and roofs) of likely US housing-stock. Each model was designed to scale based on common residential construction practices and include, to the extent practical, all structurally significant components important for shielding against ionizing radiation. Calculations were performed for floor-specific locations as well as for computing a weighted-average representative building shielding factor for single- and multi-story detached homes, both with and without basement, as well for single-wide manufactured housing-units.

  9. Computed radiography utilizing laser-stimulated luminescence: detectability of simulated low-contrast radiographic objects.

    PubMed

    Higashida, Y; Moribe, N; Hirata, Y; Morita, K; Doudanuki, S; Sonoda, Y; Katsuda, N; Hiai, Y; Misumi, W; Matsumoto, M

    1988-01-01

    Threshold contrasts of low-contrast objects with computed radiography (CR) images were compared with those of blue and green emitting screen-film systems by employing the 18-alternative forced choice (18-AFC) procedure. The dependence of the threshold contrast on the incident X-ray exposure and also the object size was studied. The results indicated that the threshold contrasts of CR system were comparable to those of blue and green screen-film systems and decreased with increasing object size, and increased with decreasing incident X-ray exposure. The increase in threshold contrasts was small when the relative incident exposure decreased from 1 to 1/4, and was large when incident exposure was decreased further.

  10. Comparative Risks of Aldehyde Constituents in Cigarette Smoke Using Transient Computational Fluid Dynamics/Physiologically Based Pharmacokinetic Models of the Rat and Human Respiratory Tracts

    PubMed Central

    Corley, Richard A.; Kabilan, Senthil; Kuprat, Andrew P.; Carson, James P.; Jacob, Richard E.; Minard, Kevin R.; Teeguarden, Justin G.; Timchalk, Charles; Pipavath, Sudhakar; Glenny, Robb; Einstein, Daniel R.

    2015-01-01

    Computational fluid dynamics (CFD) modeling is well suited for addressing species-specific anatomy and physiology in calculating respiratory tissue exposures to inhaled materials. In this study, we overcame prior CFD model limitations to demonstrate the importance of realistic, transient breathing patterns for predicting site-specific tissue dose. Specifically, extended airway CFD models of the rat and human were coupled with airway region-specific physiologically based pharmacokinetic (PBPK) tissue models to describe the kinetics of 3 reactive constituents of cigarette smoke: acrolein, acetaldehyde and formaldehyde. Simulations of aldehyde no-observed-adverse-effect levels for nasal toxicity in the rat were conducted until breath-by-breath tissue concentration profiles reached steady state. Human oral breathing simulations were conducted using representative aldehyde yields from cigarette smoke, measured puff ventilation profiles and numbers of cigarettes smoked per day. As with prior steady-state CFD/PBPK simulations, the anterior respiratory nasal epithelial tissues received the greatest initial uptake rates for each aldehyde in the rat. However, integrated time- and tissue depth-dependent area under the curve (AUC) concentrations were typically greater in the anterior dorsal olfactory epithelium using the more realistic transient breathing profiles. For human simulations, oral and laryngeal tissues received the highest local tissue dose with greater penetration to pulmonary tissues than predicted in the rat. Based upon lifetime average daily dose comparisons of tissue hot-spot AUCs (top 2.5% of surface area-normalized AUCs in each region) and numbers of cigarettes smoked/day, the order of concern for human exposures was acrolein > formaldehyde > acetaldehyde even though acetaldehyde yields were 10-fold greater than formaldehyde and acrolein. PMID:25858911

  11. Computational model of chromosome aberration yield induced by high- and low-LET radiation exposures.

    PubMed

    Ponomarev, Artem L; George, Kerry; Cucinotta, Francis A

    2012-06-01

    We present a computational model for calculating the yield of radiation-induced chromosomal aberrations in human cells based on a stochastic Monte Carlo approach and calibrated using the relative frequencies and distributions of chromosomal aberrations reported in the literature. A previously developed DNA-fragmentation model for high- and low-LET radiation called the NASARadiationTrackImage model was enhanced to simulate a stochastic process of the formation of chromosomal aberrations from DNA fragments. The current version of the model gives predictions of the yields and sizes of translocations, dicentrics, rings, and more complex-type aberrations formed in the G(0)/G(1) cell cycle phase during the first cell division after irradiation. As the model can predict smaller-sized deletions and rings (<3 Mbp) that are below the resolution limits of current cytogenetic analysis techniques, we present predictions of hypothesized small deletions that may be produced as a byproduct of properly repaired DNA double-strand breaks (DSB) by nonhomologous end-joining. Additionally, the model was used to scale chromosomal exchanges in two or three chromosomes that were obtained from whole-chromosome FISH painting analysis techniques to whole-genome equivalent values.

  12. Shift focal spot X-ray tube to the imposition anode under long exposure

    NASA Astrophysics Data System (ADS)

    Obodovskiy, A. V.; Bessonov, V. B.; Larionov, I. A.

    2018-02-01

    X-ray non-destructive testing is an integral part of any modern industrial production. Microfocus X-ray sources make it possible to obtain projected images with an increased spatial resolution by using a direct geometric magnification during the survey. On the basis of the St. Petersburg State Electrotechnical University staff of the department of electronic devices and equipment has been designed model of microfocus X-ray computed tomography.

  13. Fractional dynamics pharmacokinetics–pharmacodynamic models

    PubMed Central

    2010-01-01

    While an increasing number of fractional order integrals and differential equations applications have been reported in the physics, signal processing, engineering and bioengineering literatures, little attention has been paid to this class of models in the pharmacokinetics–pharmacodynamic (PKPD) literature. One of the reasons is computational: while the analytical solution of fractional differential equations is available in special cases, it this turns out that even the simplest PKPD models that can be constructed using fractional calculus do not allow an analytical solution. In this paper, we first introduce new families of PKPD models incorporating fractional order integrals and differential equations, and, second, exemplify and investigate their qualitative behavior. The families represent extensions of frequently used PK link and PD direct and indirect action models, using the tools of fractional calculus. In addition the PD models can be a function of a variable, the active drug, which can smoothly transition from concentration to exposure, to hyper-exposure, according to a fractional integral transformation. To investigate the behavior of the models we propose, we implement numerical algorithms for fractional integration and for the numerical solution of a system of fractional differential equations. For simplicity, in our investigation we concentrate on the pharmacodynamic side of the models, assuming standard (integer order) pharmacokinetics. PMID:20455076

  14. Are air pollution and traffic noise independently associated with atherosclerosis: the Heinz Nixdorf Recall Study.

    PubMed

    Kälsch, Hagen; Hennig, Frauke; Moebus, Susanne; Möhlenkamp, Stefan; Dragano, Nico; Jakobs, Hermann; Memmesheimer, Michael; Erbel, Raimund; Jöckel, Karl-Heinz; Hoffmann, Barbara

    2014-04-01

    Living close to high traffic has been linked to subclinical atherosclerosis, however it is not clear, whether fine particulate matter (PM) air pollution or noise, two important traffic-related exposures, are responsible for the association. We investigate the independent associations of long-term exposure to fine PM and road traffic noise with thoracic aortic calcification (TAC), a reliable measure of subclinical atherosclerosis. We used baseline data (2000-2003) from the German Heinz Nixdorf Recall Study, a population-based cohort of 4814 randomly selected participants. We assessed residential long-term exposure to PM with a chemistry transport model, and to road traffic noise using façade levels from noise models as weighted 24 h mean noise (Lden) and night-time noise (Lnight). Thoracic aortic calcification was quantified from non-contrast enhanced electron beam computed tomography. We used multiple linear regression to estimate associations of environmental exposures with ln(TAC+1), adjusting for each other, individual, and neighbourhood characteristics. In 4238 participants (mean age 60 years, 49.9% male), PM2.5 (aerodynamic diameter ≤2.5 µm) and Lnight are both associated with an increasing TAC-burden of 18.1% (95% CI: 6.6; 30.9%) per 2.4 µg/m(3) PM2.5 and 3.9% (95% CI 0.0; 8.0%) per 5dB(A) Lnight, respectively, in the full model and after mutual adjustment. We did not observe effect measure modification of the PM2.5 association by Lnight or vice versa. Long-term exposure to fine PM and night-time traffic noise are both independently associated with subclinical atherosclerosis and may both contribute to the association of traffic proximity with atherosclerosis.

  15. Reconstructing Population Exposures to Environmental Chemicals from Biomarkers: Challenges and Opportunities

    EPA Science Inventory

    A conceptual/computational framework for exposure reconstruction from biomarker data combined with auxiliary exposure-related data is presented, evaluated with example applications, and examined in the context of future needs and opportunities. This framework employs Physiologica...

  16. Computer Programming Languages and Expertise Needed by Practicing Engineers.

    ERIC Educational Resources Information Center

    Doelling, Irvin

    1980-01-01

    Discussed is the present engineering computer environment of a large aerospace company recognized as a leader in the application and development of computer-aided design and computer-aided manufacturing techniques. A review is given of the exposure spectrum of engineers to the world of computing, the computer languages used, and the career impacts…

  17. Completing the Link between Exposure Science and Toxicology for Improved Environmental Health Decision Making: The Aggregate Exposure Pathway Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Teeguarden, Justin G.; Tan, Yu-Mei; Edwards, Stephen W.

    Driven by major scientific advances in analytical methods, biomonitoring, and computational exposure assessment, and a newly articulated vision for a greater impact in public health, the field of exposure science is undergoing a rapid transition from a field of observation to a field of prediction. Deployment of an organizational and predictive framework for exposure science analogous to the computationally enabled “systems approaches” used in the biological sciences is a necessary step in this evolution. Here we propose the aggregate exposure pathway (AEP) concept as the natural and complementary companion in the exposure sciences to the adverse outcome pathway (AOP) conceptmore » in the toxicological sciences. The AEP framework offers an intuitive approach to successful organization of exposure science data within individual units of prediction common to the field, setting the stage for exposure forecasting. Looking farther ahead, we envision direct linkages between aggregate exposure pathway and adverse outcome pathways, completing the source to outcome continuum and setting the stage for more efficient integration of exposure science and toxicity testing information. Together these frameworks form and inform a decision making framework with the flexibility for risk-based, hazard-based or exposure-based decisions.« less

  18. HADOC: a computer code for calculation of external and inhalation doses from acute radionuclide releases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strenge, D.L.; Peloquin, R.A.

    The computer code HADOC (Hanford Acute Dose Calculations) is described and instructions for its use are presented. The code calculates external dose from air submersion and inhalation doses following acute radionuclide releases. Atmospheric dispersion is calculated using the Hanford model with options to determine maximum conditions. Building wake effects and terrain variation may also be considered. Doses are calculated using dose conversion factor supplied in a data library. Doses are reported for one and fifty year dose commitment periods for the maximum individual and the regional population (within 50 miles). The fractional contribution to dose by radionuclide and exposure modemore » are also printed if requested.« less

  19. Genocide Exposure and Subsequent Suicide Risk: A Population-Based Study

    PubMed Central

    Levine, Stephen Z.; Levav, Itzhak; Yoffe, Rinat; Becher, Yifat; Pugachova, Inna

    2016-01-01

    The association between periods of genocide-related exposures and suicide risk remains unknown. Our study tests that association using a national population-based study design. The source population comprised of all persons born during1922-1945 in Nazi-occupied or dominated European nations, that immigrated to Israel by 1965, were identified in the Population Register (N = 220,665), and followed up for suicide to 2014, totaling 16,953,602 person-years. The population was disaggregated to compare a trauma gradient among groups that immigrated before (indirect, n = 20,612, 9%); during (partial direct, n = 17,037, 8%); or after (full direct, n = 183,016, 83%) exposure to the Nazi era. Also, the direct exposure groups were examined regarding pre- or post-natal exposure periods. Cox regression models were used to compute Hazard Ratios (HR) of suicide risk to compare the exposure groups, adjusting for confounding by gender, residential SES and history of psychiatric hospitalization. In the total population, only the partial direct exposure subgroup was at greater risk compared to the indirect exposure group (HR = 1.73, 95% CI, 1.10, 2.73; P < .05). That effect replicated in six sensitivity analyses. In addition, sensitivity analyses showed that exposure at ages 13 plus among females, and follow-up by years since immigration were associated with a greater risk; whereas in utero exposure among persons with no psychiatric hospitalization and early postnatal exposure among males were at a reduced risk. Tentative mechanisms impute biopsychosocial vulnerability and natural selection during early critical periods among males, and feelings of guilt and entrapment or defeat among females. PMID:26901411

  20. Pet exposure and risk of atopic dermatitis at the pediatric age: a meta-analysis of birth cohort studies.

    PubMed

    Pelucchi, Claudio; Galeone, Carlotta; Bach, Jean-François; La Vecchia, Carlo; Chatenoud, Liliane

    2013-09-01

    Findings on pet exposure and the risk of atopic dermatitis (AD) in children are inconsistent. With the aim to summarize the results of exposure to different pets on AD, we undertook a meta-analysis of epidemiologic studies on this issue. In August 2012, we conducted a systematic literature search in Medline and Embase. We included analytic studies considering exposure to dogs, cats, other pets, or pets overall during pregnancy, infancy, and/or childhood, with AD assessment performed during infancy or childhood. We calculated summary relative risks and 95% CIs using both fixed- and random-effects models. We computed summary estimates across selected subgroups. Twenty-six publications from 21 birth cohort studies were used in the meta-analyses. The pooled relative risks of AD for exposure versus no exposure were 0.72 (95% CI, 0.61-0.85; I(2) = 46%; results based on 15 studies) for exposure to dogs, 0.94 (95% CI, 0.76-1.16; I(2) = 54%; results based on 13 studies) for exposure to cats, and 0.75 (95% CI, 0.67-0.85; I(2) = 54%; results based on 11 studies) for exposure to pets overall. No heterogeneity emerged across the subgroups examined, except for geographic area. This meta-analysis reported a favorable effect of exposure to dogs and pets on the risk of AD in infants or children, whereas no association emerged with exposure to cats. Copyright © 2013 American Academy of Allergy, Asthma & Immunology. Published by Mosby, Inc. All rights reserved.

Top