Lu, Jingtao; Goldsmith, Michael-Rock; Grulke, Christopher M; Chang, Daniel T; Brooks, Raina D; Leonard, Jeremy A; Phillips, Martin B; Hypes, Ethan D; Fair, Matthew J; Tornero-Velez, Rogelio; Johnson, Jeffre; Dary, Curtis C; Tan, Yu-Mei
2016-02-01
Developing physiologically-based pharmacokinetic (PBPK) models for chemicals can be resource-intensive, as neither chemical-specific parameters nor in vivo pharmacokinetic data are easily available for model construction. Previously developed, well-parameterized, and thoroughly-vetted models can be a great resource for the construction of models pertaining to new chemicals. A PBPK knowledgebase was compiled and developed from existing PBPK-related articles and used to develop new models. From 2,039 PBPK-related articles published between 1977 and 2013, 307 unique chemicals were identified for use as the basis of our knowledgebase. Keywords related to species, gender, developmental stages, and organs were analyzed from the articles within the PBPK knowledgebase. A correlation matrix of the 307 chemicals in the PBPK knowledgebase was calculated based on pharmacokinetic-relevant molecular descriptors. Chemicals in the PBPK knowledgebase were ranked based on their correlation toward ethylbenzene and gefitinib. Next, multiple chemicals were selected to represent exact matches, close analogues, or non-analogues of the target case study chemicals. Parameters, equations, or experimental data relevant to existing models for these chemicals and their analogues were used to construct new models, and model predictions were compared to observed values. This compiled knowledgebase provides a chemical structure-based approach for identifying PBPK models relevant to other chemical entities. Using suitable correlation metrics, we demonstrated that models of chemical analogues in the PBPK knowledgebase can guide the construction of PBPK models for other chemicals.
Grulke, Christopher M.; Chang, Daniel T.; Brooks, Raina D.; Leonard, Jeremy A.; Phillips, Martin B.; Hypes, Ethan D.; Fair, Matthew J.; Tornero-Velez, Rogelio; Johnson, Jeffre; Dary, Curtis C.; Tan, Yu-Mei
2016-01-01
Developing physiologically-based pharmacokinetic (PBPK) models for chemicals can be resource-intensive, as neither chemical-specific parameters nor in vivo pharmacokinetic data are easily available for model construction. Previously developed, well-parameterized, and thoroughly-vetted models can be a great resource for the construction of models pertaining to new chemicals. A PBPK knowledgebase was compiled and developed from existing PBPK-related articles and used to develop new models. From 2,039 PBPK-related articles published between 1977 and 2013, 307 unique chemicals were identified for use as the basis of our knowledgebase. Keywords related to species, gender, developmental stages, and organs were analyzed from the articles within the PBPK knowledgebase. A correlation matrix of the 307 chemicals in the PBPK knowledgebase was calculated based on pharmacokinetic-relevant molecular descriptors. Chemicals in the PBPK knowledgebase were ranked based on their correlation toward ethylbenzene and gefitinib. Next, multiple chemicals were selected to represent exact matches, close analogues, or non-analogues of the target case study chemicals. Parameters, equations, or experimental data relevant to existing models for these chemicals and their analogues were used to construct new models, and model predictions were compared to observed values. This compiled knowledgebase provides a chemical structure-based approach for identifying PBPK models relevant to other chemical entities. Using suitable correlation metrics, we demonstrated that models of chemical analogues in the PBPK knowledgebase can guide the construction of PBPK models for other chemicals. PMID:26871706
Ding, Feng; Yang, Xianhai; Chen, Guosong; Liu, Jining; Shi, Lili; Chen, Jingwen
2017-10-01
The partition coefficients between bovine serum albumin (BSA) and water (K BSA/w ) for ionogenic organic chemicals (IOCs) were different greatly from those of neutral organic chemicals (NOCs). For NOCs, several excellent models were developed to predict their logK BSA/w . However, it was found that the conventional descriptors are inappropriate for modeling logK BSA/w of IOCs. Thus, alternative approaches are urgently needed to develop predictive models for K BSA/w of IOCs. In this study, molecular descriptors that can be used to characterize the ionization effects (e.g. chemical form adjusted descriptors) were calculated and used to develop predictive models for logK BSA/w of IOCs. The models developed had high goodness-of-fit, robustness, and predictive ability. The predictor variables selected to construct the models included the chemical form adjusted averages of the negative potentials on the molecular surface (V s-adj - ), the chemical form adjusted molecular dipole moment (dipolemoment adj ), the logarithm of the n-octanol/water distribution coefficient (logD). As these molecular descriptors can be calculated from their molecular structures directly, the developed model can be easily used to fill the logK BSA/w data gap for other IOCs within the applicability domain. Furthermore, the chemical form adjusted descriptors calculated in this study also could be used to construct predictive models on other endpoints of IOCs. Copyright © 2017 Elsevier Inc. All rights reserved.
Önlü, Serli; Saçan, Melek Türker
2017-04-01
The authors modeled the 72-h algal toxicity data of hundreds of chemicals with different modes of action as a function of chemical structures. They developed mode of action-based local quantitative structure-toxicity relationship (QSTR) models for nonpolar and polar narcotics as well as a global QSTR model with a wide applicability potential for industrial chemicals and pharmaceuticals. The present study rigorously evaluated the generated models, meeting the Organisation for Economic Co-operation and Development principles of robustness, validity, and transparency. The proposed global model had a broad structural coverage for the toxicity prediction of diverse chemicals (some of which are high-production volume chemicals) with no experimental toxicity data. The global model is potentially useful for endpoint predictions, the evaluation of algal toxicity screening, and the prioritization of chemicals, as well as for the decision of further testing and the development of risk-management measures in a scientific and regulatory frame. Environ Toxicol Chem 2017;36:1012-1019. © 2016 SETAC. © 2016 SETAC.
Development of estrogen receptor beta binding prediction model using large sets of chemicals.
Sakkiah, Sugunadevi; Selvaraj, Chandrabose; Gong, Ping; Zhang, Chaoyang; Tong, Weida; Hong, Huixiao
2017-11-03
We developed an ER β binding prediction model to facilitate identification of chemicals specifically bind ER β or ER α together with our previously developed ER α binding model. Decision Forest was used to train ER β binding prediction model based on a large set of compounds obtained from EADB. Model performance was estimated through 1000 iterations of 5-fold cross validations. Prediction confidence was analyzed using predictions from the cross validations. Informative chemical features for ER β binding were identified through analysis of the frequency data of chemical descriptors used in the models in the 5-fold cross validations. 1000 permutations were conducted to assess the chance correlation. The average accuracy of 5-fold cross validations was 93.14% with a standard deviation of 0.64%. Prediction confidence analysis indicated that the higher the prediction confidence the more accurate the predictions. Permutation testing results revealed that the prediction model is unlikely generated by chance. Eighteen informative descriptors were identified to be important to ER β binding prediction. Application of the prediction model to the data from ToxCast project yielded very high sensitivity of 90-92%. Our results demonstrated ER β binding of chemicals could be accurately predicted using the developed model. Coupling with our previously developed ER α prediction model, this model could be expected to facilitate drug development through identification of chemicals that specifically bind ER β or ER α .
CFD Code Development for Combustor Flows
NASA Technical Reports Server (NTRS)
Norris, Andrew
2003-01-01
During the lifetime of this grant, work has been performed in the areas of model development, code development, code validation and code application. For model development, this has included the PDF combustion module, chemical kinetics based on thermodynamics, neural network storage of chemical kinetics, ILDM chemical kinetics and assumed PDF work. Many of these models were then implemented in the code, and in addition many improvements were made to the code, including the addition of new chemistry integrators, property evaluation schemes, new chemistry models and turbulence-chemistry interaction methodology. Validation of all new models and code improvements were also performed, while application of the code to the ZCET program and also the NPSS GEW combustor program were also performed. Several important items remain under development, including the NOx post processing, assumed PDF model development and chemical kinetic development. It is expected that this work will continue under the new grant.
The application of chemical leasing business models in Mexico.
Schwager, Petra; Moser, Frank
2006-03-01
To better address the requirements of the changing multilateral order, the United Nations Industrial Development Organization (UNIDO) Cleaner Production Programme, in 2004, developed the new Sustainable Industrial Resource Management (SIRM) approach. This approach is in accordance with the principles decided at the United Nations Conference on Environment and Development (UNCED) in Rio de Janeiro, Brazil in 1992. Unlike the traditional approaches to environmental management, the SIRM concept captures the idea of achieving sustainable industrial development through the implementation of circular material and energy flows in the entire production chain and reduction of the amount of material and energy used with greater efficiency solutions. The SIRM approach seeks to develop new models to encourage a shift from selling products to supplying services, modifying, in this manner, the supplier/user relationship and resulting in a win-win situation for the economy and the environment. Chemical Leasing represents such a new service-oriented business model and is currently being promoted by UNIDO's Cleaner Production Programme. MAIN FEATURES. One of the potential approaches to address the problems related to ineffective use and over-consumption of chemicals is the development and implementation of Chemical Leasing business models. These provide concrete solutions to the effective management of chemicals and on the ways negative releases to the environment can be reduced. The Chemical Leasing approach is a strategy that addresses the obligations of the changing international chemicals policy by focusing on a more service-oriented strategy. Mexico is one of the countries that were selected for the implementation of UNIDO's demonstration project to promote Chemical Leasing models in the country. The target sector of this project is the chemical industry, which is expected to shift their traditional business concept towards a more service and value-added approach. This is being achieved through the development of company specific business models that implement the above-indicated Chemical Leasing concept with the support from the Mexican National Cleaner Production Centre (NCPC). The implementation of Chemical Leasing in Mexico has proven to be an efficient instrument in enhancing sustainable chemical management and significantly reducing emissions in Mexico. Several companies from the chemical industrial sector implement or agreed to implement chemical leasing business models. Based on the positive findings of the project, several Mexican companies started to negotiate contents of possible Chemical Leasing contracts with suitable business partners. The project further aimed at disseminating information on Chemical Leasing. It successfully attracted globally operating companies in the chemicals sector to explore possibilities to implement Chemical Leasing business models in Mexico. At the international level, the results of the UNIDO project were presented on 20th September 2005 during a side event of the Strategic Approach to International Chemicals Management (SAICM) Preparation Conference in Vienna. To facilitate the promotion and application of Chemical Leasing project at international level, UNIDO is currently developing a number of tools to standardize Chemical Leasing projects. These include, among others, Chemical leasing contract models; Chemical Leasing data base to find partners for chemical leasing; and guidelines to implement Chemical Leasing projects and work programmes.
Developing physiologically-based pharmacokinetic (PBPK) models for chemicals can be resource-intensive, as neither chemical-specific parameters nor in vivo pharmacokinetic data are easily available for model construction. Previously developed, well-parameterized, and thoroughly-v...
Modeling of Spacecraft Advanced Chemical Propulsion Systems
NASA Technical Reports Server (NTRS)
Benfield, Michael P. J.; Belcher, Jeremy A.
2004-01-01
This paper outlines the development of the Advanced Chemical Propulsion System (ACPS) model for Earth and Space Storable propellants. This model was developed by the System Technology Operation of SAIC-Huntsville for the NASA MSFC In-Space Propulsion Project Office. Each subsystem of the model is described. Selected model results will also be shown to demonstrate the model's ability to evaluate technology changes in chemical propulsion systems.
Identification of chemical vascular disruptors during development using an integrative predictive toxicity model and zebrafish and in vitro functional angiogenesis assays Chemically-induced vascular toxicity during embryonic development can result in a wide range of adverse pre...
Ma, Guangcai; Yuan, Quan; Yu, Haiying; Lin, Hongjun; Chen, Jianrong; Hong, Huachang
2017-04-01
The binding of organic chemicals to serum albumin can significantly reduce their unbound concentration in blood and affect their biological reactions. In this study, we developed a new QSAR model for bovine serum albumin (BSA) - water partition coefficients (K BSA/W ) of neutral organic chemicals with large structural variance, logK BSA/W values covering 3.5 orders of magnitude (1.19-4.76). All chemical geometries were optimized by semi-empirical PM6 algorithm. Several quantum chemical parameters that reflect various intermolecular interactions as well as hydrophobicity were selected to develop QSAR model. The result indicates the regression model derived from logK ow , the most positive net atomic charges on an atom, Connolly solvent excluded volume, polarizability, and Abraham acidity could explain the partitioning mechanism of organic chemicals between BSA and water. The simulated external validation and cross validation verifies the developed model has good statistical robustness and predictive ability, thus can be used to estimate the logK BSA/W values for chemicals in application domain, accordingly to provide basic data for the toxicity assessment of the chemicals. Copyright © 2016 Elsevier Inc. All rights reserved.
J. E. Winandy; P. K. Lebow
2001-01-01
In this study, we develop models for predicting loss in bending strength of clear, straight-grained pine from changes in chemical composition. Although significant work needs to be done before truly universal predictive models are developed, a quantitative fundamental relationship between changes in chemical composition and strength loss for pine was demonstrated. In...
Approaches to developing alternative and predictive toxicology based on PBPK/PD and QSAR modeling.
Yang, R S; Thomas, R S; Gustafson, D L; Campain, J; Benjamin, S A; Verhaar, H J; Mumtaz, M M
1998-01-01
Systematic toxicity testing, using conventional toxicology methodologies, of single chemicals and chemical mixtures is highly impractical because of the immense numbers of chemicals and chemical mixtures involved and the limited scientific resources. Therefore, the development of unconventional, efficient, and predictive toxicology methods is imperative. Using carcinogenicity as an end point, we present approaches for developing predictive tools for toxicologic evaluation of chemicals and chemical mixtures relevant to environmental contamination. Central to the approaches presented is the integration of physiologically based pharmacokinetic/pharmacodynamic (PBPK/PD) and quantitative structure--activity relationship (QSAR) modeling with focused mechanistically based experimental toxicology. In this development, molecular and cellular biomarkers critical to the carcinogenesis process are evaluated quantitatively between different chemicals and/or chemical mixtures. Examples presented include the integration of PBPK/PD and QSAR modeling with a time-course medium-term liver foci assay, molecular biology and cell proliferation studies. Fourier transform infrared spectroscopic analyses of DNA changes, and cancer modeling to assess and attempt to predict the carcinogenicity of the series of 12 chlorobenzene isomers. Also presented is an ongoing effort to develop and apply a similar approach to chemical mixtures using in vitro cell culture (Syrian hamster embryo cell transformation assay and human keratinocytes) methodologies and in vivo studies. The promise and pitfalls of these developments are elaborated. When successfully applied, these approaches may greatly reduce animal usage, personnel, resources, and time required to evaluate the carcinogenicity of chemicals and chemical mixtures. Images Figure 6 PMID:9860897
Prediction of Chemical Function: Model Development and ...
The United States Environmental Protection Agency’s Exposure Forecaster (ExpoCast) project is developing both statistical and mechanism-based computational models for predicting exposures to thousands of chemicals, including those in consumer products. The high-throughput (HT) screening-level exposures developed under ExpoCast can be combined with HT screening (HTS) bioactivity data for the risk-based prioritization of chemicals for further evaluation. The functional role (e.g. solvent, plasticizer, fragrance) that a chemical performs can drive both the types of products in which it is found and the concentration in which it is present and therefore impacting exposure potential. However, critical chemical use information (including functional role) is lacking for the majority of commercial chemicals for which exposure estimates are needed. A suite of machine-learning based models for classifying chemicals in terms of their likely functional roles in products based on structure were developed. This effort required collection, curation, and harmonization of publically-available data sources of chemical functional use information from government and industry bodies. Physicochemical and structure descriptor data were generated for chemicals with function data. Machine-learning classifier models for function were then built in a cross-validated manner from the descriptor/function data using the method of random forests. The models were applied to: 1) predict chemi
Development of a database for chemical mechanism assignments for volatile organic emissions.
Carter, William P L
2015-10-01
The development of a database for making model species assignments when preparing total organic gas (TOG) emissions input for atmospheric models is described. This database currently has assignments of model species for 12 different gas-phase chemical mechanisms for over 1700 chemical compounds and covers over 3000 chemical categories used in five different anthropogenic TOG profile databases or output by two different biogenic emissions models. This involved developing a unified chemical classification system, assigning compounds to mixtures, assigning model species for the mechanisms to the compounds, and making assignments for unknown, unassigned, and nonvolatile mass. The comprehensiveness of the assignments, the contributions of various types of speciation categories to current profile and total emissions data, inconsistencies with existing undocumented model species assignments, and remaining speciation issues and areas of needed work are also discussed. The use of the system to prepare input for SMOKE, the Speciation Tool, and for biogenic models is described in the supplementary materials. The database, associated programs and files, and a users manual are available online at http://www.cert.ucr.edu/~carter/emitdb . Assigning air quality model species to the hundreds of emitted chemicals is a necessary link between emissions data and modeling effects of emissions on air quality. This is not easy and makes it difficult to implement new and more chemically detailed mechanisms in models. If done incorrectly, it is similar to errors in emissions speciation or the chemical mechanism used. Nevertheless, making such assignments is often an afterthought in chemical mechanism development and emissions processing, and existing assignments are usually undocumented and have errors and inconsistencies. This work is designed to address some of these problems.
Unicorns in the world of chemical bonding models.
Frenking, Gernot; Krapp, Andreas
2007-01-15
The appearance and the significance of heuristically developed bonding models are compared with the phenomenon of unicorns in mythical saga. It is argued that classical bonding models played an essential role for the development of the chemical science providing the language which is spoken in the territory of chemistry. The advent and the further development of quantum chemistry demands some restrictions and boundary conditions for classical chemical bonding models, which will continue to be integral parts of chemistry. Copyright (c) 2006 Wiley Periodicals, Inc.
Liu, Huihui; Wei, Mengbi; Yang, Xianhai; Yin, Cen; He, Xiao
2017-01-01
Partition coefficients are vital parameters for measuring accurately the chemicals concentrations by passive sampling devices. Given the wide use of low density polyethylene (LDPE) film in passive sampling, we developed a theoretical linear solvation energy relationship (TLSER) model and a quantitative structure-activity relationship (QSAR) model for the prediction of the partition coefficient of chemicals between LDPE and water (K pew ). For chemicals with the octanol-water partition coefficient (log K ow ) <8, a TLSER model with V x (McGowan volume) and qA - (the most negative charge on O, N, S, X atoms) as descriptors was developed, but the model had relatively low determination coefficient (R 2 ) and cross-validated coefficient (Q 2 ). In order to further explore the theoretical mechanisms involved in the partition process, a QSAR model with four descriptors (MLOGP (Moriguchi octanol-water partition coeff.), P_VSA_s_3 (P_VSA-like on I-state, bin 3), Hy (hydrophilic factor) and NssO (number of atoms of type ssO)) was established, and statistical analysis indicated that the model had satisfactory goodness-of-fit, robustness and predictive ability. For chemicals with log K OW >8, a TLSER model with V x and a QSAR model with MLOGP as descriptor were developed. This is the first paper to explore the models for highly hydrophobic chemicals. The applicability domain of the models, characterized by the Euclidean distance-based method and Williams plot, covered a large number of structurally diverse chemicals, which included nearly all the common hydrophobic organic compounds. Additionally, through mechanism interpretation, we explored the structural features those governing the partition behavior of chemicals between LDPE and water. Copyright © 2016 Elsevier B.V. All rights reserved.
PREDICTING THE EFFECTIVENESS OF CHEMICAL-PROTECTIVE CLOTHING MODEL AND TEST METHOD DEVELOPMENT
A predictive model and test method were developed for determining the chemical resistance of protective polymeric gloves exposed to liquid organic chemicals. The prediction of permeation through protective gloves by solvents was based on theories of the solution thermodynamics of...
Although the literature is replete with QSAR models developed for many toxic effects caused by reversible chemical interactions, the development of QSARs for the toxic effects of reactive chemicals lacks a consistent approach. While limitations exit, an appropriate starting-point...
SHEDS-HT: An Integrated Probabilistic Exposure Model for ...
United States Environmental Protection Agency (USEPA) researchers are developing a strategy for highthroughput (HT) exposure-based prioritization of chemicals under the ExpoCast program. These novel modeling approaches for evaluating chemicals based on their potential for biologically relevant human exposures will inform toxicity testing and prioritization for chemical risk assessment. Based on probabilistic methods and algorithms developed for The Stochastic Human Exposure and Dose Simulation Model for Multimedia, Multipathway Chemicals (SHEDS-MM), a new mechanistic modeling approach has been developed to accommodate high-throughput (HT) assessment of exposure potential. In this SHEDS-HT model, the residential and dietary modules of SHEDS-MM have been operationally modified to reduce the user burden, input data demands, and run times of the higher-tier model, while maintaining critical features and inputs that influence exposure. The model has been implemented in R; the modeling framework links chemicals to consumer product categories or food groups (and thus exposure scenarios) to predict HT exposures and intake doses. Initially, SHEDS-HT has been applied to 2507 organic chemicals associated with consumer products and agricultural pesticides. These evaluations employ data from recent USEPA efforts to characterize usage (prevalence, frequency, and magnitude), chemical composition, and exposure scenarios for a wide range of consumer products. In modeling indirec
Ng, Hui Wen; Doughty, Stephen W; Luo, Heng; Ye, Hao; Ge, Weigong; Tong, Weida; Hong, Huixiao
2015-12-21
Some chemicals in the environment possess the potential to interact with the endocrine system in the human body. Multiple receptors are involved in the endocrine system; estrogen receptor α (ERα) plays very important roles in endocrine activity and is the most studied receptor. Understanding and predicting estrogenic activity of chemicals facilitates the evaluation of their endocrine activity. Hence, we have developed a decision forest classification model to predict chemical binding to ERα using a large training data set of 3308 chemicals obtained from the U.S. Food and Drug Administration's Estrogenic Activity Database. We tested the model using cross validations and external data sets of 1641 chemicals obtained from the U.S. Environmental Protection Agency's ToxCast project. The model showed good performance in both internal (92% accuracy) and external validations (∼ 70-89% relative balanced accuracies), where the latter involved the validations of the model across different ER pathway-related assays in ToxCast. The important features that contribute to the prediction ability of the model were identified through informative descriptor analysis and were related to current knowledge of ER binding. Prediction confidence analysis revealed that the model had both high prediction confidence and accuracy for most predicted chemicals. The results demonstrated that the model constructed based on the large training data set is more accurate and robust for predicting ER binding of chemicals than the published models that have been developed using much smaller data sets. The model could be useful for the evaluation of ERα-mediated endocrine activity potential of environmental chemicals.
QSAR modeling for predicting mutagenic toxicity of diverse chemicals for regulatory purposes.
Basant, Nikita; Gupta, Shikha
2017-06-01
The safety assessment process of chemicals requires information on their mutagenic potential. The experimental determination of mutagenicity of a large number of chemicals is tedious and time and cost intensive, thus compelling for alternative methods. We have established local and global QSAR models for discriminating low and high mutagenic compounds and predicting their mutagenic activity in a quantitative manner in Salmonella typhimurium (TA) bacterial strains (TA98 and TA100). The decision treeboost (DTB)-based classification QSAR models discriminated among two categories with accuracies of >96% and the regression QSAR models precisely predicted the mutagenic activity of diverse chemicals yielding high correlations (R 2 ) between the experimental and model-predicted values in the respective training (>0.96) and test (>0.94) sets. The test set root mean squared error (RMSE) and mean absolute error (MAE) values emphasized the usefulness of the developed models for predicting new compounds. Relevant structural features of diverse chemicals that were responsible and influence the mutagenic activity were identified. The applicability domains of the developed models were defined. The developed models can be used as tools for screening new chemicals for their mutagenicity assessment for regulatory purpose.
A comprehensive physiologically based pharmacokinetic ...
Published physiologically based pharmacokinetic (PBPK) models from peer-reviewed articles are often well-parameterized, thoroughly-vetted, and can be utilized as excellent resources for the construction of models pertaining to related chemicals. Specifically, chemical-specific parameters and in vivo pharmacokinetic data used to calibrate these published models can act as valuable starting points for model development of new chemicals with similar molecular structures. A knowledgebase for published PBPK-related articles was compiled to support PBPK model construction for new chemicals based on their close analogues within the knowledgebase, and a web-based interface was developed to allow users to query those close analogues. A list of 689 unique chemicals and their corresponding 1751 articles was created after analysis of 2,245 PBPK-related articles. For each model, the PMID, chemical name, major metabolites, species, gender, life stages and tissue compartments were extracted from the published articles. PaDEL-Descriptor, a Chemistry Development Kit based software, was used to calculate molecular fingerprints. Tanimoto index was implemented in the user interface as measurement of structural similarity. The utility of the PBPK knowledgebase and web-based user interface was demonstrated using two case studies with ethylbenzene and gefitinib. Our PBPK knowledgebase is a novel tool for ranking chemicals based on similarities to other chemicals associated with existi
Enabling PBPK model development through the application of ...
The creation of Physiologically Based Pharmacokinetic (PBPK) models for a new chemical requires the selection of an appropriate model structure and the collection of a large amount of data for parameterization. Commonly, a large proportion of the needed information is collected from previously published PBPK models for compounds analogous to the chemical of interest. A key difficulty in quickly developing new models is therefore the identification of appropriate chemical analogs within PBPK model literature. To reduce the burden on researchers of finding the appropriate literature to inform new modeling efforts, we sought to collect a comprehensive listing of chemicals contained in the corpus of PBPK articles and embed them into a chemically searchable database for facile analog identification. To cull the list of chemicals from PBPK literature, we investigated the use of three easily accessible methods: collecting chemicals via MeSH controlled vocabulary processing abstracts using OSCAR4 text-mining software, and annotating abstracts using chemicalize.org. In total, just over 300 unique compounds spanning a variety of chemical classes were identified as having completed PBPK models from over 1700 articles. Additional annotations of PBPK model details including species, lifestage, number of compartments, gender, and exposure routes were tabulated. These data were then imbedded into the Toxicokinetic Knowledge Base (TKKB), an internal website for chemicall
Developing Computer Model-Based Assessment of Chemical Reasoning: A Feasibility Study
ERIC Educational Resources Information Center
Liu, Xiufeng; Waight, Noemi; Gregorius, Roberto; Smith, Erica; Park, Mihwa
2012-01-01
This paper reports a feasibility study on developing computer model-based assessments of chemical reasoning at the high school level. Computer models are flash and NetLogo environments to make simultaneously available three domains in chemistry: macroscopic, submicroscopic, and symbolic. Students interact with computer models to answer assessment…
We developed a numerical model to predict chemical concentrations in indoor environments resulting from soil vapor intrusion and volatilization from groundwater. The model, which integrates new and existing algorithms for chemical fate and transport, was originally...
DEVELOPING EMISSION INVENTORIES FOR BIOMASS BURNING FOR REAL-TIME AND RETROSPECTIVE MODELING
The EPA uses chemical transport models to simulate historic meteorological episodes for developing air quality management strategies. In addition, chemical transport models are now being used operationally to create air quality forecasts. There are currently a number of methods a...
Chen, Guangchao; Li, Xuehua; Chen, Jingwen; Zhang, Ya-Nan; Peijnenburg, Willie J G M
2014-12-01
Biodegradation is the principal environmental dissipation process of chemicals. As such, it is a dominant factor determining the persistence and fate of organic chemicals in the environment, and is therefore of critical importance to chemical management and regulation. In the present study, the authors developed in silico methods assessing biodegradability based on a large heterogeneous set of 825 organic compounds, using the techniques of the C4.5 decision tree, the functional inner regression tree, and logistic regression. External validation was subsequently carried out by 2 independent test sets of 777 and 27 chemicals. As a result, the functional inner regression tree exhibited the best predictability with predictive accuracies of 81.5% and 81.0%, respectively, on the training set (825 chemicals) and test set I (777 chemicals). Performance of the developed models on the 2 test sets was subsequently compared with that of the Estimation Program Interface (EPI) Suite Biowin 5 and Biowin 6 models, which also showed a better predictability of the functional inner regression tree model. The model built in the present study exhibits a reasonable predictability compared with existing models while possessing a transparent algorithm. Interpretation of the mechanisms of biodegradation was also carried out based on the models developed. © 2014 SETAC.
AQUATIC TOXICITY MODE OF ACTION STUDIES APPLIED TO QSAR DEVELOPMENT
A series of QSAR models for predicting fish acute lethality were developed using systematically collected data on more than 600 chemicals. These models were developed based on the assumption that chemicals producing toxicity through a common mechanism will have commonality in the...
Chemical process simulation has long been used as a design tool in the development of chemical plants, and has long been considered a means to evaluate different design options. With the advent of large scale computer networks and interface models for program components, it is po...
ACToR A Aggregated Computational Toxicology Resource ...
We are developing the ACToR system (Aggregated Computational Toxicology Resource) to serve as a repository for a variety of types of chemical, biological and toxicological data that can be used for predictive modeling of chemical toxicology. We are developing the ACToR system (Aggregated Computational Toxicology Resource) to serve as a repository for a variety of types of chemical, biological and toxicological data that can be used for predictive modeling of chemical toxicology.
ACToR A Aggregated Computational Toxicology Resource (S) ...
We are developing the ACToR system (Aggregated Computational Toxicology Resource) to serve as a repository for a variety of types of chemical, biological and toxicological data that can be used for predictive modeling of chemical toxicology. We are developing the ACToR system (Aggregated Computational Toxicology Resource) to serve as a repository for a variety of types of chemical, biological and toxicological data that can be used for predictive modeling of chemical toxicology.
Engineered Barrier System: Physical and Chemical Environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
P. Dixon
2004-04-26
The conceptual and predictive models documented in this Engineered Barrier System: Physical and Chemical Environment Model report describe the evolution of the physical and chemical conditions within the waste emplacement drifts of the repository. The modeling approaches and model output data will be used in the total system performance assessment (TSPA-LA) to assess the performance of the engineered barrier system and the waste form. These models evaluate the range of potential water compositions within the emplacement drifts, resulting from the interaction of introduced materials and minerals in dust with water seeping into the drifts and with aqueous solutions forming bymore » deliquescence of dust (as influenced by atmospheric conditions), and from thermal-hydrological-chemical (THC) processes in the drift. These models also consider the uncertainty and variability in water chemistry inside the drift and the compositions of introduced materials within the drift. This report develops and documents a set of process- and abstraction-level models that constitute the engineered barrier system: physical and chemical environment model. Where possible, these models use information directly from other process model reports as input, which promotes integration among process models used for total system performance assessment. Specific tasks and activities of modeling the physical and chemical environment are included in the technical work plan ''Technical Work Plan for: In-Drift Geochemistry Modeling'' (BSC 2004 [DIRS 166519]). As described in the technical work plan, the development of this report is coordinated with the development of other engineered barrier system analysis model reports.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peters, Junenette L., E-mail: petersj@bu.edu; Patricia Fabian, M., E-mail: pfabian@bu.edu; Levy, Jonathan I., E-mail: jonlevy@bu.edu
High blood pressure is associated with exposure to multiple chemical and non-chemical risk factors, but epidemiological analyses to date have not assessed the combined effects of both chemical and non-chemical stressors on human populations in the context of cumulative risk assessment. We developed a novel modeling approach to evaluate the combined impact of lead, cadmium, polychlorinated biphenyls (PCBs), and multiple non-chemical risk factors on four blood pressure measures using data for adults aged ≥20 years from the National Health and Nutrition Examination Survey (1999–2008). We developed predictive models for chemical and other stressors. Structural equation models were applied to accountmore » for complex associations among predictors of stressors as well as blood pressure. Models showed that blood lead, serum PCBs, and established non-chemical stressors were significantly associated with blood pressure. Lead was the chemical stressor most predictive of diastolic blood pressure and mean arterial pressure, while PCBs had a greater influence on systolic blood pressure and pulse pressure, and blood cadmium was not a significant predictor of blood pressure. The simultaneously fit exposure models explained 34%, 43% and 52% of the variance for lead, cadmium and PCBs, respectively. The structural equation models were developed using predictors available from public data streams (e.g., U.S. Census), which would allow the models to be applied to any U.S. population exposed to these multiple stressors in order to identify high risk subpopulations, direct intervention strategies, and inform public policy. - Highlights: • We evaluated joint impact of chemical and non-chemical stressors on blood pressure. • We built predictive models for lead, cadmium and polychlorinated biphenyls (PCBs). • Our approach allows joint evaluation of predictors from population-specific data. • Lead, PCBs and established non-chemical stressors were related to blood pressure. • Framework allows cumulative risk assessment in specific geographic settings.« less
Cunningham, Albert R.; Trent, John O.
2012-01-01
Structure–activity relationship (SAR) models are powerful tools to investigate the mechanisms of action of chemical carcinogens and to predict the potential carcinogenicity of untested compounds. We describe the use of a traditional fragment-based SAR approach along with a new virtual ligand-protein interaction-based approach for modeling of nonmutagenic carcinogens. The ligand-based SAR models used descriptors derived from computationally calculated ligand-binding affinities for learning set agents to 5495 proteins. Two learning sets were developed. One set was from the Carcinogenic Potency Database, where chemicals tested for rat carcinogenesis along with Salmonella mutagenicity data were provided. The second was from Malacarne et al. who developed a learning set of nonalerting compounds based on rodent cancer bioassay data and Ashby’s structural alerts. When the rat cancer models were categorized based on mutagenicity, the traditional fragment model outperformed the ligand-based model. However, when the learning sets were composed solely of nonmutagenic or nonalerting carcinogens and noncarcinogens, the fragment model demonstrated a concordance of near 50%, whereas the ligand-based models demonstrated a concordance of 71% for nonmutagenic carcinogens and 74% for nonalerting carcinogens. Overall, these findings suggest that expert system analysis of virtual chemical protein interactions may be useful for developing predictive SAR models for nonmutagenic carcinogens. Moreover, a more practical approach for developing SAR models for carcinogenesis may include fragment-based models for chemicals testing positive for mutagenicity and ligand-based models for chemicals devoid of DNA reactivity. PMID:22678118
Cunningham, Albert R; Carrasquer, C Alex; Qamar, Shahid; Maguire, Jon M; Cunningham, Suzanne L; Trent, John O
2012-10-01
Structure-activity relationship (SAR) models are powerful tools to investigate the mechanisms of action of chemical carcinogens and to predict the potential carcinogenicity of untested compounds. We describe the use of a traditional fragment-based SAR approach along with a new virtual ligand-protein interaction-based approach for modeling of nonmutagenic carcinogens. The ligand-based SAR models used descriptors derived from computationally calculated ligand-binding affinities for learning set agents to 5495 proteins. Two learning sets were developed. One set was from the Carcinogenic Potency Database, where chemicals tested for rat carcinogenesis along with Salmonella mutagenicity data were provided. The second was from Malacarne et al. who developed a learning set of nonalerting compounds based on rodent cancer bioassay data and Ashby's structural alerts. When the rat cancer models were categorized based on mutagenicity, the traditional fragment model outperformed the ligand-based model. However, when the learning sets were composed solely of nonmutagenic or nonalerting carcinogens and noncarcinogens, the fragment model demonstrated a concordance of near 50%, whereas the ligand-based models demonstrated a concordance of 71% for nonmutagenic carcinogens and 74% for nonalerting carcinogens. Overall, these findings suggest that expert system analysis of virtual chemical protein interactions may be useful for developing predictive SAR models for nonmutagenic carcinogens. Moreover, a more practical approach for developing SAR models for carcinogenesis may include fragment-based models for chemicals testing positive for mutagenicity and ligand-based models for chemicals devoid of DNA reactivity.
Wignall, Jessica A; Muratov, Eugene; Sedykh, Alexander; Guyton, Kathryn Z; Tropsha, Alexander; Rusyn, Ivan; Chiu, Weihsueh A
2018-05-01
Human health assessments synthesize human, animal, and mechanistic data to produce toxicity values that are key inputs to risk-based decision making. Traditional assessments are data-, time-, and resource-intensive, and they cannot be developed for most environmental chemicals owing to a lack of appropriate data. As recommended by the National Research Council, we propose a solution for predicting toxicity values for data-poor chemicals through development of quantitative structure-activity relationship (QSAR) models. We used a comprehensive database of chemicals with existing regulatory toxicity values from U.S. federal and state agencies to develop quantitative QSAR models. We compared QSAR-based model predictions to those based on high-throughput screening (HTS) assays. QSAR models for noncancer threshold-based values and cancer slope factors had cross-validation-based Q 2 of 0.25-0.45, mean model errors of 0.70-1.11 log 10 units, and applicability domains covering >80% of environmental chemicals. Toxicity values predicted from QSAR models developed in this study were more accurate and precise than those based on HTS assays or mean-based predictions. A publicly accessible web interface to make predictions for any chemical of interest is available at http://toxvalue.org. An in silico tool that can predict toxicity values with an uncertainty of an order of magnitude or less can be used to quickly and quantitatively assess risks of environmental chemicals when traditional toxicity data or human health assessments are unavailable. This tool can fill a critical gap in the risk assessment and management of data-poor chemicals. https://doi.org/10.1289/EHP2998.
NASA Technical Reports Server (NTRS)
Prinn, Ronald G.
2001-01-01
For interpreting observational data, and in particular for use in inverse methods, accurate and realistic chemical transport models are essential. Toward this end we have, in recent years, helped develop and utilize a number of three-dimensional models including the Model for Atmospheric Transport and Chemistry (MATCH).
Prediction of Chemical Function: Model Development and Application
The United States Environmental Protection Agency’s Exposure Forecaster (ExpoCast) project is developing both statistical and mechanism-based computational models for predicting exposures to thousands of chemicals, including those in consumer products. The high-throughput (...
The Dietary Exposure Potential Model (DEPM) is a computer-based model developed for estimating dietary exposure to chemical residues in food. The DEPM is based on food consumption data from the 1987-1988 Nationwide Food Consumption Survey (NFCS) administered by the United States ...
The mode of toxic action (MOA) has been recognized as a key determinant of chemical toxicity and as an alternative to chemical class-based predictive toxicity modeling. However, the development of quantitative structure activity relationship (QSAR) and other models has been limit...
tThe mode of toxic action (MOA) has been recognized as a key determinant of chemical toxicity andas an alternative to chemical class-based predictive toxicity modeling. However, the development ofquantitative structure activity relationship (QSAR) and other models has been limite...
Some aspects of mathematical and chemical modeling of complex chemical processes
NASA Technical Reports Server (NTRS)
Nemes, I.; Botar, L.; Danoczy, E.; Vidoczy, T.; Gal, D.
1983-01-01
Some theoretical questions involved in the mathematical modeling of the kinetics of complex chemical process are discussed. The analysis is carried out for the homogeneous oxidation of ethylbenzene in the liquid phase. Particular attention is given to the determination of the general characteristics of chemical systems from an analysis of mathematical models developed on the basis of linear algebra.
MODELING THREE-DIMENSIONAL SUBSURFACE FLOW, FATE AND TRANSPORT OF MICROBES AND CHEMICALS (3DFATMIC)
A three-dimensional model simulating the subsurface flow, microbial growth and degradation, microbial-chemical reaction, and transport of microbes and chemicals has been developed. he model is designed to solve the coupled flow and transport equations. asically, the saturated-uns...
The development of physiologically based toxicokinetic (PBTK) models for hydrophobic chemicals in fish requires: 1) an understanding of chemical efflux at fish gills; 2) knowledge of the factors that limit chemical exchange between blood and tissues; and, 3) a mechanistic descrip...
Program of research in severe storms
NASA Technical Reports Server (NTRS)
1979-01-01
Two modeling areas, the development of a mesoscale chemistry-meteorology interaction model, and the development of a combined urban chemical kinetics-transport model are examined. The problems associated with developing a three dimensional combined meteorological-chemical kinetics computer program package are defined. A similar three dimensional hydrostatic real time model which solves the fundamental Navier-Stokes equations for nonviscous flow is described. An urban air quality simulation model, developed to predict the temporal and spatial distribution of reactive and nonreactive gases in and around an urban area and to support a remote sensor evaluation program is reported.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wittwehr, Clemens; Aladjov, Hristo; Ankley, Gerald
Efforts are underway to transform regulatory toxicology and chemical safety assessment from a largely empirical science based on direct observation of apical toxicity outcomes in whole organism toxicity tests to a predictive one in which outcomes and risk are inferred from accumulated mechanistic understanding. The adverse outcome pathway (AOP) framework has emerged as a systematic approach for organizing knowledge that supports such inference. We argue that this systematic organization of knowledge can inform and help direct the design and development of computational prediction models that can further enhance the utility of mechanistic and in silico data for chemical safety assessment.more » Examples of AOP-informed model development and its application to the assessment of chemicals for skin sensitization and multiple modes of endocrine disruption are provided. The role of problem formulation, not only as a critical phase of risk assessment, but also as guide for both AOP and complementary model development described. Finally, a proposal for actively engaging the modeling community in AOP-informed computational model development is made. The contents serve as a vision for how AOPs can be leveraged to facilitate development of computational prediction models needed to support the next generation of chemical safety assessment.« less
Card, Marcella L; Gomez-Alvarez, Vicente; Lee, Wen-Hsiung; Lynch, David G; Orentas, Nerija S; Lee, Mari Titcombe; Wong, Edmund M; Boethling, Robert S
2017-03-22
Chemical property estimation is a key component in many industrial, academic, and regulatory activities, including in the risk assessment associated with the approximately 1000 new chemical pre-manufacture notices the United States Environmental Protection Agency (US EPA) receives annually. The US EPA evaluates fate, exposure and toxicity under the 1976 Toxic Substances Control Act (amended by the 2016 Frank R. Lautenberg Chemical Safety for the 21 st Century Act), which does not require test data with new chemical applications. Though the submission of data is not required, the US EPA has, over the past 40 years, occasionally received chemical-specific data with pre-manufacture notices. The US EPA has been actively using this and publicly available data to develop and refine predictive computerized models, most of which are housed in EPI Suite™, to estimate chemical properties used in the risk assessment of new chemicals. The US EPA develops and uses models based on (quantitative) structure-activity relationships ([Q]SARs) to estimate critical parameters. As in any evolving field, (Q)SARs have experienced successes, suffered failures, and responded to emerging trends. Correlations of a chemical structure with its properties or biological activity were first demonstrated in the late 19 th century and today have been encapsulated in a myriad of quantitative and qualitative SARs. The development and proliferation of the personal computer in the late 20 th century gave rise to a quickly increasing number of property estimation models, and continually improved computing power and connectivity among researchers via the internet are enabling the development of increasingly complex models.
Chemical leasing--a review of implementation in the past decade.
Moser, Frank; Jakl, Thomas
2015-04-01
In the past decade, research on innovative business models to manage the risk of chemical substances has sought to provide solutions to achieve the goals of the World Summit on Sustainable Development of 2002, which called for a renewal of the commitment to the sound management of chemicals and of hazardous wastes throughout their life cycle and set the ambitious goal, by 2020, to use and produce chemicals in ways that do not lead to significant adverse effects on human health and the environment. Chemical Leasing is an innovative business model that shows a great potential to become a global model for sustainable development within chemical management. This paper provides a review of the current standings of literature regarding the implementation of Chemical Leasing in the past decade. In doing so, the paper highlights the potential of this business model to serve as an approach for dematerializing production processes and managing the risks of chemicals at all levels. More in detail, it provides an outline of how Chemical Leasing has supported the alignment and implementation of the objectives of chemicals policy-makers and industry regarding the production and use of chemicals and analyses to what extent Chemical Leasing contributes to the implementation of a number of voluntary global initiatives, such as Cleaner Production, Sustainable Chemistry and Corporate Social Responsibility. This paper provides a systematic analysis of the gaps identified in literature regarding the implementation of Chemical Leasing business models. Based on this analysis, specific aspects in the field of Chemical Leasing are recommended to be further elaborated in order to increase the understanding and applicability of the business model.
Jin, Xiaochen; Fu, Zhiqiang; Li, Xuehua; Chen, Jingwen
2017-03-22
The octanol-air partition coefficient (K OA ) is a key parameter describing the partition behavior of organic chemicals between air and environmental organic phases. As the experimental determination of K OA is costly, time-consuming and sometimes limited by the availability of authentic chemical standards for the compounds to be determined, it becomes necessary to develop credible predictive models for K OA . In this study, a polyparameter linear free energy relationship (pp-LFER) model for predicting K OA at 298.15 K and a novel model incorporating pp-LFERs with temperature (pp-LFER-T model) were developed from 795 log K OA values for 367 chemicals at different temperatures (263.15-323.15 K), and were evaluated with the OECD guidelines on QSAR model validation and applicability domain description. Statistical results show that both models are well-fitted, robust and have good predictive capabilities. Particularly, the pp-LFER model shows a strong predictive ability for polyfluoroalkyl substances and organosilicon compounds, and the pp-LFER-T model maintains a high predictive accuracy within a wide temperature range (263.15-323.15 K).
In vitro screening of chemicals for bioactivity together with computational modeling are beginning to replace animal toxicity testing in support of chemical risk assessment. To facilitate this transition, an amphibian thyroid axis model has been developed to describe thyroid home...
Numerical modeling tools for chemical vapor deposition
NASA Technical Reports Server (NTRS)
Jasinski, Thomas J.; Childs, Edward P.
1992-01-01
Development of general numerical simulation tools for chemical vapor deposition (CVD) was the objective of this study. Physical models of important CVD phenomena were developed and implemented into the commercial computational fluid dynamics software FLUENT. The resulting software can address general geometries as well as the most important phenomena occurring with CVD reactors: fluid flow patterns, temperature and chemical species distribution, gas phase and surface deposition. The physical models are documented which are available and examples are provided of CVD simulation capabilities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alves, Vinicius M.; Laboratory for Molecular Modeling, Division of Chemical Biology and Medicinal Chemistry, Eshelman School of Pharmacy, University of North Carolina, Chapel Hill, NC 27599; Muratov, Eugene
Repetitive exposure to a chemical agent can induce an immune reaction in inherently susceptible individuals that leads to skin sensitization. Although many chemicals have been reported as skin sensitizers, there have been very few rigorously validated QSAR models with defined applicability domains (AD) that were developed using a large group of chemically diverse compounds. In this study, we have aimed to compile, curate, and integrate the largest publicly available dataset related to chemically-induced skin sensitization, use this data to generate rigorously validated and QSAR models for skin sensitization, and employ these models as a virtual screening tool for identifying putativemore » sensitizers among environmental chemicals. We followed best practices for model building and validation implemented with our predictive QSAR workflow using Random Forest modeling technique in combination with SiRMS and Dragon descriptors. The Correct Classification Rate (CCR) for QSAR models discriminating sensitizers from non-sensitizers was 71–88% when evaluated on several external validation sets, within a broad AD, with positive (for sensitizers) and negative (for non-sensitizers) predicted rates of 85% and 79% respectively. When compared to the skin sensitization module included in the OECD QSAR Toolbox as well as to the skin sensitization model in publicly available VEGA software, our models showed a significantly higher prediction accuracy for the same sets of external compounds as evaluated by Positive Predicted Rate, Negative Predicted Rate, and CCR. These models were applied to identify putative chemical hazards in the Scorecard database of possible skin or sense organ toxicants as primary candidates for experimental validation. - Highlights: • It was compiled the largest publicly-available skin sensitization dataset. • Predictive QSAR models were developed for skin sensitization. • Developed models have higher prediction accuracy than OECD QSAR Toolbox. • Putative chemical hazards in the Scorecard database were found using our models.« less
Advanced Modeling Techniques to Study Anthropogenic Influences on Atmospheric Chemical Budgets
NASA Technical Reports Server (NTRS)
Mathur, Rohit
1997-01-01
This research work is a collaborative effort between research groups at MCNC and the University of North Carolina at Chapel Hill. The overall objective of this research is to improve the level of understanding of the processes that determine the budgets of chemically and radiatively active compounds in the atmosphere through development and application of advanced methods for calculating the chemical change in atmospheric models. The research performed during the second year of this project focused on four major aspects: (1) The continued development and refinement of multiscale modeling techniques to address the issue of the disparate scales of the physico-chemical processes that govern the fate of atmospheric pollutants; (2) Development and application of analysis methods utilizing process and mass balance techniques to increase the interpretive powers of atmospheric models and to aid in complementary analysis of model predictions and observations; (3) Development of meteorological and emission inputs for initial application of the chemistry/transport model over the north Atlantic region; and, (4) The continued development and implementation of a totally new adaptive chemistry representation that changes the details of what is represented as the underlying conditions change.
High Throughput Exposure Modeling of Semi-Volatile Chemicals in Articles of Commerce (ACS)
Risk due to chemical exposure is a function of both chemical hazard and exposure. Near-field exposures to chemicals in consumer products are identified as the main drivers of exposure and yet are not well quantified or understood. The ExpoCast project is developing a model that e...
The U.S. EPA’s Chemical Safety and Sustainability research program is developing the Human Exposure Model (HEM) to assess near-field exposures to chemicals that occur in various populations over the entire life cycle of a consumer product. The model will be implemented as a...
How adverse outcome pathways can aid the development and ...
Efforts are underway to transform regulatory toxicology and chemical safety assessment from a largely empirical science based on direct observation of apical toxicity outcomes in whole organism toxicity tests to a predictive one in which outcomes and risk are inferred from accumulated mechanistic understanding. The adverse outcome pathway (AOP) framework has emerged as a systematic approach for organizing knowledge that supports such inference. We argue that this systematic organization of knowledge can inform and help direct the design and development of computational prediction models that can further enhance the utility of mechanistic and in silico data for chemical safety assessment. Examples of AOP-informed model development and its application to the assessment of chemicals for skin sensitization and multiple modes of endocrine disruption are provided. The role of problem formulation, not only as a critical phase of risk assessment, but also as guide for both AOP and complementary model development described. Finally, a proposal for actively engaging the modeling community in AOP-informed computational model development is made. The contents serve as a vision for how AOPs can be leveraged to facilitate development of computational prediction models needed to support the next generation of chemical safety assessment. The present manuscript reports on expert opinion and case studies that came out of a European Commission, Joint Research Centre-sponsored work
Rusyn, Ivan; Sedykh, Alexander; Guyton, Kathryn Z.; Tropsha, Alexander
2012-01-01
Quantitative structure-activity relationship (QSAR) models are widely used for in silico prediction of in vivo toxicity of drug candidates or environmental chemicals, adding value to candidate selection in drug development or in a search for less hazardous and more sustainable alternatives for chemicals in commerce. The development of traditional QSAR models is enabled by numerical descriptors representing the inherent chemical properties that can be easily defined for any number of molecules; however, traditional QSAR models often have limited predictive power due to the lack of data and complexity of in vivo endpoints. Although it has been indeed difficult to obtain experimentally derived toxicity data on a large number of chemicals in the past, the results of quantitative in vitro screening of thousands of environmental chemicals in hundreds of experimental systems are now available and continue to accumulate. In addition, publicly accessible toxicogenomics data collected on hundreds of chemicals provide another dimension of molecular information that is potentially useful for predictive toxicity modeling. These new characteristics of molecular bioactivity arising from short-term biological assays, i.e., in vitro screening and/or in vivo toxicogenomics data can now be exploited in combination with chemical structural information to generate hybrid QSAR–like quantitative models to predict human toxicity and carcinogenicity. Using several case studies, we illustrate the benefits of a hybrid modeling approach, namely improvements in the accuracy of models, enhanced interpretation of the most predictive features, and expanded applicability domain for wider chemical space coverage. PMID:22387746
Di Guardo, Antonio; Gouin, Todd; MacLeod, Matthew; Scheringer, Martin
2018-01-24
Environmental fate and exposure models are a powerful means to integrate information on chemicals, their partitioning and degradation behaviour, the environmental scenario and the emissions in order to compile a picture of chemical distribution and fluxes in the multimedia environment. A 1995 pioneering book, resulting from a series of workshops among model developers and users, reported the main advantages and identified needs for research in the field of multimedia fate models. Considerable efforts were devoted to their improvement in the past 25 years and many aspects were refined; notably the inclusion of nanomaterials among the modelled substances, the development of models at different spatial and temporal scales, the estimation of chemical properties and emission data, the incorporation of additional environmental media and processes, the integration of sensitivity and uncertainty analysis in the simulations. However, some challenging issues remain and require research efforts and attention: the need of methods to estimate partition coefficients for polar and ionizable chemical in the environment, a better description of bioavailability in different environments as well as the requirement of injecting more ecological realism in exposure predictions to account for the diversity of ecosystem structures and functions in risk assessment. Finally, to transfer new scientific developments into the realm of regulatory risk assessment, we propose the formation of expert groups that compare, discuss and recommend model modifications and updates and help develop practical tools for risk assessment.
Using model-based screening to help discover unknown environmental contaminants.
McLachlan, Michael S; Kierkegaard, Amelie; Radke, Michael; Sobek, Anna; Malmvärn, Anna; Alsberg, Tomas; Arnot, Jon A; Brown, Trevor N; Wania, Frank; Breivik, Knut; Xu, Shihe
2014-07-01
Of the tens of thousands of chemicals in use, only a small fraction have been analyzed in environmental samples. To effectively identify environmental contaminants, methods to prioritize chemicals for analytical method development are required. We used a high-throughput model of chemical emissions, fate, and bioaccumulation to identify chemicals likely to have high concentrations in specific environmental media, and we prioritized these for target analysis. This model-based screening was applied to 215 organosilicon chemicals culled from industrial chemical production statistics. The model-based screening prioritized several recognized organosilicon contaminants and generated hypotheses leading to the selection of three chemicals that have not previously been identified as potential environmental contaminants for target analysis. Trace analytical methods were developed, and the chemicals were analyzed in air, sewage sludge, and sediment. All three substances were found to be environmental contaminants. Phenyl-tris(trimethylsiloxy)silane was present in all samples analyzed, with concentrations of ∼50 pg m(-3) in Stockholm air and ∼0.5 ng g(-1) dw in sediment from the Stockholm archipelago. Tris(trifluoropropyl)trimethyl-cyclotrisiloxane and tetrakis(trifluoropropyl)tetramethyl-cyclotetrasiloxane were found in sediments from Lake Mjøsa at ∼1 ng g(-1) dw. The discovery of three novel environmental contaminants shows that models can be useful for prioritizing chemicals for exploratory assessment.
Use of statistical and neural net approaches in predicting toxicity of chemicals.
Basak, S C; Grunwald, G D; Gute, B D; Balasubramanian, K; Opitz, D
2000-01-01
Hierarchical quantitative structure-activity relationships (H-QSAR) have been developed as a new approach in constructing models for estimating physicochemical, biomedicinal, and toxicological properties of interest. This approach uses increasingly more complex molecular descriptors in a graduated approach to model building. In this study, statistical and neural network methods have been applied to the development of H-QSAR models for estimating the acute aquatic toxicity (LC50) of 69 benzene derivatives to Pimephales promelas (fathead minnow). Topostructural, topochemical, geometrical, and quantum chemical indices were used as the four levels of the hierarchical method. It is clear from both the statistical and neural network models that topostructural indices alone cannot adequately model this set of congeneric chemicals. Not surprisingly, topochemical indices greatly increase the predictive power of both statistical and neural network models. Quantum chemical indices also add significantly to the modeling of this set of acute aquatic toxicity data.
CHOOSING A CHEMICAL MECHANISM FOR REGULATORY AND RESEARCH AIR QUALITY MODELING APPLICATIONS
There are numerous, different chemical mechanisms currently available for use in air quality models, and new mechanisms and versions of mechanisms are continually being developed. The development of Morphecule-type mechanisms will add a near-infinite number of additional mecha...
Researchers facilitated evaluation of chemicals that lack chronic oral toxicity values using a QSAR model to develop estimates of potential toxicity for chemicals used in HF fluids or found in flowback or produced water
DOE Office of Scientific and Technical Information (OSTI.GOV)
McPherson, Brian J.; Pan, Feng
2014-09-24
This report summarizes development of a coupled-process reservoir model for simulating enhanced geothermal systems (EGS) that utilize supercritical carbon dioxide as a working fluid. Specifically, the project team developed an advanced chemical kinetic model for evaluating important processes in EGS reservoirs, such as mineral precipitation and dissolution at elevated temperature and pressure, and for evaluating potential impacts on EGS surface facilities by related chemical processes. We assembled a new database for better-calibrated simulation of water/brine/ rock/CO2 interactions in EGS reservoirs. This database utilizes existing kinetic and other chemical data, and we updated those data to reflect corrections for elevated temperaturemore » and pressure conditions of EGS reservoirs.« less
Brinkmann, Markus; Schlechtriem, Christian; Reininghaus, Mathias; Eichbaum, Kathrin; Buchinger, Sebastian; Reifferscheid, Georg; Hollert, Henner; Preuss, Thomas G
2016-02-16
The potential to bioconcentrate is generally considered to be an unwanted property of a substance. Consequently, chemical legislation, including the European REACH regulations, requires the chemical industry to provide bioconcentration data for chemicals that are produced or imported at volumes exceeding 100 tons per annum or if there is a concern that a substance is persistent, bioaccumulative, and toxic. For the filling of the existing data gap for chemicals produced or imported at levels that are below this stipulated volume, without the need for additional animal experiments, physiologically-based toxicokinetic (PBTK) models can be used to predict whole-body and tissue concentrations of neutral organic chemicals in fish. PBTK models have been developed for many different fish species with promising results. In this study, we developed PBTK models for zebrafish (Danio rerio) and roach (Rutilus rutilus) and combined them with existing models for rainbow trout (Onchorhynchus mykiss), lake trout (Salvelinus namaycush), and fathead minnow (Pimephales promelas). The resulting multispecies model framework allows for cross-species extrapolation of the bioaccumulative potential of neutral organic compounds. Predictions were compared with experimental data and were accurate for most substances. Our model can be used for probabilistic risk assessment of chemical bioaccumulation, with particular emphasis on cross-species evaluations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Purdy, R.
A hierarchical model consisting of quantitative structure-activity relationships based mainly on chemical reactivity was developed to predict the carcinogenicity of organic chemicals to rodents. The model is comprised of quantitative structure-activity relationships, QSARs based on hypothesized mechanisms of action, metabolism, and partitioning. Predictors included octanol/water partition coefficient, molecular size, atomic partial charge, bond angle strain, atomic acceptor delocalizibility, atomic radical superdelocalizibility, the lowest unoccupied molecular orbital (LUMO) energy of hypothesized intermediate nitrenium ion of primary aromatic amines, difference in charge of ionized and unionized carbon-chlorine bonds, substituent size and pattern on polynuclear aromatic hydrocarbons, the distance between lone electron pairsmore » over a rigid structure, and the presence of functionalities such as nitroso and hydrazine. The model correctly classified 96% of the carcinogens in the training set of 306 chemicals, and 90% of the carcinogens in the test set of 301 chemicals. The test set by chance contained 84% of the positive thiocontaining chemicals. A QSAR for these chemicals was developed. This posttest set modified model correctly predicted 94% of the carcinogens in the test set. This model was used to predict the carcinogenicity of the 25 organic chemicals the U.S. National Toxicology Program was testing at the writing of this article. 12 refs., 3 tabs.« less
Jarvis, J; Seed, M; Elton, R; Sawyer, L; Agius, R
2005-01-01
Aims: To investigate quantitatively, relationships between chemical structure and reported occupational asthma hazard for low molecular weight (LMW) organic compounds; to develop and validate a model linking asthma hazard with chemical substructure; and to generate mechanistic hypotheses that might explain the relationships. Methods: A learning dataset used 78 LMW chemical asthmagens reported in the literature before 1995, and 301 control compounds with recognised occupational exposures and hazards other than respiratory sensitisation. The chemical structures of the asthmagens and control compounds were characterised by the presence of chemical substructure fragments. Odds ratios were calculated for these fragments to determine which were associated with a likelihood of being reported as an occupational asthmagen. Logistic regression modelling was used to identify the independent contribution of these substructures. A post-1995 set of 21 asthmagens and 77 controls were selected to externally validate the model. Results: Nitrogen or oxygen containing functional groups such as isocyanate, amine, acid anhydride, and carbonyl were associated with an occupational asthma hazard, particularly when the functional group was present twice or more in the same molecule. A logistic regression model using only statistically significant independent variables for occupational asthma hazard correctly assigned 90% of the model development set. The external validation showed a sensitivity of 86% and specificity of 99%. Conclusions: Although a wide variety of chemical structures are associated with occupational asthma, bifunctional reactivity is strongly associated with occupational asthma hazard across a range of chemical substructures. This suggests that chemical cross-linking is an important molecular mechanism leading to the development of occupational asthma. The logistic regression model is freely available on the internet and may offer a useful but inexpensive adjunct to the prediction of occupational asthma hazard. PMID:15778257
Coupling biology and oceanography in models.
Fennel, W; Neumann, T
2001-08-01
The dynamics of marine ecosystems, i.e. the changes of observable chemical-biological quantities in space and time, are driven by biological and physical processes. Predictions of future developments of marine systems need a theoretical framework, i.e. models, solidly based on research and understanding of the different processes involved. The natural way to describe marine systems theoretically seems to be the embedding of chemical-biological models into circulation models. However, while circulation models are relatively advanced the quantitative theoretical description of chemical-biological processes lags behind. This paper discusses some of the approaches and problems in the development of consistent theories and indicates the beneficial potential of the coupling of marine biology and oceanography in models.
Process Model of A Fusion Fuel Recovery System for a Direct Drive IFE Power Reactor
NASA Astrophysics Data System (ADS)
Natta, Saswathi; Aristova, Maria; Gentile, Charles
2008-11-01
A task has been initiated to develop a detailed representative model for the fuel recovery system (FRS) in the prospective direct drive inertial fusion energy (IFE) reactor. As part of the conceptual design phase of the project, a chemical process model is developed in order to observe the interaction of system components. This process model is developed using FEMLAB Multiphysics software with the corresponding chemical engineering module (CEM). Initially, the reactants, system structure, and processes are defined using known chemical species of the target chamber exhaust. Each step within the Fuel recovery system is modeled compartmentally and then merged to form the closed loop fuel recovery system. The output, which includes physical properties and chemical content of the products, is analyzed after each step of the system to determine the most efficient and productive system parameters. This will serve to attenuate possible bottlenecks in the system. This modeling evaluation is instrumental in optimizing and closing the fusion fuel cycle in a direct drive IFE power reactor. The results of the modeling are presented in this paper.
A Bayesian network model for predicting aquatic toxicity mode ...
The mode of toxic action (MoA) has been recognized as a key determinant of chemical toxicity, but development of predictive MoA classification models in aquatic toxicology has been limited. We developed a Bayesian network model to classify aquatic toxicity MoA using a recently published dataset containing over one thousand chemicals with MoA assignments for aquatic animal toxicity. Two dimensional theoretical chemical descriptors were generated for each chemical using the Toxicity Estimation Software Tool. The model was developed through augmented Markov blanket discovery from the dataset of 1098 chemicals with the MoA broad classifications as a target node. From cross validation, the overall precision for the model was 80.2%. The best precision was for the AChEI MoA (93.5%) where 257 chemicals out of 275 were correctly classified. Model precision was poorest for the reactivity MoA (48.5%) where 48 out of 99 reactive chemicals were correctly classified. Narcosis represented the largest class within the MoA dataset and had a precision and reliability of 80.0%, reflecting the global precision across all of the MoAs. False negatives for narcosis most often fell into electron transport inhibition, neurotoxicity or reactivity MoAs. False negatives for all other MoAs were most often narcosis. A probabilistic sensitivity analysis was undertaken for each MoA to examine the sensitivity to individual and multiple descriptor findings. The results show that the Markov blank
MODELING MULTICOMPONENT ORGANIC CHEMICAL TRANSPORT IN THREE-FLUID-PHASE POROUS MEDIA
A two dimensional finite-element model was developed to predict coupled transient flow and multicomponent transport of organic chemicals which can partition between NAPL, water, gas and solid phases in porous media under the assumption of local chemical equilibrium. as-phase pres...
MODELING MULTICOMPONENT ORGANIC CHEMICAL TRANSPORT IN THREE FLUID PHASE POROUS MEDIA
A two-dimensional finite-element model was developed to predict coupled transient flow and multicomponent transport of organic chemicals which can partition between nonaqueous phase liquid, water, gas and solid phases in porous media under the assumption of local chemical equilib...
Scott Andersson, Asa; Tysklind, Mats; Fängmark, Ingrid
2007-08-17
The environment consists of a variety of different compartments and processes that act together in a complex system that complicate the environmental risk assessment after a chemical accident. The Environment-Accident Index (EAI) is an example of a tool based on a strategy to join the properties of a chemical with site-specific properties to facilitate this assessment and to be used in the planning process. In the development of the EAI it is necessary to make an unbiased judgement of relevant variables to include in the formula and to estimate their relative importance. The development of EAI has so far included the assimilation of chemical accidents, selection of a representative set of chemical accidents, and response values (representing effects in the environment after a chemical accident) have been developed by means of an expert panel. The developed responses were then related to the chemical and site-specific properties, through a mathematical model based on multivariate modelling (PLS), to create an improved EAI model. This resulted in EAI(new), a PLS based EAI model connected to a new classification scale. The advantages of EAI(new) compared to the old EAI (EAI(old)) is that it can be calculated without the use of tables, it can estimate the effects for all included responses and make a rough classification of chemical accidents according to the new classification scale. Finally EAI(new) is a more stable model than EAI(old), built on a valid base of accident scenarios which makes it more reliable to use for a variety of chemicals and situations as it covers a broader spectra of accident scenarios. EAI(new) can be expressed as a regression model to facilitate the calculation of the index for persons that do not have access to PLS. Future work can be; an external validation of EAI(new); to complete the formula structure; to adjust the classification scale; and to make a real life evaluation of EAI(new).
The use of mental models in chemical risk protection: developing a generic workplace methodology.
Cox, Patrick; Niewöhmer, Jörg; Pidgeon, Nick; Gerrard, Simon; Fischhoff, Baruch; Riley, Donna
2003-04-01
We adopted a comparative approach to evaluate and extend a generic methodology to analyze the different sets of beliefs held about chemical hazards in the workplace. Our study mapped existing knowledge structures about the risks associated with the use of perchloroethylene and rosin-based solder flux in differing workplaces. "Influence diagrams" were used to represent beliefs held by chemical experts; "user models" were developed from data elicited from open-ended interviews with the workplace users of the chemicals. The juxtaposition of expert and user understandings of chemical risks enabled us to identify knowledge gaps and misunderstandings and to reinforce appropriate sets of safety beliefs and behavior relevant to chemical risk communications. By designing safety information to be more relevant to the workplace context of users, we believe that employers and employees may gain improved knowledge about chemical hazards in the workplace, such that better chemical risk management, self-protection, and informed decision making develop over time.
Protocols for terrestrial bioaccumulation assessments are far less-developed than for aquatic systems. This manuscript reviews modeling approaches that can be used to assess the terrestrial bioaccumulation potential of commercial organic chemicals. Models exist for plant, inver...
NASA Astrophysics Data System (ADS)
Michael, R. A.; Stuart, A. L.
2007-12-01
Phase partitioning during freezing affects the transport and distribution of volatile chemical species in convective clouds. This consequently can have impacts on tropospheric chemistry, air quality, pollutant deposition, and climate change. Here, we discuss the development, evaluation, and application of a mechanistic model for the study and prediction of volatile chemical partitioning during steady-state hailstone growth. The model estimates the fraction of a chemical species retained in a two-phase freezing hailstone. It is based upon mass rate balances over water and solute for accretion under wet-growth conditions. Expressions for the calculation of model components, including the rates of super-cooled drop collection, shedding, evaporation, and hail growth were developed and implemented based on available cloud microphysics literature. Solute fate calculations assume equilibrium partitioning at air-liquid and liquid-ice interfaces. Currently, we are testing the model by performing mass balance calculations, sensitivity analyses, and comparison to available experimental data. Application of the model will improve understanding of the effects of cloud conditions and chemical properties on the fate of dissolved chemical species during hail growth.
Biasetti, Jacopo; Spazzini, Pier Giorgio; Swedenborg, Jesper; Gasser, T Christian
2012-01-01
Abdominal Aortic Aneurysms (AAAs) are frequently characterized by the presence of an Intra-Luminal Thrombus (ILT) known to influence their evolution biochemically and biomechanically. The ILT progression mechanism is still unclear and little is known regarding the impact of the chemical species transported by blood flow on this mechanism. Chemical agonists and antagonists of platelets activation, aggregation, and adhesion and the proteins involved in the coagulation cascade (CC) may play an important role in ILT development. Starting from this assumption, the evolution of chemical species involved in the CC, their relation to coherent vortical structures (VSs) and their possible effect on ILT evolution have been studied. To this end a fluid-chemical model that simulates the CC through a series of convection-diffusion-reaction (CDR) equations has been developed. The model involves plasma-phase and surface-bound enzymes and zymogens, and includes both plasma-phase and membrane-phase reactions. Blood is modeled as a non-Newtonian incompressible fluid. VSs convect thrombin in the domain and lead to the high concentration observed in the distal portion of the AAA. This finding is in line with the clinical observations showing that the thickest ILT is usually seen in the distal AAA region. The proposed model, due to its ability to couple the fluid and chemical domains, provides an integrated mechanochemical picture that potentially could help unveil mechanisms of ILT formation and development.
Improving plant bioaccumulation science through consistent reporting of experimental data.
Fantke, Peter; Arnot, Jon A; Doucette, William J
2016-10-01
Experimental data and models for plant bioaccumulation of organic contaminants play a crucial role for assessing the potential human and ecological risks associated with chemical use. Plants are receptor organisms and direct or indirect vectors for chemical exposures to all other organisms. As new experimental data are generated they are used to improve our understanding of plant-chemical interactions that in turn allows for the development of better scientific knowledge and conceptual and predictive models. The interrelationship between experimental data and model development is an ongoing, never-ending process needed to advance our ability to provide reliable quality information that can be used in various contexts including regulatory risk assessment. However, relatively few standard experimental protocols for generating plant bioaccumulation data are currently available and because of inconsistent data collection and reporting requirements, the information generated is often less useful than it could be for direct applications in chemical assessments and for model development and refinement. We review existing testing guidelines, common data reporting practices, and provide recommendations for revising testing guidelines and reporting requirements to improve bioaccumulation knowledge and models. This analysis provides a list of experimental parameters that will help to develop high quality datasets and support modeling tools for assessing bioaccumulation of organic chemicals in plants and ultimately addressing uncertainty in ecological and human health risk assessments. Copyright © 2016 Elsevier Ltd. All rights reserved.
In silico study of in vitro GPCR assays by QSAR modeling ...
The U.S. EPA is screening thousands of chemicals of environmental interest in hundreds of in vitro high-throughput screening (HTS) assays (the ToxCast program). One goal is to prioritize chemicals for more detailed analyses based on activity in molecular initiating events (MIE) of adverse outcome pathways (AOPs). However, the chemical space of interest for environmental exposure is much wider than this set of chemicals. Thus, there is a need to fill data gaps with in silico methods, and quantitative structure-activity relationships (QSARs) are a proven and cost effective approach to predict biological activity. ToxCast in turn provides relatively large datasets that are ideal for training and testing QSAR models. The overall goal of the study described here was to develop QSAR models to fill the data gaps in a larger environmental database of ~32k structures. The specific aim of the current work was to build QSAR models for 18 G-Protein Coupled Receptor (GPCR) assays, part of the aminergic category. Two QSAR modeling strategies were adopted: classification models were developed to separate chemicals into active/non-active classes, and then regression models were built to predict the potency values of the bioassays for the active chemicals. Multiple software programs were used to calculate constitutional, topological and substructural molecular descriptors from two-dimensional (2D) chemical structures. Model-fitting methods included PLSDA (partial least squares d
Development of Algal Interspecies Correlation Estimation Models for Chemical Hazard Assessment
Web-based Interspecies Correlation Estimation (ICE) is an application developed to predict the acute toxicity of a chemical from 1 species to another taxon. Web-ICE models use the acute toxicity value for a surrogate species to predict effect values for other species, thus potent...
Development and Application of In Vitro Models for Screening Drugs and Environmental Chemicals that Predict Toxicity in Animals and Humans (Presented by James McKim, Ph.D., DABT, Founder and Chief Science Officer, CeeTox) (5/25/2012)
Hemmer, Michael J., Robert T. Hudson and Calvin C. Walker. In press. Development of Protein Profile Technology to Evaluate Ecological Effects of Environmental Chemicals Using a Small Fish Model (Abstract). To be presented at the EPA Science Forum: Healthy Communities and Ecosyste...
Chemical genetics and strigolactone perception
Lumba, Shelley; Bunsick, Michael; McCourt, Peter
2017-01-01
Strigolactones (SLs) are a collection of related small molecules that act as hormones in plant growth and development. Intriguingly, SLs also act as ecological communicators between plants and mycorrhizal fungi and between host plants and a collection of parasitic plant species. In the case of mycorrhizal fungi, SLs exude into the soil from host roots to attract fungal hyphae for a beneficial interaction. In the case of parasitic plants, however, root-exuded SLs cause dormant parasitic plant seeds to germinate, thereby allowing the resulting seedling to infect the host and withdraw nutrients. Because a laboratory-friendly model does not exist for parasitic plants, researchers are currently using information gleaned from model plants like Arabidopsis in combination with the chemical probes developed through chemical genetics to understand SL perception of parasitic plants. This work first shows that understanding SL signaling is useful in developing chemical probes that perturb SL perception. Second, it indicates that the chemical space available to probe SL signaling in both model and parasitic plants is sizeable. Because these parasitic pests represent a major concern for food insecurity in the developing world, there is great need for chemical approaches to uncover novel lead compounds that perturb parasitic plant infections. PMID:28690842
United States Environmental Protection Agency (USEPA) researchers are developing a strategy for highthroughput (HT) exposure-based prioritization of chemicals under the ExpoCast program. These novel modeling approaches for evaluating chemicals based on their potential for biologi...
Ernstoff, Alexi S; Fantke, Peter; Huang, Lei; Jolliet, Olivier
2017-11-01
Specialty software and simplified models are often used to estimate migration of potentially toxic chemicals from packaging into food. Current models, however, are not suitable for emerging applications in decision-support tools, e.g. in Life Cycle Assessment and risk-based screening and prioritization, which require rapid computation of accurate estimates for diverse scenarios. To fulfil this need, we develop an accurate and rapid (high-throughput) model that estimates the fraction of organic chemicals migrating from polymeric packaging materials into foods. Several hundred step-wise simulations optimised the model coefficients to cover a range of user-defined scenarios (e.g. temperature). The developed model, operationalised in a spreadsheet for future dissemination, nearly instantaneously estimates chemical migration, and has improved performance over commonly used model simplifications. When using measured diffusion coefficients the model accurately predicted (R 2 = 0.9, standard error (S e ) = 0.5) hundreds of empirical data points for various scenarios. Diffusion coefficient modelling, which determines the speed of chemical transfer from package to food, was a major contributor to uncertainty and dramatically decreased model performance (R 2 = 0.4, S e = 1). In all, this study provides a rapid migration modelling approach to estimate exposure to chemicals in food packaging for emerging screening and prioritization approaches. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Dalton, Rebecca Marie
The development of student's mental models of chemical substances and processes at the molecular level was studied in a three-phase project. Animations produced in the VisChem project were used as an integral part of the chemistry instruction to help students develop their mental models. Phase one of the project involved examining the effectiveness of using animations to help first-year university chemistry students develop useful mental models of chemical phenomena. Phase two explored factors affecting the development of student's mental models, analysing results in terms of a proposed model of the perceptual processes involved in interpreting an animation. Phase three involved four case studies that served to confirm and elaborate on the effects of prior knowledge and disembedding ability on student's mental model development, and support the influence of study style on learning outcomes. Recommendations for use of the VisChem animations, based on the above findings, include: considering the prior knowledge of students; focusing attention on relevant features; encouraging a deep approach to learning; using animation to teach visual concepts; presenting ideas visually, verbally and conceptually; establishing 'animation literacy'; minimising cognitive load; using animation as feedback; using student drawings; repeating animations; and discussing 'scientific modelling'.
NASA Astrophysics Data System (ADS)
Adhikary, B.; Kulkarni, S.; Carmichael, G. R.; Tang, Y.; Dallura, A.; Mena, M.; Streets, D.; Zhang, Q.
2007-12-01
The Intercontinental Chemical Transport Experiment-Phase B (INTEX-B) was conducted over the Pacific Ocean during the 2006 North American spring season. One of the scientific objectives of the INTEX-B field campaign was to quantify the transport and chemical evolution/aging of Asian air pollution into North America. The field campaign deployed multiple experimental platforms such as satellites, aircrafts and surface measurements stations to study the pollution outflow to North America. Three dimensional chemical transport models were used to provide chemical weather forecasts and assist in flight planning during the mission. The Sulfur Transport and dEposition Model (STEM) is a regional chemical transport model developed at the University of Iowa. The STEM model was involved in providing chemical weather forecasts and assist in flight planning during the INTEX-B intensive field campaign. In this study we will report the STEM model performance of aerosols and trace gases in its ability to capture the pollutant plume with experimental observations obtained from the field campaign. The study will then relate the emissions of trace gases and aerosols to atmospheric composition, sources and sinks using the newly developed emissions inventory for the INTEX-B field campaign.
Solutions of the chemical kinetic equations for initially inhomogeneous mixtures.
NASA Technical Reports Server (NTRS)
Hilst, G. R.
1973-01-01
Following the recent discussions by O'Brien (1971) and Donaldson and Hilst (1972) of the effects of inhomogeneous mixing and turbulent diffusion on simple chemical reaction rates, the present report provides a more extensive analysis of when inhomogeneous mixing has a significant effect on chemical reaction rates. The analysis is then extended to the development of an approximate chemical sub-model which provides much improved predictions of chemical reaction rates over a wide range of inhomogeneities and pathological distributions of the concentrations of the reacting chemical species. In particular, the development of an approximate representation of the third-order correlations of the joint concentration fluctuations permits closure of the chemical sub-model at the level of the second-order moments of these fluctuations and the mean concentrations.
Gandhi, Nilima; Bhavsar, Satyendra P; Gewurtz, Sarah B; Diamond, Miriam L; Evenset, Anita; Christensen, Guttorm N; Gregor, Dennis
2006-08-01
A multichemical food web model has been developed to estimate the biomagnification of interconverting chemicals in aquatic food webs. We extended a fugacity-based food web model for single chemicals to account for reversible and irreversible biotransformation among a parent chemical and transformation products, by simultaneously solving mass balance equations of the chemicals using a matrix solution. The model can be applied to any number of chemicals and organisms or taxonomic groups in a food web. The model was illustratively applied to four PBDE congeners, BDE-47, -99, -100, and -153, in the food web of Lake Ellasjøen, Bear Island, Norway. In Ellasjøen arctic char (Salvelinus alpinus), the multichemical model estimated PBDE biotransformation from higher to lower brominated congeners and improved the correspondence between estimated and measured concentrations in comparison to estimates from the single-chemical food web model. The underestimation of BDE-47, even after considering bioformation due to biotransformation of the otherthree congeners, suggests its formation from additional biotransformation pathways not considered in this application. The model estimates approximate values for congener-specific biotransformation half-lives of 5.7,0.8,1.14, and 0.45 years for BDE-47, -99, -100, and -153, respectively, in large arctic char (S. alpinus) of Lake Ellasjøen.
Learning of Chemical Equilibrium through Modelling-Based Teaching
ERIC Educational Resources Information Center
Maia, Poliana Flavia; Justi, Rosaria
2009-01-01
This paper presents and discusses students' learning process of chemical equilibrium from a modelling-based approach developed from the use of the "Model of Modelling" diagram. The investigation was conducted in a regular classroom (students 14-15 years old) and aimed at discussing how modelling-based teaching can contribute to students…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, E.G.; Mioduszewski, R.J.
The Chemical Computer Man: Chemical Agent Response Simulation (CARS) is a computer model and simulation program for estimating the dynamic changes in human physiological dysfunction resulting from exposures to chemical-threat nerve agents. The newly developed CARS methodology simulates agent exposure effects on the following five indices of human physiological function: mental, vision, cardio-respiratory, visceral, and limbs. Mathematical models and the application of basic pharmacokinetic principles were incorporated into the simulation so that for each chemical exposure, the relationship between exposure dosage, absorbed dosage (agent blood plasma concentration), and level of physiological response are computed as a function of time. CARS,more » as a simulation tool, is designed for the users with little or no computer-related experience. The model combines maximum flexibility with a comprehensive user-friendly interactive menu-driven system. Users define an exposure problem and obtain immediate results displayed in tabular, graphical, and image formats. CARS has broad scientific and engineering applications, not only in technology for the soldier in the area of Chemical Defense, but also in minimizing animal testing in biomedical and toxicological research and the development of a modeling system for human exposure to hazardous-waste chemicals.« less
Model reduction of multiscale chemical langevin equations: a numerical case study.
Sotiropoulos, Vassilios; Contou-Carrere, Marie-Nathalie; Daoutidis, Prodromos; Kaznessis, Yiannis N
2009-01-01
Two very important characteristics of biological reaction networks need to be considered carefully when modeling these systems. First, models must account for the inherent probabilistic nature of systems far from the thermodynamic limit. Often, biological systems cannot be modeled with traditional continuous-deterministic models. Second, models must take into consideration the disparate spectrum of time scales observed in biological phenomena, such as slow transcription events and fast dimerization reactions. In the last decade, significant efforts have been expended on the development of stochastic chemical kinetics models to capture the dynamics of biomolecular systems, and on the development of robust multiscale algorithms, able to handle stiffness. In this paper, the focus is on the dynamics of reaction sets governed by stiff chemical Langevin equations, i.e., stiff stochastic differential equations. These are particularly challenging systems to model, requiring prohibitively small integration step sizes. We describe and illustrate the application of a semianalytical reduction framework for chemical Langevin equations that results in significant gains in computational cost.
Modeling of the HiPco process for carbon nanotube production. I. Chemical kinetics
NASA Technical Reports Server (NTRS)
Dateo, Christopher E.; Gokcen, Tahir; Meyyappan, M.
2002-01-01
A chemical kinetic model is developed to help understand and optimize the production of single-walled carbon nanotubes via the high-pressure carbon monoxide (HiPco) process, which employs iron pentacarbonyl as the catalyst precursor and carbon monoxide as the carbon feedstock. The model separates the HiPco process into three steps, precursor decomposition, catalyst growth and evaporation, and carbon nanotube production resulting from the catalyst-enhanced disproportionation of carbon monoxide, known as the Boudouard reaction: 2 CO(g)-->C(s) + CO2(g). The resulting detailed model contains 971 species and 1948 chemical reactions. A second model with a reduced reaction set containing 14 species and 22 chemical reactions is developed on the basis of the detailed model and reproduces the chemistry of the major species. Results showing the parametric dependence of temperature, total pressure, and initial precursor partial pressures are presented, with comparison between the two models. The reduced model is more amenable to coupled reacting flow-field simulations, presented in the following article.
Frontiers of chemical bioaccumulation modeling with fish
Predictive models for chemical accumulation in fish have been provided by numerous authors. Historically, these models were developed to describe the accumulation of neutral hydrophobic compounds which undergo little or no biotransformation. In such cases, accumulation can be p...
Advancing Models and Data for Characterizing Exposures to Chemicals in Consumer Products
EPA’s Office of Research and Development (ORD) is leading several efforts to develop data and methods for estimating population chemical exposures related to the use of consumer products. New curated chemical, ingredient, and product use information are being collected fro...
ERIC Educational Resources Information Center
McPeake, John D.; And Others
1991-01-01
Describes adolescent chemical dependency treatment model developed at Beech Hill Hospital (New Hampshire) which integrated Twelve Step-oriented alcohol and drug rehabilitation program with experiential education school, Hurricane Island Outward Bound School. Describes Beech Hill Hurricane Island Outward Bound School Adolescent Chemical Dependency…
High throughput screening (HTS) models are being developed and applied to prioritize chemicals for more comprehensive exposure and risk assessment. Dermal pathways are possible exposure routes to humans for thousands of chemicals found in personal care products and the indoor env...
Dry Chemical Development - A Model for the Extinction of Hydrocarbon Flames.
1984-02-08
and predicts the suppression effectiveness of a wide variety of gaseous, liquid, and solid agents . The flame extinguishment model is based on the...generalized by consideration of all endothermic reaction sinks, eg., vaporization, dissociation, and decomposition. The general equation correlates...CHEMICAL DEVELOPMENT - A MODEL FOR THE EXTINCTION OF HYDROCARBON FLAMES Various fire-extinguishing agents are carried on board Navy ships to control
Truong, Lisa; Ouedraogo, Gladys; Pham, LyLy; Clouzeau, Jacques; Loisel-Joubert, Sophie; Blanchet, Delphine; Noçairi, Hicham; Setzer, Woodrow; Judson, Richard; Grulke, Chris; Mansouri, Kamel; Martin, Matthew
2018-02-01
In an effort to address a major challenge in chemical safety assessment, alternative approaches for characterizing systemic effect levels, a predictive model was developed. Systemic effect levels were curated from ToxRefDB, HESS-DB and COSMOS-DB from numerous study types totaling 4379 in vivo studies for 1247 chemicals. Observed systemic effects in mammalian models are a complex function of chemical dynamics, kinetics, and inter- and intra-individual variability. To address this complex problem, systemic effect levels were modeled at the study-level by leveraging study covariates (e.g., study type, strain, administration route) in addition to multiple descriptor sets, including chemical (ToxPrint, PaDEL, and Physchem), biological (ToxCast), and kinetic descriptors. Using random forest modeling with cross-validation and external validation procedures, study-level covariates alone accounted for approximately 15% of the variance reducing the root mean squared error (RMSE) from 0.96 log 10 to 0.85 log 10 mg/kg/day, providing a baseline performance metric (lower expectation of model performance). A consensus model developed using a combination of study-level covariates, chemical, biological, and kinetic descriptors explained a total of 43% of the variance with an RMSE of 0.69 log 10 mg/kg/day. A benchmark model (upper expectation of model performance) was also developed with an RMSE of 0.5 log 10 mg/kg/day by incorporating study-level covariates and the mean effect level per chemical. To achieve a representative chemical-level prediction, the minimum study-level predicted and observed effect level per chemical were compared reducing the RMSE from 1.0 to 0.73 log 10 mg/kg/day, equivalent to 87% of predictions falling within an order-of-magnitude of the observed value. Although biological descriptors did not improve model performance, the final model was enriched for biological descriptors that indicated xenobiotic metabolism gene expression, oxidative stress, and cytotoxicity, demonstrating the importance of accounting for kinetics and non-specific bioactivity in predicting systemic effect levels. Herein, we generated an externally predictive model of systemic effect levels for use as a safety assessment tool and have generated forward predictions for over 30,000 chemicals.
Gupta, S; Basant, N; Mohan, D; Singh, K P
2016-07-01
Experimental determinations of the rate constants of the reaction of NO3 with a large number of organic chemicals are tedious, and time and resource intensive; and the development of computational methods has widely been advocated. In this study, we have developed room-temperature (298 K) and temperature-dependent quantitative structure-reactivity relationship (QSRR) models based on the ensemble learning approaches (decision tree forest (DTF) and decision treeboost (DTB)) for predicting the rate constant of the reaction of NO3 radicals with diverse organic chemicals, under OECD guidelines. Predictive powers of the developed models were established in terms of statistical coefficients. In the test phase, the QSRR models yielded a correlation (r(2)) of >0.94 between experimental and predicted rate constants. The applicability domains of the constructed models were determined. An attempt has been made to provide the mechanistic interpretation of the selected features for QSRR development. The proposed QSRR models outperformed the previous reports, and the temperature-dependent models offered a much wider applicability domain. This is the first report presenting a temperature-dependent QSRR model for predicting the nitrate radical reaction rate constant at different temperatures. The proposed models can be useful tools in predicting the reactivities of chemicals towards NO3 radicals in the atmosphere, hence, their persistence and exposure risk assessment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Singh, Kunwar P., E-mail: kpsingh_52@yahoo.com; Environmental Chemistry Division, CSIR-Indian Institute of Toxicology Research, Post Box 80, Mahatma Gandhi Marg, Lucknow 226 001; Gupta, Shikha
Robust global models capable of discriminating positive and non-positive carcinogens; and predicting carcinogenic potency of chemicals in rodents were developed. The dataset of 834 structurally diverse chemicals extracted from Carcinogenic Potency Database (CPDB) was used which contained 466 positive and 368 non-positive carcinogens. Twelve non-quantum mechanical molecular descriptors were derived. Structural diversity of the chemicals and nonlinearity in the data were evaluated using Tanimoto similarity index and Brock–Dechert–Scheinkman statistics. Probabilistic neural network (PNN) and generalized regression neural network (GRNN) models were constructed for classification and function optimization problems using the carcinogenicity end point in rat. Validation of the models wasmore » performed using the internal and external procedures employing a wide series of statistical checks. PNN constructed using five descriptors rendered classification accuracy of 92.09% in complete rat data. The PNN model rendered classification accuracies of 91.77%, 80.70% and 92.08% in mouse, hamster and pesticide data, respectively. The GRNN constructed with nine descriptors yielded correlation coefficient of 0.896 between the measured and predicted carcinogenic potency with mean squared error (MSE) of 0.44 in complete rat data. The rat carcinogenicity model (GRNN) applied to the mouse and hamster data yielded correlation coefficient and MSE of 0.758, 0.71 and 0.760, 0.46, respectively. The results suggest for wide applicability of the inter-species models in predicting carcinogenic potency of chemicals. Both the PNN and GRNN (inter-species) models constructed here can be useful tools in predicting the carcinogenicity of new chemicals for regulatory purposes. - Graphical abstract: Figure (a) shows classification accuracies (positive and non-positive carcinogens) in rat, mouse, hamster, and pesticide data yielded by optimal PNN model. Figure (b) shows generalization and predictive abilities of the interspecies GRNN model to predict the carcinogenic potency of diverse chemicals. - Highlights: • Global robust models constructed for carcinogenicity prediction of diverse chemicals. • Tanimoto/BDS test revealed structural diversity of chemicals and nonlinearity in data. • PNN/GRNN successfully predicted carcinogenicity/carcinogenic potency of chemicals. • Developed interspecies PNN/GRNN models for carcinogenicity prediction. • Proposed models can be used as tool to predict carcinogenicity of new chemicals.« less
Aladjov, Hristo; Ankley, Gerald; Byrne, Hugh J.; de Knecht, Joop; Heinzle, Elmar; Klambauer, Günter; Landesmann, Brigitte; Luijten, Mirjam; MacKay, Cameron; Maxwell, Gavin; Meek, M. E. (Bette); Paini, Alicia; Perkins, Edward; Sobanski, Tomasz; Villeneuve, Dan; Waters, Katrina M.; Whelan, Maurice
2017-01-01
Efforts are underway to transform regulatory toxicology and chemical safety assessment from a largely empirical science based on direct observation of apical toxicity outcomes in whole organism toxicity tests to a predictive one in which outcomes and risk are inferred from accumulated mechanistic understanding. The adverse outcome pathway (AOP) framework provides a systematic approach for organizing knowledge that may support such inference. Likewise, computational models of biological systems at various scales provide another means and platform to integrate current biological understanding to facilitate inference and extrapolation. We argue that the systematic organization of knowledge into AOP frameworks can inform and help direct the design and development of computational prediction models that can further enhance the utility of mechanistic and in silico data for chemical safety assessment. This concept was explored as part of a workshop on AOP-Informed Predictive Modeling Approaches for Regulatory Toxicology held September 24–25, 2015. Examples of AOP-informed model development and its application to the assessment of chemicals for skin sensitization and multiple modes of endocrine disruption are provided. The role of problem formulation, not only as a critical phase of risk assessment, but also as guide for both AOP and complementary model development is described. Finally, a proposal for actively engaging the modeling community in AOP-informed computational model development is made. The contents serve as a vision for how AOPs can be leveraged to facilitate development of computational prediction models needed to support the next generation of chemical safety assessment. PMID:27994170
Organism and population-level ecological models for chemical risk assessment
Ecological risk assessment typically focuses on animal populations as endpoints for regulatory ecotoxicology. Scientists at USEPA are developing models for animal populations exposed to a wide range of chemicals from pesticides to emerging contaminants. Modeled taxa include aquat...
The aim of this work is to develop group-contribution+ (GC+) method (combined group-contribution (GC) method and atom connectivity index (CI) method) based property models to provide reliable estimations of environment-related properties of organic chemicals together with uncert...
Further development of a global pollution model for CO, CH4, and CH2 O
NASA Technical Reports Server (NTRS)
Peters, L. K.
1975-01-01
Global tropospheric pollution models are developed that describe the transport and the physical and chemical processes occurring between the principal sources and sinks of CH4 and CO. Results are given of long term static chemical kinetic computer simulations and preliminary short term dynamic simulations.
Multiscale modeling of nerve agent hydrolysis mechanisms: a tale of two Nobel Prizes
NASA Astrophysics Data System (ADS)
Field, Martin J.; Wymore, Troy W.
2014-10-01
The 2013 Nobel Prize in Chemistry was awarded for the development of multiscale models for complex chemical systems, whereas the 2013 Peace Prize was given to the Organisation for the Prohibition of Chemical Weapons for their efforts to eliminate chemical warfare agents. This review relates the two by introducing the field of multiscale modeling and highlighting its application to the study of the biological mechanisms by which selected chemical weapon agents exert their effects at an atomic level.
NASA Astrophysics Data System (ADS)
Baklanov, Alexander; Smith Korsholm, Ulrik; Nuterman, Roman; Mahura, Alexander; Pagh Nielsen, Kristian; Hansen Sass, Bent; Rasmussen, Alix; Zakey, Ashraf; Kaas, Eigil; Kurganskiy, Alexander; Sørensen, Brian; González-Aparicio, Iratxe
2017-08-01
The Environment - High Resolution Limited Area Model (Enviro-HIRLAM) is developed as a fully online integrated numerical weather prediction (NWP) and atmospheric chemical transport (ACT) model for research and forecasting of joint meteorological, chemical and biological weather. The integrated modelling system is developed by the Danish Meteorological Institute (DMI) in collaboration with several European universities. It is the baseline system in the HIRLAM Chemical Branch and used in several countries and different applications. The development was initiated at DMI more than 15 years ago. The model is based on the HIRLAM NWP model with online integrated pollutant transport and dispersion, chemistry, aerosol dynamics, deposition and atmospheric composition feedbacks. To make the model suitable for chemical weather forecasting in urban areas, the meteorological part was improved by implementation of urban parameterisations. The dynamical core was improved by implementing a locally mass-conserving semi-Lagrangian numerical advection scheme, which improves forecast accuracy and model performance. The current version (7.2), in comparison with previous versions, has a more advanced and cost-efficient chemistry, aerosol multi-compound approach, aerosol feedbacks (direct and semi-direct) on radiation and (first and second indirect effects) on cloud microphysics. Since 2004, the Enviro-HIRLAM has been used for different studies, including operational pollen forecasting for Denmark since 2009 and operational forecasting atmospheric composition with downscaling for China since 2017. Following the main research and development strategy, further model developments will be extended towards the new NWP platform - HARMONIE. Different aspects of online coupling methodology, research strategy and possible applications of the modelling system, and fit-for-purpose
model configurations for the meteorological and air quality communities are discussed.
Vorberg, Susann
2013-01-01
Abstract Biodegradability describes the capacity of substances to be mineralized by free‐living bacteria. It is a crucial property in estimating a compound’s long‐term impact on the environment. The ability to reliably predict biodegradability would reduce the need for laborious experimental testing. However, this endpoint is difficult to model due to unavailability or inconsistency of experimental data. Our approach makes use of the Online Chemical Modeling Environment (OCHEM) and its rich supply of machine learning methods and descriptor sets to build classification models for ready biodegradability. These models were analyzed to determine the relationship between characteristic structural properties and biodegradation activity. The distinguishing feature of the developed models is their ability to estimate the accuracy of prediction for each individual compound. The models developed using seven individual descriptor sets were combined in a consensus model, which provided the highest accuracy. The identified overrepresented structural fragments can be used by chemists to improve the biodegradability of new chemical compounds. The consensus model, the datasets used, and the calculated structural fragments are publicly available at http://ochem.eu/article/31660. PMID:27485201
Kim, Cheol-Hee; Park, Jin-Ho; Park, Cheol-Jin; Na, Jin-Gyun
2004-03-01
The Chemical Accidents Response Information System (CARIS) was developed at the Center for Chemical Safety Management in South Korea in order to track and predict the dispersion of hazardous chemicals in the case of an accident or terrorist attack involving chemical companies. The main objective of CARIS is to facilitate an efficient emergency response to hazardous chemical accidents by rapidly providing key information in the decision-making process. In particular, the atmospheric modeling system implemented in CARIS, which is composed of a real-time numerical weather forecasting model and an air pollution dispersion model, can be used as a tool to forecast concentrations and to provide a wide range of assessments associated with various hazardous chemicals in real time. This article introduces the components of CARIS and describes its operational modeling system. Some examples of the operational modeling system and its use for emergency preparedness are presented and discussed. Finally, this article evaluates the current numerical weather prediction model for Korea.
QSAR models of human data can enrich or replace LLNA testing for human skin sensitization
Alves, Vinicius M.; Capuzzi, Stephen J.; Muratov, Eugene; Braga, Rodolpho C.; Thornton, Thomas; Fourches, Denis; Strickland, Judy; Kleinstreuer, Nicole; Andrade, Carolina H.; Tropsha, Alexander
2016-01-01
Skin sensitization is a major environmental and occupational health hazard. Although many chemicals have been evaluated in humans, there have been no efforts to model these data to date. We have compiled, curated, analyzed, and compared the available human and LLNA data. Using these data, we have developed reliable computational models and applied them for virtual screening of chemical libraries to identify putative skin sensitizers. The overall concordance between murine LLNA and human skin sensitization responses for a set of 135 unique chemicals was low (R = 28-43%), although several chemical classes had high concordance. We have succeeded to develop predictive QSAR models of all available human data with the external correct classification rate of 71%. A consensus model integrating concordant QSAR predictions and LLNA results afforded a higher CCR of 82% but at the expense of the reduced external dataset coverage (52%). We used the developed QSAR models for virtual screening of CosIng database and identified 1061 putative skin sensitizers; for seventeen of these compounds, we found published evidence of their skin sensitization effects. Models reported herein provide more accurate alternative to LLNA testing for human skin sensitization assessment across diverse chemical data. In addition, they can also be used to guide the structural optimization of toxic compounds to reduce their skin sensitization potential. PMID:28630595
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ho, Clifford Kuofei
Chemical transport through human skin can play a significant role in human exposure to toxic chemicals in the workplace, as well as to chemical/biological warfare agents in the battlefield. The viability of transdermal drug delivery also relies on chemical transport processes through the skin. Models of percutaneous absorption are needed for risk-based exposure assessments and drug-delivery analyses, but previous mechanistic models have been largely deterministic. A probabilistic, transient, three-phase model of percutaneous absorption of chemicals has been developed to assess the relative importance of uncertain parameters and processes that may be important to risk-based assessments. Penetration routes through the skinmore » that were modeled include the following: (1) intercellular diffusion through the multiphase stratum corneum; (2) aqueous-phase diffusion through sweat ducts; and (3) oil-phase diffusion through hair follicles. Uncertainty distributions were developed for the model parameters, and a Monte Carlo analysis was performed to simulate probability distributions of mass fluxes through each of the routes. Sensitivity analyses using stepwise linear regression were also performed to identify model parameters that were most important to the simulated mass fluxes at different times. This probabilistic analysis of percutaneous absorption (PAPA) method has been developed to improve risk-based exposure assessments and transdermal drug-delivery analyses, where parameters and processes can be highly uncertain.« less
Chemically reacting supersonic flow calculation using an assumed PDF model
NASA Technical Reports Server (NTRS)
Farshchi, M.
1990-01-01
This work is motivated by the need to develop accurate models for chemically reacting compressible turbulent flow fields that are present in a typical supersonic combustion ramjet (SCRAMJET) engine. In this paper the development of a new assumed probability density function (PDF) reaction model for supersonic turbulent diffusion flames and its implementation into an efficient Navier-Stokes solver are discussed. The application of this model to a supersonic hydrogen-air flame will be considered.
Toward a 3D model of human brain development for studying gene/environment interactions
2013-01-01
This project aims to establish and characterize an in vitro model of the developing human brain for the purpose of testing drugs and chemicals. To accurately assess risk, a model needs to recapitulate the complex interactions between different types of glial cells and neurons in a three-dimensional platform. Moreover, human cells are preferred over cells from rodents to eliminate cross-species differences in sensitivity to chemicals. Previously, we established conditions to culture rat primary cells as three-dimensional aggregates, which will be humanized and evaluated here with induced pluripotent stem cells (iPSCs). The use of iPSCs allows us to address gene/environment interactions as well as the potential of chemicals to interfere with epigenetic mechanisms. Additionally, iPSCs afford us the opportunity to study the effect of chemicals during very early stages of brain development. It is well recognized that assays for testing toxicity in the developing brain must consider differences in sensitivity and susceptibility that arise depending on the time of exposure. This model will reflect critical developmental processes such as proliferation, differentiation, lineage specification, migration, axonal growth, dendritic arborization and synaptogenesis, which will probably display differences in sensitivity to different types of chemicals. Functional endpoints will evaluate the complex cell-to-cell interactions that are affected in neurodevelopment through chemical perturbation, and the efficacy of drug intervention to prevent or reverse phenotypes. The model described is designed to assess developmental neurotoxicity effects on unique processes occurring during human brain development by leveraging human iPSCs from diverse genetic backgrounds, which can be differentiated into different cell types of the central nervous system. Our goal is to demonstrate the feasibility of the personalized model using iPSCs derived from individuals with neurodevelopmental disorders caused by known mutations and chromosomal aberrations. Notably, such a human brain model will be a versatile tool for more complex testing platforms and strategies as well as research into central nervous system physiology and pathology. PMID:24564953
QSAR modeling of cumulative environmental end-points for the prioritization of hazardous chemicals.
Gramatica, Paola; Papa, Ester; Sangion, Alessandro
2018-01-24
The hazard of chemicals in the environment is inherently related to the molecular structure and derives simultaneously from various chemical properties/activities/reactivities. Models based on Quantitative Structure Activity Relationships (QSARs) are useful to screen, rank and prioritize chemicals that may have an adverse impact on humans and the environment. This paper reviews a selection of QSAR models (based on theoretical molecular descriptors) developed for cumulative multivariate endpoints, which were derived by mathematical combination of multiple effects and properties. The cumulative end-points provide an integrated holistic point of view to address environmentally relevant properties of chemicals.
NSR&D FY15 Final Report. Modeling Mechanical, Thermal, and Chemical Effects of Impact
DOE Office of Scientific and Technical Information (OSTI.GOV)
Long, Christopher Curtis; Ma, Xia; Zhang, Duan Zhong
2015-11-02
The main goal of this project is to develop a computer model that explains and predicts coupled mechanical, thermal and chemical responses of HE under impact and friction insults. The modeling effort is based on the LANL-developed CartaBlanca code, which is implemented with the dual domain material point (DDMP) method to calculate complex and coupled thermal, chemical and mechanical effects among fluids, solids and the transitions between the states. In FY 15, we have implemented the TEPLA material model for metal and performed preliminary can penetration simulation and begun to link with experiment. Currently, we are working on implementing amore » shock to detonation transition (SDT) model (SURF) and JWL equation of state.« less
Mitchell, Jade; Arnot, Jon A.; Jolliet, Olivier; Georgopoulos, Panos G.; Isukapalli, Sastry; Dasgupta, Surajit; Pandian, Muhilan; Wambaugh, John; Egeghy, Peter; Cohen Hubal, Elaine A.; Vallero, Daniel A.
2014-01-01
While only limited data are available to characterize the potential toxicity of over 8 million commercially available chemical substances, there is even less information available on the exposure and use-scenarios that are required to link potential toxicity to human and ecological health outcomes. Recent improvements and advances such as high throughput data gathering, high performance computational capabilities, and predictive chemical inherency methodology make this an opportune time to develop an exposure-based prioritization approach that can systematically utilize and link the asymmetrical bodies of knowledge for hazard and exposure. In response to the US EPA’s need to develop novel approaches and tools for rapidly prioritizing chemicals, a “Challenge” was issued to several exposure model developers to aid the understanding of current systems in a broader sense and to assist the US EPA’s effort to develop an approach comparable to other international efforts. A common set of chemicals were prioritized under each current approach. The results are presented herein along with a comparative analysis of the rankings of the chemicals based on metrics of exposure potential or actual exposure estimates. The analysis illustrates the similarities and differences across the domains of information incorporated in each modeling approach. The overall findings indicate a need to reconcile exposures from diffuse, indirect sources (far-field) with exposures from directly, applied chemicals in consumer products or resulting from the presence of a chemical in a microenvironment like a home or vehicle. Additionally, the exposure scenario, including the mode of entry into the environment (i.e. through air, water or sediment) appears to be an important determinant of the level of agreement between modeling approaches. PMID:23707726
Mitchell, Jade; Arnot, Jon A; Jolliet, Olivier; Georgopoulos, Panos G; Isukapalli, Sastry; Dasgupta, Surajit; Pandian, Muhilan; Wambaugh, John; Egeghy, Peter; Cohen Hubal, Elaine A; Vallero, Daniel A
2013-08-01
While only limited data are available to characterize the potential toxicity of over 8 million commercially available chemical substances, there is even less information available on the exposure and use-scenarios that are required to link potential toxicity to human and ecological health outcomes. Recent improvements and advances such as high throughput data gathering, high performance computational capabilities, and predictive chemical inherency methodology make this an opportune time to develop an exposure-based prioritization approach that can systematically utilize and link the asymmetrical bodies of knowledge for hazard and exposure. In response to the US EPA's need to develop novel approaches and tools for rapidly prioritizing chemicals, a "Challenge" was issued to several exposure model developers to aid the understanding of current systems in a broader sense and to assist the US EPA's effort to develop an approach comparable to other international efforts. A common set of chemicals were prioritized under each current approach. The results are presented herein along with a comparative analysis of the rankings of the chemicals based on metrics of exposure potential or actual exposure estimates. The analysis illustrates the similarities and differences across the domains of information incorporated in each modeling approach. The overall findings indicate a need to reconcile exposures from diffuse, indirect sources (far-field) with exposures from directly, applied chemicals in consumer products or resulting from the presence of a chemical in a microenvironment like a home or vehicle. Additionally, the exposure scenario, including the mode of entry into the environment (i.e. through air, water or sediment) appears to be an important determinant of the level of agreement between modeling approaches. Copyright © 2013 Elsevier B.V. All rights reserved.
ZEBRAFISH AS AN IN VIVO MODEL FOR SUSTAINABLE CHEMICAL DESIGN.
Noyes, Pamela D; Garcia, Gloria R; Tanguay, Robert L
2016-12-21
Heightened public awareness about the many thousands of chemicals in use and present as persistent contaminants in the environment has increased the demand for safer chemicals and more rigorous toxicity testing. There is a growing recognition that the use of traditional test models and empirical approaches is impractical for screening for toxicity the many thousands of chemicals in the environment and the hundreds of new chemistries introduced each year. These realities coupled with the green chemistry movement have prompted efforts to implement more predictive-based approaches to evaluate chemical toxicity early in product development. While used for many years in environmental toxicology and biomedicine, zebrafish use has accelerated more recently in genetic toxicology, high throughput screening (HTS), and behavioral testing. This review describes major advances in these testing methods that have positioned the zebrafish as a highly applicable model in chemical safety evaluations and sustainable chemistry efforts. Many toxic responses have been shown to be shared among fish and mammals owing to their generally well-conserved development, cellular networks, and organ systems. These shared responses have been observed for chemicals that impair endocrine functioning, development, and reproduction, as well as those that elicit cardiotoxicity and carcinogenicity, among other diseases. HTS technologies with zebrafish enable screening large chemical libraries for bioactivity that provide opportunities for testing early in product development. A compelling attribute of the zebrafish centers on being able to characterize toxicity mechanisms across multiple levels of biological organization from the genome to receptor interactions and cellular processes leading to phenotypic changes such as developmental malformations. Finally, there is a growing recognition of the links between human and wildlife health and the need for approaches that allow for assessment of real world multi-chemical exposures. The zebrafish is poised to be an important model in bridging these two conventionally separate areas of toxicology and characterizing the biological effects of chemical mixtures that could augment its role in sustainable chemistry.
ZEBRAFISH AS AN IN VIVO MODEL FOR SUSTAINABLE CHEMICAL DESIGN
Noyes, Pamela D.; Garcia, Gloria R.; Tanguay, Robert L.
2016-01-01
Heightened public awareness about the many thousands of chemicals in use and present as persistent contaminants in the environment has increased the demand for safer chemicals and more rigorous toxicity testing. There is a growing recognition that the use of traditional test models and empirical approaches is impractical for screening for toxicity the many thousands of chemicals in the environment and the hundreds of new chemistries introduced each year. These realities coupled with the green chemistry movement have prompted efforts to implement more predictive-based approaches to evaluate chemical toxicity early in product development. While used for many years in environmental toxicology and biomedicine, zebrafish use has accelerated more recently in genetic toxicology, high throughput screening (HTS), and behavioral testing. This review describes major advances in these testing methods that have positioned the zebrafish as a highly applicable model in chemical safety evaluations and sustainable chemistry efforts. Many toxic responses have been shown to be shared among fish and mammals owing to their generally well-conserved development, cellular networks, and organ systems. These shared responses have been observed for chemicals that impair endocrine functioning, development, and reproduction, as well as those that elicit cardiotoxicity and carcinogenicity, among other diseases. HTS technologies with zebrafish enable screening large chemical libraries for bioactivity that provide opportunities for testing early in product development. A compelling attribute of the zebrafish centers on being able to characterize toxicity mechanisms across multiple levels of biological organization from the genome to receptor interactions and cellular processes leading to phenotypic changes such as developmental malformations. Finally, there is a growing recognition of the links between human and wildlife health and the need for approaches that allow for assessment of real world multi-chemical exposures. The zebrafish is poised to be an important model in bridging these two conventionally separate areas of toxicology and characterizing the biological effects of chemical mixtures that could augment its role in sustainable chemistry. PMID:28461781
Chemical supply chain modeling for analysis of homeland security events
Ehlen, Mark A.; Sun, Amy C.; Pepple, Mark A.; ...
2013-09-06
The potential impacts of man-made and natural disasters on chemical plants, complexes, and supply chains are of great importance to homeland security. To be able to estimate these impacts, we developed an agent-based chemical supply chain model that includes: chemical plants with enterprise operations such as purchasing, production scheduling, and inventories; merchant chemical markets, and multi-modal chemical shipments. Large-scale simulations of chemical-plant activities and supply chain interactions, running on desktop computers, are used to estimate the scope and duration of disruptive-event impacts, and overall system resilience, based on the extent to which individual chemical plants can adjust their internal operationsmore » (e.g., production mixes and levels) versus their external interactions (market sales and purchases, and transportation routes and modes). As a result, to illustrate how the model estimates the impacts of a hurricane disruption, a simple example model centered on 1,4-butanediol is presented.« less
OPERA models for predicting physicochemical properties and environmental fate endpoints.
Mansouri, Kamel; Grulke, Chris M; Judson, Richard S; Williams, Antony J
2018-03-08
The collection of chemical structure information and associated experimental data for quantitative structure-activity/property relationship (QSAR/QSPR) modeling is facilitated by an increasing number of public databases containing large amounts of useful data. However, the performance of QSAR models highly depends on the quality of the data and modeling methodology used. This study aims to develop robust QSAR/QSPR models for chemical properties of environmental interest that can be used for regulatory purposes. This study primarily uses data from the publicly available PHYSPROP database consisting of a set of 13 common physicochemical and environmental fate properties. These datasets have undergone extensive curation using an automated workflow to select only high-quality data, and the chemical structures were standardized prior to calculation of the molecular descriptors. The modeling procedure was developed based on the five Organization for Economic Cooperation and Development (OECD) principles for QSAR models. A weighted k-nearest neighbor approach was adopted using a minimum number of required descriptors calculated using PaDEL, an open-source software. The genetic algorithms selected only the most pertinent and mechanistically interpretable descriptors (2-15, with an average of 11 descriptors). The sizes of the modeled datasets varied from 150 chemicals for biodegradability half-life to 14,050 chemicals for logP, with an average of 3222 chemicals across all endpoints. The optimal models were built on randomly selected training sets (75%) and validated using fivefold cross-validation (CV) and test sets (25%). The CV Q 2 of the models varied from 0.72 to 0.95, with an average of 0.86 and an R 2 test value from 0.71 to 0.96, with an average of 0.82. Modeling and performance details are described in QSAR model reporting format and were validated by the European Commission's Joint Research Center to be OECD compliant. All models are freely available as an open-source, command-line application called OPEn structure-activity/property Relationship App (OPERA). OPERA models were applied to more than 750,000 chemicals to produce freely available predicted data on the U.S. Environmental Protection Agency's CompTox Chemistry Dashboard.
A two-dimensional, finite-difference model simulating a highway has been developed which is able to handle linear and nonlinear chemical reactions. Transport of the pollutants is accomplished by use of an upstream-flux-corrected algorithm developed at the Naval Research Laborator...
Modeling of high speed chemically reacting flow-fields
NASA Technical Reports Server (NTRS)
Drummond, J. P.; Carpenter, Mark H.; Kamath, H.
1989-01-01
The SPARK3D and SPARK3D-PNS computer programs were developed to model 3-D supersonic, chemically reacting flow-fields. The SPARK3D code is a full Navier-Stokes solver, and is suitable for use in scramjet combustors and other regions where recirculation may be present. The SPARK3D-PNS is a parabolized Navier-Stokes solver and provides an efficient means of calculating steady-state combustor far-fields and nozzles. Each code has a generalized chemistry package, making modeling of any chemically reacting flow possible. Research activities by the Langley group range from addressing fundamental theoretical issues to simulating problems of practical importance. Algorithmic development includes work on higher order and upwind spatial difference schemes. Direct numerical simulations employ these algorithms to address the fundamental issues of flow stability and transition, and the chemical reaction of supersonic mixing layers and jets. It is believed that this work will lend greater insight into phenomenological model development for simulating supersonic chemically reacting flows in practical combustors. Currently, the SPARK3D and SPARK3D-PNS codes are used to study problems of engineering interest, including various injector designs and 3-D combustor-nozzle configurations. Examples, which demonstrate the capabilities of each code are presented.
A Market-Basket Approach to Predict the Acute Aquatic Toxicity of Munitions and Energetic Materials.
Burgoon, Lyle D
2016-06-01
An ongoing challenge in chemical production, including the production of insensitive munitions and energetics, is the ability to make predictions about potential environmental hazards early in the process. To address this challenge, a quantitative structure activity relationship model was developed to predict acute fathead minnow toxicity of insensitive munitions and energetic materials. Computational predictive toxicology models like this one may be used to identify and prioritize environmentally safer materials early in their development. The developed model is based on the Apriori market-basket/frequent itemset mining approach to identify probabilistic prediction rules using chemical atom-pairs and the lethality data for 57 compounds from a fathead minnow acute toxicity assay. Lethality data were discretized into four categories based on the Globally Harmonized System of Classification and Labelling of Chemicals. Apriori identified toxicophores for categories two and three. The model classified 32 of the 57 compounds correctly, with a fivefold cross-validation classification rate of 74 %. A structure-based surrogate approach classified the remaining 25 chemicals correctly at 48 %. This result is unsurprising as these 25 chemicals were fairly unique within the larger set.
A novel and simple model of the uptake of organic chemicals by vegetation from air and soil.
Hung, H; Mackay, D
1997-09-01
A novel and simple three-compartment fugacity model has been developed to predict the kinetics and equilibria of the uptake of organic chemicals in herbaceous agricultural plants at various times, including the time of harvest using only readily available input data. The chemical concentration in each of the three plant compartments leaf, stem which includes fruits and seeds, and root) is expressed as a function of both time and chemical concentrations in soil and air. The model was developed using the fugacity concept; however, the final expressions are presented in terms of concentrations in soil and air, equilibrium partition coefficients and a set of transport and transformation half-lives. An illustrative application of the model is presented which describes the uptake of bromacil by a soybean plant under hydroponic conditions. The model, which is believed to give acceptably accurate prediction of the distribution of chemicals among plant tissues, air and soil, may be used for the assessment of exposure to, and risk from contaminants consumed either directly from vegetation or indirectly in natural and agricultural food chains.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ho, P.; Johannes, J.; Kudriavtsev, V.
The use of computational modeling to improve equipment and process designs for chemical vapor deposition (CVD) reactors is becoming increasingly common. Commercial codes are available that facilitate the modeling of chemically-reacting flows, but chemical reaction mechanisms must be separately developed for each system of interest. One f the products of the Watkins-Johnson Company (WJ) is a reactor marketed to semiconductor manufacturers for the atmospheric-pressure chemical vapor deposition (APCVD) of silicon oxide films. In this process, TEOS (tetraethoxysilane, Si(OC{sub 2}H{sub 5}){sub 4}) and ozone (O{sub 3}) are injected (in nitrogen and oxygen carrier gases) over hot silicon wafers that are beingmore » carried through the system on a moving belt. As part of their equipment improvement process, WJ is developing computational models of this tool. In this effort, they are collaborating with Sandia National Laboratories (SNL) to draw on Sandia`s experience base in understanding and modeling the chemistry of CVD processes.« less
Exploring the Use of Multiple Analogical Models when Teaching and Learning Chemical Equilibrium
ERIC Educational Resources Information Center
Harrison, Allan G.; De Jong, Onno
2005-01-01
This study describes the multiple analogical models used to introduce and teach Grade 12 chemical equilibrium. We examine the teacher's reasons for using models, explain each model's development during the lessons, and analyze the understandings students derived from the models. A case study approach was used and the data were drawn from the…
Lei, Tailong; Sun, Huiyong; Kang, Yu; Zhu, Feng; Liu, Hui; Zhou, Wenfang; Wang, Zhe; Li, Dan; Li, Youyong; Hou, Tingjun
2017-11-06
Xenobiotic chemicals and their metabolites are mainly excreted out of our bodies by the urinary tract through the urine. Chemical-induced urinary tract toxicity is one of the main reasons that cause failure during drug development, and it is a common adverse event for medications, natural supplements, and environmental chemicals. Despite its importance, there are only a few in silico models for assessing urinary tract toxicity for a large number of compounds with diverse chemical structures. Here, we developed a series of qualitative and quantitative structure-activity relationship (QSAR) models for predicting urinary tract toxicity. In our study, the recursive feature elimination method incorporated with random forests (RFE-RF) was used for dimension reduction, and then eight machine learning approaches were used for QSAR modeling, i.e., relevance vector machine (RVM), support vector machine (SVM), regularized random forest (RRF), C5.0 trees, eXtreme gradient boosting (XGBoost), AdaBoost.M1, SVM boosting (SVMBoost), and RVM boosting (RVMBoost). For building classification models, the synthetic minority oversampling technique was used to handle the imbalance data set problem. Among all the machine learning approaches, SVMBoost based on the RBF kernel achieves both the best quantitative (q ext 2 = 0.845) and qualitative predictions for the test set (MCC of 0.787, AUC of 0.893, sensitivity of 89.6%, specificity of 94.1%, and global accuracy of 90.8%). The application domains were then analyzed, and all of the tested chemicals fall within the application domain coverage. We also examined the structure features of the chemicals with large prediction errors. In brief, both the regression and classification models developed by the SVMBoost approach have reliable prediction capability for assessing chemical-induced urinary tract toxicity.
Modelling of evaporation of a dispersed liquid component in a chemically active gas flow
NASA Astrophysics Data System (ADS)
Kryukov, V. G.; Naumov, V. I.; Kotov, V. Yu.
1994-01-01
A model has been developed to investigate evaporation of dispersed liquids in chemically active gas flow. Major efforts have been directed at the development of algorithms for implementing this model. The numerical experiments demonstrate that, in the boundary layer, significant changes in the composition and temperature of combustion products take place. This gives the opportunity to more correctly model energy release processes in combustion chambers of liquid-propellant rocket engines, gas-turbine engines, and other power devices.
Reliable models for assessing human exposures are important for understanding health risks from chemicals. The Stochastic Human Exposure and Dose Simulation model for multimedia, multi-route/pathway chemicals (SHEDS-Multimedia), developed by EPA’s Office of Research and Developm...
Chemical structure-based predictive model for methanogenic anaerobic biodegradation potential.
Meylan, William; Boethling, Robert; Aronson, Dallas; Howard, Philip; Tunkel, Jay
2007-09-01
Many screening-level models exist for predicting aerobic biodegradation potential from chemical structure, but anaerobic biodegradation generally has been ignored by modelers. We used a fragment contribution approach to develop a model for predicting biodegradation potential under methanogenic anaerobic conditions. The new model has 37 fragments (substructures) and classifies a substance as either fast or slow, relative to the potential to be biodegraded in the "serum bottle" anaerobic biodegradation screening test (Organization for Economic Cooperation and Development Guideline 311). The model correctly classified 90, 77, and 91% of the chemicals in the training set (n = 169) and two independent validation sets (n = 35 and 23), respectively. Accuracy of predictions of fast and slow degradation was equal for training-set chemicals, but fast-degradation predictions were less accurate than slow-degradation predictions for the validation sets. Analysis of the signs of the fragment coefficients for this and the other (aerobic) Biowin models suggests that in the context of simple group contribution models, the majority of positive and negative structural influences on ultimate degradation are the same for aerobic and methanogenic anaerobic biodegradation.
Wittwehr, Clemens; Aladjov, Hristo; Ankley, Gerald; Byrne, Hugh J; de Knecht, Joop; Heinzle, Elmar; Klambauer, Günter; Landesmann, Brigitte; Luijten, Mirjam; MacKay, Cameron; Maxwell, Gavin; Meek, M E Bette; Paini, Alicia; Perkins, Edward; Sobanski, Tomasz; Villeneuve, Dan; Waters, Katrina M; Whelan, Maurice
2017-02-01
Efforts are underway to transform regulatory toxicology and chemical safety assessment from a largely empirical science based on direct observation of apical toxicity outcomes in whole organism toxicity tests to a predictive one in which outcomes and risk are inferred from accumulated mechanistic understanding. The adverse outcome pathway (AOP) framework provides a systematic approach for organizing knowledge that may support such inference. Likewise, computational models of biological systems at various scales provide another means and platform to integrate current biological understanding to facilitate inference and extrapolation. We argue that the systematic organization of knowledge into AOP frameworks can inform and help direct the design and development of computational prediction models that can further enhance the utility of mechanistic and in silico data for chemical safety assessment. This concept was explored as part of a workshop on AOP-Informed Predictive Modeling Approaches for Regulatory Toxicology held September 24-25, 2015. Examples of AOP-informed model development and its application to the assessment of chemicals for skin sensitization and multiple modes of endocrine disruption are provided. The role of problem formulation, not only as a critical phase of risk assessment, but also as guide for both AOP and complementary model development is described. Finally, a proposal for actively engaging the modeling community in AOP-informed computational model development is made. The contents serve as a vision for how AOPs can be leveraged to facilitate development of computational prediction models needed to support the next generation of chemical safety assessment. © The Author 2016. Published by Oxford University Press on behalf of the Society of Toxicology.
Leclercq, Catherine; Arcella, Davide; Le Donne, Cinzia; Piccinelli, Raffaela; Sette, Stefania; Soggiu, Maria Eleonora
2003-04-11
To get a more realistic view of exposure to food chemicals, risk managers are getting more interested in stochastic modelling as an alternative to deterministic approaches based on conservative assumptions. It allows to take into account all the available information in the concentration of the chemical present in foods and in food consumption patterns. Within the EC-funded "Montecarlo" project, a comprehensive set of mathematical algorithms was developed to take into account all the necessary components for stochastic modelling of a variety of food chemicals, nutrients and ingredients. An appropriate computer software is being developed. Since the concentration of food chemicals may vary among different brands of the same product, consumer behaviour with respect to brands may have an impact on exposure assessments. Numeric experiments were carried out on different ways of incorporating indicators of market share and brand loyalty in the mathematical algorithms developed within the stochastic model of exposure to intense sweeteners from sugar-free beverages. The 95th percentiles of intake were shown to vary according to the inclusion/exclusion of these indicators. The market share should be included in the model especially if the market is not equitably distributed between brands. If brand loyalty data are not available, the model may be run under theoretical scenarios.
“Impact of CB6 and CB05TU chemical mechanisms on air quality”
“Impacts of CB6 and CB05TU chemical mechanisms on air quality”In this study, we incorporate the newly developed Carbon Bond chemical mechanism (CB6) into the Community Multiscale Air Quality modeling system (CMAQv5.0.1) and perform air quality model simulations with the CB6 and t...
Recent developments in broadly applicable structure-biodegradability relationships.
Jaworska, Joanna S; Boethling, Robert S; Howard, Philip H
2003-08-01
Biodegradation is one of the most important processes influencing concentration of a chemical substance after its release to the environment. It is the main process for removal of many chemicals from the environment and therefore is an important factor in risk assessments. This article reviews available methods and models for predicting biodegradability of organic chemicals from structure. The first section of the article briefly discusses current needs for biodegradability estimation methods related to new and existing chemicals and in the context of multimedia exposure models. Following sections include biodegradation test methods and endpoints used in modeling, with special attention given to the Japanese Ministry of International Trade and Industry test; a primer on modeling, describing the various approaches that have been used in the structure/biodegradability relationship work, and contrasting statistical and mechanistic approaches; and recent developments in structure/biodegradability relationships, divided into group contribution, chemometric, and artificial intelligence approaches.
Luo, Wen; Medrek, Sarah; Misra, Jatin; Nohynek, Gerhard J
2007-02-01
The objective of this study was to construct and validate a quantitative structure-activity relationship model for skin absorption. Such models are valuable tools for screening and prioritization in safety and efficacy evaluation, and risk assessment of drugs and chemicals. A database of 340 chemicals with percutaneous absorption was assembled. Two models were derived from the training set consisting 306 chemicals (90/10 random split). In addition to the experimental K(ow) values, over 300 2D and 3D atomic and molecular descriptors were analyzed using MDL's QsarIS computer program. Subsequently, the models were validated using both internal (leave-one-out) and external validation (test set) procedures. Using the stepwise regression analysis, three molecular descriptors were determined to have significant statistical correlation with K(p) (R2 = 0.8225): logK(ow), X0 (quantification of both molecular size and the degree of skeletal branching), and SsssCH (count of aromatic carbon groups). In conclusion, two models to estimate skin absorption were developed. When compared to other skin absorption QSAR models in the literature, our model incorporated more chemicals and explored a large number of descriptors. Additionally, our models are reasonably predictive and have met both internal and external statistical validations.
Hukkerikar, Amol Shivajirao; Kalakul, Sawitree; Sarup, Bent; Young, Douglas M; Sin, Gürkan; Gani, Rafiqul
2012-11-26
The aim of this work is to develop group-contribution(+) (GC(+)) method (combined group-contribution (GC) method and atom connectivity index (CI) method) based property models to provide reliable estimations of environment-related properties of organic chemicals together with uncertainties of estimated property values. For this purpose, a systematic methodology for property modeling and uncertainty analysis is used. The methodology includes a parameter estimation step to determine parameters of property models and an uncertainty analysis step to establish statistical information about the quality of parameter estimation, such as the parameter covariance, the standard errors in predicted properties, and the confidence intervals. For parameter estimation, large data sets of experimentally measured property values of a wide range of chemicals (hydrocarbons, oxygenated chemicals, nitrogenated chemicals, poly functional chemicals, etc.) taken from the database of the US Environmental Protection Agency (EPA) and from the database of USEtox is used. For property modeling and uncertainty analysis, the Marrero and Gani GC method and atom connectivity index method have been considered. In total, 22 environment-related properties, which include the fathead minnow 96-h LC(50), Daphnia magna 48-h LC(50), oral rat LD(50), aqueous solubility, bioconcentration factor, permissible exposure limit (OSHA-TWA), photochemical oxidation potential, global warming potential, ozone depletion potential, acidification potential, emission to urban air (carcinogenic and noncarcinogenic), emission to continental rural air (carcinogenic and noncarcinogenic), emission to continental fresh water (carcinogenic and noncarcinogenic), emission to continental seawater (carcinogenic and noncarcinogenic), emission to continental natural soil (carcinogenic and noncarcinogenic), and emission to continental agricultural soil (carcinogenic and noncarcinogenic) have been modeled and analyzed. The application of the developed property models for the estimation of environment-related properties and uncertainties of the estimated property values is highlighted through an illustrative example. The developed property models provide reliable estimates of environment-related properties needed to perform process synthesis, design, and analysis of sustainable chemical processes and allow one to evaluate the effect of uncertainties of estimated property values on the calculated performance of processes giving useful insights into quality and reliability of the design of sustainable processes.
Good, Kevin; Winkel, David; VonNiederhausern, Michael; Hawkins, Brian; Cox, Jessica; Gooding, Rachel; Whitmire, Mark
2013-06-01
The Chemical Terrorism Risk Assessment (CTRA) and Chemical Infrastructure Risk Assessment (CIRA) are programs that estimate the risk of chemical terrorism attacks to help inform and improve the US defense posture against such events. One aspect of these programs is the development and advancement of a Medical Mitigation Model-a mathematical model that simulates the medical response to a chemical terrorism attack and estimates the resulting number of saved or benefited victims. At the foundation of the CTRA/CIRA Medical Mitigation Model is the concept of stock-and-flow modeling; "stocks" are states that individuals progress through during the event, while "flows" permit and govern movement from one stock to another. Using this approach, the model is able to simulate and track individual victims as they progress from exposure to an end state. Some of the considerations in the model include chemical used, type of attack, route and severity of exposure, response-related delays, detailed treatment regimens with efficacy defined as a function of time, medical system capacity, the influx of worried well individuals, and medical countermeasure availability. As will be demonstrated, the output of the CTRA/CIRA Medical Mitigation Model makes it possible to assess the effectiveness of the existing public health response system and develop and examine potential improvement strategies. Such a modeling and analysis capability can be used to inform first-responder actions/training, guide policy decisions, justify resource allocation, and direct knowledge-gap studies.
NASA Astrophysics Data System (ADS)
Chang, Hsin-Yi; Chang, Hsiang-Chi
2013-08-01
In this study, we developed online critiquing activities using an open-source computer learning environment. We investigated how well the activities scaffolded students to critique molecular models of chemical reactions made by scientists, peers, and a fictitious peer, and whether the activities enhanced the students' understanding of science models and chemical reactions. The activities were implemented in an eighth-grade class with 28 students in a public junior high school in southern Taiwan. The study employed mixed research methods. Data collected included pre- and post-instructional assessments, post-instructional interviews, and students' electronic written responses and oral discussions during the critiquing activities. The results indicated that these activities guided the students to produce overall quality critiques. Also, the students developed a more sophisticated understanding of chemical reactions and scientific models as a result of the intervention. Design considerations for effective model critiquing activities are discussed based on observational results, including the use of peer-generated artefacts for critiquing to promote motivation and collaboration, coupled with critiques of scientific models to enhance students' epistemological understanding of model purpose and communication.
McNamara, C; Naddy, B; Rohan, D; Sexton, J
2003-10-01
The Monte Carlo computational system for stochastic modelling of dietary exposure to food chemicals and nutrients is presented. This system was developed through a European Commission-funded research project. It is accessible as a Web-based application service. The system allows and supports very significant complexity in the data sets used as the model input, but provides a simple, general purpose, linear kernel for model evaluation. Specific features of the system include the ability to enter (arbitrarily) complex mathematical or probabilistic expressions at each and every input data field, automatic bootstrapping on subjects and on subject food intake diaries, and custom kernels to apply brand information such as market share and loyalty to the calculation of food and chemical intake.
CADASTER QSPR Models for Predictions of Melting and Boiling Points of Perfluorinated Chemicals.
Bhhatarai, Barun; Teetz, Wolfram; Liu, Tao; Öberg, Tomas; Jeliazkova, Nina; Kochev, Nikolay; Pukalov, Ognyan; Tetko, Igor V; Kovarich, Simona; Papa, Ester; Gramatica, Paola
2011-03-14
Quantitative structure property relationship (QSPR) studies on per- and polyfluorinated chemicals (PFCs) on melting point (MP) and boiling point (BP) are presented. The training and prediction chemicals used for developing and validating the models were selected from Syracuse PhysProp database and literatures. The available experimental data sets were split in two different ways: a) random selection on response value, and b) structural similarity verified by self-organizing-map (SOM), in order to propose reliable predictive models, developed only on the training sets and externally verified on the prediction sets. Individual linear and non-linear approaches based models developed by different CADASTER partners on 0D-2D Dragon descriptors, E-state descriptors and fragment based descriptors as well as consensus model and their predictions are presented. In addition, the predictive performance of the developed models was verified on a blind external validation set (EV-set) prepared using PERFORCE database on 15 MP and 25 BP data respectively. This database contains only long chain perfluoro-alkylated chemicals, particularly monitored by regulatory agencies like US-EPA and EU-REACH. QSPR models with internal and external validation on two different external prediction/validation sets and study of applicability-domain highlighting the robustness and high accuracy of the models are discussed. Finally, MPs for additional 303 PFCs and BPs for 271 PFCs were predicted for which experimental measurements are unknown. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
This commentary provides an overview of the challenges that arise from applying molecular modeling tools developed and commonly used for pharmaceutical discovery to the problem of predicting the potential toxicities of environmental chemicals.
NASA Astrophysics Data System (ADS)
Jorba, O.; Pérez, C.; Baldasano, J. M.
2009-04-01
Chemical processes in air quality modelling systems are usually treated independently from the meteorological models. This approach is computationally attractive since off-line chemical transport simulations only require a single meteorological dataset to produce many chemical simulations. However, this separation of chemistry and meteorology produces a loss of important information about atmospheric processes and does not allow for feedbacks between chemistry and meteorology. To take into account such processes current models are evolving to an online coupling of chemistry and meteorology to produce consistent chemical weather predictions. The Earth Sciences Department of the Barcelona Supercomputing Center (BSC) develops the NMMB/BSC-DUST (Pérez et al., 2008), an online dust model within the global-regional NCEP/NMMB numerical weather prediction model (Janjic and Black, 2007) under development at National Centers for Environmental Prediction (NCEP). Current implementation is based on the well established regional dust model and forecast system DREAM (Nickovic et al., 2001). The most relevant characteristics of NMMB/BSC-DUST are its on-line coupling of the dust scheme with the meteorological driver, the wide range of applications from meso to global scales, and the inclusion of dust radiative effects allowing feedbacks between aerosols and meteorology. In order to complement such development, BSC works also in the implementation of a fully coupled online chemical mechanism within NMMB/BSC-DUST. The final objective is to develop a fully chemical weather prediction system able to resolve gas-aerosol-meteorology interactions from global to local scales. In this contribution we will present the design of the chemistry coupling and the current progress of its implementation. Following the NCEP/NMMB approach, the chemistry part will be coupled through the Earth System Modeling Framework (ESMF) as a pluggable component. The chemical mechanism and chemistry solver is based on the Kinetic PreProcessor KPP (Sandu and Sander, 2006) package with the main purpose to maintain a wide flexibility when configuring the model. Such approach will allow using a simple general chemical mechanism for global applications or a more complex mechanism for regional to local applications at higher resolution. REFERENCES Janjic, Z.I., and Black, T.L., 2007. An ESMF unified model for a broad range of spatial and temporal scales, Geophysical Research Abstracts, 9, 05025. Nickovic, S., Papadopoulos, A., Kakaliagou, O., and Kallos, G., 2001. Model for prediciton of desert dust cycle in the atmosphere. J. Geophys. Res., 106, 18113-18129. Pérez, C., Haustein, K., Janjic, Z.I., Jorba, O., Baldasano, J.M., Black, T.L., and Nickovic, S., 2008. An online dust model within the meso to global NMMB: current progress and plans. AGU Fall Meeting, San Francisco, A41K-03, 2008. Sandu, A., and Sander, R., 2006. Technical note:Simulating chemical systems in Fortran90 and Matlab with the Kinetic PreProcessor KPP-2.1. Atmos. Chem. and Phys., 6, 187-195.
Predicting Drug-induced Hepatotoxicity Using QSAR and Toxicogenomics Approaches
Low, Yen; Uehara, Takeki; Minowa, Yohsuke; Yamada, Hiroshi; Ohno, Yasuo; Urushidani, Tetsuro; Sedykh, Alexander; Muratov, Eugene; Fourches, Denis; Zhu, Hao; Rusyn, Ivan; Tropsha, Alexander
2014-01-01
Quantitative Structure-Activity Relationship (QSAR) modeling and toxicogenomics are used independently as predictive tools in toxicology. In this study, we evaluated the power of several statistical models for predicting drug hepatotoxicity in rats using different descriptors of drug molecules, namely their chemical descriptors and toxicogenomic profiles. The records were taken from the Toxicogenomics Project rat liver microarray database containing information on 127 drugs (http://toxico.nibio.go.jp/datalist.html). The model endpoint was hepatotoxicity in the rat following 28 days of exposure, established by liver histopathology and serum chemistry. First, we developed multiple conventional QSAR classification models using a comprehensive set of chemical descriptors and several classification methods (k nearest neighbor, support vector machines, random forests, and distance weighted discrimination). With chemical descriptors alone, external predictivity (Correct Classification Rate, CCR) from 5-fold external cross-validation was 61%. Next, the same classification methods were employed to build models using only toxicogenomic data (24h after a single exposure) treated as biological descriptors. The optimized models used only 85 selected toxicogenomic descriptors and had CCR as high as 76%. Finally, hybrid models combining both chemical descriptors and transcripts were developed; their CCRs were between 68 and 77%. Although the accuracy of hybrid models did not exceed that of the models based on toxicogenomic data alone, the use of both chemical and biological descriptors enriched the interpretation of the models. In addition to finding 85 transcripts that were predictive and highly relevant to the mechanisms of drug-induced liver injury, chemical structural alerts for hepatotoxicity were also identified. These results suggest that concurrent exploration of the chemical features and acute treatment-induced changes in transcript levels will both enrich the mechanistic understanding of sub-chronic liver injury and afford models capable of accurate prediction of hepatotoxicity from chemical structure and short-term assay results. PMID:21699217
Biodegradation of organic chemicals in soil/water microcosms system: Model development
Liu, L.; Tindall, J.A.; Friedel, M.J.; Zhang, W.
2007-01-01
The chemical interactions of hydrophobic organic contaminants with soils and sediments may result in strong binding and slow subsequent release rates that significantly affect remediation rates and endpoints. In order to illustrate the recalcitrance of chemical to degradation on sites, a sorption mechanism of intraparticle sequestration was postulated to operate on chemical remediation sites. Pseudo-first order sequestration kinetics is used in the study with the hypothesis that sequestration is an irreversibly surface-mediated process. A mathematical model based on mass balance equations was developed to describe the fate of chemical degradation in soil/water microcosm systems. In the model, diffusion was represented by Fick's second law, local sorption-desorption by a linear isotherm, irreversible sequestration by a pseudo-first order kinetics and biodegradation by Monod kinetics. Solutions were obtained to provide estimates of chemical concentrations. The mathematical model was applied to a benzene biodegradation batch test and simulated model responses correlated well compared to measurements of biodegradation of benzene in the batch soil/water microcosm system. A sensitivity analysis was performed to assess the effects of several parameters on model behavior. Overall chemical removal rate decreased and sequestration increased quickly with an increase in the sorption partition coefficient. When soil particle radius, a, was greater than 1 mm, an increase in radius produced a significant decrease in overall chemical removal rate as well as an increase in sequestration. However, when soil particle radius was less than 0.1 mm, an increase in radius resulted in small changes in the removal rate and sequestration. As pseudo-first order sequestration rate increased, both chemical removal rate and sequestration increased slightly. Model simulation results showed that desorption resistance played an important role in the bioavailability of organic chemicals in porous media. Complete biostabilization of chemicals on remediation sites can be achieved when the concentration of the reversibly sorbed chemical reduces to zero (i.e., undetectable), with a certain amount of irreversibly sequestrated chemical left inside the soil particle solid phase. ?? 2006 Springer Science + Business Media B.V.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brooks, Kriston P.; Sprik, Samuel J.; Tamburello, David A.
The U.S. Department of Energy (DOE) has developed a vehicle framework model to simulate fuel cell-based light-duty vehicle operation for various hydrogen storage systems. This transient model simulates the performance of the storage system, fuel cell, and vehicle for comparison to DOE’s Technical Targets using four drive cycles/profiles. Chemical hydrogen storage models have been developed for the Framework model for both exothermic and endothermic materials. Despite the utility of such models, they require that material researchers input system design specifications that cannot be easily estimated. To address this challenge, a design tool has been developed that allows researchers to directlymore » enter kinetic and thermodynamic chemical hydrogen storage material properties into a simple sizing module that then estimates the systems parameters required to run the storage system model. Additionally, this design tool can be used as a standalone executable file to estimate the storage system mass and volume outside of the framework model and compare it to the DOE Technical Targets. These models will be explained and exercised with existing hydrogen storage materials.« less
INTEGRATED CHEMICAL INFORMATION TECHNOLOGIES ...
A central regulatory mandate of the Environmental Protection Agency, spanning many Program Offices and issues, is to assess the potential health and environmental risks of large numbers of chemicals released into the environment, often in the absence of relevant test data. Models for predicting potential adverse effects of chemicals based primarily on chemical structure play a central role in prioritization and screening strategies yet are highly dependent and conditional upon the data used for developing such models. Hence, limits on data quantity, quality, and availability are considered by many to be the largest hurdles to improving prediction models in diverse areas of toxicology. Generation of new toxicity data for additional chemicals and endpoints, development of new high-throughput, mechanistically relevant bioassays, and increased generation of genomics and proteomics data that can clarify relevant mechanisms will all play important roles in improving future SAR prediction models. The potential for much greater immediate gains, across large domains of chemical and toxicity space, comes from maximizing the ability to mine and model useful information from existing toxicity data, data that represent huge past investment in research and testing expenditures. In addition, the ability to place newer “omics” data, data that potentially span many possible domains of toxicological effects, in the broader context of historical data is the means for opti
Toxicokinetic Triage for Environmental Chemicals | Science ...
Toxicokinetic (TK) models are essential for linking administered doses to blood and tissue concentrations. In vitro-to-in vivo extrapolation (IVIVE) methods have been developed to determine TK from limited in vitro measurements and chemical structure-based property predictions, providing a less resource–intensive alternative to traditional in vivo TK approaches. High throughput TK (HTTK) methods use IVIVE to estimate doses that produce steady-state plasma concentrations equivalent to those producing biological activity in in vitro screening studies (e.g., ToxCast). In this study, the domain of applicability and assumptions of HTTK approaches were evaluated using both in vivo data and simulation analysis. Based on in vivo data for 87 chemicals, specific properties (e.g., in vitro HTTK data, physico-chemical descriptors, chemical structure, and predicted transporter affinities) were identified that correlate with poor HTTK predictive ability. For 350 xenobiotics with literature HTTK data, we then differentiated those xenobiotics for which HTTK approaches are likely to be sufficient, from those that may require additional data. For 272 chemicals we also developed a HT physiologically-based TK (HTPBTK) model that requires somewhat greater information than a steady-state model, but allows non-steady state dynamics and can predict chemical concentration time-courses for a variety of exposure scenarios, tissues, and species. We used this HTPBTK model to show that the
Bakire, Serge; Yang, Xinya; Ma, Guangcai; Wei, Xiaoxuan; Yu, Haiying; Chen, Jianrong; Lin, Hongjun
2018-01-01
Organic chemicals in the aquatic ecosystem may inhibit algae growth and subsequently lead to the decline of primary productivity. Growth inhibition tests are required for ecotoxicological assessments for regulatory purposes. In silico study is playing an important role in replacing or reducing animal tests and decreasing experimental expense due to its efficiency. In this work, a series of theoretical models was developed for predicting algal growth inhibition (log EC 50 ) after 72 h exposure to diverse chemicals. In total 348 organic compounds were classified into five modes of toxic action using the Verhaar Scheme. Each model was established by using molecular descriptors that characterize electronic and structural properties. The external validation and leave-one-out cross validation proved the statistical robustness of the derived models. Thus they can be used to predict log EC 50 values of chemicals that lack authorized algal growth inhibition values (72 h). This work systematically studied algal growth inhibition according to toxic modes and the developed model suite covers all five toxic modes. The outcome of this research will promote toxic mechanism analysis and be made applicable to structural diversity. Copyright © 2017 Elsevier Ltd. All rights reserved.
Development of a Design Tool for Planning Aqueous Amendment Injection Systems
2012-08-01
Chemical Oxidation with Permanganate (MnO4- ) ...................................... 2 1.4 IMPLEMENTATION ISSUES...17 6.4 SS DESIGN TOOL DEVELOPMENT AND EVALUATION ........................... 19 7.0 CHEMICAL OXIDATION WITH PERMANGANATE ...21 7.1 NUMERICAL MODELING OF PERMANGANATE DISTRIBUTION ........... 21 7.2 CDISCO DEVELOPMENT AND EVALUATION
Computational Modeling and Simulation of Genital Tubercle Development
Hypospadias is a developmental defect of urethral tube closure that has a complex etiology involving genetic and environmental factors, including anti-androgenic and estrogenic disrupting chemicals; however, little is known about the morphoregulatory consequences of androgen/estrogen balance during genital tubercle (GT) development. Computer models that predictively model sexual dimorphism of the GT may provide a useful resource to translate chemical-target bipartite networks and their developmental consequences across the human-relevant chemical universe. Here, we describe a multicellular agent-based model of genital tubercle (GT) development that simulates urethrogenesis from the sexually-indifferent urethral plate stage to urethral tube closure. The prototype model, constructed in CompuCell3D, recapitulates key aspects of GT morphogenesis controlled by SHH, FGF10, and androgen pathways through modulation of stochastic cell behaviors, including differential adhesion, motility, proliferation, and apoptosis. Proper urethral tube closure in the model was shown to depend quantitatively on SHH- and FGF10-induced effects on mesenchymal proliferation and epithelial apoptosis??both ultimately linked to androgen signaling. In the absence of androgen, GT development was feminized and with partial androgen deficiency, the model resolved with incomplete urethral tube closure, thereby providing an in silico platform for probabilistic prediction of hypospadias risk across c
Public Databases Supporting Computational Toxicology
A major goal of the emerging field of computational toxicology is the development of screening-level models that predict potential toxicity of chemicals from a combination of mechanistic in vitro assay data and chemical structure descriptors. In order to build these models, resea...
GREENSCOPE: A Method for Modeling Chemical Process Sustainability
Current work within the U.S. Environmental Protection Agency’s National Risk Management Research Laboratory is focused on the development of a method for modeling chemical process sustainability. The GREENSCOPE methodology, defined for the four bases of Environment, Economics, Ef...
Aznar, Margarita; López, Ricardo; Cacho, Juan; Ferreira, Vicente
2003-04-23
Partial least squares regression (PLSR) models able to predict some of the wine aroma nuances from its chemical composition have been developed. The aromatic sensory characteristics of 57 Spanish aged red wines were determined by 51 experts from the wine industry. The individual descriptions given by the experts were recorded, and the frequency with which a sensory term was used to define a given wine was taken as a measurement of its intensity. The aromatic chemical composition of the wines was determined by already published gas chromatography (GC)-flame ionization detector and GC-mass spectrometry methods. In the whole, 69 odorants were analyzed. Both matrixes, the sensory and chemical data, were simplified by grouping and rearranging correlated sensory terms or chemical compounds and by the exclusion of secondary aroma terms or of weak aroma chemicals. Finally, models were developed for 18 sensory terms and 27 chemicals or groups of chemicals. Satisfactory models, explaining more than 45% of the original variance, could be found for nine of the most important sensory terms (wood-vanillin-cinnamon, animal-leather-phenolic, toasted-coffee, old wood-reduction, vegetal-pepper, raisin-flowery, sweet-candy-cacao, fruity, and berry fruit). For this set of terms, the correlation coefficients between the measured and predicted Y (determined by cross-validation) ranged from 0.62 to 0.81. Models confirmed the existence of complex multivariate relationships between chemicals and odors. In general, pleasant descriptors were positively correlated to chemicals with pleasant aroma, such as vanillin, beta damascenone, or (E)-beta-methyl-gamma-octalactone, and negatively correlated to compounds showing less favorable odor properties, such as 4-ethyl and vinyl phenols, 3-(methylthio)-1-propanol, or phenylacetaldehyde.
Predicting SVOC Emissions into Air and Foods in Support of ...
The release of semi-volatile organic compounds (SVOCs) from consumer articles may be a critical human exposure pathway. In addition, the migration of SVOCs from food packaging materials into foods may also be a dominant source of exposure for some chemicals. Here we describe recent efforts to characterize emission-related parameters for these exposure pathways to support prediction of aggregate exposures for thousands of chemicals For chemicals in consumer articles, Little et al. (2012) developed a screening-level indoor exposure prediction model which, for a given SVOC, principally depends on steady-state gas-phase concentrations (y0). We have developed a model that predicts y0 for SVOCs in consumer articles, allowing exposure predictions for 274 ToxCast chemicals. Published emissions data for 31 SVOCs found in flooring materials, provided a training set where both chemical-specific physicochemical properties, article specific formulation properties, and experimental design aspects were available as modeling descriptors. A linear regression yielded R2- and p- values of approximately 0.62 and 3.9E-05, respectively. A similar model was developed based upon physicochemical properties alone, since article information is often not available for a given SVOC or product. This latter model yielded R2 - and p- values of approximately 0.47 and 1.2E-10, respectively. Many SVOCs are also used as additives (e.g. plasticizers, antioxidants, lubricants) in plastic food pac
Liu, Zhichao; Kelly, Reagan; Fang, Hong; Ding, Don; Tong, Weida
2011-07-18
The primary testing strategy to identify nongenotoxic carcinogens largely relies on the 2-year rodent bioassay, which is time-consuming and labor-intensive. There is an increasing effort to develop alternative approaches to prioritize the chemicals for, supplement, or even replace the cancer bioassay. In silico approaches based on quantitative structure-activity relationships (QSAR) are rapid and inexpensive and thus have been investigated for such purposes. A slightly more expensive approach based on short-term animal studies with toxicogenomics (TGx) represents another attractive option for this application. Thus, the primary questions are how much better predictive performance using short-term TGx models can be achieved compared to that of QSAR models, and what length of exposure is sufficient for high quality prediction based on TGx. In this study, we developed predictive models for rodent liver carcinogenicity using gene expression data generated from short-term animal models at different time points and QSAR. The study was focused on the prediction of nongenotoxic carcinogenicity since the genotoxic chemicals can be inexpensively removed from further development using various in vitro assays individually or in combination. We identified 62 chemicals whose hepatocarcinogenic potential was available from the National Center for Toxicological Research liver cancer database (NCTRlcdb). The gene expression profiles of liver tissue obtained from rats treated with these chemicals at different time points (1 day, 3 days, and 5 days) are available from the Gene Expression Omnibus (GEO) database. Both TGx and QSAR models were developed on the basis of the same set of chemicals using the same modeling approach, a nearest-centroid method with a minimum redundancy and maximum relevancy-based feature selection with performance assessed using compound-based 5-fold cross-validation. We found that the TGx models outperformed QSAR in every aspect of modeling. For example, the TGx models' predictive accuracy (0.77, 0.77, and 0.82 for the 1-day, 3-day, and 5-day models, respectively) was much higher for an independent validation set than that of a QSAR model (0.55). Permutation tests confirmed the statistical significance of the model's prediction performance. The study concluded that a short-term 5-day TGx animal model holds the potential to predict nongenotoxic hepatocarcinogenicity. © 2011 American Chemical Society
A FRAMEWORK TO DESIGN AND OPTIMIZE CHEMICAL FLOODING PROCESSES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori
2005-07-01
The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectivesmore » of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.« less
A Framework to Design and Optimize Chemical Flooding Processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori
2006-08-31
The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectivesmore » of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.« less
A FRAMEWORK TO DESIGN AND OPTIMIZE CHEMICAL FLOODING PROCESSES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mojdeh Delshad; Gary A. Pope; Kamy Sepehrnoori
2004-11-01
The goal of this proposed research is to provide an efficient and user friendly simulation framework for screening and optimizing chemical/microbial enhanced oil recovery processes. The framework will include (1) a user friendly interface to identify the variables that have the most impact on oil recovery using the concept of experimental design and response surface maps, (2) UTCHEM reservoir simulator to perform the numerical simulations, and (3) an economic model that automatically imports the simulation production data to evaluate the profitability of a particular design. Such a reservoir simulation framework is not currently available to the oil industry. The objectivesmore » of Task 1 are to develop three primary modules representing reservoir, chemical, and well data. The modules will be interfaced with an already available experimental design model. The objective of the Task 2 is to incorporate UTCHEM reservoir simulator and the modules with the strategic variables and developing the response surface maps to identify the significant variables from each module. The objective of the Task 3 is to develop the economic model designed specifically for the chemical processes targeted in this proposal and interface the economic model with UTCHEM production output. Task 4 is on the validation of the framework and performing simulations of oil reservoirs to screen, design and optimize the chemical processes.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, J.Y.; Batchelor, B.
1999-03-01
Chemical equilibrium models are useful to evaluate stabilized/solidified waste. A general equilibrium model, SOLTEQ, a modified version of MINTEQA2 for S/S, was applied to predict the chemical speciations in the stabilized/solidified waste form. A method was developed to prepare SOLTEQ input data that can chemically represent various stabilized/solidified binders. Taylor`s empirical model was used to describe partitioning of alkali ions. As a result, SOLTEQ could represent chemical speciation in pure binder systems such as ordinary Portland cement and ordinary Portland cement + fly ash. Moreover, SOLTEQ could reasonably describe the effects on the chemical speciation due to variations in water-to-cement,more » fly ash contents, and hydration times of various binder systems. However, this application of SOLTEQ was not accurate in predicting concentrations of Ca, Si, and SO{sub 4} ions, due to uncertainties in the CSH solubility model and K{sub sp} values of cement hydrates at high pH values.« less
Global two dimensional chemistry model and simulation of atmospheric chemical composition
NASA Astrophysics Data System (ADS)
Zhang, Renjian; Wang, Mingxing; Zeng, Qingcun
2000-03-01
A global two-dimensional zonally averaged chemistry model is developed to study the chemi-cal composition of atmosphere. The region of the model is from 90°S to 90°N and from the ground to the altitude of 20 km with a resolution of 5° x 1 km. The wind field is residual circulation calcu-lated from diabatic rate. 34 species and 104 chemical and photochemical reactions are considered in the model. The sources of CH4, CO and NOx, which are divided into seasonal sources and non-seasonal sources, are parameterized as a function of latitude and time. The chemical composi-tion of atmosphere was simulated with emission level of CH4, CO and NOx in 1990. The results are compared with observations and other model results, showing that the model is successful to simu-late the atmospheric chemical composition and distribution of CH4.
In this paper, a screening model for flow of a nonaqueous phase liquid (NAPL) and associated chemical transport in the vadose zone is developed. he model is based on kinematic approximation of the governing equations for both the NAPL and a partitionable chemical constituent. he ...
Stratospheric chemistry and transport
NASA Technical Reports Server (NTRS)
Prather, Michael; Garcia, Maria M.
1990-01-01
A Chemical Tracer Model (CTM) that can use wind field data generated by the General Circulation Model (GCM) is developed to implement chemistry in the three dimensional GCM of the middle atmosphere. Initially, chemical tracers with simple first order losses such as N2O are used. Successive models are to incorporate more complex ozone chemistry.
The creation of Physiologically Based Pharmacokinetic (PBPK) models for a new chemical requires the selection of an appropriate model structure and the collection of a large amount of data for parameterization. Commonly, a large proportion of the needed information is collected ...
A REVIEW AND COMPARISON OF MODELS FOR PREDICTING DYNAMIC CHEMICAL BIOCONCENTRATION IN FISH
Over the past 20 years, a variety of models have been developed to simulate the bioconcentration of hydrophobic organic chemicals by fish. These models differ not only in the processes they address but also in the way a given process is described. Processes described by these m...
The U.S. EPA, under its ExpoCast program, is developing high-throughput near-field modeling methods to estimate human chemical exposure and to provide real-world context to high-throughput screening (HTS) hazard data. These novel modeling methods include reverse methods to infer ...
Rapid Chemical Exposure and Dose Research
EPA evaluates the potential risks of the manufacture and use of thousands of chemicals. To assist with this evaluation, EPA scientists developed a rapid, automated model using off the shelf technology that predicts exposures for thousands of chemicals.
Evaluating the multimedia fate of organic chemicals: A level III fugacity model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mackay, D.; Paterson, S.
A multimedia model is developed and applied to selected organic chemicals in evaluative and real regional environments. The model employs the fugacity concept and treats four bulk compartments: air, water, soil, and bottom sediment, which consist of subcompartments of varying proportions of air, water, and mineral and organic matter. Chemical equilibrium is assumed to apply within (but not between) each bulk compartment. Expressions are included for emissions, advective flows, degrading reactions, and interphase transport by diffusive and non-diffusive processes. Input to the model consists of a description of the environment, the physical-chemical and reaction properties of the chemical, and emissionmore » rates. For steady-state conditions the solution is a simple algebraic expression. The model is applied to six chemicals in the region of southern Ontario and the calculated fate and concentrations are compared with observations. The results suggest that the model may be used to determine the processes that control the environmental fate of chemicals in a region and provide approximate estimates of relative media concentrations.« less
Developing and applying metamodels of high resolution ...
As defined by Wikipedia (https://en.wikipedia.org/wiki/Metamodeling), “(a) metamodel or surrogate model is a model of a model, and metamodeling is the process of generating such metamodels.” The goals of metamodeling include, but are not limited to (1) developing functional or statistical relationships between a model’s input and output variables for model analysis, interpretation, or information consumption by users’ clients; (2) quantifying a model’s sensitivity to alternative or uncertain forcing functions, initial conditions, or parameters; and (3) characterizing the model’s response or state space. Using five existing models developed by US Environmental Protection Agency, we generate a metamodeling database of the expected environmental and biological concentrations of 644 organic chemicals released into nine US rivers from wastewater treatment works (WTWs) assuming multiple loading rates and sizes of populations serviced. The chemicals of interest have log n-octanol/water partition coefficients ( ) ranging from 3 to 14, and the rivers of concern have mean annual discharges ranging from 1.09 to 3240 m3/s. Log linear regression models are derived to predict mean annual dissolved and total water concentrations and total sediment concentrations of chemicals of concern based on their , Henry’s Law Constant, and WTW loading rate and on the mean annual discharges of the receiving rivers. Metamodels are also derived to predict mean annual chemical
An object-oriented software for fate and exposure assessments.
Scheil, S; Baumgarten, G; Reiter, B; Schwartz, S; Wagner, J O; Trapp, S; Matthies, M
1995-07-01
The model system CemoS(1) (Chemical Exposure Model System) was developed for the exposure prediction of hazardous chemicals released to the environment. Eight different models were implemented involving chemicals fate simulation in air, water, soil and plants after continuous or single emissions from point and diffuse sources. Scenario studies are supported by a substance and an environmental data base. All input data are checked on their plausibility. Substance and environmental process estimation functions facilitate generic model calculations. CemoS is implemented in a modular structure using object-oriented programming.
NASA Technical Reports Server (NTRS)
Nguyen, H. L.; Ying, S.-J.
1990-01-01
Jet-A spray combustion has been evaluated in gas turbine combustion with the use of propane chemical kinetics as the first approximation for the chemical reactions. Here, the numerical solutions are obtained by using the KIVA-2 computer code. The KIVA-2 code is the most developed of the available multidimensional combustion computer programs for application of the in-cylinder combustion dynamics of internal combustion engines. The released version of KIVA-2 assumes that 12 chemical species are present; the code uses an Arrhenius kinetic-controlled combustion model governed by a four-step global chemical reaction and six equilibrium reactions. Researchers efforts involve the addition of Jet-A thermophysical properties and the implementation of detailed reaction mechanisms for propane oxidation. Three different detailed reaction mechanism models are considered. The first model consists of 131 reactions and 45 species. This is considered as the full mechanism which is developed through the study of chemical kinetics of propane combustion in an enclosed chamber. The full mechanism is evaluated by comparing calculated ignition delay times with available shock tube data. However, these detailed reactions occupy too much computer memory and CPU time for the computation. Therefore, it only serves as a benchmark case by which to evaluate other simplified models. Two possible simplified models were tested in the existing computer code KIVA-2 for the same conditions as used with the full mechanism. One model is obtained through a sensitivity analysis using LSENS, the general kinetics and sensitivity analysis program code of D. A. Bittker and K. Radhakrishnan. This model consists of 45 chemical reactions and 27 species. The other model is based on the work published by C. K. Westbrook and F. L. Dryer.
Mingguang, Zhang; Juncheng, Jiang
2008-10-30
Overpressure is one important cause of domino effect in accidents of chemical process equipments. Damage probability and relative threshold value are two necessary parameters in QRA of this phenomenon. Some simple models had been proposed based on scarce data or oversimplified assumption. Hence, more data about damage to chemical process equipments were gathered and analyzed, a quantitative relationship between damage probability and damage degrees of equipment was built, and reliable probit models were developed associated to specific category of chemical process equipments. Finally, the improvements of present models were evidenced through comparison with other models in literatures, taking into account such parameters: consistency between models and data, depth of quantitativeness in QRA.
Emissions model of waste treatment operations at the Idaho Chemical Processing Plant
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schindler, R.E.
1995-03-01
An integrated model of the waste treatment systems at the Idaho Chemical Processing Plant (ICPP) was developed using a commercially-available process simulation software (ASPEN Plus) to calculate atmospheric emissions of hazardous chemicals for use in an application for an environmental permit to operate (PTO). The processes covered by the model are the Process Equipment Waste evaporator, High Level Liquid Waste evaporator, New Waste Calcining Facility and Liquid Effluent Treatment and Disposal facility. The processes are described along with the model and its assumptions. The model calculates emissions of NO{sub x}, CO, volatile acids, hazardous metals, and organic chemicals. Some calculatedmore » relative emissions are summarized and insights on building simulations are discussed.« less
DEVELOPMENT OF AN IMPROVED SIMULATOR FOR CHEMICAL AND MICROBIAL IOR METHODS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gary A. Pope; Kamy Sepehrnoori; Mojdeh Delshad
2001-10-01
This is the final report of a three-year research project on further development of a chemical and microbial improved oil recovery reservoir simulator. The objective of this research was to extend the capability of an existing simulator (UTCHEM) to improved oil recovery methods which use surfactants, polymers, gels, alkaline chemicals, microorganisms and foam as well as various combinations of these in both conventional and naturally fractured oil reservoirs. The first task was the addition of a dual-porosity model for chemical IOR in naturally fractured oil reservoirs. They formulated and implemented a multiphase, multicomponent dual porosity model for enhanced oil recoverymore » from naturally fractured reservoirs. The multiphase dual porosity model was tested against analytical solutions, coreflood data, and commercial simulators. The second task was the addition of a foam model. They implemented a semi-empirical surfactant/foam model in UTCHEM and validated the foam model by comparison with published laboratory data. The third task addressed several numerical and coding enhancements that will greatly improve its versatility and performance. Major enhancements were made in UTCHEM output files and memory management. A graphical user interface to set up the simulation input and to process the output data on a Windows PC was developed. New solvers for solving the pressure equation and geochemical system of equations were implemented and tested. A corner point grid geometry option for gridding complex reservoirs was implemented and tested. Enhancements of physical property models for both chemical and microbial IOR simulations were included in the final task of this proposal. Additional options for calculating the physical properties such as relative permeability and capillary pressure were added. A microbiological population model was developed and incorporated into UTCHEM. They have applied the model to microbial enhanced oil recovery (MEOR) processes by including the capability of permeability reduction due to biomass growth and retention. The formations of bio-products such as surfactant and polymer surfactant have also been incorporated.« less
Dennison, James E; Andersen, Melvin E; Yang, Raymond S H
2003-09-01
Gasoline consists of a few toxicologically significant components and a large number of other hydrocarbons in a complex mixture. By using an integrated, physiologically based pharmacokinetic (PBPK) modeling and lumping approach, we have developed a method for characterizing the pharmacokinetics (PKs) of gasoline in rats. The PBPK model tracks selected target components (benzene, toluene, ethylbenzene, o-xylene [BTEX], and n-hexane) and a lumped chemical group representing all nontarget components, with competitive metabolic inhibition between all target compounds and the lumped chemical. PK data was acquired by performing gas uptake PK studies with male F344 rats in a closed chamber. Chamber air samples were analyzed every 10-20 min by gas chromatography/flame ionization detection and all nontarget chemicals were co-integrated. A four-compartment PBPK model with metabolic interactions was constructed using the BTEX, n-hexane, and lumped chemical data. Target chemical kinetic parameters were refined by studies with either the single chemical alone or with all five chemicals together. o-Xylene, at high concentrations, decreased alveolar ventilation, consistent with respiratory irritation. A six-chemical interaction model with the lumped chemical group was used to estimate lumped chemical partitioning and metabolic parameters for a winter blend of gasoline with methyl t-butyl ether and a summer blend without any oxygenate. Computer simulation results from this model matched well with experimental data from single chemical, five-chemical mixture, and the two blends of gasoline. The PBPK model analysis indicated that metabolism of individual components was inhibited up to 27% during the 6-h gas uptake experiments of gasoline exposures.
CHEMFLO: ONE-DIMENSIONAL WATER AND CHEMICAL MOVEMENT IN UNSATURATED SOILS
An interactive software system was developed to enable decision-makers, regulators, policy-makers, scientists, consultants, and students to simulate the movement of waterand chemicals in unsaturated soils. Water movement is modeled using Richards (1931) - equation. Chemical trans...
SIMULATING METABOLISM TO ENHANCE EFFECTS MODELING
A major uncertainty that has long been recognized in evaluating chemical toxicity is accounting for metabolic activation of chemicals resulting in increased toxicity. The proposed research will develop a capability for forecasting the metabolism of xenobiotic chemicals of EPA int...
Pred-Skin: A Fast and Reliable Web Application to Assess Skin Sensitization Effect of Chemicals.
Braga, Rodolpho C; Alves, Vinicius M; Muratov, Eugene N; Strickland, Judy; Kleinstreuer, Nicole; Trospsha, Alexander; Andrade, Carolina Horta
2017-05-22
Chemically induced skin sensitization is a complex immunological disease with a profound impact on quality of life and working ability. Despite some progress in developing alternative methods for assessing the skin sensitization potential of chemical substances, there is no in vitro test that correlates well with human data. Computational QSAR models provide a rapid screening approach and contribute valuable information for the assessment of chemical toxicity. We describe the development of a freely accessible web-based and mobile application for the identification of potential skin sensitizers. The application is based on previously developed binary QSAR models of skin sensitization potential from human (109 compounds) and murine local lymph node assay (LLNA, 515 compounds) data with good external correct classification rate (0.70-0.81 and 0.72-0.84, respectively). We also included a multiclass skin sensitization potency model based on LLNA data (accuracy ranging between 0.73 and 0.76). When a user evaluates a compound in the web app, the outputs are (i) binary predictions of human and murine skin sensitization potential; (ii) multiclass prediction of murine skin sensitization; and (iii) probability maps illustrating the predicted contribution of chemical fragments. The app is the first tool available that incorporates quantitative structure-activity relationship (QSAR) models based on human data as well as multiclass models for LLNA. The Pred-Skin web app version 1.0 is freely available for the web, iOS, and Android (in development) at the LabMol web portal ( http://labmol.com.br/predskin/ ), in the Apple Store, and on Google Play, respectively. We will continuously update the app as new skin sensitization data and respective models become available.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alves, Vinicius M.; Laboratory for Molecular Modeling, Division of Chemical Biology and Medicinal Chemistry, Eshelman School of Pharmacy, University of North Carolina, Chapel Hill, NC 27599; Muratov, Eugene
Skin permeability is widely considered to be mechanistically implicated in chemically-induced skin sensitization. Although many chemicals have been identified as skin sensitizers, there have been very few reports analyzing the relationships between molecular structure and skin permeability of sensitizers and non-sensitizers. The goals of this study were to: (i) compile, curate, and integrate the largest publicly available dataset of chemicals studied for their skin permeability; (ii) develop and rigorously validate QSAR models to predict skin permeability; and (iii) explore the complex relationships between skin sensitization and skin permeability. Based on the largest publicly available dataset compiled in this study, wemore » found no overall correlation between skin permeability and skin sensitization. In addition, cross-species correlation coefficient between human and rodent permeability data was found to be as low as R{sup 2} = 0.44. Human skin permeability models based on the random forest method have been developed and validated using OECD-compliant QSAR modeling workflow. Their external accuracy was high (Q{sup 2}{sub ext} = 0.73 for 63% of external compounds inside the applicability domain). The extended analysis using both experimentally-measured and QSAR-imputed data still confirmed the absence of any overall concordance between skin permeability and skin sensitization. This observation suggests that chemical modifications that affect skin permeability should not be presumed a priori to modulate the sensitization potential of chemicals. The models reported herein as well as those developed in the companion paper on skin sensitization suggest that it may be possible to rationally design compounds with the desired high skin permeability but low sensitization potential. - Highlights: • It was compiled the largest publicly-available skin permeability dataset. • Predictive QSAR models were developed for skin permeability. • No concordance between skin sensitization and skin permeability has been found. • Structural rules for optimizing sensitization and penetration were established.« less
Lewis, F.M.; Voss, C.I.; Rubin, Jacob
1986-01-01
A model was developed that can simulate the effect of certain chemical and sorption reactions simultaneously among solutes involved in advective-dispersive transport through porous media. The model is based on a methodology that utilizes physical-chemical relationships in the development of the basic solute mass-balance equations; however, the form of these equations allows their solution to be obtained by methods that do not depend on the chemical processes. The chemical environment is governed by the condition of local chemical equilibrium, and may be defined either by the linear sorption of a single species and two soluble complexation reactions which also involve that species, or binary ion exchange and one complexation reaction involving a common ion. Partial differential equations that describe solute mass balance entirely in the liquid phase are developed for each tenad (a chemical entity whose total mass is independent of the reaction process) in terms of their total dissolved concentration. These equations are solved numerically in two dimensions through the modification of an existing groundwater flow/transport computer code. (Author 's abstract)
Zhu, Hao; Ye, Lin; Richard, Ann; Golbraikh, Alexander; Wright, Fred A.; Rusyn, Ivan; Tropsha, Alexander
2009-01-01
Background Accurate prediction of in vivo toxicity from in vitro testing is a challenging problem. Large public–private consortia have been formed with the goal of improving chemical safety assessment by the means of high-throughput screening. Objective A wealth of available biological data requires new computational approaches to link chemical structure, in vitro data, and potential adverse health effects. Methods and results A database containing experimental cytotoxicity values for in vitro half-maximal inhibitory concentration (IC50) and in vivo rodent median lethal dose (LD50) for more than 300 chemicals was compiled by Zentralstelle zur Erfassung und Bewertung von Ersatz- und Ergaenzungsmethoden zum Tierversuch (ZEBET; National Center for Documentation and Evaluation of Alternative Methods to Animal Experiments). The application of conventional quantitative structure–activity relationship (QSAR) modeling approaches to predict mouse or rat acute LD50 values from chemical descriptors of ZEBET compounds yielded no statistically significant models. The analysis of these data showed no significant correlation between IC50 and LD50. However, a linear IC50 versus LD50 correlation could be established for a fraction of compounds. To capitalize on this observation, we developed a novel two-step modeling approach as follows. First, all chemicals are partitioned into two groups based on the relationship between IC50 and LD50 values: One group comprises compounds with linear IC50 versus LD50 relationships, and another group comprises the remaining compounds. Second, we built conventional binary classification QSAR models to predict the group affiliation based on chemical descriptors only. Third, we developed k-nearest neighbor continuous QSAR models for each subclass to predict LD50 values from chemical descriptors. All models were extensively validated using special protocols. Conclusions The novelty of this modeling approach is that it uses the relationships between in vivo and in vitro data only to inform the initial construction of the hierarchical two-step QSAR models. Models resulting from this approach employ chemical descriptors only for external prediction of acute rodent toxicity. PMID:19672406
Zhu, Hao; Ye, Lin; Richard, Ann; Golbraikh, Alexander; Wright, Fred A; Rusyn, Ivan; Tropsha, Alexander
2009-08-01
Accurate prediction of in vivo toxicity from in vitro testing is a challenging problem. Large public-private consortia have been formed with the goal of improving chemical safety assessment by the means of high-throughput screening. A wealth of available biological data requires new computational approaches to link chemical structure, in vitro data, and potential adverse health effects. A database containing experimental cytotoxicity values for in vitro half-maximal inhibitory concentration (IC(50)) and in vivo rodent median lethal dose (LD(50)) for more than 300 chemicals was compiled by Zentralstelle zur Erfassung und Bewertung von Ersatz- und Ergaenzungsmethoden zum Tierversuch (ZEBET; National Center for Documentation and Evaluation of Alternative Methods to Animal Experiments). The application of conventional quantitative structure-activity relationship (QSAR) modeling approaches to predict mouse or rat acute LD(50) values from chemical descriptors of ZEBET compounds yielded no statistically significant models. The analysis of these data showed no significant correlation between IC(50) and LD(50). However, a linear IC(50) versus LD(50) correlation could be established for a fraction of compounds. To capitalize on this observation, we developed a novel two-step modeling approach as follows. First, all chemicals are partitioned into two groups based on the relationship between IC(50) and LD(50) values: One group comprises compounds with linear IC(50) versus LD(50) relationships, and another group comprises the remaining compounds. Second, we built conventional binary classification QSAR models to predict the group affiliation based on chemical descriptors only. Third, we developed k-nearest neighbor continuous QSAR models for each subclass to predict LD(50) values from chemical descriptors. All models were extensively validated using special protocols. The novelty of this modeling approach is that it uses the relationships between in vivo and in vitro data only to inform the initial construction of the hierarchical two-step QSAR models. Models resulting from this approach employ chemical descriptors only for external prediction of acute rodent toxicity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pope, G.A.; Lake, L.W.; Sepehrnoori, K.
1988-11-01
The objective of this research is to develop, validate, and apply a comprehensive chemical flooding simulator for chemical recovery processes involving surfactants, polymers, and alkaline chemicals in various combinations. This integrated program includes components of laboratory experiments, physical property modelling, scale-up theory, and numerical analysis as necessary and integral components of the simulation activity. Developing, testing and applying flooding simulator (UTCHEM) to a wide variety of laboratory and reservoir problems involving tracers, polymers, polymer gels, surfactants, and alkaline agent has been continued. Improvements in both the physical-chemical and numerical aspects of UTCHEM have been made which enhance its versatility, accuracymore » and speed. Supporting experimental studies during the past year include relative permeability and trapping of microemulsion, tracer flow studies oil recovery in cores using alcohol free surfactant slugs, and microemulsion viscosity measurements. These have enabled model improvement simulator testing. Another code called PROPACK has also been developed which is used as a preprocessor for UTCHEM. Specifically, it is used to evaluate input to UTCHEM by computing and plotting key physical properties such as phase behavior interfacial tension.« less
Toxicokinetic and Dosimetry Modeling Tools for Exposure ...
New technologies and in vitro testing approaches have been valuable additions to risk assessments that have historically relied solely on in vivo test results. Compared to in vivo methods, in vitro high throughput screening (HTS) assays are less expensive, faster and can provide mechanistic insights on chemical action. However, extrapolating from in vitro chemical concentrations to target tissue or blood concentrations in vivo is fraught with uncertainties, and modeling is dependent upon pharmacokinetic variables not measured in in vitro assays. To address this need, new tools have been created for characterizing, simulating, and evaluating chemical toxicokinetics. Physiologically-based pharmacokinetic (PBPK) models provide estimates of chemical exposures that produce potentially hazardous tissue concentrations, while tissue microdosimetry PK models relate whole-body chemical exposures to cell-scale concentrations. These tools rely on high-throughput in vitro measurements, and successful methods exist for pharmaceutical compounds that determine PK from limited in vitro measurements and chemical structure-derived property predictions. These high throughput (HT) methods provide a more rapid and less resource–intensive alternative to traditional PK model development. We have augmented these in vitro data with chemical structure-based descriptors and mechanistic tissue partitioning models to construct HTPBPK models for over three hundred environmental and pharmace
Predictive Models and Tools for Assessing Chemicals under the Toxic Substances Control Act (TSCA)
EPA has developed databases and predictive models to help evaluate the hazard, exposure, and risk of chemicals released to the environment and how workers, the general public, and the environment may be exposed to and affected by them.
Modeling Zebrafish Developmental Toxicity using a Concurrent In vitro Assay Battery (SOT)
We describe the development of computational models that predict activity in a repeat-dose zebrafish embryo developmental toxicity assay using a combination of physico-chemical parameters and in vitro (human) assay measurements. The data set covered 986 chemicals including pestic...
CHEMICAL MASS BALANCE MODEL: EPA-CMB8.2
The Chemical Mass Balance (CMB) method has been a popular approach for receptor modeling of ambient air pollutants for over two decades. For the past few years the U.S. Environmental Protection Agency's Office of Research and Development (ORD) and Office of Air Quality Plannin...
Estimation of hydrolysis rate constants for carbamates ...
Cheminformatics based tools, such as the Chemical Transformation Simulator under development in EPA’s Office of Research and Development, are being increasingly used to evaluate chemicals for their potential to degrade in the environment or be transformed through metabolism. Hydrolysis represents a major environmental degradation pathway; unfortunately, only a small fraction of hydrolysis rates for about 85,000 chemicals on the Toxic Substances Control Act (TSCA) inventory are in public domain, making it critical to develop in silico approaches to estimate hydrolysis rate constants. In this presentation, we compare three complementary approaches to estimate hydrolysis rates for carbamates, an important chemical class widely used in agriculture as pesticides, herbicides and fungicides. Fragment-based Quantitative Structure Activity Relationships (QSARs) using Hammett-Taft sigma constants are widely published and implemented for relatively simple functional groups such as carboxylic acid esters, phthalate esters, and organophosphate esters, and we extend these to carbamates. We also develop a pKa based model and a quantitative structure property relationship (QSPR) model, and evaluate them against measured rate constants using R square and root mean square (RMS) error. Our work shows that for our relatively small sample size of carbamates, a Hammett-Taft based fragment model performs best, followed by a pKa and a QSPR model. This presentation compares three comp
Chen, Yun; Nielsen, Jens
2013-12-01
Bio-based production of chemical building blocks from renewable resources is an attractive alternative to petroleum-based platform chemicals. Metabolic pathway and strain engineering is the key element in constructing robust microbial chemical factories within the constraints of cost effective production. Here we discuss how the development of computational algorithms, novel modules and methods, omics-based techniques combined with modeling refinement are enabling reduction in development time and thus advance the field of industrial biotechnology. We further discuss how recent technological developments contribute to the development of novel cell factories for the production of the building block chemicals: adipic acid, succinic acid and 3-hydroxypropionic acid. Copyright © 2013 Elsevier Ltd. All rights reserved.
Quantitative metrics for assessment of chemical image quality and spatial resolution
Kertesz, Vilmos; Cahill, John F.; Van Berkel, Gary J.
2016-02-28
Rationale: Currently objective/quantitative descriptions of the quality and spatial resolution of mass spectrometry derived chemical images are not standardized. Development of these standardized metrics is required to objectively describe chemical imaging capabilities of existing and/or new mass spectrometry imaging technologies. Such metrics would allow unbiased judgment of intra-laboratory advancement and/or inter-laboratory comparison for these technologies if used together with standardized surfaces. Methods: We developed two image metrics, viz., chemical image contrast (ChemIC) based on signal-to-noise related statistical measures on chemical image pixels and corrected resolving power factor (cRPF) constructed from statistical analysis of mass-to-charge chronograms across features of interest inmore » an image. These metrics, quantifying chemical image quality and spatial resolution, respectively, were used to evaluate chemical images of a model photoresist patterned surface collected using a laser ablation/liquid vortex capture mass spectrometry imaging system under different instrument operational parameters. Results: The calculated ChemIC and cRPF metrics determined in an unbiased fashion the relative ranking of chemical image quality obtained with the laser ablation/liquid vortex capture mass spectrometry imaging system. These rankings were used to show that both chemical image contrast and spatial resolution deteriorated with increasing surface scan speed, increased lane spacing and decreasing size of surface features. Conclusions: ChemIC and cRPF, respectively, were developed and successfully applied for the objective description of chemical image quality and spatial resolution of chemical images collected from model surfaces using a laser ablation/liquid vortex capture mass spectrometry imaging system.« less
Quantitative metrics for assessment of chemical image quality and spatial resolution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kertesz, Vilmos; Cahill, John F.; Van Berkel, Gary J.
Rationale: Currently objective/quantitative descriptions of the quality and spatial resolution of mass spectrometry derived chemical images are not standardized. Development of these standardized metrics is required to objectively describe chemical imaging capabilities of existing and/or new mass spectrometry imaging technologies. Such metrics would allow unbiased judgment of intra-laboratory advancement and/or inter-laboratory comparison for these technologies if used together with standardized surfaces. Methods: We developed two image metrics, viz., chemical image contrast (ChemIC) based on signal-to-noise related statistical measures on chemical image pixels and corrected resolving power factor (cRPF) constructed from statistical analysis of mass-to-charge chronograms across features of interest inmore » an image. These metrics, quantifying chemical image quality and spatial resolution, respectively, were used to evaluate chemical images of a model photoresist patterned surface collected using a laser ablation/liquid vortex capture mass spectrometry imaging system under different instrument operational parameters. Results: The calculated ChemIC and cRPF metrics determined in an unbiased fashion the relative ranking of chemical image quality obtained with the laser ablation/liquid vortex capture mass spectrometry imaging system. These rankings were used to show that both chemical image contrast and spatial resolution deteriorated with increasing surface scan speed, increased lane spacing and decreasing size of surface features. Conclusions: ChemIC and cRPF, respectively, were developed and successfully applied for the objective description of chemical image quality and spatial resolution of chemical images collected from model surfaces using a laser ablation/liquid vortex capture mass spectrometry imaging system.« less
Gupta, Shikha; Basant, Nikita; Mohan, Dinesh; Singh, Kunwar P
2016-07-01
The persistence and the removal of organic chemicals from the atmosphere are largely determined by their reactions with the OH radical and O3. Experimental determinations of the kinetic rate constants of OH and O3 with a large number of chemicals are tedious and resource intensive and development of computational approaches has widely been advocated. Recently, ensemble machine learning (EML) methods have emerged as unbiased tools to establish relationship between independent and dependent variables having a nonlinear dependence. In this study, EML-based, temperature-dependent quantitative structure-reactivity relationship (QSRR) models have been developed for predicting the kinetic rate constants for OH (kOH) and O3 (kO3) reactions with diverse chemicals. Structural diversity of chemicals was evaluated using a Tanimoto similarity index. The generalization and prediction abilities of the constructed models were established through rigorous internal and external validation performed employing statistical checks. In test data, the EML QSRR models yielded correlation (R (2)) of ≥0.91 between the measured and the predicted reactivities. The applicability domains of the constructed models were determined using methods based on descriptors range, Euclidean distance, leverage, and standardization approaches. The prediction accuracies for the higher reactivity compounds were relatively better than those of the low reactivity compounds. Proposed EML QSRR models performed well and outperformed the previous reports. The proposed QSRR models can make predictions of rate constants at different temperatures. The proposed models can be useful tools in predicting the reactivities of chemicals towards OH radical and O3 in the atmosphere.
In this paper, a screening model for flow of a nonaqueous phase liquid (NAPL) and associated chemical transport in the vadose zone is developed. The model is based on kinematic approximation of the governing equations for both the NAPL and a partitionable chemical constituent. Th...
A new in silico classification model for ready biodegradability, based on molecular fragments.
Lombardo, Anna; Pizzo, Fabiola; Benfenati, Emilio; Manganaro, Alberto; Ferrari, Thomas; Gini, Giuseppina
2014-08-01
Regulations such as the European REACH (Registration, Evaluation, Authorization and restriction of Chemicals) often require chemicals to be evaluated for ready biodegradability, to assess the potential risk for environmental and human health. Because not all chemicals can be tested, there is an increasing demand for tools for quick and inexpensive biodegradability screening, such as computer-based (in silico) theoretical models. We developed an in silico model starting from a dataset of 728 chemicals with ready biodegradability data (MITI-test Ministry of International Trade and Industry). We used the novel software SARpy to automatically extract, through a structural fragmentation process, a set of substructures statistically related to ready biodegradability. Then, we analysed these substructures in order to build some general rules. The model consists of a rule-set made up of the combination of the statistically relevant fragments and of the expert-based rules. The model gives good statistical performance with 92%, 82% and 76% accuracy on the training, test and external set respectively. These results are comparable with other in silico models like BIOWIN developed by the United States Environmental Protection Agency (EPA); moreover this new model includes an easily understandable explanation. Copyright © 2014 Elsevier Ltd. All rights reserved.
Prediction of biodegradability from chemical structure: Modeling or ready biodegradation test data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Loonen, H.; Lindgren, F.; Hansen, B.
1999-08-01
Biodegradation data were collected and evaluated for 894 substances with widely varying chemical structures. All data were determined according to the Japanese Ministry of International Trade and Industry (MITI) I test protocol. The MITI I test is a screening test for ready biodegradability and has been described by Organization for Economic Cooperation and Development (OECD) test guideline 301 C and European Union (EU) test guideline C4F. The chemicals were characterized by a set of 127 predefined structural fragments. This data set was used to develop a model for the prediction of the biodegradability of chemicals under standardized OECD and EUmore » ready biodegradation test conditions. Partial least squares (PLS) discriminant analysis was used for the model development. The model was evaluated by means of internal cross-validation and repeated external validation. The importance of various structural fragments and fragment interactions was investigated. The most important fragments include the presence of a long alkyl chain; hydroxy, ester, and acid groups (enhancing biodegradation); and the presence of one or more aromatic rings and halogen substituents (regarding biodegradation). More than 85% of the model predictions were correct for using the complete data set. The not readily biodegradable predictions were slightly better than the readily biodegradable predictions (86 vs 84%). The average percentage of correct predictions from four external validation studies was 83%. Model optimization by including fragment interactions improve the model predicting capabilities to 89%. It can be concluded that the PLS model provides predictions of high reliability for a diverse range of chemical structures. The predictions conform to the concept of readily biodegradable (or not readily biodegradable) as defined by OECD and EU test guidelines.« less
MODELING CHEMICAL FATE AND METABOLISM FOR COMPUTATIONAL TOXICOLOGY
The goal of ORD's Computational Toxicology initiative is to develop the science for EPA to prioritize toxicity-testing requirements for chemicals subject to regulation. Many toxic effects, however, result from metabolism of parent chemicals to form metabolites that are much more...
Cheminformatic Analysis of the US EPA ToxCast Chemical Library
The ToxCast project is employing high throughput screening (HTS) technologies, along with chemical descriptors and computational models, to develop approaches for screening and prioritizing environmental chemicals for further toxicity testing. ToxCast Phase I generated HTS data f...
In general, the accuracy of a predicted toxicity value increases with increase in similarity between the query chemical and the chemicals used to develop a QSAR model. A toxicity estimation methodology employing this finding has been developed. A hierarchical based clustering t...
Using Pareto points for model identification in predictive toxicology
2013-01-01
Predictive toxicology is concerned with the development of models that are able to predict the toxicity of chemicals. A reliable prediction of toxic effects of chemicals in living systems is highly desirable in cosmetics, drug design or food protection to speed up the process of chemical compound discovery while reducing the need for lab tests. There is an extensive literature associated with the best practice of model generation and data integration but management and automated identification of relevant models from available collections of models is still an open problem. Currently, the decision on which model should be used for a new chemical compound is left to users. This paper intends to initiate the discussion on automated model identification. We present an algorithm, based on Pareto optimality, which mines model collections and identifies a model that offers a reliable prediction for a new chemical compound. The performance of this new approach is verified for two endpoints: IGC50 and LogP. The results show a great potential for automated model identification methods in predictive toxicology. PMID:23517649
The increasing use of tissue dosimetry estimated using pharmacokinetic models in chemical risk assessments in multiple countries necessitates the need to develop internationally recognized good modelling practices. These practices would facilitate sharing of models and model eva...
Non-equilibrium Quasi-Chemical Nucleation Model
NASA Astrophysics Data System (ADS)
Gorbachev, Yuriy E.
2018-04-01
Quasi-chemical model, which is widely used for nucleation description, is revised on the basis of recent results in studying of non-equilibrium effects in reacting gas mixtures (Kolesnichenko and Gorbachev in Appl Math Model 34:3778-3790, 2010; Shock Waves 23:635-648, 2013; Shock Waves 27:333-374, 2017). Non-equilibrium effects in chemical reactions are caused by the chemical reactions themselves and therefore these contributions should be taken into account in the corresponding expressions for reaction rates. Corrections to quasi-equilibrium reaction rates are of two types: (a) spatially homogeneous (caused by physical-chemical processes) and (b) spatially inhomogeneous (caused by gas expansion/compression processes and proportional to the velocity divergency). Both of these processes play an important role during the nucleation and are included into the proposed model. The method developed for solving the generalized Boltzmann equation for chemically reactive gases is applied for solving the set of equations of the revised quasi-chemical model. It is shown that non-equilibrium processes lead to essential deviation of the quasi-stationary distribution and therefore the nucleation rate from its traditional form.
The development and preliminary application of an invariant coupled diffusion and chemistry model
NASA Technical Reports Server (NTRS)
Hilst, G. R.; Donaldson, C. DUP.; Teske, M.; Contiliano, R.; Freiberg, J.
1973-01-01
In many real-world pollution chemical reaction problems, the rate of reaction problems, the rate of reaction may be greatly affected by unmixedness. An approximate closure scheme for a chemical kinetic submodel which conforms to the principles of invariant modeling and which accounts for the effects of inhomogeneous mixing over a wide range of conditions has been developed. This submodel has been coupled successfully with invariant turbulence and diffusion models, permitting calculation of two-dimensional diffusion of two reacting (isothermally) chemical species. The initial calculations indicate the ozone reactions in the wake of stratospheric aircraft will be substantially affected by the rate of diffusion of ozone into the wake, and in the early wake, by unmixedness.
Analysis of an algae-based CELSS. I - Model development
NASA Technical Reports Server (NTRS)
Holtzapple, Mark T.; Little, Frank E.; Makela, Merry E.; Patterson, C. O.
1989-01-01
A steady state chemical model and computer program have been developed for a life support system and applied to trade-off studies. The model is based on human demand for food and oxygen determined from crew metabolic needs. The model includes modules for water recycle, waste treatment, CO2 removal and treatment, and food production. The computer program calculates rates of use and material balance for food, O2, the recycle of human waste and trash, H2O, N2, and food production/supply. A simple noniterative solution for the model has been developed using the steady state rate equations for the chemical reactions. The model and program have been used in system sizing and subsystem trade-off studies of a partially closed life support system.
Analysis of an algae-based CELSS. Part 1: model development
NASA Technical Reports Server (NTRS)
Holtzapple, M. T.; Little, F. E.; Makela, M. E.; Patterson, C. O.
1989-01-01
A steady state chemical model and computer program have been developed for a life support system and applied to trade-off studies. The model is based on human demand for food and oxygen determined from crew metabolic needs. The model includes modules for water recycle, waste treatment, CO2 removal and treatment, and food production. The computer program calculates rates of use and material balance for food. O2, the recycle of human waste and trash, H2O, N2, and food production supply. A simple non-iterative solution for the model has been developed using the steady state rate equations for the chemical reactions. The model and program have been used in system sizing and subsystem trade-off studies of a partially closed life support system.
Chemicals in the environment have the potential to cause reproductive toxicity by acting on the hypothalamus-pituitary-gonadal (HPG) axis. We have developed a mathematical model to predict chemical impacts on reproductive hormone production in the highly conserved HPG axis using...
This presentation describes EPA efforts to collect, model, and measure publically available consumer product data for use in exposure assessment. The development of the ORD Chemicals and Products database will be described, as will machine-learning based models for predicting ch...
Identifying the predominant chemical reductants and pathways for electron transfer in anaerobic systems is paramount to the development of environmental fate models that incorporate pathways for abiotic reductive transformations. Currently, such models do not exist. In this chapt...
Various models have been developed to predict the relative binding affinity (RBA) of chemicals to estrogen receptors (ER). These models are important for prioritizing chemicals for screening in biological assays assessing the potential for endocrine disruption. One shortcoming of...
Mathematical models for predicting the transport and fate of pollutants in the environment require reactivity parameter values that is value of the physical and chemical constants that govern reactivity. Although empirical structure activity relationships have been developed th...
In silico prediction of potential chemical reactions mediated by human enzymes.
Yu, Myeong-Sang; Lee, Hyang-Mi; Park, Aaron; Park, Chungoo; Ceong, Hyithaek; Rhee, Ki-Hyeong; Na, Dokyun
2018-06-13
Administered drugs are often converted into an ineffective or activated form by enzymes in our body. Conventional in silico prediction approaches focused on therapeutically important enzymes such as CYP450. However, there are more than thousands of different cellular enzymes that potentially convert administered drug into other forms. We developed an in silico model to predict which of human enzymes including metabolic enzymes as well as CYP450 family can catalyze a given chemical compound. The prediction is based on the chemical and physical similarity between known enzyme substrates and a query chemical compound. Our in silico model was developed using multiple linear regression and the model showed high performance (AUC = 0.896) despite of the large number of enzymes. When evaluated on a test dataset, it also showed significantly high performance (AUC = 0.746). Interestingly, evaluation with literature data showed that our model can be used to predict not only enzymatic reactions but also drug conversion and enzyme inhibition. Our model was able to predict enzymatic reactions of a query molecule with a high accuracy. This may foster to discover new metabolic routes and to accelerate the computational development of drug candidates by enabling the prediction of the potential conversion of administered drugs into active or inactive forms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ehlen, Mark A.; Sun, Amy C.; Pepple, Mark A.
The potential impacts of man-made and natural disasters on chemical plants, complexes, and supply chains are of great importance to homeland security. To be able to estimate these impacts, we developed an agent-based chemical supply chain model that includes: chemical plants with enterprise operations such as purchasing, production scheduling, and inventories; merchant chemical markets, and multi-modal chemical shipments. Large-scale simulations of chemical-plant activities and supply chain interactions, running on desktop computers, are used to estimate the scope and duration of disruptive-event impacts, and overall system resilience, based on the extent to which individual chemical plants can adjust their internal operationsmore » (e.g., production mixes and levels) versus their external interactions (market sales and purchases, and transportation routes and modes). As a result, to illustrate how the model estimates the impacts of a hurricane disruption, a simple example model centered on 1,4-butanediol is presented.« less
New public QSAR model for carcinogenicity
2010-01-01
Background One of the main goals of the new chemical regulation REACH (Registration, Evaluation and Authorization of Chemicals) is to fulfill the gaps in data concerned with properties of chemicals affecting the human health. (Q)SAR models are accepted as a suitable source of information. The EU funded CAESAR project aimed to develop models for prediction of 5 endpoints for regulatory purposes. Carcinogenicity is one of the endpoints under consideration. Results Models for prediction of carcinogenic potency according to specific requirements of Chemical regulation were developed. The dataset of 805 non-congeneric chemicals extracted from Carcinogenic Potency Database (CPDBAS) was used. Counter Propagation Artificial Neural Network (CP ANN) algorithm was implemented. In the article two alternative models for prediction carcinogenicity are described. The first model employed eight MDL descriptors (model A) and the second one twelve Dragon descriptors (model B). CAESAR's models have been assessed according to the OECD principles for the validation of QSAR. For the model validity we used a wide series of statistical checks. Models A and B yielded accuracy of training set (644 compounds) equal to 91% and 89% correspondingly; the accuracy of the test set (161 compounds) was 73% and 69%, while the specificity was 69% and 61%, respectively. Sensitivity in both cases was equal to 75%. The accuracy of the leave 20% out cross validation for the training set of models A and B was equal to 66% and 62% respectively. To verify if the models perform correctly on new compounds the external validation was carried out. The external test set was composed of 738 compounds. We obtained accuracy of external validation equal to 61.4% and 60.0%, sensitivity 64.0% and 61.8% and specificity equal to 58.9% and 58.4% respectively for models A and B. Conclusion Carcinogenicity is a particularly important endpoint and it is expected that QSAR models will not replace the human experts opinions and conventional methods. However, we believe that combination of several methods will provide useful support to the overall evaluation of carcinogenicity. In present paper models for classification of carcinogenic compounds using MDL and Dragon descriptors were developed. Models could be used to set priorities among chemicals for further testing. The models at the CAESAR site were implemented in java and are publicly accessible. PMID:20678182
NASA Astrophysics Data System (ADS)
Meng, X.; Liu, Y.; Diner, D. J.; Garay, M. J.
2016-12-01
Ambient fine particle (PM2.5) has been positively associated with increased mortality and morbidity worldwide. Recent studies highlight the characteristics and differential toxicity of PM2.5 chemical components, which are important for identifying sources, developing targeted particulate matter (PM) control strategies, and protecting public health. Modelling with satellite retrieved data has been proved as the most cost-effective way to estimate ground PM2.5 levels; however, limited studies have predict PM2.5 chemical components with this method. In this study, the experimental MISR 4.4 km aerosol retrievals were used to predict ground-level particle sulfate, nitrite, organic carbon and element carbon concentrations in 16 counties of southern California. The PM2.5 chemical components concentrations were obtained from the National Chemical Speciation Network (CSN) and the Interagency Monitoring of Protected Visual Environments (IMPROVE) network. A generalized additive model (GAM) was developed based on 16-years data (2000-2015) by combining the MISR aerosol retrievals, meteorological variables and geographical indicators together. Model performance was assessed by model fitted R2 and root-mean-square error (RMSE) and 10-fold cross validation. Spatial patterns of sulfate, nitrate, OC and EC concentrations were also examined with 2-D prediction surfaces. This is the first attempt to develop high-resolution spatial models to predict PM2.5 chemical component concentrations with MISR retrieved aerosol properties, which will provide valuable population exposure estimates for future studies on the characteristics and differential toxicity of PM2.5 speciation.
CERAPP: Collaborative Estrogen Receptor Activity Prediction ...
Humans potentially are exposed to thousands of man-made chemicals in the environment. Some chemicals mimic natural endocrine hormones and, thus, have the potential to be endocrine disruptors. Many of these chemicals never have been tested for their ability to interact with the estrogen receptor (ER). Risk assessors need tools to prioritize chemicals for assessment in costly in vivo tests, for instance, within the EPA Endocrine Disruptor Screening Program. Here, we describe a large-scale modeling project called CERAPP (Collaborative Estrogen Receptor Activity Prediction Project) demonstrating the efficacy of using predictive computational models on high-throughput screening data to screen thousands of chemicals against the ER. CERAPP combined multiple models developed in collaboration among 17 groups in the United States and Europe to predict ER activity of a common set of 32,464 chemical structures. Quantitative structure-activity relationship models and docking approaches were employed, mostly using a common training set of 1677 compounds provided by EPA, to build a total of 40 categorical and 8 continuous models for binding, agonist, and antagonist ER activity. All predictions were tested using an evaluation set of 7522 chemicals collected from the literature. To overcome the limitations of single models, a consensus was built weighting models using a scoring function (0 to 1) based on their accuracies. Individual model scores ranged from 0.69 to 0.85, showing
In silico Testing of Environmental Impact on Embryonic Vascular Development
Understanding risks to embryonic development from exposure to environmental chemicals is a significant challenge given the diverse chemical landscape and paucity of data for most of these compounds. EPA’s Virtual Embryo project is building in silico models of morphogenesis to tes...
Use of an Existing Airborne Radon Data Base in the Verification of the NASA/AEAP Core Model
NASA Technical Reports Server (NTRS)
Kritz, Mark A.
1998-01-01
The primary objective of this project was to apply the tropospheric atmospheric radon (Rn222) measurements to the development and verification of the global 3-D atmospheric chemical transport model under development by NASA's Atmospheric Effects of Aviation Project (AEAP). The AEAP project had two principal components: (1) a modeling effort, whose goal was to create, test and apply an elaborate three-dimensional atmospheric chemical transport model (the NASA/AEAP Core model to an evaluation of the possible short and long-term effects of aircraft emissions on atmospheric chemistry and climate--and (2) a measurement effort, whose goal was to obtain a focused set of atmospheric measurements that would provide some of the observational data used in the modeling effort. My activity in this project was confined to the first of these components. Both atmospheric transport and atmospheric chemical reactions (as well the input and removal of chemical species) are accounted for in the NASA/AEAP Core model. Thus, for example, in assessing the effect of aircraft effluents on the chemistry of a given region of the upper troposphere, the model must keep track not only of the chemical reactions of the effluent species emitted by aircraft flying in this region, but also of the transport into the region of these (and other) species from other, remote sources--for example, via the vertical convection of boundary layer air to the upper troposphere. Radon, because of its known surface source and known radioactive half-life, and freedom from chemical production or loss, and from removal from the atmosphere by physical scavenging, is a recognized and valuable tool for testing the transport components of global transport and circulation models.
Numerical Validation of Chemical Compositional Model for Wettability Alteration Processes
NASA Astrophysics Data System (ADS)
Bekbauov, Bakhbergen; Berdyshev, Abdumauvlen; Baishemirov, Zharasbek; Bau, Domenico
2017-12-01
Chemical compositional simulation of enhanced oil recovery and surfactant enhanced aquifer remediation processes is a complex task that involves solving dozens of equations for all grid blocks representing a reservoir. In the present work, we perform a numerical validation of the newly developed mathematical formulation which satisfies the conservation laws of mass and energy and allows applying a sequential solution approach to solve the governing equations separately and implicitly. Through its application to the numerical experiment using a wettability alteration model and comparisons with existing chemical compositional model's numerical results, the new model has proven to be practical, reliable and stable.
MODELING A MIXTURE: PBPK/PD APPROACHES FOR PREDICTING CHEMICAL INTERACTIONS.
Since environmental chemical exposures generally involve multiple chemicals, there are both regulatory and scientific drivers to develop methods to predict outcomes of these exposures. Even using efficient statistical and experimental designs, it is not possible to test in vivo a...
2017-06-01
Chemical Transformation Simulator (CTS) was developed by the U.S. Environmental Protection Agency to provide physicochemical properties of complex...Site Model CTS Chemical Transformation Simulator developed by EPA D4EM Data for Environmental Modeling Demo Demolition area DNAN 2,4...U.S. Environmental Protection Agency (EPA), was the technical point-of-contact for the Contaminant Transformation Simulator (CTS) that was
NASA Astrophysics Data System (ADS)
Lezberg, Erwin A.; Mularz, Edward J.; Liou, Meng-Sing
1991-03-01
The objectives and accomplishments of research in chemical reacting flows, including both experimental and computational problems are described. The experimental research emphasizes the acquisition of reliable reacting-flow data for code validation, the development of chemical kinetics mechanisms, and the understanding of two-phase flow dynamics. Typical results from two nonreacting spray studies are presented. The computational fluid dynamics (CFD) research emphasizes the development of efficient and accurate algorithms and codes, as well as validation of methods and modeling (turbulence and kinetics) for reacting flows. Major developments of the RPLUS code and its application to mixing concepts, the General Electric combustor, and the Government baseline engine for the National Aerospace Plane are detailed. Finally, the turbulence research in the newly established Center for Modeling of Turbulence and Transition (CMOTT) is described.
Exploring Contextual Models in Chemical Patent Search
NASA Astrophysics Data System (ADS)
Urbain, Jay; Frieder, Ophir
We explore the development of probabilistic retrieval models for integrating term statistics with entity search using multiple levels of document context to improve the performance of chemical patent search. A distributed indexing model was developed to enable efficient named entity search and aggregation of term statistics at multiple levels of patent structure including individual words, sentences, claims, descriptions, abstracts, and titles. The system can be scaled to an arbitrary number of compute instances in a cloud computing environment to support concurrent indexing and query processing operations on large patent collections.
The number of chemicals with limited toxicological information for chemical safety decision-making has accelerated alternative model development, which often are evaluated via referencing animal toxicology studies. In vivo studies are generally considered the standard for hazard ...
MICHTOX: A MASS BALANCE AND BIOACCUMULATION MODEL FOR TOXIC CHEMICALS IN LAKE MICHIGAN
MICHTOX is a toxic chemical mass balance and bioaccumulation model for Lake Michigan. It was developed for USEPA's Region V in support of the Lake Michigan Lake-wide Management Plan (LaMP) to provide guidance on expected water quality improvements in response to critical pollutan...
The X*TRAX™ Mode! 200 Thermal Desorption System developed by Chemical Waste Management, Inc. (CWM), is a low-temperature process designed to separate organic contaminants from soils, sludges, and other solid media. The X*TRAX™ Model 200 is fully transportable and consists of thre...
Various models have been developed to predict the relative binding affinity (RBA) of chemicals to estrogen receptors (ER). These models can be used prioritize chemicals for further tiered biological testing to assess the potential for endocrine disruption. One shortcoming of mode...
There is a need to develop rapid and efficient models to screen chemicals for their potential to cause developmental neurotoxicity. Use of in vitro neuronal models, including human cells, is one approach that allows for timely, cost-effective toxicity screening. The present study...
Framework for a Quantitative Systemic Toxicity Model (FutureToxII)
EPA’s ToxCast program profiles the bioactivity of chemicals in a diverse set of ~700 high throughput screening (HTS) assays. In collaboration with L’Oreal, a quantitative model of systemic toxicity was developed using no effect levels (NEL) from ToxRefDB for 633 chemicals with HT...
There is a need to develop rapid and efficient models for screening chemicals for their potential to cause developmental neurotoxicity. Use of in vitro neuronal models, including human cells, is one approach that allows for timely, cost-effective toxicity screening. The present s...
M4FT-16LL080302052-Update to Thermodynamic Database Development and Sorption Database Integration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zavarin, Mavrik; Wolery, T. J.; Atkins-Duffin, C.
2016-08-16
This progress report (Level 4 Milestone Number M4FT-16LL080302052) summarizes research conducted at Lawrence Livermore National Laboratory (LLNL) within the Argillite Disposal R&D Work Package Number FT-16LL08030205. The focus of this research is the thermodynamic modeling of Engineered Barrier System (EBS) materials and properties and development of thermodynamic databases and models to evaluate the stability of EBS materials and their interactions with fluids at various physico-chemical conditions relevant to subsurface repository environments. The development and implementation of equilibrium thermodynamic models are intended to describe chemical and physical processes such as solubility, sorption, and diffusion.
Recent advances in the in silico modelling of UDP glucuronosyltransferase substrates.
Sorich, Michael J; Smith, Paul A; Miners, John O; Mackenzie, Peter I; McKinnon, Ross A
2008-01-01
UDP glucurononosyltransferases (UGT) are a superfamily of enzymes that catalyse the conjugation of a range of structurally diverse drugs, environmental and endogenous chemicals with glucuronic acid. This process plays a significant role in the clearance and detoxification of many chemicals. Over the last decade the regulation and substrate profiles of UGT isoforms have been increasingly characterised. The resulting data has facilitated the prototyping of ligand based in silico models capable of predicting, and gaining insights into, binding affinity and the substrate- and regio- selectivity of glucuronidation by UGT isoforms. Pharmacophore modelling has produced particularly insightful models and quantitative structure-activity relationships based on machine learning algorithms result in accurate predictions. Simple structural chemical descriptors were found to capture much of the chemical information relevant to UGT metabolism. However, quantum chemical properties of molecules and the nucleophilic atoms in the molecule can enhance both the predictivity and chemical intuitiveness of structure-activity models. Chemical diversity analysis of known substrates has shown some bias towards chemicals with aromatic and aliphatic hydroxyl groups. Future progress in in silico development will depend on larger and more diverse high quality metabolic datasets. Furthermore, improved protein structure data on UGTs will enable the application of structural modelling techniques likely leading to greater insight into the binding and reactive processes of UGT catalysed glucuronidation.
A Simplified Method for the 3D Printing of Molecular Models for Chemical Education
ERIC Educational Resources Information Center
Jones, Oliver A. H.; Spencer, Michelle J. S.
2018-01-01
Using tangible models to help students visualize chemical structures in three dimensions has been a mainstay of chemistry education for many years. Conventional chemistry modeling kits are, however, limited in the types and accuracy of the molecules, bonds and structures they can be used to build. The recent development of 3D printing technology…
Scott V. Ollinger; John D. Aber; Anthony C. Federer; Gary M. Lovett; Jennifer M. Ellis
1995-01-01
A model of physical and chemical climate was developed for New York and New England that can be used in a GIs for integration with ecosystem models. The variables included are monthly average maximum and minimum daily temperatures, precipitation, humidity, and solar radiation, as well as annual atmospheric deposition of sulfur and nitrogen. Equations generated from...
Alves, Vinicius M.; Muratov, Eugene; Fourches, Denis; Strickland, Judy; Kleinstreuer, Nicole; Andrade, Carolina H.; Tropsha, Alexander
2015-01-01
Skin permeability is widely considered to be mechanistically implicated in chemically-induced skin sensitization. Although many chemicals have been identified as skin sensitizers, there have been very few reports analyzing the relationships between molecular structure and skin permeability of sensitizers and non-sensitizers. The goals of this study were to: (i) compile, curate, and integrate the largest publicly available dataset of chemicals studied for their skin permeability; (ii) develop and rigorously validate QSAR models to predict skin permeability; and (iii) explore the complex relationships between skin sensitization and skin permeability. Based on the largest publicly available dataset compiled in this study, we found no overall correlation between skin permeability and skin sensitization. In addition, cross-species correlation coefficient between human and rodent permeability data was found to be as low as R2=0.44. Human skin permeability models based on the random forest method have been developed and validated using OECD-compliant QSAR modeling workflow. Their external accuracy was high (Q2ext = 0.73 for 63% of external compounds inside the applicability domain). The extended analysis using both experimentally-measured and QSAR-imputed data still confirmed the absence of any overall concordance between skin permeability and skin sensitization. This observation suggests that chemical modifications that affect skin permeability should not be presumed a priori to modulate the sensitization potential of chemicals. The models reported herein as well as those developed in the companion paper on skin sensitization suggest that it may be possible to rationally design compounds with the desired high skin permeability but low sensitization potential. PMID:25560673
Brooks, Kriston P.; Sprik, Samuel J.; Tamburello, David A.; ...
2018-04-07
The U.S. Department of Energy (DOE) developed a vehicle Framework model to simulate fuel cell-based light-duty vehicle operation for various hydrogen storage systems. This transient model simulates the performance of the storage system, fuel cell, and vehicle for comparison to Technical Targets established by DOE for four drive cycles/profiles. Chemical hydrogen storage models have been developed for the Framework for both exothermic and endothermic materials. Despite the utility of such models, they require that material researchers input system design specifications that cannot be estimated easily. To address this challenge, a design tool has been developed that allows researchers to directlymore » enter kinetic and thermodynamic chemical hydrogen storage material properties into a simple sizing module that then estimates system parameters required to run the storage system model. Additionally, the design tool can be used as a standalone executable file to estimate the storage system mass and volume outside of the Framework model. Here, these models will be explained and exercised with the representative hydrogen storage materials exothermic ammonia borane (NH 3BH 3) and endothermic alane (AlH 3).« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brooks, Kriston P.; Sprik, Samuel J.; Tamburello, David A.
The U.S. Department of Energy (DOE) developed a vehicle Framework model to simulate fuel cell-based light-duty vehicle operation for various hydrogen storage systems. This transient model simulates the performance of the storage system, fuel cell, and vehicle for comparison to Technical Targets established by DOE for four drive cycles/profiles. Chemical hydrogen storage models have been developed for the Framework for both exothermic and endothermic materials. Despite the utility of such models, they require that material researchers input system design specifications that cannot be estimated easily. To address this challenge, a design tool has been developed that allows researchers to directlymore » enter kinetic and thermodynamic chemical hydrogen storage material properties into a simple sizing module that then estimates system parameters required to run the storage system model. Additionally, the design tool can be used as a standalone executable file to estimate the storage system mass and volume outside of the Framework model. Here, these models will be explained and exercised with the representative hydrogen storage materials exothermic ammonia borane (NH 3BH 3) and endothermic alane (AlH 3).« less
Towards cleaner combustion engines through groundbreaking detailed chemical kinetic models
Battin-Leclerc, Frédérique; Blurock, Edward; Bounaceur, Roda; Fournet, René; Glaude, Pierre-Alexandre; Herbinet, Olivier; Sirjean, Baptiste; Warth, V.
2013-01-01
In the context of limiting the environmental impact of transportation, this paper reviews new directions which are being followed in the development of more predictive and more accurate detailed chemical kinetic models for the combustion of fuels. In the first part, the performance of current models, especially in terms of the prediction of pollutant formation, is evaluated. In the next parts, recent methods and ways to improve these models are described. An emphasis is given on the development of detailed models based on elementary reactions, on the production of the related thermochemical and kinetic parameters, and on the experimental techniques available to produce the data necessary to evaluate model predictions under well defined conditions. PMID:21597604
Developing a predictive model for the chemical composition of soot nanoparticles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Violi, Angela; Michelsen, Hope; Hansen, Nils
In order to provide the scientific foundation to enable technology breakthroughs in transportation fuel, it is important to develop a combustion modeling capability to optimize the operation and design of evolving fuels in advanced engines for transportation applications. The goal of this proposal is to develop a validated predictive model to describe the chemical composition of soot nanoparticles in premixed and diffusion flames. Atomistic studies in conjunction with state-of-the-art experiments are the distinguishing characteristics of this unique interdisciplinary effort. The modeling effort has been conducted at the University of Michigan by Prof. A. Violi. The experimental work has entailed amore » series of studies using different techniques to analyze gas-phase soot precursor chemistry and soot particle production in premixed and diffusion flames. Measurements have provided spatial distributions of polycyclic aromatic hydrocarbons and other gas-phase species and size and composition of incipient soot nanoparticles for comparison with model results. The experimental team includes Dr. N. Hansen and H. Michelsen at Sandia National Labs' Combustion Research Facility, and Dr. K. Wilson as collaborator at Lawrence Berkeley National Lab's Advanced Light Source. Our results show that the chemical and physical properties of nanoparticles affect the coagulation behavior in soot formation, and our results on an experimentally validated, predictive model for the chemical composition of soot nanoparticles will not only enhance our understanding of soot formation since but will also allow the prediction of particle size distributions under combustion conditions. These results provide a novel description of soot formation based on physical and chemical properties of the particles for use in the next generation of soot models and an enhanced capability for facilitating the design of alternative fuels and the engines they will power.« less
Reppas-Chrysovitsinos, Efstathios; Sobek, Anna; MacLeod, Matthew
2016-06-15
Polymeric materials flowing through the technosphere are repositories of organic chemicals throughout their life cycle. Equilibrium partition ratios of organic chemicals between these materials and air (KMA) or water (KMW) are required for models of fate and transport, high-throughput exposure assessment and passive sampling. KMA and KMW have been measured for a growing number of chemical/material combinations, but significant data gaps still exist. We assembled a database of 363 KMA and 910 KMW measurements for 446 individual compounds and nearly 40 individual polymers and biopolymers, collected from 29 studies. We used the EPI Suite and ABSOLV software packages to estimate physicochemical properties of the compounds and we employed an empirical correlation based on Trouton's rule to adjust the measured KMA and KMW values to a standard reference temperature of 298 K. Then, we used a thermodynamic triangle with Henry's law constant to calculate a complete set of 1273 KMA and KMW values. Using simple linear regression, we developed a suite of single parameter linear free energy relationship (spLFER) models to estimate KMA from the EPI Suite-estimated octanol-air partition ratio (KOA) and KMW from the EPI Suite-estimated octanol-water (KOW) partition ratio. Similarly, using multiple linear regression, we developed a set of polyparameter linear free energy relationship (ppLFER) models to estimate KMA and KMW from ABSOLV-estimated Abraham solvation parameters. We explored the two LFER approaches to investigate (1) their performance in estimating partition ratios, and (2) uncertainties associated with treating all different polymers as a single "bulk" polymeric material compartment. The models we have developed are suitable for screening assessments of the tendency for organic chemicals to be emitted from materials, and for use in multimedia models of the fate of organic chemicals in the indoor environment. In screening applications we recommend that KMA and KMW be modeled as 0.06 ×KOA and 0.06 ×KOW respectively, with an uncertainty range of a factor of 15.
High-throughput screening of chemicals as functional ...
Identifying chemicals that provide a specific function within a product, yet have minimal impact on the human body or environment, is the goal of most formulation chemists and engineers practicing green chemistry. We present a methodology to identify potential chemical functional substitutes from large libraries of chemicals using machine learning based models. We collect and analyze publicly available information on the function of chemicals in consumer products or industrial processes to identify a suite of harmonized function categories suitable for modeling. We use structural and physicochemical descriptors for these chemicals to build 41 quantitative structure–use relationship (QSUR) models for harmonized function categories using random forest classification. We apply these models to screen a library of nearly 6400 chemicals with available structure information for potential functional substitutes. Using our Functional Use database (FUse), we could identify uses for 3121 chemicals; 4412 predicted functional uses had a probability of 80% or greater. We demonstrate the potential application of the models to high-throughput (HT) screening for “candidate alternatives” by merging the valid functional substitute classifications with hazard metrics developed from HT screening assays for bioactivity. A descriptor set could be obtained for 6356 Tox21 chemicals that have undergone a battery of HT in vitro bioactivity screening assays. By applying QSURs, we wer
Sarmah, Swapnalee; Marrs, James A
2016-12-16
Environmental pollution is a serious problem of the modern world that possesses a major threat to public health. Exposure to environmental pollutants during embryonic development is particularly risky. Although many pollutants have been verified as potential toxicants, there are new chemicals in the environment that need assessment. Heart development is an extremely sensitive process, which can be affected by environmentally toxic molecule exposure during embryonic development. Congenital heart defects are the most common life-threatening global health problems, and the etiology is mostly unknown. The zebrafish has emerged as an invaluable model to examine substance toxicity on vertebrate development, particularly on cardiac development. The zebrafish offers numerous advantages for toxicology research not found in other model systems. Many laboratories have used the zebrafish to study the effects of widespread chemicals in the environment on heart development, including pesticides, nanoparticles, and various organic pollutants. Here, we review the uses of the zebrafish in examining effects of exposure to external molecules during embryonic development in causing cardiac defects, including chemicals ubiquitous in the environment and illicit drugs. Known or potential mechanisms of toxicity and how zebrafish research can be used to provide mechanistic understanding of cardiac defects are discussed.
Liu, Yewei; Yin, Ting; Feng, Yuanbo; Cona, Marlein Miranda; Huang, Gang; Liu, Jianjun; Song, Shaoli; Jiang, Yansheng; Xia, Qian; Swinnen, Johannes V.; Bormans, Guy; Himmelreich, Uwe; Oyen, Raymond
2015-01-01
Compared with transplanted tumor models or genetically engineered cancer models, chemically induced primary malignancies in experimental animals can mimic the clinical cancer progress from the early stage on. Cancer caused by chemical carcinogens generally develops through three phases namely initiation, promotion and progression. Based on different mechanisms, chemical carcinogens can be divided into genotoxic and non-genotoxic ones, or complete and incomplete ones, usually with an organ-specific property. Chemical carcinogens can be classified upon their origins such as environmental pollutants, cooked meat derived carcinogens, N-nitroso compounds, food additives, antineoplastic agents, naturally occurring substances and synthetic carcinogens, etc. Carcinogen-induced models of primary cancers can be used to evaluate the diagnostic/therapeutic effects of candidate drugs, investigate the biological influential factors, explore preventive measures for carcinogenicity, and better understand molecular mechanisms involved in tumor initiation, promotion and progression. Among commonly adopted cancer models, chemically induced primary malignancies in mammals have several advantages including the easy procedures, fruitful tumor generation and high analogy to clinical human primary cancers. However, in addition to the time-consuming process, the major drawback of chemical carcinogenesis for translational research is the difficulty in noninvasive tumor burden assessment in small animals. Like human cancers, tumors occur unpredictably also among animals in terms of timing, location and the number of lesions. Thanks to the availability of magnetic resonance imaging (MRI) with various advantages such as ionizing-free scanning, superb soft tissue contrast, multi-parametric information, and utility of diverse contrast agents, now a workable solution to this bottleneck problem is to apply MRI for noninvasive detection, diagnosis and therapeutic monitoring on those otherwise uncontrollable animal models with primary cancers. Moreover, it is foreseeable that the combined use of chemically induced primary cancer models and molecular imaging techniques may help to develop new anticancer diagnostics and therapeutics. PMID:26682141
Luo, Xiang; Yang, Xianhai; Qiao, Xianliang; Wang, Ya; Chen, Jingwen; Wei, Xiaoxuan; Peijnenburg, Willie J G M
2017-03-22
Reaction with hydroxyl radicals (˙OH) is an important removal pathway for organic pollutants in the aquatic environment. The aqueous reaction rate constant (k OH ) is therefore an important parameter for fate assessment of aquatic pollutants. Since experimental determination fails to meet the requirement of being able to efficiently handle numerous organic chemicals at limited cost and within a relatively short period of time, in silico methods such as quantitative structure-activity relationship (QSAR) models are needed to predict k OH . In this study, a QSAR model with a larger and wider applicability domain as compared with existing models was developed. Following the guidelines for the development and validation of QSAR models proposed by the Organization for Economic Co-operation and Development (OECD), the model shows satisfactory performance. The applicability domain of the model has been extended and contained chemicals that have rarely been covered in most previous studies. The chemicals covered in the current model contain functional groups including [double bond splayed left]C[double bond, length as m-dash]C[double bond splayed right], -C[triple bond, length as m-dash]C-, -C 6 H 5 , -OH, -CHO, -O-, [double bond splayed left]C[double bond, length as m-dash]O, -C[double bond, length as m-dash]O(O)-, -COOH, -C[triple bond, length as m-dash]N, [double bond splayed left]N-, -NH 2 , -NH-C(O)-, -NO 2 , -N[double bond, length as m-dash]C-N[double bond splayed right], [double bond splayed left]N-N[double bond splayed right], -N[double bond, length as m-dash]N-, -S-, -S-S-, -SH, -SO 3 , -SO 4 , -PO 4 , and -X (F, Cl, Br, and I).
NASA Astrophysics Data System (ADS)
Jozwiak, Zbigniew Boguslaw
1995-01-01
Ethylene is an important auto-catalytic plant growth hormone. Removal of ethylene from the atmosphere surrounding ethylene-sensitive horticultural products may be very beneficial, allowing an extended period of storage and preventing or delaying the induction of disorders. Various ethylene removal techniques have been studied and put into practice. One technique is based on using low pressure mercury ultraviolet lamps as a source of photochemical energy to initiate chemical reactions that destroy ethylene. Although previous research showed that ethylene disappeared in experiments with mercury ultraviolet lamps, the reactions were not described and the actual cause of ethylene disappearance remained unknown. Proposed causes for this disappearance were the direct action of ultraviolet rays on ethylene, reaction of ethylene with ozone (which is formed when air or gas containing molecular oxygen is exposed to radiation emitted by this type of lamp), or reactions with atomic oxygen leading to formation of ozone. The objective of the present study was to determine the set of physical and chemical actions leading to the disappearance of ethylene from artificial storage atmosphere under conditions of ultraviolet irradiation. The goal was achieved by developing a static chemical model based on the physical properties of a commercially available ultraviolet lamp, the photochemistry of gases, and the kinetics of chemical reactions. The model was used to perform computer simulations predicting time dependent concentrations of chemical species included in the model. Development of the model was accompanied by the design of a reaction chamber used for experimental verification. The model provided a good prediction of the general behavior of the species involved in the chemistry under consideration; however the model predicted lower than measured rate of ethylene disappearance. Some reasons for the model -experiment disagreement are radiation intensity averaging, the experimental technique, mass transfer in the chamber, and incompleteness of the set of chemical reactions included in the model. The work is concluded with guidelines for development of a more complex mathematical model that includes elements of mass transfer inside the reaction chamber, and uses a three dimensional approach to distribute radiation from the low pressure mercury ultraviolet tube.
NASA Astrophysics Data System (ADS)
Gramatica, Paola
This chapter surveys the QSAR modeling approaches (developed by the author's research group) for the validated prediction of environmental properties of organic pollutants. Various chemometric methods, based on different theoretical molecular descriptors, have been applied: explorative techniques (such as PCA for ranking, SOM for similarity analysis), modeling approaches by multiple-linear regression (MLR, in particular OLS), and classification methods (mainly k-NN, CART, CP-ANN). The focus of this review is on the main topics of environmental chemistry and ecotoxicology, related to the physico-chemical properties, the reactivity, and biological activity of chemicals of high environmental concern. Thus, the review deals with atmospheric degradation reactions of VOCs by tropospheric oxidants, persistence and long-range transport of POPs, sorption behavior of pesticides (Koc and leaching), bioconcentration, toxicity (acute aquatic toxicity, mutagenicity of PAHs, estrogen binding activity for endocrine disruptors compounds (EDCs)), and finally persistent bioaccumulative and toxic (PBT) behavior for the screening and prioritization of organic pollutants. Common to all the proposed models is the attention paid to model validation for predictive ability (not only internal, but also external for chemicals not participating in the model development) and checking of the chemical domain of applicability. Adherence to such a policy, requested also by the OECD principles, ensures the production of reliable predicted data, useful also in the new European regulation of chemicals, REACH.
NASA Astrophysics Data System (ADS)
Belis, Claudio A.; Pernigotti, Denise; Pirovano, Guido
2017-04-01
Source Apportionment (SA) is the identification of ambient air pollution sources and the quantification of their contribution to pollution levels. This task can be accomplished using different approaches: chemical transport models and receptor models. Receptor models are derived from measurements and therefore are considered as a reference for primary sources urban background levels. Chemical transport model have better estimation of the secondary pollutants (inorganic) and are capable to provide gridded results with high time resolution. Assessing the performance of SA model results is essential to guarantee reliable information on source contributions to be used for the reporting to the Commission and in the development of pollution abatement strategies. This is the first intercomparison ever designed to test both receptor oriented models (or receptor models) and chemical transport models (or source oriented models) using a comprehensive method based on model quality indicators and pre-established criteria. The target pollutant of this exercise, organised in the frame of FAIRMODE WG 3, is PM10. Both receptor models and chemical transport models present good performances when evaluated against their respective references. Both types of models demonstrate quite satisfactory capabilities to estimate the yearly source contributions while the estimation of the source contributions at the daily level (time series) is more critical. Chemical transport models showed a tendency to underestimate the contribution of some single sources when compared to receptor models. For receptor models the most critical source category is industry. This is probably due to the variety of single sources with different characteristics that belong to this category. Dust is the most problematic source for Chemical Transport Models, likely due to the poor information about this kind of source in the emission inventories, particularly concerning road dust re-suspension, and consequently the little detail about the chemical components of this source used in the models. The sensitivity tests show that chemical transport models show better performances when displaying a detailed set of sources (14) than when using a simplified one (only 8). It was also observed that an enhanced vertical profiling can improve the estimation of specific sources, such as industry, under complex meteorological conditions and that an insufficient spatial resolution in urban areas can impact on the capabilities of models to estimate the contribution of diffuse primary sources (e.g. traffic). Both families of models identify traffic and biomass burning as the first and second most contributing categories, respectively, to elemental carbon. The results of this study demonstrate that the source apportionment assessment methodology developed by the JRC is applicable to any kind of SA model. The same methodology is implemented in the on-line DeltaSA tool to support source apportionment model evaluation (http://source-apportionment.jrc.ec.europa.eu/).
Bio-chemo-mechanical models of vascular mechanics
Kim, Jungsil; Wagenseil, Jessica E.
2014-01-01
Models of vascular mechanics are necessary to predict the response of an artery under a variety of loads, for complex geometries, and in pathological adaptation. Classic constitutive models for arteries are phenomenological and the fitted parameters are not associated with physical components of the wall. Recently, microstructurally-linked models have been developed that associate structural information about the wall components with tissue-level mechanics. Microstructurally-linked models are useful for correlating changes in specific components with pathological outcomes, so that targeted treatments may be developed to prevent or reverse the physical changes. However, most treatments, and many causes, of vascular disease have chemical components. Chemical signaling within cells, between cells, and between cells and matrix constituents affects the biology and mechanics of the arterial wall in the short- and long-term. Hence, bio-chemo-mechanical models that include chemical signaling are critical for robust models of vascular mechanics. This review summarizes bio-mechanical and bio-chemo-mechanical models with a focus on large elastic arteries. We provide applications of these models and challenges for future work. PMID:25465618
Perspective: Reaches of chemical physics in biology.
Gruebele, Martin; Thirumalai, D
2013-09-28
Chemical physics as a discipline contributes many experimental tools, algorithms, and fundamental theoretical models that can be applied to biological problems. This is especially true now as the molecular level and the systems level descriptions begin to connect, and multi-scale approaches are being developed to solve cutting edge problems in biology. In some cases, the concepts and tools got their start in non-biological fields, and migrated over, such as the idea of glassy landscapes, fluorescence spectroscopy, or master equation approaches. In other cases, the tools were specifically developed with biological physics applications in mind, such as modeling of single molecule trajectories or super-resolution laser techniques. In this introduction to the special topic section on chemical physics of biological systems, we consider a wide range of contributions, all the way from the molecular level, to molecular assemblies, chemical physics of the cell, and finally systems-level approaches, based on the contributions to this special issue. Chemical physicists can look forward to an exciting future where computational tools, analytical models, and new instrumentation will push the boundaries of biological inquiry.
Numerical simulation of hypersonic inlet flows with equilibrium or finite rate chemistry
NASA Technical Reports Server (NTRS)
Yu, Sheng-Tao; Hsieh, Kwang-Chung; Shuen, Jian-Shun; Mcbride, Bonnie J.
1988-01-01
An efficient numerical program incorporated with comprehensive high temperature gas property models has been developed to simulate hypersonic inlet flows. The computer program employs an implicit lower-upper time marching scheme to solve the two-dimensional Navier-Stokes equations with variable thermodynamic and transport properties. Both finite-rate and local-equilibrium approaches are adopted in the chemical reaction model for dissociation and ionization of the inlet air. In the finite rate approach, eleven species equations coupled with fluid dynamic equations are solved simultaneously. In the local-equilibrium approach, instead of solving species equations, an efficient chemical equilibrium package has been developed and incorporated into the flow code to obtain chemical compositions directly. Gas properties for the reaction products species are calculated by methods of statistical mechanics and fit to a polynomial form for C(p). In the present study, since the chemical reaction time is comparable to the flow residence time, the local-equilibrium model underpredicts the temperature in the shock layer. Significant differences of predicted chemical compositions in shock layer between finite rate and local-equilibrium approaches have been observed.
Perspective: Reaches of chemical physics in biology
Gruebele, Martin; Thirumalai, D.
2013-01-01
Chemical physics as a discipline contributes many experimental tools, algorithms, and fundamental theoretical models that can be applied to biological problems. This is especially true now as the molecular level and the systems level descriptions begin to connect, and multi-scale approaches are being developed to solve cutting edge problems in biology. In some cases, the concepts and tools got their start in non-biological fields, and migrated over, such as the idea of glassy landscapes, fluorescence spectroscopy, or master equation approaches. In other cases, the tools were specifically developed with biological physics applications in mind, such as modeling of single molecule trajectories or super-resolution laser techniques. In this introduction to the special topic section on chemical physics of biological systems, we consider a wide range of contributions, all the way from the molecular level, to molecular assemblies, chemical physics of the cell, and finally systems-level approaches, based on the contributions to this special issue. Chemical physicists can look forward to an exciting future where computational tools, analytical models, and new instrumentation will push the boundaries of biological inquiry. PMID:24089712
Building new physiologically based pharmacokinetic (PBPK) models requires a lot data, such as the chemical-specific parameters and in vivo pharmacokinetic data. Previously-developed, well-parameterized, and thoroughly-vetted models can be great resource for supporting the constr...
NASA Technical Reports Server (NTRS)
Bernhardt, Paul A.; Scales, W. A.
1990-01-01
Ionospheric plasma density irregularities can be produced by chemical releases into the upper atmosphere. F-region plasma modification occurs by: (1) chemically enhancing the electron number density; (2) chemically reducing the electron population; or (3) physically convecting the plasma from one region to another. The three processes (production, loss, and transport) determine the effectiveness of ionospheric chemical releases in subtle and surprising ways. Initially, a chemical release produces a localized change in plasma density. Subsequent processes, however, can lead to enhanced transport in chemically modified regions. Ionospheric modifications by chemical releases excites artificial enhancements in airglow intensities by exothermic chemical reactions between the newly created plasma species. Numerical models were developed to describe the creation and evolution of large scale density irregularities and airglow clouds generated by artificial means. Experimental data compares favorably with theses models. It was found that chemical releases produce transient, large amplitude perturbations in electron density which can evolve into fine scale irregularities via nonlinear transport properties.
Mathematical Description of Complex Chemical Kinetics and Application to CFD Modeling Codes
NASA Technical Reports Server (NTRS)
Bittker, D. A.
1993-01-01
A major effort in combustion research at the present time is devoted to the theoretical modeling of practical combustion systems. These include turbojet and ramjet air-breathing engines as well as ground-based gas-turbine power generating systems. The ability to use computational modeling extensively in designing these products not only saves time and money, but also helps designers meet the quite rigorous environmental standards that have been imposed on all combustion devices. The goal is to combine the very complex solution of the Navier-Stokes flow equations with realistic turbulence and heat-release models into a single computer code. Such a computational fluid-dynamic (CFD) code simulates the coupling of fluid mechanics with the chemistry of combustion to describe the practical devices. This paper will focus on the task of developing a simplified chemical model which can predict realistic heat-release rates as well as species composition profiles, and is also computationally rapid. We first discuss the mathematical techniques used to describe a complex, multistep fuel oxidation chemical reaction and develop a detailed mechanism for the process. We then show how this mechanism may be reduced and simplified to give an approximate model which adequately predicts heat release rates and a limited number of species composition profiles, but is computationally much faster than the original one. Only such a model can be incorporated into a CFD code without adding significantly to long computation times. Finally, we present some of the recent advances in the development of these simplified chemical mechanisms.
Mathematical description of complex chemical kinetics and application to CFD modeling codes
NASA Technical Reports Server (NTRS)
Bittker, D. A.
1993-01-01
A major effort in combustion research at the present time is devoted to the theoretical modeling of practical combustion systems. These include turbojet and ramjet air-breathing engines as well as ground-based gas-turbine power generating systems. The ability to use computational modeling extensively in designing these products not only saves time and money, but also helps designers meet the quite rigorous environmental standards that have been imposed on all combustion devices. The goal is to combine the very complex solution of the Navier-Stokes flow equations with realistic turbulence and heat-release models into a single computer code. Such a computational fluid-dynamic (CFD) code simulates the coupling of fluid mechanics with the chemistry of combustion to describe the practical devices. This paper will focus on the task of developing a simplified chemical model which can predict realistic heat-release rates as well as species composition profiles, and is also computationally rapid. We first discuss the mathematical techniques used to describe a complex, multistep fuel oxidation chemical reaction and develop a detailed mechanism for the process. We then show how this mechanism may be reduced and simplified to give an approximate model which adequately predicts heat release rates and a limited number of species composition profiles, but is computationally much faster than the original one. Only such a model can be incorporated into a CFD code without adding significantly to long computation times. Finally, we present some of the recent advances in the development of these simplified chemical mechanisms.
Vorberg, Susann; Tetko, Igor V
2014-01-01
Biodegradability describes the capacity of substances to be mineralized by free-living bacteria. It is a crucial property in estimating a compound's long-term impact on the environment. The ability to reliably predict biodegradability would reduce the need for laborious experimental testing. However, this endpoint is difficult to model due to unavailability or inconsistency of experimental data. Our approach makes use of the Online Chemical Modeling Environment (OCHEM) and its rich supply of machine learning methods and descriptor sets to build classification models for ready biodegradability. These models were analyzed to determine the relationship between characteristic structural properties and biodegradation activity. The distinguishing feature of the developed models is their ability to estimate the accuracy of prediction for each individual compound. The models developed using seven individual descriptor sets were combined in a consensus model, which provided the highest accuracy. The identified overrepresented structural fragments can be used by chemists to improve the biodegradability of new chemical compounds. The consensus model, the datasets used, and the calculated structural fragments are publicly available at http://ochem.eu/article/31660. © 2014 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim. This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.
Remote Chemical Sensing Using Quantum Cascade Lasers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harper, Warren W.; Schultz, John F.
2003-01-30
Spectroscopic chemical sensing research at Pacific Northwest National Laboratory (PNNL) is focused on developing advanced sensors for detecting the production of nuclear, chemical, or biological weapons; use of chemical weapons; or the presence of explosives, firearms, narcotics, or other contraband of significance to homeland security in airports, cargo terminals, public buildings, or other sensitive locations. For most of these missions, the signature chemicals are expected to occur in very low concentrations, and in mixture with ambient air or airborne waste streams that contain large numbers of other species that may interfere with spectroscopic detection, or be mistaken for signatures ofmore » illicit activity. PNNL’s emphasis is therefore on developing remote and sampling sensors with extreme sensitivity, and resistance to interferents, or selectivity. PNNL’s research activities include: 1. Identification of signature chemicals and quantification of their spectral characteristics, 2. Identification and development of laser and other technologies that enable breakthroughs in sensitivity and selectivity, 3. Development of promising sensing techniques through experimentation and modeling the physical phenomenology and practical engineering limitations affecting their performance, and 4. Development and testing of data collection methods and analysis algorithms. Close coordination of all aspects of the research is important to ensure that all parts are focused on productive avenues of investigation. Close coordination of experimental development and numerical modeling is particularly important because the theoretical component provides understanding and predictive capability, while the experiments validate calculations and ensure that all phenomena and engineering limitations are considered.« less
The vast landscape of environmental chemicals has motivated the need for alternative methods to traditional whole-animal bioassays in toxicity testing. Embryonic stem (ES) cells provide an in vitro model of embryonic development and an alternative method for assessing development...
Chemically induced vascular toxicity during embryonic development can result in a wide range of adverse prenatal outcomes. We used information from genetic mouse models linked to phenotypic outcomes and a vascular toxicity knowledge base to construct an embryonic vascular disrupt...
The perchlorate anion inhibits thyroid hormone (TH) synthesis via inhibition of the sodium-iodide symporter. It is, therefore, a good model chemical to aid in the development of a bioassay to screen chemicals for effects on thyroid function. Xenopus laevis larvae were exposed to ...
Modeling the influence of organic acids on soil weathering
NASA Astrophysics Data System (ADS)
Lawrence, Corey; Harden, Jennifer; Maher, Kate
2014-08-01
Biological inputs and organic matter cycling have long been regarded as important factors in the physical and chemical development of soils. In particular, the extent to which low molecular weight organic acids, such as oxalate, influence geochemical reactions has been widely studied. Although the effects of organic acids are diverse, there is strong evidence that organic acids accelerate the dissolution of some minerals. However, the influence of organic acids at the field-scale and over the timescales of soil development has not been evaluated in detail. In this study, a reactive-transport model of soil chemical weathering and pedogenic development was used to quantify the extent to which organic acid cycling controls mineral dissolution rates and long-term patterns of chemical weathering. Specifically, oxalic acid was added to simulations of soil development to investigate a well-studied chronosequence of soils near Santa Cruz, CA. The model formulation includes organic acid input, transport, decomposition, organic-metal aqueous complexation and mineral surface complexation in various combinations. Results suggest that although organic acid reactions accelerate mineral dissolution rates near the soil surface, the net response is an overall decrease in chemical weathering. Model results demonstrate the importance of organic acid input concentrations, fluid flow, decomposition and secondary mineral precipitation rates on the evolution of mineral weathering fronts. In particular, model soil profile evolution is sensitive to kaolinite precipitation and oxalate decomposition rates. The soil profile-scale modeling presented here provides insights into the influence of organic carbon cycling on soil weathering and pedogenesis and supports the need for further field-scale measurements of the flux and speciation of reactive organic compounds.
Modeling the influence of organic acids on soil weathering
Lawrence, Corey R.; Harden, Jennifer W.; Maher, Kate
2014-01-01
Biological inputs and organic matter cycling have long been regarded as important factors in the physical and chemical development of soils. In particular, the extent to which low molecular weight organic acids, such as oxalate, influence geochemical reactions has been widely studied. Although the effects of organic acids are diverse, there is strong evidence that organic acids accelerate the dissolution of some minerals. However, the influence of organic acids at the field-scale and over the timescales of soil development has not been evaluated in detail. In this study, a reactive-transport model of soil chemical weathering and pedogenic development was used to quantify the extent to which organic acid cycling controls mineral dissolution rates and long-term patterns of chemical weathering. Specifically, oxalic acid was added to simulations of soil development to investigate a well-studied chronosequence of soils near Santa Cruz, CA. The model formulation includes organic acid input, transport, decomposition, organic-metal aqueous complexation and mineral surface complexation in various combinations. Results suggest that although organic acid reactions accelerate mineral dissolution rates near the soil surface, the net response is an overall decrease in chemical weathering. Model results demonstrate the importance of organic acid input concentrations, fluid flow, decomposition and secondary mineral precipitation rates on the evolution of mineral weathering fronts. In particular, model soil profile evolution is sensitive to kaolinite precipitation and oxalate decomposition rates. The soil profile-scale modeling presented here provides insights into the influence of organic carbon cycling on soil weathering and pedogenesis and supports the need for further field-scale measurements of the flux and speciation of reactive organic compounds.
Philip Ye, X; Liu, Lu; Hayes, Douglas; Womac, Alvin; Hong, Kunlun; Sokhansanj, Shahab
2008-10-01
The objectives of this research were to determine the variation of chemical composition across botanical fractions of cornstover, and to probe the potential of Fourier transform near-infrared (FT-NIR) techniques in qualitatively classifying separated cornstover fractions and in quantitatively analyzing chemical compositions of cornstover by developing calibration models to predict chemical compositions of cornstover based on FT-NIR spectra. Large variations of cornstover chemical composition for wide calibration ranges, which is required by a reliable calibration model, were achieved by manually separating the cornstover samples into six botanical fractions, and their chemical compositions were determined by conventional wet chemical analyses, which proved that chemical composition varies significantly among different botanical fractions of cornstover. Different botanic fractions, having total saccharide content in descending order, are husk, sheath, pith, rind, leaf, and node. Based on FT-NIR spectra acquired on the biomass, classification by Soft Independent Modeling of Class Analogy (SIMCA) was employed to conduct qualitative classification of cornstover fractions, and partial least square (PLS) regression was used for quantitative chemical composition analysis. SIMCA was successfully demonstrated in classifying botanical fractions of cornstover. The developed PLS model yielded root mean square error of prediction (RMSEP %w/w) of 0.92, 1.03, 0.17, 0.27, 0.21, 1.12, and 0.57 for glucan, xylan, galactan, arabinan, mannan, lignin, and ash, respectively. The results showed the potential of FT-NIR techniques in combination with multivariate analysis to be utilized by biomass feedstock suppliers, bioethanol manufacturers, and bio-power producers in order to better manage bioenergy feedstocks and enhance bioconversion.
Fernández, Alberto; Rallo, Robert; Giralt, Francesc
2015-10-01
Ready biodegradability is a key property for evaluating the long-term effects of chemicals on the environment and human health. As such, it is used as a screening test for the assessment of persistent, bioaccumulative and toxic substances. Regulators encourage the use of non-testing methods, such as in silico models, to save money and time. A dataset of 757 chemicals was collected to assess the performance of four freely available in silico models that predict ready biodegradability. They were applied to develop a new consensus method that prioritizes the use of each individual model according to its performance on chemical subsets driven by the presence or absence of different molecular descriptors. This consensus method was capable of almost eliminating unpredictable chemicals, while the performance of combined models was substantially improved with respect to that of the individual models. Copyright © 2015 Elsevier Inc. All rights reserved.
A Bayesian network model for predicting aquatic toxicity mode ...
The mode of toxic action (MoA) has been recognized as a key determinant of chemical toxicity but MoA classification in aquatic toxicology has been limited. We developed a Bayesian network model to classify aquatic toxicity mode of action using a recently published dataset containing over one thousand chemicals with MoA assignments for aquatic animal toxicity. Two dimensional theoretical chemical descriptors were generated for each chemical using the Toxicity Estimation Software Tool. The model was developed through augmented Markov blanket discovery from the data set with the MoA broad classifications as a target node. From cross validation, the overall precision for the model was 80.2% with a R2 of 0.959. The best precision was for the AChEI MoA (93.5%) where 257 chemicals out of 275 were correctly classified. Model precision was poorest for the reactivity MoA (48.5%) where 48 out of 99 reactive chemicals were correctly classified. Narcosis represented the largest class within the MoA dataset and had a precision and reliability of 80.0%, reflecting the global precision across all of the MoAs. False negatives for narcosis most often fell into electron transport inhibition, neurotoxicity or reactivity MoAs. False negatives for all other MoAs were most often narcosis. A probabilistic sensitivity analysis was undertaken for each MoA to examine the sensitivity to individual and multiple descriptor findings. The results show that the Markov blanket of a structurally
Chen, Yu; Dong, Fengqing; Wang, Yonghong
2016-09-01
With determined components and experimental reducibility, the chemically defined medium (CDM) and the minimal chemically defined medium (MCDM) are used in many metabolism and regulation studies. This research aimed to develop the chemically defined medium supporting high cell density growth of Bacillus coagulans, which is a promising producer of lactic acid and other bio-chemicals. In this study, a systematic methodology combining the experimental technique with flux balance analysis (FBA) was proposed to design and simplify a CDM. The single omission technique and single addition technique were employed to determine the essential and stimulatory compounds, before the optimization of their concentrations by the statistical method. In addition, to improve the growth rationally, in silico omission and addition were performed by FBA based on the construction of a medium-size metabolic model of B. coagulans 36D1. Thus, CDMs were developed to obtain considerable biomass production of at least five B. coagulans strains, in which two model strains B. coagulans 36D1 and ATCC 7050 were involved.
Ankley, Gerald T; Bencic, David C; Breen, Michael S; Collette, Timothy W; Conolly, Rory B; Denslow, Nancy D; Edwards, Stephen W; Ekman, Drew R; Garcia-Reyero, Natalia; Jensen, Kathleen M; Lazorchak, James M; Martinović, Dalma; Miller, David H; Perkins, Edward J; Orlando, Edward F; Villeneuve, Daniel L; Wang, Rong-Lin; Watanabe, Karen H
2009-05-05
Knowledge of possible toxic mechanisms (or modes) of action (MOA) of chemicals can provide valuable insights as to appropriate methods for assessing exposure and effects, thereby reducing uncertainties related to extrapolation across species, endpoints and chemical structure. However, MOA-based testing seldom has been used for assessing the ecological risk of chemicals. This is in part because past regulatory mandates have focused more on adverse effects of chemicals (reductions in survival, growth or reproduction) than the pathways through which these effects are elicited. A recent departure from this involves endocrine-disrupting chemicals (EDCs), where there is a need to understand both MOA and adverse outcomes. To achieve this understanding, advances in predictive approaches are required whereby mechanistic changes caused by chemicals at the molecular level can be translated into apical responses meaningful to ecological risk assessment. In this paper we provide an overview and illustrative results from a large, integrated project that assesses the effects of EDCs on two small fish models, the fathead minnow (Pimephales promelas) and zebrafish (Danio rerio). For this work a systems-based approach is being used to delineate toxicity pathways for 12 model EDCs with different known or hypothesized toxic MOA. The studies employ a combination of state-of-the-art genomic (transcriptomic, proteomic, metabolomic), bioinformatic and modeling approaches, in conjunction with whole animal testing, to develop response linkages across biological levels of organization. This understanding forms the basis for predictive approaches for species, endpoint and chemical extrapolation. Although our project is focused specifically on EDCs in fish, we believe that the basic conceptual approach has utility for systematically assessing exposure and effects of chemicals with other MOA across a variety of biological systems.
Biomonitoring data can help inform the development and calibration of high-throughput exposure modeling for use in prioritization and risk evaluation. A pilot project was conducted to evaluate the feasibility of using pooled banked blood samples to generate initial data on popul...
There is international concern about chemicals that alter endocrine system function in humans and/or wildlife and subsequently cause adverse effects. We previously developed a mechanistic computational model of the hypothalamic-pituitary-gonadal (HPG) axis in female fathead minno...
Modeling Chemistry for Effective Chemical Education: An Interview with Ronald J. Gillespie
ERIC Educational Resources Information Center
Cardellini, Liberato
2010-01-01
Ronald J. Gillespie, the inventor of the Valence Shell Electron Pair Repulsion (VSEPR) model, relates how his career as researcher in Christopher Ingold's laboratories started. Gillespie developed a passion for chemistry and chemical education, searching for more appropriate and interesting ways to transmit the essential knowledge and enthusiasm…
Predicting dermal penetration for ToxCast chemicals using in silico estimates for diffusion in combination with physiologically based pharmacokinetic (PBPK) modeling.Evans, M.V., Sawyer, M.E., Isaacs, K.K, and Wambaugh, J.With the development of efficient high-throughput (HT) in ...
Quantitative Model of Systemic Toxicity Using ToxCast and ToxRefDB (SOT)
EPA’s ToxCast program profiles the bioactivity of chemicals in a diverse set of ~700 high throughput screening (HTS) assays. In collaboration with L’Oreal, a quantitative model of systemic toxicity was developed using no effect levels (NEL) from ToxRefDB for 633 chemicals with HT...
BACKGROUND: An in vitro steroidogenesis assay using the human adrenocortical carcinoma cells H295R is being evaluated as a possible toxicity screening approach to detect and assess the impact of endocrine active chemicals (EAC) capable of altering steroid biosynthesis. Interpreta...
Arnot, Jon A; Mackay, Donald
2018-01-24
The chemical dietary absorption efficiency (E D ) quantifies the amount of chemical absorbed by an organism relative to the amount of chemical an organism is exposed to following ingestion. In particular, E D can influence the extent of bioaccumulation and biomagnification for hydrophobic chemicals. A new E D model is developed to quantify chemical process rates in the gastrointestinal tract (GIT). The new model is calibrated with critically evaluated measured E D values (n = 250) for 80 hydrophobic persistent chemicals. The new E D model is subsequently used to estimate chemical reaction rate constants (k R ) assumed to occur in the lumen of the GIT from experimental dietary exposure tests (n = 255) for 165 chemicals. The new k R estimates are corroborated with k R estimates for the same chemicals from the same data derived previously by other methods. The roles of k R and the biotransformation rate constant (k B ) on biomagnification factors (BMFs) determined under laboratory test conditions and on BMFs and bioaccumulation factors (BAFs) in the environment are examined with the new model. In this regard, differences in lab and field BMFs are highlighted. Recommendations to address uncertainty in E D and k R data are provided.
Prior knowledge-based approach for associating ...
Evaluating the potential human health and/or ecological risks associated with exposures to complex chemical mixtures in the ambient environment is one of the central challenges of chemical safety assessment and environmental protection. There is a need for approaches that can help to integrate chemical monitoring and bio-effects data to evaluate risks associated with chemicals present in the environment. We used prior knowledge about chemical-gene interactions to develop a knowledge assembly model for detected chemicals at five locations near two wastewater treatment plants. The assembly model was used to generate hypotheses about the biological impacts of the chemicals at each location. The hypotheses were tested using empirical hepatic gene expression data from fathead minnows exposed for 12 d at each location. Empirical gene expression data was also mapped to the assembly models to statistically evaluate the likelihood of a chemical contributing to the observed biological responses. The prior knowledge approach was able reasonably hypothesize the biological impacts at one site but not the other. Chemicals most likely contributing to the observed biological responses were identified at each location. Despite limitations to the approach, knowledge assembly models have strong potential for associating chemical occurrence with potential biological effects and providing a foundation for hypothesis generation to guide research and/or monitoring efforts relat
NASA Astrophysics Data System (ADS)
Navaz, H. K.; Dang, A. L.; Atkinson, T.; Zand, A.; Nowakowski, A.; Kamensky, K.
2014-05-01
A general-purpose multi-phase and multi-component computer model capable of solving the complex problems encountered in the agent substrate interaction is developed. The model solves the transient and time-accurate mass and momentum governing equations in a three dimensional space. The provisions for considering all the inter-phase activities (solidification, evaporation, condensation, etc.) are included in the model. The chemical reactions among all phases are allowed and the products of the existing chemical reactions in all three phases are possible. The impact of chemical reaction products on the transport properties in porous media such as porosity, capillary pressure, and permeability is considered. Numerous validations for simulants, agents, and pesticides with laboratory and open air data are presented. Results for chemical reactions in the presence of pre-existing water in porous materials such as moisture, or separated agent and water droplets on porous substrates are presented. The model will greatly enhance the capabilities in predicting the level of threat after any chemical such as Toxic Industrial Chemicals (TICs) and Toxic Industrial Materials (TIMs) release on environmental substrates. The model's generality makes it suitable for both defense and pharmaceutical applications.
Predictive performance of the Vitrigel‐eye irritancy test method using 118 chemicals
Yamaguchi, Hiroyuki; Kojima, Hajime
2015-01-01
Abstract We recently developed a novel Vitrigel‐eye irritancy test (EIT) method. The Vitrigel‐EIT method is composed of two parts, i.e., the construction of a human corneal epithelium (HCE) model in a collagen vitrigel membrane chamber and the prediction of eye irritancy by analyzing the time‐dependent profile of transepithelial electrical resistance values for 3 min after exposing a chemical to the HCE model. In this study, we estimated the predictive performance of Vitrigel‐EIT method by testing a total of 118 chemicals. The category determined by the Vitrigel‐EIT method in comparison to the globally harmonized system classification revealed that the sensitivity, specificity and accuracy were 90.1%, 65.9% and 80.5%, respectively. Here, five of seven false‐negative chemicals were acidic chemicals inducing the irregular rising of transepithelial electrical resistance values. In case of eliminating the test chemical solutions showing pH 5 or lower, the sensitivity, specificity and accuracy were improved to 96.8%, 67.4% and 84.4%, respectively. Meanwhile, nine of 16 false‐positive chemicals were classified irritant by the US Environmental Protection Agency. In addition, the disappearance of ZO‐1, a tight junction‐associated protein and MUC1, a cell membrane‐spanning mucin was immunohistologically confirmed in the HCE models after exposing not only eye irritant chemicals but also false‐positive chemicals, suggesting that such false‐positive chemicals have an eye irritant potential. These data demonstrated that the Vitrigel‐EIT method could provide excellent predictive performance to judge the widespread eye irritancy, including very mild irritant chemicals. We hope that the Vitrigel‐EIT method contributes to the development of safe commodity chemicals. Copyright © 2015 The Authors. Journal of Applied Toxicology published by John Wiley & Sons Ltd. PMID:26472347
Toxicokinetic Triage for Environmental Chemicals
Wambaugh, John F.; Wetmore, Barbara A.; Pearce, Robert; Strope, Cory; Goldsmith, Rocky; Sluka, James P.; Sedykh, Alexander; Tropsha, Alex; Bosgra, Sieto; Shah, Imran; Judson, Richard; Thomas, Russell S.; Woodrow Setzer, R.
2015-01-01
Toxicokinetic (TK) models link administered doses to plasma, blood, and tissue concentrations. High-throughput TK (HTTK) performs in vitro to in vivo extrapolation to predict TK from rapid in vitro measurements and chemical structure-based properties. A significant toxicological application of HTTK has been “reverse dosimetry,” in which bioactive concentrations from in vitro screening studies are converted into in vivo doses (mg/kg BW/day). These doses are predicted to produce steady-state plasma concentrations that are equivalent to in vitro bioactive concentrations. In this study, we evaluate the impact of the approximations and assumptions necessary for reverse dosimetry and develop methods to determine whether HTTK tools are appropriate or may lead to false conclusions for a particular chemical. Based on literature in vivo data for 87 chemicals, we identified specific properties (eg, in vitro HTTK data, physico-chemical descriptors, and predicted transporter affinities) that correlate with poor HTTK predictive ability. For 271 chemicals we developed a generic HT physiologically based TK (HTPBTK) model that predicts non-steady-state chemical concentration time-courses for a variety of exposure scenarios. We used this HTPBTK model to find that assumptions previously used for reverse dosimetry are usually appropriate, except most notably for highly bioaccumulative compounds. For the thousands of man-made chemicals in the environment that currently have no TK data, we propose a 4-element framework for chemical TK triage that can group chemicals into 7 different categories associated with varying levels of confidence in HTTK predictions. For 349 chemicals with literature HTTK data, we differentiated those chemicals for which HTTK approaches are likely to be sufficient, from those that may require additional data. PMID:26085347
Biological indicators for monitoring water quality of MTF canals system
NASA Technical Reports Server (NTRS)
Sethi, S. L.
1975-01-01
Biological models, diversity indexes, were developed to predict environmental effects of NASA's Mississippi test facility (MTF) chemical operations on canal systems in the area. To predict the effects on local streams, a physical model of unpolluted streams was established. The model is fed by artesian well water free of background levels of pollutants. The species diversity and biota composition of unpolluted MTF stream was determined; resulting information will be used to form baseline data for future comparisons. Biological modeling was accomplished by adding controlled quantities or kinds of chemical pollutants and evaluating the effects of these chemicals on the biological life of the stream.
Oparin's coacervates as an important milestone in chemical evolution
NASA Astrophysics Data System (ADS)
Kolb, Vera M.
2015-09-01
Although Oparin's coacervate model for the origin of life by chemical evolution is almost 100 years old, it is still valid. However, the structure of his originally proposed coacervate is not considered prebiotic, based on some recent developments in prebiotic chemistry. We have remedied this deficiency of the Oparin's model, by substituting his coacervate with a prebiotically feasible one. Oparin's coacervates are aqueous structures, but have a boundary with the rest of the aqueous medium. They exhibit properties of self-replication, and provide a path to a primitive metabolism, via chemical competition and thus a primitive selection. Thus, coacervates are good models for proto-cells. We review here some salient points of Oparin's model and address also some philosophical views on the beginning of natural selection in primitive chemical systems.
Group Contribution Methods for Phase Equilibrium Calculations.
Gmehling, Jürgen; Constantinescu, Dana; Schmid, Bastian
2015-01-01
The development and design of chemical processes are carried out by solving the balance equations of a mathematical model for sections of or the whole chemical plant with the help of process simulators. For process simulation, besides kinetic data for the chemical reaction, various pure component and mixture properties are required. Because of the great importance of separation processes for a chemical plant in particular, a reliable knowledge of the phase equilibrium behavior is required. The phase equilibrium behavior can be calculated with the help of modern equations of state or g(E)-models using only binary parameters. But unfortunately, only a very small part of the experimental data for fitting the required binary model parameters is available, so very often these models cannot be applied directly. To solve this problem, powerful predictive thermodynamic models have been developed. Group contribution methods allow the prediction of the required phase equilibrium data using only a limited number of group interaction parameters. A prerequisite for fitting the required group interaction parameters is a comprehensive database. That is why for the development of powerful group contribution methods almost all published pure component properties, phase equilibrium data, excess properties, etc., were stored in computerized form in the Dortmund Data Bank. In this review, the present status, weaknesses, advantages and disadvantages, possible applications, and typical results of the different group contribution methods for the calculation of phase equilibria are presented.
McKim, James M; Keller, Donald J; Gorski, Joel R
2012-12-01
Chemical sensitization is a serious condition caused by small reactive molecules and is characterized by a delayed type hypersensitivity known as allergic contact dermatitis (ACD). Contact with these molecules via dermal exposure represent a significant concern for chemical manufacturers. Recent legislation in the EU has created the need to develop non-animal alternative methods for many routine safety studies including sensitization. Although most of the alternative research has focused on pure chemicals that possess reasonable solubility properties, it is important for any successful in vitro method to have the ability to test compounds with low aqueous solubility. This is especially true for the medical device industry where device extracts must be prepared in both polar and non-polar vehicles in order to evaluate chemical sensitization. The aim of this research was to demonstrate the functionality and applicability of the human reconstituted skin models (MatTek Epiderm(®) and SkinEthic RHE) as a test system for the evaluation of chemical sensitization and its potential use for medical device testing. In addition, the development of the human 3D skin model should allow the in vitro sensitization assay to be used for finished product testing in the personal care, cosmetics, and pharmaceutical industries. This approach combines solubility, chemical reactivity, cytotoxicity, and activation of the Nrf2/ARE expression pathway to identify and categorize chemical sensitizers. Known chemical sensitizers representing extreme/strong-, moderate-, weak-, and non-sensitizing potency categories were first evaluated in the skin models at six exposure concentrations ranging from 0.1 to 2500 µM for 24 h. The expression of eight Nrf2/ARE, one AhR/XRE and two Nrf1/MRE controlled gene were measured by qRT-PCR. The fold-induction at each exposure concentration was combined with reactivity and cytotoxicity data to determine the sensitization potential. The results demonstrated that both the MatTek and SkinEthic models performed in a manner consistent with data previously reported with the human keratinocyte (HaCaT) cell line. The system was tested further by evaluating chemicals known to be associated with the manufacture of medical devices. In all cases, the human skin models performed as well or better than the HaCaT cell model previously evaluated. In addition, this study identifies a clear unifying trigger that controls both the Nrf2/ARE pathway and essential biochemical events required for the development of ACD. Finally, this study has demonstrated that by utilizing human reconstructed skin models, it is possible to evaluate non-polar extracts from medical devices and low solubility finished products.
Rowat, S C
1998-01-01
The central nervous, immune, and endocrine systems communicate through multiple common messengers. Over evolutionary time, what may be termed integrated defense system(s) (IDS) have developed to coordinate these communications for specific contexts; these include the stress response, acute-phase response, nonspecific immune response, immune response to antigen, kindling, tolerance, time-dependent sensitization, neurogenic switching, and traumatic dissociation (TD). These IDSs are described and their overlap is examined. Three models of disease production are generated: damage, in which IDSs function incorrectly; inadequate/inappropriate, in which IDS response is outstripped by a changing context; and evolving/learning, in which the IDS learned response to a context is deemed pathologic. Mechanisms of multiple chemical sensitivity (MCS) are developed from several IDS disease models. Model 1A is pesticide damage to the central nervous system, overlapping with body chemical burdens, TD, and chronic zinc deficiency; model 1B is benzene disruption of interleukin-1, overlapping with childhood developmental windows and hapten-antigenic spreading; and model 1C is autoimmunity to immunoglobulin-G (IgG), overlapping with spreading to other IgG-inducers, sudden spreading of inciters, and food-contaminating chemicals. Model 2A is chemical and stress overload, including comparison with the susceptibility/sensitization/triggering/spreading model; model 2B is genetic mercury allergy, overlapping with: heavy metals/zinc displacement and childhood/gestational mercury exposures; and model 3 is MCS as evolution and learning. Remarks are offered on current MCS research. Problems with clinical measurement are suggested on the basis of IDS models. Large-sample patient self-report epidemiology is described as an alternative or addition to clinical biomarker and animal testing. Images Figure 1 Figure 2 Figure 3 Figure 1 Figure 2 Figure 3 Figure 4 Figure 5 PMID:9539008
NASA Astrophysics Data System (ADS)
Yu, Minghao; Yamada, Kazuhiko; Takahashi, Yusuke; Liu, Kai; Zhao, Tong
2016-12-01
A numerical model for simulating air and nitrogen inductively coupled plasmas (ICPs) was developed considering thermochemical nonequilibrium and the third-order electron transport properties. A modified far-field electromagnetic model was introduced and tightly coupled with the flow field equations to describe the Joule heating and inductive discharge phenomena. In total, 11 species and 49 chemical reactions of air, which include 5 species and 8 chemical reactions of nitrogen, were employed to model the chemical reaction process. The internal energy transfers among translational, vibrational, rotational, and electronic energy modes of chemical species were taken into account to study thermal nonequilibrium effects. The low-Reynolds number Abe-Kondoh-Nagano k-ɛ turbulence model was employed to consider the turbulent heat transfer. In this study, the fundamental characteristics of an ICP flow, such as the weak ionization, high temperature but low velocity in the torch, and wide area of the plasma plume, were reproduced by the developed numerical model. The flow field differences between the air and nitrogen ICP flows inside the 10-kW ICP wind tunnel were made clear. The interactions between the electromagnetic and flow fields were also revealed for an inductive discharge.
Dilution physics modeling: Dissolution/precipitation chemistry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Onishi, Y.; Reid, H.C.; Trent, D.S.
This report documents progress made to date on integrating dilution/precipitation chemistry and new physical models into the TEMPEST thermal-hydraulics computer code. Implementation of dissolution/precipitation chemistry models is necessary for predicting nonhomogeneous, time-dependent, physical/chemical behavior of tank wastes with and without a variety of possible engineered remediation and mitigation activities. Such behavior includes chemical reactions, gas retention, solids resuspension, solids dissolution and generation, solids settling/rising, and convective motion of physical and chemical species. Thus this model development is important from the standpoint of predicting the consequences of various engineered activities, such as mitigation by dilution, retrieval, or pretreatment, that can affectmore » safe operations. The integration of a dissolution/precipitation chemistry module allows the various phase species concentrations to enter into the physical calculations that affect the TEMPEST hydrodynamic flow calculations. The yield strength model of non-Newtonian sludge correlates yield to a power function of solids concentration. Likewise, shear stress is concentration-dependent, and the dissolution/precipitation chemistry calculations develop the species concentration evolution that produces fluid flow resistance changes. Dilution of waste with pure water, molar concentrations of sodium hydroxide, and other chemical streams can be analyzed for the reactive species changes and hydrodynamic flow characteristics.« less
Rethinking Chemistry: A Learning Progression on Chemical Thinking
ERIC Educational Resources Information Center
Sevian, Hannah; Talanquer, Vicente
2014-01-01
Dominant educational approaches in chemistry focus on the learning of somewhat isolated concepts and ideas about chemical substances and reactions. Reform efforts often seek to engage students in the generation of knowledge through the investigation of chemical phenomena, with emphasis on the development and application of models to build causal…
Knowledge of possible toxic mechanisms/modes of action (MOA) of chemicals can provide valuable insights as to appropriate methods for assessing exposure and effects, such as reducing uncertainties related to extrapolation across species, endpoints and chemical structure. However,...
For thousands of chemicals in commerce, there is little or no information about exposure or health and ecological effects. The US Environmental Protection Agency (USEPA) has ongoing research programs to develop and evaluate models that use the often minimal chemical information a...
Evaluation of High-Throughput Chemical Exposure Models ...
The U.S. EPA, under its ExpoCast program, is developing high-throughput near-field modeling methods to estimate human chemical exposure and to provide real-world context to high-throughput screening (HTS) hazard data. These novel modeling methods include reverse methods to infer parent chemical exposures from biomonitoring measurements and forward models to predict multi-pathway exposures from chemical use information and/or residential media concentrations. Here, both forward and reverse modeling methods are used to characterize the relationship between matched near-field environmental (air and dust) and biomarker measurements. Indoor air, house dust, and urine samples from a sample of 120 females (aged 60 to 80 years) were analyzed. In the measured data, 78% of the residential media measurements (across 80 chemicals) and 54% of the urine measurements (across 21 chemicals) were censored, i.e. below the limit of quantification (LOQ). Because of the degree of censoring, we applied a Bayesian approach to impute censored values for 69 chemicals having at least 15% of measurements above LOQ. This resulted in 10 chemicals (5 phthalates, 5 pesticides) with matched air, dust, and urine metabolite measurements. The population medians of indoor air and dust concentrations were compared to population median exposures inferred from urine metabolites concentrations using a high-throughput reverse-dosimetry approach. Median air and dust concentrations were found to be correl
Sankar, Punnaivanam; Alain, Krief; Aghila, Gnanasekaran
2010-05-24
We have developed a model structure-editing tool, ChemEd, programmed in JAVA, which allows drawing chemical structures on a graphical user interface (GUI) by selecting appropriate structural fragments defined in a fragment library. The terms representing the structural fragments are organized in fragment ontology to provide a conceptual support. ChemEd describes the chemical structure in an XML document (ChemFul) with rich semantics explicitly encoding the details of the chemical bonding, the hybridization status, and the electron environment around each atom. The document can be further processed through suitable algorithms and with the support of external chemical ontologies to generate understandable reports about the functional groups present in the structure and their specific environment.
Ariyadasa, B H A K T; Kondo, Akira; Inoue, Yoshio
2015-02-01
A system is needed to predict the behavior, fate, and occurrence of environmental pollutants for effective environmental monitoring. Available monitoring data and computational modeling were used to develop a one-box multimedia model based on the mass balance of the emitted chemicals. Eight physiochemical phenomena in the atmosphere, soil, water, and sediment were considered in this model. This study was carried out in the Lake Biwa-Yodo River basin which provides multiple land uses and also the natural water resource for nearly 13 million of population in the region. Annual emissions for 214 nonmetallic compounds were calculated using the chemical emission data on Japanese pollutant release and transfer registry and used for executing the model simulations for 1997, 2002, and 2008 as input data. The calculated chemical concentrations by the model for all the environmental media were analyzed to determine trends in concentration over this study span. The majority of the chemicals decreased in concentration over time. Among the 214 nonmetallic chemical pollutants, 36 chemicals did not decrease in concentration and were in the top 10 % for concentration on average. Of these 36 pollutants, 7 occur in all 4 environmental media and pose a potential health risk at humans in the Lake Biwa-Yodo River basin.
A review on chemical-induced inflammatory bowel disease models in rodents.
Randhawa, Puneet Kaur; Singh, Kavinder; Singh, Nirmal; Jaggi, Amteshwar Singh
2014-08-01
Ulcerative colitis and Crohn's disease are a set of chronic, idiopathic, immunological and relapsing inflammatory disorders of the gastrointestinal tract referred to as inflammatory bowel disorder (IBD). Although the etiological factors involved in the perpetuation of IBD remain uncertain, development of various animal models provides new insights to unveil the onset and the progression of IBD. Various chemical-induced colitis models are widely used on laboratory scale. Furthermore, these models closely mimic morphological, histopathological and symptomatical features of human IBD. Among the chemical-induced colitis models, trinitrobenzene sulfonic acid (TNBS)-induced colitis, oxazolone induced-colitis and dextran sulphate sodium (DSS)-induced colitis models are most widely used. TNBS elicits Th-1 driven immune response, whereas oxazolone predominantly exhibits immune response of Th-2 phenotype. DSS-induced colitis model also induces changes in Th-1/Th-2 cytokine profile. The present review discusses the methodology and rationale of using various chemical-induced colitis models for evaluating the pathogenesis of IBD.
P80 SRM low torque flex-seal development - thermal and chemical modeling of molding process
NASA Astrophysics Data System (ADS)
Descamps, C.; Gautronneau, E.; Rousseau, G.; Daurat, M.
2009-09-01
The development of the flex-seal component of the P80 nozzle gave the opportunity to set up new design and manufacturing process methods. Due to the short development lead time required by VEGA program, the usual manufacturing iterative tests work flow, which is usually time consuming, had to be enhanced in order to use a more predictive approach. A newly refined rubber vulcanization description was built up and identified on laboratory samples. This chemical model was implemented in a thermal analysis code. The complete model successfully supports the manufacturing processes. These activities were conducted with the support of ESA/CNES Research & Technologies and DGA (General Delegation for Armament).
Kavlock, Robert; Dix, David
2010-02-01
Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, the U.S. Environmental Protection Agency EPA is developing robust and flexible computational tools that can be applied to the thousands of chemicals in commerce, and contaminant mixtures found in air, water, and hazardous-waste sites. The Office of Research and Development (ORD) Computational Toxicology Research Program (CTRP) is composed of three main elements. The largest component is the National Center for Computational Toxicology (NCCT), which was established in 2005 to coordinate research on chemical screening and prioritization, informatics, and systems modeling. The second element consists of related activities in the National Health and Environmental Effects Research Laboratory (NHEERL) and the National Exposure Research Laboratory (NERL). The third and final component consists of academic centers working on various aspects of computational toxicology and funded by the U.S. EPA Science to Achieve Results (STAR) program. Together these elements form the key components in the implementation of both the initial strategy, A Framework for a Computational Toxicology Research Program (U.S. EPA, 2003), and the newly released The U.S. Environmental Protection Agency's Strategic Plan for Evaluating the Toxicity of Chemicals (U.S. EPA, 2009a). Key intramural projects of the CTRP include digitizing legacy toxicity testing information toxicity reference database (ToxRefDB), predicting toxicity (ToxCast) and exposure (ExpoCast), and creating virtual liver (v-Liver) and virtual embryo (v-Embryo) systems models. U.S. EPA-funded STAR centers are also providing bioinformatics, computational toxicology data and models, and developmental toxicity data and models. The models and underlying data are being made publicly available through the Aggregated Computational Toxicology Resource (ACToR), the Distributed Structure-Searchable Toxicity (DSSTox) Database Network, and other U.S. EPA websites. While initially focused on improving the hazard identification process, the CTRP is placing increasing emphasis on using high-throughput bioactivity profiling data in systems modeling to support quantitative risk assessments, and in developing complementary higher throughput exposure models. This integrated approach will enable analysis of life-stage susceptibility, and understanding of the exposures, pathways, and key events by which chemicals exert their toxicity in developing systems (e.g., endocrine-related pathways). The CTRP will be a critical component in next-generation risk assessments utilizing quantitative high-throughput data and providing a much higher capacity for assessing chemical toxicity than is currently available.
NASA Technical Reports Server (NTRS)
Kacynski, Kenneth John
1994-01-01
An advanced engineering model has been developed to aid in the analysis and design of hydrogen/oxygen chemical rocket engines. The complete multispecies, chemically reacting and multidiffusing Navier-Stokes equations are modelled, including the Soret thermal diffusion and the Dufour energy transfer terms. In addition to the spectrum of multispecies aspects developed, the model developed in this study is also conservative in axisymmetric flow for both inviscid and viscous flow environments and the boundary conditions employ a viscous, chemically reacting, reference plane characteristics method. Demonstration cases are presented for a 1030:1 area ratio nozzle, a 25 lbf film cooled nozzle, and a transpiration cooled plug and spool rocket engine. The results indicate that the thrust coefficient predictions of the 1030:1 and the 25 lbf film cooled nozzle are within 0.2 to 0.5 percent, respectively, of experimental measurements when all of the chemical reaction and diffusion terms are considered. Further, the model's predictions agree very well with the heat transfer measurements made in all of the nozzle test cases. The Soret thermal diffusion term is demonstrated to have a significant effect on the predicted mass fraction of hydrogen along the wall of the nozzle in both the laminar flow 1030:1 nozzle and the turbulent flow plug and spool nozzle analysis cases performed. Further, the Soret term was shown to represent an important fraction of the diffusion fluxes occurring in a transpiration cooled rocket engine.
Development of a Human Neurovascular Unit Organotypic Systems Model of Early Brain Development
The inability to model human brain and blood-brain barrier development in vitro poses a major challenge in studies of how chemicals impact early neurogenic periods. During human development, disruption of thyroid hormone (TH) signaling is related to adverse morphological effects ...
Advancing Consumer Product Composition and Chemical ...
This presentation describes EPA efforts to collect, model, and measure publically available consumer product data for use in exposure assessment. The development of the ORD Chemicals and Products database will be described, as will machine-learning based models for predicting chemical function. Finally, the talk describes new mass spectrometry-based methods for measuring chemicals in formulation and articles. This presentation is an invited talk to the ICCA-LRI workshop "Fit-For-Purpose Exposure Assessments For Risk-Based Decision Making". The talk will share EPA efforts to characterize the components of consumer products for use in exposure assessment with the international exposure science community.
Lombardo, Andrea; Franco, Antonio; Pivato, Alberto; Barausse, Alberto
2015-03-01
Conventional approaches to estimating protective ecotoxicological thresholds of chemicals, i.e. predicted no-effect concentrations (PNEC), for an entire ecosystem are based on the use of assessment factors to extrapolate from single-species toxicity data derived in the laboratory to community-level effects on ecosystems. Aquatic food web models may be a useful tool to improve the ecological realism of chemical risk assessment because they enable a more insightful evaluation of the fate and effects of chemicals in dynamic trophic networks. A case study was developed in AQUATOX to simulate the effects of the anionic surfactant linear alkylbenzene sulfonate and the antimicrobial triclosan on a lowland riverine ecosystem. The model was built for a section of the River Thames (UK), for which detailed ecological surveys were available, allowing for a quantification of energy flows through the whole ecosystem. A control scenario was successfully calibrated for a simulation period of one year, and tested for stability over six years. Then, the model ecosystem was perturbed with varying inputs of the two chemicals. Simulations showed that both chemicals rapidly approach steady-state, with internal concentrations in line with the input bioconcentration factors throughout the year. At realistic environmental concentrations, both chemicals have insignificant effects on biomass trends. At hypothetical higher concentrations, direct and indirect effects of chemicals on the ecosystem dynamics emerged from the simulations. Indirect effects due to competition for food sources and predation can lead to responses in biomass density of the same magnitude as those caused by direct toxicity. Indirect effects can both exacerbate or compensate for direct toxicity. Uncertainties in key model assumptions are high as the validation of perturbed simulations remains extremely challenging. Nevertheless, the study is a step towards the development of realistic ecological scenarios and their potential use in prospective risk assessment of down-the-drain chemicals. Copyright © 2014 Elsevier B.V. All rights reserved.
Ni, Haochen; Rui, Yikang; Wang, Jiechen; Cheng, Liang
2014-09-05
The chemical industry poses a potential security risk to factory personnel and neighboring residents. In order to mitigate prospective damage, a synthetic method must be developed for an emergency response. With the development of environmental numeric simulation models, model integration methods, and modern information technology, many Decision Support Systems (DSSs) have been established. However, existing systems still have limitations, in terms of synthetic simulation and network interoperation. In order to resolve these limitations, the matured simulation model for chemical accidents was integrated into the WEB Geographic Information System (WEBGIS) platform. The complete workflow of the emergency response, including raw data (meteorology information, and accident information) management, numeric simulation of different kinds of accidents, environmental impact assessments, and representation of the simulation results were achieved. This allowed comprehensive and real-time simulation of acute accidents in the chemical industry. The main contribution of this paper is that an organizational mechanism of the model set, based on the accident type and pollutant substance; a scheduling mechanism for the parallel processing of multi-accident-type, multi-accident-substance, and multi-simulation-model; and finally a presentation method for scalar and vector data on the web browser on the integration of a WEB Geographic Information System (WEBGIS) platform. The outcomes demonstrated that this method could provide effective support for deciding emergency responses of acute chemical accidents.
Ni, Haochen; Rui, Yikang; Wang, Jiechen; Cheng, Liang
2014-01-01
The chemical industry poses a potential security risk to factory personnel and neighboring residents. In order to mitigate prospective damage, a synthetic method must be developed for an emergency response. With the development of environmental numeric simulation models, model integration methods, and modern information technology, many Decision Support Systems (DSSs) have been established. However, existing systems still have limitations, in terms of synthetic simulation and network interoperation. In order to resolve these limitations, the matured simulation model for chemical accidents was integrated into the WEB Geographic Information System (WEBGIS) platform. The complete workflow of the emergency response, including raw data (meteorology information, and accident information) management, numeric simulation of different kinds of accidents, environmental impact assessments, and representation of the simulation results were achieved. This allowed comprehensive and real-time simulation of acute accidents in the chemical industry. The main contribution of this paper is that an organizational mechanism of the model set, based on the accident type and pollutant substance; a scheduling mechanism for the parallel processing of multi-accident-type, multi-accident-substance, and multi-simulation-model; and finally a presentation method for scalar and vector data on the web browser on the integration of a WEB Geographic Information System (WEBGIS) platform. The outcomes demonstrated that this method could provide effective support for deciding emergency responses of acute chemical accidents. PMID:25198686
Biryol, Derya; Nicolas, Chantel I; Wambaugh, John; Phillips, Katherine; Isaacs, Kristin
2017-11-01
Under the ExpoCast program, United States Environmental Protection Agency (EPA) researchers have developed a high-throughput (HT) framework for estimating aggregate exposures to chemicals from multiple pathways to support rapid prioritization of chemicals. Here, we present methods to estimate HT exposures to chemicals migrating into food from food contact substances (FCS). These methods consisted of combining an empirical model of chemical migration with estimates of daily population food intakes derived from food diaries from the National Health and Nutrition Examination Survey (NHANES). A linear regression model for migration at equilibrium was developed by fitting available migration measurements as a function of temperature, food type (i.e., fatty, aqueous, acidic, alcoholic), initial chemical concentration in the FCS (C 0 ) and chemical properties. The most predictive variables in the resulting model were C 0 , molecular weight, log K ow , and food type (R 2 =0.71, p<0.0001). Migration-based concentrations for 1009 chemicals identified via publicly-available data sources as being present in polymer FCSs were predicted for 12 food groups (combinations of 3 storage temperatures and food type). The model was parameterized with screening-level estimates of C 0 based on the functional role of chemicals in FCS. By combining these concentrations with daily intakes for food groups derived from NHANES, population ingestion exposures of chemical in mg/kg-bodyweight/day (mg/kg-BW/day) were estimated. Calibrated aggregate exposures were estimated for 1931 chemicals by fitting HT FCS and consumer product exposures to exposures inferred from NHANES biomonitoring (R 2 =0.61, p<0.001); both FCS and consumer product pathway exposures were significantly predictive of inferred exposures. Including the FCS pathway significantly impacted the ratio of predicted exposures to those estimated to produce steady-state blood concentrations equal to in-vitro bioactive concentrations. While these HT methods have large uncertainties (and thus may not be appropriate for assessments of single chemicals), they can provide critical refinement to aggregate exposure predictions used in risk-based chemical priority-setting. Published by Elsevier Ltd.
Biodegradation of sorbed chemicals in soil
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scow, K.M.; Fan, S.; Johnson, C.
Rates of biodegradation of sorbed chemicals are usually lower in soil than in aqueous systems, in part because sorption reduces the availability of the chemical to microorganisms. Biodegradation, sorption, and diffusion occur simultaneously and are tightly coupled. In soil, the rate of biodegradation is a function of a chemical`s diffusion coefficient, sorption partition coefficient, the distance it must diffuse from the site of sorption to microbial populations that can degrade it, and its biodegradation rate constant. A model (DSB model) was developed that describes biodegradation of chemicals limited in the availability by sorption and diffusion. Different kinetics expressions describe biodegradationmore » depending on whether the reaction is controlled by mass transfer (diffusion and sorption) or the intrinsic biodegradation rate, and whether biodegradation begins during or after the majority of sorption has occurred. We tested the hypothesis that there is a direct relationship between how strongly a chemical is sorbed and the chemical`s biodegradation rate. In six soils with different organic carbon contents, there was no relationship between the extent or rate of biodegradation and the sorption partition coefficient for phenanthrene. Aging of phenanthrene residues in soil led to a substantial reduction in the rate of biodegradation compared to biodegradation rates of recently added phenanthrene. Considerable research has focused on identification and development of techniques for enhancing in situ biodegradation of sorbed chemicals. Development of such techniques, especially those involving inoculation with microbial strains, should consider physical mass transfer limitations and potential decreases in bioavailability over time. 4 refs., 3 figs., 1 tab.« less
Bachis, Giulia; Maruéjouls, Thibaud; Tik, Sovanna; Amerlinck, Youri; Melcer, Henryk; Nopens, Ingmar; Lessard, Paul; Vanrolleghem, Peter A
2015-01-01
Characterization and modelling of primary settlers have been neglected pretty much to date. However, whole plant and resource recovery modelling requires primary settler model development, as current models lack detail in describing the dynamics and the diversity of the removal process for different particulate fractions. This paper focuses on the improved modelling and experimental characterization of primary settlers. First, a new modelling concept based on particle settling velocity distribution is proposed which is then applied for the development of an improved primary settler model as well as for its characterization under addition of chemicals (chemically enhanced primary treatment, CEPT). This model is compared to two existing simple primary settler models (Otterpohl and Freund; Lessard and Beck), showing to be better than the first one and statistically comparable to the second one, but with easier calibration thanks to the ease with which wastewater characteristics can be translated into model parameters. Second, the changes in the activated sludge model (ASM)-based chemical oxygen demand fractionation between inlet and outlet induced by primary settling is investigated, showing that typical wastewater fractions are modified by primary treatment. As they clearly impact the downstream processes, both model improvements demonstrate the need for more detailed primary settler models in view of whole plant modelling.
2013-06-01
Applications of Molecular Modeling to Challenges in Clean Energy; Fitzgerald, G., et al .; ACS Symposium Series; American Chemical Society: Washington, DC...to 178 In Applications of Molecular Modeling to Challenges in Clean Energy; Fitzgerald, G., et al .; ACS Symposium Series; American Chemical Society...Washington, DC, 2013. developmodels of spectral properties and energy transfer kinetics (20–22). Ivashin et al . optimized select ligands (α
Huang, L; Fantke, P; Ernstoff, A; Jolliet, O
2017-11-01
Indoor releases of organic chemicals encapsulated in solid materials are major contributors to human exposures and are directly related to the internal diffusion coefficient in solid materials. Existing correlations to estimate the diffusion coefficient are only valid for a limited number of chemical-material combinations. This paper develops and evaluates a quantitative property-property relationship (QPPR) to predict diffusion coefficients for a wide range of organic chemicals and materials. We first compiled a training dataset of 1103 measured diffusion coefficients for 158 chemicals in 32 consolidated material types. Following a detailed analysis of the temperature influence, we developed a multiple linear regression model to predict diffusion coefficients as a function of chemical molecular weight (MW), temperature, and material type (adjusted R 2 of .93). The internal validations showed the model to be robust, stable and not a result of chance correlation. The external validation against two separate prediction datasets demonstrated the model has good predicting ability within its applicability domain (Rext2>.8), namely MW between 30 and 1178 g/mol and temperature between 4 and 180°C. By covering a much wider range of organic chemicals and materials, this QPPR facilitates high-throughput estimates of human exposures for chemicals encapsulated in solid materials. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
20170312 - Adverse Outcome Pathway (AOP) framework for ...
Vascular development commences with de novo assembly of a primary capillary plexus (vasculogenesis) followed by its expansion (angiogenesis) and maturation (angio-adaptation) into a hierarchical system of arteries and veins. These processes are tightly regulated by genetic signals and environmental factors linked to morphogenesis and microphysiology. Gestational exposure to some chemicals disrupts vascular development leading to adverse outcomes. To broadly assess consequences of gestational toxicant exposure on vascular development, an Adverse Outcome Pathway (AOP) framework was constructed that integrates data from ToxCast high-throughput screening (HTS) assays with pathway-level information from the literature and public databases. The AOP-based model resolved the ToxCast library (1065 compounds) into a matrix based on several dozen molecular functions critical for developmental angiogenesis. A sample of 38 ToxCast chemicals selected across the matrix tested model performance. Putative vascular disrupting chemical (pVDC) bioactivity was assessed by multiple laboratories utilizing diverse angiogenesis assays, including: transgenic zebrafish, complex human cell co-cultures, engineered microscale systems, and human-synthetic models. The ToxCast pVDC signature predicted vascular disruption in a manner that was chemical-specific and assay-dependent. An AOP for developmental vascular toxicity was constructed that focuses on inhibition of VEGF receptor (VEGFR2). Thi
Adverse Outcome Pathway (AOP) framework for embryonic ...
Vascular development commences with de novo assembly of a primary capillary plexus (vasculogenesis) followed by its expansion (angiogenesis) and maturation (angio-adaptation) into a hierarchical system of arteries and veins. These processes are tightly regulated by genetic signals and environmental factors linked to morphogenesis and microphysiology. Gestational exposure to some chemicals disrupts vascular development leading to adverse outcomes. To broadly assess consequences of gestational toxicant exposure on vascular development, an Adverse Outcome Pathway (AOP) framework was constructed that integrates data from ToxCast high-throughput screening (HTS) assays with pathway-level information from the literature and public databases. The AOP-based model resolved the ToxCast library (1065 compounds) into a matrix based on several dozen molecular functions critical for developmental angiogenesis. A sample of 38 ToxCast chemicals selected across the matrix tested model performance. Putative vascular disrupting chemical (pVDC) bioactivity was assessed by multiple laboratories utilizing diverse angiogenesis assays, including: transgenic zebrafish, complex human cell co-cultures, engineered microscale systems, and human-synthetic models. The ToxCast pVDC signature predicted vascular disruption in a manner that was chemical-specific and assay-dependent. An AOP for developmental vascular toxicity was constructed that focuses on inhibition of VEGF receptor (VEGFR2). Thi
Explanation of non-additive effects in mixtures of similar mode of action chemicals.
Kamo, Masashi; Yokomizo, Hiroyuki
2015-09-01
Many models have been developed to predict the combined effect of drugs and chemicals. Most models are classified into two additive models: independent action (IA) and concentration addition (CA). It is generally considered if the modes of action of chemicals are similar then the combined effect obeys CA; however, many empirical studies report nonlinear effects deviating from the predictions by CA. Such deviations are termed synergism and antagonism. Synergism, which leads to a stronger toxicity, requires more careful management, and hence it is important to understand how and which combinations of chemicals lead to synergism. In this paper, three types of chemical reactions are mathematically modeled and the cause of the nonlinear effects among chemicals with similar modes of action was investigated. Our results show that combined effects obey CA only when the modes of action are exactly the same. Contrary to existing knowledge, combined effects are generally nonlinear even if the modes of action of the chemicals are similar. Our results further show that the nonlinear effects vanish out when the chemical concentrations are low, suggesting that the current management procedure of assuming CA is rarely inappropriate because environmental concentrations of chemicals are generally low. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
A simulation study on the abatement of CO2 emissions by de-absorption with monoethanolamine.
Greer, T; Bedelbayev, A; Igreja, J M; Gomes, J F; Lie, B
2010-01-01
Because of the adverse effect of CO2 from fossil fuel combustion on the earth's ecosystems, the most cost-effective method for CO2 capture is an important area of research. The predominant process for CO2 capture currently employed by industry is chemical absorption in amine solutions. A dynamic model for the de-absorption process was developed with monoethanolamine (MEA) solution. Henry's law was used for modelling the vapour phase equilibrium of the CO2, and fugacity ratios calculated by the Peng-Robinson equation of state (EOS) were used for H2O, MEA, N2 and O2. Chemical reactions between CO2 and MEA were included in the model along with the enhancement factor for chemical absorption. Liquid and vapour energy balances were developed to calculate the liquid and vapour temperature, respectively.
Development of a global pollution model for CO, CH4, and CH2O
NASA Technical Reports Server (NTRS)
Peters, L. K.
1974-01-01
The current status of a global pollution model for carbon monoxide, methane, and formaldehyde is described. The physico-chemical action is considered of these three pollutants in the troposphere. This geographic restriction is convenient since the tropopause provides a natural boundary across which little transport occurs. The data on sources and sinks for these pollutants is based on available information and assumptions relative to the major man-made and natural contributions. The distributions and concentrations of methane, formaldehyde, and carbon monoxide in the atmosphere are interrelated by the chemical reactions in which they participate. A chemical kinetic model based on the pseudo-steady state approximation for the intermediate species was developed to account for these reactions. The numerical procedure used to mathematically describe the pollution transport is a mass conservative scheme employing an integral flux approach.
Sarmah, Swapnalee; Marrs, James A.
2016-01-01
Environmental pollution is a serious problem of the modern world that possesses a major threat to public health. Exposure to environmental pollutants during embryonic development is particularly risky. Although many pollutants have been verified as potential toxicants, there are new chemicals in the environment that need assessment. Heart development is an extremely sensitive process, which can be affected by environmentally toxic molecule exposure during embryonic development. Congenital heart defects are the most common life-threatening global health problems, and the etiology is mostly unknown. The zebrafish has emerged as an invaluable model to examine substance toxicity on vertebrate development, particularly on cardiac development. The zebrafish offers numerous advantages for toxicology research not found in other model systems. Many laboratories have used the zebrafish to study the effects of widespread chemicals in the environment on heart development, including pesticides, nanoparticles, and various organic pollutants. Here, we review the uses of the zebrafish in examining effects of exposure to external molecules during embryonic development in causing cardiac defects, including chemicals ubiquitous in the environment and illicit drugs. Known or potential mechanisms of toxicity and how zebrafish research can be used to provide mechanistic understanding of cardiac defects are discussed. PMID:27999267
Modeling Human Exposure to Indoor Contaminants: External Source to Body Tissues.
Webster, Eva M; Qian, Hua; Mackay, Donald; Christensen, Rebecca D; Tietjen, Britta; Zaleski, Rosemary
2016-08-16
Information on human indoor exposure is necessary to assess the potential risk to individuals from many chemicals of interest. Dynamic indoor and human physicologically based pharmacokinetic (PBPK) models of the distribution of nonionizing, organic chemical concentrations in indoor environments resulting in delivered tissue doses are developed, described and tested. The Indoor model successfully reproduced independently measured, reported time-dependent air concentrations of chloroform released during showering and of 2-butyoxyethanol following use of a volatile surface cleaner. The Indoor model predictions were also comparable to those from a higher tier consumer model (ConsExpo 4.1) for the surface cleaner scenario. The PBPK model successful reproduced observed chloroform exhaled air concentrations resulting from an inhalation exposure. Fugacity based modeling provided a seamless description of the partitioning, fluxes, accumulation and release of the chemical in indoor media and tissues of the exposed subject. This has the potential to assist in health risk assessments, provided that appropriate physical/chemical property, usage characteristics, and toxicological information are available.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fernández, Alberto; Rallo, Robert; Giralt, Francesc
2015-10-15
Ready biodegradability is a key property for evaluating the long-term effects of chemicals on the environment and human health. As such, it is used as a screening test for the assessment of persistent, bioaccumulative and toxic substances. Regulators encourage the use of non-testing methods, such as in silico models, to save money and time. A dataset of 757 chemicals was collected to assess the performance of four freely available in silico models that predict ready biodegradability. They were applied to develop a new consensus method that prioritizes the use of each individual model according to its performance on chemical subsetsmore » driven by the presence or absence of different molecular descriptors. This consensus method was capable of almost eliminating unpredictable chemicals, while the performance of combined models was substantially improved with respect to that of the individual models. - Highlights: • Consensus method to predict ready biodegradability by prioritizing multiple QSARs. • Consensus reduced the amount of unpredictable chemicals to less than 2%. • Performance increased with the number of QSAR models considered. • The absence of 2D atom pairs contributed significantly to the consensus model.« less
MATHEMATICAL MODELING OF PESTICIDES IN THE ENVIRONMENT: CURRENT AND FUTURE DEVELOPMENTS
Transport models, total ecosystem models with aggregated linear approximations, evaluative models, hierarchical models, and influence analysis methods are mathematical techniques that are particularly applicable to the problems encountered when characterizing pesticide chemicals ...
Li, Xuehua; Zhao, Wenxing; Li, Jing; Jiang, Jingqiu; Chen, Jianji; Chen, Jingwen
2013-08-01
To assess the persistence and fate of volatile organic compounds in the troposphere, the rate constants for the reaction with ozone (kO3) are needed. As kO3 values are only available for hundreds of compounds, and experimental determination of kO3 is costly and time-consuming, it is of importance to develop predictive models on kO3. In this study, a total of 379 logkO3 values at different temperatures were used to develop and validate a model for the prediction of kO3, based on quantum chemical descriptors, Dragon descriptors and structural fragments. Molecular descriptors were screened by stepwise multiple linear regression, and the model was constructed by partial least-squares regression. The cross validation coefficient QCUM(2) of the model is 0.836, and the external validation coefficient Qext(2) is 0.811, indicating that the model has high robustness and good predictive performance. The most significant descriptor explaining logkO3 is the BELm2 descriptor with connectivity information weighted atomic masses. kO3 increases with increasing BELm2, and decreases with increasing ionization potential. The applicability domain of the proposed model was visualized by the Williams plot. The developed model can be used to predict kO3 at different temperatures for a wide range of organic chemicals, including alkenes, cycloalkenes, haloalkenes, alkynes, oxygen-containing compounds, nitrogen-containing compounds (except primary amines) and aromatic compounds. Copyright © 2013 Elsevier Ltd. All rights reserved.
Accessible tools to quantify adverse outcomes pathways (AOPs) that can predict the ecological effects of chemicals and other stressors are a major goal of Chemical Safety and Sustainability research within US EPA’s Office of Research and Development. To address this goal, w...
Disorder and Chaos: Developing and Teaching an Interdisciplinary Course on Chemical Dynamics
ERIC Educational Resources Information Center
Desjardins, Steven G.
2008-01-01
In this paper we describe an interdisciplinary course on dynamics that is appropriate for nonscience majors. This course introduces ideas about mathematical modeling using examples based on pendulums, chemical kinetics, and population dynamics. The unique emphasis for a nonmajors course is on chemical reactions as dynamical systems that do more…
An Integrated Approach to Laser Crystal Development
NASA Technical Reports Server (NTRS)
Ries, Heidi R.
1996-01-01
Norfolk State University has developed an integrated research program in the area of laser crystal development, including crystal modeling, crystal growth, spectroscopy, and laser modeling. This research program supports a new graduate program in Chemical Physics, designed in part to address the shortage of minority scientists.
Silva, Carlos; Nunes, Bruno; Nogueira, António Ja; Gonçalves, Fernando; Pereira, Joana L
2016-11-01
Using the bivalve macrofouler Corbicula fluminea, the suitability of in vitro testing as a stepping stone towards the improvement of control methods based on chemical mixtures was addressed in this study. In vitro cholinesterase (ChE) activity inhibition following single exposure of C. fluminea tissue to four model chemicals (the organophosphates dimethoate and dichlorvos, copper and sodium dodecyl phosphate [SDS]) was first assessed. Consequently, mixtures of dimethoate with copper and dichlorvos with SDS were tested and modelled; mixtures with ChE revealed synergistic interactions for both chemical pairs. These synergic combinations were subsequently validated in vivo and the increased control potential of these selected combinations was verified, with gains of up to 50% in C. fluminea mortality relative to corresponding single chemical treatments. Such consistency supports the suitability of using time- and cost-effective surrogate testing platforms to assist the development of biofouling control strategies incorporating mixtures.
Development of algal interspecies correlation estimation models for chemical hazard assessment.
Brill, Jessica L; Belanger, Scott E; Chaney, Joel G; Dyer, Scott D; Raimondo, Sandy; Barron, Mace G; Pittinger, Charles A
2016-09-01
Web-based Interspecies Correlation Estimation (ICE) is an application developed to predict the acute toxicity of a chemical from 1 species to another taxon. Web-ICE models use the acute toxicity value for a surrogate species to predict effect values for other species, thus potentially filling in data gaps for a variety of environmental assessment purposes. Web-ICE has historically been dominated by aquatic and terrestrial animal prediction models. Web-ICE models for algal species were essentially absent and are addressed in the present study. A compilation of public and private sector-held algal toxicity data were compiled and reviewed for quality based on relevant aspects of individual studies. Interspecies correlations were constructed from the most commonly tested algal genera for a broad spectrum of chemicals. The ICE regressions were developed based on acute 72-h and 96-h endpoint values involving 1647 unique studies on 476 unique chemicals encompassing 40 genera and 70 species of green, blue-green, and diatom algae. Acceptance criteria for algal ICE models were established prior to evaluation of individual models and included a minimum sample size of 3, a statistically significant regression slope, and a slope estimation parameter ≥0.65. A total of 186 ICE models were possible at the genus level, with 21 meeting quality criteria; and 264 ICE models were developed at the species level, with 32 meeting quality criteria. Algal ICE models will have broad utility in screening environmental hazard assessments, data gap filling in certain regulatory scenarios, and as supplemental information to derive species sensitivity distributions. Environ Toxicol Chem 2016;35:2368-2378. Published 2016 Wiley Periodicals Inc. on behalf of SETAC. This article is a US government work and, as such, is in the public domain in the United States of America. Published 2016 Wiley Periodicals Inc. on behalf of SETAC. This article is a US government work and, as such, is in the public domain in the United States of America.
Douziech, Mélanie; Conesa, Irene Rosique; Benítez-López, Ana; Franco, Antonio; Huijbregts, Mark; van Zelm, Rosalie
2018-01-24
Large variations in removal efficiencies (REs) of chemicals have been reported for monitoring studies of activated sludge wastewater treatment plants (WWTPs). In this work, we conducted a meta-analysis on REs (1539 data points) for a set of 209 chemicals consisting of fragrances, surfactants, and pharmaceuticals in order to assess the drivers of the variability relating to inherent properties of the chemicals and operational parameters of activated sludge WWTPs. For a reduced dataset (n = 542), we developed a mixed-effect model (meta-regression) to explore the observed variability in REs for the chemicals using three chemical specific factors and four WWTP-related parameters. The overall removal efficiency of the set of chemicals was 82.1% (95% CI 75.2-87.1%, N = 1539). Our model accounted for 17% of the total variability in REs, while the process-based model SimpleTreat did not perform better than the average of the measured REs. We identified that, after accounting for other factors potentially influencing RE, readily biodegradable compounds were better removed than non-readily biodegradable ones. Further, we showed that REs increased with increasing sludge retention times (SRTs), especially for non-readily biodegradable compounds. Finally, our model highlighted a decrease in RE with increasing K OC . The counterintuitive relationship to K OC stresses the need for a better understanding of electrochemical interactions influencing the RE of ionisable chemicals. In addition, we highlighted the need to improve the modelling of chemicals that undergo deconjugation when predicting RE. Our meta-analysis represents a first step in better explaining the observed variability in measured REs of chemicals. It can be of particular help to prioritize the improvements required in existing process-based models to predict removal efficiencies of chemicals in WWTPs.
Combustion chamber analysis code
NASA Technical Reports Server (NTRS)
Przekwas, A. J.; Lai, Y. G.; Krishnan, A.; Avva, R. K.; Giridharan, M. G.
1993-01-01
A three-dimensional, time dependent, Favre averaged, finite volume Navier-Stokes code has been developed to model compressible and incompressible flows (with and without chemical reactions) in liquid rocket engines. The code has a non-staggered formulation with generalized body-fitted-coordinates (BFC) capability. Higher order differencing methodologies such as MUSCL and Osher-Chakravarthy schemes are available. Turbulent flows can be modeled using any of the five turbulent models present in the code. A two-phase, two-liquid, Lagrangian spray model has been incorporated into the code. Chemical equilibrium and finite rate reaction models are available to model chemically reacting flows. The discrete ordinate method is used to model effects of thermal radiation. The code has been validated extensively against benchmark experimental data and has been applied to model flows in several propulsion system components of the SSME and the STME.
Netzeva, Tatiana I; Gallegos Saliner, Ana; Worth, Andrew P
2006-05-01
The aim of the present study was to illustrate that it is possible and relatively straightforward to compare the domain of applicability of a quantitative structure-activity relationship (QSAR) model in terms of its physicochemical descriptors with a large inventory of chemicals. A training set of 105 chemicals with data for relative estrogenic gene activation, obtained in a recombinant yeast assay, was used to develop the QSAR. A binary classification model for predicting active versus inactive chemicals was developed using classification tree analysis and two descriptors with a clear physicochemical meaning (octanol-water partition coefficient, or log Kow, and the number of hydrogen bond donors, or n(Hdon)). The model demonstrated a high overall accuracy (90.5%), with a sensitivity of 95.9% and a specificity of 78.1%. The robustness of the model was evaluated using the leave-many-out cross-validation technique, whereas the predictivity was assessed using an artificial external test set composed of 12 compounds. The domain of the QSAR training set was compared with the chemical space covered by the European Inventory of Existing Commercial Chemical Substances (EINECS), as incorporated in the CDB-EC software, in the log Kow / n(Hdon) plane. The results showed that the training set and, therefore, the applicability domain of the QSAR model covers a small part of the physicochemical domain of the inventory, even though a simple method for defining the applicability domain (ranges in the descriptor space) was used. However, a large number of compounds are located within the narrow descriptor window.
CONCEPTUAL MODEL DEVELOPMENT AND INFORMATION MANAGEMENT FRAMEWORK FOR DIAGNOSTICS RESEARCH
Conceptual model development will focus on the effects of habitat alteration, nutrients,suspended and bedded sediments, and toxic chemicals on appropriate endpoints (individuals, populations, communities, ecosystems) across spatial scales (habitats, water body, watershed, region)...
QUANTITATIVE PROCEDURES FOR NEUROTOXICOLOGY RISK ASSESSMENT
In this project, previously published information on biologically based dose-response model for brain development was used to quantitatively evaluate critical neurodevelopmental processes, and to assess potential chemical impacts on early brain development. This model has been ex...
Recent Developments in Toxico-Cheminformatics: A New ...
Efforts to improve public access to chemical toxicity information resources, coupled with new high-throughput screening (HTS) data and efforts to systematize legacy toxicity studies, have the potential to significantly improve predictive capabilities in toxicology. Important recent developments include: 1) large and growing public resources that link chemical structures to biological activity and toxicity data in searchable format, and that offer more nuanced and varied representations of activity; 2) standardized relational data models that capture relevant details of chemical treatment and effects of published in vivo experiments; and 3) the generation of large amounts of new data from public efforts that are employing HTS technologies to probe a wide range of bioactivity and cellular processes across large swaths of chemical space. Most recently, EPA’s DSSTox project has published several new EPA chemical data inventories (IRIS, HPV, ToxCast) and added an on-line capability for structure (substructure or similarity)-searching through all or parts of the published DSSTox data files. These efforts are, for the first time in many cases, opening up a structure-paved two-way highway between previously inaccessible or isolated public chemical data repositories and large public resources, such as PubChem. In addition, public initiatives (such as ToxML) are developing systematized data models of toxicity study areas, and introducing standardized templates, contr
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sorooshian, S.; Bales, R.C.; Gupta, V.K.
1992-02-01
In order to better understand the implications of acid deposition in watershed systems in the Sierra Nevada, the California Air Resources Board (CARB) initiated an intensive integrated watershed study at Emerald Lake in Sequoia National Park. The comprehensive nature of the data obtained from these studies provided an opportunity to develop a quantitative description of how watershed characteristics and inputs to the watershed influence within-watershed fluxes, chemical composition of streams and lakes, and, therefore, biotic processes. Two different but closely-related modeling approaches were followed. In the first, the emphasis was placed on the development of systems-theoretic models. In the secondmore » approach, development of a compartmental model was undertaken. The systems-theoretic effort results in simple time-series models that allow the consideration of the stochastic properties of model errors. The compartmental model (the University of Arizona Alpine Hydrochemical Model (AHM)) is a comprehensive and detailed description of the various interacting physical and chemical processes occurring on the watershed.« less
Delivering The Benefits of Chemical-Biological Integration in ...
Abstract: Researchers at the EPA’s National Center for Computational Toxicology integrate advances in biology, chemistry, and computer science to examine the toxicity of chemicals and help prioritize chemicals for further research based on potential human health risks. The intention of this research program is to quickly evaluate thousands of chemicals for potential risk but with much reduced cost relative to historical approaches. This work involves computational and data driven approaches including high-throughput screening, modeling, text-mining and the integration of chemistry, exposure and biological data. We have developed a number of databases and applications that are delivering on the vision of developing a deeper understanding of chemicals and their effects on exposure and biological processes that are supporting a large community of scientists in their research efforts. This presentation will provide an overview of our work to bring together diverse large scale data from the chemical and biological domains, our approaches to integrate and disseminate these data, and the delivery of models supporting computational toxicology. This abstract does not reflect U.S. EPA policy. Presentation at ACS TOXI session on Computational Chemistry and Toxicology in Chemical Discovery and Assessement (QSARs).
Predicting long-range transport: a systematic evaluation of two multimedia transport models.
Bennett, D H; Scheringer, M; McKone, T E; Hungerbühler, K
2001-03-15
The United Nations Environment Program has recently developed criteria to identify and restrict chemicals with a potential for persistence and long-range transport (persistent organic pollutants or POPs). There are many stakeholders involved, and the issues are not only scientific but also include social, economic, and political factors. This work focuses on one aspect of the POPs debate, the criteria for determining the potential for long-range transport (LRT). Our goal is to determine if current models are reliable enough to support decisions that classify a chemical based on the LRT potential. We examine the robustness of two multimedia fate models for determining the relative ranking and absolute spatial range of various chemicals in the environment. We also consider the effect of parameter uncertainties and the model uncertainty associated with the selection of an algorithm for gas-particle partitioning on the model results. Given the same chemical properties, both models give virtually the same ranking. However, when chemical parameter uncertainties and model uncertainties such as particle partitioning are considered, the spatial range distributions obtained for the individual chemicals overlap, preventing a distinct rank order. The absolute values obtained for the predicted spatial range or travel distance differ significantly between the two models for the uncertainties evaluated. We find that to evaluate a chemical when large and unresolved uncertainties exist, it is more informative to use two or more models and include multiple types of uncertainty. Model differences and uncertainties must be explicitly confronted to determine how the limitations of scientific knowledge impact predictions in the decision-making process.
Alves, Vinicius M.; Muratov, Eugene; Fourches, Denis; Strickland, Judy; Kleinstreuer, Nicole; Andrade, Carolina H.; Tropsha, Alexander
2015-01-01
Repetitive exposure to a chemical agent can induce an immune reaction in inherently susceptible individuals that leads to skin sensitization. Although many chemicals have been reported as skin sensitizers, there have been very few rigorously validated QSAR models with defined applicability domains (AD) that were developed using a large group of chemically diverse compounds. In this study, we have aimed to compile, curate, and integrate the largest publicly available dataset related to chemically-induced skin sensitization, use this data to generate rigorously validated and QSAR models for skin sensitization, and employ these models as a virtual screening tool for identifying putative sensitizers among environmental chemicals. We followed best practices for model building and validation implemented with our predictive QSAR workflow using random forest modeling technique in combination with SiRMS and Dragon descriptors. The Correct Classification Rate (CCR) for QSAR models discriminating sensitizers from non-sensitizers were 71–88% when evaluated on several external validation sets, within a broad AD, with positive (for sensitizers) and negative (for non-sensitizers) predicted rates of 85% and 79% respectively. When compared to the skin sensitization module included in the OECD QSAR toolbox as well as to the skin sensitization model in publicly available VEGA software, our models showed a significantly higher prediction accuracy for the same sets of external compounds as evaluated by Positive Predicted Rate, Negative Predicted Rate, and CCR. These models were applied to identify putative chemical hazards in the ScoreCard database of possible skin or sense organ toxicants as primary candidates for experimental validation. PMID:25560674
Ribay, Kathryn; Kim, Marlene T; Wang, Wenyi; Pinolini, Daniel; Zhu, Hao
2016-03-01
Estrogen receptors (ERα) are a critical target for drug design as well as a potential source of toxicity when activated unintentionally. Thus, evaluating potential ERα binding agents is critical in both drug discovery and chemical toxicity areas. Using computational tools, e.g., Quantitative Structure-Activity Relationship (QSAR) models, can predict potential ERα binding agents before chemical synthesis. The purpose of this project was to develop enhanced predictive models of ERα binding agents by utilizing advanced cheminformatics tools that can integrate publicly available bioassay data. The initial ERα binding agent data set, consisting of 446 binders and 8307 non-binders, was obtained from the Tox21 Challenge project organized by the NIH Chemical Genomics Center (NCGC). After removing the duplicates and inorganic compounds, this data set was used to create a training set (259 binders and 259 non-binders). This training set was used to develop QSAR models using chemical descriptors. The resulting models were then used to predict the binding activity of 264 external compounds, which were available to us after the models were developed. The cross-validation results of training set [Correct Classification Rate (CCR) = 0.72] were much higher than the external predictivity of the unknown compounds (CCR = 0.59). To improve the conventional QSAR models, all compounds in the training set were used to search PubChem and generate a profile of their biological responses across thousands of bioassays. The most important bioassays were prioritized to generate a similarity index that was used to calculate the biosimilarity score between each two compounds. The nearest neighbors for each compound within the set were then identified and its ERα binding potential was predicted by its nearest neighbors in the training set. The hybrid model performance (CCR = 0.94 for cross validation; CCR = 0.68 for external prediction) showed significant improvement over the original QSAR models, particularly for the activity cliffs that induce prediction errors. The results of this study indicate that the response profile of chemicals from public data provides useful information for modeling and evaluation purposes. The public big data resources should be considered along with chemical structure information when predicting new compounds, such as unknown ERα binding agents.
Impacts of Lateral Boundary Conditions on US Ozone ...
Chemical boundary conditions are a key input to regional-scale photochemical models. In this study, we perform annual simulations over North America with chemical boundary conditions prepared from two global models (GEOS-CHEM and Hemispheric CMAQ). Results indicate that the impacts of different boundary conditions on ozone can be significant throughout the year. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.
Approximation and inference methods for stochastic biochemical kinetics—a tutorial review
NASA Astrophysics Data System (ADS)
Schnoerr, David; Sanguinetti, Guido; Grima, Ramon
2017-03-01
Stochastic fluctuations of molecule numbers are ubiquitous in biological systems. Important examples include gene expression and enzymatic processes in living cells. Such systems are typically modelled as chemical reaction networks whose dynamics are governed by the chemical master equation. Despite its simple structure, no analytic solutions to the chemical master equation are known for most systems. Moreover, stochastic simulations are computationally expensive, making systematic analysis and statistical inference a challenging task. Consequently, significant effort has been spent in recent decades on the development of efficient approximation and inference methods. This article gives an introduction to basic modelling concepts as well as an overview of state of the art methods. First, we motivate and introduce deterministic and stochastic methods for modelling chemical networks, and give an overview of simulation and exact solution methods. Next, we discuss several approximation methods, including the chemical Langevin equation, the system size expansion, moment closure approximations, time-scale separation approximations and hybrid methods. We discuss their various properties and review recent advances and remaining challenges for these methods. We present a comparison of several of these methods by means of a numerical case study and highlight some of their respective advantages and disadvantages. Finally, we discuss the problem of inference from experimental data in the Bayesian framework and review recent methods developed the literature. In summary, this review gives a self-contained introduction to modelling, approximations and inference methods for stochastic chemical kinetics.
Cheminformatics Analysis of EPA ToxCast Chemical Libraries ...
An important goal of toxicology research is the development of robust methods that use in vitro and chemical structure information to predict in vivo toxicity endpoints. The US EPA ToxCast program is addressing this goal using ~600 in vitro assays to create bioactivity profiles on a set of 320 compounds, mostly pesticide actives, that have well characterized in vivo toxicity. These 320 compounds (EPA-320 set evaluated in Phase I of ToxCast) are a subset of a much larger set of ~10,000 candidates that are of interest to the EPA (called here EPA-10K). Predictive models of in vivo toxicity are being constructed from the in vitro assay data on the EPA-320 chemical set. These models require validation on additional chemicals prior to wide acceptance, and this will be carried out by evaluating compounds from EPA-10K in Phase II of ToxCast. We have used cheminformatics approaches including clustering, data visualization, and QSAR to develop models for EPA-320 that could help prioritizing EPA-10K validation chemicals. Both chemical descriptors, as well as calculated physicochemical properties have been used. Compounds from EPA-10K are prioritized based on their similarity to EPA-320 using different similarity metrics, with similarity thresholds defining the domain of applicability for the predictive models built for EPA-320 set. In addition, prioritized lists of compounds of increasing dissimilarity from the EPA-320 have been produced, to test the ability of the EPA-320
Verma, Rajeshwar P; Matthews, Edwin J
2015-03-01
This is part II of an in silico investigation of chemical-induced eye injury that was conducted at FDA's CFSAN. Serious eye damage caused by chemical (eye corrosion) is assessed using the rabbit Draize test, and this endpoint is an essential part of hazard identification and labeling of industrial and consumer products to ensure occupational and consumer safety. There is an urgent need to develop an alternative to the Draize test because EU's 7th amendment to the Cosmetic Directive (EC, 2003; 76/768/EEC) and recast Regulation now bans animal testing on all cosmetic product ingredients and EU's REACH Program limits animal testing for chemicals in commerce. Although in silico methods have been reported for eye irritation (reversible damage), QSARs specific for eye corrosion (irreversible damage) have not been published. This report describes the development of 21 ANN c-QSAR models (QSAR-21) for assessing eye corrosion potential of chemicals using a large and diverse CFSAN data set of 504 chemicals, ADMET Predictor's three sensitivity analyses and ANNE classification functionalities with 20% test set selection from seven different methods. QSAR-21 models were internally and externally validated and exhibited high predictive performance: average statistics for the training, verification, and external test sets of these models were 96/96/94% sensitivity and 91/91/90% specificity. Copyright © 2014 Elsevier Inc. All rights reserved.
Liquid rocket combustor computer code development
NASA Technical Reports Server (NTRS)
Liang, P. Y.
1985-01-01
The Advanced Rocket Injector/Combustor Code (ARICC) that has been developed to model the complete chemical/fluid/thermal processes occurring inside rocket combustion chambers are highlighted. The code, derived from the CONCHAS-SPRAY code originally developed at Los Alamos National Laboratory incorporates powerful features such as the ability to model complex injector combustion chamber geometries, Lagrangian tracking of droplets, full chemical equilibrium and kinetic reactions for multiple species, a fractional volume of fluid (VOF) description of liquid jet injection in addition to the gaseous phase fluid dynamics, and turbulent mass, energy, and momentum transport. Atomization and droplet dynamic models from earlier generation codes are transplated into the present code. Currently, ARICC is specialized for liquid oxygen/hydrogen propellants, although other fuel/oxidizer pairs can be easily substituted.
Chemical Operations Technology Curriculum Development Project. PY95 Final Detailed Report.
ERIC Educational Resources Information Center
Texas State Technical Coll., Marshall.
A model curriculum for an associate of applied science degree in chemical operations technology (COT) was developed at Texas State Technical College in Marshall, Texas. First, a comprehensive analysis of the local and statewide labor market demand for trained personnel in the advanced field of COT was conducted. Next, a comprehensive task analysis…
Applicability of western chemical dietary exposure models to the Chinese population.
Zhao, Shizhen; Price, Oliver; Liu, Zhengtao; Jones, Kevin C; Sweetman, Andrew J
2015-07-01
A range of exposure models, which have been developed in Europe and North America, are playing an increasingly important role in priority setting and the risk assessment of chemicals. However, the applicability of these tools, which are based on Western dietary exposure pathways, to estimate chemical exposure to the Chinese population to support the development of a risk-based environment and exposure assessment, is unclear. Three frequently used modelling tools, EUSES, RAIDAR and ACC-HUMANsteady, have been evaluated in terms of human dietary exposure estimation by application to a range of chemicals with different physicochemical properties under both model default and Chinese dietary scenarios. Hence, the modelling approaches were assessed by considering dietary pattern differences only. The predicted dietary exposure pathways were compared under both scenarios using a range of hypothetical and current emerging contaminants. Although the differences across models are greater than those between dietary scenarios, model predictions indicated that dietary preference can have a significant impact on human exposure, with the relatively high consumption of vegetables and cereals resulting in higher exposure via plants-based foodstuffs under Chinese consumption patterns compared to Western diets. The selected models demonstrated a good ability to identify key dietary exposure pathways which can be used for screening purposes and an evaluative risk assessment. However, some model adaptations will be required to cover a number of important Chinese exposure pathways, such as freshwater farmed-fish, grains and pork. Copyright © 2015 Elsevier Inc. All rights reserved.
CHEMICAL PROCESSES AND MODELING IN ECOSYSTEMS
Trends in regulatory strategies require EPA to understand better chemical behavior in natural and impacted ecosystems and in biological systems to carry out the increasingly complex array of exposure and risk assessments needed to develop scientifically defensible regulations (GP...
ACToR A Aggregated Computational Toxicology Resource
We are developing the ACToR system (Aggregated Computational Toxicology Resource) to serve as a repository for a variety of types of chemical, biological and toxicological data that can be used for predictive modeling of chemical toxicology.
ACToR A Aggregated Computational Toxicology Resource (S)
We are developing the ACToR system (Aggregated Computational Toxicology Resource) to serve as a repository for a variety of types of chemical, biological and toxicological data that can be used for predictive modeling of chemical toxicology.
Toxicokinetic Triage for Environmental Chemicals
Toxicokinetic (TK) models are essential for linking administered doses to blood and tissue concentrations. In vitro-to-in vivo extrapolation (IVIVE) methods have been developed to determine TK from limited in vitro measurements and chemical structure-based property predictions, p...
ABSTRACT Exposure to endocrine disrupting chemicals can affect reproduction and development in both humans and wildlife. We developed a mechanistic mathematical model of the hypothalamic pituitary-gonadal (HPG) axis in female fathead minnows to predic...
Petri Nets - A Mathematical Formalism to Analyze Chemical Reaction Networks.
Koch, Ina
2010-12-17
In this review we introduce and discuss Petri nets - a mathematical formalism to describe and analyze chemical reaction networks. Petri nets were developed to describe concurrency in general systems. We find most applications to technical and financial systems, but since about twenty years also in systems biology to model biochemical systems. This review aims to give a short informal introduction to the basic formalism illustrated by a chemical example, and to discuss possible applications to the analysis of chemical reaction networks, including cheminformatics. We give a short overview about qualitative as well as quantitative modeling Petri net techniques useful in systems biology, summarizing the state-of-the-art in that field and providing the main literature references. Finally, we discuss advantages and limitations of Petri nets and give an outlook to further development. Copyright © 2010 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Chemical structure determines target organ carcinogenesis in rats
Carrasquer, C. A.; Malik, N.; States, G.; Qamar, S.; Cunningham, S.L.; Cunningham, A.R.
2012-01-01
SAR models were developed for 12 rat tumour sites using data derived from the Carcinogenic Potency Database. Essentially, the models fall into two categories: Target Site Carcinogen – Non-Carcinogen (TSC-NC) and Target Site Carcinogen – Non-Target Site Carcinogen (TSC-NTSC). The TSC-NC models were composed of active chemicals that were carcinogenic to a specific target site and inactive ones that were whole animal non-carcinogens. On the other hand, the TSC-NTSC models used an inactive category also composed of carcinogens but to any/all other sites but the target site. Leave one out validations produced an overall average concordance value for all 12 models of 0.77 for the TSC-NC models and 0.73 for the TSC-NTSC models. Overall, these findings suggest that while the TSC-NC models are able to distinguish between carcinogens and non-carcinogens, the TSC-NTSC models are identifying structural attributes that associate carcinogens to specific tumour sites. Since the TSC-NTSC models are composed of active and inactive compounds that are genotoxic and non-genotoxic carcinogens, the TSC-NTSC models may be capable of deciphering non-genotoxic mechanisms of carcinogenesis. Together, models of this type may also prove useful in anticancer drug development since they essentially contain chemicals moieties that target specific tumour site. PMID:23066888
Jasper, Micah N; Martin, Sheppard A; Oshiro, Wendy M; Ford, Jermaine; Bushnell, Philip J; El-Masri, Hisham
2016-03-15
People are often exposed to complex mixtures of environmental chemicals such as gasoline, tobacco smoke, water contaminants, or food additives. We developed an approach that applies chemical lumping methods to complex mixtures, in this case gasoline, based on biologically relevant parameters used in physiologically based pharmacokinetic (PBPK) modeling. Inhalation exposures were performed with rats to evaluate the performance of our PBPK model and chemical lumping method. There were 109 chemicals identified and quantified in the vapor in the chamber. The time-course toxicokinetic profiles of 10 target chemicals were also determined from blood samples collected during and following the in vivo experiments. A general PBPK model was used to compare the experimental data to the simulated values of blood concentration for 10 target chemicals with various numbers of lumps, iteratively increasing from 0 to 99. Large reductions in simulation error were gained by incorporating enzymatic chemical interactions, in comparison to simulating the individual chemicals separately. The error was further reduced by lumping the 99 nontarget chemicals. The same biologically based lumping approach can be used to simplify any complex mixture with tens, hundreds, or thousands of constituents.
Development of a Scale-up Tool for Pervaporation Processes
Thiess, Holger; Strube, Jochen
2018-01-01
In this study, an engineering tool for the design and optimization of pervaporation processes is developed based on physico-chemical modelling coupled with laboratory/mini-plant experiments. The model incorporates the solution-diffusion-mechanism, polarization effects (concentration and temperature), axial dispersion, pressure drop and the temperature drop in the feed channel due to vaporization of the permeating components. The permeance, being the key model parameter, was determined via dehydration experiments on a mini-plant scale for the binary mixtures ethanol/water and ethyl acetate/water. A second set of experimental data was utilized for the validation of the model for two chemical systems. The industrially relevant ternary mixture, ethanol/ethyl acetate/water, was investigated close to its azeotropic point and compared to a simulation conducted with the determined binary permeance data. Experimental and simulation data proved to agree very well for the investigated process conditions. In order to test the scalability of the developed engineering tool, large-scale data from an industrial pervaporation plant used for the dehydration of ethanol was compared to a process simulation conducted with the validated physico-chemical model. Since the membranes employed in both mini-plant and industrial scale were of the same type, the permeance data could be transferred. The comparison of the measured and simulated data proved the scalability of the derived model. PMID:29342956
Acute toxicity prediction to threatened and endangered ...
Evaluating contaminant sensitivity of threatened and endangered (listed) species and protectiveness of chemical regulations often depends on toxicity data for commonly tested surrogate species. The U.S. EPA’s Internet application Web-ICE is a suite of Interspecies Correlation Estimation (ICE) models that can extrapolate species sensitivity to listed taxa using least-squares regressions of the sensitivity of a surrogate species and a predicted taxon (species, genus, or family). Web-ICE was expanded with new models that can predict toxicity to over 250 listed species. A case study was used to assess protectiveness of genus and family model estimates derived from either geometric mean or minimum taxa toxicity values for listed species. Models developed from the most sensitive value for each chemical were generally protective of the most sensitive species within predicted taxa, including listed species, and were more protective than geometric means models. ICE model estimates were compared to HC5 values derived from Species Sensitivity Distributions for the case study chemicals to assess protectiveness of the two approaches. ICE models provide robust toxicity predictions and can generate protective toxicity estimates for assessing contaminant risk to listed species. Reporting on the development and optimization of ICE models for listed species toxicity estimation
Partitioning of polar and non-polar neutral organic chemicals into human and cow milk.
Geisler, Anett; Endo, Satoshi; Goss, Kai-Uwe
2011-10-01
The aim of this work was to develop a predictive model for milk/water partition coefficients of neutral organic compounds. Batch experiments were performed for 119 diverse organic chemicals in human milk and raw and processed cow milk at 37°C. No differences (<0.3 log units) in the partition coefficients of these types of milk were observed. The polyparameter linear free energy relationship model fit the calibration data well (SD=0.22 log units). An experimental validation data set including hormones and hormone active compounds was predicted satisfactorily by the model. An alternative modelling approach based on log K(ow) revealed a poorer performance. The model presented here provides a significant improvement in predicting enrichment of potentially hazardous chemicals in milk. In combination with physiologically based pharmacokinetic modelling this improvement in the estimation of milk/water partitioning coefficients may allow a better risk assessment for a wide range of neutral organic chemicals. Copyright © 2011 Elsevier Ltd. All rights reserved.
Boyd, Windy A.; Smith, Marjolein V.; Co, Caroll A.; Pirone, Jason R.; Rice, Julie R.; Shockley, Keith R.; Freedman, Jonathan H.
2015-01-01
Background: Modern toxicology is shifting from an observational to a mechanistic science. As part of this shift, high-throughput toxicity assays are being developed using alternative, nonmammalian species to prioritize chemicals and develop prediction models of human toxicity. Methods: The nematode Caenorhabditis elegans (C. elegans) was used to screen the U.S. Environmental Protection Agency’s (EPA’s) ToxCast™ Phase I and Phase II libraries, which contain 292 and 676 chemicals, respectively, for chemicals leading to decreased larval development and growth. Chemical toxicity was evaluated using three parameters: a biologically defined effect size threshold, half-maximal activity concentration (AC50), and lowest effective concentration (LEC). Results: Across both the Phase I and Phase II libraries, 62% of the chemicals were classified as active ≤ 200 μM in the C. elegans assay. Chemical activities and potencies in C. elegans were compared with those from two zebrafish embryonic development toxicity studies and developmental toxicity data for rats and rabbits. Concordance of chemical activity was higher between C. elegans and one zebrafish assay across Phase I chemicals (79%) than with a second zebrafish assay (59%). Using C. elegans or zebrafish to predict rat or rabbit developmental toxicity resulted in balanced accuracies (the average value of the sensitivity and specificity for an assay) ranging from 45% to 53%, slightly lower than the concordance between rat and rabbit (58%). Conclusions: Here, we present an assay that quantitatively and reliably describes the effects of chemical toxicants on C. elegans growth and development. We found significant overlap in the activity of chemicals in the ToxCast™ libraries between C. elegans and zebrafish developmental screens. Incorporating C. elegans toxicological assays as part of a battery of in vitro and in vivo assays provides additional information for the development of models to predict a chemical’s potential toxicity to humans. Citation: Boyd WA, Smith MV, Co CA, Pirone JR, Rice JR, Shockley KR, Freedman JH. 2016. Developmental effects of the ToxCast™ Phase I and II chemicals in Caenorhabditis elegans and corresponding responses in zebrafish, rats, and rabbits. Environ Health Perspect 124:586–593; http://dx.doi.org/10.1289/ehp.1409645 PMID:26496690
Consensus models to predict endocrine disruption for all ...
Humans are potentially exposed to tens of thousands of man-made chemicals in the environment. It is well known that some environmental chemicals mimic natural hormones and thus have the potential to be endocrine disruptors. Most of these environmental chemicals have never been tested for their ability to disrupt the endocrine system, in particular, their ability to interact with the estrogen receptor. EPA needs tools to prioritize thousands of chemicals, for instance in the Endocrine Disruptor Screening Program (EDSP). Collaborative Estrogen Receptor Activity Prediction Project (CERAPP) was intended to be a demonstration of the use of predictive computational models on HTS data including ToxCast and Tox21 assays to prioritize a large chemical universe of 32464 unique structures for one specific molecular target – the estrogen receptor. CERAPP combined multiple computational models for prediction of estrogen receptor activity, and used the predicted results to build a unique consensus model. Models were developed in collaboration between 17 groups in the U.S. and Europe and applied to predict the common set of chemicals. Structure-based techniques such as docking and several QSAR modeling approaches were employed, mostly using a common training set of 1677 compounds provided by U.S. EPA, to build a total of 42 classification models and 8 regression models for binding, agonist and antagonist activity. All predictions were evaluated on ToxCast data and on an exte
Hamm, V; Collon-Drouaillet, P; Fabriol, R
2008-02-19
The flooding of abandoned mines in the Lorraine Iron Basin (LIB) over the past 25 years has degraded the quality of the groundwater tapped for drinking water. High concentrations of dissolved sulphate have made the water unsuitable for human consumption. This problematic issue has led to the development of numerical tools to support water-resource management in mining contexts. Here we examine two modelling approaches using different numerical tools that we tested on the Saizerais flooded iron-ore mine (Lorraine, France). A first approach considers the Saizerais Mine as a network of two chemical reactors (NCR). The second approach is based on a physically distributed pipe network model (PNM) built with EPANET 2 software. This approach considers the mine as a network of pipes defined by their geometric and chemical parameters. Each reactor in the NCR model includes a detailed chemical model built to simulate quality evolution in the flooded mine water. However, in order to obtain a robust PNM, we simplified the detailed chemical model into a specific sulphate dissolution-precipitation model that is included as sulphate source/sink in both a NCR model and a pipe network model. Both the NCR model and the PNM, based on different numerical techniques, give good post-calibration agreement between the simulated and measured sulphate concentrations in the drinking-water well and overflow drift. The NCR model incorporating the detailed chemical model is useful when a detailed chemical behaviour at the overflow is needed. The PNM incorporating the simplified sulphate dissolution-precipitation model provides better information of the physics controlling the effect of flow and low flow zones, and the time of solid sulphate removal whereas the NCR model will underestimate clean-up time due to the complete mixing assumption. In conclusion, the detailed NCR model will give a first assessment of chemical processes at overflow, and in a second time, the PNM model will provide more detailed information on flow and chemical behaviour (dissolved sulphate concentrations, remaining mass of solid sulphate) in the network. Nevertheless, both modelling methods require hydrological and chemical parameters (recharge flow rate, outflows, volume of mine voids, mass of solids, kinetic constants of the dissolution-precipitation reactions), which are commonly not available for a mine and therefore call for calibration data.
Theory and Modeling of Liquid Explosive Detonation
NASA Astrophysics Data System (ADS)
Tarver, Craig M.; Urtiew, Paul A.
2010-10-01
The current understanding of the detonation reaction zones of liquid explosives is discussed in this article. The physical and chemical processes that precede and follow exothermic chemical reaction within the detonation reaction zone are discussed within the framework of the nonequilibrium Zeldovich-von Neumann-Doring (NEZND) theory of self-sustaining detonation. Nonequilibrium chemical and physical processes cause finite time duration induction zones before exothermic chemical energy release occurs. This separation between the leading shock wave front and the chemical energy release needed to sustain it results in shock wave amplification and the subsequent formation of complex three-dimensional cellular structures in all liquid detonation waves. To develop a practical Zeldovich-von Neumann-Doring (ZND) reactive flow model for liquid detonation, experimental data on reaction zone structure, confined failure diameter, unconfined failure diameter, and failure wave velocity in the Dremin-Trofimov test for detonating nitromethane are calculated using the ignition and growth reactive flow model.
Merging Applicability Domains for in Silico Assessment of Chemical Mutagenicity
2014-02-04
molecular fingerprints as descriptors for developing quantitative structure−activity relationship ( QSAR ) models and defining applicability domains with...used to define and quantify an applicability domain for either method. The importance of using applicability domains in QSAR modeling cannot be...domain from roughly 80% to 90%. These results indicated that the proposed QSAR protocol constituted a highly robust chemical mutagenicity prediction
Sliding mode control: an approach to regulate nonlinear chemical processes
Camacho; Smith
2000-01-01
A new approach for the design of sliding mode controllers based on a first-order-plus-deadtime model of the process, is developed. This approach results in a fixed structure controller with a set of tuning equations as a function of the characteristic parameters of the model. The controller performance is judged by simulations on two nonlinear chemical processes.
Thermo-Chemical Phenomena Simulation for Ablation
2011-02-21
DATES COVERED (1/01/08-30/11/10) 4. TITLE AND SUBTITLE Thermo- Chemical Phenomena Simulation for Ablation 5a. CONTRACT NUMBER...First, a physic based chemical kinetic model for high-temperature gas is developed and verified by comparing with data from the RAM-C-II probe and the...found to be negligible and the energy exchange is dominated by the chemical process for conductive-convective heat transfer. A simplified and more
Development of a QSAR Model for Thyroperoxidase Inhbition ...
hyroid hormones (THs) are involved in multiple biological processes and are critical modulators of fetal development. Even moderate changes in maternal or fetal TH levels can produce irreversible neurological deficits in children, such as lower IQ. The enzyme thyroperoxidase (TPO) plays a key role in the synthesis of THs, and inhibition of TPO by xenobiotics results in decreased TH synthesis. Recently, a high-throughput screening assay for TPO inhibition (AUR-TPO) was developed and used to test the ToxCast Phase I and II chemicals. In the present study, we used the results from AUR-TPO to develop a Quantitative Structure-Activity Relationship (QSAR) model for TPO inhibition. The training set consisted of 898 discrete organic chemicals: 134 inhibitors and 764 non-inhibitors. A five times two-fold cross-validation of the model was performed, yielding a balanced accuracy of 78.7%. More recently, an additional ~800 chemicals were tested in the AUR-TPO assay. These data were used for a blinded external validation of the QSAR model, demonstrating a balanced accuracy of 85.7%. Overall, the cross- and external validation indicate a robust model with high predictive performance. Next, we used the QSAR model to predict 72,526 REACH pre-registered substances. The model could predict 49.5% (35,925) of the substances in its applicability domain and of these, 8,863 (24.7%) were predicted to be TPO inhibitors. Predictions from this screening can be used in a tiered approach to
PARTICLE FLOW, MIXING, AND CHEMICAL REACTION IN CIRCULATING FLUIDIZED BED ABSORBERS
A mixing model has been developed to simulate the particle residence time distribution (RTD) in a circulating fluidized bed absorber (CFBA). Also, a gas/solid reaction model for sulfur dioxide (SO2) removal by lime has been developed. For the reaction model that considers RTD dis...
We describe the development and evaluation of two new model algorithms for NOx chemistry in the R-LINE near-road dispersion model for traffic sources. With increased urbanization, there is increased mobility leading to higher amount of traffic related activity on a global scale. ...
Hong, Huixiao; Shen, Jie; Ng, Hui Wen; Sakkiah, Sugunadevi; Ye, Hao; Ge, Weigong; Gong, Ping; Xiao, Wenming; Tong, Weida
2016-03-25
Endocrine disruptors such as polychlorinated biphenyls (PCBs), diethylstilbestrol (DES) and dichlorodiphenyltrichloroethane (DDT) are agents that interfere with the endocrine system and cause adverse health effects. Huge public health concern about endocrine disruptors has arisen. One of the mechanisms of endocrine disruption is through binding of endocrine disruptors with the hormone receptors in the target cells. Entrance of endocrine disruptors into target cells is the precondition of endocrine disruption. The binding capability of a chemical with proteins in the blood affects its entrance into the target cells and, thus, is very informative for the assessment of potential endocrine disruption of chemicals. α-fetoprotein is one of the major serum proteins that binds to a variety of chemicals such as estrogens. To better facilitate assessment of endocrine disruption of environmental chemicals, we developed a model for α-fetoprotein binding activity prediction using the novel pattern recognition method (Decision Forest) and the molecular descriptors calculated from two-dimensional structures by Mold² software. The predictive capability of the model has been evaluated through internal validation using 125 training chemicals (average balanced accuracy of 69%) and external validations using 22 chemicals (balanced accuracy of 71%). Prediction confidence analysis revealed the model performed much better at high prediction confidence. Our results indicate that the model is useful (when predictions are in high confidence) in endocrine disruption risk assessment of environmental chemicals though improvement by increasing number of training chemicals is needed.
From QSAR to QSIIR: Searching for Enhanced Computational Toxicology Models
Zhu, Hao
2017-01-01
Quantitative Structure Activity Relationship (QSAR) is the most frequently used modeling approach to explore the dependency of biological, toxicological, or other types of activities/properties of chemicals on their molecular features. In the past two decades, QSAR modeling has been used extensively in drug discovery process. However, the predictive models resulted from QSAR studies have limited use for chemical risk assessment, especially for animal and human toxicity evaluations, due to the low predictivity of new compounds. To develop enhanced toxicity models with independently validated external prediction power, novel modeling protocols were pursued by computational toxicologists based on rapidly increasing toxicity testing data in recent years. This chapter reviews the recent effort in our laboratory to incorporate the biological testing results as descriptors in the toxicity modeling process. This effort extended the concept of QSAR to Quantitative Structure In vitro-In vivo Relationship (QSIIR). The QSIIR study examples provided in this chapter indicate that the QSIIR models that based on the hybrid (biological and chemical) descriptors are indeed superior to the conventional QSAR models that only based on chemical descriptors for several animal toxicity endpoints. We believe that the applications introduced in this review will be of interest and value to researchers working in the field of computational drug discovery and environmental chemical risk assessment. PMID:23086837
Challenges in Developing Models Describing Complex Soil Systems
NASA Astrophysics Data System (ADS)
Simunek, J.; Jacques, D.
2014-12-01
Quantitative mechanistic models that consider basic physical, mechanical, chemical, and biological processes have the potential to be powerful tools to integrate our understanding of complex soil systems, and the soil science community has often called for models that would include a large number of these diverse processes. However, once attempts have been made to develop such models, the response from the community has not always been overwhelming, especially after it discovered that these models are consequently highly complex, requiring not only a large number of parameters, not all of which can be easily (or at all) measured and/or identified, and which are often associated with large uncertainties, but also requiring from their users deep knowledge of all/most of these implemented physical, mechanical, chemical and biological processes. Real, or perceived, complexity of these models then discourages users from using them even for relatively simple applications, for which they would be perfectly adequate. Due to the nonlinear nature and chemical/biological complexity of the soil systems, it is also virtually impossible to verify these types of models analytically, raising doubts about their applicability. Code inter-comparisons, which is then likely the most suitable method to assess code capabilities and model performance, requires existence of multiple models of similar/overlapping capabilities, which may not always exist. It is thus a challenge not only to developed models describing complex soil systems, but also to persuade the soil science community in using them. As a result, complex quantitative mechanistic models are still an underutilized tool in soil science research. We will demonstrate some of the challenges discussed above on our own efforts in developing quantitative mechanistic models (such as HP1/2) for complex soil systems.
Sharma, Nripen S.; Jindal, Rohit; Mitra, Bhaskar; Lee, Serom; Li, Lulu; Maguire, Tim J.; Schloss, Rene; Yarmush, Martin L.
2014-01-01
Skin sensitization remains a major environmental and occupational health hazard. Animal models have been used as the gold standard method of choice for estimating chemical sensitization potential. However, a growing international drive and consensus for minimizing animal usage have prompted the development of in vitro methods to assess chemical sensitivity. In this paper, we examine existing approaches including in silico models, cell and tissue based assays for distinguishing between sensitizers and irritants. The in silico approaches that have been discussed include Quantitative Structure Activity Relationships (QSAR) and QSAR based expert models that correlate chemical molecular structure with biological activity and mechanism based read-across models that incorporate compound electrophilicity. The cell and tissue based assays rely on an assortment of mono and co-culture cell systems in conjunction with 3D skin models. Given the complexity of allergen induced immune responses, and the limited ability of existing systems to capture the entire gamut of cellular and molecular events associated with these responses, we also introduce a microfabricated platform that can capture all the key steps involved in allergic contact sensitivity. Finally, we describe the development of an integrated testing strategy comprised of two or three tier systems for evaluating sensitization potential of chemicals. PMID:24741377
Wang, XinJie; Wu, YanQing; Huang, FengLei
2017-01-05
A mesoscopic framework is developed to quantify the thermal-mechanical-chemical responses of polymer-bonded explosive (PBX) samples under impact loading. A mesoscopic reactive model is developed for the cyclotetramethylenetetranitramine (HMX) crystal, which incorporates nonlinear elasticity, crystal plasticity, and temperature-dependent chemical reaction. The proposed model was implemented in the finite element code ABAQUS by the user subroutine VUMAT. A series of three-dimensional mesoscale models were constructed and calculated under low-strength impact loading scenarios from 100m/s to 600m/s where only the first wave transit is studied. Crystal anisotropy and microstructural heterogeneity are responsible for the nonuniform stress field and fluctuations of the stress wave front. At a critical impact velocity (≥300m/s), a chemical reaction is triggered because the temperature contributed by the volumetric and plastic works is sufficiently high. Physical quantities, including stress, temperature, and extent of reaction, are homogenized from those across the microstructure at the mesoscale to compare with macroscale measurements, which will advance the continuum-level models. The framework presented in this study has important implications in understanding hot spot ignition processes and improving predictive capabilities in energetic materials. Copyright © 2016 Elsevier B.V. All rights reserved.
Development of an Improved Simulator for Chemical and Microbial EOR Methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pope, Gary A.; Sepehrnoori, Kamy; Delshad, Mojdeh
2000-09-11
The objective of this research was to extend the capability of an existing simulator (UTCHEM) to improved oil recovery methods that use surfactants, polymers, gels, alkaline chemicals, microorganisms and foam as well as various combinations of these in both conventional and naturally fractured oil reservoirs. Task 1 is the addition of a dual-porosity model for chemical improved of recovery processes in naturally fractured oil reservoirs. Task 2 is the addition of a foam model. Task 3 addresses several numerical and coding enhancements that will greatly improve the versatility and performance of UTCHEM. Task 4 is the enhancements of physical propertymore » models.« less
Screening level risk assessment model for chemical fate and effects in the environment.
Arnot, Jon A; Mackay, Don; Webster, Eva; Southwood, Jeanette M
2006-04-01
A screening level risk assessment model is developed and described to assess and prioritize chemicals by estimating environmental fate and transport, bioaccumulation, and exposure to humans and wildlife for a unit emission rate. The most sensitive risk endpoint is identified and a critical emission rate is then calculated as a result of that endpoint being reached. Finally, this estimated critical emission rate is compared with the estimated actual emission rate as a risk assessment factor. This "back-tracking" process avoids the use of highly uncertain emission rate data as model input. The application of the model is demonstrated in detail for three diverse chemicals and in less detail for a group of 70 chemicals drawn from the Canadian Domestic Substances List. The simple Level II and the more complex Level III fate calculations are used to "bin" substances into categories of similar probable risk. The essential role of the model is to synthesize information on chemical and environmental properties within a consistent mass balance framework to yield an overall estimate of screening level risk with respect to the defined endpoint. The approach may be useful to identify and prioritize those chemicals of commerce that are of greatest potential concern and require more comprehensive modeling and monitoring evaluations in actual regional environments and food webs.
Real-time plasma control in a dual-frequency, confined plasma etcher
NASA Astrophysics Data System (ADS)
Milosavljević, V.; Ellingboe, A. R.; Gaman, C.; Ringwood, J. V.
2008-04-01
The physics issues of developing model-based control of plasma etching are presented. A novel methodology for incorporating real-time model-based control of plasma processing systems is developed. The methodology is developed for control of two dependent variables (ion flux and chemical densities) by two independent controls (27 MHz power and O2 flow). A phenomenological physics model of the nonlinear coupling between the independent controls and the dependent variables of the plasma is presented. By using a design of experiment, the functional dependencies of the response surface are determined. In conjunction with the physical model, the dependencies are used to deconvolve the sensor signals onto the control inputs, allowing compensation of the interaction between control paths. The compensated sensor signals and compensated set-points are then used as inputs to proportional-integral-derivative controllers to adjust radio frequency power and oxygen flow to yield the desired ion flux and chemical density. To illustrate the methodology, model-based real-time control is realized in a commercial semiconductor dielectric etch chamber. The two radio frequency symmetric diode operates with typical commercial fluorocarbon feed-gas mixtures (Ar/O2/C4F8). Key parameters for dielectric etching are known to include ion flux to the surface and surface flux of oxygen containing species. Control is demonstrated using diagnostics of electrode-surface ion current, and chemical densities of O, O2, and CO measured by optical emission spectrometry and/or mass spectrometry. Using our model-based real-time control, the set-point tracking accuracy to changes in chemical species density and ion flux is enhanced.
Consequence and Resilience Modeling for Chemical Supply Chains
NASA Technical Reports Server (NTRS)
Stamber, Kevin L.; Vugrin, Eric D.; Ehlen, Mark A.; Sun, Amy C.; Warren, Drake E.; Welk, Margaret E.
2011-01-01
The U.S. chemical sector produces more than 70,000 chemicals that are essential material inputs to critical infrastructure systems, such as the energy, public health, and food and agriculture sectors. Disruptions to the chemical sector can potentially cascade to other dependent sectors, resulting in serious national consequences. To address this concern, the U.S. Department of Homeland Security (DHS) tasked Sandia National Laboratories to develop a predictive consequence modeling and simulation capability for global chemical supply chains. This paper describes that capability , which includes a dynamic supply chain simulation platform called N_ABLE(tm). The paper also presents results from a case study that simulates the consequences of a Gulf Coast hurricane on selected segments of the U.S. chemical sector. The case study identified consequences that include impacted chemical facilities, cascading impacts to other parts of the chemical sector. and estimates of the lengths of chemical shortages and recovery . Overall. these simulation results can DHS prepare for and respond to actual disruptions.
Metabolic biotransformation half-lives in fish: QSAR modeling and consensus analysis.
Papa, Ester; van der Wal, Leon; Arnot, Jon A; Gramatica, Paola
2014-02-01
Bioaccumulation in fish is a function of competing rates of chemical uptake and elimination. For hydrophobic organic chemicals bioconcentration, bioaccumulation and biomagnification potential are high and the biotransformation rate constant is a key parameter. Few measured biotransformation rate constant data are available compared to the number of chemicals that are being evaluated for bioaccumulation hazard and for exposure and risk assessment. Three new Quantitative Structure-Activity Relationships (QSARs) for predicting whole body biotransformation half-lives (HLN) in fish were developed and validated using theoretical molecular descriptors that seek to capture structural characteristics of the whole molecule and three data set splitting schemes. The new QSARs were developed using a minimal number of theoretical descriptors (n=9) and compared to existing QSARs developed using fragment contribution methods that include up to 59 descriptors. The predictive statistics of the models are similar thus further corroborating the predictive performance of the different QSARs; Q(2)ext ranges from 0.75 to 0.77, CCCext ranges from 0.86 to 0.87, RMSE in prediction ranges from 0.56 to 0.58. The new QSARs provide additional mechanistic insights into the biotransformation capacity of organic chemicals in fish by including whole molecule descriptors and they also include information on the domain of applicability for the chemical of interest. Advantages of consensus modeling for improving overall prediction and minimizing false negative errors in chemical screening assessments, for identifying potential sources of residual error in the empirical HLN database, and for identifying structural features that are not well represented in the HLN dataset to prioritize future testing needs are illustrated. © 2013.
ABSTRACT Results of global gene expression profiling after short-term exposures can be used to inform tumorigenic potency and chemical mode of action (MOA) and thus serve as a strategy to prioritize future or data-poor chemicals for further evaluation. This compilation of cas...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kropka, Jamie Michael; Stavig, Mark E.; Arechederra, Gabe Kenneth
Develop an understanding of the evolution of glassy polymer mechanical response during aging and the mechanisms associated with that evolution. That understanding will be used to develop constitutive models to assess the impact of stress evolution in encapsulants on NW designs.
Exposure to endocrine disrupting chemicals can affect reproduction and development in both humans and wildlife. We developed a mechanistic mathematical model of the hypothalamic-pituitary-gonadal (HPG) axis in female fathead minnows to predict dose-response and time-course (DRTC)...
As part of a broader exploratory effort to develop ecological risk assessment approaches to estimate potential chemical effects on non-target populations, we describe an approach for developing simple population models to estimate the extent to which acute effects on individual...
Advanced physical-chemical life support systems research
NASA Technical Reports Server (NTRS)
Evanich, Peggy L.
1988-01-01
A proposed NASA space research and technology development program will provide adequate data for designing closed loop life support systems for long-duration manned space missions. This program, referred to as the Pathfinder Physical-Chemical Closed Loop Life Support Program, is to identify and develop critical chemical engineering technologies for the closure of air and water loops within the spacecraft, surface habitats or mobility devices. Computerized simulation can be used both as a research and management tool. Validated models will guide the selection of the best known applicable processes and in the development of new processes. For the integration of the habitat system, a biological subsystem would be introduced to provide food production and to enhance the physical-chemical life support functions on an ever-increasing basis.
Wildenhain, Jan; Spitzer, Michaela; Dolma, Sonam; Jarvik, Nick; White, Rachel; Roy, Marcia; Griffiths, Emma; Bellows, David S.; Wright, Gerard D.; Tyers, Mike
2016-01-01
The network structure of biological systems suggests that effective therapeutic intervention may require combinations of agents that act synergistically. However, a dearth of systematic chemical combination datasets have limited the development of predictive algorithms for chemical synergism. Here, we report two large datasets of linked chemical-genetic and chemical-chemical interactions in the budding yeast Saccharomyces cerevisiae. We screened 5,518 unique compounds against 242 diverse yeast gene deletion strains to generate an extended chemical-genetic matrix (CGM) of 492,126 chemical-gene interaction measurements. This CGM dataset contained 1,434 genotype-specific inhibitors, termed cryptagens. We selected 128 structurally diverse cryptagens and tested all pairwise combinations to generate a benchmark dataset of 8,128 pairwise chemical-chemical interaction tests for synergy prediction, termed the cryptagen matrix (CM). An accompanying database resource called ChemGRID was developed to enable analysis, visualisation and downloads of all data. The CGM and CM datasets will facilitate the benchmarking of computational approaches for synergy prediction, as well as chemical structure-activity relationship models for anti-fungal drug discovery. PMID:27874849
Chemical reactions simulated by ground-water-quality models
Grove, David B.; Stollenwerk, Kenneth G.
1987-01-01
Recent literature concerning the modeling of chemical reactions during transport in ground water is examined with emphasis on sorption reactions. The theory of transport and reactions in porous media has been well documented. Numerous equations have been developed from this theory, to provide both continuous and sequential or multistep models, with the water phase considered for both mobile and immobile phases. Chemical reactions can be either equilibrium or non-equilibrium, and can be quantified in linear or non-linear mathematical forms. Non-equilibrium reactions can be separated into kinetic and diffusional rate-limiting mechanisms. Solutions to the equations are available by either analytical expressions or numerical techniques. Saturated and unsaturated batch, column, and field studies are discussed with one-dimensional, laboratory-column experiments predominating. A summary table is presented that references the various kinds of models studied and their applications in predicting chemical concentrations in ground waters.
Thyroid hormones (TH) are critical for normal brain development. Environmental chemicals may disrupt TH homeostasis through a variety of physiological systems including membrane transporters, serum transporters, synthesis and catabolic enzymes, and nuclear receptors. Current comp...
There is increasing evidence that exposure to endocrine disrupting chemicals (EDCs) in the environment can induce adverse effects on reproduction and development in both humans and wildlife, mediated through hormonal disturbances.
Application of Chemistry in Materials Research at NASA GRC
NASA Technical Reports Server (NTRS)
Kavandi, Janet L.
2016-01-01
Overview of NASA GRC Materials Development. New materials enabled by new chemistries offering unique properties and chemical processing techniques. Durability of materials in harsh environments requires understanding and modeling of chemical interaction of materials with the environment.
Alarms about structural alerts.
Alves, Vinicius; Muratov, Eugene; Capuzzi, Stephen; Politi, Regina; Low, Yen; Braga, Rodolpho; Zakharov, Alexey V; Sedykh, Alexander; Mokshyna, Elena; Farag, Sherif; Andrade, Carolina; Kuz'min, Victor; Fourches, Denis; Tropsha, Alexander
2016-08-21
Structural alerts are widely accepted in chemical toxicology and regulatory decision support as a simple and transparent means to flag potential chemical hazards or group compounds into categories for read-across. However, there has been a growing concern that alerts disproportionally flag too many chemicals as toxic, which questions their reliability as toxicity markers. Conversely, the rigorously developed and properly validated statistical QSAR models can accurately and reliably predict the toxicity of a chemical; however, their use in regulatory toxicology has been hampered by the lack of transparency and interpretability. We demonstrate that contrary to the common perception of QSAR models as "black boxes" they can be used to identify statistically significant chemical substructures (QSAR-based alerts) that influence toxicity. We show through several case studies, however, that the mere presence of structural alerts in a chemical, irrespective of the derivation method (expert-based or QSAR-based), should be perceived only as hypotheses of possible toxicological effect. We propose a new approach that synergistically integrates structural alerts and rigorously validated QSAR models for a more transparent and accurate safety assessment of new chemicals.
Zhang, Li; Ai, Haixin; Chen, Wen; Yin, Zimo; Hu, Huan; Zhu, Junfeng; Zhao, Jian; Zhao, Qi; Liu, Hongsheng
2017-05-18
Carcinogenicity refers to a highly toxic end point of certain chemicals, and has become an important issue in the drug development process. In this study, three novel ensemble classification models, namely Ensemble SVM, Ensemble RF, and Ensemble XGBoost, were developed to predict carcinogenicity of chemicals using seven types of molecular fingerprints and three machine learning methods based on a dataset containing 1003 diverse compounds with rat carcinogenicity. Among these three models, Ensemble XGBoost is found to be the best, giving an average accuracy of 70.1 ± 2.9%, sensitivity of 67.0 ± 5.0%, and specificity of 73.1 ± 4.4% in five-fold cross-validation and an accuracy of 70.0%, sensitivity of 65.2%, and specificity of 76.5% in external validation. In comparison with some recent methods, the ensemble models outperform some machine learning-based approaches and yield equal accuracy and higher specificity but lower sensitivity than rule-based expert systems. It is also found that the ensemble models could be further improved if more data were available. As an application, the ensemble models are employed to discover potential carcinogens in the DrugBank database. The results indicate that the proposed models are helpful in predicting the carcinogenicity of chemicals. A web server called CarcinoPred-EL has been built for these models ( http://ccsipb.lnu.edu.cn/toxicity/CarcinoPred-EL/ ).
Human Environmental Disease Network: A computational model to assess toxicology of contaminants.
Taboureau, Olivier; Audouze, Karine
2017-01-01
During the past decades, many epidemiological, toxicological and biological studies have been performed to assess the role of environmental chemicals as potential toxicants associated with diverse human disorders. However, the relationships between diseases based on chemical exposure rarely have been studied by computational biology. We developed a human environmental disease network (EDN) to explore and suggest novel disease-disease and chemical-disease relationships. The presented scored EDN model is built upon the integration of systems biology and chemical toxicology using information on chemical contaminants and their disease relationships reported in the TDDB database. The resulting human EDN takes into consideration the level of evidence of the toxicant-disease relationships, allowing inclusion of some degrees of significance in the disease-disease associations. Such a network can be used to identify uncharacterized connections between diseases. Examples are discussed for type 2 diabetes (T2D). Additionally, this computational model allows confirmation of already known links between chemicals and diseases (e.g., between bisphenol A and behavioral disorders) and also reveals unexpected associations between chemicals and diseases (e.g., between chlordane and olfactory alteration), thus predicting which chemicals may be risk factors to human health. The proposed human EDN model allows exploration of common biological mechanisms of diseases associated with chemical exposure, helping us to gain insight into disease etiology and comorbidity. This computational approach is an alternative to animal testing supporting the 3R concept.
A Probablistic Diagram to Guide Chemical Design with ...
Toxicity is a concern with many chemicals currently in commerce, and with new chemicals that are introduced each year. The standard approach to testing chemicals is to run studies in laboratory animals (e.g. rats, mice, dogs), but because of the expense of these studies and concerns for animal welfare, few chemicals besides pharmaceuticals and pesticides are fully tested. Over the last decade there have been significant developments in the field of computational toxicology which combines in vitro tests and computational models. The ultimate goal of this ?field is to test all chemicals in a rapid, cost effective manner with minimal use of animals. One of the simplest measures of toxicity is provided by high-throughput in vitro cytotoxicity assays, which measure the concentration of a chemical that kills particular types of cells. Chemicals that are cytotoxic at low concentrations tend to be more toxic to animals than chemicals that are less cytotoxic. We employed molecular characteristics derived from density functional theory (DFT) and predicted values of log(octanol-water partition coe?fficient) (logP)to construct a design variable space, and built a predictive model for cytotoxicity using a Naive Bayesian algorithm. External evaluation showed that the area under the curve (AUC) for the receiver operating characteristic (ROC) of the model to be 0.81. Using this model, we provide design rules to help synthetic chemists minimize the chance that a newly synthesize
The Geochemical Earth Reference Model (GERM)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Staudigel, H.; Albarede, F.; Shaw, H.
The Geochemical Earth Reference Model (GERM) initiative is a grass- roots effort with the goal of establishing a community consensus on a chemical characterization of the Earth, its major reservoirs, and the fluxes between them. Long term goal of GERM is a chemical reservoir characterization analogous to the geophysical effort of the Preliminary Reference Earth Model (PREM). Chemical fluxes between reservoirs are included into GERM to illuminate the long-term chemical evolution of the Earth and to characterize the Earth as a dynamic chemical system. In turn, these fluxes control geological processes and influence hydrosphere-atmosphere-climate dynamics. While these long-term goals aremore » clearly the focus of GERM, the process of establishing GERM itself is just as important as its ultimate goal. The GERM initiative is developed in an open community discussion on the World Wide Web (GERM home page is at http://www-ep.es.llnl. gov/germ/germ-home.html) that is mediated by a series of editors with responsibilities for distinct reservoirs and fluxes. Beginning with the original workshop in Lyons (March 1996) GERM is continued to be developed on the Internet, punctuated by workshops and special sessions at professional meetings. It is planned to complete the first model by mid-1997, followed by a call for papers for a February 1998 GERM conference in La Jolla, California.« less
NASA Astrophysics Data System (ADS)
Bergqvist, Anna; Chang Rundgren, Shu-Nu
2017-04-01
Background: Textbooks are integral tools for teachers' lessons. Several researchers observed that school teachers rely heavily on textbooks as informational sources when planning lessons. Moreover, textbooks are an important resource for developing students' knowledge as they contain various representations that influence students' learning. However, several studies report that students have difficulties understanding models in general, and chemical bonding models in particular, and that students' difficulties understanding chemical bonding are partly due to the way it is taught by teachers and presented in textbooks.
Interactive models of communication at the nanoscale using nanoparticles that talk to one another
Llopis-Lorente, Antoni; Díez, Paula; Sánchez, Alfredo; Marcos, María D.; Sancenón, Félix; Martínez-Ruiz, Paloma; Villalonga, Reynaldo; Martínez-Máñez, Ramón
2017-01-01
‘Communication' between abiotic nanoscale chemical systems is an almost-unexplored field with enormous potential. Here we show the design and preparation of a chemical communication system based on enzyme-powered Janus nanoparticles, which mimics an interactive model of communication. Cargo delivery from one nanoparticle is governed by the biunivocal communication with another nanoparticle, which involves two enzymatic processes and the interchange of chemical messengers. The conceptual idea of establishing communication between nanodevices opens the opportunity to develop complex nanoscale systems capable of sharing information and cooperating. PMID:28556828
Multicellular Models of Morphogenesis
EPA’s Virtual Embryo project (v-Embryo™), in collaboration with developers of CompuCell3D, aims to create computer models of morphogenesis that can be used to address the effects of chemical perturbation on embryo development at the cellular level. Such computational (in silico) ...
NASA Astrophysics Data System (ADS)
Jorba, O.; Pérez, C.; Karsten, K.; Janjic, Z.; Dabdub, D.; Baldasano, J. M.
2009-09-01
This contribution presents the ongoing developments of a new fully on-line chemical weather prediction system for meso to global scale applications. The modeling system consists of a mineral dust module and a gas-phase chemistry module coupled on-line to a unified global-regional atmospheric driver. This approach allows solving small scale processes and their interactions at local to global scales. Its unified environment maintains the consistency of all the physico-chemical processes involved. The atmospheric driver is the NCEP/NMMB numerical weather prediction model (Janjic and Black, 2007) developed at National Centers for Environmental Prediction (NCEP). It represents an evolution of the operational WRF-NMME model extending from meso to global scales. Its unified non-hydrostatic dynamical core supports regional and global simulations. The Barcelona Supercomputing Center is currently designing and implementing a chemistry transport model coupled online with the new global/regional NMMB. The new modeling system is intended to be a powerful tool for research and to provide efficient global and regional chemical weather forecasts at sub-synoptic and mesoscale resolutions. The online coupling of the chemistry follows the approach similar to that of the mineral dust module already coupled to the atmospheric driver, NMMB/BSC-DUST (Pérez et al., 2008). Chemical species are advected and mixed at the corresponding time steps of the meteorological tracers using the same numerical scheme. Advection is eulerian, positive definite and monotone. The chemical mechanism and chemistry solver is based on the Kinetic PreProcessor KPP (Damian et al., 2002) package with the main purpose of maintaining a wide flexibility when configuring the model. Such approach will allow using a simplified chemical mechanism for global applications or a more complete mechanism for high-resolution local or regional studies. Moreover, it will permit the implementation of a specific configuration for forecasting applications in regional or global domains. An emission process allows the coupling of different emission inventories sources such as RETRO, EDGAR and GEIA for the global domain, EMEP for Europe and HERMES for Spain. The photolysis scheme is based on the Fast-J scheme, coupled with physics of each model layer (e.g., aerosols, clouds, absorbers as ozone) and it considers grid-scale clouds from the atmospheric driver. The dry deposition scheme follows the deposition velocity analogy for gases, enabling the calculation of deposition fluxes from airborne concentrations. No cloud-chemistry processes are included in the system yet (no wet deposition, scavenging and aqueous chemistry). The modeling system developments will be presented and first results of the gas-phase chemistry at global scale will be discussed. REFERENCES Janjic, Z.I., and Black, T.L., 2007. An ESMF unified model for a broad range of spatial and temporal scales, Geophysical Research Abstracts, 9, 05025. Pérez, C., Haustein, K., Janjic, Z.I., Jorba, O., Baldasano, J.M., Black, T.L., and Nickovic, S., 2008. An online dust model within the meso to global NMMB: current progress and plans. AGU Fall Meeting, San Francisco, A41K-03, 2008. Damian, V., Sandu, A., Damian, M., Potra, F., and Carmichael, G.R., 2002. The kinetic preprocessor KPP - A software environment for solving chemical kinetics. Comp. Chem. Eng., 26, 1567-1579. Sandu, A., and Sander, R., 2006. Technical note:Simulating chemical systems in Fortran90 and Matlab with the Kinetic PreProcessor KPP-2.1. Atmos. Chem. and Phys., 6, 187-195.
Chemical-gene interaction networks and causal reasoning for ...
Evaluating the potential human health and ecological risks associated with exposures to complex chemical mixtures in the environment is one of the main challenges of chemical safety assessment and environmental protection. There is a need for approaches that can help to integrate chemical monitoring and biological effects data to evaluate risks associated with chemicals present in the environment. Here, we used prior knowledge about chemical-gene interactions to develop a knowledge assembly model for detected chemicals at five locations near the North Branch and Chisago wastewater treatment plants (WWTP) in the St. Croix River Basin, MN and WI. The assembly model was used to generate hypotheses about the biological impacts of the chemicals at each location. The hypotheses were tested using empirical hepatic gene expression data from fathead minnows exposed for 12 d at each location. Empirical gene expression data were also mapped to the assembly models to evaluate the likelihood of a chemical contributing to the observed biological responses using richness and concordance statistics. The prior knowledge approach was able predict the observed biological pathways impacted at one site but not the other. Atrazine was identified as a potential contributor to the observed gene expression responses at a location upstream of the North Branch WTTP. Four chemicals were identified as contributors to the observed biological responses at the effluent and downstream o
Khashan, Raed; Zheng, Weifan; Tropsha, Alexander
2014-03-01
We present a novel approach to generating fragment-based molecular descriptors. The molecules are represented by labeled undirected chemical graph. Fast Frequent Subgraph Mining (FFSM) is used to find chemical-fragments (subgraphs) that occur in at least a subset of all molecules in a dataset. The collection of frequent subgraphs (FSG) forms a dataset-specific descriptors whose values for each molecule are defined by the number of times each frequent fragment occurs in this molecule. We have employed the FSG descriptors to develop variable selection k Nearest Neighbor (kNN) QSAR models of several datasets with binary target property including Maximum Recommended Therapeutic Dose (MRTD), Salmonella Mutagenicity (Ames Genotoxicity), and P-Glycoprotein (PGP) data. Each dataset was divided into training, test, and validation sets to establish the statistical figures of merit reflecting the model validated predictive power. The classification accuracies of models for both training and test sets for all datasets exceeded 75 %, and the accuracy for the external validation sets exceeded 72 %. The model accuracies were comparable or better than those reported earlier in the literature for the same datasets. Furthermore, the use of fragment-based descriptors affords mechanistic interpretation of validated QSAR models in terms of essential chemical fragments responsible for the compounds' target property. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Avianti, R.; Suyatno; Sugiarto, B.
2018-04-01
This study aims to create an appropriate learning material based on CORE (Connecting, Organizing, Reflecting, Extending) model to improve students’ learning achievement in Chemical Bonding Topic. This study used 4-D models as research design and one group pretest-posttest as design of the material treatment. The subject of the study was teaching materials based on CORE model, conducted on 30 students of Science class grade 10. The collecting data process involved some techniques such as validation, observation, test, and questionnaire. The findings were that: (1) all the contents were valid, (2) the practicality and the effectiveness of all the contents were good. The conclusion of this research was that the CORE model is appropriate to improve students’ learning outcomes for studying Chemical Bonding.
Toxicokinetic Triage for Environmental Chemicals.
Wambaugh, John F; Wetmore, Barbara A; Pearce, Robert; Strope, Cory; Goldsmith, Rocky; Sluka, James P; Sedykh, Alexander; Tropsha, Alex; Bosgra, Sieto; Shah, Imran; Judson, Richard; Thomas, Russell S; Setzer, R Woodrow
2015-09-01
Toxicokinetic (TK) models link administered doses to plasma, blood, and tissue concentrations. High-throughput TK (HTTK) performs in vitro to in vivo extrapolation to predict TK from rapid in vitro measurements and chemical structure-based properties. A significant toxicological application of HTTK has been "reverse dosimetry," in which bioactive concentrations from in vitro screening studies are converted into in vivo doses (mg/kg BW/day). These doses are predicted to produce steady-state plasma concentrations that are equivalent to in vitro bioactive concentrations. In this study, we evaluate the impact of the approximations and assumptions necessary for reverse dosimetry and develop methods to determine whether HTTK tools are appropriate or may lead to false conclusions for a particular chemical. Based on literature in vivo data for 87 chemicals, we identified specific properties (eg, in vitro HTTK data, physico-chemical descriptors, and predicted transporter affinities) that correlate with poor HTTK predictive ability. For 271 chemicals we developed a generic HT physiologically based TK (HTPBTK) model that predicts non-steady-state chemical concentration time-courses for a variety of exposure scenarios. We used this HTPBTK model to find that assumptions previously used for reverse dosimetry are usually appropriate, except most notably for highly bioaccumulative compounds. For the thousands of man-made chemicals in the environment that currently have no TK data, we propose a 4-element framework for chemical TK triage that can group chemicals into 7 different categories associated with varying levels of confidence in HTTK predictions. For 349 chemicals with literature HTTK data, we differentiated those chemicals for which HTTK approaches are likely to be sufficient, from those that may require additional data. Published by Oxford University Press on behalf of Society of Toxicology 2015. This work is written by US Government employees and is in the public domain in the US.
Predictive performance of the Vitrigel-eye irritancy test method using 118 chemicals.
Yamaguchi, Hiroyuki; Kojima, Hajime; Takezawa, Toshiaki
2016-08-01
We recently developed a novel Vitrigel-eye irritancy test (EIT) method. The Vitrigel-EIT method is composed of two parts, i.e., the construction of a human corneal epithelium (HCE) model in a collagen vitrigel membrane chamber and the prediction of eye irritancy by analyzing the time-dependent profile of transepithelial electrical resistance values for 3 min after exposing a chemical to the HCE model. In this study, we estimated the predictive performance of Vitrigel-EIT method by testing a total of 118 chemicals. The category determined by the Vitrigel-EIT method in comparison to the globally harmonized system classification revealed that the sensitivity, specificity and accuracy were 90.1%, 65.9% and 80.5%, respectively. Here, five of seven false-negative chemicals were acidic chemicals inducing the irregular rising of transepithelial electrical resistance values. In case of eliminating the test chemical solutions showing pH 5 or lower, the sensitivity, specificity and accuracy were improved to 96.8%, 67.4% and 84.4%, respectively. Meanwhile, nine of 16 false-positive chemicals were classified irritant by the US Environmental Protection Agency. In addition, the disappearance of ZO-1, a tight junction-associated protein and MUC1, a cell membrane-spanning mucin was immunohistologically confirmed in the HCE models after exposing not only eye irritant chemicals but also false-positive chemicals, suggesting that such false-positive chemicals have an eye irritant potential. These data demonstrated that the Vitrigel-EIT method could provide excellent predictive performance to judge the widespread eye irritancy, including very mild irritant chemicals. We hope that the Vitrigel-EIT method contributes to the development of safe commodity chemicals. Copyright © 2015 The Authors. Journal of Applied Toxicology published by John Wiley & Sons Ltd. Copyright © 2015 The Authors. Journal of Applied Toxicology published by John Wiley & Sons Ltd.
Golbamaki, Azadi; Benfenati, Emilio; Golbamaki, Nazanin; Manganaro, Alberto; Merdivan, Erinc; Roncaglioni, Alessandra; Gini, Giuseppina
2016-04-02
In this study, new molecular fragments associated with genotoxic and nongenotoxic carcinogens are introduced to estimate the carcinogenic potential of compounds. Two rule-based carcinogenesis models were developed with the aid of SARpy: model R (from rodents' experimental data) and model E (from human carcinogenicity data). Structural alert extraction method of SARpy uses a completely automated and unbiased manner with statistical significance. The carcinogenicity models developed in this study are collections of carcinogenic potential fragments that were extracted from two carcinogenicity databases: the ANTARES carcinogenicity dataset with information from bioassay on rats and the combination of ISSCAN and CGX datasets, which take into accounts human-based assessment. The performance of these two models was evaluated in terms of cross-validation and external validation using a 258 compound case study dataset. Combining R and H predictions and scoring a positive or negative result when both models are concordant on a prediction, increased accuracy to 72% and specificity to 79% on the external test set. The carcinogenic fragments present in the two models were compared and analyzed from the point of view of chemical class. The results of this study show that the developed rule sets will be a useful tool to identify some new structural alerts of carcinogenicity and provide effective information on the molecular structures of carcinogenic chemicals.
The importance of data curation on QSAR Modeling ...
During the last few decades many QSAR models and tools have been developed at the US EPA, including the widely used EPISuite. During this period the arsenal of computational capabilities supporting cheminformatics has broadened dramatically with multiple software packages. These modern tools allow for more advanced techniques in terms of chemical structure representation and storage, as well as enabling automated data-mining and standardization approaches to examine and fix data quality issues.This presentation will investigate the impact of data curation on the reliability of QSAR models being developed within the EPA‘s National Center for Computational Toxicology. As part of this work we have attempted to disentangle the influence of the quality versus quantity of data based on the Syracuse PHYSPROP database partly used by EPISuite software. We will review our automated approaches to examining key datasets related to the EPISuite data to validate across chemical structure representations (e.g., mol file and SMILES) and identifiers (chemical names and registry numbers) and approaches to standardize data into QSAR-ready formats prior to modeling procedures. Our efforts to quantify and segregate data into quality categories has allowed us to evaluate the resulting models that can be developed from these data slices and to quantify to what extent efforts developing high-quality datasets have the expected pay-off in terms of predicting performance. The most accur
Modeling of adipose/blood partition coefficient for environmental chemicals.
Papadaki, K C; Karakitsios, S P; Sarigiannis, D A
2017-12-01
A Quantitative Structure Activity Relationship (QSAR) model was developed in order to predict the adipose/blood partition coefficient of environmental chemical compounds. The first step of QSAR modeling was the collection of inputs. Input data included the experimental values of adipose/blood partition coefficient and two sets of molecular descriptors for 67 organic chemical compounds; a) the descriptors from Linear Free Energy Relationship (LFER) and b) the PaDEL descriptors. The datasets were split to training and prediction set and were analysed using two statistical methods; Genetic Algorithm based Multiple Linear Regression (GA-MLR) and Artificial Neural Networks (ANN). The models with LFER and PaDEL descriptors, coupled with ANN, produced satisfying performance results. The fitting performance (R 2 ) of the models, using LFER and PaDEL descriptors, was 0.94 and 0.96, respectively. The Applicability Domain (AD) of the models was assessed and then the models were applied to a large number of chemical compounds with unknown values of adipose/blood partition coefficient. In conclusion, the proposed models were checked for fitting, validity and applicability. It was demonstrated that they are stable, reliable and capable to predict the values of adipose/blood partition coefficient of "data poor" chemical compounds that fall within the applicability domain. Copyright © 2017. Published by Elsevier Ltd.
The use of high-throughput screening techniques to evaluate mitochondrial toxicity.
Wills, Lauren P
2017-11-01
Toxicologists and chemical regulators depend on accurate and effective methods to evaluate and predict the toxicity of thousands of current and future compounds. Robust high-throughput screening (HTS) experiments have the potential to efficiently test large numbers of chemical compounds for effects on biological pathways. HTS assays can be utilized to examine chemical toxicity across multiple mechanisms of action, experimental models, concentrations, and lengths of exposure. Many agricultural, industrial, and pharmaceutical chemicals classified as harmful to human and environmental health exert their effects through the mechanism of mitochondrial toxicity. Mitochondrial toxicants are compounds that cause a decrease in the number of mitochondria within a cell, and/or decrease the ability of mitochondria to perform normal functions including producing adenosine triphosphate (ATP) and maintaining cellular homeostasis. Mitochondrial dysfunction can lead to apoptosis, necrosis, altered metabolism, muscle weakness, neurodegeneration, decreased organ function, and eventually disease or death of the whole organism. The development of HTS techniques to identify mitochondrial toxicants will provide extensive databases with essential connections between mechanistic mitochondrial toxicity and chemical structure. Computational and bioinformatics approaches can be used to evaluate compound databases for specific chemical structures associated with toxicity, with the goal of developing quantitative structure-activity relationship (QSAR) models and mitochondrial toxicophores. Ultimately these predictive models will facilitate the identification of mitochondrial liabilities in consumer products, industrial compounds, pharmaceuticals and environmental hazards. Copyright © 2017 Elsevier B.V. All rights reserved.
Benchmark results in the 2D lattice Thirring model with a chemical potential
NASA Astrophysics Data System (ADS)
Ayyar, Venkitesh; Chandrasekharan, Shailesh; Rantaharju, Jarno
2018-03-01
We study the two-dimensional lattice Thirring model in the presence of a fermion chemical potential. Our model is asymptotically free and contains massive fermions that mimic a baryon and light bosons that mimic pions. Hence, it is a useful toy model for QCD, especially since it, too, suffers from a sign problem in the auxiliary field formulation in the presence of a fermion chemical potential. In this work, we formulate the model in both the world line and fermion-bag representations and show that the sign problem can be completely eliminated with open boundary conditions when the fermions are massless. Hence, we are able accurately compute a variety of interesting quantities in the model, and these results could provide benchmarks for other methods that are being developed to solve the sign problem in QCD.
Pieces of the Puzzle: Tracking the Chemical Component of the ...
This presentation provides an overview of the risk assessment conducted at the U.S. EPA, as well as some research examples related to the exposome concept. This presentation also provides the recommendation of using two organizational and predictive frameworks for tracking chemical components in the exposome. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.
Heterogenous Combustion of Porous Graphite Particles in Normal and Microgravity
NASA Technical Reports Server (NTRS)
Chelliah, Harsha K.; Miller, Fletcher J.; Delisle, Andrew J.
2001-01-01
Combustion of solid fuel particles has many important applications, including power generation and space propulsion systems. The current models available for describing the combustion process of these particles, especially porous solid particles, include various simplifying approximations. One of the most limiting approximations is the lumping of the physical properties of the porous fuel with the heterogeneous chemical reaction rate constants. The primary objective of the present work is to develop a rigorous model that could decouple such physical and chemical effects from the global heterogeneous reaction rates. For the purpose of validating this model, experiments with porous graphite particles of varying sizes and porosity are being performed. The details of this experimental and theoretical model development effort are described.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Politi, Regina; Department of Environmental Sciences and Engineering, University of North Carolina, Chapel Hill, NC 27599; Rusyn, Ivan, E-mail: iir@unc.edu
2014-10-01
The thyroid hormone receptor (THR) is an important member of the nuclear receptor family that can be activated by endocrine disrupting chemicals (EDC). Quantitative Structure–Activity Relationship (QSAR) models have been developed to facilitate the prioritization of THR-mediated EDC for the experimental validation. The largest database of binding affinities available at the time of the study for ligand binding domain (LBD) of THRβ was assembled to generate both continuous and classification QSAR models with an external accuracy of R{sup 2} = 0.55 and CCR = 0.76, respectively. In addition, for the first time a QSAR model was developed to predict bindingmore » affinities of antagonists inhibiting the interaction of coactivators with the AF-2 domain of THRβ (R{sup 2} = 0.70). Furthermore, molecular docking studies were performed for a set of THRβ ligands (57 agonists and 15 antagonists of LBD, 210 antagonists of the AF-2 domain, supplemented by putative decoys/non-binders) using several THRβ structures retrieved from the Protein Data Bank. We found that two agonist-bound THRβ conformations could effectively discriminate their corresponding ligands from presumed non-binders. Moreover, one of the agonist conformations could discriminate agonists from antagonists. Finally, we have conducted virtual screening of a chemical library compiled by the EPA as part of the Tox21 program to identify potential THRβ-mediated EDCs using both QSAR models and docking. We concluded that the library is unlikely to have any EDC that would bind to the THRβ. Models developed in this study can be employed either to identify environmental chemicals interacting with the THR or, conversely, to eliminate the THR-mediated mechanism of action for chemicals of concern. - Highlights: • This is the largest curated dataset for ligand binding domain (LBD) of the THRβ. • We report the first QSAR model for antagonists of AF-2 domain of THRβ. • A combination of QSAR and docking enables prediction of both affinity and efficacy. • Models can be used to identify environmental chemicals interacting with THRβ. • Models can be used to eliminate the THRβ-mediated mechanism of action.« less
Assessment of the GECKO-A Modeling Tool and Simplified 3D Model Parameterizations for SOA Formation
NASA Astrophysics Data System (ADS)
Aumont, B.; Hodzic, A.; La, S.; Camredon, M.; Lannuque, V.; Lee-Taylor, J. M.; Madronich, S.
2014-12-01
Explicit chemical mechanisms aim to embody the current knowledge of the transformations occurring in the atmosphere during the oxidation of organic matter. These explicit mechanisms are therefore useful tools to explore the fate of organic matter during its tropospheric oxidation and examine how these chemical processes shape the composition and properties of the gaseous and the condensed phases. Furthermore, explicit mechanisms provide powerful benchmarks to design and assess simplified parameterizations to be included 3D model. Nevertheless, the explicit mechanism describing the oxidation of hydrocarbons with backbones larger than few carbon atoms involves millions of secondary organic compounds, far exceeding the size of chemical mechanisms that can be written manually. Data processing tools can however be designed to overcome these difficulties and automatically generate consistent and comprehensive chemical mechanisms on a systematic basis. The Generator for Explicit Chemistry and Kinetics of Organics in the Atmosphere (GECKO-A) has been developed for the automatic writing of explicit chemical schemes of organic species and their partitioning between the gas and condensed phases. GECKO-A can be viewed as an expert system that mimics the steps by which chemists might develop chemical schemes. GECKO-A generates chemical schemes according to a prescribed protocol assigning reaction pathways and kinetics data on the basis of experimental data and structure-activity relationships. In its current version, GECKO-A can generate the full atmospheric oxidation scheme for most linear, branched and cyclic precursors, including alkanes and alkenes up to C25. Assessments of the GECKO-A modeling tool based on chamber SOA observations will be presented. GECKO-A was recently used to design a parameterization for SOA formation based on a Volatility Basis Set (VBS) approach. First results will be presented.
NASA Astrophysics Data System (ADS)
Young, J. C.; Boronska, K.; Martin, C. J.; Rickard, A. R.; Vázquez Moreno, M.; Pilling, M. J.; Haji, M. H.; Dew, P. M.; Lau, L. M.; Jimack, P. K.
2010-12-01
AtChem On-line1 is a simple to use zero-dimensional box modelling toolkit, developed for use by laboratory, field and chamber scientists. Any set of chemical reactions can be simulated, in particular the whole Master Chemical Mechanism (MCM2) or any subset of it. Parameters and initial data can be provided through a self-explanatory web form and the resulting model is compiled and run on a dedicated server. The core part of the toolkit, providing a robust solver for thousands of chemical reactions, is written in Fortran and uses SUNDIALS3 CVODE libraries. Chemical systems can be constrained at multiple, user-determined timescales; this enabled studies of radical chemistry at one minute timescales. AtChem On-line is free to use and requires no installation - a web browser, text editor and any compressing software is all the user needs. CPU and storage are provided by the server (input and output data are saved indefinitely). An off-line version is also being developed, which will provide batch processing, an advanced graphical user interface and post-processing tools, for example, Rate of Production Analysis (ROPA) and chainlength analysis. The source code is freely available for advanced users wishing to adapt and run the program locally. Data management, dissemination and archiving are essential in all areas of science. In order to do this in an efficient and transparent way, there is a critical need to capture high quality metadata/provenance for modelling activities. An Electronic Laboratory Notebook (ELN) has been developed in parallel with AtChem Online as part of the EC EUROCHAMP24 project. In order to use controlled chamber experiments to evaluate the MCM, we need to be able to archive, track and search information on all associated chamber model runs, so that they can be used in subsequent mechanism development. Therefore it would be extremely useful if experiment and model metadata/provenance could be easily and automatically stored electronically. Archiving metadata/provenance via an ELN makes it easier to write a paper or thesis and for mechanism developers/evaluators/peer review to search for appropriate experimental and modelling results and conclusions. The development of an ELN in the context mechanism evaluation/development using large experimental chamber datasets is presented.
Automated workflows for modelling chemical fate, kinetics and toxicity.
Sala Benito, J V; Paini, Alicia; Richarz, Andrea-Nicole; Meinl, Thorsten; Berthold, Michael R; Cronin, Mark T D; Worth, Andrew P
2017-12-01
Automation is universal in today's society, from operating equipment such as machinery, in factory processes, to self-parking automobile systems. While these examples show the efficiency and effectiveness of automated mechanical processes, automated procedures that support the chemical risk assessment process are still in their infancy. Future human safety assessments will rely increasingly on the use of automated models, such as physiologically based kinetic (PBK) and dynamic models and the virtual cell based assay (VCBA). These biologically-based models will be coupled with chemistry-based prediction models that also automate the generation of key input parameters such as physicochemical properties. The development of automated software tools is an important step in harmonising and expediting the chemical safety assessment process. In this study, we illustrate how the KNIME Analytics Platform can be used to provide a user-friendly graphical interface for these biokinetic models, such as PBK models and VCBA, which simulates the fate of chemicals in vivo within the body and in vitro test systems respectively. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Review: Modelling chemical kinetics and convective heating in giant planet entries
NASA Astrophysics Data System (ADS)
Reynier, Philippe; D'Ammando, Giuliano; Bruno, Domenico
2018-01-01
A review of the existing chemical kinetics models for H2 / He mixtures and related transport and thermodynamic properties is presented as a pre-requisite towards the development of innovative models based on the state-to-state approach. A survey of the available results obtained during the mission preparation and post-flight analyses of the Galileo mission has been undertaken and a computational matrix has been derived. Different chemical kinetics schemes for hydrogen/helium mixtures have been applied to numerical simulations of the selected points along the entry trajectory. First, a reacting scheme, based on literature data, has been set up for computing the flow-field around the probe at high altitude and comparisons with existing numerical predictions are performed. Then, a macroscopic model derived from a state-to-state model has been constructed and incorporated into a CFD code. Comparisons with existing numerical results from the literature have been performed as well as cross-check comparisons between the predictions provided by the different models in order to evaluate the potential of innovative chemical kinetics models based on the state-to-state approach.
Stadnicka-Michalak, Julita; Tanneberger, Katrin; Schirmer, Kristin; Ashauer, Roman
2014-01-01
Effect concentrations in the toxicity assessment of chemicals with fish and fish cells are generally based on external exposure concentrations. External concentrations as dose metrics, may, however, hamper interpretation and extrapolation of toxicological effects because it is the internal concentration that gives rise to the biological effective dose. Thus, we need to understand the relationship between the external and internal concentrations of chemicals. The objectives of this study were to: (i) elucidate the time-course of the concentration of chemicals with a wide range of physicochemical properties in the compartments of an in vitro test system, (ii) derive a predictive model for toxicokinetics in the in vitro test system, (iii) test the hypothesis that internal effect concentrations in fish (in vivo) and fish cell lines (in vitro) correlate, and (iv) develop a quantitative in vitro to in vivo toxicity extrapolation method for fish acute toxicity. To achieve these goals, time-dependent amounts of organic chemicals were measured in medium, cells (RTgill-W1) and the plastic of exposure wells. Then, the relation between uptake, elimination rate constants, and log KOW was investigated for cells in order to develop a toxicokinetic model. This model was used to predict internal effect concentrations in cells, which were compared with internal effect concentrations in fish gills predicted by a Physiologically Based Toxicokinetic model. Our model could predict concentrations of non-volatile organic chemicals with log KOW between 0.5 and 7 in cells. The correlation of the log ratio of internal effect concentrations in fish gills and the fish gill cell line with the log KOW was significant (r>0.85, p = 0.0008, F-test). This ratio can be predicted from the log KOW of the chemical (77% of variance explained), comprising a promising model to predict lethal effects on fish based on in vitro data. PMID:24647349
Reactive solute transport in streams: 1. Development of an equilibrium- based model
Runkel, Robert L.; Bencala, Kenneth E.; Broshears, Robert E.; Chapra, Steven C.
1996-01-01
An equilibrium-based solute transport model is developed for the simulation of trace metal fate and transport in streams. The model is formed by coupling a solute transport model with a chemical equilibrium submodel based on MINTEQ. The solute transport model considers the physical processes of advection, dispersion, lateral inflow, and transient storage, while the equilibrium submodel considers the speciation and complexation of aqueous species, precipitation/dissolution and sorption. Within the model, reactions in the water column may result in the formation of solid phases (precipitates and sorbed species) that are subject to downstream transport and settling processes. Solid phases on the streambed may also interact with the water column through dissolution and sorption/desorption reactions. Consideration of both mobile (water-borne) and immobile (streambed) solid phases requires a unique set of governing differential equations and solution techniques that are developed herein. The partial differential equations describing physical transport and the algebraic equations describing chemical equilibria are coupled using the sequential iteration approach.
Das, Rudra Narayan; Roy, Kunal
2014-06-01
Hazardous potential of ionic liquids is becoming an issue of high concern with increasing application of these compounds in various industrial processes. Predictive toxicological modeling on ionic liquids provides a rational assessment strategy and aids in developing suitable guidance for designing novel analogues. The present study attempts to explore the chemical features of ionic liquids responsible for their ecotoxicity towards the green algae Scenedesmus vacuolatus by developing mathematical models using extended topochemical atom (ETA) indices along with other categories of chemical descriptors. The entire study has been conducted with reference to the OECD guidelines for QSAR model development using predictive classification and regression modeling strategies. The best models from both the analyses showed that ecotoxicity of ionic liquids can be decreased by reducing chain length of cationic substituents and increasing hydrogen bond donor feature in cations, and replacing bulky unsaturated anions with simple saturated moiety having less lipophilic heteroatoms. Copyright © 2013 Elsevier Ltd. All rights reserved.
Dyekjaer, Jane Dannow; Jónsdóttir, Svava Osk
2004-01-22
Quantitative Structure-Property Relationships (QSPR) have been developed for a series of monosaccharides, including the physical properties of partial molar heat capacity, heat of solution, melting point, heat of fusion, glass-transition temperature, and solid state density. The models were based on molecular descriptors obtained from molecular mechanics and quantum chemical calculations, combined with other types of descriptors. Saccharides exhibit a large degree of conformational flexibility, therefore a methodology for selecting the energetically most favorable conformers has been developed, and was used for the development of the QSPR models. In most cases good correlations were obtained for monosaccharides. For five of the properties predictions were made for disaccharides, and the predicted values for the partial molar heat capacities were in excellent agreement with experimental values.
Integrated Multimedia Modeling System Response to Regional Land Management Change
A multi-media system of nitrogen and co-pollutant models describing critical physical and chemical processes that cascade synergistically and competitively through the environment, the economy and society has been developed at the USEPA Office of research and development. It is ...
Adaptive Response in Female Modeling of the Hypothalamic-pituitary-gonadal Axis
Exposure to endocrine disrupting chemicals can affect reproduction and development in both humans and wildlife. We are developing a mechanistic computational model of the hypothalamic-pituitary-gonadal (HPG) axis in female fathead minnows to predict dose-response and time-course ...
Development of a Reactor Model for Chemical Conversion of Lunar Regolith
NASA Technical Reports Server (NTRS)
Hegde, U.; Balasubramaniam, R.; Gokoglu, S.
2009-01-01
Lunar regolith will be used for a variety of purposes such as oxygen and propellant production and manufacture of various materials. The design and development of chemical conversion reactors for processing lunar regolith will require an understanding of the coupling among the chemical, mass and energy transport processes occurring at the length and time scales of the overall reactor with those occurring at the corresponding scales of the regolith particles. To this end, a coupled transport model is developed using, as an example, the reduction of ilmenite-containing regolith by a continuous flow of hydrogen in a flow-through reactor. The ilmenite conversion occurs on the surface and within the regolith particles. As the ilmenite reduction proceeds, the hydrogen in the reactor is consumed, and this, in turn, affects the conversion rate of the ilmenite in the particles. Several important quantities are identified as a result of the analysis. Reactor scale parameters include the void fraction (i.e., the fraction of the reactor volume not occupied by the regolith particles) and the residence time of hydrogen in the reactor. Particle scale quantities include the time for hydrogen to diffuse into the pores of the regolith particles and the chemical reaction time. The paper investigates the relationships between these quantities and their impact on the regolith conversion. Application of the model to various chemical reactor types, such as fluidized-bed, packed-bed, and rotary-bed configurations, are discussed.
Development of a Reactor Model for Chemical Conversion of Lunar Regolith
NASA Technical Reports Server (NTRS)
Hedge, uday; Balasubramaniam, R.; Gokoglu, S.
2007-01-01
Lunar regolith will be used for a variety of purposes such as oxygen and propellant production and manufacture of various materials. The design and development of chemical conversion reactors for processing lunar regolith will require an understanding of the coupling among the chemical, mass and energy transport processes occurring at the length and time scales of the overall reactor with those occurring at the corresponding scales of the regolith particles. To this end, a coupled transport model is developed using, as an example, the reduction of ilmenite-containing regolith by a continuous flow of hydrogen in a flow-through reactor. The ilmenite conversion occurs on the surface and within the regolith particles. As the ilmenite reduction proceeds, the hydrogen in the reactor is consumed, and this, in turn, affects the conversion rate of the ilmenite in the particles. Several important quantities are identified as a result of the analysis. Reactor scale parameters include the void fraction (i.e., the fraction of the reactor volume not occupied by the regolith particles) and the residence time of hydrogen in the reactor. Particle scale quantities include the time for hydrogen to diffuse into the pores of the regolith particles and the chemical reaction time. The paper investigates the relationships between these quantities and their impact on the regolith conversion. Application of the model to various chemical reactor types, such as fluidized-bed, packed-bed, and rotary-bed configurations, are discussed.
Modeling MIC copper release from drinking water pipes.
Pizarro, Gonzalo E; Vargas, Ignacio T; Pastén, Pablo A; Calle, Gustavo R
2014-06-01
Copper is used for household drinking water distribution systems given its physical and chemical properties that make it resistant to corrosion. However, there is evidence that, under certain conditions, it can corrode and release unsafe concentrations of copper to the water. Research on drinking water copper pipes has developed conceptual models that include several physical-chemical mechanisms. Nevertheless, there is still a necessity for the development of mathematical models of this phenomenon, which consider the interaction among physical-chemical processes at different spatial scales. We developed a conceptual and a mathematical model that reproduces the main processes in copper release from copper pipes subject to stagnation and flow cycles, and corrosion is associated with biofilm growth on the surface of the pipes. We discuss the influence of the reactive surface and the copper release curves observed. The modeling and experimental observations indicated that after 10h stagnation, the main concentration of copper is located close to the surface of the pipe. This copper is associated with the reactive surface, which acts as a reservoir of labile copper. Thus, for pipes with the presence of biofilm the complexation of copper with the biomass and the hydrodynamics are the main mechanisms for copper release. Copyright © 2013 Elsevier B.V. All rights reserved.
Alternative Test Methods for Developmental Neurotoxicity: A ...
Exposure to environmental contaminants is well documented to adversely impact the development of the nervous system. However, the time, animal and resource intensive EPA and OECD testing guideline methods for developmental neurotoxicity (DNT) are not a viable solution to characterizing potential chemical hazards for the thousands of untested chemicals currently in commerce. Thus, research efforts over the past decade have endeavored to develop cost-effective alternative DNT testing methods. These efforts have begun to generate data that can inform regulatory decisions. Yet there are major challenges to both the acceptance and use of this data. Major scientific challenges for DNT include development of new methods and models that are “fit for purpose”, development of a decision-use framework, and regulatory acceptance of the methods. It is critical to understand that use of data from these methods will be driven mainly by the regulatory problems being addressed. Some problems may be addressed with limited datasets, while others may require data for large numbers of chemicals, or require the development and use of new biological and computational models. For example mechanistic information derived from in vitro DNT assays can be used to inform weight of evidence (WoE) or integrated approaches to testing and assessment (IATA) approaches for chemical-specific assessments. Alternatively, in vitro data can be used to prioritize (for further testing) the thousands
Introduction to Chemical Engineering Reactor Analysis: A Web-Based Reactor Design Game
ERIC Educational Resources Information Center
Orbey, Nese; Clay, Molly; Russell, T.W. Fraser
2014-01-01
An approach to explain chemical engineering through a Web-based interactive game design was developed and used with college freshman and junior/senior high school students. The goal of this approach was to demonstrate how to model a lab-scale experiment, and use the results to design and operate a chemical reactor. The game incorporates both…
Chemical spills and associated deaths in the US has increased 2.6-fold and 16-fold from 1983 to 2012, respectfully. In addition, the number of chemicals to which humans are exposed to in the environment has increased almost 10-fold from 2001 to 2013 within the US. Internationally...
Pharmacophore Modelling and Synthesis of Quinoline-3-Carbohydrazide as Antioxidants
El Bakkali, Mustapha; Ismaili, Lhassane; Tomassoli, Isabelle; Nicod, Laurence; Pudlo, Marc; Refouvelet, Bernard
2011-01-01
From well-known antioxidants agents, we developed a first pharmacophore model containing four common chemical features: one aromatic ring and three hydrogen bond acceptors. This model served as a template in virtual screening of Maybridge and NCI databases that resulted in selection of sixteen compounds. The selected compounds showed a good antioxidant activity measured by three chemical tests: DPPH radical, OH° radical, and superoxide radical scavenging. New synthetic compounds with a good correlation with the model were prepared, and some of them presented a good antioxidant activity. PMID:25954520
Toxico-Cheminformatics: New and Expanding Public ...
High-throughput screening (HTS) technologies, along with efforts to improve public access to chemical toxicity information resources and to systematize older toxicity studies, have the potential to significantly improve information gathering efforts for chemical assessments and predictive capabilities in toxicology. Important developments include: 1) large and growing public resources that link chemical structures to biological activity and toxicity data in searchable format, and that offer more nuanced and varied representations of activity; 2) standardized relational data models that capture relevant details of chemical treatment and effects of published in vivo experiments; and 3) the generation of large amounts of new data from public efforts that are employing HTS technologies to probe a wide range of bioactivity and cellular processes across large swaths of chemical space. By annotating toxicity data with associated chemical structure information, these efforts link data across diverse study domains (e.g., ‘omics’, HTS, traditional toxicity studies), toxicity domains (carcinogenicity, developmental toxicity, neurotoxicity, immunotoxicity, etc) and database sources (EPA, FDA, NCI, DSSTox, PubChem, GEO, ArrayExpress, etc.). Public initiatives are developing systematized data models of toxicity study areas and introducing standardized templates, controlled vocabularies, hierarchical organization, and powerful relational searching capability across capt
Wardlow, Nathan; Polin, Chris; Villagomez-Bernabe, Balder; Currell, Fred
2015-11-01
We present a simple model for a component of the radiolytic production of any chemical species due to electron emission from irradiated nanoparticles (NPs) in a liquid environment, provided the expression for the G value for product formation is known and is reasonably well characterized by a linear dependence on beam energy. This model takes nanoparticle size, composition, density and a number of other readily available parameters (such as X-ray and electron attenuation data) as inputs and therefore allows for the ready determination of this contribution. Several approximations are used, thus this model provides an upper limit to the yield of chemical species due to electron emission, rather than a distinct value, and this upper limit is compared with experimental results. After the general model is developed we provide details of its application to the generation of HO• through irradiation of gold nanoparticles (AuNPs), a potentially important process in nanoparticle-based enhancement of radiotherapy. This model has been constructed with the intention of making it accessible to other researchers who wish to estimate chemical yields through this process, and is shown to be applicable to NPs of single elements and mixtures. The model can be applied without the need to develop additional skills (such as using a Monte Carlo toolkit), providing a fast and straightforward method of estimating chemical yields. A simple framework for determining the HO• yield for different NP sizes at constant NP concentration and initial photon energy is also presented.
Developmental neurotoxicity of industrial chemicals.
Grandjean, P; Landrigan, P J
2006-12-16
Neurodevelopmental disorders such as autism, attention deficit disorder, mental retardation, and cerebral palsy are common, costly, and can cause lifelong disability. Their causes are mostly unknown. A few industrial chemicals (eg, lead, methylmercury, polychlorinated biphenyls [PCBs], arsenic, and toluene) are recognised causes of neurodevelopmental disorders and subclinical brain dysfunction. Exposure to these chemicals during early fetal development can cause brain injury at doses much lower than those affecting adult brain function. Recognition of these risks has led to evidence-based programmes of prevention, such as elimination of lead additives in petrol. Although these prevention campaigns are highly successful, most were initiated only after substantial delays. Another 200 chemicals are known to cause clinical neurotoxic effects in adults. Despite an absence of systematic testing, many additional chemicals have been shown to be neurotoxic in laboratory models. The toxic effects of such chemicals in the developing human brain are not known and they are not regulated to protect children. The two main impediments to prevention of neurodevelopmental deficits of chemical origin are the great gaps in testing chemicals for developmental neurotoxicity and the high level of proof required for regulation. New, precautionary approaches that recognise the unique vulnerability of the developing brain are needed for testing and control of chemicals.
Modeling of Laser Material Interactions
NASA Astrophysics Data System (ADS)
Garrison, Barbara
2009-03-01
Irradiation of a substrate by laser light initiates the complex chemical and physical process of ablation where large amounts of material are removed. Ablation has been successfully used in techniques such as nanolithography and LASIK surgery, however a fundamental understanding of the process is necessary in order to further optimize and develop applications. To accurately describe the ablation phenomenon, a model must take into account the multitude of events which occur when a laser irradiates a target including electronic excitation, bond cleavage, desorption of small molecules, ongoing chemical reactions, propagation of stress waves, and bulk ejection of material. A coarse grained molecular dynamics (MD) protocol with an embedded Monte Carlo (MC) scheme has been developed which effectively addresses each of these events during the simulation. Using the simulation technique, thermal and chemical excitation channels are separately studied with a model polymethyl methacrylate system. The effects of the irradiation parameters and reaction pathways on the process dynamics are investigated. The mechanism of ablation for thermal processes is governed by a critical number of bond breaks following the deposition of energy. For the case where an absorbed photon directly causes a bond scission, ablation occurs following the rapid chemical decomposition of material. The study provides insight into the influence of thermal and chemical processes in polymethyl methacrylate and facilitates greater understanding of the complex nature of polymer ablation.
Detailed Modeling of Distillation Technologies for Closed-Loop Water Recovery Systems
NASA Technical Reports Server (NTRS)
Allada, Rama Kumar; Lange, Kevin E.; Anderson, Molly S.
2011-01-01
Detailed chemical process simulations are a useful tool in designing and optimizing complex systems and architectures for human life support. Dynamic and steady-state models of these systems help contrast the interactions of various operating parameters and hardware designs, which become extremely useful in trade-study analyses. NASA?s Exploration Life Support technology development project recently made use of such models to compliment a series of tests on different waste water distillation systems. This paper presents efforts to develop chemical process simulations for three technologies: the Cascade Distillation System (CDS), the Vapor Compression Distillation (VCD) system and the Wiped-Film Rotating Disk (WFRD) using the Aspen Custom Modeler and Aspen Plus process simulation tools. The paper discusses system design, modeling details, and modeling results for each technology and presents some comparisons between the model results and recent test data. Following these initial comparisons, some general conclusions and forward work are discussed.
3D Bioprinting of Tissue/Organ Models.
Pati, Falguni; Gantelius, Jesper; Svahn, Helene Andersson
2016-04-04
In vitro tissue/organ models are useful platforms that can facilitate systematic, repetitive, and quantitative investigations of drugs/chemicals. The primary objective when developing tissue/organ models is to reproduce physiologically relevant functions that typically require complex culture systems. Bioprinting offers exciting prospects for constructing 3D tissue/organ models, as it enables the reproducible, automated production of complex living tissues. Bioprinted tissues/organs may prove useful for screening novel compounds or predicting toxicity, as the spatial and chemical complexity inherent to native tissues/organs can be recreated. In this Review, we highlight the importance of developing 3D in vitro tissue/organ models by 3D bioprinting techniques, characterization of these models for evaluating their resemblance to native tissue, and their application in the prioritization of lead candidates, toxicity testing, and as disease/tumor models. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
WRF/CMAQ AQMEII3 Simulations of US Regional-Scale ...
Chemical boundary conditions are a key input to regional-scale photochemical models. In this study, performed during the third phase of the Air Quality Model Evaluation International Initiative (AQMEII3), we perform annual simulations over North America with chemical boundary conditions prepared from four different global models. Results indicate that the impacts of different boundary conditions are significant for ozone throughout the year and most pronounced outside the summer season. The National Exposure Research Laboratory (NERL) Computational Exposure Division (CED) develops and evaluates data, decision-support tools, and models to be applied to media-specific or receptor-specific problem areas. CED uses modeling-based approaches to characterize exposures, evaluate fate and transport, and support environmental diagnostics/forensics with input from multiple data sources. It also develops media- and receptor-specific models, process models, and decision support tools for use both within and outside of EPA.
New Vistas in Chemical Product and Process Design.
Zhang, Lei; Babi, Deenesh K; Gani, Rafiqul
2016-06-07
Design of chemicals-based products is broadly classified into those that are process centered and those that are product centered. In this article, the designs of both classes of products are reviewed from a process systems point of view; developments related to the design of the chemical product, its corresponding process, and its integration are highlighted. Although significant advances have been made in the development of systematic model-based techniques for process design (also for optimization, operation, and control), much work is needed to reach the same level for product design. Timeline diagrams illustrating key contributions in product design, process design, and integrated product-process design are presented. The search for novel, innovative, and sustainable solutions must be matched by consideration of issues related to the multidisciplinary nature of problems, the lack of data needed for model development, solution strategies that incorporate multiscale options, and reliability versus predictive power. The need for an integrated model-experiment-based design approach is discussed together with benefits of employing a systematic computer-aided framework with built-in design templates.
Kanematsu, Yusuke; Tachikawa, Masanori
2014-04-28
We have developed the multicomponent hybrid density functional theory [MC_(HF+DFT)] method with polarizable continuum model (PCM) for the analysis of molecular properties including both nuclear quantum effect and solvent effect. The chemical shifts and H/D isotope shifts of the picolinic acid N-oxide (PANO) molecule in chloroform and acetonitrile solvents are applied by B3LYP electron exchange-correlation functional for our MC_(HF+DFT) method with PCM (MC_B3LYP/PCM). Our MC_B3LYP/PCM results for PANO are in reasonable agreement with the corresponding experimental chemical shifts and isotope shifts. We further investigated the applicability of our method for acetylacetone in several solvents.
Modeling multiphase migration of organic chemicals in groundwater systems--a review and assessment.
Abriola, L M
1989-01-01
Over the past two decades, a number of models have been developed to describe the multiphase migration of organic chemicals in the subsurface. This paper presents the state-of-the-art with regard to such modeling efforts. The mathematical foundations of these models are explored and individual models are presented and discussed. Models are divided into three groups: a) those that assume a sharp interface between the migrating fluids; b) those that incorporate capillarity; and c) those that consider interphase transport of mass. Strengths and weaknesses of each approach are considered along with supporting data for model validation. Future research directions are also highlighted. PMID:2695322
Doucet, O; Lanvin, M; Thillou, C; Linossier, C; Pupat, C; Merlin, B; Zastrow, L
2006-06-01
The aim of this study was to evaluate the interest of a new three-dimensional epithelial model cultivated from human corneal cells to replace animal testing in the assessment of eye tolerance. To this end, 65 formulated cosmetic products and 36 chemicals were tested by means of this in vitro model using a simplified toxicokinetic approach. The chemicals were selected from the ECETOC data bank and the EC/HO International validation study list. Very satisfactory results were obtained in terms of concordance with the Draize test data for the formulated cosmetic products. Moreover, the response of the corneal model appeared predictive of human ocular response clinically observed by ophthalmologists. The in vitro scores for the chemicals tested strongly correlated with their respective scores in vivo. For all the compounds tested, the response of the corneal model to irritants was similar regardless of their chemical structure, suggesting a good robustness of the prediction model proposed. We concluded that this new three-dimensional epithelial model, developed from human corneal cells, could be promising for the prediction of eye irritation induced by chemicals and complex formulated products, and that these two types of materials should be tested using a similar protocol. A simple shortening of the exposure period was required for the chemicals assumed to be more aggressively irritant to the epithelial tissues than the cosmetic formulae.
Modeling chemical reactions for drug design.
Gasteiger, Johann
2007-01-01
Chemical reactions are involved at many stages of the drug design process. This starts with the analysis of biochemical pathways that are controlled by enzymes that might be downregulated in certain diseases. In the lead discovery and lead optimization process compounds have to be synthesized in order to test them for their biological activity. And finally, the metabolism of a drug has to be established. A better understanding of chemical reactions could strongly help in making the drug design process more efficient. We have developed methods for quantifying the concepts an organic chemist is using in rationalizing reaction mechanisms. These methods allow a comprehensive modeling of chemical reactivity and thus are applicable to a wide variety of chemical reactions, from gas phase reactions to biochemical pathways. They are empirical in nature and therefore allow the rapid processing of large sets of structures and reactions. We will show here how methods have been developed for the prediction of acidity values and of the regioselectivity in organic reactions, for designing the synthesis of organic molecules and of combinatorial libraries, and for furthering our understanding of enzyme-catalyzed reactions and of the metabolism of drugs.
Fuzzy rule based estimation of agricultural diffuse pollution concentration in streams.
Singh, Raj Mohan
2008-04-01
Outflow from the agricultural fields carries diffuse pollutants like nutrients, pesticides, herbicides etc. and transports the pollutants into the nearby streams. It is a matter of serious concern for water managers and environmental researchers. The application of chemicals in the agricultural fields, and transport of these chemicals into streams are uncertain that cause complexity in reliable stream quality predictions. The chemical characteristics of applied chemical, percentage of area under the chemical application etc. are some of the main inputs that cause pollution concentration as output in streams. Each of these inputs and outputs may contain measurement errors. Fuzzy rule based model based on fuzzy sets suits to address uncertainties in inputs by incorporating overlapping membership functions for each of inputs even for limited data availability situations. In this study, the property of fuzzy sets to address the uncertainty in input-output relationship is utilized to obtain the estimate of concentrations of a herbicide, atrazine, in a stream. The data of White river basin, a part of the Mississippi river system, is used for developing the fuzzy rule based models. The performance of the developed methodology is found encouraging.
Multi-Scale Modeling of the Gamma Radiolysis of Nitrate Solutions.
Horne, Gregory P; Donoclift, Thomas A; Sims, Howard E; Orr, Robin M; Pimblott, Simon M
2016-11-17
A multiscale modeling approach has been developed for the extended time scale long-term radiolysis of aqueous systems. The approach uses a combination of stochastic track structure and track chemistry as well as deterministic homogeneous chemistry techniques and involves four key stages: radiation track structure simulation, the subsequent physicochemical processes, nonhomogeneous diffusion-reaction kinetic evolution, and homogeneous bulk chemistry modeling. The first three components model the physical and chemical evolution of an isolated radiation chemical track and provide radiolysis yields, within the extremely low dose isolated track paradigm, as the input parameters for a bulk deterministic chemistry model. This approach to radiation chemical modeling has been tested by comparison with the experimentally observed yield of nitrite from the gamma radiolysis of sodium nitrate solutions. This is a complex radiation chemical system which is strongly dependent on secondary reaction processes. The concentration of nitrite is not just dependent upon the evolution of radiation track chemistry and the scavenging of the hydrated electron and its precursors but also on the subsequent reactions of the products of these scavenging reactions with other water radiolysis products. Without the inclusion of intratrack chemistry, the deterministic component of the multiscale model is unable to correctly predict experimental data, highlighting the importance of intratrack radiation chemistry in the chemical evolution of the irradiated system.
Modeling Liver-Related Adverse Effects of Drugs Using kNN QSAR Method
Rodgers, Amie D.; Zhu, Hao; Fourches, Dennis; Rusyn, Ivan; Tropsha, Alexander
2010-01-01
Adverse effects of drugs (AEDs) continue to be a major cause of drug withdrawals both in development and post-marketing. While liver-related AEDs are a major concern for drug safety, there are few in silico models for predicting human liver toxicity for drug candidates. We have applied the Quantitative Structure Activity Relationship (QSAR) approach to model liver AEDs. In this study, we aimed to construct a QSAR model capable of binary classification (active vs. inactive) of drugs for liver AEDs based on chemical structure. To build QSAR models, we have employed an FDA spontaneous reporting database of human liver AEDs (elevations in activity of serum liver enzymes), which contains data on approximately 500 approved drugs. Approximately 200 compounds with wide clinical data coverage, structural similarity and balanced (40/60) active/inactive ratio were selected for modeling and divided into multiple training/test and external validation sets. QSAR models were developed using the k nearest neighbor method and validated using external datasets. Models with high sensitivity (>73%) and specificity (>94%) for prediction of liver AEDs in external validation sets were developed. To test applicability of the models, three chemical databases (World Drug Index, Prestwick Chemical Library, and Biowisdom Liver Intelligence Module) were screened in silico and the validity of predictions was determined, where possible, by comparing model-based classification with assertions in publicly available literature. Validated QSAR models of liver AEDs based on the data from the FDA spontaneous reporting system can be employed as sensitive and specific predictors of AEDs in pre-clinical screening of drug candidates for potential hepatotoxicity in humans. PMID:20192250
In-Space Chemical Propulsion System Model
NASA Technical Reports Server (NTRS)
Byers, David C.; Woodcock, Gordon; Benfield, Michael P. J.
2004-01-01
Multiple, new technologies for chemical systems are becoming available and include high temperature rockets, very light propellant tanks and structures, new bipropellant and monopropellant options, lower mass propellant control components, and zero boil off subsystems. Such technologies offer promise of increasing the performance of in-space chemical propulsion for energetic space missions. A mass model for pressure-fed, Earth and space-storable, advanced chemical propulsion systems (ACPS) was developed in support of the NASA MSFC In-Space Propulsion Program. Data from flight systems and studies defined baseline system architectures and subsystems and analyses were formulated for parametric scaling relationships for all ACPS subsystem. The paper will first provide summary descriptions of the approaches used for the systems and the subsystems and then present selected analyses to illustrate use of the model for missions with characteristics of current interest.
In-Space Chemical Propulsion System Model
NASA Technical Reports Server (NTRS)
Byers, David C.; Woodcock, Gordon; Benfield, M. P. J.
2004-01-01
Multiple, new technologies for chemical systems are becoming available and include high temperature rockets, very light propellant tanks and structures, new bipropellant and monopropellant options, lower mass propellant control components, and zero boil off subsystems. Such technologies offer promise of increasing the performance of in-space chemical propulsion for energetic space missions. A mass model for pressure-fed, Earth and space-storable, advanced chemical propulsion systems (ACPS) was developed in support of the NASA MSFC In-Space Propulsion Program. Data from flight systems and studies defined baseline system architectures and subsystems and analyses were formulated for parametric scaling relationships for all ACPS subsystems. The paper will first provide summary descriptions of the approaches used for the systems and the subsystems and then present selected analyses to illustrate use of the model for missions with characteristics of current interest.
Application of Biologically-Based Lumping To Investigate the ...
People are often exposed to complex mixtures of environmental chemicals such as gasoline, tobacco smoke, water contaminants, or food additives. However, investigators have often considered complex mixtures as one lumped entity. Valuable information can be obtained from these experiments, though this simplification provides little insight into the impact of a mixture's chemical composition on toxicologically-relevant metabolic interactions that may occur among its constituents. We developed an approach that applies chemical lumping methods to complex mixtures, in this case gasoline, based on biologically relevant parameters used in physiologically-based pharmacokinetic (PBPK) modeling. Inhalation exposures were performed with rats to evaluate performance of our PBPK model. There were 109 chemicals identified and quantified in the vapor in the chamber. The time-course kinetic profiles of 10 target chemicals were also determined from blood samples collected during and following the in vivo experiments. A general PBPK model was used to compare the experimental data to the simulated values of blood concentration for the 10 target chemicals with various numbers of lumps, iteratively increasing from 0 to 99. Large reductions in simulation error were gained by incorporating enzymatic chemical interactions, in comparison to simulating the individual chemicals separately. The error was further reduced by lumping the 99 non-target chemicals. Application of this biologic
VOLATILIZATION RATES FROM WATER TO INDOOR AIR ...
Contaminated water can lead to volatilization of chemicals to residential indoor air. Previous research has focused on only one source (shower stalls) and has been limited to chemicals in which gas-phase resistance to mass transfer is of marginal significance. As a result, attempts to extrapolate chemical emissions from high-volatility chemicals to lower volatility chemicals, or to sources other than showers, have been difficult or impossible. This study involved the development of two-phase, dynamic mass balance models for estimating chemical emissions from washing machines, dishwashers, and bathtubs. An existing model was adopted for showers only. Each model required the use of source- and chemical-specific mass transfer coefficients. Air exchange (ventilation) rates were required for dishwashers and washing machines as well. These parameters were estimated based on a series of 113 experiments involving 5 tracer chemicals (acetone, ethyl acetate, toluene, ethylbenzene, and cyclohexane) and 4 sources (showers, bathtubs, washing machines, and dishwashers). Each set of experiments led to the determination of chemical stripping efficiencies and mass transfer coefficients (overall, liquid-phase, gas-phase), and to an assessment of the importance of gas- phase resistance to mass transfer. Stripping efficiencies ranged from 6.3% to 80% for showers, 2.6% to 69% for bathtubs, 18% to 100% for dishwashers, and 3.8% to 100% for washing machines. Acetone and cyclohexane al
NASA Astrophysics Data System (ADS)
Erduran, Sibel
The central problem underlying this dissertation is the design of learning environments that enable the teaching and learning of chemistry through modeling. Significant role of models in chemistry knowledge is highlighted with a shift in emphasis from conceptual to epistemological accounts of models. Research context is the design and implementation of student centered Acids & Bases Curriculum, developed as part of Project SEPIA. Qualitative study focused on 3 curriculum activities conducted in one 7th grade class of 19 students in an urban, public middle school in eastern United States. Questions guiding the study were: (a) How can learning environments be designed to promote growth of chemistry knowledge through modeling? (b) What epistemological criteria facilitate learning of growth of chemistry knowledge through modeling? Curriculum materials, and verbal data from whole class conversations and student group interviews were analyzed. Group interviews consisted of same 4 students, selected randomly before curriculum implementation, and were conducted following each activity to investigate students' developing understandings of models. Theoretical categories concerning definition, properties and kinds of models as well as educational and chemical models informed curriculum design, and were redefined as codes in the analysis of verbal data. Results indicate more diversity of codes in student than teacher talk across all activities. Teacher concentrated on educational and chemical models. A significant finding is that model properties such as 'compositionality' and 'projectability' were not present in teacher talk as expected by curriculum design. Students did make reference to model properties. Another finding is that students demonstrate an understanding of models characterized by the seventeenth century Lemery model of acids and bases. Two students' developing understandings of models across curriculum implementation suggest that curriculum bears some change in students' understanding of models. The tension between students' everyday knowledge and teacher's scientific knowledge is highlighted relative to the patterns in codes observed in data. Implications for theory of learning, curriculum design and teacher education are discussed. It is argued that future educational research should acknowledge and incorporate perspectives from chemical epistemology.
Screening for endocrine-disrupting chemicals (EDCs) requires sensitive, scalable assays. Current high-throughput screening (HTPS) approaches for estrogenic and androgenic activity yield rapid results, but many are not sensitive to physiological hormone concentrations, suggesting ...
EXPERIMENTAL AND MATHEMATICAL MODELING METHODS FOR THE INVESTIGATION OF TOXICOLOGICAL INTERACTIONS
While procedures have been developed and used for many years to assess risk and determine acceptable exposure levels to individual chemicals, most cases of environmental contamination can result in concurrent or sequential exposure to more than one chemical. Toxicological predict...
Paul, Nicholas A; Svensson, Carl Johan; de Nys, Rocky; Steinberg, Peter D
2014-01-01
All of the theory and most of the data on the ecology and evolution of chemical defences derive from terrestrial plants, which have considerable capacity for internal movement of resources. In contrast, most macroalgae--seaweeds--have no or very limited capacity for resource translocation, meaning that trade-offs between growth and defence, for example, should be localised rather than systemic. This may change the predictions of chemical defence theories for seaweeds. We developed a model that mimicked the simple growth pattern of the red seaweed Asparagopsis armata which is composed of repeating clusters of somatic cells and cells which contain deterrent secondary chemicals (gland cells). To do this we created a distinct growth curve for the somatic cells and another for the gland cells using empirical data. The somatic growth function was linked to the growth function for defence via differential equations modelling, which effectively generated a trade-off between growth and defence as these neighbouring cells develop. By treating growth and defence as separate functions we were also able to model a trade-off in growth of 2-3% under most circumstances. However, we found contrasting evidence for this trade-off in the empirical relationships between growth and defence, depending on the light level under which the alga was cultured. After developing a model that incorporated both branching and cell division rates, we formally demonstrated that positive correlations between growth and defence are predicted in many circumstances and also that allocation costs, if they exist, will be constrained by the intrinsic growth patterns of the seaweed. Growth patterns could therefore explain contrasting evidence for cost of constitutive chemical defence in many studies, highlighting the need to consider the fundamental biology and ontogeny of organisms when assessing the allocation theories for defence.
This project aims to strengthen the general scientific foundation of EPA's exposure and risk assessment processes by developing state-of-the-art exposure to dose computational models. This research will produce physiologically-based pharmacokinetic (PBPK) and pharmacodynamic (PD)...
Developing an Experimental Model of Vascular Toxicity in Embryonic Zebrafish
Developing an Experimental Model of Vascular Toxicity in Embryonic Zebrafish Tamara Tal, Integrated Systems Toxicology Division, U.S. EPA Background: There are tens of thousands of chemicals that have yet to be fully evaluated for their toxicity by validated in vivo testing ...
ERIC Educational Resources Information Center
Judge, Sarah; Delgaty, Laura; Broughton, Mark; Dyter, Laura; Grimes, Callum; Metcalf, James; Nicholson, Rose; Pennock, Erin; Jankowski, Karl
2017-01-01
A team of six children (13-14 years old) developed and conducted an experiment to assess the behaviour of the planarian flatworm, an invertebrate animal model, before, during and after exposure to chemicals. The aim of the project was to engage children in pharmacology and toxicology research. First, the concept that exposure to chemicals can…
ERIC Educational Resources Information Center
Krein, Michael
2011-01-01
After decades of development and use in a variety of application areas, Quantitative Structure Property Relationships (QSPRs) and related descriptor-based statistical learning methods have achieved a level of infamy due to their misuse. The field is rife with past examples of overtrained models, overoptimistic performance assessment, and outright…
Politi, Regina; Rusyn, Ivan; Tropsha, Alexander
2016-01-01
The thyroid hormone receptor (THR) is an important member of the nuclear receptor family that can be activated by endocrine disrupting chemicals (EDC). Quantitative Structure-Activity Relationship (QSAR) models have been developed to facilitate the prioritization of THR-mediated EDC for the experimental validation. The largest database of binding affinities available at the time of the study for ligand binding domain (LBD) of THRβ was assembled to generate both continuous and classification QSAR models with an external accuracy of R2=0.55 and CCR=0.76, respectively. In addition, for the first time a QSAR model was developed to predict binding affinities of antagonists inhibiting the interaction of coactivators with the AF-2 domain of THRβ (R2=0.70). Furthermore, molecular docking studies were performed for a set of THRβ ligands (57 agonists and 15 antagonists of LBD, 210 antagonists of the AF-2 domain, supplemented by putative decoys/non-binders) using several THRβ structures retrieved from the Protein Data Bank. We found that two agonist-bound THRβ conformations could effectively discriminate their corresponding ligands from presumed non-binders. Moreover, one of the agonist conformations could discriminate agonists from antagonists. Finally, we have conducted virtual screening of a chemical library compiled by the EPA as part of the Tox21 program to identify potential THRβ-mediated EDCs using both QSAR models and docking. We concluded that the library is unlikely to have any EDC that would bind to the THRβ. Models developed in this study can be employed either to identify environmental chemicals interacting with the THR or, conversely, to eliminate the THR-mediated mechanism of action for chemicals of concern. PMID:25058446
Investigating the effect of chemical stress and resource ...
Modeling exposure and recovery of fish and wildlife populations after stressor mitigation serves as a basis for evaluating population status and remediation success. The Atlantic killifish (Fundulus heteroclitus) is an important and well-studied model organism for understanding the effects of pollutants and other stressors in estuarine and marine ecosystems. Herein, we develop a density dependent matrix population model for Atlantic killifish that analyzes both size-structure and age class-structure of the population so that we could readily incorporate output from a dynamic energy budget (DEB) model currently under development. This population modeling approach emphasizes application in conjunction with field monitoring efforts (e.g., through effects-based monitoring programs) and/or laboratory analysis to link effects due to chemical stress to adverse outcomes in whole organisms and populations. We applied the model using data for killifish exposed to dioxin-like compounds, taken from a previously published study. Specifically, the model was used to investigate population trajectories for Atlantic killifish with dietary exposures to 112, 296, and 875 pg/g of dioxin with effects on fertility and survival rates. All effects were expressed relative to control fish. Further, the population model was employed to examine age and size distributions of a population exposed to resource limitation in addition to chemical stress. For each dietary exposure concentration o
Quantum-chemical insights from deep tensor neural networks
Schütt, Kristof T.; Arbabzadah, Farhad; Chmiela, Stefan; Müller, Klaus R.; Tkatchenko, Alexandre
2017-01-01
Learning from data has led to paradigm shifts in a multitude of disciplines, including web, text and image search, speech recognition, as well as bioinformatics. Can machine learning enable similar breakthroughs in understanding quantum many-body systems? Here we develop an efficient deep learning approach that enables spatially and chemically resolved insights into quantum-mechanical observables of molecular systems. We unify concepts from many-body Hamiltonians with purpose-designed deep tensor neural networks, which leads to size-extensive and uniformly accurate (1 kcal mol−1) predictions in compositional and configurational chemical space for molecules of intermediate size. As an example of chemical relevance, the model reveals a classification of aromatic rings with respect to their stability. Further applications of our model for predicting atomic energies and local chemical potentials in molecules, reliable isomer energies, and molecules with peculiar electronic structure demonstrate the potential of machine learning for revealing insights into complex quantum-chemical systems. PMID:28067221
Quantum-chemical insights from deep tensor neural networks.
Schütt, Kristof T; Arbabzadah, Farhad; Chmiela, Stefan; Müller, Klaus R; Tkatchenko, Alexandre
2017-01-09
Learning from data has led to paradigm shifts in a multitude of disciplines, including web, text and image search, speech recognition, as well as bioinformatics. Can machine learning enable similar breakthroughs in understanding quantum many-body systems? Here we develop an efficient deep learning approach that enables spatially and chemically resolved insights into quantum-mechanical observables of molecular systems. We unify concepts from many-body Hamiltonians with purpose-designed deep tensor neural networks, which leads to size-extensive and uniformly accurate (1 kcal mol -1 ) predictions in compositional and configurational chemical space for molecules of intermediate size. As an example of chemical relevance, the model reveals a classification of aromatic rings with respect to their stability. Further applications of our model for predicting atomic energies and local chemical potentials in molecules, reliable isomer energies, and molecules with peculiar electronic structure demonstrate the potential of machine learning for revealing insights into complex quantum-chemical systems.
Quantum-chemical insights from deep tensor neural networks
NASA Astrophysics Data System (ADS)
Schütt, Kristof T.; Arbabzadah, Farhad; Chmiela, Stefan; Müller, Klaus R.; Tkatchenko, Alexandre
2017-01-01
Learning from data has led to paradigm shifts in a multitude of disciplines, including web, text and image search, speech recognition, as well as bioinformatics. Can machine learning enable similar breakthroughs in understanding quantum many-body systems? Here we develop an efficient deep learning approach that enables spatially and chemically resolved insights into quantum-mechanical observables of molecular systems. We unify concepts from many-body Hamiltonians with purpose-designed deep tensor neural networks, which leads to size-extensive and uniformly accurate (1 kcal mol-1) predictions in compositional and configurational chemical space for molecules of intermediate size. As an example of chemical relevance, the model reveals a classification of aromatic rings with respect to their stability. Further applications of our model for predicting atomic energies and local chemical potentials in molecules, reliable isomer energies, and molecules with peculiar electronic structure demonstrate the potential of machine learning for revealing insights into complex quantum-chemical systems.
Coupling Computer-Aided Process Simulation and ...
A methodology is described for developing a gate-to-gate life cycle inventory (LCI) of a chemical manufacturing process to support the application of life cycle assessment in the design and regulation of sustainable chemicals. The inventories were derived by first applying process design and simulation of develop a process flow diagram describing the energy and basic material flows of the system. Additional techniques developed by the U.S. Environmental Protection Agency for estimating uncontrolled emissions from chemical processing equipment were then applied to obtain a detailed emission profile for the process. Finally, land use for the process was estimated using a simple sizing model. The methodology was applied to a case study of acetic acid production based on the Cativa tm process. The results reveal improvements in the qualitative LCI for acetic acid production compared to commonly used databases and top-down methodologies. The modeling techniques improve the quantitative LCI results for inputs and uncontrolled emissions. With provisions for applying appropriate emission controls, the proposed method can provide an estimate of the LCI that can be used for subsequent life cycle assessments. As part of its mission, the Agency is tasked with overseeing the use of chemicals in commerce. This can include consideration of a chemical's potential impact on health and safety, resource conservation, clean air and climate change, clean water, and sustainable
NASA Astrophysics Data System (ADS)
Long, M. S.; Yantosca, R.; Nielsen, J.; Linford, J. C.; Keller, C. A.; Payer Sulprizio, M.; Jacob, D. J.
2014-12-01
The GEOS-Chem global chemical transport model (CTM), used by a large atmospheric chemistry research community, has been reengineered to serve as a platform for a range of computational atmospheric chemistry science foci and applications. Development included modularization for coupling to general circulation and Earth system models (ESMs) and the adoption of co-processor capable atmospheric chemistry solvers. This was done using an Earth System Modeling Framework (ESMF) interface that operates independently of GEOS-Chem scientific code to permit seamless transition from the GEOS-Chem stand-alone serial CTM to deployment as a coupled ESM module. In this manner, the continual stream of updates contributed by the CTM user community is automatically available for broader applications, which remain state-of-science and directly referenceable to the latest version of the standard GEOS-Chem CTM. These developments are now available as part of the standard version of the GEOS-Chem CTM. The system has been implemented as an atmospheric chemistry module within the NASA GEOS-5 ESM. The coupled GEOS-5/GEOS-Chem system was tested for weak and strong scalability and performance with a tropospheric oxidant-aerosol simulation. Results confirm that the GEOS-Chem chemical operator scales efficiently for any number of processes. Although inclusion of atmospheric chemistry in ESMs is computationally expensive, the excellent scalability of the chemical operator means that the relative cost goes down with increasing number of processes, making fine-scale resolution simulations possible.
Prediction of the effect of formulation on the toxicity of chemicals.
Mistry, Pritesh; Neagu, Daniel; Sanchez-Ruiz, Antonio; Trundle, Paul R; Vessey, Jonathan D; Gosling, John Paul
2017-01-01
Two approaches for the prediction of which of two vehicles will result in lower toxicity for anticancer agents are presented. Machine-learning models are developed using decision tree, random forest and partial least squares methodologies and statistical evidence is presented to demonstrate that they represent valid models. Separately, a clustering method is presented that allows the ordering of vehicles by the toxicity they show for chemically-related compounds.
A THC Simulator for Modeling Fluid-Rock Interactions
NASA Astrophysics Data System (ADS)
Hamidi, Sahar; Galvan, Boris; Heinze, Thomas; Miller, Stephen
2014-05-01
Fluid-rock interactions play an essential role in many earth processes, from a likely influence on earthquake nucleation and aftershocks, to enhanced geothermal system, carbon capture and storage (CCS), and underground nuclear waste repositories. In THC models, two-way interactions between different processes (thermal, hydraulic and chemical) are present. Fluid flow influences the permeability of the rock especially if chemical reactions are taken into account. On one hand solute concentration influences fluid properties while, on the other hand, heat can affect further chemical reactions. Estimating heat production from a naturally fractured geothermal systems remains a complex problem. Previous works are typically based on a local thermal equilibrium assumption and rarely consider the salinity. The dissolved salt in fluid affects the hydro- and thermodynamical behavior of the system by changing the hydraulic properties of the circulating fluid. Coupled thermal-hydraulic-chemical models (THC) are important for investigating these processes, but what is needed is a coupling to mechanics to result in THMC models. Although similar models currently exist (e.g. PFLOTRAN), our objective here is to develop algorithms for implementation using the Graphics Processing Unit (GPU) computer architecture to be run on GPU clusters. To that aim, we present a two-dimensional numerical simulation of a fully coupled non-isothermal non-reactive solute flow. The thermal part of the simulation models heat transfer processes for either local thermal equilibrium or nonequilibrium cases, and coupled to a non-reactive mass transfer described by a non-linear diffusion/dispersion model. The flow process of the model includes a non-linear Darcian flow for either saturated or unsaturated scenarios. For the unsaturated case, we use the Richards' approximation for a mixture of liquid and gas phases. Relative permeability and capillary pressure are determined by the van Genuchten relations. Permeability of rock is controlled by porosity, which is itself related to effective stress. The theoretical model is solved using explicit finite differences, and runs in parallel mode with OpenMP. The code is fully modular so that any combination of current THC processes, one- and two-phase, can be chosen. Future developments will include dissolution and precipitation of chemical components in addition to chemical erosion.
Novel naïve Bayes classification models for predicting the carcinogenicity of chemicals.
Zhang, Hui; Cao, Zhi-Xing; Li, Meng; Li, Yu-Zhi; Peng, Cheng
2016-11-01
The carcinogenicity prediction has become a significant issue for the pharmaceutical industry. The purpose of this investigation was to develop a novel prediction model of carcinogenicity of chemicals by using a naïve Bayes classifier. The established model was validated by the internal 5-fold cross validation and external test set. The naïve Bayes classifier gave an average overall prediction accuracy of 90 ± 0.8% for the training set and 68 ± 1.9% for the external test set. Moreover, five simple molecular descriptors (e.g., AlogP, Molecular weight (M W ), No. of H donors, Apol and Wiener) considered as important for the carcinogenicity of chemicals were identified, and some substructures related to the carcinogenicity were achieved. Thus, we hope the established naïve Bayes prediction model could be applied to filter early-stage molecules for this potential carcinogenicity adverse effect; and the identified five simple molecular descriptors and substructures of carcinogens would give a better understanding of the carcinogenicity of chemicals, and further provide guidance for medicinal chemists in the design of new candidate drugs and lead optimization, ultimately reducing the attrition rate in later stages of drug development. Copyright © 2016 Elsevier Ltd. All rights reserved.
New chemical-DSMC method in numerical simulation of axisymmetric rarefied reactive flow
NASA Astrophysics Data System (ADS)
Zakeri, Ramin; Kamali Moghadam, Ramin; Mani, Mahmoud
2017-04-01
The modified quantum kinetic (MQK) chemical reaction model introduced by Zakeri et al. is developed for applicable cases in axisymmetric reactive rarefied gas flows using the direct simulation Monte Carlo (DSMC) method. Although, the MQK chemical model uses some modifications in the quantum kinetic (QK) method, it also employs the general soft sphere collision model and Stockmayer potential function to properly select the collision pairs in the DSMC algorithm and capture both the attraction and repulsion intermolecular forces in rarefied gas flows. For assessment of the presented model in the simulation of more complex and applicable reacting flows, first, the air dissociation is studied in a single cell for equilibrium and non-equilibrium conditions. The MQK results agree well with the analytical and experimental data and they accurately predict the characteristics of the rarefied flowfield with chemical reaction. To investigate accuracy of the MQK chemical model in the simulation of the axisymmetric flow, air dissociation is also assessed in an axial hypersonic flow around two geometries, the sphere as a benchmark case and the blunt body (STS-2) as an applicable test case. The computed results including the transient, rotational and vibrational temperatures, species concentration in the stagnation line, and also the heat flux and pressure coefficient on the surface are compared with those of the other chemical methods like the QK and total collision energy (TCE) models and available analytical and experimental data. Generally, the MQK chemical model properly simulates the chemical reactions and predicts flowfield characteristics more accurate rather than the typical QK model. Although in some cases, results of the MQK approaches match with those of the TCE method, the main point is that the MQK does not need any experimental data or unrealistic assumption of specular boundary condition as used in the TCE method. Another advantage of the MQK model is the significant reduction of computational cost rather than the QK chemical model to reach the same accuracy because of applying more proper collision model and consequently, decrease of the particles collision number.
Analysis of exhaled breath by laser detection
NASA Astrophysics Data System (ADS)
Thrall, Karla D.; Toth, James J.; Sharpe, Steven W.
1996-04-01
The goal of our work is two fold: (1) to develop a portable rapid laser based breath analyzer for monitoring metabolic processes, and (2) predict these metabolic processes through physiologically based pharmacokinetic (PBPK) modeling. Small infrared active molecules such as ammonia, carbon monoxide, carbon dioxide, methane and ethane are present in exhaled breath and can be readily detected by laser absorption spectroscopy. In addition, many of the stable isotopomers of these molecules can be accurately detected, making it possible to follow specific metabolic processes. Potential areas of applications for this technology include the diagnosis of certain pathologies (e.g. Helicobacter Pylori infection), detection of trauma due to either physical or chemical causes and monitoring nutrient uptake (i.e., malnutrition). In order to understand the origin and elucidate the metabolic processes associated with these small molecules, we are employing physiologically based pharmacokinetic (PBPK) models. A PBPK model is founded on known physiological processes (i.e., blood flow rates, tissue volumes, breathing rate, etc.), chemical-specific processes (i.e., tissue solubility coefficients, molecular weight, chemical density, etc.), and on metabolic processes (tissue site and rate of metabolic biotransformation). Since many of these processes are well understood, a PBPK model can be developed and validated against the more readily available experimental animal data, and then by extrapolating the parameters to apply to man, the model can predict chemical behavior in humans.
A COMSOL-GEMS interface for modeling coupled reactive-transport geochemical processes
NASA Astrophysics Data System (ADS)
Azad, Vahid Jafari; Li, Chang; Verba, Circe; Ideker, Jason H.; Isgor, O. Burkan
2016-07-01
An interface was developed between COMSOL MultiphysicsTM finite element analysis software and (geo)chemical modeling platform, GEMS, for the reactive-transport modeling of (geo)chemical processes in variably saturated porous media. The two standalone software packages are managed from the interface that uses a non-iterative operator splitting technique to couple the transport (COMSOL) and reaction (GEMS) processes. The interface allows modeling media with complex chemistry (e.g. cement) using GEMS thermodynamic database formats. Benchmark comparisons show that the developed interface can be used to predict a variety of reactive-transport processes accurately. The full functionality of the interface was demonstrated to model transport processes, governed by extended Nernst-Plank equation, in Class H Portland cement samples in high pressure and temperature autoclaves simulating systems that are used to store captured carbon dioxide (CO2) in geological reservoirs.
Grant, Sarah Schmidt; Kawate, Tomohiko; Nag, Partha P.; Silvis, Melanie R.; Gordon, Katherine; Stanley, Sarah A.; Kazyanskaya, Ed; Nietupski, Ray; Golas, Aaron; Fitzgerald, Michael; Cho, Sanghyun; Franzblau, Scott G.; Hung, Deborah T.
2013-01-01
During Mycobacterium tuberculosis infection, a population of bacteria is thought to exist in a non-replicating state, refractory to antibiotics, which may contribute to the need for prolonged antibiotic therapy. The identification of inhibitors of the non-replicating state provides tools that can be used to probe this hypothesis and the physiology of this state. The development of such inhibitors also has the potential to shorten the duration of antibiotic therapy required. Here we describe the development of a novel non-replicating assay amenable to high-throughput chemical screening coupled with secondary assays that use carbon starvation as the in vitro model. Together these assays identify compounds with activity against replicating and non-replicating M. tuberculosis as well as compounds that inhibit the transition from non-replicating to replicating stages of growth. Using these assays we successfully screened over 300,000 compounds and identified 786 inhibitors of non-replicating M. tuberculosis. In order to understand the relationship among different non-replicating models, we teste 52 of these molecules in a hypoxia model and four different chemical scaffolds in a stochastic persist model and a streptomycin dependent model. We found that compounds display varying levels of activity in different models for the non-replicating state, suggesting important differences in bacterial physiology between models. Therefore, chemical tools identified in this assay may be useful for determining the relevance of different non-replicating in vitro models to in vivo M. tuberculosis infection. Given our current limited understanding, molecules that are active across multiple models may represent more promising candidates for further development. PMID:23898841
Cheminformatics-aided pharmacovigilance: application to Stevens-Johnson Syndrome
Low, Yen S; Caster, Ola; Bergvall, Tomas; Fourches, Denis; Zang, Xiaoling; Norén, G Niklas; Rusyn, Ivan; Edwards, Ralph
2016-01-01
Objective Quantitative Structure-Activity Relationship (QSAR) models can predict adverse drug reactions (ADRs), and thus provide early warnings of potential hazards. Timely identification of potential safety concerns could protect patients and aid early diagnosis of ADRs among the exposed. Our objective was to determine whether global spontaneous reporting patterns might allow chemical substructures associated with Stevens-Johnson Syndrome (SJS) to be identified and utilized for ADR prediction by QSAR models. Materials and Methods Using a reference set of 364 drugs having positive or negative reporting correlations with SJS in the VigiBase global repository of individual case safety reports (Uppsala Monitoring Center, Uppsala, Sweden), chemical descriptors were computed from drug molecular structures. Random Forest and Support Vector Machines methods were used to develop QSAR models, which were validated by external 5-fold cross validation. Models were employed for virtual screening of DrugBank to predict SJS actives and inactives, which were corroborated using knowledge bases like VigiBase, ChemoText, and MicroMedex (Truven Health Analytics Inc, Ann Arbor, Michigan). Results We developed QSAR models that could accurately predict if drugs were associated with SJS (area under the curve of 75%–81%). Our 10 most active and inactive predictions were substantiated by SJS reports (or lack thereof) in the literature. Discussion Interpretation of QSAR models in terms of significant chemical descriptors suggested novel SJS structural alerts. Conclusions We have demonstrated that QSAR models can accurately identify SJS active and inactive drugs. Requiring chemical structures only, QSAR models provide effective computational means to flag potentially harmful drugs for subsequent targeted surveillance and pharmacoepidemiologic investigations. PMID:26499102
Baldi, Pierre
2010-01-01
As repositories of chemical molecules continue to expand and become more open, it becomes increasingly important to develop tools to search them efficiently and assess the statistical significance of chemical similarity scores. Here we develop a general framework for understanding, modeling, predicting, and approximating the distribution of chemical similarity scores and its extreme values in large databases. The framework can be applied to different chemical representations and similarity measures but is demonstrated here using the most common binary fingerprints with the Tanimoto similarity measure. After introducing several probabilistic models of fingerprints, including the Conditional Gaussian Uniform model, we show that the distribution of Tanimoto scores can be approximated by the distribution of the ratio of two correlated Normal random variables associated with the corresponding unions and intersections. This remains true also when the distribution of similarity scores is conditioned on the size of the query molecules in order to derive more fine-grained results and improve chemical retrieval. The corresponding extreme value distributions for the maximum scores are approximated by Weibull distributions. From these various distributions and their analytical forms, Z-scores, E-values, and p-values are derived to assess the significance of similarity scores. In addition, the framework allows one to predict also the value of standard chemical retrieval metrics, such as Sensitivity and Specificity at fixed thresholds, or ROC (Receiver Operating Characteristic) curves at multiple thresholds, and to detect outliers in the form of atypical molecules. Numerous and diverse experiments carried in part with large sets of molecules from the ChemDB show remarkable agreement between theory and empirical results. PMID:20540577
Investigation of chemically-reacting supersonic internal flows
NASA Technical Reports Server (NTRS)
Chitsomboon, T.; Tiwari, S. N.
1985-01-01
This report covers work done on the research project Analysis and Computation of Internal Flow Field in a Scramjet Engine. The work is supported by the NASA Langley Research Center (Computational Methods Branch of the High-Speed Aerodynamics Division) through research grant NAG1-423. The governing equations of two-dimensional chemically-reacting flows are presented together with the global two-step chemistry model. The finite-difference algorithm used is illustrated and the method of circumventing the stiffness is discussed. The computer program developed is used to solve two model problems of a premixed chemically-reacting flow. The results obtained are physically reasonable.
As defined by Wikipedia (https://en.wikipedia.org/wiki/Metamodeling), “(a) metamodel or surrogate model is a model of a model, and metamodeling is the process of generating such metamodels.” The goals of metamodeling include, but are not limited to (1) developing func...
The mode of toxic action (MoA) has been recognized as a key determinant of chemical toxicity, but development of predictive MoA classification models in aquatic toxicology has been limited. We developed a Bayesian network model to classify aquatic toxicity MoA using a recently pu...
This presentation will examine the impact of data quality on the construction of QSAR models being developed within the EPA‘s National Center for Computational Toxicology. We have developed a public-facing platform to provide access to predictive models. As part of the work we ha...
Preliminary Results from Electric Arc Furnace Off-Gas Enthalpy Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nimbalkar, Sachin U; Thekdi, Arvind; Keiser, James R
2015-01-01
This article describes electric arc furnace (EAF) off-gas enthalpy models developed at Oak Ridge National Laboratory (ORNL) to calculate overall heat availability (sensible and chemical enthalpy) and recoverable heat values (steam or power generation potential) for existing EAF operations and to test ORNL s new EAF waste heat recovery (WHR) concepts. ORNL s new EAF WHR concepts are: Regenerative Drop-out Box System and Fluidized Bed System. The two EAF off-gas enthalpy models described in this paper are: 1.Overall Waste Heat Recovery Model that calculates total heat availability in off-gases of existing EAF operations 2.Regenerative Drop-out Box System Model in whichmore » hot EAF off-gases alternately pass through one of two refractory heat sinks that store heat and then transfer it to another gaseous medium These models calculate the sensible and chemical enthalpy of EAF off-gases based on the off-gas chemical composition, temperature, and mass flow rate during tap to tap time, and variations in those parameters in terms of actual values over time. The models provide heat transfer analysis for the aforementioned concepts to confirm the overall system and major component sizing (preliminary) to assess the practicality of the systems. Real-time EAF off-gas composition (e.g., CO, CO2, H2, and H2O), volume flow, and temperature data from one EAF operation was used to test the validity and accuracy of the modeling work. The EAF off-gas data was used to calculate the sensible and chemical enthalpy of the EAF off-gases to generate steam and power. The article provides detailed results from the modeling work that are important to the success of ORNL s EAF WHR project. The EAF WHR project aims to develop and test new concepts and materials that allow cost-effective recovery of sensible and chemical heat from high-temperature gases discharged from EAFs.« less
IDENTIFYING INDICATORS OF REACTIVITY FOR CHEMICAL REDUCTANTS IN ANOXIC AND ANAEROBIC SEDIMENTS
To develop reaction transport models describing the movement of redox-active organic contaminants through contaminated sediments and aquifers, it is imperative to know the identity and reactivity of chemical reductants in natural sediments and to associate their reactivity with p...
High-Throughput Dietary Exposure Predictions for Chemical Migrants from Food Packaging Materials
United States Environmental Protection Agency researchers have developed a Stochastic Human Exposure and Dose Simulation High -Throughput (SHEDS-HT) model for use in prioritization of chemicals under the ExpoCast program. In this research, new methods were implemented in SHEDS-HT...
ToxRefDB: Classifying ToxCast™ Phase I Chemicals Utilizing Structured Toxicity Information
There is an essential need for highly detailed chemicals classifications within the ToxCast™ research program. In order to develop predictive models and biological signatures utilizing high-throughput screening (HTS) and in vitro genomic data, relevant endpoints and toxicities m...
The Virtual Embryo Project (v-Embryo™)
The v-Embryo is a far reaching new research program at the US EPA to develop a working computer model of a mammalian embryo that can be used to better understand the prenatal risks posed by environmental chemicals and to eventually predict a chemical's potential developmental tox...
Systematic development of reduced reaction mechanisms for dynamic modeling
NASA Technical Reports Server (NTRS)
Frenklach, M.; Kailasanath, K.; Oran, E. S.
1986-01-01
A method for systematically developing a reduced chemical reaction mechanism for dynamic modeling of chemically reactive flows is presented. The method is based on the postulate that if a reduced reaction mechanism faithfully describes the time evolution of both thermal and chain reaction processes characteristic of a more complete mechanism, then the reduced mechanism will describe the chemical processes in a chemically reacting flow with approximately the same degree of accuracy. Here this postulate is tested by producing a series of mechanisms of reduced accuracy, which are derived from a full detailed mechanism for methane-oxygen combustion. These mechanisms were then tested in a series of reactive flow calculations in which a large-amplitude sinusoidal perturbation is applied to a system that is initially quiescent and whose temperature is high enough to start ignition processes. Comparison of the results for systems with and without convective flow show that this approach produces reduced mechanisms that are useful for calculations of explosions and detonations. Extensions and applicability to flames are discussed.
Reacidification modeling and dose calculation procedures for calcium-carbonate-treated lakes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scheffe, R.D.
1987-01-01
Two dose calculation models and a reacidification model were developed and applied to two Adirondack acid lakes (Woods Lake and Cranberry Pond) that were treated with calcite during May 30-31, 1985 as part of the EPRI-funded Lake Acidification Mitigation Project. The first dose model extended Sverdrup's (1983) Lake Liming model by incorporating chemical equilibrium routines to eliminate empirical components. The model simulates laboratory column water chemistry profiles (spatially and temporally) and dissolution efficiencies fairly well; however, the model predicted conservative dissolution efficiencies for the study lakes. Time-series water chemistry profiles of the lakes suggest that atmospheric carbon dioxide intrusion ratemore » was far greater than expected and enhanced dissolution efficiency. Accordingly, a second dose model was developed that incorporated ongoing CO/sub 2/ intrusion and added flexibility in the handling of solid and dissolved species transport. This revised model simulated whole-lake water chemistry throughout the three week dissolution period. The Acid Lake Reacidification Model (ALaRM) is a general mass-balance model developed for the temporal prediction of the principal chemical species in both the water column and sediment pore water of small lakes and ponds.« less
Assessment of chemistry models for compressible reacting flows
NASA Astrophysics Data System (ADS)
Lapointe, Simon; Blanquart, Guillaume
2014-11-01
Recent technological advances in propulsion and power devices and renewed interest in the development of next generation supersonic and hypersonic vehicles have increased the need for detailed understanding of turbulence-combustion interactions in compressible reacting flows. In numerical simulations of such flows, accurate modeling of the fuel chemistry is a critical component of capturing the relevant physics. Various chemical models are currently being used in reacting flow simulations. However, the differences between these models and their impacts on the fluid dynamics in the context of compressible flows are not well understood. In the present work, a numerical code is developed to solve the fully coupled compressible conservation equations for reacting flows. The finite volume code is based on the theoretical and numerical framework developed by Oefelein (Prog. Aero. Sci. 42 (2006) 2-37) and employs an all-Mach-number formulation with dual time-stepping and preconditioning. The numerical approach is tested on turbulent premixed flames at high Karlovitz numbers. Different chemical models of varying complexity and computational cost are used and their effects are compared.
Mansouri, K; Grulke, C M; Richard, A M; Judson, R S; Williams, A J
2016-11-01
The increasing availability of large collections of chemical structures and associated experimental data provides an opportunity to build robust QSAR models for applications in different fields. One common concern is the quality of both the chemical structure information and associated experimental data. Here we describe the development of an automated KNIME workflow to curate and correct errors in the structure and identity of chemicals using the publicly available PHYSPROP physicochemical properties and environmental fate datasets. The workflow first assembles structure-identity pairs using up to four provided chemical identifiers, including chemical name, CASRNs, SMILES, and MolBlock. Problems detected included errors and mismatches in chemical structure formats, identifiers and various structure validation issues, including hypervalency and stereochemistry descriptions. Subsequently, a machine learning procedure was applied to evaluate the impact of this curation process. The performance of QSAR models built on only the highest-quality subset of the original dataset was compared with the larger curated and corrected dataset. The latter showed statistically improved predictive performance. The final workflow was used to curate the full list of PHYSPROP datasets, and is being made publicly available for further usage and integration by the scientific community.
The EPA's Office of Research and Development is embarking on a long term project to develop a Multimedia Integrated Modeling System (MIMS). The system will have capabilities to represent the transport and fate of nutrients and chemical stressors over multiple scales. MIMS will ...
Exposure to endocrine disrupting chemicals can affect reproduction and development in both humans and wildlife. We are developing a mechanistic mathematical model of the hypothalamic-pituitary-gonadal (HPG) axis in female fathead minnows to predict doseresponse and time-course (...
Characterization and prediction of chemical functions and weight fractions in consumer products.
Isaacs, Kristin K; Goldsmith, Michael-Rock; Egeghy, Peter; Phillips, Katherine; Brooks, Raina; Hong, Tao; Wambaugh, John F
2016-01-01
Assessing exposures from the thousands of chemicals in commerce requires quantitative information on the chemical constituents of consumer products. Unfortunately, gaps in available composition data prevent assessment of exposure to chemicals in many products. Here we propose filling these gaps via consideration of chemical functional role. We obtained function information for thousands of chemicals from public sources and used a clustering algorithm to assign chemicals into 35 harmonized function categories (e.g., plasticizers, antimicrobials, solvents). We combined these functions with weight fraction data for 4115 personal care products (PCPs) to characterize the composition of 66 different product categories (e.g., shampoos). We analyzed the combined weight fraction/function dataset using machine learning techniques to develop quantitative structure property relationship (QSPR) classifier models for 22 functions and for weight fraction, based on chemical-specific descriptors (including chemical properties). We applied these classifier models to a library of 10196 data-poor chemicals. Our predictions of chemical function and composition will inform exposure-based screening of chemicals in PCPs for combination with hazard data in risk-based evaluation frameworks. As new information becomes available, this approach can be applied to other classes of products and the chemicals they contain in order to provide essential consumer product data for use in exposure-based chemical prioritization.
Biological profiling and dose-response modeling tools ...
Through its ToxCast project, the U.S. EPA has developed a battery of in vitro high throughput screening (HTS) assays designed to assess the potential toxicity of environmental chemicals. At present, over 1800 chemicals have been tested in up to 600 assays, yielding a large number of concentration-response data sets. Standard processing of these data sets involves finding a best fitting mathematical model and set of model parameters that specify this model. The model parameters include quantities such as the half-maximal activity concentration (or “AC50”) that have biological significance and can be used to inform the efficacy or potency of a given chemical with respect to a given assay. All of this data is processed and stored in an online-accessible database and website: http://actor.epa.gov/dashboard2. Results from these in vitro assays are used in a multitude of ways. New pathways and targets can be identified and incorporated into new or existing adverse outcome pathways (AOPs). Pharmacokinetic models such as those implemented EPA’s HTTK R package can be used to translate an in vitro concentration into an in vivo dose; i.e., one can predict the oral equivalent dose that might be expected to activate a specific biological pathway. Such predicted values can then be compared with estimated actual human exposures prioritize chemicals for further testing.Any quantitative examination should be accompanied by estimation of uncertainty. We are developing met
Katoh, Masakazu; Hamajima, Fumiyasu; Ogasawara, Takahiro; Hata, Ken-ichiro
2013-12-01
Finding in vitro eye irritation testing alternatives to animal testing such as the Draize eye test, which uses rabbits, is essential from the standpoint of animal welfare. It has been developed a reconstructed human corneal epithelial model, the LabCyte CORNEA-MODEL, which has a representative corneal epithelium-like structure. Protocol optimization (pre-validation study) was examined in order to establish a new alternative method for eye irritancy evaluation with this model. From the results of the optimization experiments, the application periods for chemicals were set at 1min for liquid chemicals or 24h for solid chemicals, and the post-exposure incubation periods were set at 24h for liquids or zero for solids. If the viability was less than 50%, the chemical was judged to be an eye irritant. Sixty-one chemicals were applied in the optimized protocol using the LabCyte CORNEA-MODEL and these results were evaluated in correlation with in vivo results. The predictions of the optimized LabCyte CORNEA-MODEL eye irritation test methods were highly correlated with in vivo eye irritation (sensitivity 100%, specificity 80.0%, and accuracy 91.8%). These results suggest that the LabCyte CORNEA-MODEL eye irritation test could be useful as an alternative method to the Draize eye test. Copyright © 2013 Elsevier Ltd. All rights reserved.
Indoor Air Nuclear, Biological, and Chemical Health Modeling and Assessment System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stenner, Robert D.; Hadley, Donald L.; Armstrong, Peter R.
2001-03-01
Indoor air quality effects on human health are of increasing concern to public health agencies and building owners. The prevention and treatment of 'sick building' syndrome and the spread of air-borne diseases in hospitals, for example, are well known priorities. However, increasing attention is being directed to the vulnerability of our public buildings/places, public security and national defense facilities to terrorist attack or the accidental release of air-borne biological pathogens, harmful chemicals, or radioactive contaminants. The Indoor Air Nuclear, Biological, and Chemical Health Modeling and Assessment System (IA-NBC-HMAS) was developed to serve as a health impact analysis tool for usemore » in addressing these concerns. The overall goal was to develop a user-friendly fully functional prototype Health Modeling and Assessment system, which will operate under the PNNL FRAMES system for ease of use and to maximize its integration with other modeling and assessment capabilities accessible within the FRAMES system (e.g., ambient air fate and transport models, water borne fate and transport models, Physiologically Based Pharmacokinetic models, etc.). The prototype IA-NBC-HMAS is designed to serve as a functional Health Modeling and Assessment system that can be easily tailored to meet specific building analysis needs of a customer. The prototype system was developed and tested using an actual building (i.e., the Churchville Building located at the Aberdeen Proving Ground) and release scenario (i.e., the release and measurement of tracer materials within the building) to ensure realism and practicality in the design and development of the prototype system. A user-friendly "demo" accompanies this report to allow the reader the opportunity for a "hands on" review of the prototype system's capability.« less
Intra-Engine Trace Species Chemistry
NASA Technical Reports Server (NTRS)
Waitz, Ian A.; Lukachko, S. P.; Chobot, A.; Miake-Lye, R. C.; Brown, R.
2002-01-01
Prompted by the needs of downstream plume-wake models, the Massachusetts Institute of Technology (MIT) and Aerodyne Research Incorporated (ART) initiated a collaborative effort, with funding from the NASA AEAP, to develop tools that would assist in understanding the fundamental drivers of chemical change within the intra-engine exhaust flow path. Efforts have been focused on the development of a modeling methodology that can adequately investigate the complex intra-engine environment. Over the history of this project, our research has increasingly pointed to the intra-engine environment as a possible site for important trace chemical activity. Modeling studies we initiated for the turbine and exhaust nozzle have contributed several important capabilities to the atmospheric effects of aviation assessment. These include a more complete understanding of aerosol precursor production, improved initial conditions for plume-wake modeling studies, and a more comprehensive analysis of ground-based test cell and in-flight exhaust measurement data. In addition, establishing a physical understanding of important flow and chemical processes through computational investigations may eventually assist in the design of engines to reduce undesirable species.
Wang, Wenyi; Kim, Marlene T.; Sedykh, Alexander
2015-01-01
Purpose Experimental Blood–Brain Barrier (BBB) permeability models for drug molecules are expensive and time-consuming. As alternative methods, several traditional Quantitative Structure-Activity Relationship (QSAR) models have been developed previously. In this study, we aimed to improve the predictivity of traditional QSAR BBB permeability models by employing relevant public bio-assay data in the modeling process. Methods We compiled a BBB permeability database consisting of 439 unique compounds from various resources. The database was split into a modeling set of 341 compounds and a validation set of 98 compounds. Consensus QSAR modeling workflow was employed on the modeling set to develop various QSAR models. A five-fold cross-validation approach was used to validate the developed models, and the resulting models were used to predict the external validation set compounds. Furthermore, we used previously published membrane transporter models to generate relevant transporter profiles for target compounds. The transporter profiles were used as additional biological descriptors to develop hybrid QSAR BBB models. Results The consensus QSAR models have R2=0.638 for fivefold cross-validation and R2=0.504 for external validation. The consensus model developed by pooling chemical and transporter descriptors showed better predictivity (R2=0.646 for five-fold cross-validation and R2=0.526 for external validation). Moreover, several external bio-assays that correlate with BBB permeability were identified using our automatic profiling tool. Conclusions The BBB permeability models developed in this study can be useful for early evaluation of new compounds (e.g., new drug candidates). The combination of chemical and biological descriptors shows a promising direction to improve the current traditional QSAR models. PMID:25862462
Fu, Zhiqiang; Chen, Jingwen; Li, Xuehua; Wang, Ya'nan; Yu, Haiying
2016-04-01
The octanol-air partition coefficient (KOA) is needed for assessing multimedia transport and bioaccumulability of organic chemicals in the environment. As experimental determination of KOA for various chemicals is costly and laborious, development of KOA estimation methods is necessary. We investigated three methods for KOA prediction, conventional quantitative structure-activity relationship (QSAR) models based on molecular structural descriptors, group contribution models based on atom-centered fragments, and a novel model that predicts KOA via solvation free energy from air to octanol phase (ΔGO(0)), with a collection of 939 experimental KOA values for 379 compounds at different temperatures (263.15-323.15 K) as validation or training sets. The developed models were evaluated with the OECD guidelines on QSAR models validation and applicability domain (AD) description. Results showed that although the ΔGO(0) model is theoretically sound and has a broad AD, the prediction accuracy of the model is the poorest. The QSAR models perform better than the group contribution models, and have similar predictability and accuracy with the conventional method that estimates KOA from the octanol-water partition coefficient and Henry's law constant. One QSAR model, which can predict KOA at different temperatures, was recommended for application as to assess the long-range transport potential of chemicals. Copyright © 2016 Elsevier Ltd. All rights reserved.
Furuhama, A; Toida, T; Nishikawa, N; Aoki, Y; Yoshioka, Y; Shiraishi, H
2010-07-01
The KAshinhou Tool for Ecotoxicity (KATE) system, including ecotoxicity quantitative structure-activity relationship (QSAR) models, was developed by the Japanese National Institute for Environmental Studies (NIES) using the database of aquatic toxicity results gathered by the Japanese Ministry of the Environment and the US EPA fathead minnow database. In this system chemicals can be entered according to their one-dimensional structures and classified by substructure. The QSAR equations for predicting the toxicity of a chemical compound assume a linear correlation between its log P value and its aquatic toxicity. KATE uses a structural domain called C-judgement, defined by the substructures of specified functional groups in the QSAR models. Internal validation by the leave-one-out method confirms that the QSAR equations, with r(2 )> 0.7, RMSE
Papoulias, Diana M.; Noltie, Douglas B.; Tillitt, Donald E.
2000-01-01
A model system was characterized which may be used as an in vivo screen for effects of chemicals or environmental mixtures on sexual differentiation and development of reproductive organs and gametes. We evaluated the effects of a model environmental estrogen, ethinyl estradiol (EE2), on the d-rR strain of medaka, Oryzias latipes, using a nano-injection exposure. Gonad histopathology indicated that a single injection of 0.5–2.5 ng EE2/egg can cause phenotypic sex-reversal of genetic males to females. Sex-reversals could be detected as early as 7 days post-hatch. Sex-reversed males had female-typical duct development and the secondary sex characteristics we measured were generally consistent with phenotype, with the exception of a few EE2-exposed XX and XY females which possessed ambiguous anal fins. Using discriminant analysis, we determined that the presence or absence of the secondary sex characteristic, a dorsal fin notch, was a very reliable indicator of gonadal sex. No instances of gonadal intersexes were observed. Ethinyl estradiol also appeared to reduce growth but not condition (weight-at-length) and exposed XX females appeared to have a higher incidence of atretic follicles relative to controls. Our results suggest that estrogenic chemicals may influence sexual differentiation and development and that the medaka model is well suited to assessing these effects.
Computational modeling of the amphibian thyroid axis ...
In vitro screening of chemicals for bioactivity together with computational modeling are beginning to replace animal toxicity testing in support of chemical risk assessment. To facilitate this transition, an amphibian thyroid axis model has been developed to describe thyroid homeostasis during Xenopus laevis pro-metamorphosis. The model simulates the dynamic relationships of normal thyroid biology throughout this critical period of amphibian development and includes molecular initiating events (MIEs) for thyroid axis disruption to allow in silico simulations of hormone levels following chemical perturbations. One MIE that has been formally described using the adverse outcome pathway (AOP) framework is thyroperoxidase (TPO) inhibition. The goal of this study was to refine the model parameters and validate model predictions by generating dose-response and time-course biochemical data following exposure to three TPO inhibitors, methimazole, 6-propylthiouracil and 2-mercaptobenzothiazole. Key model variables including gland and blood thyroid hormone (TH) levels were compared to empirical values measured in biological samples at 2, 4, 7 and 10 days following initiation of exposure at Nieuwkoop and Faber (NF) stage 54 (onset of pro-metamorphosis). The secondary objective of these studies was to relate depleted blood TH levels to delayed metamorphosis, the adverse apical outcome. Delayed metamorphosis was evaluated by continuing exposure with a subset of larvae until a
US EPA - A*Star Partnership - Accelerating the Acceptance of ...
The path for incorporating new alternative methods and technologies into quantitative chemical risk assessment poses a diverse set of scientific challenges. Some of these challenges include development of relevant and predictive test systems and computational models to integrate and extrapolate experimental data, and rapid characterization and acceptance of these systems and models. The series of presentations will highlight a collaborative effort between the U.S. Environmental Protection Agency (EPA) and the Agency for Science, Technology and Research (A*STAR) that is focused on developing and applying experimental and computational models for predicting chemical-induced liver and kidney toxicity, brain angiogenesis, and blood-brain-barrier formation. In addressing some of these challenges, the U.S. EPA and A*STAR collaboration will provide a glimpse of what chemical risk assessments could look like in the 21st century. Presentation on US EPA – A*STAR Partnership at international symposium on Accelerating the acceptance of next-generation sciences and their application to regulatory risk assessment in Singapore.
Identification of consensus biomarkers for predicting non-genotoxic hepatocarcinogens
Huang, Shan-Han; Tung, Chun-Wei
2017-01-01
The assessment of non-genotoxic hepatocarcinogens (NGHCs) is currently relying on two-year rodent bioassays. Toxicogenomics biomarkers provide a potential alternative method for the prioritization of NGHCs that could be useful for risk assessment. However, previous studies using inconsistently classified chemicals as the training set and a single microarray dataset concluded no consensus biomarkers. In this study, 4 consensus biomarkers of A2m, Ca3, Cxcl1, and Cyp8b1 were identified from four large-scale microarray datasets of the one-day single maximum tolerated dose and a large set of chemicals without inconsistent classifications. Machine learning techniques were subsequently applied to develop prediction models for NGHCs. The final bagging decision tree models were constructed with an average AUC performance of 0.803 for an independent test. A set of 16 chemicals with controversial classifications were reclassified according to the consensus biomarkers. The developed prediction models and identified consensus biomarkers are expected to be potential alternative methods for prioritization of NGHCs for further experimental validation. PMID:28117354
NASA Astrophysics Data System (ADS)
Johnson, Ryan Federick; Chelliah, Harsha Kumar
2017-01-01
For a range of flow and chemical timescales, numerical simulations of two-dimensional laminar flow over a reacting carbon surface were performed to understand further the complex coupling between heterogeneous and homogeneous reactions. An open-source computational package (OpenFOAM®) was used with previously developed lumped heterogeneous reaction models for carbon surfaces and a detailed homogeneous reaction model for CO oxidation. The influence of finite-rate chemical kinetics was explored by varying the surface temperatures from 1800 to 2600 K, while flow residence time effects were explored by varying the free-stream velocity up to 50 m/s. The reacting boundary layer structure dependence on the residence time was analysed by extracting the ratio of chemical source and species diffusion terms. The important contributions of radical species reactions on overall carbon removal rate, which is often neglected in multi-dimensional simulations, are highlighted. The results provide a framework for future development and validation of lumped heterogeneous reaction models based on multi-dimensional reacting flow configurations.
Thermal and chemical convection in planetary mantles
NASA Technical Reports Server (NTRS)
Dupeyrat, L.; Sotin, C.; Parmentier, E. M.
1995-01-01
Melting of the upper mantle and extraction of melt result in the formation of a less dense depleted mantle. This paper describes series of two-dimensional models that investigate the effects of chemical buoyancy induced by these density variations. A tracer particles method has been set up to follow as closely as possible the chemical state of the mantle and to model the chemical buoyant force at each grid point. Each series of models provides the evolution with time of magma production, crustal thickness, surface heat flux, and thermal and chemical state of the mantle. First, models that do not take into account the displacement of plates at the surface of Earth demonstrate that chemical buoyancy has an important effect on the geometry of convection. Then models include horizontal motion of plates 5000 km wide. Recycling of crust is taken into account. For a sufficiently high plate velocity which depends on the thermal Rayleigh number, the cell's size is strongly coupled with the plate's size. Plate motion forces chemically buoyant material to sink into the mantle. Then the positive chemical buoyancy yields upwelling as depleted mantle reaches the interface between the upper and the lower mantle. This process is very efficient in mixing the depleted and undepleted mantle at the scale of the grid spacing since these zones of upwelling disrupt the large convective flow. At low spreading rates, zones of upwelling develop quickly, melting occurs, and the model predicts intraplate volcanism by melting of subducted crust. At fast spreading rates, depleted mantle also favors the formation of these zones of upwelling, but they are not strong enough to yield partial melting. Their rapid displacement toward the ridge contributes to faster large-scale homogenization.
Sorich, Michael J; McKinnon, Ross A; Miners, John O; Winkler, David A; Smith, Paul A
2004-10-07
This study aimed to evaluate in silico models based on quantum chemical (QC) descriptors derived using the electronegativity equalization method (EEM) and to assess the use of QC properties to predict chemical metabolism by human UDP-glucuronosyltransferase (UGT) isoforms. Various EEM-derived QC molecular descriptors were calculated for known UGT substrates and nonsubstrates. Classification models were developed using support vector machine and partial least squares discriminant analysis. In general, the most predictive models were generated with the support vector machine. Combining QC and 2D descriptors (from previous work) using a consensus approach resulted in a statistically significant improvement in predictivity (to 84%) over both the QC and 2D models and the other methods of combining the descriptors. EEM-derived QC descriptors were shown to be both highly predictive and computationally efficient. It is likely that EEM-derived QC properties will be generally useful for predicting ADMET and physicochemical properties during drug discovery.
Andersen, Mathias Bækbo; Frey, Jared; Pennathur, Sumita; Bruus, Henrik
2011-01-01
We present a combined theoretical and experimental analysis of the solid-liquid interface of fused-silica nanofabricated channels with and without a hydrophilic 3-cyanopropyldimethylchlorosilane (cyanosilane) coating. We develop a model that relaxes the assumption that the surface parameters C(1), C(2), and pK(+) are constant and independent of surface composition. Our theoretical model consists of three parts: (i) a chemical equilibrium model of the bare or coated wall, (ii) a chemical equilibrium model of the buffered bulk electrolyte, and (iii) a self-consistent Gouy-Chapman-Stern triple-layer model of the electrochemical double layer coupling these two equilibrium models. To validate our model, we used both pH-sensitive dye-based capillary filling experiments as well as electro-osmotic current-monitoring measurements. Using our model we predict the dependence of ζ potential, surface charge density, and capillary filling length ratio on ionic strength for different surface compositions, which can be difficult to achieve otherwise. Copyright © 2010 Elsevier Inc. All rights reserved.
Predictive QSAR modeling workflow, model applicability domains, and virtual screening.
Tropsha, Alexander; Golbraikh, Alexander
2007-01-01
Quantitative Structure Activity Relationship (QSAR) modeling has been traditionally applied as an evaluative approach, i.e., with the focus on developing retrospective and explanatory models of existing data. Model extrapolation was considered if only in hypothetical sense in terms of potential modifications of known biologically active chemicals that could improve compounds' activity. This critical review re-examines the strategy and the output of the modern QSAR modeling approaches. We provide examples and arguments suggesting that current methodologies may afford robust and validated models capable of accurate prediction of compound properties for molecules not included in the training sets. We discuss a data-analytical modeling workflow developed in our laboratory that incorporates modules for combinatorial QSAR model development (i.e., using all possible binary combinations of available descriptor sets and statistical data modeling techniques), rigorous model validation, and virtual screening of available chemical databases to identify novel biologically active compounds. Our approach places particular emphasis on model validation as well as the need to define model applicability domains in the chemistry space. We present examples of studies where the application of rigorously validated QSAR models to virtual screening identified computational hits that were confirmed by subsequent experimental investigations. The emerging focus of QSAR modeling on target property forecasting brings it forward as predictive, as opposed to evaluative, modeling approach.
Overview of chemical vapor infiltration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Besmann, T.M.; Stinton, D.P.; Lowden, R.A.
1993-06-01
Chemical vapor infiltration (CVI) is developing into a commercially important method for the fabrication of continuous filament ceramic composites. Current efforts are focused on the development of an improved understanding of the various processes in CVI and its modeling. New approaches to CVI are being explored, including pressure pulse infiltration and microwave heating. Material development is also proceeding with emphasis on improving the oxidation resistance of the interfacial layer between the fiber and matrix. This paper briefly reviews these subjects, indicating the current state of the science and technology.
Optical detection of chemical warfare agents and toxic industrial chemicals
NASA Astrophysics Data System (ADS)
Webber, Michael E.; Pushkarsky, Michael B.; Patel, C. Kumar N.
2004-12-01
We present an analytical model evaluating the suitability of optical absorption based spectroscopic techniques for detection of chemical warfare agents (CWAs) and toxic industrial chemicals (TICs) in ambient air. The sensor performance is modeled by simulating absorption spectra of a sample containing both the target and multitude of interfering species as well as an appropriate stochastic noise and determining the target concentrations from the simulated spectra via a least square fit (LSF) algorithm. The distribution of the LSF target concentrations determines the sensor sensitivity, probability of false positives (PFP) and probability of false negatives (PFN). The model was applied to CO2 laser based photoacosutic (L-PAS) CWA sensor and predicted single digit ppb sensitivity with very low PFP rates in the presence of significant amount of interferences. This approach will be useful for assessing sensor performance by developers and users alike; it also provides methodology for inter-comparison of different sensing technologies.
Classification of Chemicals Based On Structured Toxicity ...
Thirty years and millions of dollars worth of pesticide registration toxicity studies, historically stored as hardcopy and scanned documents, have been digitized into highly standardized and structured toxicity data within the Toxicity Reference Database (ToxRefDB). Toxicity-based classifications of chemicals were performed as a model application of ToxRefDB. These endpoints will ultimately provide the anchoring toxicity information for the development of predictive models and biological signatures utilizing in vitro assay data. Utilizing query and structured data mining approaches, toxicity profiles were uniformly generated for greater than 300 chemicals. Based on observation rate, species concordance and regulatory relevance, individual and aggregated effects have been selected to classify the chemicals providing a set of predictable endpoints. ToxRefDB exhibits the utility of transforming unstructured toxicity data into structured data and, furthermore, into computable outputs, and serves as a model for applying such data to address modern toxicological problems.
Stryjewska, Agnieszka; Kiepura, Katarzyna; Librowski, Tadeusz; Lochyński, Stanisław
2013-01-01
Industrial biotechnology has been defined as the use and application of biotechnology for the sustainable processing and production of chemicals, materials and fuels. It makes use of biocatalysts such as microbial communities, whole-cell microorganisms or purified enzymes. In the review these processes are described. Drug design is an iterative process which begins when a chemist identifies a compound that displays an interesting biological profile and ends when both the activity profile and the chemical synthesis of the new chemical entity are optimized. Traditional approaches to drug discovery rely on a stepwise synthesis and screening program for large numbers of compounds to optimize activity profiles. Over the past ten to twenty years, scientists have used computer models of new chemical entities to help define activity profiles, geometries and relativities. This article introduces inter alia the concepts of molecular modelling and contains references for further reading.
NASA Astrophysics Data System (ADS)
Bizzi, S.; Surridge, B.; Lerner, D. N.:
2009-04-01
River ecosystems represent complex networks of interacting biological, chemical and geomorphological processes. These processes generate spatial and temporal patterns in biological, chemical and geomorphological variables, and a growing number of these variables are now being used to characterise the status of rivers. However, integrated analyses of these biological-chemical-geomorphological networks have rarely been undertaken, and as a result our knowledge of the underlying processes and how they generate the resulting patterns remains weak. The apparent complexity of the networks involved, and the lack of coherent datasets, represent two key challenges to such analyses. In this paper we describe the application of a novel technique, Structural Equation Modelling (SEM), to the investigation of biological, chemical and geomorphological data collected from rivers across England and Wales. The SEM approach is a multivariate statistical technique enabling simultaneous examination of direct and indirect relationships across a network of variables. Further, SEM allows a-priori conceptual or theoretical models to be tested against available data. This is a significant departure from the solely exploratory analyses which characterise other multivariate techniques. We took biological, chemical and river habitat survey data collected by the Environment Agency for 400 sites in rivers spread across England and Wales, and created a single, coherent dataset suitable for SEM analyses. Biological data cover benthic macroinvertebrates, chemical data relate to a range of standard parameters (e.g. BOD, dissolved oxygen and phosphate concentration), and geomorphological data cover factors such as river typology, substrate material and degree of physical modification. We developed a number of a-priori conceptual models, reflecting current research questions or existing knowledge, and tested the ability of these conceptual models to explain the variance and covariance within the dataset. The conceptual models we developed were able to explain correctly the variance and covariance shown by the datasets, proving to be a relevant representation of the processes involved. The models explained 65% of the variance in indices describing benthic macroinvertebrate communities. Dissolved oxygen was of primary importance, but geomorphological factors, including river habitat type and degree of habitat degradation, also had significant explanatory power. The addition of spatial variables, such as latitude or longitude, did not provide additional explanatory power. This suggests that the variables already included in the models effectively represented the eco-regions across which our data were distributed. The models produced new insights into the relative importance of chemical and geomorphological factors for river macroinvertebrate communities. The SEM technique proved a powerful tool for exploring complex biological-chemical-geomorphological networks, for example able to deal with the co-correlations that are common in rivers due to multiple feedback mechanisms.
Use of Chemical Mixtures to Differentiate Mechanisms of Endocrine Action in a Small Fish Model
Various assays with adult fish have been developed to identify potential endocrine-disrupting chemicals (EDCs) which may cause toxicity via alterations in the hypothalamic-pituitary-gonadal (HPG) axis via different mechanisms/modes of action (MOA). These assays can be sensitive ...
Chemically induced vascular toxicity during embryonic development can result in a wide range of adverse prenatal outcomes. We used information from genetic mouse models linked to phenotypic outcomes and a vascular toxicity knowledge base to construct an embryonic vascular disrupt...
Using in vitro Dose-Response Profiles to Enhance QSAR Modeling of in vivo Toxicity
To develop effective means for rapid toxicity evaluation of environmental chemicals, the Tox21 partnership among the National Toxicology Program (NTP), NIH Chemical Genomics Center, and National Center for Computational Toxicology (NCCT) at the US EPA are conducting a number of ...
New methods are needed to screen thousands of environmental chemicals for toxicity, including developmental neurotoxicity. In vitro, cell-based assays that model key cellular events have been proposed for high throughput screening of chemicals for developmental neurotoxicity. Whi...