Advancing the Adverse Outcome Pathway Framework - an ...
The ability of scientists to conduct whole organism toxicity tests to understand chemical safety has been significantly outpaced by the rapid synthesis of new chemicals. Therefore, to increase efficiencies in chemical risk assessment, scientists are turning to mechanistic-based studies, making greater use of in vitro and in silico methods, to screen for potential environmental and human health hazards. A framework that has gained traction for capturing available knowledge describing the linkage between mechanistic data and apical toxicity endpoints, required for regulatory assessments, is the adverse outcome pathway (AOP). A number of international activities have focused on AOP development and plausible applications to regulatory decision-making. These interactions have prompted dialog between research scientists and regulatory communities to consider how best to use the AOP framework in risk assessment. While expert-facilitated discussions have been instrumental in moving the science of AOPs forward, it was recognized that a survey of the broader scientific community would aid in identifying shortcomings and guiding future initiatives for the AOP framework. To that end, a ?‘Horizon Scanning’ exercise was conducted to solicit questions from the global scientific and regulatory communities concerning the challenges or limitations that must be addressed to realize the full potential of the AOP framework in research and regulatory decision making. The m
Mechanistic explanation, cognitive systems demarcation, and extended cognition.
van Eck, Dingmar; Looren de Jong, Huib
2016-10-01
Approaches to the Internalism-Externalism controversy in the philosophy of mind often involve both (broadly) metaphysical and explanatory considerations. Whereas originally most emphasis seems to have been placed on metaphysical concerns, recently the explanation angle is getting more attention. Explanatory considerations promise to offer more neutral grounds for cognitive systems demarcation than (broadly) metaphysical ones. However, it has been argued that explanation-based approaches are incapable of determining the plausibility of internalist-based conceptions of cognition vis-à-vis externalist ones. On this perspective, improved metaphysics is the route along which to solve the Internalist-Externalist stalemate. In this paper we challenge this claim. Although we agree that explanation-orientated approaches have indeed so far failed to deliver solid means for cognitive system demarcation, we elaborate a more promising explanation-oriented framework to address this issue. We argue that the mutual manipulability account of constitutive relevance in mechanisms, extended with the criterion of 'fat-handedness', is capable of plausibly addressing the cognitive systems demarcation problem, and thus able to decide on the explanatory traction of Internalist vs. Externalist conceptions, on a case-by-case basis. Our analysis also highlights why some other recent mechanistic takes on the problem of cognitive systems demarcation have been unsuccessful. We illustrate our claims with a case on gestures and learning. Copyright © 2016 Elsevier Ltd. All rights reserved.
Emergence of tissue polarization from synergy of intracellular and extracellular auxin signaling
Wabnik, Krzysztof; Kleine-Vehn, Jürgen; Balla, Jozef; Sauer, Michael; Naramoto, Satoshi; Reinöhl, Vilém; Merks, Roeland M H; Govaerts, Willy; Friml, Jiří
2010-01-01
Plant development is exceptionally flexible as manifested by its potential for organogenesis and regeneration, which are processes involving rearrangements of tissue polarities. Fundamental questions concern how individual cells can polarize in a coordinated manner to integrate into the multicellular context. In canalization models, the signaling molecule auxin acts as a polarizing cue, and feedback on the intercellular auxin flow is key for synchronized polarity rearrangements. We provide a novel mechanistic framework for canalization, based on up-to-date experimental data and minimal, biologically plausible assumptions. Our model combines the intracellular auxin signaling for expression of PINFORMED (PIN) auxin transporters and the theoretical postulation of extracellular auxin signaling for modulation of PIN subcellular dynamics. Computer simulations faithfully and robustly recapitulated the experimentally observed patterns of tissue polarity and asymmetric auxin distribution during formation and regeneration of vascular systems and during the competitive regulation of shoot branching by apical dominance. Additionally, our model generated new predictions that could be experimentally validated, highlighting a mechanistically conceivable explanation for the PIN polarization and canalization of the auxin flow in plants. PMID:21179019
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goodman, Julie, E-mail: jgoodman@gradientcorp.com
Background: The International Agency for Research on Cancer (IARC) recently developed a framework for evaluating mechanistic evidence that includes a list of 10 key characteristics of carcinogens. This framework is useful for identifying and organizing large bodies of literature on carcinogenic mechanisms, but it lacks sufficient guidance for conducting evaluations that fully integrate mechanistic evidence into hazard assessments. Objectives: We summarize the framework, and suggest approaches to strengthen the evaluation of mechanistic evidence using this framework. Discussion: While the framework is useful for organizing mechanistic evidence, its lack of guidance for implementation limits its utility for understanding human carcinogenic potential.more » Specifically, it does not include explicit guidance for evaluating the biological significance of mechanistic endpoints, inter- and intra-individual variability, or study quality and relevance. It also does not explicitly address how mechanistic evidence should be integrated with other realms of evidence. Because mechanistic evidence is critical to understanding human cancer hazards, we recommend that IARC develop transparent and systematic guidelines for the use of this framework so that mechanistic evidence will be evaluated and integrated in a robust manner, and concurrently with other realms of evidence, to reach a final human cancer hazard conclusion. Conclusions: IARC does not currently provide a standardized approach to evaluating mechanistic evidence. Incorporating the recommendations discussed here will make IARC analyses of mechanistic evidence more transparent, and lead to assessments of cancer hazards that reflect the weight of the scientific evidence and allow for scientifically defensible decision-making. - Highlights: • IARC has a revised framework for evaluating literature on carcinogenic mechanisms. • The framework is based on 10 key characteristics of carcinogens. • IARC should develop transparent and systematic guidelines for using the framework. • It should better address biological significance, study quality, and relevance. • It should better address integrating mechanistic evidence with other evidence.« less
COLLABORATION ON NHEERL EPIDEMIOLOGY STUDIES
This task will continue ORD's efforts to develop a biologically plausible, quantitative health risk model for particulate matter (PM) based on epidemiological, toxicological, and mechanistic studies using matched exposure assessments. The NERL, in collaboration with the NHEERL, ...
The cultural evolution of fertility decline
Colleran, Heidi
2016-01-01
Cultural evolutionists have long been interested in the problem of why fertility declines as populations develop. By outlining plausible mechanistic links between individual decision-making, information flow in populations and competition between groups, models of cultural evolution offer a novel and powerful approach for integrating multiple levels of explanation of fertility transitions. However, only a modest number of models have been published. Their assumptions often differ from those in other evolutionary approaches to social behaviour, but their empirical predictions are often similar. Here I offer the first overview of cultural evolutionary research on demographic transition, critically compare it with approaches taken by other evolutionary researchers, identify gaps and overlaps, and highlight parallel debates in demography. I suggest that researchers divide their labour between three distinct phases of fertility decline—the origin, spread and maintenance of low fertility—each of which may be driven by different causal processes, at different scales, requiring different theoretical and empirical tools. A comparative, multi-level and mechanistic framework is essential for elucidating both the evolved aspects of our psychology that govern reproductive decision-making, and the social, ecological and cultural contingencies that precipitate and sustain fertility decline. PMID:27022079
The challenge of risk characterization: current practice and future directions.
Gray, G M; Cohen, J T; Graham, J D
1993-01-01
Risk characterization is perhaps the most important part of risk assessment. As currently practiced, risk characterizations do not convey the degree of uncertainty in a risk estimate to risk managers, Congress, the press, and the public. Here, we use a framework put forth by an ad hoc study group of industry and government scientists and academics to critique the risk characterizations contained in two risks assessments of gasoline vapor. After discussing the strengths and weaknesses of each assessment's risk characterization, we detail an alternative approach that conveys estimates in the form of a probability distribution. The distributional approach can make use of all relevant scientific data and knowledge, including alternative data sets and all plausible mechanistic theories of carcinogenesis. As a result, this approach facilitates better public health decisions than current risk characterization procedures. We discuss methodological issues, as well as strengths and weaknesses of the distributional approach. PMID:8020444
Merkle, Jerod A.; Cross, Paul C.; Scurlock, Brandon M.; Cole, Eric K.; Courtemanch, Alyson B.; Dewey, Sarah R.; Kauffman, Matthew J.
2018-01-01
Disease models typically focus on temporal dynamics of infection, while often neglecting environmental processes that determine host movement. In many systems, however, temporal disease dynamics may be slow compared to the scale at which environmental conditions alter host space-use and accelerate disease transmission.Using a mechanistic movement modelling approach, we made space-use predictions of a mobile host (elk [Cervus Canadensis] carrying the bacterial disease brucellosis) under environmental conditions that change daily and annually (e.g., plant phenology, snow depth), and we used these predictions to infer how spring phenology influences the risk of brucellosis transmission from elk (through aborted foetuses) to livestock in the Greater Yellowstone Ecosystem.Using data from 288 female elk monitored with GPS collars, we fit step selection functions (SSFs) during the spring abortion season and then implemented a master equation approach to translate SSFs into predictions of daily elk distribution for five plausible winter weather scenarios (from a heavy snow, to an extreme winter drought year). We predicted abortion events by combining elk distributions with empirical estimates of daily abortion rates, spatially varying elk seroprevelance and elk population counts.Our results reveal strong spatial variation in disease transmission risk at daily and annual scales that is strongly governed by variation in host movement in response to spring phenology. For example, in comparison with an average snow year, years with early snowmelt are predicted to have 64% of the abortions occurring on feedgrounds shift to occurring on mainly public lands, and to a lesser extent on private lands.Synthesis and applications. Linking mechanistic models of host movement with disease dynamics leads to a novel bridge between movement and disease ecology. Our analysis framework offers new avenues for predicting disease spread, while providing managers tools to proactively mitigate risks posed by mobile disease hosts. More broadly, we demonstrate how mechanistic movement models can provide predictions of ecological conditions that are consistent with climate change but may be more extreme than has been observed historically.
Lewis, Sarah J; Gardner, Mike; Higgins, Julian; Holly, Jeff M P; Gaunt, Tom R; Perks, Claire M; Turner, Suzanne D; Rinaldi, Sabina; Thomas, Steve; Harrison, Sean; Lennon, Rosie J; Tan, Vanessa; Borwick, Cath; Emmett, Pauline; Jeffreys, Mona; Northstone, Kate; Mitrou, Giota; Wiseman, Martin; Thompson, Rachel; Martin, Richard M
2017-11-01
Background: Human, animal, and cell experimental studies; human biomarker studies; and genetic studies complement epidemiologic findings and can offer insights into biological plausibility and pathways between exposure and disease, but methods for synthesizing such studies are lacking. We, therefore, developed a methodology for identifying mechanisms and carrying out systematic reviews of mechanistic studies that underpin exposure-cancer associations. Methods: A multidisciplinary team with expertise in informatics, statistics, epidemiology, systematic reviews, cancer biology, and nutrition was assembled. Five 1-day workshops were held to brainstorm ideas; in the intervening periods we carried out searches and applied our methods to a case study to test our ideas. Results: We have developed a two-stage framework, the first stage of which is designed to identify mechanisms underpinning a specific exposure-disease relationship; the second stage is a targeted systematic review of studies on a specific mechanism. As part of the methodology, we also developed an online tool for text mining for mechanism prioritization (TeMMPo) and a new graph for displaying related but heterogeneous data from epidemiologic studies (the Albatross plot). Conclusions: We have developed novel tools for identifying mechanisms and carrying out systematic reviews of mechanistic studies of exposure-disease relationships. In doing so, we have outlined how we have overcome the challenges that we faced and provided researchers with practical guides for conducting mechanistic systematic reviews. Impact: The aforementioned methodology and tools will allow potential mechanisms to be identified and the strength of the evidence underlying a particular mechanism to be assessed. Cancer Epidemiol Biomarkers Prev; 26(11); 1667-75. ©2017 AACR . ©2017 American Association for Cancer Research.
The Modeling Environment for Total Risks studies (MENTOR) system, combined with an extension of the SHEDS (Stochastic Human Exposure and Dose Simulation) methodology, provide a mechanistically consistent framework for conducting source-to-dose exposure assessments of multiple pol...
Matott, L Shawn; Jiang, Zhengzheng; Rabideau, Alan J; Allen-King, Richelle M
2015-01-01
Numerous isotherm expressions have been developed for describing sorption of hydrophobic organic compounds (HOCs), including "dual-mode" approaches that combine nonlinear behavior with a linear partitioning component. Choosing among these alternative expressions for describing a given dataset is an important task that can significantly influence subsequent transport modeling and/or mechanistic interpretation. In this study, a series of numerical experiments were undertaken to identify "best-in-class" isotherms by refitting 10 alternative models to a suite of 13 previously published literature datasets. The corrected Akaike Information Criterion (AICc) was used for ranking these alternative fits and distinguishing between plausible and implausible isotherms for each dataset. The occurrence of multiple plausible isotherms was inversely correlated with dataset "richness", such that datasets with fewer observations and/or a narrow range of aqueous concentrations resulted in a greater number of plausible isotherms. Overall, only the Polanyi-partition dual-mode isotherm was classified as "plausible" across all 13 of the considered datasets, indicating substantial statistical support consistent with current advances in sorption theory. However, these findings are predicated on the use of the AICc measure as an unbiased ranking metric and the adoption of a subjective, but defensible, threshold for separating plausible and implausible isotherms. Copyright © 2015 Elsevier B.V. All rights reserved.
Application of Adverse Outcome Pathways to U.S. EPA’s Endocrine Disruptor Screening Program
Noyes, Pamela D.; Casey, Warren M.; Dix, David J.
2017-01-01
Background: The U.S. EPA’s Endocrine Disruptor Screening Program (EDSP) screens and tests environmental chemicals for potential effects in estrogen, androgen, and thyroid hormone pathways, and it is one of the only regulatory programs designed around chemical mode of action. Objectives: This review describes the EDSP’s use of adverse outcome pathway (AOP) and toxicity pathway frameworks to organize and integrate diverse biological data for evaluating the endocrine activity of chemicals. Using these frameworks helps to establish biologically plausible links between endocrine mechanisms and apical responses when those end points are not measured in the same assay. Results: Pathway frameworks can facilitate a weight of evidence determination of a chemical’s potential endocrine activity, identify data gaps, aid study design, direct assay development, and guide testing strategies. Pathway frameworks also can be used to evaluate the performance of computational approaches as alternatives for low-throughput and animal-based assays and predict downstream key events. In cases where computational methods can be validated based on performance, they may be considered as alternatives to specific assays or end points. Conclusions: A variety of biological systems affect apical end points used in regulatory risk assessments, and without mechanistic data, an endocrine mode of action cannot be determined. Because the EDSP was designed to consider mode of action, toxicity pathway and AOP concepts are a natural fit. Pathway frameworks have diverse applications to endocrine screening and testing. An estrogen pathway example is presented, and similar approaches are being used to evaluate alternative methods and develop predictive models for androgen and thyroid pathways. https://doi.org/10.1289/EHP1304 PMID:28934726
Rational and Mechanistic Perspectives on Reinforcement Learning
ERIC Educational Resources Information Center
Chater, Nick
2009-01-01
This special issue describes important recent developments in applying reinforcement learning models to capture neural and cognitive function. But reinforcement learning, as a theoretical framework, can apply at two very different levels of description: "mechanistic" and "rational." Reinforcement learning is often viewed in mechanistic terms--as…
Deshmukh, Dattatray G; Bangal, Mukund N; Patekar, Mukunda R; Medhane, Vijay J; Mathad, Vijayavitthal Thippannachar
2018-03-01
The present work describes investigation of mechanistic pathway for trimethyl borate mediated amidation of (R)-mandelic acid (3) with 4-nitophenylethylamine (2) to provide (R)-2-hydroxy-N-[2-(4-nitrophenyl)ethyl]-2-phenylacetamide (4) during mirabegron synthesis. Plausible reaction mechanism is proposed by isolating and elucidating the active α-hydroxy ester intermediate 16 from the reaction mass. Trimethyl borate mediated approach proved to be selective in providing 4 without disturbing α-hydroxyl group and stereochemistry of the chiral center, and is also a greener, more economic and production friendly over the reported methods. The developed approach is rapid and efficient for the preparation of 4 with an overall yield of 85-87% and around 99.0% purity by HPLC at scale.
Divorce and health: good data in need of better theory.
Sbarra, David A; Coan, James A
2017-02-01
A very large literature links the experiences of marital separation and divorce to risk for a range of poor distal health outcomes, including early death. What is far less clear, however, is the mechanistic pathways that convey this risk. Several plausible mechanisms are identified in the literature, and the central thesis of this paper is that the empirical study of divorce and health will benefit enormously from a renewed reliance on theory to dictate how these mechanisms of action may unfold over time. This review emphasizes the roles of attachment and social baseline theories in making specific mechanistic predictions and highlights the ways in which these perspectives can contribute new empirical knowledge on risk and resilience following marital dissolution. Copyright © 2016 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Brown, Kathleen M.; Anfara, Vincent A., Jr.; Roney, Kathleen
2004-01-01
Utilizing a qualitative, multisite case study design and the theoretical framework of Hoy and Hannum (1997), the design and execution of this research investigates plausible explanations for the difference in student achievement between high performing (HPS) suburban middle schools and low performing (LPS) urban middle schools. Aside from the…
ERIC Educational Resources Information Center
Kaplan, David; Su, Dan
2016-01-01
This article presents findings on the consequences of matrix sampling of context questionnaires for the generation of plausible values in large-scale assessments. Three studies are conducted. Study 1 uses data from PISA 2012 to examine several different forms of missing data imputation within the chained equations framework: predictive mean…
Mechanisms, determination and the metaphysics of neuroscience.
Soom, Patrice
2012-09-01
In this paper, I evaluate recently defended mechanistic accounts of the unity of neuroscience from a metaphysical point of view. Considering the mechanistic framework in general (Sections 2 and 3), I argue that explanations of this kind are essentially reductive (Section 4). The reductive character of mechanistic explanations provides a sufficiency criterion, according to which the mechanism underlying a certain phenomenon is sufficient for the latter. Thus, the concept of supervenience can be used in order to describe the relation between mechanisms and phenomena (Section 5). Against this background, I show that the mechanistic framework is subject to the causal exclusion problem and faces the classical metaphysical options when it comes to the relations obtaining between different levels of mechanisms (Section 6). Finally, an attempt to improve the metaphysics of mechanisms is made (Section 7) and further difficulties are pointed out (Section 8). Copyright © 2012 Elsevier Ltd. All rights reserved.
A framework for predicting impacts on ecosystem services ...
Protection of ecosystem services is increasingly emphasized as a risk-assessment goal, but there are wide gaps between current ecological risk-assessment endpoints and potential effects on services provided by ecosystems. The authors present a framework that links common ecotoxicological endpoints to chemical impacts on populations and communities and the ecosystem services that they provide. This framework builds on considerable advances in mechanistic effects models designed to span multiple levels of biological organization and account for various types of biological interactions and feedbacks. For illustration, the authors introduce 2 case studies that employ well-developed and validated mechanistic effects models: the inSTREAM individual-based model for fish populations and the AQUATOX ecosystem model. They also show how dynamic energy budget theory can provide a common currency for interpreting organism-level toxicity. They suggest that a framework based on mechanistic models that predict impacts on ecosystem services resulting from chemical exposure, combined with economic valuation, can provide a useful approach for informing environmental management. The authors highlight the potential benefits of using this framework as well as the challenges that will need to be addressed in future work. The framework introduced here represents an ongoing initiative supported by the National Institute of Mathematical and Biological Synthesis (NIMBioS; http://www.nimbi
Forbes, Valery E; Salice, Chris J; Birnir, Bjorn; Bruins, Randy J F; Calow, Peter; Ducrot, Virginie; Galic, Nika; Garber, Kristina; Harvey, Bret C; Jager, Henriette; Kanarek, Andrew; Pastorok, Robert; Railsback, Steve F; Rebarber, Richard; Thorbek, Pernille
2017-04-01
Protection of ecosystem services is increasingly emphasized as a risk-assessment goal, but there are wide gaps between current ecological risk-assessment endpoints and potential effects on services provided by ecosystems. The authors present a framework that links common ecotoxicological endpoints to chemical impacts on populations and communities and the ecosystem services that they provide. This framework builds on considerable advances in mechanistic effects models designed to span multiple levels of biological organization and account for various types of biological interactions and feedbacks. For illustration, the authors introduce 2 case studies that employ well-developed and validated mechanistic effects models: the inSTREAM individual-based model for fish populations and the AQUATOX ecosystem model. They also show how dynamic energy budget theory can provide a common currency for interpreting organism-level toxicity. They suggest that a framework based on mechanistic models that predict impacts on ecosystem services resulting from chemical exposure, combined with economic valuation, can provide a useful approach for informing environmental management. The authors highlight the potential benefits of using this framework as well as the challenges that will need to be addressed in future work. Environ Toxicol Chem 2017;36:845-859. © 2017 SETAC. © 2017 SETAC.
A conceptual framework of computations in mid-level vision
Kubilius, Jonas; Wagemans, Johan; Op de Beeck, Hans P.
2014-01-01
If a picture is worth a thousand words, as an English idiom goes, what should those words—or, rather, descriptors—capture? What format of image representation would be sufficiently rich if we were to reconstruct the essence of images from their descriptors? In this paper, we set out to develop a conceptual framework that would be: (i) biologically plausible in order to provide a better mechanistic understanding of our visual system; (ii) sufficiently robust to apply in practice on realistic images; and (iii) able to tap into underlying structure of our visual world. We bring forward three key ideas. First, we argue that surface-based representations are constructed based on feature inference from the input in the intermediate processing layers of the visual system. Such representations are computed in a largely pre-semantic (prior to categorization) and pre-attentive manner using multiple cues (orientation, color, polarity, variation in orientation, and so on), and explicitly retain configural relations between features. The constructed surfaces may be partially overlapping to compensate for occlusions and are ordered in depth (figure-ground organization). Second, we propose that such intermediate representations could be formed by a hierarchical computation of similarity between features in local image patches and pooling of highly-similar units, and reestimated via recurrent loops according to the task demands. Finally, we suggest to use datasets composed of realistically rendered artificial objects and surfaces in order to better understand a model's behavior and its limitations. PMID:25566044
A conceptual framework of computations in mid-level vision.
Kubilius, Jonas; Wagemans, Johan; Op de Beeck, Hans P
2014-01-01
If a picture is worth a thousand words, as an English idiom goes, what should those words-or, rather, descriptors-capture? What format of image representation would be sufficiently rich if we were to reconstruct the essence of images from their descriptors? In this paper, we set out to develop a conceptual framework that would be: (i) biologically plausible in order to provide a better mechanistic understanding of our visual system; (ii) sufficiently robust to apply in practice on realistic images; and (iii) able to tap into underlying structure of our visual world. We bring forward three key ideas. First, we argue that surface-based representations are constructed based on feature inference from the input in the intermediate processing layers of the visual system. Such representations are computed in a largely pre-semantic (prior to categorization) and pre-attentive manner using multiple cues (orientation, color, polarity, variation in orientation, and so on), and explicitly retain configural relations between features. The constructed surfaces may be partially overlapping to compensate for occlusions and are ordered in depth (figure-ground organization). Second, we propose that such intermediate representations could be formed by a hierarchical computation of similarity between features in local image patches and pooling of highly-similar units, and reestimated via recurrent loops according to the task demands. Finally, we suggest to use datasets composed of realistically rendered artificial objects and surfaces in order to better understand a model's behavior and its limitations.
Turchin, Peter; Currie, Thomas E
2016-01-01
The evidence compiled in the target article demonstrates that the assumptions of cultural group selection (CGS) theory are often met, and it is therefore a useful framework for generating plausible hypotheses. However, more can be said about how we can test the predictions of CGS hypotheses against competing explanations using historical, archaeological, and anthropological data.
Simulating human behavior for national security human interactions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernard, Michael Lewis; Hart, Dereck H.; Verzi, Stephen J.
2007-01-01
This 3-year research and development effort focused on what we believe is a significant technical gap in existing modeling and simulation capabilities: the representation of plausible human cognition and behaviors within a dynamic, simulated environment. Specifically, the intent of the ''Simulating Human Behavior for National Security Human Interactions'' project was to demonstrate initial simulated human modeling capability that realistically represents intra- and inter-group interaction behaviors between simulated humans and human-controlled avatars as they respond to their environment. Significant process was made towards simulating human behaviors through the development of a framework that produces realistic characteristics and movement. The simulated humansmore » were created from models designed to be psychologically plausible by being based on robust psychological research and theory. Progress was also made towards enhancing Sandia National Laboratories existing cognitive models to support culturally plausible behaviors that are important in representing group interactions. These models were implemented in the modular, interoperable, and commercially supported Umbra{reg_sign} simulation framework.« less
MECHANISTIC-BASED DISINFECTION AND DISINFECTION BYPRODUCT MODELS
We propose developing a mechanistic-based numerical model for chlorine decay and regulated DBP (THM and HAA) formation derived from (free) chlorination; the model framework will allow future modifications for other DBPs and chloramination. Predicted chlorine residual and DBP r...
Oakes, Benjamin Donald; Mattsson, Lars-Göran; Näsman, Per; Glazunov, Andrés Alayón
2018-06-01
Modern infrastructures are becoming increasingly dependent on electronic systems, leaving them more vulnerable to electrical surges or electromagnetic interference. Electromagnetic disturbances appear in nature, e.g., lightning and solar wind; however, they may also be generated by man-made technology to maliciously damage or disturb electronic equipment. This article presents a systematic risk assessment framework for identifying possible, consequential, and plausible intentional electromagnetic interference (IEMI) attacks on an arbitrary distribution network infrastructure. In the absence of available data on IEMI occurrences, we find that a systems-based risk assessment is more useful than a probabilistic approach. We therefore modify the often applied definition of risk, i.e., a set of triplets containing scenario, probability, and consequence, to a set of quadruplets: scenario, resource requirements, plausibility, and consequence. Probability is "replaced" by resource requirements and plausibility, where the former is the minimum amount and type of equipment necessary to successfully carry out an attack scenario and the latter is a subjective assessment of the extent of the existence of attackers who possess the motivation, knowledge, and resources necessary to carry out the scenario. We apply the concept of intrusion areas and classify electromagnetic source technology according to key attributes. Worst-case scenarios are identified for different quantities of attacker resources. The most plausible and consequential of these are deemed the most important scenarios and should provide useful decision support in a countermeasures effort. Finally, an example of the proposed risk assessment framework, based on notional data, is provided on a hypothetical water distribution network. © 2017 Society for Risk Analysis.
A Framework for Implementing TQM in Higher Education Programs
ERIC Educational Resources Information Center
Venkatraman, Sitalakshmi
2007-01-01
Purpose: This paper aims to provide a TQM framework that stresses continuous improvements in teaching as a plausible means of TQM implementation in higher education programs. Design/methodology/approach: The literature survey of the TQM philosophies and the comparative analysis of TQM adoption in industry versus higher education provide the…
Duan, J; Kesisoglou, F; Novakovic, J; Amidon, GL; Jamei, M; Lukacova, V; Eissing, T; Tsakalozou, E; Zhao, L; Lionberger, R
2017-01-01
On May 19, 2016, the US Food and Drug Administration (FDA) hosted a public workshop, entitled “Mechanistic Oral Absorption Modeling and Simulation for Formulation Development and Bioequivalence Evaluation.”1 The topic of mechanistic oral absorption modeling, which is one of the major applications of physiologically based pharmacokinetic (PBPK) modeling and simulation, focuses on predicting oral absorption by mechanistically integrating gastrointestinal transit, dissolution, and permeation processes, incorporating systems, active pharmaceutical ingredient (API), and the drug product information, into a systemic mathematical whole‐body framework.2 PMID:28571121
A series of case studies is presented focusing on multimedia/multipathway population exposures to arsenic, employing the Population Based Modeling approach of the MENTOR (Modeling Environment for Total Risks) framework. This framework considers currently five exposure routes: i...
The Adverse Outcome Pathway (AOP) framework has emerged to capitalise on the vast quantity of mechanistic data generated by alternative techniques, as well as advances in systems biology, cheminformatics, and bioinformatics. AOPs provide a scaffold onto which mechanistic data can...
The adverse outcome pathway (AOP) framework is intended to help support greater use of mechanistic toxicology data as a basis for risk assessment and/or regulatory decision-making. While there have been clear advances in the ability to rapidly generate mechanistically-oriented da...
McClelland, Amanda; Zelner, Jon; Streftaris, George; Funk, Sebastian; Metcalf, Jessica; Dalziel, Benjamin D.; Grenfell, Bryan T.
2017-01-01
In recent years there has been growing availability of individual-level spatio-temporal disease data, particularly due to the use of modern communicating devices with GPS tracking functionality. These detailed data have been proven useful for inferring disease transmission to a more refined level than previously. However, there remains a lack of statistically sound frameworks to model the underlying transmission dynamic in a mechanistic manner. Such a development is particularly crucial for enabling a general epidemic predictive framework at the individual level. In this paper we propose a new statistical framework for mechanistically modelling individual-to-individual disease transmission in a landscape with heterogeneous population density. Our methodology is first tested using simulated datasets, validating our inferential machinery. The methodology is subsequently applied to data that describes a regional Ebola outbreak in Western Africa (2014-2015). Our results show that the methods are able to obtain estimates of key epidemiological parameters that are broadly consistent with the literature, while revealing a significantly shorter distance of transmission. More importantly, in contrast to existing approaches, we are able to perform a more general model prediction that takes into account the susceptible population. Finally, our results show that, given reasonable scenarios, the framework can be an effective surrogate for susceptible-explicit individual models which are often computationally challenging. PMID:29084216
Lau, Max S Y; Gibson, Gavin J; Adrakey, Hola; McClelland, Amanda; Riley, Steven; Zelner, Jon; Streftaris, George; Funk, Sebastian; Metcalf, Jessica; Dalziel, Benjamin D; Grenfell, Bryan T
2017-10-01
In recent years there has been growing availability of individual-level spatio-temporal disease data, particularly due to the use of modern communicating devices with GPS tracking functionality. These detailed data have been proven useful for inferring disease transmission to a more refined level than previously. However, there remains a lack of statistically sound frameworks to model the underlying transmission dynamic in a mechanistic manner. Such a development is particularly crucial for enabling a general epidemic predictive framework at the individual level. In this paper we propose a new statistical framework for mechanistically modelling individual-to-individual disease transmission in a landscape with heterogeneous population density. Our methodology is first tested using simulated datasets, validating our inferential machinery. The methodology is subsequently applied to data that describes a regional Ebola outbreak in Western Africa (2014-2015). Our results show that the methods are able to obtain estimates of key epidemiological parameters that are broadly consistent with the literature, while revealing a significantly shorter distance of transmission. More importantly, in contrast to existing approaches, we are able to perform a more general model prediction that takes into account the susceptible population. Finally, our results show that, given reasonable scenarios, the framework can be an effective surrogate for susceptible-explicit individual models which are often computationally challenging.
Zhang, Chun; Feng, Peng; Jiao, Ning
2013-10-09
The Cu-catalyzed novel aerobic oxidative esterification reaction of 1,3-diones for the synthesis of α-ketoesters has been developed. This method combines C-C σ-bond cleavage, dioxygen activation and oxidative C-H bond functionalization, as well as provides a practical, neutral, and mild synthetic approach to α-ketoesters which are important units in many biologically active compounds and useful precursors in a variety of functional group transformations. A plausible radical process is proposed on the basis of mechanistic studies.
Ruiter, Sander; Sippel, Josefine; Bouwmeester, Manon C; Lommelaars, Tobias; Beekhof, Piet; Hodemaekers, Hennie M; Bakker, Frank; van den Brandhof, Evert-Jan; Pennings, Jeroen L A; van der Ven, Leo T M
2016-11-02
Non-communicable diseases (NCDs) are a major cause of premature mortality. Recent studies show that predispositions for NCDs may arise from early-life exposure to low concentrations of environmental contaminants. This developmental origins of health and disease (DOHaD) paradigm suggests that programming of an embryo can be disrupted, changing the homeostatic set point of biological functions. Epigenetic alterations are a possible underlying mechanism. Here, we investigated the DOHaD paradigm by exposing zebrafish to subtoxic concentrations of the ubiquitous contaminant cadmium during embryogenesis, followed by growth under normal conditions. Prolonged behavioral responses to physical stress and altered antioxidative physiology were observed approximately ten weeks after termination of embryonal exposure, at concentrations that were 50-3200-fold below the direct embryotoxic concentration, and interpreted as altered developmental programming. Literature was explored for possible mechanistic pathways that link embryonic subtoxic cadmium to the observed apical phenotypes, more specifically, the probability of molecular mechanisms induced by cadmium exposure leading to altered DNA methylation and subsequently to the observed apical phenotypes. This was done using the adverse outcome pathway model framework, and assessing key event relationship plausibility by tailored Bradford-Hill analysis. Thus, cadmium interaction with thiols appeared to be the major contributor to late-life effects. Cadmium-thiol interactions may lead to depletion of the methyl donor S -adenosyl-methionine, resulting in methylome alterations, and may, additionally, result in oxidative stress, which may lead to DNA oxidation, and subsequently altered DNA methyltransferase activity. In this way, DNA methylation may be affected at a critical developmental stage, causing the observed apical phenotypes.
We introduce and validate a new precision oncology framework for the systematic prioritization of drugs targeting mechanistic tumor dependencies in individual patients. Compounds are prioritized on the basis of their ability to invert the concerted activity of master regulator proteins that mechanistically regulate tumor cell state, as assessed from systematic drug perturbation assays. We validated the approach on a cohort of 212 gastroenteropancreatic neuroendocrine tumors (GEP-NETs), a rare malignancy originating in the pancreas and gastrointestinal tract.
Drawing a link between habitat change and the production and delivery of ecosystem services is a priority in coastal estuarine ecosystems. Mechanistic modeling tools are highly functional for exploring this link because they allow for the synthesis of multiple ecological and beh...
ERIC Educational Resources Information Center
Russ, Rosemary S.; Scherr, Rachel E.; Hammer, David; Mikeska, Jamie
2008-01-01
Science education reform has long focused on assessing student inquiry, and there has been progress in developing tools specifically with respect to experimentation and argumentation. We suggest the need for attention to another aspect of inquiry, namely "mechanistic reasoning." Scientific inquiry focuses largely on understanding causal…
Bavassi, M Luz; Tagliazucchi, Enzo; Laje, Rodrigo
2013-02-01
Time processing in the few hundred milliseconds range is involved in the human skill of sensorimotor synchronization, like playing music in an ensemble or finger tapping to an external beat. In finger tapping, a mechanistic explanation in biologically plausible terms of how the brain achieves synchronization is still missing despite considerable research. In this work we show that nonlinear effects are important for the recovery of synchronization following a perturbation (a step change in stimulus period), even for perturbation magnitudes smaller than 10% of the period, which is well below the amount of perturbation needed to evoke other nonlinear effects like saturation. We build a nonlinear mathematical model for the error correction mechanism and test its predictions, and further propose a framework that allows us to unify the description of the three common types of perturbations. While previous authors have used two different model mechanisms for fitting different perturbation types, or have fitted different parameter value sets for different perturbation magnitudes, we propose the first unified description of the behavior following all perturbation types and magnitudes as the dynamical response of a compound model with fixed terms and a single set of parameter values. Copyright © 2012 Elsevier B.V. All rights reserved.
Modeling Avoidance in Mood and Anxiety Disorders Using Reinforcement Learning.
Mkrtchian, Anahit; Aylward, Jessica; Dayan, Peter; Roiser, Jonathan P; Robinson, Oliver J
2017-10-01
Serious and debilitating symptoms of anxiety are the most common mental health problem worldwide, accounting for around 5% of all adult years lived with disability in the developed world. Avoidance behavior-avoiding social situations for fear of embarrassment, for instance-is a core feature of such anxiety. However, as for many other psychiatric symptoms the biological mechanisms underlying avoidance remain unclear. Reinforcement learning models provide formal and testable characterizations of the mechanisms of decision making; here, we examine avoidance in these terms. A total of 101 healthy participants and individuals with mood and anxiety disorders completed an approach-avoidance go/no-go task under stress induced by threat of unpredictable shock. We show an increased reliance in the mood and anxiety group on a parameter of our reinforcement learning model that characterizes a prepotent (pavlovian) bias to withhold responding in the face of negative outcomes. This was particularly the case when the mood and anxiety group was under stress. This formal description of avoidance within the reinforcement learning framework provides a new means of linking clinical symptoms with biophysically plausible models of neural circuitry and, as such, takes us closer to a mechanistic understanding of mood and anxiety disorders. Copyright © 2017 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.
Loucks, Eric B; Schuman-Olivier, Zev; Britton, Willoughby B; Fresco, David M; Desbordes, Gaelle; Brewer, Judson A; Fulwiler, Carl
2015-12-01
The purpose of this review is to provide (1) a synopsis on relations of mindfulness with cardiovascular disease (CVD) and major CVD risk factors, and (2) an initial consensus-based overview of mechanisms and theoretical framework by which mindfulness might influence CVD. Initial evidence, often of limited methodological quality, suggests possible impacts of mindfulness on CVD risk factors including physical activity, smoking, diet, obesity, blood pressure, and diabetes regulation. Plausible mechanisms include (1) improved attention control (e.g., ability to hold attention on experiences related to CVD risk, such as smoking, diet, physical activity, and medication adherence), (2) emotion regulation (e.g., improved stress response, self-efficacy, and skills to manage craving for cigarettes, palatable foods, and sedentary activities), and (3) self-awareness (e.g., self-referential processing and awareness of physical sensations due to CVD risk factors). Understanding mechanisms and theoretical framework should improve etiologic knowledge, providing customized mindfulness intervention targets that could enable greater mindfulness intervention efficacy.
Schuman-Olivier, Zev; Britton, Willoughby B.; Fresco, David M.; Desbordes, Gaelle; Brewer, Judson A.; Fulwiler, Carl
2016-01-01
The purpose of this review is to provide (1) a synopsis on relations of mindfulness with cardiovascular disease (CVD) and major CVD risk factors, and (2) an initial consensus-based overview of mechanisms and theoretical framework by which mindfulness might influence CVD. Initial evidence, often of limited methodological quality, suggests possible impacts of mindfulness on CVD risk factors including physical activity, smoking, diet, obesity, blood pressure, and diabetes regulation. Plausible mechanisms include (1) improved attention control (e.g., ability to hold attention on experiences related to CVD risk, such as smoking, diet, physical activity, and medication adherence), (2) emotion regulation (e.g., improved stress response, self-efficacy, and skills to manage craving for cigarettes, palatable foods, and sedentary activities), and (3) self-awareness (e.g., self-referential processing and awareness of physical sensations due to CVD risk factors). Understanding mechanisms and theoretical framework should improve etiologic knowledge, providing customized mindfulness intervention targets that could enable greater mindfulness intervention efficacy. PMID:26482755
Energy and Power Aware Computing Through Management of Computational Entropy
2008-01-01
18 2.4.1 ACIP living framework forum task...This research focused on two sub- tasks: (1) Assessing the need and planning for a potential “Living Framework Forum ” (LFF) software architecture...probabilistic switching with plausible device realizations to save energy in our patent application [35]. In [35], we showed an introverted switch in
Human Health Effects of Trichloroethylene: Key Findings and Scientific Issues
Jinot, Jennifer; Scott, Cheryl Siegel; Makris, Susan L.; Cooper, Glinda S.; Dzubow, Rebecca C.; Bale, Ambuja S.; Evans, Marina V.; Guyton, Kathryn Z.; Keshava, Nagalakshmi; Lipscomb, John C.; Barone, Stanley; Fox, John F.; Gwinn, Maureen R.; Schaum, John; Caldwell, Jane C.
2012-01-01
Background: In support of the Integrated Risk Information System (IRIS), the U.S. Environmental Protection Agency (EPA) completed a toxicological review of trichloroethylene (TCE) in September 2011, which was the result of an effort spanning > 20 years. Objectives: We summarized the key findings and scientific issues regarding the human health effects of TCE in the U.S. EPA’s toxicological review. Methods: In this assessment we synthesized and characterized thousands of epidemiologic, experimental animal, and mechanistic studies, and addressed several key scientific issues through modeling of TCE toxicokinetics, meta-analyses of epidemiologic studies, and analyses of mechanistic data. Discussion: Toxicokinetic modeling aided in characterizing the toxicological role of the complex metabolism and multiple metabolites of TCE. Meta-analyses of the epidemiologic data strongly supported the conclusions that TCE causes kidney cancer in humans and that TCE may also cause liver cancer and non-Hodgkin lymphoma. Mechanistic analyses support a key role for mutagenicity in TCE-induced kidney carcinogenicity. Recent evidence from studies in both humans and experimental animals point to the involvement of TCE exposure in autoimmune disease and hypersensitivity. Recent avian and in vitro mechanistic studies provided biological plausibility that TCE plays a role in developmental cardiac toxicity, the subject of substantial debate due to mixed results from epidemiologic and rodent studies. Conclusions: TCE is carcinogenic to humans by all routes of exposure and poses a potential human health hazard for noncancer toxicity to the central nervous system, kidney, liver, immune system, male reproductive system, and the developing embryo/fetus. PMID:23249866
Yé, Yazoume; Eisele, Thomas P; Eckert, Erin; Korenromp, Eline; Shah, Jui A; Hershey, Christine L; Ivanovich, Elizabeth; Newby, Holly; Carvajal-Velez, Liliana; Lynch, Michael; Komatsu, Ryuichi; Cibulskis, Richard E; Moore, Zhuzhi; Bhattarai, Achuyt
2017-09-01
Concerted efforts from national and international partners have scaled up malaria control interventions, including insecticide-treated nets, indoor residual spraying, diagnostics, prompt and effective treatment of malaria cases, and intermittent preventive treatment during pregnancy in sub-Saharan Africa (SSA). This scale-up warrants an assessment of its health impact to guide future efforts and investments; however, measuring malaria-specific mortality and the overall impact of malaria control interventions remains challenging. In 2007, Roll Back Malaria's Monitoring and Evaluation Reference Group proposed a theoretical framework for evaluating the impact of full-coverage malaria control interventions on morbidity and mortality in high-burden SSA countries. Recently, several evaluations have contributed new ideas and lessons to strengthen this plausibility design. This paper harnesses that new evaluation experience to expand the framework, with additional features, such as stratification, to examine subgroups most likely to experience improvement if control programs are working; the use of a national platform framework; and analysis of complete birth histories from national household surveys. The refined framework has shown that, despite persisting data challenges, combining multiple sources of data, considering potential contributions from both fundamental and proximate contextual factors, and conducting subnational analyses allows identification of the plausible contributions of malaria control interventions on malaria morbidity and mortality.
Zhang, X; Duan, J; Kesisoglou, F; Novakovic, J; Amidon, G L; Jamei, M; Lukacova, V; Eissing, T; Tsakalozou, E; Zhao, L; Lionberger, R
2017-08-01
On May 19, 2016, the US Food and Drug Administration (FDA) hosted a public workshop, entitled "Mechanistic Oral Absorption Modeling and Simulation for Formulation Development and Bioequivalence Evaluation." The topic of mechanistic oral absorption modeling, which is one of the major applications of physiologically based pharmacokinetic (PBPK) modeling and simulation, focuses on predicting oral absorption by mechanistically integrating gastrointestinal transit, dissolution, and permeation processes, incorporating systems, active pharmaceutical ingredient (API), and the drug product information, into a systemic mathematical whole-body framework. © 2017 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.
ACSPRI 2014 4th International Social Science Methodology Conference Report
2015-04-01
Validity, trustworthiness and rigour: quality and the idea of qualitative research . Journal of Advanced Nursing, 304-310. Spencer, L., Ritchie, J...increasing data quality; the Total Survey Error framework; multi-modal on-line surveying, quality frameworks for assessing qualitative research ; and...provided an overview of the current perspectives on causal claims in qualitative research . Three approaches to generating plausible causal
Kentzoglanakis, Kyriakos; Poole, Matthew
2012-01-01
In this paper, we investigate the problem of reverse engineering the topology of gene regulatory networks from temporal gene expression data. We adopt a computational intelligence approach comprising swarm intelligence techniques, namely particle swarm optimization (PSO) and ant colony optimization (ACO). In addition, the recurrent neural network (RNN) formalism is employed for modeling the dynamical behavior of gene regulatory systems. More specifically, ACO is used for searching the discrete space of network architectures and PSO for searching the corresponding continuous space of RNN model parameters. We propose a novel solution construction process in the context of ACO for generating biologically plausible candidate architectures. The objective is to concentrate the search effort into areas of the structure space that contain architectures which are feasible in terms of their topological resemblance to real-world networks. The proposed framework is initially applied to the reconstruction of a small artificial network that has previously been studied in the context of gene network reverse engineering. Subsequently, we consider an artificial data set with added noise for reconstructing a subnetwork of the genetic interaction network of S. cerevisiae (yeast). Finally, the framework is applied to a real-world data set for reverse engineering the SOS response system of the bacterium Escherichia coli. Results demonstrate the relative advantage of utilizing problem-specific knowledge regarding biologically plausible structural properties of gene networks over conducting a problem-agnostic search in the vast space of network architectures.
Cross-benzoin and Stetter-type reactions mediated by KOtBu-DMF via an electron-transfer process.
Ragno, Daniele; Zaghi, Anna; Di Carmine, Graziano; Giovannini, Pier Paolo; Bortolini, Olga; Fogagnolo, Marco; Molinari, Alessandra; Venturini, Alessandro; Massi, Alessandro
2016-10-18
The condensation of aromatic α-diketones (benzils) with aromatic aldehydes (benzoin-type reaction) and chalcones (Stetter-type reaction) in DMF in the presence of catalytic (25 mol%) KOtBu is reported. Both types of umpolung processes proceed with good efficiency and complete chemoselectivity. On the basis of spectroscopic evidence (MS analysis) of plausible intermediates and literature reports, the occurrence of different ionic pathways have been evaluated to elucidate the mechanism of a model cross-benzoin-like reaction along with a radical route initiated by an electron-transfer process to benzil from the carbamoyl anion derived from DMF. This mechanistic investigation has culminated in a different proposal, supported by calculations and a trapping experiment, based on double electron-transfer to benzil with formation of the corresponding enediolate anion as the key reactive intermediate. A mechanistic comparison between the activation modes of benzils in KOtBu-DMF and KOtBu-DMSO systems is also described.
Informatics approaches in the Biological Characterization of Adverse Outcome Pathways
Adverse Outcome Pathways (AOPs) are a conceptual framework to characterize toxicity pathways by a series of mechanistic steps from a molecular initiating event to population outcomes. This framework helps to direct risk assessment research, for example by aiding in computational ...
Kolokotroni, Eleni; Dionysiou, Dimitra; Veith, Christian; Kim, Yoo-Jin; Franz, Astrid; Grgic, Aleksandar; Bohle, Rainer M.; Stamatakos, Georgios
2016-01-01
The 5-year survival of non-small cell lung cancer patients can be as low as 1% in advanced stages. For patients with resectable disease, the successful choice of preoperative chemotherapy is critical to eliminate micrometastasis and improve operability. In silico experimentations can suggest the optimal treatment protocol for each patient based on their own multiscale data. A determinant for reliable predictions is the a priori estimation of the drugs’ cytotoxic efficacy on cancer cells for a given treatment. In the present work a mechanistic model of cancer response to treatment is applied for the estimation of a plausible value range of the cell killing efficacy of various cisplatin-based doublet regimens. Among others, the model incorporates the cancer related mechanism of uncontrolled proliferation, population heterogeneity, hypoxia and treatment resistance. The methodology is based on the provision of tumor volumetric data at two time points, before and after or during treatment. It takes into account the effect of tumor microenvironment and cell repopulation on treatment outcome. A thorough sensitivity analysis based on one-factor-at-a-time and latin hypercube sampling/partial rank correlation coefficient approaches has established the volume growth rate and the growth fraction at diagnosis as key features for more accurate estimates. The methodology is applied on the retrospective data of thirteen patients with non-small cell lung cancer who received cisplatin in combination with gemcitabine, vinorelbine or docetaxel in the neoadjuvant context. The selection of model input values has been guided by a comprehensive literature survey on cancer-specific proliferation kinetics. The latin hypercube sampling has been recruited to compensate for patient-specific uncertainties. Concluding, the present work provides a quantitative framework for the estimation of the in-vivo cell-killing ability of various chemotherapies. Correlation studies of such estimates with the molecular profile of patients could serve as a basis for reliable personalized predictions. PMID:27657742
The adverse outcome pathway: A multifaceted framework supporting 21st century toxicology
The adverse outcome pathway (AOP) framework serves as a knowledge assembly, interpretation, and communication tool designed to support the translation of pathway-specific mechanistic data into responses relevant to assessing and managing risks of chemicals to human health and the...
Ruiter, Sander; Sippel, Josefine; Bouwmeester, Manon C.; Lommelaars, Tobias; Beekhof, Piet; Hodemaekers, Hennie M.; Bakker, Frank; van den Brandhof, Evert-Jan; Pennings, Jeroen L. A.; van der Ven, Leo T. M.
2016-01-01
Non-communicable diseases (NCDs) are a major cause of premature mortality. Recent studies show that predispositions for NCDs may arise from early-life exposure to low concentrations of environmental contaminants. This developmental origins of health and disease (DOHaD) paradigm suggests that programming of an embryo can be disrupted, changing the homeostatic set point of biological functions. Epigenetic alterations are a possible underlying mechanism. Here, we investigated the DOHaD paradigm by exposing zebrafish to subtoxic concentrations of the ubiquitous contaminant cadmium during embryogenesis, followed by growth under normal conditions. Prolonged behavioral responses to physical stress and altered antioxidative physiology were observed approximately ten weeks after termination of embryonal exposure, at concentrations that were 50–3200-fold below the direct embryotoxic concentration, and interpreted as altered developmental programming. Literature was explored for possible mechanistic pathways that link embryonic subtoxic cadmium to the observed apical phenotypes, more specifically, the probability of molecular mechanisms induced by cadmium exposure leading to altered DNA methylation and subsequently to the observed apical phenotypes. This was done using the adverse outcome pathway model framework, and assessing key event relationship plausibility by tailored Bradford-Hill analysis. Thus, cadmium interaction with thiols appeared to be the major contributor to late-life effects. Cadmium-thiol interactions may lead to depletion of the methyl donor S-adenosyl-methionine, resulting in methylome alterations, and may, additionally, result in oxidative stress, which may lead to DNA oxidation, and subsequently altered DNA methyltransferase activity. In this way, DNA methylation may be affected at a critical developmental stage, causing the observed apical phenotypes. PMID:27827847
Strupp, Christian; Bomann, Werner; Cohen, Samuel M; Weber, Klaus
2016-12-01
Fluensulfone is a nematicide for agricultural use. Chronic dietary exposure led to bronchiolo-alveolar hyperplasia and bronchiolo-alveolar adenomas in CD-1 mice but not in rats. Genotoxicity could be excluded as a mode of action (MOA). An earlier publication (Strupp, C., Banas, D. A., Cohen, S. M., Gordon, E. B., Jaeger, M., and Weber, K. (2012). Relationship of metabolism and cell proliferation to the mode of action of fluensulfone-induced mouse lung tumors: analysis of their human relevance using the IPCS framework. Toxicol. Sci. 128, 284-294.) reported MOA studies identifying the following key events: increased metabolism of fluensulfone by CYP2f2 in mouse lung Club cells, followed by local proliferation, finally leading to adenoma formation. Human lung microsomes were found not to metabolize fluensulfone. The Joint FAO/WHO Meeting on Pesticide Residues has reviewed the previous data and concluded that the MOA is plausible however some areas of uncertainty were identified. This publication provides additional data to address these. New cell proliferation studies in mice showed that the MOA is functionally independent of sex. A threshold of cell proliferation in Club cells correlating with the dose response for adenoma formation was shown. CYP2f2 knockout mice did not react to fluensulfone exposure with cell proliferation like wild-type mice, confirming the key role of this enzyme. The collective data for fluensulfone were evaluated according to the International Programme on Chemical Safety (IPCS) Mode of Action Framework which leads to the conclusion that the mouse-specific lung tumors after fluensulfone are not relevant to humans. © The Author 2016. Published by Oxford University Press on behalf of the Society of Toxicology. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Rational and mechanistic perspectives on reinforcement learning.
Chater, Nick
2009-12-01
This special issue describes important recent developments in applying reinforcement learning models to capture neural and cognitive function. But reinforcement learning, as a theoretical framework, can apply at two very different levels of description: mechanistic and rational. Reinforcement learning is often viewed in mechanistic terms--as describing the operation of aspects of an agent's cognitive and neural machinery. Yet it can also be viewed as a rational level of description, specifically, as describing a class of methods for learning from experience, using minimal background knowledge. This paper considers how rational and mechanistic perspectives differ, and what types of evidence distinguish between them. Reinforcement learning research in the cognitive and brain sciences is often implicitly committed to the mechanistic interpretation. Here the opposite view is put forward: that accounts of reinforcement learning should apply at the rational level, unless there is strong evidence for a mechanistic interpretation. Implications of this viewpoint for reinforcement-based theories in the cognitive and brain sciences are discussed.
Pattern formation in mass conserving reaction-diffusion systems
NASA Astrophysics Data System (ADS)
Brauns, Fridtjof; Halatek, Jacob; Frey, Erwin
We present a rigorous theoretical framework able to generalize and unify pattern formation for quantitative mass conserving reaction-diffusion models. Mass redistribution controls chemical equilibria locally. Separation of diffusive mass redistribution on the level of conserved species provides a general mathematical procedure to decompose complex reaction-diffusion systems into effectively independent functional units, and to reveal the general underlying bifurcation scenarios. We apply this framework to Min protein pattern formation and identify the mechanistic roles of both involved protein species. MinD generates polarity through phase separation, whereas MinE takes the role of a control variable regulating the existence of MinD phases. Hence, polarization and not oscillations is the generic core dynamics of Min proteins in vivo. This establishes an intrinsic mechanistic link between the Min system and a broad class of intracellular pattern forming systems based on bistability and phase separation (wave-pinning). Oscillations are facilitated by MinE redistribution and can be understood mechanistically as relaxation oscillations of the polarization direction.
Application of the adverse outcome pathway framework - advances and challenges
The adverse outcome pathway (AOP) framework, while not new in concept, has gained attention in recent years as a set of organizing principles and tools that can help facilitate greater use of mechanistic or pathway-based data in risk assessment and regulatory decision-making. Reg...
Yé, Yazoume; Eisele, Thomas P.; Eckert, Erin; Korenromp, Eline; Shah, Jui A.; Hershey, Christine L.; Ivanovich, Elizabeth; Newby, Holly; Carvajal-Velez, Liliana; Lynch, Michael; Komatsu, Ryuichi; Cibulskis, Richard E.; Moore, Zhuzhi; Bhattarai, Achuyt
2017-01-01
Abstract. Concerted efforts from national and international partners have scaled up malaria control interventions, including insecticide-treated nets, indoor residual spraying, diagnostics, prompt and effective treatment of malaria cases, and intermittent preventive treatment during pregnancy in sub-Saharan Africa (SSA). This scale-up warrants an assessment of its health impact to guide future efforts and investments; however, measuring malaria-specific mortality and the overall impact of malaria control interventions remains challenging. In 2007, Roll Back Malaria's Monitoring and Evaluation Reference Group proposed a theoretical framework for evaluating the impact of full-coverage malaria control interventions on morbidity and mortality in high-burden SSA countries. Recently, several evaluations have contributed new ideas and lessons to strengthen this plausibility design. This paper harnesses that new evaluation experience to expand the framework, with additional features, such as stratification, to examine subgroups most likely to experience improvement if control programs are working; the use of a national platform framework; and analysis of complete birth histories from national household surveys. The refined framework has shown that, despite persisting data challenges, combining multiple sources of data, considering potential contributions from both fundamental and proximate contextual factors, and conducting subnational analyses allows identification of the plausible contributions of malaria control interventions on malaria morbidity and mortality. PMID:28990923
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mann, Greg; Koehnke, Jesko; Bent, Andrew F.
The highly conserved domain of unknown function in the cyanobactin superfamily has a novel fold. The protein does not appear to bind the most plausible substrates, leaving questions as to its role. Patellamides are members of the cyanobactin family of ribosomally synthesized and post-translationally modified cyclic peptide natural products, many of which, including some patellamides, are biologically active. A detailed mechanistic understanding of the biosynthetic pathway would enable the construction of a biotechnological ‘toolkit’ to make novel analogues of patellamides that are not found in nature. All but two of the protein domains involved in patellamide biosynthesis have been characterized.more » The two domains of unknown function (DUFs) are homologous to each other and are found at the C-termini of the multi-domain proteins PatA and PatG. The domain sequence is found in all cyanobactin-biosynthetic pathways characterized to date, implying a functional role in cyanobactin biosynthesis. Here, the crystal structure of the PatG DUF domain is reported and its binding interactions with plausible substrates are investigated.« less
Adverse outcome pathway (AOP) development I: Strategies and principles
An adverse outcome pathway (AOP) is a conceptual framework that organizes existing knowledge concerning biologically plausible, and empirically-supported, links between molecular-level perturbation of a biological system and an adverse outcome at a level of biological organizatio...
Bequest Motives and the Annuity Puzzle.
Lockwood, Lee M
2012-04-01
Few retirees annuitize any wealth, a fact that has so far defied explanation within the standard framework of forward-looking, expected utility-maximizing agents. Bequest motives seem a natural explanation. Yet the prevailing view is that people with plausible bequest motives should annuitize part of their wealth, and thus that bequest motives cannot explain why most people do not annuitize any wealth. I show, however, that people with plausible bequest motives are likely to be better off not annuitizing any wealth at available rates. The evidence suggests that bequest motives play a central role in limiting the demand for annuities.
Bequest Motives and the Annuity Puzzle
Lockwood, Lee M.
2011-01-01
Few retirees annuitize any wealth, a fact that has so far defied explanation within the standard framework of forward-looking, expected utility-maximizing agents. Bequest motives seem a natural explanation. Yet the prevailing view is that people with plausible bequest motives should annuitize part of their wealth, and thus that bequest motives cannot explain why most people do not annuitize any wealth. I show, however, that people with plausible bequest motives are likely to be better off not annuitizing any wealth at available rates. The evidence suggests that bequest motives play a central role in limiting the demand for annuities. PMID:22822300
Prebiotic NH3 Formation: Insights from Simulations.
Stirling, András; Rozgonyi, Tamás; Krack, Matthias; Bernasconi, Marco
2016-02-15
Simulations of prebiotic NH₃ synthesis from NO₃⁻ and NO₂⁻ on pyrite surfaces under hydrothermal conditions are reported. Ab initio metadynamics calculations have successfully explored the full reaction path which explains earlier experimental observations. We have found that the reaction mechanism can be constructed from stepwise single atom transfers which are compatible with the expected reaction time scales. The roles of the hot-pressurized water and of the pyrite surfaces have been addressed. The mechanistic picture that emerged from the simulations strengthens the theory of chemoautotrophic origin of life by providing plausible reaction pathways for the formation of ammonia within the iron-sulfur-world scenario.
Cerebrovascular Hemodynamics in Women.
Duque, Cristina; Feske, Steven K; Sorond, Farzaneh A
2017-12-01
Sex and gender, as biological and social factors, significantly influence health outcomes. Among the biological factors, sex differences in vascular physiology may be one specific mechanism contributing to the observed differences in clinical presentation, response to treatment, and clinical outcomes in several vascular disorders. This review focuses on the cerebrovascular bed and summarizes the existing literature on sex differences in cerebrovascular hemodynamics to highlight the knowledge deficit that exists in this domain. The available evidence is used to generate mechanistically plausible and testable hypotheses to underscore the unmet need in understanding sex-specific mechanisms as targets for more effective therapeutic and preventive strategies. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.
Heck, Daniel W; Hilbig, Benjamin E; Moshagen, Morten
2017-08-01
Decision strategies explain how people integrate multiple sources of information to make probabilistic inferences. In the past decade, increasingly sophisticated methods have been developed to determine which strategy explains decision behavior best. We extend these efforts to test psychologically more plausible models (i.e., strategies), including a new, probabilistic version of the take-the-best (TTB) heuristic that implements a rank order of error probabilities based on sequential processing. Within a coherent statistical framework, deterministic and probabilistic versions of TTB and other strategies can directly be compared using model selection by minimum description length or the Bayes factor. In an experiment with inferences from given information, only three of 104 participants were best described by the psychologically plausible, probabilistic version of TTB. Similar as in previous studies, most participants were classified as users of weighted-additive, a strategy that integrates all available information and approximates rational decisions. Copyright © 2017 Elsevier Inc. All rights reserved.
A Biomass-based Model to Estimate the Plausibility of Exoplanet Biosignature Gases
NASA Astrophysics Data System (ADS)
Seager, S.; Bains, W.; Hu, R.
2013-10-01
Biosignature gas detection is one of the ultimate future goals for exoplanet atmosphere studies. We have created a framework for linking biosignature gas detectability to biomass estimates, including atmospheric photochemistry and biological thermodynamics. The new framework is intended to liberate predictive atmosphere models from requiring fixed, Earth-like biosignature gas source fluxes. New biosignature gases can be considered with a check that the biomass estimate is physically plausible. We have validated the models on terrestrial production of NO, H2S, CH4, CH3Cl, and DMS. We have applied the models to propose NH3 as a biosignature gas on a "cold Haber World," a planet with a N2-H2 atmosphere, and to demonstrate why gases such as CH3Cl must have too large of a biomass to be a plausible biosignature gas on planets with Earth or early-Earth-like atmospheres orbiting a Sun-like star. To construct the biomass models, we developed a functional classification of biosignature gases, and found that gases (such as CH4, H2S, and N2O) produced from life that extracts energy from chemical potential energy gradients will always have false positives because geochemistry has the same gases to work with as life does, and gases (such as DMS and CH3Cl) produced for secondary metabolic reasons are far less likely to have false positives but because of their highly specialized origin are more likely to be produced in small quantities. The biomass model estimates are valid to one or two orders of magnitude; the goal is an independent approach to testing whether a biosignature gas is plausible rather than a precise quantification of atmospheric biosignature gases and their corresponding biomasses.
Knowledge-based vision and simple visual machines.
Cliff, D; Noble, J
1997-01-01
The vast majority of work in machine vision emphasizes the representation of perceived objects and events: it is these internal representations that incorporate the 'knowledge' in knowledge-based vision or form the 'models' in model-based vision. In this paper, we discuss simple machine vision systems developed by artificial evolution rather than traditional engineering design techniques, and note that the task of identifying internal representations within such systems is made difficult by the lack of an operational definition of representation at the causal mechanistic level. Consequently, we question the nature and indeed the existence of representations posited to be used within natural vision systems (i.e. animals). We conclude that representations argued for on a priori grounds by external observers of a particular vision system may well be illusory, and are at best place-holders for yet-to-be-identified causal mechanistic interactions. That is, applying the knowledge-based vision approach in the understanding of evolved systems (machines or animals) may well lead to theories and models that are internally consistent, computationally plausible, and entirely wrong. PMID:9304684
Identifying collaborative care teams through electronic medical record utilization patterns.
Chen, You; Lorenzi, Nancy M; Sandberg, Warren S; Wolgast, Kelly; Malin, Bradley A
2017-04-01
The goal of this investigation was to determine whether automated approaches can learn patient-oriented care teams via utilization of an electronic medical record (EMR) system. To perform this investigation, we designed a data-mining framework that relies on a combination of latent topic modeling and network analysis to infer patterns of collaborative teams. We applied the framework to the EMR utilization records of over 10 000 employees and 17 000 inpatients at a large academic medical center during a 4-month window in 2010. Next, we conducted an extrinsic evaluation of the patterns to determine the plausibility of the inferred care teams via surveys with knowledgeable experts. Finally, we conducted an intrinsic evaluation to contextualize each team in terms of collaboration strength (via a cluster coefficient) and clinical credibility (via associations between teams and patient comorbidities). The framework discovered 34 collaborative care teams, 27 (79.4%) of which were confirmed as administratively plausible. Of those, 26 teams depicted strong collaborations, with a cluster coefficient > 0.5. There were 119 diagnostic conditions associated with 34 care teams. Additionally, to provide clarity on how the survey respondents arrived at their determinations, we worked with several oncologists to develop an illustrative example of how a certain team functions in cancer care. Inferred collaborative teams are plausible; translating such patterns into optimized collaborative care will require administrative review and integration with management practices. EMR utilization records can be mined for collaborative care patterns in large complex medical centers. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Zhu, Xin-Guang; Lynch, Jonathan P; LeBauer, David S; Millar, Andrew J; Stitt, Mark; Long, Stephen P
2016-05-01
A paradigm shift is needed and timely in moving plant modelling from largely isolated efforts to a connected community endeavour that can take full advantage of advances in computer science and in mechanistic understanding of plant processes. Plants in silico (Psi) envisions a digital representation of layered dynamic modules, linking from gene networks and metabolic pathways through to cellular organization, tissue, organ and whole plant development, together with resource capture and use efficiency in dynamic competitive environments, ultimately allowing a mechanistically rich simulation of the plant or of a community of plants in silico. The concept is to integrate models or modules from different layers of organization spanning from genome to phenome to ecosystem in a modular framework allowing the use of modules of varying mechanistic detail representing the same biological process. Developments in high-performance computing, functional knowledge of plants, the internet and open-source version controlled software make achieving the concept realistic. Open source will enhance collaboration and move towards testing and consensus on quantitative theoretical frameworks. Importantly, Psi provides a quantitative knowledge framework where the implications of a discovery at one level, for example, single gene function or developmental response, can be examined at the whole plant or even crop and natural ecosystem levels. © 2015 The Authors Plant, Cell & Environment Published by John Wiley & Sons Ltd.
Mourning dove hunting regulation strategy based on annual harvest statistics and banding data
Otis, D.L.
2006-01-01
Although managers should strive to base game bird harvest management strategies on mechanistic population models, monitoring programs required to build and continuously update these models may not be in place. Alternatively, If estimates of total harvest and harvest rates are available, then population estimates derived from these harvest data can serve as the basis for making hunting regulation decisions based on population growth rates derived from these estimates. I present a statistically rigorous approach for regulation decision-making using a hypothesis-testing framework and an assumed framework of 3 hunting regulation alternatives. I illustrate and evaluate the technique with historical data on the mid-continent mallard (Anas platyrhynchos) population. I evaluate the statistical properties of the hypothesis-testing framework using the best available data on mourning doves (Zenaida macroura). I use these results to discuss practical implementation of the technique as an interim harvest strategy for mourning doves until reliable mechanistic population models and associated monitoring programs are developed.
Impaired associative learning in schizophrenia: behavioral and computational studies
Diwadkar, Vaibhav A.; Flaugher, Brad; Jones, Trevor; Zalányi, László; Ujfalussy, Balázs; Keshavan, Matcheri S.
2008-01-01
Associative learning is a central building block of human cognition and in large part depends on mechanisms of synaptic plasticity, memory capacity and fronto–hippocampal interactions. A disorder like schizophrenia is thought to be characterized by altered plasticity, and impaired frontal and hippocampal function. Understanding the expression of this dysfunction through appropriate experimental studies, and understanding the processes that may give rise to impaired behavior through biologically plausible computational models will help clarify the nature of these deficits. We present a preliminary computational model designed to capture learning dynamics in healthy control and schizophrenia subjects. Experimental data was collected on a spatial-object paired-associate learning task. The task evinces classic patterns of negatively accelerated learning in both healthy control subjects and patients, with patients demonstrating lower rates of learning than controls. Our rudimentary computational model of the task was based on biologically plausible assumptions, including the separation of dorsal/spatial and ventral/object visual streams, implementation of rules of learning, the explicit parameterization of learning rates (a plausible surrogate for synaptic plasticity), and learning capacity (a plausible surrogate for memory capacity). Reductions in learning dynamics in schizophrenia were well-modeled by reductions in learning rate and learning capacity. The synergy between experimental research and a detailed computational model of performance provides a framework within which to infer plausible biological bases of impaired learning dynamics in schizophrenia. PMID:19003486
Edelman, Gerald M.; Gally, Joseph A.; Baars, Bernard J.
2010-01-01
The Dynamic Core and Global Workspace hypotheses were independently put forward to provide mechanistic and biologically plausible accounts of how brains generate conscious mental content. The Dynamic Core proposes that reentrant neural activity in the thalamocortical system gives rise to conscious experience. Global Workspace reconciles the limited capacity of momentary conscious content with the vast repertoire of long-term memory. In this paper we show the close relationship between the two hypotheses. This relationship allows for a strictly biological account of phenomenal experience and subjectivity that is consistent with mounting experimental evidence. We examine the constraints on causal analyses of consciousness and suggest that there is now sufficient evidence to consider the design and construction of a conscious artifact. PMID:21713129
Aguilar, David; Contel, Maria; Urriolabeitia, Esteban P
2010-08-09
Propargylamines can be obtained from secondary amines and terminal alkynes in chlorinated solvents by a three- and two-component synthesis catalyzed by gold compounds and nanoparticles (Au-NP) under mild conditions. The use of dichloromethane allows for the activation of two C-Cl bonds and a clean transfer of the methylene fragment to the final product. The scope of the reaction as well as the influence of different gold(III) cycloaurated complexes and salts has been investigated. The involvement of gold nanoparticles generated in situ in the process is discussed and a plausible reaction mechanism is proposed on the basis of the data obtained.
Generative mechanistic explanation building in undergraduate molecular and cellular biology
NASA Astrophysics Data System (ADS)
Southard, Katelyn M.; Espindola, Melissa R.; Zaepfel, Samantha D.; Bolger, Molly S.
2017-09-01
When conducting scientific research, experts in molecular and cellular biology (MCB) use specific reasoning strategies to construct mechanistic explanations for the underlying causal features of molecular phenomena. We explored how undergraduate students applied this scientific practice in MCB. Drawing from studies of explanation building among scientists, we created and applied a theoretical framework to explore the strategies students use to construct explanations for 'novel' biological phenomena. Specifically, we explored how students navigated the multi-level nature of complex biological systems using generative mechanistic reasoning. Interviews were conducted with introductory and upper-division biology students at a large public university in the United States. Results of qualitative coding revealed key features of students' explanation building. Students used modular thinking to consider the functional subdivisions of the system, which they 'filled in' to varying degrees with mechanistic elements. They also hypothesised the involvement of mechanistic entities and instantiated abstract schema to adapt their explanations to unfamiliar biological contexts. Finally, we explored the flexible thinking that students used to hypothesise the impact of mutations on multi-leveled biological systems. Results revealed a number of ways that students drew mechanistic connections between molecules, functional modules (sets of molecules with an emergent function), cells, tissues, organisms and populations.
Using foresight methods to anticipate future threats: the case of disease management.
Ma, Sai; Seid, Michael
2006-01-01
We describe a unique foresight framework for health care managers to use in longer-term planning. This framework uses scenario-building to envision plausible alternate futures of the U.S. health care system and links those broad futures to business-model-specific "load-bearing" assumptions. Because the framework we describe simultaneously addresses very broad and very specific issues, it can be easily applied to a broad range of health care issues by using the broad framework and business-specific assumptions for the particular case at hand. We illustrate this method using the case of disease management, pointing out that although the industry continues to grow rapidly, its future also contains great uncertainties.
Multinational Experiment 7. Outcome 3 - Cyber Domain. Objective 3.3: Concept Framework Version 3.0
2012-10-03
experimentation in order to give some parameters for Decision Makers’ actions. A.5 DIFFERENT LEGAL FRAMEWORKS The juridical framework to which we refer, in...material effects (e.g. psychological impact), economic et al, or, especially in the military field, it may affect Operational Security (OPSEC). 7...not expected at all to be run as a mechanistic tool that produces univocal outputs on the base of juridically qualified inputs, making unnecessary
Magnetic field effects in proteins
NASA Astrophysics Data System (ADS)
Jones, Alex R.
2016-06-01
Many animals can sense the geomagnetic field, which appears to aid in behaviours such as migration. The influence of man-made magnetic fields on biology, however, is potentially more sinister, with adverse health effects being claimed from exposure to fields from mobile phones or high voltage power lines. Do these phenomena have a common, biophysical origin, and is it even plausible that such weak fields can profoundly impact noisy biological systems? Radical pair intermediates are widespread in protein reaction mechanisms, and the radical pair mechanism has risen to prominence as perhaps the most plausible means by which even very weak fields might impact biology. In this New Views article, I will discuss the literature over the past 40 years that has investigated the topic of magnetic field effects in proteins. The lack of reproducible results has cast a shadow over the area. However, magnetic field and spin effects have proven to be useful mechanistic tools for radical mechanism in biology. Moreover, if a magnetic effect on a radical pair mechanism in a protein were to influence a biological system, the conditions necessary for it to do so appear increasing unlikely to have come about by chance.
Modelling the ecological niche from functional traits
Kearney, Michael; Simpson, Stephen J.; Raubenheimer, David; Helmuth, Brian
2010-01-01
The niche concept is central to ecology but is often depicted descriptively through observing associations between organisms and habitats. Here, we argue for the importance of mechanistically modelling niches based on functional traits of organisms and explore the possibilities for achieving this through the integration of three theoretical frameworks: biophysical ecology (BE), the geometric framework for nutrition (GF) and dynamic energy budget (DEB) models. These three frameworks are fundamentally based on the conservation laws of thermodynamics, describing energy and mass balance at the level of the individual and capturing the prodigious predictive power of the concepts of ‘homeostasis’ and ‘evolutionary fitness’. BE and the GF provide mechanistic multi-dimensional depictions of climatic and nutritional niches, respectively, providing a foundation for linking organismal traits (morphology, physiology, behaviour) with habitat characteristics. In turn, they provide driving inputs and cost functions for mass/energy allocation within the individual as determined by DEB models. We show how integration of the three frameworks permits calculation of activity constraints, vital rates (survival, development, growth, reproduction) and ultimately population growth rates and species distributions. When integrated with contemporary niche theory, functional trait niche models hold great promise for tackling major questions in ecology and evolutionary biology. PMID:20921046
Iyappan, Anandhi; Kawalia, Shweta Bagewadi; Raschka, Tamara; Hofmann-Apitius, Martin; Senger, Philipp
2016-07-08
Neurodegenerative diseases are incurable and debilitating indications with huge social and economic impact, where much is still to be learnt about the underlying molecular events. Mechanistic disease models could offer a knowledge framework to help decipher the complex interactions that occur at molecular and cellular levels. This motivates the need for the development of an approach integrating highly curated and heterogeneous data into a disease model of different regulatory data layers. Although several disease models exist, they often do not consider the quality of underlying data. Moreover, even with the current advancements in semantic web technology, we still do not have cure for complex diseases like Alzheimer's disease. One of the key reasons accountable for this could be the increasing gap between generated data and the derived knowledge. In this paper, we describe an approach, called as NeuroRDF, to develop an integrative framework for modeling curated knowledge in the area of complex neurodegenerative diseases. The core of this strategy lies in the usage of well curated and context specific data for integration into one single semantic web-based framework, RDF. This increases the probability of the derived knowledge to be novel and reliable in a specific disease context. This infrastructure integrates highly curated data from databases (Bind, IntAct, etc.), literature (PubMed), and gene expression resources (such as GEO and ArrayExpress). We illustrate the effectiveness of our approach by asking real-world biomedical questions that link these resources to prioritize the plausible biomarker candidates. Among the 13 prioritized candidate genes, we identified MIF to be a potential emerging candidate due to its role as a pro-inflammatory cytokine. We additionally report on the effort and challenges faced during generation of such an indication-specific knowledge base comprising of curated and quality-controlled data. Although many alternative approaches have been proposed and practiced for modeling diseases, the semantic web technology is a flexible and well established solution for harmonized aggregation. The benefit of this work, to use high quality and context specific data, becomes apparent in speculating previously unattended biomarker candidates around a well-known mechanism, further leveraged for experimental investigations.
Pharmacometric Models for Characterizing the Pharmacokinetics of Orally Inhaled Drugs.
Borghardt, Jens Markus; Weber, Benjamin; Staab, Alexander; Kloft, Charlotte
2015-07-01
During the last decades, the importance of modeling and simulation in clinical drug development, with the goal to qualitatively and quantitatively assess and understand mechanisms of pharmacokinetic processes, has strongly increased. However, this increase could not equally be observed for orally inhaled drugs. The objectives of this review are to understand the reasons for this gap and to demonstrate the opportunities that mathematical modeling of pharmacokinetics of orally inhaled drugs offers. To achieve these objectives, this review (i) discusses pulmonary physiological processes and their impact on the pharmacokinetics after drug inhalation, (ii) provides a comprehensive overview of published pharmacokinetic models, (iii) categorizes these models into physiologically based pharmacokinetic (PBPK) and (clinical data-derived) empirical models, (iv) explores both their (mechanistic) plausibility, and (v) addresses critical aspects of different pharmacometric approaches pertinent for drug inhalation. In summary, pulmonary deposition, dissolution, and absorption are highly complex processes and may represent the major challenge for modeling and simulation of PK after oral drug inhalation. Challenges in relating systemic pharmacokinetics with pulmonary efficacy may be another factor contributing to the limited number of existing pharmacokinetic models for orally inhaled drugs. Investigations comprising in vitro experiments, clinical studies, and more sophisticated mathematical approaches are considered to be necessary for elucidating these highly complex pulmonary processes. With this additional knowledge, the PBPK approach might gain additional attractiveness. Currently, (semi-)mechanistic modeling offers an alternative to generate and investigate hypotheses and to more mechanistically understand the pulmonary and systemic pharmacokinetics after oral drug inhalation including the impact of pulmonary diseases.
Word Learning as Bayesian Inference
ERIC Educational Resources Information Center
Xu, Fei; Tenenbaum, Joshua B.
2007-01-01
The authors present a Bayesian framework for understanding how adults and children learn the meanings of words. The theory explains how learners can generalize meaningfully from just one or a few positive examples of a novel word's referents, by making rational inductive inferences that integrate prior knowledge about plausible word meanings with…
Testing Adaptive Toolbox Models: A Bayesian Hierarchical Approach
ERIC Educational Resources Information Center
Scheibehenne, Benjamin; Rieskamp, Jorg; Wagenmakers, Eric-Jan
2013-01-01
Many theories of human cognition postulate that people are equipped with a repertoire of strategies to solve the tasks they face. This theoretical framework of a cognitive toolbox provides a plausible account of intra- and interindividual differences in human behavior. Unfortunately, it is often unclear how to rigorously test the toolbox…
Development of reliable pavement models.
DOT National Transportation Integrated Search
2011-05-01
The current report proposes a framework for estimating the reliability of a given pavement structure as analyzed by : the Mechanistic-Empirical Pavement Design Guide (MEPDG). The methodology proposes using a previously fit : response surface, in plac...
Dallmann, André; Ince, Ibrahim; Meyer, Michaela; Willmann, Stefan; Eissing, Thomas; Hempel, Georg
2017-11-01
In the past years, several repositories for anatomical and physiological parameters required for physiologically based pharmacokinetic modeling in pregnant women have been published. While providing a good basis, some important aspects can be further detailed. For example, they did not account for the variability associated with parameters or were lacking key parameters necessary for developing more detailed mechanistic pregnancy physiologically based pharmacokinetic models, such as the composition of pregnancy-specific tissues. The aim of this meta-analysis was to provide an updated and extended database of anatomical and physiological parameters in healthy pregnant women that also accounts for changes in the variability of a parameter throughout gestation and for the composition of pregnancy-specific tissues. A systematic literature search was carried out to collect study data on pregnancy-related changes of anatomical and physiological parameters. For each parameter, a set of mathematical functions was fitted to the data and to the standard deviation observed among the data. The best performing functions were selected based on numerical and visual diagnostics as well as based on physiological plausibility. The literature search yielded 473 studies, 302 of which met the criteria to be further analyzed and compiled in a database. In total, the database encompassed 7729 data. Although the availability of quantitative data for some parameters remained limited, mathematical functions could be generated for many important parameters. Gaps were filled based on qualitative knowledge and based on physiologically plausible assumptions. The presented results facilitate the integration of pregnancy-dependent changes in anatomy and physiology into mechanistic population physiologically based pharmacokinetic models. Such models can ultimately provide a valuable tool to investigate the pharmacokinetics during pregnancy in silico and support informed decision making regarding optimal dosing regimens in this vulnerable special population.
Mechanistic Probes of Zeolitic Imidazolate Framework for Photocatalytic Application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pattengale, Brian; Yang, Sizhuo; Lee, Sungsik
2017-11-07
In this work, we report a zeolitic imidazolate framework (ZIF-67) with remarkable activity for hydrogen evolution reaction (HER) of 40,500 μmol H2/g MOF, which is, to the best of our knowledge, the highest activity achieved by any MOF system. This result necessitated assessment of its atomic-scale mechanistic function for HER using advanced spectroscopy techniques including time-resolved optical (OTA) and in situ X-ray absorption (XAS) spectroscopy. Through the correlation of OTA results with catalytic performance, we demonstrated that the electron transfer (ET) rather than energy transfer (ENT) pathway between photosensitizer and ZIF-67 is the key factor that controls the efficiency ofmore » HER activity, as HER activity that undergoes ET pathway is 3 orders of magnitude higher than that of ENT process. Using in situ XAS, we unraveled the spectral features for key intermediate species which are likely responsible for the rate determining process under turn over conditions. This work represents an original approach to study porous ZIF materials at the molecular level using advanced spectroscopic techniques, providing unprecedented insights into the photoactive nature of ZIF frameworks.« less
ERIC Educational Resources Information Center
Laszlo, Sarah; Plaut, David C.
2012-01-01
The Parallel Distributed Processing (PDP) framework has significant potential for producing models of cognitive tasks that approximate how the brain performs the same tasks. To date, however, there has been relatively little contact between PDP modeling and data from cognitive neuroscience. In an attempt to advance the relationship between…
Complex Systems and Educational Change: Towards a New Research Agenda
ERIC Educational Resources Information Center
Lemke, Jay L.; Sabelli, Nora H.
2008-01-01
How might we usefully apply concepts and procedures derived from the study of other complex dynamical systems to analyzing systemic change in education? In this article, we begin to define possible agendas for research toward developing systematic frameworks and shared terminology for such a project. We illustrate the plausibility of defining such…
Silicene catalyzed reduction of nitrobenzene to aniline: A mechanistic study
NASA Astrophysics Data System (ADS)
Morrissey, Christopher; He, Haiying
2018-03-01
The reduction of nitrobenzene to aniline has broad applications in chemical and pharmaceutical industries. The high reaction temperatures and pressures and unavoidable hazardous chemicals of current metal catalysts call for more environmentally friendly non-metal catalysts. In this study, the plausibility of silicene as a potential catalyst for nitrobenzene reduction is investigated with a focus on the distinct reaction mechanism based on the density functional theory. The direct reaction pathway was shown to be distinctly different from the Haber mechanism following PhNO2∗ → PhNO∗ → PhNHO∗ → PhNH2O∗ → PhNH2∗. The hydroxyl groups remain bound to silicene after aniline is formed and acquire a high activation barrier to remove.
Mander, Bryce A.; Winer, Joseph R.; Jagust, William J.; Walker, Matthew P.
2016-01-01
Sleep disruption appears to be a core component of Alzheimer's disease (AD) and its pathophysiology. Signature abnormalities of sleep emerge before clinical onset of AD. Moreover, insufficient sleep facilitates accumulation of amyloid-β (Aβ), potentially triggering earlier cognitive decline and conversion to AD. Building on such findings, this review has four goals, evaluating: (i) associations and plausible mechanisms linking NREM sleep disruption, Aβ, and AD, (ii) a role for NREM sleep disruption as a novel factor linking cortical Aβ to impaired hippocampus-dependent memory consolidation, (iii) the potential diagnostic utility of NREM sleep disruption as a new biomarker of AD, and (iv) the possibility of sleep as a new treatment target in aging, affording preventative and therapeutic benefits. PMID:27325209
Data-driven development of AOP knowledge
The Adverse Outcome Pathway framework represents a systematic way to organize mechanistic information underlying toxicology, and it is specifically designed to connect early stage molecular perturbations by chemicals and other stressors with adverse outcomes in humans and wildlif...
van Bilsen, Jolanda H M; Sienkiewicz-Szłapka, Edyta; Lozano-Ojalvo, Daniel; Willemsen, Linette E M; Antunes, Celia M; Molina, Elena; Smit, Joost J; Wróblewska, Barbara; Wichers, Harry J; Knol, Edward F; Ladics, Gregory S; Pieters, Raymond H H; Denery-Papini, Sandra; Vissers, Yvonne M; Bavaro, Simona L; Larré, Colette; Verhoeckx, Kitty C M; Roggen, Erwin L
2017-01-01
The introduction of whole new foods in a population may lead to sensitization and food allergy. This constitutes a potential public health problem and a challenge to risk assessors and managers as the existing understanding of the pathophysiological processes and the currently available biological tools for prediction of the risk for food allergy development and the severity of the reaction are not sufficient. There is a substantial body of in vivo and in vitro data describing molecular and cellular events potentially involved in food sensitization. However, these events have not been organized in a sequence of related events that is plausible to result in sensitization, and useful to challenge current hypotheses. The aim of this manuscript was to collect and structure the current mechanistic understanding of sensitization induction to food proteins by applying the concept of adverse outcome pathway (AOP). The proposed AOP for food sensitization is based on information on molecular and cellular mechanisms and pathways evidenced to be involved in sensitization by food and food proteins and uses the AOPs for chemical skin sensitization and respiratory sensitization induction as templates. Available mechanistic data on protein respiratory sensitization were included to fill out gaps in the understanding of how proteins may affect cells, cell-cell interactions and tissue homeostasis. Analysis revealed several key events (KE) and biomarkers that may have potential use in testing and assessment of proteins for their sensitizing potential. The application of the AOP concept to structure mechanistic in vivo and in vitro knowledge has made it possible to identify a number of methods, each addressing a specific KE, that provide information about the food allergenic potential of new proteins. When applied in the context of an integrated strategy these methods may reduce, if not replace, current animal testing approaches. The proposed AOP will be shared at the www.aopwiki.org platform to expand the mechanistic data, improve the confidence in each of the proposed KE and key event relations (KERs), and allow for the identification of new, or refinement of established KE and KERs.
Crops in silico: A community wide multi-scale computational modeling framework of plant canopies
NASA Astrophysics Data System (ADS)
Srinivasan, V.; Christensen, A.; Borkiewic, K.; Yiwen, X.; Ellis, A.; Panneerselvam, B.; Kannan, K.; Shrivastava, S.; Cox, D.; Hart, J.; Marshall-Colon, A.; Long, S.
2016-12-01
Current crop models predict a looming gap between supply and demand for primary foodstuffs over the next 100 years. While significant yield increases were achieved in major food crops during the early years of the green revolution, the current rates of yield increases are insufficient to meet future projected food demand. Furthermore, with projected reduction in arable land, decrease in water availability, and increasing impacts of climate change on future food production, innovative technologies are required to sustainably improve crop yield. To meet these challenges, we are developing Crops in silico (Cis), a biologically informed, multi-scale, computational modeling framework that can facilitate whole plant simulations of crop systems. The Cis framework is capable of linking models of gene networks, protein synthesis, metabolic pathways, physiology, growth, and development in order to investigate crop response to different climate scenarios and resource constraints. This modeling framework will provide the mechanistic details to generate testable hypotheses toward accelerating directed breeding and engineering efforts to increase future food security. A primary objective for building such a framework is to create synergy among an inter-connected community of biologists and modelers to create a realistic virtual plant. This framework advantageously casts the detailed mechanistic understanding of individual plant processes across various scales in a common scalable framework that makes use of current advances in high performance and parallel computing. We are currently designing a user friendly interface that will make this tool equally accessible to biologists and computer scientists. Critically, this framework will provide the community with much needed tools for guiding future crop breeding and engineering, understanding the emergent implications of discoveries at the molecular level for whole plant behavior, and improved prediction of plant and ecosystem responses to the environment.
NASA Astrophysics Data System (ADS)
Farrell, Kathryn; Oden, J. Tinsley; Faghihi, Danial
2015-08-01
A general adaptive modeling algorithm for selection and validation of coarse-grained models of atomistic systems is presented. A Bayesian framework is developed to address uncertainties in parameters, data, and model selection. Algorithms for computing output sensitivities to parameter variances, model evidence and posterior model plausibilities for given data, and for computing what are referred to as Occam Categories in reference to a rough measure of model simplicity, make up components of the overall approach. Computational results are provided for representative applications.
Reevaluating the conceptual framework for applied research on host-plant resistance.
Stout, Michael J
2013-06-01
Applied research on host-plant resistance to arthropod pests has been guided over the past 60 years by a framework originally developed by Reginald Painter in his 1951 book, Insect Resistance in Crop Plants. Painter divided the "phenomena" of resistance into three "mechanisms," nonpreference (later renamed antixenosis), antibiosis, and tolerance. The weaknesses of this framework are discussed. In particular, this trichotomous framework does not encompass all known mechanisms of resistance, and the antixenosis and antibiosis categories are ambiguous and inseparable in practice. These features have perhaps led to a simplistic approach to understanding arthropod resistance in crop plants. A dichotomous scheme is proposed as a replacement, with a major division between resistance (plant traits that limit injury to the plant) and tolerance (plant traits that reduce amount of yield loss per unit injury), and the resistance category subdivided into constitutive/inducible and direct/indirect subcategories. The most important benefits of adopting this dichotomous scheme are to more closely align the basic and applied literatures on plant resistance and to encourage a more mechanistic approach to studying plant resistance in crop plants. A more mechanistic approach will be needed to develop novel approaches for integrating plant resistance into pest management programs. © 2012 Institute of Zoology, Chinese Academy of Sciences.
EU Framework 6 Project: Predictive Toxicology (PredTox)-overview and outcome
DOE Office of Scientific and Technical Information (OSTI.GOV)
Suter, Laura, E-mail: Laura.suter-dick@roche.com; Schroeder, Susanne; Meyer, Kirstin
2011-04-15
In this publication, we report the outcome of the integrated EU Framework 6 Project: Predictive Toxicology (PredTox), including methodological aspects and overall conclusions. Specific details including data analysis and interpretation are reported in separate articles in this issue. The project, partly funded by the EU, was carried out by a consortium of 15 pharmaceutical companies, 2 SMEs, and 3 universities. The effects of 16 test compounds were characterized using conventional toxicological parameters and 'omics' technologies. The three major observed toxicities, liver hypertrophy, bile duct necrosis and/or cholestasis, and kidney proximal tubular damage were analyzed in detail. The combined approach ofmore » 'omics' and conventional toxicology proved a useful tool for mechanistic investigations and the identification of putative biomarkers. In our hands and in combination with histopathological assessment, target organ transcriptomics was the most prolific approach for the generation of mechanistic hypotheses. Proteomics approaches were relatively time-consuming and required careful standardization. NMR-based metabolomics detected metabolite changes accompanying histopathological findings, providing limited additional mechanistic information. Conversely, targeted metabolite profiling with LC/GC-MS was very useful for the investigation of bile duct necrosis/cholestasis. In general, both proteomics and metabolomics were supportive of other findings. Thus, the outcome of this program indicates that 'omics' technologies can help toxicologists to make better informed decisions during exploratory toxicological studies. The data support that hypothesis on mode of action and discovery of putative biomarkers are tangible outcomes of integrated 'omics' analysis. Qualification of biomarkers remains challenging, in particular in terms of identification, mechanistic anchoring, appropriate specificity, and sensitivity.« less
Modeling Environment for Total Risk-4M
MENTOR-4M uses an integrated, mechanistically consistent, source-to-dose modeling framework to quantify simultaneous exposures and doses of individuals and populations to multiple contaminants. It is an implementation of the MENTOR system for exposures to Multiple contaminants fr...
Adverse outcome pathway (AOP) development: Guiding principles and best practices
Adverse outcome pathways (AOPs) represent a conceptual framework that can support greater application of mechanistic data in regulatory decision-making. However, in order for the scientific community to collectively address the daunting challenge of describing relevant toxicologi...
Adverse Outcome Pathways: From Definition to Application
A challenge for both human health and ecological toxicologists is the transparent application of mechanistic (e.g., molecular, biochemical, histological) data to risk assessments. The adverse outcome pathway (AOP) is a conceptual framework designed to meet this need. Specifical...
Veltman, Karin; Huijbregts, Mark A J; Hendriks, A Jan
2010-07-01
Both biotic ligand models (BLM) and bioaccumulation models aim to quantify metal exposure based on mechanistic knowledge, but key factors included in the description of metal uptake differ between the two approaches. Here, we present a quantitative comparison of both approaches and show that BLM and bioaccumulation kinetics can be merged into a common mechanistic framework for metal uptake in aquatic organisms. Our results show that metal-specific absorption efficiencies calculated from BLM-parameters for freshwater fish are highly comparable, i.e. within a factor of 2.4 for silver, cadmium, copper, and zinc, to bioaccumulation-absorption efficiencies for predominantly marine fish. Conditional affinity constants are significantly related to the metal-specific covalent index. Additionally, the affinity constants of calcium, cadmium, copper, sodium, and zinc are significantly comparable across aquatic species, including molluscs, daphnids, and fish. This suggests that affinity constants can be estimated from the covalent index, and constants can be extrapolated across species. A new model is proposed that integrates the combined effect of metal chemodynamics, as speciation, competition, and ligand affinity, and species characteristics, as size, on metal uptake by aquatic organisms. An important direction for further research is the quantitative comparison of the proposed model with acute toxicity values for organisms belonging to different size classes.
Bus, James S
2017-06-01
The International Agency for Research on Cancer (IARC) has formulated 10 key characteristics of human carcinogens to incorporate mechanistic data into cancer hazard classifications. The analysis used glyphosate as a case example to examine the robustness of IARC's determination of oxidative stress as "strong" evidence supporting a plausible cancer mechanism in humans. The IARC analysis primarily relied on 14 human/mammalian studies; 19 non-mammalian studies were uninformative of human cancer given the broad spectrum of test species and extensive use of formulations and aquatic testing. The mammalian studies had substantial experimental limitations for informing cancer mechanism including use of: single doses and time points; cytotoxic/toxic test doses; tissues not identified as potential cancer targets; glyphosate formulations or mixtures; technically limited oxidative stress biomarkers. The doses were many orders of magnitude higher than human exposures determined in human biomonitoring studies. The glyphosate case example reveals that the IARC evaluation fell substantially short of "strong" supporting evidence of oxidative stress as a plausible human cancer mechanism, and suggests that other IARC monographs relying on the 10 key characteristics approach should be similarly examined for a lack of robust data integration fundamental to reasonable mode of action evaluations. Copyright © 2017 Elsevier Inc. All rights reserved.
Non-specific effects of vaccines: plausible and potentially important, but implications uncertain.
Pollard, Andrew J; Finn, Adam; Curtis, Nigel
2017-11-01
Non-specific effects (NSE) or heterologous effects of vaccines are proposed to explain observations in some studies that certain vaccines have an impact beyond the direct protection against infection with the specific pathogen for which the vaccines were designed. The importance and implications of such effects remain controversial. There are several known immunological mechanisms which could lead to NSE, since it is widely recognised that the generation of specific immunity is initiated by non-specific innate immune mechanisms that may also have wider effects on adaptive immune function. However, there are no published studies that demonstrate a mechanistic link between such immunological phenomena and clinically relevant NSE in humans. While it is highly plausible that some vaccines do have NSE, their magnitude and duration, and thus importance, remain uncertain. Although the WHO recently concluded that current evidence does not justify changes to immunisation policy, further studies of sufficient size and quality are needed to assess the importance of NSE for all-cause mortality. This could provide insights into vaccine immunobiology with important implications for infant health and survival. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Phthalates impact human health: Epidemiological evidences and plausible mechanism of action.
Benjamin, Sailas; Masai, Eiji; Kamimura, Naofumi; Takahashi, Kenji; Anderson, Robin C; Faisal, Panichikkal Abdul
2017-10-15
Disregarding the rising alarm on the hazardous nature of various phthalates and their metabolites, ruthless usage of phthalates as plasticizer in plastics and as additives in innumerable consumer products continues due low their cost, attractive properties, and lack of suitable alternatives. Globally, in silico computational, in vitro mechanistic, in vivo preclinical and limited clinical or epidemiological human studies showed that over a dozen phthalates and their metabolites ingested passively by man from the general environment, foods, drinks, breathing air, and routine household products cause various dysfunctions. Thus, this review addresses the health hazards posed by phthalates on children and adolescents, epigenetic modulation, reproductive toxicity in women and men; insulin resistance and type II diabetes; overweight and obesity, skeletal anomalies, allergy and asthma, cancer, etc., coupled with the description of major phthalates and their general uses, phthalate exposure routes, biomonitoring and risk assessment, special account on endocrine disruption; and finally, a plausible molecular cross-talk with a unique mechanism of action. This clinically focused comprehensive review on the hazards of phthalates would benefit the general population, academia, scientists, clinicians, environmentalists, and law or policy makers to decide upon whether usage of phthalates to be continued swiftly without sufficient deceleration or regulated by law or to be phased out from earth forever. Copyright © 2017. Published by Elsevier B.V.
Interval Estimation of Revision Effect on Scale Reliability via Covariance Structure Modeling
ERIC Educational Resources Information Center
Raykov, Tenko
2009-01-01
A didactic discussion of a procedure for interval estimation of change in scale reliability due to revision is provided, which is developed within the framework of covariance structure modeling. The method yields ranges of plausible values for the population gain or loss in reliability of unidimensional composites, which results from deletion or…
Morality Principles for Risk Modelling: Needs and Links with the Origins of Plausible Inference
NASA Astrophysics Data System (ADS)
Solana-Ortega, Alberto; Solana, Vicente
2009-12-01
In comparison with the foundations of probability calculus, the inescapable and controversial issue of how to assign probabilities has only recently become a matter of formal study. The introduction of information as a technical concept was a milestone, but the most promising entropic assignment methods still face unsolved difficulties, manifesting the incompleteness of plausible inference theory. In this paper we examine the situation faced by risk analysts in the critical field of extreme events modelling, where the former difficulties are especially visible, due to scarcity of observational data, the large impact of these phenomena and the obligation to assume professional responsibilities. To respond to the claim for a sound framework to deal with extremes, we propose a metafoundational approach to inference, based on a canon of extramathematical requirements. We highlight their strong moral content, and show how this emphasis in morality, far from being new, is connected with the historic origins of plausible inference. Special attention is paid to the contributions of Caramuel, a contemporary of Pascal, unfortunately ignored in the usual mathematical accounts of probability.
Cañete-Valdeón, José M; Wieringa, Roel; Smallbone, Kieran
2012-12-01
There is a growing interest in mathematical mechanistic modelling as a promising strategy for understanding tumour progression. This approach is accompanied by a methodological change of making research, in which models help to actively generate hypotheses instead of waiting for general principles to become apparent once sufficient data are accumulated. This paper applies recent research from philosophy of science to uncover three important problems of mechanistic modelling which may compromise its mainstream application, namely: the dilemma of formal and informal descriptions, the need to express degrees of confidence and the need of an argumentation framework. We report experience and research on similar problems from software engineering and provide evidence that the solutions adopted there can be transferred to the biological domain. We hope this paper can provoke new opportunities for further and profitable interdisciplinary research in the field.
A framework for in vitro systems toxicology assessment of e-liquids
Iskandar, Anita R.; Gonzalez-Suarez, Ignacio; Majeed, Shoaib; Marescotti, Diego; Sewer, Alain; Xiang, Yang; Leroy, Patrice; Guedj, Emmanuel; Mathis, Carole; Schaller, Jean-Pierre; Vanscheeuwijck, Patrick; Frentzel, Stefan; Martin, Florian; Ivanov, Nikolai V.; Peitsch, Manuel C.; Hoeng, Julia
2016-01-01
Abstract Various electronic nicotine delivery systems (ENDS), of which electronic cigarettes (e-cigs) are the most recognized prototype, have been quickly gaining ground on conventional cigarettes because they are perceived as less harmful. Research assessing the potential effects of ENDS exposure in humans is currently limited and inconclusive. New products are emerging with numerous variations in designs and performance parameters within and across brands. Acknowledging these challenges, we present here a proposed framework for an in vitro systems toxicology assessment of e-liquids and their aerosols, intended to complement the battery of assays for standard toxicity assessments. The proposed framework utilizes high-throughput toxicity assessments of e-liquids and their aerosols, in which the device-to-device variability is minimized, and a systems-level investigation of the cellular mechanisms of toxicity is an integral part. An analytical chemistry investigation is also included as a part of the framework to provide accurate and reliable chemistry data solidifying the toxicological assessment. In its simplest form, the framework comprises of three main layers: (1) high-throughput toxicity screening of e-liquids using primary human cell culture systems; (2) toxicity-related mechanistic assessment of selected e-liquids, and (3) toxicity-related mechanistic assessment of their aerosols using organotypic air–liquid interface airway culture systems. A systems toxicology assessment approach is leveraged to enable in-depth analyses of the toxicity-related cellular mechanisms of e-liquids and their aerosols. We present example use cases to demonstrate the suitability of the framework for a robust in vitro assessment of e-liquids and their aerosols. PMID:27117495
A framework for in vitro systems toxicology assessment of e-liquids.
Iskandar, Anita R; Gonzalez-Suarez, Ignacio; Majeed, Shoaib; Marescotti, Diego; Sewer, Alain; Xiang, Yang; Leroy, Patrice; Guedj, Emmanuel; Mathis, Carole; Schaller, Jean-Pierre; Vanscheeuwijck, Patrick; Frentzel, Stefan; Martin, Florian; Ivanov, Nikolai V; Peitsch, Manuel C; Hoeng, Julia
2016-07-01
Various electronic nicotine delivery systems (ENDS), of which electronic cigarettes (e-cigs) are the most recognized prototype, have been quickly gaining ground on conventional cigarettes because they are perceived as less harmful. Research assessing the potential effects of ENDS exposure in humans is currently limited and inconclusive. New products are emerging with numerous variations in designs and performance parameters within and across brands. Acknowledging these challenges, we present here a proposed framework for an in vitro systems toxicology assessment of e-liquids and their aerosols, intended to complement the battery of assays for standard toxicity assessments. The proposed framework utilizes high-throughput toxicity assessments of e-liquids and their aerosols, in which the device-to-device variability is minimized, and a systems-level investigation of the cellular mechanisms of toxicity is an integral part. An analytical chemistry investigation is also included as a part of the framework to provide accurate and reliable chemistry data solidifying the toxicological assessment. In its simplest form, the framework comprises of three main layers: (1) high-throughput toxicity screening of e-liquids using primary human cell culture systems; (2) toxicity-related mechanistic assessment of selected e-liquids, and (3) toxicity-related mechanistic assessment of their aerosols using organotypic air-liquid interface airway culture systems. A systems toxicology assessment approach is leveraged to enable in-depth analyses of the toxicity-related cellular mechanisms of e-liquids and their aerosols. We present example use cases to demonstrate the suitability of the framework for a robust in vitro assessment of e-liquids and their aerosols.
Modeling Environment for Total Risk-1A
MENTOR-1A uses an integrated, mechanistically consistent source-to-dose modeling framework to quantify inhalation exposure and dose for individuals and/or populations due to co-occurring air pollutants. It uses the "One Atmosphere" concept to characterize simultaneous exposures t...
Accelerating Adverse Outcome Pathway (AOP) development via computationally predicted AOP networks
The Adverse Outcome Pathway (AOP) framework is increasingly being adopted as a tool for organizing and summarizing the mechanistic information connecting molecular perturbations by environmental stressors with adverse outcomes relevant for ecological and human health outcomes. Ho...
Watching a signaling protein function in real time via 100-ps time-resolved Laue crystallography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schotte, Friedrich; Cho, Hyun Sun; Kaila, Ville R.I.
2012-11-06
To understand how signaling proteins function, it is necessary to know the time-ordered sequence of events that lead to the signaling state. We recently developed on the BioCARS 14-IDB beamline at the Advanced Photon Source the infrastructure required to characterize structural changes in protein crystals with near-atomic spatial resolution and 150-ps time resolution, and have used this capability to track the reversible photocycle of photoactive yellow protein (PYP) following trans-to-cis photoisomerization of its p-coumaric acid (pCA) chromophore over 10 decades of time. The first of four major intermediates characterized in this study is highly contorted, with the pCA carbonyl rotatedmore » nearly 90° out of the plane of the phenolate. A hydrogen bond between the pCA carbonyl and the Cys69 backbone constrains the chromophore in this unusual twisted conformation. Density functional theory calculations confirm that this structure is chemically plausible and corresponds to a strained cis intermediate. This unique structure is short-lived (~600 ps), has not been observed in prior cryocrystallography experiments, and is the progenitor of intermediates characterized in previous nanosecond time-resolved Laue crystallography studies. The structural transitions unveiled during the PYP photocycle include trans/cis isomerization, the breaking and making of hydrogen bonds, formation/relaxation of strain, and gated water penetration into the interior of the protein. This mechanistically detailed, near-atomic resolution description of the complete PYP photocycle provides a framework for understanding signal transduction in proteins, and for assessing and validating theoretical/computational approaches in protein biophysics.« less
Dose-response relationships for environmentally mediated infectious disease transmission models
Eisenberg, Joseph N. S.
2017-01-01
Environmentally mediated infectious disease transmission models provide a mechanistic approach to examining environmental interventions for outbreaks, such as water treatment or surface decontamination. The shift from the classical SIR framework to one incorporating the environment requires codifying the relationship between exposure to environmental pathogens and infection, i.e. the dose–response relationship. Much of the work characterizing the functional forms of dose–response relationships has used statistical fit to experimental data. However, there has been little research examining the consequences of the choice of functional form in the context of transmission dynamics. To this end, we identify four properties of dose–response functions that should be considered when selecting a functional form: low-dose linearity, scalability, concavity, and whether it is a single-hit model. We find that i) middle- and high-dose data do not constrain the low-dose response, and different dose–response forms that are equally plausible given the data can lead to significant differences in simulated outbreak dynamics; ii) the choice of how to aggregate continuous exposure into discrete doses can impact the modeled force of infection; iii) low-dose linear, concave functions allow the basic reproduction number to control global dynamics; and iv) identifiability analysis offers a way to manage multiple sources of uncertainty and leverage environmental monitoring to make inference about infectivity. By applying an environmentally mediated infectious disease model to the 1993 Milwaukee Cryptosporidium outbreak, we demonstrate that environmental monitoring allows for inference regarding the infectivity of the pathogen and thus improves our ability to identify outbreak characteristics such as pathogen strain. PMID:28388665
Goodpaster, Jason D.; Weber, Adam Z.
2017-01-01
Electrochemical reduction of CO2 using renewable sources of electrical energy holds promise for converting CO2 to fuels and chemicals. Since this process is complex and involves a large number of species and physical phenomena, a comprehensive understanding of the factors controlling product distribution is required. While the most plausible reaction pathway is usually identified from quantum-chemical calculation of the lowest free-energy pathway, this approach can be misleading when coverages of adsorbed species determined for alternative mechanism differ significantly, since elementary reaction rates depend on the product of the rate coefficient and the coverage of species involved in the reaction. Moreover, cathode polarization can influence the kinetics of CO2 reduction. Here, we present a multiscale framework for ab initio simulation of the electrochemical reduction of CO2 over an Ag(110) surface. A continuum model for species transport is combined with a microkinetic model for the cathode reaction dynamics. Free energies of activation for all elementary reactions are determined from density functional theory calculations. Using this approach, three alternative mechanisms for CO2 reduction were examined. The rate-limiting step in each mechanism is **COOH formation at higher negative potentials. However, only via the multiscale simulation was it possible to identify the mechanism that leads to a dependence of the rate of CO formation on the partial pressure of CO2 that is consistent with experiments. Simulations based on this mechanism also describe the dependence of the H2 and CO current densities on cathode voltage that are in strikingly good agreement with experimental observation. PMID:28973926
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mander, Bryce A.; Winer, Joseph R.; Jagust, William J.
Sleep disruption appears to be a major component of Alzheimer's disease (AD) and its pathophysiology. Signature abnormalities of sleep emerge before clinical onset of AD. Moreover, insufficient sleep facilitates accumulation of amyloid-β (Aβ), potentially triggering earlier cognitive decline and conversion to AD. Building on such findings, this review has four goals: evaluating (i) associations and plausible mechanisms linking non-rapid-eye-movement (NREM) sleep disruption, Aβ, and AD; (ii) a role for NREM sleep disruption as a novel factor linking cortical Aβ to impaired hippocampus-dependent memory consolidation; (iii) the potential diagnostic utility of NREM sleep disruption as a new biomarker of AD; andmore » (iv) the possibility of sleep as a new treatment target in aging, affording preventative and therapeutic benefits.« less
Advancing the Adverse Outcome Pathway Framework - an International Horizon Scanning Approach
The ability of scientists to conduct whole organism toxicity tests to understand chemical safety has been significantly outpaced by the rapid synthesis of new chemicals. Therefore, to increase efficiencies in chemical risk assessment, scientists are turning to mechanistic-based ...
Modeling Environment for Total Risk-2E
MENTOR-2E uses an integrated, mechanistically consistent source-to-dose-to-response modeling framework to quantify inhalation exposure and doses resulting from emergency events. It is an implementation of the MENTOR system that is focused towards modeling of the impacts of rele...
Realizing the promise of AOPs: A stakeholder-driven roadmap to the future
The adverse outcome pathway (AOP) framework was developed to serve as a knowledge assembly and communication tool to facilitate translation of mechanistic (e.g., molecular, biochemical, histological) data into adverse apical outcomes meaningful to chemical risk assessment. Althou...
Using IBMs to Investigate Spatially-dependent Processes in Landscape Genetics Theory
Much of landscape and conservation genetics theory has been derived using non-spatialmathematical models. Here, we use a mechanistic, spatially-explicit, eco-evolutionary IBM to examine the utility of this theoretical framework in landscapes with spatial structure. Our analysis...
Anomalous cross-modulation between microwave beams
NASA Astrophysics Data System (ADS)
Ranfagni, Anedio; Mugnai, Daniela; Petrucci, Andrea; Mignani, Roberto; Cacciari, Ilaria
2018-06-01
An anomalous effect in the near field of crossing microwave beams, which consists of an unexpected transfer of modulation from one beam to the other, has found a plausible interpretation within the framework of a locally broken Lorentz invariance. A theoretical approach of this kind deserves to be reconsidered also in the light of further experimental work, including a counter-check of the phenomenon.
Explicit B-spline regularization in diffeomorphic image registration
Tustison, Nicholas J.; Avants, Brian B.
2013-01-01
Diffeomorphic mappings are central to image registration due largely to their topological properties and success in providing biologically plausible solutions to deformation and morphological estimation problems. Popular diffeomorphic image registration algorithms include those characterized by time-varying and constant velocity fields, and symmetrical considerations. Prior information in the form of regularization is used to enforce transform plausibility taking the form of physics-based constraints or through some approximation thereof, e.g., Gaussian smoothing of the vector fields [a la Thirion's Demons (Thirion, 1998)]. In the context of the original Demons' framework, the so-called directly manipulated free-form deformation (DMFFD) (Tustison et al., 2009) can be viewed as a smoothing alternative in which explicit regularization is achieved through fast B-spline approximation. This characterization can be used to provide B-spline “flavored” diffeomorphic image registration solutions with several advantages. Implementation is open source and available through the Insight Toolkit and our Advanced Normalization Tools (ANTs) repository. A thorough comparative evaluation with the well-known SyN algorithm (Avants et al., 2008), implemented within the same framework, and its B-spline analog is performed using open labeled brain data and open source evaluation tools. PMID:24409140
Mechanistic impact of outdoor air pollution on asthma and allergic diseases
Zhang, Qingling; Qiu, Zhiming; Chung, Kian Fan
2015-01-01
Over the past decades, asthma and allergic diseases, such as allergic rhinitis and eczema, have become increasingly common, but the reason for this increased prevalence is still unclear. It has become apparent that genetic variation alone is not sufficient to account for the observed changes; rather, the changing environment, together with alterations in lifestyle and eating habits, are likely to have driven the increase in prevalence, and in some cases, severity of disease. This is particularly highlighted by recent awareness of, and concern about, the exposure to ubiquitous environmental pollutants, including chemicals with oxidant-generating capacities, and their impact on the human respiratory and immune systems. Indeed, several epidemiological studies have identified a variety of risk factors, including ambient pollutant gases and airborne particles, for the prevalence and the exacerbation of allergic diseases. However, the responsible pollutants remain unclear and the causal relationship has not been established. Recent studies of cellular and animal models have suggested several plausible mechanisms, with the most consistent observation being the direct effects of particle components on the generation of reactive oxygen species (ROS) and the resultant oxidative stress and inflammatory responses. This review attempts to highlight the experimental findings, with particular emphasis on several major mechanistic events initiated by exposure to particulate matters (PMs) in the exposure-disease relationship. PMID:25694815
NASA Astrophysics Data System (ADS)
Ellis, John; Garcia, Marcos A. G.; Nanopoulos, Dimitri V.; Olive, Keith A.
2016-05-01
Supersymmetry is the most natural framework for physics above the TeV scale, and the corresponding framework for early-Universe cosmology, including inflation, is supergravity. No-scale supergravity emerges from generic string compactifications and yields a non-negative potential, and is therefore a plausible framework for constructing models of inflation. No-scale inflation yields naturally predictions similar to those of the Starobinsky model based on R+{R}2 gravity, with a tilted spectrum of scalar perturbations: {n}s∼ 0.96, and small values of the tensor-to-scalar perturbation ratio r\\lt 0.1, as favoured by Planck and other data on the cosmic microwave background (CMB). Detailed measurements of the CMB may provide insights into the embedding of inflation within string theory as well as its links to collider physics.
DOT National Transportation Integrated Search
2013-06-01
This report summarizes a research project aimed at developing degradation models for bridge decks in the state of Michigan based on durability mechanics. A probabilistic framework to implement local-level mechanistic-based models for predicting the c...
Ecological risk assessors face increasing demands to assess more chemicals, with greater speed and accuracy, and to do so using fewer resources and experimental animals. New approaches in biological and computational sciences are being developed to generate mechanistic informatio...
Ecological risk assessors face increasing demands to assess more chemicals, with greater speed and accuracy, and to do so using fewer resources and experimental animals. New approaches in biological and computational sciences may be able to generate mechanistic information that ...
Laboratory and field procedures used to characterize materials.
DOT National Transportation Integrated Search
2009-01-01
The objective of TxDOT Project 0-5798 is to develop the framework for the development and : implementation of the next level of Mechanistic-Empirical Pavement Design Guide (MEPDG) for TxDOT : (Tex-ME). A very important aspect of this project is to id...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seager, S.; Bains, W.; Hu, R.
Biosignature gas detection is one of the ultimate future goals for exoplanet atmosphere studies. We have created a framework for linking biosignature gas detectability to biomass estimates, including atmospheric photochemistry and biological thermodynamics. The new framework is intended to liberate predictive atmosphere models from requiring fixed, Earth-like biosignature gas source fluxes. New biosignature gases can be considered with a check that the biomass estimate is physically plausible. We have validated the models on terrestrial production of NO, H{sub 2}S, CH{sub 4}, CH{sub 3}Cl, and DMS. We have applied the models to propose NH{sub 3} as a biosignature gas on amore » 'cold Haber World', a planet with a N{sub 2}-H{sub 2} atmosphere, and to demonstrate why gases such as CH{sub 3}Cl must have too large of a biomass to be a plausible biosignature gas on planets with Earth or early-Earth-like atmospheres orbiting a Sun-like star. To construct the biomass models, we developed a functional classification of biosignature gases, and found that gases (such as CH{sub 4}, H{sub 2}S, and N{sub 2}O) produced from life that extracts energy from chemical potential energy gradients will always have false positives because geochemistry has the same gases to work with as life does, and gases (such as DMS and CH{sub 3}Cl) produced for secondary metabolic reasons are far less likely to have false positives but because of their highly specialized origin are more likely to be produced in small quantities. The biomass model estimates are valid to one or two orders of magnitude; the goal is an independent approach to testing whether a biosignature gas is plausible rather than a precise quantification of atmospheric biosignature gases and their corresponding biomasses.« less
Turner, Nancy D; Lloyd, Shannon K
2017-04-01
A role for red and processed meat in the development of colorectal cancer has been proposed based largely on evidence from observational studies in humans, especially in those populations consuming a westernized diet. Determination of causation specifically by red or processed meat is contingent upon identification of plausible mechanisms that lead to colorectal cancer. We conducted a systematic review of the available evidence to determine the availability of plausible mechanistic data linking red and processed meat consumption to colorectal cancer risk. Forty studies using animal models or cell cultures met specified inclusion criteria, most of which were designed to examine the role of heme iron or heterocyclic amines in relation to colon carcinogenesis. Most studies used levels of meat or meat components well in excess of those found in human diets. Although many of the experiments used semi-purified diets designed to mimic the nutrient loads in current westernized diets, most did not include potential biologically active protective compounds present in whole foods. Because of these limitations in the existing literature, there is currently insufficient evidence to confirm a mechanistic link between the intake of red meat as part of a healthy dietary pattern and colorectal cancer risk. Impact statement Current recommendations to reduce colon cancer include the reduction or elimination of red or processed meats. These recommendations are based on data from epidemiological studies conducted among cultures where meat consumption is elevated and consumption of fruits, vegetables, and whole grains are reduced. This review evaluated experimental data exploring the putative mechanisms whereby red or processed meats may contribute to colon cancer. Most studies used levels of meat or meat-derived compounds that were in excess of those in human diets, even in cultures where meat intake is elevated. Experiments where protective dietary compounds were used to mitigate the extreme levels of meat and meat-derived compounds showed protection against colon cancer, with some essentially negating the impact of meat in the diet. It is essential that better-designed studies be conducted that use relevant concentrations of meat or meat-derived compounds in complex diets representative of the foods consumed by humans.
Lloyd, Shannon K
2017-01-01
A role for red and processed meat in the development of colorectal cancer has been proposed based largely on evidence from observational studies in humans, especially in those populations consuming a westernized diet. Determination of causation specifically by red or processed meat is contingent upon identification of plausible mechanisms that lead to colorectal cancer. We conducted a systematic review of the available evidence to determine the availability of plausible mechanistic data linking red and processed meat consumption to colorectal cancer risk. Forty studies using animal models or cell cultures met specified inclusion criteria, most of which were designed to examine the role of heme iron or heterocyclic amines in relation to colon carcinogenesis. Most studies used levels of meat or meat components well in excess of those found in human diets. Although many of the experiments used semi-purified diets designed to mimic the nutrient loads in current westernized diets, most did not include potential biologically active protective compounds present in whole foods. Because of these limitations in the existing literature, there is currently insufficient evidence to confirm a mechanistic link between the intake of red meat as part of a healthy dietary pattern and colorectal cancer risk. Impact statement Current recommendations to reduce colon cancer include the reduction or elimination of red or processed meats. These recommendations are based on data from epidemiological studies conducted among cultures where meat consumption is elevated and consumption of fruits, vegetables, and whole grains are reduced. This review evaluated experimental data exploring the putative mechanisms whereby red or processed meats may contribute to colon cancer. Most studies used levels of meat or meat-derived compounds that were in excess of those in human diets, even in cultures where meat intake is elevated. Experiments where protective dietary compounds were used to mitigate the extreme levels of meat and meat-derived compounds showed protection against colon cancer, with some essentially negating the impact of meat in the diet. It is essential that better-designed studies be conducted that use relevant concentrations of meat or meat-derived compounds in complex diets representative of the foods consumed by humans. PMID:28205448
Mallinckrodt, C H; Lin, Q; Molenberghs, M
2013-01-01
The objective of this research was to demonstrate a framework for drawing inference from sensitivity analyses of incomplete longitudinal clinical trial data via a re-analysis of data from a confirmatory clinical trial in depression. A likelihood-based approach that assumed missing at random (MAR) was the primary analysis. Robustness to departure from MAR was assessed by comparing the primary result to those from a series of analyses that employed varying missing not at random (MNAR) assumptions (selection models, pattern mixture models and shared parameter models) and to MAR methods that used inclusive models. The key sensitivity analysis used multiple imputation assuming that after dropout the trajectory of drug-treated patients was that of placebo treated patients with a similar outcome history (placebo multiple imputation). This result was used as the worst reasonable case to define the lower limit of plausible values for the treatment contrast. The endpoint contrast from the primary analysis was - 2.79 (p = .013). In placebo multiple imputation, the result was - 2.17. Results from the other sensitivity analyses ranged from - 2.21 to - 3.87 and were symmetrically distributed around the primary result. Hence, no clear evidence of bias from missing not at random data was found. In the worst reasonable case scenario, the treatment effect was 80% of the magnitude of the primary result. Therefore, it was concluded that a treatment effect existed. The structured sensitivity framework of using a worst reasonable case result based on a controlled imputation approach with transparent and debatable assumptions supplemented a series of plausible alternative models under varying assumptions was useful in this specific situation and holds promise as a generally useful framework. Copyright © 2012 John Wiley & Sons, Ltd.
Crupi, Vincenzo; Nelson, Jonathan D; Meder, Björn; Cevolani, Gustavo; Tentori, Katya
2018-06-17
Searching for information is critical in many situations. In medicine, for instance, careful choice of a diagnostic test can help narrow down the range of plausible diseases that the patient might have. In a probabilistic framework, test selection is often modeled by assuming that people's goal is to reduce uncertainty about possible states of the world. In cognitive science, psychology, and medical decision making, Shannon entropy is the most prominent and most widely used model to formalize probabilistic uncertainty and the reduction thereof. However, a variety of alternative entropy metrics (Hartley, Quadratic, Tsallis, Rényi, and more) are popular in the social and the natural sciences, computer science, and philosophy of science. Particular entropy measures have been predominant in particular research areas, and it is often an open issue whether these divergences emerge from different theoretical and practical goals or are merely due to historical accident. Cutting across disciplinary boundaries, we show that several entropy and entropy reduction measures arise as special cases in a unified formalism, the Sharma-Mittal framework. Using mathematical results, computer simulations, and analyses of published behavioral data, we discuss four key questions: How do various entropy models relate to each other? What insights can be obtained by considering diverse entropy models within a unified framework? What is the psychological plausibility of different entropy models? What new questions and insights for research on human information acquisition follow? Our work provides several new pathways for theoretical and empirical research, reconciling apparently conflicting approaches and empirical findings within a comprehensive and unified information-theoretic formalism. Copyright © 2018 Cognitive Science Society, Inc.
NASA Astrophysics Data System (ADS)
Jeuland, Marc; Whittington, Dale
2014-03-01
This article presents a methodology for planning new water resources infrastructure investments and operating strategies in a world of climate change uncertainty. It combines a real options (e.g., options to defer, expand, contract, abandon, switch use, or otherwise alter a capital investment) approach with principles drawn from robust decision-making (RDM). RDM comprises a class of methods that are used to identify investment strategies that perform relatively well, compared to the alternatives, across a wide range of plausible future scenarios. Our proposed framework relies on a simulation model that includes linkages between climate change and system hydrology, combined with sensitivity analyses that explore how economic outcomes of investments in new dams vary with forecasts of changing runoff and other uncertainties. To demonstrate the framework, we consider the case of new multipurpose dams along the Blue Nile in Ethiopia. We model flexibility in design and operating decisions—the selection, sizing, and sequencing of new dams, and reservoir operating rules. Results show that there is no single investment plan that performs best across a range of plausible future runoff conditions. The decision-analytic framework is then used to identify dam configurations that are both robust to poor outcomes and sufficiently flexible to capture high upside benefits if favorable future climate and hydrological conditions should arise. The approach could be extended to explore design and operating features of development and adaptation projects other than dams.
Exploring Conceptual Change in Genetics Using a Multidimensional Interpretive Framework.
ERIC Educational Resources Information Center
Venville, Grady J.; Treagust, David F.
1998-01-01
Changes in grade 10 students' (n=79) conceptions of genes during genetics instruction was studied from multiple perspectives. Ontologically, most students moved from passive to active models of genes. Affectively, students were interested in genetics but unmotivated by microscopic mechanistic explanations; however, teaching approaches were…
In order to increase the uptake and use of high throughput screening data in environmental risk assessment, it is important to establish scientifically credible links between measures of biological pathway perturbation and apical adverse outcomes in humans and wildlife. The adver...
Mechanistic modeling & effectiveness of buffer strips for pesticide regulatory frameworks
USDA-ARS?s Scientific Manuscript database
Vegetative Filter Strips (VFS) have been used as an effective conservation practice in agricultural areas for controlling and mitigate the effect of sediment, nutrients and pesticides loads into water bodies. In addition to the agricultural sector, another important use of VFS for controlling plague...
NASA Astrophysics Data System (ADS)
Wallace, Jon Michael
2003-10-01
Reliability prediction of components operating in complex systems has historically been conducted in a statistically isolated manner. Current physics-based, i.e. mechanistic, component reliability approaches focus more on component-specific attributes and mathematical algorithms and not enough on the influence of the system. The result is that significant error can be introduced into the component reliability assessment process. The objective of this study is the development of a framework that infuses the needs and influence of the system into the process of conducting mechanistic-based component reliability assessments. The formulated framework consists of six primary steps. The first three steps, identification, decomposition, and synthesis, are primarily qualitative in nature and employ system reliability and safety engineering principles to construct an appropriate starting point for the component reliability assessment. The following two steps are the most unique. They involve a step to efficiently characterize and quantify the system-driven local parameter space and a subsequent step using this information to guide the reduction of the component parameter space. The local statistical space quantification step is accomplished using two proposed multivariate probability models: Multi-Response First Order Second Moment and Taylor-Based Inverse Transformation. Where existing joint probability models require preliminary distribution and correlation information of the responses, these models combine statistical information of the input parameters with an efficient sampling of the response analyses to produce the multi-response joint probability distribution. Parameter space reduction is accomplished using Approximate Canonical Correlation Analysis (ACCA) employed as a multi-response screening technique. The novelty of this approach is that each individual local parameter and even subsets of parameters representing entire contributing analyses can now be rank ordered with respect to their contribution to not just one response, but the entire vector of component responses simultaneously. The final step of the framework is the actual probabilistic assessment of the component. Although the same multivariate probability tools employed in the characterization step can be used for the component probability assessment, variations of this final step are given to allow for the utilization of existing probabilistic methods such as response surface Monte Carlo and Fast Probability Integration. The overall framework developed in this study is implemented to assess the finite-element based reliability prediction of a gas turbine airfoil involving several failure responses. Results of this implementation are compared to results generated using the conventional 'isolated' approach as well as a validation approach conducted through large sample Monte Carlo simulations. The framework resulted in a considerable improvement to the accuracy of the part reliability assessment and an improved understanding of the component failure behavior. Considerable statistical complexity in the form of joint non-normal behavior was found and accounted for using the framework. Future applications of the framework elements are discussed.
Laszlo, Sarah; Plaut, David C
2012-03-01
The Parallel Distributed Processing (PDP) framework has significant potential for producing models of cognitive tasks that approximate how the brain performs the same tasks. To date, however, there has been relatively little contact between PDP modeling and data from cognitive neuroscience. In an attempt to advance the relationship between explicit, computational models and physiological data collected during the performance of cognitive tasks, we developed a PDP model of visual word recognition which simulates key results from the ERP reading literature, while simultaneously being able to successfully perform lexical decision-a benchmark task for reading models. Simulations reveal that the model's success depends on the implementation of several neurally plausible features in its architecture which are sufficiently domain-general to be relevant to cognitive modeling more generally. Copyright © 2011 Elsevier Inc. All rights reserved.
Quantum theory as plausible reasoning applied to data obtained by robust experiments.
De Raedt, H; Katsnelson, M I; Michielsen, K
2016-05-28
We review recent work that employs the framework of logical inference to establish a bridge between data gathered through experiments and their objective description in terms of human-made concepts. It is shown that logical inference applied to experiments for which the observed events are independent and for which the frequency distribution of these events is robust with respect to small changes of the conditions under which the experiments are carried out yields, without introducing any concept of quantum theory, the quantum theoretical description in terms of the Schrödinger or the Pauli equation, the Stern-Gerlach or Einstein-Podolsky-Rosen-Bohm experiments. The extraordinary descriptive power of quantum theory then follows from the fact that it is plausible reasoning, that is common sense, applied to reproducible and robust experimental data. © 2016 The Author(s).
Meshless Modeling of Deformable Shapes and their Motion
Adams, Bart; Ovsjanikov, Maks; Wand, Michael; Seidel, Hans-Peter; Guibas, Leonidas J.
2010-01-01
We present a new framework for interactive shape deformation modeling and key frame interpolation based on a meshless finite element formulation. Starting from a coarse nodal sampling of an object’s volume, we formulate rigidity and volume preservation constraints that are enforced to yield realistic shape deformations at interactive frame rates. Additionally, by specifying key frame poses of the deforming shape and optimizing the nodal displacements while targeting smooth interpolated motion, our algorithm extends to a motion planning framework for deformable objects. This allows reconstructing smooth and plausible deformable shape trajectories in the presence of possibly moving obstacles. The presented results illustrate that our framework can handle complex shapes at interactive rates and hence is a valuable tool for animators to realistically and efficiently model and interpolate deforming 3D shapes. PMID:24839614
The adverse outcome pathway (AOP) framework can be used to support the use of mechanistic toxicology data as a basis for risk assessment. For certain risk contexts this includes defining, quantitative linkages between the molecular initiating event (MIE) and subsequent key events...
Old, New, Borrowed, Blue: Reconceptualizing the Systems Framework
ERIC Educational Resources Information Center
Garces-Bacsal, Rhoda Myra
2012-01-01
Ziegler and Phillipson began the target article by citing the mechanistic tradition of finding meaning in the natural world and applying this to various processes of identifying giftedness (Ziegler & Stoeger, 2008)--and demonstrating its ineffectiveness in traditional gifted education. The systems theory is said to allow for a greater…
The Adverse Outcome Pathway (AOP) framework is becoming a widely used tool for organizing and summarizing the mechanistic information connecting molecular perturbations by environmental stressors with adverse ecological and human health outcomes. However, the conventional process...
Sustainability Education's Gift: Learning Patterns and Relationships
ERIC Educational Resources Information Center
Williams, Dilafruz
2008-01-01
The crisis of sustainability can be linked to the traditional forms of schooling driven by mechanistic and technocratic worldviews. Progressing to a more sustainable world requires a fundamental shift in the framework of formal education--its structure, content and process--to include principles of systems thinking and holistic learning. A case…
The Adverse Outcome Pathway (AOP) framework describes the progression of a toxicity pathway from molecular perturbation to population-level outcome in a series of measurable, mechanistic responses. The controlled, computer-readable vocabulary that defines an AOP has the ability t...
Adverse outcome pathways (AOPs) provide a framework that supports greater use of mechanistic data measured at lower levels of biological organization as a basis for regulatory decision-making. However, it is recognized that different types of regulatory applications and decisions...
Vision: a moving hill for spatial updating on the fly.
Stanford, Terrence R
2015-02-02
A recent study reveals a dynamic neural map that provides a continuous representation of remembered visual stimulus locations with respect to constantly changing gaze. This finding suggests a new mechanistic framework for understanding the spatiotemporal dynamics of goal-directed action. Copyright © 2015 Elsevier Ltd. All rights reserved.
Real-time tracking of visually attended objects in virtual environments and its application to LOD.
Lee, Sungkil; Kim, Gerard Jounghyun; Choi, Seungmoon
2009-01-01
This paper presents a real-time framework for computationally tracking objects visually attended by the user while navigating in interactive virtual environments. In addition to the conventional bottom-up (stimulus-driven) saliency map, the proposed framework uses top-down (goal-directed) contexts inferred from the user's spatial and temporal behaviors, and identifies the most plausibly attended objects among candidates in the object saliency map. The computational framework was implemented using GPU, exhibiting high computational performance adequate for interactive virtual environments. A user experiment was also conducted to evaluate the prediction accuracy of the tracking framework by comparing objects regarded as visually attended by the framework to actual human gaze collected with an eye tracker. The results indicated that the accuracy was in the level well supported by the theory of human cognition for visually identifying single and multiple attentive targets, especially owing to the addition of top-down contextual information. Finally, we demonstrate how the visual attention tracking framework can be applied to managing the level of details in virtual environments, without any hardware for head or eye tracking.
NASA Astrophysics Data System (ADS)
Miyakawa, Takuya; Tanokura, Masaru
The phytohormone abscisic acid (ABA) plays a key role in the rapid adaptation of plants to environmental stresses such as drought and high salinity. Accumulated ABA in plant cells promotes stomatal closure in guard cells and transcription of stress-tolerant genes. Our understanding of ABA responses dramatically improved by the discovery of both PYR/PYL/RCAR as a soluble ABA receptor and inhibitory complex of a protein phospatase PP2C and a protein kinase SnRK2. Moreover, several structural analyses of PYR/PYL/RCAR revealed the mechanistic basis for the regulatory mechanism of ABA signaling, which provides a rational framework for the design of alternative agonists in future.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Singh, Meenesh R.; Goodpaster, Jason D.; Weber, Adam Z.
Electrochemical reduction of CO 2 using renewable sources of electrical energy holds promise for converting CO 2 to fuels and chemicals. Since this process is complex and involves a large number of species and physical phenomena, a comprehensive understanding of the factors controlling product distribution is required. While the most plausible reaction pathway is usually identified from quantum-chemical calculation of the lowest free-energy pathway, this approach can be misleading when coverages of adsorbed species determined for alternative mechanism differ significantly, since elementary reaction rates depend on the product of the rate coefficient and the coverage of species involved in themore » reaction. Moreover, cathode polarization can influence the kinetics of CO 2 reduction. Here in this work, we present a multiscale framework for ab initio simulation of the electrochemical reduction of CO 2 over an Ag(110) surface. A continuum model for species transport is combined with a microkinetic model for the cathode reaction dynamics. Free energies of activation for all elementary reactions are determined from density functional theory calculations. Using this approach, three alternative mechanisms for CO 2 reduction were examined. The rate-limiting step in each mechanism is **COOH formation at higher negative potentials. However, only via the multiscale simulation was it possible to identify the mechanism that leads to a dependence of the rate of CO formation on the partial pressure of CO 2 that is consistent with experiments. Additionally, simulations based on this mechanism also describe the dependence of the H 2 and CO current densities on cathode voltage that are in strikingly good agreement with experimental observation.« less
Singh, Meenesh R.; Goodpaster, Jason D.; Weber, Adam Z.; ...
2017-10-02
Electrochemical reduction of CO 2 using renewable sources of electrical energy holds promise for converting CO 2 to fuels and chemicals. Since this process is complex and involves a large number of species and physical phenomena, a comprehensive understanding of the factors controlling product distribution is required. While the most plausible reaction pathway is usually identified from quantum-chemical calculation of the lowest free-energy pathway, this approach can be misleading when coverages of adsorbed species determined for alternative mechanism differ significantly, since elementary reaction rates depend on the product of the rate coefficient and the coverage of species involved in themore » reaction. Moreover, cathode polarization can influence the kinetics of CO 2 reduction. Here in this work, we present a multiscale framework for ab initio simulation of the electrochemical reduction of CO 2 over an Ag(110) surface. A continuum model for species transport is combined with a microkinetic model for the cathode reaction dynamics. Free energies of activation for all elementary reactions are determined from density functional theory calculations. Using this approach, three alternative mechanisms for CO 2 reduction were examined. The rate-limiting step in each mechanism is **COOH formation at higher negative potentials. However, only via the multiscale simulation was it possible to identify the mechanism that leads to a dependence of the rate of CO formation on the partial pressure of CO 2 that is consistent with experiments. Additionally, simulations based on this mechanism also describe the dependence of the H 2 and CO current densities on cathode voltage that are in strikingly good agreement with experimental observation.« less
Dose selection based on physiologically based pharmacokinetic (PBPK) approaches.
Jones, Hannah M; Mayawala, Kapil; Poulin, Patrick
2013-04-01
Physiologically based pharmacokinetic (PBPK) models are built using differential equations to describe the physiology/anatomy of different biological systems. Readily available in vitro and in vivo preclinical data can be incorporated into these models to not only estimate pharmacokinetic (PK) parameters and plasma concentration-time profiles, but also to gain mechanistic insight into compound properties. They provide a mechanistic framework to understand and extrapolate PK and dose across in vitro and in vivo systems and across different species, populations and disease states. Using small molecule and large molecule examples from the literature and our own company, we have shown how PBPK techniques can be utilised for human PK and dose prediction. Such approaches have the potential to increase efficiency, reduce the need for animal studies, replace clinical trials and increase PK understanding. Given the mechanistic nature of these models, the future use of PBPK modelling in drug discovery and development is promising, however some limitations need to be addressed to realise its application and utility more broadly.
Relationship between brain plasticity, learning and foraging performance in honey bees.
Cabirol, Amélie; Cope, Alex J; Barron, Andrew B; Devaud, Jean-Marc
2018-01-01
Brain structure and learning capacities both vary with experience, but the mechanistic link between them is unclear. Here, we investigated whether experience-dependent variability in learning performance can be explained by neuroplasticity in foraging honey bees. The mushroom bodies (MBs) are a brain center necessary for ambiguous olfactory learning tasks such as reversal learning. Using radio frequency identification technology, we assessed the effects of natural variation in foraging activity, and the age when first foraging, on both performance in reversal learning and on synaptic connectivity in the MBs. We found that reversal learning performance improved at foraging onset and could decline with greater foraging experience. If bees started foraging before the normal age, as a result of a stress applied to the colony, the decline in learning performance with foraging experience was more severe. Analyses of brain structure in the same bees showed that the total number of synaptic boutons at the MB input decreased when bees started foraging, and then increased with greater foraging intensity. At foraging onset MB structure is therefore optimized for bees to update learned information, but optimization of MB connectivity deteriorates with foraging effort. In a computational model of the MBs sparser coding of information at the MB input improved reversal learning performance. We propose, therefore, a plausible mechanistic relationship between experience, neuroplasticity, and cognitive performance in a natural and ecological context.
Courellis, Hristos; Mullen, Tim; Poizner, Howard; Cauwenberghs, Gert; Iversen, John R.
2017-01-01
Quantification of dynamic causal interactions among brain regions constitutes an important component of conducting research and developing applications in experimental and translational neuroscience. Furthermore, cortical networks with dynamic causal connectivity in brain-computer interface (BCI) applications offer a more comprehensive view of brain states implicated in behavior than do individual brain regions. However, models of cortical network dynamics are difficult to generalize across subjects because current electroencephalography (EEG) signal analysis techniques are limited in their ability to reliably localize sources across subjects. We propose an algorithmic and computational framework for identifying cortical networks across subjects in which dynamic causal connectivity is modeled among user-selected cortical regions of interest (ROIs). We demonstrate the strength of the proposed framework using a “reach/saccade to spatial target” cognitive task performed by 10 right-handed individuals. Modeling of causal cortical interactions was accomplished through measurement of cortical activity using (EEG), application of independent component clustering to identify cortical ROIs as network nodes, estimation of cortical current density using cortically constrained low resolution electromagnetic brain tomography (cLORETA), multivariate autoregressive (MVAR) modeling of representative cortical activity signals from each ROI, and quantification of the dynamic causal interaction among the identified ROIs using the Short-time direct Directed Transfer function (SdDTF). The resulting cortical network and the computed causal dynamics among its nodes exhibited physiologically plausible behavior, consistent with past results reported in the literature. This physiological plausibility of the results strengthens the framework's applicability in reliably capturing complex brain functionality, which is required by applications, such as diagnostics and BCI. PMID:28566997
The Adverse Outcome Pathway (AOP) framework is increasingly being adopted as a tool for organizing and summarizing the mechanistic information connecting molecular perturbations by environmental stressors with adverse outcomes relevant for ecological and human health outcomes. Ho...
The fish short term reproduction assay (FSTRA) is a key component of the USEPA endocrine disruptor screening program (EDSP). The FSTRA considers several mechanistic and apical responses in fathead minnows (Pimephales promelas) to determine whether an unknown chemical is likely to...
The fish short term reproduction assay (FSTRA) is a key component of the USEPA endocrine disruptor screening program (EDSP). The FSTRA considers several mechanistic and apical responses in fathead minnows (Pimephales promelas) to determine whether an unknown chemical is likely t...
The Adverse Outcome Pathway (AOP) framework summarizes key information about mechanistic events leading to an adverse health or ecological outcome. In recent years computationally predicted AOPs (cpAOP) making use of publicly available data have been proposed as a means of accele...
A Beneficial Use Impairment (BUI) common at Great Lakes Areas of Concern (AOCs) is loss of fish and wildlife populations. Consequently, recovery of populations after stressor mitigation serves as a basis for evaluating remediation success. We describe a framework that can be a...
Armstrong, Mitchell R; Senthilnathan, Sethuraman; Balzer, Christopher J; Shan, Bohan; Chen, Liang; Mu, Bin
2017-01-01
Systematic studies of key operating parameters for the sonochemical synthesis of the metal organic framework (MOF) HKUST-1(also called CuBTC) were performed including reaction time, reactor volume, sonication amplitude, sonication tip size, solvent composition, and reactant concentrations analyzed through SEM particle size analysis. Trends in the particle size and size distributions show reproducible control of average particle sizes between 1 and 4μm. These results along with complementary studies in sonofragmentation and temperature control were conducted to compare these results to kinetic crystal growth models found in literature to develop a plausible hypothetical mechanism for ultrasound-assisted growth of metal-organic-frameworks composed of a competitive mechanism including constructive solid-on-solid (SOS) crystal growth and a deconstructive sonofragmentation. Copyright © 2016 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farrell, Kathryn, E-mail: kfarrell@ices.utexas.edu; Oden, J. Tinsley, E-mail: oden@ices.utexas.edu; Faghihi, Danial, E-mail: danial@ices.utexas.edu
A general adaptive modeling algorithm for selection and validation of coarse-grained models of atomistic systems is presented. A Bayesian framework is developed to address uncertainties in parameters, data, and model selection. Algorithms for computing output sensitivities to parameter variances, model evidence and posterior model plausibilities for given data, and for computing what are referred to as Occam Categories in reference to a rough measure of model simplicity, make up components of the overall approach. Computational results are provided for representative applications.
Beemelmanns, Christine; Reissig, Hans-Ulrich
2015-06-01
This comprehensive report accounts the development of a highly diastereoselective samarium diiodide-induced cascade reaction of substituted indolyl ketones. The complexity-generating transformation with SmI2 allows the diastereoselective generation of three stereogenic centers including one quaternary center in one step. The obtained tetra- or pentacyclic dihydroindole derivatives are structural motifs of many monoterpene indole alkaloids, and their subsequent transformations gave way to one of the shortest approaches towards strychnine (14 % overall yield in ten steps, or 10 % overall yield in eight steps). During the course of this report we discuss the influence of substituents on the cyclization step, plausible mechanistic scenarios for the SmI2 -induced cascade reaction, diastereoselective reductive amination, and regioselective dehydratization protocols towards the pentacyclic core structure of strychnos alkaloids. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Solomons, Noel W
2013-01-01
Zinc has become a prominent nutrient of clinical and public health interest in the new millennium. Functions and actions for zinc emerge as increasingly ubiquitous in mammalian anatomy, physiology and metabolism. There is undoubtedly an underpinning in fundamental biology for all of the aspects of zinc in human health (clinical and epidemiological) in pediatric and public health practice. Unfortunately, basic science research may not have achieved a full understanding as yet. As a complement to the applied themes in the companion articles, a selection of recent advances in the domains homeostatic regulation and transport of zinc is presented; they are integrated, in turn, with findings on genetic expression, intracellular signaling, immunity and host defense, and bone growth. The elements include ionic zinc, zinc transporters, metallothioneins, zinc metalloenzymes and zinc finger proteins. In emerging basic research, we find some plausible mechanistic explanations for delayed linear growth with zinc deficiency and increased infectious disease resistance with zinc supplementation. Copyright © 2013 S. Karger AG, Basel.
A Mechanistic Investigation of the Gold(III)-Catalyzed Hydrofurylation of C-C Multiple Bonds.
Hossein Bagi, Amin; Khaledi, Yousef; Ghari, Hossein; Arndt, Sebastian; Hashmi, A Stephen K; Yates, Brian F; Ariafard, Alireza
2016-11-09
The gold-catalyzed direct functionalization of aromatic C-H bonds has attracted interest for constructing organic compounds which have application in pharmaceuticals, agrochemicals, and other important fields. In the literature, two major mechanisms have been proposed for these catalytic reactions: inner-sphere syn-addition and outer-sphere anti-addition (Friedel-Crafts-type mechanism). In this article, the AuCl 3 -catalyzed hydrofurylation of allenyl ketone, vinyl ketone, ketone, and alcohol substrates is investigated with the aid of density functional theory calculations, and it is found that the corresponding functionalizations are best rationalized in terms of a novel mechanism called "concerted electrophilic ipso-substitution" (CEIS) in which the gold(III)-furyl σ-bond produced by furan auration acts as a nucleophile and attacks the protonated substrate via an outer-sphere mechanism. This unprecedented mechanism needs to be considered as an alternative plausible pathway for gold(III)-catalyzed arene functionalization reactions in future studies.
Brook, Robert D; Franklin, Barry; Cascio, Wayne; Hong, Yuling; Howard, George; Lipsett, Michael; Luepker, Russell; Mittleman, Murray; Samet, Jonathan; Smith, Sidney C; Tager, Ira
2004-06-01
Air pollution is a heterogeneous, complex mixture of gases, liquids, and particulate matter. Epidemiological studies have demonstrated a consistent increased risk for cardiovascular events in relation to both short- and long-term exposure to present-day concentrations of ambient particulate matter. Several plausible mechanistic pathways have been described, including enhanced coagulation/thrombosis, a propensity for arrhythmias, acute arterial vasoconstriction, systemic inflammatory responses, and the chronic promotion of atherosclerosis. The purpose of this statement is to provide healthcare professionals and regulatory agencies with a comprehensive review of the literature on air pollution and cardiovascular disease. In addition, the implications of these findings in relation to public health and regulatory policies are addressed. Practical recommendations for healthcare providers and their patients are outlined. In the final section, suggestions for future research are made to address a number of remaining scientific questions.
Vocation in theology-based nursing theories.
Lundmark, Mikael
2007-11-01
By using the concepts of intrinsicality/extrinsicality as analytic tools, the theology-based nursing theories of Ann Bradshaw and Katie Eriksson are analyzed regarding their explicit and/or implicit understanding of vocation as a motivational factor for nursing. The results show that both theories view intrinsic values as guarantees against reducing nursing practice to mechanistic applications of techniques and as being a way of reinforcing a high ethical standard. The theories explicitly (Bradshaw) or implicitly (Eriksson) advocate a vocational understanding of nursing as being essential for nursing theories. Eriksson's theory has a potential for conceptualizing an understanding of extrinsic and intrinsic motivational factors for nursing but one weakness in the theory could be the risk of slipping over to moral judgments where intrinsic factors are valued as being superior to extrinsic. Bradshaw's theory is more complex and explicit in understanding the concept of vocation and is theologically more plausible, although also more confessional.
Breaking down the gut microbiome composition in multiple sclerosis.
Budhram, Adrian; Parvathy, Seema; Kremenchutzky, Marcelo; Silverman, Michael
2017-04-01
The gut microbiome, which consists of a highly diverse ecologic community of micro-organisms, has increasingly been studied regarding its role in multiple sclerosis (MS) immunopathogenesis. This review critically examines the literature investigating the gut microbiome in MS. A comprehensive search was performed of PubMed databases and ECTRIMS meeting abstracts for literature relating to the gut microbiome in MS. Controlled studies examining the gut microbiome in patients with MS were included for review. Identified studies were predominantly case-control in their design and consistently found differences in the gut microbiome of MS patients compared to controls. We examine plausible mechanistic links between these differences and MS immunopathogenesis, and discuss the therapeutic implications of these findings. Review of the available literature reveals potential immunopathogenic links between the gut microbiome and MS, identifies avenues for therapeutic advancement, and emphasizes the need for further systematic study in this emerging field.
Origin of the Reductive Tricarboxylic Acid (rTCA) Cycle-Type CO2 Fixation: A Perspective
Fujishima, Kosuke
2017-01-01
The reductive tricarboxylic acid (rTCA) cycle is among the most plausible candidates for the first autotrophic metabolism in the earliest life. Extant enzymes fixing CO2 in this cycle contain cofactors at the catalytic centers, but it is unlikely that the protein/cofactor system emerged at once in a prebiotic process. Here, we discuss the feasibility of non-enzymatic cofactor-assisted drive of the rTCA reactions in the primitive Earth environments, particularly focusing on the acetyl-CoA conversion to pyruvate. Based on the energetic and mechanistic aspects of this reaction, we propose that the deep-sea hydrothermal vent environments with active electricity generation in the presence of various sulfide catalysts are a promising setting for it to progress. Our view supports the theory of an autotrophic origin of life from primordial carbon assimilation within a sulfide-rich hydrothermal vent.
Kibdelones: novel anticancer polyketides from a rare Australian actinomycete.
Ratnayake, Ranjala; Lacey, Ernest; Tennant, Shaun; Gill, Jennifer H; Capon, Robert J
2007-01-01
The kibdelones are a novel family of bioactive heterocyclic polyketides produced by a rare soil actinomycete, Kibdelosporangium sp. (MST-108465). Complete relative stereostructures were assigned to kibdelones A-C (1-3), kibdelone B rhamnoside (5), 13-oxokibdelone A (7), and 25-methoxy-24-oxokibdelone C (8) on the basis of detailed spectroscopic analysis and chemical interconversion, as well as mechanistic and biosynthetic considerations. Under mild conditions, kibdelones B (2) and C (3) undergo a facile equilibration to kibdelones A-C (1-3), while kibdelone B rhamnoside (5) equilibrates to a mixture of kibdelone A-C rhamnosides (4-6). A plausible mechanism for this equilibration is proposed and involves air oxidation, quinone/hydroquinone redox transformations, and a choreographed sequence of keto/enol tautomerizations that aromatize ring C via a quinone methide intermediate. Kibdelones exhibit potent and selective cytotoxicity against a panel of human tumor cell lines and display significant antibacterial and nematocidal activity.
The General Adaptation Syndrome: A Foundation for the Concept of Periodization.
Cunanan, Aaron J; DeWeese, Brad H; Wagle, John P; Carroll, Kevin M; Sausaman, Robert; Hornsby, W Guy; Haff, G Gregory; Triplett, N Travis; Pierce, Kyle C; Stone, Michael H
2018-04-01
Recent reviews have attempted to refute the efficacy of applying Selye's general adaptation syndrome (GAS) as a conceptual framework for the training process. Furthermore, the criticisms involved are regularly used as the basis for arguments against the periodization of training. However, these perspectives fail to consider the entirety of Selye's work, the evolution of his model, and the broad applications he proposed. While it is reasonable to critically evaluate any paradigm, critics of the GAS have yet to dismantle the link between stress and adaptation. Disturbance to the state of an organism is the driving force for biological adaptation, which is the central thesis of the GAS model and the primary basis for its application to the athlete's training process. Despite its imprecisions, the GAS has proven to be an instructive framework for understanding the mechanistic process of providing a training stimulus to induce specific adaptations that result in functional enhancements. Pioneers of modern periodization have used the GAS as a framework for the management of stress and fatigue to direct adaptation during sports training. Updates to the periodization concept have retained its founding constructs while explicitly calling for scientifically based, evidence-driven practice suited to the individual. Thus, the purpose of this review is to provide greater clarity on how the GAS serves as an appropriate mechanistic model to conceptualize the periodization of training.
From the exposome to mechanistic understanding of chemical ...
BACKGROUND: Current definitions of the exposome expand beyond the initial idea to consider the totality of exposure and aim to relate to biological effects. While the exposome has been established for human health, its principles can be extended to include broader ecological issues. The assessment of exposure is tightly interlinked with hazard assessment. OBJECTIVES: We explore if mechanistic understanding of the causal links between exposure and adverse effects on human health and the environment can be improved by integrating the exposome approach with the adverse outcome pathway (AOP) concept - a framework to structure and organize the sequence of toxicological events from an initial molecular interaction of a chemical to an adverse outcome.METHODS: This review was informed by a Workshop organized by the Integrated Project EXPOSOME at the UFZ Helmholtz Centre for Environmental Research in Leipzig, Germany. DISCUSSION: The exposome encompasses all chemicals, including exogenous chemicals and endogenous compounds that are produced in response to external factors. By complementing the exposome research with the AOP concept, we can achieve a better mechanistic understanding, weigh the importance of various components of the exposome, and determine primary risk drivers. The ability to interpret multiple exposures and mixture effects at the mechanistic level requires a more holistic approach facilitated by the exposome concept.CONCLUSION: Incorporating the AOP conc
Toxicologists use dose-response data from both in vivo and in vitro experiments to evaluate the effects of chemical contaminants on organisms. Cumulative risk assessments (CRAs) consider the effects of multiple stressors on multiple endpoints, and utilize environmental exposure ...
The fish short term reproduction assay (FSTRA) is a key component of the USEPA endocrine disruptor screening program (EDSP). The FSTRA considers several mechanistic and apical responses in fathead minnows (Pimephales promelas) to determine whether an unknown chemical is likely to...
USDA-ARS?s Scientific Manuscript database
Components of emulsifiable concentrates (ECs) used in pesticide formulations may be emitted to air following application in agricultural use and contribute to ozone formation. A key consideration is the fraction of the ECs that is volatilized. This study is designed to provide a mechanistic model fr...
Embracing Community Ecology in Plant Microbiome Research.
Dini-Andreote, Francisco; Raaijmakers, Jos M
2018-06-01
Community assembly is mediated by selection, dispersal, drift, and speciation. Environmental selection is mostly used to date to explain patterns in plant microbiome assembly, whereas the influence of the other processes remains largely elusive. Recent studies highlight that adopting community ecology concepts provides a mechanistic framework for plant microbiome research. Copyright © 2018 Elsevier Ltd. All rights reserved.
Instructional Design for the 21st Century: Towards a New Conceptual Framework.
ERIC Educational Resources Information Center
Mashhadi, Azam
As a new century approaches it is time to re-assess the foundations on which instructional design currently rests, as well as the "mode of thinking" that it promotes. Traditional theories regarding instructional design have largely been implicitly based on out-moded eighteenth century conceptions of the physical universe (a mechanistic world view)…
Multivariate cross-frequency coupling via generalized eigendecomposition
Cohen, Michael X
2017-01-01
This paper presents a new framework for analyzing cross-frequency coupling in multichannel electrophysiological recordings. The generalized eigendecomposition-based cross-frequency coupling framework (gedCFC) is inspired by source-separation algorithms combined with dynamics of mesoscopic neurophysiological processes. It is unaffected by factors that confound traditional CFC methods—such as non-stationarities, non-sinusoidality, and non-uniform phase angle distributions—attractive properties considering that brain activity is neither stationary nor perfectly sinusoidal. The gedCFC framework opens new opportunities for conceptualizing CFC as network interactions with diverse spatial/topographical distributions. Five specific methods within the gedCFC framework are detailed, these are validated in simulated data and applied in several empirical datasets. gedCFC accurately recovers physiologically plausible CFC patterns embedded in noise that causes traditional CFC methods to perform poorly. The paper also demonstrates that spike-field coherence in multichannel local field potential data can be analyzed using the gedCFC framework, which provides significant advantages over traditional spike-field coherence analyses. Null-hypothesis testing is also discussed. DOI: http://dx.doi.org/10.7554/eLife.21792.001 PMID:28117662
Vidali, Veroniki P; Mitsopoulou, Kornilia P; Dakanali, Marianna; Demadis, Konstantinos D; Odysseos, Andreani D; Christou, Yiota A; Couladouros, Elias A
2013-11-01
A novel skeletal rearrangement of bicyclo[3.3.1]nonane-2,4,9-trione (16) to an unprecedented highly functionalized bicyclo[3.3.0]octane system (17), induced by an intramolecular Michael addition, is presented. This novel framework was found to be similarly active to hyperforin (1), against PC-3 cell lines. A mechanistic study was examined in detail, proposing a number of cascade transformations. Also, reactivity of the Δ(7,10)-double bond was examined under several conditions to explain the above results.
Modelling Trial-by-Trial Changes in the Mismatch Negativity
Lieder, Falk; Daunizeau, Jean; Garrido, Marta I.; Friston, Karl J.; Stephan, Klaas E.
2013-01-01
The mismatch negativity (MMN) is a differential brain response to violations of learned regularities. It has been used to demonstrate that the brain learns the statistical structure of its environment and predicts future sensory inputs. However, the algorithmic nature of these computations and the underlying neurobiological implementation remain controversial. This article introduces a mathematical framework with which competing ideas about the computational quantities indexed by MMN responses can be formalized and tested against single-trial EEG data. This framework was applied to five major theories of the MMN, comparing their ability to explain trial-by-trial changes in MMN amplitude. Three of these theories (predictive coding, model adjustment, and novelty detection) were formalized by linking the MMN to different manifestations of the same computational mechanism: approximate Bayesian inference according to the free-energy principle. We thereby propose a unifying view on three distinct theories of the MMN. The relative plausibility of each theory was assessed against empirical single-trial MMN amplitudes acquired from eight healthy volunteers in a roving oddball experiment. Models based on the free-energy principle provided more plausible explanations of trial-by-trial changes in MMN amplitude than models representing the two more traditional theories (change detection and adaptation). Our results suggest that the MMN reflects approximate Bayesian learning of sensory regularities, and that the MMN-generating process adjusts a probabilistic model of the environment according to prediction errors. PMID:23436989
Uninformative Prior Multiple Target Tracking Using Evidential Particle Filters
NASA Astrophysics Data System (ADS)
Worthy, J. L., III; Holzinger, M. J.
Space situational awareness requires the ability to initialize state estimation from short measurements and the reliable association of observations to support the characterization of the space environment. The electro-optical systems used to observe space objects cannot fully characterize the state of an object given a short, unobservable sequence of measurements. Further, it is difficult to associate these short-arc measurements if many such measurements are generated through the observation of a cluster of satellites, debris from a satellite break-up, or from spurious detections of an object. An optimization based, probabilistic short-arc observation association approach coupled with a Dempster-Shafer based evidential particle filter in a multiple target tracking framework is developed and proposed to address these problems. The optimization based approach is shown in literature to be computationally efficient and can produce probabilities of association, state estimates, and covariances while accounting for systemic errors. Rigorous application of Dempster-Shafer theory is shown to be effective at enabling ignorance to be properly accounted for in estimation by augmenting probability with belief and plausibility. The proposed multiple hypothesis framework will use a non-exclusive hypothesis formulation of Dempster-Shafer theory to assign belief mass to candidate association pairs and generate tracks based on the belief to plausibility ratio. The proposed algorithm is demonstrated using simulated observations of a GEO satellite breakup scenario.
Reinforcement Learning Using a Continuous Time Actor-Critic Framework with Spiking Neurons
Frémaux, Nicolas; Sprekeler, Henning; Gerstner, Wulfram
2013-01-01
Animals repeat rewarded behaviors, but the physiological basis of reward-based learning has only been partially elucidated. On one hand, experimental evidence shows that the neuromodulator dopamine carries information about rewards and affects synaptic plasticity. On the other hand, the theory of reinforcement learning provides a framework for reward-based learning. Recent models of reward-modulated spike-timing-dependent plasticity have made first steps towards bridging the gap between the two approaches, but faced two problems. First, reinforcement learning is typically formulated in a discrete framework, ill-adapted to the description of natural situations. Second, biologically plausible models of reward-modulated spike-timing-dependent plasticity require precise calculation of the reward prediction error, yet it remains to be shown how this can be computed by neurons. Here we propose a solution to these problems by extending the continuous temporal difference (TD) learning of Doya (2000) to the case of spiking neurons in an actor-critic network operating in continuous time, and with continuous state and action representations. In our model, the critic learns to predict expected future rewards in real time. Its activity, together with actual rewards, conditions the delivery of a neuromodulatory TD signal to itself and to the actor, which is responsible for action choice. In simulations, we show that such an architecture can solve a Morris water-maze-like navigation task, in a number of trials consistent with reported animal performance. We also use our model to solve the acrobot and the cartpole problems, two complex motor control tasks. Our model provides a plausible way of computing reward prediction error in the brain. Moreover, the analytically derived learning rule is consistent with experimental evidence for dopamine-modulated spike-timing-dependent plasticity. PMID:23592970
Reinforcement learning using a continuous time actor-critic framework with spiking neurons.
Frémaux, Nicolas; Sprekeler, Henning; Gerstner, Wulfram
2013-04-01
Animals repeat rewarded behaviors, but the physiological basis of reward-based learning has only been partially elucidated. On one hand, experimental evidence shows that the neuromodulator dopamine carries information about rewards and affects synaptic plasticity. On the other hand, the theory of reinforcement learning provides a framework for reward-based learning. Recent models of reward-modulated spike-timing-dependent plasticity have made first steps towards bridging the gap between the two approaches, but faced two problems. First, reinforcement learning is typically formulated in a discrete framework, ill-adapted to the description of natural situations. Second, biologically plausible models of reward-modulated spike-timing-dependent plasticity require precise calculation of the reward prediction error, yet it remains to be shown how this can be computed by neurons. Here we propose a solution to these problems by extending the continuous temporal difference (TD) learning of Doya (2000) to the case of spiking neurons in an actor-critic network operating in continuous time, and with continuous state and action representations. In our model, the critic learns to predict expected future rewards in real time. Its activity, together with actual rewards, conditions the delivery of a neuromodulatory TD signal to itself and to the actor, which is responsible for action choice. In simulations, we show that such an architecture can solve a Morris water-maze-like navigation task, in a number of trials consistent with reported animal performance. We also use our model to solve the acrobot and the cartpole problems, two complex motor control tasks. Our model provides a plausible way of computing reward prediction error in the brain. Moreover, the analytically derived learning rule is consistent with experimental evidence for dopamine-modulated spike-timing-dependent plasticity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collier, Virginia E.; Ellebracht, Nathan C.; Lindy, George I.
The kinetic and mechanistic understanding of cooperatively catalyzed aldol and nitroaldol condensations is probed using a series of mesoporous silicas functionalized with aminosilanes to provide bifunctional acid–base character. Mechanistically, a Hammett analysis is performed to determine the effects of electron-donating and electron-withdrawing groups of para-substituted benzaldehyde derivatives on the catalytic activity of each condensation reaction. This information is also used to discuss the validity of previously proposed catalytic mechanisms and to propose a revised mechanism with plausible reaction intermediates. For both reactions, electron-withdrawing groups increase the observed rates of reaction, though resonance effects play an important, yet subtle, role inmore » the nitroaldol condensation, in which a p-methoxy electron-donating group is also able to stabilize the proposed carbocation intermediate. Additionally, activation energies and pre-exponential factors are calculated via the Arrhenius analysis of two catalysts with similar amine loadings: one catalyst had silanols available for cooperative interactions (acid–base catalysis), while the other was treated with a silanol-capping reagent to prevent such cooperativity (base-only catalysis). The values obtained for activation energies and pre-exponential factors in each reaction are discussed in the context of the proposed mechanisms and the importance of cooperative interactions in each reaction. The catalytic activity decreases for all reactions when the silanols are capped with trimethylsilyl groups, and higher temperatures are required to make accurate rate measurements, emphasizing the vital role the weakly acidic silanols play in the catalytic cycles. The results indicate that loss of acid sites is more detrimental to the catalytic activity of the aldol condensation than the nitroaldol condensation, as evidenced by the significant decrease in the pre-exponential factor for the aldol condensation when silanols are unavailable for cooperative interactions. Cooperative catalysis is evidenced by significant changes in the pre-exponential factor, rather than the activation energy for the aldol condensation.« less
Collier, Virginia E.; Ellebracht, Nathan C.; Lindy, George I.; ...
2015-12-09
The kinetic and mechanistic understanding of cooperatively catalyzed aldol and nitroaldol condensations is probed using a series of mesoporous silicas functionalized with aminosilanes to provide bifunctional acid–base character. Mechanistically, a Hammett analysis is performed to determine the effects of electron-donating and electron-withdrawing groups of para-substituted benzaldehyde derivatives on the catalytic activity of each condensation reaction. This information is also used to discuss the validity of previously proposed catalytic mechanisms and to propose a revised mechanism with plausible reaction intermediates. For both reactions, electron-withdrawing groups increase the observed rates of reaction, though resonance effects play an important, yet subtle, role inmore » the nitroaldol condensation, in which a p-methoxy electron-donating group is also able to stabilize the proposed carbocation intermediate. Additionally, activation energies and pre-exponential factors are calculated via the Arrhenius analysis of two catalysts with similar amine loadings: one catalyst had silanols available for cooperative interactions (acid–base catalysis), while the other was treated with a silanol-capping reagent to prevent such cooperativity (base-only catalysis). The values obtained for activation energies and pre-exponential factors in each reaction are discussed in the context of the proposed mechanisms and the importance of cooperative interactions in each reaction. The catalytic activity decreases for all reactions when the silanols are capped with trimethylsilyl groups, and higher temperatures are required to make accurate rate measurements, emphasizing the vital role the weakly acidic silanols play in the catalytic cycles. The results indicate that loss of acid sites is more detrimental to the catalytic activity of the aldol condensation than the nitroaldol condensation, as evidenced by the significant decrease in the pre-exponential factor for the aldol condensation when silanols are unavailable for cooperative interactions. Cooperative catalysis is evidenced by significant changes in the pre-exponential factor, rather than the activation energy for the aldol condensation.« less
Levy, Karen; Zimmerman, Julie; Elliott, Mark; Bartram, Jamie; Carlton, Elizabeth; Clasen, Thomas; Dillingham, Rebecca; Eisenberg, Joseph; Guerrant, Richard; Lantagne, Daniele; Mihelcic, James; Nelson, Kara
2016-01-01
Increased precipitation and temperature variability as well as extreme events related to climate change are predicted to affect the availability and quality of water globally. Already heavily burdened with diarrheal diseases due to poor access to water, sanitation and hygiene facilities, communities throughout the developing world lack the adaptive capacity to sufficiently respond to the additional adversity caused by climate change. Studies suggest that diarrhea rates are positively correlated with increased temperature, and show a complex relationship with precipitation. Although climate change will likely increase rates of diarrheal diseases on average, there is a poor mechanistic understanding of the underlying disease transmission processes and substantial uncertainty surrounding current estimates. This makes it difficult to recommend appropriate adaptation strategies. We review the relevant climate-related mechanisms behind transmission of diarrheal disease pathogens and argue that systems-based mechanistic approaches incorporating human, engineered and environmental components are urgently needed. We then review successful systems-based approaches used in other environmental health fields and detail one modeling framework to predict climate change impacts on diarrheal diseases and design adaptation strategies. PMID:26799810
Domain generality vs. modality specificity: The paradox of statistical learning
Frost, Ram; Armstrong, Blair C.; Siegelman, Noam; Christiansen, Morten H.
2015-01-01
Statistical learning is typically considered to be a domain-general mechanism by which cognitive systems discover the underlying distributional properties of the input. Recent studies examining whether there are commonalities in the learning of distributional information across different domains or modalities consistently reveal, however, modality and stimulus specificity. An important question is, therefore, how and why a hypothesized domain-general learning mechanism systematically produces such effects. We offer a theoretical framework according to which statistical learning is not a unitary mechanism, but a set of domain-general computational principles, that operate in different modalities and therefore are subject to the specific constraints characteristic of their respective brain regions. This framework offers testable predictions and we discuss its computational and neurobiological plausibility. PMID:25631249
Katharine N. Suding; Sandra Lavorel; F. Stuart Chapin; Johannes H.C. Cornelissen; Sandra Diaz; Eric Garnier; Deborah Goldberg; David U. Hooper; Stephen T. Jackson; Marie-Laure Navas
2008-01-01
Predicting ecosystem responses to global change is a major challenge in ecology. A critical step in that challenge is to understand how changing environmental conditions influence processes across levels of ecological organization. While direct scaling from individual to ecosystem dynamics can lead to robust and mechanistic predictions, new approaches are needed to...
In Situ, Time-Resolved, and Mechanistic Studies of Metal–Organic Framework Nucleation and Growth
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Vleet, Mary J.; Weng, Tingting; Li, Xinyi
The vast chemical and structural diversity of metal–organic frameworks (MOFs) opens up the exciting possibility of “crystal engineering” MOFs tailored for particular catalytic or separation applications. Yet the process of reaction discovery, optimization, and scale-up of MOF synthesis remains extremely challenging, presenting significant obstacles to the synthetic realization of many otherwise promising MOF structures. Recently, significant new insights into the fundamental processes governing MOF nucleation and growth, as well as the relationship between reaction parameters and synthetic outcome, have been derived using powerful in situ, time-resolved and/or mechanistic studies of MOF crystallization. This Review provides a summary and associated criticalmore » analysis of the results of these and other related “direct” studies of MOF nucleation and growth, with a particular emphasis on the recent advances in instrument technologies that have enabled such studies and on the major hypotheses, theories, and models that have been used to explain MOF formation. We conclude with a summary of the major insights that have been gained from the work summarized in this Review, outlining our own perspective on potential fruitful new directions for investigation.« less
Kurakin, Alexei
2007-01-01
A large body of experimental evidence indicates that the specific molecular interactions and/or chemical conversions depicted as links in the conventional diagrams of cellular signal transduction and metabolic pathways are inherently probabilistic, ambiguous and context-dependent. Being the inevitable consequence of the dynamic nature of protein structure in solution, the ambiguity of protein-mediated interactions and conversions challenges the conceptual adequacy and practical usefulness of the mechanistic assumptions and inferences embodied in the design charts of cellular circuitry. It is argued that the reconceptualization of molecular recognition and cellular organization within the emerging interpretational framework of self-organization, which is expanded here to include such concepts as bounded stochasticity, evolutionary memory, and adaptive plasticity offers a significantly more adequate representation of experimental reality than conventional mechanistic conceptions do. Importantly, the expanded framework of self-organization appears to be universal and scale-invariant, providing conceptual continuity across multiple scales of biological organization, from molecules to societies. This new conceptualization of biological phenomena suggests that such attributes of intelligence as adaptive plasticity, decision-making, and memory are enforced by evolution at different scales of biological organization and may represent inherent properties of living matter. (c) 2007 John Wiley & Sons, Ltd.
In Situ, Time-Resolved, and Mechanistic Studies of Metal–Organic Framework Nucleation and Growth
Van Vleet, Mary J.; Weng, Tingting; Li, Xinyi; ...
2018-03-07
The vast chemical and structural diversity of metal–organic frameworks (MOFs) opens up the exciting possibility of “crystal engineering” MOFs tailored for particular catalytic or separation applications. Yet the process of reaction discovery, optimization, and scale-up of MOF synthesis remains extremely challenging, presenting significant obstacles to the synthetic realization of many otherwise promising MOF structures. Recently, significant new insights into the fundamental processes governing MOF nucleation and growth, as well as the relationship between reaction parameters and synthetic outcome, have been derived using powerful in situ, time-resolved and/or mechanistic studies of MOF crystallization. This Review provides a summary and associated criticalmore » analysis of the results of these and other related “direct” studies of MOF nucleation and growth, with a particular emphasis on the recent advances in instrument technologies that have enabled such studies and on the major hypotheses, theories, and models that have been used to explain MOF formation. We conclude with a summary of the major insights that have been gained from the work summarized in this Review, outlining our own perspective on potential fruitful new directions for investigation.« less
Advanced Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Technical Exchange Meeting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Curtis
2013-09-01
During FY13, the INL developed an advanced SMR PRA framework which has been described in the report Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Technical Framework Specification, INL/EXT-13-28974 (April 2013). In this framework, the various areas are considered: Probabilistic models to provide information specific to advanced SMRs Representation of specific SMR design issues such as having co-located modules and passive safety features Use of modern open-source and readily available analysis methods Internal and external events resulting in impacts to safety All-hazards considerations Methods to support the identification of design vulnerabilities Mechanistic and probabilistic data needs to support modelingmore » and tools In order to describe this framework more fully and obtain feedback on the proposed approaches, the INL hosted a technical exchange meeting during August 2013. This report describes the outcomes of that meeting.« less
Rohrmeier, Martin A; Cross, Ian
2014-07-01
Humans rapidly learn complex structures in various domains. Findings of above-chance performance of some untrained control groups in artificial grammar learning studies raise questions about the extent to which learning can occur in an untrained, unsupervised testing situation with both correct and incorrect structures. The plausibility of unsupervised online-learning effects was modelled with n-gram, chunking and simple recurrent network models. A novel evaluation framework was applied, which alternates forced binary grammaticality judgments and subsequent learning of the same stimulus. Our results indicate a strong online learning effect for n-gram and chunking models and a weaker effect for simple recurrent network models. Such findings suggest that online learning is a plausible effect of statistical chunk learning that is possible when ungrammatical sequences contain a large proportion of grammatical chunks. Such common effects of continuous statistical learning may underlie statistical and implicit learning paradigms and raise implications for study design and testing methodologies. Copyright © 2014 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Koutroulis, Aristeidis; Papadimitriou, Lamprini; Grillakis, Manolis; Tsanis, Ioannis
2017-04-01
Recent developments could postpone climate actions in the frame of the global climate deal of the Paris Agreement, making higher-end global warming increasingly plausible. Although not clear in the COP21 water security is fundamental to achieving low-carbon ambitions, thus climate and water policies are closely related. The projection of the relationship between global warming, water availability and water stress through their complex interactions among different sectors, along with the synergies and trade-offs between adaptation and mitigation actions, is a rather challenging task under the prism of climate change. Here we try to develop and apply a simple, transparent conceptual framework describing European vulnerability to hydrological drought of current hydro-climatic and socioeconomic status as well as projected vulnerability at specific levels of global warming (1.5oC, 2oC and 4oC) following highly rates of climatic change (RCP8.5) and considering different levels of adaptation associated to specific socioeconomic pathways (SSP2, SSP3 and SSP5).
NASA Astrophysics Data System (ADS)
Hay, C.; Creveling, J. R.; Huybers, P. J.
2016-12-01
Excursions in the stable carbon isotopic composition of carbonate rocks (δ13Ccarb) can facilitate correlation of Precambrian and Phanerozoic sedimentary successions at a higher temporal resolution than radiometric and biostratigraphic frameworks typically afford. Within the bounds of litho- and biostratigraphic constraints, stratigraphers often correlate isotopic patterns between distant stratigraphic sections through visual alignment of local maxima and minima of isotopic values. The reproducibility of this method can prove challenging and, thus, evaluating the statistical robustness of intrabasinal composite carbon isotope curves, and global correlations to these reference curves, remains difficult. To assess the reproducibility of stratigraphic alignment of δ13Ccarb data, and correlations between carbon isotope excursions, we employ a numerical dynamic time warping methodology that stretches and squeezes the time axis of a record to obtain an optimal correlation (in a least-squares sense) between time-uncertain series of data. In particular, we assess various alignments between series of Early Cambrian δ13Ccarb data with respect to plausible matches. We first show that an alignment of these records obtained visually, and published previously, is broadly reproducible using dynamic time warping. Alternative alignments with similar goodness of fits are also obtainable, and their stratigraphic plausibility are discussed. This approach should be generalizable to an algorithm for the purposes of developing a library of plausible alignments between multiple time-uncertain stratigraphic records.
Kahn, Michael G; Callahan, Tiffany J; Barnard, Juliana; Bauck, Alan E; Brown, Jeff; Davidson, Bruce N; Estiri, Hossein; Goerg, Carsten; Holve, Erin; Johnson, Steven G; Liaw, Siaw-Teng; Hamilton-Lopez, Marianne; Meeker, Daniella; Ong, Toan C; Ryan, Patrick; Shang, Ning; Weiskopf, Nicole G; Weng, Chunhua; Zozus, Meredith N; Schilling, Lisa
2016-01-01
Harmonized data quality (DQ) assessment terms, methods, and reporting practices can establish a common understanding of the strengths and limitations of electronic health record (EHR) data for operational analytics, quality improvement, and research. Existing published DQ terms were harmonized to a comprehensive unified terminology with definitions and examples and organized into a conceptual framework to support a common approach to defining whether EHR data is 'fit' for specific uses. DQ publications, informatics and analytics experts, managers of established DQ programs, and operational manuals from several mature EHR-based research networks were reviewed to identify potential DQ terms and categories. Two face-to-face stakeholder meetings were used to vet an initial set of DQ terms and definitions that were grouped into an overall conceptual framework. Feedback received from data producers and users was used to construct a draft set of harmonized DQ terms and categories. Multiple rounds of iterative refinement resulted in a set of terms and organizing framework consisting of DQ categories, subcategories, terms, definitions, and examples. The harmonized terminology and logical framework's inclusiveness was evaluated against ten published DQ terminologies. Existing DQ terms were harmonized and organized into a framework by defining three DQ categories: (1) Conformance (2) Completeness and (3) Plausibility and two DQ assessment contexts: (1) Verification and (2) Validation. Conformance and Plausibility categories were further divided into subcategories. Each category and subcategory was defined with respect to whether the data may be verified with organizational data, or validated against an accepted gold standard, depending on proposed context and uses. The coverage of the harmonized DQ terminology was validated by successfully aligning to multiple published DQ terminologies. Existing DQ concepts, community input, and expert review informed the development of a distinct set of terms, organized into categories and subcategories. The resulting DQ terms successfully encompassed a wide range of disparate DQ terminologies. Operational definitions were developed to provide guidance for implementing DQ assessment procedures. The resulting structure is an inclusive DQ framework for standardizing DQ assessment and reporting. While our analysis focused on the DQ issues often found in EHR data, the new terminology may be applicable to a wide range of electronic health data such as administrative, research, and patient-reported data. A consistent, common DQ terminology, organized into a logical framework, is an initial step in enabling data owners and users, patients, and policy makers to evaluate and communicate data quality findings in a well-defined manner with a shared vocabulary. Future work will leverage the framework and terminology to develop reusable data quality assessment and reporting methods.
Varma, Manthena V; El-Kattan, Ayman F
2016-07-01
A large body of evidence suggests hepatic uptake transporters, organic anion-transporting polypeptides (OATPs), are of high clinical relevance in determining the pharmacokinetics of substrate drugs, based on which recent regulatory guidances to industry recommend appropriate assessment of investigational drugs for the potential drug interactions. We recently proposed an extended clearance classification system (ECCS) framework in which the systemic clearance of class 1B and 3B drugs is likely determined by hepatic uptake. The ECCS framework therefore predicts the possibility of drug-drug interactions (DDIs) involving OATPs and the effects of genetic variants of SLCO1B1 early in the discovery and facilitates decision making in the candidate selection and progression. Although OATP-mediated uptake is often the rate-determining process in the hepatic clearance of substrate drugs, metabolic and/or biliary components also contribute to the overall hepatic disposition and, more importantly, to liver exposure. Clinical evidence suggests that alteration in biliary efflux transport or metabolic enzymes associated with genetic polymorphism leads to change in the pharmacodynamic response of statins, for which the pharmacological target resides in the liver. Perpetrator drugs may show inhibitory and/or induction effects on transporters and enzymes simultaneously. It is therefore important to adopt models that frame these multiple processes in a mechanistic sense for quantitative DDI predictions and to deconvolute the effects of individual processes on the plasma and hepatic exposure. In vitro data-informed mechanistic static and physiologically based pharmacokinetic models are proven useful in rationalizing and predicting transporter-mediated DDIs and the complex DDIs involving transporter-enzyme interplay. © 2016, The American College of Clinical Pharmacology.
Coady, Katherine K.; Biever, Ronald C.; Denslow, Nancy D.; Gross, Melanie; Guiney, Patrick D.; Holbech, Henrik; Karouna-Renier, Natalie K.; Katsiadaki, Ioanna; Krueger, Hank; Levine, Steven L.; Maack, Gerd; Williams, Mike; Wolf, Jeffrey C.; Ankley, Gerald T.
2017-01-01
In the present study, existing regulatory frameworks and test systems for assessing potential endocrine active chemicals are described, and associated challenges are discussed, along with proposed approaches to address these challenges. Regulatory frameworks vary somewhat across geographies, but all basically evaluate whether a chemical possesses endocrine activity and whether this activity can result in adverse outcomes either to humans or to the environment. Current test systems include in silico, in vitro, and in vivo techniques focused on detecting potential endocrine activity, and in vivo tests that collect apical data to detect possible adverse effects. These test systems are currently designed to robustly assess endocrine activity and/or adverse effects in the estrogen, androgen, and thyroid hormone signaling pathways; however, there are some limitations of current test systems for evaluating endocrine hazard and risk. These limitations include a lack of certainty regarding: 1) adequately sensitive species and life stages; 2) mechanistic endpoints that are diagnostic for endocrine pathways of concern; and 3) the linkage between mechanistic responses and apical, adverse outcomes. Furthermore, some existing test methods are resource intensive with regard to time, cost, and use of animals. However, based on recent experiences, there are opportunities to improve approaches to and guidance for existing test methods and to reduce uncertainty. For example, in vitro high-throughput screening could be used to prioritize chemicals for testing and provide insights as to the most appropriate assays for characterizing hazard and risk. Other recommendations include adding endpoints for elucidating connections between mechanistic effects and adverse outcomes, identifying potentially sensitive taxa for which test methods currently do not exist, and addressing key endocrine pathways of possible concern in addition to those associated with estrogen, androgen, and thyroid signaling.
Rougier, Thibaud; Lassalle, Géraldine; Drouineau, Hilaire; Dumoulin, Nicolas; Faure, Thierry; Deffuant, Guillaume; Rochard, Eric; Lambert, Patrick
2015-01-01
Species can respond to climate change by tracking appropriate environmental conditions in space, resulting in a range shift. Species Distribution Models (SDMs) can help forecast such range shift responses. For few species, both correlative and mechanistic SDMs were built, but allis shad (Alosa alosa), an endangered anadromous fish species, is one of them. The main purpose of this study was to provide a framework for joint analyses of correlative and mechanistic SDMs projections in order to strengthen conservation measures for species of conservation concern. Guidelines for joint representation and subsequent interpretation of models outputs were defined and applied. The present joint analysis was based on the novel mechanistic model GR3D (Global Repositioning Dynamics of Diadromous fish Distribution) which was parameterized on allis shad and then used to predict its future distribution along the European Atlantic coast under different climate change scenarios (RCP 4.5 and RCP 8.5). We then used a correlative SDM for this species to forecast its distribution across the same geographic area and under the same climate change scenarios. First, projections from correlative and mechanistic models provided congruent trends in probability of habitat suitability and population dynamics. This agreement was preferentially interpreted as referring to the species vulnerability to climate change. Climate change could not be accordingly listed as a major threat for allis shad. The congruence in predicted range limits between SDMs projections was the next point of interest. The difference, when noticed, required to deepen our understanding of the niche modelled by each approach. In this respect, the relative position of the northern range limit between the two methods strongly suggested here that a key biological process related to intraspecific variability was potentially lacking in the mechanistic SDM. Based on our knowledge, we hypothesized that local adaptations to cold temperatures deserved more attention in terms of modelling, but further in conservation planning as well.
Mathewson, Paul D; Moyer-Horner, Lucas; Beever, Erik A; Briscoe, Natalie J; Kearney, Michael; Yahn, Jeremiah M; Porter, Warren P
2017-03-01
How climate constrains species' distributions through time and space is an important question in the context of conservation planning for climate change. Despite increasing awareness of the need to incorporate mechanism into species distribution models (SDMs), mechanistic modeling of endotherm distributions remains limited in this literature. Using the American pika (Ochotona princeps) as an example, we present a framework whereby mechanism can be incorporated into endotherm SDMs. Pika distribution has repeatedly been found to be constrained by warm temperatures, so we used Niche Mapper, a mechanistic heat-balance model, to convert macroclimate data to pika-specific surface activity time in summer across the western United States. We then explored the difference between using a macroclimate predictor (summer temperature) and using a mechanistic predictor (predicted surface activity time) in SDMs. Both approaches accurately predicted pika presences in current and past climate regimes. However, the activity models predicted 8-19% less habitat loss in response to annual temperature increases of ~3-5 °C predicted in the region by 2070, suggesting that pikas may be able to buffer some climate change effects through behavioral thermoregulation that can be captured by mechanistic modeling. Incorporating mechanism added value to the modeling by providing increased confidence in areas where different modeling approaches agreed and providing a range of outcomes in areas of disagreement. It also provided a more proximate variable relating animal distribution to climate, allowing investigations into how unique habitat characteristics and intraspecific phenotypic variation may allow pikas to exist in areas outside those predicted by generic SDMs. Only a small number of easily obtainable data are required to parameterize this mechanistic model for any endotherm, and its use can improve SDM predictions by explicitly modeling a widely applicable direct physiological effect: climate-imposed restrictions on activity. This more complete understanding is necessary to inform climate adaptation actions, management strategies, and conservation plans. © 2016 John Wiley & Sons Ltd.
Mathewson, Paul; Moyer-Horner, Lucas; Beever, Erik; Briscoe, Natalie; Kearney, Michael T.; Yahn, Jeremiah; Porter, Warren P.
2017-01-01
How climate constrains species’ distributions through time and space is an important question in the context of conservation planning for climate change. Despite increasing awareness of the need to incorporate mechanism into species distribution models (SDMs), mechanistic modeling of endotherm distributions remains limited in this literature. Using the American pika (Ochotona princeps) as an example, we present a framework whereby mechanism can be incorporated into endotherm SDMs. Pika distribution has repeatedly been found to be constrained by warm temperatures, so we used Niche Mapper, a mechanistic heat-balance model, to convert macroclimate data to pika-specific surface activity time in summer across the western United States. We then explored the difference between using a macroclimate predictor (summer temperature) and using a mechanistic predictor (predicted surface activity time) in SDMs. Both approaches accurately predicted pika presences in current and past climate regimes. However, the activity models predicted 8–19% less habitat loss in response to annual temperature increases of ~3–5 °C predicted in the region by 2070, suggesting that pikas may be able to buffer some climate change effects through behavioral thermoregulation that can be captured by mechanistic modeling. Incorporating mechanism added value to the modeling by providing increased confidence in areas where different modeling approaches agreed and providing a range of outcomes in areas of disagreement. It also provided a more proximate variable relating animal distribution to climate, allowing investigations into how unique habitat characteristics and intraspecific phenotypic variation may allow pikas to exist in areas outside those predicted by generic SDMs. Only a small number of easily obtainable data are required to parameterize this mechanistic model for any endotherm, and its use can improve SDM predictions by explicitly modeling a widely applicable direct physiological effect: climate-imposed restrictions on activity. This more complete understanding is necessary to inform climate adaptation actions, management strategies, and conservation plans.
Rougier, Thibaud; Lassalle, Géraldine; Drouineau, Hilaire; Dumoulin, Nicolas; Faure, Thierry; Deffuant, Guillaume; Rochard, Eric; Lambert, Patrick
2015-01-01
Species can respond to climate change by tracking appropriate environmental conditions in space, resulting in a range shift. Species Distribution Models (SDMs) can help forecast such range shift responses. For few species, both correlative and mechanistic SDMs were built, but allis shad (Alosa alosa), an endangered anadromous fish species, is one of them. The main purpose of this study was to provide a framework for joint analyses of correlative and mechanistic SDMs projections in order to strengthen conservation measures for species of conservation concern. Guidelines for joint representation and subsequent interpretation of models outputs were defined and applied. The present joint analysis was based on the novel mechanistic model GR3D (Global Repositioning Dynamics of Diadromous fish Distribution) which was parameterized on allis shad and then used to predict its future distribution along the European Atlantic coast under different climate change scenarios (RCP 4.5 and RCP 8.5). We then used a correlative SDM for this species to forecast its distribution across the same geographic area and under the same climate change scenarios. First, projections from correlative and mechanistic models provided congruent trends in probability of habitat suitability and population dynamics. This agreement was preferentially interpreted as referring to the species vulnerability to climate change. Climate change could not be accordingly listed as a major threat for allis shad. The congruence in predicted range limits between SDMs projections was the next point of interest. The difference, when noticed, required to deepen our understanding of the niche modelled by each approach. In this respect, the relative position of the northern range limit between the two methods strongly suggested here that a key biological process related to intraspecific variability was potentially lacking in the mechanistic SDM. Based on our knowledge, we hypothesized that local adaptations to cold temperatures deserved more attention in terms of modelling, but further in conservation planning as well. PMID:26426280
Goosen, Ryan W.
2016-01-01
The relevance of specific microbial colonisation to colorectal cancer (CRC) disease pathogenesis is increasingly recognised, but our understanding of possible underlying molecular mechanisms that may link colonisation to disease in vivo remains limited. Here, we investigate the relationships between the most commonly studied CRC-associated bacteria (Enterotoxigenic Bacteroides fragilis, pks+ Escherichia coli, Fusobacterium spp., afaC+ E. coli, Enterococcus faecalis & Enteropathogenic E. coli) and altered transcriptomic and methylation profiles of CRC patients, in order to gain insight into the potential contribution of these bacteria in the aetiopathogenesis of CRC. We show that colonisation by E. faecalis and high levels of Fusobacterium is associated with a specific transcriptomic subtype of CRC that is characterised by CpG island methylation, microsatellite instability and a significant increase in inflammatory and DNA damage pathways. Analysis of the significant, bacterially-associated changes in host gene expression, both at the level of individual genes as well as pathways, revealed a transcriptional remodeling that provides a plausible mechanistic link between specific bacterial colonisation and colorectal cancer disease development and progression in this subtype; these included upregulation of REG3A, REG1A and REG1P in the case of high-level colonization by Fusobacterium, and CXCL10 and BMI1 in the case of colonisation by E. faecalis. The enrichment of both E. faecalis and Fusobacterium in this CRC subtype suggests that polymicrobial colonisation of the colonic epithelium may well be an important aspect of colonic tumourigenesis. PMID:27846243
Allen, J. L.; Oberdorster, G.; Morris-Schafer, K.; Wong, C.; Klocke, C.; Sobolewski, M.; Conrad, K.; Mayer-Proschel, M.; Cory-Slechta, D. A.
2016-01-01
Accumulating evidence from both human and animal studies show that brain is a target of air pollution. Multiple epidemiological studies have now linked components of air pollution to diagnosis of autism spectrum disorder (ASD), a linkage with plausibility based on the shared mechanisms of inflammation. Additional plausibility appears to be provided by findings from our studies in mice of exposures from postnatal day (PND) 4-7 and 10-13 (human 3rd trimester equivalent), to concentrated ambient ultrafine (UFP) particles, considered the most reactive component of air pollution, at levels consistent with high traffic areas of major U.S. cities and thus highly relevant to human exposures. These exposures, occurring during a period of marked neuro- and gliogenesis, unexpectedly produced a pattern of developmental neurotoxicity notably similar to multiple hypothesized mechanistic underpinnings of ASD, including its greater impact in males. UFP exposures induced inflammation/microglial activation, reductions in size of the corpus callosum (CC) and associated hypomyelination, aberrant white matter development and/or structural integrity with ventriculomegaly (VM), elevated glutamate and excitatory/inhibitory imbalance, increased amygdala astrocytic activation, and repetitive and impulsive behaviors. Collectively, these findings suggest the human 3rd trimester equivalent as a period of potential vulnerability to neurodevelopmental toxicity to UFP, particularly in males, and point to the possibility that UFP air pollution exposure during periods of rapid neuro- and gliogenesis may be a risk factor not only for ASD, but also for other neurodevelopmental disorders that share features with ASD, such as schizophrenia, attention deficit disorder, and periventricular leukomalacia. PMID:26721665
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Li; Rivera-Ramos, Milton E.; Hernández-Maldonado, Arturo J.
2014-05-28
A Sr{sup 2+}-SAPO-34 material that displays superior CO2 adsorption selectivity and capacity was characterized via XPS and UV-vis spectroscopy to elucidate the valence state of strontium cations and framework silicon environment. Most importantly, the location of the strontium has been estimated from a Rietveld refinement analysis of synchrotron diffraction data. The XPS analysis indicated that the apparent valence state of the strontium is less than 2, an indication of its interaction with the large anionic framework. Furthermore, UV-vis tests pointed to changes in the silicon environment, plausibly related to this valence state or framework faulting. For the refinement, the analysismore » found that strontium occupied two unique sites: a site Sr1 slightly displaced from six-membered rings and a site Sr2 positioned at the top or bottom of the eight-membered rings. The latter position favors the interaction of the alkaline earth metal with CO{sub 2}, probably resulting in an enhanced electric field-quadrupole moment interaction.« less
Arighi, Cecilia; Shamovsky, Veronica; Masci, Anna Maria; Ruttenberg, Alan; Smith, Barry; Natale, Darren A; Wu, Cathy; D'Eustachio, Peter
2015-01-01
The Protein Ontology (PRO) provides terms for and supports annotation of species-specific protein complexes in an ontology framework that relates them both to their components and to species-independent families of complexes. Comprehensive curation of experimentally known forms and annotations thereof is expected to expose discrepancies, differences, and gaps in our knowledge. We have annotated the early events of innate immune signaling mediated by Toll-Like Receptor 3 and 4 complexes in human, mouse, and chicken. The resulting ontology and annotation data set has allowed us to identify species-specific gaps in experimental data and possible functional differences between species, and to employ inferred structural and functional relationships to suggest plausible resolutions of these discrepancies and gaps.
Li, Michael; Dushoff, Jonathan; Bolker, Benjamin M
2018-07-01
Simple mechanistic epidemic models are widely used for forecasting and parameter estimation of infectious diseases based on noisy case reporting data. Despite the widespread application of models to emerging infectious diseases, we know little about the comparative performance of standard computational-statistical frameworks in these contexts. Here we build a simple stochastic, discrete-time, discrete-state epidemic model with both process and observation error and use it to characterize the effectiveness of different flavours of Bayesian Markov chain Monte Carlo (MCMC) techniques. We use fits to simulated data, where parameters (and future behaviour) are known, to explore the limitations of different platforms and quantify parameter estimation accuracy, forecasting accuracy, and computational efficiency across combinations of modeling decisions (e.g. discrete vs. continuous latent states, levels of stochasticity) and computational platforms (JAGS, NIMBLE, Stan).
Kahn, Michael G.; Callahan, Tiffany J.; Barnard, Juliana; Bauck, Alan E.; Brown, Jeff; Davidson, Bruce N.; Estiri, Hossein; Goerg, Carsten; Holve, Erin; Johnson, Steven G.; Liaw, Siaw-Teng; Hamilton-Lopez, Marianne; Meeker, Daniella; Ong, Toan C.; Ryan, Patrick; Shang, Ning; Weiskopf, Nicole G.; Weng, Chunhua; Zozus, Meredith N.; Schilling, Lisa
2016-01-01
Objective: Harmonized data quality (DQ) assessment terms, methods, and reporting practices can establish a common understanding of the strengths and limitations of electronic health record (EHR) data for operational analytics, quality improvement, and research. Existing published DQ terms were harmonized to a comprehensive unified terminology with definitions and examples and organized into a conceptual framework to support a common approach to defining whether EHR data is ‘fit’ for specific uses. Materials and Methods: DQ publications, informatics and analytics experts, managers of established DQ programs, and operational manuals from several mature EHR-based research networks were reviewed to identify potential DQ terms and categories. Two face-to-face stakeholder meetings were used to vet an initial set of DQ terms and definitions that were grouped into an overall conceptual framework. Feedback received from data producers and users was used to construct a draft set of harmonized DQ terms and categories. Multiple rounds of iterative refinement resulted in a set of terms and organizing framework consisting of DQ categories, subcategories, terms, definitions, and examples. The harmonized terminology and logical framework’s inclusiveness was evaluated against ten published DQ terminologies. Results: Existing DQ terms were harmonized and organized into a framework by defining three DQ categories: (1) Conformance (2) Completeness and (3) Plausibility and two DQ assessment contexts: (1) Verification and (2) Validation. Conformance and Plausibility categories were further divided into subcategories. Each category and subcategory was defined with respect to whether the data may be verified with organizational data, or validated against an accepted gold standard, depending on proposed context and uses. The coverage of the harmonized DQ terminology was validated by successfully aligning to multiple published DQ terminologies. Discussion: Existing DQ concepts, community input, and expert review informed the development of a distinct set of terms, organized into categories and subcategories. The resulting DQ terms successfully encompassed a wide range of disparate DQ terminologies. Operational definitions were developed to provide guidance for implementing DQ assessment procedures. The resulting structure is an inclusive DQ framework for standardizing DQ assessment and reporting. While our analysis focused on the DQ issues often found in EHR data, the new terminology may be applicable to a wide range of electronic health data such as administrative, research, and patient-reported data. Conclusion: A consistent, common DQ terminology, organized into a logical framework, is an initial step in enabling data owners and users, patients, and policy makers to evaluate and communicate data quality findings in a well-defined manner with a shared vocabulary. Future work will leverage the framework and terminology to develop reusable data quality assessment and reporting methods. PMID:27713905
Framework for Uncertainty Assessment - Hanford Site-Wide Groundwater Flow and Transport Modeling
NASA Astrophysics Data System (ADS)
Bergeron, M. P.; Cole, C. R.; Murray, C. J.; Thorne, P. D.; Wurstner, S. K.
2002-05-01
Pacific Northwest National Laboratory is in the process of development and implementation of an uncertainty estimation methodology for use in future site assessments that addresses parameter uncertainty as well as uncertainties related to the groundwater conceptual model. The long-term goals of the effort are development and implementation of an uncertainty estimation methodology for use in future assessments and analyses being made with the Hanford site-wide groundwater model. The basic approach in the framework developed for uncertainty assessment consists of: 1) Alternate conceptual model (ACM) identification to identify and document the major features and assumptions of each conceptual model. The process must also include a periodic review of the existing and proposed new conceptual models as data or understanding become available. 2) ACM development of each identified conceptual model through inverse modeling with historical site data. 3) ACM evaluation to identify which of conceptual models are plausible and should be included in any subsequent uncertainty assessments. 4) ACM uncertainty assessments will only be carried out for those ACMs determined to be plausible through comparison with historical observations and model structure identification measures. The parameter uncertainty assessment process generally involves: a) Model Complexity Optimization - to identify the important or relevant parameters for the uncertainty analysis; b) Characterization of Parameter Uncertainty - to develop the pdfs for the important uncertain parameters including identification of any correlations among parameters; c) Propagation of Uncertainty - to propagate parameter uncertainties (e.g., by first order second moment methods if applicable or by a Monte Carlo approach) through the model to determine the uncertainty in the model predictions of interest. 5)Estimation of combined ACM and scenario uncertainty by a double sum with each component of the inner sum (an individual CCDF) representing parameter uncertainty associated with a particular scenario and ACM and the outer sum enumerating the various plausible ACM and scenario combinations in order to represent the combined estimate of uncertainty (a family of CCDFs). A final important part of the framework includes identification, enumeration, and documentation of all the assumptions, which include those made during conceptual model development, required by the mathematical model, required by the numerical model, made during the spatial and temporal descretization process, needed to assign the statistical model and associated parameters that describe the uncertainty in the relevant input parameters, and finally those assumptions required by the propagation method. Pacific Northwest National Laboratory is operated for the U.S. Department of Energy under Contract DE-AC06-76RL01830.
"Orange alert": a fluorescent detector for bisphenol A in water environments.
Zhang, Liyun; Er, Jun Cheng; Xu, Wang; Qin, Xian; Samanta, Animesh; Jana, Santanu; Lee, Chi-Lik Ken; Chang, Young-Tae
2014-03-07
Due to the prevalent use of polycarbonate plastics and epoxy resins in packaging materials and paints for ships, there has been a widespread global contamination of environmental water sources with bisphenol A (BPA). BPA, an endocrine disruptor, has been found to cause tremendous health problems. Therefore, there is an urgent need for detecting BPA in a convenient and sensitive manner to ensure water safety. Herein, we develop a fluorescent turn-on BPA probe, named Bisphenol Orange (BPO), which could conveniently detect BPA in a wide variety of real water samples including sea water, drain water and drinking water. BPO shows superior selectivity toward BPA and up to 70-fold increase in fluorescence emission at 580 nm when mixed with BPA in water. Mechanistic studies suggest a plausible water-dependent formation of hydrophobic BPA clusters which favorably trap and restrict the rotation of BPO and recover its inherent fluorescence. Copyright © 2014 Elsevier B.V. All rights reserved.
Antioxidative mechanisms in chlorogenic acid.
Tošović, Jelena; Marković, Svetlana; Dimitrić Marković, Jasmina M; Mojović, Miloš; Milenković, Dejan
2017-12-15
Although chlorogenic acid (5CQA) is an important ingredient of various foods and beverages, mechanisms of its antioxidative action have not been fully clarified. Besides electron spin resonance experiment, this study includes thermodynamic and mechanistic investigations of the hydrogen atom transfer (HAT), radical adduct formation (RAF), sequential proton loss electron transfer (SPLET), and single electron transfer - proton transfer (SET-PT) mechanisms of 5CQA in benzene, ethanol, and water solutions. The calculations were performed using the M06-2X/6-311++G(d,p) level of theory and CPCM solvation model. It was found that SET-PT is not a plausible antioxidative mechanism of 5CQA. RAF pathways are faster, but HAT yields thermodynamically more stable radical products, indicating that in acidic and neutral media 5CQA can take either HAT or RAF pathways. In basic environment (e.g. at physiological pH) SPLET is the likely antioxidative mechanism of 5CQA with extremely high rate. Copyright © 2017 Elsevier Ltd. All rights reserved.
Fe-polyaniline composite nanofiber catalyst for chemoselective hydrolysis of oxime.
Mahato, Sanjit Kumar; Bhaumik, Madhumita; Maji, Arun; Dutta, Abhijit; Maiti, Debabrata; Maity, Arjun
2018-03-01
A facile chemoselective one-pot strategy for the deprotection of oxime has been developed using Fe 0 -polyaniline composite nanofiber (Fe 0 -PANI), as a catalyst. Nano material based Fe 0 -PANI catalyst has been synthesized via in-situ polymerization of ANI monomer and followed by reductive deposition of Fe 0 onto PANI matrix. The catalyst was characterized by FE-SEM, HR-TEM, BET, XRD, ATR-FTIR, XPS and VSM techniques. The scope of the transformation was studied for aryl, alkyl and heteroarylketoxime with excellent chemoselectivity (>99%). Mechanistic investigations suggested the involvement of a cationic intermediate with Fe 3+ active catalytic species. Substituent effect showed a linear free energy relationship. The activation energy (E a ) was calculated to be 17.46 kJ mol -1 for acetophenone oxime to acetophenone conversion. The recyclability of the catalyst demonstrated up to 10 cycles without any significant loss of efficiency. Based on the preliminary experiments a plausible mechanism has been proposed involving a carbocationic intermediate. Copyright © 2017 Elsevier Inc. All rights reserved.
Chromatin Remodeling BAF (SWI/SNF) Complexes in Neural Development and Disorders
Sokpor, Godwin; Xie, Yuanbin; Rosenbusch, Joachim; Tuoc, Tran
2017-01-01
The ATP-dependent BRG1/BRM associated factor (BAF) chromatin remodeling complexes are crucial in regulating gene expression by controlling chromatin dynamics. Over the last decade, it has become increasingly clear that during neural development in mammals, distinct ontogenetic stage-specific BAF complexes derived from combinatorial assembly of their subunits are formed in neural progenitors and post-mitotic neural cells. Proper functioning of the BAF complexes plays critical roles in neural development, including the establishment and maintenance of neural fates and functionality. Indeed, recent human exome sequencing and genome-wide association studies have revealed that mutations in BAF complex subunits are linked to neurodevelopmental disorders such as Coffin-Siris syndrome, Nicolaides-Baraitser syndrome, Kleefstra's syndrome spectrum, Hirschsprung's disease, autism spectrum disorder, and schizophrenia. In this review, we focus on the latest insights into the functions of BAF complexes during neural development and the plausible mechanistic basis of how mutations in known BAF subunits are associated with certain neurodevelopmental disorders. PMID:28824374
Chromatin Remodeling BAF (SWI/SNF) Complexes in Neural Development and Disorders.
Sokpor, Godwin; Xie, Yuanbin; Rosenbusch, Joachim; Tuoc, Tran
2017-01-01
The ATP-dependent BRG1/BRM associated factor (BAF) chromatin remodeling complexes are crucial in regulating gene expression by controlling chromatin dynamics. Over the last decade, it has become increasingly clear that during neural development in mammals, distinct ontogenetic stage-specific BAF complexes derived from combinatorial assembly of their subunits are formed in neural progenitors and post-mitotic neural cells. Proper functioning of the BAF complexes plays critical roles in neural development, including the establishment and maintenance of neural fates and functionality. Indeed, recent human exome sequencing and genome-wide association studies have revealed that mutations in BAF complex subunits are linked to neurodevelopmental disorders such as Coffin-Siris syndrome, Nicolaides-Baraitser syndrome, Kleefstra's syndrome spectrum, Hirschsprung's disease, autism spectrum disorder, and schizophrenia. In this review, we focus on the latest insights into the functions of BAF complexes during neural development and the plausible mechanistic basis of how mutations in known BAF subunits are associated with certain neurodevelopmental disorders.
Climate change and ocean deoxygenation within intensified surface-driven upwelling circulations.
Bakun, Andrew
2017-09-13
Ocean deoxygenation often takes place in proximity to zones of intense upwelling. Associated concerns about amplified ocean deoxygenation arise from an arguable likelihood that coastal upwelling systems in the world's oceans may further intensify as anthropogenic climate change proceeds. Comparative examples discussed include the uniquely intense seasonal Somali Current upwelling, the massive upwelling that occurs quasi-continuously off Namibia and the recently appearing and now annually recurring 'dead zone' off the US State of Oregon. The evident 'transience' in causal dynamics off Oregon is somewhat mirrored in an interannual-scale intermittence in eruptions of anaerobically formed noxious gases off Namibia. A mechanistic scheme draws the three examples towards a common context in which, in addition to the obvious but politically problematic remedy of actually reducing 'greenhouse' gas emissions, the potentially manageable abundance of strongly swimming, finely gill raker-meshed small pelagic fish emerges as a plausible regulating factor.This article is part of the themed issue 'Ocean ventilation and deoxygenation in a warming world'. © 2017 The Author(s).
Lamas, Gervasio A.; Navas-Acien, Ana; Mark, Daniel B.; Lee, Kerry L.
2016-01-01
This review summarizes evidence from 2 lines of research previously thought unrelated: the unexpectedly positive results of the Trial to Assess Chelation Therapy (TACT), and a body of epidemiological data showing that accumulation of biologically active metals, such as lead and cadmium, is an important risk factor for cardiovascular disease. Considering these 2 areas of work together may lead to the identification of new, modifiable risk factors for atherosclerotic cardiovascular disease. We examine the history of chelation up through the report of TACT. We then describe work connecting higher metal levels in the body with the future risk of cardiovascular disease. We conclude by presenting a brief overview of a newly planned National Institutes of Health trial, TACT2, in which we will attempt to replicate the findings of TACT and to establish that removal of toxic metal stores from the body is a plausible mechanistic explanation for the benefits of edetate disodium treatment. PMID:27199065
Heavy Metals, Cardiovascular Disease, and the Unexpected Benefits of Chelation Therapy.
Lamas, Gervasio A; Navas-Acien, Ana; Mark, Daniel B; Lee, Kerry L
2016-05-24
This review summarizes evidence from 2 lines of research previously thought to be unrelated: the unexpectedly positive results of TACT (Trial to Assess Chelation Therapy), and a body of epidemiological data showing that accumulation of biologically active metals, such as lead and cadmium, is an important risk factor for cardiovascular disease. Considering these 2 areas of work together may lead to the identification of new, modifiable risk factors for atherosclerotic cardiovascular disease. We examine the history of chelation up through the report of TACT. We then describe work connecting higher metal levels in the body with the future risk of cardiovascular disease. We conclude by presenting a brief overview of a newly planned National Institutes of Health trial, TACT2, in which we will attempt to replicate the findings of TACT and to establish that removal of toxic metal stores from the body is a plausible mechanistic explanation for the benefits of edetate disodium treatment. Copyright © 2016 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.
Emergent neutrality drives phytoplankton species coexistence
Segura, Angel M.; Calliari, Danilo; Kruk, Carla; Conde, Daniel; Bonilla, Sylvia; Fort, Hugo
2011-01-01
The mechanisms that drive species coexistence and community dynamics have long puzzled ecologists. Here, we explain species coexistence, size structure and diversity patterns in a phytoplankton community using a combination of four fundamental factors: organism traits, size-based constraints, hydrology and species competition. Using a ‘microscopic’ Lotka–Volterra competition (MLVC) model (i.e. with explicit recipes to compute its parameters), we provide a mechanistic explanation of species coexistence along a niche axis (i.e. organismic volume). We based our model on empirically measured quantities, minimal ecological assumptions and stochastic processes. In nature, we found aggregated patterns of species biovolume (i.e. clumps) along the volume axis and a peak in species richness. Both patterns were reproduced by the MLVC model. Observed clumps corresponded to niche zones (volumes) where species fitness was highest, or where fitness was equal among competing species. The latter implies the action of equalizing processes, which would suggest emergent neutrality as a plausible mechanism to explain community patterns. PMID:21177680
Carias, Ann M; Allen, Shannon A; Fought, Angela J; Kotnik Halavaty, Katarina; Anderson, Meegan R; Jimenez, Maria L; McRaven, Michael D; Gioia, Casey J; Henning, Tara R; Kersh, Ellen N; Smith, James M; Pereira, Lara E; Butler, Katherine; McNicholl, S Janet M; Hendry, R Michael; Kiser, Patrick F; Veazey, Ronald S; Hope, Thomas J
2016-09-01
Currently, there are mounting data suggesting that HIV-1 acquisition in women can be affected by the use of certain hormonal contraceptives. However, in non-human primate models, endogenous or exogenous progestin-dominant states are shown to increase acquisition. To gain mechanistic insights into this increased acquisition, we studied how mucosal barrier function and CD4+ T-cell and CD68+ macrophage density and localization changed in the presence of natural progestins or after injection with high-dose DMPA. The presence of natural or injected progestins increased virus penetration of the columnar epithelium and the infiltration of susceptible cells into a thinned squamous epithelium of the vaginal vault, increasing the likelihood of potential virus interactions with target cells. These data suggest that increasing either endogenous or exogenous progestin can alter female reproductive tract barrier properties and provide plausible mechanisms for increased HIV-1 acquisition risk in the presence of increased progestin levels.
NASA Astrophysics Data System (ADS)
Rani, Anjeeta; Jayaraj, Abhilash; Jayaram, B.; Pannuru, Venkatesu
2016-03-01
In adaptation biology of the discovery of the intracellular osmolytes, the osmolytes are found to play a central role in cellular homeostasis and stress response. A number of models using these molecules are now poised to address a wide range of problems in biology. Here, a combination of biophysical measurements and molecular dynamics (MD) simulation method is used to examine the effect of trimethylamine-N-oxide (TMAO) on stem bromelain (BM) structure, stability and function. From the analysis of our results, we found that TMAO destabilizes BM hydrophobic pockets and active site as a result of concerted polar and non-polar interactions which is strongly evidenced by MD simulation carried out for 250 ns. This destabilization is enthalpically favourable at higher concentrations of TMAO while entropically unfavourable. However, to the best of our knowledge, the results constitute first detailed unambiguous proof of destabilizing effect of most commonly addressed TMAO on the interactions governing stability of BM and present plausible mechanism of protein unfolding by TMAO.
Defence mechanisms: the role of physiology in current and future environmental protection paradigms
Glover, Chris N
2018-01-01
Abstract Ecological risk assessments principally rely on simplified metrics of organismal sensitivity that do not consider mechanism or biological traits. As such, they are unable to adequately extrapolate from standard laboratory tests to real-world settings, and largely fail to account for the diversity of organisms and environmental variables that occur in natural environments. However, an understanding of how stressors influence organism health can compensate for these limitations. Mechanistic knowledge can be used to account for species differences in basal biological function and variability in environmental factors, including spatial and temporal changes in the chemical, physical and biological milieu. Consequently, physiological understanding of biological function, and how this is altered by stressor exposure, can facilitate proactive, predictive risk assessment. In this perspective article, existing frameworks that utilize physiological knowledge (e.g. biotic ligand models, adverse outcomes pathways and mechanistic effect models), are outlined, and specific examples of how mechanistic understanding has been used to predict risk are highlighted. Future research approaches and data needs for extending the incorporation of physiological information into ecological risk assessments are discussed. Although the review focuses on chemical toxicants in aquatic systems, physical and biological stressors and terrestrial environments are also briefly considered. PMID:29564135
NASA Astrophysics Data System (ADS)
Worman, Stacey; Furbish, David; Fathel, Siobhan
2014-05-01
In arid landscapes, desert shrubs individually and collectively modify how sediment is transported (e.g by wind, overland-flow, and rain-splash). Addressing how desert shrubs modify landscapes on geomorphic timescales therefore necessitates spanning multiple shrub lifetimes and accounting for how processes affecting shrub dynamics on these longer timescales (e.g. fire, grazing, drought, and climate change) may in turn impact sediment transport. To fulfill this need, we present a mechanistic model of the spatiotemporal dynamics of a desert-shrub population that uses a simple accounting framework and tracks individual shrubs as they enter, age, and exit the population (via recruitment, growth, and mortality). Our model is novel insomuch as it (1) features a strong biophysical foundation, (2) mimics well-documented aspects of how shrub populations respond to changes in precipitation, and (3) possesses the process granularity appropriate for use in geomorphic simulations. In a complimentary abstract (Fathel et al. 2014), we demonstrate the potential of this biological model by coupling it to a physical model of rain-splash sediment transport: We mechanistically reproduce the empirical observation that the erosion rate of a hillslope decreases as its vegetation coverage increases and we predict erosion rates under different climate-change scenarios.
Secondary dispersal driven by overland flow in drylands: Review and mechanistic model development.
Thompson, Sally E; Assouline, Shmuel; Chen, Li; Trahktenbrot, Ana; Svoray, Tal; Katul, Gabriel G
2014-01-01
Seed dispersal alters gene flow, reproduction, migration and ultimately spatial organization of dryland ecosystems. Because many seeds in drylands lack adaptations for long-distance dispersal, seed transport by secondary processes such as tumbling in the wind or mobilization in overland flow plays a dominant role in determining where seeds ultimately germinate. Here, recent developments in modeling runoff generation in spatially complex dryland ecosystems are reviewed with the aim of proposing improvements to mechanistic modeling of seed dispersal processes. The objective is to develop a physically-based yet operational framework for determining seed dispersal due to surface runoff, a process that has gained recent experimental attention. A Buoyant OBject Coupled Eulerian - Lagrangian Closure model (BOB-CELC) is proposed to represent seed movement in shallow surface flows. The BOB-CELC is then employed to investigate the sensitivity of seed transport to landscape and storm properties and to the spatial configuration of vegetation patches interspersed within bare earth. The potential to simplify seed transport outcomes by considering the limiting behavior of multiple runoff events is briefly considered, as is the potential for developing highly mechanistic, spatially explicit models that link seed transport, vegetation structure and water movement across multiple generations of dryland plants.
Stocker, Elena; Toschkoff, Gregor; Sacher, Stephan; Khinast, Johannes G
2014-11-20
The purpose of this study is to evaluate the use of computer simulations for generating quantitative knowledge as a basis for risk ranking and mechanistic process understanding, as required by ICH Q9 on quality risk management systems. In this specific publication, the main focus is the demonstration of a risk assessment workflow, including a computer simulation for the generation of mechanistic understanding of active tablet coating in a pan coater. Process parameter screening studies are statistically planned under consideration of impacts on a potentially critical quality attribute, i.e., coating mass uniformity. Based on computer simulation data the process failure mode and effects analysis of the risk factors is performed. This results in a quantitative criticality assessment of process parameters and the risk priority evaluation of failure modes. The factor for a quantitative reassessment of the criticality and risk priority is the coefficient of variation, which represents the coating mass uniformity. The major conclusion drawn from this work is a successful demonstration of the integration of computer simulation in the risk management workflow leading to an objective and quantitative risk assessment. Copyright © 2014. Published by Elsevier B.V.
Using energy budgets to combine ecology and toxicology in a mammalian sentinel species
NASA Astrophysics Data System (ADS)
Desforges, Jean-Pierre W.; Sonne, Christian; Dietz, Rune
2017-04-01
Process-driven modelling approaches can resolve many of the shortcomings of traditional descriptive and non-mechanistic toxicology. We developed a simple dynamic energy budget (DEB) model for the mink (Mustela vison), a sentinel species in mammalian toxicology, which coupled animal physiology, ecology and toxicology, in order to mechanistically investigate the accumulation and adverse effects of lifelong dietary exposure to persistent environmental toxicants, most notably polychlorinated biphenyls (PCBs). Our novel mammalian DEB model accurately predicted, based on energy allocations to the interconnected metabolic processes of growth, development, maintenance and reproduction, lifelong patterns in mink growth, reproductive performance and dietary accumulation of PCBs as reported in the literature. Our model results were consistent with empirical data from captive and free-ranging studies in mink and other wildlife and suggest that PCB exposure can have significant population-level impacts resulting from targeted effects on fetal toxicity, kit mortality and growth and development. Our approach provides a simple and cross-species framework to explore the mechanistic interactions of physiological processes and ecotoxicology, thus allowing for a deeper understanding and interpretation of stressor-induced adverse effects at all levels of biological organization.
Mellor, Jonathan E; Levy, Karen; Zimmerman, Julie; Elliott, Mark; Bartram, Jamie; Carlton, Elizabeth; Clasen, Thomas; Dillingham, Rebecca; Eisenberg, Joseph; Guerrant, Richard; Lantagne, Daniele; Mihelcic, James; Nelson, Kara
2016-04-01
Increased precipitation and temperature variability as well as extreme events related to climate change are predicted to affect the availability and quality of water globally. Already heavily burdened with diarrheal diseases due to poor access to water, sanitation and hygiene facilities, communities throughout the developing world lack the adaptive capacity to sufficiently respond to the additional adversity caused by climate change. Studies suggest that diarrhea rates are positively correlated with increased temperature, and show a complex relationship with precipitation. Although climate change will likely increase rates of diarrheal diseases on average, there is a poor mechanistic understanding of the underlying disease transmission processes and substantial uncertainty surrounding current estimates. This makes it difficult to recommend appropriate adaptation strategies. We review the relevant climate-related mechanisms behind transmission of diarrheal disease pathogens and argue that systems-based mechanistic approaches incorporating human, engineered and environmental components are urgently needed. We then review successful systems-based approaches used in other environmental health fields and detail one modeling framework to predict climate change impacts on diarrheal diseases and design adaptation strategies. Copyright © 2016 Elsevier B.V. All rights reserved.
Asha, K S; Bhattacharjee, Rameswar; Mandal, Sukhendu
2016-09-12
A complete transmetalation has been achieved on a barium metal-organic framework (MOF), leading to the isolation of a new Tb-MOF in a single-crystal (SC) to single-crystal (SC) fashion. It leads to the transformation of an anionic framework with cations in the pore to one that is neutral. The mechanistic studies proposed a core-shell metal exchange through dissociation of metal-ligand bonds. This Tb-MOF exhibits enhanced photoluminescence and acts as a selective sensor for phosphate anion in aqueous medium. Thus, this work not only provides a method to functionalize a MOF that can have potential application in sensing but also elucidates the formation mechanism of the resulting MOF. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Hansmann, Max M; Rudolph, Matthias; Rominger, Frank; Hashmi, A Stephen K
2013-02-25
The other side of the mountain: Changing the framework of diyne systems opens up new cyclization modes for dual gold catalysis. Instead of a 5-endo cyclization and gold vinylidenes a 6-endo cyclization gives rise to gold-stabilized carbenes as key intermediates for selective C-H insertions. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Interagency Conflict Assessment Framework: A Pragmatic Tool for Army Design
2010-12-02
Peter Checkland and John Poulter, Learning for Action: A Short Definitive Account of Soft Systems Methodology and its use for Practitioners, Teachers...lend itself to mechanistic analytical methods.32 Peter Checkland and John Poulter suggest that each approach is neither right nor wrong, rather...their relationships. According to Peter Checkland , the use of what he describes as “rich pictures” are excellent tools for capturing the dynamics of a
A mechanistic modeling and data assimilation framework for Mojave Desert ecohydrology
Ng, Gene-Hua Crystal.; Bedford, David; Miller, David
2014-01-01
This study demonstrates and addresses challenges in coupled ecohydrological modeling in deserts, which arise due to unique plant adaptations, marginal growing conditions, slow net primary production rates, and highly variable rainfall. We consider model uncertainty from both structural and parameter errors and present a mechanistic model for the shrub Larrea tridentata (creosote bush) under conditions found in the Mojave National Preserve in southeastern California (USA). Desert-specific plant and soil features are incorporated into the CLM-CN model by Oleson et al. (2010). We then develop a data assimilation framework using the ensemble Kalman filter (EnKF) to estimate model parameters based on soil moisture and leaf-area index observations. A new implementation procedure, the “multisite loop EnKF,” tackles parameter estimation difficulties found to affect desert ecohydrological applications. Specifically, the procedure iterates through data from various observation sites to alleviate adverse filter impacts from non-Gaussianity in small desert vegetation state values. It also readjusts inconsistent parameters and states through a model spin-up step that accounts for longer dynamical time scales due to infrequent rainfall in deserts. Observation error variance inflation may also be needed to help prevent divergence of estimates from true values. Synthetic test results highlight the importance of adequate observations for reducing model uncertainty, which can be achieved through data quality or quantity.
Revilla, Eloy; Wiegand, Thorsten
2008-12-09
The dynamics of spatially structured populations is characterized by within- and between-patch processes. The available theory describes the latter with simple distance-dependent functions that depend on landscape properties such as interpatch distance or patch size. Despite its potential role, we lack a good mechanistic understanding of how the movement of individuals between patches affects the dynamics of these populations. We used the theoretical framework provided by movement ecology to make a direct representation of the processes determining how individuals connect local populations in a spatially structured population of Iberian lynx. Interpatch processes depended on the heterogeneity of the matrix where patches are embedded and the parameters defining individual movement behavior. They were also very sensitive to the dynamic demographic variables limiting the time moving, the within-patch dynamics of available settlement sites (both spatiotemporally heterogeneous) and the response of individuals to the perceived risk while moving. These context-dependent dynamic factors are an inherent part of the movement process, producing connectivities and dispersal kernels whose variability is affected by other demographic processes. Mechanistic representations of interpatch movements, such as the one provided by the movement-ecology framework, permit the dynamic interaction of birth-death processes and individual movement behavior, thus improving our understanding of stochastic spatially structured populations.
Placing biodiversity in ecosystem models without getting lost in translation
NASA Astrophysics Data System (ADS)
Queirós, Ana M.; Bruggeman, Jorn; Stephens, Nicholas; Artioli, Yuri; Butenschön, Momme; Blackford, Jeremy C.; Widdicombe, Stephen; Allen, J. Icarus; Somerfield, Paul J.
2015-04-01
A key challenge to progressing our understanding of biodiversity's role in the sustenance of ecosystem function is the extrapolation of the results of two decades of dedicated empirical research to regional, global and future landscapes. Ecosystem models provide a platform for this progression, potentially offering a holistic view of ecosystems where, guided by the mechanistic understanding of processes and their connection to the environment and biota, large-scale questions can be investigated. While the benefits of depicting biodiversity in such models are widely recognized, its application is limited by difficulties in the transfer of knowledge from small process oriented ecology into macro-scale modelling. Here, we build on previous work, breaking down key challenges of that knowledge transfer into a tangible framework, highlighting successful strategies that both modelling and ecology communities have developed to better interact with one another. We use a benthic and a pelagic case-study to illustrate how aspects of the links between biodiversity and ecosystem process have been depicted in marine ecosystem models (ERSEM and MIRO), from data, to conceptualisation and model development. We hope that this framework may help future interactions between biodiversity researchers and model developers by highlighting concrete solutions to common problems, and in this way contribute to the advance of the mechanistic understanding of the role of biodiversity in marine (and terrestrial) ecosystems.
Current limitations and recommendations to improve testing ...
In this paper existing regulatory frameworks and test systems for assessing potential endocrine-active chemicals are described, and associated challenges discussed, along with proposed approaches to address these challenges. Regulatory frameworks vary somewhat across organizations, but all basically evaluate whether a chemical possesses endocrine activity and whether this activity can result in adverse outcomes either to humans or the environment. Current test systems include in silico, in vitro and in vivo techniques focused on detecting potential endocrine activity, and in vivo tests that collect apical data to detect possible adverse effects. These test systems are currently designed to robustly assess endocrine activity and/or adverse effects in the estrogen, androgen, and thyroid hormonal pathways; however, there are some limitations of current test systems for evaluating endocrine hazard and risk. These limitations include a lack of certainty regarding: 1)adequately sensitive species and life-stages, 2) mechanistic endpoints that are diagnostic for endocrine pathways of concern, and 3) the linkage between mechanistic responses and apical, adverse outcomes. Furthermore, some existing test methods are resource intensive in regard to time, cost, and use of animals. However, based on recent experiences, there are opportunities to improve approaches to, and guidance for existing test methods, and to reduce uncertainty. For example, in vitro high throughput
NASA Astrophysics Data System (ADS)
Fijani, E.; Chitsazan, N.; Nadiri, A.; Tsai, F. T.; Asghari Moghaddam, A.
2012-12-01
Artificial Neural Networks (ANNs) have been widely used to estimate concentration of chemicals in groundwater systems. However, estimation uncertainty is rarely discussed in the literature. Uncertainty in ANN output stems from three sources: ANN inputs, ANN parameters (weights and biases), and ANN structures. Uncertainty in ANN inputs may come from input data selection and/or input data error. ANN parameters are naturally uncertain because they are maximum-likelihood estimated. ANN structure is also uncertain because there is no unique ANN model given a specific case. Therefore, multiple plausible AI models are generally resulted for a study. One might ask why good models have to be ignored in favor of the best model in traditional estimation. What is the ANN estimation variance? How do the variances from different ANN models accumulate to the total estimation variance? To answer these questions we propose a Hierarchical Bayesian Model Averaging (HBMA) framework. Instead of choosing one ANN model (the best ANN model) for estimation, HBMA averages outputs of all plausible ANN models. The model weights are based on the evidence of data. Therefore, the HBMA avoids overconfidence on the single best ANN model. In addition, HBMA is able to analyze uncertainty propagation through aggregation of ANN models in a hierarchy framework. This method is applied for estimation of fluoride concentration in the Poldasht plain and the Bazargan plain in Iran. Unusually high fluoride concentration in the Poldasht and Bazargan plains has caused negative effects on the public health. Management of this anomaly requires estimation of fluoride concentration distribution in the area. The results show that the HBMA provides a knowledge-decision-based framework that facilitates analyzing and quantifying ANN estimation uncertainties from different sources. In addition HBMA allows comparative evaluation of the realizations for each source of uncertainty by segregating the uncertainty sources in a hierarchical framework. Fluoride concentration estimation using the HBMA method shows better agreement to the observation data in the test step because they are not based on a single model with a non-dominate weights.
Witter, Sophie; Falisse, Jean-Benoit; Bertone, Maria Paola; Alonso-Garbayo, Alvaro; Martins, João S; Salehi, Ahmad Shah; Pavignani, Enrico; Martineau, Tim
2015-05-15
Human resources for health are self-evidently critical to running a health service and system. There is, however, a wider set of social issues which is more rarely considered. One area which is hinted at in literature, particularly on fragile and conflict-affected states, but rarely examined in detail, is the contribution which health staff may or do play in relation to the wider state-building processes. This article aims to explore that relationship, developing a conceptual framework to understand what linkages might exist and looking for empirical evidence in the literature to support, refute or adapt those linkages. An open call for contributions to the article was launched through an online community. The group then developed a conceptual framework and explored a variety of literatures (political, economic, historical, public administration, conflict and health-related) to find theoretical and empirical evidence related to the linkages outlined in the framework. Three country case reports were also developed for Afghanistan, Burundi and Timor-Leste, using secondary sources and the knowledge of the group. We find that the empirical evidence for most of the linkages is not strong, which is not surprising, given the complexity of the relationships. Nevertheless, some of the posited relationships are plausible, especially between development of health cadres and a strengthened public administration, which in the long run underlies a number of state-building features. The reintegration of factional health staff post-conflict is also plausibly linked to reconciliation and peace-building. The role of medical staff as part of national elites may also be important. The concept of state-building itself is highly contested, with a rich vein of scepticism about the wisdom or feasibility of this as an external project. While recognizing the inherently political nature of these processes, systems and sub-systems, it remains the case that state-building does occur over time, driven by a combination of internal and external forces and that understanding the role played in it by the health system and health staff, particularly after conflicts and in fragile settings, is an area worth further investigation. This review and framework contribute to that debate.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pan, Jian-Bo; Ji, Nan; Pan, Wen
2014-01-01
Drugs may induce adverse drug reactions (ADRs) when they unexpectedly bind to proteins other than their therapeutic targets. Identification of these undesired protein binding partners, called off-targets, can facilitate toxicity assessment in the early stages of drug development. In this study, a computational framework was introduced for the exploration of idiosyncratic mechanisms underlying analgesic-induced severe adverse drug reactions (SADRs). The putative analgesic-target interactions were predicted by performing reverse docking of analgesics or their active metabolites against human/mammal protein structures in a high-throughput manner. Subsequently, bioinformatics analyses were undertaken to identify ADR-associated proteins (ADRAPs) and pathways. Using the pathways and ADRAPsmore » that this analysis identified, the mechanisms of SADRs such as cardiac disorders were explored. For instance, 53 putative ADRAPs and 24 pathways were linked with cardiac disorders, of which 10 ADRAPs were confirmed by previous experiments. Moreover, it was inferred that pathways such as base excision repair, glycolysis/glyconeogenesis, ErbB signaling, calcium signaling, and phosphatidyl inositol signaling likely play pivotal roles in drug-induced cardiac disorders. In conclusion, our framework offers an opportunity to globally understand SADRs at the molecular level, which has been difficult to realize through experiments. It also provides some valuable clues for drug repurposing. - Highlights: • A novel computational framework was developed for mechanistic study of SADRs. • Off-targets of drugs were identified in large scale and in a high-throughput manner. • SADRs like cardiac disorders were systematically explored in molecular networks. • A number of ADR-associated proteins were identified.« less
Prueitt, Robyn L; Goodman, Julie E
2016-09-01
Exposure to elevated levels of ozone has been associated with a variety of respiratory-related health endpoints in both epidemiology and controlled human exposure studies, including lung function decrements and airway inflammation. A mode of action (MoA) for these effects has not been established, but it has been proposed that they may occur through ozone-induced activation of neural reflexes. We critically reviewed experimental studies of ozone exposure and neural reflex activation and applied the International Programme on Chemical Safety (IPCS) mode-of-action/human relevance framework to evaluate the biological plausibility and human relevance of this proposed MoA. Based on the currently available experimental data, we found that the proposed MoA of neural reflex activation is biologically plausible for the endpoint of ozone-induced lung function decrements at high ozone exposures, but further studies are needed to fill important data gaps regarding the relevance of this MoA at lower exposures. A role for the proposed MoA in ozone-induced airway inflammation is less plausible, as the evidence is conflicting and is also of unclear relevance given the lack of studies conducted at lower exposures. The evidence suggests a different MoA for ozone-induced inflammation that may still be linked to the key events in the proposed MoA, such that neural reflex activation may have some degree of involvement in modulating ozone-induced neutrophil influx, even if it is not a direct role.
Dynamical simulation priors for human motion tracking.
Vondrak, Marek; Sigal, Leonid; Jenkins, Odest Chadwicke
2013-01-01
We propose a simulation-based dynamical motion prior for tracking human motion from video in presence of physical ground-person interactions. Most tracking approaches to date have focused on efficient inference algorithms and/or learning of prior kinematic motion models; however, few can explicitly account for the physical plausibility of recovered motion. Here, we aim to recover physically plausible motion of a single articulated human subject. Toward this end, we propose a full-body 3D physical simulation-based prior that explicitly incorporates a model of human dynamics into the Bayesian filtering framework. We consider the motion of the subject to be generated by a feedback “control loop” in which Newtonian physics approximates the rigid-body motion dynamics of the human and the environment through the application and integration of interaction forces, motor forces, and gravity. Interaction forces prevent physically impossible hypotheses, enable more appropriate reactions to the environment (e.g., ground contacts), and are produced from detected human-environment collisions. Motor forces actuate the body, ensure that proposed pose transitions are physically feasible, and are generated using a motion controller. For efficient inference in the resulting high-dimensional state space, we utilize an exemplar-based control strategy that reduces the effective search space of motor forces. As a result, we are able to recover physically plausible motion of human subjects from monocular and multiview video. We show, both quantitatively and qualitatively, that our approach performs favorably with respect to Bayesian filtering methods with standard motion priors.
Functionality limit of classical simulated annealing
NASA Astrophysics Data System (ADS)
Hasegawa, M.
2015-09-01
By analyzing the system dynamics in the landscape paradigm, optimization function of classical simulated annealing is reviewed on the random traveling salesman problems. The properly functioning region of the algorithm is experimentally determined in the size-time plane and the influence of its boundary on the scalability test is examined in the standard framework of this method. From both results, an empirical choice of temperature length is plausibly explained as a minimum requirement that the algorithm maintains its scalability within its functionality limit. The study exemplifies the applicability of computational physics analysis to the optimization algorithm research.
Bertheloot, Jessica; Wu, Qiongli; Cournède, Paul-Henry; Andrieu, Bruno
2011-10-01
Simulating nitrogen economy in crop plants requires formalizing the interactions between soil nitrogen availability, root nitrogen acquisition, distribution between vegetative organs and remobilization towards grains. This study evaluates and analyses the functional-structural and mechanistic model of nitrogen economy, NEMA (Nitrogen Economy Model within plant Architecture), developed for winter wheat (Triticum aestivum) after flowering. NEMA was calibrated for field plants under three nitrogen fertilization treatments at flowering. Model behaviour was investigated and sensitivity to parameter values was analysed. Nitrogen content of all photosynthetic organs and in particular nitrogen vertical distribution along the stem and remobilization patterns in response to fertilization were simulated accurately by the model, from Rubisco turnover modulated by light intercepted by the organ and a mobile nitrogen pool. This pool proved to be a reliable indicator of plant nitrogen status, allowing efficient regulation of nitrogen acquisition by roots, remobilization from vegetative organs and accumulation in grains in response to nitrogen treatments. In our simulations, root capacity to import carbon, rather than carbon availability, limited nitrogen acquisition and ultimately nitrogen accumulation in grains, while Rubisco turnover intensity mostly affected dry matter accumulation in grains. NEMA enabled interpretation of several key patterns usually observed in field conditions and the identification of plausible processes limiting for grain yield, protein content and root nitrogen acquisition that could be targets for plant breeding; however, further understanding requires more mechanistic formalization of carbon metabolism. Its strong physiological basis and its realistic behaviour support its use to gain insights into nitrogen economy after flowering.
Assembling evidence for identifying reservoirs of infection
Viana, Mafalda; Mancy, Rebecca; Biek, Roman; Cleaveland, Sarah; Cross, Paul C.; Lloyd-Smith, James O.; Haydon, Daniel T.
2014-01-01
Many pathogens persist in multihost systems, making the identification of infection reservoirs crucial for devising effective interventions. Here, we present a conceptual framework for classifying patterns of incidence and prevalence, and review recent scientific advances that allow us to study and manage reservoirs simultaneously. We argue that interventions can have a crucial role in enriching our mechanistic understanding of how reservoirs function and should be embedded as quasi-experimental studies in adaptive management frameworks. Single approaches to the study of reservoirs are unlikely to generate conclusive insights whereas the formal integration of data and methodologies, involving interventions, pathogen genetics, and contemporary surveillance techniques, promises to open up new opportunities to advance understanding of complex multihost systems. PMID:24726345
Assembling evidence for identifying reservoirs of infection
Mafalda, Viana; Rebecca, Mancy; Roman, Biek; Sarah, Cleaveland; Cross, Paul C.; James O, Lloyd-Smith; Daniel T, Haydon
2014-01-01
Many pathogens persist in multihost systems, making the identification of infection reservoirs crucial for devising effective interventions. Here, we present a conceptual framework for classifying patterns of incidence and prevalence, and review recent scientific advances that allow us to study and manage reservoirs simultaneously. We argue that interventions can have a crucial role in enriching our mechanistic understanding of how reservoirs function and should be embedded as quasi-experimental studies in adaptive management frameworks. Single approaches to the study of reservoirs are unlikely to generate conclusive insights whereas the formal integration of data and methodologies, involving interventions, pathogen genetics, and contemporary surveillance techniques, promises to open up new opportunities to advance understanding of complex multihost systems.
Estimating Allee dynamics before they can be observed: polar bears as a case study.
Molnár, Péter K; Lewis, Mark A; Derocher, Andrew E
2014-01-01
Allee effects are an important component in the population dynamics of numerous species. Accounting for these Allee effects in population viability analyses generally requires estimates of low-density population growth rates, but such data are unavailable for most species and particularly difficult to obtain for large mammals. Here, we present a mechanistic modeling framework that allows estimating the expected low-density growth rates under a mate-finding Allee effect before the Allee effect occurs or can be observed. The approach relies on representing the mechanisms causing the Allee effect in a process-based model, which can be parameterized and validated from data on the mechanisms rather than data on population growth. We illustrate the approach using polar bears (Ursus maritimus), and estimate their expected low-density growth by linking a mating dynamics model to a matrix projection model. The Allee threshold, defined as the population density below which growth becomes negative, is shown to depend on age-structure, sex ratio, and the life history parameters determining reproduction and survival. The Allee threshold is thus both density- and frequency-dependent. Sensitivity analyses of the Allee threshold show that different combinations of the parameters determining reproduction and survival can lead to differing Allee thresholds, even if these differing combinations imply the same stable-stage population growth rate. The approach further shows how mate-limitation can induce long transient dynamics, even in populations that eventually grow to carrying capacity. Applying the models to the overharvested low-density polar bear population of Viscount Melville Sound, Canada, shows that a mate-finding Allee effect is a plausible mechanism for slow recovery of this population. Our approach is generalizable to any mating system and life cycle, and could aid proactive management and conservation strategies, for example, by providing a priori estimates of minimum conservation targets for rare species or minimum eradication targets for pests and invasive species.
Estimating Allee Dynamics before They Can Be Observed: Polar Bears as a Case Study
Molnár, Péter K.; Lewis, Mark A.; Derocher, Andrew E.
2014-01-01
Allee effects are an important component in the population dynamics of numerous species. Accounting for these Allee effects in population viability analyses generally requires estimates of low-density population growth rates, but such data are unavailable for most species and particularly difficult to obtain for large mammals. Here, we present a mechanistic modeling framework that allows estimating the expected low-density growth rates under a mate-finding Allee effect before the Allee effect occurs or can be observed. The approach relies on representing the mechanisms causing the Allee effect in a process-based model, which can be parameterized and validated from data on the mechanisms rather than data on population growth. We illustrate the approach using polar bears (Ursus maritimus), and estimate their expected low-density growth by linking a mating dynamics model to a matrix projection model. The Allee threshold, defined as the population density below which growth becomes negative, is shown to depend on age-structure, sex ratio, and the life history parameters determining reproduction and survival. The Allee threshold is thus both density- and frequency-dependent. Sensitivity analyses of the Allee threshold show that different combinations of the parameters determining reproduction and survival can lead to differing Allee thresholds, even if these differing combinations imply the same stable-stage population growth rate. The approach further shows how mate-limitation can induce long transient dynamics, even in populations that eventually grow to carrying capacity. Applying the models to the overharvested low-density polar bear population of Viscount Melville Sound, Canada, shows that a mate-finding Allee effect is a plausible mechanism for slow recovery of this population. Our approach is generalizable to any mating system and life cycle, and could aid proactive management and conservation strategies, for example, by providing a priori estimates of minimum conservation targets for rare species or minimum eradication targets for pests and invasive species. PMID:24427306
Public health implications of environmental exposures.
De Rosa, C T; Pohl, H R; Williams, M; Ademoyero, A A; Chou, C H; Jones, D E
1998-01-01
The Agency for Toxic Substances and Disease Registry (ATSDR) is a public health agency with responsibility for assessing the public health implications associated with uncontrolled releases of hazardous substances into the environment. The biological effects of low-level exposures are a primary concern in these assessments. One of the tools used by the agency for this purpose is the risk assessment paradigm originally outlined and described by the National Academy of Science in 1983. Because of its design and inherent concepts, risk assessment has been variously employed by a number of environmental and public health agencies and programs as a means to organize information, as a decision support tool, and as a working hypothesis for biologically based inference and extrapolation. Risk assessment has also been the subject of significant critical review. The ATSDR recognizes the utility of both the qualitative and quantitative conclusions provided by traditional risk assessment, but the agency uses such estimates only in the broader context of professional judgment, internal and external peer review, and extensive public review and comment. This multifaceted approach is consistent with the Council on Environmental Quality's description and use of risk analysis as an organizing construct based on sound biomedical and other scientific judgment in concert with risk assessment to define plausible exposure ranges of concern rather than a single numerical estimate that may convey an artificial sense of precision. In this approach biomedical opinion, host factors, mechanistic interpretation, molecular epidemiology, and actual exposure conditions are all critically important in evaluating the significance of environmental exposure to hazardous substances. As such, the ATSDR risk analysis approach is a multidimensional endeavor encompassing not only the components of risk assessment but also the principles of biomedical judgment, risk management, and risk communication. Within this framework of risk analysis, the ATSDR may rely on one or more of a number of interrelated principles and approaches to screen, organize information, set priorities, make decisions, and define future research needs and directions. Images Figure 1 PMID:9539032
Soriano, Elena; Marco-Contelles, José
2009-08-18
Organometallic chemistry provides powerful tools for the stereocontrolled synthesis of heterocycles and carbocycles. The electrophilic transition metals Pt(II) and Au(I, III) are efficient catalysts in these transitions and promote a variety of organic transformations of unsaturated precursors. These reactions produce functionalized cyclic and acyclic scaffolds for the synthesis of natural and non-natural products efficiently, under mild conditions, and with excellent chemoselectivity. Because these transformations are strongly substrate-dependent, they are versatile and may yield diverse molecular scaffolds. Therefore, synthetic chemists need a mechanistic interpretation to optimize this reaction process and design a new generation of catalysts. However, so far, no intermediate species has been isolated or characterized, so the formulated mechanistic hypotheses have been primarily based on labeling studies or trapping reactions. Recently, theoretical DFT studies have become a useful tool in our research, giving us insights into the key intermediates and into a variety of plausible reaction pathways. In this Account, we present a comprehensive mechanistic overview of transformations promoted by Pt and Au in a non-nucleophilic medium based on quantum-mechanical studies. The calculations are consistent with the experimental observations and provide fundamental insights into the versatility of these reaction processes. The reactivity of these metals results from their peculiar Lewis acid properties: the alkynophilic character of these soft metals and the pi-acid activation of unsaturated groups promotes the intra- or intermolecular attack of a nucleophile. 1,n-Enynes (n = 3-8) are particularly important precursors, and their transformation may yield a variety of cycloadducts depending on the molecular structure. However, the calculations suggest that these different cyclizations would have closely related reaction mechanisms, and we propose a unified mechanistic picture. The intramolecular nucleophilic attack of the double bond on the activated alkyne takes place by an endo-dig or exo-dig pathway to afford a cyclopropyl-metallocarbenoid. Through divergent routes, the cyclopropyl intermediate formed by exo-cyclopropanation could yield the metathesis adduct or bicyclic compounds. The endo-cyclization may be followed by a [1,2]-migration of the propargyl moiety to the internal acetylenic position to afford bicyclic [n.1.0] derivatives. This reaction mechanism is applicable for functional groups ranging from H to carboxylate propargyl substituents (Rautenstrauch reaction). In intramolecular reactions in which a shorter enyne bears a propargyl ester or in intermolecular reactions of an ester with an alkene, the ester preferentially attacks the activated alkyne because of enthalpic (ring strain) and entropic effects. Our calculations can predict the correct stereochemical outcome, which may aid the rational design of further stereoselective syntheses. The alkynes activated by electrophilic species can also react with other nucleophiles, such as aromatic rings. The calculations account for the high endo-selectivity observed and suggest that this transformation takes place through a Friedel-Crafts-type alkenylation mechanism, where the endo-dig cyclization promoted by PtCl(2) may involve a cyclopropylmetallacarbene as intermediate before the formation of the expected Wheland-type intermediate. These comparisons of the computational approach with experiment demonstrate the value of theory in the development of a solid mechanistic understanding of these reaction processes.
Dynamic causal modelling: a critical review of the biophysical and statistical foundations.
Daunizeau, J; David, O; Stephan, K E
2011-09-15
The goal of dynamic causal modelling (DCM) of neuroimaging data is to study experimentally induced changes in functional integration among brain regions. This requires (i) biophysically plausible and physiologically interpretable models of neuronal network dynamics that can predict distributed brain responses to experimental stimuli and (ii) efficient statistical methods for parameter estimation and model comparison. These two key components of DCM have been the focus of more than thirty methodological articles since the seminal work of Friston and colleagues published in 2003. In this paper, we provide a critical review of the current state-of-the-art of DCM. We inspect the properties of DCM in relation to the most common neuroimaging modalities (fMRI and EEG/MEG) and the specificity of inference on neural systems that can be made from these data. We then discuss both the plausibility of the underlying biophysical models and the robustness of the statistical inversion techniques. Finally, we discuss potential extensions of the current DCM framework, such as stochastic DCMs, plastic DCMs and field DCMs. Copyright © 2009 Elsevier Inc. All rights reserved.
Climate change adaptation in regulated water utilities
NASA Astrophysics Data System (ADS)
Vicuna, S.; Melo, O.; Harou, J. J.; Characklis, G. W.; Ricalde, I.
2017-12-01
Concern about climate change impacts on water supply systems has grown in recent years. However, there are still few examples of pro-active interventions (e.g. infrastructure investment or policy changes) meant to address plausible future changes. Deep uncertainty associated with climate impacts, future demands, and regulatory constraints might explain why utility planning in a range of contexts doesn't explicitly consider climate change scenarios and potential adaptive responses. Given the importance of water supplies for economic development and the cost and longevity of many water infrastructure investments, large urban water supply systems could suffer from lack of pro-active climate change adaptation. Water utilities need to balance the potential for high regret stranded assets on the one side, with insufficient supplies leading to potentially severe socio-economic, political and environmental failures on the other, and need to deal with a range of interests and constraints. This work presents initial findings from a project looking at how cities in Chile, the US and the UK are developing regulatory frameworks that incorporate utility planning under uncertainty. Considering for example the city of Santiago, Chile, recent studies have shown that although high scarcity cost scenarios are plausible, pre-emptive investment to guard from possible water supply failures is still remote and not accommodated by current planning practice. A first goal of the project is to compare and contrast regulatory approaches to utility risks considering climate change adaptation measures. Subsequently we plan to develop and propose a custom approach for the city of Santiago based on lessons learned from other contexts. The methodological approach combines institutional assessment of water supply regulatory frameworks with simulation-based decision-making under uncertainty approaches. Here we present initial work comparing the regulatory frameworks in Chile, UK and USA evaluating their ability to incorporate uncertain climate and other changes into long-term infrastructure investment planning. The potential for regulatory and financial adaptive measures is explored in addition to a discussion on evaluating their appropriateness via various modelling-based intervention decision-making approaches.
Shoda, Lisl Km; Battista, Christina; Siler, Scott Q; Pisetsky, David S; Watkins, Paul B; Howell, Brett A
2017-01-01
Drug-induced liver injury (DILI) remains an adverse event of significant concern for drug development and marketed drugs, and the field would benefit from better tools to identify liver liabilities early in development and/or to mitigate potential DILI risk in otherwise promising drugs. DILIsym software takes a quantitative systems toxicology approach to represent DILI in pre-clinical species and in humans for the mechanistic investigation of liver toxicity. In addition to multiple intrinsic mechanisms of hepatocyte toxicity (ie, oxidative stress, bile acid accumulation, mitochondrial dysfunction), DILIsym includes the interaction between hepatocytes and cells of the innate immune response in the amplification of liver injury and in liver regeneration. The representation of innate immune responses, detailed here, consolidates much of the available data on the innate immune response in DILI within a single framework and affords the opportunity to systematically investigate the contribution of the innate response to DILI.
When mechanism matters: Bayesian forecasting using models of ecological diffusion
Hefley, Trevor J.; Hooten, Mevin B.; Russell, Robin E.; Walsh, Daniel P.; Powell, James A.
2017-01-01
Ecological diffusion is a theory that can be used to understand and forecast spatio-temporal processes such as dispersal, invasion, and the spread of disease. Hierarchical Bayesian modelling provides a framework to make statistical inference and probabilistic forecasts, using mechanistic ecological models. To illustrate, we show how hierarchical Bayesian models of ecological diffusion can be implemented for large data sets that are distributed densely across space and time. The hierarchical Bayesian approach is used to understand and forecast the growth and geographic spread in the prevalence of chronic wasting disease in white-tailed deer (Odocoileus virginianus). We compare statistical inference and forecasts from our hierarchical Bayesian model to phenomenological regression-based methods that are commonly used to analyse spatial occurrence data. The mechanistic statistical model based on ecological diffusion led to important ecological insights, obviated a commonly ignored type of collinearity, and was the most accurate method for forecasting.
Physiome-model-based state-space framework for cardiac deformation recovery.
Wong, Ken C L; Zhang, Heye; Liu, Huafeng; Shi, Pengcheng
2007-11-01
To more reliably recover cardiac information from noise-corrupted, patient-specific measurements, it is essential to employ meaningful constraining models and adopt appropriate optimization criteria to couple the models with the measurements. Although biomechanical models have been extensively used for myocardial motion recovery with encouraging results, the passive nature of such constraints limits their ability to fully count for the deformation caused by active forces of the myocytes. To overcome such limitations, we propose to adopt a cardiac physiome model as the prior constraint for cardiac motion analysis. The cardiac physiome model comprises an electric wave propagation model, an electromechanical coupling model, and a biomechanical model, which are connected through a cardiac system dynamics for a more complete description of the macroscopic cardiac physiology. Embedded within a multiframe state-space framework, the uncertainties of the model and the patient's measurements are systematically dealt with to arrive at optimal cardiac kinematic estimates and possibly beyond. Experiments have been conducted to compare our proposed cardiac-physiome-model-based framework with the solely biomechanical model-based framework. The results show that our proposed framework recovers more accurate cardiac deformation from synthetic data and obtains more sensible estimates from real magnetic resonance image sequences. With the active components introduced by the cardiac physiome model, cardiac deformations recovered from patient's medical images are more physiologically plausible.
Noecker, Cecilia; Eng, Alexander; Srinivasan, Sujatha; Theriot, Casey M; Young, Vincent B; Jansson, Janet K; Fredricks, David N; Borenstein, Elhanan
2016-01-01
Multiple molecular assays now enable high-throughput profiling of the ecology, metabolic capacity, and activity of the human microbiome. However, to date, analyses of such multi-omic data typically focus on statistical associations, often ignoring extensive prior knowledge of the mechanisms linking these various facets of the microbiome. Here, we introduce a comprehensive framework to systematically link variation in metabolomic data with community composition by utilizing taxonomic, genomic, and metabolic information. Specifically, we integrate available and inferred genomic data, metabolic network modeling, and a method for predicting community-wide metabolite turnover to estimate the biosynthetic and degradation potential of a given community. Our framework then compares variation in predicted metabolic potential with variation in measured metabolites' abundances to evaluate whether community composition can explain observed shifts in the community metabolome, and to identify key taxa and genes contributing to the shifts. Focusing on two independent vaginal microbiome data sets, each pairing 16S community profiling with large-scale metabolomics, we demonstrate that our framework successfully recapitulates observed variation in 37% of metabolites. Well-predicted metabolite variation tends to result from disease-associated metabolism. We further identify several disease-enriched species that contribute significantly to these predictions. Interestingly, our analysis also detects metabolites for which the predicted variation negatively correlates with the measured variation, suggesting environmental control points of community metabolism. Applying this framework to gut microbiome data sets reveals similar trends, including prediction of bile acid metabolite shifts. This framework is an important first step toward a system-level multi-omic integration and an improved mechanistic understanding of the microbiome activity and dynamics in health and disease. Studies characterizing both the taxonomic composition and metabolic profile of various microbial communities are becoming increasingly common, yet new computational methods are needed to integrate and interpret these data in terms of known biological mechanisms. Here, we introduce an analytical framework to link species composition and metabolite measurements, using a simple model to predict the effects of community ecology on metabolite concentrations and evaluating whether these predictions agree with measured metabolomic profiles. We find that a surprisingly large proportion of metabolite variation in the vaginal microbiome can be predicted based on species composition (including dramatic shifts associated with disease), identify putative mechanisms underlying these predictions, and evaluate the roles of individual bacterial species and genes. Analysis of gut microbiome data using this framework recovers similar community metabolic trends. This framework lays the foundation for model-based multi-omic integrative studies, ultimately improving our understanding of microbial community metabolism.
Noecker, Cecilia; Eng, Alexander; Srinivasan, Sujatha; Theriot, Casey M.; Young, Vincent B.; Jansson, Janet K.; Fredricks, David N.
2016-01-01
ABSTRACT Multiple molecular assays now enable high-throughput profiling of the ecology, metabolic capacity, and activity of the human microbiome. However, to date, analyses of such multi-omic data typically focus on statistical associations, often ignoring extensive prior knowledge of the mechanisms linking these various facets of the microbiome. Here, we introduce a comprehensive framework to systematically link variation in metabolomic data with community composition by utilizing taxonomic, genomic, and metabolic information. Specifically, we integrate available and inferred genomic data, metabolic network modeling, and a method for predicting community-wide metabolite turnover to estimate the biosynthetic and degradation potential of a given community. Our framework then compares variation in predicted metabolic potential with variation in measured metabolites’ abundances to evaluate whether community composition can explain observed shifts in the community metabolome, and to identify key taxa and genes contributing to the shifts. Focusing on two independent vaginal microbiome data sets, each pairing 16S community profiling with large-scale metabolomics, we demonstrate that our framework successfully recapitulates observed variation in 37% of metabolites. Well-predicted metabolite variation tends to result from disease-associated metabolism. We further identify several disease-enriched species that contribute significantly to these predictions. Interestingly, our analysis also detects metabolites for which the predicted variation negatively correlates with the measured variation, suggesting environmental control points of community metabolism. Applying this framework to gut microbiome data sets reveals similar trends, including prediction of bile acid metabolite shifts. This framework is an important first step toward a system-level multi-omic integration and an improved mechanistic understanding of the microbiome activity and dynamics in health and disease. IMPORTANCE Studies characterizing both the taxonomic composition and metabolic profile of various microbial communities are becoming increasingly common, yet new computational methods are needed to integrate and interpret these data in terms of known biological mechanisms. Here, we introduce an analytical framework to link species composition and metabolite measurements, using a simple model to predict the effects of community ecology on metabolite concentrations and evaluating whether these predictions agree with measured metabolomic profiles. We find that a surprisingly large proportion of metabolite variation in the vaginal microbiome can be predicted based on species composition (including dramatic shifts associated with disease), identify putative mechanisms underlying these predictions, and evaluate the roles of individual bacterial species and genes. Analysis of gut microbiome data using this framework recovers similar community metabolic trends. This framework lays the foundation for model-based multi-omic integrative studies, ultimately improving our understanding of microbial community metabolism. PMID:27239563
Allen, J L; Oberdorster, G; Morris-Schaffer, K; Wong, C; Klocke, C; Sobolewski, M; Conrad, K; Mayer-Proschel, M; Cory-Slechta, D A
2017-03-01
Accumulating evidence from both human and animal studies show that brain is a target of air pollution. Multiple epidemiological studies have now linked components of air pollution to diagnosis of autism spectrum disorder (ASD), a linkage with plausibility based on the shared mechanisms of inflammation. Additional plausibility appears to be provided by findings from our studies in mice of exposures from postnatal day (PND) 4-7 and 10-13 (human 3rd trimester equivalent), to concentrated ambient ultrafine (UFP) particles, considered the most reactive component of air pollution, at levels consistent with high traffic areas of major U.S. cities and thus highly relevant to human exposures. These exposures, occurring during a period of marked neuro- and gliogenesis, unexpectedly produced a pattern of developmental neurotoxicity notably similar to multiple hypothesized mechanistic underpinnings of ASD, including its greater impact in males. UFP exposures induced inflammation/microglial activation, reductions in size of the corpus callosum (CC) and associated hypomyelination, aberrant white matter development and/or structural integrity with ventriculomegaly (VM), elevated glutamate and excitatory/inhibitory imbalance, increased amygdala astrocytic activation, and repetitive and impulsive behaviors. Collectively, these findings suggest the human 3rd trimester equivalent as a period of potential vulnerability to neurodevelopmental toxicity to UFP, particularly in males, and point to the possibility that UFP air pollution exposure during periods of rapid neuro- and gliogenesis may be a risk factor not only for ASD, but also for other neurodevelopmental disorders that share features with ASD, such as schizophrenia, attention deficit disorder, and periventricular leukomalacia. Copyright © 2015 Elsevier B.V. All rights reserved.
Lan, Qing; Zhang, Luoping; Tang, Xiaojiang; Shen, Min; Smith, Martyn T.; Qiu, Chuangyi; Ge, Yichen; Ji, Zhiying; Xiong, Jun; He, Jian; Reiss, Boris; Hao, Zhenyue; Liu, Songwang; Xie, Yuxuan; Guo, Weihong; Purdue, Mark P.; Galvan, Noe; Xin, Kerry X.; Hu, Wei; Beane Freeman, Laura E.; Blair, Aaron E.; Li, Laiyu; Rothman, Nathaniel; Vermeulen, Roel; Huang, Hanlin
2010-01-01
Occupational cohort and case–control studies suggest that trichloroethylene (TCE) exposure may be associated with non-Hodgkin lymphoma (NHL) but findings are not consistent. There is a need for mechanistic studies to evaluate the biologic plausibility of this association. We carried out a cross-sectional molecular epidemiology study of 80 healthy workers that used TCE and 96 comparable unexposed controls in Guangdong, China. Personal exposure measurements were taken over a three-week period before blood collection. Ninety-six percent of workers were exposed to TCE below the current US Occupational Safety and Health Administration Permissible Exposure Limit (100 p.p.m. 8 h time-weighted average), with a mean (SD) of 22.2 (36.0) p.p.m. The total lymphocyte count and each of the major lymphocyte subsets including CD4+ T cells, CD8+ T cells, natural killer (NK) cells and B cells were significantly decreased among the TCE-exposed workers compared with controls (P < 0.05), with evidence of a dose-dependent decline. Further, there was a striking 61% decline in sCD27 plasma level and a 34% decline in sCD30 plasma level among TCE-exposed workers compared with controls. This is the first report that TCE exposure under the current Occupational Safety and Health Administration workplace standard is associated with a decline in all major lymphocyte subsets and sCD27 and sCD30, which play an important role in regulating cellular activity in subsets of T, B and NK cells and are associated with lymphocyte activation. Given that altered immunity is an established risk factor for NHL, these results add to the biologic plausibility that TCE is a possible lymphomagen. PMID:20530238
Calibration and analysis of genome-based models for microbial ecology.
Louca, Stilianos; Doebeli, Michael
2015-10-16
Microbial ecosystem modeling is complicated by the large number of unknown parameters and the lack of appropriate calibration tools. Here we present a novel computational framework for modeling microbial ecosystems, which combines genome-based model construction with statistical analysis and calibration to experimental data. Using this framework, we examined the dynamics of a community of Escherichia coli strains that emerged in laboratory evolution experiments, during which an ancestral strain diversified into two coexisting ecotypes. We constructed a microbial community model comprising the ancestral and the evolved strains, which we calibrated using separate monoculture experiments. Simulations reproduced the successional dynamics in the evolution experiments, and pathway activation patterns observed in microarray transcript profiles. Our approach yielded detailed insights into the metabolic processes that drove bacterial diversification, involving acetate cross-feeding and competition for organic carbon and oxygen. Our framework provides a missing link towards a data-driven mechanistic microbial ecology.
2013-01-01
Background High-throughput profiling of human tissues typically yield as results the gene lists comprised of a mix of relevant molecular entities with multiple false positives that obstruct the translation of such results into mechanistic hypotheses. From general probabilistic considerations, gene lists distilled for the mechanistically relevant components can be far more useful for subsequent experimental design or data interpretation. Results The input candidate gene lists were processed into different tiers of evidence consistency established by enrichment analysis across subsets of the same experiments and across different experiments and platforms. The cut-offs were established empirically through ontological and semantic enrichment; resultant shortened gene list was re-expanded by Ingenuity Pathway Assistant tool. The resulting sub-networks provided the basis for generating mechanistic hypotheses that were partially validated by literature search. This approach differs from previous consistency-based studies in that the cut-off on the Receiver Operating Characteristic of the true-false separation process is optimized by flexible selection of the consistency building procedure. The gene list distilled by this analytic technique and its network representation were termed Compact Disease Model (CDM). Here we present the CDM signature for the study of early-stage Alzheimer’s disease. The integrated analysis of this gene signature allowed us to identify the protein traffic vesicles as prominent players in the pathogenesis of Alzheimer’s. Considering the distances and complexity of protein trafficking in neurons, it is plausible that spontaneous protein misfolding along with a shortage of growth stimulation result in neurodegeneration. Several potentially overlapping scenarios of early-stage Alzheimer pathogenesis have been discussed, with an emphasis on the protective effects of AT-1 mediated antihypertensive response on cytoskeleton remodeling, along with neuronal activation of oncogenes, luteinizing hormone signaling and insulin-related growth regulation, forming a pleiotropic model of its early stages. Alignment with emerging literature confirmed many predictions derived from early-stage Alzheimer’s disease’ CDM. Conclusions A flexible approach for high-throughput data analysis, the Compact Disease Model generation, allows extraction of meaningful, mechanism-centered gene sets compatible with instant translation of the results into testable hypotheses. PMID:24196233
Noel, Jean-Paul; Blanke, Olaf; Magosso, Elisa; Serino, Andrea
2018-06-01
Interactions between the body and the environment occur within the peripersonal space (PPS), the space immediately surrounding the body. The PPS is encoded by multisensory (audio-tactile, visual-tactile) neurons that possess receptive fields (RFs) anchored on the body and restricted in depth. The extension in depth of PPS neurons' RFs has been documented to change dynamically as a function of the velocity of incoming stimuli, but the underlying neural mechanisms are still unknown. Here, by integrating a psychophysical approach with neural network modeling, we propose a mechanistic explanation behind this inherent dynamic property of PPS. We psychophysically mapped the size of participant's peri-face and peri-trunk space as a function of the velocity of task-irrelevant approaching auditory stimuli. Findings indicated that the peri-trunk space was larger than the peri-face space, and, importantly, as for the neurophysiological delineation of RFs, both of these representations enlarged as the velocity of incoming sound increased. We propose a neural network model to mechanistically interpret these findings: the network includes reciprocal connections between unisensory areas and higher order multisensory neurons, and it implements neural adaptation to persistent stimulation as a mechanism sensitive to stimulus velocity. The network was capable of replicating the behavioral observations of PPS size remapping and relates behavioral proxies of PPS size to neurophysiological measures of multisensory neurons' RF size. We propose that a biologically plausible neural adaptation mechanism embedded within the network encoding for PPS can be responsible for the dynamic alterations in PPS size as a function of the velocity of incoming stimuli. NEW & NOTEWORTHY Interactions between body and environment occur within the peripersonal space (PPS). PPS neurons are highly dynamic, adapting online as a function of body-object interactions. The mechanistic underpinning PPS dynamic properties are unexplained. We demonstrate with a psychophysical approach that PPS enlarges as incoming stimulus velocity increases, efficiently preventing contacts with faster approaching objects. We present a neurocomputational model of multisensory PPS implementing neural adaptation to persistent stimulation to propose a neurophysiological mechanism underlying this effect.
Unifying Time to Contact Estimation and Collision Avoidance across Species
Keil, Matthias S.; López-Moliner, Joan
2012-01-01
The -function and the -function are phenomenological models that are widely used in the context of timing interceptive actions and collision avoidance, respectively. Both models were previously considered to be unrelated to each other: is a decreasing function that provides an estimation of time-to-contact (ttc) in the early phase of an object approach; in contrast, has a maximum before ttc. Furthermore, it is not clear how both functions could be implemented at the neuronal level in a biophysically plausible fashion. Here we propose a new framework – the corrected modified Tau function – capable of predicting both -type (“”) and -type (“”) responses. The outstanding property of our new framework is its resilience to noise. We show that can be derived from a firing rate equation, and, as , serves to describe the response curves of collision sensitive neurons. Furthermore, we show that predicts the psychophysical performance of subjects determining ttc. Our new framework is thus validated successfully against published and novel experimental data. Within the framework, links between -type and -type neurons are established. Therefore, it could possibly serve as a model for explaining the co-occurrence of such neurons in the brain. PMID:22915999
A decision framework for managing risk to airports from terrorist attack.
Shafieezadeh, Abdollah; Cha, Eun J; Ellingwood, Bruce R
2015-02-01
This article presents an asset-level security risk management framework to assist stakeholders of critical assets with allocating limited budgets for enhancing their safety and security against terrorist attack. The proposed framework models the security system of an asset, considers various threat scenarios, and models the sequential decision framework of attackers during the attack. Its novel contributions are the introduction of the notion of partial neutralization of attackers by defenders, estimation of total loss from successful, partially successful, and unsuccessful actions of attackers at various stages of an attack, and inclusion of the effects of these losses on the choices made by terrorists at various stages of the attack. The application of the proposed method is demonstrated in an example dealing with security risk management of a U.S. commercial airport, in which a set of plausible threat scenarios and risk mitigation options are considered. It is found that a combination of providing blast-resistant cargo containers and a video surveillance system on the airport perimeter fence is the best option based on minimum expected life-cycle cost considering a 10-year service period. © 2014 Society for Risk Analysis.
Moore, Shannon R.; Saidel, Gerald M.; Knothe, Ulf; Knothe Tate, Melissa L.
2014-01-01
The link between mechanics and biology in the generation and the adaptation of bone has been well studied in context of skeletal development and fracture healing. Yet, the prediction of tissue genesis within - and the spatiotemporal healing of - postnatal defects, necessitates a quantitative evaluation of mechano-biological interactions using experimental and clinical parameters. To address this current gap in knowledge, this study aims to develop a mechanistic mathematical model of tissue genesis using bone morphogenetic protein (BMP) to represent of a class of factors that may coordinate bone healing. Specifically, we developed a mechanistic, mathematical model to predict the dynamics of tissue genesis by periosteal progenitor cells within a long bone defect surrounded by periosteum and stabilized via an intramedullary nail. The emergent material properties and mechanical environment associated with nascent tissue genesis influence the strain stimulus sensed by progenitor cells within the periosteum. Using a mechanical finite element model, periosteal surface strains are predicted as a function of emergent, nascent tissue properties. Strains are then input to a mechanistic mathematical model, where mechanical regulation of BMP-2 production mediates rates of cellular proliferation, differentiation and tissue production, to predict healing outcomes. A parametric approach enables the spatial and temporal prediction of endochondral tissue regeneration, assessed as areas of cartilage and mineralized bone, as functions of radial distance from the periosteum and time. Comparing model results to histological outcomes from two previous studies of periosteum-mediated bone regeneration in a common ovine model, it was shown that mechanistic models incorporating mechanical feedback successfully predict patterns (spatial) and trends (temporal) of bone tissue regeneration. The novel model framework presented here integrates a mechanistic feedback system based on the mechanosensitivity of periosteal progenitor cells, which allows for modeling and prediction of tissue regeneration on multiple length and time scales. Through combination of computational, physical and engineering science approaches, the model platform provides a means to test new hypotheses in silico and to elucidate conditions conducive to endogenous tissue genesis. Next generation models will serve to unravel intrinsic differences in bone genesis by endochondral and intramembranous mechanisms. PMID:24967742
Siegler, Jason C; Marshall, Paul W M; Bishop, David; Shaw, Greg; Green, Simon
2016-12-01
A large proportion of empirical research and reviews investigating the ergogenic potential of sodium bicarbonate (NaHCO 3 ) supplementation have focused predominately on performance outcomes and only speculate about underlying mechanisms responsible for any benefit. The aim of this review was to critically evaluate the influence of NaHCO 3 supplementation on mechanisms associated with skeletal muscle fatigue as it translates directly to exercise performance. Mechanistic links between skeletal muscle fatigue, proton accumulation (or metabolic acidosis) and NaHCO 3 supplementation have been identified to provide a more targeted, evidence-based approach to direct future research, as well as provide practitioners with a contemporary perspective on the potential applications and limitations of this supplement. The mechanisms identified have been broadly categorised under the sections 'Whole-body Metabolism', 'Muscle Physiology' and 'Motor Pathways', and when possible, the performance outcomes of these studies contextualized within an integrative framework of whole-body exercise where other factors such as task demand (e.g. large vs. small muscle groups), cardio-pulmonary and neural control mechanisms may outweigh any localised influence of NaHCO 3 . Finally, the 'Performance Applications' section provides further interpretation for the practitioner founded on the mechanistic evidence provided in this review and other relevant, applied NaHCO 3 performance-related studies.
An, Gary C
2010-01-01
The greatest challenge facing the biomedical research community is the effective translation of basic mechanistic knowledge into clinically effective therapeutics. This challenge is most evident in attempts to understand and modulate "systems" processes/disorders, such as sepsis, cancer, and wound healing. Formulating an investigatory strategy for these issues requires the recognition that these are dynamic processes. Representation of the dynamic behavior of biological systems can aid in the investigation of complex pathophysiological processes by augmenting existing discovery procedures by integrating disparate information sources and knowledge. This approach is termed Translational Systems Biology. Focusing on the development of computational models capturing the behavior of mechanistic hypotheses provides a tool that bridges gaps in the understanding of a disease process by visualizing "thought experiments" to fill those gaps. Agent-based modeling is a computational method particularly well suited to the translation of mechanistic knowledge into a computational framework. Utilizing agent-based models as a means of dynamic hypothesis representation will be a vital means of describing, communicating, and integrating community-wide knowledge. The transparent representation of hypotheses in this dynamic fashion can form the basis of "knowledge ecologies," where selection between competing hypotheses will apply an evolutionary paradigm to the development of community knowledge.
Changes in Black-legged Tick Population in New England with Future Climate Change
NASA Astrophysics Data System (ADS)
Krishnan, S.; Huber, M.
2015-12-01
Lyme disease is one of the most frequently reported vector-borne diseases in the United States. In the Northeastern United States, vector transmission is maintained in a horizontal transmission cycle between the vector, the black-legged ticks, and the vertebrate reservoir hosts, which include white-tailed deer, rodents and other medium to large sized mammals. Predicting how vector populations change with future climate change is critical to understanding disease spread in the future, and for developing suitable regional adaptation strategies. For the United States, these predictions have mostly been made using regressions based on field and lab studies, or using spatial suitability studies. However, the relation between tick populations at various life-cycle stages and climate variables are complex, necessitating a mechanistic approach. In this study, we present a framework for driving a mechanistic tick population model with high-resolution regional climate modeling projections. The goal is to estimate changes in black-legged tick populations in New England for the 21st century. The tick population model used is based on the mechanistic approach of Ogden et al., (2005) developed for Canada. Dynamically downscaled climate projections at a 3-kms resolution using the Weather and Research Forecasting Model (WRF) are used to drive the tick population model.
Integrated computational model of the bioenergetics of isolated lung mitochondria
Zhang, Xiao; Jacobs, Elizabeth R.; Camara, Amadou K. S.; Clough, Anne V.
2018-01-01
Integrated computational modeling provides a mechanistic and quantitative framework for describing lung mitochondrial bioenergetics. Thus, the objective of this study was to develop and validate a thermodynamically-constrained integrated computational model of the bioenergetics of isolated lung mitochondria. The model incorporates the major biochemical reactions and transport processes in lung mitochondria. A general framework was developed to model those biochemical reactions and transport processes. Intrinsic model parameters such as binding constants were estimated using previously published isolated enzymes and transporters kinetic data. Extrinsic model parameters such as maximal reaction and transport velocities were estimated by fitting the integrated bioenergetics model to published and new tricarboxylic acid cycle and respirometry data measured in isolated rat lung mitochondria. The integrated model was then validated by assessing its ability to predict experimental data not used for the estimation of the extrinsic model parameters. For example, the model was able to predict reasonably well the substrate and temperature dependency of mitochondrial oxygen consumption, kinetics of NADH redox status, and the kinetics of mitochondrial accumulation of the cationic dye rhodamine 123, driven by mitochondrial membrane potential, under different respiratory states. The latter required the coupling of the integrated bioenergetics model to a pharmacokinetic model for the mitochondrial uptake of rhodamine 123 from buffer. The integrated bioenergetics model provides a mechanistic and quantitative framework for 1) integrating experimental data from isolated lung mitochondria under diverse experimental conditions, and 2) assessing the impact of a change in one or more mitochondrial processes on overall lung mitochondrial bioenergetics. In addition, the model provides important insights into the bioenergetics and respiration of lung mitochondria and how they differ from those of mitochondria from other organs. To the best of our knowledge, this model is the first for the bioenergetics of isolated lung mitochondria. PMID:29889855
Integrated computational model of the bioenergetics of isolated lung mitochondria.
Zhang, Xiao; Dash, Ranjan K; Jacobs, Elizabeth R; Camara, Amadou K S; Clough, Anne V; Audi, Said H
2018-01-01
Integrated computational modeling provides a mechanistic and quantitative framework for describing lung mitochondrial bioenergetics. Thus, the objective of this study was to develop and validate a thermodynamically-constrained integrated computational model of the bioenergetics of isolated lung mitochondria. The model incorporates the major biochemical reactions and transport processes in lung mitochondria. A general framework was developed to model those biochemical reactions and transport processes. Intrinsic model parameters such as binding constants were estimated using previously published isolated enzymes and transporters kinetic data. Extrinsic model parameters such as maximal reaction and transport velocities were estimated by fitting the integrated bioenergetics model to published and new tricarboxylic acid cycle and respirometry data measured in isolated rat lung mitochondria. The integrated model was then validated by assessing its ability to predict experimental data not used for the estimation of the extrinsic model parameters. For example, the model was able to predict reasonably well the substrate and temperature dependency of mitochondrial oxygen consumption, kinetics of NADH redox status, and the kinetics of mitochondrial accumulation of the cationic dye rhodamine 123, driven by mitochondrial membrane potential, under different respiratory states. The latter required the coupling of the integrated bioenergetics model to a pharmacokinetic model for the mitochondrial uptake of rhodamine 123 from buffer. The integrated bioenergetics model provides a mechanistic and quantitative framework for 1) integrating experimental data from isolated lung mitochondria under diverse experimental conditions, and 2) assessing the impact of a change in one or more mitochondrial processes on overall lung mitochondrial bioenergetics. In addition, the model provides important insights into the bioenergetics and respiration of lung mitochondria and how they differ from those of mitochondria from other organs. To the best of our knowledge, this model is the first for the bioenergetics of isolated lung mitochondria.
Towards a resource-based habitat approach for spatial modelling of vector-borne disease risks.
Hartemink, Nienke; Vanwambeke, Sophie O; Purse, Bethan V; Gilbert, Marius; Van Dyck, Hans
2015-11-01
Given the veterinary and public health impact of vector-borne diseases, there is a clear need to assess the suitability of landscapes for the emergence and spread of these diseases. Current approaches for predicting disease risks neglect key features of the landscape as components of the functional habitat of vectors or hosts, and hence of the pathogen. Empirical-statistical methods do not explicitly incorporate biological mechanisms, whereas current mechanistic models are rarely spatially explicit; both methods ignore the way animals use the landscape (i.e. movement ecology). We argue that applying a functional concept for habitat, i.e. the resource-based habitat concept (RBHC), can solve these issues. The RBHC offers a framework to identify systematically the different ecological resources that are necessary for the completion of the transmission cycle and to relate these resources to (combinations of) landscape features and other environmental factors. The potential of the RBHC as a framework for identifying suitable habitats for vector-borne pathogens is explored and illustrated with the case of bluetongue virus, a midge-transmitted virus affecting ruminants. The concept facilitates the study of functional habitats of the interacting species (vectors as well as hosts) and provides new insight into spatial and temporal variation in transmission opportunities and exposure that ultimately determine disease risks. It may help to identify knowledge gaps and control options arising from changes in the spatial configuration of key resources across the landscape. The RBHC framework may act as a bridge between existing mechanistic and statistical modelling approaches. © 2014 The Authors. Biological Reviews published by John Wiley & Sons Ltd on behalf of Cambridge Philosophical Society.
Chemiresistive Sensor Arrays from Conductive 2D Metal–Organic Frameworks
Campbell, Michael G.; Liu, Sophie F.; Swager, Timothy M.; ...
2015-10-11
Applications of porous metal–organic frameworks (MOFs) in electronic devices are rare, owing in large part to a lack of MOFs that display electrical conductivity. Here, we describe the use of conductive two-dimensional (2D) MOFs as a new class of materials for chemiresistive sensing of volatile organic compounds (VOCs). We demonstrate that a family of structurally analogous 2D MOFs can be used to construct a cross-reactive sensor array that allows for clear discrimination between different categories of VOCs. Lastly, experimental data show that multiple sensing mechanisms are operative with high degrees of orthogonality, establishing that the 2D MOFs used here aremore » mechanistically unique and offer advantages relative to other known chemiresistor materials.« less
Assembling evidence for identifying reservoirs of infection.
Viana, Mafalda; Mancy, Rebecca; Biek, Roman; Cleaveland, Sarah; Cross, Paul C; Lloyd-Smith, James O; Haydon, Daniel T
2014-05-01
Many pathogens persist in multihost systems, making the identification of infection reservoirs crucial for devising effective interventions. Here, we present a conceptual framework for classifying patterns of incidence and prevalence, and review recent scientific advances that allow us to study and manage reservoirs simultaneously. We argue that interventions can have a crucial role in enriching our mechanistic understanding of how reservoirs function and should be embedded as quasi-experimental studies in adaptive management frameworks. Single approaches to the study of reservoirs are unlikely to generate conclusive insights whereas the formal integration of data and methodologies, involving interventions, pathogen genetics, and contemporary surveillance techniques, promises to open up new opportunities to advance understanding of complex multihost systems. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.
Biologically plausible learning in neural networks: a lesson from bacterial chemotaxis.
Shimansky, Yury P
2009-12-01
Learning processes in the brain are usually associated with plastic changes made to optimize the strength of connections between neurons. Although many details related to biophysical mechanisms of synaptic plasticity have been discovered, it is unclear how the concurrent performance of adaptive modifications in a huge number of spatial locations is organized to minimize a given objective function. Since direct experimental observation of even a relatively small subset of such changes is not feasible, computational modeling is an indispensable investigation tool for solving this problem. However, the conventional method of error back-propagation (EBP) employed for optimizing synaptic weights in artificial neural networks is not biologically plausible. This study based on computational experiments demonstrated that such optimization can be performed rather efficiently using the same general method that bacteria employ for moving closer to an attractant or away from a repellent. With regard to neural network optimization, this method consists of regulating the probability of an abrupt change in the direction of synaptic weight modification according to the temporal gradient of the objective function. Neural networks utilizing this method (regulation of modification probability, RMP) can be viewed as analogous to swimming in the multidimensional space of their parameters in the flow of biochemical agents carrying information about the optimality criterion. The efficiency of RMP is comparable to that of EBP, while RMP has several important advantages. Since the biological plausibility of RMP is beyond a reasonable doubt, the RMP concept provides a constructive framework for the experimental analysis of learning in natural neural networks.
Multi-Scale Modeling in Morphogenesis: A Critical Analysis of the Cellular Potts Model
Voss-Böhme, Anja
2012-01-01
Cellular Potts models (CPMs) are used as a modeling framework to elucidate mechanisms of biological development. They allow a spatial resolution below the cellular scale and are applied particularly when problems are studied where multiple spatial and temporal scales are involved. Despite the increasing usage of CPMs in theoretical biology, this model class has received little attention from mathematical theory. To narrow this gap, the CPMs are subjected to a theoretical study here. It is asked to which extent the updating rules establish an appropriate dynamical model of intercellular interactions and what the principal behavior at different time scales characterizes. It is shown that the longtime behavior of a CPM is degenerate in the sense that the cells consecutively die out, independent of the specific interdependence structure that characterizes the model. While CPMs are naturally defined on finite, spatially bounded lattices, possible extensions to spatially unbounded systems are explored to assess to which extent spatio-temporal limit procedures can be applied to describe the emergent behavior at the tissue scale. To elucidate the mechanistic structure of CPMs, the model class is integrated into a general multiscale framework. It is shown that the central role of the surface fluctuations, which subsume several cellular and intercellular factors, entails substantial limitations for a CPM's exploitation both as a mechanistic and as a phenomenological model. PMID:22984409
Cunning, Ross; Muller, Erik B; Gates, Ruth D; Nisbet, Roger M
2017-10-27
Coral reef ecosystems owe their ecological success - and vulnerability to climate change - to the symbiotic metabolism of corals and Symbiodinium spp. The urgency to understand and predict the stability and breakdown of these symbioses (i.e., coral 'bleaching') demands the development and application of theoretical tools. Here, we develop a dynamic bioenergetic model of coral-Symbiodinium symbioses that demonstrates realistic steady-state patterns in coral growth and symbiont abundance across gradients of light, nutrients, and feeding. Furthermore, by including a mechanistic treatment of photo-oxidative stress, the model displays dynamics of bleaching and recovery that can be explained as transitions between alternate stable states. These dynamics reveal that "healthy" and "bleached" states correspond broadly to nitrogen- and carbon-limitation in the system, with transitions between them occurring as integrated responses to multiple environmental factors. Indeed, a suite of complex emergent behaviors reproduced by the model (e.g., bleaching is exacerbated by nutrients and attenuated by feeding) suggests it captures many important attributes of the system; meanwhile, its modular framework and open source R code are designed to facilitate further problem-specific development. We see significant potential for this modeling framework to generate testable hypotheses and predict integrated, mechanistic responses of corals to environmental change, with important implications for understanding the performance and maintenance of symbiotic systems. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
The Universal Plausibility Metric (UPM) & Principle (UPP).
Abel, David L
2009-12-03
Mere possibility is not an adequate basis for asserting scientific plausibility. A precisely defined universal bound is needed beyond which the assertion of plausibility, particularly in life-origin models, can be considered operationally falsified. But can something so seemingly relative and subjective as plausibility ever be quantified? Amazingly, the answer is, "Yes." A method of objectively measuring the plausibility of any chance hypothesis (The Universal Plausibility Metric [UPM]) is presented. A numerical inequality is also provided whereby any chance hypothesis can be definitively falsified when its UPM metric of xi is < 1 (The Universal Plausibility Principle [UPP]). Both UPM and UPP pre-exist and are independent of any experimental design and data set. No low-probability hypothetical plausibility assertion should survive peer-review without subjection to the UPP inequality standard of formal falsification (xi < 1).
Returning Incidental Findings in Low-Resource Settings: A Case of Rescue?
Mackay, Douglas
2018-05-01
In a carefully argued article, Haley K. Sullivan and Benjamin E. Berkman address the important question of whether investigators have a duty to report incidental findings to research participants in low-resource settings. They suggest that the duty to rescue offers the most plausible justification for the duty to return incidental findings, and they explore the implications of this duty for the context of research in low-resource settings. While I think they make valuable headway on an important problem, in this commentary, I identify a significant difference between the paradigmatic rescue case and the return of incidental findings in low-resource settings. This difference, I suggest, implies that their framework may be too narrow in scope. If investigators (and their sponsors) really wish to fulfill their duty to rescue, they must consider factors that are left out of Sullivan and Berkman's framework. © 2018 The Hastings Center.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gao, Wen-Yang; Cai, Rong; Pham, Tony
Copper paddlewheel based molecular building blocks (MBBs) are ubiquitous and have been widely employed for the construction of highly porous metal–organic frameworks (MOFs). However, most copper paddlewheel based MOFs fail to retain their structural integrity in the presence of water. This instability is directly correlated to the plausible displacement of coordinating carboxylates in the copper paddlewheel MBB, [Cu₂(O₂C-)₄], by the strongly coordinating water molecules. In this comprehensive study, we illustrate the chemical stability control in the rht-MOF platform via strengthening the coordinating bonds within the triangular inorganic MBB, [Cu₃O(N 4–x(CH) xC-)₃] (x = 0, 1, or 2). Remotely, the chemicalmore » stabilization propagated into the paddlewheel MBB to afford isoreticular rht-MOFs with remarkably enhanced water/chemical stabilities compared to the prototypal rht-MOF-1.« less
A cost-based comparison of quarantine strategies for new emerging diseases.
Mubayi, Anuj; Zaleta, Christopher Kribs; Martcheva, Maia; Castillo-Chávez, Carlos
2010-07-01
A classical epidemiological framework is used to provide a preliminary cost analysis of the effects of quarantine and isolation on the dynamics of infectious diseases for which no treatment or immediate diagnosis tools are available. Within this framework we consider the cost incurred from the implementation of three types of dynamic control strategies. Taking the context of the 2003 SARS outbreak in Hong Kong as an example, we use a simple cost function to compare the total cost of each mixed (quarantine and isolation) control strategy from a public health resource allocation perspective. The goal is to extend existing epi-economics methodology by developing a theoretical framework of dynamic quarantine strategies aimed at emerging diseases, by drawing upon the large body of literature on the dynamics of infectious diseases. We find that the total cost decreases with increases in the quarantine rates past a critical value, regardless of the resource allocation strategy. In the case of a manageable outbreak resources must be used early to achieve the best results whereas in case of an unmanageable outbreak, a constant-effort strategy seems the best among our limited plausible sets.
Generating Adaptive Behaviour within a Memory-Prediction Framework
Rawlinson, David; Kowadlo, Gideon
2012-01-01
The Memory-Prediction Framework (MPF) and its Hierarchical-Temporal Memory implementation (HTM) have been widely applied to unsupervised learning problems, for both classification and prediction. To date, there has been no attempt to incorporate MPF/HTM in reinforcement learning or other adaptive systems; that is, to use knowledge embodied within the hierarchy to control a system, or to generate behaviour for an agent. This problem is interesting because the human neocortex is believed to play a vital role in the generation of behaviour, and the MPF is a model of the human neocortex. We propose some simple and biologically-plausible enhancements to the Memory-Prediction Framework. These cause it to explore and interact with an external world, while trying to maximize a continuous, time-varying reward function. All behaviour is generated and controlled within the MPF hierarchy. The hierarchy develops from a random initial configuration by interaction with the world and reinforcement learning only. Among other demonstrations, we show that a 2-node hierarchy can learn to successfully play “rocks, paper, scissors” against a predictable opponent. PMID:22272231
The Universal Plausibility Metric (UPM) & Principle (UPP)
2009-01-01
Background Mere possibility is not an adequate basis for asserting scientific plausibility. A precisely defined universal bound is needed beyond which the assertion of plausibility, particularly in life-origin models, can be considered operationally falsified. But can something so seemingly relative and subjective as plausibility ever be quantified? Amazingly, the answer is, "Yes." A method of objectively measuring the plausibility of any chance hypothesis (The Universal Plausibility Metric [UPM]) is presented. A numerical inequality is also provided whereby any chance hypothesis can be definitively falsified when its UPM metric of ξ is < 1 (The Universal Plausibility Principle [UPP]). Both UPM and UPP pre-exist and are independent of any experimental design and data set. Conclusion No low-probability hypothetical plausibility assertion should survive peer-review without subjection to the UPP inequality standard of formal falsification (ξ < 1). PMID:19958539
Quantifying Direct and Indirect Effects of Elevated CO2 on Ecosystem Response
NASA Astrophysics Data System (ADS)
Fatichi, S.; Leuzinger, S.; Paschalis, A.; Donnellan-Barraclough, A.; Hovenden, M. J.; Langley, J. A.
2015-12-01
Increasing concentrations of atmospheric carbon dioxide are expected to affect carbon assimilation, evapotranspiration (ET) and ultimately plant growth. Direct leaf biochemical effects have been widely investigated, while indirect effects, although documented, are very difficult to quantify in experiments. We hypothesize that the interaction of direct and indirect effects is a possible reason for conflicting results concerning the magnitude of CO2 fertilization effects across different climates and ecosystems. A mechanistic ecohydrological model (Tethys-Chloris) is used to investigate the relative contribution of direct (through plant physiology) and indirect (via stomatal closure and thus soil moisture, and changes in Leaf Area Index, LAI) effects of elevated CO2 across a number of ecosystems. We specifically ask in which ecosystems and climate indirect effects are expected to be largest. Data and boundary conditions from flux-towers and free air CO2 enrichment (FACE) experiments are used to force the model and evaluate its performance. Numerical results suggest that indirect effects of elevated CO2, through water savings and increased LAI, are very significant and sometimes larger than direct effects. Indirect effects tend to be considerably larger in water-limited ecosystems, while direct effects correlate positively with mean air temperature. Increasing CO2 from 375 to 550 ppm causes a total effect on Net Primary Production in the order of 15 to 40% and on ET from 0 to -8%, depending on climate and ecosystem type. The total CO2 effect has a significant negative correlation with the wetness index and positive correlation with vapor pressure deficit. These results provide a more general mechanistic understanding of relatively short-term (less than 20 years) implications of elevated CO2 on ecosystem response and suggest plausible magnitudes for the expected changes.
Rangarajan, Srinivas; Maravelias, Christos T.; Mavrikakis, Manos
2017-11-09
Here, we present a general optimization-based framework for (i) ab initio and experimental data driven mechanistic modeling and (ii) optimal catalyst design of heterogeneous catalytic systems. Both cases are formulated as a nonlinear optimization problem that is subject to a mean-field microkinetic model and thermodynamic consistency requirements as constraints, for which we seek sparse solutions through a ridge (L 2 regularization) penalty. The solution procedure involves an iterative sequence of forward simulation of the differential algebraic equations pertaining to the microkinetic model using a numerical tool capable of handling stiff systems, sensitivity calculations using linear algebra, and gradient-based nonlinear optimization.more » A multistart approach is used to explore the solution space, and a hierarchical clustering procedure is implemented for statistically classifying potentially competing solutions. An example of methanol synthesis through hydrogenation of CO and CO 2 on a Cu-based catalyst is used to illustrate the framework. The framework is fast, is robust, and can be used to comprehensively explore the model solution and design space of any heterogeneous catalytic system.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rangarajan, Srinivas; Maravelias, Christos T.; Mavrikakis, Manos
Here, we present a general optimization-based framework for (i) ab initio and experimental data driven mechanistic modeling and (ii) optimal catalyst design of heterogeneous catalytic systems. Both cases are formulated as a nonlinear optimization problem that is subject to a mean-field microkinetic model and thermodynamic consistency requirements as constraints, for which we seek sparse solutions through a ridge (L 2 regularization) penalty. The solution procedure involves an iterative sequence of forward simulation of the differential algebraic equations pertaining to the microkinetic model using a numerical tool capable of handling stiff systems, sensitivity calculations using linear algebra, and gradient-based nonlinear optimization.more » A multistart approach is used to explore the solution space, and a hierarchical clustering procedure is implemented for statistically classifying potentially competing solutions. An example of methanol synthesis through hydrogenation of CO and CO 2 on a Cu-based catalyst is used to illustrate the framework. The framework is fast, is robust, and can be used to comprehensively explore the model solution and design space of any heterogeneous catalytic system.« less
Ecotoxicological effects of microplastics on biota: a review.
Anbumani, Sadasivam; Kakkar, Poonam
2018-05-01
The ubiquitous presence of microplastics in the environment has drawn the attention of ecotoxicologists on its safety and toxicity. Sources of microplastics in the environment include disintegration of larger plastic items (secondary microplastics), personal care products like liquid soap, exfoliating scrubbers, and cleaning supplies etc. Indiscriminate usage of plastics and its poor waste disposal management pose serious concern on ecosystem quality at global level. The present review focused on the ecological impact of microplastics on biota at different trophic levels, its uptake, accumulation, and excretion etc., and its plausible mechanistic toxicity with risk assessment approaches. Existing scientific evidence shows that microplastics exposure triggers a wide variety of toxic insult from feeding disruption to reproductive performance, physical ingestion, disturbances in energy metabolism, changes in liver physiology, synergistic and/ or antagonistic action of other hydrophobic organic contaminants etc. from lower to higher trophics. Thus, microplastic accumulation and its associated adverse effects make it mandatory to go in for risk assessment and legislative action. Subsequent research priorities, agenda, and key issues to be addressed are also acknowledged in the present review.
Haldar, Saikat; Mulani, Fayaj A; Aarthy, Thiagarayaselvam; Dandekar, Devdutta S; Thulasiram, Hirekodathakallu V
2014-10-31
C-seco triterpenoids are widely bioactive class of natural products with high structural complexity and diversity. The preparative isolation of these molecules with high purity is greatly desirable, although restricted due to the complexity of natural extracts. In this article we have demonstrated a Medium Pressure Liquid Chromatography (MPLC) based protocol for the isolation of eight major C-seco triterpenoids of salannin skeleton from Neem (Azadirachta indica) oil. Successive application of normal phase pre-packed silica-gel columns for the fractionation followed by reverse phase in automated MPLC system expedited the process and furnished highly pure metabolites. Furthermore, eight isolated triterpenoids along with five semi-synthesized derivatives were characterized using ultra performance liquid chromatography-electrospray ionization-quadrupole/orbitrap-MS/MS spectrometry as a rapid and sensitive identification technique. The structure-fragment relationships were established on the basis of plausible mechanistic pathway for the generation of daughter ions. The MS/MS spectral information of the triterpenoids was further utilized for the identification of studied molecules in the complex extract of stem and bark tissues from Neem. Copyright © 2014 Elsevier B.V. All rights reserved.
Augustsson, Cecilia; Persson, Egon
2014-11-13
Successful competition of activated factor VII (FVIIa) with zymogen factor VII (FVII) for tissue factor (TF) and loading of the platelet surface with FVIIa are plausible driving forces behind the pharmacological effect of recombinant FVIIa (rFVIIa) in hemophilia patients. Thrombin generation measurements in platelet-rich hemophilia A plasma revealed competition for TF, which potentially could reduce the effective (r)FVIIa:TF complex concentration and thereby attenuate factor Xa production. However, (auto)activation of FVII apparently counteracted the negative effect of zymogen binding; a small impact was observed at endogenous concentrations of FVII and FVIIa but was virtually absent at pharmacological amounts of rFVIIa. Moreover, corrections of the propagation phase in hemophilia A required rFVIIa concentrations above the range where a physiological level of FVII was capable to downregulate thrombin generation. These data strongly suggest that rFVIIa acts independently of TF in hemophilia therapy and that FVII displacement by rFVIIa is a negligible mechanistic component. © 2014 by The American Society of Hematology.
Hambly, Nathan; Shimbori, Chiko; Kolb, Martin
2015-10-01
Idiopathic pulmonary fibrosis (IPF) is a chronic and progressive fibrotic lung disease associated with high morbidity and poor survival. Characterized by substantial disease heterogeneity, the diagnostic considerations, clinical course and treatment response in individual patients can be variable. In the past decade, with the advent of high-throughput proteomic and genomic technologies, our understanding of the pathogenesis of IPF has greatly improved and has led to the recognition of novel treatment targets and numerous putative biomarkers. Molecular biomarkers with mechanistic plausibility are highly desired in IPF, where they have the potential to accelerate drug development, facilitate early detection in susceptible individuals, improve prognostic accuracy and inform treatment recommendations. Although the search for candidate biomarkers remains in its infancy, attractive targets such as MUC5B and MPP7 have already been validated in large cohorts and have demonstrated their potential to improve clinical predictors beyond that of routine clinical practices. The discovery and implementation of future biomarkers will face many challenges, but with strong collaborative efforts among scientists, clinicians and the industry the ultimate goal of personalized medicine may be realized. © 2015 Asian Pacific Society of Respirology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hill, Christopher
This project investigated possible mechanisms by which melt-water pulses can induce abrupt change in the Atlantic Meridional Overturning Circulation (AMOC) magnitude. AMOC magnitude is an important ingredient in present day climate. Previous studies have hypothesized abrupt reduction in AMOC magnitude in response to influxes of glacial melt water into the North Atlantic. Notable fresh-water influxes are associated with the terminus of the last ice age. During this period large volumes of melt water accumulated behind retreating ice sheets and subsequently drained rapidly when the ice weakened sufficiently. Rapid draining of glacial lakes into the North Atlantic is a possible originmore » of a number of paleo-record abrupt climate shifts. These include the Younger-Dryas cooling event and the 8,200 year cooling event. The studies undertaken focused on whether the mechanistic sequence by which glacial melt-water impacts AMOC, which then impacts Northern Hemisphere global mean surface temperature, is dynamically plausible. The work has implications for better understanding past climate stability. The work also has relevance for today’s environment, in which high-latitude ice melting in Greenland appears to be driving fresh water outflows at an accelerating pace.« less
Phoenix, Chris
2007-01-01
The relative insensitivity of lifespan to environmental factors constitutes compelling evidence that the physiological decline associated with aging derives primarily from the accumulation of intrinsic molecular and cellular side-effects of metabolism. Here we model that accumulation starting from a biologically based interpretation of the way in which those side-effects interact. We first validate this model by showing that it very accurately reproduces the distribution of ages at death seen in typical populations that are well protected from age-independent causes of death. We then exploit the mechanistic basis of this model to explore the impact on lifespans of interventions that combat aging, with an emphasis on interventions that repair (rather than merely retard) the direct molecular or cellular consequences of metabolism and thus prevent them from accumulating to pathogenic levels. Our results strengthen the case that an indefinite extension of healthy and total life expectancy can be achieved by a plausible rate of progress in the development of such therapies, once a threshold level of efficacy of those therapies has been reached. PMID:19424837
Dynamics of myosin-driven skeletal muscle contraction: I. Steady-state force generation.
Lan, Ganhui; Sun, Sean X
2005-06-01
Skeletal muscle contraction is a canonical example of motor-driven force generation. Despite the long history of research in this topic, a mechanistic explanation of the collective myosin force generation is lacking. We present a theoretical model of muscle contraction based on the conformational movements of individual myosins and experimentally measured chemical rate constants. Detailed mechanics of the myosin motor and the geometry of the sarcomere are taken into account. Two possible scenarios of force generation are examined. We find only one of the scenarios can give rise to a plausible contraction mechanism. We propose that the synchrony in muscle contraction is due to a force-dependent ADP release step. Computational results of a half sarcomere with 150 myosin heads can explain the experimentally measured force-velocity relationship and efficiency data. We predict that the number of working myosin motors increases as the load force is increased, thus showing synchrony among myosin motors during muscle contraction. We also find that titin molecules anchoring the thick filament are passive force generators in assisting muscle contraction.
Dynamics of Myosin-Driven Skeletal Muscle Contraction: I. Steady-State Force Generation
Lan, Ganhui; Sun, Sean X.
2005-01-01
Skeletal muscle contraction is a canonical example of motor-driven force generation. Despite the long history of research in this topic, a mechanistic explanation of the collective myosin force generation is lacking. We present a theoretical model of muscle contraction based on the conformational movements of individual myosins and experimentally measured chemical rate constants. Detailed mechanics of the myosin motor and the geometry of the sarcomere are taken into account. Two possible scenarios of force generation are examined. We find only one of the scenarios can give rise to a plausible contraction mechanism. We propose that the synchrony in muscle contraction is due to a force-dependent ADP release step. Computational results of a half sarcomere with 150 myosin heads can explain the experimentally measured force-velocity relationship and efficiency data. We predict that the number of working myosin motors increases as the load force is increased, thus showing synchrony among myosin motors during muscle contraction. We also find that titin molecules anchoring the thick filament are passive force generators in assisting muscle contraction. PMID:15778440
Potjewyd, G; Day, P J; Shangula, S; Margison, G P; Povey, A C
2017-03-01
L-β-N-methylamino-l-alanine (BMAA) is a non-proteinic amino acid, that is neurotoxic in vitro and in animals, and is implicated in the causation of amyotrophic lateral sclerosis and parkinsonism-dementia complex (ALS-PDC) on Guam. Given that natural amino acids can be N-nitrosated to form toxic alkylating agents and the structural similarity of BMAA to other amino acids, our hypothesis was that N-nitrosation of BMAA might result in a toxic alkylating agent, providing a novel mechanistic hypothesis for BMAA action. We have chemically nitrosated BMAA with sodium nitrite to produce nitrosated BMAA (N-BMAA) which was shown to react with the alkyl-trapping agent, 4-(p-nitrobenzyl)pyridine, cause DNA strand breaks in vitro and was toxic to the human neuroblastoma cell line SH-SY5Y under conditions in which BMAA itself was minimally toxic. Our results indicate that N-BMAA is an alkylating agent and toxin suggesting a plausible and previously unrecognised mechanism for the neurotoxic effects of BMAA. Copyright © 2017 Elsevier B.V. All rights reserved.
Protein Interactome of Muscle Invasive Bladder Cancer
Bhat, Akshay; Heinzel, Andreas; Mayer, Bernd; Perco, Paul; Mühlberger, Irmgard; Husi, Holger; Merseburger, Axel S.; Zoidakis, Jerome; Vlahou, Antonia; Schanstra, Joost P.; Mischak, Harald; Jankowski, Vera
2015-01-01
Muscle invasive bladder carcinoma is a complex, multifactorial disease caused by disruptions and alterations of several molecular pathways that result in heterogeneous phenotypes and variable disease outcome. Combining this disparate knowledge may offer insights for deciphering relevant molecular processes regarding targeted therapeutic approaches guided by molecular signatures allowing improved phenotype profiling. The aim of the study is to characterize muscle invasive bladder carcinoma on a molecular level by incorporating scientific literature screening and signatures from omics profiling. Public domain omics signatures together with molecular features associated with muscle invasive bladder cancer were derived from literature mining to provide 286 unique protein-coding genes. These were integrated in a protein-interaction network to obtain a molecular functional map of the phenotype. This feature map educated on three novel disease-associated pathways with plausible involvement in bladder cancer, namely Regulation of actin cytoskeleton, Neurotrophin signalling pathway and Endocytosis. Systematic integration approaches allow to study the molecular context of individual features reported as associated with a clinical phenotype and could potentially help to improve the molecular mechanistic description of the disorder. PMID:25569276
NASA Astrophysics Data System (ADS)
Maldonado, Solvey; Findeisen, Rolf
2010-06-01
The modeling, analysis, and design of treatment therapies for bone disorders based on the paradigm of force-induced bone growth and adaptation is a challenging task. Mathematical models provide, in comparison to clinical, medical and biological approaches an structured alternative framework to understand the concurrent effects of the multiple factors involved in bone remodeling. By now, there are few mathematical models describing the appearing complex interactions. However, the resulting models are complex and difficult to analyze, due to the strong nonlinearities appearing in the equations, the wide range of variability of the states, and the uncertainties in parameters. In this work, we focus on analyzing the effects of changes in model structure and parameters/inputs variations on the overall steady state behavior using systems theoretical methods. Based on an briefly reviewed existing model that describes force-induced bone adaptation, the main objective of this work is to analyze the stationary behavior and to identify plausible treatment targets for remodeling related bone disorders. Identifying plausible targets can help in the development of optimal treatments combining both physical activity and drug-medication. Such treatments help to improve/maintain/restore bone strength, which deteriorates under bone disorder conditions, such as estrogen deficiency.
Qamar, A; LeBlanc, K; Semeniuk, O; Reznik, A; Lin, J; Pan, Y; Moewes, A
2017-10-13
We investigated the electronic structure of Lead Oxide (PbO) - one of the most promising photoconductor materials for direct conversion x-ray imaging detectors, using soft x-ray emission and absorption spectroscopy. Two structural configurations of thin PbO layers, namely the polycrystalline and the amorphous phase, were studied, and compared to the properties of powdered α-PbO and β-PbO samples. In addition, we performed calculations within the framework of density functional theory and found an excellent agreement between the calculated and the measured absorption and emission spectra, which indicates high accuracy of our structural models. Our work provides strong evidence that the electronic structure of PbO layers, specifically the width of the band gap and the presence of additional interband and intraband states in both conduction and valence band, depend on the deposition conditions. We tested several model structures using DFT simulations to understand what the origin of these states is. The presence of O vacancies is the most plausible explanation for these additional electronic states. Several other plausible models were ruled out including interstitial O, dislocated O and the presence of significant lattice stress in PbO.
Boolean network inference from time series data incorporating prior biological knowledge.
Haider, Saad; Pal, Ranadip
2012-01-01
Numerous approaches exist for modeling of genetic regulatory networks (GRNs) but the low sampling rates often employed in biological studies prevents the inference of detailed models from experimental data. In this paper, we analyze the issues involved in estimating a model of a GRN from single cell line time series data with limited time points. We present an inference approach for a Boolean Network (BN) model of a GRN from limited transcriptomic or proteomic time series data based on prior biological knowledge of connectivity, constraints on attractor structure and robust design. We applied our inference approach to 6 time point transcriptomic data on Human Mammary Epithelial Cell line (HMEC) after application of Epidermal Growth Factor (EGF) and generated a BN with a plausible biological structure satisfying the data. We further defined and applied a similarity measure to compare synthetic BNs and BNs generated through the proposed approach constructed from transitions of various paths of the synthetic BNs. We have also compared the performance of our algorithm with two existing BN inference algorithms. Through theoretical analysis and simulations, we showed the rarity of arriving at a BN from limited time series data with plausible biological structure using random connectivity and absence of structure in data. The framework when applied to experimental data and data generated from synthetic BNs were able to estimate BNs with high similarity scores. Comparison with existing BN inference algorithms showed the better performance of our proposed algorithm for limited time series data. The proposed framework can also be applied to optimize the connectivity of a GRN from experimental data when the prior biological knowledge on regulators is limited or not unique.
Mode of action in relevance of rodent liver tumors to human cancer risk.
Holsapple, Michael P; Pitot, Henri C; Cohen, Samuel M; Cohen, Samuel H; Boobis, Alan R; Klaunig, James E; Pastoor, Timothy; Dellarco, Vicki L; Dragan, Yvonne P
2006-01-01
Hazard identification and risk assessment paradigms depend on the presumption of the similarity of rodents to humans, yet species specific responses, and the extrapolation of high-dose effects to low-dose exposures can affect the estimation of human risk from rodent data. As a consequence, a human relevance framework concept was developed by the International Programme on Chemical Safety (IPCS) and International Life Sciences Institute (ILSI) Risk Science Institute (RSI) with the central tenet being the identification of a mode of action (MOA). To perform a MOA analysis, the key biochemical, cellular, and molecular events need to first be established, and the temporal and dose-dependent concordance of each of the key events in the MOA can then be determined. The key events can be used to bridge species and dose for a given MOA. The next step in the MOA analysis is the assessment of biological plausibility for determining the relevance of the specified MOA in an animal model for human cancer risk based on kinetic and dynamic parameters. Using the framework approach, a MOA in animals could not be defined for metal overload. The MOA for phenobarbital (PB)-like P450 inducers was determined to be unlikely in humans after kinetic and dynamic factors were considered. In contrast, after these factors were considered with reference to estrogen, the conclusion was drawn that estrogen-induced tumors were plausible in humans. Finally, it was concluded that the induction of rodent liver tumors by porphyrogenic compounds followed a cytotoxic MOA, and that liver tumors formed as a result of sustained cytotoxicity and regenerative proliferation are considered relevant for evaluating human cancer risk if appropriate metabolism occurs in the animal models and in humans.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grabaskas, Dave; Brunett, Acacia J.; Bucknor, Matthew
GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory are currently engaged in a joint effort to modernize and develop probabilistic risk assessment (PRA) techniques for advanced non-light water reactors. At a high level, the primary outcome of this project will be the development of next-generation PRA methodologies that will enable risk-informed prioritization of safety- and reliability-focused research and development, while also identifying gaps that may be resolved through additional research. A subset of this effort is the development of PRA methodologies to conduct a mechanistic source term (MST) analysis for event sequences that could result in the release ofmore » radionuclides. The MST analysis seeks to realistically model and assess the transport, retention, and release of radionuclides from the reactor to the environment. The MST methods developed during this project seek to satisfy the requirements of the Mechanistic Source Term element of the ASME/ANS Non-LWR PRA standard. The MST methodology consists of separate analysis approaches for risk-significant and non-risk significant event sequences that may result in the release of radionuclides from the reactor. For risk-significant event sequences, the methodology focuses on a detailed assessment, using mechanistic models, of radionuclide release from the fuel, transport through and release from the primary system, transport in the containment, and finally release to the environment. The analysis approach for non-risk significant event sequences examines the possibility of large radionuclide releases due to events such as re-criticality or the complete loss of radionuclide barriers. This paper provides details on the MST methodology, including the interface between the MST analysis and other elements of the PRA, and provides a simplified example MST calculation for a sodium fast reactor.« less
Cremer, Jonas; Arnoldini, Markus; Hwa, Terence
2017-06-20
The human gut harbors a dynamic microbial community whose composition bears great importance for the health of the host. Here, we investigate how colonic physiology impacts bacterial growth, which ultimately dictates microbiota composition. Combining measurements of bacterial physiology with analysis of published data on human physiology into a quantitative, comprehensive modeling framework, we show how water flow in the colon, in concert with other physiological factors, determine the abundances of the major bacterial phyla. Mechanistically, our model shows that local pH values in the lumen, which differentially affect the growth of different bacteria, drive changes in microbiota composition. It identifies key factors influencing the delicate regulation of colonic pH, including epithelial water absorption, nutrient inflow, and luminal buffering capacity, and generates testable predictions on their effects. Our findings show that a predictive and mechanistic understanding of microbial ecology in the gut is possible. Such predictive understanding is needed for the rational design of intervention strategies to actively control the microbiota.
Kirk, Devin; Jones, Natalie; Peacock, Stephanie; Phillips, Jessica; Molnár, Péter K; Krkošek, Martin; Luijckx, Pepijn
2018-02-01
The complexity of host-parasite interactions makes it difficult to predict how host-parasite systems will respond to climate change. In particular, host and parasite traits such as survival and virulence may have distinct temperature dependencies that must be integrated into models of disease dynamics. Using experimental data from Daphnia magna and a microsporidian parasite, we fitted a mechanistic model of the within-host parasite population dynamics. Model parameters comprising host aging and mortality, as well as parasite growth, virulence, and equilibrium abundance, were specified by relationships arising from the metabolic theory of ecology. The model effectively predicts host survival, parasite growth, and the cost of infection across temperature while using less than half the parameters compared to modeling temperatures discretely. Our results serve as a proof of concept that linking simple metabolic models with a mechanistic host-parasite framework can be used to predict temperature responses of parasite population dynamics at the within-host level.
Jones, Natalie; Peacock, Stephanie; Phillips, Jessica; Molnár, Péter K.; Krkošek, Martin; Luijckx, Pepijn
2018-01-01
The complexity of host–parasite interactions makes it difficult to predict how host–parasite systems will respond to climate change. In particular, host and parasite traits such as survival and virulence may have distinct temperature dependencies that must be integrated into models of disease dynamics. Using experimental data from Daphnia magna and a microsporidian parasite, we fitted a mechanistic model of the within-host parasite population dynamics. Model parameters comprising host aging and mortality, as well as parasite growth, virulence, and equilibrium abundance, were specified by relationships arising from the metabolic theory of ecology. The model effectively predicts host survival, parasite growth, and the cost of infection across temperature while using less than half the parameters compared to modeling temperatures discretely. Our results serve as a proof of concept that linking simple metabolic models with a mechanistic host–parasite framework can be used to predict temperature responses of parasite population dynamics at the within-host level. PMID:29415043
Cremer, Jonas; Arnoldini, Markus; Hwa, Terence
2017-01-01
The human gut harbors a dynamic microbial community whose composition bears great importance for the health of the host. Here, we investigate how colonic physiology impacts bacterial growth, which ultimately dictates microbiota composition. Combining measurements of bacterial physiology with analysis of published data on human physiology into a quantitative, comprehensive modeling framework, we show how water flow in the colon, in concert with other physiological factors, determine the abundances of the major bacterial phyla. Mechanistically, our model shows that local pH values in the lumen, which differentially affect the growth of different bacteria, drive changes in microbiota composition. It identifies key factors influencing the delicate regulation of colonic pH, including epithelial water absorption, nutrient inflow, and luminal buffering capacity, and generates testable predictions on their effects. Our findings show that a predictive and mechanistic understanding of microbial ecology in the gut is possible. Such predictive understanding is needed for the rational design of intervention strategies to actively control the microbiota. PMID:28588144
Koutinas, Michalis; Kiparissides, Alexandros; Pistikopoulos, Efstratios N; Mantalaris, Athanasios
2012-01-01
The complexity of the regulatory network and the interactions that occur in the intracellular environment of microorganisms highlight the importance in developing tractable mechanistic models of cellular functions and systematic approaches for modelling biological systems. To this end, the existing process systems engineering approaches can serve as a vehicle for understanding, integrating and designing biological systems and processes. Here, we review the application of a holistic approach for the development of mathematical models of biological systems, from the initial conception of the model to its final application in model-based control and optimisation. We also discuss the use of mechanistic models that account for gene regulation, in an attempt to advance the empirical expressions traditionally used to describe micro-organism growth kinetics, and we highlight current and future challenges in mathematical biology. The modelling research framework discussed herein could prove beneficial for the design of optimal bioprocesses, employing rational and feasible approaches towards the efficient production of chemicals and pharmaceuticals.
Koutinas, Michalis; Kiparissides, Alexandros; Pistikopoulos, Efstratios N.; Mantalaris, Athanasios
2013-01-01
The complexity of the regulatory network and the interactions that occur in the intracellular environment of microorganisms highlight the importance in developing tractable mechanistic models of cellular functions and systematic approaches for modelling biological systems. To this end, the existing process systems engineering approaches can serve as a vehicle for understanding, integrating and designing biological systems and processes. Here, we review the application of a holistic approach for the development of mathematical models of biological systems, from the initial conception of the model to its final application in model-based control and optimisation. We also discuss the use of mechanistic models that account for gene regulation, in an attempt to advance the empirical expressions traditionally used to describe micro-organism growth kinetics, and we highlight current and future challenges in mathematical biology. The modelling research framework discussed herein could prove beneficial for the design of optimal bioprocesses, employing rational and feasible approaches towards the efficient production of chemicals and pharmaceuticals. PMID:24688682
Covalent Organic Frameworks as a Platform for Multidimensional Polymerization.
Bisbey, Ryan P; Dichtel, William R
2017-06-28
The simultaneous polymerization and crystallization of monomers featuring directional bonding designs provides covalent organic frameworks (COFs), which are periodic polymer networks with robust covalent bonds arranged in two- or three-dimensional topologies. The range of properties characterized in COFs has rapidly expanded to include those of interest for heterogeneous catalysis, energy storage and photovoltaic devices, and proton-conducting membranes. Yet many of these applications will require materials quality, morphological control, and synthetic efficiency exceeding the capabilities of contemporary synthetic methods. This level of control will emerge from an improved fundamental understanding of COF nucleation and growth processes. More powerful characterization of structure and defects, improved syntheses guided by mechanistic understanding, and accessing diverse isolated forms, ranging from single crystals to thin films to colloidal suspensions, remain important frontier problems.
Covalent Organic Frameworks as a Platform for Multidimensional Polymerization
2017-01-01
The simultaneous polymerization and crystallization of monomers featuring directional bonding designs provides covalent organic frameworks (COFs), which are periodic polymer networks with robust covalent bonds arranged in two- or three-dimensional topologies. The range of properties characterized in COFs has rapidly expanded to include those of interest for heterogeneous catalysis, energy storage and photovoltaic devices, and proton-conducting membranes. Yet many of these applications will require materials quality, morphological control, and synthetic efficiency exceeding the capabilities of contemporary synthetic methods. This level of control will emerge from an improved fundamental understanding of COF nucleation and growth processes. More powerful characterization of structure and defects, improved syntheses guided by mechanistic understanding, and accessing diverse isolated forms, ranging from single crystals to thin films to colloidal suspensions, remain important frontier problems. PMID:28691064
Kepler, the Ultimate Aristotelian
NASA Astrophysics Data System (ADS)
Davis, A. E. L.
A comparison is made between Aristotelian and Newtonian versions of Laws of Motion. Kepler was successful in proving the 2 laws of motion of a single planet - to the extent that agreement with a framework of theory constitutes a proof. Of course he invented his framework of causes after the event, to fit the motions that had been already been quantified - but it may seem to you that Kepler's mainly mechanistic way explanation could have been considered by his contemporaries just as reasonable as Newton's action at a distance. It could be now apprecated that there was a window of less than 50 years before Newton's total synthesis. No-one previously had had the motivation to create a system of "celestial physics" based on a judicious use of Aristotelian principles. Yet this is what Kepler achieved.
Moullin, Joanna C; Sabater-Hernández, Daniel; Fernandez-Llimos, Fernando; Benrimoj, Shalom I
2015-03-14
Implementation science and knowledge translation have developed across multiple disciplines with the common aim of bringing innovations to practice. Numerous implementation frameworks, models, and theories have been developed to target a diverse array of innovations. As such, it is plausible that not all frameworks include the full range of concepts now thought to be involved in implementation. Users face the decision of selecting a single or combining multiple implementation frameworks. To aid this decision, the aim of this review was to assess the comprehensiveness of existing frameworks. A systematic search was undertaken in PubMed to identify implementation frameworks of innovations in healthcare published from 2004 to May 2013. Additionally, titles and abstracts from Implementation Science journal and references from identified papers were reviewed. The orientation, type, and presence of stages and domains, along with the degree of inclusion and depth of analysis of factors, strategies, and evaluations of implementation of included frameworks were analysed. Frameworks were assessed individually and grouped according to their targeted innovation. Frameworks for particular innovations had similar settings, end-users, and 'type' (descriptive, prescriptive, explanatory, or predictive). On the whole, frameworks were descriptive and explanatory more often than prescriptive and predictive. A small number of the reviewed frameworks covered an implementation concept(s) in detail, however, overall, there was limited degree and depth of analysis of implementation concepts. The core implementation concepts across the frameworks were collated to form a Generic Implementation Framework, which includes the process of implementation (often portrayed as a series of stages and/or steps), the innovation to be implemented, the context in which the implementation is to occur (divided into a range of domains), and influencing factors, strategies, and evaluations. The selection of implementation framework(s) should be based not solely on the healthcare innovation to be implemented, but include other aspects of the framework's orientation, e.g., the setting and end-user, as well as the degree of inclusion and depth of analysis of the implementation concepts. The resulting generic structure provides researchers, policy-makers, health administrators, and practitioners a base that can be used as guidance for their implementation efforts.
Automatising the analysis of stochastic biochemical time-series
2015-01-01
Background Mathematical and computational modelling of biochemical systems has seen a lot of effort devoted to the definition and implementation of high-performance mechanistic simulation frameworks. Within these frameworks it is possible to analyse complex models under a variety of configurations, eventually selecting the best setting of, e.g., parameters for a target system. Motivation This operational pipeline relies on the ability to interpret the predictions of a model, often represented as simulation time-series. Thus, an efficient data analysis pipeline is crucial to automatise time-series analyses, bearing in mind that errors in this phase might mislead the modeller's conclusions. Results For this reason we have developed an intuitive framework-independent Python tool to automate analyses common to a variety of modelling approaches. These include assessment of useful non-trivial statistics for simulation ensembles, e.g., estimation of master equations. Intuitive and domain-independent batch scripts will allow the researcher to automatically prepare reports, thus speeding up the usual model-definition, testing and refinement pipeline. PMID:26051821
Bouhaddou, Mehdi; Koch, Rick J.; DiStefano, Matthew S.; Tan, Annie L.; Mertz, Alex E.
2018-01-01
Most cancer cells harbor multiple drivers whose epistasis and interactions with expression context clouds drug and drug combination sensitivity prediction. We constructed a mechanistic computational model that is context-tailored by omics data to capture regulation of stochastic proliferation and death by pan-cancer driver pathways. Simulations and experiments explore how the coordinated dynamics of RAF/MEK/ERK and PI-3K/AKT kinase activities in response to synergistic mitogen or drug combinations control cell fate in a specific cellular context. In this MCF10A cell context, simulations suggest that synergistic ERK and AKT inhibitor-induced death is likely mediated by BIM rather than BAD, which is supported by prior experimental studies. AKT dynamics explain S-phase entry synergy between EGF and insulin, but simulations suggest that stochastic ERK, and not AKT, dynamics seem to drive cell-to-cell proliferation variability, which in simulations is predictable from pre-stimulus fluctuations in C-Raf/B-Raf levels. Simulations suggest MEK alteration negligibly influences transformation, consistent with clinical data. Tailoring the model to an alternate cell expression and mutation context, a glioma cell line, allows prediction of increased sensitivity of cell death to AKT inhibition. Our model mechanistically interprets context-specific landscapes between driver pathways and cell fates, providing a framework for designing more rational cancer combination therapy. PMID:29579036
NASA Astrophysics Data System (ADS)
Payne, J. F.
2016-12-01
Significant Arctic environmental and socio-economic change has been observed on the North Slope of Alaska, presenting challenges for resident communities and management agencies that need to adapt to future changes that are difficult to model or predict. Continued climate change coupled with new or modified energy development could substantially alter the landscape and ecosystem in the future. The North Slope Science Initiative (NSSI) recognized the value of using a participatory scenarios process to consider plausible future energy and resource development scenarios through the year 2040 to help identify and prioritize research and monitoring needs on the North Slope. The scenarios process engaged diverse stakeholders, including subject matter experts and local knowledge holders. Through identification and ranking of key drivers and uncertainties relevant to the focus of the study, a series of spatially explicit scenarios was developed, analyzed in terms of low, medium and high development activities. Climate change and economic factors were key drivers affecting plausible energy development scenarios. The implications from each of the scenarios were then used to identify important research and monitoring activities and their relevant spatial scales. The scenarios project identified over 40 research and monitoring needs. The top five research needs addressed data gaps and key concerns related to how the scenarios could affect: hunting and trapping on land, health and community well-being, permafrost and hydrology, marine mammal subsistence and potential marine oil spills. The use of a participatory scenarios process was essential for identifying a range of plausible energy and resource development scenarios using a framework that involved a systematic assessment of complex interacting drivers of change, consideration of key uncertainties, and transparency throughout the project.
Modelling the effects of past and future climate on the risk of bluetongue emergence in Europe
Guis, Helene; Caminade, Cyril; Calvete, Carlos; Morse, Andrew P.; Tran, Annelise; Baylis, Matthew
2012-01-01
Vector-borne diseases are among those most sensitive to climate because the ecology of vectors and the development rate of pathogens within them are highly dependent on environmental conditions. Bluetongue (BT), a recently emerged arboviral disease of ruminants in Europe, is often cited as an illustration of climate's impact on disease emergence, although no study has yet tested this association. Here, we develop a framework to quantitatively evaluate the effects of climate on BT's emergence in Europe by integrating high-resolution climate observations and model simulations within a mechanistic model of BT transmission risk. We demonstrate that a climate-driven model explains, in both space and time, many aspects of BT's recent emergence and spread, including the 2006 BT outbreak in northwest Europe which occurred in the year of highest projected risk since at least 1960. Furthermore, the model provides mechanistic insight into BT's emergence, suggesting that the drivers of emergence across Europe differ between the South and the North. Driven by simulated future climate from an ensemble of 11 regional climate models, the model projects increase in the future risk of BT emergence across most of Europe with uncertainty in rate but not in trend. The framework described here is adaptable and applicable to other diseases, where the link between climate and disease transmission risk can be quantified, permitting the evaluation of scale and uncertainty in climate change's impact on the future of such diseases. PMID:21697167
Scenario-neutral Food Security Risk Assessment: A livestock Heat Stress Case Study
NASA Astrophysics Data System (ADS)
Broman, D.; Rajagopalan, B.; Hopson, T. M.
2015-12-01
Food security risk assessments can provide decision-makers with actionable information to identify critical system limitations, and alternatives to mitigate the impacts of future conditions. The majority of current risk assessments have been scenario-led and results are limited by the scenarios - selected future states of the world's climate system and socioeconomic factors. A generic scenario-neutral framework for food security risk assessments is presented here that uses plausible states of the world without initially assigning likelihoods. Measures of system vulnerabilities are identified and system risk is assessed for these states. This framework has benefited greatly by research in the water and natural resource fields to adapt their planning to provide better risk assessments. To illustrate the utility of this framework we develop a case study using livestock heat stress risk within the pastoral system of West Africa. Heat stress can have a major impact not only on livestock owners, but on the greater food production system, decreasing livestock growth, milk production, and reproduction, and in severe cases, death. A heat stress index calculated from daily weather is used as a vulnerability measure and is computed from historic daily weather data at several locations in the study region. To generate plausible states, a stochastic weather generator is developed to generate synthetic weather sequences at each location, consistent with the seasonal climate. A spatial model of monthly and seasonal heat stress provide projections of current and future livestock heat stress measures across the study region, and can incorporate in seasonal climate and other external covariates. These models, when linked with empirical thresholds of heat stress risk for specific breeds offer decision-makers with actionable information for use in near-term warning systems as well as for future planning. Future assessment can indicate under which states livestock are at greatest risk of heat stress; when coupled with assessments of additional measures (e.g. water and fodder availability) can inform on alternatives that provide satisfactory performance under a wide range of states (e.g. optimal cattle breed, supplemental feed, increased water access).
Steinmetz, Nicholas A.; Moore, Tirin; Knudsen, Eric I.
2017-01-01
Distinct networks in the forebrain and the midbrain coordinate to control spatial attention. The critical involvement of the superior colliculus (SC)—the central structure in the midbrain network—in visuospatial attention has been shown by four seminal, published studies in monkeys (Macaca mulatta) performing multialternative tasks. However, due to the lack of a mechanistic framework for interpreting behavioral data in such tasks, the nature of the SC's contribution to attention remains unclear. Here we present and validate a novel decision framework for analyzing behavioral data in multialternative attention tasks. We apply this framework to re-examine the behavioral evidence from these published studies. Our model is a multidimensional extension to signal detection theory that distinguishes between two major classes of attentional mechanisms: those that alter the quality of sensory information or “sensitivity,” and those that alter the selective gating of sensory information or “choice bias.” Model-based simulations and model-based analyses of data from these published studies revealed a converging pattern of results that indicated that choice-bias changes, rather than sensitivity changes, were the primary outcome of SC manipulation. Our results suggest that the SC contributes to attentional performance predominantly by generating a spatial choice bias for stimuli at a selected location, and that this bias operates downstream of forebrain mechanisms that enhance sensitivity. The findings lead to a testable mechanistic framework of how the midbrain and forebrain networks interact to control spatial attention. SIGNIFICANCE STATEMENT Attention involves the selection of the most relevant information for differential sensory processing and decision making. While the mechanisms by which attention alters sensory encoding (sensitivity control) are well studied, the mechanisms by which attention alters decisional weighting of sensory evidence (choice-bias control) are poorly understood. Here, we introduce a model of multialternative decision making that distinguishes bias from sensitivity effects in attention tasks. With our model, we simulate experimental data from four seminal studies that microstimulated or inactivated a key attention-related midbrain structure, the superior colliculus (SC). We demonstrate that the experimental effects of SC manipulation are entirely consistent with the SC controlling attention by changing choice bias, thereby shedding new light on how the brain mediates attention. PMID:28100734
Sridharan, Devarajan; Steinmetz, Nicholas A; Moore, Tirin; Knudsen, Eric I
2017-01-18
Distinct networks in the forebrain and the midbrain coordinate to control spatial attention. The critical involvement of the superior colliculus (SC)-the central structure in the midbrain network-in visuospatial attention has been shown by four seminal, published studies in monkeys (Macaca mulatta) performing multialternative tasks. However, due to the lack of a mechanistic framework for interpreting behavioral data in such tasks, the nature of the SC's contribution to attention remains unclear. Here we present and validate a novel decision framework for analyzing behavioral data in multialternative attention tasks. We apply this framework to re-examine the behavioral evidence from these published studies. Our model is a multidimensional extension to signal detection theory that distinguishes between two major classes of attentional mechanisms: those that alter the quality of sensory information or "sensitivity," and those that alter the selective gating of sensory information or "choice bias." Model-based simulations and model-based analyses of data from these published studies revealed a converging pattern of results that indicated that choice-bias changes, rather than sensitivity changes, were the primary outcome of SC manipulation. Our results suggest that the SC contributes to attentional performance predominantly by generating a spatial choice bias for stimuli at a selected location, and that this bias operates downstream of forebrain mechanisms that enhance sensitivity. The findings lead to a testable mechanistic framework of how the midbrain and forebrain networks interact to control spatial attention. Attention involves the selection of the most relevant information for differential sensory processing and decision making. While the mechanisms by which attention alters sensory encoding (sensitivity control) are well studied, the mechanisms by which attention alters decisional weighting of sensory evidence (choice-bias control) are poorly understood. Here, we introduce a model of multialternative decision making that distinguishes bias from sensitivity effects in attention tasks. With our model, we simulate experimental data from four seminal studies that microstimulated or inactivated a key attention-related midbrain structure, the superior colliculus (SC). We demonstrate that the experimental effects of SC manipulation are entirely consistent with the SC controlling attention by changing choice bias, thereby shedding new light on how the brain mediates attention. Copyright © 2017 the authors 0270-6474/17/370480-32$15.00/0.
A new mechanistic framework to predict OCS fluxes in soils
NASA Astrophysics Data System (ADS)
Sauze, Joana; Ogee, Jérôme; Launois, Thomas; Kesselmeier, Jürgen; Van Diest, Heidi; Wingate, Lisa
2015-04-01
A better description of the amplitude of photosynthetic and respiratory gross CO2 fluxes at large scales is needed to improve our predictions of the current and future global CO2 cycle. Carbonyl sulfide (COS) is the most abundant sulphur gas in the atmosphere and has been proposed as a new tracer of gross photosynthesis, as the uptake of COS from the atmosphere is dominated by the activity of carbonic anhydrase (CA), an enzyme abundant in leaves that also catalyses CO2 hydration during photosynthesis. However, soils also exchange COS with the atmosphere and there is growing evidence that this flux must also be accounted for in atmospheric budgets. In this context a new mechanistic description of soil-atmosphere COS exchange is clearly needed. Soils can take up COS from the atmosphere as the soil biota also contain CA, and COS emissions from soils have also been reported in agricultural fields or anoxic soils. Previous studies have also shown that soil COS fluxes present an optimum soil water content and soil temperature. Here we propose a new mechanistic framework to predict the fluxes of COS between the soils and the atmosphere. We describe the COS soil budget by a first-order reaction-diffusion-production equation, assuming that the hydrolysis of COS by CA is total and irreversible. To describe COS diffusion through the soil matrix, we use different formulations of soil air-filled pore space and temperature, depending on the turbulence level above the soil surface. Using this model we are able to explain the observed presence of an optimum temperature for soil COS uptake and show how this optimum can shift to cooler temperatures in the presence of soil COS emissions. Our model can also explain the observed optimum with soil moisture content previously described in the literature (e.g. Van Diest & Kesselmeier, 2008) as a result of diffusional constraints on COS hydrolysis. These diffusional constraints are also responsible for the response of COS uptake to soil weight and depth observed by Kesselmeier et al. (1999). In order to simulate the exact COS uptake rates and patterns observed on several soils collected from a range of biomes (Van Diest & Kesselmeier, 2008) different CA activities had to be evoked in each soil type, coherent with the expected soil microbial population size and diversity. A better description of the drivers governing soil CA activity and COS emissions from soils is needed before incorporating our new mechanistic model of soil-atmosphere COS uptake in large-scale ecosystem models and COS atmospheric budgets.
Li, Chuan; Li, Chuang-Jun; Ma, Jie; Chen, Fang-You; Li, Li; Wang, Xiao-Liang; Ye, Fei; Zhang, Dong-Ming
2018-06-15
Magterpenoid A (1), possessing a rare 4,6,11-trioxatricyclo[5.3.1.0 1,5 ]undecane framework with an irregular monoterpenoid moiety, magterpenoid B (2), with an unprecedented 6/6/6/6 polycyclic skeleton, and magterpenoid C (3), a novel terpenoid quinone with a C6-C3 unit, were isolated from the bark of Magnolia officinalis var. biloba. Plausible biogenetic pathways of 1-3 are presented. Compounds 1 and 3 exhibited significant PTP1B inhibitory activities with IC 50 values of 1.44 and 0.81 μM, respectively.
Resolving Conflicts Between Syntax and Plausibility in Sentence Comprehension
Andrews, Glenda; Ogden, Jessica E.; Halford, Graeme S.
2017-01-01
Comprehension of plausible and implausible object- and subject-relative clause sentences with and without prepositional phrases was examined. Undergraduates read each sentence then evaluated a statement as consistent or inconsistent with the sentence. Higher acceptance of consistent than inconsistent statements indicated reliance on syntactic analysis. Higher acceptance of plausible than implausible statements reflected reliance on semantic plausibility. There was greater reliance on semantic plausibility and lesser reliance on syntactic analysis for more complex object-relatives and sentences with prepositional phrases than for less complex subject-relatives and sentences without prepositional phrases. Comprehension accuracy and confidence were lower when syntactic analysis and semantic plausibility yielded conflicting interpretations. The conflict effect on comprehension was significant for complex sentences but not for less complex sentences. Working memory capacity predicted resolution of the syntax-plausibility conflict in more and less complex items only when sentences and statements were presented sequentially. Fluid intelligence predicted resolution of the conflict in more and less complex items under sequential and simultaneous presentation. Domain-general processes appear to be involved in resolving syntax-plausibility conflicts in sentence comprehension. PMID:28458748
Counterfactual Plausibility and Comparative Similarity.
Stanley, Matthew L; Stewart, Gregory W; Brigard, Felipe De
2017-05-01
Counterfactual thinking involves imagining hypothetical alternatives to reality. Philosopher David Lewis (1973, 1979) argued that people estimate the subjective plausibility that a counterfactual event might have occurred by comparing an imagined possible world in which the counterfactual statement is true against the current, actual world in which the counterfactual statement is false. Accordingly, counterfactuals considered to be true in possible worlds comparatively more similar to ours are judged as more plausible than counterfactuals deemed true in possible worlds comparatively less similar. Although Lewis did not originally develop his notion of comparative similarity to be investigated as a psychological construct, this study builds upon his idea to empirically investigate comparative similarity as a possible psychological strategy for evaluating the perceived plausibility of counterfactual events. More specifically, we evaluate judgments of comparative similarity between episodic memories and episodic counterfactual events as a factor influencing people's judgments of plausibility in counterfactual simulations, and we also compare it against other factors thought to influence judgments of counterfactual plausibility, such as ease of simulation and prior simulation. Our results suggest that the greater the perceived similarity between the original memory and the episodic counterfactual event, the greater the perceived plausibility that the counterfactual event might have occurred. While similarity between actual and counterfactual events, ease of imagining, and prior simulation of the counterfactual event were all significantly related to counterfactual plausibility, comparative similarity best captured the variance in ratings of counterfactual plausibility. Implications for existing theories on the determinants of counterfactual plausibility are discussed. Copyright © 2016 Cognitive Science Society, Inc.
An Observation-Driven Agent-Based Modeling and Analysis Framework for C. elegans Embryogenesis.
Wang, Zi; Ramsey, Benjamin J; Wang, Dali; Wong, Kwai; Li, Husheng; Wang, Eric; Bao, Zhirong
2016-01-01
With cutting-edge live microscopy and image analysis, biologists can now systematically track individual cells in complex tissues and quantify cellular behavior over extended time windows. Computational approaches that utilize the systematic and quantitative data are needed to understand how cells interact in vivo to give rise to the different cell types and 3D morphology of tissues. An agent-based, minimum descriptive modeling and analysis framework is presented in this paper to study C. elegans embryogenesis. The framework is designed to incorporate the large amounts of experimental observations on cellular behavior and reserve data structures/interfaces that allow regulatory mechanisms to be added as more insights are gained. Observed cellular behaviors are organized into lineage identity, timing and direction of cell division, and path of cell movement. The framework also includes global parameters such as the eggshell and a clock. Division and movement behaviors are driven by statistical models of the observations. Data structures/interfaces are reserved for gene list, cell-cell interaction, cell fate and landscape, and other global parameters until the descriptive model is replaced by a regulatory mechanism. This approach provides a framework to handle the ongoing experiments of single-cell analysis of complex tissues where mechanistic insights lag data collection and need to be validated on complex observations.
NASA Astrophysics Data System (ADS)
Hurd, B. H.; Coonrod, J.
2008-12-01
Climate change is expected to alter surface hydrology throughout the arid Western United States, in most cases compressing the period of peak snowmelt and runoff, and in some cases, for example, the Rio Grande, limiting total runoff. As such, climate change is widely expected to further stress arid watersheds, particularly in regions where trends in population growth, economic development and environmental regulation are current challenges. Strategies to adapt to such changes are evolving at various institutional levels including conjunctive management of surface and ground waters. Groundwater resources remain one of the key components of water management strategies aimed at accommodating continued population growth and mitigating the potential for water supply disruptions under climate change. By developing a framework for valuing these resources and for value improvements in the information pertaining to their characteristics, this research can assist in prioritizing infrastructure and investment to change and enhance water resource management. The key objective of this paper is to 1) develop a framework for estimating the value of groundwater resources and improved information, and 2) provide some preliminary estimates of this value and how it responds to plausible scenarios of climate change.
Bayesian Hierarchical Grouping: perceptual grouping as mixture estimation
Froyen, Vicky; Feldman, Jacob; Singh, Manish
2015-01-01
We propose a novel framework for perceptual grouping based on the idea of mixture models, called Bayesian Hierarchical Grouping (BHG). In BHG we assume that the configuration of image elements is generated by a mixture of distinct objects, each of which generates image elements according to some generative assumptions. Grouping, in this framework, means estimating the number and the parameters of the mixture components that generated the image, including estimating which image elements are “owned” by which objects. We present a tractable implementation of the framework, based on the hierarchical clustering approach of Heller and Ghahramani (2005). We illustrate it with examples drawn from a number of classical perceptual grouping problems, including dot clustering, contour integration, and part decomposition. Our approach yields an intuitive hierarchical representation of image elements, giving an explicit decomposition of the image into mixture components, along with estimates of the probability of various candidate decompositions. We show that BHG accounts well for a diverse range of empirical data drawn from the literature. Because BHG provides a principled quantification of the plausibility of grouping interpretations over a wide range of grouping problems, we argue that it provides an appealing unifying account of the elusive Gestalt notion of Prägnanz. PMID:26322548
Mechanistic modeling of reactive soil nitrogen emissions across agricultural management practices
NASA Astrophysics Data System (ADS)
Rasool, Q. Z.; Miller, D. J.; Bash, J. O.; Venterea, R. T.; Cooter, E. J.; Hastings, M. G.; Cohan, D. S.
2017-12-01
The global reactive nitrogen (N) budget has increased by a factor of 2-3 from pre-industrial levels. This increase is especially pronounced in highly N fertilized agricultural regions in summer. The reactive N emissions from soil to atmosphere can be in reduced (NH3) or oxidized (NO, HONO, N2O) forms, depending on complex biogeochemical transformations of soil N reservoirs. Air quality models like CMAQ typically neglect soil emissions of HONO and N2O. Previously, soil NO emissions estimated by models like CMAQ remained parametric and inconsistent with soil NH3 emissions. Thus, there is a need to more mechanistically and consistently represent the soil N processes that lead to reactive N emissions to the atmosphere. Our updated approach estimates soil NO, HONO and N2O emissions by incorporating detailed agricultural fertilizer inputs from EPIC, and CMAQ-modeled N deposition, into the soil N pool. EPIC addresses the nitrification, denitrification and volatilization rates along with soil N pools for agricultural soils. Suitable updates to account for factors like nitrite (NO2-) accumulation not addressed in EPIC, will also be made. The NO and N2O emissions from nitrification and denitrification are computed mechanistically using the N sub-model of DAYCENT. These mechanistic definitions use soil water content, temperature, NH4+ and NO3- concentrations, gas diffusivity and labile C availability as dependent parameters at various soil layers. Soil HONO emissions found to be most probable under high NO2- availability will be based on observed ratios of HONO to NO emissions under different soil moistures, pH and soil types. The updated scheme will utilize field-specific soil properties and N inputs across differing manure management practices such as tillage. Comparison of the modeled soil NO emission rates from the new mechanistic and existing schemes against field measurements will be discussed. Our updated framework will help to predict the diurnal and daily variability of different reactive N emissions (NO, HONO, N2O) with soil temperature, moisture and N inputs.
Towards a neuro-computational account of prism adaptation.
Petitet, Pierre; O'Reilly, Jill X; O'Shea, Jacinta
2017-12-14
Prism adaptation has a long history as an experimental paradigm used to investigate the functional and neural processes that underlie sensorimotor control. In the neuropsychology literature, prism adaptation behaviour is typically explained by reference to a traditional cognitive psychology framework that distinguishes putative functions, such as 'strategic control' versus 'spatial realignment'. This theoretical framework lacks conceptual clarity, quantitative precision and explanatory power. Here, we advocate for an alternative computational framework that offers several advantages: 1) an algorithmic explanatory account of the computations and operations that drive behaviour; 2) expressed in quantitative mathematical terms; 3) embedded within a principled theoretical framework (Bayesian decision theory, state-space modelling); 4) that offers a means to generate and test quantitative behavioural predictions. This computational framework offers a route towards mechanistic neurocognitive explanations of prism adaptation behaviour. Thus it constitutes a conceptual advance compared to the traditional theoretical framework. In this paper, we illustrate how Bayesian decision theory and state-space models offer principled explanations for a range of behavioural phenomena in the field of prism adaptation (e.g. visual capture, magnitude of visual versus proprioceptive realignment, spontaneous recovery and dynamics of adaptation memory). We argue that this explanatory framework can advance understanding of the functional and neural mechanisms that implement prism adaptation behaviour, by enabling quantitative tests of hypotheses that go beyond merely descriptive mapping claims that 'brain area X is (somehow) involved in psychological process Y'. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
A unifying framework for quantifying the nature of animal interactions.
Potts, Jonathan R; Mokross, Karl; Lewis, Mark A
2014-07-06
Collective phenomena, whereby agent-agent interactions determine spatial patterns, are ubiquitous in the animal kingdom. On the other hand, movement and space use are also greatly influenced by the interactions between animals and their environment. Despite both types of interaction fundamentally influencing animal behaviour, there has hitherto been no unifying framework for the models proposed in both areas. Here, we construct a general method for inferring population-level spatial patterns from underlying individual movement and interaction processes, a key ingredient in building a statistical mechanics for ecological systems. We show that resource selection functions, as well as several examples of collective motion models, arise as special cases of our framework, thus bringing together resource selection analysis and collective animal behaviour into a single theory. In particular, we focus on combining the various mechanistic models of territorial interactions in the literature with step selection functions, by incorporating interactions into the step selection framework and demonstrating how to derive territorial patterns from the resulting models. We demonstrate the efficacy of our model by application to a population of insectivore birds in the Amazon rainforest. © 2014 The Author(s) Published by the Royal Society. All rights reserved.
Edwards, J R; Scully, J A; Brtek, M D
2000-12-01
Research into the changing nature of work requires comprehensive models of work design. One such model is the interdisciplinary framework (M. A. Campion, 1988), which integrates 4 work-design approaches (motivational, mechanistic, biological, perceptual-motor) and links each approach to specific outcomes. Unfortunately, studies of this framework have used methods that disregard measurement error, overlook dimensions within each work-design approach, and treat each approach and outcome separately. This study reanalyzes data from M. A. Campion (1988), using structural equation models that incorporate measurement error, specify multiple dimensions for each work-design approach, and examine the work-design approaches and outcomes jointly. Results show that previous studies underestimate relationships between work-design approaches and outcomes and that dimensions within each approach exhibit relationships with outcomes that differ in magnitude and direction.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robinson, Jerome R.; Gordon, Zachary; Booth, Corwin H.
2014-06-24
Cerium compounds have played vital roles in organic, inorganic, and materials chemistry due to their reversible redox chemistry between trivalent and tetravalent oxidation states. However, attempts to rationally access molecular cerium complexes in both oxidation states have been frustrated by unpredictable reactivity in cerium(III) oxidation chemistry. Such oxidation reactions are limited by steric saturation at the metal ion, which can result in high energy activation barriers for electron transfer. An alternative approach has been realized using a rare earth/alkali metal/1,1'-BINOLate (REMB) heterobimetallic framework, which uses redox-inactive metals within the secondary coordination sphere to control ligand reorganization. The rational syntheses ofmore » functionalized cerium(IV) products and a mechanistic examination of the role of ligand reorganization in cerium(III) oxidation are presented.« less
Venkatakrishnan, K; Ecsedy, J A
2017-01-01
Clinical pharmacodynamic evaluation is a key component of the "pharmacologic audit trail" in oncology drug development. We posit that its value can and should be greatly enhanced via application of a robust quantitative pharmacology framework informed by biologically mechanistic considerations. Herein, we illustrate examples of intersectional blindspots across the disciplines of quantitative pharmacology and translational science and offer a roadmap aimed at enhancing the caliber of clinical pharmacodynamic research in the development of oncology therapeutics. © 2016 American Society for Clinical Pharmacology and Therapeutics.
NASA Astrophysics Data System (ADS)
Bowling, Daniel
2014-09-01
A central challenge in neuroscience is to understand the relationship between the mechanistic operation of the nervous system and the psychological phenomena we experience everyday (e.g., perception, memory, attention, emotion, and consciousness). Supported by revolutionary advances in technology, knowledge of neural mechanisms has grown dramatically over recent decades, but with few exceptions our understanding of how these mechanisms relate to psychological phenomena remains poor.
A formal framework for scenario development in support of environmental decision-making
Mahmoud, M.; Liu, Yajing; Hartmann, H.; Stewart, S.; Wagener, T.; Semmens, D.; Stewart, R.; Gupta, H.; Dominguez, D.; Dominguez, F.; Hulse, D.; Letcher, R.; Rashleigh, Brenda; Smith, C.; Street, R.; Ticehurst, J.; Twery, M.; van, Delden H.; Waldick, R.; White, D.; Winter, L.
2009-01-01
Scenarios are possible future states of the world that represent alternative plausible conditions under different assumptions. Often, scenarios are developed in a context relevant to stakeholders involved in their applications since the evaluation of scenario outcomes and implications can enhance decision-making activities. This paper reviews the state-of-the-art of scenario development and proposes a formal approach to scenario development in environmental decision-making. The discussion of current issues in scenario studies includes advantages and obstacles in utilizing a formal scenario development framework, and the different forms of uncertainty inherent in scenario development, as well as how they should be treated. An appendix for common scenario terminology has been attached for clarity. Major recommendations for future research in this area include proper consideration of uncertainty in scenario studies in particular in relation to stakeholder relevant information, construction of scenarios that are more diverse in nature, and sharing of information and resources among the scenario development research community. ?? 2008 Elsevier Ltd.
NASA Astrophysics Data System (ADS)
Araya, Tirusew; Chen, Chun-cheng; Jia, Man-ke; Johnson, David; Li, Ruiping; Huang, Ying-ping
2017-02-01
Metal organic frameworks (MOFs), a new class of porous crystalline materials have attracted attention because of potential applications in environmental remediation. In this work, an Fe-based MOF, FeBTC (BTC = 1,3,5-tricarboxylic acid), was successfully modified with Amberlite IRA-200 resin to yield a novel heterogeneous photocatalyst, A@FeBTC. The modification resulted in higher photocatalytic activity than FeBTC under the same conditions. After 60 min of visible light illumination (λ ≥ 420 nm) 99% of rhodamine B was degraded. The modification lowers the zeta potential, enhancing charge-based selective adsorption and subsequent photocatalytic degradation of cationic dye pollutants. The composite also improved catalyst stability and recyclability by significantly reducing loss of iron leaching. Photoluminescence studies show that introduction of the resin reduces the recombination rate of photogenerated charge carriers thereby improving the photocatalytic activity of the composite. Finally, a plausible photocatalytic reaction mechanism is proposed.
Toward a Safety Risk-Based Classification of Unmanned Aircraft
NASA Technical Reports Server (NTRS)
Torres-Pomales, Wilfredo
2016-01-01
There is a trend of growing interest and demand for greater access of unmanned aircraft (UA) to the National Airspace System (NAS) as the ongoing development of UA technology has created the potential for significant economic benefits. However, the lack of a comprehensive and efficient UA regulatory framework has constrained the number and kinds of UA operations that can be performed. This report presents initial results of a study aimed at defining a safety-risk-based UA classification as a plausible basis for a regulatory framework for UA operating in the NAS. Much of the study up to this point has been at a conceptual high level. The report includes a survey of contextual topics, analysis of safety risk considerations, and initial recommendations for a risk-based approach to safe UA operations in the NAS. The next phase of the study will develop and leverage deeper clarity and insight into practical engineering and regulatory considerations for ensuring that UA operations have an acceptable level of safety.
Werling, Donna M; Brand, Harrison; An, Joon-Yong; Stone, Matthew R; Zhu, Lingxue; Glessner, Joseph T; Collins, Ryan L; Dong, Shan; Layer, Ryan M; Markenscoff-Papadimitriou, Eirene; Farrell, Andrew; Schwartz, Grace B; Wang, Harold Z; Currall, Benjamin B; Zhao, Xuefang; Dea, Jeanselle; Duhn, Clif; Erdman, Carolyn A; Gilson, Michael C; Yadav, Rachita; Handsaker, Robert E; Kashin, Seva; Klei, Lambertus; Mandell, Jeffrey D; Nowakowski, Tomasz J; Liu, Yuwen; Pochareddy, Sirisha; Smith, Louw; Walker, Michael F; Waterman, Matthew J; He, Xin; Kriegstein, Arnold R; Rubenstein, John L; Sestan, Nenad; McCarroll, Steven A; Neale, Benjamin M; Coon, Hilary; Willsey, A Jeremy; Buxbaum, Joseph D; Daly, Mark J; State, Matthew W; Quinlan, Aaron R; Marth, Gabor T; Roeder, Kathryn; Devlin, Bernie; Talkowski, Michael E; Sanders, Stephan J
2018-05-01
Genomic association studies of common or rare protein-coding variation have established robust statistical approaches to account for multiple testing. Here we present a comparable framework to evaluate rare and de novo noncoding single-nucleotide variants, insertion/deletions, and all classes of structural variation from whole-genome sequencing (WGS). Integrating genomic annotations at the level of nucleotides, genes, and regulatory regions, we define 51,801 annotation categories. Analyses of 519 autism spectrum disorder families did not identify association with any categories after correction for 4,123 effective tests. Without appropriate correction, biologically plausible associations are observed in both cases and controls. Despite excluding previously identified gene-disrupting mutations, coding regions still exhibited the strongest associations. Thus, in autism, the contribution of de novo noncoding variation is probably modest in comparison to that of de novo coding variants. Robust results from future WGS studies will require large cohorts and comprehensive analytical strategies that consider the substantial multiple-testing burden.
NASA Astrophysics Data System (ADS)
McCoy, D.; Burrows, S. M.; Elliott, S.; Frossard, A. A.; Russell, L. M.; Liu, X.; Ogunro, O. O.; Easter, R. C.; Rasch, P. J.
2014-12-01
Remote marine clouds, such as those over the Southern Ocean, are particularly sensitive to variations in the concentration and chemical composition of aerosols that serve as cloud condensation nuclei (CCN). Observational evidence indicates that the organic content of fine marine aerosol is greatly increased during the biologically active season near strong phytoplankton blooms in certain locations, while being nearly constant in other locations. We have recently developed a novel modeling framework that mechanistically links the organic fraction of submicron sea spray to ocean biogeochemistry (Burrows et al., in discussion, ACPD, 2014; Elliott et al., ERL, 2014). Because of its combination of large phytoplankton blooms and high wind speeds, the Southern Ocean is an ideal location for testing our understanding of the processes driving the enrichment of organics in sea spray aerosol. Comparison of the simulated OM fraction with satellite observations shows that OM fraction is a statistically significant predictor of cloud droplet number concentration over the Southern Ocean. This presentation will focus on predictions from our modeling framework for the Southern Ocean, specifically, the predicted geographic gradients and seasonal cycles in the aerosol organic matter and its functional group composition. The timing and location of a Southern Ocean field campaign will determine its utility in observing the effects of highly localized and seasonal phytoplankton blooms on aerosol composition and clouds. Reference cited: Burrows, S. M., Ogunro, O., Frossard, A. A., Russell, L. M., Rasch, P. J., and Elliott, S.: A physically-based framework for modelling the organic fractionation of sea spray aerosol from bubble film Langmuir equilibria, Atmos. Chem. Phys. Discuss., 14, 5375-5443, doi:10.5194/acpd-14-5375-2014, 2014. Elliott, S., Burrows, S. M., Deal, C., Liu, X., Long, M., Ogunro, O., Russell, L. M., and Wingenter O.. "Prospects for simulating macromolecular surfactant chemistry at the ocean-atmosphere boundary." Environmental Research Letters 9, no. 6 (2014): 064012.
Bays, Rebecca B; Zabrucky, Karen M; Gagne, Phill
2012-01-01
In the current study we examined whether prevalence information and imagery encoding influence participants' general plausibility, personal plausibility, belief, and memory ratings for suggested childhood events. Results showed decreases in general and personal plausibility ratings for low prevalence events when encoding instructions were not elaborate; however, instructions to repeatedly imagine suggested events elicited personal plausibility increases for low-prevalence events, evidence that elaborate imagery negated the effect of our prevalence manipulation. We found no evidence of imagination inflation or false memory construction. We discuss critical differences in researchers' manipulations of plausibility and imagery that may influence results of false memory studies in the literature. In future research investigators should focus on the specific nature of encoding instructions when examining the development of false memories.
Walz, Yvonne; Wegmann, Martin; Dech, Stefan; Vounatsou, Penelope; Poda, Jean-Noël; N'Goran, Eliézer K.; Utzinger, Jürg; Raso, Giovanna
2015-01-01
Background Schistosomiasis is the most widespread water-based disease in sub-Saharan Africa. Transmission is governed by the spatial distribution of specific freshwater snails that act as intermediate hosts and human water contact patterns. Remote sensing data have been utilized for spatially explicit risk profiling of schistosomiasis. We investigated the potential of remote sensing to characterize habitat conditions of parasite and intermediate host snails and discuss the relevance for public health. Methodology We employed high-resolution remote sensing data, environmental field measurements, and ecological data to model environmental suitability for schistosomiasis-related parasite and snail species. The model was developed for Burkina Faso using a habitat suitability index (HSI). The plausibility of remote sensing habitat variables was validated using field measurements. The established model was transferred to different ecological settings in Côte d’Ivoire and validated against readily available survey data from school-aged children. Principal Findings Environmental suitability for schistosomiasis transmission was spatially delineated and quantified by seven habitat variables derived from remote sensing data. The strengths and weaknesses highlighted by the plausibility analysis showed that temporal dynamic water and vegetation measures were particularly useful to model parasite and snail habitat suitability, whereas the measurement of water surface temperature and topographic variables did not perform appropriately. The transferability of the model showed significant relations between the HSI and infection prevalence in study sites of Côte d’Ivoire. Conclusions/Significance A predictive map of environmental suitability for schistosomiasis transmission can support measures to gain and sustain control. This is particularly relevant as emphasis is shifting from morbidity control to interrupting transmission. Further validation of our mechanistic model needs to be complemented by field data of parasite- and snail-related fitness. Our model provides a useful tool to monitor the development of new hotspots of potential schistosomiasis transmission based on regularly updated remote sensing data. PMID:26587839
Walz, Yvonne; Wegmann, Martin; Dech, Stefan; Vounatsou, Penelope; Poda, Jean-Noël; N'Goran, Eliézer K; Utzinger, Jürg; Raso, Giovanna
2015-11-01
Schistosomiasis is the most widespread water-based disease in sub-Saharan Africa. Transmission is governed by the spatial distribution of specific freshwater snails that act as intermediate hosts and human water contact patterns. Remote sensing data have been utilized for spatially explicit risk profiling of schistosomiasis. We investigated the potential of remote sensing to characterize habitat conditions of parasite and intermediate host snails and discuss the relevance for public health. We employed high-resolution remote sensing data, environmental field measurements, and ecological data to model environmental suitability for schistosomiasis-related parasite and snail species. The model was developed for Burkina Faso using a habitat suitability index (HSI). The plausibility of remote sensing habitat variables was validated using field measurements. The established model was transferred to different ecological settings in Côte d'Ivoire and validated against readily available survey data from school-aged children. Environmental suitability for schistosomiasis transmission was spatially delineated and quantified by seven habitat variables derived from remote sensing data. The strengths and weaknesses highlighted by the plausibility analysis showed that temporal dynamic water and vegetation measures were particularly useful to model parasite and snail habitat suitability, whereas the measurement of water surface temperature and topographic variables did not perform appropriately. The transferability of the model showed significant relations between the HSI and infection prevalence in study sites of Côte d'Ivoire. A predictive map of environmental suitability for schistosomiasis transmission can support measures to gain and sustain control. This is particularly relevant as emphasis is shifting from morbidity control to interrupting transmission. Further validation of our mechanistic model needs to be complemented by field data of parasite- and snail-related fitness. Our model provides a useful tool to monitor the development of new hotspots of potential schistosomiasis transmission based on regularly updated remote sensing data.
Franken, Tom P.; Bremen, Peter; Joris, Philip X.
2014-01-01
Coincidence detection by binaural neurons in the medial superior olive underlies sensitivity to interaural time difference (ITD) and interaural correlation (ρ). It is unclear whether this process is akin to a counting of individual coinciding spikes, or rather to a correlation of membrane potential waveforms resulting from converging inputs from each side. We analyzed spike trains of axons of the cat trapezoid body (TB) and auditory nerve (AN) in a binaural coincidence scheme. ITD was studied by delaying “ipsi-” vs. “contralateral” inputs; ρ was studied by using responses to different noises. We varied the number of inputs; the monaural and binaural threshold and the coincidence window duration. We examined physiological plausibility of output “spike trains” by comparing their rate and tuning to ITD and ρ to those of binaural cells. We found that multiple inputs are required to obtain a plausible output spike rate. In contrast to previous suggestions, monaural threshold almost invariably needed to exceed binaural threshold. Elevation of the binaural threshold to values larger than 2 spikes caused a drastic decrease in rate for a short coincidence window. Longer coincidence windows allowed a lower number of inputs and higher binaural thresholds, but decreased the depth of modulation. Compared to AN fibers, TB fibers allowed higher output spike rates for a low number of inputs, but also generated more monaural coincidences. We conclude that, within the parameter space explored, the temporal patterns of monaural fibers require convergence of multiple inputs to achieve physiological binaural spike rates; that monaural coincidences have to be suppressed relative to binaural ones; and that the neuron has to be sensitive to single binaural coincidences of spikes, for a number of excitatory inputs per side of 10 or less. These findings suggest that the fundamental operation in the mammalian binaural circuit is coincidence counting of single binaural input spikes. PMID:24822037
NASA Astrophysics Data System (ADS)
Horecka, Hannah M.; Thomas, Andrew C.; Weatherbee, Ryan A.
2014-05-01
The Gulf of Maine experiences annual closures of shellfish harvesting due to the accumulation of toxins produced by dinoflagellates of the genus Alexandrium. Factors controlling the timing, location, and magnitude of these events in eastern Maine remain poorly understood. Previous work identified possible linkages between interannual variability of oceanographic variables and shellfish toxicity along the western Maine coastline but no such linkages were evident along the eastern Maine coast in the vicinity of Cobscook Bay, where strong tidal mixing tends to reduce seasonal variability in oceanographic properties. Using 21 years (1985-2005) of shellfish toxicity data, interannual variability in two metrics of annual toxicity, maximum magnitude and total annual toxicity, from stations in the Cobscook Bay region are examined for relationships to a suite of available environmental variables. Consistent with earlier work, no (or only weak) correlations were found between toxicity and oceanographic variables, even those very proximate to the stations such as local sea surface temperature. Similarly no correlations were evident between toxicity and air temperature, precipitation or relative humidity. The data suggest possible connections to local river discharge, but plausible mechanisms are not obvious. Correlations between toxicity and two variables indicative of local meteorological conditions, dew point and atmospheric pressure, both suggest a link between increased toxicity in these eastern Maine stations and weather conditions characterized by clearer skies/drier air (or less stormy/humid conditions). As no correlation of opposite sign was evident between toxicity and local precipitation, one plausible link is through light availability and its positive impact on phytoplankton production in this persistently foggy section of coast. These preliminary findings point to both the value of maintaining long-term shellfish toxicity sampling and a need for inclusion of weather variability in future modeling studies aimed at development of a more mechanistic understanding of factors controlling interannual differences in eastern Gulf of Maine shellfish toxicity.
How many kinds of reasoning? Inference, probability, and natural language semantics.
Lassiter, Daniel; Goodman, Noah D
2015-03-01
The "new paradigm" unifying deductive and inductive reasoning in a Bayesian framework (Oaksford & Chater, 2007; Over, 2009) has been claimed to be falsified by results which show sharp differences between reasoning about necessity vs. plausibility (Heit & Rotello, 2010; Rips, 2001; Rotello & Heit, 2009). We provide a probabilistic model of reasoning with modal expressions such as "necessary" and "plausible" informed by recent work in formal semantics of natural language, and show that it predicts the possibility of non-linear response patterns which have been claimed to be problematic. Our model also makes a strong monotonicity prediction, while two-dimensional theories predict the possibility of reversals in argument strength depending on the modal word chosen. Predictions were tested using a novel experimental paradigm that replicates the previously-reported response patterns with a minimal manipulation, changing only one word of the stimulus between conditions. We found a spectrum of reasoning "modes" corresponding to different modal words, and strong support for our model's monotonicity prediction. This indicates that probabilistic approaches to reasoning can account in a clear and parsimonious way for data previously argued to falsify them, as well as new, more fine-grained, data. It also illustrates the importance of careful attention to the semantics of language employed in reasoning experiments. Copyright © 2014 Elsevier B.V. All rights reserved.
Carroll, Rebecca; Meis, Markus; Schulte, Michael; Vormann, Matthias; Kießling, Jürgen; Meister, Hartmut
2015-02-01
To report the development of a standardized German version of a reading span test (RST) with a dual task design. Special attention was paid to psycholinguistic control of the test items and time-sensitive scoring. We aim to establish our RST version to use for determining an individual's working memory in the framework of hearing research in German contexts. RST stimuli were controlled and pretested for psycholinguistic factors. The RST task was to read sentences, quickly determine their plausibility, and later recall certain words to determine a listener's individual reading span. RST results were correlated with outcomes of additional sentence-in-noise tests measured in an aided and an unaided listening condition, each at two reception thresholds. Item plausibility was pre-determined by 28 native German participants. An additional 62 listeners (45-86 years, M = 69.8) with mild-to-moderate hearing loss were tested for speech intelligibility and reading span in a multicenter study. The reading span test significantly correlated with speech intelligibility at both speech reception thresholds in the aided listening condition. Our German RST is standardized with respect to psycholinguistic construction principles of the stimuli, and is a cognitive correlate of intelligibility in a German matrix speech-in-noise test.
NASA Astrophysics Data System (ADS)
Koch, Jonas; Nowak, Wolfgang
2013-04-01
At many hazardous waste sites and accidental spills, dense non-aqueous phase liquids (DNAPLs) such as TCE, PCE, or TCA have been released into the subsurface. Once a DNAPL is released into the subsurface, it serves as persistent source of dissolved-phase contamination. In chronological order, the DNAPL migrates through the porous medium and penetrates the aquifer, it forms a complex pattern of immobile DNAPL saturation, it dissolves into the groundwater and forms a contaminant plume, and it slowly depletes and bio-degrades in the long-term. In industrial countries the number of such contaminated sites is tremendously high to the point that a ranking from most risky to least risky is advisable. Such a ranking helps to decide whether a site needs to be remediated or may be left to natural attenuation. Both the ranking and the designing of proper remediation or monitoring strategies require a good understanding of the relevant physical processes and their inherent uncertainty. To this end, we conceptualize a probabilistic simulation framework that estimates probability density functions of mass discharge, source depletion time, and critical concentration values at crucial target locations. Furthermore, it supports the inference of contaminant source architectures from arbitrary site data. As an essential novelty, the mutual dependencies of the key parameters and interacting physical processes are taken into account throughout the whole simulation. In an uncertain and heterogeneous subsurface setting, we identify three key parameter fields: the local velocities, the hydraulic permeabilities and the DNAPL phase saturations. Obviously, these parameters depend on each other during DNAPL infiltration, dissolution and depletion. In order to highlight the importance of these mutual dependencies and interactions, we present results of several model set ups where we vary the physical and stochastic dependencies of the input parameters and simulated processes. Under these changes, the probability density functions demonstrate strong statistical shifts in their expected values and in their uncertainty. Considering the uncertainties of all key parameters but neglecting their interactions overestimates the output uncertainty. However, consistently using all available physical knowledge when assigning input parameters and simulating all relevant interactions of the involved processes reduces the output uncertainty significantly back down to useful and plausible ranges. When using our framework in an inverse setting, omitting a parameter dependency within a crucial physical process would lead to physical meaningless identified parameters. Thus, we conclude that the additional complexity we propose is both necessary and adequate. Overall, our framework provides a tool for reliable and plausible prediction, risk assessment, and model based decision support for DNAPL contaminated sites.
Modelling default and likelihood reasoning as probabilistic
NASA Technical Reports Server (NTRS)
Buntine, Wray
1990-01-01
A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. 'Likely' and 'by default' are in fact treated as duals in the same sense as 'possibility' and 'necessity'. To model these four forms probabilistically, a logic QDP and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequence results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.
Photostability of 2D Organic-Inorganic Hybrid Perovskites
Wei, Yi; Audebert, Pierre; Galmiche, Laurent; Lauret, Jean-Sébastien; Deleporte, Emmanuelle
2014-01-01
We analyze the behavior of a series of newly synthesized (R-NH3)2PbX4 perovskites and, in particular, discuss the possible reasons which cause their degradation under UV illumination. Experimental results show that the degradation process depends a lot on their molecular components: not only the inorganic part, but also the chemical structure of the organic moieties play an important role in bleaching and photo-chemical reaction processes which tend to destroy perovskites luminescent framework. In addition, we find the spatial arrangement in crystal also influences the photostability course. Following these trends, we propose a plausible mechanism for the photodegradation of the films, and also introduced options for optimized stability. PMID:28788706
A Mechanistic-Based Healing Model for Self-Healing Glass Seals Used in Solid Oxide Fuel Cells
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Wei; Sun, Xin; Stephens, Elizabeth V.
The usage of self-healing glass as hermetic seals is a recent advancement in sealing technology development for the planar solid oxide fuel cells (SOFCs). Because of its capability of restoring the mechanical properties at elevated temperatures, the self-healing glass seal is expected to provide high reliability in maintaining the long-term structural integrity and functionality of SOFCs. In order to accommodate the design and to evaluate the effectiveness of such engineering seals under various thermo-mechanical operating conditions, computational modeling framework needs to be developed to accurately capture and predict the healing behavior of the glass material. In the present work, amore » mechanistic-based two-stage model was developed to study the stress and temperature-dependent crack healing of the self-healing glass materials. The model was first calibrated by experimental measurements combined with the kinetic Monte Carlo (kMC) simulation results and then implemented into the finite element analysis (FEA). The effects of various factors, i.e. stress, temperature, crack morphology, on the healing behavior of the glass were investigated and discussed.« less
PI(5)P Regulates Autophagosome Biogenesis
Vicinanza, Mariella; Korolchuk, Viktor I.; Ashkenazi, Avraham; Puri, Claudia; Menzies, Fiona M.; Clarke, Jonathan H.; Rubinsztein, David C.
2015-01-01
Summary Phosphatidylinositol 3-phosphate (PI(3)P), the product of class III PI3K VPS34, recruits specific autophagic effectors, like WIPI2, during the initial steps of autophagosome biogenesis and thereby regulates canonical autophagy. However, mammalian cells can produce autophagosomes through enigmatic noncanonical VPS34-independent pathways. Here we show that PI(5)P can regulate autophagy via PI(3)P effectors and thereby identify a mechanistic explanation for forms of noncanonical autophagy. PI(5)P synthesis by the phosphatidylinositol 5-kinase PIKfyve was required for autophagosome biogenesis, and it increased levels of PI(5)P, stimulated autophagy, and reduced the levels of autophagic substrates. Inactivation of VPS34 impaired recruitment of WIPI2 and DFCP1 to autophagic precursors, reduced ATG5-ATG12 conjugation, and compromised autophagosome formation. However, these phenotypes were rescued by PI(5)P in VPS34-inactivated cells. These findings provide a mechanistic framework for alternative VPS34-independent autophagy-initiating pathways, like glucose starvation, and unravel a cytoplasmic function for PI(5)P, which previously has been linked predominantly to nuclear roles. PMID:25578879
Modelling and observing the role of wind in Anopheles population dynamics around a reservoir.
Endo, Noriko; Eltahir, Elfatih A B
2018-01-25
Wind conditions, as well as other environmental conditions, are likely to influence malaria transmission through the behaviours of Anopheles mosquitoes, especially around water-resource reservoirs. Wind-induced waves in a reservoir impose mortality on aquatic-stage mosquitoes. Mosquitoes' host-seeking activity is also influenced by wind through dispersion of [Formula: see text]. However, no malaria transmission model exists to date that simulated those impacts of wind mechanistically. A modelling framework for simulating the three important effects of wind on the behaviours of mosquito is developed: attraction of adult mosquitoes through dispersion of [Formula: see text] ([Formula: see text] attraction), advection of adult mosquitoes (advection), and aquatic-stage mortality due to wind-induced surface waves (waves). The framework was incorporated in a mechanistic malaria transmission simulator, HYDREMATS. The performance of the extended simulator was compared with the observed population dynamics of the Anopheles mosquitoes at a village adjacent to the Koka Reservoir in Ethiopia. The observed population dynamics of the Anopheles mosquitoes were reproduced with some reasonable accuracy in HYDREMATS that includes the representation of the wind effects. HYDREMATS without the wind model failed to do so. Offshore wind explained the increase in Anopheles population that cannot be expected from other environmental conditions alone. Around large water bodies such as reservoirs, the role of wind in the dynamics of Anopheles population, hence in malaria transmission, can be significant. Modelling the impacts of wind on the behaviours of Anopheles mosquitoes aids in reproducing the seasonality of malaria transmission and in estimation of the risk of malaria around reservoirs.
A mechanistic hypothesis of the factors that enhance vulnerability to nicotine use in females
O'Dell, Laura E.; Torres, Oscar V.
2013-01-01
Women are particularly more vulnerable to tobacco use than men. This review proposes a unifying hypothesis that females experience greater rewarding effects of nicotine and more intense stress produced by withdrawal than males. We also provide a neural framework whereby estrogen promotes greater rewarding effects of nicotine in females via enhanced dopamine release in the nucleus accumbens (NAcc). During withdrawal, we suggest that corticotropin-releasing factor (CRF) stress systems are sensitized and promote a greater suppression of dopamine release in the NAcc of females versus males. Taken together, females display enhanced nicotine reward via estrogen and amplified effects of withdrawal via stress systems. Although this framework focuses on sex differences in adult rats, it is also applied to adolescent females who display enhanced rewarding effects of nicotine, but reduced effects of withdrawal from this drug. Since females experience strong rewarding effects of nicotine, a clinical implication of our hypothesis is that specific strategies to prevent smoking initiation among females are critical. Also, anxiolytic medications may be more effective in females that experience intense stress during withdrawal. Furthermore, medications that target withdrawal should not be applied in a unilateral manner across age and sex, given that nicotine withdrawal is lower during adolescence. This review highlights key factors that promote nicotine use in females, and future studies on sex-dependent interactions of stress and reward systems are needed to test our mechanistic hypotheses. Future studies in this area will have important translational value toward reducing health disparities produced by nicotine use in females. PMID:23684991
Aladjov, Hristo; Ankley, Gerald; Byrne, Hugh J.; de Knecht, Joop; Heinzle, Elmar; Klambauer, Günter; Landesmann, Brigitte; Luijten, Mirjam; MacKay, Cameron; Maxwell, Gavin; Meek, M. E. (Bette); Paini, Alicia; Perkins, Edward; Sobanski, Tomasz; Villeneuve, Dan; Waters, Katrina M.; Whelan, Maurice
2017-01-01
Efforts are underway to transform regulatory toxicology and chemical safety assessment from a largely empirical science based on direct observation of apical toxicity outcomes in whole organism toxicity tests to a predictive one in which outcomes and risk are inferred from accumulated mechanistic understanding. The adverse outcome pathway (AOP) framework provides a systematic approach for organizing knowledge that may support such inference. Likewise, computational models of biological systems at various scales provide another means and platform to integrate current biological understanding to facilitate inference and extrapolation. We argue that the systematic organization of knowledge into AOP frameworks can inform and help direct the design and development of computational prediction models that can further enhance the utility of mechanistic and in silico data for chemical safety assessment. This concept was explored as part of a workshop on AOP-Informed Predictive Modeling Approaches for Regulatory Toxicology held September 24–25, 2015. Examples of AOP-informed model development and its application to the assessment of chemicals for skin sensitization and multiple modes of endocrine disruption are provided. The role of problem formulation, not only as a critical phase of risk assessment, but also as guide for both AOP and complementary model development is described. Finally, a proposal for actively engaging the modeling community in AOP-informed computational model development is made. The contents serve as a vision for how AOPs can be leveraged to facilitate development of computational prediction models needed to support the next generation of chemical safety assessment. PMID:27994170
A Green Solvent Induced DNA Package
NASA Astrophysics Data System (ADS)
Satpathi, Sagar; Sengupta, Abhigyan; Hridya, V. M.; Gavvala, Krishna; Koninti, Raj Kumar; Roy, Bibhisan; Hazra, Partha
2015-03-01
Mechanistic details of DNA compaction is essential blue print for gene regulation in living organisms. Many in vitro studies have been implemented using several compaction agents. However, these compacting agents may have some kinds of cytotoxic effects to the cells. To minimize this aspect, several research works had been performed, but people have never focused green solvent, i.e. room temperature ionic liquid as DNA compaction agent. To the best of our knowledge, this is the first ever report where we have shown that guanidinium tris(pentafluoroethyl)trifluorophosphate (Gua-IL) acts as a DNA compacting agent. The compaction ability of Gua-IL has been verified by different spectroscopic techniques, like steady state emission, circular dichroism, dynamic light scattering and UV melting. Notably, we have extensively probed this compaction by Gua-IL through field emission scanning electron microscopy (FE-SEM) and fluorescence microscopy images. We also have discussed the plausible compaction mechanism process of DNA by Gua-IL. Our results suggest that Gua-IL forms a micellar kind of self aggregation above a certain concentration (>=1 mM), which instigates this compaction process. This study divulges the specific details of DNA compaction mechanism by a new class of compaction agent, which is highly biodegradable and eco friendly in nature.
Promiscuity in the Enzymatic Catalysis of Phosphate and Sulfate Transfer
2016-01-01
The enzymes that facilitate phosphate and sulfate hydrolysis are among the most proficient natural catalysts known to date. Interestingly, a large number of these enzymes are promiscuous catalysts that exhibit both phosphatase and sulfatase activities in the same active site and, on top of that, have also been demonstrated to efficiently catalyze the hydrolysis of other additional substrates with varying degrees of efficiency. Understanding the factors that underlie such multifunctionality is crucial both for understanding functional evolution in enzyme superfamilies and for the development of artificial enzymes. In this Current Topic, we have primarily focused on the structural and mechanistic basis for catalytic promiscuity among enzymes that facilitate both phosphoryl and sulfuryl transfer in the same active site, while comparing this to how catalytic promiscuity manifests in other promiscuous phosphatases. We have also drawn on the large number of experimental and computational studies of selected model systems in the literature to explore the different features driving the catalytic promiscuity of such enzymes. Finally, on the basis of this comparative analysis, we probe the plausible origins and determinants of catalytic promiscuity in enzymes that catalyze phosphoryl and sulfuryl transfer. PMID:27187273
Increasing elevation of fire in the Sierra Nevada and implications for forest change
Schwartz, Mark W.; Butt, Nathalie; Dolanc, Christopher R.; Holguin, Andrew; Moritz, Max A.; North, Malcolm P.; Safford, Hugh D.; Stephenson, Nathan L.; Thorne, James H.; van Mantgem, Phillip J.
2015-01-01
Fire in high-elevation forest ecosystems can have severe impacts on forest structure, function and biodiversity. Using a 105-year data set, we found increasing elevation extent of fires in the Sierra Nevada, and pose five hypotheses to explain this pattern. Beyond the recognized pattern of increasing fire frequency in the Sierra Nevada since the late 20th century, we find that the upper elevation extent of those fires has also been increasing. Factors such as fire season climate and fuel build up are recognized potential drivers of changes in fire regimes. Patterns of warming climate and increasing stand density are consistent with both the direction and magnitude of increasing elevation of wildfire. Reduction in high elevation wildfire suppression and increasing ignition frequencies may also contribute to the observed pattern. Historical biases in fire reporting are recognized, but not likely to explain the observed patterns. The four plausible mechanistic hypotheses (changes in fire management, climate, fuels, ignitions) are not mutually exclusive, and likely have synergistic interactions that may explain the observed changes. Irrespective of mechanism, the observed pattern of increasing occurrence of fire in these subalpine forests may have significant impacts on their resilience to changing climatic conditions.
Studies of thin water films and relevance to the heterogeneous nucleation of ice
NASA Astrophysics Data System (ADS)
Ochshorn, Eli
The research that I will present in this dissertation concerns qualitative factors relevant to thin water films and ice nucleation. The immediate goal is not to develop a precise quantitative theory of ice nucleation. Instead, the focus is on characterizing some molecular properties (e.g., bond strengths, bond orientations, range of surface effects, etc.) of freezing catalysts and interfacial water over a range of temperatures relevant to the ice nucleation process (i.e., 20 to -20 °C). From this, we can evaluate the plausibility of different mechanistic freezing hypotheses through comparison with experiment. In all studies, I use Fourier transform infrared spectroscopy to study the intermolecular details of water and a surface species of interest. The dissertation is arranged with an introductory chapter, which primarily serves to place the research within the context of the field, then three chapters containing original research, each of which is a self-contained study that has either already been published or is currently under consideration for publication in a peer-reviewed journal. Finally, an appendix at the end provides some additional details that have not been included in the articles.
Interactive natural language acquisition in a multi-modal recurrent neural architecture
NASA Astrophysics Data System (ADS)
Heinrich, Stefan; Wermter, Stefan
2018-01-01
For the complex human brain that enables us to communicate in natural language, we gathered good understandings of principles underlying language acquisition and processing, knowledge about sociocultural conditions, and insights into activity patterns in the brain. However, we were not yet able to understand the behavioural and mechanistic characteristics for natural language and how mechanisms in the brain allow to acquire and process language. In bridging the insights from behavioural psychology and neuroscience, the goal of this paper is to contribute a computational understanding of appropriate characteristics that favour language acquisition. Accordingly, we provide concepts and refinements in cognitive modelling regarding principles and mechanisms in the brain and propose a neurocognitively plausible model for embodied language acquisition from real-world interaction of a humanoid robot with its environment. In particular, the architecture consists of a continuous time recurrent neural network, where parts have different leakage characteristics and thus operate on multiple timescales for every modality and the association of the higher level nodes of all modalities into cell assemblies. The model is capable of learning language production grounded in both, temporal dynamic somatosensation and vision, and features hierarchical concept abstraction, concept decomposition, multi-modal integration, and self-organisation of latent representations.
Rothnie, Alice; Clarke, Anthony R.; Kuzmic, Petr; Cameron, Angus; Smith, Corinne J.
2011-01-01
An essential stage in endocytic coated vesicle recycling is the dissociation of clathrin from the vesicle coat by the molecular chaperone, 70-kDa heat-shock cognate protein (Hsc70), and the J-domain-containing protein, auxilin, in an ATP-dependent process. We present a detailed mechanistic analysis of clathrin disassembly catalyzed by Hsc70 and auxilin, using loss of perpendicular light scattering to monitor the process. We report that a single auxilin per clathrin triskelion is required for maximal rate of disassembly, that ATP is hydrolyzed at the same rate that disassembly occurs, and that three ATP molecules are hydrolyzed per clathrin triskelion released. Stopped-flow measurements revealed a lag phase in which the scattering intensity increased owing to association of Hsc70 with clathrin cages followed by serial rounds of ATP hydrolysis prior to triskelion removal. Global fit of stopped-flow data to several physically plausible mechanisms showed the best fit to a model in which sequential hydrolysis of three separate ATP molecules is required for the eventual release of a triskelion from the clathrin–auxilin cage. PMID:21482805
Rhomberg, Lorenz R.; Goodman, Julie E.; Prueitt, Robyn L.
2013-01-01
Styrene was listed as “reasonably anticipated to be a human carcinogen” in the twelfth edition of the National Toxicology Program's Report on Carcinogens based on what we contend are erroneous findings of limited evidence of carcinogenicity in humans, sufficient evidence of carcinogenicity in experimental animals, and supporting mechanistic data. The epidemiology studies show no consistent increased incidence of, or mortality from, any type of cancer. In animal studies, increased incidence rates of mostly benign tumors have been observed only in certain strains of one species (mice) and at one tissue site (lung). The lack of concordance of tumor incidence and tumor type among animals (even within the same species) and humans indicates that there has been no particular cancer consistently observed among all available studies. The only plausible mechanism for styrene-induced carcinogenesis—a non-genotoxic mode of action that is specific to the mouse lung—is not relevant to humans. As a whole, the evidence does not support the characterization of styrene as “reasonably anticipated to be a human carcinogen,” and styrene should not be listed in the Report on Carcinogens. PMID:23335843
Lavado, Nieves; García de la Concepción, Juan; Babiano, Reyes; Cintas, Pedro
2018-03-15
The condensation of cyanamide and glyoxal, two well-known prebiotic monomers, in an aqueous phase has been investigated in great detail, demonstrating the formation of oligomeric species of varied structure, though consistent with generalizable patterns. This chemistry involving structurally simple substances also illustrates the possibility of building molecular complexity under prebiotically plausible conditions, not only on Earth, but also in extraterrestrial scenarios. We show that cyanamide-glyoxal reactions in water lead to mixtures comprising both acyclic and cyclic fragments, largely based on fused five- and six-membered rings, which can be predicted by computation. Remarkably, such a mixture could be identified using high-resolution electrospray ionization (ESI) mass spectrometry and spectroscopic methods. A few mechanistic pathways can be postulated, most involving the intermediacy of glyoxal cyanoimine and further chain growth, thus increasing the diversity of the observed products. This rationale is supported by theoretical analyses with clear-cut identification of all of the stationary points and transition-state structures. The properties and structural differences of oligomers obtained under thermodynamic conditions in water as opposed to those isolated by precipitation from organic media are also discussed. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Modelling Delta-Notch perturbations during zebrafish somitogenesis.
Murray, Philip J; Maini, Philip K; Baker, Ruth E
2013-01-15
The discovery over the last 15 years of molecular clocks and gradients in the pre-somitic mesoderm of numerous vertebrate species has added significant weight to Cooke and Zeeman's 'clock and wavefront' model of somitogenesis, in which a travelling wavefront determines the spatial position of somite formation and the somitogenesis clock controls periodicity (Cooke and Zeeman, 1976). However, recent high-throughput measurements of spatiotemporal patterns of gene expression in different zebrafish mutant backgrounds allow further quantitative evaluation of the clock and wavefront hypothesis. In this study we describe how our recently proposed model, in which oscillator coupling drives the propagation of an emergent wavefront, can be used to provide mechanistic and testable explanations for the following observed phenomena in zebrafish embryos: (a) the variation in somite measurements across a number of zebrafish mutants; (b) the delayed formation of somites and the formation of 'salt and pepper' patterns of gene expression upon disruption of oscillator coupling; and (c) spatial correlations in the 'salt and pepper' patterns in Delta-Notch mutants. In light of our results, we propose a number of plausible experiments that could be used to further test the model. Copyright © 2012 Elsevier Inc. All rights reserved.
Julien, Elizabeth; Boobis, Alan R; Olin, Stephen S
2009-09-01
The ILSI Research Foundation convened a cross-disciplinary working group to examine current approaches for assessing dose-response and identifying safe levels of intake or exposure for four categories of bioactive agents-food allergens, nutrients, pathogenic microorganisms, and environmental chemicals. This effort generated a common analytical framework-the Key Events Dose-Response Framework (KEDRF)-for systematically examining key events that occur between the initial dose of a bioactive agent and the effect of concern. Individual key events are considered with regard to factors that influence the dose-response relationship and factors that underlie variability in that relationship. This approach illuminates the connection between the processes occurring at the level of fundamental biology and the outcomes observed at the individual and population levels. Thus, it promotes an evidence-based approach for using mechanistic data to reduce reliance on default assumptions, to quantify variability, and to better characterize biological thresholds. This paper provides an overview of the KEDRF and introduces a series of four companion papers that illustrate initial application of the approach to a range of bioactive agents.
Integrated presentation of ecological risk from multiple stressors
NASA Astrophysics Data System (ADS)
Goussen, Benoit; Price, Oliver R.; Rendal, Cecilie; Ashauer, Roman
2016-10-01
Current environmental risk assessments (ERA) do not account explicitly for ecological factors (e.g. species composition, temperature or food availability) and multiple stressors. Assessing mixtures of chemical and ecological stressors is needed as well as accounting for variability in environmental conditions and uncertainty of data and models. Here we propose a novel probabilistic ERA framework to overcome these limitations, which focusses on visualising assessment outcomes by construct-ing and interpreting prevalence plots as a quantitative prediction of risk. Key components include environmental scenarios that integrate exposure and ecology, and ecological modelling of relevant endpoints to assess the effect of a combination of stressors. Our illustrative results demonstrate the importance of regional differences in environmental conditions and the confounding interactions of stressors. Using this framework and prevalence plots provides a risk-based approach that combines risk assessment and risk management in a meaningful way and presents a truly mechanistic alternative to the threshold approach. Even whilst research continues to improve the underlying models and data, regulators and decision makers can already use the framework and prevalence plots. The integration of multiple stressors, environmental conditions and variability makes ERA more relevant and realistic.
Integrated presentation of ecological risk from multiple stressors.
Goussen, Benoit; Price, Oliver R; Rendal, Cecilie; Ashauer, Roman
2016-10-26
Current environmental risk assessments (ERA) do not account explicitly for ecological factors (e.g. species composition, temperature or food availability) and multiple stressors. Assessing mixtures of chemical and ecological stressors is needed as well as accounting for variability in environmental conditions and uncertainty of data and models. Here we propose a novel probabilistic ERA framework to overcome these limitations, which focusses on visualising assessment outcomes by construct-ing and interpreting prevalence plots as a quantitative prediction of risk. Key components include environmental scenarios that integrate exposure and ecology, and ecological modelling of relevant endpoints to assess the effect of a combination of stressors. Our illustrative results demonstrate the importance of regional differences in environmental conditions and the confounding interactions of stressors. Using this framework and prevalence plots provides a risk-based approach that combines risk assessment and risk management in a meaningful way and presents a truly mechanistic alternative to the threshold approach. Even whilst research continues to improve the underlying models and data, regulators and decision makers can already use the framework and prevalence plots. The integration of multiple stressors, environmental conditions and variability makes ERA more relevant and realistic.
Benigni, Romualdo; Bossa, Cecilia
2008-01-01
In the past decades, chemical carcinogenicity has been the object of mechanistic studies that have been translated into valuable experimental (e.g., the Salmonella assays system) and theoretical (e.g., compilations of structure alerts for chemical carcinogenicity) models. These findings remain the basis of the science and regulation of mutagens and carcinogens. Recent advances in the organization and treatment of large databases consisting of both biological and chemical information nowadays allows for a much easier and more refined view of data. This paper reviews recent analyses on the predictive performance of various lists of structure alerts, including a new compilation of alerts that combines previous work in an optimized form for computer implementation. The revised compilation is part of the Toxtree 1.50 software (freely available from the European Chemicals Bureau website). The use of structural alerts for the chemical biological profiling of a large database of Salmonella mutagenicity results is also reported. Together with being a repository of the science on the chemical biological interactions at the basis of chemical carcinogenicity, the SAs have a crucial role in practical applications for risk assessment, for: (a) description of sets of chemicals; (b) preliminary hazard characterization; (c) formation of categories for e.g., regulatory purposes; (d) generation of subsets of congeneric chemicals to be analyzed subsequently with QSAR methods; (e) priority setting. An important aspect of SAs as predictive toxicity tools is that they derive directly from mechanistic knowledge. The crucial role of mechanistic knowledge in the process of applying (Q)SAR considerations to risk assessment should be strongly emphasized. Mechanistic knowledge provides a ground for interaction and dialogue between model developers, toxicologists and regulators, and permits the integration of the (Q)SAR results into a wider regulatory framework, where different types of evidence and data concur or complement each other as a basis for making decisions and taking actions.
Huber, John H; Childs, Marissa L; Caldwell, Jamie M; Mordecai, Erin A
2018-05-01
Dengue, chikungunya, and Zika virus epidemics transmitted by Aedes aegypti mosquitoes have recently (re)emerged and spread throughout the Americas, Southeast Asia, the Pacific Islands, and elsewhere. Understanding how environmental conditions affect epidemic dynamics is critical for predicting and responding to the geographic and seasonal spread of disease. Specifically, we lack a mechanistic understanding of how seasonal variation in temperature affects epidemic magnitude and duration. Here, we develop a dynamic disease transmission model for dengue virus and Aedes aegypti mosquitoes that integrates mechanistic, empirically parameterized, and independently validated mosquito and virus trait thermal responses under seasonally varying temperatures. We examine the influence of seasonal temperature mean, variation, and temperature at the start of the epidemic on disease dynamics. We find that at both constant and seasonally varying temperatures, warmer temperatures at the start of epidemics promote more rapid epidemics due to faster burnout of the susceptible population. By contrast, intermediate temperatures (24-25°C) at epidemic onset produced the largest epidemics in both constant and seasonally varying temperature regimes. When seasonal temperature variation was low, 25-35°C annual average temperatures produced the largest epidemics, but this range shifted to cooler temperatures as seasonal temperature variation increased (analogous to previous results for diurnal temperature variation). Tropical and sub-tropical cities such as Rio de Janeiro, Fortaleza, and Salvador, Brazil; Cali, Cartagena, and Barranquilla, Colombia; Delhi, India; Guangzhou, China; and Manila, Philippines have mean annual temperatures and seasonal temperature ranges that produced the largest epidemics. However, more temperate cities like Shanghai, China had high epidemic suitability because large seasonal variation offset moderate annual average temperatures. By accounting for seasonal variation in temperature, the model provides a baseline for mechanistically understanding environmental suitability for virus transmission by Aedes aegypti. Overlaying the impact of human activities and socioeconomic factors onto this mechanistic temperature-dependent framework is critical for understanding likelihood and magnitude of outbreaks.
Kayala, Matthew A; Baldi, Pierre
2012-10-22
Proposing reasonable mechanisms and predicting the course of chemical reactions is important to the practice of organic chemistry. Approaches to reaction prediction have historically used obfuscating representations and manually encoded patterns or rules. Here we present ReactionPredictor, a machine learning approach to reaction prediction that models elementary, mechanistic reactions as interactions between approximate molecular orbitals (MOs). A training data set of productive reactions known to occur at reasonable rates and yields and verified by inclusion in the literature or textbooks is derived from an existing rule-based system and expanded upon with manual curation from graduate level textbooks. Using this training data set of complex polar, hypervalent, radical, and pericyclic reactions, a two-stage machine learning prediction framework is trained and validated. In the first stage, filtering models trained at the level of individual MOs are used to reduce the space of possible reactions to consider. In the second stage, ranking models over the filtered space of possible reactions are used to order the reactions such that the productive reactions are the top ranked. The resulting model, ReactionPredictor, perfectly ranks polar reactions 78.1% of the time and recovers all productive reactions 95.7% of the time when allowing for small numbers of errors. Pericyclic and radical reactions are perfectly ranked 85.8% and 77.0% of the time, respectively, rising to >93% recovery for both reaction types with a small number of allowed errors. Decisions about which of the polar, pericyclic, or radical reaction type ranking models to use can be made with >99% accuracy. Finally, for multistep reaction pathways, we implement the first mechanistic pathway predictor using constrained tree-search to discover a set of reasonable mechanistic steps from given reactants to given products. Webserver implementations of both the single step and pathway versions of ReactionPredictor are available via the chemoinformatics portal http://cdb.ics.uci.edu/.
NASA Astrophysics Data System (ADS)
Ghimire, B.; Riley, W. J.; Koven, C. D.; Randerson, J. T.; Mu, M.; Kattge, J.; Rogers, A.; Reich, P. B.
2014-12-01
In many ecosystems, nitrogen is the most limiting nutrient for plant growth and productivity. However mechanistic representation of nitrogen uptake linked to root traits, and functional nitrogen allocation among different leaf enzymes involved in respiration and photosynthesis is currently lacking in Earth System models. The linkage between nitrogen availability and plant productivity is simplistically represented by potential photosynthesis rates, and is subsequently downregulated depending on nitrogen supply and other nitrogen consumers in the model (e.g., nitrification). This type of potential photosynthesis rate calculation is problematic for several reasons. Firstly, plants do not photosynthesize at potential rates and then downregulate. Secondly, there is considerable subjectivity on the meaning of potential photosynthesis rates. Thirdly, there exists lack of understanding on modeling these potential photosynthesis rates in a changing climate. In addition to model structural issues in representing photosynthesis rates, the role of plant roots in nutrient acquisition have been largely ignored in Earth System models. For example, in CLM4.5, nitrogen uptake is linked to leaf level processes (e.g., primarily productivity) rather than root scale process involved in nitrogen uptake. We present a new plant model for CLM with an improved mechanistic presentation of plant nitrogen uptake based on root scale Michaelis Menten kinetics, and stronger linkages between leaf nitrogen and plant productivity by inferring relationships observed in global databases of plant traits (including the TRY database and several individual studies). We also incorporate improved representation of plant nitrogen leaf allocation, especially in tropical regions where significant over-prediction of plant growth and productivity in CLM4.5 simulations exist. We evaluate our improved global model simulations using the International Land Model Benchmarking (ILAMB) framework. We conclude that mechanistic representation of leaf-level nitrogen allocation and a theoretically consistent treatment of competition with belowground consumers leads to overall improvements in CLM4.5's global carbon cycling predictions.
Bayesian Group Bridge for Bi-level Variable Selection.
Mallick, Himel; Yi, Nengjun
2017-06-01
A Bayesian bi-level variable selection method (BAGB: Bayesian Analysis of Group Bridge) is developed for regularized regression and classification. This new development is motivated by grouped data, where generic variables can be divided into multiple groups, with variables in the same group being mechanistically related or statistically correlated. As an alternative to frequentist group variable selection methods, BAGB incorporates structural information among predictors through a group-wise shrinkage prior. Posterior computation proceeds via an efficient MCMC algorithm. In addition to the usual ease-of-interpretation of hierarchical linear models, the Bayesian formulation produces valid standard errors, a feature that is notably absent in the frequentist framework. Empirical evidence of the attractiveness of the method is illustrated by extensive Monte Carlo simulations and real data analysis. Finally, several extensions of this new approach are presented, providing a unified framework for bi-level variable selection in general models with flexible penalties.
Toxicology ontology perspectives.
Hardy, Barry; Apic, Gordana; Carthew, Philip; Clark, Dominic; Cook, David; Dix, Ian; Escher, Sylvia; Hastings, Janna; Heard, David J; Jeliazkova, Nina; Judson, Philip; Matis-Mitchell, Sherri; Mitic, Dragana; Myatt, Glenn; Shah, Imran; Spjuth, Ola; Tcheremenskaia, Olga; Toldo, Luca; Watson, David; White, Andrew; Yang, Chihae
2012-01-01
The field of predictive toxicology requires the development of open, public, computable, standardized toxicology vocabularies and ontologies to support the applications required by in silico, in vitro, and in vivo toxicology methods and related analysis and reporting activities. In this article we review ontology developments based on a set of perspectives showing how ontologies are being used in predictive toxicology initiatives and applications. Perspectives on resources and initiatives reviewed include OpenTox, eTOX, Pistoia Alliance, ToxWiz, Virtual Liver, EU-ADR, BEL, ToxML, and Bioclipse. We also review existing ontology developments in neighboring fields that can contribute to establishing an ontological framework for predictive toxicology. A significant set of resources is already available to provide a foundation for an ontological framework for 21st century mechanistic-based toxicology research. Ontologies such as ToxWiz provide a basis for application to toxicology investigations, whereas other ontologies under development in the biological, chemical, and biomedical communities could be incorporated in an extended future framework. OpenTox has provided a semantic web framework for the implementation of such ontologies into software applications and linked data resources. Bioclipse developers have shown the benefit of interoperability obtained through ontology by being able to link their workbench application with remote OpenTox web services. Although these developments are promising, an increased international coordination of efforts is greatly needed to develop a more unified, standardized, and open toxicology ontology framework.
A Conceptual Framework for Understanding Unintended Prolonged Opioid Use.
Hooten, W Michael; Brummett, Chad M; Sullivan, Mark D; Goesling, Jenna; Tilburt, Jon C; Merlin, Jessica S; St Sauver, Jennifer L; Wasan, Ajay D; Clauw, Daniel J; Warner, David O
2017-12-01
An urgent need exists to better understand the transition from short-term opioid use to unintended prolonged opioid use (UPOU). The purpose of this work is to propose a conceptual framework for understanding UPOU that posits the influence of 3 principal domains that include the characteristics of (1) individual patients, (2) the practice environment, and (3) opioid prescribers. Although no standardized method exists for developing a conceptual framework, the process often involves identifying corroborative evidence, leveraging expert opinion to identify factors for inclusion in the framework, and developing a graphic depiction of the relationships between the various factors and the clinical problem of interest. Key patient characteristics potentially associated with UPOU include (1) medical and mental health conditions; (2) pain etiology; (3) individual affective, behavioral, and neurophysiologic reactions to pain and opioids; and (4) sociodemographic factors. Also, UPOU could be influenced by structural and health care policy factors: (1) the practice environment, including the roles of prescribing clinicians, adoption of relevant practice guidelines, and clinician incentives or disincentives, and (2) the regulatory environment. Finally, characteristics inherent to clinicians that could influence prescribing practices include (1) training in pain management and opioid use; (2) personal attitudes, knowledge, and beliefs regarding the risks and benefits of opioids; and (3) professionalism. As the gatekeeper to opioid access, the behavior of prescribing clinicians directly mediates UPOU, with the 3 domains interacting to determine this behavior. This proposed conceptual framework could guide future research on the topic and allow plausible hypothesis-based interventions to reduce UPOU. Copyright © 2017 Mayo Foundation for Medical Education and Research. Published by Elsevier Inc. All rights reserved.
A causal analysis framework for land-use change and the potential role of bioenergy policy
Efroymson, Rebecca A.; Kline, Keith L.; Angelsen, Arild; ...
2016-10-05
Here we propose a causal analysis framework to increase the reliability of land-use change (LUC) models and the accuracy of net greenhouse gas (GHG) emissions calculations for biofuels. The health-sciences-inspired framework is used here to determine probable causes of LUC, with an emphasis on bioenergy and deforestation. Calculations of net GHG emissions for LUC are critical in determining whether a fuel qualifies as a biofuel or advanced biofuel category under national (U.S., U.K.), state (California), and European Union regulations. Biofuel policymakers and scientists continue to discuss whether presumed indirect land-use change (ILUC) estimates, which often involve deforestation, should be includedmore » in GHG accounting for biofuel pathways. Current estimates of ILUC for bioenergy rely largely on economic simulation models that focus on causal pathways involving global commodity trade and use coarse land cover data with simple land classification systems. ILUC estimates are highly uncertain, partly because changes are not clearly defined and key causal links are not sufficiently included in the models. The proposed causal analysis framework begins with a definition of the change that has occurred and proceeds to a strength-of-evidence approach based on types of epidemiological evidence including plausibility of the relationship, completeness of the causal pathway, spatial co-occurrence, time order, analogous agents, simulation model results, and quantitative agent response relationships.Lastly, we discuss how LUC may be allocated among probable causes for policy purposes and how the application of the framework has the potential to increase the validity of LUC models and resolve ILUC and biofuel controversies.« less
A causal analysis framework for land-use change and the potential role of bioenergy policy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Efroymson, Rebecca A.; Kline, Keith L.; Angelsen, Arild
Here we propose a causal analysis framework to increase the reliability of land-use change (LUC) models and the accuracy of net greenhouse gas (GHG) emissions calculations for biofuels. The health-sciences-inspired framework is used here to determine probable causes of LUC, with an emphasis on bioenergy and deforestation. Calculations of net GHG emissions for LUC are critical in determining whether a fuel qualifies as a biofuel or advanced biofuel category under national (U.S., U.K.), state (California), and European Union regulations. Biofuel policymakers and scientists continue to discuss whether presumed indirect land-use change (ILUC) estimates, which often involve deforestation, should be includedmore » in GHG accounting for biofuel pathways. Current estimates of ILUC for bioenergy rely largely on economic simulation models that focus on causal pathways involving global commodity trade and use coarse land cover data with simple land classification systems. ILUC estimates are highly uncertain, partly because changes are not clearly defined and key causal links are not sufficiently included in the models. The proposed causal analysis framework begins with a definition of the change that has occurred and proceeds to a strength-of-evidence approach based on types of epidemiological evidence including plausibility of the relationship, completeness of the causal pathway, spatial co-occurrence, time order, analogous agents, simulation model results, and quantitative agent response relationships.Lastly, we discuss how LUC may be allocated among probable causes for policy purposes and how the application of the framework has the potential to increase the validity of LUC models and resolve ILUC and biofuel controversies.« less
Exploratory Application of Neuropharmacometabolomics in Severe Childhood Traumatic Brain Injury.
Hagos, Fanuel T; Empey, Philip E; Wang, Pengcheng; Ma, Xiaochao; Poloyac, Samuel M; Bayır, Hülya; Kochanek, Patrick M; Bell, Michael J; Clark, Robert S B
2018-05-07
To employ metabolomics-based pathway and network analyses to evaluate the cerebrospinal fluid metabolome after severe traumatic brain injury in children and the capacity of combination therapy with probenecid and N-acetylcysteine to impact glutathione-related and other pathways and networks, relative to placebo treatment. Analysis of cerebrospinal fluid obtained from children enrolled in an Institutional Review Board-approved, randomized, placebo-controlled trial of a combination of probenecid and N-acetylcysteine after severe traumatic brain injury (Trial Registration NCT01322009). Thirty-six-bed PICU in a university-affiliated children's hospital. Twelve children 2-18 years old after severe traumatic brain injury and five age-matched control subjects. Probenecid (25 mg/kg) and N-acetylcysteine (140 mg/kg) or placebo administered via naso/orogastric tube. The cerebrospinal fluid metabolome was analyzed in samples from traumatic brain injury patients 24 hours after the first dose of drugs or placebo and control subjects. Feature detection, retention time, alignment, annotation, and principal component analysis and statistical analysis were conducted using XCMS-online. The software "mummichog" was used for pathway and network analyses. A two-component principal component analysis revealed clustering of each of the groups, with distinct metabolomics signatures. Several novel pathways with plausible mechanistic involvement in traumatic brain injury were identified. A combination of metabolomics and pathway/network analyses showed that seven glutathione-centered pathways and two networks were enriched in the cerebrospinal fluid of traumatic brain injury patients treated with probenecid and N-acetylcysteine versus placebo-treated patients. Several additional pathways/networks consisting of components that are known substrates of probenecid-inhibitable transporters were also identified, providing additional mechanistic validation. This proof-of-concept neuropharmacometabolomics assessment reveals alterations in known and previously unidentified metabolic pathways and supports therapeutic target engagement of the combination of probenecid and N-acetylcysteine treatment after severe traumatic brain injury in children.
Abderrazak, Amna; Syrovets, Tatiana; Couchie, Dominique; El Hadri, Khadija; Friguet, Bertrand; Simmet, Thomas; Rouis, Mustapha
2015-01-01
IL-1β production is critically regulated by cytosolic molecular complexes, termed inflammasomes. Different inflammasome complexes have been described to date. While all inflammasomes recognize certain pathogens, it is the distinctive feature of NLRP3 inflammasome to be activated by many and diverse stimuli making NLRP3 the most versatile, and importantly also the most clinically implicated inflammasome. However, NLRP3 activation has remained the most enigmatic. It is not plausible that the intracellular NLRP3 receptor is able to detect all of its many and diverse triggers through direct interactions; instead, it is discussed that NLRP3 is responding to certain generic cellular stress-signals induced by the multitude of molecules that trigger its activation. An ever increasing number of studies link the sensing of cellular stress signals to a direct pathophysiological role of NLRP3 activation in a wide range of autoinflammatory and autoimmune disorders, and thus provide a novel mechanistic rational, on how molecules trigger and support sterile inflammatory diseases. A vast interest has created to unravel how NLRP3 becomes activated, since mechanistic insight is the prerequisite for a knowledge-based development of therapeutic intervention strategies that specifically target the NLRP3 triggered IL-1β production. In this review, we have updated knowledge on NLRP3 inflammasome assembly and activation and on the pyrin domain in NLRP3 that could represent a drug target to treat sterile inflammatory diseases. We have reported mutations in NLRP3 that were found to be associated with certain diseases. In addition, we have reviewed the functional link between NLRP3 inflammasome, the regulator of cellular redox status Trx/TXNIP complex, endoplasmic reticulum stress and the pathogenesis of diseases such as type 2 diabetes. Finally, we have provided data on NLRP3 inflammasome, as a critical regulator involved in the pathogenesis of obesity and cardiovascular diseases. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.
Elastic Wave Imaging of in-Situ Bio-Alterations in a Contaminated Aquifer
NASA Astrophysics Data System (ADS)
Jaiswal, P.; Raj, R.; Atekwana, E. A.; Briand, B.; Alam, I.
2014-12-01
We present a pioneering report on the utility of seismic methods in imaging bio-induced elastic property changes within a contaminated aquifer. To understand physical properties of contaminated soil, we acquired 48 meters long multichannel seismic profile over the Norman landfill leachate plume in Norman Oklahoma, USA. We estimated both the P- and S- wave velocities respectively using full-waveform inversion of the transmission and the ground-roll coda. The resulting S-wave model showed distinct velocity anomaly (~10% over background) within the water table fluctuation zone bounded by the historical minimum and maximum groundwater table. In comparison, the P-wave velocity anomaly within the same zone was negligible. The Environmental Scanning Electron Microscope (ESEM) images of samples from a core located along the seismic profile clearly shows presence of biofilms in the water table fluctuation zone and their absence both above and below the fluctuation zone. Elemental chemistry further indicates that the sediment composition throughout the core is fairly constant. We conclude that the velocity anomaly in S-wave is due to biofilms. As a next step, we develop mechanistic modeling to gain insights into the petro-physical behavior of biofilm-bearing sediments. Preliminary results suggest that a plausible model could be biofilms acting as contact cement between sediment grains. The biofilm cement can be placed in two ways - (i) superficial non-contact deposition on sediment grains, and (ii) deposition at grain contacts. Both models explain P- and S- wave velocity structure at reasonable (~5-10%) biofilm saturation and are equivocally supported by the ESEM images. Ongoing attenuation modeling from full-waveform inversion and its mechanistic realization, may be able to further discriminate between the two cement models. Our study strongly suggests that as opposed to the traditional P-wave seismic, S-wave acquisition and imaging can be a more powerful tool for in-situ imaging of biofilm formation in field settings with significant implication for bioremediation and microbial enhanced oil recovery monitoring.
Pilgrims sailing the Titanic: plausibility effects on memory for misinformation.
Hinze, Scott R; Slaten, Daniel G; Horton, William S; Jenkins, Ryan; Rapp, David N
2014-02-01
People rely on information they read even when it is inaccurate (Marsh, Meade, & Roediger, Journal of Memory and Language 49:519-536, 2003), but how ubiquitous is this phenomenon? In two experiments, we investigated whether this tendency to encode and rely on inaccuracies from text might be influenced by the plausibility of misinformation. In Experiment 1, we presented stories containing inaccurate plausible statements (e.g., "The Pilgrims' ship was the Godspeed"), inaccurate implausible statements (e.g., . . . the Titanic), or accurate statements (e.g., . . . the Mayflower). On a subsequent test of general knowledge, participants relied significantly less on implausible than on plausible inaccuracies from the texts but continued to rely on accurate information. In Experiment 2, we replicated these results with the addition of a think-aloud procedure to elicit information about readers' noticing and evaluative processes for plausible and implausible misinformation. Participants indicated more skepticism and less acceptance of implausible than of plausible inaccuracies. In contrast, they often failed to notice, completely ignored, and at times even explicitly accepted the misinformation provided by plausible lures. These results offer insight into the conditions under which reliance on inaccurate information occurs and suggest potential mechanisms that may underlie reported misinformation effects.
Phillips, Lawrence; Pearl, Lisa
2015-11-01
The informativity of a computational model of language acquisition is directly related to how closely it approximates the actual acquisition task, sometimes referred to as the model's cognitive plausibility. We suggest that though every computational model necessarily idealizes the modeled task, an informative language acquisition model can aim to be cognitively plausible in multiple ways. We discuss these cognitive plausibility checkpoints generally and then apply them to a case study in word segmentation, investigating a promising Bayesian segmentation strategy. We incorporate cognitive plausibility by using an age-appropriate unit of perceptual representation, evaluating the model output in terms of its utility, and incorporating cognitive constraints into the inference process. Our more cognitively plausible model shows a beneficial effect of cognitive constraints on segmentation performance. One interpretation of this effect is as a synergy between the naive theories of language structure that infants may have and the cognitive constraints that limit the fidelity of their inference processes, where less accurate inference approximations are better when the underlying assumptions about how words are generated are less accurate. More generally, these results highlight the utility of incorporating cognitive plausibility more fully into computational models of language acquisition. Copyright © 2015 Cognitive Science Society, Inc.
Zhu, Mingyan; Kim, Myung Hee; Lee, Sanghee; Bae, Su Jung; Kim, Seong Hwan; Park, Seung Bum
2010-12-23
A novel benzopyran-fused molecular framework 7ai was discovered as a specific inhibitor of RANKL-induced osteoclastogenesis using a cell-based TRAP activity assay from drug-like small-molecule libraries constructed by diversity-oriented synthesis. Its inhibitory activity was confirmed by in vitro evaluations including specific inhibition of RANKL-induced ERK phosphorylation and NF-κB transcriptional activation. 7ai can serve as a specific small-molecule modulator for mechanistic studies of RANKL-induced osteoclast differentiation as well as a potential lead for the development of antiresorptive drugs.
NASA Astrophysics Data System (ADS)
Muthukrishnan, A.; Sangaranarayanan, M. V.
2007-10-01
The reduction of carbon-fluorine bond in 4-fluorobenzonitrile in acetonitrile as the solvent, is analyzed using convolution potential sweep voltammetry and the dependence of the transfer coefficient on potential is investigated within the framework of Marcus-Hush quadratic activation-driving force theory. The validity of stepwise mechanism is inferred from solvent reorganization energy estimates as well as bond length calculations using B3LYP/6-31g(d) method. A novel method of estimating the standard reduction potential of the 4-fluorobenzonitrile in acetonitrile is proposed.
NASA Astrophysics Data System (ADS)
Zhang, Le; Zhang, Shaoxiang
2017-03-01
A body of research [1-7] has already shown that epigenetic reprogramming plays a critical role in maintaining the normal development of embryos. However, the mechanistic quantitation of the epigenetic interactions between sperms and oocytes and the related impact on embryo development are still not clear [6,7]. In this study, Wang et al., [8] develop a modeling framework that addresses this question by integrating game theory and the latest discoveries of the epigenetic control of embryo development.
Mechanisms of Graft Rejection and Immune Regulation after Lung Transplant.
Gauthier, Jason M; Li, Wenjun; Hsiao, Hsi-Min; Takahashi, Tsuyoshi; Arefanian, Saeed; Krupnick, Alexander S; Gelman, Andrew E; Kreisel, Daniel
2017-09-01
Outcomes after lung transplant lag behind those of other solid-organ transplants. A better understanding of the pathways that contribute to rejection and tolerance after lung transplant will be required to develop new therapeutic strategies that take into account the unique immunological features of lungs. Mechanistic immunological investigations in an orthotopic transplant model in the mouse have shed new light on immune responses after lung transplant. Here, we highlight that interactions between immune cells within pulmonary grafts shape their fate. These observations set lungs apart from other organs and help provide the conceptual framework for the development of lung-specific immunosuppression.
The future trajectory of adverse outcome pathways: a commentary.
Sewell, Fiona; Gellatly, Nichola; Beaumont, Maria; Burden, Natalie; Currie, Richard; de Haan, Lolke; Hutchinson, Thomas H; Jacobs, Miriam; Mahony, Catherine; Malcomber, Ian; Mehta, Jyotigna; Whale, Graham; Kimber, Ian
2018-04-01
The advent of adverse outcome pathways (AOPs) has provided a new lexicon for description of mechanistic toxicology, and a renewed enthusiasm for exploring modes of action resulting in adverse health and environmental effects. In addition, AOPs have been used successfully as a framework for the design and development of non-animal approaches to toxicity testing. Although the value of AOPs is widely recognised, there remain challenges and opportunities associated with their use in practise. The purpose of this article is to consider specifically how the future trajectory of AOPs may provide a basis for addressing some of those challenges and opportunities.
Rao, Rohit T; Scherholz, Megerle L; Hartmanshenn, Clara; Bae, Seul-A; Androulakis, Ioannis P
2017-12-05
The use of models in biology has become particularly relevant as it enables investigators to develop a mechanistic framework for understanding the operating principles of living systems as well as in quantitatively predicting their response to both pathological perturbations and pharmacological interventions. This application has resulted in a synergistic convergence of systems biology and pharmacokinetic-pharmacodynamic modeling techniques that has led to the emergence of quantitative systems pharmacology (QSP). In this review, we discuss how the foundational principles of chemical process systems engineering inform the progressive development of more physiologically-based systems biology models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sezen, Halil; Aldemir, Tunc; Denning, R.
Probabilistic risk assessment of nuclear power plants initially focused on events initiated by internal faults at the plant, rather than external hazards including earthquakes and flooding. Although the importance of external hazards risk analysis is now well recognized, the methods for analyzing low probability external hazards rely heavily on subjective judgment of specialists, often resulting in substantial conservatism. This research developed a framework to integrate the risk of seismic and flooding events using realistic structural models and simulation of response of nuclear structures. The results of four application case studies are presented.
Dualism and its importance for medicine.
Switankowsky, I
2000-11-01
Cartesian dualism has been viewed by medical theorists to be one of the chief causes of a reductionist/mechanistic treatment of the patient. Although I aver that Cartesian dualism is one culprit for the misapprehension of the genuine treatment of patients in terms of both mind and body, I argue that interactive dualism which stresses the interaction of mind and body is essential to treat patients with dignity and compassion. Thus, adequate medical care that is humanistic in nature is difficult (if not impossible) to achieve without physicians adhering to a dualistic framework in which the body and person is treated during illness.
Adults' memories of childhood: true and false reports.
Qin, Jianjian; Ogle, Christin M; Goodman, Gail S
2008-12-01
In 3 experiments, the authors examined factors that, according to the source-monitoring framework, might influence false memory formation and true/false memory discernment. In Experiment 1, combined effects of warning and visualization on false childhood memory formation were examined, as were individual differences in true and false childhood memories. Combining warnings and visualization led to the lowest false memory and highest true memory. Several individual difference factors (e.g., parental fearful attachment style) predicted false recall. In addition, true and false childhood memories differed (e.g., in amount of information). Experiment 2 examined relations between Deese/Roediger-McDermott task performance and false childhood memories. Deese/Roediger-McDermott performance (e.g., intrusion of unrelated words in free recall) was associated with false childhood memory, suggesting liberal response criteria in source decisions as a common underlying mechanism. Experiment 3 investigated adults' abilities to discern true and false childhood memory reports (e.g., by detecting differences in amount of information as identified in Experiment 1). Adults who were particularly successful in discerning such reports indicated reliance on event plausibility. Overall, the source-monitoring framework provided a viable explanatory framework. Implications for theory and clinical and forensic interviews are discussed. PsycINFO Database Record (c) 2008 APA, all rights reserved.
Predictive representations can link model-based reinforcement learning to model-free mechanisms.
Russek, Evan M; Momennejad, Ida; Botvinick, Matthew M; Gershman, Samuel J; Daw, Nathaniel D
2017-09-01
Humans and animals are capable of evaluating actions by considering their long-run future rewards through a process described using model-based reinforcement learning (RL) algorithms. The mechanisms by which neural circuits perform the computations prescribed by model-based RL remain largely unknown; however, multiple lines of evidence suggest that neural circuits supporting model-based behavior are structurally homologous to and overlapping with those thought to carry out model-free temporal difference (TD) learning. Here, we lay out a family of approaches by which model-based computation may be built upon a core of TD learning. The foundation of this framework is the successor representation, a predictive state representation that, when combined with TD learning of value predictions, can produce a subset of the behaviors associated with model-based learning, while requiring less decision-time computation than dynamic programming. Using simulations, we delineate the precise behavioral capabilities enabled by evaluating actions using this approach, and compare them to those demonstrated by biological organisms. We then introduce two new algorithms that build upon the successor representation while progressively mitigating its limitations. Because this framework can account for the full range of observed putatively model-based behaviors while still utilizing a core TD framework, we suggest that it represents a neurally plausible family of mechanisms for model-based evaluation.
2018-01-01
Qualitative risk assessment frameworks, such as the Productivity Susceptibility Analysis (PSA), have been developed to rapidly evaluate the risks of fishing to marine populations and prioritize management and research among species. Despite being applied to over 1,000 fish populations, and an ongoing debate about the most appropriate method to convert biological and fishery characteristics into an overall measure of risk, the assumptions and predictive capacity of these approaches have not been evaluated. Several interpretations of the PSA were mapped to a conventional age-structured fisheries dynamics model to evaluate the performance of the approach under a range of assumptions regarding exploitation rates and measures of biological risk. The results demonstrate that the underlying assumptions of these qualitative risk-based approaches are inappropriate, and the expected performance is poor for a wide range of conditions. The information required to score a fishery using a PSA-type approach is comparable to that required to populate an operating model and evaluating the population dynamics within a simulation framework. In addition to providing a more credible characterization of complex system dynamics, the operating model approach is transparent, reproducible and can evaluate alternative management strategies over a range of plausible hypotheses for the system. PMID:29856869
Predictive representations can link model-based reinforcement learning to model-free mechanisms
Botvinick, Matthew M.
2017-01-01
Humans and animals are capable of evaluating actions by considering their long-run future rewards through a process described using model-based reinforcement learning (RL) algorithms. The mechanisms by which neural circuits perform the computations prescribed by model-based RL remain largely unknown; however, multiple lines of evidence suggest that neural circuits supporting model-based behavior are structurally homologous to and overlapping with those thought to carry out model-free temporal difference (TD) learning. Here, we lay out a family of approaches by which model-based computation may be built upon a core of TD learning. The foundation of this framework is the successor representation, a predictive state representation that, when combined with TD learning of value predictions, can produce a subset of the behaviors associated with model-based learning, while requiring less decision-time computation than dynamic programming. Using simulations, we delineate the precise behavioral capabilities enabled by evaluating actions using this approach, and compare them to those demonstrated by biological organisms. We then introduce two new algorithms that build upon the successor representation while progressively mitigating its limitations. Because this framework can account for the full range of observed putatively model-based behaviors while still utilizing a core TD framework, we suggest that it represents a neurally plausible family of mechanisms for model-based evaluation. PMID:28945743
Svoboda, David; Ulman, Vladimir
2017-01-01
The proper analysis of biological microscopy images is an important and complex task. Therefore, it requires verification of all steps involved in the process, including image segmentation and tracking algorithms. It is generally better to verify algorithms with computer-generated ground truth datasets, which, compared to manually annotated data, nowadays have reached high quality and can be produced in large quantities even for 3D time-lapse image sequences. Here, we propose a novel framework, called MitoGen, which is capable of generating ground truth datasets with fully 3D time-lapse sequences of synthetic fluorescence-stained cell populations. MitoGen shows biologically justified cell motility, shape and texture changes as well as cell divisions. Standard fluorescence microscopy phenomena such as photobleaching, blur with real point spread function (PSF), and several types of noise, are simulated to obtain realistic images. The MitoGen framework is scalable in both space and time. MitoGen generates visually plausible data that shows good agreement with real data in terms of image descriptors and mean square displacement (MSD) trajectory analysis. Additionally, it is also shown in this paper that four publicly available segmentation and tracking algorithms exhibit similar performance on both real and MitoGen-generated data. The implementation of MitoGen is freely available.
Advancing the adverse outcome pathway framework and its ...
Regulatory agencies worldwide are confronted with the challenging task of assessing the risks of thousands of chemicals to protect both human health and the environment. Traditional toxicity testing largely relies on apical endpoints from whole animal studies, which, in addition to ethical concerns, is costly and time prohibitive. As a result, the utility of mechanism-based in silico, in vitro, and in vivo approaches to support chemical safety evaluations have increasingly been explored. An approach that has gained traction for capturing available knowledge describing the linkage between mechanistic data and apical toxicity endpoints, required for regulatory assessments, is the adverse outcome pathway (AOP) framework. A number of international workshops and expert meetings have been held over the past years focusing on the AOP framework and its applications to chemical risk assessment. Although, these interactions have illustrated the necessity of expert guidance in moving the science of AOPs and their applications forward, there is also the recognition that a broader survey of the scientific community could be useful in guiding future initiatives in the AOP arena. To that end, a Horizon Scanning exercise was conducted to solicit questions from the global scientific community concerning the challenges or limitations that must be addressed in order to realize the full potential of the AOP framework in research and regulatory decision making. Over a 4 month ques
A New Biogeochemical Computational Framework Integrated within the Community Land Model
NASA Astrophysics Data System (ADS)
Fang, Y.; Li, H.; Liu, C.; Huang, M.; Leung, L.
2012-12-01
Terrestrial biogeochemical processes, particularly carbon cycle dynamics, have been shown to significantly influence regional and global climate changes. Modeling terrestrial biogeochemical processes within the land component of Earth System Models such as the Community Land model (CLM), however, faces three major challenges: 1) extensive efforts in modifying modeling structures and rewriting computer programs to incorporate biogeochemical processes with increasing complexity, 2) expensive computational cost to solve the governing equations due to numerical stiffness inherited from large variations in the rates of biogeochemical processes, and 3) lack of an efficient framework to systematically evaluate various mathematical representations of biogeochemical processes. To address these challenges, we introduce a new computational framework to incorporate biogeochemical processes into CLM, which consists of a new biogeochemical module with a generic algorithm and reaction database. New and updated biogeochemical processes can be incorporated into CLM without significant code modification. To address the stiffness issue, algorithms and criteria will be developed to identify fast processes, which will be replaced with algebraic equations and decoupled from slow processes. This framework can serve as a generic and user-friendly platform to test out different mechanistic process representations and datasets and gain new insight on the behavior of the terrestrial ecosystems in response to climate change in a systematic way.
A formal model of interpersonal inference
Moutoussis, Michael; Trujillo-Barreto, Nelson J.; El-Deredy, Wael; Dolan, Raymond J.; Friston, Karl J.
2014-01-01
Introduction: We propose that active Bayesian inference—a general framework for decision-making—can equally be applied to interpersonal exchanges. Social cognition, however, entails special challenges. We address these challenges through a novel formulation of a formal model and demonstrate its psychological significance. Method: We review relevant literature, especially with regards to interpersonal representations, formulate a mathematical model and present a simulation study. The model accommodates normative models from utility theory and places them within the broader setting of Bayesian inference. Crucially, we endow people's prior beliefs, into which utilities are absorbed, with preferences of self and others. The simulation illustrates the model's dynamics and furnishes elementary predictions of the theory. Results: (1) Because beliefs about self and others inform both the desirability and plausibility of outcomes, in this framework interpersonal representations become beliefs that have to be actively inferred. This inference, akin to “mentalizing” in the psychological literature, is based upon the outcomes of interpersonal exchanges. (2) We show how some well-known social-psychological phenomena (e.g., self-serving biases) can be explained in terms of active interpersonal inference. (3) Mentalizing naturally entails Bayesian updating of how people value social outcomes. Crucially this includes inference about one's own qualities and preferences. Conclusion: We inaugurate a Bayes optimal framework for modeling intersubject variability in mentalizing during interpersonal exchanges. Here, interpersonal representations are endowed with explicit functional and affective properties. We suggest the active inference framework lends itself to the study of psychiatric conditions where mentalizing is distorted. PMID:24723872
A framework for the etiology of running-related injuries.
Bertelsen, M L; Hulme, A; Petersen, J; Brund, R K; Sørensen, H; Finch, C F; Parner, E T; Nielsen, R O
2017-11-01
The etiology of running-related injury is important to consider as the effectiveness of a given running-related injury prevention intervention is dependent on whether etiologic factors are readily modifiable and consistent with a biologically plausible causal mechanism. Therefore, the purpose of the present article was to present an evidence-informed conceptual framework outlining the multifactorial nature of running-related injury etiology. In the framework, four mutually exclusive parts are presented: (a) Structure-specific capacity when entering a running session; (b) structure-specific cumulative load per running session; (c) reduction in the structure-specific capacity during a running session; and (d) exceeding the structure-specific capacity. The framework can then be used to inform the design of future running-related injury prevention studies, including the formation of research questions and hypotheses, as well as the monitoring of participation-related and non-participation-related exposures. In addition, future research applications should focus on addressing how changes in one or more exposures influence the risk of running-related injury. This necessitates the investigation of how different factors affect the structure-specific load and/or the load capacity, and the dose-response relationship between running participation and injury risk. Ultimately, this direction allows researchers to move beyond traditional risk factor identification to produce research findings that are not only reliably reported in terms of the observed cause-effect association, but also translatable in practice. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Plausibility Judgments in Conceptual Change and Epistemic Cognition
ERIC Educational Resources Information Center
Lombardi, Doug; Nussbaum, E. Michael; Sinatra, Gale M.
2016-01-01
Plausibility judgments rarely have been addressed empirically in conceptual change research. Recent research, however, suggests that these judgments may be pivotal to conceptual change about certain topics where a gap exists between what scientists and laypersons find plausible. Based on a philosophical and empirical foundation, this article…
Source Effects and Plausibility Judgments When Reading about Climate Change
ERIC Educational Resources Information Center
Lombardi, Doug; Seyranian, Viviane; Sinatra, Gale M.
2014-01-01
Gaps between what scientists and laypeople find plausible may act as a barrier to learning complex and/or controversial socioscientific concepts. For example, individuals may consider scientific explanations that human activities are causing current climate change as implausible. This plausibility judgment may be due-in part-to individuals'…
Plausibility and Perspective Influence the Processing of Counterfactual Narratives
ERIC Educational Resources Information Center
Ferguson, Heather J.; Jayes, Lewis T.
2018-01-01
Previous research has established that readers' eye movements are sensitive to the difficulty with which a word is processed. One important factor that influences processing is the fit of a word within the wider context, including its plausibility. Here we explore the influence of plausibility in counterfactual language processing. Counterfactuals…
DOE Office of Scientific and Technical Information (OSTI.GOV)
DeRosa, C.T.; Choudhury, H.; Schoeny, R.S.
Risk assessment can be thought of as a conceptual approach to bridge the gap between the available data and the ultimate goal of characterizing the risk or hazard associated with a particular environmental problem. To lend consistency to and to promote quality in the process, the US Environmental Protection Agency (EPA) published Guidelines for Risk Assessment of Carcinogenicity, Developmental Toxicity, Germ Cell Mutagenicity and Exposure Assessment, and Risk Assessment of Chemical Mixtures. The guidelines provide a framework for organizing the information, evaluating data, and for carrying out the risk assessment in a scientifically plausible manner. In the absence of sufficientmore » scientific information or when abundant data are available, the guidelines provide alternative methodologies that can be employed in the risk assessment. 4 refs., 3 figs., 2 tabs.« less
Modelling default and likelihood reasoning as probabilistic reasoning
NASA Technical Reports Server (NTRS)
Buntine, Wray
1990-01-01
A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. Likely and by default are in fact treated as duals in the same sense as possibility and necessity. To model these four forms probabilistically, a qualitative default probabilistic (QDP) logic and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequent results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.
Early behavioral intervention, brain plasticity, and the prevention of autism spectrum disorder.
Dawson, Geraldine
2008-01-01
Advances in the fields of cognitive and affective developmental neuroscience, developmental psychopathology, neurobiology, genetics, and applied behavior analysis have contributed to a more optimistic outcome for individuals with autism spectrum disorder (ASD). These advances have led to new methods for early detection and more effective treatments. For the first time, prevention of ASD is plausible. Prevention will entail detecting infants at risk before the full syndrome is present and implementing treatments designed to alter the course of early behavioral and brain development. This article describes a developmental model of risk, risk processes, symptom emergence, and adaptation in ASD that offers a framework for understanding early brain plasticity in ASD and its role in prevention of the disorder.
Modeling discrete and rhythmic movements through motor primitives: a review.
Degallier, Sarah; Ijspeert, Auke
2010-10-01
Rhythmic and discrete movements are frequently considered separately in motor control, probably because different techniques are commonly used to study and model them. Yet the increasing interest in finding a comprehensive model for movement generation requires bridging the different perspectives arising from the study of those two types of movements. In this article, we consider discrete and rhythmic movements within the framework of motor primitives, i.e., of modular generation of movements. In this way we hope to gain an insight into the functional relationships between discrete and rhythmic movements and thus into a suitable representation for both of them. Within this framework we can define four possible categories of modeling for discrete and rhythmic movements depending on the required command signals and on the spinal processes involved in the generation of the movements. These categories are first discussed in terms of biological concepts such as force fields and central pattern generators and then illustrated by several mathematical models based on dynamical system theory. A discussion on the plausibility of theses models concludes the work.
GeNN: a code generation framework for accelerated brain simulations
NASA Astrophysics Data System (ADS)
Yavuz, Esin; Turner, James; Nowotny, Thomas
2016-01-01
Large-scale numerical simulations of detailed brain circuit models are important for identifying hypotheses on brain functions and testing their consistency and plausibility. An ongoing challenge for simulating realistic models is, however, computational speed. In this paper, we present the GeNN (GPU-enhanced Neuronal Networks) framework, which aims to facilitate the use of graphics accelerators for computational models of large-scale neuronal networks to address this challenge. GeNN is an open source library that generates code to accelerate the execution of network simulations on NVIDIA GPUs, through a flexible and extensible interface, which does not require in-depth technical knowledge from the users. We present performance benchmarks showing that 200-fold speedup compared to a single core of a CPU can be achieved for a network of one million conductance based Hodgkin-Huxley neurons but that for other models the speedup can differ. GeNN is available for Linux, Mac OS X and Windows platforms. The source code, user manual, tutorials, Wiki, in-depth example projects and all other related information can be found on the project website http://genn-team.github.io/genn/.
GeNN: a code generation framework for accelerated brain simulations.
Yavuz, Esin; Turner, James; Nowotny, Thomas
2016-01-07
Large-scale numerical simulations of detailed brain circuit models are important for identifying hypotheses on brain functions and testing their consistency and plausibility. An ongoing challenge for simulating realistic models is, however, computational speed. In this paper, we present the GeNN (GPU-enhanced Neuronal Networks) framework, which aims to facilitate the use of graphics accelerators for computational models of large-scale neuronal networks to address this challenge. GeNN is an open source library that generates code to accelerate the execution of network simulations on NVIDIA GPUs, through a flexible and extensible interface, which does not require in-depth technical knowledge from the users. We present performance benchmarks showing that 200-fold speedup compared to a single core of a CPU can be achieved for a network of one million conductance based Hodgkin-Huxley neurons but that for other models the speedup can differ. GeNN is available for Linux, Mac OS X and Windows platforms. The source code, user manual, tutorials, Wiki, in-depth example projects and all other related information can be found on the project website http://genn-team.github.io/genn/.
GeNN: a code generation framework for accelerated brain simulations
Yavuz, Esin; Turner, James; Nowotny, Thomas
2016-01-01
Large-scale numerical simulations of detailed brain circuit models are important for identifying hypotheses on brain functions and testing their consistency and plausibility. An ongoing challenge for simulating realistic models is, however, computational speed. In this paper, we present the GeNN (GPU-enhanced Neuronal Networks) framework, which aims to facilitate the use of graphics accelerators for computational models of large-scale neuronal networks to address this challenge. GeNN is an open source library that generates code to accelerate the execution of network simulations on NVIDIA GPUs, through a flexible and extensible interface, which does not require in-depth technical knowledge from the users. We present performance benchmarks showing that 200-fold speedup compared to a single core of a CPU can be achieved for a network of one million conductance based Hodgkin-Huxley neurons but that for other models the speedup can differ. GeNN is available for Linux, Mac OS X and Windows platforms. The source code, user manual, tutorials, Wiki, in-depth example projects and all other related information can be found on the project website http://genn-team.github.io/genn/. PMID:26740369
Conroy, M.J.; Runge, M.C.; Nichols, J.D.; Stodola, K.W.; Cooper, R.J.
2011-01-01
The broad physical and biological principles behind climate change and its potential large scale ecological impacts on biota are fairly well understood, although likely responses of biotic communities at fine spatio-temporal scales are not, limiting the ability of conservation programs to respond effectively to climate change outside the range of human experience. Much of the climate debate has focused on attempts to resolve key uncertainties in a hypothesis-testing framework. However, conservation decisions cannot await resolution of these scientific issues and instead must proceed in the face of uncertainty. We suggest that conservation should precede in an adaptive management framework, in which decisions are guided by predictions under multiple, plausible hypotheses about climate impacts. Under this plan, monitoring is used to evaluate the response of the system to climate drivers, and management actions (perhaps experimental) are used to confront testable predictions with data, in turn providing feedback for future decision making. We illustrate these principles with the problem of mitigating the effects of climate change on terrestrial bird communities in the southern Appalachian Mountains, USA. ?? 2010 Elsevier Ltd.
Implementation of the Leaching Environmental Assessment ...
LEAF provides a uniform and integrated approach for evaluating leaching from solid materials (e.g., waste, treated wastes such as by solidification/stabilization, secondary materials such as blast furnace slags, energy residuals such as coal fly ash, soil, sediments, mining and mineral processing wastes). Assessment using LEAF applies a stepwise approach that considers the leaching behavior of COPCs in response to chemical and physical factors that control and material properties across a range of plausible field conditions (US EPA, 2010). The framework provides the flexibility to tailor testing to site conditions and select the extent of testing based on assessment objectives and the level of detailed information needed to support decision-making. The main focus will be to discuss the implementation of LEAF in the US and the How to Guide that has recently been completed. To present the How To Guide for the implementation of the leaching environmental assessment framework to an international audience already familiar with comparable leaching tests in use in Europe. Will be meeting with European colleagues on their interest in expanding methods to include organics.
NASA Astrophysics Data System (ADS)
Liu, Y.; Gupta, H.; Wagener, T.; Stewart, S.; Mahmoud, M.; Hartmann, H.; Springer, E.
2007-12-01
Some of the most challenging issues facing contemporary water resources management are those typified by complex coupled human-environmental systems with poorly characterized uncertainties. In other words, major decisions regarding water resources have to be made in the face of substantial uncertainty and complexity. It has been suggested that integrated models can be used to coherently assemble information from a broad set of domains, and can therefore serve as an effective means for tackling the complexity of environmental systems. Further, well-conceived scenarios can effectively inform decision making, particularly when high complexity and poorly characterized uncertainties make the problem intractable via traditional uncertainty analysis methods. This presentation discusses the integrated modeling framework adopted by SAHRA, an NSF Science & Technology Center, to investigate stakeholder-driven water sustainability issues within the semi-arid southwestern US. The multi-disciplinary, multi-resolution modeling framework incorporates a formal scenario approach to analyze the impacts of plausible (albeit uncertain) alternative futures to support adaptive management of water resources systems. Some of the major challenges involved in, and lessons learned from, this effort will be discussed.
Crystal structure and cation exchanging properties of a novel open framework phosphate of Ce (IV)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bevara, Samatha; Achary, S. N., E-mail: sachary@barc.gov.in; Tyagi, A. K.
2016-05-23
Herein we report preparation, crystal structure and ion exchanging properties of a new phosphate of tetravalent cerium, K{sub 2}Ce(PO{sub 4}){sub 2}. A monoclinic structure having framework type arrangement of Ce(PO{sub 4}){sub 6} units formed by C2O{sub 8} square-antiprism and PO{sub 4} tetrahedra is assigned for K{sub C}e(PO{sub 4}){sub 2}. The K{sup +} ions are occupied in the channels formed by the Ce(PO{sub 4})6 and provide overall charge neutrality. The unique channel type arrangements of the K+ make them exchangeable with other cations. The ion exchanging properties of K2Ce(PO4)2 has been investigated by equilibrating with solution of 90Sr followed by radiometricmore » analysis. In optimum conditions, significant exchange of K+ with Sr2+ with Kd ~ 8000 mL/g is observed. The details of crystal structure and ion exchange properties are explained and a plausible mechanism for ion exchange is presented.« less
Bramley, Neil R; Lagnado, David A; Speekenbrink, Maarten
2015-05-01
Interacting with a system is key to uncovering its causal structure. A computational framework for interventional causal learning has been developed over the last decade, but how real causal learners might achieve or approximate the computations entailed by this framework is still poorly understood. Here we describe an interactive computer task in which participants were incentivized to learn the structure of probabilistic causal systems through free selection of multiple interventions. We develop models of participants' intervention choices and online structure judgments, using expected utility gain, probability gain, and information gain and introducing plausible memory and processing constraints. We find that successful participants are best described by a model that acts to maximize information (rather than expected score or probability of being correct); that forgets much of the evidence received in earlier trials; but that mitigates this by being conservative, preferring structures consistent with earlier stated beliefs. We explore 2 heuristics that partly explain how participants might be approximating these models without explicitly representing or updating a hypothesis space. (c) 2015 APA, all rights reserved).
NASA Astrophysics Data System (ADS)
Ghimire, B.; Riley, W. J.; Koven, C.
2013-12-01
Nitrogen is the most important nutrient limiting plant carbon assimilation and growth, and is required for production of photosynthetic enzymes, growth and maintenance respiration, and maintaining cell structure. The forecasted rise in plant available nitrogen through atmospheric nitrogen deposition and the release of locked soil nitrogen by permafrost thaw in high latitude ecosystems is likely to result in an increase in plant productivity. However a mechanistic representation of plant nitrogen dynamics is lacking in earth system models. Most earth system models ignore the dynamic nature of plant nutrient uptake and allocation, and further lack tight coupling of below- and above-ground processes. In these models, the increase in nitrogen uptake does not translate to a corresponding increase in photosynthesis parameters, such as maximum Rubisco capacity and electron transfer rate. We present an improved modeling framework implemented in the Community Land Model version 4.5 (CLM4.5) for dynamic plant nutrient uptake, and allocation to different plant parts, including leaf enzymes. This modeling framework relies on imposing a more realistic flexible carbon to nitrogen stoichiometric ratio for different plant parts. The model mechanistically responds to plant nitrogen uptake and leaf allocation though changes in photosynthesis parameters. We produce global simulations, and examine the impacts of the improved nitrogen cycling. The improved model is evaluated against multiple observations including TRY database of global plant traits, nitrogen fertilization observations and 15N tracer studies. Global simulations with this new version of CLM4.5 showed better agreement with the observations than the default CLM4.5-CN model, and captured the underlying mechanisms associated with plant nitrogen cycle.
NASA Astrophysics Data System (ADS)
Malek, Keyvan; Stöckle, Claudio; Chinnayakanahalli, Kiran; Nelson, Roger; Liu, Mingliang; Rajagopalan, Kirti; Barik, Muhammad; Adam, Jennifer C.
2017-08-01
Food supply is affected by a complex nexus of land, atmosphere, and human processes, including short- and long-term stressors (e.g., drought and climate change, respectively). A simulation platform that captures these complex elements can be used to inform policy and best management practices to promote sustainable agriculture. We have developed a tightly coupled framework using the macroscale variable infiltration capacity (VIC) hydrologic model and the CropSyst agricultural model. A mechanistic irrigation module was also developed for inclusion in this framework. Because VIC-CropSyst combines two widely used and mechanistic models (for crop phenology, growth, management, and macroscale hydrology), it can provide realistic and hydrologically consistent simulations of water availability, crop water requirements for irrigation, and agricultural productivity for both irrigated and dryland systems. This allows VIC-CropSyst to provide managers and decision makers with reliable information on regional water stresses and their impacts on food production. Additionally, VIC-CropSyst is being used in conjunction with socioeconomic models, river system models, and atmospheric models to simulate feedback processes between regional water availability, agricultural water management decisions, and land-atmosphere interactions. The performance of VIC-CropSyst was evaluated on both regional (over the US Pacific Northwest) and point scales. Point-scale evaluation involved using two flux tower sites located in agricultural fields in the US (Nebraska and Illinois). The agreement between recorded and simulated evapotranspiration (ET), applied irrigation water, soil moisture, leaf area index (LAI), and yield indicated that, although the model is intended to work on regional scales, it also captures field-scale processes in agricultural areas.
Wittwehr, Clemens; Aladjov, Hristo; Ankley, Gerald; Byrne, Hugh J; de Knecht, Joop; Heinzle, Elmar; Klambauer, Günter; Landesmann, Brigitte; Luijten, Mirjam; MacKay, Cameron; Maxwell, Gavin; Meek, M E Bette; Paini, Alicia; Perkins, Edward; Sobanski, Tomasz; Villeneuve, Dan; Waters, Katrina M; Whelan, Maurice
2017-02-01
Efforts are underway to transform regulatory toxicology and chemical safety assessment from a largely empirical science based on direct observation of apical toxicity outcomes in whole organism toxicity tests to a predictive one in which outcomes and risk are inferred from accumulated mechanistic understanding. The adverse outcome pathway (AOP) framework provides a systematic approach for organizing knowledge that may support such inference. Likewise, computational models of biological systems at various scales provide another means and platform to integrate current biological understanding to facilitate inference and extrapolation. We argue that the systematic organization of knowledge into AOP frameworks can inform and help direct the design and development of computational prediction models that can further enhance the utility of mechanistic and in silico data for chemical safety assessment. This concept was explored as part of a workshop on AOP-Informed Predictive Modeling Approaches for Regulatory Toxicology held September 24-25, 2015. Examples of AOP-informed model development and its application to the assessment of chemicals for skin sensitization and multiple modes of endocrine disruption are provided. The role of problem formulation, not only as a critical phase of risk assessment, but also as guide for both AOP and complementary model development is described. Finally, a proposal for actively engaging the modeling community in AOP-informed computational model development is made. The contents serve as a vision for how AOPs can be leveraged to facilitate development of computational prediction models needed to support the next generation of chemical safety assessment. © The Author 2016. Published by Oxford University Press on behalf of the Society of Toxicology.
2011-01-01
Animal models of psychiatric disorders are usually discussed with regard to three criteria first elaborated by Willner; face, predictive and construct validity. Here, we draw the history of these concepts and then try to redraw and refine these criteria, using the framework of the diathesis model of depression that has been proposed by several authors. We thus propose a set of five major criteria (with sub-categories for some of them); homological validity (including species validity and strain validity), pathogenic validity (including ontopathogenic validity and triggering validity), mechanistic validity, face validity (including ethological and biomarker validity) and predictive validity (including induction and remission validity). Homological validity requires that an adequate species and strain be chosen: considering species validity, primates will be considered to have a higher score than drosophila, and considering strains, a high stress reactivity in a strain scores higher than a low stress reactivity in another strain. Pathological validity corresponds to the fact that, in order to shape pathological characteristics, the organism has been manipulated both during the developmental period (for example, maternal separation: ontopathogenic validity) and during adulthood (for example, stress: triggering validity). Mechanistic validity corresponds to the fact that the cognitive (for example, cognitive bias) or biological mechanisms (such as dysfunction of the hormonal stress axis regulation) underlying the disorder are identical in both humans and animals. Face validity corresponds to the observable behavioral (ethological validity) or biological (biomarker validity) outcomes: for example anhedonic behavior (ethological validity) or elevated corticosterone (biomarker validity). Finally, predictive validity corresponds to the identity of the relationship between the triggering factor and the outcome (induction validity) and between the effects of the treatments on the two organisms (remission validity). The relevance of this framework is then discussed regarding various animal models of depression. PMID:22738250
How do Changes in Hydro-Climate Conditions Alter the Risk of Infection With Fasciolosis?
NASA Astrophysics Data System (ADS)
Beltrame, L.; Dunne, T.; Rose, H.; Walker, J.; Morgan, E.; Vickerman, P.; Wagener, T.
2017-12-01
Fasciolosis is a widespread parasitic disease of livestock and is emerging as a major zoonosis. Since the parasite and its intermediate host live and develop in the environment, risk of infection is directly affected by climatic-environmental conditions. Changes in disease prevalence, seasonality and distribution have been reported in recent years and attributed to altered temperature and rainfall patterns, raising concerns about the effects of climate change in the future. Therefore, it is urgent to understand how changes in climate-environmental drivers may alter the dynamics of disease risk in a quantitative way, to guide parasite control strategies and interventions in the coming decades. In a previous work, we developed and tested a novel mechanistic hydro-epidemiological model for Fasciolosis, which explicitly represents the parasite life-cycle in connection with key environmental processes, allowing to capture the impact of previously unseen conditions. In this study, we use the new mechanistic model to assess the sensitivity of infection rates to changes in climate-environmental factors. This is challenging as processes underlying disease transmission are complex and interacting, and may have contrasting effects on the parasite life-cycle stages. To this end, we set up a sensitivity analysis framework to investigate in a structured way which factors play a key role in controlling the magnitude, timing and spread of infection, and how the sensitivity of disease risk varies in time and space. Moreover, we define synthetic scenarios to explore the space of possible variability of the hydro-climate drivers and investigate conditions that lead to critical levels of infection. The study shows how the new model combined with the sensitivity analysis framework can support decision-making, providing useful information for disease management.
2011-01-01
Background Bacteria have evolved a rich set of mechanisms for sensing and adapting to adverse conditions in their environment. These are crucial for their survival, which requires them to react to extracellular stresses such as heat shock, ethanol treatment or phage infection. Here we focus on studying the phage shock protein (Psp) stress response in Escherichia coli induced by a phage infection or other damage to the bacterial membrane. This system has not yet been theoretically modelled or analysed in silico. Results We develop a model of the Psp response system, and illustrate how such models can be constructed and analyzed in light of available sparse and qualitative information in order to generate novel biological hypotheses about their dynamical behaviour. We analyze this model using tools from Petri-net theory and study its dynamical range that is consistent with currently available knowledge by conditioning model parameters on the available data in an approximate Bayesian computation (ABC) framework. Within this ABC approach we analyze stochastic and deterministic dynamics. This analysis allows us to identify different types of behaviour and these mechanistic insights can in turn be used to design new, more detailed and time-resolved experiments. Conclusions We have developed the first mechanistic model of the Psp response in E. coli. This model allows us to predict the possible qualitative stochastic and deterministic dynamic behaviours of key molecular players in the stress response. Our inferential approach can be applied to stress response and signalling systems more generally: in the ABC framework we can condition mathematical models on qualitative data in order to delimit e.g. parameter ranges or the qualitative system dynamics in light of available end-point or qualitative information. PMID:21569396
The ecological impacts of nighttime light pollution: a mechanistic appraisal.
Gaston, Kevin J; Bennie, Jonathan; Davies, Thomas W; Hopkins, John
2013-11-01
The ecological impacts of nighttime light pollution have been a longstanding source of concern, accentuated by realized and projected growth in electrical lighting. As human communities and lighting technologies develop, artificial light increasingly modifies natural light regimes by encroaching on dark refuges in space, in time, and across wavelengths. A wide variety of ecological implications of artificial light have been identified. However, the primary research to date is largely focused on the disruptive influence of nighttime light on higher vertebrates, and while comprehensive reviews have been compiled along taxonomic lines and within specific research domains, the subject is in need of synthesis within a common mechanistic framework. Here we propose such a framework that focuses on the cross-factoring of the ways in which artificial lighting alters natural light regimes (spatially, temporally, and spectrally), and the ways in which light influences biological systems, particularly the distinction between light as a resource and light as an information source. We review the evidence for each of the combinations of this cross-factoring. As artificial lighting alters natural patterns of light in space, time and across wavelengths, natural patterns of resource use and information flows may be disrupted, with downstream effects to the structure and function of ecosystems. This review highlights: (i) the potential influence of nighttime lighting at all levels of biological organisation (from cell to ecosystem); (ii) the significant impact that even low levels of nighttime light pollution can have; and (iii) the existence of major research gaps, particularly in terms of the impacts of light at population and ecosystem levels, identification of intensity thresholds, and the spatial extent of impacts in the vicinity of artificial lights. © 2013 The Authors. Biological Reviews © 2013 Cambridge Philosophical Society.
Sex Differences in Animal Models: Focus on Addiction
Becker, Jill B.
2016-01-01
The purpose of this review is to discuss ways to think about and study sex differences in preclinical animal models. We use the framework of addiction, in which animal models have excellent face and construct validity, to illustrate the importance of considering sex differences. There are four types of sex differences: qualitative, quantitative, population, and mechanistic. A better understanding of the ways males and females can differ will help scientists design experiments to characterize better the presence or absence of sex differences in new phenomena that they are investigating. We have outlined major quantitative, population, and mechanistic sex differences in the addiction domain using a heuristic framework of the three established stages of the addiction cycle: binge/intoxication, withdrawal/negative affect, and preoccupation/anticipation. Female rats, in general, acquire the self-administration of drugs and alcohol more rapidly, escalate their drug taking with extended access more rapidly, show more motivational withdrawal, and (where tested in animal models of “craving”) show greater reinstatement. The one exception is that female rats show less motivational withdrawal to alcohol. The bases for these quantitative sex differences appear to be both organizational, in that estradiol-treated neonatal animals show the male phenotype, and activational, in that the female phenotype depends on the effects of gonadal hormones. In animals, differences within the estrous cycle can be observed but are relatively minor. Such hormonal effects seem to be most prevalent during the acquisition of drug taking and less influential once compulsive drug taking is established and are linked largely to progesterone and estradiol. This review emphasizes not only significant differences in the phenotypes of females and males in the domain of addiction but emphasizes the paucity of data to date in our understanding of those differences. PMID:26772794
Semantic and Plausibility Preview Benefit Effects in English: Evidence from Eye Movements
Schotter, Elizabeth R.; Jia, Annie
2016-01-01
Theories of preview benefit in reading hinge on integration across saccades and the idea that preview benefit is greater the more similar the preview and target are. Schotter (2013) reported preview benefit from a synonymous preview, but it is unclear whether this effect occurs because of similarity between the preview and target (integration), or because of contextual fit of the preview—synonyms satisfy both accounts. Studies in Chinese have found evidence for preview benefit for words that are unrelated to the target, but are contextually plausible (Yang, Li, Wang, Slattery, & Rayner, 2014; Yang, Wang, Tong, & Rayner, 2012), which is incompatible with an integration account but supports a contextual fit account. Here, we used plausible and implausible unrelated previews in addition to plausible synonym, antonym, and identical previews to further investigate these accounts for readers of English. Early reading measures were shorter for all plausible preview conditions compared to the implausible preview condition. In later reading measures, a benefit for the plausible unrelated preview condition was not observed. In a second experiment, we asked questions that probed whether the reader encoded the preview or target. Readers were more likely to report the preview when they had skipped the word and not regressed to it, and when the preview was plausible. Thus, under certain circumstances, the preview word is processed to a high level of representation (i.e., semantic plausibility) regardless of its relationship to the target, but its influence on reading is relatively short-lived, being replaced by the target word, when fixated. PMID:27123754
Günther, Fritz; Marelli, Marco
2016-01-01
Noun compounds, consisting of two nouns (the head and the modifier) that are combined into a single concept, differ in terms of their plausibility: school bus is a more plausible compound than saddle olive. The present study investigates which factors influence the plausibility of attested and novel noun compounds. Distributional Semantic Models (DSMs) are used to obtain formal (vector) representations of word meanings, and compositional methods in DSMs are employed to obtain such representations for noun compounds. From these representations, different plausibility measures are computed. Three of those measures contribute in predicting the plausibility of noun compounds: The relatedness between the meaning of the head noun and the compound (Head Proximity), the relatedness between the meaning of modifier noun and the compound (Modifier Proximity), and the similarity between the head noun and the modifier noun (Constituent Similarity). We find non-linear interactions between Head Proximity and Modifier Proximity, as well as between Modifier Proximity and Constituent Similarity. Furthermore, Constituent Similarity interacts non-linearly with the familiarity with the compound. These results suggest that a compound is perceived as more plausible if it can be categorized as an instance of the category denoted by the head noun, if the contribution of the modifier to the compound meaning is clear but not redundant, and if the constituents are sufficiently similar in cases where this contribution is not clear. Furthermore, compounds are perceived to be more plausible if they are more familiar, but mostly for cases where the relation between the constituents is less clear. PMID:27732599
Lustig, Audrey; Worner, Susan P; Pitt, Joel P W; Doscher, Crile; Stouffer, Daniel B; Senay, Senait D
2017-10-01
Natural and human-induced events are continuously altering the structure of our landscapes and as a result impacting the spatial relationships between individual landscape elements and the species living in the area. Yet, only recently has the influence of the surrounding landscape on invasive species spread started to be considered. The scientific community increasingly recognizes the need for broader modeling framework that focuses on cross-study comparisons at different spatiotemporal scales. Using two illustrative examples, we introduce a general modeling framework that allows for a systematic investigation of the effect of habitat change on invasive species establishment and spread. The essential parts of the framework are (i) a mechanistic spatially explicit model (a modular dispersal framework-MDIG) that allows population dynamics and dispersal to be modeled in a geographical information system (GIS), (ii) a landscape generator that allows replicated landscape patterns with partially controllable spatial properties to be generated, and (iii) landscape metrics that depict the essential aspects of landscape with which dispersal and demographic processes interact. The modeling framework provides functionality for a wide variety of applications ranging from predictions of the spatiotemporal spread of real species and comparison of potential management strategies, to theoretical investigation of the effect of habitat change on population dynamics. Such a framework allows to quantify how small-grain landscape characteristics, such as habitat size and habitat connectivity, interact with life-history traits to determine the dynamics of invasive species spread in fragmented landscape. As such, it will give deeper insights into species traits and landscape features that lead to establishment and spread success and may be key to preventing new incursions and the development of efficient monitoring, surveillance, control or eradication programs.
ERIC Educational Resources Information Center
Gauld, Colin
1998-01-01
Reports that many students do not believe Newton's law of action and reaction and suggests ways in which its plausibility might be enhanced. Reviews how this law has been made more plausible over time by Newton and those who succeeded him. Contains 25 references. (DDR)
Plausibility Reappraisals and Shifts in Middle School Students' Climate Change Conceptions
ERIC Educational Resources Information Center
Lombardi, Doug; Sinatra, Gale M.; Nussbaum, E. Michael
2013-01-01
Plausibility is a central but under-examined topic in conceptual change research. Climate change is an important socio-scientific topic; however, many view human-induced climate change as implausible. When learning about climate change, students need to make plausibility judgments but they may not be sufficiently critical or reflective. The…
NASA Astrophysics Data System (ADS)
Gallice, A.
2015-12-01
Stream temperature controls important aspects of the riverine habitat, such as the rate of spawning or death of many fish species, or the concentration of numerous dissolved substances. In the current context of accelerating climate change, the future evolution of stream temperature is regarded as uncertain, particularly in the Alps. This uncertainty fostered the development of many prediction models, which are usually classified in two categories: mechanistic models and statistical models. Based on the numerical resolution of physical conservation laws, mechanistic models are generally considered to provide more reliable long-term estimates than regression models. However, despite their physical basis, these models are observed to differ quite significantly in some aspects of their implementation, notably (1) the routing of water in the river channel and (2) the estimation of the temperature of groundwater discharging into the stream. For each one of these two aspects, we considered several of the standard modeling approaches reported in the literature and implemented them in a new modular framework. The latter is based on the spatially-distributed snow model Alpine3D, which is essentially used in the framework to compute the amount of water infiltrating in the upper soil layer. Starting from there, different methods can be selected for the computation of the water and energy fluxes in the hillslopes and in the river network. We relied on this framework to compare the various methodologies for river channel routing and groundwater temperature modeling. We notably assessed the impact of each these approaches on the long-term stream temperature predictions of the model under a typical climate change scenario. The case study was conducted over a high Alpine catchment in Switzerland, whose hydrological and thermal regimes are expected to be markedly affected by climate change. The results show that the various modeling approaches lead to significant differences in the model predictions, and that these differences may be larger than the uncertainties in future air temperature. It is also shown that the temperature of groundwater discharging into the stream has a marked impact on the modeled stream temperature at the catchment outlet.
Dual Nature of Translational Control by Regulatory BC RNAs ▿
Eom, Taesun; Berardi, Valerio; Zhong, Jun; Risuleo, Gianfranco; Tiedge, Henri
2011-01-01
In higher eukaryotes, increasing evidence suggests, gene expression is to a large degree controlled by RNA. Regulatory RNAs have been implicated in the management of neuronal function and plasticity in mammalian brains. However, much of the molecular-mechanistic framework that enables neuronal regulatory RNAs to control gene expression remains poorly understood. Here, we establish molecular mechanisms that underlie the regulatory capacity of neuronal BC RNAs in the translational control of gene expression. We report that regulatory BC RNAs employ a two-pronged approach in translational control. One of two distinct repression mechanisms is mediated by C-loop motifs in BC RNA 3′ stem-loop domains. These C-loops bind to eIF4B and prevent the factor's interaction with 18S rRNA of the small ribosomal subunit. In the second mechanism, the central A-rich domains of BC RNAs target eIF4A, specifically inhibiting its RNA helicase activity. Thus, BC RNAs repress translation initiation in a bimodal mechanistic approach. As BC RNA functionality has evolved independently in rodent and primate lineages, our data suggest that BC RNA translational control was necessitated and implemented during mammalian phylogenetic development of complex neural systems. PMID:21930783
Nanoparticle-mediated growth factor delivery systems: A new way to treat Alzheimer's disease.
Lauzon, Marc-Antoine; Daviau, Alex; Marcos, Bernard; Faucheux, Nathalie
2015-05-28
The number of people diagnosed with Alzheimer's disease (AD) is increasing steadily as the world population ages, thus creating a huge socio-economic burden. Current treatments have only transient effects and concentrate on a single aspect of AD. There is much evidence suggesting that growth factors (GFs) have a great therapeutic potential and can play on all AD hallmarks. Because GFs are prone to denaturation and clearance, a delivery system is required to ensure protection and a sustainable delivery. This review provides information about the latest advances in the development of GF delivery systems (GFDS) targeting the brain in terms of in vitro and in vivo effects in the context of AD and discusses new strategies designed to increase the availability and the specificity of GFs to the brain. This paper also discusses, on a mechanistic level, the different delivery hurdles encountered by the carrier or the GF itself from its injection site up to the brain tissue. The major mass transport phenomena influencing the delivery systems targeting the brain are addressed and insights are given about how mechanistic mathematical frameworks can be developed to use and optimize them. Copyright © 2015. Published by Elsevier B.V.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Preston, R.J.
The traveler attended the 1st International Conference on Biological Dosimetry in Madrid, Spain. This conference was organized to provide information to a general audience of biologists, physicists, radiotherapists, industrial hygiene personnel and individuals from related fields on the current ability of cytogenetic analysis to provide estimates of radiation dose in cases of occupational or environmental exposure. There is a growing interest in Spain in biological dosimetry because of the increased use of radiation sources for medical and occupational uses, and with this the anticipated and actual increase in numbers of overexposure. The traveler delivered the introductory lecture on Biological Dosimetry:more » Mechanistic Concepts'' that was intended to provide a framework by which the more applied lectures could be interpreted in a mechanistic way. A second component of the trip was to provide advice with regard to several recent cases of overexposure that had been or were being assessed by the Radiopathology and Radiotherapy Department of the Hospital General Gregorio Maranon'' in Madrid. The traveler had provided information on several of these, and had analyzed cells from some exposed or purportedly exposed individuals. The members of the biological dosimetry group were referred to individuals at REACTS at Oak Ridge Associated Universities for advice on follow-up treatment.« less
Marshall, Jill A; Roering, Joshua J; Bartlein, Patrick J; Gavin, Daniel G; Granger, Darryl E; Rempel, Alan W; Praskievicz, Sarah J; Hales, Tristram C
2015-11-01
Understanding climatic influences on the rates and mechanisms of landscape erosion is an unresolved problem in Earth science that is important for quantifying soil formation rates, sediment and solute fluxes to oceans, and atmospheric CO2 regulation by silicate weathering. Glaciated landscapes record the erosional legacy of glacial intervals through moraine deposits and U-shaped valleys, whereas more widespread unglaciated hillslopes and rivers lack obvious climate signatures, hampering mechanistic theory for how climate sets fluxes and form. Today, periglacial processes in high-elevation settings promote vigorous bedrock-to-regolith conversion and regolith transport, but the extent to which frost processes shaped vast swaths of low- to moderate-elevation terrain during past climate regimes is not well established. By combining a mechanistic frost weathering model with a regional Last Glacial Maximum (LGM) climate reconstruction derived from a paleo-Earth System Model, paleovegetation data, and a paleoerosion archive, we propose that frost-driven sediment production was pervasive during the LGM in our unglaciated Pacific Northwest study site, coincident with a 2.5 times increase in erosion relative to modern rates. Our findings provide a novel framework to quantify how climate modulates sediment production over glacial-interglacial cycles in mid-latitude unglaciated terrain.
Integrated presentation of ecological risk from multiple stressors
Goussen, Benoit; Price, Oliver R.; Rendal, Cecilie; Ashauer, Roman
2016-01-01
Current environmental risk assessments (ERA) do not account explicitly for ecological factors (e.g. species composition, temperature or food availability) and multiple stressors. Assessing mixtures of chemical and ecological stressors is needed as well as accounting for variability in environmental conditions and uncertainty of data and models. Here we propose a novel probabilistic ERA framework to overcome these limitations, which focusses on visualising assessment outcomes by construct-ing and interpreting prevalence plots as a quantitative prediction of risk. Key components include environmental scenarios that integrate exposure and ecology, and ecological modelling of relevant endpoints to assess the effect of a combination of stressors. Our illustrative results demonstrate the importance of regional differences in environmental conditions and the confounding interactions of stressors. Using this framework and prevalence plots provides a risk-based approach that combines risk assessment and risk management in a meaningful way and presents a truly mechanistic alternative to the threshold approach. Even whilst research continues to improve the underlying models and data, regulators and decision makers can already use the framework and prevalence plots. The integration of multiple stressors, environmental conditions and variability makes ERA more relevant and realistic. PMID:27782171
NASA Astrophysics Data System (ADS)
Lombardi, D.
2011-12-01
Plausibility judgments-although well represented in conceptual change theories (see, for example, Chi, 2005; diSessa, 1993; Dole & Sinatra, 1998; Posner et al., 1982)-have received little empirical attention until our recent work investigating teachers' and students' understanding of and perceptions about human-induced climate change (Lombardi & Sinatra, 2010, 2011). In our first study with undergraduate students, we found that greater plausibility perceptions of human-induced climate accounted for significantly greater understanding of weather and climate distinctions after instruction, even after accounting for students' prior knowledge (Lombardi & Sinatra, 2010). In a follow-up study with inservice science and preservice elementary teachers, we showed that anger about the topic of climate change and teaching about climate change was significantly related to implausible perceptions about human-induced climate change (Lombardi & Sinatra, 2011). Results from our recent studies helped to inform our development of a model of the role of plausibility judgments in conceptual change situations. The model applies to situations involving cognitive dissonance, where background knowledge conflicts with an incoming message. In such situations, we define plausibility as a judgment on the relative potential truthfulness of incoming information compared to one's existing mental representations (Rescher, 1976). Students may not consciously think when making plausibility judgments, expending only minimal mental effort in what is referred to as an automatic cognitive process (Stanovich, 2009). However, well-designed instruction could facilitate students' reappraisal of plausibility judgments in more effortful and conscious cognitive processing. Critical evaluation specifically may be one effective method to promote plausibility reappraisal in a classroom setting (Lombardi & Sinatra, in progress). In science education, critical evaluation involves the analysis of how evidentiary data support a hypothesis and its alternatives. The presentation will focus on how instruction promoting critical evaluation can encourage individuals to reappraise their plausibility judgments and initiate knowledge reconstruction. In a recent pilot study, teachers experienced an instructional scaffold promoting critical evaluation of two competing climate change theories (i.e., human-induced and increasing solar irradiance) and significantly changed both their plausibility judgments and perceptions of correctness toward the scientifically-accepted model of human-induced climate change. A comparison group of teachers who did not experience the critical evaluation activity showed no significant change. The implications of these studies for future research and instruction will be discussed in the presentation, including effective ways to increase students' and teachers' ability to be critically evaluative and reappraise their plausibility judgments. With controversial science issues, such as climate change, such abilities may be necessary to facilitate conceptual change.
The gut microbiota and obesity: from correlation to causality.
Zhao, Liping
2013-09-01
The gut microbiota has been linked with chronic diseases such as obesity in humans. However, the demonstration of causality between constituents of the microbiota and specific diseases remains an important challenge in the field. In this Opinion article, using Koch's postulates as a conceptual framework, I explore the chain of causation from alterations in the gut microbiota, particularly of the endotoxin-producing members, to the development of obesity in both rodents and humans. I then propose a strategy for identifying the causative agents of obesity in the human microbiota through a combination of microbiome-wide association studies, mechanistic analysis of host responses and the reproduction of diseases in gnotobiotic animals.
Gene Profiling in Experimental Models of Eye Growth: Clues to Myopia Pathogenesis
Stone, Richard A.; Khurana, Tejvir S.
2010-01-01
To understand the complex regulatory pathways that underlie the development of refractive errors, expression profiling has evaluated gene expression in ocular tissues of well-characterized experimental models that alter postnatal eye growth and induce refractive errors. Derived from a variety of platforms (e.g. differential display, spotted microarrays or Affymetrix GeneChips), gene expression patterns are now being identified in species that include chicken, mouse and primate. Reconciling available results is hindered by varied experimental designs and analytical/statistical features. Continued application of these methods offers promise to provide the much-needed mechanistic framework to develop therapies to normalize refractive development in children. PMID:20363242
NASA Astrophysics Data System (ADS)
Schindler, Stefan; Danzer, Michael A.
2017-03-01
Aiming at a long-term stable and safe operation of rechargeable lithium-ion cells, elementary design aspects and degradation phenomena have to be considered depending on the specific application. Among the degrees of freedom in cell design, electrode balancing is of particular interest and has a distinct effect on useable capacity and voltage range. Concerning intrinsic degradation modes, understanding the underlying electrochemical processes and tracing the overall degradation history are the most crucial tasks. In this study, a model-based, minimal parameter framework for combined elucidation of electrode balancing and degradation pathways in commercial lithium-ion cells is introduced. The framework rests upon the simulation of full cell voltage profiles from the superposition of equivalent, artificially degraded half-cell profiles and allows to separate aging contributions from loss of available lithium and active materials in both electrodes. A physically meaningful coupling between thermodynamic and kinetic degradation modes based on the correlation between altered impedance features and loss of available lithium as well as loss of active material is proposed and validated by a low temperature degradation profile examined in one of our recent publications. The coupled framework is able to determine the electrode balancing within an error range of < 1% and the projected cell degradation is qualitatively and quantitatively in line with experimental observations.
Transformers: the changing phases of low-dimensional vanadium oxide bronzes.
Marley, Peter M; Horrocks, Gregory A; Pelcher, Kate E; Banerjee, Sarbajit
2015-03-28
In this feature article, we explore the electronic and structural phase transformations of ternary vanadium oxides with the composition MxV2O5 where M is an intercalated cation. The periodic arrays of intercalated cations ordered along quasi-1D tunnels or layered between 2D sheets of the V2O5 framework induce partial reduction of the framework vanadium atoms giving rise to charge ordering patterns that are specific to the metal M and stoichiometry x. This periodic charge ordering makes these materials remarkably versatile platforms for studying electron correlation and underpins the manifestation of phenomena such as colossal metal-insulator transitions, quantized charge corrals, and superconductivity. We describe current mechanistic understanding of these emergent phenomena with a particular emphasis on the benefits derived from scaling these materials to nanostructured dimensions wherein precise ordering of cations can be obtained and phase relationships can be derived that are entirely inaccessible in the bulk. In particular, structural transformations induced by intercalation are dramatically accelerated due to the shorter diffusion path lengths at nanometer-sized dimensions, which cause a dramatic reduction of kinetic barriers to phase transformations and facilitate interconversion between the different frameworks. We conclude by summarizing numerous technological applications that have become feasible due to recent advances in controlling the structural chemistry and both electronic and structural phase transitions in these versatile frameworks.
Banerjee, Debasis; Wang, Hao; Plonka, Anna M; Emge, Thomas J; Parise, John B; Li, Jing
2016-08-08
Gate-opening is a unique and interesting phenomenon commonly observed in flexible porous frameworks, where the pore characteristics and/or crystal structures change in response to external stimuli such as adding or removing guest molecules. For gate-opening that is induced by gas adsorption, the pore-opening pressure often varies for different adsorbate molecules and, thus, can be applied to selectively separate a gas mixture. The detailed understanding of this phenomenon is of fundamental importance to the design of industrially applicable gas-selective sorbents, which remains under investigated due to the lack of direct structural evidence for such systems. We report a mechanistic study of gas-induced gate-opening process of a microporous metal-organic framework, [Mn(ina)2 ] (ina=isonicotinate) associated with commensurate adsorption, by a combination of several analytical techniques including single crystal X-ray diffraction, in situ powder X-ray diffraction coupled with differential scanning calorimetry (XRD-DSC), and gas adsorption-desorption methods. Our study reveals that the pronounced and reversible gate opening/closing phenomena observed in [Mn(ina)2 ] are coupled with a structural transition that involves rotation of the organic linker molecules as a result of interaction of the framework with adsorbed gas molecules including carbon dioxide and propane. The onset pressure to open the gate correlates with the extent of such interaction. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
High temperature, oxygen, and performance: Insights from reptiles and amphibians.
Gangloff, Eric J; Telemeco, Rory S
2018-04-25
Much recent theoretical and empirical work has sought to describe the physiological mechanisms underlying thermal tolerance in animals. Leading hypotheses can be broadly divided into two categories that primarily differ in organizational scale: 1) high temperature directly reduces the function of subcellular machinery, such as enzymes and cell membranes, or 2) high temperature disrupts system-level interactions, such as mismatches in the supply and demand of oxygen, prior to having any direct negative effect on the subcellular machinery. Nonetheless, a general framework describing the contexts under which either subcellular component or organ system failure limits organisms at high temperatures remains elusive. With this commentary, we leverage decades of research on the physiology of ectothermic tetrapods (amphibians and non-avian reptiles) to address these hypotheses. Available data suggest both mechanisms are important. Thus, we expand previous work and propose the Hierarchical Mechanisms of Thermal Limitation (HMTL) hypothesis, which explains how subcellular and organ system failures interact to limit performance and set tolerance limits at high temperatures. We further integrate this framework with the thermal performance curve paradigm commonly used to predict the effects of thermal environments on performance and fitness. The HMTL framework appears to successfully explain diverse observations in reptiles and amphibians and makes numerous predictions that remain untested. We hope that this framework spurs further research in diverse taxa and facilitates mechanistic forecasts of biological responses to climate change.
Adaptive invasive species distribution models: A framework for modeling incipient invasions
Uden, Daniel R.; Allen, Craig R.; Angeler, David G.; Corral, Lucia; Fricke, Kent A.
2015-01-01
The utilization of species distribution model(s) (SDM) for approximating, explaining, and predicting changes in species’ geographic locations is increasingly promoted for proactive ecological management. Although frameworks for modeling non-invasive species distributions are relatively well developed, their counterparts for invasive species—which may not be at equilibrium within recipient environments and often exhibit rapid transformations—are lacking. Additionally, adaptive ecological management strategies address the causes and effects of biological invasions and other complex issues in social-ecological systems. We conducted a review of biological invasions, species distribution models, and adaptive practices in ecological management, and developed a framework for adaptive, niche-based, invasive species distribution model (iSDM) development and utilization. This iterative, 10-step framework promotes consistency and transparency in iSDM development, allows for changes in invasive drivers and filters, integrates mechanistic and correlative modeling techniques, balances the avoidance of type 1 and type 2 errors in predictions, encourages the linking of monitoring and management actions, and facilitates incremental improvements in models and management across space, time, and institutional boundaries. These improvements are useful for advancing coordinated invasive species modeling, management and monitoring from local scales to the regional, continental and global scales at which biological invasions occur and harm native ecosystems and economies, as well as for anticipating and responding to biological invasions under continuing global change.
Liver Enzymes and Bone Mineral Density in the General Population.
Breitling, Lutz Philipp
2015-10-01
Liver enzyme serum levels within and just above the normal range are strong predictors of incident morbidity and mortality in the general population. However, despite the close links between hepatic pathology and impaired bone health, the association of liver enzymes with osteoporosis has hardly been investigated. The aim of the present study was to clarify whether serum liver enzyme levels in the general population are associated with bone mineral density. This was an observational, cross-sectional study. Participants and Main Outcome: Data on 13 849 adult participants of the Third National Health and Nutrition Examination Survey were used to quantify the independent associations of γ-glutamyltransferase, alanine transaminase, and aspartate transaminase with femoral neck bone mineral density assessed by dual-energy x-ray absorptiometry. In multiple regression models adjusting for numerous confounding variables, γ-glutamyltransferase showed a weak inverse association with bone mineral density (P = .0063). There also was limited evidence of a nonmonotonous relationship with alanine transaminase, with peak bone mineral density in the second quartile of enzyme activity (P = .0039). No association was found for aspartate transaminase. Although mechanistically plausible associations were found in the present study, the rather weak nature of these patterns renders it unlikely that liver enzyme levels could be of substantial use for osteoporosis risk stratification in the general population.
Efficient high light acclimation involves rapid processes at multiple mechanistic levels.
Dietz, Karl-Josef
2015-05-01
Like no other chemical or physical parameter, the natural light environment of plants changes with high speed and jumps of enormous intensity. To cope with this variability, photosynthetic organisms have evolved sensing and response mechanisms that allow efficient acclimation. Most signals originate from the chloroplast itself. In addition to very fast photochemical regulation, intensive molecular communication is realized within the photosynthesizing cell, optimizing the acclimation process. Current research has opened up new perspectives on plausible but mostly unexpected complexity in signalling events, crosstalk, and process adjustments. Within seconds and minutes, redox states, levels of reactive oxygen species, metabolites, and hormones change and transmit information to the cytosol, modifying metabolic activity, gene expression, translation activity, and alternative splicing events. Signalling pathways on an intermediate time scale of several minutes to a few hours pave the way for long-term acclimation. Thereby, a new steady state of the transcriptome, proteome, and metabolism is realized within rather short time periods irrespective of the previous acclimation history to shade or sun conditions. This review provides a time line of events during six hours in the 'stressful' life of a plant. © The Author 2015. Published by Oxford University Press on behalf of the Society for Experimental Biology. All rights reserved. For permissions, please email: journals.permissions@oup.com.
Philipp, Bodo; Hoff, Malte; Germa, Florence; Schink, Bernhard; Beimborn, Dieter; Mersch-Sundermann, Volker
2007-02-15
Prediction of the biodegradability of organic compounds is an ecologically desirable and economically feasible tool for estimating the environmental fate of chemicals. We combined quantitative structure-activity relationships (QSAR) with the systematic collection of biochemical knowledge to establish rules for the prediction of aerobic biodegradation of N-heterocycles. Validated biodegradation data of 194 N-heterocyclic compounds were analyzed using the MULTICASE-method which delivered two QSAR models based on 17 activating (OSAR 1) and on 16 inactivating molecular fragments (GSAR 2), which were statistically significantly linked to efficient or poor biodegradability, respectively. The percentages of correct classifications were over 99% for both models, and cross-validation resulted in 67.9% (GSAR 1) and 70.4% (OSAR 2) correct predictions. Biochemical interpretation of the activating and inactivating characteristics of the molecular fragments delivered plausible mechanistic interpretations and enabled us to establish the following biodegradation rules: (1) Target sites for amidohydrolases and for cytochrome P450 monooxygenases enhance biodegradation of nonaromatic N-heterocycles. (2) Target sites for molybdenum hydroxylases enhance biodegradation of aromatic N-heterocycles. (3) Target sites for hydratation by an urocanase-like mechanism enhance biodegradation of imidazoles. Our complementary approach represents a feasible strategy for generating concrete rules for the prediction of biodegradability of organic compounds.
Brain mechanisms for perceptual and reward-related decision-making.
Deco, Gustavo; Rolls, Edmund T; Albantakis, Larissa; Romo, Ranulfo
2013-04-01
Phenomenological models of decision-making, including the drift-diffusion and race models, are compared with mechanistic, biologically plausible models, such as integrate-and-fire attractor neuronal network models. The attractor network models show how decision confidence is an emergent property; and make testable predictions about the neural processes (including neuronal activity and fMRI signals) involved in decision-making which indicate that the medial prefrontal cortex is involved in reward value-based decision-making. Synaptic facilitation in these models can help to account for sequential vibrotactile decision-making, and for how postponed decision-related responses are made. The randomness in the neuronal spiking-related noise that makes the decision-making probabilistic is shown to be increased by the graded firing rate representations found in the brain, to be decreased by the diluted connectivity, and still to be significant in biologically large networks with thousands of synapses onto each neuron. The stability of these systems is shown to be influenced in different ways by glutamatergic and GABAergic efficacy, leading to a new field of dynamical neuropsychiatry with applications to understanding schizophrenia and obsessive-compulsive disorder. The noise in these systems is shown to be advantageous, and to apply to similar attractor networks involved in short-term memory, long-term memory, attention, and associative thought processes. Copyright © 2012 Elsevier Ltd. All rights reserved.
Vanyukov, Michael M.; Tarter, Ralph E.; Kirillova, Galina P.; Kirisci, Levent; Reynolds, Maureen D.; Kreek, Mary Jeanne; Conway, Kevin P.; Maher, Brion S.; Iacono, William G.; Bierut, Laura; Neale, Michael C.; Clark, Duncan B.; Ridenour, Ty A.
2013-01-01
Background Two competing concepts address the development of involvement with psychoactive substances: the “gateway hypothesis” (GH) and common liability to addiction (CLA). Method The literature on theoretical foundations and empirical findings related to both concepts is reviewed. Results The data suggest that drug use initiation sequencing, the core GH element, is variable and opportunistic rather than uniform and developmentally deterministic. The association between risks for use of different substances, if any, can be more readily explained by common underpinnings than by specific staging. In contrast, the CLA concept is grounded in genetic theory and supported by data identifying common sources of variation in the risk for specific addictions. This commonality has identifiable neurobiological substrate and plausible evolutionary explanations. Conclusions Whereas the “gateway” hypothesis does not specify mechanistic connections between “stages”, and does not extend to the risks for addictions, the concept of common liability to addictions incorporates sequencing of drug use initiation as well as extends to related addictions and their severity, provides a parsimonious explanation of substance use and addiction co-occurrence, and establishes a theoretical and empirical foundation to research in etiology, quantitative risk and severity measurement, as well as targeted non-drug-specific prevention and early intervention. PMID:22261179
NASA Astrophysics Data System (ADS)
Jiang, Xuan-Feng; Huang, Hui; Chai, Yun-Feng; Lohr, Tracy Lynn; Yu, Shu-Yan; Lai, Wenzhen; Pan, Yuan-Jiang; Delferro, Massimiliano; Marks, Tobin J.
2017-02-01
Developing homogeneous catalysts that convert CS2 and COS pollutants into environmentally benign products is important for both fundamental catalytic research and applied environmental science. Here we report a series of air-stable dimeric Pd complexes that mediate the facile hydrolytic cleavage of both CS2 carbon-sulfur bonds at 25 °C to produce CO2 and trimeric Pd complexes. Oxidation of the trimeric complexes with HNO3 regenerates the dimeric starting complexes with the release of SO2 and NO2. Isotopic labelling confirms that the carbon and oxygen atoms of CO2 originate from CS2 and H2O, respectively, and reaction intermediates were observed by gas-phase and electrospray ionization mass spectrometry, as well as by Fourier transform infrared spectroscopy. We also propose a plausible mechanistic scenario based on the experimentally observed intermediates. The mechanism involves intramolecular attack by a nucleophilic Pd-OH moiety on the carbon atom of coordinated µ-OCS2, which on deprotonation cleaves one C-S bond and simultaneously forms a C-O bond. Coupled C-S cleavage and CO2 release to yield [(bpy)3Pd3(µ3-S)2](NO3)2 (bpy, 2,2‧-bipyridine) provides the thermodynamic driving force for the reaction.
A side-effect free method for identifying cancer drug targets.
Ashraf, Md Izhar; Ong, Seng-Kai; Mujawar, Shama; Pawar, Shrikant; More, Pallavi; Paul, Somnath; Lahiri, Chandrajit
2018-04-27
Identifying effective drug targets, with little or no side effects, remains an ever challenging task. A potential pitfall of failing to uncover the correct drug targets, due to side effect of pleiotropic genes, might lead the potential drugs to be illicit and withdrawn. Simplifying disease complexity, for the investigation of the mechanistic aspects and identification of effective drug targets, have been done through several approaches of protein interactome analysis. Of these, centrality measures have always gained importance in identifying candidate drug targets. Here, we put forward an integrated method of analysing a complex network of cancer and depict the importance of k-core, functional connectivity and centrality (KFC) for identifying effective drug targets. Essentially, we have extracted the proteins involved in the pathways leading to cancer from the pathway databases which enlist real experimental datasets. The interactions between these proteins were mapped to build an interactome. Integrative analyses of the interactome enabled us to unearth plausible reasons for drugs being rendered withdrawn, thereby giving future scope to pharmaceutical industries to potentially avoid them (e.g. ESR1, HDAC2, F2, PLG, PPARA, RXRA, etc). Based upon our KFC criteria, we have shortlisted ten proteins (GRB2, FYN, PIK3R1, CBL, JAK2, LCK, LYN, SYK, JAK1 and SOCS3) as effective candidates for drug development.
Divorce and Death: A Meta-Analysis and Research Agenda for Clinical, Social, and Health Psychology.
Sbarra, David A; Law, Rita W; Portley, Robert M
2011-09-01
Divorce is a relatively common stressful life event that is purported to increase risk for all-cause mortality. One problem in the literature on divorce and health is that it is fragmented and spread across many disciplines; most prospective studies of mortality are based in epidemiology and sociology, whereas most mechanistic studies are based in psychology. This review integrates research on divorce and death via meta-analysis and outlines a research agenda for better understanding the potential mechanisms linking marital dissolution and risk for all-cause mortality. Random effects meta-analysis with a sample of 32 prospective studies (involving more than 6.5 million people, 160,000 deaths, and over 755,000 divorces in 11 different countries) revealed a significant increase in risk for early death among separated/divorced adults in comparison to their married counterparts. Men and younger adults evidenced significantly greater risk for early death following marital separation/divorce than did women and older adults. Quantification of the overall effect size linking marital separation/divorce to risk for early death reveals a number of important research questions, and this article discusses what remains to be learned about four plausible mechanisms of action: social selection, resource disruptions, changes in health behaviors, and chronic psychological distress. © Association for Psychological Science 2011.
Bergman, Ebba; Matsson, Elin M; Hedeland, Mikael; Bondesson, Ulf; Knutson, Lars; Lennernäs, Hans
2010-09-01
The effect of a single intrajejunal dose of gemfibrozil (600 mg) on the plasma pharmacokinetics and biliary excretion of a single intrajejunal dose of rosuvastatin (20 mg) was investigated by using a multichannel catheter positioned in the distal duodenum-proximal jejunum in 8 healthy volunteers. Bile and plasma samples were collected every 20 minutes for 200 minutes, with additional plasma samples being drawn for up to 48 hours. Gemfibrozil did not affect the bioavailability of rosuvastatin, although it increased the apparent absorption phase during the initial 200 minutes (AUC(plasma,200min)) by 1.56-fold (95% confidence interval, 1.14-2.15). The interaction was less pronounced in this single-dose study than in a previous report when gemfibrozil was administered repeatedly; nevertheless, the interaction coincided with the highest exposure to gemfibrozil. The plausible reason why the interaction in this investigation was only minor is the low exposure to gemfibrozil (and its metabolites), suggesting that the total plasma concentration of gemfibrozil needs to be above 20 µM to affect the disposition of rosuvastatin. This study demonstrates the value of monitoring the plasma pharmacokinetics of the inhibitor, and not only the drug under investigation, to improve the mechanistic interpretation.
Jones, Michael L.; Shuter, Brian J.; Zhao, Yingming; Stockwell, Jason D.
2006-01-01
Future changes to climate in the Great Lakes may have important consequences for fisheries. Evidence suggests that Great Lakes air and water temperatures have risen and the duration of ice cover has lessened during the past century. Global circulation models (GCMs) suggest future warming and increases in precipitation in the region. We present new evidence that water temperatures have risen in Lake Erie, particularly during summer and winter in the period 19652000. GCM forecasts coupled with physical models suggest lower annual runoff, less ice cover, and lower lake levels in the future, but the certainty of these forecasts is low. Assessment of the likely effects of climate change on fish stocks will require an integrative approach that considers several components of habitat rather than water temperature alone. We recommend using mechanistic models that couple habitat conditions to population demographics to explore integrated effects of climate-caused habitat change and illustrate this approach with a model for Lake Erie walleye (Sander vitreum). We show that the combined effect on walleye populations of plausible changes in temperature, river hydrology, lake levels, and light penetration can be quite different from that which would be expected based on consideration of only a single factor.
Negotiating plausibility: intervening in the future of nanotechnology.
Selin, Cynthia
2011-12-01
The national-level scenarios project NanoFutures focuses on the social, political, economic, and ethical implications of nanotechnology, and is initiated by the Center for Nanotechnology in Society at Arizona State University (CNS-ASU). The project involves novel methods for the development of plausible visions of nanotechnology-enabled futures, elucidates public preferences for various alternatives, and, using such preferences, helps refine future visions for research and outreach. In doing so, the NanoFutures project aims to address a central question: how to deliberate the social implications of an emergent technology whose outcomes are not known. The solution pursued by the NanoFutures project is twofold. First, NanoFutures limits speculation about the technology to plausible visions. This ambition introduces a host of concerns about the limits of prediction, the nature of plausibility, and how to establish plausibility. Second, it subjects these visions to democratic assessment by a range of stakeholders, thus raising methodological questions as to who are relevant stakeholders and how to activate different communities so as to engage the far future. This article makes the dilemmas posed by decisions about such methodological issues transparent and therefore articulates the role of plausibility in anticipatory governance.
Ecological multiplex interactions determine the role of species for parasite spread amplification
Stella, Massimo; Selakovic, Sanja; Antonioni, Alberto
2018-01-01
Despite their potential interplay, multiple routes of many disease transmissions are often investigated separately. As a unifying framework for understanding parasite spread through interdependent transmission paths, we present the ‘ecomultiplex’ model, where the multiple transmission paths among a diverse community of interacting hosts are represented as a spatially explicit multiplex network. We adopt this framework for designing and testing potential control strategies for Trypanosoma cruzi spread in two empirical host communities. We show that the ecomultiplex model is an efficient and low data-demanding method to identify which species enhances parasite spread and should thus be a target for control strategies. We also find that the interplay between predator-prey and host-parasite interactions leads to a phenomenon of parasite amplification, in which top predators facilitate T. cruzi spread, offering a mechanistic interpretation of previous empirical findings. Our approach can provide novel insights in understanding and controlling parasite spreading in real-world complex systems. PMID:29683427
Studying Two-Dimensional Zeolites with the Tools of Surface Science: MFI Nanosheets on Au(111)
J. D. Kestell; Zhong, J. Q.; Shete, M.; ...
2016-07-26
While surface science has provided fundamental insights into a variety a materials, the most used catalysts in the industry, namely zeolites, still remain a challenge. The recent preparation of two-dimensional versions of MFI zeolite frameworks and the possibility of their deposition on electrically conductive supports provides for the first time a viable strategy to perform detailed studies on industrially relevant zeolites using the vast toolkit of surface science. In this work we demonstrate the use of infrared reflection absorption spectroscopy (IRRAS) and synchrotron-based x-ray photoelectron spectroscopy (XPS) to study these materials. Furthermore, polarization modulation IRRAS is used to study themore » adsorption of methanol and its effect in phonon vibrations of the zeolite framework. The possibility of using surface science methods, in particular under ambient pressure conditions, for the study of well-defined zeolites and other microporous structures opens new avenues to understand structural and mechanistic aspects of these materials as catalysts, adsorbents and molecular sieves.« less
Studying Two-Dimensional Zeolites with the Tools of Surface Science: MFI Nanosheets on Au(111)
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. D. Kestell; Zhong, J. Q.; Shete, M.
While surface science has provided fundamental insights into a variety a materials, the most used catalysts in the industry, namely zeolites, still remain a challenge. The recent preparation of two-dimensional versions of MFI zeolite frameworks and the possibility of their deposition on electrically conductive supports provides for the first time a viable strategy to perform detailed studies on industrially relevant zeolites using the vast toolkit of surface science. In this work we demonstrate the use of infrared reflection absorption spectroscopy (IRRAS) and synchrotron-based x-ray photoelectron spectroscopy (XPS) to study these materials. Furthermore, polarization modulation IRRAS is used to study themore » adsorption of methanol and its effect in phonon vibrations of the zeolite framework. The possibility of using surface science methods, in particular under ambient pressure conditions, for the study of well-defined zeolites and other microporous structures opens new avenues to understand structural and mechanistic aspects of these materials as catalysts, adsorbents and molecular sieves.« less
Lv, Kai; Yang, Chu-Ting; Liu, Yi; Hu, Sheng; Wang, Xiao-Lin
2018-01-01
To aid the design of a hierarchically porous unconventional metal-phosphonate framework (HP-UMPF) for practical radioanalytical separation, a systematic investigation of the hydrolytic stability of bulk phase against acidic corrosion has been carried out for an archetypical HP-UMPF. Bulk dissolution results suggest that aqueous acidity has a more paramount effect on incongruent leaching than the temperature, and the kinetic stability reaches equilibrium by way of an accumulation of a partial leached species on the corrosion conduits. A variation of particle morphology, hierarchical porosity and backbone composition upon corrosion reveals that they are hydrolytically resilient without suffering any great degradation of porous texture, although large aggregates crack into sporadic fractures while the nucleophilic attack of inorganic layers cause the leaching of tin and phosphorus. The remaining selectivity of these HP-UMPFs is dictated by a balance between the elimination of free phosphonate and the exposure of confined phosphonates, thus allowing a real-time tailor of radionuclide sequestration. Moreover, a plausible degradation mechanism has been proposed for the triple progressive dissolution of three-level hierarchical porous structures to elucidate resultant reactivity. These HP-UMPFs are compared with benchmark metal-organic frameworks (MOFs) to obtain a rough grading of hydrolytic stability and two feasible approaches are suggested for enhancing their hydrolytic stability that are intended for real-life separation protocols. PMID:29538348
An action potential-driven model of soleus muscle activation dynamics for locomotor-like movements
NASA Astrophysics Data System (ADS)
Kim, Hojeong; Sandercock, Thomas G.; Heckman, C. J.
2015-08-01
Objective. The goal of this study was to develop a physiologically plausible, computationally robust model for muscle activation dynamics (A(t)) under physiologically relevant excitation and movement. Approach. The interaction of excitation and movement on A(t) was investigated comparing the force production between a cat soleus muscle and its Hill-type model. For capturing A(t) under excitation and movement variation, a modular modeling framework was proposed comprising of three compartments: (1) spikes-to-[Ca2+]; (2) [Ca2+]-to-A; and (3) A-to-force transformation. The individual signal transformations were modeled based on physiological factors so that the parameter values could be separately determined for individual modules directly based on experimental data. Main results. The strong dependency of A(t) on excitation frequency and muscle length was found during both isometric and dynamically-moving contractions. The identified dependencies of A(t) under the static and dynamic conditions could be incorporated in the modular modeling framework by modulating the model parameters as a function of movement input. The new modeling approach was also applicable to cat soleus muscles producing waveforms independent of those used to set the model parameters. Significance. This study provides a modeling framework for spike-driven muscle responses during movement, that is suitable not only for insights into molecular mechanisms underlying muscle behaviors but also for large scale simulations.
Exposure Render: An Interactive Photo-Realistic Volume Rendering Framework
Kroes, Thomas; Post, Frits H.; Botha, Charl P.
2012-01-01
The field of volume visualization has undergone rapid development during the past years, both due to advances in suitable computing hardware and due to the increasing availability of large volume datasets. Recent work has focused on increasing the visual realism in Direct Volume Rendering (DVR) by integrating a number of visually plausible but often effect-specific rendering techniques, for instance modeling of light occlusion and depth of field. Besides yielding more attractive renderings, especially the more realistic lighting has a positive effect on perceptual tasks. Although these new rendering techniques yield impressive results, they exhibit limitations in terms of their exibility and their performance. Monte Carlo ray tracing (MCRT), coupled with physically based light transport, is the de-facto standard for synthesizing highly realistic images in the graphics domain, although usually not from volumetric data. Due to the stochastic sampling of MCRT algorithms, numerous effects can be achieved in a relatively straight-forward fashion. For this reason, we have developed a practical framework that applies MCRT techniques also to direct volume rendering (DVR). With this work, we demonstrate that a host of realistic effects, including physically based lighting, can be simulated in a generic and flexible fashion, leading to interactive DVR with improved realism. In the hope that this improved approach to DVR will see more use in practice, we have made available our framework under a permissive open source license. PMID:22768292
An, Gary; Christley, Scott
2012-01-01
Given the panoply of system-level diseases that result from disordered inflammation, such as sepsis, atherosclerosis, cancer, and autoimmune disorders, understanding and characterizing the inflammatory response is a key target of biomedical research. Untangling the complex behavioral configurations associated with a process as ubiquitous as inflammation represents a prototype of the translational dilemma: the ability to translate mechanistic knowledge into effective therapeutics. A critical failure point in the current research environment is a throughput bottleneck at the level of evaluating hypotheses of mechanistic causality; these hypotheses represent the key step toward the application of knowledge for therapy development and design. Addressing the translational dilemma will require utilizing the ever-increasing power of computers and computational modeling to increase the efficiency of the scientific method in the identification and evaluation of hypotheses of mechanistic causality. More specifically, development needs to focus on facilitating the ability of non-computer trained biomedical researchers to utilize and instantiate their knowledge in dynamic computational models. This is termed "dynamic knowledge representation." Agent-based modeling is an object-oriented, discrete-event, rule-based simulation method that is well suited for biomedical dynamic knowledge representation. Agent-based modeling has been used in the study of inflammation at multiple scales. The ability of agent-based modeling to encompass multiple scales of biological process as well as spatial considerations, coupled with an intuitive modeling paradigm, suggest that this modeling framework is well suited for addressing the translational dilemma. This review describes agent-based modeling, gives examples of its applications in the study of inflammation, and introduces a proposed general expansion of the use of modeling and simulation to augment the generation and evaluation of knowledge by the biomedical research community at large.
NASA Technical Reports Server (NTRS)
Salby, Murry
1998-01-01
A 3-dimensional model was developed to support mechanistic studies. The model solves the global primitive equations in isentropic coordinates, which directly characterize diabatic processes forcing the Brewer-Dobson circulation of the middle atmosphere. It's numerical formulation is based on Hough harmonics, which partition horizontal motion into its rotational and divergent components. These computational features, along with others, enable 3D integrations to be performed practically on RISC computer architecture, on which they can be iterated to support mechanistic studies. The model conserves potential vorticity quite accurately under adiabatic conditions. Forced by observed tropospheric structure, in which integrations are anchored, the model generates a diabatic circulation that is consistent with satellite observations of tracer behavior and diabatic cooling rates. The model includes a basic but fairly complete treatment of gas-phase photochemistry that represents some 20 chemical species and 50 governing reactions with diurnally-varying shortwave absorption. The model thus provides a reliable framework to study transport and underlying diabatic processes, which can then be compared against chemical and dynamical structure observed and in GCM integrations. Integrations with the Langley GCM were performed to diagnose feedback between simulated convection and the tropical circulation. These were studied in relation to tropospheric properties controlling moisture convergence and environmental conditions supporting deep convection, for comparison against mechanistic integrations of wave CISK that successfully reproduce the Madden-Julian Oscillation (MJO) of the tropical circulation. These comparisons were aimed at identifying and ultimately improving aspects of the convective simulation, with the objective of recovering a successful simulation of the MJO in the Langley GCM, behavior that should be important to budgets of upper-tropospheric water vapor and chemical species.
NASA Astrophysics Data System (ADS)
Soderholm, L.; Mitchell, J. F.
2016-05-01
Synthesis of inorganic extended solids is a critical starting point from which real-world functional materials and their consequent technologies originate. However, unlike the rich mechanistic foundation of organic synthesis, with its underlying rules of assembly (e.g., functional groups and their reactivities), the synthesis of inorganic materials lacks an underpinning of such robust organizing principles. In the latter case, any such rules must account for the diversity of chemical species and bonding motifs inherent to inorganic materials and the potential impact of mass transport on kinetics, among other considerations. Without such assembly rules, there is less understanding, less predictive power, and ultimately less control of properties. Despite such hurdles, developing a mechanistic understanding for synthesis of inorganic extended solids would dramatically impact the range of new material discoveries and resulting new functionalities, warranting a broad call to explore what is possible. Here we discuss our recent approaches toward a mechanistic framework for the synthesis of bulk inorganic extended solids, in which either embryonic atomic correlations or fully developed phases in solutions or melts can be identified and tracked during product selection and crystallization. The approach hinges on the application of high-energy x-rays, with their penetrating power and large Q-range, to explore reaction pathways in situ. We illustrate this process using two examples: directed assembly of Zr clusters in aqueous solution and total phase awareness during crystallization from K-Cu-S melts. These examples provide a glimpse of what we see as a larger vision, in which large scale simulations, data-driven science, and in situ studies of atomic correlations combine to accelerate materials discovery and synthesis, based on the assembly of well-defined, prenucleated atomic correlations.
Soderholm, L.; Mitchell, J. F.
2016-05-26
Synthesis of inorganic extended solids is a critical starting point from which real-world functional materials and their consequent technologies originate. However, unlike the rich mechanistic foundation of organic synthesis, with its underlying rules of assembly (e.g., functional groups and their reactivities), the synthesis of inorganic materials lacks an underpinning of such robust organizing principles. In the latter case, any such rules must account for the diversity of chemical species and bonding motifs inherent to inorganic materials and the potential impact of mass transport on kinetics, among other considerations. Without such assembly rules, there is less understanding, less predictive power, andmore » ultimately less control of properties. Despite such hurdles, developing a mechanistic understanding for synthesis of inorganic extended solids would dramatically impact the range of new material discoveries and resulting new functionalities, warranting a broad call to explore what is possible. Here we discuss our recent approaches toward a mechanistic framework for the synthesis of bulk inorganic extended solids, in which either embryonic atomic correlations or fully developed phases in solutions or melts can be identified and tracked during product selection and crystallization. The approach hinges on the application of high-energy x-rays, with their penetrating power and large Q-range, to explore reaction pathways in situ. We illustrate this process using two examples: directed assembly of Zr clusters in aqueous solution and total phase awareness during crystallization from K–Cu–S melts. These examples provide a glimpse of what we see as a larger vision, in which large scale simulations, data-driven science, and in situ studies of atomic correlations combine to accelerate materials discovery and synthesis, based on the assembly of well-defined, prenucleated atomic correlations.« less
NASA Astrophysics Data System (ADS)
Marçais, J.; Gupta, H. V.; De Dreuzy, J. R.; Troch, P. A. A.
2016-12-01
Geomorphological structure and geological heterogeneity of hillslopes are major controls on runoff responses. The diversity of hillslopes (morphological shapes and geological structures) on one hand, and the highly non linear runoff mechanism response on the other hand, make it difficult to transpose what has been learnt at one specific hillslope to another. Therefore, making reliable predictions on runoff appearance or river flow for a given hillslope is a challenge. Applying a classic model calibration (based on inverse problems technique) requires doing it for each specific hillslope and having some data available for calibration. When applied to thousands of cases it cannot always be promoted. Here we propose a novel modeling framework based on coupling process based models with data based approach. First we develop a mechanistic model, based on hillslope storage Boussinesq equations (Troch et al. 2003), able to model non linear runoff responses to rainfall at the hillslope scale. Second we set up a model database, representing thousands of non calibrated simulations. These simulations investigate different hillslope shapes (real ones obtained by analyzing 5m digital elevation model of Brittany and synthetic ones), different hillslope geological structures (i.e. different parametrizations) and different hydrologic forcing terms (i.e. different infiltration chronicles). Then, we use this model library to train a machine learning model on this physically based database. Machine learning model performance is then assessed by a classic validating phase (testing it on new hillslopes and comparing machine learning with mechanistic outputs). Finally we use this machine learning model to learn what are the hillslope properties controlling runoffs. This methodology will be further tested combining synthetic datasets with real ones.
Disruptive behaviour in the perioperative setting: a contemporary review.
Villafranca, Alexander; Hamlin, Colin; Enns, Stephanie; Jacobsohn, Eric
2017-02-01
Disruptive behaviour, which we define as behaviour that does not show others an adequate level of respect and causes victims or witnesses to feel threatened, is a concern in the operating room. This review summarizes the current literature on disruptive behaviour as it applies to the perioperative domain. Searches of MEDLINE ® , Scopus™, and Google books identified articles and monographs of interest, with backreferencing used as a supplemental strategy. Much of the data comes from studies outside the operating room and has significant methodological limitations. Disruptive behaviour has intrapersonal, interpersonal, and organizational causes. While fewer than 10% of clinicians display disruptive behaviour, up to 98% of clinicians report witnessing disruptive behaviour in the last year, 70% report being treated with incivility, and 36% report being bullied. This type of conduct can have many negative ramifications for clinicians, students, and institutions. Although the evidence regarding patient outcomes is primarily based on clinician perceptions, anecdotes, and expert opinion, this evidence supports the contention of an increase in morbidity and mortality. The plausible mechanism for this increase is social undermining of teamwork, communication, clinical decision-making, and technical performance. The behavioural responses of those who are exposed to such conduct can positively or adversely moderate the consequences of disruptive behaviour. All operating room professions are involved, with the rank order (from high to low) being surgeons, nurses, anesthesiologists, and "others". The optimal approaches to the prevention and management of disruptive behaviour are uncertain, but they include preventative and professional development courses, training in soft skills and teamwork, institutional efforts to optimize the workplace, clinician contracts outlining the clinician's (and institution's) responsibilities, institutional policies that are monitored and enforced, regular performance feedback, and clinician coaching/remediation as required. Disruptive behaviour remains a part of operating room culture, with many associated deleterious effects. There is a widely accepted view that disruptive behaviour can lead to increased patient morbidity and mortality. This is mechanistically plausible, but more rigorous studies are required to confirm the effects and estimate their magnitude. An important measure that individual clinicians can take is to monitor and control their own behaviour, including their responses to disruptive behaviour.
Computational Complexity and Human Decision-Making.
Bossaerts, Peter; Murawski, Carsten
2017-12-01
The rationality principle postulates that decision-makers always choose the best action available to them. It underlies most modern theories of decision-making. The principle does not take into account the difficulty of finding the best option. Here, we propose that computational complexity theory (CCT) provides a framework for defining and quantifying the difficulty of decisions. We review evidence showing that human decision-making is affected by computational complexity. Building on this evidence, we argue that most models of decision-making, and metacognition, are intractable from a computational perspective. To be plausible, future theories of decision-making will need to take into account both the resources required for implementing the computations implied by the theory, and the resource constraints imposed on the decision-maker by biology. Copyright © 2017 Elsevier Ltd. All rights reserved.
Knowledge Acquisition of Generic Queries for Information Retrieval
Seol, Yoon-Ho; Johnson, Stephen B.; Cimino, James J.
2002-01-01
Several studies have identified clinical questions posed by health care professionals to understand the nature of information needs during clinical practice. To support access to digital information sources, it is necessary to integrate the information needs with a computer system. We have developed a conceptual guidance approach in information retrieval, based on a knowledge base that contains the patterns of information needs. The knowledge base uses a formal representation of clinical questions based on the UMLS knowledge sources, called the Generic Query model. To improve the coverage of the knowledge base, we investigated a method for extracting plausible clinical questions from the medical literature. This poster presents the Generic Query model, shows how it is used to represent the patterns of clinical questions, and describes the framework used to extract knowledge from the medical literature.
Comparing the epistemological underpinnings of students' and scientists' reasoning about conclusions
NASA Astrophysics Data System (ADS)
Hogan, Kathleen; Maglienti, Mark
2001-08-01
This study examined the criteria that middle school students, nonscientist adults, technicians, and scientists used to rate the validity of conclusions drawn by hypothetical students from a set of evidence. The groups' criteria for evaluating conclusions were considered to be dimensions of their epistemological frameworks regarding how knowledge claims are justified, and as such are integral to their scientific reasoning. Quantitative and qualitative analyses revealed that the responses of students and nonscientists differed from the responses of technicians and scientists, with the major difference being the groups' relative emphasis on criteria of empirical consistency or plausibility of the conclusions. We argue that the sources of the groups' differing epistemic criteria rest in their different spheres of cultural practice, and explore implications of this perspective for science teaching and learning.
A non-oscillatory energy-splitting method for the computation of compressible multi-fluid flows
NASA Astrophysics Data System (ADS)
Lei, Xin; Li, Jiequan
2018-04-01
This paper proposes a new non-oscillatory energy-splitting conservative algorithm for computing multi-fluid flows in the Eulerian framework. In comparison with existing multi-fluid algorithms in the literature, it is shown that the mass fraction model with isobaric hypothesis is a plausible choice for designing numerical methods for multi-fluid flows. Then we construct a conservative Godunov-based scheme with the high order accurate extension by using the generalized Riemann problem solver, through the detailed analysis of kinetic energy exchange when fluids are mixed under the hypothesis of isobaric equilibrium. Numerical experiments are carried out for the shock-interface interaction and shock-bubble interaction problems, which display the excellent performance of this type of schemes and demonstrate that nonphysical oscillations are suppressed around material interfaces substantially.
ERIC Educational Resources Information Center
Staub, Adrian; Rayner, Keith; Pollatsek, Alexander; Hyona, Jukka; Majewski, Helen
2007-01-01
Readers' eye movements were monitored as they read sentences containing noun-noun compounds that varied in frequency (e.g., elevator mechanic, mountain lion). The left constituent of the compound was either plausible or implausible as a head noun at the point at which it appeared, whereas the compound as a whole was always plausible. When the head…
NASA Astrophysics Data System (ADS)
Barefoot, E. A.; Nittrouer, J. A.; Foreman, B.; Moodie, A. J.; Dickens, G. R.
2017-12-01
The Paleocene-Eocene Thermal Maximum (PETM) was a period of rapid climatic change when global temperatures increased by 5-8˚C in as little as 5 ka. It has been hypothesized that by drastically enhancing the hydrologic cycle, this temperature change significantly perturbed landscape dynamics over the ensuing 200 ka. Much of the evidence documenting hydrological variability derives from studies of the stratigraphic record, which is interpreted to encode a system-clearing event in fluvial systems worldwide during and after the PETM. For example, in the Piceance Basin of Western Colorado, it is hypothesized that intensification of monsoons due to PETM warming caused an increase in sediment flux to the basin. The resulting stratigraphy records a modulation of the sedimentation rate, where the PETM interval is represented by a laterally extensive sheet sand positioned between units dominated by floodplain muds. The temporal interval, the sediment provenance history, as well as the tectonic history of the PETM in the Piceance Basin are all well-constrained, leaving climate as the most significant allogenic forcing in the Piceance Basin during the PETM. However, the precise nature of landscape change that link climate forcing by the PETM to modulation of the sedimentation rate in this basin remains to be demonstrated. Here, we present a simple stratigraphic numerical model coupled with a conceptual source-to-sink framework to test the impact of a suite of changing upstream boundary conditions on the fluvial system. In the model, climate-related variables force changes in flow characteristics such as sediment transport, slope, and velocity, which determine the resultant floodplain stratigraphy. The model is based on mathematical relations that link bankfull geometry and water discharge, impacting the lateral migration rate of the channel, sediment transport rate, and avulsion frequency, thereby producing a cross-section of basin stratigraphy. In this way, we simulate a raft of plausible, and mutually exclusive, climate-change scenarios for the case study of the Piceance Basin during the PETM, which may be compared to the stratigraphic record through field observation. The method described here represents a step towards connecting the impacts of global climate change to fluvial systems and sedimentation dynamics.
Effects of plausibility on structural priming.
Christianson, Kiel; Luke, Steven G; Ferreira, Fernanda
2010-03-01
We report a replication and extension of Ferreira (2003), in which it was observed that native adult English speakers misinterpret passive sentences that relate implausible but not impossible semantic relationships (e.g., The angler was caught by the fish) significantly more often than they do plausible passives or plausible or implausible active sentences. In the experiment reported here, participants listened to the same plausible and implausible passive and active sentences as in Ferreira (2003), answered comprehension questions, and then orally described line drawings of simple transitive actions. The descriptions were analyzed as a measure of structural priming (Bock, 1986). Question accuracy data replicated Ferreira (2003). Production data yielded an interaction: Passive descriptions were produced more often after plausible passives and implausible actives. We interpret these results as indicative of a language processor that proceeds along differentiated morphosyntactic and semantic routes. The processor may end up adjudicating between conflicting outputs from these routes by settling on a "good enough" representation that is not completely faithful to the input.
The neural optimal control hierarchy for motor control
NASA Astrophysics Data System (ADS)
DeWolf, T.; Eliasmith, C.
2011-10-01
Our empirical, neuroscientific understanding of biological motor systems has been rapidly growing in recent years. However, this understanding has not been systematically mapped to a quantitative characterization of motor control based in control theory. Here, we attempt to bridge this gap by describing the neural optimal control hierarchy (NOCH), which can serve as a foundation for biologically plausible models of neural motor control. The NOCH has been constructed by taking recent control theoretic models of motor control, analyzing the required processes, generating neurally plausible equivalent calculations and mapping them on to the neural structures that have been empirically identified to form the anatomical basis of motor control. We demonstrate the utility of the NOCH by constructing a simple model based on the identified principles and testing it in two ways. First, we perturb specific anatomical elements of the model and compare the resulting motor behavior with clinical data in which the corresponding area of the brain has been damaged. We show that damaging the assigned functions of the basal ganglia and cerebellum can cause the movement deficiencies seen in patients with Huntington's disease and cerebellar lesions. Second, we demonstrate that single spiking neuron data from our model's motor cortical areas explain major features of single-cell responses recorded from the same primate areas. We suggest that together these results show how NOCH-based models can be used to unify a broad range of data relevant to biological motor control in a quantitative, control theoretic framework.
Grossi, Enzo
2005-09-27
The concept of risk has pervaded medical literature in the last decades and has become a familiar topic, and the concept of probability, linked to binary logic approach, is commonly applied in epidemiology and clinical medicine. The application of probability theory to groups of individuals is quite straightforward but can pose communication challenges at individual level. Few articles by the way have tried to focus the concept of "risk" at the individual subject level rather than at population level. The author has reviewed the conceptual framework which has led to the use of probability theory in the medical field in a time when the principal causes of death were represented by acute disease often of infective origin. In the present scenario, in which chronic degenerative disease dominate and there are smooth transitions between health and disease the use of fuzzy logic rather than binary logic would be more appropriate. The use of fuzzy logic in which more than two possible truth-value assignments are allowed overcomes the trap of probability theory when dealing with uncertain outcomes, thereby making the meaning of a certain prognostic statement easier to understand by the patient. At individual subject level the recourse to the term plausibility, related to fuzzy logic, would help the physician to communicate to the patient more efficiently in comparison with the term probability, related to binary logic. This would represent an evident advantage for the transfer of medical evidences to individual subjects.
Altered morphology of the nucleus accumbens in persistent developmental stuttering.
Neef, Nicole E; Bütfering, Christoph; Auer, Tibor; Metzger, F Luise; Euler, Harald A; Frahm, Jens; Paulus, Walter; Sommer, Martin
2018-03-01
Neuroimaging studies in persistent developmental stuttering repeatedly report altered basal ganglia functions. Together with thalamus and cerebellum, these structures mediate sensorimotor functions and thus represent a plausible link between stuttering and neuroanatomy. However, stuttering is a complex, multifactorial disorder. Besides sensorimotor functions, emotional and social-motivational factors constitute major aspects of the disorder. Here, we investigated cortical and subcortical gray matter regions to study whether persistent developmental stuttering is also linked to alterations of limbic structures. The study included 33 right-handed participants who stutter and 34 right-handed control participants matched for sex, age, and education. Structural images were acquired using magnetic resonance imaging to estimate volumetric characteristics of the nucleus accumbens, hippocampus, amygdala, pallidum, putamen, caudate nucleus, and thalamus. Volumetric comparisons and vertex-based shape comparisons revealed structural differences. The right nucleus accumbens was larger in participants who stutter compared to controls. Recent theories of basal ganglia functions suggest that the nucleus accumbens is a motivation-to-movement interface. A speaker intends to reach communicative goals, but stuttering can derail these efforts. It is therefore highly plausible to find alterations in the motivation-to-movement interface in stuttering. While behavioral studies of stuttering sought to find links between the limbic and sensorimotor system, we provide the first neuroimaging evidence of alterations in the limbic system. Thus, our findings might initialize a unified neurobiological framework of persistent developmental stuttering that integrates sensorimotor and social-motivational neuroanatomical circuitries. Copyright © 2017 Elsevier Inc. All rights reserved.
Díaz, Sandra; Cáceres, Daniel M.; Trainor, Sarah F.; Pérez-Harguindeguy, Natalia; Bret-Harte, M. Syndonia; Finegan, Bryan; Peña-Claros, Marielos; Poorter, Lourens
2011-01-01
The crucial role of biodiversity in the links between ecosystems and societies has been repeatedly highlighted both as source of wellbeing and as a target of human actions, but not all aspects of biodiversity are equally important to different ecosystem services. Similarly, different social actors have different perceptions of and access to ecosystem services, and therefore, they have different wants and capacities to select directly or indirectly for particular biodiversity and ecosystem characteristics. Their choices feed back onto the ecosystem services provided to all parties involved and in turn, affect future decisions. Despite this recognition, the research communities addressing biodiversity, ecosystem services, and human outcomes have yet to develop frameworks that adequately treat the multiple dimensions and interactions in the relationship. Here, we present an interdisciplinary framework for the analysis of relationships between functional diversity, ecosystem services, and human actions that is applicable to specific social environmental systems at local scales. We connect the mechanistic understanding of the ecological role of diversity with its social relevance: ecosystem services. The framework permits connections between functional diversity components and priorities of social actors using land use decisions and ecosystem services as the main links between these ecological and social components. We propose a matrix-based method that provides a transparent and flexible platform for quantifying and integrating social and ecological information and negotiating potentially conflicting land uses among multiple social actors. We illustrate the applicability of our framework by way of land use examples from temperate to subtropical South America, an area of rapid social and ecological change. PMID:21220325
The Plausibility of a String Quartet Performance in Virtual Reality.
Bergstrom, Ilias; Azevedo, Sergio; Papiotis, Panos; Saldanha, Nuno; Slater, Mel
2017-04-01
We describe an experiment that explores the contribution of auditory and other features to the illusion of plausibility in a virtual environment that depicts the performance of a string quartet. 'Plausibility' refers to the component of presence that is the illusion that the perceived events in the virtual environment are really happening. The features studied were: Gaze (the musicians ignored the participant, the musicians sometimes looked towards and followed the participant's movements), Sound Spatialization (Mono, Stereo, Spatial), Auralization (no sound reflections, reflections corresponding to a room larger than the one perceived, reflections that exactly matched the virtual room), and Environment (no sound from outside of the room, birdsong and wind corresponding to the outside scene). We adopted the methodology based on color matching theory, where 20 participants were first able to assess their feeling of plausibility in the environment with each of the four features at their highest setting. Then five times participants started from a low setting on all features and were able to make transitions from one system configuration to another until they matched their original feeling of plausibility. From these transitions a Markov transition matrix was constructed, and also probabilities of a match conditional on feature configuration. The results show that Environment and Gaze were individually the most important factors influencing the level of plausibility. The highest probability transitions were to improve Environment and Gaze, and then Auralization and Spatialization. We present this work as both a contribution to the methodology of assessing presence without questionnaires, and showing how various aspects of a musical performance can influence plausibility.
What if? Neural activity underlying semantic and episodic counterfactual thinking.
Parikh, Natasha; Ruzic, Luka; Stewart, Gregory W; Spreng, R Nathan; De Brigard, Felipe
2018-05-25
Counterfactual thinking (CFT) is the process of mentally simulating alternative versions of known facts. In the past decade, cognitive neuroscientists have begun to uncover the neural underpinnings of CFT, particularly episodic CFT (eCFT), which activates regions in the default network (DN) also activated by episodic memory (eM) recall. However, the engagement of DN regions is different for distinct kinds of eCFT. More plausible counterfactuals and counterfactuals about oneself show stronger activity in DN regions compared to implausible and other- or object-focused counterfactuals. The current study sought to identify a source for this difference in DN activity. Specifically, self-focused counterfactuals may also be more plausible, suggesting that DN core regions are sensitive to the plausibility of a simulation. On the other hand, plausible and self-focused counterfactuals may involve more episodic information than implausible and other-focused counterfactuals, which would imply DN sensitivity to episodic information. In the current study, we compared episodic and semantic counterfactuals generated to be plausible or implausible against episodic and semantic memory reactivation using fMRI. Taking multivariate and univariate approaches, we found that the DN is engaged more during episodic simulations, including eM and all eCFT, than during semantic simulations. Semantic simulations engaged more inferior temporal and lateral occipital regions. The only region that showed strong plausibility effects was the hippocampus, which was significantly engaged for implausible CFT but not for plausible CFT, suggestive of binding more disparate information. Consequences of these findings for the cognitive neuroscience of mental simulation are discussed. Published by Elsevier Inc.
Schmid, Annina B; Coppieters, Michel W
2011-12-01
A high prevalence of dual nerve disorders is frequently reported. How a secondary nerve disorder may develop following a primary nerve disorder remains largely unknown. Although still frequently cited, most explanatory theories were formulated many years ago. Considering recent advances in neuroscience, it is uncertain whether these theories still reflect current expert opinion. A Delphi study was conducted to update views on potential mechanisms underlying dual nerve disorders. In three rounds, seventeen international experts in the field of peripheral nerve disorders were asked to list possible mechanisms and rate their plausibility. Mechanisms with a median plausibility rating of ≥7 out of 10 were considered highly plausible. The experts identified fourteen mechanisms associated with a first nerve disorder that may predispose to the development of another nerve disorder. Of these fourteen mechanisms, nine have not previously been linked to double crush. Four mechanisms were considered highly plausible (impaired axonal transport, ion channel up or downregulation, inflammation in the dorsal root ganglia and neuroma-in-continuity). Eight additional mechanisms were listed which are not triggered by a primary nerve disorder, but may render the nervous system more vulnerable to multiple nerve disorders, such as systemic diseases and neurotoxic medication. Even though many mechanisms were classified as plausible or highly plausible, overall plausibility ratings varied widely. Experts indicated that a wide range of mechanisms has to be considered to better understand dual nerve disorders. Previously listed theories cannot be discarded, but may be insufficient to explain the high prevalence of dual nerve disorders. Copyright © 2011 Elsevier Ltd. All rights reserved.
Fu, Si-Yao; Yang, Guo-Sheng; Kuai, Xin-Kai
2012-01-01
In this paper, we present a quantitative, highly structured cortex-simulated model, which can be simply described as feedforward, hierarchical simulation of ventral stream of visual cortex using biologically plausible, computationally convenient spiking neural network system. The motivation comes directly from recent pioneering works on detailed functional decomposition analysis of the feedforward pathway of the ventral stream of visual cortex and developments on artificial spiking neural networks (SNNs). By combining the logical structure of the cortical hierarchy and computing power of the spiking neuron model, a practical framework has been presented. As a proof of principle, we demonstrate our system on several facial expression recognition tasks. The proposed cortical-like feedforward hierarchy framework has the merit of capability of dealing with complicated pattern recognition problems, suggesting that, by combining the cognitive models with modern neurocomputational approaches, the neurosystematic approach to the study of cortex-like mechanism has the potential to extend our knowledge of brain mechanisms underlying the cognitive analysis and to advance theoretical models of how we recognize face or, more specifically, perceive other people's facial expression in a rich, dynamic, and complex environment, providing a new starting point for improved models of visual cortex-like mechanism. PMID:23193391
A stochastic differential equation model of diurnal cortisol patterns
NASA Technical Reports Server (NTRS)
Brown, E. N.; Meehan, P. M.; Dempster, A. P.
2001-01-01
Circadian modulation of episodic bursts is recognized as the normal physiological pattern of diurnal variation in plasma cortisol levels. The primary physiological factors underlying these diurnal patterns are the ultradian timing of secretory events, circadian modulation of the amplitude of secretory events, infusion of the hormone from the adrenal gland into the plasma, and clearance of the hormone from the plasma by the liver. Each measured plasma cortisol level has an error arising from the cortisol immunoassay. We demonstrate that all of these three physiological principles can be succinctly summarized in a single stochastic differential equation plus measurement error model and show that physiologically consistent ranges of the model parameters can be determined from published reports. We summarize the model parameters in terms of the multivariate Gaussian probability density and establish the plausibility of the model with a series of simulation studies. Our framework makes possible a sensitivity analysis in which all model parameters are allowed to vary simultaneously. The model offers an approach for simultaneously representing cortisol's ultradian, circadian, and kinetic properties. Our modeling paradigm provides a framework for simulation studies and data analysis that should be readily adaptable to the analysis of other endocrine hormone systems.
Testing adaptive toolbox models: a Bayesian hierarchical approach.
Scheibehenne, Benjamin; Rieskamp, Jörg; Wagenmakers, Eric-Jan
2013-01-01
Many theories of human cognition postulate that people are equipped with a repertoire of strategies to solve the tasks they face. This theoretical framework of a cognitive toolbox provides a plausible account of intra- and interindividual differences in human behavior. Unfortunately, it is often unclear how to rigorously test the toolbox framework. How can a toolbox model be quantitatively specified? How can the number of toolbox strategies be limited to prevent uncontrolled strategy sprawl? How can a toolbox model be formally tested against alternative theories? The authors show how these challenges can be met by using Bayesian inference techniques. By means of parameter recovery simulations and the analysis of empirical data across a variety of domains (i.e., judgment and decision making, children's cognitive development, function learning, and perceptual categorization), the authors illustrate how Bayesian inference techniques allow toolbox models to be quantitatively specified, strategy sprawl to be contained, and toolbox models to be rigorously tested against competing theories. The authors demonstrate that their approach applies at the individual level but can also be generalized to the group level with hierarchical Bayesian procedures. The suggested Bayesian inference techniques represent a theoretical and methodological advancement for toolbox theories of cognition and behavior.
Hierarchical Aligned Cluster Analysis for Temporal Clustering of Human Motion.
Zhou, Feng; De la Torre, Fernando; Hodgins, Jessica K
2013-03-01
Temporal segmentation of human motion into plausible motion primitives is central to understanding and building computational models of human motion. Several issues contribute to the challenge of discovering motion primitives: the exponential nature of all possible movement combinations, the variability in the temporal scale of human actions, and the complexity of representing articulated motion. We pose the problem of learning motion primitives as one of temporal clustering, and derive an unsupervised hierarchical bottom-up framework called hierarchical aligned cluster analysis (HACA). HACA finds a partition of a given multidimensional time series into m disjoint segments such that each segment belongs to one of k clusters. HACA combines kernel k-means with the generalized dynamic time alignment kernel to cluster time series data. Moreover, it provides a natural framework to find a low-dimensional embedding for time series. HACA is efficiently optimized with a coordinate descent strategy and dynamic programming. Experimental results on motion capture and video data demonstrate the effectiveness of HACA for segmenting complex motions and as a visualization tool. We also compare the performance of HACA to state-of-the-art algorithms for temporal clustering on data of a honey bee dance. The HACA code is available online.
Adil, Karim; Belmabkhout, Youssef; Pillai, Renjith S; Cadiau, Amandine; Bhatt, Prashant M; Assen, Ayalew H; Maurin, Guillaume; Eddaoudi, Mohamed
2017-06-06
The separation of related molecules with similar physical/chemical properties is of prime industrial importance and practically entails a substantial energy penalty, typically necessitating the operation of energy-demanding low temperature fractional distillation techniques. Certainly research efforts, in academia and industry alike, are ongoing with the main aim to develop advanced functional porous materials to be adopted as adsorbents for the effective and energy-efficient separation of various important commodities. Of special interest is the subclass of metal-organic frameworks (MOFs) with pore aperture sizes below 5-7 Å, namely ultra-microporous MOFs, which in contrast to conventional zeolites and activated carbons show great prospects for addressing key challenges in separations pertaining to energy and environmental sustainability, specifically materials for carbon capture and separation of olefin/paraffin, acetylene/ethylene, linear/branched alkanes, xenon/krypton, etc. In this tutorial review we discuss the latest developments in ultra-microporous MOF adsorbents and their use as separating agents via thermodynamics and/or kinetics and molecular sieving. Appreciably, we provide insights into the distinct microscopic mechanisms governing the resultant separation performances, and suggest a plausible correlation between the inherent structural features/topology of MOFs and the associated gas/vapour separation performance.
Construction of Gene Regulatory Networks Using Recurrent Neural Networks and Swarm Intelligence.
Khan, Abhinandan; Mandal, Sudip; Pal, Rajat Kumar; Saha, Goutam
2016-01-01
We have proposed a methodology for the reverse engineering of biologically plausible gene regulatory networks from temporal genetic expression data. We have used established information and the fundamental mathematical theory for this purpose. We have employed the Recurrent Neural Network formalism to extract the underlying dynamics present in the time series expression data accurately. We have introduced a new hybrid swarm intelligence framework for the accurate training of the model parameters. The proposed methodology has been first applied to a small artificial network, and the results obtained suggest that it can produce the best results available in the contemporary literature, to the best of our knowledge. Subsequently, we have implemented our proposed framework on experimental (in vivo) datasets. Finally, we have investigated two medium sized genetic networks (in silico) extracted from GeneNetWeaver, to understand how the proposed algorithm scales up with network size. Additionally, we have implemented our proposed algorithm with half the number of time points. The results indicate that a reduction of 50% in the number of time points does not have an effect on the accuracy of the proposed methodology significantly, with a maximum of just over 15% deterioration in the worst case.
Modelling the failure behaviour of wind turbines
NASA Astrophysics Data System (ADS)
Faulstich, S.; Berkhout, V.; Mayer, J.; Siebenlist, D.
2016-09-01
Modelling the failure behaviour of wind turbines is an essential part of offshore wind farm simulation software as it leads to optimized decision making when specifying the necessary resources for the operation and maintenance of wind farms. In order to optimize O&M strategies, a thorough understanding of a wind turbine's failure behaviour is vital and is therefore being developed at Fraunhofer IWES. Within this article, first the failure models of existing offshore O&M tools are presented to show the state of the art and strengths and weaknesses of the respective models are briefly discussed. Then a conceptual framework for modelling different failure mechanisms of wind turbines is being presented. This framework takes into account the different wind turbine subsystems and structures as well as the failure modes of a component by applying several influencing factors representing wear and break failure mechanisms. A failure function is being set up for the rotor blade as exemplary component and simulation results have been compared to a constant failure rate and to empirical wind turbine fleet data as a reference. The comparison and the breakdown of specific failure categories demonstrate the overall plausibility of the model.
A cortical framework for invariant object categorization and recognition.
Rodrigues, João; Hans du Buf, J M
2009-08-01
In this paper we present a new model for invariant object categorization and recognition. It is based on explicit multi-scale features: lines, edges and keypoints are extracted from responses of simple, complex and end-stopped cells in cortical area V1, and keypoints are used to construct saliency maps for Focus-of-Attention. The model is a functional but dichotomous one, because keypoints are employed to model the "where" data stream, with dynamic routing of features from V1 to higher areas to obtain translation, rotation and size invariance, whereas lines and edges are employed in the "what" stream for object categorization and recognition. Furthermore, both the "where" and "what" pathways are dynamic in that information at coarse scales is employed first, after which information at progressively finer scales is added in order to refine the processes, i.e., both the dynamic feature routing and the categorization level. The construction of group and object templates, which are thought to be available in the prefrontal cortex with "what" and "where" components in PF46d and PF46v, is also illustrated. The model was tested in the framework of an integrated and biologically plausible architecture.
Engdahl, N.B.; Vogler, E.T.; Weissmann, G.S.
2010-01-01
River-aquifer exchange is considered within a transition probability framework along the Rio Grande in Albuquerque, New Mexico, to provide a stochastic estimate of aquifer heterogeneity and river loss. Six plausible hydrofacies configurations were determined using categorized drill core and wetland survey data processed through the TPROGS geostatistical package. A base case homogeneous model was also constructed for comparison. River loss was simulated for low, moderate, and high Rio Grande stages and several different riverside drain stage configurations. Heterogeneity effects were quantified by determining the mean and variance of the K field for each realization compared to the root-mean-square (RMS) error of the observed groundwater head data. Simulation results showed that the heterogeneous models produced smaller estimates of loss than the homogeneous approximation. Differences between heterogeneous and homogeneous model results indicate that the use of a homogeneous K in a regional-scale model may result in an overestimation of loss but comparable RMS error. We find that the simulated river loss is dependent on the aquifer structure and is most sensitive to the volumetric proportion of fines within the river channel. Copyright 2010 by the American Geophysical Union.
Fu, Si-Yao; Yang, Guo-Sheng; Kuai, Xin-Kai
2012-01-01
In this paper, we present a quantitative, highly structured cortex-simulated model, which can be simply described as feedforward, hierarchical simulation of ventral stream of visual cortex using biologically plausible, computationally convenient spiking neural network system. The motivation comes directly from recent pioneering works on detailed functional decomposition analysis of the feedforward pathway of the ventral stream of visual cortex and developments on artificial spiking neural networks (SNNs). By combining the logical structure of the cortical hierarchy and computing power of the spiking neuron model, a practical framework has been presented. As a proof of principle, we demonstrate our system on several facial expression recognition tasks. The proposed cortical-like feedforward hierarchy framework has the merit of capability of dealing with complicated pattern recognition problems, suggesting that, by combining the cognitive models with modern neurocomputational approaches, the neurosystematic approach to the study of cortex-like mechanism has the potential to extend our knowledge of brain mechanisms underlying the cognitive analysis and to advance theoretical models of how we recognize face or, more specifically, perceive other people's facial expression in a rich, dynamic, and complex environment, providing a new starting point for improved models of visual cortex-like mechanism.
Finding optimal vaccination strategies under parameter uncertainty using stochastic programming.
Tanner, Matthew W; Sattenspiel, Lisa; Ntaimo, Lewis
2008-10-01
We present a stochastic programming framework for finding the optimal vaccination policy for controlling infectious disease epidemics under parameter uncertainty. Stochastic programming is a popular framework for including the effects of parameter uncertainty in a mathematical optimization model. The problem is initially formulated to find the minimum cost vaccination policy under a chance-constraint. The chance-constraint requires that the probability that R(*)
Constructing Precisely Computing Networks with Biophysical Spiking Neurons.
Schwemmer, Michael A; Fairhall, Adrienne L; Denéve, Sophie; Shea-Brown, Eric T
2015-07-15
While spike timing has been shown to carry detailed stimulus information at the sensory periphery, its possible role in network computation is less clear. Most models of computation by neural networks are based on population firing rates. In equivalent spiking implementations, firing is assumed to be random such that averaging across populations of neurons recovers the rate-based approach. Recently, however, Denéve and colleagues have suggested that the spiking behavior of neurons may be fundamental to how neuronal networks compute, with precise spike timing determined by each neuron's contribution to producing the desired output (Boerlin and Denéve, 2011; Boerlin et al., 2013). By postulating that each neuron fires to reduce the error in the network's output, it was demonstrated that linear computations can be performed by networks of integrate-and-fire neurons that communicate through instantaneous synapses. This left open, however, the possibility that realistic networks, with conductance-based neurons with subthreshold nonlinearity and the slower timescales of biophysical synapses, may not fit into this framework. Here, we show how the spike-based approach can be extended to biophysically plausible networks. We then show that our network reproduces a number of key features of cortical networks including irregular and Poisson-like spike times and a tight balance between excitation and inhibition. Lastly, we discuss how the behavior of our model scales with network size or with the number of neurons "recorded" from a larger computing network. These results significantly increase the biological plausibility of the spike-based approach to network computation. We derive a network of neurons with standard spike-generating currents and synapses with realistic timescales that computes based upon the principle that the precise timing of each spike is important for the computation. We then show that our network reproduces a number of key features of cortical networks including irregular, Poisson-like spike times, and a tight balance between excitation and inhibition. These results significantly increase the biological plausibility of the spike-based approach to network computation, and uncover how several components of biological networks may work together to efficiently carry out computation. Copyright © 2015 the authors 0270-6474/15/3510112-23$15.00/0.
Effect of topological patterning on self-rolling of nanomembranes.
Chen, Cheng; Song, Pengfei; Meng, Fanchao; Ou, Pengfei; Liu, Xinyu; Song, Jun
2018-08-24
The effects of topological patterning (i.e., grating and rectangular patterns) on the self-rolling behaviors of heteroepitaxial strained nanomembranes have been systematically studied. An analytical modeling framework, validated through finite-element simulations, has been formulated to predict the resultant curvature of the patterned nanomembrane as the pattern thickness and density vary. The effectiveness of the grating pattern in regulating the rolling direction of the nanomembrane has been demonstrated and quantitatively assessed. Further to the rolling of nanomembranes, a route to achieve predictive design of helical structures has been proposed and showcased. The present study provides new knowledge and mechanistic guidance towards predictive control and tuning of roll-up nanostructures via topological patterning.
Ramos, Kenneth S; Steffen, Marlene C; Falahatpisheh, M H; Nanez, Adrian
2007-06-01
As the postgenomic era continues to unfold, a new wave of scientific investigation is upon us focusing on the application of genomic technologies to study the meanings encrypted on the DNA code and the responses of living organisms to changes in their environment. Recent functional genomics studies in this laboratory have focused on the role of the aryl hydrocarbon receptor, a ubiquitous transcription factor, in genetic programming during renal development. Also of interest is the application of genomics investigations to the study of chronic medical conditions associated with early life exposures to environmental contaminants. Molecular evidence is discussed in this review within the framework of human molecular medicine.
Learning and cognition in insects.
Giurfa, Martin
2015-01-01
Insects possess small brains but exhibit sophisticated behavioral performances. Recent works have reported the existence of unsuspected cognitive capabilities in various insect species, which go beyond the traditional studied framework of simple associative learning. In this study, I focus on capabilities such as attention, social learning, individual recognition, concept learning, and metacognition, and discuss their presence and mechanistic bases in insects. I analyze whether these behaviors can be explained on the basis of elemental associative learning or, on the contrary, require higher-order explanations. In doing this, I highlight experimental challenges and suggest future directions for investigating the neurobiology of higher-order learning in insects, with the goal of uncovering l architectures underlying cognitive processing. © 2015 John Wiley & Sons, Ltd.
Zhang, Le; Zhang, Shaoxiang
2017-03-01
A body of research [1-7] has already shown that epigenetic reprogramming plays a critical role in maintaining the normal development of embryos. However, the mechanistic quantitation of the epigenetic interactions between sperms and oocytes and the related impact on embryo development are still not clear [6,7]. In this study, Wang et al., [8] develop a modeling framework that addresses this question by integrating game theory and the latest discoveries of the epigenetic control of embryo development. Copyright © 2017 Elsevier B.V. All rights reserved.
Systems Biology Approach Reveals a Calcium-Dependent Mechanism for Basal Toxicity in Daphnia magna.
Antczak, Philipp; White, Thomas A; Giri, Anirudha; Michelangeli, Francesco; Viant, Mark R; Cronin, Mark T D; Vulpe, Chris; Falciani, Francesco
2015-09-15
The expanding diversity and ever increasing amounts of man-made chemicals discharged to the environment pose largely unknown hazards to ecosystem and human health. The concept of adverse outcome pathways (AOPs) emerged as a comprehensive framework for risk assessment. However, the limited mechanistic information available for most chemicals and a lack of biological pathway annotation in many species represent significant challenges to effective implementation of this approach. Here, a systems level, multistep modeling strategy demonstrates how to integrate information on chemical structure with mechanistic insight from genomic studies, and phenotypic effects to define a putative adverse outcome pathway. Results indicated that transcriptional changes indicative of intracellular calcium mobilization were significantly overrepresented in Daphnia magna (DM) exposed to sublethal doses of presumed narcotic chemicals with log Kow ≥ 1.8. Treatment of DM with a calcium ATPase pump inhibitor substantially recapitulated the common transcriptional changes. We hypothesize that calcium mobilization is a potential key molecular initiating event in DM basal (narcosis) toxicity. Heart beat rate analysis and metabolome analysis indicated sublethal effects consistent with perturbations of calcium preceding overt acute toxicity. Together, the results indicate that altered calcium homeostasis may be a key early event in basal toxicity or narcosis induced by lipophilic compounds.
Cornelius, Carolin; Dinkova-Kostova, Albena T.; Calabrese, Edward J.; Mattson, Mark P.
2010-01-01
Abstract Despite the capacity of chaperones and other homeostatic components to restore folding equilibrium, cells appear poorly adapted for chronic oxidative stress that increases in cancer and in metabolic and neurodegenerative diseases. Modulation of endogenous cellular defense mechanisms represents an innovative approach to therapeutic intervention in diseases causing chronic tissue damage, such as in neurodegeneration. This article introduces the concept of hormesis and its applications to the field of neuroprotection. It is argued that the hormetic dose response provides the central underpinning of neuroprotective responses, providing a framework for explaining the common quantitative features of their dose–response relationships, their mechanistic foundations, and their relationship to the concept of biological plasticity, as well as providing a key insight for improving the accuracy of the therapeutic dose of pharmaceutical agents within the highly heterogeneous human population. This article describes in mechanistic detail how hormetic dose responses are mediated for endogenous cellular defense pathways, including sirtuin and Nrf2 and related pathways that integrate adaptive stress responses in the prevention of neurodegenerative diseases. Particular attention is given to the emerging role of nitric oxide, carbon monoxide, and hydrogen sulfide gases in hormetic-based neuroprotection and their relationship to membrane radical dynamics and mitochondrial redox signaling. Antioxid. Redox Signal. 13, 1763–1811. PMID:20446769
Adverse outcome pathway (AOP) development and evaluation ...
The Adverse Outcome Pathway provides a construct for assembling mechanistic information at different levels of biological organization in a form designed to support regulatory decision making. In particular, it frames the link between molecular and cellular events that can be measured in high throughput toxicity testing and the organism or population-level events that are commonly relevant in defining risk. Recognizing the importance of this emerging framework, the Organisation for Economic Co-operation and Development (OECD) launched a program to support the development, documentation and consideration of AOPs by the international community in 2012 (http://www.oecd.org/chemicalsafety/testing/adverse-outcome-pathways-molecular-screening-and-toxicogenomics.htm). In 2014, a handbook (https://aopkb.org/common/AOP_Handbook.pdf) was developed to guide users in the documentation and evaluation of AOPs and their entry into an official knowledgebase. The handbook draws on longstanding experience in consideration of mechanistic data (e.g., mode of action analysis) to inform risk assessment. To further assist users, a training program was developed by members of the OECD Extended Advisory Group to teach users the basic principles of AOP development and the best practices as outlined in the OECD AOP handbook. Training sessions began in early 2015, and this course will provide training for interested SOT scientists. Following this course, all participants will be familiar w
Bouffler, S D; Peters, S; Gilvin, P; Slack, K; Markiewicz, E; Quinlan, R A; Gillan, J; Coster, M; Barnard, S; Rothkamm, K; Ainsbury, E
2015-06-01
The recommendation from the International Commission on Radiological Protection that the occupational equivalent dose limit for the lens of the eye should be reduced to 20 mSv year(-1), averaged over 5 years with no year exceeding 50 mSv, has stimulated a discussion on the practicalities of implementation of this revised dose limit, and the most appropriate risk and protection framework to adopt. This brief paper provides an overview of some of the drivers behind the move to a lower recommended dose limit. The issue of implementation in the medical sector in the UK has been addressed through a small-scale survey of doses to the lens of the eye amongst interventional cardiologists and radiologists. In addition, a mechanistic study of early and late post-irradiation changes in the lens of the eye in in-vivo-exposed mice is outlined. Surveys and studies such as those described can contribute to a deeper understanding of fundamental and practical issues, and therefore contribute to a robust evidence base for ensuring adequate protection of the eye while avoiding undesirable restrictions to working practices. © The International Society for Prosthetics and Orthotics Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wittwehr, Clemens; Aladjov, Hristo; Ankley, Gerald
Efforts are underway to transform regulatory toxicology and chemical safety assessment from a largely empirical science based on direct observation of apical toxicity outcomes in whole organism toxicity tests to a predictive one in which outcomes and risk are inferred from accumulated mechanistic understanding. The adverse outcome pathway (AOP) framework has emerged as a systematic approach for organizing knowledge that supports such inference. We argue that this systematic organization of knowledge can inform and help direct the design and development of computational prediction models that can further enhance the utility of mechanistic and in silico data for chemical safety assessment.more » Examples of AOP-informed model development and its application to the assessment of chemicals for skin sensitization and multiple modes of endocrine disruption are provided. The role of problem formulation, not only as a critical phase of risk assessment, but also as guide for both AOP and complementary model development described. Finally, a proposal for actively engaging the modeling community in AOP-informed computational model development is made. The contents serve as a vision for how AOPs can be leveraged to facilitate development of computational prediction models needed to support the next generation of chemical safety assessment.« less
Mechanistic and quantitative insight into cell surface targeted molecular imaging agent design.
Zhang, Liang; Bhatnagar, Sumit; Deschenes, Emily; Thurber, Greg M
2016-05-05
Molecular imaging agent design involves simultaneously optimizing multiple probe properties. While several desired characteristics are straightforward, including high affinity and low non-specific background signal, in practice there are quantitative trade-offs between these properties. These include plasma clearance, where fast clearance lowers background signal but can reduce target uptake, and binding, where high affinity compounds sometimes suffer from lower stability or increased non-specific interactions. Further complicating probe development, many of the optimal parameters vary depending on both target tissue and imaging agent properties, making empirical approaches or previous experience difficult to translate. Here, we focus on low molecular weight compounds targeting extracellular receptors, which have some of the highest contrast values for imaging agents. We use a mechanistic approach to provide a quantitative framework for weighing trade-offs between molecules. Our results show that specific target uptake is well-described by quantitative simulations for a variety of targeting agents, whereas non-specific background signal is more difficult to predict. Two in vitro experimental methods for estimating background signal in vivo are compared - non-specific cellular uptake and plasma protein binding. Together, these data provide a quantitative method to guide probe design and focus animal work for more cost-effective and time-efficient development of molecular imaging agents.
Marshall, Jill A.; Roering, Joshua J.; Bartlein, Patrick J.; Gavin, Daniel G.; Granger, Darryl E.; Rempel, Alan W.; Praskievicz, Sarah J.; Hales, Tristram C.
2015-01-01
Understanding climatic influences on the rates and mechanisms of landscape erosion is an unresolved problem in Earth science that is important for quantifying soil formation rates, sediment and solute fluxes to oceans, and atmospheric CO2 regulation by silicate weathering. Glaciated landscapes record the erosional legacy of glacial intervals through moraine deposits and U-shaped valleys, whereas more widespread unglaciated hillslopes and rivers lack obvious climate signatures, hampering mechanistic theory for how climate sets fluxes and form. Today, periglacial processes in high-elevation settings promote vigorous bedrock-to-regolith conversion and regolith transport, but the extent to which frost processes shaped vast swaths of low- to moderate-elevation terrain during past climate regimes is not well established. By combining a mechanistic frost weathering model with a regional Last Glacial Maximum (LGM) climate reconstruction derived from a paleo-Earth System Model, paleovegetation data, and a paleoerosion archive, we propose that frost-driven sediment production was pervasive during the LGM in our unglaciated Pacific Northwest study site, coincident with a 2.5 times increase in erosion relative to modern rates. Our findings provide a novel framework to quantify how climate modulates sediment production over glacial-interglacial cycles in mid-latitude unglaciated terrain. PMID:26702434
Razavi, Sonia M; Gonzalez, Marcial; Cuitiño, Alberto M
2015-04-30
We propose a general framework for determining optimal relationships for tensile strength of doubly convex tablets under diametrical compression. This approach is based on the observation that tensile strength is directly proportional to the breaking force and inversely proportional to a non-linear function of geometric parameters and materials properties. This generalization reduces to the analytical expression commonly used for flat faced tablets, i.e., Hertz solution, and to the empirical relationship currently used in the pharmaceutical industry for convex-faced tablets, i.e., Pitt's equation. Under proper parametrization, optimal tensile strength relationship can be determined from experimental results by minimizing a figure of merit of choice. This optimization is performed under the first-order approximation that a flat faced tablet and a doubly curved tablet have the same tensile strength if they have the same relative density and are made of the same powder, under equivalent manufacturing conditions. Furthermore, we provide a set of recommendations and best practices for assessing the performance of optimal tensile strength relationships in general. Based on these guidelines, we identify two new models, namely the general and mechanistic models, which are effective and predictive alternatives to the tensile strength relationship currently used in the pharmaceutical industry. Copyright © 2015 Elsevier B.V. All rights reserved.
High-throughput literature mining to support read-across ...
Building scientific confidence in the development and evaluation of read-across remains an ongoing challenge. Approaches include establishing systematic frameworks to identify sources of uncertainty and ways to address them. One source of uncertainty is related to characterizing biological similarity. Many research efforts are underway such as structuring mechanistic data in adverse outcome pathways and investigating the utility of high throughput (HT)/high content (HC) screening data. A largely untapped resource for read-across to date is the biomedical literature. This information has the potential to support read-across by facilitating the identification of valid source analogues with similar biological and toxicological profiles as well as providing the mechanistic understanding for any prediction made. A key challenge in using biomedical literature is to convert and translate its unstructured form into a computable format that can be linked to chemical structure. We developed a novel text-mining strategy to represent literature information for read across. Keywords were used to organize literature into toxicity signatures at the chemical level. These signatures were integrated with HT in vitro data and curated chemical structures. A rule-based algorithm assessed the strength of the literature relationship, providing a mechanism to rank and visualize the signature as literature ToxPIs (LitToxPIs). LitToxPIs were developed for over 6,000 chemicals for a varie
Ye, Han; Zhou, Jiadong; Er, Dequan; Price, Christopher C; Yu, Zhongyuan; Liu, Yumin; Lowengrub, John; Lou, Jun; Liu, Zheng; Shenoy, Vivek B
2017-12-26
Vertical stacking of monolayers via van der Waals (vdW) interaction opens promising routes toward engineering physical properties of two-dimensional (2D) materials and designing atomically thin devices. However, due to the lack of mechanistic understanding, challenges remain in the controlled fabrication of these structures via scalable methods such as chemical vapor deposition (CVD) onto substrates. In this paper, we develop a general multiscale model to describe the size evolution of 2D layers and predict the necessary growth conditions for vertical (initial + subsequent layers) versus in-plane lateral (monolayer) growth. An analytic thermodynamic criterion is established for subsequent layer growth that depends on the sizes of both layers, the vdW interaction energies, and the edge energy of 2D layers. Considering the time-dependent growth process, we find that temperature and adatom flux from vapor are the primary criteria affecting the self-assembled growth. The proposed model clearly demonstrates the distinct roles of thermodynamic and kinetic mechanisms governing the final structure. Our model agrees with experimental observations of various monolayer and bilayer transition metal dichalcogenides grown by CVD and provides a predictive framework to guide the fabrication of vertically stacked 2D materials.
The stochastic system approach for estimating dynamic treatments effect.
Commenges, Daniel; Gégout-Petit, Anne
2015-10-01
The problem of assessing the effect of a treatment on a marker in observational studies raises the difficulty that attribution of the treatment may depend on the observed marker values. As an example, we focus on the analysis of the effect of a HAART on CD4 counts, where attribution of the treatment may depend on the observed marker values. This problem has been treated using marginal structural models relying on the counterfactual/potential response formalism. Another approach to causality is based on dynamical models, and causal influence has been formalized in the framework of the Doob-Meyer decomposition of stochastic processes. Causal inference however needs assumptions that we detail in this paper and we call this approach to causality the "stochastic system" approach. First we treat this problem in discrete time, then in continuous time. This approach allows incorporating biological knowledge naturally. When working in continuous time, the mechanistic approach involves distinguishing the model for the system and the model for the observations. Indeed, biological systems live in continuous time, and mechanisms can be expressed in the form of a system of differential equations, while observations are taken at discrete times. Inference in mechanistic models is challenging, particularly from a numerical point of view, but these models can yield much richer and reliable results.
How adverse outcome pathways can aid the development and ...
Efforts are underway to transform regulatory toxicology and chemical safety assessment from a largely empirical science based on direct observation of apical toxicity outcomes in whole organism toxicity tests to a predictive one in which outcomes and risk are inferred from accumulated mechanistic understanding. The adverse outcome pathway (AOP) framework has emerged as a systematic approach for organizing knowledge that supports such inference. We argue that this systematic organization of knowledge can inform and help direct the design and development of computational prediction models that can further enhance the utility of mechanistic and in silico data for chemical safety assessment. Examples of AOP-informed model development and its application to the assessment of chemicals for skin sensitization and multiple modes of endocrine disruption are provided. The role of problem formulation, not only as a critical phase of risk assessment, but also as guide for both AOP and complementary model development described. Finally, a proposal for actively engaging the modeling community in AOP-informed computational model development is made. The contents serve as a vision for how AOPs can be leveraged to facilitate development of computational prediction models needed to support the next generation of chemical safety assessment. The present manuscript reports on expert opinion and case studies that came out of a European Commission, Joint Research Centre-sponsored work
Mechanistic links between cellular trade-offs, gene expression, and growth.
Weiße, Andrea Y; Oyarzún, Diego A; Danos, Vincent; Swain, Peter S
2015-03-03
Intracellular processes rarely work in isolation but continually interact with the rest of the cell. In microbes, for example, we now know that gene expression across the whole genome typically changes with growth rate. The mechanisms driving such global regulation, however, are not well understood. Here we consider three trade-offs that, because of limitations in levels of cellular energy, free ribosomes, and proteins, are faced by all living cells and we construct a mechanistic model that comprises these trade-offs. Our model couples gene expression with growth rate and growth rate with a growing population of cells. We show that the model recovers Monod's law for the growth of microbes and two other empirical relationships connecting growth rate to the mass fraction of ribosomes. Further, we can explain growth-related effects in dosage compensation by paralogs and predict host-circuit interactions in synthetic biology. Simulating competitions between strains, we find that the regulation of metabolic pathways may have evolved not to match expression of enzymes to levels of extracellular substrates in changing environments but rather to balance a trade-off between exploiting one type of nutrient over another. Although coarse-grained, the trade-offs that the model embodies are fundamental, and, as such, our modeling framework has potentially wide application, including in both biotechnology and medicine.
How Peircean semiotic philosophy connects Western science with Eastern emptiness ontology.
Brier, Søren
2017-12-01
In recent articles in this journal I have discussed why a traditional physicalist and mechanist, as well as an info-computationalist, view of science cannot fulfil the goal of building a transdisciplinary science across Snow's two cultures. There seems to be no path proceeding from mechanistic physicalism to views that encompass phenomenological theories of experiential consciousness and meaning-based cognition and communication. I have suggested, as an alternative, the Cybersemiotic framework's integration of Peirce's semiotics and Luhmann's autopoietic system theory. The present article considers in greater depth the ontological developments necessary to make this possible. It shows how Peirce avoids materialism and German idealism through his building on a concept of emptiness similar to modern quantum field theory, positing an indeterminist objective chance feeding into an evolutionary philosophy of knowing based on pure mathematics and phenomenology that is itself combined with empirically executed fallibilism. Furthermore, he created a new metaphysics in the form of a philosophical synechist triadic process philosophy. This was integrated into the transcendentalist view of process view of science and spirituality developed from Western Unitarianism by Emerson (agapism), and featuring a metaphysics of emptiness and spontaneity (tychism) that are also essential for the Eastern philosophies of Buddhism and Vedanta. Copyright © 2017 Elsevier Ltd. All rights reserved.
Mohammadhassanzadeh, Hossein; Van Woensel, William; Abidi, Samina Raza; Abidi, Syed Sibte Raza
2017-01-01
Capturing complete medical knowledge is challenging-often due to incomplete patient Electronic Health Records (EHR), but also because of valuable, tacit medical knowledge hidden away in physicians' experiences. To extend the coverage of incomplete medical knowledge-based systems beyond their deductive closure, and thus enhance their decision-support capabilities, we argue that innovative, multi-strategy reasoning approaches should be applied. In particular, plausible reasoning mechanisms apply patterns from human thought processes, such as generalization, similarity and interpolation, based on attributional, hierarchical, and relational knowledge. Plausible reasoning mechanisms include inductive reasoning , which generalizes the commonalities among the data to induce new rules, and analogical reasoning , which is guided by data similarities to infer new facts. By further leveraging rich, biomedical Semantic Web ontologies to represent medical knowledge, both known and tentative, we increase the accuracy and expressivity of plausible reasoning, and cope with issues such as data heterogeneity, inconsistency and interoperability. In this paper, we present a Semantic Web-based, multi-strategy reasoning approach, which integrates deductive and plausible reasoning and exploits Semantic Web technology to solve complex clinical decision support queries. We evaluated our system using a real-world medical dataset of patients with hepatitis, from which we randomly removed different percentages of data (5%, 10%, 15%, and 20%) to reflect scenarios with increasing amounts of incomplete medical knowledge. To increase the reliability of the results, we generated 5 independent datasets for each percentage of missing values, which resulted in 20 experimental datasets (in addition to the original dataset). The results show that plausibly inferred knowledge extends the coverage of the knowledge base by, on average, 2%, 7%, 12%, and 16% for datasets with, respectively, 5%, 10%, 15%, and 20% of missing values. This expansion in the KB coverage allowed solving complex disease diagnostic queries that were previously unresolvable, without losing the correctness of the answers. However, compared to deductive reasoning, data-intensive plausible reasoning mechanisms yield a significant performance overhead. We observed that plausible reasoning approaches, by generating tentative inferences and leveraging domain knowledge of experts, allow us to extend the coverage of medical knowledge bases, resulting in improved clinical decision support. Second, by leveraging OWL ontological knowledge, we are able to increase the expressivity and accuracy of plausible reasoning methods. Third, our approach is applicable to clinical decision support systems for a range of chronic diseases.
NASA Astrophysics Data System (ADS)
Fatichi, Simone; Manzoni, Stefano; Or, Dani; Paschalis, Athanasios
2016-04-01
The potential of a given ecosystem to store and release carbon is inherently linked to soil biogeochemical processes. These processes are deeply connected to the water, energy, and vegetation dynamics above and belowground. Recently, it has been advocated that a mechanistic representation of soil biogeochemistry require: (i) partitioning of soil organic carbon (SOC) pools according to their functional role; (ii) an explicit representation of microbial dynamics; (iii) coupling of carbon and nutrient cycles. While some of these components have been introduced in specialized models, they have been rarely implemented in terrestrial biosphere models and tested in real cases. In this study, we combine a new soil biogeochemistry model with an existing model of land-surface hydrology and vegetation dynamics (T&C). Specifically the soil biogeochemistry component explicitly separates different litter pools and distinguishes SOC in particulate, dissolved and mineral associated fractions. Extracellular enzymes and microbial pools are explicitly represented differentiating the functional roles of bacteria, saprotrophic and mycorrhizal fungi. Microbial activity depends on temperature, soil moisture and litter or SOC stoichiometry. The activity of macrofauna is also modeled. Nutrient dynamics include the cycles of nitrogen, phosphorous and potassium. The model accounts for feedbacks between nutrient limitations and plant growth as well as for plant stoichiometric flexibility. In turn, litter input is a function of the simulated vegetation dynamics. Root exudation and export to mycorrhiza are computed based on a nutrient uptake cost function. The combined model is tested to reproduce respiration dynamics and nitrogen cycle in few sites where data were available to test plausibility of results across a range of different metrics. For instance in a Swiss grassland ecosystem, fine root, bacteria, fungal and macrofaunal respiration account for 40%, 23%, 33% and 4% of total belowground respiration, respectively. Root exudation and carbon export to mycorrhizal represent about 7% of plant Net Primary Production. The model allows exploring the temporal dynamics of respiration fluxes from the different ecosystem components and designing virtual experiments on the controls exerted by environmental variables and/or soil microbes and mycorrhizal associations on soil carbon storage, plant growth, and nutrient leaching.
State of the science: chronic periodontitis and systemic health.
Otomo-Corgel, Joan; Pucher, Jeffery J; Rethman, Michael P; Reynolds, Mark A
2012-09-01
Inflammatory periodontal diseases exhibit an association with multiple systemic conditions. Currently, there is a lack of consensus among experts on the nature of these associations and confusion among health care providers and the public on how to interpret this rapidly growing body of science. This article overviews the current evidence linking periodontal diseases to diabetes, cardiovascular disease, osteoporosis, preterm low birth weight babies, respiratory diseases, and rheumatoid arthritis. Evidence was taken from systematic reviews, clinical trials, and mechanistic studies retrieved in searches of the PubMed electronic database. The available data provide the basis for applied practical clinical recommendations. Evidence is summarized and critically reviewed from systematic reviews, primary clinical trials, and mechanistic studies Surrogate markers for chronic periodontitis, such as tooth loss, show relatively consistent but weak associations with multiple systemic conditions. Despite biological plausibility, shorter-term interventional trials have generally not supported unambiguous cause-and-effect relationships. Nevertheless, the effective treatment of periodontal infections is important to achieve oral health goals, as well as to reduce the systemic risks of chronic local inflammation and bacteremias. Inflammatory periodontal diseases exhibit an association with multiple systemic conditions. With pregnancy as a possible exception, the local and systemic effects of periodontal infections and inflammation are usually exerted for many years, typically among those who are middle-aged or older. It follows that numerous epidemiological associations linking chronic periodontitis to age-associated and biologically complex conditions such as diabetes, cardiovascular disease, osteoporosis, respiratory diseases, rheumatoid arthritis, certain cancers, erectile dysfunction, kidney disease and dementia, have been reported. In the coming years, it seems likely that additional associations will be reported, despite adjustments for known genetic, behavioral and environmental confounders. Determining cause-and-effect mechanisms is more complicated, especially in circumstances where systemic effects may be subtle. Currently, however, there is a lack of consensus among experts on the nature of these associations and confusion among health care providers and the public on how to interpret this rapidly growing body of science. This article overviews the current evidence linking periodontal diseases to diabetes, cardiovascular disease, osteoporosis, preterm/low birth weight babies, respiratory diseases, and rheumatoid arthritis. Copyright © 2012 Elsevier Inc. All rights reserved.
Bale, Ambuja S; Lee, Janice S
2016-01-01
The purpose of this article is to briefly review the published literature on the developmental neurotoxic effects, including potential mechanisms, of four butanols: n-butanol, sec-butanol, tert-butanol, isobutanol, and identify data gaps and research needs for evaluation of human health risks in this area. Exposure potential to these four butanols is considerable given the high production volume (>1 billion lb) of n- and tert-butanol and moderate production volumes (100-500 million lb) of sec- and isobutanol. With the impetus to derive cleaner gasoline blends, butanols are being considered for use as fuel oxygenates. Notable signs of neurotoxicity and developmental neurotoxicity have been observed in some studies where laboratory animals (rodents) were gestationally exposed to n- or tert-butanol. Mechanistic data relevant to the observed developmental neurotoxicity endpoints were also reviewed to hypothesize potential mechanisms associated with the developmental neurotoxicity outcome. Data from the related and highly characterized alcohol, ethanol, were included to examine consistencies between this compound and the four butanols. It is widely known that alcohols, including butanols, interact with several ion channels and modulate the function of these targets following both acute and chronic exposures. In addition, n- and sec-butanol have been demonstrated to inhibit fetal rat brain astroglial cell proliferation. Further, rat pups exposed to n-butanol in utero were also reported to have significant increases in brain levels of dopamine and serotonin, but decreases in serotonin levels were noted with gestational exposure to tert-butanol. tert-Butanol was reported to inhibit muscarinic receptor-stimulated phosphoinositide metabolism which has been hypothesized to be a possible target for the neurotoxic effects of ethanol during brain development. The mechanistic data for the butanols support developmental neurotoxicity that has been observed in some of the rodent studies. However, careful studies evaluating the neurobehavior of developing pups in sensitive strains, as well as characterizing the plausible mechanisms involved, need to be conducted in order to further elucidate the neurodevelopmental effects of butanols for risk evaluation. Published by Elsevier Inc.