Phenoscape: Identifying Candidate Genes for Evolutionary Phenotypes
Edmunds, Richard C.; Su, Baofeng; Balhoff, James P.; Eames, B. Frank; Dahdul, Wasila M.; Lapp, Hilmar; Lundberg, John G.; Vision, Todd J.; Dunham, Rex A.; Mabee, Paula M.; Westerfield, Monte
2016-01-01
Phenotypes resulting from mutations in genetic model organisms can help reveal candidate genes for evolutionarily important phenotypic changes in related taxa. Although testing candidate gene hypotheses experimentally in nonmodel organisms is typically difficult, ontology-driven information systems can help generate testable hypotheses about developmental processes in experimentally tractable organisms. Here, we tested candidate gene hypotheses suggested by expert use of the Phenoscape Knowledgebase, specifically looking for genes that are candidates responsible for evolutionarily interesting phenotypes in the ostariophysan fishes that bear resemblance to mutant phenotypes in zebrafish. For this, we searched ZFIN for genetic perturbations that result in either loss of basihyal element or loss of scales phenotypes, because these are the ancestral phenotypes observed in catfishes (Siluriformes). We tested the identified candidate genes by examining their endogenous expression patterns in the channel catfish, Ictalurus punctatus. The experimental results were consistent with the hypotheses that these features evolved through disruption in developmental pathways at, or upstream of, brpf1 and eda/edar for the ancestral losses of basihyal element and scales, respectively. These results demonstrate that ontological annotations of the phenotypic effects of genetic alterations in model organisms, when aggregated within a knowledgebase, can be used effectively to generate testable, and useful, hypotheses about evolutionary changes in morphology. PMID:26500251
Trevors, J T
2010-06-01
Methods to research the origin of microbial life are limited. However, microorganisms were the first organisms on the Earth capable of cell growth and division, and interactions with their environment, other microbial cells, and eventually with diverse eukaryotic organisms. The origin of microbial life and the supporting scientific evidence are both an enigma and a scientific priority. Numerous hypotheses have been proposed, scenarios imagined, speculations presented in papers, insights shared, and assumptions made without supporting experimentation, which have led to limited progress in understanding the origin of microbial life. The use of the human imagination to envision the origin of life events, without supporting experimentation, observation and independently replicated experiments required for science, is a significant constraint. The challenge remains how to better understand the origin of microbial life using observations and experimental methods as opposed to speculation, assumptions, scenarios, envisioning events and un-testable hypotheses. This is not an easy challenge as experimental design and plausible hypothesis testing are difficult. Since past approaches have been inconclusive in providing evidence for the origin of microbial life mechanisms and the manner in which genetic instructions was encoded into DNA/RNA, it is reasonable and logical to propose that progress will be made when testable, plausible hypotheses and methods are used in the origin of microbial life research, and the experimental observations are, or are not reproduced in independent laboratories. These perspectives will be discussed in this article as well as the possibility that a pre-biotic film preceded a microbial biofilm as a possible micro-location for the origin of microbial cells capable of growth and division. 2010 Elsevier B.V. All rights reserved.
Integrated PK-PD and agent-based modeling in oncology.
Wang, Zhihui; Butner, Joseph D; Cristini, Vittorio; Deisboeck, Thomas S
2015-04-01
Mathematical modeling has become a valuable tool that strives to complement conventional biomedical research modalities in order to predict experimental outcome, generate new medical hypotheses, and optimize clinical therapies. Two specific approaches, pharmacokinetic-pharmacodynamic (PK-PD) modeling, and agent-based modeling (ABM), have been widely applied in cancer research. While they have made important contributions on their own (e.g., PK-PD in examining chemotherapy drug efficacy and resistance, and ABM in describing and predicting tumor growth and metastasis), only a few groups have started to combine both approaches together in an effort to gain more insights into the details of drug dynamics and the resulting impact on tumor growth. In this review, we focus our discussion on some of the most recent modeling studies building on a combined PK-PD and ABM approach that have generated experimentally testable hypotheses. Some future directions are also discussed.
Integrated PK-PD and Agent-Based Modeling in Oncology
Wang, Zhihui; Butner, Joseph D.; Cristini, Vittorio
2016-01-01
Mathematical modeling has become a valuable tool that strives to complement conventional biomedical research modalities in order to predict experimental outcome, generate new medical hypotheses, and optimize clinical therapies. Two specific approaches, pharmacokinetic-pharmacodynamic (PK-PD) modeling, and agent-based modeling (ABM), have been widely applied in cancer research. While they have made important contributions on their own (e.g., PK-PD in examining chemotherapy drug efficacy and resistance, and ABM in describing and predicting tumor growth and metastasis), only a few groups have started to combine both approaches together in an effort to gain more insights into the details of drug dynamics and the resulting impact on tumor growth. In this review, we focus our discussion on some of the most recent modeling studies building on a combined PK-PD and ABM approach that have generated experimentally testable hypotheses. Some future directions are also discussed. PMID:25588379
Soy-Based Therapeutic Baby Formulas: Testable Hypotheses Regarding the Pros and Cons.
Westmark, Cara J
2016-01-01
Soy-based infant formulas have been consumed in the United States since 1909, and currently constitute a significant portion of the infant formula market. There are efforts underway to generate genetically modified soybeans that produce therapeutic agents of interest with the intent to deliver those agents in a soy-based infant formula platform. The threefold purpose of this review article is to first discuss the pros and cons of soy-based infant formulas, then present testable hypotheses to discern the suitability of a soy platform for drug delivery in babies, and finally start a discussion to inform public policy on this important area of infant nutrition.
Soy-Based Therapeutic Baby Formulas: Testable Hypotheses Regarding the Pros and Cons
Westmark, Cara J.
2017-01-01
Soy-based infant formulas have been consumed in the United States since 1909, and currently constitute a significant portion of the infant formula market. There are efforts underway to generate genetically modified soybeans that produce therapeutic agents of interest with the intent to deliver those agents in a soy-based infant formula platform. The threefold purpose of this review article is to first discuss the pros and cons of soy-based infant formulas, then present testable hypotheses to discern the suitability of a soy platform for drug delivery in babies, and finally start a discussion to inform public policy on this important area of infant nutrition. PMID:28149839
“Feature Detection” vs. “Predictive Coding” Models of Plant Behavior
Calvo, Paco; Baluška, František; Sims, Andrew
2016-01-01
In this article we consider the possibility that plants exhibit anticipatory behavior, a mark of intelligence. If plants are able to anticipate and respond accordingly to varying states of their surroundings, as opposed to merely responding online to environmental contingencies, then such capacity may be in principle testable, and subject to empirical scrutiny. Our main thesis is that adaptive behavior can only take place by way of a mechanism that predicts the environmental sources of sensory stimulation. We propose to test for anticipation in plants experimentally by contrasting two empirical hypotheses: “feature detection” and “predictive coding.” We spell out what these contrasting hypotheses consist of by way of illustration from the animal literature, and consider how to transfer the rationale involved to the plant literature. PMID:27757094
Multidisciplinary approaches to understanding collective cell migration in developmental biology.
Schumacher, Linus J; Kulesa, Paul M; McLennan, Rebecca; Baker, Ruth E; Maini, Philip K
2016-06-01
Mathematical models are becoming increasingly integrated with experimental efforts in the study of biological systems. Collective cell migration in developmental biology is a particularly fruitful application area for the development of theoretical models to predict the behaviour of complex multicellular systems with many interacting parts. In this context, mathematical models provide a tool to assess the consistency of experimental observations with testable mechanistic hypotheses. In this review, we showcase examples from recent years of multidisciplinary investigations of neural crest cell migration. The neural crest model system has been used to study how collective migration of cell populations is shaped by cell-cell interactions, cell-environmental interactions and heterogeneity between cells. The wide range of emergent behaviours exhibited by neural crest cells in different embryonal locations and in different organisms helps us chart out the spectrum of collective cell migration. At the same time, this diversity in migratory characteristics highlights the need to reconcile or unify the array of currently hypothesized mechanisms through the next generation of experimental data and generalized theoretical descriptions. © 2016 The Authors.
Twelve testable hypotheses on the geobiology of weathering
S.L. Brantley; J.P. Megonigal; F.N. Scatena; Z. Balogh-Brunstad; R.T. Barnes; M.A. Bruns; P. van Cappelen; K. Dontsova; H.E. Hartnett; A.S. Hartshorn; A. Heimsath; E. Herndon; L. Jin; C.K. Keller; J.R. Leake; W.H. McDowell; F.C. Meinzer; T.J. Mozdzer; S. Petsch; J. Pett-Ridge; K.S. Pretziger; P.A. Raymond; C.S. Riebe; K. Shumaker; A. Sutton-Grier; R. Walter; K. Yoo
2011-01-01
Critical Zone (CZ) research investigates the chemical, physical, and biological processes that modulate the Earth's surface. Here, we advance 12 hypotheses that must be tested to improve our understanding of the CZ: (1) Solar-to-chemical conversion of energy by plants regulates flows of carbon, water, and nutrients through plant-microbe soil networks, thereby...
Colquhoun, Heather L; Carroll, Kelly; Eva, Kevin W; Grimshaw, Jeremy M; Ivers, Noah; Michie, Susan; Sales, Anne; Brehaut, Jamie C
2017-09-29
Audit and feedback (A&F) is a common strategy for helping health providers to implement evidence into practice. Despite being extensively studied, health care A&F interventions remain variably effective, with overall effect sizes that have not improved since 2003. Contributing to this stagnation is the fact that most health care A&F interventions have largely been designed without being informed by theoretical understanding from the behavioral and social sciences. To determine if the trend can be improved, the objective of this study was to develop a list of testable, theory-informed hypotheses about how to design more effective A&F interventions. Using purposive sampling, semi-structured 60-90-min telephone interviews were conducted with experts in theories related to A&F from a range of fields (e.g., cognitive, health and organizational psychology, medical decision-making, economics). Guided by detailed descriptions of A&F interventions from the health care literature, interviewees described how they would approach the problem of designing improved A&F interventions. Specific, theory-informed hypotheses about the conditions for effective design and delivery of A&F interventions were elicited from the interviews. The resulting hypotheses were assigned by three coders working independently into themes, and categories of themes, in an iterative process. We conducted 28 interviews and identified 313 theory-informed hypotheses, which were placed into 30 themes. The 30 themes included hypotheses related to the following five categories: A&F recipient (seven themes), content of the A&F (ten themes), process of delivery of the A&F (six themes), behavior that was the focus of the A&F (three themes), and other (four themes). We have identified a set of testable, theory-informed hypotheses from a broad range of behavioral and social science that suggest conditions for more effective A&F interventions. This work demonstrates the breadth of perspectives about A&F from non-healthcare-specific disciplines in a way that yields testable hypotheses for healthcare A&F interventions. These results will serve as the foundation for further work seeking to set research priorities among the A&F research community.
Binding and Scope Dependencies with "Floating Quantifiers" in Japanese
ERIC Educational Resources Information Center
Mukai, Emi
2012-01-01
The primary concern of this thesis is how we can achieve rigorous testability when we set the properties of the Computational System (hypothesized to be at the center of the language faculty) as our object of inquiry and informant judgments as a tool to construct and/or evaluate our hypotheses concerning the properties of the Computational System.…
STOP using just GO: a multi-ontology hypothesis generation tool for high throughput experimentation
2013-01-01
Background Gene Ontology (GO) enrichment analysis remains one of the most common methods for hypothesis generation from high throughput datasets. However, we believe that researchers strive to test other hypotheses that fall outside of GO. Here, we developed and evaluated a tool for hypothesis generation from gene or protein lists using ontological concepts present in manually curated text that describes those genes and proteins. Results As a consequence we have developed the method Statistical Tracking of Ontological Phrases (STOP) that expands the realm of testable hypotheses in gene set enrichment analyses by integrating automated annotations of genes to terms from over 200 biomedical ontologies. While not as precise as manually curated terms, we find that the additional enriched concepts have value when coupled with traditional enrichment analyses using curated terms. Conclusion Multiple ontologies have been developed for gene and protein annotation, by using a dataset of both manually curated GO terms and automatically recognized concepts from curated text we can expand the realm of hypotheses that can be discovered. The web application STOP is available at http://mooneygroup.org/stop/. PMID:23409969
Moses Lake Fishery Restoration Project : FY 1999 Annual Report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None given
2000-12-01
The Moses Lake Project consists of 3 phases. Phase 1 is the assessment of all currently available physical and biological information, the collection of baseline biological data, the formulation of testable hypotheses, and the development of a detailed study plan to test the hypotheses. Phase 2 is dedicated to the implementation of the study plan including data collection, hypotheses testing, and the formulation of a management plan. Phase 3 of the project is the implementation of the management plan, monitoring and evaluation of the implemented recommendations. The project intends to restore the failed recreational fishery for panfish species (black crappie,more » bluegill and yellow perch) in Moses Lake as off site mitigation for lost recreational fishing opportunities for anadromous species in the upper Columbia River. This report summarizes the results of Phase 1 investigations and presents the study plan directed at initiating Phase 2 of the project. Phase 1of the project culminates with the formulation of testable hypotheses directed at investigating possible limiting factors to the production of panfish in Moses Lake. The limiting factors to be investigated will include water quality, habitat quantity and quality, food limitations, competition, recruitment, predation, over harvest, environmental requirements, and the physical and chemical limitations of the system in relation to the fishes.« less
Ruiz, Patricia; Perlina, Ally; Mumtaz, Moiz; Fowler, Bruce A
2016-07-01
A number of epidemiological studies have identified statistical associations between persistent organic pollutants (POPs) and metabolic diseases, but testable hypotheses regarding underlying molecular mechanisms to explain these linkages have not been published. We assessed the underlying mechanisms of POPs that have been associated with metabolic diseases; three well-known POPs [2,3,7,8-tetrachlorodibenzodioxin (TCDD), 2,2´,4,4´,5,5´-hexachlorobiphenyl (PCB 153), and 4,4´-dichlorodiphenyldichloroethylene (p,p´-DDE)] were studied. We used advanced database search tools to delineate testable hypotheses and to guide laboratory-based research studies into underlying mechanisms by which this POP mixture could produce or exacerbate metabolic diseases. For our searches, we used proprietary systems biology software (MetaCore™/MetaDrug™) to conduct advanced search queries for the underlying interactions database, followed by directional network construction to identify common mechanisms for these POPs within two or fewer interaction steps downstream of their primary targets. These common downstream pathways belong to various cytokine and chemokine families with experimentally well-documented causal associations with type 2 diabetes. Our systems biology approach allowed identification of converging pathways leading to activation of common downstream targets. To our knowledge, this is the first study to propose an integrated global set of step-by-step molecular mechanisms for a combination of three common POPs using a systems biology approach, which may link POP exposure to diseases. Experimental evaluation of the proposed pathways may lead to development of predictive biomarkers of the effects of POPs, which could translate into disease prevention and effective clinical treatment strategies. Ruiz P, Perlina A, Mumtaz M, Fowler BA. 2016. A systems biology approach reveals converging molecular mechanisms that link different POPs to common metabolic diseases. Environ Health Perspect 124:1034-1041; http://dx.doi.org/10.1289/ehp.1510308.
Serotonergic Psychedelics: Experimental Approaches for Assessing Mechanisms of Action.
Canal, Clinton E
2018-03-13
Recent, well-controlled - albeit small-scale - clinical trials show that serotonergic psychedelics, including psilocybin and lysergic acid diethylamide, possess great promise for treating psychiatric disorders, including treatment-resistant depression. Additionally, fresh results from a deluge of clinical neuroimaging studies are unveiling the dynamic effects of serotonergic psychedelics on functional activity within, and connectivity across, discrete neural systems. These observations have led to testable hypotheses regarding neural processing mechanisms that contribute to psychedelic effects and therapeutic benefits. Despite these advances and a plethora of preclinical and clinical observations supporting a central role for brain serotonin 5-HT 2A receptors in producing serotonergic psychedelic effects, lingering and new questions about mechanisms abound. These chiefly pertain to molecular neuropharmacology. This chapter is devoted to illuminating and discussing such questions in the context of preclinical experimental approaches for studying mechanisms of action of serotonergic psychedelics, classic and new.
Linking Microbiota to Human Diseases: A Systems Biology Perspective.
Wu, Hao; Tremaroli, Valentina; Bäckhed, Fredrik
2015-12-01
The human gut microbiota encompasses a densely populated ecosystem that provides essential functions for host development, immune maturation, and metabolism. Alterations to the gut microbiota have been observed in numerous diseases, including human metabolic diseases such as obesity, type 2 diabetes (T2D), and irritable bowel syndrome, and some animal experiments have suggested causality. However, few studies have validated causality in humans and the underlying mechanisms remain largely to be elucidated. We discuss how systems biology approaches combined with new experimental technologies may disentangle some of the mechanistic details in the complex interactions of diet, microbiota, and host metabolism and may provide testable hypotheses for advancing our current understanding of human-microbiota interaction. Copyright © 2015 Elsevier Ltd. All rights reserved.
Modelling toehold-mediated RNA strand displacement.
Šulc, Petr; Ouldridge, Thomas E; Romano, Flavio; Doye, Jonathan P K; Louis, Ard A
2015-03-10
We study the thermodynamics and kinetics of an RNA toehold-mediated strand displacement reaction with a recently developed coarse-grained model of RNA. Strand displacement, during which a single strand displaces a different strand previously bound to a complementary substrate strand, is an essential mechanism in active nucleic acid nanotechnology and has also been hypothesized to occur in vivo. We study the rate of displacement reactions as a function of the length of the toehold and temperature and make two experimentally testable predictions: that the displacement is faster if the toehold is placed at the 5' end of the substrate; and that the displacement slows down with increasing temperature for longer toeholds. Copyright © 2015 Biophysical Society. Published by Elsevier Inc. All rights reserved.
A systems framework for identifying candidate microbial assemblages for disease management
USDA-ARS?s Scientific Manuscript database
Network models of soil and plant microbiomes present new opportunities for enhancing disease management, but also challenges for interpretation. We present a framework for interpreting microbiome networks, illustrating how the observed structure of networks can be used to generate testable hypothese...
Leveraging Rigorous Local Evaluations to Understand Contradictory Findings
ERIC Educational Resources Information Center
Boulay, Beth; Martin, Carlos; Zief, Susan; Granger, Robert
2013-01-01
Contradictory findings from "well-implemented" rigorous evaluations invite researchers to identify the differences that might explain the contradictions, helping to generate testable hypotheses for new research. This panel will examine efforts to ensure that the large number of local evaluations being conducted as part of four…
The Process of Mentoring Pregnant Adolescents: An Exploratory Study.
ERIC Educational Resources Information Center
Blinn-Pike, Lynn; Kuschel, Diane; McDaniel, Annette; Mingus, Suzanne; Mutti, Megan Poole
1998-01-01
The process that occurs in relationships between volunteer adult mentors and pregnant adolescent "mentees" is described empirically; testable hypotheses based on findings concerning the mentor role are proposed. Case records from 20 mentors are analyzed; findings regarding mentors' roles are discussed. Criteria for conceptualizing quasi-parenting…
Researching the Study Abroad Experience
ERIC Educational Resources Information Center
McLeod, Mark; Wainwright, Philip
2009-01-01
The authors propose a paradigm for rigorous scientific assessment of study abroad programs, with the focus being on how study abroad experiences affect psychological constructs as opposed to looking solely at study-abroad-related outcomes. Social learning theory is used as a possible theoretical basis for making testable hypotheses and guiding…
A THEORY OF WORK ADJUSTMENT. MINNESOTA STUDIES IN VOCATIONAL REHABILITATION, 15.
ERIC Educational Resources Information Center
DAWIS, RENE V.; AND OTHERS
A THEORY OF WORK ADJUSTMENT WHICH MAY CONTRIBUTE TO THE DEVELOPMENT OF A SCIENCE OF THE PSYCHOLOGY OF OCCUPATIONAL BEHAVIOR IS PROPOSED. IT BUILDS ON THE BASIC PSYCHOLOGICAL CONCEPTS OF STIMULUS, RESPONSE, AND REINFORCEMENT, AND PROVIDES A RESEARCH PARADIGM FOR GENERATING TESTABLE HYPOTHESES. IT WAS DERIVED FROM EARLY RESEARCH EFFORTS OF THE…
ERIC Educational Resources Information Center
Maul, Andrew
2015-01-01
Briggs and Peck [in "Using Learning Progressions to Design Vertical Scales That Support Coherent Inferences about Student Growth"] call for greater care in the conceptualization of the target attributes of students, or "what it is that is growing from grade to grade." In particular, they argue that learning progressions can…
Modules, Theories, or Islands of Expertise? Domain Specificity in Socialization
ERIC Educational Resources Information Center
Gelman, Susan A.
2010-01-01
The domain-specific approach to socialization processes presented by J. E. Grusec and M. Davidov (this issue) provides a compelling framework for integrating and interpreting a large and disparate body of research findings, and it generates a wealth of testable new hypotheses. At the same time, it introduces core theoretical questions regarding…
Phases in the Adoption of Educational Innovations in Teacher Training Institutions.
ERIC Educational Resources Information Center
Hall, Gene E.
An attempt has been made to categorize phenomena observed as 20 teacher training institutions have adopted innovations and to extrapolate from these findings key concepts and principles that could form the basis for developing empirically testable hypotheses and could be of some immediate utility to those involved in innovation adoption. The…
Thinking about Evolution: Combinatorial Play as a Strategy for Exercising Scientific Creativity
ERIC Educational Resources Information Center
Wingate, Richard J. T.
2011-01-01
An enduring focus in education on how scientists formulate experiments and "do science" in the laboratory has excluded a vital element of scientific practice: the creative and imaginative thinking that generates models and testable hypotheses. In this case study, final-year biomedical sciences university students were invited to create and justify…
Hallow, K M; Gebremichael, Y
2017-06-01
Renal function plays a central role in cardiovascular, kidney, and multiple other diseases, and many existing and novel therapies act through renal mechanisms. Even with decades of accumulated knowledge of renal physiology, pathophysiology, and pharmacology, the dynamics of renal function remain difficult to understand and predict, often resulting in unexpected or counterintuitive therapy responses. Quantitative systems pharmacology modeling of renal function integrates this accumulated knowledge into a quantitative framework, allowing evaluation of competing hypotheses, identification of knowledge gaps, and generation of new experimentally testable hypotheses. Here we present a model of renal physiology and control mechanisms involved in maintaining sodium and water homeostasis. This model represents the core renal physiological processes involved in many research questions in drug development. The model runs in R and the code is made available. In a companion article, we present a case study using the model to explore mechanisms and pharmacology of salt-sensitive hypertension. © 2017 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.
How and why does the immunological synapse form? Physical chemistry meets cell biology.
Chakraborty, Arup K
2002-03-05
During T lymphocyte (T cell) recognition of an antigen, a highly organized and specific pattern of membrane proteins forms in the junction between the T cell and the antigen-presenting cell (APC). This specialized cell-cell junction is called the immunological synapse. It is several micrometers large and forms over many minutes. A plethora of experiments are being performed to study the mechanisms that underlie synapse formation and the way in which information transfer occurs across the synapse. The wealth of experimental data that is beginning to emerge must be understood within a mechanistic framework if it is to prove useful in developing modalities to control the immune response. Quantitative models can complement experiments in the quest for such a mechanistic understanding by suggesting experimentally testable hypotheses. Here, a quantitative synapse assembly model is described. The model uses concepts developed in physical chemistry and cell biology and is able to predict the spatiotemporal evolution of cell shape and receptor protein patterns observed during synapse formation. Attention is directed to how the juxtaposition of model predictions and experimental data has led to intriguing hypotheses regarding the role of null and self peptides during synapse assembly, as well as correlations between T cell effector functions and the robustness of synapse assembly. We remark on some ways in which synergistic experiments and modeling studies can improve current models, and we take steps toward a better understanding of information transfer across the T cell-APC junction.
Beyond the bucket: testing the effect of experimental design on rate and sequence of decay
NASA Astrophysics Data System (ADS)
Gabbott, Sarah; Murdock, Duncan; Purnell, Mark
2016-04-01
Experimental decay has revealed the potential for profound biases in our interpretations of exceptionally preserved fossils, with non-random sequences of character loss distorting the position of fossil taxa in phylogenetic trees. By characterising these sequences we can rewind this distortion and make better-informed interpretations of the affinity of enigmatic fossil taxa. Equally, rate of character loss is crucial for estimating the preservation potential of phylogentically informative characters, and revealing the mechanisms of preservation themselves. However, experimental decay has been criticised for poorly modeling 'real' conditions, and dismissed as unsophisticated 'bucket science'. Here we test the effect of a differing experimental parameters on the rate and sequence of decay. By doing so, we can test the assumption that the results of decay experiments are applicable to informing interpretations of exceptionally preserved fossils from diverse preservational settings. The results of our experiments demonstrate the validity of using the sequence of character loss as a phylogenetic tool, and sheds light on the extent to which environment must be considered before making decay-informed interpretations, or reconstructing taphonomic pathways. With careful consideration of experimental design, driven by testable hypotheses, decay experiments are robust and informative - experimental taphonomy needn't kick the bucket just yet.
Origin and Proliferation of Multiple-Drug Resistance in Bacterial Pathogens
Chang, Hsiao-Han; Cohen, Ted; Grad, Yonatan H.; Hanage, William P.; O'Brien, Thomas F.
2015-01-01
SUMMARY Many studies report the high prevalence of multiply drug-resistant (MDR) strains. Because MDR infections are often significantly harder and more expensive to treat, they represent a growing public health threat. However, for different pathogens, different underlying mechanisms are traditionally used to explain these observations, and it is unclear whether each bacterial taxon has its own mechanism(s) for multidrug resistance or whether there are common mechanisms between distantly related pathogens. In this review, we provide a systematic overview of the causes of the excess of MDR infections and define testable predictions made by each hypothetical mechanism, including experimental, epidemiological, population genomic, and other tests of these hypotheses. Better understanding the cause(s) of the excess of MDR is the first step to rational design of more effective interventions to prevent the origin and/or proliferation of MDR. PMID:25652543
ERIC Educational Resources Information Center
Kulczynska, Agnieszka; Johnson, Reed; Frost, Tony; Margerum, Lawrence D.
2011-01-01
An advanced undergraduate laboratory project is described that integrates inorganic, analytical, physical, and biochemical techniques to reveal differences in binding between cationic metal complexes and anionic DNA (herring testes). Students were guided to formulate testable hypotheses based on the title question and a list of different metal…
The part of cognitive science that is philosophy.
Dennett, Daniel C
2009-04-01
There is much good work for philosophers to do in cognitive science if they adopt the constructive attitude that prevails in science, work toward testable hypotheses, and take on the task of clarifying the relationship between the scientific concepts and the everyday concepts with which we conduct our moral lives. Copyright © 2009 Cognitive Science Society, Inc.
Surface fire effects on conifer and hardwood crowns--applications of an integral plume model
Matthew Dickinson; Anthony Bova; Kathleen Kavanagh; Antoine Randolph; Lawrence Band
2009-01-01
An integral plume model was applied to the problems of tree death from canopy injury in dormant-season hardwoods and branch embolism in Douglas fir (Pseudotsuga menziesii) crowns. Our purpose was to generate testable hypotheses. We used the integral plume models to relate crown injury to bole injury and to explore the effects of variation in fire...
Objections to routine clinical outcomes measurement in mental health services: any evidence so far?
MacDonald, Alastair J D; Trauer, Tom
2010-12-01
Routine clinical outcomes measurement (RCOM) is gaining importance in mental health services. To examine whether criticisms published in advance of the development of RCOM have been borne out by data now available from such a programme. This was an observational study of routine ratings using HoNOS65+ at inception/admission and again at discharge in an old age psychiatry service from 1997 to 2008. Testable hypotheses were generated from each criticism amenable to empirical examination. Inter-rater reliability estimates were applied to observed differences between scores between community and ward patients using resampling. Five thousand one hundred eighty community inceptions and 862 admissions had HoNOS65+ ratings at referral/admission and discharge. We could find no evidence of gaming (artificially worse scores at inception and better at discharge), selection, attrition or detection bias, and ratings were consistent with diagnosis and level of service. Anticipated low levels of inter-rater reliability did not vitiate differences between levels of service. Although only hypotheses testable from within RCOM data were examined, and only 46% of eligible episodes had complete outcomes data, no evidence of the alleged biases were found. RCOM seems valid and practical in mental health services.
The evolutionary psychology of hunger.
Al-Shawaf, Laith
2016-10-01
An evolutionary psychological perspective suggests that emotions can be understood as coordinating mechanisms whose job is to regulate various psychological and physiological programs in the service of solving an adaptive problem. This paper suggests that it may also be fruitful to approach hunger from this coordinating mechanism perspective. To this end, I put forward an evolutionary task analysis of hunger, generating novel a priori hypotheses about the coordinating effects of hunger on psychological processes such as perception, attention, categorization, and memory. This approach appears empirically fruitful in that it yields a bounty of testable new hypotheses. Copyright © 2016 Elsevier Ltd. All rights reserved.
Raja, Kalpana; Patrick, Matthew; Gao, Yilin; Madu, Desmond; Yang, Yuyang
2017-01-01
In the past decade, the volume of “omics” data generated by the different high-throughput technologies has expanded exponentially. The managing, storing, and analyzing of this big data have been a great challenge for the researchers, especially when moving towards the goal of generating testable data-driven hypotheses, which has been the promise of the high-throughput experimental techniques. Different bioinformatics approaches have been developed to streamline the downstream analyzes by providing independent information to interpret and provide biological inference. Text mining (also known as literature mining) is one of the commonly used approaches for automated generation of biological knowledge from the huge number of published articles. In this review paper, we discuss the recent advancement in approaches that integrate results from omics data and information generated from text mining approaches to uncover novel biomedical information. PMID:28331849
Advancing Biological Understanding and Therapeutics Discovery with Small Molecule Probes
Schreiber, Stuart L.; Kotz, Joanne D.; Li, Min; Aubé, Jeffrey; Austin, Christopher P.; Reed, John C.; Rosen, Hugh; White, E. Lucile; Sklar, Larry A.; Lindsley, Craig W.; Alexander, Benjamin R.; Bittker, Joshua A.; Clemons, Paul A.; de Souza, Andrea; Foley, Michael A.; Palmer, Michelle; Shamji, Alykhan F.; Wawer, Mathias J.; McManus, Owen; Wu, Meng; Zou, Beiyan; Yu, Haibo; Golden, Jennifer E.; Schoenen, Frank J.; Simeonov, Anton; Jadhav, Ajit; Jackson, Michael R.; Pinkerton, Anthony B.; Chung, Thomas D.Y.; Griffin, Patrick R.; Cravatt, Benjamin F.; Hodder, Peter S.; Roush, William R.; Roberts, Edward; Chung, Dong-Hoon; Jonsson, Colleen B.; Noah, James W.; Severson, William E.; Ananthan, Subramaniam; Edwards, Bruce; Oprea, Tudor I.; Conn, P. Jeffrey; Hopkins, Corey R.; Wood, Michael R.; Stauffer, Shaun R.; Emmitte, Kyle A.
2015-01-01
Small-molecule probes can illuminate biological processes and aid in the assessment of emerging therapeutic targets by perturbing biological systems in a manner distinct from other experimental approaches. Despite the tremendous promise of chemical tools for investigating biology and disease, small-molecule probes were unavailable for most targets and pathways as recently as a decade ago. In 2005, the U.S. National Institutes of Health launched the decade-long Molecular Libraries Program with the intent of innovating in and broadening access to small-molecule science. This Perspective describes how novel small-molecule probes identified through the program are enabling the exploration of biological pathways and therapeutic hypotheses not otherwise testable. These experiences illustrate how small-molecule probes can help bridge the chasm between biological research and the development of medicines, but also highlight the need to innovate the science of therapeutic discovery. PMID:26046436
Canales, Javier; Moyano, Tomás C.; Villarroel, Eva; Gutiérrez, Rodrigo A.
2014-01-01
Nitrogen (N) is an essential macronutrient for plant growth and development. Plants adapt to changes in N availability partly by changes in global gene expression. We integrated publicly available root microarray data under contrasting nitrate conditions to identify new genes and functions important for adaptive nitrate responses in Arabidopsis thaliana roots. Overall, more than 2000 genes exhibited changes in expression in response to nitrate treatments in Arabidopsis thaliana root organs. Global regulation of gene expression by nitrate depends largely on the experimental context. However, despite significant differences from experiment to experiment in the identity of regulated genes, there is a robust nitrate response of specific biological functions. Integrative gene network analysis uncovered relationships between nitrate-responsive genes and 11 highly co-expressed gene clusters (modules). Four of these gene network modules have robust nitrate responsive functions such as transport, signaling, and metabolism. Network analysis hypothesized G2-like transcription factors are key regulatory factors controlling transport and signaling functions. Our meta-analysis highlights the role of biological processes not studied before in the context of the nitrate response such as root hair development and provides testable hypothesis to advance our understanding of nitrate responses in plants. PMID:24570678
Mathematical modeling of the female reproductive system: from oocyte to delivery.
Clark, Alys R; Kruger, Jennifer A
2017-01-01
From ovulation to delivery, and through the menstrual cycle, the female reproductive system undergoes many dynamic changes to provide an optimal environment for the embryo to implant, and to develop successfully. It is difficult ethically and practically to observe the system over the timescales involved in growth and development (often hours to days). Even in carefully monitored conditions clinicians and biologists can only see snapshots of the development process. Mathematical models are emerging as a key means to supplement our knowledge of the reproductive process, and to tease apart complexity in the reproductive system. These models have been used successfully to test existing hypotheses regarding the mechanisms of female infertility and pathological fetal development, and also to provide new experimentally testable hypotheses regarding the process of development. This new knowledge has allowed for improvements in assisted reproductive technologies and is moving toward translation to clinical practice via multiscale assessments of the dynamics of ovulation, development in pregnancy, and the timing and mechanics of delivery. WIREs Syst Biol Med 2017, 9:e1353. doi: 10.1002/wsbm.1353 For further resources related to this article, please visit the WIREs website. © 2016 Wiley Periodicals, Inc.
Structural Genomics of Bacterial Virulence Factors
2006-05-01
positioned in the unit cell by Molecular Replacement (Protein Data Bank ( PDB ) ID code 1acc)6 using MOLREP, and refined with REFMAC version 5.0 (ref. 24...increase our understanding of the molecular mechanisms of pathogenicity, putting us in a stronger position to anticipate and react to emerging...term, the accumulated structural information will generate important and testable hypotheses that will increase our understanding of the molecular
ERIC Educational Resources Information Center
Maestripieri, Dario
2005-01-01
Comparative behavioral research is important for a number of reasons and can contribute to the understanding of human behavior and development in many different ways. Research with animal models of human behavior and development can be a source not only of general principles and testable hypotheses but also of empirical information that may be…
Testability of evolutionary game dynamics based on experimental economics data
NASA Astrophysics Data System (ADS)
Wang, Yijia; Chen, Xiaojie; Wang, Zhijian
2017-11-01
Understanding the dynamic processes of a real game system requires an appropriate dynamics model, and rigorously testing a dynamics model is nontrivial. In our methodological research, we develop an approach to testing the validity of game dynamics models that considers the dynamic patterns of angular momentum and speed as measurement variables. Using Rock-Paper-Scissors (RPS) games as an example, we illustrate the geometric patterns in the experiment data. We then derive the related theoretical patterns from a series of typical dynamics models. By testing the goodness-of-fit between the experimental and theoretical patterns, we show that the validity of these models can be evaluated quantitatively. Our approach establishes a link between dynamics models and experimental systems, which is, to the best of our knowledge, the most effective and rigorous strategy for ascertaining the testability of evolutionary game dynamics models.
Conceptual frameworks and methods for advancing invasion ecology.
Heger, Tina; Pahl, Anna T; Botta-Dukát, Zoltan; Gherardi, Francesca; Hoppe, Christina; Hoste, Ivan; Jax, Kurt; Lindström, Leena; Boets, Pieter; Haider, Sylvia; Kollmann, Johannes; Wittmann, Meike J; Jeschke, Jonathan M
2013-09-01
Invasion ecology has much advanced since its early beginnings. Nevertheless, explanation, prediction, and management of biological invasions remain difficult. We argue that progress in invasion research can be accelerated by, first, pointing out difficulties this field is currently facing and, second, looking for measures to overcome them. We see basic and applied research in invasion ecology confronted with difficulties arising from (A) societal issues, e.g., disparate perceptions of invasive species; (B) the peculiarity of the invasion process, e.g., its complexity and context dependency; and (C) the scientific methodology, e.g., imprecise hypotheses. To overcome these difficulties, we propose three key measures: (1) a checklist for definitions to encourage explicit definitions; (2) implementation of a hierarchy of hypotheses (HoH), where general hypotheses branch into specific and precisely testable hypotheses; and (3) platforms for improved communication. These measures may significantly increase conceptual clarity and enhance communication, thus advancing invasion ecology.
Debates—Hypothesis testing in hydrology: Introduction
NASA Astrophysics Data System (ADS)
Blöschl, Günter
2017-03-01
This paper introduces the papers in the "Debates—Hypothesis testing in hydrology" series. The four articles in the series discuss whether and how the process of testing hypotheses leads to progress in hydrology. Repeated experiments with controlled boundary conditions are rarely feasible in hydrology. Research is therefore not easily aligned with the classical scientific method of testing hypotheses. Hypotheses in hydrology are often enshrined in computer models which are tested against observed data. Testability may be limited due to model complexity and data uncertainty. All four articles suggest that hypothesis testing has contributed to progress in hydrology and is needed in the future. However, the procedure is usually not as systematic as the philosophy of science suggests. A greater emphasis on a creative reasoning process on the basis of clues and explorative analyses is therefore needed.
It takes two to talk: a second-person neuroscience approach to language learning.
Syal, Supriya; Anderson, Adam K
2013-08-01
Language is a social act. We have previously argued that language remains embedded in sociality because the motivation to communicate exists only within a social context. Schilbach et al. underscore the importance of studying linguistic behavior from within the motivated, socially interactive frame in which it is learnt and used, as well as provide testable hypotheses for a participatory, second-person neuroscience approach to language learning.
Empirical approaches to the study of language evolution.
Fitch, W Tecumseh
2017-02-01
The study of language evolution, and human cognitive evolution more generally, has often been ridiculed as unscientific, but in fact it differs little from many other disciplines that investigate past events, such as geology or cosmology. Well-crafted models of language evolution make numerous testable hypotheses, and if the principles of strong inference (simultaneous testing of multiple plausible hypotheses) are adopted, there is an increasing amount of relevant data allowing empirical evaluation of such models. The articles in this special issue provide a concise overview of current models of language evolution, emphasizing the testable predictions that they make, along with overviews of the many sources of data available to test them (emphasizing comparative, neural, and genetic data). The key challenge facing the study of language evolution is not a lack of data, but rather a weak commitment to hypothesis-testing approaches and strong inference, exacerbated by the broad and highly interdisciplinary nature of the relevant data. This introduction offers an overview of the field, and a summary of what needed to evolve to provide our species with language-ready brains. It then briefly discusses different contemporary models of language evolution, followed by an overview of different sources of data to test these models. I conclude with my own multistage model of how different components of language could have evolved.
Vesselinova, Neda; Alexandrov, Boian; Wall, Michael E.
2016-11-08
We present a dynamical model of drug accumulation in bacteria. The model captures key features in experimental time courses on ofloxacin accumulation: initial uptake; two-phase response; and long-term acclimation. In combination with experimental data, the model provides estimates of import and export rates in each phase, the time of entry into the second phase, and the decrease of internal drug during acclimation. Global sensitivity analysis, local sensitivity analysis, and Bayesian sensitivity analysis of the model provide information about the robustness of these estimates, and about the relative importance of different parameters in determining the features of the accumulation time coursesmore » in three different bacterial species: Escherichia coli, Staphylococcus aureus, and Pseudomonas aeruginosa. The results lead to experimentally testable predictions of the effects of membrane permeability, drug efflux and trapping (e.g., by DNA binding) on drug accumulation. A key prediction is that a sudden increase in ofloxacin accumulation in both E. coli and S. aureus is accompanied by a decrease in membrane permeability.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vesselinova, Neda; Alexandrov, Boian; Wall, Michael E.
We present a dynamical model of drug accumulation in bacteria. The model captures key features in experimental time courses on ofloxacin accumulation: initial uptake; two-phase response; and long-term acclimation. In combination with experimental data, the model provides estimates of import and export rates in each phase, the time of entry into the second phase, and the decrease of internal drug during acclimation. Global sensitivity analysis, local sensitivity analysis, and Bayesian sensitivity analysis of the model provide information about the robustness of these estimates, and about the relative importance of different parameters in determining the features of the accumulation time coursesmore » in three different bacterial species: Escherichia coli, Staphylococcus aureus, and Pseudomonas aeruginosa. The results lead to experimentally testable predictions of the effects of membrane permeability, drug efflux and trapping (e.g., by DNA binding) on drug accumulation. A key prediction is that a sudden increase in ofloxacin accumulation in both E. coli and S. aureus is accompanied by a decrease in membrane permeability.« less
Creation of a Mouse with Stress-Induced Dystonia: Control of an ATPase Chaperone
2013-04-01
was successful, and a mouse with the desired dystonic symptoms was obtained. It has two mutations , one a dominantly inherited gene with 100...the hallmark of dystonia. 15. SUBJECT TERMS Dystonia, genetically modified mice, stress, gene mutations , animal model of disease. 16...there are a variety of hypotheses that should be testable if there were a realistic animal model. Mice with mutations in genes known to cause dystonia
Multi-omics approach identifies molecular mechanisms of plant-fungus mycorrhizal interaction
Larsen, Peter E.; Sreedasyam, Avinash; Trivedi, Geetika; ...
2016-01-19
In mycorrhizal symbiosis, plant roots form close, mutually beneficial interactions with soil fungi. Before this mycorrhizal interaction can be established however, plant roots must be capable of detecting potential beneficial fungal partners and initiating the gene expression patterns necessary to begin symbiosis. To predict a plant root – mycorrhizal fungi sensor systems, we analyzed in vitro experiments of Populus tremuloides (aspen tree) and Laccaria bicolor (mycorrhizal fungi) interaction and leveraged over 200 previously published transcriptomic experimental data sets, 159 experimentally validated plant transcription factor binding motifs, and more than 120-thousand experimentally validated protein-protein interactions to generate models of pre-mycorrhizal sensormore » systems in aspen root. These sensor mechanisms link extracellular signaling molecules with gene regulation through a network comprised of membrane receptors, signal cascade proteins, transcription factors, and transcription factor biding DNA motifs. Modeling predicted four pre-mycorrhizal sensor complexes in aspen that interact with fifteen transcription factors to regulate the expression of 1184 genes in response to extracellular signals synthesized by Laccaria. Predicted extracellular signaling molecules include common signaling molecules such as phenylpropanoids, salicylate, and, jasmonic acid. Lastly, this multi-omic computational modeling approach for predicting the complex sensory networks yielded specific, testable biological hypotheses for mycorrhizal interaction signaling compounds, sensor complexes, and mechanisms of gene regulation.« less
Multi-omics approach identifies molecular mechanisms of plant-fungus mycorrhizal interaction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larsen, Peter E.; Sreedasyam, Avinash; Trivedi, Geetika
In mycorrhizal symbiosis, plant roots form close, mutually beneficial interactions with soil fungi. Before this mycorrhizal interaction can be established however, plant roots must be capable of detecting potential beneficial fungal partners and initiating the gene expression patterns necessary to begin symbiosis. To predict a plant root – mycorrhizal fungi sensor systems, we analyzed in vitro experiments of Populus tremuloides (aspen tree) and Laccaria bicolor (mycorrhizal fungi) interaction and leveraged over 200 previously published transcriptomic experimental data sets, 159 experimentally validated plant transcription factor binding motifs, and more than 120-thousand experimentally validated protein-protein interactions to generate models of pre-mycorrhizal sensormore » systems in aspen root. These sensor mechanisms link extracellular signaling molecules with gene regulation through a network comprised of membrane receptors, signal cascade proteins, transcription factors, and transcription factor biding DNA motifs. Modeling predicted four pre-mycorrhizal sensor complexes in aspen that interact with fifteen transcription factors to regulate the expression of 1184 genes in response to extracellular signals synthesized by Laccaria. Predicted extracellular signaling molecules include common signaling molecules such as phenylpropanoids, salicylate, and, jasmonic acid. Lastly, this multi-omic computational modeling approach for predicting the complex sensory networks yielded specific, testable biological hypotheses for mycorrhizal interaction signaling compounds, sensor complexes, and mechanisms of gene regulation.« less
Slot-like capacity and resource-like coding in a neural model of multiple-item working memory.
Standage, Dominic; Pare, Martin
2018-06-27
For the past decade, research on the storage limitations of working memory has been dominated by two fundamentally different hypotheses. On the one hand, the contents of working memory may be stored in a limited number of `slots', each with a fixed resolution. On the other hand, any number of items may be stored, but with decreasing resolution. These two hypotheses have been invaluable in characterizing the computational structure of working memory, but neither provides a complete account of the available experimental data, nor speaks to the neural basis of the limitations it characterizes. To address these shortcomings, we simulated a multiple-item working memory task with a cortical network model, the cellular resolution of which allowed us to quantify the coding fidelity of memoranda as a function of memory load, as measured by the discriminability, regularity and reliability of simulated neural spiking. Our simulations account for a wealth of neural and behavioural data from human and non-human primate studies, and they demonstrate that feedback inhibition lowers both capacity and coding fidelity. Because the strength of inhibition scales with the number of items stored by the network, increasing this number progressively lowers fidelity until capacity is reached. Crucially, the model makes specific, testable predictions for neural activity on multiple-item working memory tasks.
Simulating Cancer Growth with Multiscale Agent-Based Modeling
Wang, Zhihui; Butner, Joseph D.; Kerketta, Romica; Cristini, Vittorio; Deisboeck, Thomas S.
2014-01-01
There have been many techniques developed in recent years to in silico model a variety of cancer behaviors. Agent-based modeling is a specific discrete-based hybrid modeling approach that allows simulating the role of diversity in cell populations as well as within each individual cell; it has therefore become a powerful modeling method widely used by computational cancer researchers. Many aspects of tumor morphology including phenotype-changing mutations, the adaptation to microenvironment, the process of angiogenesis, the influence of extracellular matrix, reactions to chemotherapy or surgical intervention, the effects of oxygen and nutrient availability, and metastasis and invasion of healthy tissues have been incorporated and investigated in agent-based models. In this review, we introduce some of the most recent agent-based models that have provided insight into the understanding of cancer growth and invasion, spanning multiple biological scales in time and space, and we further describe several experimentally testable hypotheses generated by those models. We also discuss some of the current challenges of multiscale agent-based cancer models. PMID:24793698
Burger, Gerhard A.; Danen, Erik H. J.; Beltman, Joost B.
2017-01-01
Epithelial–mesenchymal transition (EMT), the process by which epithelial cells can convert into motile mesenchymal cells, plays an important role in development and wound healing but is also involved in cancer progression. It is increasingly recognized that EMT is a dynamic process involving multiple intermediate or “hybrid” phenotypes rather than an “all-or-none” process. However, the role of EMT in various cancer hallmarks, including metastasis, is debated. Given the complexity of EMT regulation, computational modeling has proven to be an invaluable tool for cancer research, i.e., to resolve apparent conflicts in experimental data and to guide experiments by generating testable hypotheses. In this review, we provide an overview of computational modeling efforts that have been applied to regulation of EMT in the context of cancer progression and its associated tumor characteristics. Moreover, we identify possibilities to bridge different modeling approaches and point out outstanding questions in which computational modeling can contribute to advance our understanding of pathological EMT. PMID:28824874
Modeling T-cell activation using gene expression profiling and state-space models.
Rangel, Claudia; Angus, John; Ghahramani, Zoubin; Lioumi, Maria; Sotheran, Elizabeth; Gaiba, Alessia; Wild, David L; Falciani, Francesco
2004-06-12
We have used state-space models to reverse engineer transcriptional networks from highly replicated gene expression profiling time series data obtained from a well-established model of T-cell activation. State space models are a class of dynamic Bayesian networks that assume that the observed measurements depend on some hidden state variables that evolve according to Markovian dynamics. These hidden variables can capture effects that cannot be measured in a gene expression profiling experiment, e.g. genes that have not been included in the microarray, levels of regulatory proteins, the effects of messenger RNA and protein degradation, etc. Bootstrap confidence intervals are developed for parameters representing 'gene-gene' interactions over time. Our models represent the dynamics of T-cell activation and provide a methodology for the development of rational and experimentally testable hypotheses. Supplementary data and Matlab computer source code will be made available on the web at the URL given below. http://public.kgi.edu/~wild/LDS/index.htm
[Mechanisms of action of voltage-gated sodium channel ligands].
Tikhonov, D B
2007-05-01
The voltage-gated sodium channels play a key role in the generation of action potential in excitable cells. Sodium channels are targeted by a number of modulating ligands. Despite numerous studies, the mechanisms of action of many ligands are still unknown. The main cause of the problem is the absence of the channel structure. Sodium channels belong to the superfamily of P-loop channels that also the data abowt includes potassium and calcium channels and the channels of ionotropic glutamate receptors. Crystallization of several potassium channels has opened a possibility to analyze the structure of other members of the superfamily using the homology modeling approach. The present study summarizes the results of several recent modelling studies of such sodium channel ligands as tetrodotoxin, batrachotoxin and local anesthetics. Comparison of available experimental data with X-ray structures of potassium channels has provided a new level of understanding of the mechanisms of action of sodium channel ligands and has allowed proposing several testable hypotheses.
Hicks, Amanda; Hogan, William R.; Rutherford, Michael; Malin, Bradley; Xie, Mengjun; Fellbaum, Christiane; Yin, Zhijun; Fabbri, Daniel; Hanna, Josh; Bian, Jiang
2015-01-01
The Institute of Medicine (IOM) recommends that health care providers collect data on gender identity. If these data are to be useful, they should utilize terms that characterize gender identity in a manner that is 1) sensitive to transgender and gender non-binary individuals (trans* people) and 2) semantically structured to render associated data meaningful to the health care professionals. We developed a set of tools and approaches for analyzing Twitter data as a basis for generating hypotheses on language used to identify gender and discuss gender-related issues across regions and population groups. We offer sample hypotheses regarding regional variations in the usage of certain terms such as ‘genderqueer’, ‘genderfluid’, and ‘neutrois’ and their usefulness as terms on intake forms. While these hypotheses cannot be directly validated with Twitter data alone, our data and tools help to formulate testable hypotheses and design future studies regarding the adequacy of gender identification terms on intake forms. PMID:26958196
Phylogenetic perspectives on noise-induced fear and annoyance
NASA Astrophysics Data System (ADS)
Bowles, Ann
2003-04-01
Negative human responses to noise are typically interpreted in terms of human psychological, cognitive, or social processes. However, it may be useful to frame hypotheses about human responses in terms of evolutionary history, during which negative responses have been part of a suite of adaptions to a variable sound environment. By comparing the responses of a range of nonhuman animals to various types of noise, it is possible to develop hypotheses about the ecology of human responses. Examples of noise-related phenomena that could be explained usefully from this perspective include the Schulz curve, noise-induced physical stress, acute fear responses induced by transient noise, and the relationship between temperament and noise-induced annoyance. Responses of animals from a range of taxa will be described and their behavior interpreted in terms of their life-history strategies. With this perspective, some testable hypotheses about noise-induced fear and annoyance will be suggested.
Causal Reasoning on Biological Networks: Interpreting Transcriptional Changes
NASA Astrophysics Data System (ADS)
Chindelevitch, Leonid; Ziemek, Daniel; Enayetallah, Ahmed; Randhawa, Ranjit; Sidders, Ben; Brockel, Christoph; Huang, Enoch
Over the past decade gene expression data sets have been generated at an increasing pace. In addition to ever increasing data generation, the biomedical literature is growing exponentially. The PubMed database (Sayers et al., 2010) comprises more than 20 million citations as of October 2010. The goal of our method is the prediction of putative upstream regulators of observed expression changes based on a set of over 400,000 causal relationships. The resulting putative regulators constitute directly testable hypotheses for follow-up.
White-nose syndrome initiates a cascade of physiologic disturbances in the hibernating bat host
Verant, Michelle L.; Meteyer, Carol U.; Speakman, John R.; Cryan, Paul M.; Lorch, Jeffrey M.; Blehert, David S.
2014-01-01
Integrating these novel findings on the physiological changes that occur in early-stage WNS with those previously documented in late-stage infections, we propose a multi-stage disease progression model that mechanistically describes the pathologic and physiologic effects underlying mortality of WNS in hibernating bats. This model identifies testable hypotheses for better understanding this disease, knowledge that will be critical for defining effective disease mitigation strategies aimed at reducing morbidity and mortality that results from WNS.
Dini-Andreote, Francisco; Stegen, James C; van Elsas, Jan Dirk; Salles, Joana Falcão
2015-03-17
Ecological succession and the balance between stochastic and deterministic processes are two major themes within microbial ecology, but these conceptual domains have mostly developed independent of each other. Here we provide a framework that integrates shifts in community assembly processes with microbial primary succession to better understand mechanisms governing the stochastic/deterministic balance. Synthesizing previous work, we devised a conceptual model that links ecosystem development to alternative hypotheses related to shifts in ecological assembly processes. Conceptual model hypotheses were tested by coupling spatiotemporal data on soil bacterial communities with environmental conditions in a salt marsh chronosequence spanning 105 years of succession. Analyses within successional stages showed community composition to be initially governed by stochasticity, but as succession proceeded, there was a progressive increase in deterministic selection correlated with increasing sodium concentration. Analyses of community turnover among successional stages--which provide a larger spatiotemporal scale relative to within stage analyses--revealed that changes in the concentration of soil organic matter were the main predictor of the type and relative influence of determinism. Taken together, these results suggest scale-dependency in the mechanisms underlying selection. To better understand mechanisms governing these patterns, we developed an ecological simulation model that revealed how changes in selective environments cause shifts in the stochastic/deterministic balance. Finally, we propose an extended--and experimentally testable--conceptual model integrating ecological assembly processes with primary and secondary succession. This framework provides a priori hypotheses for future experiments, thereby facilitating a systematic approach to understand assembly and succession in microbial communities across ecosystems.
Testing two principles of the Health Action Process Approach in individuals with type 2 diabetes.
Lippke, Sonia; Plotnikoff, Ronald C
2014-01-01
The Health Action Process Approach (HAPA) proposes principles that can be translated into testable hypotheses. This is one of the first studies to have explicitly tested HAPA's first 2 principles, which are (1) health behavior change process can be subdivided into motivation and volition, and (2) volition can be grouped into intentional and action stages. The 3 stage groups are labeled preintenders, intenders, and actors. The hypotheses of the HAPA model were investigated in a sample of 1,193 individuals with Type 2 diabetes. Study participants completed a questionnaire assessing the HAPA variables. The hypotheses were evaluated by examining mean differences of test variables and by the use of multigroup structural equation modeling (MSEM). Findings support the HAPA's 2 principles and 3 distinct stages. The 3 HAPA stages were significantly different in several stage-specific variables, and discontinuity patterns were found in terms of nonlinear trends across means. In terms of predicting goals, action planning, and behavior, differences transpired between the 2 motivational stages (preintenders and intenders), and between the 2 volitional stages (intenders and actors). Results indicate implications for supporting behavior change processes, depending on in which stage a person is at: All individuals should be helped to increase self-efficacy. Preintenders and intenders require interventions targeting outcome expectancies. Actors benefit from an improvement in action planning to maintain and increase their previous behavior. Overall, the first 2 principles of the HAPA were supported and some evidence for the other principles was found. Future research should experimentally test these conclusions. 2014 APA, all rights reserved
Buckler, Andrew J; Liu, Tiffany Ting; Savig, Erica; Suzek, Baris E; Ouellette, M; Danagoulian, J; Wernsing, G; Rubin, Daniel L; Paik, David
2013-08-01
A widening array of novel imaging biomarkers is being developed using ever more powerful clinical and preclinical imaging modalities. These biomarkers have demonstrated effectiveness in quantifying biological processes as they occur in vivo and in the early prediction of therapeutic outcomes. However, quantitative imaging biomarker data and knowledge are not standardized, representing a critical barrier to accumulating medical knowledge based on quantitative imaging data. We use an ontology to represent, integrate, and harmonize heterogeneous knowledge across the domain of imaging biomarkers. This advances the goal of developing applications to (1) improve precision and recall of storage and retrieval of quantitative imaging-related data using standardized terminology; (2) streamline the discovery and development of novel imaging biomarkers by normalizing knowledge across heterogeneous resources; (3) effectively annotate imaging experiments thus aiding comprehension, re-use, and reproducibility; and (4) provide validation frameworks through rigorous specification as a basis for testable hypotheses and compliance tests. We have developed the Quantitative Imaging Biomarker Ontology (QIBO), which currently consists of 488 terms spanning the following upper classes: experimental subject, biological intervention, imaging agent, imaging instrument, image post-processing algorithm, biological target, indicated biology, and biomarker application. We have demonstrated that QIBO can be used to annotate imaging experiments with standardized terms in the ontology and to generate hypotheses for novel imaging biomarker-disease associations. Our results established the utility of QIBO in enabling integrated analysis of quantitative imaging data.
Scientific realism and wishful thinking in soil hydrology
NASA Astrophysics Data System (ADS)
Flühler, H.
2009-04-01
In our field we often learn - or could have learned - more from failures than from successes provided we had postulated testable hypotheses to be accepted or rejected. In soil hydrology, hypotheses are testable if independent information quantifying the pertinent system features is at hand. This view on how to operate is an idealized concept of how we could or should have worked. In reality, the path to success is more tortuous and we usually progress differently obeying to other professional musts. Although we missed some shortcuts over the past few decades, we definitely made significant progress in understanding vadose zone progresses, but we could have advanced our system understanding faster by more rigorously questioning the fundamental assumptions. I will try to illustrate the tortuous path of learning and identify some causes of the slowed-down learning curve. In the pioneering phase of vadose zone research many models have been mapped in our minds and implemented on our computers. Many of them are now well established, powerful and represent the state-of-the-art even when they do not work. Some of them are based on erroneous or misleading concepts. Even when based on adequate concepts they might have been applied in the wrong context or inadequate models may have lead to apparent success. I address this process of collective learning with the intention that we spend more time and efforts to find the right question instead of improving tools, which are questionably suitable for solving the main problems.
NASA Technical Reports Server (NTRS)
Chen, Chung-Hsing
1992-01-01
In this thesis, a behavioral-level testability analysis approach is presented. This approach is based on analyzing the circuit behavioral description (similar to a C program) to estimate its testability by identifying controllable and observable circuit nodes. This information can be used by a test generator to gain better access to internal circuit nodes and to reduce its search space. The results of the testability analyzer can also be used to select test points or partial scan flip-flops in the early design phase. Based on selection criteria, a novel Synthesis for Testability approach call Test Statement Insertion (TSI) is proposed, which modifies the circuit behavioral description directly. Test Statement Insertion can also be used to modify circuit structural description to improve its testability. As a result, Synthesis for Testability methodology can be combined with an existing behavioral synthesis tool to produce more testable circuits.
Geophysical Evolution of Ch Asteroids and Testable Hypotheses for Future Missions
NASA Astrophysics Data System (ADS)
Castillo, J. C.
2017-12-01
The main population of asteroids related to meteorites in the collections remains to be explored in situ. Ch asteroids are the only midsized asteroids that display a signature of hydration (besides Pallas) and the spectral connection between Ch asteroids and CM chondrites suggests that the former represent potential parent bodies for the latter. This class of asteroids is particularly interesting because it hosts many objects 100-200 km in size, which are believed to belong to a primordial population of planetesimals. This presentation will explore multiple evolution pathways for Ch-asteroids leading to possible hypotheses on the geological, petrological, and geophysical properties that a disrupted parent body would present to a future mission. This work is being carried out at the Jet Propulsion Laboratory, California Institute of Technology, under contract to NASA.
Interrogating selectivity in catalysis using molecular vibrations
NASA Astrophysics Data System (ADS)
Milo, Anat; Bess, Elizabeth N.; Sigman, Matthew S.
2014-03-01
The delineation of molecular properties that underlie reactivity and selectivity is at the core of physical organic chemistry, and this knowledge can be used to inform the design of improved synthetic methods or identify new chemical transformations. For this reason, the mathematical representation of properties affecting reactivity and selectivity trends, that is, molecular parameters, is paramount. Correlations produced by equating these molecular parameters with experimental outcomes are often defined as free-energy relationships and can be used to evaluate the origin of selectivity and to generate new, experimentally testable hypotheses. The premise behind successful correlations of this type is that a systematically perturbed molecular property affects a transition-state interaction between the catalyst, substrate and any reaction components involved in the determination of selectivity. Classic physical organic molecular descriptors, such as Hammett, Taft or Charton parameters, seek to independently probe isolated electronic or steric effects. However, these parameters cannot address simultaneous, non-additive variations to more than one molecular property, which limits their utility. Here we report a parameter system based on the vibrational response of a molecule to infrared radiation that can be used to mathematically model and predict selectivity trends for reactions with interlinked steric and electronic effects at positions of interest. The disclosed parameter system is mechanistically derived and should find broad use in the study of chemical and biological systems.
Cruz-Morales, Pablo; Ramos-Aboites, Hilda E; Licona-Cassani, Cuauhtémoc; Selem-Mójica, Nelly; Mejía-Ponce, Paulina M; Souza-Saldívar, Valeria; Barona-Gómez, Francisco
2017-09-01
Desferrioxamines are hydroxamate siderophores widely conserved in both aquatic and soil-dwelling Actinobacteria. While the genetic and enzymatic bases of siderophore biosynthesis and their transport in model families of this phylum are well understood, evolutionary studies are lacking. Here, we perform a comprehensive desferrioxamine-centric (des genes) phylogenomic analysis, which includes the genomes of six novel strains isolated from an iron and phosphorous depleted oasis in the Chihuahuan desert of Mexico. Our analyses reveal previously unnoticed desferrioxamine evolutionary patterns, involving both biosynthetic and transport genes, likely to be related to desferrioxamines chemical diversity. The identified patterns were used to postulate experimentally testable hypotheses after phenotypic characterization, including profiling of siderophores production and growth stimulation of co-cultures under iron deficiency. Based in our results, we propose a novel des gene, which we term desG, as responsible for incorporation of phenylacetyl moieties during biosynthesis of previously reported arylated desferrioxamines. Moreover, a genomic-based classification of the siderophore-binding proteins responsible for specific and generalist siderophore assimilation is postulated. This report provides a much-needed evolutionary framework, with specific insights supported by experimental data, to direct the future ecological and functional analysis of desferrioxamines in the environment. © FEMS 2017.
2016-01-01
Asters nucleated by Microtubule (MT) organizing centers (MTOCs) converge on chromosomes during spindle assembly in mouse oocytes undergoing meiosis I. Time-lapse imaging suggests that this centripetal motion is driven by a biased ‘search-and-capture’ mechanism. Here, we develop a model of a random walk in a drift field to test the nature of the bias and the spatio-temporal dynamics of the search process. The model is used to optimize the spatial field of drift in simulations, by comparison to experimental motility statistics. In a second step, this optimized gradient is used to determine the location of immobilized dynein motors and MT polymerization parameters, since these are hypothesized to generate the gradient of forces needed to move MTOCs. We compare these scenarios to self-organized mechanisms by which asters have been hypothesized to find the cell-center- MT pushing at the cell-boundary and clustering motor complexes. By minimizing the error between simulation outputs and experiments, we find a model of “pulling” by a gradient of dynein motors alone can drive the centripetal motility. Interestingly, models of passive MT based “pushing” at the cortex, clustering by cross-linking motors and MT-dynamic instability gradients alone, by themselves do not result in the observed motility. The model predicts the sensitivity of the results to motor density and stall force, but not MTs per aster. A hybrid model combining a chromatin-centered immobilized dynein gradient, diffusible minus-end directed clustering motors and pushing at the cell cortex, is required to comprehensively explain the available data. The model makes experimentally testable predictions of a spatial bias and self-organized mechanisms by which MT asters can find the center of a large cell. PMID:27706163
A four-component model of the action potential in mouse detrusor smooth muscle cell
Brain, Keith L.; Young, John S.; Manchanda, Rohit
2018-01-01
Background and hypothesis Detrusor smooth muscle cells (DSMCs) of the urinary bladder are electrically connected to one another via gap junctions and form a three dimensional syncytium. DSMCs exhibit spontaneous electrical activity, including passive depolarizations and action potentials. The shapes of spontaneous action potentials (sAPs) observed from a single DSM cell can vary widely. The biophysical origins of this variability, and the precise components which contribute to the complex shapes observed are not known. To address these questions, the basic components which constitute the sAPs were investigated. We hypothesized that linear combinations of scaled versions of these basic components can produce sAP shapes observed in the syncytium. Methods and results The basic components were identified as spontaneous evoked junction potentials (sEJP), native AP (nAP), slow after hyperpolarization (sAHP) and very slow after hyperpolarization (vsAHP). The experimental recordings were grouped into two sets: a training data set and a testing data set. A training set was used to estimate the components, and a test set to evaluate the efficiency of the estimated components. We found that a linear combination of the identified components when appropriately amplified and time shifted replicated various AP shapes to a high degree of similarity, as quantified by the root mean square error (RMSE) measure. Conclusions We conclude that the four basic components—sEJP, nAP, sAHP, and vsAHP—identified and isolated in this work are necessary and sufficient to replicate all varieties of the sAPs recorded experimentally in DSMCs. This model has the potential to generate testable hypotheses that can help identify the physiological processes underlying various features of the sAPs. Further, this model also provides a means to classify the sAPs into various shape classes. PMID:29351282
A four-component model of the action potential in mouse detrusor smooth muscle cell.
Padmakumar, Mithun; Brain, Keith L; Young, John S; Manchanda, Rohit
2018-01-01
Detrusor smooth muscle cells (DSMCs) of the urinary bladder are electrically connected to one another via gap junctions and form a three dimensional syncytium. DSMCs exhibit spontaneous electrical activity, including passive depolarizations and action potentials. The shapes of spontaneous action potentials (sAPs) observed from a single DSM cell can vary widely. The biophysical origins of this variability, and the precise components which contribute to the complex shapes observed are not known. To address these questions, the basic components which constitute the sAPs were investigated. We hypothesized that linear combinations of scaled versions of these basic components can produce sAP shapes observed in the syncytium. The basic components were identified as spontaneous evoked junction potentials (sEJP), native AP (nAP), slow after hyperpolarization (sAHP) and very slow after hyperpolarization (vsAHP). The experimental recordings were grouped into two sets: a training data set and a testing data set. A training set was used to estimate the components, and a test set to evaluate the efficiency of the estimated components. We found that a linear combination of the identified components when appropriately amplified and time shifted replicated various AP shapes to a high degree of similarity, as quantified by the root mean square error (RMSE) measure. We conclude that the four basic components-sEJP, nAP, sAHP, and vsAHP-identified and isolated in this work are necessary and sufficient to replicate all varieties of the sAPs recorded experimentally in DSMCs. This model has the potential to generate testable hypotheses that can help identify the physiological processes underlying various features of the sAPs. Further, this model also provides a means to classify the sAPs into various shape classes.
Khetan, Neha; Athale, Chaitanya A
2016-10-01
Asters nucleated by Microtubule (MT) organizing centers (MTOCs) converge on chromosomes during spindle assembly in mouse oocytes undergoing meiosis I. Time-lapse imaging suggests that this centripetal motion is driven by a biased 'search-and-capture' mechanism. Here, we develop a model of a random walk in a drift field to test the nature of the bias and the spatio-temporal dynamics of the search process. The model is used to optimize the spatial field of drift in simulations, by comparison to experimental motility statistics. In a second step, this optimized gradient is used to determine the location of immobilized dynein motors and MT polymerization parameters, since these are hypothesized to generate the gradient of forces needed to move MTOCs. We compare these scenarios to self-organized mechanisms by which asters have been hypothesized to find the cell-center- MT pushing at the cell-boundary and clustering motor complexes. By minimizing the error between simulation outputs and experiments, we find a model of "pulling" by a gradient of dynein motors alone can drive the centripetal motility. Interestingly, models of passive MT based "pushing" at the cortex, clustering by cross-linking motors and MT-dynamic instability gradients alone, by themselves do not result in the observed motility. The model predicts the sensitivity of the results to motor density and stall force, but not MTs per aster. A hybrid model combining a chromatin-centered immobilized dynein gradient, diffusible minus-end directed clustering motors and pushing at the cell cortex, is required to comprehensively explain the available data. The model makes experimentally testable predictions of a spatial bias and self-organized mechanisms by which MT asters can find the center of a large cell.
Hypothesis testing and earthquake prediction.
Jackson, D D
1996-04-30
Requirements for testing include advance specification of the conditional rate density (probability per unit time, area, and magnitude) or, alternatively, probabilities for specified intervals of time, space, and magnitude. Here I consider testing fully specified hypotheses, with no parameter adjustments or arbitrary decisions allowed during the test period. Because it may take decades to validate prediction methods, it is worthwhile to formulate testable hypotheses carefully in advance. Earthquake prediction generally implies that the probability will be temporarily higher than normal. Such a statement requires knowledge of "normal behavior"--that is, it requires a null hypothesis. Hypotheses can be tested in three ways: (i) by comparing the number of actual earth-quakes to the number predicted, (ii) by comparing the likelihood score of actual earthquakes to the predicted distribution, and (iii) by comparing the likelihood ratio to that of a null hypothesis. The first two tests are purely self-consistency tests, while the third is a direct comparison of two hypotheses. Predictions made without a statement of probability are very difficult to test, and any test must be based on the ratio of earthquakes in and out of the forecast regions.
Hypothesis testing and earthquake prediction.
Jackson, D D
1996-01-01
Requirements for testing include advance specification of the conditional rate density (probability per unit time, area, and magnitude) or, alternatively, probabilities for specified intervals of time, space, and magnitude. Here I consider testing fully specified hypotheses, with no parameter adjustments or arbitrary decisions allowed during the test period. Because it may take decades to validate prediction methods, it is worthwhile to formulate testable hypotheses carefully in advance. Earthquake prediction generally implies that the probability will be temporarily higher than normal. Such a statement requires knowledge of "normal behavior"--that is, it requires a null hypothesis. Hypotheses can be tested in three ways: (i) by comparing the number of actual earth-quakes to the number predicted, (ii) by comparing the likelihood score of actual earthquakes to the predicted distribution, and (iii) by comparing the likelihood ratio to that of a null hypothesis. The first two tests are purely self-consistency tests, while the third is a direct comparison of two hypotheses. Predictions made without a statement of probability are very difficult to test, and any test must be based on the ratio of earthquakes in and out of the forecast regions. PMID:11607663
What can we learn from a two-brain approach to verbal interaction?
Schoot, Lotte; Hagoort, Peter; Segaert, Katrien
2016-09-01
Verbal interaction is one of the most frequent social interactions humans encounter on a daily basis. In the current paper, we zoom in on what the multi-brain approach has contributed, and can contribute in the future, to our understanding of the neural mechanisms supporting verbal interaction. Indeed, since verbal interaction can only exist between individuals, it seems intuitive to focus analyses on inter-individual neural markers, i.e. between-brain neural coupling. To date, however, there is a severe lack of theoretically-driven, testable hypotheses about what between-brain neural coupling actually reflects. In this paper, we develop a testable hypothesis in which between-pair variation in between-brain neural coupling is of key importance. Based on theoretical frameworks and empirical data, we argue that the level of between-brain neural coupling reflects speaker-listener alignment at different levels of linguistic and extra-linguistic representation. We discuss the possibility that between-brain neural coupling could inform us about the highest level of inter-speaker alignment: mutual understanding. Copyright © 2016 Elsevier Ltd. All rights reserved.
Simulating cancer growth with multiscale agent-based modeling.
Wang, Zhihui; Butner, Joseph D; Kerketta, Romica; Cristini, Vittorio; Deisboeck, Thomas S
2015-02-01
There have been many techniques developed in recent years to in silico model a variety of cancer behaviors. Agent-based modeling is a specific discrete-based hybrid modeling approach that allows simulating the role of diversity in cell populations as well as within each individual cell; it has therefore become a powerful modeling method widely used by computational cancer researchers. Many aspects of tumor morphology including phenotype-changing mutations, the adaptation to microenvironment, the process of angiogenesis, the influence of extracellular matrix, reactions to chemotherapy or surgical intervention, the effects of oxygen and nutrient availability, and metastasis and invasion of healthy tissues have been incorporated and investigated in agent-based models. In this review, we introduce some of the most recent agent-based models that have provided insight into the understanding of cancer growth and invasion, spanning multiple biological scales in time and space, and we further describe several experimentally testable hypotheses generated by those models. We also discuss some of the current challenges of multiscale agent-based cancer models. Copyright © 2014 Elsevier Ltd. All rights reserved.
Conroy, M.J.; Runge, M.C.; Nichols, J.D.; Stodola, K.W.; Cooper, R.J.
2011-01-01
The broad physical and biological principles behind climate change and its potential large scale ecological impacts on biota are fairly well understood, although likely responses of biotic communities at fine spatio-temporal scales are not, limiting the ability of conservation programs to respond effectively to climate change outside the range of human experience. Much of the climate debate has focused on attempts to resolve key uncertainties in a hypothesis-testing framework. However, conservation decisions cannot await resolution of these scientific issues and instead must proceed in the face of uncertainty. We suggest that conservation should precede in an adaptive management framework, in which decisions are guided by predictions under multiple, plausible hypotheses about climate impacts. Under this plan, monitoring is used to evaluate the response of the system to climate drivers, and management actions (perhaps experimental) are used to confront testable predictions with data, in turn providing feedback for future decision making. We illustrate these principles with the problem of mitigating the effects of climate change on terrestrial bird communities in the southern Appalachian Mountains, USA. ?? 2010 Elsevier Ltd.
A Global Classification System for Catchment Hydrology
NASA Astrophysics Data System (ADS)
Woods, R. A.
2004-05-01
It is a shocking state of affairs - there is no underpinning scientific taxonomy of catchments. There are widely used global classification systems for climate, river morphology, lakes and wetlands, but for river catchments there exists only a plethora of inconsistent, incomplete regional schemes. By proceeding without a common taxonomy for catchments, freshwater science has missed one of its key developmental stages, and has leapt from definition of phenomena to experiments, theories and models, without the theoretical framework of a classification. I propose the development of a global hierarchical classification system for physical aspects of river catchments, to help underpin physical science in the freshwater environment and provide a solid foundation for classification of river ecosystems. Such a classification scheme can open completely new vistas in hydrology: for example it will be possible to (i) rationally transfer experimental knowledge of hydrological processes between basins anywhere in the world, provided they belong to the same class; (ii) perform meaningful meta-analyses in order to reconcile studies that show inconsistent results (iii) generate new testable hypotheses which involve locations worldwide.
Ditlev, Jonathon A; Mayer, Bruce J; Loew, Leslie M
2013-02-05
Mathematical modeling has established its value for investigating the interplay of biochemical and mechanical mechanisms underlying actin-based motility. Because of the complex nature of actin dynamics and its regulation, many of these models are phenomenological or conceptual, providing a general understanding of the physics at play. But the wealth of carefully measured kinetic data on the interactions of many of the players in actin biochemistry cries out for the creation of more detailed and accurate models that could permit investigators to dissect interdependent roles of individual molecular components. Moreover, no human mind can assimilate all of the mechanisms underlying complex protein networks; so an additional benefit of a detailed kinetic model is that the numerous binding proteins, signaling mechanisms, and biochemical reactions can be computationally organized in a fully explicit, accessible, visualizable, and reusable structure. In this review, we will focus on how comprehensive and adaptable modeling allows investigators to explain experimental observations and develop testable hypotheses on the intracellular dynamics of the actin cytoskeleton. Copyright © 2013 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Ditlev, Jonathon A.; Mayer, Bruce J.; Loew, Leslie M.
2013-01-01
Mathematical modeling has established its value for investigating the interplay of biochemical and mechanical mechanisms underlying actin-based motility. Because of the complex nature of actin dynamics and its regulation, many of these models are phenomenological or conceptual, providing a general understanding of the physics at play. But the wealth of carefully measured kinetic data on the interactions of many of the players in actin biochemistry cries out for the creation of more detailed and accurate models that could permit investigators to dissect interdependent roles of individual molecular components. Moreover, no human mind can assimilate all of the mechanisms underlying complex protein networks; so an additional benefit of a detailed kinetic model is that the numerous binding proteins, signaling mechanisms, and biochemical reactions can be computationally organized in a fully explicit, accessible, visualizable, and reusable structure. In this review, we will focus on how comprehensive and adaptable modeling allows investigators to explain experimental observations and develop testable hypotheses on the intracellular dynamics of the actin cytoskeleton. PMID:23442903
Hu, Xiao-Min; Chen, Jiang-Shan; Liu, Bi-Heng; Guo, Yu; Huang, Yun-Feng; Zhou, Zong-Quan; Han, Yong-Jian; Li, Chuan-Feng; Guo, Guang-Can
2016-10-21
The physical impact and the testability of the Kochen-Specker (KS) theorem is debated because of the fact that perfect compatibility in a single quantum system cannot be achieved in practical experiments with finite precision. Here, we follow the proposal of A. Cabello and M. T. Cunha [Phys. Rev. Lett. 106, 190401 (2011)], and present a compatibility-loophole-free experimental violation of an inequality of noncontextual theories by two spatially separated entangled qutrits. A maximally entangled qutrit-qutrit state with a fidelity as high as 0.975±0.001 is prepared and distributed to separated spaces, and these two photons are then measured locally, providing the compatibility requirement. The results show that the inequality for noncontextual theory is violated by 31 standard deviations. Our experiments pave the way to close the debate about the testability of the KS theorem. In addition, the method to generate high-fidelity and high-dimension entangled states will provide significant advantages in high-dimension quantum encoding and quantum communication.
Wagenmakers, Eric-Jan; Farrell, Simon; Ratcliff, Roger
2005-01-01
Recently, G. C. Van Orden, J. G. Holden, and M. T. Turvey (2003) proposed to abandon the conventional framework of cognitive psychology in favor of the framework of nonlinear dynamical systems theory. Van Orden et al. presented evidence that “purposive behavior originates in self-organized criticality” (p. 333). Here, the authors show that Van Orden et al.’s analyses do not test their hypotheses. Further, the authors argue that a confirmation of Van Orden et al.’s hypotheses would not have constituted firm evidence in support of their framework. Finally, the absence of a specific model for how self-organized criticality produces the observed behavior makes it very difficult to derive testable predictions. The authors conclude that the proposed paradigm shift is presently unwarranted. PMID:15702966
Mäs, Michael; Flache, Andreas
2013-01-01
Explanations of opinion bi-polarization hinge on the assumption of negative influence, individuals' striving to amplify differences to disliked others. However, empirical evidence for negative influence is inconclusive, which motivated us to search for an alternative explanation. Here, we demonstrate that bi-polarization can be explained without negative influence, drawing on theories that emphasize the communication of arguments as central mechanism of influence. Due to homophily, actors interact mainly with others whose arguments will intensify existing tendencies for or against the issue at stake. We develop an agent-based model of this theory and compare its implications to those of existing social-influence models, deriving testable hypotheses about the conditions of bi-polarization. Hypotheses were tested with a group-discussion experiment (N = 96). Results demonstrate that argument exchange can entail bi-polarization even when there is no negative influence.
Evolutionary Perspectives on Genetic and Environmental Risk Factors for Psychiatric Disorders.
Keller, Matthew C
2018-05-07
Evolutionary medicine uses evolutionary theory to help elucidate why humans are vulnerable to disease and disorders. I discuss two different types of evolutionary explanations that have been used to help understand human psychiatric disorders. First, a consistent finding is that psychiatric disorders are moderately to highly heritable, and many, such as schizophrenia, are also highly disabling and appear to decrease Darwinian fitness. Models used in evolutionary genetics to understand why genetic variation exists in fitness-related traits can be used to understand why risk alleles for psychiatric disorders persist in the population. The usual explanation for species-typical adaptations-natural selection-is less useful for understanding individual differences in genetic risk to disorders. Rather, two other types of models, mutation-selection-drift and balancing selection, offer frameworks for understanding why genetic variation in risk to psychiatric (and other) disorders exists, and each makes predictions that are now testable using whole-genome data. Second, species-typical capacities to mount reactions to negative events are likely to have been crafted by natural selection to minimize fitness loss. The pain reaction to tissue damage is almost certainly such an example, but it has been argued that the capacity to experience depressive symptoms such as sadness, anhedonia, crying, and fatigue in the face of adverse life situations may have been crafted by natural selection as well. I review the rationale and strength of evidence for this hypothesis. Evolutionary hypotheses of psychiatric disorders are important not only for offering explanations for why psychiatric disorders exist, but also for generating new, testable hypotheses and understanding how best to design studies and analyze data.
From Cookbook to Experimental Design
ERIC Educational Resources Information Center
Flannagan, Jenny Sue; McMillan, Rachel
2009-01-01
Developing expertise, whether from cook to chef or from student to scientist, occurs over time and requires encouragement, guidance, and support. One key goal of an elementary science program should be to move students toward expertise in their ability to design investigative questions. The ability to design a testable question is difficult for…
Links between Parents' Epistemological Stance and Children's Evidence Talk
ERIC Educational Resources Information Center
Luce, Megan R.; Callanan, Maureen A.; Smilovic, Sarah
2013-01-01
Recent experimental research highlights young children's selectivity in learning from others. Little is known, however, about the patterns of information that children actually encounter in conversations with adults. This study investigated variation in parents' tendency to focus on testable evidence as a way to answer science-related questions…
NASA Astrophysics Data System (ADS)
Bunn, Henry T.; Pickering, Travis Rayne
2010-11-01
The world's first archaeological traces from 2.6 million years ago (Ma) at Gona, in Ethiopia, include sharp-edged cutting tools and cut-marked animal bones, which indicate consumption of skeletal muscle by early hominin butchers. From that point, evidence of hominin meat-eating becomes increasingly more common throughout the Pleistocene archaeological record. Thus, the substantive debate about hominin meat-eating now centers on mode(s) of carcass resource acquisition. Two prominent hypotheses suggest, alternatively, (1) that early Homo hunted ungulate prey by running them to physiological failure and then dispatching them, or (2) that early Homo was relegated to passively scavenging carcass residues abandoned by carnivore predators. Various paleontologically testable predictions can be formulated for both hypotheses. Here we test four predictions concerning age-frequency distributions for bovids that contributed carcass remains to the 1.8 Ma. old FLK 22 Zinjanthropus (FLK Zinj, Olduvai Gorge, Tanzania) fauna, which zooarchaeological and taphonomic data indicate was formed predominantly by early Homo. In all but one case, the bovid mortality data from FLK Zinj violate test predictions of the endurance running-hunting and passive scavenging hypotheses. When combined with other taphonomic data, these results falsify both hypotheses, and lead to the hypothesis that early Homo operated successfully as an ambush predator.
Modelling protein functional domains in signal transduction using Maude
NASA Technical Reports Server (NTRS)
Sriram, M. G.
2003-01-01
Modelling of protein-protein interactions in signal transduction is receiving increased attention in computational biology. This paper describes recent research in the application of Maude, a symbolic language founded on rewriting logic, to the modelling of functional domains within signalling proteins. Protein functional domains (PFDs) are a critical focus of modern signal transduction research. In general, Maude models can simulate biological signalling networks and produce specific testable hypotheses at various levels of abstraction. Developing symbolic models of signalling proteins containing functional domains is important because of the potential to generate analyses of complex signalling networks based on structure-function relationships.
PhyloDet: a scalable visualization tool for mapping multiple traits to large evolutionary trees
Lee, Bongshin; Nachmanson, Lev; Robertson, George; Carlson, Jonathan M.; Heckerman, David
2009-01-01
Summary: Evolutionary biologists are often interested in finding correlations among biological traits across a number of species, as such correlations may lead to testable hypotheses about the underlying function. Because some species are more closely related than others, computing and visualizing these correlations must be done in the context of the evolutionary tree that relates species. In this note, we introduce PhyloDet (short for PhyloDetective), an evolutionary tree visualization tool that enables biologists to visualize multiple traits mapped to the tree. Availability: http://research.microsoft.com/cue/phylodet/ Contact: bongshin@microsoft.com. PMID:19633096
The need for theory to guide concussion research.
Molfese, Dennis L
2015-01-01
Although research into concussion has greatly expanded over the past decade, progress in identifying the mechanisms and consequences of head injury and recovery are largely absent. Instead, data are accumulated without the guidance of a systematic theory to direct research questions or generate testable hypotheses. As part of this special issue on sports concussion, I advance a theory that emphasizes changes in spatial and temporal distributions of the brain's neural networks during normal learning and the disruptions of these networks following injury. Specific predictions are made regarding both the development of the network as well as its breakdown following injury.
QUANTITATIVE TESTS OF ELMS AS INTERMEDIATE N PEELING-BALOONING MODES
DOE Office of Scientific and Technical Information (OSTI.GOV)
LAO,LL; SNYDER,PB; LEONARD,AW
2003-03-01
A271 QUANTITATIVE TESTS OF ELMS AS INTERMEDIATE N PEELING-BALOONING MODES. Several testable features of the working model of edge localized modes (ELMs) as intermediate toroidal mode number peeling-ballooning modes are evaluated quantitatively using DIII-D and JT-60U experimental data and the ELITE MHD stability code. These include the hypothesis that ELM sizes are related to the radial widths of the unstable MHD modes, the unstable modes have a strong ballooning character localized in the outboard bad curvature region, and ELM size generally becomes smaller at high edge collisionality. ELMs are triggered when the growth rates of the unstable MHD modes becomemore » significantly large. These testable features are consistent with many ELM observations in DIII-D and JT-60U discharges.« less
Lee, Insuk; Li, Zhihua; Marcotte, Edward M.
2007-01-01
Background Probabilistic functional gene networks are powerful theoretical frameworks for integrating heterogeneous functional genomics and proteomics data into objective models of cellular systems. Such networks provide syntheses of millions of discrete experimental observations, spanning DNA microarray experiments, physical protein interactions, genetic interactions, and comparative genomics; the resulting networks can then be easily applied to generate testable hypotheses regarding specific gene functions and associations. Methodology/Principal Findings We report a significantly improved version (v. 2) of a probabilistic functional gene network [1] of the baker's yeast, Saccharomyces cerevisiae. We describe our optimization methods and illustrate their effects in three major areas: the reduction of functional bias in network training reference sets, the application of a probabilistic model for calculating confidences in pair-wise protein physical or genetic interactions, and the introduction of simple thresholds that eliminate many false positive mRNA co-expression relationships. Using the network, we predict and experimentally verify the function of the yeast RNA binding protein Puf6 in 60S ribosomal subunit biogenesis. Conclusions/Significance YeastNet v. 2, constructed using these optimizations together with additional data, shows significant reduction in bias and improvements in precision and recall, in total covering 102,803 linkages among 5,483 yeast proteins (95% of the validated proteome). YeastNet is available from http://www.yeastnet.org. PMID:17912365
Choice Experiments to Quantify Preferences for Health and Healthcare: State of the Practice.
Mühlbacher, Axel; Johnson, F Reed
2016-06-01
Stated-preference methods increasingly are used to quantify preferences in health economics, health technology assessment, benefit-risk analysis and health services research. The objective of stated-preference studies is to acquire information about trade-off preferences among treatment outcomes, prioritization of clinical decision criteria, likely uptake or adherence to healthcare products and acceptability of healthcare services or policies. A widely accepted approach to eliciting preferences is discrete-choice experiments. Patient, physician, insurant or general-public respondents choose among constructed, experimentally controlled alternatives described by decision-relevant features or attributes. Attributes can represent complete health states, sets of treatment outcomes or characteristics of a healthcare system. The observed pattern of choice reveals how different respondents or groups of respondents implicitly weigh, value and assess different characteristics of treatments, products or services. An important advantage of choice experiments is their foundation in microeconomic utility theory. This conceptual framework provides tests of internal validity, guidance for statistical analysis of latent preference structures, and testable behavioural hypotheses. Choice experiments require expertise in survey-research methods, random-utility theory, experimental design and advanced statistical analysis. This paper should be understood as an introduction to setting up a basic experiment rather than an exhaustive critique of the latest findings and procedures. Where appropriate, we have identified topics of active research where a broad consensus has not yet been established.
Traditional fire-use, landscape transition, and the legacies of social theory past.
Coughlan, Michael R
2015-12-01
Fire-use and the scale and character of its effects on landscapes remain hotly debated in the paleo- and historical-fire literature. Since the second half of the nineteenth century, anthropology and geography have played important roles in providing theoretical propositions and testable hypotheses for advancing understandings of the ecological role of human-fire-use in landscape histories. This article reviews some of the most salient and persistent theoretical propositions and hypotheses concerning the role of humans in historical fire ecology. The review discusses this history in light of current research agendas, such as those offered by pyrogeography. The review suggests that a more theoretically cognizant historical fire ecology should strive to operationalize transdisciplinary theory capable of addressing the role of human variability in the evolutionary history of landscapes. To facilitate this process, researchers should focus attention on integrating more current human ecology theory into transdisciplinary research agendas.
Mäs, Michael; Flache, Andreas
2013-01-01
Explanations of opinion bi-polarization hinge on the assumption of negative influence, individuals’ striving to amplify differences to disliked others. However, empirical evidence for negative influence is inconclusive, which motivated us to search for an alternative explanation. Here, we demonstrate that bi-polarization can be explained without negative influence, drawing on theories that emphasize the communication of arguments as central mechanism of influence. Due to homophily, actors interact mainly with others whose arguments will intensify existing tendencies for or against the issue at stake. We develop an agent-based model of this theory and compare its implications to those of existing social-influence models, deriving testable hypotheses about the conditions of bi-polarization. Hypotheses were tested with a group-discussion experiment (N = 96). Results demonstrate that argument exchange can entail bi-polarization even when there is no negative influence. PMID:24312164
Postmarketing surveillance: perspectives of a journal editor.
Gelenberg, A J
1993-01-01
In the absence of a systematic monitoring program for drugs newly approved by the Food and Drug Administration (FDA), reports in clinical journals provide a legitimate forum for disseminating information about unexpected pharmacologic events. A journal editor bears the responsibility for publishing educated clinical observations that meet standards of scientific rigor while not giving premature credibility to chance and dubious reports of side effects of new drugs. Often this responsibility involves overcoming the fear of bad publicity and withstanding pressures from pharmaceutical companies to print only positive information about new products. Published preliminary observations may contribute to the problem of product liability, but they also generate testable hypotheses and healthy debate. If hypotheses later prove to be incorrect, they can be refuted by systematic studies and clarified in reviews and editorials. Our goal of effective education will be reached not by self-censorship but by scientific openness.
Thomas, Jennifer J; Lawson, Elizabeth A; Micali, Nadia; Misra, Madhusmita; Deckersbach, Thilo; Eddy, Kamryn T
2017-08-01
DSM-5 defined avoidant/restrictive food intake disorder (ARFID) as a failure to meet nutritional needs leading to low weight, nutritional deficiency, dependence on supplemental feedings, and/or psychosocial impairment. We summarize what is known about ARFID and introduce a three-dimensional model to inform research. Because ARFID prevalence, risk factors, and maintaining mechanisms are not known, prevailing treatment approaches are based on clinical experience rather than data. Furthermore, most ARFID research has focused on children, rather than adolescents or adults. We hypothesize a three-dimensional model wherein neurobiological abnormalities in sensory perception, homeostatic appetite, and negative valence systems underlie the three primary ARFID presentations of sensory sensitivity, lack of interest in eating, and fear of aversive consequences, respectively. Now that ARFID has been defined, studies investigating risk factors, prevalence, and pathophysiology are needed. Our model suggests testable hypotheses about etiology and highlights cognitive-behavioral therapy as one possible treatment.
Sexual and Emotional Infidelity: Evolved Gender Differences in Jealousy Prove Robust and Replicable.
Buss, David M
2018-03-01
Infidelity poses threats to high-investment mating relationships. Because of gender differences in some aspects of reproductive biology, such as internal female fertilization, the nature of these threats differs for men and women. Men, but not women, for example, have recurrently faced the problem of uncertainty in their genetic parenthood. Jealousy is an emotion hypothesized to have evolved to combat these threats. The 1992 article Sex Differences in Jealousy: Evolution, Physiology, and Psychology reported three empirical studies using two different methods, forced-choice and physiological experiments. Results supported the evolution-based hypotheses. The article became highly cited for several reasons. It elevated the status of jealousy as an important emotion to be explained by any comprehensive theory of human emotions. Subsequent meta-analyses robustly supported the evolutionary hypotheses. Moreover, the work supported the evolutionary meta-theory of gender differences, which posits differences only in domains in which the sexes have recurrently faced distinct adaptive problems. It also heralded the newly emerging field of evolutionary psychology as a useful perspective that possesses the scientific virtues of testability, falsifiability, and heuristic value in discovering previously unknown psychological phenomena.
Taking Bioinformatics to Systems Medicine.
van Kampen, Antoine H C; Moerland, Perry D
2016-01-01
Systems medicine promotes a range of approaches and strategies to study human health and disease at a systems level with the aim of improving the overall well-being of (healthy) individuals, and preventing, diagnosing, or curing disease. In this chapter we discuss how bioinformatics critically contributes to systems medicine. First, we explain the role of bioinformatics in the management and analysis of data. In particular we show the importance of publicly available biological and clinical repositories to support systems medicine studies. Second, we discuss how the integration and analysis of multiple types of omics data through integrative bioinformatics may facilitate the determination of more predictive and robust disease signatures, lead to a better understanding of (patho)physiological molecular mechanisms, and facilitate personalized medicine. Third, we focus on network analysis and discuss how gene networks can be constructed from omics data and how these networks can be decomposed into smaller modules. We discuss how the resulting modules can be used to generate experimentally testable hypotheses, provide insight into disease mechanisms, and lead to predictive models. Throughout, we provide several examples demonstrating how bioinformatics contributes to systems medicine and discuss future challenges in bioinformatics that need to be addressed to enable the advancement of systems medicine.
A bioinformatics expert system linking functional data to anatomical outcomes in limb regeneration
Lobo, Daniel; Feldman, Erica B.; Shah, Michelle; Malone, Taylor J.
2014-01-01
Abstract Amphibians and molting arthropods have the remarkable capacity to regenerate amputated limbs, as described by an extensive literature of experimental cuts, amputations, grafts, and molecular techniques. Despite a rich history of experimental effort, no comprehensive mechanistic model exists that can account for the pattern regulation observed in these experiments. While bioinformatics algorithms have revolutionized the study of signaling pathways, no such tools have heretofore been available to assist scientists in formulating testable models of large‐scale morphogenesis that match published data in the limb regeneration field. Major barriers to preventing an algorithmic approach are the lack of formal descriptions for experimental regenerative information and a repository to centralize storage and mining of functional data on limb regeneration. Establishing a new bioinformatics of shape would significantly accelerate the discovery of key insights into the mechanisms that implement complex regeneration. Here, we describe a novel mathematical ontology for limb regeneration to unambiguously encode phenotype, manipulation, and experiment data. Based on this formalism, we present the first centralized formal database of published limb regeneration experiments together with a user‐friendly expert system tool to facilitate its access and mining. These resources are freely available for the community and will assist both human biologists and artificial intelligence systems to discover testable, mechanistic models of limb regeneration. PMID:25729585
Zhou, Shaona; Han, Jing; Koenig, Kathleen; Raplinger, Amy; Pi, Yuan; Li, Dan; Xiao, Hua; Fu, Zhao; Bao, Lei
2016-03-01
Scientific reasoning is an important component under the cognitive strand of the 21st century skills and is highly emphasized in the new science education standards. This study focuses on the assessment of student reasoning in control of variables (COV), which is a core sub-skill of scientific reasoning. The main research question is to investigate the extent to which the existence of experimental data in questions impacts student reasoning and performance. This study also explores the effects of task contexts on student reasoning as well as students' abilities to distinguish between testability and causal influences of variables in COV experiments. Data were collected with students from both USA and China. Students received randomly one of two test versions, one with experimental data and one without. The results show that students from both populations (1) perform better when experimental data are not provided, (2) perform better in physics contexts than in real-life contexts, and (3) students have a tendency to equate non-influential variables to non-testable variables. In addition, based on the analysis of both quantitative and qualitative data, a possible progression of developmental levels of student reasoning in control of variables is proposed, which can be used to inform future development of assessment and instruction.
Zhou, Shaona; Han, Jing; Koenig, Kathleen; Raplinger, Amy; Pi, Yuan; Li, Dan; Xiao, Hua; Fu, Zhao
2015-01-01
Scientific reasoning is an important component under the cognitive strand of the 21st century skills and is highly emphasized in the new science education standards. This study focuses on the assessment of student reasoning in control of variables (COV), which is a core sub-skill of scientific reasoning. The main research question is to investigate the extent to which the existence of experimental data in questions impacts student reasoning and performance. This study also explores the effects of task contexts on student reasoning as well as students’ abilities to distinguish between testability and causal influences of variables in COV experiments. Data were collected with students from both USA and China. Students received randomly one of two test versions, one with experimental data and one without. The results show that students from both populations (1) perform better when experimental data are not provided, (2) perform better in physics contexts than in real-life contexts, and (3) students have a tendency to equate non-influential variables to non-testable variables. In addition, based on the analysis of both quantitative and qualitative data, a possible progression of developmental levels of student reasoning in control of variables is proposed, which can be used to inform future development of assessment and instruction. PMID:26949425
Models of cooperative dynamics from biomolecules to magnets
NASA Astrophysics Data System (ADS)
Mobley, David Lowell
This work details application of computer models to several biological systems (prion diseases and Alzheimer's disease) and a magnetic system. These share some common themes, which are discussed. Here, simple lattice-based models are applied to aggregation of misfolded protein in prion diseases like Mad Cow disease. These can explain key features of the diseases. The modeling is based on aggregation being essential in establishing the time-course of infectivity. Growth of initial aggregates is assumed to dominate the experimentally observed lag phase. Subsequent fission, regrowth, and fission set apart the exponential doubling phase in disease progression. We explore several possible modes of growth for 2-D aggregates and suggest the model providing the best explanation for the experimental data. We develop testable predictions from this model. Like prion disease, Alzheimer's disease (AD) is an amyloid disease characterized by large aggregates in the brain. However, evidence increasingly points away from these as the toxic agent and towards oligomers of the Abeta peptide. We explore one possible toxicity mechanism---insertion of Abeta into cell membranes and formation of harmful ion channels. We find that mutations in this peptide which cause familial Alzheimer's disease (FAD) also affect the insertion of this peptide into membranes in a fairly consistent way, suggesting that this toxicity mechanism may be relevant biologically. We find a particular inserted configuration which may be especially harmful and develop testable predictions to verify whether or not this is the case. Nucleation is an essential feature of our models for prion disease, in that it protects normal, healthy individuals from getting prion disease. Nucleation is important in many other areas, and we modify our lattice-based nucleation model to apply to a hysteretic magnetic system where nucleation has been suggested to be important. From a simple model, we find qualitative agreement with experiment, and make testable experimental predictions concerning time-dependence and temperature-dependence of the major hysteresis loop and reversal curves which have been experimentally verified. We argue why this model may be suitable for systems like these and explain implications for Ising-like models. We suggest implications for future modeling work. Finally, we present suggestions for future work in all three areas.
Morris, Melody K.; Saez-Rodriguez, Julio; Clarke, David C.; Sorger, Peter K.; Lauffenburger, Douglas A.
2011-01-01
Predictive understanding of cell signaling network operation based on general prior knowledge but consistent with empirical data in a specific environmental context is a current challenge in computational biology. Recent work has demonstrated that Boolean logic can be used to create context-specific network models by training proteomic pathway maps to dedicated biochemical data; however, the Boolean formalism is restricted to characterizing protein species as either fully active or inactive. To advance beyond this limitation, we propose a novel form of fuzzy logic sufficiently flexible to model quantitative data but also sufficiently simple to efficiently construct models by training pathway maps on dedicated experimental measurements. Our new approach, termed constrained fuzzy logic (cFL), converts a prior knowledge network (obtained from literature or interactome databases) into a computable model that describes graded values of protein activation across multiple pathways. We train a cFL-converted network to experimental data describing hepatocytic protein activation by inflammatory cytokines and demonstrate the application of the resultant trained models for three important purposes: (a) generating experimentally testable biological hypotheses concerning pathway crosstalk, (b) establishing capability for quantitative prediction of protein activity, and (c) prediction and understanding of the cytokine release phenotypic response. Our methodology systematically and quantitatively trains a protein pathway map summarizing curated literature to context-specific biochemical data. This process generates a computable model yielding successful prediction of new test data and offering biological insight into complex datasets that are difficult to fully analyze by intuition alone. PMID:21408212
1981-03-31
logic testing element and a concomitant testability criterion ideally suited to dynamic circuit applications and appro- priate for automatic computer...making connections automatically . PF is an experimental feature which provides users with only four different chip sizes (full, half, quarter, and eighth...initial solution is found constructively which is improved by pair-wise swapping. Results show, however, that the constructive initial sorter , which
Delay test generation for synchronous sequential circuits
NASA Astrophysics Data System (ADS)
Devadas, Srinivas
1989-05-01
We address the problem of generating tests for delay faults in non-scan synchronous sequential circuits. Delay test generation for sequential circuits is a considerably more difficult problem than delay testing of combinational circuits and has received much less attention. In this paper, we present a method for generating test sequences to detect delay faults in sequential circuits using the stuck-at fault sequential test generator STALLION. The method is complete in that it will generate a delay test sequence for a targeted fault given sufficient CPU time, if such a sequence exists. We term faults for which no delay test sequence exists, under out test methodology, sequentially delay redundant. We describe means of eliminating sequential delay redundancies in logic circuits. We present a partial-scan methodology for enhancing the testability of difficult-to-test of untestable sequential circuits, wherein a small number of flip-flops are selected and made controllable/observable. The selection process guarantees the elimination of all sequential delay redundancies. We show that an intimate relationship exists between state assignment and delay testability of a sequential machine. We describe a state assignment algorithm for the synthesis of sequential machines with maximal delay fault testability. Preliminary experimental results using the test generation, partial-scan and synthesis algorithm are presented.
NASA Astrophysics Data System (ADS)
Wieder, William R.; Knowles, John F.; Blanken, Peter D.; Swenson, Sean C.; Suding, Katharine N.
2017-04-01
Abiotic factors structure plant community composition and ecosystem function across many different spatial scales. Often, such variation is considered at regional or global scales, but here we ask whether ecosystem-scale simulations can be used to better understand landscape-level variation that might be particularly important in complex terrain, such as high-elevation mountains. We performed ecosystem-scale simulations by using the Community Land Model (CLM) version 4.5 to better understand how the increased length of growing seasons may impact carbon, water, and energy fluxes in an alpine tundra landscape. The model was forced with meteorological data and validated with observations from the Niwot Ridge Long Term Ecological Research Program site. Our results demonstrate that CLM is capable of reproducing the observed carbon, water, and energy fluxes for discrete vegetation patches across this heterogeneous ecosystem. We subsequently accelerated snowmelt and increased spring and summer air temperatures in order to simulate potential effects of climate change in this region. We found that vegetation communities that were characterized by different snow accumulation dynamics showed divergent biogeochemical responses to a longer growing season. Contrary to expectations, wet meadow ecosystems showed the strongest decreases in plant productivity under extended summer scenarios because of disruptions in hydrologic connectivity. These findings illustrate how Earth system models such as CLM can be used to generate testable hypotheses about the shifting nature of energy, water, and nutrient limitations across space and through time in heterogeneous landscapes; these hypotheses may ultimately guide further experimental work and model development.
Yu, Chenggang; Boutté, Angela; Yu, Xueping; Dutta, Bhaskar; Feala, Jacob D; Schmid, Kara; Dave, Jitendra; Tawa, Gregory J; Wallqvist, Anders; Reifman, Jaques
2015-02-01
The multifactorial nature of traumatic brain injury (TBI), especially the complex secondary tissue injury involving intertwined networks of molecular pathways that mediate cellular behavior, has confounded attempts to elucidate the pathology underlying the progression of TBI. Here, systems biology strategies are exploited to identify novel molecular mechanisms and protein indicators of brain injury. To this end, we performed a meta-analysis of four distinct high-throughput gene expression studies involving different animal models of TBI. By using canonical pathways and a large human protein-interaction network as a scaffold, we separately overlaid the gene expression data from each study to identify molecular signatures that were conserved across the different studies. At 24 hr after injury, the significantly activated molecular signatures were nonspecific to TBI, whereas the significantly suppressed molecular signatures were specific to the nervous system. In particular, we identified a suppressed subnetwork consisting of 58 highly interacting, coregulated proteins associated with synaptic function. We selected three proteins from this subnetwork, postsynaptic density protein 95, nitric oxide synthase 1, and disrupted in schizophrenia 1, and hypothesized that their abundance would be significantly reduced after TBI. In a penetrating ballistic-like brain injury rat model of severe TBI, Western blot analysis confirmed our hypothesis. In addition, our analysis recovered 12 previously identified protein biomarkers of TBI. The results suggest that systems biology may provide an efficient, high-yield approach to generate testable hypotheses that can be experimentally validated to identify novel mechanisms of action and molecular indicators of TBI. © 2014 The Authors. Journal of Neuroscience Research Published by Wiley Periodicals, Inc.
Shaping Gene Expression by Landscaping Chromatin Architecture: Lessons from a Master.
Sartorelli, Vittorio; Puri, Pier Lorenzo
2018-05-19
Since its discovery as a skeletal muscle-specific transcription factor able to reprogram somatic cells into differentiated myofibers, MyoD has provided an instructive model to understand how transcription factors regulate gene expression. Reciprocally, studies of other transcriptional regulators have provided testable hypotheses to further understand how MyoD activates transcription. Using MyoD as a reference, in this review, we discuss the similarities and differences in the regulatory mechanisms employed by tissue-specific transcription factors to access DNA and regulate gene expression by cooperatively shaping the chromatin landscape within the context of cellular differentiation. Copyright © 2018 Elsevier Inc. All rights reserved.
NDR proteins: lessons learned from Arabidopsis and animal cells prompt a testable hypothesis.
Mudgil, Yashwanti; Jones, Alan M
2010-08-01
N-myc Down Regulated (NDR) genes were discovered more than fifteen years ago. Indirect evidence support a role in tumor progression and cellular differentiation, but their biochemical function is still unknown. Our detailed analyses on Arabidopsis NDL proteins show their involvement in altering auxin transport, local auxin gradients and expression level of auxin transport proteins. Animal NDL proteins may be involved in membrane recycling of E-cadherin and effector for the small GTPase. In light of these findings, we hypothesize that NDL proteins regulate vesicular trafficking of auxin transport facilitator PIN proteins by biochemically alterating the local lipid environment of PIN proteins.
Cerebrovascular Hemodynamics in Women.
Duque, Cristina; Feske, Steven K; Sorond, Farzaneh A
2017-12-01
Sex and gender, as biological and social factors, significantly influence health outcomes. Among the biological factors, sex differences in vascular physiology may be one specific mechanism contributing to the observed differences in clinical presentation, response to treatment, and clinical outcomes in several vascular disorders. This review focuses on the cerebrovascular bed and summarizes the existing literature on sex differences in cerebrovascular hemodynamics to highlight the knowledge deficit that exists in this domain. The available evidence is used to generate mechanistically plausible and testable hypotheses to underscore the unmet need in understanding sex-specific mechanisms as targets for more effective therapeutic and preventive strategies. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.
NASA Technical Reports Server (NTRS)
Leonard, J. I.
1985-01-01
Theoretical responses to weightlessness are summarized. The studies include development and validation of a model of erythropoiesis regulation, analysis of the behavior of erythropoiesis under a variety of conditions, simulations of bed rest and space flight, and an evaluation of ground-based animal studies which were conducted as analogs of zero-g. A review of all relevant space flight findings and a set of testable hypotheses which attempt to explain how red cell mass decreases in space flight are presented. An additional document describes details of the mathematical model used in these studies.
What to expect from an evolutionary hypothesis for a human disease: The case of type 2 diabetes.
Watve, Milind; Diwekar-Joshi, Manawa
2016-10-01
Evolutionary medicine has a promise to bring in a conceptual revolution in medicine. However, as yet the field does not have the same theoretical rigour as that of many other fields in evolutionary studies. We discuss here with reference to type 2 diabetes mellitus (T2DM) what role an evolutionary hypothesis should play in the development of thinking in medicine. Starting with the thrifty gene hypothesis, evolutionary thinking in T2DM has undergone several transitions, modifications and refinements of the thrift family of hypotheses. In addition alternative hypotheses independent of thrift are also suggested. However, most hypotheses look at partial pictures; make selective use of supportive data ignoring inconvenient truths. Most hypotheses look at a superficial picture and avoid getting into the intricacies of underlying molecular, neuronal and physiological processes. Very few hypotheses have suggested clinical implications and none of them have been tested with randomized clinical trials. In the meanwhile the concepts in the pathophysiology of T2DM are undergoing radical changes and evolutionary hypotheses need to take them into account. We suggest an approach and a set of criteria to evaluate the relative merits of the alternative hypotheses. A number of hypotheses are likely to fail when critically evaluated against these criteria. It is possible that more than one selective process are at work in the evolution of propensity to T2DM, but the intercompatibility of the alternative selective forces and their relative contribution needs to be examined. The approach we describe could potentially lead to a sound evolutionary theory that is clinically useful and testable by randomized controlled clinical trials. Copyright © 2016 Elsevier GmbH. All rights reserved.
Wing, Steve; Richardson, David B; Hoffmann, Wolfgang
2011-04-01
In April 2010, the U.S. Nuclear Regulatory Commission asked the National Academy of Sciences to update a 1990 study of cancer risks near nuclear facilities. Prior research on this topic has suffered from problems in hypothesis formulation and research design. We review epidemiologic principles used in studies of generic exposure-response associations and in studies of specific sources of exposure. We then describe logical problems with assumptions, formation of testable hypotheses, and interpretation of evidence in previous research on cancer risks near nuclear facilities. Advancement of knowledge about cancer risks near nuclear facilities depends on testing specific hypotheses grounded in physical and biological mechanisms of exposure and susceptibility while considering sample size and ability to adequately quantify exposure, ascertain cancer cases, and evaluate plausible confounders. Next steps in advancing knowledge about cancer risks near nuclear facilities require studies of childhood cancer incidence, focus on in utero and early childhood exposures, use of specific geographic information, and consideration of pathways for transport and uptake of radionuclides. Studies of cancer mortality among adults, cancers with long latencies, large geographic zones, and populations that reside at large distances from nuclear facilities are better suited for public relations than for scientific purposes.
Social calls provide novel insights into the evolution of vocal learning
Sewall, Kendra B.; Young, Anna M.; Wright, Timothy F.
2016-01-01
Learned song is among the best-studied models of animal communication. In oscine songbirds, where learned song is most prevalent, it is used primarily for intrasexual selection and mate attraction. Learning of a different class of vocal signals, known as contact calls, is found in a diverse array of species, where they are used to mediate social interactions among individuals. We argue that call learning provides a taxonomically rich system for studying testable hypotheses for the evolutionary origins of vocal learning. We describe and critically evaluate four nonmutually exclusive hypotheses for the origin and current function of vocal learning of calls, which propose that call learning (1) improves auditory detection and recognition, (2) signals local knowledge, (3) signals group membership, or (4) allows for the encoding of more complex social information. We propose approaches to testing these four hypotheses but emphasize that all of them share the idea that social living, not sexual selection, is a central driver of vocal learning. Finally, we identify future areas for research on call learning that could provide new perspectives on the origins and mechanisms of vocal learning in both animals and humans. PMID:28163325
Schmalzl, Laura; Powers, Chivon; Henje Blom, Eva
2015-01-01
During recent decades numerous yoga-based practices (YBP) have emerged in the West, with their aims ranging from fitness gains to therapeutic benefits and spiritual development. Yoga is also beginning to spark growing interest within the scientific community, and yoga-based interventions have been associated with measureable changes in physiological parameters, perceived emotional states, and cognitive functioning. YBP typically involve a combination of postures or movement sequences, conscious regulation of the breath, and various techniques to improve attentional focus. However, so far little if any research has attempted to deconstruct the role of these different component parts in order to better understand their respective contribution to the effects of YBP. A clear operational definition of yoga-based therapeutic interventions for scientific purposes, as well as a comprehensive theoretical framework from which testable hypotheses can be formulated, is therefore needed. Here we propose such a framework, and outline the bottom-up neurophysiological and top-down neurocognitive mechanisms hypothesized to be at play in YBP. PMID:26005409
The origin and early evolution of vascular plant shoots and leaves.
Harrison, C Jill; Morris, Jennifer L
2018-02-05
The morphology of plant fossils from the Rhynie chert has generated longstanding questions about vascular plant shoot and leaf evolution, for instance, which morphologies were ancestral within land plants, when did vascular plants first arise and did leaves have multiple evolutionary origins? Recent advances combining insights from molecular phylogeny, palaeobotany and evo-devo research address these questions and suggest the sequence of morphological innovation during vascular plant shoot and leaf evolution. The evidence pinpoints testable developmental and genetic hypotheses relating to the origin of branching and indeterminate shoot architectures prior to the evolution of leaves, and demonstrates underestimation of polyphyly in the evolution of leaves from branching forms in 'telome theory' hypotheses of leaf evolution. This review discusses fossil, developmental and genetic evidence relating to the evolution of vascular plant shoots and leaves in a phylogenetic framework.This article is part of a discussion meeting issue 'The Rhynie cherts: our earliest terrestrial ecosystem revisited'. © 2017 The Authors.
Schmalzl, Laura; Powers, Chivon; Henje Blom, Eva
2015-01-01
During recent decades numerous yoga-based practices (YBP) have emerged in the West, with their aims ranging from fitness gains to therapeutic benefits and spiritual development. Yoga is also beginning to spark growing interest within the scientific community, and yoga-based interventions have been associated with measureable changes in physiological parameters, perceived emotional states, and cognitive functioning. YBP typically involve a combination of postures or movement sequences, conscious regulation of the breath, and various techniques to improve attentional focus. However, so far little if any research has attempted to deconstruct the role of these different component parts in order to better understand their respective contribution to the effects of YBP. A clear operational definition of yoga-based therapeutic interventions for scientific purposes, as well as a comprehensive theoretical framework from which testable hypotheses can be formulated, is therefore needed. Here we propose such a framework, and outline the bottom-up neurophysiological and top-down neurocognitive mechanisms hypothesized to be at play in YBP.
Bogenschutz, Michael P; Pommy, Jessica M
2012-01-01
Alcohol and drug addiction are major public health problems, and existing treatments are only moderately effective. Although there has been interest for over half a century in the therapeutic use of classic hallucinogens to treat addictions, clinical research with these drugs was halted at an early stage in the early 1970s, leaving many fundamental questions unanswered. In the past two decades, clinical research on classic hallucinogens has resumed, although addiction treatment trials are only now beginning. The purpose of this paper is to provide a targeted review of the research most relevant to the therapeutic potential of hallucinogens, and to integrate this information with current thinking about addiction and recovery. On the basis of this information, we present a heuristic model which organizes a number of hypotheses that may be tested in future research. We conclude that existing evidence provides a convincing rationale for further research on the effects of classic hallucinogens in the treatment of addiction. Copyright © 2012 John Wiley & Sons, Ltd.
The origin and early evolution of vascular plant shoots and leaves
2018-01-01
The morphology of plant fossils from the Rhynie chert has generated longstanding questions about vascular plant shoot and leaf evolution, for instance, which morphologies were ancestral within land plants, when did vascular plants first arise and did leaves have multiple evolutionary origins? Recent advances combining insights from molecular phylogeny, palaeobotany and evo–devo research address these questions and suggest the sequence of morphological innovation during vascular plant shoot and leaf evolution. The evidence pinpoints testable developmental and genetic hypotheses relating to the origin of branching and indeterminate shoot architectures prior to the evolution of leaves, and demonstrates underestimation of polyphyly in the evolution of leaves from branching forms in ‘telome theory’ hypotheses of leaf evolution. This review discusses fossil, developmental and genetic evidence relating to the evolution of vascular plant shoots and leaves in a phylogenetic framework. This article is part of a discussion meeting issue ‘The Rhynie cherts: our earliest terrestrial ecosystem revisited’. PMID:29254961
Hartog, Iris; Scherer-Rath, Michael; Kruizinga, Renske; Netjes, Justine; Henriques, José; Nieuwkerk, Pythia; Sprangers, Mirjam; van Laarhoven, Hanneke
2017-09-01
Falling seriously ill is often experienced as a life event that causes conflict with people's personal goals and expectations in life and evokes existential questions. This article presents a new humanities approach to the way people make meaning of such events and how this influences their quality of life. Incorporating theories on contingency, narrative identity, and quality of life, we developed a theoretical model entailing the concepts life event, worldview, ultimate life goals, experience of contingency, narrative meaning making, narrative integration, and quality of life. We formulate testable hypotheses and describe the self-report questionnaire that was developed based on the model.
Planning and Studying Improvement in Patient Care: The Use of Theoretical Perspectives
Grol, Richard PTM; Bosch, Marije C; Hulscher, Marlies EJL; Eccles, Martin P; Wensing, Michel
2007-01-01
A consistent finding in articles on quality improvement in health care is that change is difficult to achieve. According to the research literature, the majority of interventions are targeted at health care professionals. But success in achieving change may be influenced by factors other than those relating to individual professionals, and theories may help explain whether change is possible. This article argues for a more systematic use of theories in planning and evaluating quality-improvement interventions in clinical practice. It demonstrates how different theories can be used to generate testable hypotheses regarding factors that influence the implementation of change, and it shows how different theoretical assumptions lead to different quality-improvement strategies. PMID:17319808
Creativity, information, and consciousness: The information dynamics of thinking.
Wiggins, Geraint A
2018-05-07
This paper presents a theory of the basic operation of mind, Information Dynamics of Thinking, which is intended for computational implementation and thence empirical testing. It is based on the information theory of Shannon, and treats the mind/brain as an information processing organ that aims to be information-efficient, in that it predicts its world, so as to use information efficiently, and regularly re-represents it, so as to store information efficiently. The theory is presented in context of a background review of various research areas that impinge upon its development. Consequences of the theory and testable hypotheses arising from it are discussed. Copyright © 2018. Published by Elsevier B.V.
Panarchy: theory and application
Allen, Craig R.; Angeler, David G.; Garmestani, Ahjond S.; Gunderson, Lance H.; Holling, Crawford S.
2014-01-01
The concept of panarchy provides a framework that characterizes complex systems of people and nature as dynamically organized and structured within and across scales of space and time. It has been more than a decade since the introduction of panarchy. Over this period, its invocation in peer-reviewed literature has been steadily increasing, but its use remains primarily descriptive and abstract. Here, we discuss the use of the concept in the literature to date, highlight where the concept may be useful, and discuss limitations to the broader applicability of panarchy theory for research in the ecological and social sciences. Finally, we forward a set of testable hypotheses to evaluate key propositions that follow from panarchy theory.
Modules, theories, or islands of expertise? Domain specificity in socialization.
Gelman, Susan A
2010-01-01
The domain-specific approach to socialization processes presented by J. E. Grusec and M. Davidov (this issue) provides a compelling framework for integrating and interpreting a large and disparate body of research findings, and it generates a wealth of testable new hypotheses. At the same time, it introduces core theoretical questions regarding the nature of social interactions, from the perspective of both children and their caregivers. This commentary draws on the literature regarding domain specificity in cognitive development, applauds what is innovative and exciting about applying a domain-specific approach to socialization processes, and points to questions for future research. Foremost among these is what is meant by "domain specificity."
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bray, O.H.
This paper describes a natural language based, semantic information modeling methodology and explores its use and value in clarifying and comparing political science theories and frameworks. As an example, the paper uses this methodology to clarify and compare some of the basic concepts and relationships in the realist (e.g. Waltz) and the liberal (e.g. Rosenau) paradigms for international relations. The methodology can provide three types of benefits: (1) it can clarify and make explicit exactly what is meant by a concept; (2) it can often identify unanticipated implications and consequence of concepts and relationships; and (3) it can help inmore » identifying and operationalizing testable hypotheses.« less
2014-01-01
Background Cis-regulatory modules (CRMs), or the DNA sequences required for regulating gene expression, play the central role in biological researches on transcriptional regulation in metazoan species. Nowadays, the systematic understanding of CRMs still mainly resorts to computational methods due to the time-consuming and small-scale nature of experimental methods. But the accuracy and reliability of different CRM prediction tools are still unclear. Without comparative cross-analysis of the results and combinatorial consideration with extra experimental information, there is no easy way to assess the confidence of the predicted CRMs. This limits the genome-wide understanding of CRMs. Description It is known that transcription factor binding and epigenetic profiles tend to determine functions of CRMs in gene transcriptional regulation. Thus integration of the genome-wide epigenetic profiles with systematically predicted CRMs can greatly help researchers evaluate and decipher the prediction confidence and possible transcriptional regulatory functions of these potential CRMs. However, these data are still fragmentary in the literatures. Here we performed the computational genome-wide screening for potential CRMs using different prediction tools and constructed the pioneer database, cisMEP (cis-regulatory module epigenetic profile database), to integrate these computationally identified CRMs with genomic epigenetic profile data. cisMEP collects the literature-curated TFBS location data and nine genres of epigenetic data for assessing the confidence of these potential CRMs and deciphering the possible CRM functionality. Conclusions cisMEP aims to provide a user-friendly interface for researchers to assess the confidence of different potential CRMs and to understand the functions of CRMs through experimentally-identified epigenetic profiles. The deposited potential CRMs and experimental epigenetic profiles for confidence assessment provide experimentally testable hypotheses for the molecular mechanisms of metazoan gene regulation. We believe that the information deposited in cisMEP will greatly facilitate the comparative usage of different CRM prediction tools and will help biologists to study the modular regulatory mechanisms between different TFs and their target genes. PMID:25521507
Brook, Bindi S.
2017-01-01
The chemokine receptor CCR7 drives leukocyte migration into and within lymph nodes (LNs). It is activated by chemokines CCL19 and CCL21, which are scavenged by the atypical chemokine receptor ACKR4. CCR7-dependent navigation is determined by the distribution of extracellular CCL19 and CCL21, which form concentration gradients at specific microanatomical locations. The mechanisms underpinning the establishment and regulation of these gradients are poorly understood. In this article, we have incorporated multiple biochemical processes describing the CCL19–CCL21–CCR7–ACKR4 network into our model of LN fluid flow to establish a computational model to investigate intranodal chemokine gradients. Importantly, the model recapitulates CCL21 gradients observed experimentally in B cell follicles and interfollicular regions, building confidence in its ability to accurately predict intranodal chemokine distribution. Parameter variation analysis indicates that the directionality of these gradients is robust, but their magnitude is sensitive to these key parameters: chemokine production, diffusivity, matrix binding site availability, and CCR7 abundance. The model indicates that lymph flow shapes intranodal CCL21 gradients, and that CCL19 is functionally important at the boundary between B cell follicles and the T cell area. It also predicts that ACKR4 in LNs prevents CCL19/CCL21 accumulation in efferent lymph, but does not control intranodal gradients. Instead, it attributes the disrupted interfollicular CCL21 gradients observed in Ackr4-deficient LNs to ACKR4 loss upstream. Our novel approach has therefore generated new testable hypotheses and alternative interpretations of experimental data. Moreover, it acts as a framework to investigate gradients at other locations, including those that cannot be visualized experimentally or involve other chemokines. PMID:28807994
Evidence-based Sensor Tasking for Space Domain Awareness
NASA Astrophysics Data System (ADS)
Jaunzemis, A.; Holzinger, M.; Jah, M.
2016-09-01
Space Domain Awareness (SDA) is the actionable knowledge required to predict, avoid, deter, operate through, recover from, and/or attribute cause to the loss and/or degradation of space capabilities and services. A main purpose for SDA is to provide decision-making processes with a quantifiable and timely body of evidence of behavior(s) attributable to specific space threats and/or hazards. To fulfill the promise of SDA, it is necessary for decision makers and analysts to pose specific hypotheses that may be supported or refuted by evidence, some of which may only be collected using sensor networks. While Bayesian inference may support some of these decision making needs, it does not adequately capture ambiguity in supporting evidence; i.e., it struggles to rigorously quantify 'known unknowns' for decision makers. Over the past 40 years, evidential reasoning approaches such as Dempster Shafer theory have been developed to address problems with ambiguous bodies of evidence. This paper applies mathematical theories of evidence using Dempster Shafer expert systems to address the following critical issues: 1) How decision makers can pose critical decision criteria as rigorous, testable hypotheses, 2) How to interrogate these hypotheses to reduce ambiguity, and 3) How to task a network of sensors to gather evidence for multiple competing hypotheses. This theory is tested using a simulated sensor tasking scenario balancing search versus track responsibilities.
The cancer Warburg effect may be a testable example of the minimum entropy production rate principle
NASA Astrophysics Data System (ADS)
Marín, Dolores; Sabater, Bartolomé
2017-04-01
Cancer cells consume more glucose by glycolytic fermentation to lactate than by respiration, a characteristic known as the Warburg effect. In contrast with the 36 moles of ATP produced by respiration, fermentation produces two moles of ATP per mole of glucose consumed, which poses a puzzle with regard to the function of the Warburg effect. The production of free energy (ΔG), enthalpy (ΔH), and entropy (ΔS) per mole linearly varies with the fraction (x) of glucose consumed by fermentation that is frequently estimated around 0.9. Hence, calculation shows that, in respect to pure respiration, the predominant fermentative metabolism decreases around 10% the production of entropy per mole of glucose consumed in cancer cells. We hypothesize that increased fermentation could allow cancer cells to accomplish the Prigogine theorem of the trend to minimize the rate of production of entropy. According to the theorem, open cellular systems near the steady state could evolve to minimize the rates of entropy production that may be reached by modified replicating cells producing entropy at a low rate. Remarkably, at CO2 concentrations above 930 ppm, glucose respiration produces less entropy than fermentation, which suggests experimental tests to validate the hypothesis of minimization of the rate of entropy production through the Warburg effect.
Contreras-López, Orlando; Moyano, Tomás C; Soto, Daniela C; Gutiérrez, Rodrigo A
2018-01-01
The rapid increase in the availability of transcriptomics data generated by RNA sequencing represents both a challenge and an opportunity for biologists without bioinformatics training. The challenge is handling, integrating, and interpreting these data sets. The opportunity is to use this information to generate testable hypothesis to understand molecular mechanisms controlling gene expression and biological processes (Fig. 1). A successful strategy to generate tractable hypotheses from transcriptomics data has been to build undirected network graphs based on patterns of gene co-expression. Many examples of new hypothesis derived from network analyses can be found in the literature, spanning different organisms including plants and specific fields such as root developmental biology.In order to make the process of constructing a gene co-expression network more accessible to biologists, here we provide step-by-step instructions using published RNA-seq experimental data obtained from a public database. Similar strategies have been used in previous studies to advance root developmental biology. This guide includes basic instructions for the operation of widely used open source platforms such as Bio-Linux, R, and Cytoscape. Even though the data we used in this example was obtained from Arabidopsis thaliana, the workflow developed in this guide can be easily adapted to work with RNA-seq data from any organism.
Slowly switching between environments facilitates reverse evolution in small populations.
Tan, Longzhi; Gore, Jeff
2012-10-01
Natural populations must constantly adapt to ever-changing environmental conditions. A particularly interesting question is whether such adaptations can be reversed by returning the population to an ancestral environment. Such evolutionary reversals have been observed in both natural and laboratory populations. However, the factors that determine the reversibility of evolution are still under debate. The time scales of environmental change vary over a wide range, but little is known about how the rate of environmental change influences the reversibility of evolution. Here, we demonstrate computationally that slowly switching between environments increases the reversibility of evolution for small populations that are subject to only modest clonal interference. For small populations, slow switching reduces the mean number of mutations acquired in a new environment and also increases the probability of reverse evolution at each of these "genetic distances." As the population size increases, slow switching no longer reduces the genetic distance, thus decreasing the evolutionary reversibility. We confirm this effect using both a phenomenological model of clonal interference and also a Wright-Fisher stochastic simulation that incorporates genetic diversity. Our results suggest that the rate of environmental change is a key determinant of the reversibility of evolution, and provides testable hypotheses for experimental evolution. © 2012 The Author(s). Evolution© 2012 The Society for the Study of Evolution.
Marín, Dolores; Sabater, Bartolomé
2017-04-28
Cancer cells consume more glucose by glycolytic fermentation to lactate than by respiration, a characteristic known as the Warburg effect. In contrast with the 36 moles of ATP produced by respiration, fermentation produces two moles of ATP per mole of glucose consumed, which poses a puzzle with regard to the function of the Warburg effect. The production of free energy (ΔG), enthalpy (ΔH), and entropy (ΔS) per mole linearly varies with the fraction (x) of glucose consumed by fermentation that is frequently estimated around 0.9. Hence, calculation shows that, in respect to pure respiration, the predominant fermentative metabolism decreases around 10% the production of entropy per mole of glucose consumed in cancer cells. We hypothesize that increased fermentation could allow cancer cells to accomplish the Prigogine theorem of the trend to minimize the rate of production of entropy. According to the theorem, open cellular systems near the steady state could evolve to minimize the rates of entropy production that may be reached by modified replicating cells producing entropy at a low rate. Remarkably, at CO 2 concentrations above 930 ppm, glucose respiration produces less entropy than fermentation, which suggests experimental tests to validate the hypothesis of minimization of the rate of entropy production through the Warburg effect.
The Contribution of Psychosocial Stress to the Obesity Epidemic
Siervo, M.; Wells, J. C. K.; Cizza, G.
2009-01-01
The Thrifty Gene hypothesis theorizes that during evolution a set of genes has been selected to ensure survival in environments with limited food supply and marked seasonality. Contemporary environments have predictable and unlimited food availability, an attenuated seasonality due to artificial lighting, indoor heating during the winter and air conditioning during the summer, and promote sedentariness and overeating. In this setting the thrifty genes are constantly activated to enhance energy storage. Psychosocial stress and sleep deprivation are other features of modern societies. Stress-induced hypercortisolemia in the setting of unlimited food supply promotes adiposity. Modern man is becoming obese because these ancient mechanisms are efficiently promoting a positive energy balance. We propose that in today’s plentifully provisioned societies, where sedentariness and mental stress have become typical traits, chronic activation of the neuroendocrine systems may contribute to the increased prevalence of obesity. We suggest that some of the yet unidentified thrifty genes may be linked to highly conserved energy sensing mechanisms (AMP kinase, mTOR kinase). These hypotheses are testable. Rural societies that are becoming rapidly industrialized and are witnessing a dramatic increase in obesity may provide a historical opportunity to conduct epidemiological studies of the thrifty genotype. In experimental settings, the effects of various forms of psychosocial stress in increasing metabolic efficiency and gene expression can be further tested. PMID:19156597
Dong, Junzi; Colburn, H. Steven
2016-01-01
In multisource, “cocktail party” sound environments, human and animal auditory systems can use spatial cues to effectively separate and follow one source of sound over competing sources. While mechanisms to extract spatial cues such as interaural time differences (ITDs) are well understood in precortical areas, how such information is reused and transformed in higher cortical regions to represent segregated sound sources is not clear. We present a computational model describing a hypothesized neural network that spans spatial cue detection areas and the cortex. This network is based on recent physiological findings that cortical neurons selectively encode target stimuli in the presence of competing maskers based on source locations (Maddox et al., 2012). We demonstrate that key features of cortical responses can be generated by the model network, which exploits spatial interactions between inputs via lateral inhibition, enabling the spatial separation of target and interfering sources while allowing monitoring of a broader acoustic space when there is no competition. We present the model network along with testable experimental paradigms as a starting point for understanding the transformation and organization of spatial information from midbrain to cortex. This network is then extended to suggest engineering solutions that may be useful for hearing-assistive devices in solving the cocktail party problem. PMID:26866056
Dong, Junzi; Colburn, H Steven; Sen, Kamal
2016-01-01
In multisource, "cocktail party" sound environments, human and animal auditory systems can use spatial cues to effectively separate and follow one source of sound over competing sources. While mechanisms to extract spatial cues such as interaural time differences (ITDs) are well understood in precortical areas, how such information is reused and transformed in higher cortical regions to represent segregated sound sources is not clear. We present a computational model describing a hypothesized neural network that spans spatial cue detection areas and the cortex. This network is based on recent physiological findings that cortical neurons selectively encode target stimuli in the presence of competing maskers based on source locations (Maddox et al., 2012). We demonstrate that key features of cortical responses can be generated by the model network, which exploits spatial interactions between inputs via lateral inhibition, enabling the spatial separation of target and interfering sources while allowing monitoring of a broader acoustic space when there is no competition. We present the model network along with testable experimental paradigms as a starting point for understanding the transformation and organization of spatial information from midbrain to cortex. This network is then extended to suggest engineering solutions that may be useful for hearing-assistive devices in solving the cocktail party problem.
Critical Roles of the Direct GABAergic Pallido-cortical Pathway in Controlling Absence Seizures
Li, Min; Ma, Tao; Wu, Shengdun; Ma, Jingling; Cui, Yan; Xia, Yang; Xu, Peng; Yao, Dezhong
2015-01-01
The basal ganglia (BG), serving as an intermediate bridge between the cerebral cortex and thalamus, are believed to play crucial roles in controlling absence seizure activities generated by the pathological corticothalamic system. Inspired by recent experiments, here we systematically investigate the contribution of a novel identified GABAergic pallido-cortical pathway, projecting from the globus pallidus externa (GPe) in the BG to the cerebral cortex, to the control of absence seizures. By computational modelling, we find that both increasing the activation of GPe neurons and enhancing the coupling strength of the inhibitory pallido-cortical pathway can suppress the bilaterally synchronous 2–4 Hz spike and wave discharges (SWDs) during absence seizures. Appropriate tuning of several GPe-related pathways may also trigger the SWD suppression, through modulating the activation level of GPe neurons. Furthermore, we show that the previously discovered bidirectional control of absence seizures due to the competition between other two BG output pathways also exists in our established model. Importantly, such bidirectional control is shaped by the coupling strength of this direct GABAergic pallido-cortical pathway. Our work suggests that the novel identified pallido-cortical pathway has a functional role in controlling absence seizures and the presented results might provide testable hypotheses for future experimental studies. PMID:26496656
Early developmental gene enhancers affect subcortical volumes in the adult human brain.
Becker, Martin; Guadalupe, Tulio; Franke, Barbara; Hibar, Derrek P; Renteria, Miguel E; Stein, Jason L; Thompson, Paul M; Francks, Clyde; Vernes, Sonja C; Fisher, Simon E
2016-05-01
Genome-wide association screens aim to identify common genetic variants contributing to the phenotypic variability of complex traits, such as human height or brain morphology. The identified genetic variants are mostly within noncoding genomic regions and the biology of the genotype-phenotype association typically remains unclear. In this article, we propose a complementary targeted strategy to reveal the genetic underpinnings of variability in subcortical brain volumes, by specifically selecting genomic loci that are experimentally validated forebrain enhancers, active in early embryonic development. We hypothesized that genetic variation within these enhancers may affect the development and ultimately the structure of subcortical brain regions in adults. We tested whether variants in forebrain enhancer regions showed an overall enrichment of association with volumetric variation in subcortical structures of >13,000 healthy adults. We observed significant enrichment of genomic loci that affect the volume of the hippocampus within forebrain enhancers (empirical P = 0.0015), a finding which robustly passed the adjusted threshold for testing of multiple brain phenotypes (cutoff of P < 0.0083 at an alpha of 0.05). In analyses of individual single nucleotide polymorphisms (SNPs), we identified an association upstream of the ID2 gene with rs7588305 and variation in hippocampal volume. This SNP-based association survived multiple-testing correction for the number of SNPs analyzed but not for the number of subcortical structures. Targeting known regulatory regions offers a way to understand the underlying biology that connects genotypes to phenotypes, particularly in the context of neuroimaging genetics. This biology-driven approach generates testable hypotheses regarding the functional biology of identified associations. Hum Brain Mapp 37:1788-1800, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Modeling integrated cellular machinery using hybrid Petri-Boolean networks.
Berestovsky, Natalie; Zhou, Wanding; Nagrath, Deepak; Nakhleh, Luay
2013-01-01
The behavior and phenotypic changes of cells are governed by a cellular circuitry that represents a set of biochemical reactions. Based on biological functions, this circuitry is divided into three types of networks, each encoding for a major biological process: signal transduction, transcription regulation, and metabolism. This division has generally enabled taming computational complexity dealing with the entire system, allowed for using modeling techniques that are specific to each of the components, and achieved separation of the different time scales at which reactions in each of the three networks occur. Nonetheless, with this division comes loss of information and power needed to elucidate certain cellular phenomena. Within the cell, these three types of networks work in tandem, and each produces signals and/or substances that are used by the others to process information and operate normally. Therefore, computational techniques for modeling integrated cellular machinery are needed. In this work, we propose an integrated hybrid model (IHM) that combines Petri nets and Boolean networks to model integrated cellular networks. Coupled with a stochastic simulation mechanism, the model simulates the dynamics of the integrated network, and can be perturbed to generate testable hypotheses. Our model is qualitative and is mostly built upon knowledge from the literature and requires fine-tuning of very few parameters. We validated our model on two systems: the transcriptional regulation of glucose metabolism in human cells, and cellular osmoregulation in S. cerevisiae. The model produced results that are in very good agreement with experimental data, and produces valid hypotheses. The abstract nature of our model and the ease of its construction makes it a very good candidate for modeling integrated networks from qualitative data. The results it produces can guide the practitioner to zoom into components and interconnections and investigate them using such more detailed mathematical models.
Rooster feathering, androgenic alopecia, and hormone dependent tumor growth: What is in common?
Mayer, Julie Ann; Chuong, Cheng-Ming; Widelitz, Randall
2015-01-01
Different epithelial organs form as a result of epithelial - mesenchymal interactions and share a common theme modulated by variations (Chuong edit. In Molecular Basis of Epithelial Appendage Morphogenesis, 1998). One of the major modulators is the sex hormone pathway that acts on the prototype signaling pathway to alter organ phenotypes. Here we focus on how the sex hormone pathway interfaces with epithelia morphogenesis related signaling pathways. We first survey these sex hormone regulated morphogenetic processes in various epithelial organs. Sexual dimorphism of hairs and feathers has implications in sexual selection. Diseases of these pathways result in androgenic alopecia, hirsutism, henny feathering, etc. The growth and development of mammary glands, prostate glands and external genitalia essential for reproductive function are also dependent on sex hormones. Diseases affecting these organs include congenital anomalies and hormone dependent type of breast and prostate cancers. To study the role of sex hormones in new growth in the context of system biology / pathology, an in vivo model in which organ formation starts from stem cells is essential. With recent developments (Yu et al., The morphogenesis of feathers. Nature 420:308–312, 2002), the growth of tail feathers in roosters and hens has become a testable model in which experimental manipulations are possible. We show exemplary data of differences in their growth rate, proliferative cell population and signaling molecule expression. Working hypotheses are proposed on how the sex hormone pathways may interact with growth pathways. It is now possible to test these hypotheses using the chicken model to learn fundamental mechanisms on how sex hormones affect organogenesis, epithelial organ cycling, and growth related tumorigenesis. PMID:15617560
Modeling Integrated Cellular Machinery Using Hybrid Petri-Boolean Networks
Berestovsky, Natalie; Zhou, Wanding; Nagrath, Deepak; Nakhleh, Luay
2013-01-01
The behavior and phenotypic changes of cells are governed by a cellular circuitry that represents a set of biochemical reactions. Based on biological functions, this circuitry is divided into three types of networks, each encoding for a major biological process: signal transduction, transcription regulation, and metabolism. This division has generally enabled taming computational complexity dealing with the entire system, allowed for using modeling techniques that are specific to each of the components, and achieved separation of the different time scales at which reactions in each of the three networks occur. Nonetheless, with this division comes loss of information and power needed to elucidate certain cellular phenomena. Within the cell, these three types of networks work in tandem, and each produces signals and/or substances that are used by the others to process information and operate normally. Therefore, computational techniques for modeling integrated cellular machinery are needed. In this work, we propose an integrated hybrid model (IHM) that combines Petri nets and Boolean networks to model integrated cellular networks. Coupled with a stochastic simulation mechanism, the model simulates the dynamics of the integrated network, and can be perturbed to generate testable hypotheses. Our model is qualitative and is mostly built upon knowledge from the literature and requires fine-tuning of very few parameters. We validated our model on two systems: the transcriptional regulation of glucose metabolism in human cells, and cellular osmoregulation in S. cerevisiae. The model produced results that are in very good agreement with experimental data, and produces valid hypotheses. The abstract nature of our model and the ease of its construction makes it a very good candidate for modeling integrated networks from qualitative data. The results it produces can guide the practitioner to zoom into components and interconnections and investigate them using such more detailed mathematical models. PMID:24244124
A model study on the circuit mechanism underlying decision-making in Drosophila.
Wu, Zhihua; Guo, Aike
2011-05-01
Previous elegant experiments in a flight simulator showed that conditioned Drosophila is able to make a clear-cut decision to avoid potential danger. When confronted with conflicting visual cues, the relative saliency of two competing cues is found to be a sensory ruler for flies to judge which cue should be used for decision-making. Further genetic manipulations and immunohistological analysis revealed that the dopamine system and mushroom bodies are indispensable for such a clear-cut or nonlinear decision. The neural circuit mechanism, however, is far from being clear. In this paper, we adopt a computational modeling approach to investigate how different brain areas and the dopamine system work together to drive a fly to make a decision. By developing a systems-level neural network, a two-pathway circuit is proposed. Besides a direct pathway from a feature binding area to the motor center, another connects two areas via the mushroom body, a target of dopamine release. A raised dopamine level is hypothesized to be induced by complex choice tasks and to enhance lateral inhibition and steepen the units' response gain in the mushroom body. Simulations show that training helps to assign values to formerly neutral features. For a circuit model with a blocked mushroom body, the direct pathway passes all alternatives to the motor center without changing original values, giving rise to a simple choice characterized by a linear choice curve. With respect to an intact circuit, enhanced lateral inhibition dependent on dopamine critically promotes competition between alternatives, turning the linear- into nonlinear choice behavior. Results account well for experimental data, supporting the reasonableness of model working hypotheses. Several testable predictions are made for future studies. Copyright © 2011 Elsevier Ltd. All rights reserved.
Architectural Analysis of Dynamically Reconfigurable Systems
NASA Technical Reports Server (NTRS)
Lindvall, Mikael; Godfrey, Sally; Ackermann, Chris; Ray, Arnab; Yonkwa, Lyly
2010-01-01
oTpics include: the problem (increased flexibility of architectural styles decrease analyzability, behavior emerges and varies depending on the configuration, does the resulting system run according to the intended design, and architectural decisions can impede or facilitate testing); top down approach to architecture analysis, detection of defects and deviations, and architecture and its testability; currently targeted projects GMSEC and CFS; analyzing software architectures; analyzing runtime events; actual architecture recognition; GMPUB in Dynamic SAVE; sample output from new approach; taking message timing delays into account; CFS examples of architecture and testability; some recommendations for improved testablity; and CFS examples of abstract interfaces and testability; CFS example of opening some internal details.
Boolean Minimization and Algebraic Factorization Procedures for Fully Testable Sequential Machines
1989-09-01
Boolean Minimization and Algebraic Factorization Procedures for Fully Testable Sequential Machines Srinivas Devadas and Kurt Keutzer F ( Abstract In this...Projects Agency under contract number N00014-87-K-0825. Author Information Devadas : Department of Electrical Engineering and Computer Science, Room 36...MA 02139; (617) 253-0292. 0 * Boolean Minimization and Algebraic Factorization Procedures for Fully Testable Sequential Machines Siivas Devadas
Advanced Diagnostic and Prognostic Testbed (ADAPT) Testability Analysis Report
NASA Technical Reports Server (NTRS)
Ossenfort, John
2008-01-01
As system designs become more complex, determining the best locations to add sensors and test points for the purpose of testing and monitoring these designs becomes more difficult. Not only must the designer take into consideration all real and potential faults of the system, he or she must also find efficient ways of detecting and isolating those faults. Because sensors and cabling take up valuable space and weight on a system, and given constraints on bandwidth and power, it is even more difficult to add sensors into these complex designs after the design has been completed. As a result, a number of software tools have been developed to assist the system designer in proper placement of these sensors during the system design phase of a project. One of the key functions provided by many of these software programs is a testability analysis of the system essentially an evaluation of how observable the system behavior is using available tests. During the design phase, testability metrics can help guide the designer in improving the inherent testability of the design. This may include adding, removing, or modifying tests; breaking up feedback loops, or changing the system to reduce fault propagation. Given a set of test requirements, the analysis can also help to verify that the system will meet those requirements. Of course, a testability analysis requires that a software model of the physical system is available. For the analysis to be most effective in guiding system design, this model should ideally be constructed in parallel with these efforts. The purpose of this paper is to present the final testability results of the Advanced Diagnostic and Prognostic Testbed (ADAPT) after the system model was completed. The tool chosen to build the model and to perform the testability analysis with is the Testability Engineering and Maintenance System Designer (TEAMS-Designer). The TEAMS toolset is intended to be a solution to span all phases of the system, from design and development through health management and maintenance. TEAMS-Designer is the model-building and testability analysis software in that suite.
A Unified Approach to the Synthesis of Fully Testable Sequential Machines
1989-10-01
N A Unified Approach to the Synthesis of Fully Testable Sequential Machines Srinivas Devadas and Kurt Keutzer Abstract • In this paper we attempt to...research was supported in part by the Defense Advanced Research Projects Agency under contract N00014-87-K-0825. Author Information Devadas : Department...Fully Testable Sequential Maine(S P Sritiivas Devadas Departinent of Electrical Engineerinig anid Comivi Sciec Massachusetts Institute of Technology
Factors That Affect Software Testability
NASA Technical Reports Server (NTRS)
Voas, Jeffrey M.
1991-01-01
Software faults that infrequently affect software's output are dangerous. When a software fault causes frequent software failures, testing is likely to reveal the fault before the software is releases; when the fault remains undetected during testing, it can cause disaster after the software is installed. A technique for predicting whether a particular piece of software is likely to reveal faults within itself during testing is found in [Voas91b]. A piece of software that is likely to reveal faults within itself during testing is said to have high testability. A piece of software that is not likely to reveal faults within itself during testing is said to have low testability. It is preferable to design software with higher testabilities from the outset, i.e., create software with as high of a degree of testability as possible to avoid the problems of having undetected faults that are associated with low testability. Information loss is a phenomenon that occurs during program execution that increases the likelihood that a fault will remain undetected. In this paper, I identify two brad classes of information loss, define them, and suggest ways of predicting the potential for information loss to occur. We do this in order to decrease the likelihood that faults will remain undetected during testing.
Wing, Steve; Richardson, David B.; Hoffmann, Wolfgang
2011-01-01
Background In April 2010, the U.S. Nuclear Regulatory Commission asked the National Academy of Sciences to update a 1990 study of cancer risks near nuclear facilities. Prior research on this topic has suffered from problems in hypothesis formulation and research design. Objectives We review epidemiologic principles used in studies of generic exposure–response associations and in studies of specific sources of exposure. We then describe logical problems with assumptions, formation of testable hypotheses, and interpretation of evidence in previous research on cancer risks near nuclear facilities. Discussion Advancement of knowledge about cancer risks near nuclear facilities depends on testing specific hypotheses grounded in physical and biological mechanisms of exposure and susceptibility while considering sample size and ability to adequately quantify exposure, ascertain cancer cases, and evaluate plausible confounders. Conclusions Next steps in advancing knowledge about cancer risks near nuclear facilities require studies of childhood cancer incidence, focus on in utero and early childhood exposures, use of specific geographic information, and consideration of pathways for transport and uptake of radionuclides. Studies of cancer mortality among adults, cancers with long latencies, large geographic zones, and populations that reside at large distances from nuclear facilities are better suited for public relations than for scientific purposes. PMID:21147606
Tomášek, Oldřich; Gabrielová, Barbora; Kačer, Petr; Maršík, Petr; Svobodová, Jana; Syslová, Kamila; Vinkler, Michal; Albrecht, Tomáš
2016-01-01
Several recent hypotheses consider oxidative stress to be a primary constraint ensuring honesty of condition-dependent carotenoid-based signalling. The key testable difference between these hypotheses is the assumed importance of carotenoids for redox homeostasis, with carotenoids being either antioxidant, pro-oxidant or unimportant. We tested the role of carotenoids in redox balance and sexual signalling by exposing adult male zebra finches (Taeniopygia guttata) to oxidative challenge (diquat dibromide) and manipulating carotenoid intake. As the current controversy over the importance of carotenoids as antioxidants could stem from the hydrophilic basis of commonly-used antioxidant assays, we used the novel measure of in vivo lipophilic antioxidant capacity. Oxidative challenge reduced beak pigmentation but elicited an increase in antioxidant capacity suggesting resource reallocation from signalling to redox homeostasis. Carotenoids counteracted the effect of oxidative challenge on lipophilic (but not hydrophilic) antioxidant capacity, thereby supporting carotenoid antioxidant function in vivo. This is inconsistent with hypotheses proposing that signalling honesty is maintained through either ROS-induced carotenoid degradation or the pro-oxidant effect of high levels of carotenoid-cleavage products acting as a physiological handicap. Our data further suggest that assessment of lipophilic antioxidant capacity is necessary to fully understand the role of redox processes in ecology and evolution. PMID:27000655
Exploring Gusev Crater with Spirit: Review of science objectives and testable hypotheses
Cabrol, N.A.; Grin, E.A.; Carr, M.H.; Sutter, B.; Moore, Johnnie N.; Farmer, J.D.; Greeley, R.; Kuzmin, R.O.; DesMarais, D.J.; Kramer, M.G.; Newsom, H.; Barber, C.; Thorsos, I.; Tanaka, K.L.; Barlow, N.G.; Fike, D.A.; Urquhart, M.L.; Grigsby, B.; Grant, F.D.; de Goursac, O.
2003-01-01
Gusev Crater was selected as the landing site for the Mars Exploration Rover (MER) Spirit mission. Located at the outlet of Ma'adim Vallis and 250 km south of the volcano Apollinaris Patera, Gusev is an outstanding site to achieve the goals of the MER mission. The crater could have collected sediments from a variety of sources during its 3.9 Ga history, including fluvial, lacustrine, volcanic, glacial, impact, regional and local aeolian, and global air falls. It is a unique site to investigate the past history of water on Mars, climate and geological changes, and the potential habitability of the planet, which are central science objectives of the MER mission. Because of its complex history and potential diversity, Gusev will allow the testing of a large spectrum of hypotheses with the complete suite of MER instruments. Evidence consistent with long-lived lake episodes exist in the landing ellipse area. They might offer a unique opportunity to study, for the first time, Martian aqueous sediments and minerals formed in situ in their geological context. We review the geological history and diversity of the landing site, the science hypotheses that can be tested during the MER mission, and the relevance of Gusev to the MER mission objectives and payload. Copyright 2003 by the American Geophysical Union.
Gilbert, Jack A; O'Dor, Ronald; King, Nicholas; Vogel, Timothy M
2011-06-14
Scientific discovery is incremental. The Merriam-Webster definition of 'Scientific Method' is "principles and procedures for the systematic pursuit of knowledge involving the recognition and formulation of a problem, the collection of data through observation and experiment, and the formulation and testing of hypotheses". Scientists are taught to be excellent observers, as observations create questions, which in turn generate hypotheses. After centuries of science we tend to assume that we have enough observations to drive science, and enable the small steps and giant leaps which lead to theories and subsequent testable hypotheses. One excellent example of this is Charles Darwin's Voyage of the Beagle, which was essentially an opportunistic survey of biodiversity. Today, obtaining funding for even small-scale surveys of life on Earth is difficult; but few argue the importance of the theory that was generated by Darwin from his observations made during this epic journey. However, these observations, even combined with the parallel work of Alfred Russell Wallace at around the same time have still not generated an indisputable 'law of biology'. The fact that evolution remains a 'theory', at least to the general public, suggests that surveys for new data need to be taken to a new level.
NASA Astrophysics Data System (ADS)
Boger, R. A.; Low, R.; Paull, S.; Anyamba, A.; Soebiyanto, R. P.
2017-12-01
Temperature and precipitation are important drivers of mosquito population dynamics, and a growing set of models have been proposed to characterize these relationships. Validation of these models, and development of broader theories across mosquito species and regions could nonetheless be improved by comparing observations from a global dataset of mosquito larvae with satellite-based measurements of meteorological variables. Citizen science data can be particularly useful for two such aspects of research into the meteorological drivers of mosquito populations: i) Broad-scale validation of mosquito distribution models and ii) Generation of quantitative hypotheses regarding changes to mosquito abundance and phenology across scales. The recently released GLOBE Observer Mosquito Habitat Mapper (GO-MHM) app engages citizen scientists in identifying vector taxa, mapping breeding sites and decommissioning non-natural habitats, and provides a potentially useful new tool for validating mosquito ubiquity projections based on the analysis of remotely sensed environmental data. Our early work with GO-MHM data focuses on two objectives: validating citizen science reports of Aedes aegypti distribution through comparison with accepted scientific data sources, and exploring the relationship between extreme temperature and precipitation events and subsequent observations of mosquito larvae. Ultimately the goal is to develop testable hypotheses regarding the shape and character of this relationship between mosquito species and regions.
Gas Replacements for GFP to Track Microbial Dynamics in Soils and Sediments
NASA Astrophysics Data System (ADS)
Cheng, Hsiao-Ying; Silberg, Jonathan; Masiello, Caroline
2016-04-01
Metagenomic analyses offer unprecedented views of soil microbial communities, and additionally provide a host of testable hypotheses about the biological mechanisms driving global biogeochemical fluxes. Outside the biogeosciences, hypotheses generated by metagenomics are often tested using biosensors, microbes programmed to respond in a detectable way to either changes in their metabolism or changes in the environment. A very large number of microbial behaviors can be monitored using biosensors, but these sensors typically report in ways that are undetectable in soils, e.g. by releasing green fluorescent protein (GFP). We are building a new class of biosensors that report by releasing easily-detected gases. We will provide an overview of the potential uses of gas-reporting biosensors in geobiology, and will report the current development these sensors. One goal in the development of these sensors is to make tractable the testing of gene expression hypotheses derived from metagenomics data. Examples of processes that could be tracked non-invasively with gas sensors include coordination of biofilm formation, nitrification, rhizobial infection of plant roots, and at least some forms of methanogenesis, all of which are managed by the easily-engineered acyl homoserine lactone cell-cell communication system. Another relatively simple process to track with gas sensors is horizontal gene transfer. We will report on the progress of these proof-of-concept examples.
Research by retrieving experiments.
Blagosklonny, Mikhail V
2007-06-01
Newton did not discover that apples fall: the information was available prior to his gravitational hypothesis. Hypotheses can be tested not only by performing experiments but also by retrieving experiments from the literature (via PubMed, for example). Here I show how disconnected facts from known data, if properly connected, can generate novel predictions testable in turn by other published data. With examples from cell cycle, aging, cancer and other fields of biology and medicine, I discuss how new knowledge was and will be derived from old information. Millions of experiments have been already performed to test unrelated hypotheses and the results of those experiments are available to 'test' your hypotheses too. But most data (99% by some estimates) remain unpublished, because they were negative, seemed of low priority, or did not fit the story. Yet for other investigators those data may be valuable. The well-known story of Franklin and Watson is a case in point. By making preliminary data widely available, 'data-owners' will benefit most, receiving the credit for otherwise unused results. If posted (pre-published) on searchable databases, these data may fuel thousands of projects without the need for repetitive experiments. Enormous 'pre-published' databases coupled with Google-like search engines can change the structure of scientific research, and shrinking funding will make this inevitable.
Are evolutionary hypotheses for motion sickness "just-so" stories?
Oman, Charles M
2012-01-01
Vertebrates have evolved rapidly conditionable nausea and vomiting reflexes mediated by gut and brainstem receptors, clearly as a defense against neurotoxin ingestion. In 1977 Treisman proposed that sensory orientation linkages to emetic centers evolved for the same reason, and that motion sickness was an accidental byproduct. It was an "adaptationist" explanation for motion sickness, since it assumed that evolution has shaped all phenotypic traits for survival advantage. Treisman's "poison" theory is plausible, and frequently cited as the accepted scientific explanation for motion sickness. However, alternative explanations have been proposed. The creation of hypotheses is an essential part of science - provided they are testable. This paper reviews the evidence for the Poison theory and several other adaptationist explanations. These hypotheses are certainly not "just-so stories", but supporting evidence is equivocal, and contradictory evidence exists Parsimony suggests an alternative "pluralistic" view: The vertebrate reticular formation maintains oxygenated blood flow to the brain, discriminates unexpected sensory stimuli- including postural disturbances, and detects and expels ingested neurotoxins. The three systems share neuroarchitectural elements but normally function independently. Brainstem sensory conflict neurons normally discriminate brief postural disturbances, but can be abnormally stimulated during prolonged passive transport (e.g. by boat, beginning about 150-200 generations ago). Sensory conflict signals cross couple into the neurotoxin expulsion and avoidance system, producing an arguably maladaptive emetic phenotype.
Tomášek, Oldřich; Gabrielová, Barbora; Kačer, Petr; Maršík, Petr; Svobodová, Jana; Syslová, Kamila; Vinkler, Michal; Albrecht, Tomáš
2016-03-22
Several recent hypotheses consider oxidative stress to be a primary constraint ensuring honesty of condition-dependent carotenoid-based signalling. The key testable difference between these hypotheses is the assumed importance of carotenoids for redox homeostasis, with carotenoids being either antioxidant, pro-oxidant or unimportant. We tested the role of carotenoids in redox balance and sexual signalling by exposing adult male zebra finches (Taeniopygia guttata) to oxidative challenge (diquat dibromide) and manipulating carotenoid intake. As the current controversy over the importance of carotenoids as antioxidants could stem from the hydrophilic basis of commonly-used antioxidant assays, we used the novel measure of in vivo lipophilic antioxidant capacity. Oxidative challenge reduced beak pigmentation but elicited an increase in antioxidant capacity suggesting resource reallocation from signalling to redox homeostasis. Carotenoids counteracted the effect of oxidative challenge on lipophilic (but not hydrophilic) antioxidant capacity, thereby supporting carotenoid antioxidant function in vivo. This is inconsistent with hypotheses proposing that signalling honesty is maintained through either ROS-induced carotenoid degradation or the pro-oxidant effect of high levels of carotenoid-cleavage products acting as a physiological handicap. Our data further suggest that assessment of lipophilic antioxidant capacity is necessary to fully understand the role of redox processes in ecology and evolution.
Permo-Triassic vertebrate extinctions: A program
NASA Technical Reports Server (NTRS)
Olson, E. C.
1988-01-01
Since the time of the Authors' study on this subject, a great deal of new information has become available. Concepts of the nature of extinctions have changed materially. The Authors' conclusion that a catastrophic event was not responsible for the extinction of vertebrates has modified to the extent that hypotheses involving either the impact of a massive extra-terrestrial body or volcanism provide plausible but not currently fully testable hypotheses. Stated changes resulted in a rapid decrease in organic diversity, as the ratio of origins of taxa to extinctions shifted from strongly positive to negative, with momentary equilibrium being reached at about the Permo-Triassic boundary. The proximate causes of the changes in the terrestrial biota appear to lie in two primary factors: (1) strong climatic changes (global mean temperatures, temperature ranges, humidity) and (2) susceptibility of the dominant vertebrates (large dicynodonts) and the glossopteris flora to disruption of the equlibrium of the world ecosystem. The following proximate causes have been proposed: (1) rhythmic fluctuations in solar radiation, (2) tectonic events as Pangea assembled, altering land-ocean relationships, patterns of wind and water circulation and continental physiography, (3) volcanism, and (4) changes subsequent to impacts of one or more massive extra terrestrial objects, bodies or comets. These hypotheses are discussed.
Social response to technological disaster: the accident at Three Mile Island
DOE Office of Scientific and Technical Information (OSTI.GOV)
Richardson, B.B.
1984-01-01
Until recently the sociological study of man environment relations under extreme circumstances has been restricted to natural hazards (e.g., floods, hurricanes, tornadoes). Technological disasters are becoming more commonplace (e.g., Times Beach, MO, Love Canal, TMI-2) and are growing as potential sources of impact upon human populations. However, theory regarding the social impact of such disasters has not been developed. While research on natural disasters is in part applicable to technological disasters, theory adapted from environmental sociology and psychology are also utilized to develop a theory of social response to extreme environmental events produced by technology. Hypotheses are developed in themore » form of an empirically testable model based on the literature reviewed.« less
Reframing landscape fragmentation's effects on ecosystem services.
Mitchell, Matthew G E; Suarez-Castro, Andrés F; Martinez-Harms, Maria; Maron, Martine; McAlpine, Clive; Gaston, Kevin J; Johansen, Kasper; Rhodes, Jonathan R
2015-04-01
Landscape structure and fragmentation have important effects on ecosystem services, with a common assumption being that fragmentation reduces service provision. This is based on fragmentation's expected effects on ecosystem service supply, but ignores how fragmentation influences the flow of services to people. Here we develop a new conceptual framework that explicitly considers the links between landscape fragmentation, the supply of services, and the flow of services to people. We argue that fragmentation's effects on ecosystem service flow can be positive or negative, and use our framework to construct testable hypotheses about the effects of fragmentation on final ecosystem service provision. Empirical efforts to apply and test this framework are critical to improving landscape management for multiple ecosystem services. Copyright © 2015 Elsevier Ltd. All rights reserved.
Domain generality vs. modality specificity: The paradox of statistical learning
Frost, Ram; Armstrong, Blair C.; Siegelman, Noam; Christiansen, Morten H.
2015-01-01
Statistical learning is typically considered to be a domain-general mechanism by which cognitive systems discover the underlying distributional properties of the input. Recent studies examining whether there are commonalities in the learning of distributional information across different domains or modalities consistently reveal, however, modality and stimulus specificity. An important question is, therefore, how and why a hypothesized domain-general learning mechanism systematically produces such effects. We offer a theoretical framework according to which statistical learning is not a unitary mechanism, but a set of domain-general computational principles, that operate in different modalities and therefore are subject to the specific constraints characteristic of their respective brain regions. This framework offers testable predictions and we discuss its computational and neurobiological plausibility. PMID:25631249
Transcriptional Regulatory Network Analysis of MYB Transcription Factor Family Genes in Rice.
Smita, Shuchi; Katiyar, Amit; Chinnusamy, Viswanathan; Pandey, Dev M; Bansal, Kailash C
2015-01-01
MYB transcription factor (TF) is one of the largest TF families and regulates defense responses to various stresses, hormone signaling as well as many metabolic and developmental processes in plants. Understanding these regulatory hierarchies of gene expression networks in response to developmental and environmental cues is a major challenge due to the complex interactions between the genetic elements. Correlation analyses are useful to unravel co-regulated gene pairs governing biological process as well as identification of new candidate hub genes in response to these complex processes. High throughput expression profiling data are highly useful for construction of co-expression networks. In the present study, we utilized transcriptome data for comprehensive regulatory network studies of MYB TFs by "top-down" and "guide-gene" approaches. More than 50% of OsMYBs were strongly correlated under 50 experimental conditions with 51 hub genes via "top-down" approach. Further, clusters were identified using Markov Clustering (MCL). To maximize the clustering performance, parameter evaluation of the MCL inflation score (I) was performed in terms of enriched GO categories by measuring F-score. Comparison of co-expressed cluster and clads analyzed from phylogenetic analysis signifies their evolutionarily conserved co-regulatory role. We utilized compendium of known interaction and biological role with Gene Ontology enrichment analysis to hypothesize function of coexpressed OsMYBs. In the other part, the transcriptional regulatory network analysis by "guide-gene" approach revealed 40 putative targets of 26 OsMYB TF hubs with high correlation value utilizing 815 microarray data. The putative targets with MYB-binding cis-elements enrichment in their promoter region, functional co-occurrence as well as nuclear localization supports our finding. Specially, enrichment of MYB binding regions involved in drought-inducibility implying their regulatory role in drought response in rice. Thus, the co-regulatory network analysis facilitated the identification of complex OsMYB regulatory networks, and candidate target regulon genes of selected guide MYB genes. The results contribute to the candidate gene screening, and experimentally testable hypotheses for potential regulatory MYB TFs, and their targets under stress conditions.
Recent Advances in Tourette Syndrome
Bloch, Michael; State, Matthew; Pittenger, Christopher
2014-01-01
Purpose of review The purpose of this review is to consider the recent literature pertaining to the neurobiology, genetics and treatment of Tourette syndrome (TS). Recent findings Over the last several years, both neuropathological and genetic findings have further focused attention on long-standing hypotheses regarding the role of the basal ganglia in the etiology of tics and TS. Moreover, while the field awaits the results the first large-scale genetic studies, recent findings have already mirrored developments in the neuropsychiatric genetics literature more broadly, highlighting the value of the study of rare variation and the overlap of risks among seemingly disparate diagnostic categories. Finally, treatment studies have underscored the importance of cognitive-behavioral as well as pharmacological interventions for the treatment of tic disorders. Summary Recent findings have led to novel, testable hypotheses regarding the molecular and cellular mechanisms underlying TS. These, in turn, have begun to provide new avenues to conceptualizing treatment strategies. While the development of additional medication options is a pressing need, recent data has demonstrated both the safety and efficacy of non-pharmacological approaches. PMID:21386676
Nauta, Margaret M
2010-01-01
This article celebrates the 50th anniversary of the introduction of John L. Holland's (1959) theory of vocational personalities and work environments by describing the theory's development and evolution, its instrumentation, and its current status. Hallmarks of Holland's theory are its empirical testability and its user-friendliness. By constructing measures for operationalizing the theory's constructs, Holland and his colleagues helped ensure that the theory could be implemented in practice on a widespread basis. Empirical data offer considerable support for the existence of Holland's RIASEC types and their ordering among persons and environments. Although Holland's congruence hypotheses have received empirical support, congruence appears to have modest predictive power. Mixed support exists for Holland's hypotheses involving the secondary constructs of differentiation, consistency, and vocational identity. Evidence of the continued impact of Holland's theory on the field of counseling psychology, particularly in the area of interest assessment, can be seen from its frequent implementation in practice and its use by scholars. Ideas for future research and practice using Holland's theory are suggested.
Extended Testability Analysis Tool
NASA Technical Reports Server (NTRS)
Melcher, Kevin; Maul, William A.; Fulton, Christopher
2012-01-01
The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.
Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool
NASA Technical Reports Server (NTRS)
Maul, William A.; Fulton, Christopher E.
2011-01-01
This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual
2013-01-01
Background High-throughput profiling of human tissues typically yield as results the gene lists comprised of a mix of relevant molecular entities with multiple false positives that obstruct the translation of such results into mechanistic hypotheses. From general probabilistic considerations, gene lists distilled for the mechanistically relevant components can be far more useful for subsequent experimental design or data interpretation. Results The input candidate gene lists were processed into different tiers of evidence consistency established by enrichment analysis across subsets of the same experiments and across different experiments and platforms. The cut-offs were established empirically through ontological and semantic enrichment; resultant shortened gene list was re-expanded by Ingenuity Pathway Assistant tool. The resulting sub-networks provided the basis for generating mechanistic hypotheses that were partially validated by literature search. This approach differs from previous consistency-based studies in that the cut-off on the Receiver Operating Characteristic of the true-false separation process is optimized by flexible selection of the consistency building procedure. The gene list distilled by this analytic technique and its network representation were termed Compact Disease Model (CDM). Here we present the CDM signature for the study of early-stage Alzheimer’s disease. The integrated analysis of this gene signature allowed us to identify the protein traffic vesicles as prominent players in the pathogenesis of Alzheimer’s. Considering the distances and complexity of protein trafficking in neurons, it is plausible that spontaneous protein misfolding along with a shortage of growth stimulation result in neurodegeneration. Several potentially overlapping scenarios of early-stage Alzheimer pathogenesis have been discussed, with an emphasis on the protective effects of AT-1 mediated antihypertensive response on cytoskeleton remodeling, along with neuronal activation of oncogenes, luteinizing hormone signaling and insulin-related growth regulation, forming a pleiotropic model of its early stages. Alignment with emerging literature confirmed many predictions derived from early-stage Alzheimer’s disease’ CDM. Conclusions A flexible approach for high-throughput data analysis, the Compact Disease Model generation, allows extraction of meaningful, mechanism-centered gene sets compatible with instant translation of the results into testable hypotheses. PMID:24196233
Current challenges in fundamental physics
NASA Astrophysics Data System (ADS)
Egana Ugrinovic, Daniel
The discovery of the Higgs boson at the Large Hadron Collider completed the Standard Model of particle physics. The Standard Model is a remarkably successful theory of fundamental physics, but it suffers from severe problems. It does not provide an explanation for the origin or stability of the electroweak scale nor for the origin and structure of flavor and CP violation. It predicts vanishing neutrino masses, in disagreement with experimental observations. It also fails to explain the matter-antimatter asymmetry of the universe, and it does not provide a particle candidate for dark matter. In this thesis we provide experimentally testable solutions for most of these problems and we study their phenomenology.
Gait control in a soft robot by sensing interactions with the environment using self-deformation.
Umedachi, Takuya; Kano, Takeshi; Ishiguro, Akio; Trimmer, Barry A
2016-12-01
All animals use mechanosensors to help them move in complex and changing environments. With few exceptions, these sensors are embedded in soft tissues that deform in normal use such that sensory feedback results from the interaction of an animal with its environment. Useful information about the environment is expected to be embedded in the mechanical responses of the tissues during movements. To explore how such sensory information can be used to control movements, we have developed a soft-bodied crawling robot inspired by a highly tractable animal model, the tobacco hornworm Manduca sexta . This robot uses deformations of its body to detect changes in friction force on a substrate. This information is used to provide local sensory feedback for coupled oscillators that control the robot's locomotion. The validity of the control strategy is demonstrated with both simulation and a highly deformable three-dimensionally printed soft robot. The results show that very simple oscillators are able to generate propagating waves and crawling/inching locomotion through the interplay of deformation in different body parts in a fully decentralized manner. Additionally, we confirmed numerically and experimentally that the gait pattern can switch depending on the surface contact points. These results are expected to help in the design of adaptable, robust locomotion control systems for soft robots and also suggest testable hypotheses about how soft animals use sensory feedback.
The relation between prior knowledge and students' collaborative discovery learning processes
NASA Astrophysics Data System (ADS)
Gijlers, Hannie; de Jong, Ton
2005-03-01
In this study we investigate how prior knowledge influences knowledge development during collaborative discovery learning. Fifteen dyads of students (pre-university education, 15-16 years old) worked on a discovery learning task in the physics field of kinematics. The (face-to-face) communication between students was recorded and the interaction with the environment was logged. Based on students' individual judgments of the truth-value and testability of a series of domain-specific propositions, a detailed description of the knowledge configuration for each dyad was created before they entered the learning environment. Qualitative analyses of two dialogues illustrated that prior knowledge influences the discovery learning processes, and knowledge development in a pair of students. Assessments of student and dyad definitional (domain-specific) knowledge, generic (mathematical and graph) knowledge, and generic (discovery) skills were related to the students' dialogue in different discovery learning processes. Results show that a high level of definitional prior knowledge is positively related to the proportion of communication regarding the interpretation of results. Heterogeneity with respect to generic prior knowledge was positively related to the number of utterances made in the discovery process categories hypotheses generation and experimentation. Results of the qualitative analyses indicated that collaboration between extremely heterogeneous dyads is difficult when the high achiever is not willing to scaffold information and work in the low achiever's zone of proximal development.
Gait control in a soft robot by sensing interactions with the environment using self-deformation
Ishiguro, Akio; Trimmer, Barry A.
2016-01-01
All animals use mechanosensors to help them move in complex and changing environments. With few exceptions, these sensors are embedded in soft tissues that deform in normal use such that sensory feedback results from the interaction of an animal with its environment. Useful information about the environment is expected to be embedded in the mechanical responses of the tissues during movements. To explore how such sensory information can be used to control movements, we have developed a soft-bodied crawling robot inspired by a highly tractable animal model, the tobacco hornworm Manduca sexta. This robot uses deformations of its body to detect changes in friction force on a substrate. This information is used to provide local sensory feedback for coupled oscillators that control the robot's locomotion. The validity of the control strategy is demonstrated with both simulation and a highly deformable three-dimensionally printed soft robot. The results show that very simple oscillators are able to generate propagating waves and crawling/inching locomotion through the interplay of deformation in different body parts in a fully decentralized manner. Additionally, we confirmed numerically and experimentally that the gait pattern can switch depending on the surface contact points. These results are expected to help in the design of adaptable, robust locomotion control systems for soft robots and also suggest testable hypotheses about how soft animals use sensory feedback. PMID:28083114
VirtualPlant: A Software Platform to Support Systems Biology Research1[W][OA
Katari, Manpreet S.; Nowicki, Steve D.; Aceituno, Felipe F.; Nero, Damion; Kelfer, Jonathan; Thompson, Lee Parnell; Cabello, Juan M.; Davidson, Rebecca S.; Goldberg, Arthur P.; Shasha, Dennis E.; Coruzzi, Gloria M.; Gutiérrez, Rodrigo A.
2010-01-01
Data generation is no longer the limiting factor in advancing biological research. In addition, data integration, analysis, and interpretation have become key bottlenecks and challenges that biologists conducting genomic research face daily. To enable biologists to derive testable hypotheses from the increasing amount of genomic data, we have developed the VirtualPlant software platform. VirtualPlant enables scientists to visualize, integrate, and analyze genomic data from a systems biology perspective. VirtualPlant integrates genome-wide data concerning the known and predicted relationships among genes, proteins, and molecules, as well as genome-scale experimental measurements. VirtualPlant also provides visualization techniques that render multivariate information in visual formats that facilitate the extraction of biological concepts. Importantly, VirtualPlant helps biologists who are not trained in computer science to mine lists of genes, microarray experiments, and gene networks to address questions in plant biology, such as: What are the molecular mechanisms by which internal or external perturbations affect processes controlling growth and development? We illustrate the use of VirtualPlant with three case studies, ranging from querying a gene of interest to the identification of gene networks and regulatory hubs that control seed development. Whereas the VirtualPlant software was developed to mine Arabidopsis (Arabidopsis thaliana) genomic data, its data structures, algorithms, and visualization tools are designed in a species-independent way. VirtualPlant is freely available at www.virtualplant.org. PMID:20007449
NOXclass: prediction of protein-protein interaction types.
Zhu, Hongbo; Domingues, Francisco S; Sommer, Ingolf; Lengauer, Thomas
2006-01-19
Structural models determined by X-ray crystallography play a central role in understanding protein-protein interactions at the molecular level. Interpretation of these models requires the distinction between non-specific crystal packing contacts and biologically relevant interactions. This has been investigated previously and classification approaches have been proposed. However, less attention has been devoted to distinguishing different types of biological interactions. These interactions are classified as obligate and non-obligate according to the effect of the complex formation on the stability of the protomers. So far no automatic classification methods for distinguishing obligate, non-obligate and crystal packing interactions have been made available. Six interface properties have been investigated on a dataset of 243 protein interactions. The six properties have been combined using a support vector machine algorithm, resulting in NOXclass, a classifier for distinguishing obligate, non-obligate and crystal packing interactions. We achieve an accuracy of 91.8% for the classification of these three types of interactions using a leave-one-out cross-validation procedure. NOXclass allows the interpretation and analysis of protein quaternary structures. In particular, it generates testable hypotheses regarding the nature of protein-protein interactions, when experimental results are not available. We expect this server will benefit the users of protein structural models, as well as protein crystallographers and NMR spectroscopists. A web server based on the method and the datasets used in this study are available at http://noxclass.bioinf.mpi-inf.mpg.de/.
Buckley, Thomas N; Adams, Mark A
2011-01-01
Leaf respiration continues in the light but at a reduced rate. This inhibition is highly variable, and the mechanisms are poorly known, partly due to the lack of a formal model that can generate testable hypotheses. We derived an analytical model for non-photorespiratory CO₂ release by solving steady-state supply/demand equations for ATP, NADH and NADPH, coupled to a widely used photosynthesis model. We used this model to evaluate causes for suppression of respiration by light. The model agrees with many observations, including highly variable suppression at saturating light, greater suppression in mature leaves, reduced assimilatory quotient (ratio of net CO₂ and O₂ exchange) concurrent with nitrate reduction and a Kok effect (discrete change in quantum yield at low light). The model predicts engagement of non-phosphorylating pathways at moderate to high light, or concurrent with processes that yield ATP and NADH, such as fatty acid or terpenoid synthesis. Suppression of respiration is governed largely by photosynthetic adenylate balance, although photorespiratory NADH may contribute at sub-saturating light. Key questions include the precise diel variation of anabolism and the ATP : 2e⁻ ratio for photophosphorylation. Our model can focus experimental research and is a step towards a fully process-based model of CO₂ exchange. © 2010 Blackwell Publishing Ltd.
The effective application of a discrete transition model to explore cell-cycle regulation in yeast
2013-01-01
Background Bench biologists often do not take part in the development of computational models for their systems, and therefore, they frequently employ them as “black-boxes”. Our aim was to construct and test a model that does not depend on the availability of quantitative data, and can be directly used without a need for intensive computational background. Results We present a discrete transition model. We used cell-cycle in budding yeast as a paradigm for a complex network, demonstrating phenomena such as sequential protein expression and activity, and cell-cycle oscillation. The structure of the network was validated by its response to computational perturbations such as mutations, and its response to mating-pheromone or nitrogen depletion. The model has a strong predicative capability, demonstrating how the activity of a specific transcription factor, Hcm1, is regulated, and what determines commitment of cells to enter and complete the cell-cycle. Conclusion The model presented herein is intuitive, yet is expressive enough to elucidate the intrinsic structure and qualitative behavior of large and complex regulatory networks. Moreover our model allowed us to examine multiple hypotheses in a simple and intuitive manner, giving rise to testable predictions. This methodology can be easily integrated as a useful approach for the study of networks, enriching experimental biology with computational insights. PMID:23915717
Comparison of Three Ionic Liquid-Tolerant Cellulases by Molecular Dynamics
Jaeger, Vance; Burney, Patrick; Pfaendtner, Jim
2015-01-01
We have employed molecular dynamics to investigate the differences in ionic liquid tolerance among three distinct family 5 cellulases from Trichoderma viride, Thermogata maritima, and Pyrococcus horikoshii. Simulations of the three cellulases were conducted at a range of temperatures in various binary mixtures of the ionic liquid 1-ethyl-3-methyl-imidazolium acetate with water. Our analysis demonstrates that the effects of ionic liquids on the enzymes vary in each individual case from local structural disturbances to loss of much of one of the enzyme’s secondary structure. Enzymes with more negatively charged surfaces tend to resist destabilization by ionic liquids. Specific and unique structural changes in the enzymes are induced by the presence of ionic liquids. Disruption of the secondary structure, changes in dynamical motion, and local changes in the binding pocket are observed in less tolerant enzymes. Ionic-liquid-induced denaturation of one of the enzymes is indicated over the 500 ns timescale. In contrast, the most tolerant cellulase behaves similarly in water and in ionic-liquid-containing mixtures. Unlike the heuristic approaches that attempt to predict enzyme stability using macroscopic properties, molecular dynamics allows us to predict specific atomic-level structural and dynamical changes in an enzyme’s behavior induced by ionic liquids and other mixed solvents. Using these insights, we propose specific experimentally testable hypotheses regarding the origin of activity loss for each of the systems investigated in this study. PMID:25692593
Ali, Ashehad A.; Medlyn, Belinda E.; Aubier, Thomas G.; ...
2015-10-06
Differential species responses to atmospheric CO 2 concentration (C a) could lead to quantitative changes in competition among species and community composition, with flow-on effects for ecosystem function. However, there has been little theoretical analysis of how elevated C a (eC a) will affect plant competition, or how composition of plant communities might change. Such theoretical analysis is needed for developing testable hypotheses to frame experimental research. Here, we investigated theoretically how plant competition might change under eC a by implementing two alternative competition theories, resource use theory and resource capture theory, in a plant carbon and nitrogen cycling model.more » The model makes several novel predictions for the impact of eC a on plant community composition. Using resource use theory, the model predicts that eC a is unlikely to change species dominance in competition, but is likely to increase coexistence among species. Using resource capture theory, the model predicts that eC a may increase community evenness. Collectively, both theories suggest that eC a will favor coexistence and hence that species diversity should increase with eC a. Our theoretical analysis leads to a novel hypothesis for the impact of eC a on plant community composition. In this study, the hypothesis has potential to help guide the design and interpretation of eC a experiments.« less
Improving accuracy and power with transfer learning using a meta-analytic database.
Schwartz, Yannick; Varoquaux, Gaël; Pallier, Christophe; Pinel, Philippe; Poline, Jean-Baptiste; Thirion, Bertrand
2012-01-01
Typical cohorts in brain imaging studies are not large enough for systematic testing of all the information contained in the images. To build testable working hypotheses, investigators thus rely on analysis of previous work, sometimes formalized in a so-called meta-analysis. In brain imaging, this approach underlies the specification of regions of interest (ROIs) that are usually selected on the basis of the coordinates of previously detected effects. In this paper, we propose to use a database of images, rather than coordinates, and frame the problem as transfer learning: learning a discriminant model on a reference task to apply it to a different but related new task. To facilitate statistical analysis of small cohorts, we use a sparse discriminant model that selects predictive voxels on the reference task and thus provides a principled procedure to define ROIs. The benefits of our approach are twofold. First it uses the reference database for prediction, i.e., to provide potential biomarkers in a clinical setting. Second it increases statistical power on the new task. We demonstrate on a set of 18 pairs of functional MRI experimental conditions that our approach gives good prediction. In addition, on a specific transfer situation involving different scanners at different locations, we show that voxel selection based on transfer learning leads to higher detection power on small cohorts.
A two-hypothesis approach to establishing a life detection/biohazard protocol for planetary samples
NASA Astrophysics Data System (ADS)
Conley, Catharine; Steele, Andrew
2016-07-01
The COSPAR policy on performing a biohazard assessment on samples brought from Mars to Earth is framed in the context of a concern for false-positive results. However, as noted during the 2012 Workshop for Life Detection in Samples from Mars (ref. Kminek et al., 2014), a more significant concern for planetary samples brought to Earth is false-negative results, because an undetected biohazard could increase risk to the Earth. This is the reason that stringent contamination control must be a high priority for all Category V Restricted Earth Return missions. A useful conceptual framework for addressing these concerns involves two complementary 'null' hypotheses: testing both of them, together, would allow statistical and community confidence to be developed regarding one or the other conclusion. As noted above, false negatives are of primary concern for safety of the Earth, so the 'Earth Safety null hypothesis' -- that must be disproved to assure low risk to the Earth from samples introduced by Category V Restricted Earth Return missions -- is 'There is native life in these samples.' False positives are of primary concern for Astrobiology, so the 'Astrobiology null hypothesis' -- that must be disproved in order to demonstrate the existence of extraterrestrial life is 'There is no life in these samples.' The presence of Earth contamination would render both of these hypotheses more difficult to disprove. Both these hypotheses can be tested following a strict science protocol; analyse, interprete, test the hypotheses and repeat. The science measurements undertaken are then done in an iterative fashion that responds to discovery with both hypotheses testable from interpretation of the scientific data. This is a robust, community involved activity that ensures maximum science return with minimal sample use.
Predicting Predator Recognition in a Changing World.
Carthey, Alexandra J R; Blumstein, Daniel T
2018-02-01
Through natural as well as anthropogenic processes, prey can lose historically important predators and gain novel ones. Both predator gain and loss frequently have deleterious consequences. While numerous hypotheses explain the response of individuals to novel and familiar predators, we lack a unifying conceptual model that predicts the fate of prey following the introduction of a novel or a familiar (reintroduced) predator. Using the concept of eco-evolutionary experience, we create a new framework that allows us to predict whether prey will recognize and be able to discriminate predator cues from non-predator cues and, moreover, the likely persistence outcomes for 11 different predator-prey interaction scenarios. This framework generates useful and testable predictions for ecologists, conservation scientists, and decision-makers. Copyright © 2017 Elsevier Ltd. All rights reserved.
Causes and consequences of reduced blood volume in space flight - A multi-discipline modeling study
NASA Technical Reports Server (NTRS)
Leonard, J. I.
1983-01-01
A group of mathematical models of various physiological systems have been developed and applied to studying problems associated with adaptation to weightlessness. One biomedical issue which could be addressed by at least three of these models from varying perspectives was the reduction in blood volume that universally occurs in astronauts. Accordingly, models of fluid-electrolyte, erythropoiesis, and cardiovascular regulation were employed to study the causes and consequences of blood volume loss during space flight. This analysis confirms the notion that alterations of blood volume are central to an understanding of adaptation to prolonged space flight. More importantly, the modeling studies resulted in specific hypotheses accounting for plasma volume and red cell mass losses and testable predictions concerning the behavior of the circulatory system.
Graduate students' teaching experiences improve their methodological research skills.
Feldon, David F; Peugh, James; Timmerman, Briana E; Maher, Michelle A; Hurst, Melissa; Strickland, Denise; Gilmore, Joanna A; Stiegelmeyer, Cindy
2011-08-19
Science, technology, engineering, and mathematics (STEM) graduate students are often encouraged to maximize their engagement with supervised research and minimize teaching obligations. However, the process of teaching students engaged in inquiry provides practice in the application of important research skills. Using a performance rubric, we compared the quality of methodological skills demonstrated in written research proposals for two groups of early career graduate students (those with both teaching and research responsibilities and those with only research responsibilities) at the beginning and end of an academic year. After statistically controlling for preexisting differences between groups, students who both taught and conducted research demonstrate significantly greater improvement in their abilities to generate testable hypotheses and design valid experiments. These results indicate that teaching experience can contribute substantially to the improvement of essential research skills.
A unifying model of the role of the infralimbic cortex in extinction and habits
Taylor, Jane R.; Chandler, L. Judson
2014-01-01
The infralimbic prefrontal cortex (IL) has been shown to be critical for the regulation of flexible behavior, but its precise function remains unclear. This region has been shown to be critical for the acquisition, consolidation, and expression of extinction learning, leading many to hypothesize that IL suppresses behavior as part of a “stop” network. However, this framework is at odds with IL function in habitual behavior in which the IL has been shown to be required for the expression and acquisition of ongoing habitual behavior. Here, we will review the current state of knowledge of IL anatomy and function in behavioral flexibility and provide a testable framework for a single IL mechanism underlying its function in both extinction and habit learning. PMID:25128534
The use of models to predict potential contamination aboard orbital vehicles
NASA Technical Reports Server (NTRS)
Boraas, Martin E.; Seale, Dianne B.
1989-01-01
A model of fungal growth on air-exposed, nonnutritive solid surfaces, developed for utilization aboard orbital vehicles is presented. A unique feature of this testable model is that the development of a fungal mycelium can facilitate its own growth by condensation of water vapor from its environment directly onto fungal hyphae. The fungal growth rate is limited by the rate of supply of volatile nutrients and fungal biomass is limited by either the supply of nonvolatile nutrients or by metabolic loss processes. The model discussed is structurally simple, but its dynamics can be quite complex. Biofilm accumulation can vary from a simple linear increase to sustained exponential growth, depending on the values of the environmental variable and model parameters. The results of the model are consistent with data from aquatic biofilm studies, insofar as the two types of systems are comparable. It is shown that the model presented is experimentally testable and provides a platform for the interpretation of observational data that may be directly relevant to the question of growth of organisms aboard the proposed Space Station.
LSI/VLSI design for testability analysis and general approach
NASA Technical Reports Server (NTRS)
Lam, A. Y.
1982-01-01
The incorporation of testability characteristics into large scale digital design is not only necessary for, but also pertinent to effective device testing and enhancement of device reliability. There are at least three major DFT techniques, namely, the self checking, the LSSD, and the partitioning techniques, each of which can be incorporated into a logic design to achieve a specific set of testability and reliability requirements. Detailed analysis of the design theory, implementation, fault coverage, hardware requirements, application limitations, etc., of each of these techniques are also presented.
Wynne-Edwards, K E
2001-01-01
Hormone disruption is a major, underappreciated component of the plant chemical arsenal, and the historical coevolution between hormone-disrupting plants and herbivores will have both increased the susceptibility of carnivores and diversified the sensitivities of herbivores to man-made endocrine disruptors. Here I review diverse evidence of the influence of plant secondary compounds on vertebrate reproduction, including human reproduction. Three of the testable hypotheses about the evolutionary responses of vertebrate herbivores to hormone-disrupting challenges from their diet are developed. Specifically, the hypotheses are that a) vertebrate herbivores will express steroid hormone receptors in the buccal cavity and/or the vomeronasal organ; b) absolute sex steroid concentrations will be lower in carnivores than in herbivores; and c) herbivore steroid receptors should be more diverse in their binding affinities than carnivore lineages. The argument developed in this review, if empirically validated by support for the specific hypotheses, suggests that a) carnivores will be more susceptible than herbivores to endocrine-disrupting compounds of anthropogenic origin entering their bodies, and b) diverse herbivore lineages will be variably susceptible to any given natural or synthetic contaminant. As screening methods for hormone-disrupting potential are compared and adopted, comparative endocrine physiology research is urgently needed to develop models that predict the broad applicability of those screening results in diverse vertebrate species. PMID:11401754
Merks, Roeland M H; Guravage, Michael; Inzé, Dirk; Beemster, Gerrit T S
2011-02-01
Plant organs, including leaves and roots, develop by means of a multilevel cross talk between gene regulation, patterned cell division and cell expansion, and tissue mechanics. The multilevel regulatory mechanisms complicate classic molecular genetics or functional genomics approaches to biological development, because these methodologies implicitly assume a direct relation between genes and traits at the level of the whole plant or organ. Instead, understanding gene function requires insight into the roles of gene products in regulatory networks, the conditions of gene expression, etc. This interplay is impossible to understand intuitively. Mathematical and computer modeling allows researchers to design new hypotheses and produce experimentally testable insights. However, the required mathematics and programming experience makes modeling poorly accessible to experimental biologists. Problem-solving environments provide biologically intuitive in silico objects ("cells", "regulation networks") required for setting up a simulation and present those to the user in terms of familiar, biological terminology. Here, we introduce the cell-based computer modeling framework VirtualLeaf for plant tissue morphogenesis. The current version defines a set of biologically intuitive C++ objects, including cells, cell walls, and diffusing and reacting chemicals, that provide useful abstractions for building biological simulations of developmental processes. We present a step-by-step introduction to building models with VirtualLeaf, providing basic example models of leaf venation and meristem development. VirtualLeaf-based models provide a means for plant researchers to analyze the function of developmental genes in the context of the biophysics of growth and patterning. VirtualLeaf is an ongoing open-source software project (http://virtualleaf.googlecode.com) that runs on Windows, Mac, and Linux.
The dynamics of hurricane balls
NASA Astrophysics Data System (ADS)
Andersen, W. L.; Werner, Steven
2015-09-01
We examine the theory of the hurricane balls toy. This toy consists of two steel balls, welded together that are sent spinning on a horizontal surface somewhat like a top. Unlike a top, at high frequency the symmetry axis approaches a limiting inclination that is not perpendicular to the surface. We calculate (and experimentally verify) the limiting inclinations for three toy geometries. We find that at high frequencies, hurricane balls provide an easily realized and testable example of the Poinsot theory of freely rotating symmetrical bodies.
Beyond Critical Exponents in Neuronal Avalanches
NASA Astrophysics Data System (ADS)
Friedman, Nir; Butler, Tom; Deville, Robert; Beggs, John; Dahmen, Karin
2011-03-01
Neurons form a complex network in the brain, where they interact with one another by firing electrical signals. Neurons firing can trigger other neurons to fire, potentially causing avalanches of activity in the network. In many cases these avalanches have been found to be scale independent, similar to critical phenomena in diverse systems such as magnets and earthquakes. We discuss models for neuronal activity that allow for the extraction of testable, statistical predictions. We compare these models to experimental results, and go beyond critical exponents.
Ecological resistance in urban streams: the role of natural and legacy attributes
Utz, Ryan M.; Hopkins, Kristina G.; Beesley, Leah; Booth, Derek B.; Hawley, Robert J.; Baker, Matthew E.; Freeman, Mary C.; Jones, Krista L.
2016-01-01
Urbanization substantially changes the physicochemical and biological characteristics of streams. The trajectory of negative effect is broadly similar around the world, but the nature and magnitude of ecological responses to urban growth differ among locations. Some heterogeneity in response arises from differences in the level of urban development and attributes of urban water management. However, the heterogeneity also may arise from variation in hydrologic, biological, and physicochemical templates that shaped stream ecosystems before urban development. We present a framework to develop hypotheses that predict how natural watershed and channel attributes in the pre-urban-development state may confer ecological resistance to urbanization. We present 6 testable hypotheses that explore the expression of such attributes under our framework: 1) greater water storage capacity mitigates hydrologic regime shifts, 2) coarse substrates and a balance between erosive forces and sediment supply buffer morphological changes, 3) naturally high ionic concentrations and pH pre-adapt biota to water-quality stress, 4) metapopulation connectivity results in retention of species richness, 5) high functional redundancy buffers trophic function from species loss, and 6) landuse history mutes or reverses the expected trajectory of eutrophication. Data from past comparative analyses support these hypotheses, but rigorous testing will require targeted investigations that account for confounding or interacting factors, such as diversity in urban infrastructure attributes. Improved understanding of the susceptibility or resistance of stream ecosystems could substantially strengthen conservation, management, and monitoring efforts in urban streams. We hope that these preliminary, conceptual hypotheses will encourage others to explore these ideas further and generate additional explanations for the heterogeneity observed in urban streams.
Module generation for self-testing integrated systems
NASA Astrophysics Data System (ADS)
Vanriessen, Ronald Pieter
Hardware used for self test in VLSI (Very Large Scale Integrated) systems is reviewed, and an architecture to control the test hardware in an integrated system is presented. Because of the increase of test times, the use of self test techniques has become practically and economically viable for VLSI systems. Beside the reduction in test times and costs, self test also provides testing at operational speeds. Therefore, a suitable combination of scan path and macrospecific (self) tests is required to reduce test times and costs. An expert system that can be used in a silicon compilation environment is presented. The approach requires a minimum of testability knowledge from a system designer. A user friendly interface was described for specifying and modifying testability requirements by a testability expert. A reason directed backtracking mechanism is used to solve selection failures. Both the hierarchical testable architecture and the design for testability expert system are used in a self test compiler. The definition of a self test compiler was given. A self test compiler is a software tool that selects an appropriate test method for every macro in a design. The hardware to control a macro test will be included in the design automatically. As an example, the integration of the self-test compiler in a silicon compilation system PIRAMID was described. The design of a demonstrator circuit by self test compiler is described. This circuit consists of two self testable macros. Control of the self test hardware is carried out via the test access port of the boundary scan standard.
The evolution of social and semantic networks in epistemic communities
NASA Astrophysics Data System (ADS)
Margolin, Drew Berkley
This study describes and tests a model of scientific inquiry as an evolving, organizational phenomenon. Arguments are derived from organizational ecology and evolutionary theory. The empirical subject of study is an epistemic community of scientists publishing on a research topic in physics: the string theoretic concept of "D-branes." The study uses evolutionary theory as a means of predicting change in the way members of the community choose concepts to communicate acceptable knowledge claims. It is argued that the pursuit of new knowledge is risky, because the reliability of a novel knowledge claim cannot be verified until after substantial resources have been invested. Using arguments from both philosophy of science and organizational ecology, it is suggested that scientists can mitigate and sensibly share the risks of knowledge discovery within the community by articulating their claims in legitimate forms, i.e., forms that are testable within and relevant to the community. Evidence from empirical studies of semantic usage suggests that the legitimacy of a knowledge claim is influenced by the characteristics of the concepts in which it is articulated. A model of conceptual retention, variation, and selection is then proposed for predicting the usage of concepts and conceptual co-occurrences in the future publications of the community, based on its past. Results substantially supported hypothesized retention and selection mechanisms. Future concept usage was predictable from previous concept usage, but was limited by conceptual carrying capacity as predicted by density dependence theory. Also as predicted, retention was stronger when the community showed a more cohesive social structure. Similarly, concepts that showed structural signatures of high testability and relevance were more likely to be selected after previous usage frequency was controlled for. By contrast, hypotheses for variation mechanisms were not supported. Surprisingly, concepts whose structural position suggested they would be easiest to discover through search processes were used less frequently, once previous usage frequency was controlled for. The study also makes a theoretical contribution by suggesting ways that evolutionary theory can be used to integrate findings from the study of science with insights from organizational communication. A variety of concrete directions for future studies of social and semantic network evolution are also proposed.
Curiosity at Vera Rubin Ridge: Testable Hypotheses, First Results, and Implications for Habitability
NASA Astrophysics Data System (ADS)
Fraeman, A.; Bedford, C.; Bridges, J.; Edgar, L. A.; Hardgrove, C.; Horgan, B. H. N.; Gabriel, T. S. J.; Grotzinger, J. P.; Gupta, S.; Johnson, J. R.; Rampe, E. B.; Morris, R. V.; Salvatore, M. R.; Schwenzer, S. P.; Stack, K.; Pinet, P. C.; Rubin, D. M.; Weitz, C. M.; Wellington, D. F.; Wiens, R. C.; Williams, A. J.; Vasavada, A. R.
2017-12-01
As of sol 1756, Curiosity was 250 meters from ascending Vera Rubin Ridge, a unique geomorphic feature preserved in the lower foothills of Aeolis Mons (informally known as Mt. Sharp) that is distinguishable from orbit. Vera Rubin Ridge (previously termed the Hematite Ridge) is characterized by a higher thermal inertia than the surrounding terrain, is comparatively resistant to erosion, and is capped with a hematite-bearing layer that is visible in 18 m/pixel CRISM data. A key hypothesis associated with this unit is that it represents a redox interface where ferrous iron oxidized and precipitated either as hematite or another ferric precursor. The Curiosity integrated payload is being used to determine the depositional environment(s), stratigraphic context and geochemical conditions associated with this interface, all of which will provide key insights into its past habitability potential and the relative timing of processes. Specifically, analysis of Curiosity data will address four major questions related to the history and evolution of ridge-forming strata: (1) What is the stratigraphic relationship between the units in the ridge and the Mt. Sharp group (see Grotzinger et al., 2015)? (2) What primary and secondary geologic processes deposited and modified the ridge units over time? (3) What is the nature and timing of the hematite precipitation environment, and how does it relate to similar oxidized phases in the Murray formation? (4) What are the implications for habitability and the preservation of organic molecules? Initial results of a systematic imaging campaign along the contact between the lower portion or the ridge and the Murray formation has revealed dm-scale cross bedding within the ridge stratigraphy, which provide clues about the depositional environments; these can be compared to suites of sedimentary structures within the adjacent Murray formation. Long distance ChemCam passive and Mastcam multispectral data show that hematite and likely other ferric phases are present in the upper ridge, consistent with orbital data. Curiosity will continue to take systematic observations that draw upon testable hypotheses about the ridge environments as the rover ascends Vera Rubin Ridge.
Stephens, Patrick R.; Hua, Jessica; Searle, Catherine L.; Xie, Gisselle Yang; Urbina, Jenny; Olson, Deanna H.; Bancroft, Betsy A.; Weis, Virginia; Hammond, John I.; Relyea, Rick A.; Blaustein, Andrew R.
2017-01-01
Variation in host responses to pathogens can have cascading effects on populations and communities when some individuals or groups of individuals display disproportionate vulnerability to infection or differ in their competence to transmit infection. The fungal pathogen, Batrachochytrium dendrobatidis (Bd) has been detected in almost 700 different amphibian species and is implicated in numerous global amphibian population declines. Identifying key hosts in the amphibian-Bd system–those who are at greatest risk or who pose the greatest risk for others–is challenging due in part to many extrinsic environmental factors driving spatiotemporal Bd distribution and context-dependent host responses to Bd in the wild. One way to improve predictive risk models and generate testable mechanistic hypotheses about vulnerability is to complement what we know about the spatial epidemiology of Bd with data collected through comparative experimental studies. We used standardized pathogen challenges to quantify amphibian survival and infection trajectories across 20 post-metamorphic North American species raised from eggs. We then incorporated trait-based models to investigate the predictive power of phylogenetic history, habitat use, and ecological and life history traits in explaining responses to Bd. True frogs (Ranidae) displayed the lowest infection intensities, whereas toads (Bufonidae) generally displayed the greatest levels of mortality after Bd exposure. Affiliation with ephemeral aquatic habitat and breadth of habitat use were strong predictors of vulnerability to and intensity of infection and several other traits including body size, lifespan, age at sexual maturity, and geographic range also appeared in top models explaining host responses to Bd. Several of the species examined are highly understudied with respect to Bd such that this study represents the first experimental susceptibility data. Combining insights gained from experimental studies with observations of landscape-level disease prevalence may help explain current and predict future pathogen dynamics in the Bd system. PMID:28095428
Gervasi, Stephanie S; Stephens, Patrick R; Hua, Jessica; Searle, Catherine L; Xie, Gisselle Yang; Urbina, Jenny; Olson, Deanna H; Bancroft, Betsy A; Weis, Virginia; Hammond, John I; Relyea, Rick A; Blaustein, Andrew R
2017-01-01
Variation in host responses to pathogens can have cascading effects on populations and communities when some individuals or groups of individuals display disproportionate vulnerability to infection or differ in their competence to transmit infection. The fungal pathogen, Batrachochytrium dendrobatidis (Bd) has been detected in almost 700 different amphibian species and is implicated in numerous global amphibian population declines. Identifying key hosts in the amphibian-Bd system-those who are at greatest risk or who pose the greatest risk for others-is challenging due in part to many extrinsic environmental factors driving spatiotemporal Bd distribution and context-dependent host responses to Bd in the wild. One way to improve predictive risk models and generate testable mechanistic hypotheses about vulnerability is to complement what we know about the spatial epidemiology of Bd with data collected through comparative experimental studies. We used standardized pathogen challenges to quantify amphibian survival and infection trajectories across 20 post-metamorphic North American species raised from eggs. We then incorporated trait-based models to investigate the predictive power of phylogenetic history, habitat use, and ecological and life history traits in explaining responses to Bd. True frogs (Ranidae) displayed the lowest infection intensities, whereas toads (Bufonidae) generally displayed the greatest levels of mortality after Bd exposure. Affiliation with ephemeral aquatic habitat and breadth of habitat use were strong predictors of vulnerability to and intensity of infection and several other traits including body size, lifespan, age at sexual maturity, and geographic range also appeared in top models explaining host responses to Bd. Several of the species examined are highly understudied with respect to Bd such that this study represents the first experimental susceptibility data. Combining insights gained from experimental studies with observations of landscape-level disease prevalence may help explain current and predict future pathogen dynamics in the Bd system.
Cvicek, Vaclav; Goddard, William A.; Abrol, Ravinder
2016-01-01
The understanding of G-protein coupled receptors (GPCRs) is undergoing a revolution due to increased information about their signaling and the experimental determination of structures for more than 25 receptors. The availability of at least one receptor structure for each of the GPCR classes, well separated in sequence space, enables an integrated superfamily-wide analysis to identify signatures involving the role of conserved residues, conserved contacts, and downstream signaling in the context of receptor structures. In this study, we align the transmembrane (TM) domains of all experimental GPCR structures to maximize the conserved inter-helical contacts. The resulting superfamily-wide GpcR Sequence-Structure (GRoSS) alignment of the TM domains for all human GPCR sequences is sufficient to generate a phylogenetic tree that correctly distinguishes all different GPCR classes, suggesting that the class-level differences in the GPCR superfamily are encoded at least partly in the TM domains. The inter-helical contacts conserved across all GPCR classes describe the evolutionarily conserved GPCR structural fold. The corresponding structural alignment of the inactive and active conformations, available for a few GPCRs, identifies activation hot-spot residues in the TM domains that get rewired upon activation. Many GPCR mutations, known to alter receptor signaling and cause disease, are located at these conserved contact and activation hot-spot residue positions. The GRoSS alignment places the chemosensory receptor subfamilies for bitter taste (TAS2R) and pheromones (Vomeronasal, VN1R) in the rhodopsin family, known to contain the chemosensory olfactory receptor subfamily. The GRoSS alignment also enables the quantification of the structural variability in the TM regions of experimental structures, useful for homology modeling and structure prediction of receptors. Furthermore, this alignment identifies structurally and functionally important residues in all human GPCRs. These residues can be used to make testable hypotheses about the structural basis of receptor function and about the molecular basis of disease-associated single nucleotide polymorphisms. PMID:27028541
Refinement of Representation Theorems for Context-Free Languages
NASA Astrophysics Data System (ADS)
Fujioka, Kaoru
In this paper, we obtain some refinement of representation theorems for context-free languages by using Dyck languages, insertion systems, strictly locally testable languages, and morphisms. For instance, we improved the Chomsky-Schützenberger representation theorem and show that each context-free language L can be represented in the form L = h (D ∩ R), where D is a Dyck language, R is a strictly 3-testable language, and h is a morphism. A similar representation for context-free languages can be obtained, using insertion systems of weight (3, 0) and strictly 4-testable languages.
An empirical comparison of a dynamic software testability metric to static cyclomatic complexity
NASA Technical Reports Server (NTRS)
Voas, Jeffrey M.; Miller, Keith W.; Payne, Jeffrey E.
1993-01-01
This paper compares the dynamic testability prediction technique termed 'sensitivity analysis' to the static testability technique termed cyclomatic complexity. The application that we chose in this empirical study is a CASE generated version of a B-737 autoland system. For the B-737 system we analyzed, we isolated those functions that we predict are more prone to hide errors during system/reliability testing. We also analyzed the code with several other well-known static metrics. This paper compares and contrasts the results of sensitivity analysis to the results of the static metrics.
Currie, Thomas E; Mace, Ruth
2011-04-12
Traditional investigations of the evolution of human social and political institutions trace their ancestry back to nineteenth century social scientists such as Herbert Spencer, and have concentrated on the increase in socio-political complexity over time. More recent studies of cultural evolution have been explicitly informed by Darwinian evolutionary theory and focus on the transmission of cultural traits between individuals. These two approaches to investigating cultural change are often seen as incompatible. However, we argue that many of the defining features and assumptions of 'Spencerian' cultural evolutionary theory represent testable hypotheses that can and should be tackled within a broader 'Darwinian' framework. In this paper we apply phylogenetic comparative techniques to data from Austronesian-speaking societies of Island South-East Asia and the Pacific to test hypotheses about the mode and tempo of human socio-political evolution. We find support for three ideas often associated with Spencerian cultural evolutionary theory: (i) political organization has evolved through a regular sequence of forms, (ii) increases in hierarchical political complexity have been more common than decreases, and (iii) political organization has co-evolved with the wider presence of hereditary social stratification.
Currie, Thomas E.; Mace, Ruth
2011-01-01
Traditional investigations of the evolution of human social and political institutions trace their ancestry back to nineteenth century social scientists such as Herbert Spencer, and have concentrated on the increase in socio-political complexity over time. More recent studies of cultural evolution have been explicitly informed by Darwinian evolutionary theory and focus on the transmission of cultural traits between individuals. These two approaches to investigating cultural change are often seen as incompatible. However, we argue that many of the defining features and assumptions of ‘Spencerian’ cultural evolutionary theory represent testable hypotheses that can and should be tackled within a broader ‘Darwinian’ framework. In this paper we apply phylogenetic comparative techniques to data from Austronesian-speaking societies of Island South-East Asia and the Pacific to test hypotheses about the mode and tempo of human socio-political evolution. We find support for three ideas often associated with Spencerian cultural evolutionary theory: (i) political organization has evolved through a regular sequence of forms, (ii) increases in hierarchical political complexity have been more common than decreases, and (iii) political organization has co-evolved with the wider presence of hereditary social stratification. PMID:21357233
Tramacere, Antonella; Pievani, Telmo; Ferrari, Pier F
2017-08-01
Considering the properties of mirror neurons (MNs) in terms of development and phylogeny, we offer a novel, unifying, and testable account of their evolution according to the available data and try to unify apparently discordant research, including the plasticity of MNs during development, their adaptive value and their phylogenetic relationships and continuity. We hypothesize that the MN system reflects a set of interrelated traits, each with an independent natural history due to unique selective pressures, and propose that there are at least three evolutionarily significant trends that gave raise to three subtypes: hand visuomotor, mouth visuomotor, and audio-vocal. Specifically, we put forward a mosaic evolution hypothesis, which posits that different types of MNs may have evolved at different rates within and among species. This evolutionary hypothesis represents an alternative to both adaptationist and associative models. Finally, the review offers a strong heuristic potential in predicting the circumstances under which specific variations and properties of MNs are expected. Such predictive value is critical to test new hypotheses about MN activity and its plastic changes, depending on the species, the neuroanatomical substrates, and the ecological niche. © 2016 Cambridge Philosophical Society.
A systematic survey of lipids across mouse tissues
Jain, Mohit; Ngoy, Soeun; Sheth, Sunil A.; Swanson, Raymond A.; Rhee, Eugene P.; Liao, Ronglih; Clish, Clary B.; Mootha, Vamsi K.
2014-01-01
Lipids are a diverse collection of macromolecules essential for normal physiology, but the tissue distribution and function for many individual lipid species remain unclear. Here, we report a mass spectrometry survey of lipid abundance across 18 mouse tissues, detecting ∼1,000 mass spectrometry features, of which we identify 179 lipids from the glycerolipids, glycerophospholipids, lysophospholipids, acylcarnitines, sphingolipids, and cholesteryl ester classes. Our data reveal tissue-specific organization of lipids and can be used to generate testable hypotheses. For example, our data indicate that circulating triglycerides positively and negatively associated with future diabetes in humans are enriched in mouse adipose tissue and liver, respectively, raising hypotheses regarding the tissue origins of these diabetes-associated lipids. We also integrate our tissue lipid data with gene expression profiles to predict a number of substrates of lipid-metabolizing enzymes, highlighting choline phosphotransferases and sterol O-acyltransferases. Finally, we identify several tissue-specific lipids not present in plasma under normal conditions that may be of interest as biomarkers of tissue injury, and we show that two of these lipids are released into blood following ischemic brain injury in mice. This resource complements existing compendia of tissue gene expression and may be useful for integrative physiology and lipid biology. PMID:24518676
The arsenic exposure hypothesis for Alzheimer disease.
Gong, Gordon; OʼBryant, Sid E
2010-01-01
Prior research has shown that arsenic exposure induces changes that coincide with most of the developmental, biochemical, pathologic, and clinical features of Alzheimer disease (AD) and associated disorders. On the basis of this literature, we propose the Arsenic Exposure Hypothesis for AD that is inclusive of and cooperative with the existing hypotheses. Arsenic toxicity induces hyperphosphorylation of protein tau and overtranscription of the amyloid precursor protein, which are involved in the formation of neurofibrillary tangles and brain amyloid plaques, consistent with the amyloid hypothesis of AD. Arsenic exposure has been associated with cardiovascular diseases and associated risk factors, which is in agreement with the vascular hypothesis of AD. Arsenic exposure invokes brain inflammatory responses, which resonates with the inflammatory hypotheses of AD. Arsenic exposure has been linked to reduced memory and intellectual abilities in children and adolescents, which provides a biologic basis for the developmental origin of health and disease hypothesis for AD. Arsenic and its metabolites generate free radicals causing oxidative stress and neuronal death, which fits the existing oxidative stress hypothesis. Taken together, the arsenic exposure hypothesis for AD provides a parsimonious testable hypothesis for the development and progression of this devastating disease at least for some subsets of individuals.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Loehle, C.
1994-05-01
The three great myths, which form a sort of triumvirate of misunderstanding, are the Eureka! myth, the hypothesis myth, and the measurement myth. These myths are prevalent among scientists as well as among observers of science. The Eureka! myth asserts that discovery occurs as a flash of insight, and as such is not subject to investigation. This leads to the perception that discovery or deriving a hypothesis is a moment or event rather than a process. Events are singular and not subject to description. The hypothesis myth asserts that proper science is motivated by testing hypotheses, and that if somethingmore » is not experimentally testable then it is not scientific. This myth leads to absurd posturing by some workers conducting empirical descriptive studies, who dress up their study with a ``hypothesis`` to obtain funding or get it published. Methods papers are often rejected because they do not address a specific scientific problem. The fact is that many of the great breakthroughs in silence involve methods and not hypotheses or arise from largely descriptive studies. Those captured by this myth also try to block funding for those developing methods. The third myth is the measurement myth, which holds that determining what to measure is straightforward, so one doesn`t need a lot of introspection to do science. As one ecologist put it to me ``Don`t give me any of that philosophy junk, just let me out in the field. I know what to measure.`` These myths lead to difficulties for scientists who must face peer review to obtain funding and to get published. These myths also inhibit the study of science as a process. Finally, these myths inhibit creativity and suppress innovation. In this paper I first explore these myths in more detail and then propose a new model of discovery that opens the supposedly miraculous process of discovery to doser scrutiny.« less
NASA Astrophysics Data System (ADS)
Xu, Jin-Shi; Li, Chuan-Feng; Guo, Guang-Can
2016-11-01
In 1935, Einstein, Podolsky and Rosen published their influential paper proposing a now famous paradox (the EPR paradox) that threw doubt on the completeness of quantum mechanics. Two fundamental concepts: entanglement and steering, were given in the response to the EPR paper by Schrodinger, which both reflect the nonlocal nature of quantum mechanics. In 1964, John Bell obtained an experimentally testable inequality, in which its violation contradicts the prediction of local hidden variable models and agrees with that of quantum mechanics. Since then, great efforts have been made to experimentally investigate the nonlocal feature of quantum mechanics and many distinguished quantum properties were observed. In this work, along with the discussion of the development of quantum nonlocality, we would focus on our recent experimental efforts in investigating quantum correlations and their applications with optical systems, including the study of entanglement-assisted entropic uncertainty principle, Einstein-Podolsky-Rosen steering and the dynamics of quantum correlations.
Econometrics of exhaustible resource supply: a theory and an application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Epple, D.
1983-01-01
This report takes a major step toward developing a fruitful approach to empirical analysis of resource supply. It is the first empirical application of resource theory that has successfully integrated the effects of depletion of nonrenewable resources with the effects of uncertainty about future costs and prices on supply behavior. Thus, the model is a major improvement over traditional engineering-optimization models that assume complete certainty, and over traditional econometrics models that are only implicitly related to the theory of resource supply. The model is used to test hypotheses about interdependence of oil and natural gas discoveries, depletion, ultimate recovery, andmore » the role of price expectations. This paper demonstrates the feasibility of using exhaustible resource theory in the development of empirically testable models. 19 refs., 1 fig., 5 tabs.« less
Brain Evolution and Human Neuropsychology: The Inferential Brain Hypothesis
Koscik, Timothy R.; Tranel, Daniel
2013-01-01
Collaboration between human neuropsychology and comparative neuroscience has generated invaluable contributions to our understanding of human brain evolution and function. Further cross-talk between these disciplines has the potential to continue to revolutionize these fields. Modern neuroimaging methods could be applied in a comparative context, yielding exciting new data with the potential of providing insight into brain evolution. Conversely, incorporating an evolutionary base into the theoretical perspectives from which we approach human neuropsychology could lead to novel hypotheses and testable predictions. In the spirit of these objectives, we present here a new theoretical proposal, the Inferential Brain Hypothesis, whereby the human brain is thought to be characterized by a shift from perceptual processing to inferential computation, particularly within the social realm. This shift is believed to be a driving force for the evolution of the large human cortex. PMID:22459075
Injured Brains and Adaptive Networks: The Benefits and Costs of Hyperconnectivity.
Hillary, Frank G; Grafman, Jordan H
2017-05-01
A common finding in human functional brain-imaging studies is that damage to neural systems paradoxically results in enhanced functional connectivity between network regions, a phenomenon commonly referred to as 'hyperconnectivity'. Here, we describe the various ways that hyperconnectivity operates to benefit a neural network following injury while simultaneously negotiating the trade-off between metabolic cost and communication efficiency. Hyperconnectivity may be optimally expressed by increasing connections through the most central and metabolically efficient regions (i.e., hubs). While adaptive in the short term, we propose that chronic hyperconnectivity may leave network hubs vulnerable to secondary pathological processes over the life span due to chronically elevated metabolic stress. We conclude by offering novel, testable hypotheses for advancing our understanding of the role of hyperconnectivity in systems-level brain plasticity in neurological disorders. Copyright © 2017 Elsevier Ltd. All rights reserved.
Exploring Culturally Based Intrafamilial Stressors Among Latino Adolescents
Cordova, David; Ciofu, Amanda; Cervantes, Richard
2014-01-01
Despite the profound impact that intrafamilial stressors, including parent – adolescent acculturation discrepancies, may have on Latino adolescent behavioral and mental health, this line of research remains underdeveloped. The purpose of this study is to obtain rich descriptions from Latino adolescents of the most salient intrafamilial stressors. The authors employ focus group methodology with a grounded theory approach. A total of 25 focus groups were conducted with 170 Latino adolescents in the Northeast and Southwest United States. Findings indicate that Latino adolescents experience significant stressors related to parent – adolescent acculturation discrepancies. From this qualitative study the authors derive a series of testable hypotheses aimed at fully understanding the role of parent – adolescent acculturation discrepancies on Latino adolescent behavioral and mental health and informing the development of culturally responsive preventive interventions for this population. PMID:25530653
The biology and polymer physics underlying large‐scale chromosome organization
2017-01-01
Chromosome large‐scale organization is a beautiful example of the interplay between physics and biology. DNA molecules are polymers and thus belong to the class of molecules for which physicists have developed models and formulated testable hypotheses to understand their arrangement and dynamic properties in solution, based on the principles of polymer physics. Biologists documented and discovered the biochemical basis for the structure, function and dynamic spatial organization of chromosomes in cells. The underlying principles of chromosome organization have recently been revealed in unprecedented detail using high‐resolution chromosome capture technology that can simultaneously detect chromosome contact sites throughout the genome. These independent lines of investigation have now converged on a model in which DNA loops, generated by the loop extrusion mechanism, are the basic organizational and functional units of the chromosome. PMID:29105235
Testability analysis on a hydraulic system in a certain equipment based on simulation model
NASA Astrophysics Data System (ADS)
Zhang, Rui; Cong, Hua; Liu, Yuanhong; Feng, Fuzhou
2018-03-01
Aiming at the problem that the complicated structure and the shortage of fault statistics information in hydraulic systems, a multi value testability analysis method based on simulation model is proposed. Based on the simulation model of AMESim, this method injects the simulated faults and records variation of test parameters ,such as pressure, flow rate, at each test point compared with those under normal conditions .Thus a multi-value fault-test dependency matrix is established. Then the fault detection rate (FDR) and fault isolation rate (FIR) are calculated based on the dependency matrix. Finally the system of testability and fault diagnosis capability are analyzed and evaluated, which can only reach a lower 54%(FDR) and 23%(FIR). In order to improve testability performance of the system,. number and position of the test points are optimized on the system. Results show the proposed test placement scheme can be used to solve the problems that difficulty, inefficiency and high cost in the system maintenance.
Rozier, Kelvin; Bondarenko, Vladimir E
2017-05-01
The β 1 - and β 2 -adrenergic signaling systems play different roles in the functioning of cardiac cells. Experimental data show that the activation of the β 1 -adrenergic signaling system produces significant inotropic, lusitropic, and chronotropic effects in the heart, whereas the effects of the β 2 -adrenergic signaling system is less apparent. In this paper, a comprehensive compartmentalized experimentally based mathematical model of the combined β 1 - and β 2 -adrenergic signaling systems in mouse ventricular myocytes is developed to simulate the experimental findings and make testable predictions of the behavior of the cardiac cells under different physiological conditions. Simulations describe the dynamics of major signaling molecules in different subcellular compartments; kinetics and magnitudes of phosphorylation of ion channels, transporters, and Ca 2+ handling proteins; modifications of action potential shape and duration; and [Ca 2+ ] i and [Na + ] i dynamics upon stimulation of β 1 - and β 2 -adrenergic receptors (β 1 - and β 2 -ARs). The model reveals physiological conditions when β 2 -ARs do not produce significant physiological effects and when their effects can be measured experimentally. Simulations demonstrated that stimulation of β 2 -ARs with isoproterenol caused a marked increase in the magnitude of the L-type Ca 2+ current, [Ca 2+ ] i transient, and phosphorylation of phospholamban only upon additional application of pertussis toxin or inhibition of phosphodiesterases of type 3 and 4. The model also made testable predictions of the changes in magnitudes of [Ca 2+ ] i and [Na + ] i fluxes, the rate of decay of [Na + ] i concentration upon both combined and separate stimulation of β 1 - and β 2 -ARs, and the contribution of phosphorylation of PKA targets to the changes in the action potential and [Ca 2+ ] i transient. Copyright © 2017 the American Physiological Society.
Larval transport modeling of deep-sea invertebrates can aid the search for undiscovered populations.
Yearsley, Jon M; Sigwart, Julia D
2011-01-01
Many deep-sea benthic animals occur in patchy distributions separated by thousands of kilometres, yet because deep-sea habitats are remote, little is known about their larval dispersal. Our novel method simulates dispersal by combining data from the Argo array of autonomous oceanographic probes, deep-sea ecological surveys, and comparative invertebrate physiology. The predicted particle tracks allow quantitative, testable predictions about the dispersal of benthic invertebrate larvae in the south-west Pacific. In a test case presented here, using non-feeding, non-swimming (lecithotrophic trochophore) larvae of polyplacophoran molluscs (chitons), we show that the likely dispersal pathways in a single generation are significantly shorter than the distances between the three known population centres in our study region. The large-scale density of chiton populations throughout our study region is potentially much greater than present survey data suggest, with intermediate 'stepping stone' populations yet to be discovered. We present a new method that is broadly applicable to studies of the dispersal of deep-sea organisms. This test case demonstrates the power and potential applications of our new method, in generating quantitative, testable hypotheses at multiple levels to solve the mismatch between observed and expected distributions: probabilistic predictions of locations of intermediate populations, potential alternative dispersal mechanisms, and expected population genetic structure. The global Argo data have never previously been used to address benthic biology, and our method can be applied to any non-swimming larvae of the deep-sea, giving information upon dispersal corridors and population densities in habitats that remain intrinsically difficult to assess.
Larval Transport Modeling of Deep-Sea Invertebrates Can Aid the Search for Undiscovered Populations
Yearsley, Jon M.; Sigwart, Julia D.
2011-01-01
Background Many deep-sea benthic animals occur in patchy distributions separated by thousands of kilometres, yet because deep-sea habitats are remote, little is known about their larval dispersal. Our novel method simulates dispersal by combining data from the Argo array of autonomous oceanographic probes, deep-sea ecological surveys, and comparative invertebrate physiology. The predicted particle tracks allow quantitative, testable predictions about the dispersal of benthic invertebrate larvae in the south-west Pacific. Principal Findings In a test case presented here, using non-feeding, non-swimming (lecithotrophic trochophore) larvae of polyplacophoran molluscs (chitons), we show that the likely dispersal pathways in a single generation are significantly shorter than the distances between the three known population centres in our study region. The large-scale density of chiton populations throughout our study region is potentially much greater than present survey data suggest, with intermediate ‘stepping stone’ populations yet to be discovered. Conclusions/Significance We present a new method that is broadly applicable to studies of the dispersal of deep-sea organisms. This test case demonstrates the power and potential applications of our new method, in generating quantitative, testable hypotheses at multiple levels to solve the mismatch between observed and expected distributions: probabilistic predictions of locations of intermediate populations, potential alternative dispersal mechanisms, and expected population genetic structure. The global Argo data have never previously been used to address benthic biology, and our method can be applied to any non-swimming larvae of the deep-sea, giving information upon dispersal corridors and population densities in habitats that remain intrinsically difficult to assess. PMID:21857992
Lift and drag in three-dimensional steady viscous and compressible flow
NASA Astrophysics Data System (ADS)
Liu, L. Q.; Wu, J. Z.; Su, W. D.; Kang, L. L.
2017-11-01
In a recent paper, Liu, Zhu, and Wu ["Lift and drag in two-dimensional steady viscous and compressible flow," J. Fluid Mech. 784, 304-341 (2015)] present a force theory for a body in a two-dimensional, viscous, compressible, and steady flow. In this companion paper, we do the same for three-dimensional flows. Using the fundamental solution of the linearized Navier-Stokes equations, we improve the force formula for incompressible flows originally derived by Goldstein in 1931 and summarized by Milne-Thomson in 1968, both being far from complete, to its perfect final form, which is further proved to be universally true from subsonic to supersonic flows. We call this result the unified force theorem, which states that the forces are always determined by the vector circulation Γϕ of longitudinal velocity and the scalar inflow Qψ of transverse velocity. Since this theorem is not directly observable either experimentally or computationally, a testable version is also derived, which, however, holds only in the linear far field. We name this version the testable unified force formula. After that, a general principle to increase the lift-drag ratio is proposed.
Left-right asymmetry specification in amphioxus: review and prospects.
Soukup, Vladimir
2017-01-01
Extant bilaterally symmetrical animals usually show asymmetry in the arrangement of their inner organs. However, the exaggerated left-right (LR) asymmetry in amphioxus represents a true peculiarity among them. The amphioxus larva shows completely disparate fates of left and right body sides, so that organs associated with pharynx are either positioned exclusively on the left or on the right side. Moreover, segmented paraxial structures such as muscle blocks and their neuronal innervation show offset arrangement between the sides making it difficult to propose any explanation or adaptivity to larval and adult life. First LR asymmetries can be traced back to an early embryonic period when morphological asymmetries are preceded by molecular asymmetries driven by the action of the Nodal signaling pathway. This review sums up recent advances in understanding LR asymmetry specification in amphioxus and proposes upstream events that may regulate asymmetric Nodal signaling. These events include the presence of the vertebrate-like LR organizer and a cilia-driven fluid flow that may be involved in the breaking of bilateral symmetry. The upstream pathways comprising the ion flux, Delta/Notch, Wnt/β-catenin and Wnt/PCP are hypothesized to regulate both formation of the LR organizer and expression of the downstream Nodal signaling pathway genes. These suggestions are in line with what we know from vertebrate and ambulacrarian LR axis specification and are directly testable by experimental manipulations. Thanks to the phylogenetic position of amphioxus, the proposed mechanisms may be helpful in understanding the evolution of LR axis specification across deuterostomes.
A transcriptional serenAID: the role of noncoding RNAs in class switch recombination
Yewdell, William T.; Chaudhuri, Jayanta
2017-01-01
Abstract During an immune response, activated B cells may undergo class switch recombination (CSR), a molecular rearrangement that allows B cells to switch from expressing IgM and IgD to a secondary antibody heavy chain isotype such as IgG, IgA or IgE. Secondary antibody isotypes provide the adaptive immune system with distinct effector functions to optimally combat various pathogens. CSR occurs between repetitive DNA elements within the immunoglobulin heavy chain (Igh) locus, termed switch (S) regions and requires the DNA-modifying enzyme activation-induced cytidine deaminase (AID). AID-mediated DNA deamination within S regions initiates the formation of DNA double-strand breaks, which serve as biochemical beacons for downstream DNA repair pathways that coordinate the ligation of DNA breaks. Myriad factors contribute to optimal AID targeting; however, many of these factors also localize to genomic regions outside of the Igh locus. Thus, a current challenge is to explain the specific targeting of AID to the Igh locus. Recent studies have implicated noncoding RNAs in CSR, suggesting a provocative mechanism that incorporates Igh-specific factors to enable precise AID targeting. Here, we chronologically recount the rich history of noncoding RNAs functioning in CSR to provide a comprehensive context for recent and future discoveries. We present a model for the RNA-guided targeting of AID that attempts to integrate historical and recent findings, and highlight potential caveats. Lastly, we discuss testable hypotheses ripe for current experimentation, and explore promising ideas for future investigations. PMID:28535205
Rajeev, Lara; Luning, Eric G; Dehal, Paramvir S; Price, Morgan N; Arkin, Adam P; Mukhopadhyay, Aindrila
2011-10-12
Two component regulatory systems are the primary form of signal transduction in bacteria. Although genomic binding sites have been determined for several eukaryotic and bacterial transcription factors, comprehensive identification of gene targets of two component response regulators remains challenging due to the lack of knowledge of the signals required for their activation. We focused our study on Desulfovibrio vulgaris Hildenborough, a sulfate reducing bacterium that encodes unusually diverse and largely uncharacterized two component signal transduction systems. We report the first systematic mapping of the genes regulated by all transcriptionally acting response regulators in a single bacterium. Our results enabled functional predictions for several response regulators and include key processes of carbon, nitrogen and energy metabolism, cell motility and biofilm formation, and responses to stresses such as nitrite, low potassium and phosphate starvation. Our study also led to the prediction of new genes and regulatory networks, which found corroboration in a compendium of transcriptome data available for D. vulgaris. For several regulators we predicted and experimentally verified the binding site motifs, most of which were discovered as part of this study. The gene targets identified for the response regulators allowed strong functional predictions to be made for the corresponding two component systems. By tracking the D. vulgaris regulators and their motifs outside the Desulfovibrio spp. we provide testable hypotheses regarding the functions of orthologous regulators in other organisms. The in vitro array based method optimized here is generally applicable for the study of such systems in all organisms.
Representing high throughput expression profiles via perturbation barcodes reveals compound targets.
Filzen, Tracey M; Kutchukian, Peter S; Hermes, Jeffrey D; Li, Jing; Tudor, Matthew
2017-02-01
High throughput mRNA expression profiling can be used to characterize the response of cell culture models to perturbations such as pharmacologic modulators and genetic perturbations. As profiling campaigns expand in scope, it is important to homogenize, summarize, and analyze the resulting data in a manner that captures significant biological signals in spite of various noise sources such as batch effects and stochastic variation. We used the L1000 platform for large-scale profiling of 978 representative genes across thousands of compound treatments. Here, a method is described that uses deep learning techniques to convert the expression changes of the landmark genes into a perturbation barcode that reveals important features of the underlying data, performing better than the raw data in revealing important biological insights. The barcode captures compound structure and target information, and predicts a compound's high throughput screening promiscuity, to a higher degree than the original data measurements, indicating that the approach uncovers underlying factors of the expression data that are otherwise entangled or masked by noise. Furthermore, we demonstrate that visualizations derived from the perturbation barcode can be used to more sensitively assign functions to unknown compounds through a guilt-by-association approach, which we use to predict and experimentally validate the activity of compounds on the MAPK pathway. The demonstrated application of deep metric learning to large-scale chemical genetics projects highlights the utility of this and related approaches to the extraction of insights and testable hypotheses from big, sometimes noisy data.
Representing high throughput expression profiles via perturbation barcodes reveals compound targets
Kutchukian, Peter S.; Li, Jing; Tudor, Matthew
2017-01-01
High throughput mRNA expression profiling can be used to characterize the response of cell culture models to perturbations such as pharmacologic modulators and genetic perturbations. As profiling campaigns expand in scope, it is important to homogenize, summarize, and analyze the resulting data in a manner that captures significant biological signals in spite of various noise sources such as batch effects and stochastic variation. We used the L1000 platform for large-scale profiling of 978 representative genes across thousands of compound treatments. Here, a method is described that uses deep learning techniques to convert the expression changes of the landmark genes into a perturbation barcode that reveals important features of the underlying data, performing better than the raw data in revealing important biological insights. The barcode captures compound structure and target information, and predicts a compound’s high throughput screening promiscuity, to a higher degree than the original data measurements, indicating that the approach uncovers underlying factors of the expression data that are otherwise entangled or masked by noise. Furthermore, we demonstrate that visualizations derived from the perturbation barcode can be used to more sensitively assign functions to unknown compounds through a guilt-by-association approach, which we use to predict and experimentally validate the activity of compounds on the MAPK pathway. The demonstrated application of deep metric learning to large-scale chemical genetics projects highlights the utility of this and related approaches to the extraction of insights and testable hypotheses from big, sometimes noisy data. PMID:28182661
Fabina, Nicholas S; Putnam, Hollie M; Franklin, Erik C; Stat, Michael; Gates, Ruth D
2013-11-01
Climate change-driven stressors threaten the persistence of coral reefs worldwide. Symbiotic relationships between scleractinian corals and photosynthetic endosymbionts (genus Symbiodinium) are the foundation of reef ecosystems, and these associations are differentially impacted by stress. Here, we couple empirical data from the coral reefs of Moorea, French Polynesia, and a network theoretic modeling approach to evaluate how patterns in coral-Symbiodinium associations influence community stability under climate change. To introduce the effect of climate perturbations, we simulate local 'extinctions' that represent either the loss of coral species or the ability to engage in symbiotic interactions. Community stability is measured by determining the duration and number of species that persist through the simulated extinctions. Our results suggest that four factors greatly increase coral-Symbiodinium community stability in response to global changes: (i) the survival of generalist hosts and symbionts maximizes potential symbiotic unions; (ii) elevated symbiont diversity provides redundant or complementary symbiotic functions; (iii) compatible symbiotic assemblages create the potential for local recolonization; and (iv) the persistence of certain traits associate with symbiotic diversity and redundancy. Symbiodinium may facilitate coral persistence through novel environmental regimes, but this capacity is mediated by symbiotic specificity, association patterns, and the functional performance of the symbionts. Our model-based approach identifies general trends and testable hypotheses in coral-Symbiodinium community responses. Future studies should consider similar methods when community size and/or environmental complexity preclude experimental approaches. © 2013 John Wiley & Sons Ltd.
Modeling the attenuation and failure of action potentials in the dendrites of hippocampal neurons.
Migliore, M
1996-01-01
We modeled two different mechanisms, a shunting conductance and a slow sodium inactivation, to test whether they could modulate the active propagation of a train of action potentials in a dendritic tree. Computer simulations, using a compartmental model of a pyramidal neuron, suggest that each of these two mechanisms could account for the activity-dependent attenuation and failure of the action potentials in the dendrites during the train. Each mechanism is shown to be in good qualitative agreement with experimental findings on somatic or dendritic stimulation and on the effects of hyperpolarization. The conditions under which branch point failures can be observed, and a few experimentally testable predictions, are presented and discussed. PMID:8913580
Generating Testable Questions in the Science Classroom: The BDC Model
ERIC Educational Resources Information Center
Tseng, ChingMei; Chen, Shu-Bi Shu-Bi; Chang, Wen-Hua
2015-01-01
Guiding students to generate testable scientific questions is essential in the inquiry classroom, but it is not easy. The purpose of the BDC ("Big Idea, Divergent Thinking, and Convergent Thinking") instructional model is to to scaffold students' inquiry learning. We illustrate the use of this model with an example lesson, designed…
Easily Testable PLA-Based Finite State Machines
1989-03-01
PLATYPUS (20]. Then, justifi- type 1, 4 and 5 can be guaranteed to be testable via cation paths are obtained from the STG using simple logic...next state lines is found, if such a vector par that is gnrt d y the trupt eexists, using PLATYPUS [20]. pair that is generated by the first corrupted
1983-11-01
compound operations, with status. (h) Pre-programmed CRC and double-precision multiply/divide algo- rithms. (i) Double length accumulator with full...IH1.25 _ - MICROCOP ’ RESOLUTION TEST CHART NATIONAL BUREAU OF STANDARDS-1963-A .4 ’* • • . - . .. •. . . . . . . . . . . . . . • - -. .• ,. o. . . .- "o
Schneider, Sven; Diehl, Katharina
2016-05-01
The popularity of electronic cigarettes (e-cigarettes) among adolescents is growing worldwide. A more accurate model than the much discussed but inadequate Gateway Hypothesis is needed to explain some adolescents' initial preference for e-cigarettes over tobacco cigarettes, as well as any transition from e-cigarettes to tobacco smoking. Our aim was to summarize the diffuse fear that adolescents will be indirectly encouraged to begin smoking tobacco via the use of e-cigarettes and to systematize the disparate causal hypotheses used thus far in relevant literature. We summarized the vague and fragmented hypotheses formulated thus far in literature on both trajectories from abstinence to e-cigarette use and from there to tobacco smoking into a set of empirically testable hypotheses and organized them into a comprehensive model. Our results indicate that the perceived health risks, specific product characteristics (such as taste, price and inconspicuous use), and higher levels of acceptance among peers and others potentially make e-cigarettes initially more attractive to adolescents than tobacco cigarettes. Later, increasing familiarity with nicotine could lead to the reevaluation of both electronic and tobacco cigarettes and subsequently to a potential transition to tobacco smoking. The suggested "catalyst model" takes variations in the nicotine content of e-cigarettes as well as the dual use of different substances into account. Our model provides causal hypotheses for the initiation of e-cigarette use and for the potential transition to tobacco smoking which, after being tested in empirical studies, could lead to the formulation of concrete recommendations for healthcare intervention and prevention measures. We developed a model that provides causal hypotheses for the initiation of e-cigarette use and for the potential transition to tobacco smoking. © The Author 2015. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Insights into Mechanisms of Chronic Neurodegeneration
Diack, Abigail B.; Alibhai, James D.; Barron, Rona; Bradford, Barry; Piccardo, Pedro; Manson, Jean C.
2016-01-01
Chronic neurodegenerative diseases such as Alzheimer’s disease (AD), Parkinson’s disease (PD), and prion diseases are characterised by the accumulation of abnormal conformers of a host encoded protein in the central nervous system. The process leading to neurodegeneration is still poorly defined and thus development of early intervention strategies is challenging. Unique amongst these diseases are Transmissible Spongiform Encephalopathies (TSEs) or prion diseases, which have the ability to transmit between individuals. The infectious nature of these diseases has permitted in vivo and in vitro modelling of the time course of the disease process in a highly reproducible manner, thus early events can be defined. Recent evidence has demonstrated that the cell-to-cell spread of protein aggregates by a “prion-like mechanism” is common among the protein misfolding diseases. Thus, the TSE models may provide insights into disease mechanisms and testable hypotheses for disease intervention, applicable to a number of these chronic neurodegenerative diseases. PMID:26771599
Ethnic Enclaves and the Earnings of Immigrants
Xie, Yu; Gough, Margaret
2011-01-01
A large literature in sociology concerns the implications of immigrants’ participation in ethnic enclaves for their economic and social well-being. The “enclave thesis” speculates that immigrants benefit from working in ethnic enclaves. Previous research concerning the effects of enclave participation on immigrants’ economic outcomes has come to mixed conclusions as to whether enclave effects are positive or negative. In this article, we seek to extend and improve upon past work by formulating testable hypotheses based on the enclave thesis and testing them with data from the 2003 New Immigrant Survey (NIS), employing both residence-based and workplace-based measures of the ethnic enclave. We compare the economic outcomes of immigrants working in ethnic enclaves with those of immigrants working in the mainstream economy. Our research yields minimal support for the enclave thesis. Our results further indicate that for some immigrant groups, ethnic enclave participation actually has a negative effect on economic outcomes. PMID:21863367
Sea ice microorganisms: environmental constraints and extracellular responses.
Ewert, Marcela; Deming, Jody W
2013-03-28
Inherent to sea ice, like other high latitude environments, is the strong seasonality driven by changes in insolation throughout the year. Sea-ice organisms are exposed to shifting, sometimes limiting, conditions of temperature and salinity. An array of adaptations to survive these and other challenges has been acquired by those organisms that inhabit the ice. One key adaptive response is the production of extracellular polymeric substances (EPS), which play multiple roles in the entrapment, retention and survival of microorganisms in sea ice. In this concept paper we consider two main areas of sea-ice microbiology: the physico-chemical properties that define sea ice as a microbial habitat, imparting particular advantages and limits; and extracellular responses elicited in microbial inhabitants as they exploit or survive these conditions. Emphasis is placed on protective strategies used in the face of fluctuating and extreme environmental conditions in sea ice. Gaps in knowledge and testable hypotheses are identified for future research.
Similarities and differences between the Wnt and reelin pathways in the forming brain.
Reiner, Orly; Sapir, Tamar
2005-01-01
One of the key features in development is the reutilization of successful signaling pathways. Here, we emphasize the involvement of the Wnt pathway, one of the five kinds of signal transduction pathway predominating early embryonic development of all animals, in regulating the formation of brain structure. We discuss the interrelationships between the Wnt and reelin pathways in the regulation of cortical layering. We summarize data emphasizing key molecules, which, when mutated, result in abnormal brain development. This integrated view, which is based on conservation of pathways, reveals the relative position of participants in the pathway, points to control mechanisms, and allows raising testable working hypotheses. Nevertheless, although signaling pathways are highly conserved from flies to humans, the overall morphology is not. We propose that future studies directed at understanding of diversification will provide fruitful insights on mammalian brain formation.
The Puzzle of Male Chronophilias.
Seto, Michael C
2017-01-01
In this article, I return to the idea that pedophilia, a sexual interest in prepubescent children, can be considered a sexual orientation for age, in conjunction with the much more widely acknowledged and discussed sexual orientation for gender. Here, I broaden the scope to consider other chronophilias, referring to paraphilias for age/maturity categories other than young sexually mature adults. The puzzle of chronophilias includes questions about etiology and course, how chronophilias are related to each other, and what they can tell us about how human (male) sexuality is organized. In this article, I briefly review research on nepiophilia (infant/toddlers), pedophilia (prepubescent children), hebephilia (pubescent children), ephebophilia (postpubescent, sexually maturing adolescents), teleiophilia (young sexually mature adults, typically 20s and 30s), mesophilia (middle-aged adults, typically 40s and 50s), and gerontophilia (elderly adults, typically 60s and older) in the context of a multidimensional sexual orientations framework. Relevant research, limitations, and testable hypotheses for future work are identified.
Strength and Vulnerability Integration (SAVI): A Model of Emotional Well-Being Across Adulthood
Charles, Susan Turk
2010-01-01
The following paper presents the theoretical model of Strength and Vulnerability Integration (SAVI) to explain factors that influence emotion regulation and emotional well-being across adulthood. The model posits that trajectories of adult development are marked by age-related enhancement in the use of strategies that serve to avoid or limit exposure to negative stimuli, but age-related vulnerabilities in situations that elicit high levels of sustained emotional arousal. When older adults avoid or reduce exposure to emotional distress, they often respond better than younger adults; when they experience high levels of sustained emotional arousal, however, age-related advantages in emotional well-being are attenuated, and older adults are hypothesized to have greater difficulties returning to homeostasis. SAVI provides a testable model to understand the literature on emotion and aging and to predict trajectories of emotional experience across the adult life span. PMID:21038939
Strength and vulnerability integration: a model of emotional well-being across adulthood.
Charles, Susan Turk
2010-11-01
The following article presents the theoretical model of strength and vulnerability integration (SAVI) to explain factors that influence emotion regulation and emotional well-being across adulthood. The model posits that trajectories of adult development are marked by age-related enhancement in the use of strategies that serve to avoid or limit exposure to negative stimuli but by age-related vulnerabilities in situations that elicit high levels of sustained emotional arousal. When older adults avoid or reduce exposure to emotional distress, they often respond better than younger adults; when they experience high levels of sustained emotional arousal, however, age-related advantages in emotional well-being are attenuated, and older adults are hypothesized to have greater difficulties returning to homeostasis. SAVI provides a testable model to understand the literature on emotion and aging and to predict trajectories of emotional experience across the adult life span.
Modelling the molecular mechanisms of aging
Mc Auley, Mark T.; Guimera, Alvaro Martinez; Hodgson, David; Mcdonald, Neil; Mooney, Kathleen M.; Morgan, Amy E.
2017-01-01
The aging process is driven at the cellular level by random molecular damage that slowly accumulates with age. Although cells possess mechanisms to repair or remove damage, they are not 100% efficient and their efficiency declines with age. There are many molecular mechanisms involved and exogenous factors such as stress also contribute to the aging process. The complexity of the aging process has stimulated the use of computational modelling in order to increase our understanding of the system, test hypotheses and make testable predictions. As many different mechanisms are involved, a wide range of models have been developed. This paper gives an overview of the types of models that have been developed, the range of tools used, modelling standards and discusses many specific examples of models that have been grouped according to the main mechanisms that they address. We conclude by discussing the opportunities and challenges for future modelling in this field. PMID:28096317
An application of statistics to comparative metagenomics
Rodriguez-Brito, Beltran; Rohwer, Forest; Edwards, Robert A
2006-01-01
Background Metagenomics, sequence analyses of genomic DNA isolated directly from the environments, can be used to identify organisms and model community dynamics of a particular ecosystem. Metagenomics also has the potential to identify significantly different metabolic potential in different environments. Results Here we use a statistical method to compare curated subsystems, to predict the physiology, metabolism, and ecology from metagenomes. This approach can be used to identify those subsystems that are significantly different between metagenome sequences. Subsystems that were overrepresented in the Sargasso Sea and Acid Mine Drainage metagenome when compared to non-redundant databases were identified. Conclusion The methodology described herein applies statistics to the comparisons of metabolic potential in metagenomes. This analysis reveals those subsystems that are more, or less, represented in the different environments that are compared. These differences in metabolic potential lead to several testable hypotheses about physiology and metabolism of microbes from these ecosystems. PMID:16549025
An application of statistics to comparative metagenomics.
Rodriguez-Brito, Beltran; Rohwer, Forest; Edwards, Robert A
2006-03-20
Metagenomics, sequence analyses of genomic DNA isolated directly from the environments, can be used to identify organisms and model community dynamics of a particular ecosystem. Metagenomics also has the potential to identify significantly different metabolic potential in different environments. Here we use a statistical method to compare curated subsystems, to predict the physiology, metabolism, and ecology from metagenomes. This approach can be used to identify those subsystems that are significantly different between metagenome sequences. Subsystems that were overrepresented in the Sargasso Sea and Acid Mine Drainage metagenome when compared to non-redundant databases were identified. The methodology described herein applies statistics to the comparisons of metabolic potential in metagenomes. This analysis reveals those subsystems that are more, or less, represented in the different environments that are compared. These differences in metabolic potential lead to several testable hypotheses about physiology and metabolism of microbes from these ecosystems.
The biology and polymer physics underlying large-scale chromosome organization.
Sazer, Shelley; Schiessel, Helmut
2018-02-01
Chromosome large-scale organization is a beautiful example of the interplay between physics and biology. DNA molecules are polymers and thus belong to the class of molecules for which physicists have developed models and formulated testable hypotheses to understand their arrangement and dynamic properties in solution, based on the principles of polymer physics. Biologists documented and discovered the biochemical basis for the structure, function and dynamic spatial organization of chromosomes in cells. The underlying principles of chromosome organization have recently been revealed in unprecedented detail using high-resolution chromosome capture technology that can simultaneously detect chromosome contact sites throughout the genome. These independent lines of investigation have now converged on a model in which DNA loops, generated by the loop extrusion mechanism, are the basic organizational and functional units of the chromosome. © 2017 The Authors. Traffic published by John Wiley & Sons Ltd.
Flett, Gordon L; Hewitt, Paul L
2006-07-01
This article reviews the concepts of positive and negative perfectionism and the dual process model of perfectionism outlined by Slade and Owens (1998). The authors acknowledge that the dual process model represents a conceptual advance in the study of perfectionism and that Slade and Owens should be commended for identifying testable hypotheses and future research directions. However, the authors take issue with the notion that there are two types of perfectionism, with one type of perfectionism representing a "normal" or "healthy" form of perfectionism. They suggest that positive perfectionism is motivated, at least in part, by an avoidance orientation and fear of failure, and recent attempts to define and conceptualize positive perfectionism may have blurred the distinction between perfectionism and conscientiousness. Research findings that question the adaptiveness of positive forms of perfectionism are highlighted, and key issues for future research are identified.
Quantitative Measurements of Autobiographical Memory Content
Mainetti, Matteo; Ascoli, Giorgio A.
2012-01-01
Autobiographical memory (AM), subjective recollection of past experiences, is fundamental in everyday life. Nevertheless, characterization of the spontaneous occurrence of AM, as well as of the number and types of recollected details, remains limited. The CRAM (Cue-Recalled Autobiographical Memory) test (http://cramtest.info) adapts and combines the cue-word method with an assessment that collects counts of details recalled from different life periods. The SPAM (Spontaneous Probability of Autobiographical Memories) protocol samples introspection during everyday activity, recording memory duration and frequency. These measures provide detailed, naturalistic accounts of AM content and frequency, quantifying essential dimensions of recollection. AM content (∼20 details/recollection) decreased with the age of the episode, but less drastically than the probability of reporting remote compared to recent memories. AM retrieval was frequent (∼20/hour), each memory lasting ∼30 seconds. Testable hypotheses of the specific content retrieved in a fixed time from given life periods are presented. PMID:23028629
Simpson, Eleanor H.; Kellendonk, Christoph
2016-01-01
The dopamine hypothesis of schizophrenia is supported by a large number of imaging studies that have identified an increase in dopamine binding at the D2 receptor selectively in the striatum. Here we review a decade of work using a regionally restricted and temporally regulated transgenic mouse model to investigate the behavioral, molecular, electrophysiological, and anatomical consequences of selective D2 receptor upregulation in the striatum. These studies have identified new and potentially important biomarkers at the circuit and molecular level that can now be explored in patients with schizophrenia. They provide an example of how animal models and their detailed level of neurobiological analysis allow a deepening of our understanding of the relationship between neuronal circuit function and symptoms of schizophrenia, and as a consequence generate new hypotheses that are testable in patients. PMID:27720388
Rice-arsenate interactions in hydroponics: a three-gene model for tolerance.
Norton, Gareth J; Nigar, Meher; Williams, Paul N; Dasgupta, Tapash; Meharg, Andrew A; Price, Adam H
2008-01-01
In this study, the genetic mapping of the tolerance of root growth to 13.3 muM arsenate [As(V)] using the BalaxAzucena population is improved, and candidate genes for further study are identified. A remarkable three-gene model of tolerance is advanced, which appears to involve epistatic interaction between three major genes, two on chromosome 6 and one on chromosome 10. Any combination of two of these genes inherited from the tolerant parent leads to the plant having tolerance. Lists of potential positional candidate genes are presented. These are then refined using whole genome transcriptomics data and bioinformatics. Physiological evidence is also provided that genes related to phosphate transport are unlikely to be behind the genetic loci conferring tolerance. These results offer testable hypotheses for genes related to As(V) tolerance that might offer strategies for mitigating arsenic (As) accumulation in consumed rice.
Rice–arsenate interactions in hydroponics: a three-gene model for tolerance
Norton, Gareth J.; Nigar, Meher; Dasgupta, Tapash; Meharg, Andrew A.; Price, Adam H.
2008-01-01
In this study, the genetic mapping of the tolerance of root growth to 13.3 μM arsenate [As(V)] using the Bala×Azucena population is improved, and candidate genes for further study are identified. A remarkable three-gene model of tolerance is advanced, which appears to involve epistatic interaction between three major genes, two on chromosome 6 and one on chromosome 10. Any combination of two of these genes inherited from the tolerant parent leads to the plant having tolerance. Lists of potential positional candidate genes are presented. These are then refined using whole genome transcriptomics data and bioinformatics. Physiological evidence is also provided that genes related to phosphate transport are unlikely to be behind the genetic loci conferring tolerance. These results offer testable hypotheses for genes related to As(V) tolerance that might offer strategies for mitigating arsenic (As) accumulation in consumed rice. PMID:18453529
Cultural prototypes and dimensions of honor.
Cross, Susan E; Uskul, Ayse K; Gerçek-Swing, Berna; Sunbay, Zeynep; Alözkan, Cansu; Günsoy, Ceren; Ataca, Bilge; Karakitapoglu-Aygün, Zahide
2014-02-01
Research evidence and theoretical accounts of honor point to differing definitions of the construct in differing cultural contexts. The current studies address the question "What is honor?" using a prototype approach in Turkey and the Northern United States. Studies 1a/1b revealed substantial differences in the specific features generated by members of the two groups, but Studies 2 and 3 revealed cultural similarities in the underlying dimensions of self-respect, moral behavior, and social status/respect. Ratings of the centrality and personal importance of these factors were similar across the two groups, but their association with other relevant constructs differed. The tripartite nature of honor uncovered in these studies helps observers and researchers alike understand how diverse responses to situations can be attributed to honor. Inclusion of a prototype analysis into the literature on honor cultures can provide enhanced coverage of the concept that may lead to testable hypotheses and new theoretical developments.
Leveraging ecological theory to guide natural product discovery.
Smanski, Michael J; Schlatter, Daniel C; Kinkel, Linda L
2016-03-01
Technological improvements have accelerated natural product (NP) discovery and engineering to the point that systematic genome mining for new molecules is on the horizon. NP biosynthetic potential is not equally distributed across organisms, environments, or microbial life histories, but instead is enriched in a number of prolific clades. Also, NPs are not equally abundant in nature; some are quite common and others markedly rare. Armed with this knowledge, random 'fishing expeditions' for new NPs are increasingly harder to justify. Understanding the ecological and evolutionary pressures that drive the non-uniform distribution of NP biosynthesis provides a rational framework for the targeted isolation of strains enriched in new NP potential. Additionally, ecological theory leads to testable hypotheses regarding the roles of NPs in shaping ecosystems. Here we review several recent strain prioritization practices and discuss the ecological and evolutionary underpinnings for each. Finally, we offer perspectives on leveraging microbial ecology and evolutionary biology for future NP discovery.
Functional Interdependence Theory: An Evolutionary Account of Social Situations.
Balliet, Daniel; Tybur, Joshua M; Van Lange, Paul A M
2017-11-01
Social interactions are characterized by distinct forms of interdependence, each of which has unique effects on how behavior unfolds within the interaction. Despite this, little is known about the psychological mechanisms that allow people to detect and respond to the nature of interdependence in any given interaction. We propose that interdependence theory provides clues regarding the structure of interdependence in the human ancestral past. In turn, evolutionary psychology offers a framework for understanding the types of information processing mechanisms that could have been shaped under these recurring conditions. We synthesize and extend these two perspectives to introduce a new theory: functional interdependence theory (FIT). FIT can generate testable hypotheses about the function and structure of the psychological mechanisms for inferring interdependence. This new perspective offers insight into how people initiate and maintain cooperative relationships, select social partners and allies, and identify opportunities to signal social motives.
The ancestral flower of angiosperms and its early diversification
Sauquet, Hervé; von Balthazar, Maria; Magallón, Susana; Doyle, James A.; Endress, Peter K.; Bailes, Emily J.; Barroso de Morais, Erica; Bull-Hereñu, Kester; Carrive, Laetitia; Chartier, Marion; Chomicki, Guillaume; Coiro, Mario; Cornette, Raphaël; El Ottra, Juliana H. L.; Epicoco, Cyril; Foster, Charles S. P.; Jabbour, Florian; Haevermans, Agathe; Haevermans, Thomas; Hernández, Rebeca; Little, Stefan A.; Löfstrand, Stefan; Luna, Javier A.; Massoni, Julien; Nadot, Sophie; Pamperl, Susanne; Prieu, Charlotte; Reyes, Elisabeth; dos Santos, Patrícia; Schoonderwoerd, Kristel M.; Sontag, Susanne; Soulebeau, Anaëlle; Staedler, Yannick; Tschan, Georg F.; Wing-Sze Leung, Amy; Schönenberger, Jürg
2017-01-01
Recent advances in molecular phylogenetics and a series of important palaeobotanical discoveries have revolutionized our understanding of angiosperm diversification. Yet, the origin and early evolution of their most characteristic feature, the flower, remains poorly understood. In particular, the structure of the ancestral flower of all living angiosperms is still uncertain. Here we report model-based reconstructions for ancestral flowers at the deepest nodes in the phylogeny of angiosperms, using the largest data set of floral traits ever assembled. We reconstruct the ancestral angiosperm flower as bisexual and radially symmetric, with more than two whorls of three separate perianth organs each (undifferentiated tepals), more than two whorls of three separate stamens each, and more than five spirally arranged separate carpels. Although uncertainty remains for some of the characters, our reconstruction allows us to propose a new plausible scenario for the early diversification of flowers, leading to new testable hypotheses for future research on angiosperms. PMID:28763051
The Fate of the Method of 'Paradigms' in Paleobiology.
Rudwick, Martin J S
2017-11-02
An earlier article described the mid-twentieth century origins of the method of "paradigms" in paleobiology, as a way of making testable hypotheses about the functional morphology of extinct organisms. The present article describes the use of "paradigms" through the 1970s and, briefly, to the end of the century. After I had proposed the paradigm method to help interpret the ecological history of brachiopods, my students developed it in relation to that and other invertebrate phyla, notably in Euan Clarkson's analysis of vision in trilobites. David Raup's computer-aided "theoretical morphology" was then combined with my functional or adaptive emphasis, in Adolf Seilacher's tripartite "constructional morphology." Stephen Jay Gould, who had strongly endorsed the method, later switched to criticizing the "adaptationist program" he claimed it embodied. Although the explicit use of paradigms in paleobiology had declined by the end of the century, the method was tacitly subsumed into functional morphology as "biomechanics."
Sea Ice Microorganisms: Environmental Constraints and Extracellular Responses
Ewert, Marcela; Deming, Jody W.
2013-01-01
Inherent to sea ice, like other high latitude environments, is the strong seasonality driven by changes in insolation throughout the year. Sea-ice organisms are exposed to shifting, sometimes limiting, conditions of temperature and salinity. An array of adaptations to survive these and other challenges has been acquired by those organisms that inhabit the ice. One key adaptive response is the production of extracellular polymeric substances (EPS), which play multiple roles in the entrapment, retention and survival of microorganisms in sea ice. In this concept paper we consider two main areas of sea-ice microbiology: the physico-chemical properties that define sea ice as a microbial habitat, imparting particular advantages and limits; and extracellular responses elicited in microbial inhabitants as they exploit or survive these conditions. Emphasis is placed on protective strategies used in the face of fluctuating and extreme environmental conditions in sea ice. Gaps in knowledge and testable hypotheses are identified for future research. PMID:24832800
Ecological Suitability and Spatial Distribution of Five Anopheles Species in Amazonian Brazil
McKeon, Sascha N.; Schlichting, Carl D.; Povoa, Marinete M.; Conn, Jan E.
2013-01-01
Seventy-six sites characterized in Amazonian Brazil revealed distinct habitat diversification by examining the environmental factors associated with the distribution and abundance of five anopheline species (Diptera: Culicidae) in the subgenus Nyssorhynchus. These included three members of the Albitarsis Complex, Anopheles oryzalimnetes, Anopheles marajoara, Anopheles janconnae; Anopheles triannulatus, and Anopheles goeldii. Anopheles janconnae abundance had a positive correlation to water flow and a negative relationship to sun exposure. Abundance of An. oryzalimentes was associated with water chemistry. Anopheles goeldii larvae were abundant in shaded, more saline waters. Anopheles marajoara and An. triannulatus were negatively associated with available resources, although An. marajoara also showed several local correlations. These analyses suggest An. triannulatus is a habitat generalist, An. oryzalimentes and An. janconnae are specialists, and An. marajoara and An. goeldii could not be easily classified either way. Correlations described herein provide testable hypotheses for future research and identifying habitats for vector control. PMID:23546804
Segmental folding of chromosomes: a basis for structural and regulatory chromosomal neighborhoods?
Nora, Elphège P; Dekker, Job; Heard, Edith
2013-09-01
We discuss here a series of testable hypotheses concerning the role of chromosome folding into topologically associating domains (TADs). Several lines of evidence suggest that segmental packaging of chromosomal neighborhoods may underlie features of chromatin that span large domains, such as heterochromatin blocks, association with the nuclear lamina and replication timing. By defining which DNA elements preferentially contact each other, the segmentation of chromosomes into TADs may also underlie many properties of long-range transcriptional regulation. Several observations suggest that TADs can indeed provide a structural basis to regulatory landscapes, by controlling enhancer sharing and allocation. We also discuss how TADs may shape the evolution of chromosomes, by causing maintenance of synteny over large chromosomal segments. Finally we suggest a series of experiments to challenge these ideas and provide concrete examples illustrating how they could be practically applied. © 2013 The Authors. Bioessays published by WILEY Periodicals, Inc.
Segmental folding of chromosomes: A basis for structural and regulatory chromosomal neighborhoods?
Nora, Elphège P; Dekker, Job; Heard, Edith
2013-01-01
We discuss here a series of testable hypotheses concerning the role of chromosome folding into topologically associating domains (TADs). Several lines of evidence suggest that segmental packaging of chromosomal neighborhoods may underlie features of chromatin that span large domains, such as heterochromatin blocks, association with the nuclear lamina and replication timing. By defining which DNA elements preferentially contact each other, the segmentation of chromosomes into TADs may also underlie many properties of long-range transcriptional regulation. Several observations suggest that TADs can indeed provide a structural basis to regulatory landscapes, by controlling enhancer sharing and allocation. We also discuss how TADs may shape the evolution of chromosomes, by causing maintenance of synteny over large chromosomal segments. Finally we suggest a series of experiments to challenge these ideas and provide concrete examples illustrating how they could be practically applied. PMID:23832846
Assisted reproduction with gametes and embryos: what research is needed and fundable?
Seidel, George E
2016-01-01
Principles for selecting future research projects include interests of investigators, fundability, potential applications, ethical considerations, being able to formulate testable hypotheses and choosing the best models, including selection of the most appropriate species. The following 10 areas of assisted reproduction seem especially appropriate for further research: efficacious capacitation of bovine spermatozoa in vitro; improved in vitro bovine oocyte maturation; decreasing variability and increasing efficacy of bovine superovulation; improved fertility of sexed semen; improving equine IVF; improving cryopreservation of rooster spermatozoa; understanding differences between males in success of sperm cryopreservation and reasons for success in competitive fertilisation; mechanisms of reprogramming somatic cell nuclei after nuclear transfer; regulation of differentiation of ovarian primordial follicles; and means by which spermatozoa maintain fertility during storage in the epididymis. Issues are species specific for several of these topics, in most cases because the biology is species specific.
New streams and springs after the 2014 Mw6.0 South Napa earthquake.
Wang, Chi-Yuen; Manga, Michael
2015-07-09
Many streams and springs, which were dry or nearly dry before the 2014 Mw6.0 South Napa earthquake, started to flow after the earthquake. A United States Geological Survey stream gauge also registered a coseismic increase in discharge. Public interest was heightened by a state of extreme drought in California. Since the new flows were not contaminated by pre-existing surface water, their composition allowed unambiguous identification of their origin. Following the earthquake we repeatedly surveyed the new flows, collecting data to test hypotheses about their origin. We show that the new flows originated from groundwater in nearby mountains released by the earthquake. The estimated total amount of new water is ∼ 10(6) m(3), about 1/40 of the annual water use in the Napa-Sonoma area. Our model also makes a testable prediction of a post-seismic decrease of seismic velocity in the shallow crust of the affected region.
Making the Most of Omics for Symbiosis Research
Chaston, J.; Douglas, A.E.
2012-01-01
Omics, including genomics, proteomics and metabolomics, enable us to explain symbioses in terms of the underlying molecules and their interactions. The central task is to transform molecular catalogs of genes, metabolites etc. into a dynamic understanding of symbiosis function. We review four exemplars of omics studies that achieve this goal, through defined biological questions relating to metabolic integration and regulation of animal-microbial symbioses, the genetic autonomy of bacterial symbionts, and symbiotic protection of animal hosts from pathogens. As omic datasets become increasingly complex, computationally-sophisticated downstream analyses are essential to reveal interactions not evident to visual inspection of the data. We discuss two approaches, phylogenomics and transcriptional clustering, that can divide the primary output of omics studies – long lists of factors – into manageable subsets, and we describe how they have been applied to analyze large datasets and generate testable hypotheses. PMID:22983030
The hazards of hazard identification in environmental epidemiology.
Saracci, Rodolfo
2017-08-09
Hazard identification is a major scientific challenge, notably for environmental epidemiology, and is often surrounded, as the recent case of glyphosate shows, by debate arising in the first place by the inherently problematic nature of many components of the identification process. Particularly relevant in this respect are components less amenable to logical or mathematical formalization and essentially dependent on scientists' judgment. Four such potentially hazardous components that are capable of distorting the correct process of hazard identification are reviewed and discussed from an epidemiologist perspective: (1) lexical mix-up of hazard and risk (2) scientific questions as distinct from testable hypotheses, and implications for the hierarchy of strength of evidence obtainable from different types of study designs (3) assumptions in prior beliefs and model choices and (4) conflicts of interest. Four suggestions are put forward to strengthen a process that remains in several aspects judgmental, but not arbitrary, in nature.
Stephenson, Chris P; Baguley, Ian J
2018-02-01
Functional Neurological Symptom Disorder (FND) is a relatively common neurological condition, accounting for approximately 3-6% of neurologist referrals. FND is considered a transient disorder of neuronal function, sometimes linked to physical trauma and psychological stress. Despite this, chronic disability is common, for example, around 40% of adults with motor FND have permanent disability. Building on current theoretical models, this paper proposes that microglial dysfunction could perpetuate functional changes within acute motor FND, thus providing a pathophysiological mechanism underlying the chronic stage of the motor FND phenotypes seen clinically. Core to our argument is microglia's dual role in modulating neuroimmunity and their control of synaptic plasticity, which places them at a pathophysiological nexus wherein coincident physical trauma and psychological stress could cause long-term change in neuronal networks without producing macroscopic structural abnormality. This model proposes a range of hypotheses that are testable with current technologies. Copyright © 2017. Published by Elsevier Ltd.
NASA Astrophysics Data System (ADS)
Chakdar, Shreyashi
The Standard Model of particle physics is assumed to be a low-energy effective theory with new physics theoretically motivated to be around TeV scale. The thesis presents theories with new physics beyond the Standard Model in the TeV scale testable in the colliders. Work done in chapters 2, 3 and 5 in this thesis present some models incorporating different approaches of enlarging the Standard Model gauge group to a grand unified symmetry with each model presenting its unique signatures in the colliders. The study on leptoquarks gauge bosons in reference to TopSU(5) model in chapter 2 showed that their discovery mass range extends up to 1.5 TeV at 14 TeV LHC with luminosity of 100 fb--1. On the other hand, in chapter 3 we studied the collider phenomenology of TeV scale mirror fermions in Left-Right Mirror model finding that the reaches for the mirror quarks goes upto 750 GeV at the 14 TeV LHC with 300 fb--1 luminosity. In chapter 4 we have enlarged the bosonic symmetry to fermi-bose symmetry e.g. supersymmetry and have shown that SUSY with non-universalities in gaugino or scalar masses within high scale SUGRA set up can still be accessible at LHC with 14 TeV. In chapter 5, we performed a study in respect to the e+e-- collider and find that precise measurements of the higgs boson mass splittings up to ˜ 100 MeV may be possible with high luminosity in the International Linear Collider (ILC). In chapter 6 we have shown that the experimental data on neutrino masses and mixings are consistent with the proposed 4/5 parameter Dirac neutrino models yielding a solution for the neutrino masses with inverted mass hierarchy and large CP violating phase delta and thus can be tested experimentally. Chapter 7 of the thesis incorporates a warm dark matter candidate in context of two Higgs doublet model. The model has several testable consequences at colliders with the charged scalar and pseudoscalar being in few hundred GeV mass range. This thesis presents an endeavor to study beyond standard model physics at the TeV scale with testable signals in the Colliders.
Eye Examination Testability in Children with Autism and in Typical Peers
Coulter, Rachel Anastasia; Bade, Annette; Tea, Yin; Fecho, Gregory; Amster, Deborah; Jenewein, Erin; Rodena, Jacqueline; Lyons, Kara Kelley; Mitchell, G. Lynn; Quint, Nicole; Dunbar, Sandra; Ricamato, Michele; Trocchio, Jennie; Kabat, Bonnie; Garcia, Chantel; Radik, Irina
2015-01-01
ABSTRACT Purpose To compare testability of vision and eye tests in an examination protocol of 9- to 17-year-old patients with autism spectrum disorder (ASD) to typically developing (TD) peers. Methods In a prospective pilot study, 61 children and adolescents (34 with ASD and 27 who were TD) aged 9 to 17 years completed an eye examination protocol including tests of visual acuity, refraction, convergence (eye teaming), stereoacuity (depth perception), ocular motility, and ocular health. Patients who required new refractive correction were retested after wearing their updated spectacle prescription for 1 month. The specialized protocol incorporated visual, sensory, and communication supports. A psychologist determined group status/eligibility using DSM-IV-TR (Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition, Text Revision) criteria by review of previous evaluations and parent responses on the Social Communication Questionnaire. Before the examination, parents provided information regarding patients’ sex, race, ethnicity, and, for ASD patients, verbal communication level (nonverbal, uses short words, verbal). Parents indicated whether the patient wore a refractive correction, whether the patient had ever had an eye examination, and the age at the last examination. Chi-square tests compared testability results for TD and ASD groups. Results Typically developing and ASD groups did not differ by age (p = 0.54), sex (p = 0.53), or ethnicity (p = 0.22). Testability was high on most tests (TD, 100%; ASD, 88 to 100%), except for intraocular pressure (IOP), which was reduced for both the ASD (71%) and the TD (89%) patients. Among ASD patients, IOP testability varied greatly with verbal communication level (p < 0.001). Although IOP measurements were completed on all verbal patients, only 37.5% of nonverbal and 44.4% of ASD patients who used short words were successful. Conclusions Patients with ASD can complete most vision and eye tests within an examination protocol. Testability of IOPs is reduced, particularly for nonverbal patients and patients who use short words to communicate. PMID:25415280
Detecting Rotational Superradiance in Fluid Laboratories
NASA Astrophysics Data System (ADS)
Cardoso, Vitor; Coutant, Antonin; Richartz, Mauricio; Weinfurtner, Silke
2016-12-01
Rotational superradiance was predicted theoretically decades ago, and is chiefly responsible for a number of important effects and phenomenology in black-hole physics. However, rotational superradiance has never been observed experimentally. Here, with the aim of probing superradiance in the lab, we investigate the behavior of sound and surface waves in fluids resting in a circular basin at the center of which a rotating cylinder is placed. We show that with a suitable choice for the material of the cylinder, surface and sound waves are amplified. Two types of instabilities are studied: one sets in whenever superradiant modes are confined near the rotating cylinder and the other, which does not rely on confinement, corresponds to a local excitation of the cylinder. Our findings are experimentally testable in existing fluid laboratories and, hence, offer experimental exploration and comparison of dynamical instabilities arising from rapidly rotating boundary layers in astrophysical as well as in fluid dynamical systems.
A Framework for Evidence-Based Licensure of Adaptive Autonomous Systems
2016-03-01
insights gleaned to DoD. The autonomy community has identified significant challenges associated with test, evaluation verification and validation of...licensure as a test, evaluation, verification , and validation (TEVV) framework that can address these challenges. IDA found that traditional...language requirements to testable (preferably machine testable) specifications • Design of architectures that treat development and verification of
NASA Astrophysics Data System (ADS)
McManamay, R.; Allen, M. R.; Piburn, J.; Sanyal, J.; Stewart, R.; Bhaduri, B. L.
2017-12-01
Characterizing interdependencies among land-energy-water sectors, their vulnerabilities, and tipping points, is challenging, especially if all sectors are simultaneously considered. Because such holistic system behavior is uncertain, largely unmodeled, and in need of testable hypotheses of system drivers, these dynamics are conducive to exploratory analytics of spatiotemporal patterns, powered by tools, such as Dynamic Time Warping (DTW). Here, we conduct a retrospective analysis (1950 - 2010) of temporal trends in land use, energy use, and water use within US counties to identify commonalities in resource consumption and adaptation strategies to resource limitations. We combine existing and derived data from statistical downscaling to synthesize a temporally comprehensive land-energy-water dataset at the US county level and apply DTW and subsequent hierarchical clustering to examine similar temporal trends in resource typologies for land, energy, and water sectors. As expected, we observed tradeoffs among water uses (e.g., public supply vs irrigation) and land uses (e.g., urban vs ag). Strong associations between clusters amongst sectors reveal tight system interdependencies, whereas weak associations suggest unique behaviors and potential for human adaptations towards disruptive technologies and less resource-dependent population growth. Our framework is useful for exploring complex human-environmental system dynamics and generating hypotheses to guide subsequent energy-water-nexus research.
Drivers and mechanisms of tree mortality in moist tropical forests.
McDowell, Nate; Allen, Craig D; Anderson-Teixeira, Kristina; Brando, Paulo; Brienen, Roel; Chambers, Jeff; Christoffersen, Brad; Davies, Stuart; Doughty, Chris; Duque, Alvaro; Espirito-Santo, Fernando; Fisher, Rosie; Fontes, Clarissa G; Galbraith, David; Goodsman, Devin; Grossiord, Charlotte; Hartmann, Henrik; Holm, Jennifer; Johnson, Daniel J; Kassim, Abd Rahman; Keller, Michael; Koven, Charlie; Kueppers, Lara; Kumagai, Tomo'omi; Malhi, Yadvinder; McMahon, Sean M; Mencuccini, Maurizio; Meir, Patrick; Moorcroft, Paul; Muller-Landau, Helene C; Phillips, Oliver L; Powell, Thomas; Sierra, Carlos A; Sperry, John; Warren, Jeff; Xu, Chonggang; Xu, Xiangtao
2018-02-16
Tree mortality rates appear to be increasing in moist tropical forests (MTFs) with significant carbon cycle consequences. Here, we review the state of knowledge regarding MTF tree mortality, create a conceptual framework with testable hypotheses regarding the drivers, mechanisms and interactions that may underlie increasing MTF mortality rates, and identify the next steps for improved understanding and reduced prediction. Increasing mortality rates are associated with rising temperature and vapor pressure deficit, liana abundance, drought, wind events, fire and, possibly, CO 2 fertilization-induced increases in stand thinning or acceleration of trees reaching larger, more vulnerable heights. The majority of these mortality drivers may kill trees in part through carbon starvation and hydraulic failure. The relative importance of each driver is unknown. High species diversity may buffer MTFs against large-scale mortality events, but recent and expected trends in mortality drivers give reason for concern regarding increasing mortality within MTFs. Models of tropical tree mortality are advancing the representation of hydraulics, carbon and demography, but require more empirical knowledge regarding the most common drivers and their subsequent mechanisms. We outline critical datasets and model developments required to test hypotheses regarding the underlying causes of increasing MTF mortality rates, and improve prediction of future mortality under climate change. No claim to original US government works New Phytologist © 2018 New Phytologist Trust.
Is titin a 'winding filament'? A new twist on muscle contraction.
Nishikawa, Kiisa C; Monroy, Jenna A; Uyeno, Theodore E; Yeo, Sang Hoon; Pai, Dinesh K; Lindstedt, Stan L
2012-03-07
Recent studies have demonstrated a role for the elastic protein titin in active muscle, but the mechanisms by which titin plays this role remain to be elucidated. In active muscle, Ca(2+)-binding has been shown to increase titin stiffness, but the observed increase is too small to explain the increased stiffness of parallel elastic elements upon muscle activation. We propose a 'winding filament' mechanism for titin's role in active muscle. First, we hypothesize that Ca(2+)-dependent binding of titin's N2A region to thin filaments increases titin stiffness by preventing low-force straightening of proximal immunoglobulin domains that occurs during passive stretch. This mechanism explains the difference in length dependence of force between skeletal myofibrils and cardiac myocytes. Second, we hypothesize that cross-bridges serve not only as motors that pull thin filaments towards the M-line, but also as rotors that wind titin on the thin filaments, storing elastic potential energy in PEVK during force development and active stretch. Energy stored during force development can be recovered during active shortening. The winding filament hypothesis accounts for force enhancement during stretch and force depression during shortening, and provides testable predictions that will encourage new directions for research on mechanisms of muscle contraction.
Is titin a ‘winding filament’? A new twist on muscle contraction
Nishikawa, Kiisa C.; Monroy, Jenna A.; Uyeno, Theodore E.; Yeo, Sang Hoon; Pai, Dinesh K.; Lindstedt, Stan L.
2012-01-01
Recent studies have demonstrated a role for the elastic protein titin in active muscle, but the mechanisms by which titin plays this role remain to be elucidated. In active muscle, Ca2+-binding has been shown to increase titin stiffness, but the observed increase is too small to explain the increased stiffness of parallel elastic elements upon muscle activation. We propose a ‘winding filament’ mechanism for titin's role in active muscle. First, we hypothesize that Ca2+-dependent binding of titin's N2A region to thin filaments increases titin stiffness by preventing low-force straightening of proximal immunoglobulin domains that occurs during passive stretch. This mechanism explains the difference in length dependence of force between skeletal myofibrils and cardiac myocytes. Second, we hypothesize that cross-bridges serve not only as motors that pull thin filaments towards the M-line, but also as rotors that wind titin on the thin filaments, storing elastic potential energy in PEVK during force development and active stretch. Energy stored during force development can be recovered during active shortening. The winding filament hypothesis accounts for force enhancement during stretch and force depression during shortening, and provides testable predictions that will encourage new directions for research on mechanisms of muscle contraction. PMID:21900329
Alarcón, Tomás; Marches, Radu; Page, Karen M
2006-05-07
We formulate models of the mechanism(s) by which B cell lymphoma cells stimulated with an antibody specific to the B cell receptor (IgM) become quiescent or apoptotic. In particular, we aim to reproduce experimental results by Marches et al. according to which the fate of the targeted cells (Daudi) depends on the levels of expression of p21(Waf1) (p21) cell-cycle inhibitor. A simple model is formulated in which the basic ingredients are p21 and caspase activity, and their mutual inhibition. We show that this model does not reproduce the experimental results and that further refinement is needed. A second model successfully reproduces the experimental observations, for a given set of parameter values, indicating a critical role for Myc in the fate decision process. We use bifurcation analysis and objective sensitivity analysis to assess the robustness of our results. Importantly, this analysis yields experimentally testable predictions on the role of Myc, which could have therapeutic implications.
Superstitiousness in obsessive-compulsive disorder
Brugger, Peter; Viaud-Delmon, Isabelle
2010-01-01
It has been speculated that superstitiousness and obsessivecompulsive disorder (OCD) exist along a continuum. The distinction between superstitious behavior italic>and superstitious belief, however, is crucial for any theoretical account of claimed associations between superstitiousness and OCD. By demonstrating that there is a dichotomy between behavior and belief, which is experimentally testable, we can differentiate superstitious behavior from superstitious belief, or magical ideation. Different brain circuits are responsible for these two forms of superstitiousness; thus, determining which type of superstition is prominent in the symptomatology of an individual patient may inform us about the primarily affected neurocognitive systems. PMID:20623929
Authors’ response: mirror neurons: tests and testability.
Catmur, Caroline; Press, Clare; Cook, Richard; Bird, Geoffrey; Heyes, Cecilia
2014-04-01
Commentators have tended to focus on the conceptual framework of our article, the contrast between genetic and associative accounts of mirror neurons, and to challenge it with additional possibilities rather than empirical data. This makes the empirically focused comments especially valuable. The mirror neuron debate is replete with ideas; what it needs now are system-level theories and careful experiments – tests and testability.
A simple theoretical framework for understanding heterogeneous differentiation of CD4+ T cells
2012-01-01
Background CD4+ T cells have several subsets of functional phenotypes, which play critical yet diverse roles in the immune system. Pathogen-driven differentiation of these subsets of cells is often heterogeneous in terms of the induced phenotypic diversity. In vitro recapitulation of heterogeneous differentiation under homogeneous experimental conditions indicates some highly regulated mechanisms by which multiple phenotypes of CD4+ T cells can be generated from a single population of naïve CD4+ T cells. Therefore, conceptual understanding of induced heterogeneous differentiation will shed light on the mechanisms controlling the response of populations of CD4+ T cells under physiological conditions. Results We present a simple theoretical framework to show how heterogeneous differentiation in a two-master-regulator paradigm can be governed by a signaling network motif common to all subsets of CD4+ T cells. With this motif, a population of naïve CD4+ T cells can integrate the signals from their environment to generate a functionally diverse population with robust commitment of individual cells. Notably, two positive feedback loops in this network motif govern three bistable switches, which in turn, give rise to three types of heterogeneous differentiated states, depending upon particular combinations of input signals. We provide three prototype models illustrating how to use this framework to explain experimental observations and make specific testable predictions. Conclusions The process in which several types of T helper cells are generated simultaneously to mount complex immune responses upon pathogenic challenges can be highly regulated, and a simple signaling network motif can be responsible for generating all possible types of heterogeneous populations with respect to a pair of master regulators controlling CD4+ T cell differentiation. The framework provides a mathematical basis for understanding the decision-making mechanisms of CD4+ T cells, and it can be helpful for interpreting experimental results. Mathematical models based on the framework make specific testable predictions that may improve our understanding of this differentiation system. PMID:22697466
Hypotheses for Near-Surface Exchange of Methane on Mars.
Hu, Renyu; Bloom, A Anthony; Gao, Peter; Miller, Charles E; Yung, Yuk L
2016-07-01
The Curiosity rover recently detected a background of 0.7 ppb and spikes of 7 ppb of methane on Mars. This in situ measurement reorients our understanding of the martian environment and its potential for life, as the current theories do not entail any geological source or sink of methane that varies sub-annually. In particular, the 10-fold elevation during the southern winter indicates episodic sources of methane that are yet to be discovered. Here we suggest a near-surface reservoir could explain this variability. Using the temperature and humidity measurements from the rover, we find that perchlorate salts in the regolith deliquesce to form liquid solutions, and deliquescence progresses to deeper subsurface in the season of the methane spikes. We therefore formulate the following three testable hypotheses. The first scenario is that the regolith in Gale Crater adsorbs methane when dry and releases this methane to the atmosphere upon deliquescence. The adsorption energy needs to be 36 kJ mol(-1) to explain the magnitude of the methane spikes, higher than existing laboratory measurements. The second scenario is that microorganisms convert organic matter in the soil to methane when they are in liquid solutions. This scenario does not require regolith adsorption but entails extant life on Mars. The third scenario is that deep subsurface aquifers produce the bursts of methane. Continued in situ measurements of methane and water, as well as laboratory studies of adsorption and deliquescence, will test these hypotheses and inform the existence of the near-surface reservoir and its exchange with the atmosphere. Mars-Methane-Astrobiology-Regolith. Astrobiology 16, 539-550.
Hypotheses for Near-Surface Exchange of Methane on Mars
NASA Astrophysics Data System (ADS)
Hu, Renyu; Bloom, A. Anthony; Gao, Peter; Miller, Charles E.; Yung, Yuk L.
2016-07-01
The Curiosity rover recently detected a background of 0.7 ppb and spikes of 7 ppb of methane on Mars. This in situ measurement reorients our understanding of the martian environment and its potential for life, as the current theories do not entail any geological source or sink of methane that varies sub-annually. In particular, the 10-fold elevation during the southern winter indicates episodic sources of methane that are yet to be discovered. Here we suggest a near-surface reservoir could explain this variability. Using the temperature and humidity measurements from the rover, we find that perchlorate salts in the regolith deliquesce to form liquid solutions, and deliquescence progresses to deeper subsurface in the season of the methane spikes. We therefore formulate the following three testable hypotheses. The first scenario is that the regolith in Gale Crater adsorbs methane when dry and releases this methane to the atmosphere upon deliquescence. The adsorption energy needs to be 36 kJ mol-1 to explain the magnitude of the methane spikes, higher than existing laboratory measurements. The second scenario is that microorganisms convert organic matter in the soil to methane when they are in liquid solutions. This scenario does not require regolith adsorption but entails extant life on Mars. The third scenario is that deep subsurface aquifers produce the bursts of methane. Continued in situ measurements of methane and water, as well as laboratory studies of adsorption and deliquescence, will test these hypotheses and inform the existence of the near-surface reservoir and its exchange with the atmosphere.
Artificial Intelligence Applications to Testability.
1984-10-01
general software assistant; examining testability utilization of it should wait a few years until the software assistant is a well defined product ...ago. It provides a single host which satisfies the needs of developers, product developers, and end users . As shown in table 5.10-2, it also provides...follows a trend towards more user -oriented design approaches to interactive computer systems. The implicit goal in this trend is the
A Predictive Model of the Oxygen and Heme Regulatory Network in Yeast
Kundaje, Anshul; Xin, Xiantong; Lan, Changgui; Lianoglou, Steve; Zhou, Mei; Zhang, Li; Leslie, Christina
2008-01-01
Deciphering gene regulatory mechanisms through the analysis of high-throughput expression data is a challenging computational problem. Previous computational studies have used large expression datasets in order to resolve fine patterns of coexpression, producing clusters or modules of potentially coregulated genes. These methods typically examine promoter sequence information, such as DNA motifs or transcription factor occupancy data, in a separate step after clustering. We needed an alternative and more integrative approach to study the oxygen regulatory network in Saccharomyces cerevisiae using a small dataset of perturbation experiments. Mechanisms of oxygen sensing and regulation underlie many physiological and pathological processes, and only a handful of oxygen regulators have been identified in previous studies. We used a new machine learning algorithm called MEDUSA to uncover detailed information about the oxygen regulatory network using genome-wide expression changes in response to perturbations in the levels of oxygen, heme, Hap1, and Co2+. MEDUSA integrates mRNA expression, promoter sequence, and ChIP-chip occupancy data to learn a model that accurately predicts the differential expression of target genes in held-out data. We used a novel margin-based score to extract significant condition-specific regulators and assemble a global map of the oxygen sensing and regulatory network. This network includes both known oxygen and heme regulators, such as Hap1, Mga2, Hap4, and Upc2, as well as many new candidate regulators. MEDUSA also identified many DNA motifs that are consistent with previous experimentally identified transcription factor binding sites. Because MEDUSA's regulatory program associates regulators to target genes through their promoter sequences, we directly tested the predicted regulators for OLE1, a gene specifically induced under hypoxia, by experimental analysis of the activity of its promoter. In each case, deletion of the candidate regulator resulted in the predicted effect on promoter activity, confirming that several novel regulators identified by MEDUSA are indeed involved in oxygen regulation. MEDUSA can reveal important information from a small dataset and generate testable hypotheses for further experimental analysis. Supplemental data are included. PMID:19008939
NASA Astrophysics Data System (ADS)
Blacksberg, Jordana; Mahjoub, Ahmed; Poston, Michael; Brown, Mike; Eiler, John; Ehlmann, Bethany; Hand, Kevin; Carlson, Robert W.; Hodyss, Robert; Wong, Ian
2015-11-01
We present an experimental study aimed at exploring the hypothesis suggested by recent dynamical models - that the Jupiter Trojan asteroids originated in the outer solar system, were scattered by the same instability responsibility for the radical rearrangement of the giant planets, and were subsequently captured in their current location (e.g. Morbidelli et al., 2005, Nesvorny et al., 2013). We seek to identify spectroscopic, chemical and isotopic properties that can tie the Trojan populations to these evolutionary pathways, providing experimental support of dynamical models, and providing testable hypotheses that can feed into the design of experiments that might be performed on potential future missions to these and other primitive bodies.We present the results of experiments devised to explore the hypothesis that Kuiper Belt Objects (KBOs) represent the parent populations of the Trojan asteroids. Numerous thin ice films composed of select solar system volatiles (H2O, H2S, CH3OH, NH3) were grown in various mixtures to simulate compositional changes of icy bodies as a function of volatility and radial distance of formation from the Sun. Subsequent processing of these icy bodies was simulated using electron irradiation and heating. Visible reflectance spectra show significant reddening when H2S is present. Mid-infrared spectra confirm the formation of non-volatile sulfur-containing molecules in the products of H2S-containing ices. These experiments suggest that the presence of specific sulfur-bearing chemical species may play an important role in the colors of both the KBOs and Trojans today. Finally, we discuss the role of the silicate component expected on the surface of the Trojan asteroids (Emery et al., 2006), and the implications of a surface composed of silicates in intimate contact with the nonvolatile organic residues generated by ice irradiation.This work has been supported by the Keck Institute for Space Studies (KISS). The research described here was carried out at the Jet Propulsion Laboratory, Caltech, under a contract with the National Aeronautics and Space Administration (NASA) and at the Caltech Division of Geological and Planetary Sciences.
Computer simulation analysis of normal and abnormal development of the mammalian diaphragm
Fisher, Jason C; Bodenstein, Lawrence
2006-01-01
Background Congenital diaphragmatic hernia (CDH) is a birth defect with significant morbidity and mortality. Knowledge of diaphragm morphogenesis and the aberrations leading to CDH is limited. Although classical embryologists described the diaphragm as arising from the septum transversum, pleuroperitoneal folds (PPF), esophageal mesentery and body wall, animal studies suggest that the PPF is the major, if not sole, contributor to the muscular diaphragm. Recently, a posterior defect in the PPF has been identified when the teratogen nitrofen is used to induce CDH in fetal rodents. We describe use of a cell-based computer modeling system (Nudge++™) to study diaphragm morphogenesis. Methods and results Key diaphragmatic structures were digitized from transverse serial sections of paraffin-embedded mouse embryos at embryonic days 11.5 and 13. Structure boundaries and simulated cells were combined in the Nudge++™ software. Model cells were assigned putative behavioral programs, and these programs were progressively modified to produce a diaphragm consistent with the observed anatomy in rodents. Homology between our model and recent anatomical observations occurred under the following simulation conditions: (1) cell mitoses are restricted to the edge of growing tissue; (2) cells near the chest wall remain mitotically active; (3) mitotically active non-edge cells migrate toward the chest wall; and (4) movement direction depends on clonal differentiation between anterior and posterior PPF cells. Conclusion With the PPF as the sole source of mitotic cells, an early defect in the PPF evolves into a posteromedial diaphragm defect, similar to that of the rodent nitrofen CDH model. A posterolateral defect, as occurs in human CDH, would be more readily recreated by invoking other cellular contributions. Our results suggest that recent reports of PPF-dominated diaphragm morphogenesis in the rodent may not be strictly applicable to man. The ability to recreate a CDH defect using a combination of experimental data and testable hypotheses gives impetus to simulation modeling as an adjunct to experimental analysis of diaphragm morphogenesis. PMID:16483386
Abdo, Nour; Xia, Menghang; Brown, Chad C.; Kosyk, Oksana; Huang, Ruili; Sakamuru, Srilatha; Zhou, Yi-Hui; Jack, John R.; Gallins, Paul; Xia, Kai; Li, Yun; Chiu, Weihsueh A.; Motsinger-Reif, Alison A.; Austin, Christopher P.; Tice, Raymond R.
2015-01-01
Background: Understanding of human variation in toxicity to environmental chemicals remains limited, so human health risk assessments still largely rely on a generic 10-fold factor (10½ each for toxicokinetics and toxicodynamics) to account for sensitive individuals or subpopulations. Objectives: We tested a hypothesis that population-wide in vitro cytotoxicity screening can rapidly inform both the magnitude of and molecular causes for interindividual toxicodynamic variability. Methods: We used 1,086 lymphoblastoid cell lines from the 1000 Genomes Project, representing nine populations from five continents, to assess variation in cytotoxic response to 179 chemicals. Analysis included assessments of population variation and heritability, and genome-wide association mapping, with attention to phenotypic relevance to human exposures. Results: For about half the tested compounds, cytotoxic response in the 1% most “sensitive” individual occurred at concentrations within a factor of 10½ (i.e., approximately 3) of that in the median individual; however, for some compounds, this factor was > 10. Genetic mapping suggested important roles for variation in membrane and transmembrane genes, with a number of chemicals showing association with SNP rs13120371 in the solute carrier SLC7A11, previously implicated in chemoresistance. Conclusions: This experimental approach fills critical gaps unaddressed by recent large-scale toxicity testing programs, providing quantitative, experimentally based estimates of human toxicodynamic variability, and also testable hypotheses about mechanisms contributing to interindividual variation. Citation: Abdo N, Xia M, Brown CC, Kosyk O, Huang R, Sakamuru S, Zhou YH, Jack JR, Gallins P, Xia K, Li Y, Chiu WA, Motsinger-Reif AA, Austin CP, Tice RR, Rusyn I, Wright FA. 2015. Population-based in vitro hazard and concentration–response assessment of chemicals: the 1000 Genomes high-throughput screening study. Environ Health Perspect 123:458–466; http://dx.doi.org/10.1289/ehp.1408775 PMID:25622337
Computer simulation analysis of normal and abnormal development of the mammalian diaphragm.
Fisher, Jason C; Bodenstein, Lawrence
2006-02-17
Congenital diaphragmatic hernia (CDH) is a birth defect with significant morbidity and mortality. Knowledge of diaphragm morphogenesis and the aberrations leading to CDH is limited. Although classical embryologists described the diaphragm as arising from the septum transversum, pleuroperitoneal folds (PPF), esophageal mesentery and body wall, animal studies suggest that the PPF is the major, if not sole, contributor to the muscular diaphragm. Recently, a posterior defect in the PPF has been identified when the teratogen nitrofen is used to induce CDH in fetal rodents. We describe use of a cell-based computer modeling system (Nudge++) to study diaphragm morphogenesis. Key diaphragmatic structures were digitized from transverse serial sections of paraffin-embedded mouse embryos at embryonic days 11.5 and 13. Structure boundaries and simulated cells were combined in the Nudge++ software. Model cells were assigned putative behavioral programs, and these programs were progressively modified to produce a diaphragm consistent with the observed anatomy in rodents. Homology between our model and recent anatomical observations occurred under the following simulation conditions: (1) cell mitoses are restricted to the edge of growing tissue; (2) cells near the chest wall remain mitotically active; (3) mitotically active non-edge cells migrate toward the chest wall; and (4) movement direction depends on clonal differentiation between anterior and posterior PPF cells. With the PPF as the sole source of mitotic cells, an early defect in the PPF evolves into a posteromedial diaphragm defect, similar to that of the rodent nitrofen CDH model. A posterolateral defect, as occurs in human CDH, would be more readily recreated by invoking other cellular contributions. Our results suggest that recent reports of PPF-dominated diaphragm morphogenesis in the rodent may not be strictly applicable to man. The ability to recreate a CDH defect using a combination of experimental data and testable hypotheses gives impetus to simulation modeling as an adjunct to experimental analysis of diaphragm morphogenesis.
Tong, J H S; Hawi, Z; Dark, C; Cummins, T D R; Johnson, B P; Newman, D P; Lau, R; Vance, A; Heussler, H S; Matthews, N; Bellgrove, M A; Pang, K C
2016-11-01
Attention deficit hyperactivity disorder (ADHD) is a highly heritable psychiatric condition with negative lifetime outcomes. Uncovering its genetic architecture should yield important insights into the neurobiology of ADHD and assist development of novel treatment strategies. Twenty years of candidate gene investigations and more recently genome-wide association studies have identified an array of potential association signals. In this context, separating the likely true from false associations ('the wheat' from 'the chaff') will be crucial for uncovering the functional biology of ADHD. Here, we defined a set of 2070 DNA variants that showed evidence of association with ADHD (or were in linkage disequilibrium). More than 97% of these variants were noncoding, and were prioritised for further exploration using two tools-genome-wide annotation of variants (GWAVA) and Combined Annotation-Dependent Depletion (CADD)-that were recently developed to rank variants based upon their likely pathogenicity. Capitalising on recent efforts such as the Encyclopaedia of DNA Elements and US National Institutes of Health Roadmap Epigenomics Projects to improve understanding of the noncoding genome, we subsequently identified 65 variants to which we assigned functional annotations, based upon their likely impact on alternative splicing, transcription factor binding and translational regulation. We propose that these 65 variants, which possess not only a high likelihood of pathogenicity but also readily testable functional hypotheses, represent a tractable shortlist for future experimental validation in ADHD. Taken together, this study brings into sharp focus the likely relevance of noncoding variants for the genetic risk associated with ADHD, and more broadly suggests a bioinformatics approach that should be relevant to other psychiatric disorders.
Estimating outflow facility through pressure dependent pathways of the human eye
Gardiner, Bruce S.
2017-01-01
We develop and test a new theory for pressure dependent outflow from the eye. The theory comprises three main parameters: (i) a constant hydraulic conductivity, (ii) an exponential decay constant and (iii) a no-flow intraocular pressure, from which the total pressure dependent outflow, average outflow facilities and local outflow facilities for the whole eye may be evaluated. We use a new notation to specify precisely the meaning of model parameters and so model outputs. Drawing on a range of published data, we apply the theory to animal eyes, enucleated eyes and in vivo human eyes, and demonstrate how to evaluate model parameters. It is shown that the theory can fit high quality experimental data remarkably well. The new theory predicts that outflow facilities and total pressure dependent outflow for the whole eye are more than twice as large as estimates based on the Goldman equation and fluorometric analysis of anterior aqueous outflow. It appears likely that this discrepancy can be largely explained by pseudofacility and aqueous flow through the retinal pigmented epithelium, while any residual discrepancy may be due to pathological processes in aged eyes. The model predicts that if the hydraulic conductivity is too small, or the exponential decay constant is too large, then intraocular eye pressure may become unstable when subjected to normal circadian changes in aqueous production. The model also predicts relationships between variables that may be helpful when planning future experiments, and the model generates many novel testable hypotheses. With additional research, the analysis described here may find application in the differential diagnosis, prognosis and monitoring of glaucoma. PMID:29261696
2011-01-01
Background Two component regulatory systems are the primary form of signal transduction in bacteria. Although genomic binding sites have been determined for several eukaryotic and bacterial transcription factors, comprehensive identification of gene targets of two component response regulators remains challenging due to the lack of knowledge of the signals required for their activation. We focused our study on Desulfovibrio vulgaris Hildenborough, a sulfate reducing bacterium that encodes unusually diverse and largely uncharacterized two component signal transduction systems. Results We report the first systematic mapping of the genes regulated by all transcriptionally acting response regulators in a single bacterium. Our results enabled functional predictions for several response regulators and include key processes of carbon, nitrogen and energy metabolism, cell motility and biofilm formation, and responses to stresses such as nitrite, low potassium and phosphate starvation. Our study also led to the prediction of new genes and regulatory networks, which found corroboration in a compendium of transcriptome data available for D. vulgaris. For several regulators we predicted and experimentally verified the binding site motifs, most of which were discovered as part of this study. Conclusions The gene targets identified for the response regulators allowed strong functional predictions to be made for the corresponding two component systems. By tracking the D. vulgaris regulators and their motifs outside the Desulfovibrio spp. we provide testable hypotheses regarding the functions of orthologous regulators in other organisms. The in vitro array based method optimized here is generally applicable for the study of such systems in all organisms. PMID:21992415
Using next generation transcriptome sequencing to predict an ectomycorrhizal metablome.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larsen, P. E.; Sreedasyam, A.; Trivedi, G
Mycorrhizae, symbiotic interactions between soil fungi and tree roots, are ubiquitous in terrestrial ecosystems. The fungi contribute phosphorous, nitrogen and mobilized nutrients from organic matter in the soil and in return the fungus receives photosynthetically-derived carbohydrates. This union of plant and fungal metabolisms is the mycorrhizal metabolome. Understanding this symbiotic relationship at a molecular level provides important contributions to the understanding of forest ecosystems and global carbon cycling. We generated next generation short-read transcriptomic sequencing data from fully-formed ectomycorrhizae between Laccaria bicolor and aspen (Populus tremuloides) roots. The transcriptomic data was used to identify statistically significantly expressed gene models usingmore » a bootstrap-style approach, and these expressed genes were mapped to specific metabolic pathways. Integration of expressed genes that code for metabolic enzymes and the set of expressed membrane transporters generates a predictive model of the ectomycorrhizal metabolome. The generated model of mycorrhizal metabolome predicts that the specific compounds glycine, glutamate, and allantoin are synthesized by L. bicolor and that these compounds or their metabolites may be used for the benefit of aspen in exchange for the photosynthetically-derived sugars fructose and glucose. The analysis illustrates an approach to generate testable biological hypotheses to investigate the complex molecular interactions that drive ectomycorrhizal symbiosis. These models are consistent with experimental environmental data and provide insight into the molecular exchange processes for organisms in this complex ecosystem. The method used here for predicting metabolomic models of mycorrhizal systems from deep RNA sequencing data can be generalized and is broadly applicable to transcriptomic data derived from complex systems.« less
Modulation of hippocampal rhythms by subthreshold electric fields and network topology
Berzhanskaya, Julia; Chernyy, Nick; Gluckman, Bruce J.; Schiff, Steven J.; Ascoli, Giorgio A.
2012-01-01
Theta (4–12 Hz) and gamma (30–80 Hz) rhythms are considered important for cortical and hippocampal function. Although several neuron types are implicated in rhythmogenesis, the exact cellular mechanisms remain unknown. Subthreshold electric fields provide a flexible, area-specific tool to modulate neural activity and directly test functional hypotheses. Here we present experimental and computational evidence of the interplay among hippocampal synaptic circuitry, neuronal morphology, external electric fields, and network activity. Electrophysiological data are used to constrain and validate an anatomically and biophysically realistic model of area CA1 containing pyramidal cells and two interneuron types: dendritic- and perisomatic-targeting. We report two lines of results: addressing the network structure capable of generating theta-modulated gamma rhythms, and demonstrating electric field effects on those rhythms. First, theta-modulated gamma rhythms require specific inhibitory connectivity. In one configuration, GABAergic axo-dendritic feedback on pyramidal cells is only effective in proximal but not distal layers. An alternative configuration requires two distinct perisomatic interneuron classes, one exclusively receiving excitatory contacts, the other additionally targeted by inhibition. These observations suggest novel roles for particular classes of oriens and basket cells. The second major finding is that subthreshold electric fields robustly alter the balance between different rhythms. Independent of network configuration, positive electric fields decrease, while negative fields increase the theta/gamma ratio. Moreover, electric fields differentially affect average theta frequency depending on specific synaptic connectivity. These results support the testable prediction that subthreshold electric fields can alter hippocampal rhythms, suggesting new approaches to explore their cognitive functions and underlying circuitry. PMID:23053863
Harris, Jenine K; Erwin, Paul C; Smith, Carson; Brownson, Ross C
2015-01-01
Evidence-based decision making (EBDM) is the process, in local health departments (LHDs) and other settings, of translating the best available scientific evidence into practice. Local health departments are more likely to be successful if they use evidence-based strategies. However, EBDM and use of evidence-based strategies by LHDs are not widespread. Drawing on diffusion of innovations theory, we sought to understand how LHD directors and program managers perceive the relative advantage, compatibility, simplicity, and testability of EBDM. Directors and managers of programs in chronic disease, environmental health, and infectious disease from LHDs nationwide completed a survey including demographic information and questions about diffusion attributes (advantage, compatibility, simplicity, and testability) related to EBDM. Bivariate inferential tests were used to compare responses between directors and managers and to examine associations between participant characteristics and diffusion attributes. Relative advantage and compatibility scores were high for directors and managers, whereas simplicity and testability scores were lower. Although health department directors and managers of programs in chronic disease generally had higher scores than other groups, there were few significant or large differences between directors and managers across the diffusion attributes. Larger jurisdiction population size was associated with higher relative advantage and compatibility scores for both directors and managers. Overall, directors and managers were in strong agreement on the relative advantage of an LHD using EBDM, with directors in stronger agreement than managers. Perceived relative advantage has been demonstrated to be the most important factor in the rate of innovation adoption, suggesting an opportunity for directors to speed EBDM adoption. However, lower average scores across all groups for simplicity and testability may be hindering EBDM adoption. Recommended strategies for increasing perceived EBDM simplicity and testability are provided.
Kidane, Yared H; Lawrence, Christopher; Murali, T M
2013-10-07
Fungi are the second most abundant type of human pathogens. Invasive fungal pathogens are leading causes of life-threatening infections in clinical settings. Toxicity to the host and drug-resistance are two major deleterious issues associated with existing antifungal agents. Increasing a host's tolerance and/or immunity to fungal pathogens has potential to alleviate these problems. A host's tolerance may be improved by modulating the immune system such that it responds more rapidly and robustly in all facets, ranging from the recognition of pathogens to their clearance from the host. An understanding of biological processes and genes that are perturbed during attempted fungal exposure, colonization, and/or invasion will help guide the identification of endogenous immunomodulators and/or small molecules that activate host-immune responses such as specialized adjuvants. In this study, we present computational techniques and approaches using publicly available transcriptional data sets, to predict immunomodulators that may act against multiple fungal pathogens. Our study analyzed data sets derived from host cells exposed to five fungal pathogens, namely, Alternaria alternata, Aspergillus fumigatus, Candida albicans, Pneumocystis jirovecii, and Stachybotrys chartarum. We observed statistically significant associations between host responses to A. fumigatus and C. albicans. Our analysis identified biological processes that were consistently perturbed by these two pathogens. These processes contained both immune response-inducing genes such as MALT1, SERPINE1, ICAM1, and IL8, and immune response-repressing genes such as DUSP8, DUSP6, and SPRED2. We hypothesize that these genes belong to a pool of common immunomodulators that can potentially be activated or suppressed (agonized or antagonized) in order to render the host more tolerant to infections caused by A. fumigatus and C. albicans. Our computational approaches and methodologies described here can now be applied to newly generated or expanded data sets for further elucidation of additional drug targets. Moreover, identified immunomodulators may be used to generate experimentally testable hypotheses that could help in the discovery of broad-spectrum immunotherapeutic interventions. All of our results are available at the following supplementary website: http://bioinformatics.cs.vt.edu/~murali/supplements/2013-kidane-bmc.
2013-01-01
Background Fungi are the second most abundant type of human pathogens. Invasive fungal pathogens are leading causes of life-threatening infections in clinical settings. Toxicity to the host and drug-resistance are two major deleterious issues associated with existing antifungal agents. Increasing a host’s tolerance and/or immunity to fungal pathogens has potential to alleviate these problems. A host’s tolerance may be improved by modulating the immune system such that it responds more rapidly and robustly in all facets, ranging from the recognition of pathogens to their clearance from the host. An understanding of biological processes and genes that are perturbed during attempted fungal exposure, colonization, and/or invasion will help guide the identification of endogenous immunomodulators and/or small molecules that activate host-immune responses such as specialized adjuvants. Results In this study, we present computational techniques and approaches using publicly available transcriptional data sets, to predict immunomodulators that may act against multiple fungal pathogens. Our study analyzed data sets derived from host cells exposed to five fungal pathogens, namely, Alternaria alternata, Aspergillus fumigatus, Candida albicans, Pneumocystis jirovecii, and Stachybotrys chartarum. We observed statistically significant associations between host responses to A. fumigatus and C. albicans. Our analysis identified biological processes that were consistently perturbed by these two pathogens. These processes contained both immune response-inducing genes such as MALT1, SERPINE1, ICAM1, and IL8, and immune response-repressing genes such as DUSP8, DUSP6, and SPRED2. We hypothesize that these genes belong to a pool of common immunomodulators that can potentially be activated or suppressed (agonized or antagonized) in order to render the host more tolerant to infections caused by A. fumigatus and C. albicans. Conclusions Our computational approaches and methodologies described here can now be applied to newly generated or expanded data sets for further elucidation of additional drug targets. Moreover, identified immunomodulators may be used to generate experimentally testable hypotheses that could help in the discovery of broad-spectrum immunotherapeutic interventions. All of our results are available at the following supplementary website: http://bioinformatics.cs.vt.edu/~murali/supplements/2013-kidane-bmc PMID:24099000
Percolation mechanism drives actin gels to the critically connected state
NASA Astrophysics Data System (ADS)
Lee, Chiu Fan; Pruessner, Gunnar
2016-05-01
Cell motility and tissue morphogenesis depend crucially on the dynamic remodeling of actomyosin networks. An actomyosin network consists of an actin polymer network connected by cross-linker proteins and motor protein myosins that generate internal stresses on the network. A recent discovery shows that for a range of experimental parameters, actomyosin networks contract to clusters with a power-law size distribution [J. Alvarado, Nat. Phys. 9, 591 (2013), 10.1038/nphys2715]. Here, we argue that actomyosin networks can exhibit a robust critical signature without fine-tuning because the dynamics of the system can be mapped onto a modified version of percolation with trapping (PT), which is known to show critical behavior belonging to the static percolation universality class without the need for fine-tuning of a control parameter. We further employ our PT model to generate experimentally testable predictions.
NASA Astrophysics Data System (ADS)
Wang, Jun-Wei; Zhou, Tian-Shou
2009-12-01
In this paper, we develop a new mathematical model for the mammalian circadian clock, which incorporates both transcriptional/translational feedback loops (TTFLs) and a cAMP-mediated feedback loop. The model shows that TTFLs and cAMP signalling cooperatively drive the circadian rhythms. It reproduces typical experimental observations with qualitative similarities, e.g. circadian oscillations in constant darkness and entrainment to light-dark cycles. In addition, it can explain the phenotypes of cAMP-mutant and Rev-erbα-/--mutant mice, and help us make an experimentally-testable prediction: oscillations may be rescued when arrhythmic mice with constitutively low concentrations of cAMP are crossed with Rev-erbα-/- mutant mice. The model enhances our understanding of the mammalian circadian clockwork from the viewpoint of the entire cell.
Robertson, Scott; Leonhardt, Ulf
2014-11-01
Hawking radiation has become experimentally testable thanks to the many analog systems which mimic the effects of the event horizon on wave propagation. These systems are typically dominated by dispersion and give rise to a numerically soluble and stable ordinary differential equation only if the rest-frame dispersion relation Ω^{2}(k) is a polynomial of relatively low degree. Here we present a new method for the calculation of wave scattering in a one-dimensional medium of arbitrary dispersion. It views the wave equation as an integral equation in Fourier space, which can be solved using standard and efficient numerical techniques.
Proposed experiment to test fundamentally binary theories
NASA Astrophysics Data System (ADS)
Kleinmann, Matthias; Vértesi, Tamás; Cabello, Adán
2017-09-01
Fundamentally binary theories are nonsignaling theories in which measurements of many outcomes are constructed by selecting from binary measurements. They constitute a sensible alternative to quantum theory and have never been directly falsified by any experiment. Here we show that fundamentally binary theories are experimentally testable with current technology. For that, we identify a feasible Bell-type experiment on pairs of entangled qutrits. In addition, we prove that, for any n , quantum n -ary correlations are not fundamentally (n -1 ) -ary. For that, we introduce a family of inequalities that hold for fundamentally (n -1 ) -ary theories but are violated by quantum n -ary correlations.
Emergent quantum mechanics without wavefunctions
NASA Astrophysics Data System (ADS)
Mesa Pascasio, J.; Fussy, S.; Schwabl, H.; Grössing, G.
2016-03-01
We present our model of an Emergent Quantum Mechanics which can be characterized by “realism without pre-determination”. This is illustrated by our analytic description and corresponding computer simulations of Bohmian-like “surreal” trajectories, which are obtained classically, i.e. without the use of any quantum mechanical tool such as wavefunctions. However, these trajectories do not necessarily represent ontological paths of particles but rather mappings of the probability density flux in a hydrodynamical sense. Modelling emergent quantum mechanics in a high-low intesity double slit scenario gives rise to the “quantum sweeper effect” with a characteristic intensity pattern. This phenomenon should be experimentally testable via weak measurement techniques.
Subluxation: dogma or science?
Keating, Joseph C; Charlton, Keith H; Grod, Jaroslaw P; Perle, Stephen M; Sikorski, David; Winterstein, James F
2005-01-01
Subluxation syndrome is a legitimate, potentially testable, theoretical construct for which there is little experimental evidence. Acceptable as hypothesis, the widespread assertion of the clinical meaningfulness of this notion brings ridicule from the scientific and health care communities and confusion within the chiropractic profession. We believe that an evidence-orientation among chiropractors requires that we distinguish between subluxation dogma vs. subluxation as the potential focus of clinical research. We lament efforts to generate unity within the profession through consensus statements concerning subluxation dogma, and believe that cultural authority will continue to elude us so long as we assert dogma as though it were validated clinical theory. PMID:16092955
NASA Astrophysics Data System (ADS)
Tu, K. M.; Matubayasi, N.; Liang, K. K.; Todorov, I. T.; Chan, S. L.; Chau, P.-L.
2012-08-01
We placed halothane, a general anaesthetic, inside palmitoyloleoylphosphatidylcholine (POPC) bilayers and performed molecular dynamics simulations at atmospheric and raised pressures. We demonstrated that halothane aggregated inside POPC membranes at 20 MPa but not at 40 MPa. The pressure range of aggregation matches that of pressure reversal in whole animals, and strongly suggests that this could be the mechanism for this effect. Combining these results with previous experimental data, we describe a testable hypothesis of how aggregation of general anaesthetics at high pressure can lead to pressure reversal, the effect whereby these drugs lose the efficacy at high pressure.
Broken SU(3) antidecuplet for {Theta}{sup +} and {Xi}{sub 3/2}
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pakvasa, Sandip; Suzuki, Mahiko
2004-05-05
If the narrow exotic baryon resonances {Theta}{sup +}(1540) and {Xi}{sub 3/2} are members of the J{sup P} = 1/2{sup +} antidecuplet with N*(1710), the octet-antidecuplet mixing is required not only by the mass spectrum but also by the decay pattern of N*(1710). This casts doubt on validity of the {Theta}{sup +} mass prediction by the chiral soliton model. While all pieces of the existing experimental information point to a small octet-decuplet mixing, the magnitude of mixing required by the mass spectrum is not consistent with the value needed to account for the hadronic decay rates. The discrepancy is not resolvedmore » even after the large experimental uncertainty is taken into consideration. We fail to find an alternative SU(3) assignment even with different spin-parity assignment. When we extend the analysis to mixing with a higher SU(3) multiplet, we find one experimentally testable scenario in the case of mixing with a 27-plet.« less
Testability Design Rating System: Testability Handbook. Volume 1
1992-02-01
4-10 4.7.5 Summary of False BIT Alarms (FBA) ............................. 4-10 4.7.6 Smart BIT Technique...Circuit Board PGA Pin Grid Array PLA Programmable Logic Array PLD Programmable Logic Device PN Pseudo-Random Number PREDICT Probabilistic Estimation of...11 4.7.6 Smart BIT ( reference: RADC-TR-85-198). " Smart " BIT is a term given to BIT circuitry in a system LRU which includes dedicated processor/memory
Smart substrates: Making multi-chip modules smarter
NASA Astrophysics Data System (ADS)
Wunsch, T. F.; Treece, R. K.
1995-05-01
A novel multi-chip module (MCM) design and manufacturing methodology which utilizes active CMOS circuits in what is normally a passive substrate realizes the 'smart substrate' for use in highly testable, high reliability MCMS. The active devices are used to test the bare substrate, diagnose assembly errors or integrated circuit (IC) failures that require rework, and improve the testability of the final MCM assembly. A static random access memory (SRAM) MCM has been designed and fabricated in Sandia Microelectronics Development Laboratory in order to demonstrate the technical feasibility of this concept and to examine design and manufacturing issues which will ultimately determine the economic viability of this approach. The smart substrate memory MCM represents a first in MCM packaging. At the time the first modules were fabricated, no other company or MCM vendor had incorporated active devices in the substrate to improve manufacturability and testability, and thereby improve MCM reliability and reduce cost.
Morris, Melody K; Shriver, Zachary; Sasisekharan, Ram; Lauffenburger, Douglas A
2012-03-01
Mathematical models have substantially improved our ability to predict the response of a complex biological system to perturbation, but their use is typically limited by difficulties in specifying model topology and parameter values. Additionally, incorporating entities across different biological scales ranging from molecular to organismal in the same model is not trivial. Here, we present a framework called "querying quantitative logic models" (Q2LM) for building and asking questions of constrained fuzzy logic (cFL) models. cFL is a recently developed modeling formalism that uses logic gates to describe influences among entities, with transfer functions to describe quantitative dependencies. Q2LM does not rely on dedicated data to train the parameters of the transfer functions, and it permits straight-forward incorporation of entities at multiple biological scales. The Q2LM framework can be employed to ask questions such as: Which therapeutic perturbations accomplish a designated goal, and under what environmental conditions will these perturbations be effective? We demonstrate the utility of this framework for generating testable hypotheses in two examples: (i) a intracellular signaling network model; and (ii) a model for pharmacokinetics and pharmacodynamics of cell-cytokine interactions; in the latter, we validate hypotheses concerning molecular design of granulocyte colony stimulating factor. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Parvovirus B19 Infection in Children With Arterial Ischemic Stroke.
Fullerton, Heather J; Luna, Jorge M; Wintermark, Max; Hills, Nancy K; Tokarz, Rafal; Li, Ying; Glaser, Carol; DeVeber, Gabrielle A; Lipkin, W Ian; Elkind, Mitchell S V
2017-10-01
Case-control studies suggest that acute infection transiently increases the risk of childhood arterial ischemic stroke. We hypothesized that an unbiased pathogen discovery approach utilizing MassTag-polymerase chain reaction would identify pathogens in the blood of childhood arterial ischemic stroke cases. The multicenter international VIPS study (Vascular Effects of Infection in Pediatric Stroke) enrolled arterial ischemic stroke cases, and stroke-free controls, aged 29 days through 18 years. Parental interview included questions on recent infections. In this pilot study, we used MassTag-polymerase chain reaction to test the plasma of the first 161 cases and 34 controls enrolled for a panel of 28 common bacterial and viral pathogens. Pathogen DNA was detected in no controls and 14 cases (8.7%): parvovirus B19 (n=10), herpesvirus 6 (n=2), adenovirus (n=1), and rhinovirus 6C (n=1). Parvovirus B19 infection was confirmed by serologies in all 10; infection was subclinical in 8. Four cases with parvovirus B19 had underlying congenital heart disease, whereas another 5 had a distinct arteriopathy involving a long-segment stenosis of the distal internal carotid and proximal middle cerebral arteries. Using MassTag-polymerase chain reaction, we detected parvovirus B19-a virus known to infect erythrocytes and endothelial cells-in some cases of childhood arterial ischemic stroke. This approach can generate new, testable hypotheses about childhood stroke pathogenesis. © 2017 American Heart Association, Inc.
Rothwell, Gar W; Wyatt, Sarah E; Tomescu, Alexandru M F
2014-06-01
Paleontology yields essential evidence for inferring not only the pattern of evolution, but also the genetic basis of evolution within an ontogenetic framework. Plant fossils provide evidence for the pattern of plant evolution in the form of transformational series of structure through time. Developmentally diagnostic structural features that serve as "fingerprints" of regulatory genetic pathways also are preserved by plant fossils, and here we provide examples of how those fingerprints can be used to infer the mechanisms by which plant form and development have evolved. When coupled with an understanding of variations and systematic distributions of specific regulatory genetic pathways, this approach provides an avenue for testing evolutionary hypotheses at the organismal level that is analogous to employing bioinformatics to explore genetics at the genomic level. The positions where specific genes, gene families, and developmental regulatory mechanisms first appear in phylogenies are correlated with the positions where fossils with the corresponding structures occur on the tree, thereby yielding testable hypotheses that extend our understanding of the role of developmental changes in the evolution of the body plans of vascular plant sporophytes. As a result, we now have new and powerful methodologies for characterizing major evolutionary changes in morphology, anatomy, and physiology that have resulted from combinations of genetic regulatory changes and that have produced the synapomorphies by which we recognize major clades of plants. © 2014 Botanical Society of America, Inc.
Cyclin A2 promotes DNA repair in the brain during both development and aging.
Gygli, Patrick E; Chang, Joshua C; Gokozan, Hamza N; Catacutan, Fay P; Schmidt, Theresa A; Kaya, Behiye; Goksel, Mustafa; Baig, Faisal S; Chen, Shannon; Griveau, Amelie; Michowski, Wojciech; Wong, Michael; Palanichamy, Kamalakannan; Sicinski, Piotr; Nelson, Randy J; Czeisler, Catherine; Otero, José J
2016-07-01
Various stem cell niches of the brain have differential requirements for Cyclin A2. Cyclin A2 loss results in marked cerebellar dysmorphia, whereas forebrain growth is retarded during early embryonic development yet achieves normal size at birth. To understand the differential requirements of distinct brain regions for Cyclin A2, we utilized neuroanatomical, transgenic mouse, and mathematical modeling techniques to generate testable hypotheses that provide insight into how Cyclin A2 loss results in compensatory forebrain growth during late embryonic development. Using unbiased measurements of the forebrain stem cell niche, we parameterized a mathematical model whereby logistic growth instructs progenitor cells as to the cell-types of their progeny. Our data was consistent with prior findings that progenitors proliferate along an auto-inhibitory growth curve. The growth retardation inCCNA2-null brains corresponded to cell cycle lengthening, imposing a developmental delay. We hypothesized that Cyclin A2 regulates DNA repair and that CCNA2-null progenitors thus experienced lengthened cell cycle. We demonstrate that CCNA2-null progenitors suffer abnormal DNA repair, and implicate Cyclin A2 in double-strand break repair. Cyclin A2's DNA repair functions are conserved among cell lines, neural progenitors, and hippocampal neurons. We further demonstrate that neuronal CCNA2 ablation results in learning and memory deficits in aged mice.
Vázquez-Lobo, Alejandra; Carlsbecker, Annelie; Vergara-Silva, Francisco; Alvarez-Buylla, Elena R; Piñero, Daniel; Engström, Peter
2007-01-01
The identity of genes causally implicated in the development and evolutionary origin of reproductive characters in gymnosperms is largely unknown. Working within the framework of plant evolutionary developmental biology, here we have cloned, sequenced, performed phylogenetic analyses upon and tested the expression patterns of LEAFY/FLORICAULA and NEEDLY orthologs in reproductive structures from selected species of the conifer genera Picea, Podocarpus, and Taxus. Contrary to expectations based on previous assessments, expression of LFY/FLO and NLY in cones of these taxa was found to occur simultaneously in a single reproductive axis, initially overlapping but later in mutually exclusive primordia and/or groups of developing cells in both female and male structures. These observations directly affect the status of the "mostly male theory" for the origin of the angiosperm flower. On the other hand, comparative spatiotemporal patterns of the expression of these genes suggest a complex genetic regulatory network of cone development, as well as a scheme of functional divergence for LFY/FLO with respect to NLY homologs in gymnosperms, both with clear heterochronic aspects. Results presented in this study contribute to the understanding of the molecular-genetic basis of morphological evolution in conifer cones, and may aid in establishing a foundation for gymnosperm-specific, testable evo-devo hypotheses.
Use of direct gradient analysis to uncover biological hypotheses in 16s survey data and beyond.
Erb-Downward, John R; Sadighi Akha, Amir A; Wang, Juan; Shen, Ning; He, Bei; Martinez, Fernando J; Gyetko, Margaret R; Curtis, Jeffrey L; Huffnagle, Gary B
2012-01-01
This study investigated the use of direct gradient analysis of bacterial 16S pyrosequencing surveys to identify relevant bacterial community signals in the midst of a "noisy" background, and to facilitate hypothesis-testing both within and beyond the realm of ecological surveys. The results, utilizing 3 different real world data sets, demonstrate the utility of adding direct gradient analysis to any analysis that draws conclusions from indirect methods such as Principal Component Analysis (PCA) and Principal Coordinates Analysis (PCoA). Direct gradient analysis produces testable models, and can identify significant patterns in the midst of noisy data. Additionally, we demonstrate that direct gradient analysis can be used with other kinds of multivariate data sets, such as flow cytometric data, to identify differentially expressed populations. The results of this study demonstrate the utility of direct gradient analysis in microbial ecology and in other areas of research where large multivariate data sets are involved.
New streams and springs after the 2014 Mw6.0 South Napa earthquake
Wang, Chi-Yuen; Manga, Michael
2015-01-01
Many streams and springs, which were dry or nearly dry before the 2014 Mw6.0 South Napa earthquake, started to flow after the earthquake. A United States Geological Survey stream gauge also registered a coseismic increase in discharge. Public interest was heightened by a state of extreme drought in California. Since the new flows were not contaminated by pre-existing surface water, their composition allowed unambiguous identification of their origin. Following the earthquake we repeatedly surveyed the new flows, collecting data to test hypotheses about their origin. We show that the new flows originated from groundwater in nearby mountains released by the earthquake. The estimated total amount of new water is ∼106 m3, about 1/40 of the annual water use in the Napa–Sonoma area. Our model also makes a testable prediction of a post-seismic decrease of seismic velocity in the shallow crust of the affected region. PMID:26158898
Hook, Paul W; McClymont, Sarah A; Cannon, Gabrielle H; Law, William D; Morton, A Jennifer; Goff, Loyal A; McCallion, Andrew S
2018-03-01
Genetic variation modulating risk of sporadic Parkinson disease (PD) has been primarily explored through genome-wide association studies (GWASs). However, like many other common genetic diseases, the impacted genes remain largely unknown. Here, we used single-cell RNA-seq to characterize dopaminergic (DA) neuron populations in the mouse brain at embryonic and early postnatal time points. These data facilitated unbiased identification of DA neuron subpopulations through their unique transcriptional profiles, including a postnatal neuroblast population and substantia nigra (SN) DA neurons. We use these population-specific data to develop a scoring system to prioritize candidate genes in all 49 GWAS intervals implicated in PD risk, including genes with known PD associations and many with extensive supporting literature. As proof of principle, we confirm that the nigrostriatal pathway is compromised in Cplx1-null mice. Ultimately, this systematic approach establishes biologically pertinent candidates and testable hypotheses for sporadic PD, informing a new era of PD genetic research. Copyright © 2018 American Society of Human Genetics. All rights reserved.
Greenwood, Pamela M; Blumberg, Eric J; Scheldrup, Melissa R
2018-03-01
A comprehensive explanation is lacking for the broad array of cognitive effects modulated by transcranial direct current stimulation (tDCS). We advanced the testable hypothesis that tDCS to the default mode network (DMN) increases processing of goals and stored information at the expense of external events. We further hypothesized that tDCS to the dorsal attention network (DAN) increases processing of external events at the expense of goals and stored information. A literature search (PsychINFO) identified 42 empirical studies and 3 meta-analyses examining effects of prefrontal and/or parietal tDCS on tasks that selectively required external and/or internal processing. Most, though not all, of the studies that met our search criteria supported our hypothesis. Three meta-analyses supported our hypothesis. The hypothesis we advanced provides a framework for the design and interpretation of results in light of the role of large-scale intrinsic networks that govern attention. Copyright © 2017 Elsevier Ltd. All rights reserved.
Gao, Yu; Fangel, Jonatan U; Willats, William G T; Vivier, Melané A; Moore, John P
2016-11-05
The effectiveness of enzyme-mediated-maceration in red winemaking relies on the use of an optimum combination of specific enzymes. A lack of information on the relevant enzyme activities and the corresponding polysaccharide-rich berry cell wall structure is a major limitation. This study used different combinations of purified recombinant pectinases with cell wall profiling tools to follow the deconstruction process during winemaking. Multivariate data analysis of the glycan microarray (CoMPP) and gas chromatography (GC) results revealed that pectin lyase performed almost as effectively in de-pectination as certain commercial enzyme mixtures. Surprisingly the combination of endo-polygalacturonase and pectin-methyl-esterase only unraveled the cell walls without de-pectination. Datasets from the various combinations used confirmed pectin-rich and xyloglucan-rich layers within the grape pomace. These data support a proposed grape cell wall model which can serve as a foundation to evaluate testable hypotheses in future studies aimed at developing tailor-made enzymes for winemaking scenarios. Copyright © 2016 Elsevier Ltd. All rights reserved.
Genetics and epigenetics of rheumatoid arthritis
Viatte, Sebastien; Plant, Darren; Raychaudhuri, Soumya
2013-01-01
Investigators have made key advances in rheumatoid arthritis (RA) genetics in the past 10 years. Although genetic studies have had limited influence on clinical practice and drug discovery, they are currently generating testable hypotheses to explain disease pathogenesis. Firstly, we review here the major advances in identifying RA genetic susceptibility markers both within and outside of the MHC. Understanding how genetic variants translate into pathogenic mechanisms and ultimately into phenotypes remains a mystery for most of the polymorphisms that confer susceptibility to RA, but functional data are emerging. Interplay between environmental and genetic factors is poorly understood and in need of further investigation. Secondly, we review current knowledge of the role of epigenetics in RA susceptibility. Differences in the epigenome could represent one of the ways in which environmental exposures translate into phenotypic outcomes. The best understood epigenetic phenomena include post-translational histone modifications and DNA methylation events, both of which have critical roles in gene regulation. Epigenetic studies in RA represent a new area of research with the potential to answer unsolved questions. PMID:23381558
Filho, Edson; Bertollo, Maurizio; Robazza, Claudio; Comani, Silvia
2015-01-01
Since the discovery of the mirror neuron system in the 1980s, little, if any, research has been devoted to the study of interactive motor tasks (Goldman, 2012). Scientists interested in the neuropsychophysiological markers of joint motor action have relied on observation paradigms and passive tasks rather than dynamic paradigms and interactive tasks (Konvalinka and Roepstorff, 2012). Within this research scenario, we introduce a novel research paradigm that uses cooperative juggling as a platform to capture peripheral (e.g., skin conductance, breathing and heart rates, electromyographic signals) and central neuropsychophysiological (e.g., functional connectivity within and between brains) markers underlying the notion of team mental models (TMM). We discuss the epistemological and theoretical grounds of a cooperative juggling paradigm, and propose testable hypotheses on neuropsychophysiological markers underlying TMM. Furthermore, we present key methodological concerns that may influence peripheral responses as well as single and hyperbrain network configurations during joint motor action. Preliminary findings of the paradigm are highlighted. We conclude by delineating avenues for future research.
The evolution of speech: a comparative review.
Fitch
2000-07-01
The evolution of speech can be studied independently of the evolution of language, with the advantage that most aspects of speech acoustics, physiology and neural control are shared with animals, and thus open to empirical investigation. At least two changes were necessary prerequisites for modern human speech abilities: (1) modification of vocal tract morphology, and (2) development of vocal imitative ability. Despite an extensive literature, attempts to pinpoint the timing of these changes using fossil data have proven inconclusive. However, recent comparative data from nonhuman primates have shed light on the ancestral use of formants (a crucial cue in human speech) to identify individuals and gauge body size. Second, comparative analysis of the diverse vertebrates that have evolved vocal imitation (humans, cetaceans, seals and birds) provides several distinct, testable hypotheses about the adaptive function of vocal mimicry. These developments suggest that, for understanding the evolution of speech, comparative analysis of living species provides a viable alternative to fossil data. However, the neural basis for vocal mimicry and for mimesis in general remains unknown.
Fast gene ontology based clustering for microarray experiments.
Ovaska, Kristian; Laakso, Marko; Hautaniemi, Sampsa
2008-11-21
Analysis of a microarray experiment often results in a list of hundreds of disease-associated genes. In order to suggest common biological processes and functions for these genes, Gene Ontology annotations with statistical testing are widely used. However, these analyses can produce a very large number of significantly altered biological processes. Thus, it is often challenging to interpret GO results and identify novel testable biological hypotheses. We present fast software for advanced gene annotation using semantic similarity for Gene Ontology terms combined with clustering and heat map visualisation. The methodology allows rapid identification of genes sharing the same Gene Ontology cluster. Our R based semantic similarity open-source package has a speed advantage of over 2000-fold compared to existing implementations. From the resulting hierarchical clustering dendrogram genes sharing a GO term can be identified, and their differences in the gene expression patterns can be seen from the heat map. These methods facilitate advanced annotation of genes resulting from data analysis.
The ORF1 Protein Encoded by LINE-1: Structure and Function During L1 Retrotransposition
Martin, Sandra L.
2006-01-01
LINE-1, or L1 is an autonomous non-LTR retrotransposon in mammals. Retrotransposition requires the function of the two, L1-encoded polypeptides, ORF1p and ORF2p. Early recognition of regions of homology between the predicted amino acid sequence of ORF2 and known endonuclease and reverse transcriptase enzymes led to testable hypotheses regarding the function of ORF2p in retrotransposition. As predicted, ORF2p has been demonstrated to have both endonuclease and reverse transcriptase activities. In contrast, no homologs of known function have contributed to our understanding of the function of ORF1p during retrotransposition. Nevertheless, significant advances have been made such that we now know that ORF1p is a high affinity RNA binding protein that forms a ribonucleoprotein particle together with L1 RNA. Furthermore, ORF1p is a nucleic acid chaperone and this nucleic acid chaperone activity is required for L1 retrotransposition. PMID:16877816
Gender and Physics: a Theoretical Analysis
NASA Astrophysics Data System (ADS)
Rolin, Kristina
This article argues that the objections raised by Koertge (1998), Gross and Levitt (1994), and Weinberg (1996) against feminist scholarship on gender and physics are unwarranted. The objections are that feminist science studies perpetuate gender stereotypes, are irrelevant to the content of physics, or promote epistemic relativism. In the first part of this article I argue that the concept of gender, as it has been developed in feminist theory, is a key to understanding why the first objection is misguided. Instead of reinforcing gender stereotypes, feminist science studies scholars can formulate empirically testable hypotheses regarding local and contested beliefs about gender. In the second part of this article I argue that a social analysis of scientific knowledge is a key to understanding why the second and the third objections are misguided. The concept of gender is relevant for understanding the social practice of physics, and the social practice of physics can be of epistemic importance. Instead of advancing epistemic relativism, feminist science studies scholars can make important contributions to a subfield of philosophy called social epistemology.
Five potential consequences of climate change for invasive species.
Hellmann, Jessica J; Byers, James E; Bierwagen, Britta G; Dukes, Jeffrey S
2008-06-01
Scientific and societal unknowns make it difficult to predict how global environmental changes such as climate change and biological invasions will affect ecological systems. In the long term, these changes may have interacting effects and compound the uncertainty associated with each individual driver. Nonetheless, invasive species are likely to respond in ways that should be qualitatively predictable, and some of these responses will be distinct from those of native counterparts. We used the stages of invasion known as the "invasion pathway" to identify 5 nonexclusive consequences of climate change for invasive species: (1) altered transport and introduction mechanisms, (2) establishment of new invasive species, (3) altered impact of existing invasive species, (4) altered distribution of existing invasive species, and (5) altered effectiveness of control strategies. We then used these consequences to identify testable hypotheses about the responses of invasive species to climate change and provide suggestions for invasive-species management plans. The 5 consequences also emphasize the need for enhanced environmental monitoring and expanded coordination among entities involved in invasive-species management.
Physical concepts in the development of constitutive equations
NASA Technical Reports Server (NTRS)
Cassenti, B. N.
1985-01-01
Proposed viscoplastic material models include in their formulation observed material response but do not generally incorporate principles from thermodynamics, statistical mechanics, and quantum mechanics. Numerous hypotheses were made for material response based on first principles. Many of these hypotheses were tested experimentally. The proposed viscoplastic theories and the experimental basis of these hypotheses must be checked against the hypotheses. The physics of thermodynamics, statistical mechanics and quantum mechanics, and the effects of defects, are reviewed for their application to the development of constitutive laws.
Lee, Joy L; DeCamp, Matthew; Dredze, Mark; Chisolm, Margaret S; Berger, Zackary D
2014-10-15
Twitter is home to many health professionals who send messages about a variety of health-related topics. Amid concerns about physicians posting inappropriate content online, more in-depth knowledge about these messages is needed to understand health professionals' behavior on Twitter. Our goal was to characterize the content of Twitter messages, specifically focusing on health professionals and their tweets relating to health. We performed an in-depth content analysis of 700 tweets. Qualitative content analysis was conducted on tweets by health users on Twitter. The primary objective was to describe the general type of content (ie, health-related versus non-health related) on Twitter authored by health professionals and further to describe health-related tweets on the basis of the type of statement made. Specific attention was given to whether a tweet was personal (as opposed to professional) or made a claim that users would expect to be supported by some level of medical evidence (ie, a "testable" claim). A secondary objective was to compare content types among different users, including patients, physicians, nurses, health care organizations, and others. Health-related users are posting a wide range of content on Twitter. Among health-related tweets, 53.2% (184/346) contained a testable claim. Of health-related tweets by providers, 17.6% (61/346) were personal in nature; 61% (59/96) made testable statements. While organizations and businesses use Twitter to promote their services and products, patient advocates are using this tool to share their personal experiences with health. Twitter users in health-related fields tweet about both testable claims and personal experiences. Future work should assess the relationship between testable tweets and the actual level of evidence supporting them, including how Twitter users-especially patients-interpret the content of tweets posted by health providers.
A Multilayer Network Approach for Guiding Drug Repositioning in Neglected Diseases
Chernomoretz, Ariel; Agüero, Fernán
2016-01-01
Drug development for neglected diseases has been historically hampered due to lack of market incentives. The advent of public domain resources containing chemical information from high throughput screenings is changing the landscape of drug discovery for these diseases. In this work we took advantage of data from extensively studied organisms like human, mouse, E. coli and yeast, among others, to develop a novel integrative network model to prioritize and identify candidate drug targets in neglected pathogen proteomes, and bioactive drug-like molecules. We modeled genomic (proteins) and chemical (bioactive compounds) data as a multilayer weighted network graph that takes advantage of bioactivity data across 221 species, chemical similarities between 1.7 105 compounds and several functional relations among 1.67 105 proteins. These relations comprised orthology, sharing of protein domains, and shared participation in defined biochemical pathways. We showcase the application of this network graph to the problem of prioritization of new candidate targets, based on the information available in the graph for known compound-target associations. We validated this strategy by performing a cross validation procedure for known mouse and Trypanosoma cruzi targets and showed that our approach outperforms classic alignment-based approaches. Moreover, our model provides additional flexibility as two different network definitions could be considered, finding in both cases qualitatively different but sensible candidate targets. We also showcase the application of the network to suggest targets for orphan compounds that are active against Plasmodium falciparum in high-throughput screens. In this case our approach provided a reduced prioritization list of target proteins for the query molecules and showed the ability to propose new testable hypotheses for each compound. Moreover, we found that some predictions highlighted by our network model were supported by independent experimental validations as found post-facto in the literature. PMID:26735851
Deep venous thrombosis: The valve cusp hypoxia thesis and its incompatibility with modern orthodoxy.
Malone, P Colm; Agutter, Paul S
2016-01-01
The valve cusp hypoxia thesis (VCHT) of the aetiology of deep venous thrombosis (DVT) was adumbrated in this journal in 1977 and fully articulated in 2008, the original hypothesis having been strongly corroborated by experiments published in 1981 and 1984. It presents a unitary account of the pathogenesis of venous thrombosis and embolism that is rooted in the pathophysiological tradition of Hunter, Virchow, Lister, Welch and Aschoff, a tradition traceable back to Harvey. In this paper we summarise the thesis in its mature form, consider its compatibility with recent advances in the DVT field, and ask why it has not yet been assimilated into the mainstream literature, which during the past half century has been dominated by a haematology-orientated 'consensus model'. We identify and discuss seven ways in which the VCHT is incompatible with these mainstream beliefs about the aetiology of venous thrombosis, drawing attention to: (1) the spurious nature of 'Virchow's triad'; (2) the crucial differences between 'venous thrombus' and 'clot'; the facts that (3) venous thrombi form in the valve pockets (VVPs), (4) DVT is not a primarily haematological condition, (5) the so-called 'thrombophilias' are not thrombogenic per se; (6) the conflict between the single unitary aetiology of DVT and the tacit assumption that the condition is 'multicausal'; (7) the inability of anticoagulants to prevent the initiation of venous thrombogenesis, though they do prevent the growth of thrombi to clinically significant size. In discussing point (7), we show that the VCHT indicates new approaches to mechanical prophylaxis against DVT. These approaches are then formulated as experimentally testable hypotheses, and we suggest methods for testing them preclinically using animal trials. Copyright © 2015 Elsevier Ltd. All rights reserved.
Investigating Evolutionary Conservation of Dendritic Cell Subset Identity and Functions
Vu Manh, Thien-Phong; Bertho, Nicolas; Hosmalin, Anne; Schwartz-Cornil, Isabelle; Dalod, Marc
2015-01-01
Dendritic cells (DCs) were initially defined as mononuclear phagocytes with a dendritic morphology and an exquisite efficiency for naïve T-cell activation. DC encompass several subsets initially identified by their expression of specific cell surface molecules and later shown to excel in distinct functions and to develop under the instruction of different transcription factors or cytokines. Very few cell surface molecules are expressed in a specific manner on any immune cell type. Hence, to identify cell types, the sole use of a small number of cell surface markers in classical flow cytometry can be deceiving. Moreover, the markers currently used to define mononuclear phagocyte subsets vary depending on the tissue and animal species studied and even between laboratories. This has led to confusion in the definition of DC subset identity and in their attribution of specific functions. There is a strong need to identify a rigorous and consensus way to define mononuclear phagocyte subsets, with precise guidelines potentially applicable throughout tissues and species. We will discuss the advantages, drawbacks, and complementarities of different methodologies: cell surface phenotyping, ontogeny, functional characterization, and molecular profiling. We will advocate that gene expression profiling is a very rigorous, largely unbiased and accessible method to define the identity of mononuclear phagocyte subsets, which strengthens and refines surface phenotyping. It is uniquely powerful to yield new, experimentally testable, hypotheses on the ontogeny or functions of mononuclear phagocyte subsets, their molecular regulation, and their evolutionary conservation. We propose defining cell populations based on a combination of cell surface phenotyping, expression analysis of hallmark genes, and robust functional assays, in order to reach a consensus and integrate faster the huge but scattered knowledge accumulated by different laboratories on different cell types, organs, and species. PMID:26082777
Anger and its control in Graeco-Roman and modern psychology.
Schimmel, S
1979-11-01
Modern psychologists have studied the phenomena of anger and hostility with diverse methodologies and from a variety of theoretical orientations. The close relationships between anger and aggression, psychosomatic disorder and personal unhappiness, make the understanding and control of anger an important individual and social goal. For all of its sophistication and accomplishment, however, most of the modern research demonstrates, to its disadvantage, a lack of historical perspective with respect to the analysis and treatment of anger, whether normal or pathological. This attitude has deprived psychology of a rich source of empirical observations, intriguing, testable hypotheses, and ingenious techniques of treatment. Of the literature that has been neglected, the analyses of the emotion of anger in the writings of Greek and Roman moral philosophers, particularly Aristotle (4th century B.C.), Seneca (1st century A.D.) and Plutarch (early 2nd century A.D.) are of particular interest. Although modern analyses and methods of treatment are in some ways more refined and more quantitatively precise, and are often subjected to validation and modification by empirical-experimental tests, scientific psychology has, to date, contributed relatively little to the understanding and control of anger that is novel except for research on its physiological dimensions. We can still benefit from the insight, prescriptions and procedures of the classicists, who in some respects offer more powerful methods of control than the most recently published works. Naturally, the modern psychotherapist or behavior therapist can and must go beyond the ancients, as is inherent in all scientific and intellectual progress, but there are no scientific or rational grounds for ignoring them as has been done for 75 years.
The experience of agency: an interplay between prediction and postdiction
Synofzik, Matthis; Vosgerau, Gottfried; Voss, Martin
2013-01-01
The experience of agency, i.e., the registration that I am the initiator of my actions, is a basic and constant underpinning of our interaction with the world. Whereas several accounts have underlined predictive processes as the central mechanism (e.g., the comparator model by C. Frith), others emphasized postdictive inferences (e.g., post-hoc inference account by D. Wegner). Based on increasing evidence that both predictive and postdictive processes contribute to the experience of agency, we here present a unifying but at the same time parsimonious approach that reconciles these accounts: predictive and postdictive processes are both integrated by the brain according to the principles of optimal cue integration. According to this framework, predictive and postdictive processes each serve as authorship cues that are continuously integrated and weighted depending on their availability and reliability in a given situation. Both sensorimotor and cognitive signals can serve as predictive cues (e.g., internal predictions based on an efferency copy of the motor command or cognitive anticipations based on priming). Similarly, other sensorimotor and cognitive cues can each serve as post-hoc cues (e.g., visual feedback of the action or the affective valence of the action outcome). Integration and weighting of these cues might not only differ between contexts and individuals, but also between different subject and disease groups. For example, schizophrenia patients with delusions of influence seem to rely less on (probably imprecise) predictive motor signals of the action and more on post-hoc action cues like e.g., visual feedback and, possibly, the affective valence of the action outcome. Thus, the framework of optimal cue integration offers a promising approach that directly stimulates a wide range of experimentally testable hypotheses on agency processing in different subject groups. PMID:23508565
Predicting Adverse Drug Effects from Literature- and Database-Mined Assertions.
La, Mary K; Sedykh, Alexander; Fourches, Denis; Muratov, Eugene; Tropsha, Alexander
2018-06-06
Given that adverse drug effects (ADEs) have led to post-market patient harm and subsequent drug withdrawal, failure of candidate agents in the drug development process, and other negative outcomes, it is essential to attempt to forecast ADEs and other relevant drug-target-effect relationships as early as possible. Current pharmacologic data sources, providing multiple complementary perspectives on the drug-target-effect paradigm, can be integrated to facilitate the inference of relationships between these entities. This study aims to identify both existing and unknown relationships between chemicals (C), protein targets (T), and ADEs (E) based on evidence in the literature. Cheminformatics and data mining approaches were employed to integrate and analyze publicly available clinical pharmacology data and literature assertions interrelating drugs, targets, and ADEs. Based on these assertions, a C-T-E relationship knowledge base was developed. Known pairwise relationships between chemicals, targets, and ADEs were collected from several pharmacological and biomedical data sources. These relationships were curated and integrated according to Swanson's paradigm to form C-T-E triangles. Missing C-E edges were then inferred as C-E relationships. Unreported associations between drugs, targets, and ADEs were inferred, and inferences were prioritized as testable hypotheses. Several C-E inferences, including testosterone → myocardial infarction, were identified using inferences based on the literature sources published prior to confirmatory case reports. Timestamping approaches confirmed the predictive ability of this inference strategy on a larger scale. The presented workflow, based on free-access databases and an association-based inference scheme, provided novel C-E relationships that have been validated post hoc in case reports. With refinement of prioritization schemes for the generated C-E inferences, this workflow may provide an effective computational method for the early detection of potential drug candidate ADEs that can be followed by targeted experimental investigations.
A Multilayer Network Approach for Guiding Drug Repositioning in Neglected Diseases.
Berenstein, Ariel José; Magariños, María Paula; Chernomoretz, Ariel; Agüero, Fernán
2016-01-01
Drug development for neglected diseases has been historically hampered due to lack of market incentives. The advent of public domain resources containing chemical information from high throughput screenings is changing the landscape of drug discovery for these diseases. In this work we took advantage of data from extensively studied organisms like human, mouse, E. coli and yeast, among others, to develop a novel integrative network model to prioritize and identify candidate drug targets in neglected pathogen proteomes, and bioactive drug-like molecules. We modeled genomic (proteins) and chemical (bioactive compounds) data as a multilayer weighted network graph that takes advantage of bioactivity data across 221 species, chemical similarities between 1.7 105 compounds and several functional relations among 1.67 105 proteins. These relations comprised orthology, sharing of protein domains, and shared participation in defined biochemical pathways. We showcase the application of this network graph to the problem of prioritization of new candidate targets, based on the information available in the graph for known compound-target associations. We validated this strategy by performing a cross validation procedure for known mouse and Trypanosoma cruzi targets and showed that our approach outperforms classic alignment-based approaches. Moreover, our model provides additional flexibility as two different network definitions could be considered, finding in both cases qualitatively different but sensible candidate targets. We also showcase the application of the network to suggest targets for orphan compounds that are active against Plasmodium falciparum in high-throughput screens. In this case our approach provided a reduced prioritization list of target proteins for the query molecules and showed the ability to propose new testable hypotheses for each compound. Moreover, we found that some predictions highlighted by our network model were supported by independent experimental validations as found post-facto in the literature.
Flight control system design factors for applying automated testing techniques
NASA Technical Reports Server (NTRS)
Sitz, Joel R.; Vernon, Todd H.
1990-01-01
The principal design features and operational experiences of the X-29 forward-swept-wing aircraft and F-18 high alpha research vehicle (HARV) automated test systems are discussed. It is noted that operational experiences in developing and using these automated testing techniques have highlighted the need for incorporating target system features to improve testability. Improved target system testability can be accomplished with the addition of nonreal-time and real-time features. Online access to target system implementation details, unobtrusive real-time access to internal user-selectable variables, and proper software instrumentation are all desirable features of the target system. Also, test system and target system design issues must be addressed during the early stages of the target system development. Processing speeds of up to 20 million instructions/s and the development of high-bandwidth reflective memory systems have improved the ability to integrate the target system and test system for the application of automated testing techniques. It is concluded that new methods of designing testability into the target systems are required.
Abu Bakar, Nurul Farhana; Chen, Ai-Hong
2014-02-01
Children with learning disabilities might have difficulties to communicate effectively and give reliable responses as required in various visual function testing procedures. The purpose of this study was to compare the testability of visual acuity using the modified Early Treatment Diabetic Retinopathy Study (ETDRS) and Cambridge Crowding Cards, stereo acuity using Lang Stereo test II and Butterfly stereo tests and colour perception using Colour Vision Test Made Easy (CVTME) and Ishihara's Test for Colour Deficiency (Ishihara Test) between children in mainstream classes and children with learning disabilities in special education classes in government primary schools. A total of 100 primary school children (50 children from mainstream classes and 50 children from special education classes) matched in age were recruited in this cross-sectional comparative study. The testability was determined by the percentage of children who were able to give reliable respond as required by the respective tests. 'Unable to test' was defined as inappropriate response or uncooperative despite best efforts of the screener. The testability of the modified ETDRS, Butterfly stereo test and Ishihara test for respective visual function tests were found lower among children in special education classes ( P < 0.001) but not in Cambridge Crowding Cards, Lang Stereo test II and CVTME. Non verbal or "matching" approaches were found to be more superior in testing visual functions in children with learning disabilities. Modifications of vision testing procedures are essential for children with learning disabilities.
Romantic love modulates women's identification of men's body odors.
Lundström, Johan N; Jones-Gotman, Marilyn
2009-02-01
Romantic love is one of our most potent and powerful emotions, but very little is known with respect to the hormonal and psychological mechanisms in play. Romantic love is thought to help intimate partners stay committed to each other and two mechanisms have been proposed to mediate this commitment: increased attention towards one's partner or deflected attention away from other potential partners. Both mechanisms find support in the literature. We explored the potential influence of each of these mechanisms by assessing women's ability to identify (ID) body odors originating from their boyfriend, a same-sex friend, and an opposite-sex friend and the relationship between this ability and the degree of romantic love expressed towards their boyfriend. We hypothesized that an increase in attention towards one's partner would render a positive correlation between ID of a boyfriend's body odor and degree of romantic love; conversely, we hypothesized that attention deflected away from other potential partners would render a negative correlation between ID of an opposite-sex friend's body odor and degree of romantic love for the boyfriend. Our results supported the deflection theory as we found a negative correlation between the degree of romantic love for the subjects' boyfriends and their ability to ID the body odor of an opposite-sex friend but not of their boyfriend or same-sex friend. Our results indicate that romantic love deflects attention away from potential new partners rather than towards the present partner. These changes are likely mediated by circulating neuropeptides and a testable model is suggested.
Testability, Test Automation and Test Driven Development for the Trick Simulation Toolkit
NASA Technical Reports Server (NTRS)
Penn, John
2014-01-01
This paper describes the adoption of a Test Driven Development approach and a Continuous Integration System in the development of the Trick Simulation Toolkit, a generic simulation development environment for creating high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. It describes the approach, and the significant benefits seen, such as fast, thorough and clear test feedback every time code is checked into the code repository. It also describes an approach that encourages development of code that is testable and adaptable.
What makes Darwinian hydrology "Darwinian"? Asking a different kind of question about landscapes
NASA Astrophysics Data System (ADS)
Harman, C.; Troch, P. A.
2014-02-01
There have been repeated calls for a Darwinian approach to hydrologic science, or for a synthesis of Darwinian and Newtonian approaches, to deepen understanding of the hydrologic system in the larger landscape context, and so develop a better basis for predictions now and in an uncertain future. But what exactly makes a Darwinian approach to hydrology "Darwinian"? While there have now been a number of discussions of Darwinian approaches, many referencing Harte (2002), the term is potentially a source of confusion because its connections to Darwin remain allusive rather than explicit. Here we suggest that the Darwinian approach to hydrology follows the example of Charles Darwin by focusing attention on the patterns of variation in populations and seeking hypotheses that explain these patterns in terms of the mechanisms and conditions that determine their historical development. These hypotheses do not simply catalog patterns or predict them statistically - they connect the present structure with processes operating in the past. Nor are they explanations presented without independent evidence or critical analysis - Darwin's hypotheses about the mechanisms underlying present-day variation could be independently tested and validated. With a Darwinian framework in mind, it is easy to see that a great deal of hydrologic research has already been done that contributes to a Darwinian hydrology - whether deliberately or not. We discuss some practical and philosophical issues with this approach to hydrologic science: how are explanatory hypotheses generated? What constitutes a good hypothesis? How are hypotheses tested? "Historical" sciences - including paleohydrology - have long grappled with these questions, as must a Darwinian hydrologic science. We can draw on Darwin's own example for some answers, though there are ongoing debates about the philosophical nature of his methods and reasoning. Darwin used a range of methods of historical reasoning to develop explanatory hypotheses: extrapolating mechanisms, space for time substitution, and looking for signatures of history. Some of these are already in use, while others are not and could be used to develop new insights. He sought explanatory hypotheses that intelligibly unified disparate facts, were testable against evidence, and had fertile implications for further research. He provided evidence to support his hypotheses by deducing corollary conditions ("if explanation A is true, then B will also be true") and comparing these to observations. While a synthesis of the Darwinian and Newtonian approaches remains a goal, the Darwinian approach to hydrologic science has significant value of its own. The Darwinian hydrology that has been conducted already has not been coordinated or linked into a general body of theory and knowledge, but the time is coming when this will be possible.
How scientific experiments are designed: Problem solving in a knowledge-rich, error-rich environment
NASA Astrophysics Data System (ADS)
Baker, Lisa M.
While theory formation and the relation between theory and data has been investigated in many studies of scientific reasoning, researchers have focused less attention on reasoning about experimental design, even though the experimental design process makes up a large part of real-world scientists' reasoning. The goal of this thesis was to provide a cognitive account of the scientific experimental design process by analyzing experimental design as problem-solving behavior (Newell & Simon, 1972). Three specific issues were addressed: the effect of potential error on experimental design strategies, the role of prior knowledge in experimental design, and the effect of characteristics of the space of alternate hypotheses on alternate hypothesis testing. A two-pronged in vivo/in vitro research methodology was employed, in which transcripts of real-world scientific laboratory meetings were analyzed as well as undergraduate science and non-science majors' design of biology experiments in the psychology laboratory. It was found that scientists use a specific strategy to deal with the possibility of error in experimental findings: they include "known" control conditions in their experimental designs both to determine whether error is occurring and to identify sources of error. The known controls strategy had not been reported in earlier studies with science-like tasks, in which participants' responses to error had consisted of replicating experiments and discounting results. With respect to prior knowledge: scientists and undergraduate students drew on several types of knowledge when designing experiments, including theoretical knowledge, domain-specific knowledge of experimental techniques, and domain-general knowledge of experimental design strategies. Finally, undergraduate science students generated and tested alternates to their favored hypotheses when the space of alternate hypotheses was constrained and searchable. This result may help explain findings of confirmation bias in earlier studies using science-like tasks, in which characteristics of the alternate hypothesis space may have made it unfeasible for participants to generate and test alternate hypotheses. In general, scientists and science undergraduates were found to engage in a systematic experimental design process that responded to salient features of the problem environment, including the constant potential for experimental error, availability of alternate hypotheses, and access to both theoretical knowledge and knowledge of experimental techniques.
Abu Bakar, Nurul Farhana; Chen, Ai-Hong
2014-01-01
Context: Children with learning disabilities might have difficulties to communicate effectively and give reliable responses as required in various visual function testing procedures. Aims: The purpose of this study was to compare the testability of visual acuity using the modified Early Treatment Diabetic Retinopathy Study (ETDRS) and Cambridge Crowding Cards, stereo acuity using Lang Stereo test II and Butterfly stereo tests and colour perception using Colour Vision Test Made Easy (CVTME) and Ishihara's Test for Colour Deficiency (Ishihara Test) between children in mainstream classes and children with learning disabilities in special education classes in government primary schools. Materials and Methods: A total of 100 primary school children (50 children from mainstream classes and 50 children from special education classes) matched in age were recruited in this cross-sectional comparative study. The testability was determined by the percentage of children who were able to give reliable respond as required by the respective tests. ‘Unable to test’ was defined as inappropriate response or uncooperative despite best efforts of the screener. Results: The testability of the modified ETDRS, Butterfly stereo test and Ishihara test for respective visual function tests were found lower among children in special education classes (P < 0.001) but not in Cambridge Crowding Cards, Lang Stereo test II and CVTME. Conclusion: Non verbal or “matching” approaches were found to be more superior in testing visual functions in children with learning disabilities. Modifications of vision testing procedures are essential for children with learning disabilities. PMID:24008790
The role of beta-endorphin in the pathophysiology of major depression.
Hegadoren, K M; O'Donnell, T; Lanius, R; Coupland, N J; Lacaze-Masmonteil, N
2009-10-01
A role for beta-endorphin (beta-END) in the pathophysiology of major depressive disorder (MDD) is suggested by both animal research and studies examining clinical populations. The major etiological theories of depression include brain regions and neural systems that interact with opioid systems and beta-END. Recent preclinical data have demonstrated multiple roles for beta-END in the regulation of complex homeostatic and behavioural processes that are affected during a depressive episode. Additionally, beta-END inputs to regulatory pathways involving feeding behaviours, motivation, and specific types of motor activity have important implications in defining the biological foundations for specific depressive symptoms. Early research linking beta-END to MDD did so in the context of the hypothalamic-pituitary-adrenal (HPA) axis activity, where it was suggested that HPA axis dysregulation may account for depressive symptoms in some individuals. The primary aims of this paper are to use both preclinical and clinical research (a) to critically review data that explores potential roles for beta-END in the pathophysiology of MDD and (b) to highlight gaps in the literature that limit further development of etiological theories of depression and testable hypotheses. In addition to examining methodological and theoretical challenges of past clinical studies, we summarize studies that have investigated basal beta-END levels in MDD and that have used challenge tests to examine beta-END responses to a variety of experimental paradigms. A brief description of the synthesis, location in the CNS and behavioural pharmacology of this neuropeptide is also provided to frame this discussion. Given the lack of clinical improvement observed with currently available antidepressants in a significant proportion of depressed individuals, it is imperative that novel mechanisms be investigated for antidepressant potential. We conclude that the renewed interest in elucidating the role of beta-END in the pathophysiology of MDD must be paralleled by consensus building within the research community around the heterogeneity inherent in mood disorders, standardization of experimental protocols, improved discrimination of POMC products in analytical techniques and consistent attention paid to important confounds like age and gender.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vadhavkar, Nikhil; Pham, Christopher; Georgescu, Walter
In contrast to the classic view of static DNA double-strand breaks (DSBs) being repaired at the site of damage, we hypothesize that DSBs move and merge with each other over large distances (m). As X-ray dose increases, the probability of having DSB clusters increases as does the probability of misrepair and cell death. Experimental work characterizing the X-ray dose dependence of radiation-induced foci (RIF) in nonmalignant human mammary epithelial cells (MCF10A) is used here to validate a DSB clustering model. We then use the principles of the local effect model (LEM) to predict the yield of DSBs at the submicronmore » level. Two mechanisms for DSB clustering, namely random coalescence of DSBs versus active movement of DSBs into repair domains are compared and tested. Simulations that best predicted both RIF dose dependence and cell survival after X-ray irradiation favored the repair domain hypothesis, suggesting the nucleus is divided into an array of regularly spaced repair domains of ~;;1.55 m sides. Applying the same approach to high-linear energy transfer (LET) ion tracks, we are able to predict experimental RIF/m along tracks with an overall relative error of 12percent, for LET ranging between 30 350 keV/m and for three different ions. Finally, cell death was predicted by assuming an exponential dependence on the total number of DSBs and of all possible combinations of paired DSBs within each simulated RIF. Relative biological effectiveness (RBE) predictions for cell survival of MCF10A exposed to high-LET showed an LET dependence that matches previous experimental results for similar cell types. Overall, this work suggests that microdosimetric properties of ion tracks at the submicron level are sufficient to explain both RIF data and survival curves for any LET, similarly to the LEM assumption. Conversely, high-LET death mechanism does not have to infer linear-quadratic dose formalism as done in the LEM. In addition, the size of repair domains derived in our model are based on experimental RIF and are three times larger than the hypothetical LEM voxel used to fit survival curves. Our model is therefore an alternative to previous approaches that provides a testable biological mechanism (i.e., RIF). In addition, we propose that DSB pairing will help develop more accurate alternatives to the linear cancer risk model (LNT) currently used for regulating exposure to very low levels of ionizing radiation.« less
Bayesian Network Webserver: a comprehensive tool for biological network modeling.
Ziebarth, Jesse D; Bhattacharya, Anindya; Cui, Yan
2013-11-01
The Bayesian Network Webserver (BNW) is a platform for comprehensive network modeling of systems genetics and other biological datasets. It allows users to quickly and seamlessly upload a dataset, learn the structure of the network model that best explains the data and use the model to understand relationships between network variables. Many datasets, including those used to create genetic network models, contain both discrete (e.g. genotype) and continuous (e.g. gene expression traits) variables, and BNW allows for modeling hybrid datasets. Users of BNW can incorporate prior knowledge during structure learning through an easy-to-use structural constraint interface. After structure learning, users are immediately presented with an interactive network model, which can be used to make testable hypotheses about network relationships. BNW, including a downloadable structure learning package, is available at http://compbio.uthsc.edu/BNW. (The BNW interface for adding structural constraints uses HTML5 features that are not supported by current version of Internet Explorer. We suggest using other browsers (e.g. Google Chrome or Mozilla Firefox) when accessing BNW). ycui2@uthsc.edu. Supplementary data are available at Bioinformatics online.
Crowell, Sheila E.; Beauchaine, Theodore P.; Linehan, Marsha M.
2009-01-01
Over the past several decades, research has focused increasingly on developmental precursors to psychological disorders that were previously assumed to emerge only in adulthood. This change in focus follows from the recognition that complex transactions between biological vulnerabilities and psychosocial risk factors shape emotional and behavioral development beginning at conception. To date, however, empirical research on the development of borderline personality is extremely limited. Indeed, in the decade since M. M. Linehan initially proposed a biosocial model of the development of borderline personality disorder, there have been few attempts to test the model among at-risk youth. In this review, diverse literatures are reviewed that can inform understanding of the ontogenesis of borderline pathology, and testable hypotheses are proposed to guide future research with at-risk children and adolescents. One probable pathway is identified that leads to borderline personality disorder; it begins with early vulnerability, expressed initially as impulsivity and followed by heightened emotional sensitivity. These vulnerabilities are potentiated across development by environmental risk factors that give rise to more extreme emotional, behavioral, and cognitive dysregulation. PMID:19379027
The Comparative Toxicogenomics Database: update 2017.
Davis, Allan Peter; Grondin, Cynthia J; Johnson, Robin J; Sciaky, Daniela; King, Benjamin L; McMorran, Roy; Wiegers, Jolene; Wiegers, Thomas C; Mattingly, Carolyn J
2017-01-04
The Comparative Toxicogenomics Database (CTD; http://ctdbase.org/) provides information about interactions between chemicals and gene products, and their relationships to diseases. Core CTD content (chemical-gene, chemical-disease and gene-disease interactions manually curated from the literature) are integrated with each other as well as with select external datasets to generate expanded networks and predict novel associations. Today, core CTD includes more than 30.5 million toxicogenomic connections relating chemicals/drugs, genes/proteins, diseases, taxa, Gene Ontology (GO) annotations, pathways, and gene interaction modules. In this update, we report a 33% increase in our core data content since 2015, describe our new exposure module (that harmonizes exposure science information with core toxicogenomic data) and introduce a novel dataset of GO-disease inferences (that identify common molecular underpinnings for seemingly unrelated pathologies). These advancements centralize and contextualize real-world chemical exposures with molecular pathways to help scientists generate testable hypotheses in an effort to understand the etiology and mechanisms underlying environmentally influenced diseases. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
Designing for competence: spaces that enhance collaboration readiness in healthcare.
Lamb, Gerri; Shraiky, James
2013-09-01
Many universities in the United States are investing in classrooms and campuses designed to increase collaboration and teamwork among the health professions. To date, we know little about whether these learning spaces are having the intended impact on student performance. Recent advances in the identification of interprofessional teamwork competencies provide a much-needed step toward a defined outcome metric. Rigorous study of the relationship between design and student competence in collaboration also requires clear specification of design concepts and development of testable frameworks. Such theory-based evaluation is crucial for design to become an integral part of interprofessional education strategies and initiatives. Current classroom and campus designs were analyzed for common themes and features in collaborative spaces as a starting place for specification of design concepts and model development. Four major themes were identified: flexibility, visual transparency/proximity, technology and environmental infrastructure. Potential models linking this preliminary set of design concepts to student competencies are proposed and used to generate hypotheses for future study of the impact of collaborative design spaces on student outcomes.
Combined neurostimulation and neuroimaging in cognitive neuroscience: past, present, and future.
Bestmann, Sven; Feredoes, Eva
2013-08-01
Modern neurostimulation approaches in humans provide controlled inputs into the operations of cortical regions, with highly specific behavioral consequences. This enables causal structure-function inferences, and in combination with neuroimaging, has provided novel insights into the basic mechanisms of action of neurostimulation on distributed networks. For example, more recent work has established the capacity of transcranial magnetic stimulation (TMS) to probe causal interregional influences, and their interaction with cognitive state changes. Combinations of neurostimulation and neuroimaging now face the challenge of integrating the known physiological effects of neurostimulation with theoretical and biological models of cognition, for example, when theoretical stalemates between opposing cognitive theories need to be resolved. This will be driven by novel developments, including biologically informed computational network analyses for predicting the impact of neurostimulation on brain networks, as well as novel neuroimaging and neurostimulation techniques. Such future developments may offer an expanded set of tools with which to investigate structure-function relationships, and to formulate and reconceptualize testable hypotheses about complex neural network interactions and their causal roles in cognition. © 2013 New York Academy of Sciences.
Paleolithic vs. modern diets--selected pathophysiological implications.
Eaton, S B; Eaton, S B
2000-04-01
The nutritional patterns of Paleolithic humans influenced genetic evolution during the time segment within which defining characteristics of contemporary humans were selected. Our genome can have changed little since the beginnings of agriculture, so, genetically, humans remain Stone Agers--adapted for a Paleolithic dietary regimen. Such diets were based chiefly on wild game, fish and uncultivated plant foods. They provided abundant protein; a fat profile much different from that of affluent Western nations; high fibre; carbohydrate from fruits and vegetables (and some honey) but not from cereals, refined sugars and dairy products; high levels of micronutrients and probably of phytochemicals as well. Differences between contemporary and ancestral diets have many pathophysiological implications. This review addresses phytochemicals and cancer; calcium, physical exertion, bone mineral density and bone structural geometry; dietary protein, potassium, renal acid secretion and urinary calcium loss; and finally sarcopenia, adiposity, insulin receptors and insulin resistance. While not, yet, a basis for formal recommendations, awareness of Paleolithic nutritional patterns should generate novel, testable hypotheses grounded in evolutionary theory and it should dispel complacency regarding currently accepted nutritional tenets.
Crowell, Sheila E; Beauchaine, Theodore P; Linehan, Marsha M
2009-05-01
Over the past several decades, research has focused increasingly on developmental precursors to psychological disorders that were previously assumed to emerge only in adulthood. This change in focus follows from the recognition that complex transactions between biological vulnerabilities and psychosocial risk factors shape emotional and behavioral development beginning at conception. To date, however, empirical research on the development of borderline personality is extremely limited. Indeed, in the decade since M. M. Linehan initially proposed a biosocial model of the development of borderline personality disorder, there have been few attempts to test the model among at-risk youth. In this review, diverse literatures are reviewed that can inform understanding of the ontogenesis of borderline pathology, and testable hypotheses are proposed to guide future research with at-risk children and adolescents. One probable pathway is identified that leads to borderline personality disorder; it begins with early vulnerability, expressed initially as impulsivity and followed by heightened emotional sensitivity. These vulnerabilities are potentiated across development by environmental risk factors that give rise to more extreme emotional, behavioral, and cognitive dysregulation. (PsycINFO Database Record (c) 2009 APA, all rights reserved).
Where do mirror neurons come from?
Heyes, Cecilia
2010-03-01
Debates about the evolution of the 'mirror neuron system' imply that it is an adaptation for action understanding. Alternatively, mirror neurons may be a byproduct of associative learning. Here I argue that the adaptation and associative hypotheses both offer plausible accounts of the origin of mirror neurons, but the associative hypothesis has three advantages. First, it provides a straightforward, testable explanation for the differences between monkeys and humans that have led some researchers to question the existence of a mirror neuron system. Second, it is consistent with emerging evidence that mirror neurons contribute to a range of social cognitive functions, but do not play a dominant, specialised role in action understanding. Finally, the associative hypothesis is supported by recent data showing that, even in adulthood, the mirror neuron system can be transformed by sensorimotor learning. The associative account implies that mirror neurons come from sensorimotor experience, and that much of this experience is obtained through interaction with others. Therefore, if the associative account is correct, the mirror neuron system is a product, as well as a process, of social interaction. (c) 2009 Elsevier Ltd. All rights reserved.
Kim, Betty E; Seligman, Darryl; Kable, Joseph W
2012-01-01
Recent work has shown that visual fixations reflect and influence trial-to-trial variability in people's preferences between goods. Here we extend this principle to attribute weights during decision making under risk. We measured eye movements while people chose between two risky gambles or bid on a single gamble. Consistent with previous work, we found that people exhibited systematic preference reversals between choices and bids. For two gambles matched in expected value, people systematically chose the higher probability option but provided a higher bid for the option that offered the greater amount to win. This effect was accompanied by a shift in fixations of the two attributes, with people fixating on probabilities more during choices and on amounts more during bids. Our results suggest that the construction of value during decision making under risk depends on task context partly because the task differentially directs attention at probabilities vs. amounts. Since recent work demonstrates that neural correlates of value vary with visual fixations, our results also suggest testable hypotheses regarding how task context modulates the neural computation of value to generate preference reversals.
Human Facial Expressions as Adaptations:Evolutionary Questions in Facial Expression Research
SCHMIDT, KAREN L.; COHN, JEFFREY F.
2007-01-01
The importance of the face in social interaction and social intelligence is widely recognized in anthropology. Yet the adaptive functions of human facial expression remain largely unknown. An evolutionary model of human facial expression as behavioral adaptation can be constructed, given the current knowledge of the phenotypic variation, ecological contexts, and fitness consequences of facial behavior. Studies of facial expression are available, but results are not typically framed in an evolutionary perspective. This review identifies the relevant physical phenomena of facial expression and integrates the study of this behavior with the anthropological study of communication and sociality in general. Anthropological issues with relevance to the evolutionary study of facial expression include: facial expressions as coordinated, stereotyped behavioral phenotypes, the unique contexts and functions of different facial expressions, the relationship of facial expression to speech, the value of facial expressions as signals, and the relationship of facial expression to social intelligence in humans and in nonhuman primates. Human smiling is used as an example of adaptation, and testable hypotheses concerning the human smile, as well as other expressions, are proposed. PMID:11786989
Characterizing behavioural ‘characters’: an evolutionary framework
Araya-Ajoy, Yimen G.; Dingemanse, Niels J.
2014-01-01
Biologists often study phenotypic evolution assuming that phenotypes consist of a set of quasi-independent units that have been shaped by selection to accomplish a particular function. In the evolutionary literature, such quasi-independent functional units are called ‘evolutionary characters’, and a framework based on evolutionary principles has been developed to characterize them. This framework mainly focuses on ‘fixed’ characters, i.e. those that vary exclusively between individuals. In this paper, we introduce multi-level variation and thereby expand the framework to labile characters, focusing on behaviour as a worked example. We first propose a concept of ‘behavioural characters’ based on the original evolutionary character concept. We then detail how integration of variation between individuals (cf. ‘personality’) and within individuals (cf. ‘individual plasticity’) into the framework gives rise to a whole suite of novel testable predictions about the evolutionary character concept. We further propose a corresponding statistical methodology to test whether observed behaviours should be considered expressions of a hypothesized evolutionary character. We illustrate the application of our framework by characterizing the behavioural character ‘aggressiveness’ in wild great tits, Parus major. PMID:24335984
Developmental Perspectives on Oxytocin and Vasopressin
Hammock, Elizabeth A D
2015-01-01
The related neuropeptides oxytocin and vasopressin are involved in species-typical behavior, including social recognition behavior, maternal behavior, social bonding, communication, and aggression. A wealth of evidence from animal models demonstrates significant modulation of adult social behavior by both of these neuropeptides and their receptors. Over the last decade, there has been a flood of studies in humans also implicating a role for these neuropeptides in human social behavior. Despite popular assumptions that oxytocin is a molecule of social bonding in the infant brain, less mechanistic research emphasis has been placed on the potential role of these neuropeptides in the developmental emergence of the neural substrates of behavior. This review summarizes what is known and assumed about the developmental influence of these neuropeptides and outlines the important unanswered questions and testable hypotheses. There is tremendous translational need to understand the functions of these neuropeptides in mammalian experience-dependent development of the social brain. The activity of oxytocin and vasopressin during development should inform our understanding of individual, sex, and species differences in social behavior later in life. PMID:24863032
Matisoo-Smith, Elizabeth; Gosling, Anna L
2018-05-01
The Pacific region has had a complex human history. It has been subject to multiple major human dispersal and colonisation events, including some of the earliest Out-of-Africa migrations, the so-called Austronesian expansion of people out of Island Southeast Asia, and the more recent arrival of Europeans. Despite models of island isolation, evidence suggests significant levels of interconnectedness that vary in direction and frequency over time. The Pacific Ocean covers a vast area and its islands provide an array of different physical environments with variable pathogen loads and subsistence opportunities. These diverse environments likely caused Pacific peoples to adapt (both genetically and culturally) in unique ways. Differences in genetic background, in combination with adaptation, likely affect their susceptibility to non-communicable diseases. Here we provide an overview of some of the key issues in the natural and human history of the Pacific region which are likely to impact human health. We argue that understanding the evolutionary and cultural history of Pacific peoples is essential for the generation of testable hypotheses surrounding potential causes of elevated disease susceptibility among Pacific peoples.
NASA Astrophysics Data System (ADS)
Patel, Namu; Patankar, Neelesh A.
2017-11-01
Aquatic locomotion relies on feedback loops to generate the flexural muscle moment needed to attain the reference shape. Experimentalists have consistently reported a difference between the electromyogram (EMG) and curvature wave speeds. The EMG wave speed has been found to correlate with the cross-sectional moment wave. The correlation, however, remains unexplained. Using feedback dependent controller models, we demonstrate two scenarios - one at higher passive elastic stiffness and another at lower passive elastic stiffness of the body. The former case becomes equivalent to the penalty type mathematical model for swimming used in prior literature and it does not reproduce neuromechanical wave speed discrepancy. The latter case at lower elastic stiffness does reproduce the wave speed discrepancy and appears to be biologically most relevant. These findings are applied to develop testable hypotheses about control mechanisms that animals might be using at during low and high Reynolds number swimming. This work is supported by NSF Grants DMS-1547394, CBET-1066575, ACI-1460334, and IOS-1456830. Travel for NP is supported by Institute for Defense Analyses.
Rethinking intractable conflict: the perspective of dynamical systems.
Vallacher, Robin R; Coleman, Peter T; Nowak, Andrzej; Bui-Wrzosinska, Lan
2010-01-01
Intractable conflicts are demoralizing. Beyond destabilizing the families, communities, or international regions in which they occur, they tend to perpetuate the very conditions of misery and hate that contributed to them in the first place. Although the common factors and processes associated with intractable conflicts have been identified through research, they represent an embarrassment of riches for theory construction. Thus, the current task in this area is integrating these diverse factors into an account that provides a coherent perspective yet allows for prediction and a basis for conflict resolution in specific conflict settings. We suggest that the perspective of dynamical systems provides such an account. This article outlines the key concepts and hypotheses associated with this approach. It is organized around a set of basic questions concerning intractable conflict for which the dynamical perspective offers fresh insight and testable propositions. The questions and answers are intended to provide readers with basic concepts and principles of complexity and dynamical systems that are useful for rethinking the nature of intractable conflict and the means by which such conflict can be transformed. Copyright 2010 APA, all rights reserved.
Is Psychoanalysis a Folk Psychology?
Arminjon, Mathieu
2013-01-01
Even as the neuro-psychoanalytic field has matured, from a naturalist point of view, the epistemological status of Freudian interpretations still remains problematic at a naturalist point of view. As a result of the resurgence of hermeneutics, the claim has been made that psychoanalysis is an extension of folk psychology. For these “extensionists,” asking psychoanalysis to prove its interpretations would be as absurd as demanding the proofs of the scientific accuracy of folk psychology. I propose to show how Dennett’s theory of the intentional stance allows us to defend an extensionist position while sparing us certain hermeneutic difficulties. In conclusion, I will consider how Shevrin et al. (1996) experiments could turn extensionist conceptual considerations into experimentally testable issues. PMID:23525879
NASA Astrophysics Data System (ADS)
Ao, Ping
2011-03-01
There has been a tremendous progress in cancer research. However, it appears the current dominant cancer research framework of regarding cancer as diseases of genome leads impasse. Naturally questions have been asked that whether it is possible to develop alternative frameworks such that they can connect both to mutations and other genetic/genomic effects and to environmental factors. Furthermore, such framework can be made quantitative and with predictions experimentally testable. In this talk, I will present a positive answer to this calling. I will explain on our construction of endogenous network theory based on molecular-cellular agencies as dynamical variable. Such cancer theory explicitly demonstrates a profound connection to many fundamental concepts in physics, as such stochastic non-equilibrium processes, ``energy'' landscape, metastability, etc. It suggests that neneath cancer's daunting complexity may lie a simplicity that gives grounds for hope. The rationales behind such theory, its predictions, and its initial experimental verifications will be presented. Supported by USA NIH and China NSF.
O'Malley, Maureen A
2018-06-01
Since the 1940s, microbiologists, biochemists and population geneticists have experimented with the genetic mechanisms of microorganisms in order to investigate evolutionary processes. These evolutionary studies of bacteria and other microorganisms gained some recognition from the standard-bearers of the modern synthesis of evolutionary biology, especially Theodosius Dobzhansky and Ledyard Stebbins. A further period of post-synthesis bacterial evolutionary research occurred between the 1950s and 1980s. These experimental analyses focused on the evolution of population and genetic structure, the adaptive gain of new functions, and the evolutionary consequences of competition dynamics. This large body of research aimed to make evolutionary theory testable and predictive, by giving it mechanistic underpinnings. Although evolutionary microbiologists promoted bacterial experiments as methodologically advantageous and a source of general insight into evolution, they also acknowledged the biological differences of bacteria. My historical overview concludes with reflections on what bacterial evolutionary research achieved in this period, and its implications for the still-developing modern synthesis.
Regulation of multispanning membrane protein topology via post-translational annealing.
Van Lehn, Reid C; Zhang, Bin; Miller, Thomas F
2015-09-26
The canonical mechanism for multispanning membrane protein topogenesis suggests that protein topology is established during cotranslational membrane integration. However, this mechanism is inconsistent with the behavior of EmrE, a dual-topology protein for which the mutation of positively charged loop residues, even close to the C-terminus, leads to dramatic shifts in its topology. We use coarse-grained simulations to investigate the Sec-facilitated membrane integration of EmrE and its mutants on realistic biological timescales. This work reveals a mechanism for regulating membrane-protein topogenesis, in which initially misintegrated configurations of the proteins undergo post-translational annealing to reach fully integrated multispanning topologies. The energetic barriers associated with this post-translational annealing process enforce kinetic pathways that dictate the topology of the fully integrated proteins. The proposed mechanism agrees well with the experimentally observed features of EmrE topogenesis and provides a range of experimentally testable predictions regarding the effect of translocon mutations on membrane protein topogenesis.
Dynamic allostery of protein alpha helical coiled-coils
Hawkins, Rhoda J; McLeish, Tom C.B
2005-01-01
Alpha helical coiled-coils appear in many important allosteric proteins such as the dynein molecular motor and bacteria chemotaxis transmembrane receptors. As a mechanism for transmitting the information of ligand binding to a distant site across an allosteric protein, an alternative to conformational change in the mean static structure is an induced change in the pattern of the internal dynamics of the protein. We explore how ligand binding may change the intramolecular vibrational free energy of a coiled-coil, using parameterized coarse-grained models, treating the case of dynein in detail. The models predict that coupling of slide, bend and twist modes of the coiled-coil transmits an allosteric free energy of ∼2kBT, consistent with experimental results. A further prediction is a quantitative increase in the effective stiffness of the coiled-coil without any change in inherent flexibility of the individual helices. The model provides a possible and experimentally testable mechanism for transmission of information through the alpha helical coiled-coil of dynein. PMID:16849225
Tailor, Vijay; Glaze, Selina; Unwin, Hilary; Bowman, Richard; Thompson, Graham; Dahlmann-Noor, Annegret
2016-10-01
Children and adults with neurological impairments are often not able to access conventional perimetry; however, information about the visual field is valuable. A new technology, saccadic vector optokinetic perimetry (SVOP), may have improved accessibility, but its accuracy has not been evaluated. We aimed to explore accessibility, testability and accuracy of SVOP in children with neurodisability or isolated visual pathway deficits. Cohort study; recruitment October 2013-May 2014, at children's eye clinics at a tertiary referral centre and a regional Child Development Centre; full orthoptic assessment, SVOP (central 30° of the visual field) and confrontation visual fields (CVF). Group 1: age 1-16 years, neurodisability (n=16), group 2: age 10-16 years, confirmed or suspected visual field defect (n=21); group 2 also completed Goldmann visual field testing (GVFT). Group 1: testability with a full 40-point test protocol is 12.5%; with reduced test protocols, testability is 100%, but plots may be clinically meaningless. Children (44%) and parents/carers (62.5%) find the test easy. SVOP and CVF agree in 50%. Group 2: testability is 62% for the 40-point protocol, and 90.5% for reduced protocols. Corneal changes in childhood glaucoma interfere with SVOP testing. All children and parents/carers find SVOP easy. Overall agreement with GVFT is 64.7%. While SVOP is highly accessible to children, many cannot complete a full 40-point test. Agreement with current standard tests is moderate to poor. Abnormal saccades cause an apparent non-specific visual field defect. In children with glaucoma or nystagmus SVOP calibration often fails. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Werner, Jan; Griebeler, Eva Maria
2013-01-01
It has been hypothesized that a high reproductive output contributes to the unique gigantism in large dinosaur taxa. In order to infer more information on dinosaur reproduction, we established allometries between body mass and different reproductive traits (egg mass, clutch mass, annual clutch mass) for extant phylogenetic brackets (birds, crocodiles and tortoises) of extinct non-avian dinosaurs. Allometries were applied to nine non-avian dinosaur taxa (theropods, hadrosaurs, and sauropodomorphs) for which fossil estimates on relevant traits are currently available. We found that the reproductive traits of most dinosaurs conformed to similar-sized or scaled-up extant reptiles or birds. The reproductive traits of theropods, which are considered more bird-like, were indeed consistent with birds, while the traits of sauropodomorphs conformed better to reptiles. Reproductive traits of hadrosaurs corresponded to both reptiles and birds. Excluding Massospondylus carinatus , all dinosaurs studied had an intermediary egg to body mass relationship to reptiles and birds. In contrast, dinosaur clutch masses fitted with either the masses predicted from allometries of birds (theropods) or to the masses of reptiles (all other taxa). Theropods studied had probably one clutch per year. For sauropodomorphs and hadrosaurs, more than one clutch per year was predicted. Contrary to current hypotheses, large dinosaurs did not have exceptionally high annual egg numbers (AEN). Independent of the extant model, the estimated dinosaur AEN did not exceed 850 eggs (75,000 kg sauropod) for any of the taxa studied. This estimated maximum is probably an overestimation due to unrealistic assumptions. According to most AEN estimations, the dinosaurs studied laid less than 200 eggs per year. Only some AEN estimates obtained for medium to large sized sauropods were higher (200-400 eggs). Our results provide new (testable) hypotheses, especially for reproductive traits that are insufficiently documented or lacking from the fossil record. This contributes to the understanding of their evolution. PMID:23991160
Werner, Jan; Griebeler, Eva Maria
2013-01-01
It has been hypothesized that a high reproductive output contributes to the unique gigantism in large dinosaur taxa. In order to infer more information on dinosaur reproduction, we established allometries between body mass and different reproductive traits (egg mass, clutch mass, annual clutch mass) for extant phylogenetic brackets (birds, crocodiles and tortoises) of extinct non-avian dinosaurs. Allometries were applied to nine non-avian dinosaur taxa (theropods, hadrosaurs, and sauropodomorphs) for which fossil estimates on relevant traits are currently available. We found that the reproductive traits of most dinosaurs conformed to similar-sized or scaled-up extant reptiles or birds. The reproductive traits of theropods, which are considered more bird-like, were indeed consistent with birds, while the traits of sauropodomorphs conformed better to reptiles. Reproductive traits of hadrosaurs corresponded to both reptiles and birds. Excluding Massospondyluscarinatus, all dinosaurs studied had an intermediary egg to body mass relationship to reptiles and birds. In contrast, dinosaur clutch masses fitted with either the masses predicted from allometries of birds (theropods) or to the masses of reptiles (all other taxa). Theropods studied had probably one clutch per year. For sauropodomorphs and hadrosaurs, more than one clutch per year was predicted. Contrary to current hypotheses, large dinosaurs did not have exceptionally high annual egg numbers (AEN). Independent of the extant model, the estimated dinosaur AEN did not exceed 850 eggs (75,000 kg sauropod) for any of the taxa studied. This estimated maximum is probably an overestimation due to unrealistic assumptions. According to most AEN estimations, the dinosaurs studied laid less than 200 eggs per year. Only some AEN estimates obtained for medium to large sized sauropods were higher (200-400 eggs). Our results provide new (testable) hypotheses, especially for reproductive traits that are insufficiently documented or lacking from the fossil record. This contributes to the understanding of their evolution.
ERIC Educational Resources Information Center
MILLER, WILLIAM CHARLES, III
THIS EXPERIMENTAL STUDY EXAMINED THE HYPOTHESES THAT FILM MOTION INCREASES AUDIENCE EMOTIONAL INVOLVEMENT, INCREASES POSITIVE ATTITUDE RESPONSE TO THE FILM AND DOES NOT AFFECT AUDIENCE INFORMATION RETENTION. OTHER HYPOTHESES WERE THAT THE GALVANIC SKIN RESPONSE (GSR) IS USEFUL FOR EVALUATING FILM AUDIENCE EMOTIONAL INVOLVEMENT, THAT AUDIENCE…
Bossong, Matthijs G; Niesink, Raymond J M
2010-11-01
Cannabis use during adolescence increases the risk of developing psychotic disorders later in life. However, the neurobiological processes underlying this relationship are unknown. This review reports the results of a literature search comprising various neurobiological disciplines, ultimately converging into a model that might explain the neurobiology of cannabis-induced schizophrenia. The article briefly reviews current insights into brain development during adolescence. In particular, the role of the excitatory neurotransmitter glutamate in experience-dependent maturation of specific cortical circuitries is examined. The review also covers recent hypotheses regarding disturbances in strengthening and pruning of synaptic connections in the prefrontal cortex, and the link with latent psychotic disorders. In the present model, cannabis-induced schizophrenia is considered to be a distortion of normal late postnatal brain maturation. Distortion of glutamatergic transmission during critical periods may disturb prefrontal neurocircuitry in specific brain areas. Our model postulates that adolescent exposure to Δ9-tetrahydrocannabinol (THC), the primary psychoactive substance in cannabis, transiently disturbs physiological control of the endogenous cannabinoid system over glutamate and GABA release. As a result, THC may adversely affect adolescent experience-dependent maturation of neural circuitries within prefrontal cortical areas. Depending on dose, exact time window and duration of exposure, this may ultimately lead to the development of psychosis or schizophrenia. The proposed model provides testable hypotheses which can be addressed in future studies, including animal experiments, reanalysis of existing epidemiological data, and prospective epidemiological studies in which the role of the dose-time-effect relationship should be central. Copyright © 2010 Elsevier Ltd. All rights reserved.
Quanbeck, Stephanie M.; Brachova, Libuse; Campbell, Alexis A.; Guan, Xin; Perera, Ann; He, Kun; Rhee, Seung Y.; Bais, Preeti; Dickerson, Julie A.; Dixon, Philip; Wohlgemuth, Gert; Fiehn, Oliver; Barkan, Lenore; Lange, Iris; Lange, B. Markus; Lee, Insuk; Cortes, Diego; Salazar, Carolina; Shuman, Joel; Shulaev, Vladimir; Huhman, David V.; Sumner, Lloyd W.; Roth, Mary R.; Welti, Ruth; Ilarslan, Hilal; Wurtele, Eve S.; Nikolau, Basil J.
2012-01-01
Metabolomics is the methodology that identifies and measures global pools of small molecules (of less than about 1,000 Da) of a biological sample, which are collectively called the metabolome. Metabolomics can therefore reveal the metabolic outcome of a genetic or environmental perturbation of a metabolic regulatory network, and thus provide insights into the structure and regulation of that network. Because of the chemical complexity of the metabolome and limitations associated with individual analytical platforms for determining the metabolome, it is currently difficult to capture the complete metabolome of an organism or tissue, which is in contrast to genomics and transcriptomics. This paper describes the analysis of Arabidopsis metabolomics data sets acquired by a consortium that includes five analytical laboratories, bioinformaticists, and biostatisticians, which aims to develop and validate metabolomics as a hypothesis-generating functional genomics tool. The consortium is determining the metabolomes of Arabidopsis T-DNA mutant stocks, grown in standardized controlled environment optimized to minimize environmental impacts on the metabolomes. Metabolomics data were generated with seven analytical platforms, and the combined data is being provided to the research community to formulate initial hypotheses about genes of unknown function (GUFs). A public database (www.PlantMetabolomics.org) has been developed to provide the scientific community with access to the data along with tools to allow for its interactive analysis. Exemplary datasets are discussed to validate the approach, which illustrate how initial hypotheses can be generated from the consortium-produced metabolomics data, integrated with prior knowledge to provide a testable hypothesis concerning the functionality of GUFs. PMID:22645570
Janky, Rekin's; van Helden, Jacques
2008-01-23
The detection of conserved motifs in promoters of orthologous genes (phylogenetic footprints) has become a common strategy to predict cis-acting regulatory elements. Several software tools are routinely used to raise hypotheses about regulation. However, these tools are generally used as black boxes, with default parameters. A systematic evaluation of optimal parameters for a footprint discovery strategy can bring a sizeable improvement to the predictions. We evaluate the performances of a footprint discovery approach based on the detection of over-represented spaced motifs. This method is particularly suitable for (but not restricted to) Bacteria, since such motifs are typically bound by factors containing a Helix-Turn-Helix domain. We evaluated footprint discovery in 368 Escherichia coli K12 genes with annotated sites, under 40 different combinations of parameters (taxonomical level, background model, organism-specific filtering, operon inference). Motifs are assessed both at the levels of correctness and significance. We further report a detailed analysis of 181 bacterial orthologs of the LexA repressor. Distinct motifs are detected at various taxonomical levels, including the 7 previously characterized taxon-specific motifs. In addition, we highlight a significantly stronger conservation of half-motifs in Actinobacteria, relative to Firmicutes, suggesting an intermediate state in specificity switching between the two Gram-positive phyla, and thereby revealing the on-going evolution of LexA auto-regulation. The footprint discovery method proposed here shows excellent results with E. coli and can readily be extended to predict cis-acting regulatory signals and propose testable hypotheses in bacterial genomes for which nothing is known about regulation.
Rethinking energy in parkinsonian motor symptoms: a potential role for neural metabolic deficits
Amano, Shinichi; Kegelmeyer, Deborah; Hong, S. Lee
2015-01-01
Parkinson’s disease (PD) is characterized as a chronic and progressive neurodegenerative disorder that results in a variety of debilitating symptoms, including bradykinesia, resting tremor, rigidity, and postural instability. Research spanning several decades has emphasized basal ganglia dysfunction, predominantly resulting from dopaminergic (DA) cell loss, as the primarily cause of the aforementioned parkinsonian features. But, why those particular features manifest themselves remains an enigma. The goal of this paper is to develop a theoretical framework that parkinsonian motor features are behavioral consequence of a long-term adaptation to their inability (inflexibility or lack of capacity) to meet energetic demands, due to neural metabolic deficits arising from mitochondrial dysfunction associated with PD. Here, we discuss neurophysiological changes that are generally associated with PD, such as selective degeneration of DA neurons in the substantia nigra pars compacta (SNc), in conjunction with metabolic and mitochondrial dysfunction. We then characterize the cardinal motor symptoms of PD, bradykinesia, resting tremor, rigidity and gait disturbance, reviewing literature to demonstrate how these motor patterns are actually energy efficient from a metabolic perspective. We will also develop three testable hypotheses: (1) neural metabolic deficits precede the increased rate of neurodegeneration and onset of behavioral symptoms in PD; (2) motor behavior of persons with PD are more sensitive to changes in metabolic/bioenergetic state; and (3) improvement of metabolic function could lead to better motor performance in persons with PD. These hypotheses are designed to introduce a novel viewpoint that can elucidate the connections between metabolic, neural and motor function in PD. PMID:25610377
Non-degradative Ubiquitination of Protein Kinases
Ball, K. Aurelia; Johnson, Jeffrey R.; Lewinski, Mary K.; Guatelli, John; Verschueren, Erik; Krogan, Nevan J.; Jacobson, Matthew P.
2016-01-01
Growing evidence supports other regulatory roles for protein ubiquitination in addition to serving as a tag for proteasomal degradation. In contrast to other common post-translational modifications, such as phosphorylation, little is known about how non-degradative ubiquitination modulates protein structure, dynamics, and function. Due to the wealth of knowledge concerning protein kinase structure and regulation, we examined kinase ubiquitination using ubiquitin remnant immunoaffinity enrichment and quantitative mass spectrometry to identify ubiquitinated kinases and the sites of ubiquitination in Jurkat and HEK293 cells. We find that, unlike phosphorylation, ubiquitination most commonly occurs in structured domains, and on the kinase domain, ubiquitination is concentrated in regions known to be important for regulating activity. We hypothesized that ubiquitination, like other post-translational modifications, may alter the conformational equilibrium of the modified protein. We chose one human kinase, ZAP-70, to simulate using molecular dynamics with and without a monoubiquitin modification. In Jurkat cells, ZAP-70 is ubiquitinated at several sites that are not sensitive to proteasome inhibition and thus may have other regulatory roles. Our simulations show that ubiquitination influences the conformational ensemble of ZAP-70 in a site-dependent manner. When monoubiquitinated at K377, near the C-helix, the active conformation of the ZAP-70 C-helix is disrupted. In contrast, when monoubiquitinated at K476, near the kinase hinge region, an active-like ZAP-70 C-helix conformation is stabilized. These results lead to testable hypotheses that ubiquitination directly modulates kinase activity, and that ubiquitination is likely to alter structure, dynamics, and function in other protein classes as well. PMID:27253329
NASA Astrophysics Data System (ADS)
John, Timm; Svensen, Henrik; Weyer, Stefan; Polozov, Alexander; Planke, Sverre
2010-05-01
The Siberian iron-bearing phreatomagmatic pipes represent world class Fe-ore deposit, and 5-6 are currently mined in eastern Siberia. The pipes formed within the vast Tunguska Basin, cutting thick accumulations of carbonates (dolostones) and evaporites (anhydrite, halite, dolostone). These sediments were intruded by the sub-volcanic part of the Siberian Traps at 252 Ma, and sills and dykes are abundant throughout the basin. The pipes formed during sediment-magma interactions in the deep parts of the basin, and the degassing is believed to have triggered the end-Permian environmental crisis. A major problem with understanding the pipe formation is related to the source of iron. Available hypotheses state that the iron was leached from a Fe-enriched magmatic melt that incorporated dolostones. It is currently unclear how the magmatic, hydrothermal, and sedimentary processes interacted to form the deposits, as there are no actual constraints to pin down the iron source. We hypothesize two end-member scenarios to account for the magnetite enrichment and deposition, which is testable by analyzing Fe-isotopes of magnetite: 1) Iron sourced from dolerite magma through leaching and metasomatism by chloride brines. 2) Leaching of iron from sedimentary rocks (shale, dolostone) during magma-sediment interactions. We focus on understanding the Fe-isotopic architecture of the pipes in order constrain the source of the Fe and the mechanism that caused this significant Fe redistribution. We further evaluate possible fractionation during fast metasomatic ore-forming process that took place soon after pipe formation.
Parent-offspring conflict over family size in current China.
Liu, Jianghua; Duan, Chongli; Lummaa, Virpi
2017-05-06
In China, the recent replacement of the one-child policy with a two-child policy could potentially change family ecology-parents may switch investment from exclusively one child to two. The parent-offspring conflict theory provides testable hypotheses concerning possible firstborn opposition toward further reproduction of their mother, and who wins the conflict. We tested the hypotheses that if there is any opposition, it will differ between sexes, weaken with offspring age and family resource availability, and affect maternal reproductive decision-making. Using survey data of 531 non-pregnant mothers of only one child from Xi'an (China), logistic regression was used to examine effects of age, family income, and sex on the attitudes of firstborn children toward having a sibling; ordinal regression was used to investigate how such attitudes affect maternal intention to reproduce again. Firstborns' unsupportive attitude toward their mothers' further reproduction weakened with age and was overall more frequent in low-income families. Sons' unsupportive tendency displayed a somewhat U-shaped relationship, whereas daughters' weakened with family income; consequently, sons were more likely than daughters to be unsupportive in high-income families, suggesting a tendency to be more demanding. Forty-nine percent of mothers supported by their firstborns intended to reproduce again, whilst only 9% of mothers not supported by firstborns had such an intention. Our study contributes to evolutionary literature on parent-offspring conflict and its influence on female reproductive strategy in modern human societies, and has also important implications for understanding fertility patterns and conducting interventions in family conflict in China. © 2016 Wiley Periodicals, Inc.
Ilukor, John; Birner, Regina; Nielsen, Thea
2015-11-01
Providing adequate animal health services to smallholder farmers in developing countries has remained a challenge, in spite of various reform efforts during the past decades. The focuses of the past reforms were on market failures to decide what the public sector, the private sector, and the "third sector" (the community-based sector) should do with regard to providing animal health services. However, such frameworks have paid limited attention to the governance challenges inherent in the provision of animal health services. This paper presents a framework for analyzing institutional arrangements for providing animal health services that focus not only on market failures, but also on governance challenges, such as elite capture, and absenteeism of staff. As an analytical basis, Williamson's discriminating alignment hypothesis is applied to assess the cost-effectiveness of different institutional arrangements for animal health services in view of both market failures and governance challenges. This framework is used to generate testable hypotheses on the appropriateness of different institutional arrangements for providing animal health services, depending on context-specific circumstances. Data from Uganda and Kenya on clinical veterinary services is used to provide an empirical test of these hypotheses and to demonstrate application of Williamson's transaction cost theory to veterinary service delivery. The paper concludes that strong public sector involvement, especially in building and strengthening a synergistic relation-based referral arrangement between paraprofessionals and veterinarians is imperative in improving animal health service delivery in developing countries. Copyright © 2015 Elsevier B.V. All rights reserved.
Twelve testable hypotheses on the geobiology of weathering.
Brantley, S L; Megonigal, J P; Scatena, F N; Balogh-Brunstad, Z; Barnes, R T; Bruns, M A; Van Cappellen, P; Dontsova, K; Hartnett, H E; Hartshorn, A S; Heimsath, A; Herndon, E; Jin, L; Keller, C K; Leake, J R; McDowell, W H; Meinzer, F C; Mozdzer, T J; Petsch, S; Pett-Ridge, J; Pregitzer, K S; Raymond, P A; Riebe, C S; Shumaker, K; Sutton-Grier, A; Walter, R; Yoo, K
2011-03-01
Critical Zone (CZ) research investigates the chemical, physical, and biological processes that modulate the Earth's surface. Here, we advance 12 hypotheses that must be tested to improve our understanding of the CZ: (1) Solar-to-chemical conversion of energy by plants regulates flows of carbon, water, and nutrients through plant-microbe soil networks, thereby controlling the location and extent of biological weathering. (2) Biological stoichiometry drives changes in mineral stoichiometry and distribution through weathering. (3) On landscapes experiencing little erosion, biology drives weathering during initial succession, whereas weathering drives biology over the long term. (4) In eroding landscapes, weathering-front advance at depth is coupled to surface denudation via biotic processes. (5) Biology shapes the topography of the Critical Zone. (6) The impact of climate forcing on denudation rates in natural systems can be predicted from models incorporating biogeochemical reaction rates and geomorphological transport laws. (7) Rising global temperatures will increase carbon losses from the Critical Zone. (8) Rising atmospheric P(CO2) will increase rates and extents of mineral weathering in soils. (9) Riverine solute fluxes will respond to changes in climate primarily due to changes in water fluxes and secondarily through changes in biologically mediated weathering. (10) Land use change will impact Critical Zone processes and exports more than climate change. (11) In many severely altered settings, restoration of hydrological processes is possible in decades or less, whereas restoration of biodiversity and biogeochemical processes requires longer timescales. (12) Biogeochemical properties impart thresholds or tipping points beyond which rapid and irreversible losses of ecosystem health, function, and services can occur. © 2011 Blackwell Publishing Ltd.
Towards a theory of PACS deployment: an integrative PACS maturity framework.
van de Wetering, Rogier; Batenburg, Ronald
2014-06-01
Owing to large financial investments that go along with the picture archiving and communication system (PACS) deployments and inconsistent PACS performance evaluations, there is a pressing need for a better understanding of the implications of PACS deployment in hospitals. We claim that there is a gap in the research field, both theoretically and empirically, to explain the success of the PACS deployment and maturity in hospitals. Theoretical principles are relevant to the PACS performance; maturity and alignment are reviewed from a system and complexity perspective. A conceptual model to explain the PACS performance and a set of testable hypotheses are then developed. Then, structural equation modeling (SEM), i.e. causal modeling, is applied to validate the model and hypotheses based on a research sample of 64 hospitals that use PACS, i.e. 70 % of all hospitals in the Netherlands. Outcomes of the SEM analyses substantiate that the measurements of all constructs are reliable and valid. The PACS alignment-modeled as a higher-order construct of five complementary organizational dimensions and maturity levels-has a significant positive impact on the PACS performance. This result is robust and stable for various sub-samples and segments. This paper presents a conceptual model that explains how alignment in deploying PACS in hospitals is positively related to the perceived performance of PACS. The conceptual model is extended with tools as checklists to systematically identify the improvement areas for hospitals in the PACS domain. The holistic approach towards PACS alignment and maturity provides a framework for clinical practice.
Hypotheses for a Near-Surface Reservoir of Methane and Its Release on Mars
NASA Astrophysics Data System (ADS)
Hu, R.; Bloom, A. A.; Gao, P.; Miller, C. E.; Yung, Y. L.
2015-12-01
The Curiosity rover recently detected a background of 0.7 ppb and spikes of 7 ppb of methane on Mars. This in situ measurement reorients our understanding of the Martian environment and its potential for life, as the current theories do not entail any active source or sink of methane. In particular, the 10-fold elevation during the southern winter indicates episodic sources of methane that are yet to be discovered. Using the temperature and humidity measurements from the rover, we find that perchlorate salts in the regolith deliquesce to form liquid solutions, and deliquescence progresses to deeper subsurface in the season of the methane spikes. We therefore formulate the following three testable hypotheses as an attempt to explain the apparent variability of the atmospheric methane abundance. The first scenario is that the regolith in Gale Crater adsorbs methane when dry and releases this methane to the atmosphere upon deliquescence. The adsorption energy needs to be 36 kJ mol-1 to explain the magnitude of the methane spikes, higher than laboratory measurements. The second scenario is that microorganisms exist and convert organic matter in the soil to methane when they are in liquid solutions. This scenario does not require regolith adsorption. The third scenario is that deep subsurface aquifers sealed by ice or clathrate produce bursts of methane as a result of freezing and thawing of the permafrost, as the terrestrial arctic tundra. Continued monitoring of methane by Curiosity will test the existence of the near-surface reservoir and its exchange with the atmosphere.
Statistical mechanics of monatomic liquids
NASA Astrophysics Data System (ADS)
Wallace, Duane C.
1997-10-01
Two key experimental properties of elemental liquids, together with an analysis of the condensed-system potential-energy surface, lead us logically to the dynamical theory of monatomic liquids. Experimentally, the ion motional specific heat is approximately 3Nk for N ions, implying the normal modes of motion are approximately 3N independent harmonic oscillators. This implies the potential surface contains nearly harmonic valleys. The equilibrium configuration at the bottom of each valley is a ``structure.'' Structures are crystalline or amorphous, and amorphous structures can have a remnant of local crystal symmetry, or can be random. The random structures are by far the most numerous, and hence dominate the statistical mechanics of the liquid state, and their macroscopic properties are uniform over the structure class, for large-N systems. The Hamiltonian for any structural valley is the static structure potential, a sum of harmonic normal modes, and an anharmonic correction. Again from experiment, the constant-density entropy of melting contains a universal disordering contribution of NkΔ, suggesting the random structural valleys are of universal number wN, where lnw=Δ. Our experimental estimate for Δ is 0.80. In quasiharmonic approximation, the liquid theory for entropy agrees with experiment, for all currently analyzable experimental data at elevated temperatures, to within 1-2% of the total entropy. Further testable predictions of the theory are mentioned.
Multidisciplinary analysis and design of printed wiring boards
NASA Astrophysics Data System (ADS)
Fulton, Robert E.; Hughes, Joseph L.; Scott, Waymond R., Jr.; Umeagukwu, Charles; Yeh, Chao-Pin
1991-04-01
Modern printed wiring board design depends on electronic prototyping using computer-based simulation and design tools. Existing electrical computer-aided design (ECAD) tools emphasize circuit connectivity with only rudimentary analysis capabilities. This paper describes a prototype integrated PWB design environment denoted Thermal Structural Electromagnetic Testability (TSET) being developed at Georgia Tech in collaboration with companies in the electronics industry. TSET provides design guidance based on enhanced electrical and mechanical CAD capabilities including electromagnetic modeling testability analysis thermal management and solid mechanics analysis. TSET development is based on a strong analytical and theoretical science base and incorporates an integrated information framework and a common database design based on a systematic structured methodology.
Bayesian naturalness, simplicity, and testability applied to the B ‑ L MSSM GUT
NASA Astrophysics Data System (ADS)
Fundira, Panashe; Purves, Austin
2018-04-01
Recent years have seen increased use of Bayesian model comparison to quantify notions such as naturalness, simplicity, and testability, especially in the area of supersymmetric model building. After demonstrating that Bayesian model comparison can resolve a paradox that has been raised in the literature concerning the naturalness of the proton mass, we apply Bayesian model comparison to GUTs, an area to which it has not been applied before. We find that the GUTs are substantially favored over the nonunifying puzzle model. Of the GUTs we consider, the B ‑ L MSSM GUT is the most favored, but the MSSM GUT is almost equally favored.
Two fundamental questions about protein evolution.
Penny, David; Zhong, Bojian
2015-12-01
Two basic questions are considered that approach protein evolution from different directions; the problems arising from using Markov models for the deeper divergences, and then the origin of proteins themselves. The real problem for the first question (going backwards in time) is that at deeper phylogenies the Markov models of sequence evolution must lose information exponentially at deeper divergences, and several testable methods are suggested that should help resolve these deeper divergences. For the second question (coming forwards in time) a problem is that most models for the origin of protein synthesis do not give a role for the very earliest stages of the process. From our knowledge of the importance of replication accuracy in limiting the length of a coding molecule, a testable hypothesis is proposed. The length of the code, the code itself, and tRNAs would all have prior roles in increasing the accuracy of RNA replication; thus proteins would have been formed only after the tRNAs and the length of the triplet code are already formed. Both questions lead to testable predictions. Copyright © 2014 Elsevier B.V. and Société Française de Biochimie et Biologie Moléculaire (SFBBM). All rights reserved.
Guimaraes, Sandra; Fernandes, Tiago; Costa, Patrício; Silva, Eduardo
2018-06-01
To determine a normative of tumbling E optotype and its feasibility for visual acuity (VA) assessment in children aged 3-4 years. A cross-sectional study of 1756 children who were invited to participate in a comprehensive non-invasive eye exam. Uncorrected monocular VA with crowded tumbling E with a comprehensive ophthalmological examination were assessed. Testability rates of the whole population and VA of the healthy children for different age subgroups, gender, school type and the order of testing in which the ophthalmological examination was performed were evaluated. The overall testability rate was 95% (92% and 98% for children aged 3 and 4 years, respectively). The mean VA of the first-day assessment (first-VA) and best-VA over 2 days' assessments was 0.14 logMAR (95% CI 0.14 to 0.15) (decimal=0.72, 95% CI 0.71 to 0.73) and 0.13 logMAR (95% CI 0.13 to 0.14) (decimal=0.74, 95% CI 0.73 to 0.74). Analysis with age showed differences between groups in first-VA (F(3,1146)=10.0; p<0.001; η2=0.026) and best-VA (F(3,1155)=8.8; p<0.001; η2=0.022). Our normative was very highly correlated with previous reported HOTV-Amblyopia-Treatment-Study (HOTV-ATS) (first-VA, r=0.97; best-VA, r=0.99), with 0.8 to 0.7 lines consistent overestimation for HOTV-ATS as described in literature. Overall false-positive referral was 1.3%, being specially low regarding anisometropias of ≥2 logMAR lines (0.17%). Interocular difference ≥1 line VA logMAR was not associated with age (p=0.195). This is the first normative for European Caucasian children with single crowded tumbling E in healthy eyes and the largest study comparing 3 and 4 years old testability. Testability rates are higher than found in literature with other optotypes, especially in children aged 3 years, where we found 5%-11% better testability rates. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
A computational approach to negative priming
NASA Astrophysics Data System (ADS)
Schrobsdorff, H.; Ihrke, M.; Kabisch, B.; Behrendt, J.; Hasselhorn, M.; Herrmann, J. Michael
2007-09-01
Priming is characterized by a sensitivity of reaction times to the sequence of stimuli in psychophysical experiments. The reduction of the reaction time observed in positive priming is well-known and experimentally understood (Scarborough et al., J. Exp. Psycholol: Hum. Percept. Perform., 3, pp. 1-17, 1977). Negative priming—the opposite effect—is experimentally less tangible (Fox, Psychonom. Bull. Rev., 2, pp. 145-173, 1995). The dependence on subtle parameter changes (such as response-stimulus interval) usually varies. The sensitivity of the negative priming effect bears great potential for applications in research in fields such as memory, selective attention, and ageing effects. We develop and analyse a computational realization, CISAM, of a recent psychological model for action decision making, the ISAM (Kabisch, PhD thesis, Friedrich-Schiller-Universitat, 2003), which is sensitive to priming conditions. With the dynamical systems approach of the CISAM, we show that a single adaptive threshold mechanism is sufficient to explain both positive and negative priming effects. This is achieved by comparing results obtained by the computational modelling with experimental data from our laboratory. The implementation provides a rich base from which testable predictions can be derived, e.g. with respect to hitherto untested stimulus combinations (e.g. single-object trials).
Testability of evolutionary game dynamics based on experimental economics data
NASA Astrophysics Data System (ADS)
Wang, Yijia; Chen, Xiaojie; Wang, Zhijian
In order to better understand the dynamic processes of a real game system, we need an appropriate dynamics model, so to evaluate the validity of a model is not a trivial task. Here, we demonstrate an approach, considering the dynamical macroscope patterns of angular momentum and speed as the measurement variables, to evaluate the validity of various dynamics models. Using the data in real time Rock-Paper-Scissors (RPS) games experiments, we obtain the experimental dynamic patterns, and then derive the related theoretical dynamic patterns from a series of typical dynamics models respectively. By testing the goodness-of-fit between the experimental and theoretical patterns, the validity of the models can be evaluated. One of the results in our study case is that, among all the nonparametric models tested, the best-known Replicator dynamics model performs almost worst, while the Projection dynamics model performs best. Besides providing new empirical macroscope patterns of social dynamics, we demonstrate that the approach can be an effective and rigorous tool to test game dynamics models. Fundamental Research Funds for the Central Universities (SSEYI2014Z) and the National Natural Science Foundation of China (Grants No. 61503062).
NASA Astrophysics Data System (ADS)
Amoroso, Richard L.
HÉCTOR A.A brief introductory survey of Unified Field Mechanics (UFM) is given from the perspective of a Holographic Anthropic Multiverse cosmology in 12 `continuous-state' dimensions. The paradigm with many new parameters is cast in a scale-invariant conformal covariant Dirac polarized vacuum utilizing extended HD forms of the de Broglie-Bohm and Cramer interpretations of quantum theory. The model utilizes a unique form of M-Theory based in part on the original hadronic form of string theory that had a variable string tension, TS and included a tachyon. The model is experimentally testable, thus putatively able to demonstrate the existence of large-scale additional dimensionality (LSXD), test for QED violating tight-bound state spectral lines in hydrogen `below' the lowest Bohr orbit, and surmount the quantum uncertainty principle utilizing a hyperincursive Sagnac Effect resonance hierarchy.
Eventful horizons: String theory in de Sitter and anti-de Sitter
NASA Astrophysics Data System (ADS)
Kleban, Matthew Benjamin
String theory purports to be a theory of quantum gravity. As such, it should have much to say about the deep mysteries surrounding the very early stages of our universe. For this reason, although the theory is notoriously difficult to directly test, data from experimental cosmology may provide a way to probe the high energy physics of string theory. In the first part of this thesis, I will address the important issue of the testability of string theory using observations of the cosmic microwave background radiation. In the second part, I will study some formal difficulties that arise in attempting to understand string theory in de Sitter spacetime. In the third part, I will study the singularity of an eternal anti de Sitter Schwarzschild black hole, using the AdS/CFT correspondence.
a Heavy Higgs Boson from Flavor and Electroweak Symmetry Unification
NASA Astrophysics Data System (ADS)
Fabbrichesi, Marco
2005-08-01
We present a unified picture of flavor and electroweak symmetry breaking based on a nonlinear sigma model spontaneously broken at the TeV scale. Flavor and Higgs bosons arise as pseudo-Goldstone modes. Explicit collective symmetry breaking yields stable vacuum expectation values and masses protected at one loop by the little-Higgs mechanism. The coupling to the fermions generates well-definite mass textures--according to a U(1) global flavor symmetry--that correctly reproduce the mass hierarchies and mixings of quarks and leptons. The model is more constrained than usual little-Higgs models because of bounds on weak and flavor physics. The main experimental signatures testable at the LHC are a rather large mass m
Kalman filter control of a model of spatiotemporal cortical dynamics
Schiff, Steven J; Sauer, Tim
2007-01-01
Recent advances in Kalman filtering to estimate system state and parameters in nonlinear systems have offered the potential to apply such approaches to spatiotemporal nonlinear systems. We here adapt the nonlinear method of unscented Kalman filtering to observe the state and estimate parameters in a computational spatiotemporal excitable system that serves as a model for cerebral cortex. We demonstrate the ability to track spiral wave dynamics, and to use an observer system to calculate control signals delivered through applied electrical fields. We demonstrate how this strategy can control the frequency of such a system, or quench the wave patterns, while minimizing the energy required for such results. These findings are readily testable in experimental applications, and have the potential to be applied to the treatment of human disease. PMID:18310806
Constraining the loop quantum gravity parameter space from phenomenology
NASA Astrophysics Data System (ADS)
Brahma, Suddhasattwa; Ronco, Michele
2018-03-01
Development of quantum gravity theories rarely takes inputs from experimental physics. In this letter, we take a small step towards correcting this by establishing a paradigm for incorporating putative quantum corrections, arising from canonical quantum gravity (QG) theories, in deriving falsifiable modified dispersion relations (MDRs) for particles on a deformed Minkowski space-time. This allows us to differentiate and, hopefully, pick between several quantization choices via testable, state-of-the-art phenomenological predictions. Although a few explicit examples from loop quantum gravity (LQG) (such as the regularization scheme used or the representation of the gauge group) are shown here to establish the claim, our framework is more general and is capable of addressing other quantization ambiguities within LQG and also those arising from other similar QG approaches.
NASA Astrophysics Data System (ADS)
Päßler, Jan-Filip; Jarochowska, Emilia; Bestmann, Michel; Munnecke, Axel
2018-02-01
Although carbonate-precipitating cyanobacteria are ubiquitous in aquatic ecosystems today, the criteria used to identify them in the geological record are subjective and rarely testable. Differences in the mode of biomineralization between cyanobacteria and eukaryotes, i.e. biologically induced calcification (BIM) vs. biologically controlled calcification (BCM), result in different crystallographic structures which might be used as a criterion to test cyanobacterial affinities. Cyanobacteria are often used as a ‘wastebasket taxon’, to which various microfossils are assigned. The lack of a testable criterion for the identification of cyanobacteria may bias their fossil record severely. We employed electron backscatter diffraction (EBSD) to investigate the structure of calcareous skeletons in two microproblematica widespread in Palaeozoic marine ecosystems: Rothpletzella, hypothesized to be a cyanobacterium, and an incertae sedis microorganism Allonema. We used a calcareous trilobite shell as a BCM reference. The mineralized structure of Allonema has a simple single-layered structure of acicular crystals perpendicular to the surface of the organism. The c-axes of these crystals are parallel to the elongation and thereby normal to the surface of the organism. EBSD pole figures and misorientation axes distribution reveal a fibre texture around the c-axis with a small degree of variation (up to 30°), indicating a highly ordered structure. A comparable pattern was found in the trilobite shell. This structure allows excluding biologically induced mineralization as the mechanism of shell formation in Allonema. In Rothpletzella, the c-axes of the microcrystalline sheath show a broader clustering compared to Allonema, but still reveal crystals tending to be perpendicular to the surface of the organism. The misorientation axes of adjacent crystals show an approximately random distribution. Rothpletzella also shares morphological similarities with extant cyanobacteria. We propose that the occurrence of a strong misorientation relationship between adjacent crystals with misorientation axes clustering around the c-axis can be used as a proxy for the degree of control exerted by an organism on its mineralized structures. Therefore, precisely constrained distributions of misorientations (misorientation angle and misorientation axis) may be used to identify BCM in otherwise problematic fossils and can be used to ground-truth the cyanobacterial affinities commonly proposed for problematic extinct organisms.
Is Ecosystem-Atmosphere Observation in Long-Term Networks actually Science?
NASA Astrophysics Data System (ADS)
Schmid, H. P. E.
2015-12-01
Science uses observations to build knowledge by testable explanations and predictions. The "scientific method" requires controlled systematic observation to examine questions, hypotheses and predictions. Thus, enquiry along the scientific method responds to questions of the type "what if …?" In contrast, long-term observation programs follow a different strategy: we commonly take great care to minimize our influence on the environment of our measurements, with the aim to maximize their external validity. We observe what we think are key variables for ecosystem-atmosphere exchange and ask questions such as "what happens next?" or "how did this happen?" This apparent deviation from the scientific method begs the question whether any explanations we come up with for the phenomena we observe are actually contributing to testable knowledge, or whether their value remains purely anecdotal. Here, we present examples to argue that, under certain conditions, data from long-term observations and observation networks can have equivalent or even higher scientific validity than controlled experiments. Internal validity is particularly enhanced if observations are combined with modeling. Long-term observations of ecosystem-atmosphere fluxes identify trends and temporal scales of variability. Observation networks reveal spatial patterns and variations, and long-term observation networks combine both aspects. A necessary condition for such observations to gain validity beyond the anecdotal is the requirement that the data are comparable: a comparison of two measured values, separated in time or space, must inform us objectively whether (e.g.) one value is larger than the other. In turn, a necessary condition for the comparability of data is the compatibility of the sensors and procedures used to generate them. Compatibility ensures that we compare "apples to apples": that measurements conducted in identical conditions give the same values (within suitable uncertainty intervals). In principle, a useful tool to achieve comparability and compatibility is the standardization of sensors and methods. However, due to the diversity of ecosystems and settings, standardization in ecosystem-atmosphere exchange is difficult. We discuss some of the challenges and pitfalls of standardization across networks.
Teaching Rival Hypotheses in Experimental Psychology.
ERIC Educational Resources Information Center
Howard, George S.; Engelhardt, Jean L.
1984-01-01
Students critiqued research contained in the book "Rival Hypotheses" (Huck and Sandler, 1979), which contains studies dealing with knowledge claims of practical importance, e.g., the evidence that saccharine causes cancer. (RM)
Kell, Douglas B.; Oliver, Stephen G.
2014-01-01
One approach to experimental science involves creating hypotheses, then testing them by varying one or more independent variables, and assessing the effects of this variation on the processes of interest. We use this strategy to compare the intellectual status and available evidence for two models or views of mechanisms of transmembrane drug transport into intact biological cells. One (BDII) asserts that lipoidal phospholipid Bilayer Diffusion Is Important, while a second (PBIN) proposes that in normal intact cells Phospholipid Bilayer diffusion Is Negligible (i.e., may be neglected quantitatively), because evolution selected against it, and with transmembrane drug transport being effected by genetically encoded proteinaceous carriers or pores, whose “natural” biological roles, and substrates are based in intermediary metabolism. Despite a recent review elsewhere, we can find no evidence able to support BDII as we can find no experiments in intact cells in which phospholipid bilayer diffusion was either varied independently or measured directly (although there are many papers where it was inferred by seeing a covariation of other dependent variables). By contrast, we find an abundance of evidence showing cases in which changes in the activities of named and genetically identified transporters led to measurable changes in the rate or extent of drug uptake. PBIN also has considerable predictive power, and accounts readily for the large differences in drug uptake between tissues, cells and species, in accounting for the metabolite-likeness of marketed drugs, in pharmacogenomics, and in providing a straightforward explanation for the late-stage appearance of toxicity and of lack of efficacy during drug discovery programmes despite macroscopically adequate pharmacokinetics. Consequently, the view that Phospholipid Bilayer diffusion Is Negligible (PBIN) provides a starting hypothesis for assessing cellular drug uptake that is much better supported by the available evidence, and is both more productive and more predictive. PMID:25400580
The evolutionary logic of sepsis.
Rózsa, Lajos; Apari, Péter; Sulyok, Mihály; Tappe, Dennis; Bodó, Imre; Hardi, Richárd; Müller, Viktor
2017-11-01
The recently proposed Microbiome Mutiny Hypothesis posits that members of the human microbiome obtain information about the host individuals' health status and, when host survival is compromised, switch to an intensive exploitation strategy to maximize residual transmission. In animals and humans, sepsis is an acute systemic reaction to microbes invading the normally sterile body compartments. When induced by formerly mutualistic or neutral microbes, possibly in response to declining host health, sepsis appears to fit the 'microbiome mutiny' scenario except for its apparent failure to enhance transmission of the causative organisms. We propose that the ability of certain species of the microbiome to induce sepsis is not a fortuitous side effect of within-host replication, but rather it might, in some cases, be the result of their adaptive evolution. Whenever host health declines, inducing sepsis can be adaptive for those members of the healthy human microbiome that are capable of colonizing the future cadaver and spread by cadaver-borne transmission. We hypothesize that such microbes might exhibit switches along the 'mutualist - lethal pathogen - decomposer - mutualist again' scenario, implicating a previously unsuspected, surprising level of phenotypic plasticity. This hypothesis predicts that those species of the healthy microbiome that are recurring causative agents of sepsis can participate in the decomposition of cadavers, and can be transmitted as soil-borne or water-borne infections. Furthermore, in individual sepsis cases, the same microbial clones that dominate the systemic infection that precipitates sepsis, should also be present in high concentration during decomposition following death: this prediction is testable by molecular fingerprinting in experimentally induced animal models. Sepsis is a leading cause of human death worldwide. If further research confirms that some cases of sepsis indeed involve the 'mutiny' (facultative phenotypic switching) of normal members of the microbiome, then new strategies could be devised to prevent or treat sepsis by interfering with this process. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Condition-dependent functional connectivity: syntax networks in bilinguals
Dodel, Silke; Golestani, Narly; Pallier, Christophe; ElKouby, Vincent; Le Bihan, Denis; Poline, Jean-Baptiste
2005-01-01
This paper introduces a method to study the variation of brain functional connectivity networks with respect to experimental conditions in fMRI data. It is related to the psychophysiological interaction technique introduced by Friston et al. and extends to networks of correlation modulation (CM networks). Extended networks containing several dozens of nodes are determined in which the links correspond to consistent correlation modulation across subjects. In addition, we assess inter-subject variability and determine networks in which the condition-dependent functional interactions can be explained by a subject-dependent variable. We applied the technique to data from a study on syntactical production in bilinguals and analysed functional interactions differentially across tasks (word reading or sentence production) and across languages. We find an extended network of consistent functional interaction modulation across tasks, whereas the network comparing languages shows fewer links. Interestingly, there is evidence for a specific network in which the differences in functional interaction across subjects can be explained by differences in the subjects' syntactical proficiency. Specifically, we find that regions, including ones that have previously been shown to be involved in syntax and in language production, such as the left inferior frontal gyrus, putamen, insula, precentral gyrus, as well as the supplementary motor area, are more functionally linked during sentence production in the second, compared with the first, language in syntactically more proficient bilinguals than in syntactically less proficient ones. Our approach extends conventional activation analyses to the notion of networks, emphasizing functional interactions between regions independently of whether or not they are activated. On the one hand, it gives rise to testable hypotheses and allows an interpretation of the results in terms of the previous literature, and on the other hand, it provides a basis for studying the structure of functional interactions as a whole, and hence represents a further step towards the notion of large-scale networks in functional imaging. PMID:16087437
Schmickl, Thomas; Karsai, Istvan
2014-01-01
We develop a model to produce plausible patterns of task partitioning in the ponerine ant Ectatomma ruidum based on the availability of living prey and prey corpses. The model is based on the organizational capabilities of a “common stomach” through which the colony utilizes the availability of a natural (food) substance as a major communication channel to regulate the income and expenditure of the very same substance. This communication channel has also a central role in regulating task partitioning of collective hunting behavior in a supply&demand-driven manner. Our model shows that task partitioning of the collective hunting behavior in E. ruidum can be explained by regulation due to a common stomach system. The saturation of the common stomach provides accessible information to individual ants so that they can adjust their hunting behavior accordingly by engaging in or by abandoning from stinging or transporting tasks. The common stomach is able to establish and to keep stabilized an effective mix of workforce to exploit the prey population and to transport food into the nest. This system is also able to react to external perturbations in a de-centralized homeostatic way, such as to changes in the prey density or to accumulation of food in the nest. In case of stable conditions the system develops towards an equilibrium concerning colony size and prey density. Our model shows that organization of work through a common stomach system can allow Ectatomma ruidum to collectively forage for food in a robust, reactive and reliable way. The model is compared to previously published models that followed a different modeling approach. Based on our model analysis we also suggest a series of experiments for which our model gives plausible predictions. These predictions are used to formulate a set of testable hypotheses that should be investigated empirically in future experimentation. PMID:25493558
White-nose syndrome initiates a cascade of physiologic disturbances in the hibernating bat host.
Verant, Michelle L; Meteyer, Carol U; Speakman, John R; Cryan, Paul M; Lorch, Jeffrey M; Blehert, David S
2014-12-09
The physiological effects of white-nose syndrome (WNS) in hibernating bats and ultimate causes of mortality from infection with Pseudogymnoascus (formerly Geomyces) destructans are not fully understood. Increased frequency of arousal from torpor described among hibernating bats with late-stage WNS is thought to accelerate depletion of fat reserves, but the physiological mechanisms that lead to these alterations in hibernation behavior have not been elucidated. We used the doubly labeled water (DLW) method and clinical chemistry to evaluate energy use, body composition changes, and blood chemistry perturbations in hibernating little brown bats (Myotis lucifugus) experimentally infected with P. destructans to better understand the physiological processes that underlie mortality from WNS. These data indicated that fat energy utilization, as demonstrated by changes in body composition, was two-fold higher for bats with WNS compared to negative controls. These differences were apparent in early stages of infection when torpor-arousal patterns were equivalent between infected and non-infected animals, suggesting that P. destructans has complex physiological impacts on its host prior to onset of clinical signs indicative of late-stage infections. Additionally, bats with mild to moderate skin lesions associated with early-stage WNS demonstrated a chronic respiratory acidosis characterized by significantly elevated dissolved carbon dioxide, acidemia, and elevated bicarbonate. Potassium concentrations were also significantly higher among infected bats, but sodium, chloride, and other hydration parameters were equivalent to controls. Integrating these novel findings on the physiological changes that occur in early-stage WNS with those previously documented in late-stage infections, we propose a multi-stage disease progression model that mechanistically describes the pathologic and physiologic effects underlying mortality of WNS in hibernating bats. This model identifies testable hypotheses for better understanding this disease, knowledge that will be critical for defining effective disease mitigation strategies aimed at reducing morbidity and mortality that results from WNS.
Kapoor, Abhijeet; Shandilya, Manish; Kundu, Suman
2011-01-01
Human dopamine β-hydroxylase (DBH) is an important therapeutic target for complex traits. Several single nucleotide polymorphisms (SNPs) have also been identified in DBH with potential adverse physiological effect. However, difficulty in obtaining diffractable crystals and lack of a suitable template for modeling the protein has ensured that neither crystallographic three-dimensional structure nor computational model for the enzyme is available to aid rational drug design, prediction of functional significance of SNPs or analytical protein engineering. Adequate biochemical information regarding human DBH, structural coordinates for peptidylglycine alpha-hydroxylating monooxygenase and computational data from a partial model of rat DBH were used along with logical manual intervention in a novel way to build an in silico model of human DBH. The model provides structural insight into the active site, metal coordination, subunit interface, substrate recognition and inhibitor binding. It reveals that DOMON domain potentially promotes tetramerization, while substrate dopamine and a potential therapeutic inhibitor nepicastat are stabilized in the active site through multiple hydrogen bonding. Functional significance of several exonic SNPs could be described from a structural analysis of the model. The model confirms that SNP resulting in Ala318Ser or Leu317Pro mutation may not influence enzyme activity, while Gly482Arg might actually do so being in the proximity of the active site. Arg549Cys may cause abnormal oligomerization through non-native disulfide bond formation. Other SNPs like Glu181, Glu250, Lys239 and Asp290 could potentially inhibit tetramerization thus affecting function. The first three-dimensional model of full-length human DBH protein was obtained in a novel manner with a set of experimental data as guideline for consistency of in silico prediction. Preliminary physicochemical tests validated the model. The model confirms, rationalizes and provides structural basis for several biochemical data and claims testable hypotheses regarding function. It provides a reasonable template for drug design as well.
In vitro screening for population variability in toxicity of pesticide-containing mixtures
Abdo, Nour; Wetmore, Barbara A.; Chappell, Grace A.; Shea, Damian; Wright, Fred A.; Rusyna, Ivan
2016-01-01
Population-based human in vitro models offer exceptional opportunities for evaluating the potential hazard and mode of action of chemicals, as well as variability in responses to toxic insults among individuals. This study was designed to test the hypothesis that comparative population genomics with efficient in vitro experimental design can be used for evaluation of the potential for hazard, mode of action, and the extent of population variability in responses to chemical mixtures. We selected 146 lymphoblast cell lines from 4 ancestrally and geographically diverse human populations based on the availability of genome sequence and basal RNA-seq data. Cells were exposed to two pesticide mixtures – an environmental surface water sample comprised primarily of organochlorine pesticides and a laboratory-prepared mixture of 36 currently used pesticides – in concentration response and evaluated for cytotoxicity. On average, the two mixtures exhibited a similar range of in vitro cytotoxicity and showed considerable inter-individual variability across screened cell lines. However, when in vitroto-in vivo extrapolation (IVIVE) coupled with reverse dosimetry was employed to convert the in vitro cytotoxic concentrations to oral equivalent doses and compared to the upper bound of predicted human exposure, we found that a nominally more cytotoxic chlorinated pesticide mixture is expected to have greater margin of safety (more than 5 orders of magnitude) as compared to the current use pesticide mixture (less than 2 orders of magnitude) due primarily to differences in exposure predictions. Multivariate genome-wide association mapping revealed an association between the toxicity of current use pesticide mixture and a polymorphism in rs1947825 in C17orf54. We conclude that a combination of in vitro human population-based cytotoxicity screening followed by dosimetric adjustment and comparative population genomics analyses enables quantitative evaluation of human health hazard from complex environmental mixtures. Additionally, such an approach yields testable hypotheses regarding potential toxicity mechanisms. PMID:26386728
The Microbe-Free Plant: Fact or Artifact?
Partida-Martínez, Laila P.; Heil, Martin
2011-01-01
Plant–microbe interactions are ubiquitous. Plants are threatened by pathogens, but they are even more commonly engaged in neutral or mutualistic interactions with microbes: belowground microbial plant associates are mycorrhizal fungi, Rhizobia, and plant-growth promoting rhizosphere bacteria, aboveground plant parts are colonized by internally living bacteria and fungi (endophytes) and by microbes in the phyllosphere (epiphytes). We emphasize here that a completely microbe-free plant is an exotic exception rather than the biologically relevant rule. The complex interplay of such microbial communities with the host–plant affects multiple vital parameters such as plant nutrition, growth rate, resistance to biotic and abiotic stressors, and plant survival and distribution. The mechanisms involved reach from direct ones such as nutrient acquisition, the production of plant hormones, or direct antibiosis, to indirect ones that are mediated by effects on host resistance genes or via interactions at higher trophic levels. Plant-associated microbes are heterotrophic and cause costs to their host plant, whereas the benefits depend on the current environment. Thus, the outcome of the interaction for the plant host is highly context dependent. We argue that considering the microbe-free plant as the “normal” or control stage significantly impairs research into important phenomena such as (1) phenotypic and epigenetic plasticity, (2) the “normal” ecological outcome of a given interaction, and (3) the evolution of plants. For the future, we suggest cultivation-independent screening methods using direct PCR from plant tissue of more than one fungal and bacterial gene to collect data on the true microbial diversity in wild plants. The patterns found could be correlated to host species and environmental conditions, in order to formulate testable hypotheses on the biological roles of plant endophytes in nature. Experimental approaches should compare different host–endophyte combinations under various relevant environmental conditions and study at the genetic, epigenetic, transcriptional, and physiological level the parameters that cause the interaction to shift along the mutualism–parasitism continuum. PMID:22639622
Dissecting Leishmania infantum Energy Metabolism - A Systems Perspective
Subramanian, Abhishek; Jhawar, Jitesh; Sarkar, Ram Rup
2015-01-01
Leishmania infantum, causative agent of visceral leishmaniasis in humans, illustrates a complex lifecycle pertaining to two extreme environments, namely, the gut of the sandfly vector and human macrophages. Leishmania is capable of dynamically adapting and tactically switching between these critically hostile situations. The possible metabolic routes ventured by the parasite to achieve this exceptional adaptation to its varying environments are still poorly understood. In this study, we present an extensively reconstructed energy metabolism network of Leishmania infantum as an attempt to identify certain strategic metabolic routes preferred by the parasite to optimize its survival in such dynamic environments. The reconstructed network consists of 142 genes encoding for enzymes performing 237 reactions distributed across five distinct model compartments. We annotated the subcellular locations of different enzymes and their reactions on the basis of strong literature evidence and sequence-based detection of cellular localization signal within a protein sequence. To explore the diverse features of parasite metabolism the metabolic network was implemented and analyzed as a constraint-based model. Using a systems-based approach, we also put forth an extensive set of lethal reaction knockouts; some of which were validated using published data on Leishmania species. Performing a robustness analysis, the model was rigorously validated and tested for the secretion of overflow metabolites specific to Leishmania under varying extracellular oxygen uptake rate. Further, the fate of important non-essential amino acids in L. infantum metabolism was investigated. Stage-specific scenarios of L. infantum energy metabolism were incorporated in the model and key metabolic differences were outlined. Analysis of the model revealed the essentiality of glucose uptake, succinate fermentation, glutamate biosynthesis and an active TCA cycle as driving forces for parasite energy metabolism and its optimal growth. Finally, through our in silico knockout analysis, we could identify possible therapeutic targets that provide experimentally testable hypotheses. PMID:26367006
Reliability/maintainability/testability design for dormancy
NASA Astrophysics Data System (ADS)
Seman, Robert M.; Etzl, Julius M.; Purnell, Arthur W.
1988-05-01
This document has been prepared as a tool for designers of dormant military equipment and systems. The purpose of this handbook is to provide design engineers with Reliability/Maintainability/Testability design guidelines for systems which spend significant portions of their life cycle in a dormant state. The dormant state is defined as a nonoperating mode where a system experiences very little or no electrical stress. The guidelines in this report present design criteria in the following categories: (1) Part Selection and Control; (2) Derating Practices; (3) Equipment/System Packaging; (4) Transportation and Handling; (5) Maintainability Design; (6) Testability Design; (7) Evaluation Methods for In-Plant and Field Evaluation; and (8) Product Performance Agreements. Whereever applicable, design guidelines for operating systems were included with the dormant design guidelines. This was done in an effort to produce design guidelines for a more complete life cycle. Although dormant systems spend significant portions of their life cycle in a nonoperating mode, the designer must design the system for the complete life cycle, including nonoperating as well as operating modes. The guidelines are primarily intended for use in the design of equipment composed of electronic parts and components. However, they can also be used for the design of systems which encompass both electronic and nonelectronic parts, as well as for the modification of existing systems.
Huang, Dan; Chen, Xuejuan; Gong, Qi; Yuan, Chaoqun; Ding, Hui; Bai, Jing; Zhu, Hui; Fu, Zhujun; Yu, Rongbin; Liu, Hu
2016-01-01
This survey was conducted to determine the testability, distribution and associations of ocular biometric parameters in Chinese preschool children. Ocular biometric examinations, including the axial length (AL) and corneal radius of curvature (CR), were conducted on 1,688 3-year-old subjects by using an IOLMaster in August 2015. Anthropometric parameters, including height and weight, were measured according to a standardized protocol, and body mass index (BMI) was calculated. The testability was 93.7% for the AL and 78.6% for the CR overall, and both measures improved with age. Girls performed slightly better in AL measurements (P = 0.08), and the difference in CR was statistically significant (P < 0.05). The AL distribution was normal in girls (P = 0.12), whereas it was not in boys (P < 0.05). For CR1, all subgroups presented normal distributions (P = 0.16 for boys; P = 0.20 for girls), but the distribution varied when the subgroups were combined (P < 0.05). CR2 presented a normal distribution (P = 0.11), whereas the AL/CR ratio was abnormal (P < 0.001). Boys exhibited a significantly longer AL, a greater CR and a greater AL/CR ratio than girls (all P < 0.001). PMID:27384307
Spiegel, Brennan M.R.; Chey, William D.; Chang, Lin
2010-01-01
Some studies indicate that small intestinal bacterial overgrowth (SIBO), as measured by hydrogen breath tests (HBT), is more prevalent in patients with irritable bowel syndrome (IBS) vs. matched controls without IBS. Although the data are conflicting, this observation has led to the hypothesis that SIBO may be a primary cause of IBS. Yet, it remains unclear whether SIBO is truly fundamental to the pathophysiology of IBS, or is instead a mere epiphenomenon or bystander of something else altogether. We hypothesize that SIBO might be a byproduct of the disproportionate use of proton pump inhibitors (PPIs) in IBS, as follows: (1) IBS patients are more likely than controls to receive PPI therapy; (2) PPI therapy may promote varying forms of SIBO by eliminating gastric acid; and (3) existing studies linking SIBO to IBS have not adjusted for or excluded the use of PPI therapy. When linked together, these premises form the basis for a simple and testable hypothesis: the relationship between SIBO and IBS may be confounded by PPIs. Our article explores these premises, lays out the argument supporting this “PPI hypothesis,” discusses potential implications, and outlines next steps to further investigate this possibility. PMID:19086951
Sexual imprinting: what strategies should we expect to see in nature?
Chaffee, Dalton W; Griffin, Hayes; Gilman, R Tucker
2013-12-01
Sexual imprinting occurs when juveniles learn mate preferences by observing the phenotypes of other members of their populations, and it is ubiquitous in nature. Imprinting strategies, that is which individuals and phenotypes are observed and how strong preferences become, vary among species. Imprinting can affect trait evolution and the probability of speciation, and different imprinting strategies are expected to have different effects. However, little is known about how and why different imprinting strategies evolve, or which strategies we should expect to see in nature. We used a mathematical model to study how the evolution of sexual imprinting depends on (1) imprinting costs and (2) the sex-specific fitness effects of the phenotype on which individuals imprint. We found that even small fixed costs prevent the evolution of sexual imprinting, but small relative costs do not. When imprinting does evolve, we identified the conditions under which females should evolve to imprint on their fathers, their mothers, or on other members of their populations. Our results provide testable hypotheses for empirical work and help to explain the conditions under which sexual imprinting might evolve to promote speciation. © 2013 The Author(s). Evolution © 2013 The Society for the Study of Evolution.
Wertz, Annie E; Moya, Cristina
2018-05-30
Despite a shared recognition that the design of the human mind and the design of human culture are tightly linked, researchers in the evolutionary social sciences tend to specialize in understanding one at the expense of the other. The disciplinary boundaries roughly correspond to research traditions that focus more on natural selection and those that focus more on cultural evolution. In this paper, we articulate how two research traditions within the evolutionary social sciences-evolutionary psychology and cultural evolution-approach the study of design. We focus our analysis on the design of cognitive mechanisms that are the result of the interplay of genetic and cultural evolution. We aim to show how the approaches of these two research traditions can complement each other, and provide a framework for developing a wider range of testable hypotheses about cognitive design. To do so, we provide concrete illustrations of how this integrated approach can be used to interrogate cognitive design using examples from our own work on plant and symbolic group boundary cognition. We hope this recognition of different pathways to design will broaden the hypothesis space in the evolutionary social sciences and encourage methodological pluralism in the investigation of the mind. Copyright © 2018 Elsevier B.V. All rights reserved.
Poudel, R; Jumpponen, A; Schlatter, D C; Paulitz, T C; Gardener, B B McSpadden; Kinkel, L L; Garrett, K A
2016-10-01
Network models of soil and plant microbiomes provide new opportunities for enhancing disease management, but also challenges for interpretation. We present a framework for interpreting microbiome networks, illustrating how observed network structures can be used to generate testable hypotheses about candidate microbes affecting plant health. The framework includes four types of network analyses. "General network analysis" identifies candidate taxa for maintaining an existing microbial community. "Host-focused analysis" includes a node representing a plant response such as yield, identifying taxa with direct or indirect associations with that node. "Pathogen-focused analysis" identifies taxa with direct or indirect associations with taxa known a priori as pathogens. "Disease-focused analysis" identifies taxa associated with disease. Positive direct or indirect associations with desirable outcomes, or negative associations with undesirable outcomes, indicate candidate taxa. Network analysis provides characterization not only of taxa with direct associations with important outcomes such as disease suppression, biofertilization, or expression of plant host resistance, but also taxa with indirect associations via their association with other key taxa. We illustrate the interpretation of network structure with analyses of microbiomes in the oak phyllosphere, and in wheat rhizosphere and bulk soil associated with the presence or absence of infection by Rhizoctonia solani.
Global Change And Water Availability And Quality: Challenges Ahead
NASA Astrophysics Data System (ADS)
Larsen, M. C.; Ryker, S. J.
2012-12-01
The United States is in the midst of a continental-scale, multi-year water-resources experiment, in which society has not defined testable hypotheses or set the duration and scope of the experiment. What are we doing? We are expanding population at two to three times the national growth rate in our most water-scarce states, in the southwest, where water stress is already great and modeling predicts decreased streamflow by the middle of this century. We are expanding irrigated agriculture from the west into the east, particularly to the southeastern states, where increased competition for ground and surface water has urban, agricultural, and environmental interests at odds, and increasingly, in court. We are expanding our consumption of pharmaceutical and personal care products to historic high levels and disposing of them in surface and groundwater, through sewage treatment plants and individual septic systems that were not designed to treat them. These and other examples of our national-scale experiment are likely to continue well into the 21st century. This experiment and related challenges will continue and likely intensify as non-climatic and climatic factors, such as predicted rising temperature and changes in the distribution of precipitation in time and space, continue to develop.
Computational Approaches to Drug Repurposing and Pharmacology
Hodos, Rachel A; Kidd, Brian A; Khader, Shameer; Readhead, Ben P; Dudley, Joel T
2016-01-01
Data in the biological, chemical, and clinical domains are accumulating at ever-increasing rates and have the potential to accelerate and inform drug development in new ways. Challenges and opportunities now lie in developing analytic tools to transform these often complex and heterogeneous data into testable hypotheses and actionable insights. This is the aim of computational pharmacology, which uses in silico techniques to better understand and predict how drugs affect biological systems, which can in turn improve clinical use, avoid unwanted side effects, and guide selection and development of better treatments. One exciting application of computational pharmacology is drug repurposing- finding new uses for existing drugs. Already yielding many promising candidates, this strategy has the potential to improve the efficiency of the drug development process and reach patient populations with previously unmet needs such as those with rare diseases. While current techniques in computational pharmacology and drug repurposing often focus on just a single data modality such as gene expression or drug-target interactions, we rationalize that methods such as matrix factorization that can integrate data within and across diverse data types have the potential to improve predictive performance and provide a fuller picture of a drug's pharmacological action. PMID:27080087
Evolutionary medicine: update on the relevance to family practice.
Naugler, Christopher T
2008-09-01
To review the relevance of evolutionary medicine to family practice and family physician training. Articles were located through a MEDLINE search, using the key words evolution, Darwin, and adaptation. Most references presented level III evidence (expert opinion), while a minority provided level II evidence (epidemiologic studies). Evolutionary medicine deals with the interplay of biology and the environment in the understanding of human disease. Yet medical schools have virtually ignored the need for family physicians to have more than a cursory knowledge of this topic. A review of the main trends in this field most relevant to family practice revealed that a basic knowledge of evolutionary medicine might help in explaining the causation of diseases to patients. Evolutionary medicine has also proven key to explaining the reasons for the development of antibiotic resistance and has the potential to explain cancer pathogenesis. As an organizing principle, this field also has potential in the teaching of family medicine. Evolutionary medicine should be studied further and incorporated into medical training and practice. Its practical utility will be proven through the generation of testable hypotheses and their application in relation to disease causation and possible prevention.
Lemons, Michele L
2012-01-01
Inquiry-based projects promote discovery and retention of key concepts, increase student engagement, and stimulate interest in research. Described here are a series of lab exercises within an undergraduate upper level neuroscience course that train students to design, execute and analyze their own hypothesis-driven research project. Prior to developing their own projects, students learn several research techniques including aseptic cell culture, cell line maintenance, immunocytochemistry and fluorescent microscopy. Working in groups, students choose how to use these techniques to characterize and identify a "mystery" cell line. Each lab group is given a unique cell line with either a neural, astrocyte, or Schwann cell origin. Working together, students plan and execute experiments to determine the cellular origin and other unique characteristics of their mystery cell line. Students generate testable hypotheses, design interpretable experiments, generate and analyze data, and report their findings in both oral and written formats. Students receive instructor and peer feedback throughout the entire project. In summary, these labs train students the process of scientific research. This series of lab exercises received very strong positive feedback from the students. Reflections on student feedback and plans for future improvements are discussed.
Trajectories of women's abortion-related care: A conceptual framework.
Coast, Ernestina; Norris, Alison H; Moore, Ann M; Freeman, Emily
2018-03-01
We present a new conceptual framework for studying trajectories to obtaining abortion-related care. It assembles for the first time all of the known factors influencing a trajectory and encourages readers to consider the ways these macro- and micro-level factors operate in multiple and sometimes conflicting ways. Based on presentation to and feedback from abortion experts (researchers, providers, funders, policymakers and advisors, advocates) (n = 325) between 03/06/2014 and 22/08/2015, and a systematic mapping of peer-reviewed literature (n = 424) published between 01/01/2011 and 30/10/2017, our framework synthesises the factors shaping abortion trajectories, grouped into three domains: abortion-specific experiences, individual contexts, and (inter)national and sub-national contexts. Our framework includes time-dependent processes involved in an individual trajectory, starting with timing of pregnancy awareness. This framework can be used to guide testable hypotheses about enabling and inhibiting influences on care-seeking behaviour and consideration about how abortion trajectories might be influenced by policy or practice. Research based on understanding of trajectories has the potential to improve women's experiences and outcomes of abortion-related care. Copyright © 2018 The Author(s). Published by Elsevier Ltd.. All rights reserved.
Genes, Environments, and Sex Differences in Alcohol Research.
Salvatore, Jessica E; Cho, Seung Bin; Dick, Danielle M
2017-07-01
The study of sex differences has been identified as one way to enhance scientific reproducibility, and the National Institutes of Health (NIH) have implemented a new policy to encourage the explicit examination of sex differences. Our goal here is to address sex differences in behavioral genetic research on alcohol outcomes. We review sex differences for alcohol outcomes and whether the source and magnitude of genetic influences on alcohol consumption and alcohol use disorder (AUD) are the same across sexes; describe common research designs for studying sex-specific gene-by-environment interaction (G × E) effects; and discuss the role of statistical power and theory when testing sex-specific genetic effects. There are robust sex differences for many alcohol outcomes. The weight of evidence suggests that the source and magnitude of genetic influences on alcohol consumption and AUD are the same across sexes. Whether there are sex-specific G × E effects has received less attention to date. The new NIH policy necessitates a systematic approach for studying sex-specific genetic effects in alcohol research. Researchers are encouraged to report power for tests of these effects and to use theory to develop testable hypotheses, especially for studies of G × E.
DuBois, Debra C; Piel, William H; Jusko, William J
2008-01-01
High-throughput data collection using gene microarrays has great potential as a method for addressing the pharmacogenomics of complex biological systems. Similarly, mechanism-based pharmacokinetic/pharmacodynamic modeling provides a tool for formulating quantitative testable hypotheses concerning the responses of complex biological systems. As the response of such systems to drugs generally entails cascades of molecular events in time, a time series design provides the best approach to capturing the full scope of drug effects. A major problem in using microarrays for high-throughput data collection is sorting through the massive amount of data in order to identify probe sets and genes of interest. Due to its inherent redundancy, a rich time series containing many time points and multiple samples per time point allows for the use of less stringent criteria of expression, expression change and data quality for initial filtering of unwanted probe sets. The remaining probe sets can then become the focus of more intense scrutiny by other methods, including temporal clustering, functional clustering and pharmacokinetic/pharmacodynamic modeling, which provide additional ways of identifying the probes and genes of pharmacological interest. PMID:15212590
Lemons, Michele L.
2012-01-01
Inquiry-based projects promote discovery and retention of key concepts, increase student engagement, and stimulate interest in research. Described here are a series of lab exercises within an undergraduate upper level neuroscience course that train students to design, execute and analyze their own hypothesis-driven research project. Prior to developing their own projects, students learn several research techniques including aseptic cell culture, cell line maintenance, immunocytochemistry and fluorescent microscopy. Working in groups, students choose how to use these techniques to characterize and identify a “mystery” cell line. Each lab group is given a unique cell line with either a neural, astrocyte, or Schwann cell origin. Working together, students plan and execute experiments to determine the cellular origin and other unique characteristics of their mystery cell line. Students generate testable hypotheses, design interpretable experiments, generate and analyze data, and report their findings in both oral and written formats. Students receive instructor and peer feedback throughout the entire project. In summary, these labs train students the process of scientific research. This series of lab exercises received very strong positive feedback from the students. Reflections on student feedback and plans for future improvements are discussed. PMID:23504583
A Scientific Cognitive-Behavioral Model of Tinnitus: Novel Conceptualizations of Tinnitus Distress
McKenna, Laurence; Handscomb, Lucy; Hoare, Derek J.; Hall, Deborah A.
2014-01-01
The importance of psychological factors in tinnitus distress has been formally recognized for almost three decades. The psychological understanding of why tinnitus can be a distressing condition posits that it becomes problematic when it acquires an emotive significance through cognitive processes. Principle therapeutic efforts are directed at reducing or removing the cognitive (and behavioral) obstacles to habituation. Here, the evidence relevant to a new psychological model of tinnitus is critically reviewed. The model posits that patients’ interpretations of tinnitus and the changes in behavior that result are given a central role in creating and maintaining distress. The importance of selective attention and the possibility that this leads to distorted perception of tinnitus is highlighted. From this body of evidence, we propose a coherent cognitive-behavioral model of tinnitus distress that is more in keeping with contemporary psychological theories of clinical problems (particularly that of insomnia) and which postulates a number of behavioral processes that are seen as cognitively mediated. This new model provides testable hypotheses to guide future research to unravel the complex mechanisms underpinning tinnitus distress. It is also well suited to define individual symptomatology and to provide a framework for the delivery of cognitive-behavioral therapy. PMID:25339938
A parallel implementation of an off-lattice individual-based model of multicellular populations
NASA Astrophysics Data System (ADS)
Harvey, Daniel G.; Fletcher, Alexander G.; Osborne, James M.; Pitt-Francis, Joe
2015-07-01
As computational models of multicellular populations include ever more detailed descriptions of biophysical and biochemical processes, the computational cost of simulating such models limits their ability to generate novel scientific hypotheses and testable predictions. While developments in microchip technology continue to increase the power of individual processors, parallel computing offers an immediate increase in available processing power. To make full use of parallel computing technology, it is necessary to develop specialised algorithms. To this end, we present a parallel algorithm for a class of off-lattice individual-based models of multicellular populations. The algorithm divides the spatial domain between computing processes and comprises communication routines that ensure the model is correctly simulated on multiple processors. The parallel algorithm is shown to accurately reproduce the results of a deterministic simulation performed using a pre-existing serial implementation. We test the scaling of computation time, memory use and load balancing as more processes are used to simulate a cell population of fixed size. We find approximate linear scaling of both speed-up and memory consumption on up to 32 processor cores. Dynamic load balancing is shown to provide speed-up for non-regular spatial distributions of cells in the case of a growing population.
Covariations in ecological scaling laws fostered by community dynamics.
Zaoli, Silvia; Giometto, Andrea; Maritan, Amos; Rinaldo, Andrea
2017-10-03
Scaling laws in ecology, intended both as functional relationships among ecologically relevant quantities and the probability distributions that characterize their occurrence, have long attracted the interest of empiricists and theoreticians. Empirical evidence exists of power laws associated with the number of species inhabiting an ecosystem, their abundances, and traits. Although their functional form appears to be ubiquitous, empirical scaling exponents vary with ecosystem type and resource supply rate. The idea that ecological scaling laws are linked has been entertained before, but the full extent of macroecological pattern covariations, the role of the constraints imposed by finite resource supply, and a comprehensive empirical verification are still unexplored. Here, we propose a theoretical scaling framework that predicts the linkages of several macroecological patterns related to species' abundances and body sizes. We show that such a framework is consistent with the stationary-state statistics of a broad class of resource-limited community dynamics models, regardless of parameterization and model assumptions. We verify predicted theoretical covariations by contrasting empirical data and provide testable hypotheses for yet unexplored patterns. We thus place the observed variability of ecological scaling exponents into a coherent statistical framework where patterns in ecology embed constrained fluctuations.
Barker, Jessica L.; Bronstein, Judith L.
2016-01-01
Exploitation in cooperative interactions both within and between species is widespread. Although it is assumed to be costly to be exploited, mechanisms to control exploitation are surprisingly rare, making the persistence of cooperation a fundamental paradox in evolutionary biology and ecology. Focusing on between-species cooperation (mutualism), we hypothesize that the temporal sequence in which exploitation occurs relative to cooperation affects its net costs and argue that this can help explain when and where control mechanisms are observed in nature. Our principal prediction is that when exploitation occurs late relative to cooperation, there should be little selection to limit its effects (analogous to “tolerated theft” in human cooperative groups). Although we focus on cases in which mutualists and exploiters are different individuals (of the same or different species), our inferences can readily be extended to cases in which individuals exhibit mixed cooperative-exploitative strategies. We demonstrate that temporal structure should be considered alongside spatial structure as an important process affecting the evolution of cooperation. We also provide testable predictions to guide future empirical research on interspecific as well as intraspecific cooperation. PMID:26841169
Howe, E.A.; de Souza, A.; Lahr, D.L.; Chatwin, S.; Montgomery, P.; Alexander, B.R.; Nguyen, D.-T.; Cruz, Y.; Stonich, D.A.; Walzer, G.; Rose, J.T.; Picard, S.C.; Liu, Z.; Rose, J.N.; Xiang, X.; Asiedu, J.; Durkin, D.; Levine, J.; Yang, J.J.; Schürer, S.C.; Braisted, J.C.; Southall, N.; Southern, M.R.; Chung, T.D.Y.; Brudz, S.; Tanega, C.; Schreiber, S.L.; Bittker, J.A.; Guha, R.; Clemons, P.A.
2015-01-01
BARD, the BioAssay Research Database (https://bard.nih.gov/) is a public database and suite of tools developed to provide access to bioassay data produced by the NIH Molecular Libraries Program (MLP). Data from 631 MLP projects were migrated to a new structured vocabulary designed to capture bioassay data in a formalized manner, with particular emphasis placed on the description of assay protocols. New data can be submitted to BARD with a user-friendly set of tools that assist in the creation of appropriately formatted datasets and assay definitions. Data published through the BARD application program interface (API) can be accessed by researchers using web-based query tools or a desktop client. Third-party developers wishing to create new tools can use the API to produce stand-alone tools or new plug-ins that can be integrated into BARD. The entire BARD suite of tools therefore supports three classes of researcher: those who wish to publish data, those who wish to mine data for testable hypotheses, and those in the developer community who wish to build tools that leverage this carefully curated chemical biology resource. PMID:25477388
A General, Synthetic Model for Predicting Biodiversity Gradients from Environmental Geometry.
Gross, Kevin; Snyder-Beattie, Andrew
2016-10-01
Latitudinal and elevational biodiversity gradients fascinate ecologists, and have inspired dozens of explanations. The geometry of the abiotic environment is sometimes thought to contribute to these gradients, yet evaluations of geometric explanations are limited by a fragmented understanding of the diversity patterns they predict. This article presents a mathematical model that synthesizes multiple pathways by which environmental geometry can drive diversity gradients. The model characterizes species ranges by their environmental niches and limits on range sizes and places those ranges onto the simplified geometries of a sphere or cone. The model predicts nuanced and realistic species-richness gradients, including latitudinal diversity gradients with tropical plateaus and mid-latitude inflection points and elevational diversity gradients with low-elevation diversity maxima. The model also illustrates the importance of a mid-environment effect that augments species richness at locations with intermediate environments. Model predictions match multiple empirical biodiversity gradients, depend on ecological traits in a testable fashion, and formally synthesize elements of several geometric models. Together, these results suggest that previous assessments of geometric hypotheses should be reconsidered and that environmental geometry may play a deeper role in driving biodiversity gradients than is currently appreciated.
Ezawa, Tatsuhiro; Saito, Katsuharu
2018-04-27
Contents Summary I. Introduction II. Foraging for phosphate III. Fine-tuning of phosphate homeostasis IV. The frontiers: phosphate translocation and export V. Conclusions and outlook Acknowledgements References SUMMARY: Arbuscular mycorrhizal fungi form symbiotic associations with most land plants and deliver mineral nutrients, in particular phosphate, to the host. Therefore, understanding the mechanisms of phosphate acquisition and delivery in the fungi is critical for full appreciation of the mutualism in this association. Here, we provide updates on physical, chemical, and biological strategies of the fungi for phosphate acquisition, including interactions with phosphate-solubilizing bacteria, and those on the regulatory mechanisms of phosphate homeostasis based on resurveys of published genome sequences and a transcriptome with reference to the latest findings in a model fungus. For the mechanisms underlying phosphate translocation and export to the host, which are major research frontiers in this field, not only recent advances but also testable hypotheses are proposed. Lastly, we briefly discuss applicability of the latest tools to gene silencing in the fungi, which will be breakthrough techniques for comprehensive understanding of the molecular basis of fungal phosphate metabolism. © 2018 The Authors. New Phytologist © 2018 New Phytologist Trust.
Carpiano, Richard M
2006-01-01
Within the past several years, a considerable body of research on social capital has emerged in public health. Although offering the potential for new insights into how community factors impact health and well being, this research has received criticism for being undertheorized and methodologically flawed. In an effort to address some of these limitations, this paper applies Pierre Bourdieu's (1986) [Bourdieu, P. (1986). Handbook of theory and research for the sociology of education (pp. 241-258). New York: Greenwood] social capital theory to create a conceptual model of neighborhood socioeconomic processes, social capital (resources inhered within social networks), and health. After briefly reviewing the social capital conceptualizations of Bourdieu and Putnam, I attempt to integrate these authors' theories to better understand how social capital might operate within neighborhoods or local areas. Next, I describe a conceptual model that incorporates this theoretical integration of social capital into a framework of neighborhood social processes as health determinants. Discussion focuses on the utility of this Bourdieu-based neighborhood social capital theory and model for examining several under-addressed issues of social capital in the neighborhood effects literature and generating specific, empirically testable hypotheses for future research.
NASA Astrophysics Data System (ADS)
D'Ausilio, Alessandro; Bartoli, Eleonora; Maffongelli, Laura
2015-03-01
We are grateful to all commentators for their insightful commentaries and observations that enrich our proposal. One of our aims was indeed to bridge the gap between fields of research that, progressing independently, are facing similar issues regarding the neural representation of motor knowledge. In this respect, we were pleased to receive feedback from eminent researchers on both the mirror neuron as well as the motor control fields. Their expertise covers animal and human neurophysiology, as well as the computational modeling of neural and behavioral processes. Given their heterogeneous cultural perspectives and research approaches, a number of important open questions were raised. For simplicity we separated these issues into four sections. In the first section we present methodological aspects regarding how synergies can be measured in paradigms investigating the human mirror system. The second section regards the fundamental definition of what exactly synergies might be. The third concerns how synergies can generate testable predictions in mirror neuron research. Finally, the fourth section deals with the ultimate question regarding the function of the mirror neuron system.
Formalizing an integrative, multidisciplinary cancer therapy discovery workflow
McGuire, Mary F.; Enderling, Heiko; Wallace, Dorothy I.; Batra, Jaspreet; Jordan, Marie; Kumar, Sushil; Panetta, John C.; Pasquier, Eddy
2014-01-01
Although many clinicians and researchers work to understand cancer, there has been limited success to effectively combine forces and collaborate over time, distance, data and budget constraints. Here we present a workflow template for multidisciplinary cancer therapy that was developed during the 2nd Annual Workshop on Cancer Systems Biology sponsored by Tufts University, Boston, MA in July 2012. The template was applied to the development of a metronomic therapy backbone for neuroblastoma. Three primary groups were identified: clinicians, biologists, and scientists (mathematicians, computer scientists, physicists and engineers). The workflow described their integrative interactions; parallel or sequential processes; data sources and computational tools at different stages as well as the iterative nature of therapeutic development from clinical observations to in vitro, in vivo, and clinical trials. We found that theoreticians in dialog with experimentalists could develop calibrated and parameterized predictive models that inform and formalize sets of testable hypotheses, thus speeding up discovery and validation while reducing laboratory resources and costs. The developed template outlines an interdisciplinary collaboration workflow designed to systematically investigate the mechanistic underpinnings of a new therapy and validate that therapy to advance development and clinical acceptance. PMID:23955390
D’Esposito, Mark
2017-01-01
Recent work has established that visual working memory is subject to serial dependence: current information in memory blends with that from the recent past as a function of their similarity. This tuned temporal smoothing likely promotes the stability of memory in the face of noise and occlusion. Serial dependence accumulates over several seconds in memory and deteriorates with increased separation between trials. While this phenomenon has been extensively characterized in behavior, its neural mechanism is unknown. In the present study, we investigate the circuit-level origins of serial dependence in a biophysical model of cortex. We explore two distinct kinds of mechanisms: stable persistent activity during the memory delay period and dynamic “activity-silent” synaptic plasticity. We find that networks endowed with both strong reverberation to support persistent activity and dynamic synapses can closely reproduce behavioral serial dependence. Specifically, elevated activity drives synaptic augmentation, which biases activity on the subsequent trial, giving rise to a spatiotemporally tuned shift in the population response. Our hybrid neural model is a theoretical advance beyond abstract mathematical characterizations, offers testable hypotheses for physiological research, and demonstrates the power of biological insights to provide a quantitative explanation of human behavior. PMID:29244810
Antenna Mechanism of Length Control of Actin Cables
Mohapatra, Lishibanya; Goode, Bruce L.; Kondev, Jane
2015-01-01
Actin cables are linear cytoskeletal structures that serve as tracks for myosin-based intracellular transport of vesicles and organelles in both yeast and mammalian cells. In a yeast cell undergoing budding, cables are in constant dynamic turnover yet some cables grow from the bud neck toward the back of the mother cell until their length roughly equals the diameter of the mother cell. This raises the question: how is the length of these cables controlled? Here we describe a novel molecular mechanism for cable length control inspired by recent experimental observations in cells. This “antenna mechanism” involves three key proteins: formins, which polymerize actin, Smy1 proteins, which bind formins and inhibit actin polymerization, and myosin motors, which deliver Smy1 to formins, leading to a length-dependent actin polymerization rate. We compute the probability distribution of cable lengths as a function of several experimentally tuneable parameters such as the formin-binding affinity of Smy1 and the concentration of myosin motors delivering Smy1. These results provide testable predictions of the antenna mechanism of actin-cable length control. PMID:26107518
Antenna Mechanism of Length Control of Actin Cables.
Mohapatra, Lishibanya; Goode, Bruce L; Kondev, Jane
2015-06-01
Actin cables are linear cytoskeletal structures that serve as tracks for myosin-based intracellular transport of vesicles and organelles in both yeast and mammalian cells. In a yeast cell undergoing budding, cables are in constant dynamic turnover yet some cables grow from the bud neck toward the back of the mother cell until their length roughly equals the diameter of the mother cell. This raises the question: how is the length of these cables controlled? Here we describe a novel molecular mechanism for cable length control inspired by recent experimental observations in cells. This "antenna mechanism" involves three key proteins: formins, which polymerize actin, Smy1 proteins, which bind formins and inhibit actin polymerization, and myosin motors, which deliver Smy1 to formins, leading to a length-dependent actin polymerization rate. We compute the probability distribution of cable lengths as a function of several experimentally tuneable parameters such as the formin-binding affinity of Smy1 and the concentration of myosin motors delivering Smy1. These results provide testable predictions of the antenna mechanism of actin-cable length control.
NASA Astrophysics Data System (ADS)
Chun, E. J.; Cvetič, G.; Dev, P. S. B.; Drewes, M.; Fong, C. S.; Garbrecht, B.; Hambye, T.; Harz, J.; Hernández, P.; Kim, C. S.; Molinaro, E.; Nardi, E.; Racker, J.; Rius, N.; Zamora-Saa, J.
2018-02-01
The focus of this paper lies on the possible experimental tests of leptogenesis scenarios. We consider both leptogenesis generated from oscillations, as well as leptogenesis from out-of-equilibrium decays. As the Akhmedov-Rubakov-Smirnov (ARS) mechanism allows for heavy neutrinos in the GeV range, this opens up a plethora of possible experimental tests, e.g. at neutrino oscillation experiments, neutrinoless double beta decay, and direct searches for neutral heavy leptons at future facilities. In contrast, testing leptogenesis from out-of-equilibrium decays is a quite difficult task. We comment on the necessary conditions for having successful leptogenesis at the TeV-scale. We further discuss possible realizations and their model specific testability in extended seesaw models, models with extended gauge sectors, and supersymmetric leptogenesis. Not being able to test high-scale leptogenesis directly, we present a way to falsify such scenarios by focusing on their washout processes. This is discussed specifically for the left-right symmetric model and the observation of a heavy WR, as well as model independently when measuring ΔL = 2 washout processes at the LHC or neutrinoless double beta decay.
A spiking neural network model of 3D perception for event-based neuromorphic stereo vision systems
Osswald, Marc; Ieng, Sio-Hoi; Benosman, Ryad; Indiveri, Giacomo
2017-01-01
Stereo vision is an important feature that enables machine vision systems to perceive their environment in 3D. While machine vision has spawned a variety of software algorithms to solve the stereo-correspondence problem, their implementation and integration in small, fast, and efficient hardware vision systems remains a difficult challenge. Recent advances made in neuromorphic engineering offer a possible solution to this problem, with the use of a new class of event-based vision sensors and neural processing devices inspired by the organizing principles of the brain. Here we propose a radically novel model that solves the stereo-correspondence problem with a spiking neural network that can be directly implemented with massively parallel, compact, low-latency and low-power neuromorphic engineering devices. We validate the model with experimental results, highlighting features that are in agreement with both computational neuroscience stereo vision theories and experimental findings. We demonstrate its features with a prototype neuromorphic hardware system and provide testable predictions on the role of spike-based representations and temporal dynamics in biological stereo vision processing systems. PMID:28079187
A spiking neural network model of 3D perception for event-based neuromorphic stereo vision systems.
Osswald, Marc; Ieng, Sio-Hoi; Benosman, Ryad; Indiveri, Giacomo
2017-01-12
Stereo vision is an important feature that enables machine vision systems to perceive their environment in 3D. While machine vision has spawned a variety of software algorithms to solve the stereo-correspondence problem, their implementation and integration in small, fast, and efficient hardware vision systems remains a difficult challenge. Recent advances made in neuromorphic engineering offer a possible solution to this problem, with the use of a new class of event-based vision sensors and neural processing devices inspired by the organizing principles of the brain. Here we propose a radically novel model that solves the stereo-correspondence problem with a spiking neural network that can be directly implemented with massively parallel, compact, low-latency and low-power neuromorphic engineering devices. We validate the model with experimental results, highlighting features that are in agreement with both computational neuroscience stereo vision theories and experimental findings. We demonstrate its features with a prototype neuromorphic hardware system and provide testable predictions on the role of spike-based representations and temporal dynamics in biological stereo vision processing systems.
A spiking neural network model of 3D perception for event-based neuromorphic stereo vision systems
NASA Astrophysics Data System (ADS)
Osswald, Marc; Ieng, Sio-Hoi; Benosman, Ryad; Indiveri, Giacomo
2017-01-01
Stereo vision is an important feature that enables machine vision systems to perceive their environment in 3D. While machine vision has spawned a variety of software algorithms to solve the stereo-correspondence problem, their implementation and integration in small, fast, and efficient hardware vision systems remains a difficult challenge. Recent advances made in neuromorphic engineering offer a possible solution to this problem, with the use of a new class of event-based vision sensors and neural processing devices inspired by the organizing principles of the brain. Here we propose a radically novel model that solves the stereo-correspondence problem with a spiking neural network that can be directly implemented with massively parallel, compact, low-latency and low-power neuromorphic engineering devices. We validate the model with experimental results, highlighting features that are in agreement with both computational neuroscience stereo vision theories and experimental findings. We demonstrate its features with a prototype neuromorphic hardware system and provide testable predictions on the role of spike-based representations and temporal dynamics in biological stereo vision processing systems.
Stereoacuity of preschool children with and without vision disorders.
Ciner, Elise B; Ying, Gui-Shuang; Kulp, Marjean Taylor; Maguire, Maureen G; Quinn, Graham E; Orel-Bixler, Deborah; Cyert, Lynn A; Moore, Bruce; Huang, Jiayan
2014-03-01
To evaluate associations between stereoacuity and presence, type, and severity of vision disorders in Head Start preschool children and determine testability and levels of stereoacuity by age in children without vision disorders. Stereoacuity of children aged 3 to 5 years (n = 2898) participating in the Vision in Preschoolers (VIP) Study was evaluated using the Stereo Smile II test during a comprehensive vision examination. This test uses a two-alternative forced-choice paradigm with four stereoacuity levels (480 to 60 seconds of arc). Children were classified by the presence (n = 871) or absence (n = 2027) of VIP Study-targeted vision disorders (amblyopia, strabismus, significant refractive error, or unexplained reduced visual acuity), including type and severity. Median stereoacuity between groups and among severity levels of vision disorders was compared using Wilcoxon rank sum and Kruskal-Wallis tests. Testability and stereoacuity levels were determined for children without VIP Study-targeted disorders overall and by age. Children with VIP Study-targeted vision disorders had significantly worse median stereoacuity than that of children without vision disorders (120 vs. 60 seconds of arc, p < 0.001). Children with the most severe vision disorders had worse stereoacuity than that of children with milder disorders (median 480 vs. 120 seconds of arc, p < 0.001). Among children without vision disorders, testability was 99.6% overall, increasing with age to 100% for 5-year-olds (p = 0.002). Most of the children without vision disorders (88%) had stereoacuity at the two best disparities (60 or 120 seconds of arc); the percentage increasing with age (82% for 3-, 89% for 4-, and 92% for 5-year-olds; p < 0.001). The presence of any VIP Study-targeted vision disorder was associated with significantly worse stereoacuity in preschool children. Severe vision disorders were more likely associated with poorer stereopsis than milder or no vision disorders. Testability was excellent at all ages. These results support the validity of the Stereo Smile II for assessing random-dot stereoacuity in preschool children.
DeCamp, Matthew; Dredze, Mark; Chisolm, Margaret S; Berger, Zackary D
2014-01-01
Background Twitter is home to many health professionals who send messages about a variety of health-related topics. Amid concerns about physicians posting inappropriate content online, more in-depth knowledge about these messages is needed to understand health professionals’ behavior on Twitter. Objective Our goal was to characterize the content of Twitter messages, specifically focusing on health professionals and their tweets relating to health. Methods We performed an in-depth content analysis of 700 tweets. Qualitative content analysis was conducted on tweets by health users on Twitter. The primary objective was to describe the general type of content (ie, health-related versus non-health related) on Twitter authored by health professionals and further to describe health-related tweets on the basis of the type of statement made. Specific attention was given to whether a tweet was personal (as opposed to professional) or made a claim that users would expect to be supported by some level of medical evidence (ie, a “testable” claim). A secondary objective was to compare content types among different users, including patients, physicians, nurses, health care organizations, and others. Results Health-related users are posting a wide range of content on Twitter. Among health-related tweets, 53.2% (184/346) contained a testable claim. Of health-related tweets by providers, 17.6% (61/346) were personal in nature; 61% (59/96) made testable statements. While organizations and businesses use Twitter to promote their services and products, patient advocates are using this tool to share their personal experiences with health. Conclusions Twitter users in health-related fields tweet about both testable claims and personal experiences. Future work should assess the relationship between testable tweets and the actual level of evidence supporting them, including how Twitter users—especially patients—interpret the content of tweets posted by health providers. PMID:25591063
Darwinian hydrology: can the methodology Charles Darwin pioneered help hydrologic science?
NASA Astrophysics Data System (ADS)
Harman, C.; Troch, P. A.
2013-05-01
There have been repeated calls for a Darwinian approach to hydrologic science or for a synthesis of Darwinian and Newtonian approaches, to deepen understanding the hydrologic system in the larger landscape context, and so develop a better basis for predictions now and in an uncertain future. But what exactly makes a Darwinian approach to hydrology "Darwinian"? While there have now been a number of discussions of Darwinian approaches, many referencing Harte (2002), the term is potentially a source of confusion while its connections to Darwin remain allusive rather than explicit. Here we discuss the methods that Charles Darwin pioneered to understand a variety of complex systems in terms of their historical processes of change. We suggest that the Darwinian approach to hydrology follows his lead by focusing attention on the patterns of variation in populations, seeking hypotheses that explain these patterns in terms of the mechanisms and conditions that determine their historical development, using deduction and modeling to derive consequent hypotheses that follow from a proposed explanation, and critically testing these hypotheses against new observations. It is not sufficient to catalogue the patterns or predict them statistically. Nor is it sufficient for the explanations to amount to a "just-so" story not subject to critical analysis. Darwin's theories linked present-day variation to mechanisms that operated over history, and could be independently test and falsified by comparing new observations to the predictions of corollary hypotheses they generated. With a Darwinian framework in mind it is easy to see that a great deal of hydrologic research has already been done that contributes to a Darwinian hydrology - whether deliberately or not. The various heuristic methods that Darwin used to develop explanatory theories - extrapolating mechanisms, space for time substitution, and looking for signatures of history - have direct application in hydrologic science. Some are already in use, while others are not and could be used to develop new insights. Darwin sought explanatory theories that intelligibly connected disparate facts, that were testable and falsifiable, and that had fertile implications for further research. While a synthesis of the Darwinian and Newtonian approaches remains a goal, the Darwinian approach to hydrologic science has significant value of its own The Darwinian hydrology that has been conducted already has not been coordinated or linked into a general body of theory and knowledge, but the time is ccoming when this will be possible.
The diffusion decision model: theory and data for two-choice decision tasks.
Ratcliff, Roger; McKoon, Gail
2008-04-01
The diffusion decision model allows detailed explanations of behavior in two-choice discrimination tasks. In this article, the model is reviewed to show how it translates behavioral data-accuracy, mean response times, and response time distributions-into components of cognitive processing. Three experiments are used to illustrate experimental manipulations of three components: stimulus difficulty affects the quality of information on which a decision is based; instructions emphasizing either speed or accuracy affect the criterial amounts of information that a subject requires before initiating a response; and the relative proportions of the two stimuli affect biases in drift rate and starting point. The experiments also illustrate the strong constraints that ensure the model is empirically testable and potentially falsifiable. The broad range of applications of the model is also reviewed, including research in the domains of aging and neurophysiology.
The Law of Self-Acting Machines and Irreversible Processes with Reversible Replicas
NASA Astrophysics Data System (ADS)
Valev, Pentcho
2002-11-01
Clausius and Kelvin saved Carnot theorem and developed the second law by assuming that Carnot machines can work in the absence of an operator and that all the irreversible processes have reversible replicas. The former assumption restored Carnot theorem as an experience of mankind whereas the latter generated "the law of ever increasing entropy". Both assumptions are wrong so it makes sense to return to Carnot theorem (or some equivalent) and test it experimentally. Two testable paradigms - the system performing two types of reversible work and the system in dynamical equilibrium - suggest that perpetuum mobile of the second kind in the presence of an operator is possible. The deviation from the second law prediction, expressed as difference between partial derivatives in a Maxwell relation, measures the degree of structural-functional evolution for the respective system.
Hayes, Mark A.; Cryan, Paul M.; Wunder, Michael B.
2015-01-01
Understanding seasonal distribution and movement patterns of animals that migrate long distances is an essential part of monitoring and conserving their populations. Compared to migratory birds and other more conspicuous migrants, we know very little about the movement patterns of many migratory bats. Hoary bats (Lasiurus cinereus), a cryptic, wide-ranging, long-distance migrant, comprise a substantial proportion of the tens to hundreds of thousands of bat fatalities estimated to occur each year at wind turbines in North America. We created seasonally-dynamic species distribution models (SDMs) from 2,753 museum occurrence records collected over five decades in North America to better understand the seasonal geographic distributions of hoary bats. We used 5 SDM approaches: logistic regression, multivariate adaptive regression splines, boosted regression trees, random forest, and maximum entropy and consolidated outputs to generate ensemble maps. These maps represent the first formal hypotheses for sex- and season-specific hoary bat distributions. Our results suggest that North American hoary bats winter in regions with relatively long growing seasons where temperatures are moderated by proximity to oceans, and then move to the continental interior for the summer. SDMs suggested that hoary bats are most broadly distributed in autumn—the season when they are most susceptible to mortality from wind turbines; this season contains the greatest overlap between potentially suitable habitat and wind energy facilities. Comparing wind-turbine fatality data to model outputs could test many predictions, such as ‘risk from turbines is highest in habitats between hoary bat summering and wintering grounds’. Although future field studies are needed to validate the SDMs, this study generated well-justified and testable hypotheses of hoary bat migration patterns and seasonal distribution. PMID:26208098
Using biological markets principles to examine patterns of grooming exchange in Macaca thibetana.
Balasubramaniam, K N; Berman, C M; Ogawa, H; Li, J
2011-12-01
Biological markets principles offer testable hypotheses to explain variation in grooming exchange patterns among nonhuman primates. They predict that when within-group contest competition (WGC) is high and dominance hierarchies steep, grooming interchange with other "commodity" behaviors (such as agonistic support) should prevail. In contrast, when WGC is low and gradients shallow, market theory predicts that grooming reciprocity should prevail. We tested these predictions in a wild, provisioned Tibetan macaque (Macaca thibetana) group across six time periods during which the group had been subjected to varying degrees of range restriction. Data on female-female aggression, grooming, and support were collected using all-occurrences and focal animal sampling techniques, and analyzed using ANCOVA methods and correlation analyses. We found that hierarchical steepness varied significantly across periods, but did not correlate with two indirect indicators of WGC (group size and range restriction) in predicted directions. Contrary to expectations, we found a negative correlation between steepness and group size, perhaps because the responses of group members to external risks (i.e. prolonged and unavoidable exposure to humans) may have overshadowed the effects of WGC. As predicted, grooming reciprocity was significant in each period and negatively correlated with steepness, even after we controlled group size, kinship, rank differences, and proximity. In contrast, there was no evidence for grooming interchange with agonistic support or for a positive relationship between interchange and steepness. We hypothesize that stressful conditions and/or the presence of stable hierarchies during each period may have led to a greater market demand for grooming than support. We suggest that future studies testing these predictions consider more direct measures of WGC and commodities in addition to support, such as feeding tolerance and access to infants. © 2011 Wiley Periodicals, Inc.
Hayes, Mark A.; Cryan, Paul M.; Wunder, Michael B.
2015-01-01
Understanding seasonal distribution and movement patterns of animals that migrate long distances is an essential part of monitoring and conserving their populations. Compared to migratory birds and other more conspicuous migrants, we know very little about the movement patterns of many migratory bats. Hoary bats (Lasiurus cinereus), a cryptic, wide-ranging, long-distance migrant, comprise a substantial proportion of the tens to hundreds of thousands of bat fatalities estimated to occur each year at wind turbines in North America. We created seasonally-dynamic species distribution models (SDMs) from 2,753 museum occurrence records collected over five decades in North America to better understand the seasonal geographic distributions of hoary bats. We used 5 SDM approaches: logistic regression, multivariate adaptive regression splines, boosted regression trees, random forest, and maximum entropy and consolidated outputs to generate ensemble maps. These maps represent the first formal hypotheses for sex- and season-specific hoary bat distributions. Our results suggest that North American hoary bats winter in regions with relatively long growing seasons where temperatures are moderated by proximity to oceans, and then move to the continental interior for the summer. SDMs suggested that hoary bats are most broadly distributed in autumn—the season when they are most susceptible to mortality from wind turbines; this season contains the greatest overlap between potentially suitable habitat and wind energy facilities. Comparing wind-turbine fatality data to model outputs could test many predictions, such as ‘risk from turbines is highest in habitats between hoary bat summering and wintering grounds’. Although future field studies are needed to validate the SDMs, this study generated well-justified and testable hypotheses of hoary bat migration patterns and seasonal distribution.
Whiting, James R; Magalhaes, Isabel S; Singkam, Abdul R; Robertson, Shaun; D'Agostino, Daniele; Bradley, Janette E; MacColl, Andrew D C
2018-06-20
Understanding how wild immune variation covaries with other traits can reveal how costs and trade-offs shape immune evolution in the wild. Divergent life history strategies may increase or alleviate immune costs, helping shape immune variation in a consistent, testable way. Contrasting hypotheses suggest that shorter life histories may alleviate costs by offsetting them against increased mortality; or increase the effect of costs if immune responses are traded off against development or reproduction. We investigated the evolutionary relationship between life history and immune responses within an island radiation of three-spined stickleback, with discrete populations of varying life histories and parasitism. We sampled two short-lived, two long-lived and an anadromous population using qPCR to quantify current immune profile and RAD-seq data to study the distribution of immune variants within our assay genes and across the genome. Short-lived populations exhibited significantly increased expression of all assay genes, which was accompanied by a strong association with population-level variation in local alleles and divergence in a gene that may be involved in complement pathways. In addition, divergence around the eda gene in anadromous fish is likely associated with increased inflammation. A wider analysis of 15 populations across the island revealed that immune genes across the genome show evidence of having diverged alongside life history strategies. Parasitism and reproductive investment were also important sources of variation for expression, highlighting the caution required when assaying immune responses in the wild. These results provide strong, gene-based support for current hypotheses linking life history and immune variation across multiple populations of a vertebrate model. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
NASA Astrophysics Data System (ADS)
Owen, J. J.; Dietrich, W. E.; Nishiizumi, K.; Bellugi, D.; Amundson, R.
2008-12-01
Modeling the development of hillslopes using mass balance equations has generated many testable hypotheses related to morphology, process rates, and soil properties, however it is only relatively recently that techniques for constraining these models (such as cosmogenic radionuclides) have become commonplace. As such, many hypotheses related to the effects of boundary conditions or climate on process rates and soil properties have been left untested. We selected pairs of hillslopes along a precipitation gradient in northern Chile (24°-30° S) which were either bounded by actively eroding (bedrock-bedded) channels or by stable or aggradational landforms (pediments, colluvial aprons, valley bottoms). For each hillslope we measured soil properties, atmospheric deposition rates, and bedrock denudation rates. We observe significant changes in soil properties with climate: there is a shift from thick, weathered soils in the semiarid south, to the near absence of soil in the arid middle, to salt-rich soils in the hyperarid north. Coincident with these are dramatic changes in the types and rates of processes acting on the soils. We found relatively quick, biotically-driven soil formation and transport in the south, and very slow, salt-driven processes in the north. Additionally, we observe systematic differences between hillslopes of different boundary condition within the same climate zone, such as thicker soils, gentler slopes, and slower erosion rates on hillslopes with a non-eroding boundary versus an eroding boundary. These support general predictions based on hillslope soil mass balance equations and geomorphic transport laws. Using parameters derived from our field data, we attempt to use a mass balance model of hillslope development to explore the effect of changing boundary conditions and/or shifting climate.
MetNet: Software to Build and Model the Biogenetic Lattice of Arabidopsis
Wurtele, Eve Syrkin; Li, Jie; Diao, Lixia; ...
2003-01-01
MetNet (http://www.botany.iastate.edu/∼mash/metnetex/metabolicnetex.html) is publicly available software in development for analysis of genome-wide RNA, protein and metabolite profiling data. The software is designed to enable the biologist to visualize, statistically analyse and model a metabolic and regulatory network map of Arabidopsis , combined with gene expression profiling data. It contains a JAVA interface to an interactions database (MetNetDB) containing information on regulatory and metabolic interactions derived from a combination of web databases (TAIR, KEGG, BRENDA) and input from biologists in their area of expertise. FCModeler captures input from MetNetDB in a graphical form. Sub-networks can be identified and interpreted using simplemore » fuzzy cognitive maps. FCModeler is intended to develop and evaluate hypotheses, and provide a modelling framework for assessing the large amounts of data captured by high-throughput gene expression experiments. FCModeler and MetNetDB are currently being extended to three-dimensional virtual reality display. The MetNet map, together with gene expression data, can be viewed using multivariate graphics tools in GGobi linked with the data analytic tools in R. Users can highlight different parts of the metabolic network and see the relevant expression data highlighted in other data plots. Multi-dimensional expression data can be rotated through different dimensions. Statistical analysis can be computed alongside the visual. MetNet is designed to provide a framework for the formulation of testable hypotheses regarding the function of specific genes, and in the long term provide the basis for identification of metabolic and regulatory networks that control plant composition and development.« less
Murphy, Bridget F; Thompson, Michael B
2011-07-01
Squamate reptiles (lizards and snakes) offer a unique model system for testing hypotheses about the evolutionary transition from oviparity (egg-laying) to viviparity (live-bearing) in amniote vertebrates. The evolution of squamate viviparity has occurred remarkably frequently (>108 times) and has resulted in major changes in reproductive physiology. Such frequent changes in reproductive strategy pose two questions: (1) what are the molecular mechanisms responsible for the evolution of squamate viviparity? (2) Are these molecular mechanisms the same for separate origins of viviparity? Molecular approaches, such as RT-PCR, in situ hybridisation, Western blotting and immunofluorescence, have been invaluable for identifying genes and proteins that are involved in squamate placental development, materno-foetal immunotolerance, placental transport, placental angiogenesis, hormone synthesis and hormone receptor expression. However, the candidate-gene or -protein approach that has been used until now does not allow for de novo gene/protein discovery; results to date suggest that the reproductive physiologies of mammals and squamate reptiles are very similar, but this conclusion may simply be due to a limited capacity to study the subset of genes and proteins that are unique to reptiles. Progress has also been slowed by the lack of appropriate molecular and genomic resources for squamate reptiles. The advent of next-generation sequencing provides a relatively inexpensive way to conduct rapid high-throughput sequencing of genomes and transcriptomes. We discuss the potential use of next-generation sequencing technologies to analyse differences in gene expression between oviparous and viviparous squamates, provide important sequence information for reptiles, and generate testable hypotheses for the evolution of viviparity.
Hayes, Mark A; Cryan, Paul M; Wunder, Michael B
2015-01-01
Understanding seasonal distribution and movement patterns of animals that migrate long distances is an essential part of monitoring and conserving their populations. Compared to migratory birds and other more conspicuous migrants, we know very little about the movement patterns of many migratory bats. Hoary bats (Lasiurus cinereus), a cryptic, wide-ranging, long-distance migrant, comprise a substantial proportion of the tens to hundreds of thousands of bat fatalities estimated to occur each year at wind turbines in North America. We created seasonally-dynamic species distribution models (SDMs) from 2,753 museum occurrence records collected over five decades in North America to better understand the seasonal geographic distributions of hoary bats. We used 5 SDM approaches: logistic regression, multivariate adaptive regression splines, boosted regression trees, random forest, and maximum entropy and consolidated outputs to generate ensemble maps. These maps represent the first formal hypotheses for sex- and season-specific hoary bat distributions. Our results suggest that North American hoary bats winter in regions with relatively long growing seasons where temperatures are moderated by proximity to oceans, and then move to the continental interior for the summer. SDMs suggested that hoary bats are most broadly distributed in autumn-the season when they are most susceptible to mortality from wind turbines; this season contains the greatest overlap between potentially suitable habitat and wind energy facilities. Comparing wind-turbine fatality data to model outputs could test many predictions, such as 'risk from turbines is highest in habitats between hoary bat summering and wintering grounds'. Although future field studies are needed to validate the SDMs, this study generated well-justified and testable hypotheses of hoary bat migration patterns and seasonal distribution.
Pro-sexual and androgen enhancing effects of Tribulus terrestris L.: Fact or Fiction.
Neychev, Vladimir; Mitev, Vanyo
2016-02-17
Historically, aphrodisiacs have had a reputation for making sex more achievable and satisfying. It has been long believed that Tribulus terrestris L. (TT), an annual plant of the family Zygophyllaceae, possesses aphrodisiac properties purportedly attributed to its ability to influence levels or mimic function of sex hormones. Due to this appealing beliefs, the popularity of medicinal products from TT is expanding at a remarkable pace among consumers who are attempting to enhance their sexual health. However, reliable scientific evidence supporting these purported bioactivities are scant and far from conclusive. To critically analyze and updated the evidence supporting a role for TT as an aphrodisiac and to reappraise the widely believed view of TT as an androgen enhancing botanical supplement. An extensive review of the literature was carried out based on systematic search of major scientific databases (PubMed, Elsevier, Springer Link, Google Scholar, Medline Plus, and Web of Science) for studies of phytochemical, pharmacological and traditional uses of TT published between 1968 and 2015. In addition, the reference lists of the available articles were reviewed and relevant studies including material in journals which are not indexed internationally were reviewed. Analysis of phytochemical and pharmacological studies in humans and animals revealed an important role for TT in treating erectile dysfunction and sexual desire problems; however, empirical evidence to support the hypothesis that this desirable effects are due to androgen enhancing properties of TT is, at best, inconclusive, and analysis of empirical evidence from a comprehensive review of available literature proved this hypothesis wrong. While the mechanisms underlying TT aphrodisiac activity remain largely unknown, there is emerging compelling evidence from experimental studies in animals for possible endothelium and nitric oxide-dependent mechanisms underlying TT aphrodisiac and pro-erectile activities. It is becoming increasingly clear that the deep-seated traditional view of TT bioactivity focused exclusively on its androgen enhancing properties is outdated and incapable for accommodating the emerging evidence from recent clinical and experimental studies pointing toward new and, perhaps, more plausible modes of action. Novel paradigms guiding the development of new testable hypotheses for TT aphrodisiac properties are needed to stimulate further investigations into potential biological mechanisms in which many apparently conflicting observations can be reconciled. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Zou, Cunlu; Ladroue, Christophe; Guo, Shuixia; Feng, Jianfeng
2010-06-21
Reverse-engineering approaches such as Bayesian network inference, ordinary differential equations (ODEs) and information theory are widely applied to deriving causal relationships among different elements such as genes, proteins, metabolites, neurons, brain areas and so on, based upon multi-dimensional spatial and temporal data. There are several well-established reverse-engineering approaches to explore causal relationships in a dynamic network, such as ordinary differential equations (ODE), Bayesian networks, information theory and Granger Causality. Here we focused on Granger causality both in the time and frequency domain and in local and global networks, and applied our approach to experimental data (genes and proteins). For a small gene network, Granger causality outperformed all the other three approaches mentioned above. A global protein network of 812 proteins was reconstructed, using a novel approach. The obtained results fitted well with known experimental findings and predicted many experimentally testable results. In addition to interactions in the time domain, interactions in the frequency domain were also recovered. The results on the proteomic data and gene data confirm that Granger causality is a simple and accurate approach to recover the network structure. Our approach is general and can be easily applied to other types of temporal data.
Gene Function Hypotheses for the Campylobacter jejuni Glycome Generated by a Logic-Based Approach
Sternberg, Michael J.E.; Tamaddoni-Nezhad, Alireza; Lesk, Victor I.; Kay, Emily; Hitchen, Paul G.; Cootes, Adrian; van Alphen, Lieke B.; Lamoureux, Marc P.; Jarrell, Harold C.; Rawlings, Christopher J.; Soo, Evelyn C.; Szymanski, Christine M.; Dell, Anne; Wren, Brendan W.; Muggleton, Stephen H.
2013-01-01
Increasingly, experimental data on biological systems are obtained from several sources and computational approaches are required to integrate this information and derive models for the function of the system. Here, we demonstrate the power of a logic-based machine learning approach to propose hypotheses for gene function integrating information from two diverse experimental approaches. Specifically, we use inductive logic programming that automatically proposes hypotheses explaining the empirical data with respect to logically encoded background knowledge. We study the capsular polysaccharide biosynthetic pathway of the major human gastrointestinal pathogen Campylobacter jejuni. We consider several key steps in the formation of capsular polysaccharide consisting of 15 genes of which 8 have assigned function, and we explore the extent to which functions can be hypothesised for the remaining 7. Two sources of experimental data provide the information for learning—the results of knockout experiments on the genes involved in capsule formation and the absence/presence of capsule genes in a multitude of strains of different serotypes. The machine learning uses the pathway structure as background knowledge. We propose assignments of specific genes to five previously unassigned reaction steps. For four of these steps, there was an unambiguous optimal assignment of gene to reaction, and to the fifth, there were three candidate genes. Several of these assignments were consistent with additional experimental results. We therefore show that the logic-based methodology provides a robust strategy to integrate results from different experimental approaches and propose hypotheses for the behaviour of a biological system. PMID:23103756
Gene function hypotheses for the Campylobacter jejuni glycome generated by a logic-based approach.
Sternberg, Michael J E; Tamaddoni-Nezhad, Alireza; Lesk, Victor I; Kay, Emily; Hitchen, Paul G; Cootes, Adrian; van Alphen, Lieke B; Lamoureux, Marc P; Jarrell, Harold C; Rawlings, Christopher J; Soo, Evelyn C; Szymanski, Christine M; Dell, Anne; Wren, Brendan W; Muggleton, Stephen H
2013-01-09
Increasingly, experimental data on biological systems are obtained from several sources and computational approaches are required to integrate this information and derive models for the function of the system. Here, we demonstrate the power of a logic-based machine learning approach to propose hypotheses for gene function integrating information from two diverse experimental approaches. Specifically, we use inductive logic programming that automatically proposes hypotheses explaining the empirical data with respect to logically encoded background knowledge. We study the capsular polysaccharide biosynthetic pathway of the major human gastrointestinal pathogen Campylobacter jejuni. We consider several key steps in the formation of capsular polysaccharide consisting of 15 genes of which 8 have assigned function, and we explore the extent to which functions can be hypothesised for the remaining 7. Two sources of experimental data provide the information for learning-the results of knockout experiments on the genes involved in capsule formation and the absence/presence of capsule genes in a multitude of strains of different serotypes. The machine learning uses the pathway structure as background knowledge. We propose assignments of specific genes to five previously unassigned reaction steps. For four of these steps, there was an unambiguous optimal assignment of gene to reaction, and to the fifth, there were three candidate genes. Several of these assignments were consistent with additional experimental results. We therefore show that the logic-based methodology provides a robust strategy to integrate results from different experimental approaches and propose hypotheses for the behaviour of a biological system. Copyright © 2012 Elsevier Ltd. All rights reserved.
HyQue: evaluating hypotheses using Semantic Web technologies.
Callahan, Alison; Dumontier, Michel; Shah, Nigam H
2011-05-17
Key to the success of e-Science is the ability to computationally evaluate expert-composed hypotheses for validity against experimental data. Researchers face the challenge of collecting, evaluating and integrating large amounts of diverse information to compose and evaluate a hypothesis. Confronted with rapidly accumulating data, researchers currently do not have the software tools to undertake the required information integration tasks. We present HyQue, a Semantic Web tool for querying scientific knowledge bases with the purpose of evaluating user submitted hypotheses. HyQue features a knowledge model to accommodate diverse hypotheses structured as events and represented using Semantic Web languages (RDF/OWL). Hypothesis validity is evaluated against experimental and literature-sourced evidence through a combination of SPARQL queries and evaluation rules. Inference over OWL ontologies (for type specifications, subclass assertions and parthood relations) and retrieval of facts stored as Bio2RDF linked data provide support for a given hypothesis. We evaluate hypotheses of varying levels of detail about the genetic network controlling galactose metabolism in Saccharomyces cerevisiae to demonstrate the feasibility of deploying such semantic computing tools over a growing body of structured knowledge in Bio2RDF. HyQue is a query-based hypothesis evaluation system that can currently evaluate hypotheses about the galactose metabolism in S. cerevisiae. Hypotheses as well as the supporting or refuting data are represented in RDF and directly linked to one another allowing scientists to browse from data to hypothesis and vice versa. HyQue hypotheses and data are available at http://semanticscience.org/projects/hyque.
On testing VLSI chips for the big Viterbi decoder
NASA Technical Reports Server (NTRS)
Hsu, I. S.
1989-01-01
A general technique that can be used in testing very large scale integrated (VLSI) chips for the Big Viterbi Decoder (BVD) system is described. The test technique is divided into functional testing and fault-coverage testing. The purpose of functional testing is to verify that the design works functionally. Functional test vectors are converted from outputs of software simulations which simulate the BVD functionally. Fault-coverage testing is used to detect and, in some cases, to locate faulty components caused by bad fabrication. This type of testing is useful in screening out bad chips. Finally, design for testability, which is included in the BVD VLSI chip design, is described in considerable detail. Both the observability and controllability of a VLSI chip are greatly enhanced by including the design for the testability feature.
Model-Based Testability Assessment and Directed Troubleshooting of Shuttle Wiring Systems
NASA Technical Reports Server (NTRS)
Deb, Somnath; Domagala, Chuck; Shrestha, Roshan; Malepati, Venkatesh; Cavanaugh, Kevin; Patterson-Hine, Ann; Sanderfer, Dwight; Cockrell, Jim; Norvig, Peter (Technical Monitor)
2000-01-01
We have recently completed a pilot study on the Space shuttle wiring system commissioned by the Wiring Integrity Research (WIRe) team at NASA Ames Research Center, As the space shuttle ages, it is experiencing wiring degradation problems including arcing, chaffing insulation breakdown and broken conductors. A systematic and comprehensive test process is required to thoroughly test and quality assure (QA) the wiring systems. The NASA WIRe team recognized the value of a formal model based analysis for risk-assessment and fault coverage analysis. However. wiring systems are complex and involve over 50,000 wire segments. Therefore, NASA commissioned this pilot study with Qualtech Systems. Inc. (QSI) to explore means of automatically extracting high fidelity multi-signal models from wiring information database for use with QSI's Testability Engineering and Maintenance System (TEAMS) tool.
Higher-order Fourier analysis over finite fields and applications
NASA Astrophysics Data System (ADS)
Hatami, Pooya
Higher-order Fourier analysis is a powerful tool in the study of problems in additive and extremal combinatorics, for instance the study of arithmetic progressions in primes, where the traditional Fourier analysis comes short. In recent years, higher-order Fourier analysis has found multiple applications in computer science in fields such as property testing and coding theory. In this thesis, we develop new tools within this theory with several new applications such as a characterization theorem in algebraic property testing. One of our main contributions is a strong near-equidistribution result for regular collections of polynomials. The densities of small linear structures in subsets of Abelian groups can be expressed as certain analytic averages involving linear forms. Higher-order Fourier analysis examines such averages by approximating the indicator function of a subset by a function of bounded number of polynomials. Then, to approximate the average, it suffices to know the joint distribution of the polynomials applied to the linear forms. We prove a near-equidistribution theorem that describes these distributions for the group F(n/p) when p is a fixed prime. This fundamental fact was previously known only under various extra assumptions about the linear forms or the field size. We use this near-equidistribution theorem to settle a conjecture of Gowers and Wolf on the true complexity of systems of linear forms. Our next application is towards a characterization of testable algebraic properties. We prove that every locally characterized affine-invariant property of functions f : F(n/p) → R with n∈ N, is testable. In fact, we prove that any such property P is proximity-obliviously testable. More generally, we show that any affine-invariant property that is closed under subspace restrictions and has "bounded complexity" is testable. We also prove that any property that can be described as the property of decomposing into a known structure of low-degree polynomials is locally characterized and is, hence, testable. We discuss several notions of regularity which allow us to deduce algorithmic versions of various regularity lemmas for polynomials by Green and Tao and by Kaufman and Lovett. We show that our algorithmic regularity lemmas for polynomials imply algorithmic versions of several results relying on regularity, such as decoding Reed-Muller codes beyond the list decoding radius (for certain structured errors), and prescribed polynomial decompositions. Finally, motivated by the definition of Gowers norms, we investigate norms defined by different systems of linear forms. We give necessary conditions on the structure of systems of linear forms that define norms. We prove that such norms can be one of only two types, and assuming that |F p| is sufficiently large, they essentially are equivalent to either a Gowers norm or Lp norms.
Bosdriesz, Evert; Magnúsdóttir, Stefanía; Bruggeman, Frank J; Teusink, Bas; Molenaar, Douwe
2015-06-01
Microorganisms rely on binding-protein assisted, active transport systems to scavenge for scarce nutrients. Several advantages of using binding proteins in such uptake systems have been proposed. However, a systematic, rigorous and quantitative analysis of the function of binding proteins is lacking. By combining knowledge of selection pressure and physiochemical constraints, we derive kinetic, thermodynamic, and stoichiometric properties of binding-protein dependent transport systems that enable a maximal import activity per amount of transporter. Under the hypothesis that this maximal specific activity of the transport complex is the selection objective, binding protein concentrations should exceed the concentration of both the scarce nutrient and the transporter. This increases the encounter rate of transporter with loaded binding protein at low substrate concentrations, thereby enhancing the affinity and specific uptake rate. These predictions are experimentally testable, and a number of observations confirm them. © 2015 FEBS.
Implementation of a quantum random number generator based on the optimal clustering of photocounts
NASA Astrophysics Data System (ADS)
Balygin, K. A.; Zaitsev, V. I.; Klimov, A. N.; Kulik, S. P.; Molotkov, S. N.
2017-10-01
To implement quantum random number generators, it is fundamentally important to have a mathematically provable and experimentally testable process of measurements of a system from which an initial random sequence is generated. This makes sure that randomness indeed has a quantum nature. A quantum random number generator has been implemented with the use of the detection of quasi-single-photon radiation by a silicon photomultiplier (SiPM) matrix, which makes it possible to reliably reach the Poisson statistics of photocounts. The choice and use of the optimal clustering of photocounts for the initial sequence of photodetection events and a method of extraction of a random sequence of 0's and 1's, which is polynomial in the length of the sequence, have made it possible to reach a yield rate of 64 Mbit/s of the output certainly random sequence.
Woolley, Thomas E; Gaffney, Eamonn A; Goriely, Alain
2017-07-01
If the plasma membrane of a cell is able to delaminate locally from its actin cortex, a cellular bleb can be produced. Blebs are pressure-driven protrusions, which are noteworthy for their ability to produce cellular motion. Starting from a general continuum mechanics description, we restrict ourselves to considering cell and bleb shapes that maintain approximately spherical forms. From this assumption, we obtain a tractable algebraic system for bleb formation. By including cell-substrate adhesions, we can model blebbing cell motility. Further, by considering mechanically isolated blebbing events, which are randomly distributed over the cell, we can derive equations linking the macroscopic migration characteristics to the microscopic structural parameters of the cell. This multiscale modeling framework is then used to provide parameter estimates, which are in agreement with current experimental data. In summary, the construction of the mathematical model provides testable relationships between the bleb size and cell motility.
Unified TeV scale picture of baryogenesis and dark matter.
Babu, K S; Mohapatra, R N; Nasri, Salah
2007-04-20
We present a simple extension of the minimal supersymmetric standard model which provides a unified picture of cosmological baryon asymmetry and dark matter. Our model introduces a gauge singlet field N and a color triplet field X which couple to the right-handed quark fields. The out-of-equilibrium decay of the Majorana fermion N mediated by the exchange of the scalar field X generates adequate baryon asymmetry for MN approximately 100 GeV and MX approximately TeV. The scalar partner of N (denoted N1) is naturally the lightest SUSY particle as it has no gauge interactions and plays the role of dark matter. The model is experimentally testable in (i) neutron-antineutron oscillations with a transition time estimated to be around 10(10)sec, (ii) discovery of colored particles X at LHC with mass of order TeV, and (iii) direct dark matter detection with a predicted cross section in the observable range.
Brains studying brains: look before you think in vision
NASA Astrophysics Data System (ADS)
Zhaoping, Li
2016-06-01
Using our own brains to study our brains is extraordinary. For example, in vision this makes us naturally blind to our own blindness, since our impression of seeing our world clearly is consistent with our ignorance of what we do not see. Our brain employs its ‘conscious’ part to reason and make logical deductions using familiar rules and past experience. However, human vision employs many ‘subconscious’ brain parts that follow rules alien to our intuition. Our blindness to our unknown unknowns and our presumptive intuitions easily lead us astray in asking and formulating theoretical questions, as witnessed in many unexpected and counter-intuitive difficulties and failures encountered by generations of scientists. We should therefore pay a more than usual amount of attention and respect to experimental data when studying our brain. I show that this can be productive by reviewing two vision theories that have provided testable predictions and surprising insights.
Sequential pattern formation governed by signaling gradients
NASA Astrophysics Data System (ADS)
Jörg, David J.; Oates, Andrew C.; Jülicher, Frank
2016-10-01
Rhythmic and sequential segmentation of the embryonic body plan is a vital developmental patterning process in all vertebrate species. However, a theoretical framework capturing the emergence of dynamic patterns of gene expression from the interplay of cell oscillations with tissue elongation and shortening and with signaling gradients, is still missing. Here we show that a set of coupled genetic oscillators in an elongating tissue that is regulated by diffusing and advected signaling molecules can account for segmentation as a self-organized patterning process. This system can form a finite number of segments and the dynamics of segmentation and the total number of segments formed depend strongly on kinetic parameters describing tissue elongation and signaling molecules. The model accounts for existing experimental perturbations to signaling gradients, and makes testable predictions about novel perturbations. The variety of different patterns formed in our model can account for the variability of segmentation between different animal species.
Hypercharged dark matter and direct detection as a probe of reheating.
Feldstein, Brian; Ibe, Masahiro; Yanagida, Tsutomu T
2014-03-14
The lack of new physics at the LHC so far weakens the argument for TeV scale thermal dark matter. On the other hand, heavier, nonthermal dark matter is generally difficult to test experimentally. Here we consider the interesting and generic case of hypercharged dark matter, which can allow for heavy dark matter masses without spoiling testability. Planned direct detection experiments will be able to see a signal for masses up to an incredible 1010 GeV, and this can further serve to probe the reheating temperature up to about 109 GeV, as determined by the nonthermal dark matter relic abundance. The Z-mediated nature of the dark matter scattering may be determined in principle by comparing scattering rates on different detector nuclei, which in turn can reveal the dark matter mass. We will discuss the extent to which future experiments may be able to make such a determination.
Brains studying brains: look before you think in vision.
Zhaoping, Li
2016-05-11
Using our own brains to study our brains is extraordinary. For example, in vision this makes us naturally blind to our own blindness, since our impression of seeing our world clearly is consistent with our ignorance of what we do not see. Our brain employs its 'conscious' part to reason and make logical deductions using familiar rules and past experience. However, human vision employs many 'subconscious' brain parts that follow rules alien to our intuition. Our blindness to our unknown unknowns and our presumptive intuitions easily lead us astray in asking and formulating theoretical questions, as witnessed in many unexpected and counter-intuitive difficulties and failures encountered by generations of scientists. We should therefore pay a more than usual amount of attention and respect to experimental data when studying our brain. I show that this can be productive by reviewing two vision theories that have provided testable predictions and surprising insights.
Quantifying quantum coherence with quantum Fisher information.
Feng, X N; Wei, L F
2017-11-14
Quantum coherence is one of the old but always important concepts in quantum mechanics, and now it has been regarded as a necessary resource for quantum information processing and quantum metrology. However, the question of how to quantify the quantum coherence has just been paid the attention recently (see, e.g., Baumgratz et al. PRL, 113. 140401 (2014)). In this paper we verify that the well-known quantum Fisher information (QFI) can be utilized to quantify the quantum coherence, as it satisfies the monotonicity under the typical incoherent operations and the convexity under the mixing of the quantum states. Differing from most of the pure axiomatic methods, quantifying quantum coherence by QFI could be experimentally testable, as the bound of the QFI is practically measurable. The validity of our proposal is specifically demonstrated with the typical phase-damping and depolarizing evolution processes of a generic single-qubit state, and also by comparing it with the other quantifying methods proposed previously.
Linking short-term responses to ecologically-relevant outcomes
Opportunity to participate in the conduct of collaborative integrative lab, field and modelling efforts to characterize molecular-to-organismal level responses and make quantitative testable predictions of population level outcomes
HyQue: evaluating hypotheses using Semantic Web technologies
2011-01-01
Background Key to the success of e-Science is the ability to computationally evaluate expert-composed hypotheses for validity against experimental data. Researchers face the challenge of collecting, evaluating and integrating large amounts of diverse information to compose and evaluate a hypothesis. Confronted with rapidly accumulating data, researchers currently do not have the software tools to undertake the required information integration tasks. Results We present HyQue, a Semantic Web tool for querying scientific knowledge bases with the purpose of evaluating user submitted hypotheses. HyQue features a knowledge model to accommodate diverse hypotheses structured as events and represented using Semantic Web languages (RDF/OWL). Hypothesis validity is evaluated against experimental and literature-sourced evidence through a combination of SPARQL queries and evaluation rules. Inference over OWL ontologies (for type specifications, subclass assertions and parthood relations) and retrieval of facts stored as Bio2RDF linked data provide support for a given hypothesis. We evaluate hypotheses of varying levels of detail about the genetic network controlling galactose metabolism in Saccharomyces cerevisiae to demonstrate the feasibility of deploying such semantic computing tools over a growing body of structured knowledge in Bio2RDF. Conclusions HyQue is a query-based hypothesis evaluation system that can currently evaluate hypotheses about the galactose metabolism in S. cerevisiae. Hypotheses as well as the supporting or refuting data are represented in RDF and directly linked to one another allowing scientists to browse from data to hypothesis and vice versa. HyQue hypotheses and data are available at http://semanticscience.org/projects/hyque. PMID:21624158
Inferior olive mirrors joint dynamics to implement an inverse controller.
Alvarez-Icaza, Rodrigo; Boahen, Kwabena
2012-10-01
To produce smooth and coordinated motion, our nervous systems need to generate precisely timed muscle activation patterns that, due to axonal conduction delay, must be generated in a predictive and feedforward manner. Kawato proposed that the cerebellum accomplishes this by acting as an inverse controller that modulates descending motor commands to predictively drive the spinal cord such that the musculoskeletal dynamics are canceled out. This and other cerebellar theories do not, however, account for the rich biophysical properties expressed by the olivocerebellar complex's various cell types, making these theories difficult to verify experimentally. Here we propose that a multizonal microcomplex's (MZMC) inferior olivary neurons use their subthreshold oscillations to mirror a musculoskeletal joint's underdamped dynamics, thereby achieving inverse control. We used control theory to map a joint's inverse model onto an MZMC's biophysics, and we used biophysical modeling to confirm that inferior olivary neurons can express the dynamics required to mirror biomechanical joints. We then combined both techniques to predict how experimentally injecting current into the inferior olive would affect overall motor output performance. We found that this experimental manipulation unmasked a joint's natural dynamics, as observed by motor output ringing at the joint's natural frequency, with amplitude proportional to the amount of current. These results support the proposal that the cerebellum-in particular an MZMC-is an inverse controller; the results also provide a biophysical implementation for this controller and allow one to make an experimentally testable prediction.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gill, R.
Epidemiological studies have linked consumption of n-3 PUFAs with a variety of beneficial health benefits, particularly with respect to putative anti-inflammatory effects. Unfortunately, many of these results remain somewhat controversial because in most instances there has not been a linkage to specific molecular mechanisms. For instance, dietary exposure to low levels of mercury has been shown to be damaging to neural development, but concomitant ingestion of n-3 PUFAs as occurs during consumption of fish, has been shown to counteract the detrimental effects. As the mechanisms mediating the neurotoxicity of environmental mercury are not fully delineated, it is difficult to conceptualizemore » a testable molecular mechanism explaining how n-3 PUFAs negate its neurotoxic effects. However, environmental exposure to mercury also has been linked to increased autoimmunity. By way of a molecular understanding of this immuno-toxic association, disruption of CD95 signaling is well established as a triggering factor for autoimmunity, and we have previously shown that environmentally relevant in vitro and dietary exposures to mercury interfere with CD95 signaling. In particular we have shown that activation of caspase 8, as well as downstream activation of caspase 3, in response to CD95 agonist stimulation is depressed by mercury. More recently we have shown in vitro that the n-3 PUFA docosahexaenoic acid counteracts the negative effect of mercury on CD95 signaling by restoring caspase activity. We hypothesized that concomitant ingestion of n-3 PUFAs with mercury might be protective from the immuno-toxic effects of mercury, as it is with mercury's neuro-toxic effects, and in the case of immuno-toxicity this would be related to restoration of CD95 signal strength. We now show that dietary ingestion of n-3 PUFAs generally promotes CD95 signaling by upregulating caspase 8 activation. Apart from accounting for the ability of n-3 PUFAs to specifically counteract autoimmune sequelae of mercury exposure, this novel finding for the first time suggests a testable molecular mechanism explaining the overall anti-inflammatory properties of n-3 PUFAs. - Highlights: • Dietary n-3 PUFAs counter Hg{sup 2+} immunotoxicity • Hg{sup 2+} interference with SEB-mediated signal transduction is ameliorated by n-3 PUFA rich diets. • Dietary n-3 PUFAs augment SEB-mediated activation of caspase 8 in vivo.« less
Hartmann, Ernest
2010-01-01
There is a widespread consensus that emotion is important in dreams, deriving from both biological and psychological studies. However, the emphasis on examining emotions explicitly mentioned in dreams is misplaced. The dream is basically made of imagery. The focus of our group has been on relating the dream imagery to the dreamer's underlying emotion. What is most important is the underlying emotion--the emotion of the dreamer, not the emotion in the dream. This chapter discusses many studies relating the dream-especially the central image of the dream--to the dreamer's underlying emotion. Focusing on the underlying emotion leads to a coherent and testable view of the nature of dreaming. It also helps to clarify some important puzzling features of the literature on dreams, such as why the clinical literature is different in so many ways from the experimental literature, especially the laboratory-based experimental literature. Based on central image intensity and the associated underlying emotion, we can identify a hierarchy of dreams, from the highest-intensity, "big dreams," to the lowest-intensity dreams from laboratory awakenings. Copyright © 2010 Elsevier Inc. All rights reserved.
Measurement uncertainty relations: characterising optimal error bounds for qubits
NASA Astrophysics Data System (ADS)
Bullock, T.; Busch, P.
2018-07-01
In standard formulations of the uncertainty principle, two fundamental features are typically cast as impossibility statements: two noncommuting observables cannot in general both be sharply defined (for the same state), nor can they be measured jointly. The pioneers of quantum mechanics were acutely aware and puzzled by this fact, and it motivated Heisenberg to seek a mitigation, which he formulated in his seminal paper of 1927. He provided intuitive arguments to show that the values of, say, the position and momentum of a particle can at least be unsharply defined, and they can be measured together provided some approximation errors are allowed. Only now, nine decades later, a working theory of approximate joint measurements is taking shape, leading to rigorous and experimentally testable formulations of associated error tradeoff relations. Here we briefly review this new development, explaining the concepts and steps taken in the construction of optimal joint approximations of pairs of incompatible observables. As a case study, we deduce measurement uncertainty relations for qubit observables using two distinct error measures. We provide an operational interpretation of the error bounds and discuss some of the first experimental tests of such relations.
CrossCheck: an open-source web tool for high-throughput screen data analysis.
Najafov, Jamil; Najafov, Ayaz
2017-07-19
Modern high-throughput screening methods allow researchers to generate large datasets that potentially contain important biological information. However, oftentimes, picking relevant hits from such screens and generating testable hypotheses requires training in bioinformatics and the skills to efficiently perform database mining. There are currently no tools available to general public that allow users to cross-reference their screen datasets with published screen datasets. To this end, we developed CrossCheck, an online platform for high-throughput screen data analysis. CrossCheck is a centralized database that allows effortless comparison of the user-entered list of gene symbols with 16,231 published datasets. These datasets include published data from genome-wide RNAi and CRISPR screens, interactome proteomics and phosphoproteomics screens, cancer mutation databases, low-throughput studies of major cell signaling mediators, such as kinases, E3 ubiquitin ligases and phosphatases, and gene ontological information. Moreover, CrossCheck includes a novel database of predicted protein kinase substrates, which was developed using proteome-wide consensus motif searches. CrossCheck dramatically simplifies high-throughput screen data analysis and enables researchers to dig deep into the published literature and streamline data-driven hypothesis generation. CrossCheck is freely accessible as a web-based application at http://proteinguru.com/crosscheck.
Plichta, Michael M.; Scheres, Anouk
2013-01-01
A review of the existing functional magnetic resonance imaging (fMRI) studies on reward anticipation in patients with attention-deficit/hyperactivity disorder (ADHD) is provided. Meta-analysis showed a significant medium effect size (Cohen’s d = 0.48–0.58) in terms of ventral–striatal (VS)-hyporesponsiveness in ADHD. Studies on VS-responsiveness and trait impulsivity in the healthy population demonstrate the opposite relationship, i.e. impulsivity-scores positively correlated with VS activation during reward processing. Against the background that ADHD may represent an extreme on a continuum of normal variability, the question arises as to how these contrasting findings can be integrated. We discuss three theoretical approaches, each of which integrates the opposing findings: (1) an inverted-u-shape model; (2) a (genetic) moderator model; and (3) the “unrelated model”. We conclude that at the present stage the number of existing studies in the healthy population as well as in ADHD groups is too small for a final answer. Therefore, our presented integrative approaches should be understood as an attempt to frame future research directions by generating testable hypotheses and giving practical suggestions for future studies. PMID:23928090
Expert knowledge as a foundation for the management of secretive species and their habitat
Drew, C. Ashton; Collazo, Jaime
2012-01-01
In this chapter, we share lessons learned during the elicitation and application of expert knowledge in the form of a belief network model for the habitat of a waterbird, the King Rail (Rallus elegans). A belief network is a statistical framework used to graphically represent and evaluate hypothesized cause and effect relationships among variables. Our model was a pilot project to explore the value of such a model as a tool to help the US Fish and Wildlife Service (USFWS) conserve species that lack sufficient empirical data to guide management decisions. Many factors limit the availability of empirical data that can support landscape-scale conservation planning. Globally, most species simply have not yet been subject to empirical study (Wilson 2000). Even for well-studied species, data are often restricted to specific geographic extents, to particular seasons, or to specific segments of a species’ life history. The USFWS mandates that the agency’s conservation actions (1) be coordinated across regional landscapes, (2) be founded on the best available science (with testable assumptions), and (3) support adaptive management through monitoring and assessment of action outcomes. Given limits on the available data, the concept of “best available science” in the context of conservation planning generally includes a mix of empirical data and expert knowledge (Sullivan et al. 2006).
Saqib, Hafiz Sohaib Ahmed; You, Minsheng
2017-01-01
Conservation biological control emphasizes natural and other non-crop vegetation as a source of natural enemies to focal crops. There is an unmet need for better methods to identify the types of vegetation that are optimal to support specific natural enemies that may colonize the crops. Here we explore the commonality of the spider assemblage—considering abundance and diversity (H)—in brassica crops with that of adjacent non-crop and non-brassica crop vegetation. We employ spatial-based multivariate ordination approaches, hierarchical clustering and spatial eigenvector analysis. The small-scale mixed cropping and high disturbance frequency of southern Chinese vegetation farming offered a setting to test the role of alternate vegetation for spider conservation. Our findings indicate that spider families differ markedly in occurrence with respect to vegetation type. Grassy field margins, non-crop vegetation, taro and sweetpotato harbour spider morphospecies and functional groups that are also present in brassica crops. In contrast, pumpkin and litchi contain spiders not found in brassicas, and so may have little benefit for conservation biological control services for brassicas. Our findings also illustrate the utility of advanced statistical approaches for identifying spatial relationships between natural enemies and the land uses most likely to offer alternative habitats for conservation biological control efforts that generates testable hypotheses for future studies. PMID:29085741
How cognitive heuristics can explain social interactions in spatial movement.
Seitz, Michael J; Bode, Nikolai W F; Köster, Gerta
2016-08-01
The movement of pedestrian crowds is a paradigmatic example of collective motion. The precise nature of individual-level behaviours underlying crowd movements has been subject to a lively debate. Here, we propose that pedestrians follow simple heuristics rooted in cognitive psychology, such as 'stop if another step would lead to a collision' or 'follow the person in front'. In other words, our paradigm explicitly models individual-level behaviour as a series of discrete decisions. We show that our cognitive heuristics produce realistic emergent crowd phenomena, such as lane formation and queuing behaviour. Based on our results, we suggest that pedestrians follow different cognitive heuristics that are selected depending on the context. This differs from the widely used approach of capturing changes in behaviour via model parameters and leads to testable hypotheses on changes in crowd behaviour for different motivation levels. For example, we expect that rushed individuals more often evade to the side and thus display distinct emergent queue formations in front of a bottleneck. Our heuristics can be ranked according to the cognitive effort that is required to follow them. Therefore, our model establishes a direct link between behavioural responses and cognitive effort and thus facilitates a novel perspective on collective behaviour. © 2016 The Author(s).
When cooperation begets cooperation: the role of key individuals in galvanizing support.
McAuliffe, Katherine; Wrangham, Richard; Glowacki, Luke; Russell, Andrew F
2015-12-05
Life abounds with examples of conspecifics actively cooperating to a common end, despite conflicts of interest being expected concerning how much each individual should contribute. Mathematical models typically find that such conflict can be resolved by partial-response strategies, leading investors to contribute relatively equitably. Using a case study approach, we show that such model expectations can be contradicted in at least four disparate contexts: (i) bi-parental care; (ii) cooperative breeding; (iii) cooperative hunting; and (iv) human cooperation. We highlight that: (a) marked variation in contributions is commonplace; and (b) individuals can often respond positively rather than negatively to the contributions of others. Existing models have surprisingly limited power in explaining these phenomena. Here, we propose that, although among-individual variation in cooperative contributions will be influenced by differential costs and benefits, there is likely to be a strong genetic or epigenetic component. We then suggest that selection can maintain high investors (key individuals) when their contributions promote support by increasing the benefits and/or reducing the costs for others. Our intentions are to raise awareness in--and provide testable hypotheses of--two of the most poorly understood, yet integral, questions regarding cooperative ventures: why do individuals vary in their contributions and when does cooperation beget cooperation? © 2015 The Author(s).
How cognitive heuristics can explain social interactions in spatial movement
Köster, Gerta
2016-01-01
The movement of pedestrian crowds is a paradigmatic example of collective motion. The precise nature of individual-level behaviours underlying crowd movements has been subject to a lively debate. Here, we propose that pedestrians follow simple heuristics rooted in cognitive psychology, such as ‘stop if another step would lead to a collision’ or ‘follow the person in front’. In other words, our paradigm explicitly models individual-level behaviour as a series of discrete decisions. We show that our cognitive heuristics produce realistic emergent crowd phenomena, such as lane formation and queuing behaviour. Based on our results, we suggest that pedestrians follow different cognitive heuristics that are selected depending on the context. This differs from the widely used approach of capturing changes in behaviour via model parameters and leads to testable hypotheses on changes in crowd behaviour for different motivation levels. For example, we expect that rushed individuals more often evade to the side and thus display distinct emergent queue formations in front of a bottleneck. Our heuristics can be ranked according to the cognitive effort that is required to follow them. Therefore, our model establishes a direct link between behavioural responses and cognitive effort and thus facilitates a novel perspective on collective behaviour. PMID:27581483
Lin, Frank Po-Yen; Pokorny, Adrian; Teng, Christina; Epstein, Richard J
2017-07-31
Vast amounts of clinically relevant text-based variables lie undiscovered and unexploited in electronic medical records (EMR). To exploit this untapped resource, and thus facilitate the discovery of informative covariates from unstructured clinical narratives, we have built a novel computational pipeline termed Text-based Exploratory Pattern Analyser for Prognosticator and Associator discovery (TEPAPA). This pipeline combines semantic-free natural language processing (NLP), regular expression induction, and statistical association testing to identify conserved text patterns associated with outcome variables of clinical interest. When we applied TEPAPA to a cohort of head and neck squamous cell carcinoma patients, plausible concepts known to be correlated with human papilloma virus (HPV) status were identified from the EMR text, including site of primary disease, tumour stage, pathologic characteristics, and treatment modalities. Similarly, correlates of other variables (including gender, nodal status, recurrent disease, smoking and alcohol status) were also reliably recovered. Using highly-associated patterns as covariates, a patient's HPV status was classifiable using a bootstrap analysis with a mean area under the ROC curve of 0.861, suggesting its predictive utility in supporting EMR-based phenotyping tasks. These data support using this integrative approach to efficiently identify disease-associated factors from unstructured EMR narratives, and thus to efficiently generate testable hypotheses.
The challenges to inferring the regulators of biodiversity in deep time.
Ezard, Thomas H G; Quental, Tiago B; Benton, Michael J
2016-04-05
Attempts to infer the ecological drivers of macroevolution in deep time have long drawn inspiration from work on extant systems, but long-term evolutionary and geological changes complicate the simple extrapolation of such theory. Recent efforts to incorporate a more informed ecology into macroevolution have moved beyond the descriptive, seeking to isolate generating mechanisms and produce testable hypotheses of how groups of organisms usurp each other or coexist over vast timespans. This theme issue aims to exemplify this progress, providing a series of case studies of how novel modelling approaches are helping infer the regulators of biodiversity in deep time. In this Introduction, we explore the challenges of these new approaches. First, we discuss how our choices of taxonomic units have implications for the conclusions drawn. Second, we emphasize the need to embrace the interdependence of biotic and abiotic changes, because no living organism ignores its environment. Third, in the light of parts 1 and 2, we discuss the set of dynamic signatures that we might expect to observe in the fossil record. Finally, we ask whether these dynamics represent the most ecologically informative foci for research efforts aimed at inferring the regulators of biodiversity in deep time. The papers in this theme issue contribute in each of these areas. © 2016 The Author(s).
Application of Adverse Outcome Pathways (AOPs) in Human ...
The adverse outcome pathway (AOP) framework was developed to help organize and disseminate existing knowledge concerning the means through which specific perturbations of biological pathways can lead to adverse outcomes considered relevant to risk-based regulatory decision-making. Because many fundamental molecular and cellular pathways are conserved across taxa, data from assays that screen chemicals for their ability to interact with specific biomolecular targets can often be credibly applied to a broad range of species, even if the apical outcomes of those perturbations may differ. Information concerning the different trajectories of adversity that molecular initiating events may take in different taxa, life stages, and sexes of organisms can be captured in the form of an AOP network. As an example, AOPs documenting divergent consequences of thyroid peroxidase (TPO) and deiodinase (DIO) inhibition in mammals, amphibians, and fish have been developed. These AOPs provide the foundation for using data from common in vitro assays for TPO or DIO activity to inform both human health and ecological risk assessments. They also provide the foundation for an integrated approach to testing and assessment, where available information and biological understanding can be integrated in order to formulate plausible and testable hypotheses which can be used to target in vivo testing on the endpoints of greatest concern. Application of this AOP knowledge in several different r
Collective thermoregulation in bee clusters
Ocko, Samuel A.; Mahadevan, L.
2014-01-01
Swarming is an essential part of honeybee behaviour, wherein thousands of bees cling onto each other to form a dense cluster that may be exposed to the environment for several days. This cluster has the ability to maintain its core temperature actively without a central controller. We suggest that the swarm cluster is akin to an active porous structure whose functional requirement is to adjust to outside conditions by varying its porosity to control its core temperature. Using a continuum model that takes the form of a set of advection–diffusion equations for heat transfer in a mobile porous medium, we show that the equalization of an effective ‘behavioural pressure’, which propagates information about the ambient temperature through variations in density, leads to effective thermoregulation. Our model extends and generalizes previous models by focusing the question of mechanism on the form and role of the behavioural pressure, and allows us to explain the vertical asymmetry of the cluster (as a consequence of buoyancy-driven flows), the ability of the cluster to overpack at low ambient temperatures without breaking up at high ambient temperatures, and the relative insensitivity to large variations in the ambient temperature. Our theory also makes testable hypotheses for the response of the cluster to external temperature inhomogeneities and suggests strategies for biomimetic thermoregulation. PMID:24335563
Defining the Construct of Synthetic Androgen Intoxication: An Application of General Brain Arousal.
Hildebrandt, Tom; Heywood, Ashley; Wesley, Daniel; Schulz, Kurt
2018-01-01
Synthetic androgens (i. e., anabolic-androgenic steroids) are the primary component to the majority of problematic appearance and performance enhancing drug (APED) use. Despite evidence that these substances are associated with increased risk for aggression, violence, body image disturbances, and polypharmacy and can develop a pattern of chronic use consistent with drug dependence, there are no formal definitions of androgen intoxication. Consequently, the purpose of this paper is to establish a testable theory of androgen intoxication. We present evidence and theorize that synthetic androgen intoxication can be defined by a pattern of poor self-regulation characterized by increased propensity for a range of behaviors (e.g., aggression, sex, drug seeking, exercise, etc.) via androgen mediated effects on general brain arousal. This theory posits that androgens reduce threshold for emotional reactivity, motor response, and alertness to sensory stimuli and disrupt inhibitory control over the behaviors associated with synthetic androgen use. These changes result from alteration to basic neurocircuitry that amplifies limbic activation and reduces top-down cortical control. The implications for this definition are to inform APED specific hypotheses about the behavioral and psychological effects of APED use and provide a basis for establishing clinical, legal, and public health guidelines to address the use and misuse of these substances.
Model averaging, optimal inference, and habit formation
FitzGerald, Thomas H. B.; Dolan, Raymond J.; Friston, Karl J.
2014-01-01
Postulating that the brain performs approximate Bayesian inference generates principled and empirically testable models of neuronal function—the subject of much current interest in neuroscience and related disciplines. Current formulations address inference and learning under some assumed and particular model. In reality, organisms are often faced with an additional challenge—that of determining which model or models of their environment are the best for guiding behavior. Bayesian model averaging—which says that an agent should weight the predictions of different models according to their evidence—provides a principled way to solve this problem. Importantly, because model evidence is determined by both the accuracy and complexity of the model, optimal inference requires that these be traded off against one another. This means an agent's behavior should show an equivalent balance. We hypothesize that Bayesian model averaging plays an important role in cognition, given that it is both optimal and realizable within a plausible neuronal architecture. We outline model averaging and how it might be implemented, and then explore a number of implications for brain and behavior. In particular, we propose that model averaging can explain a number of apparently suboptimal phenomena within the framework of approximate (bounded) Bayesian inference, focusing particularly upon the relationship between goal-directed and habitual behavior. PMID:25018724
Inoculation Stress Hypothesis of Environmental Enrichment
Crofton, Elizabeth J.; Zhang, Yafang; Green, Thomas A.
2014-01-01
One hallmark of psychiatric conditions is the vast continuum of individual differences in susceptibility vs. resilience resulting from the interaction of genetic and environmental factors. The environmental enrichment paradigm is an animal model that is useful for studying a range of psychiatric conditions, including protective phenotypes in addiction and depression models. The major question is how environmental enrichment, a non-drug and non-surgical manipulation, can produce such robust individual differences in such a wide range of behaviors. This paper draws from a variety of published sources to outline a coherent hypothesis of inoculation stress as a factor producing the protective enrichment phenotypes. The basic tenet suggests that chronic mild stress from living in a complex environment and interacting non-aggressively with conspecifics can inoculate enriched rats against subsequent stressors and/or drugs of abuse. This paper reviews the enrichment phenotypes, mulls the fundamental nature of environmental enrichment vs. isolation, discusses the most appropriate control for environmental enrichment, and challenges the idea that cortisol/corticosterone equals stress. The intent of the inoculation stress hypothesis of environmental enrichment is to provide a scaffold with which to build testable hypotheses for the elucidation of the molecular mechanisms underlying these protective phenotypes and thus provide new therapeutic targets to treat psychiatric/neurological conditions. PMID:25449533
Inoculation stress hypothesis of environmental enrichment.
Crofton, Elizabeth J; Zhang, Yafang; Green, Thomas A
2015-02-01
One hallmark of psychiatric conditions is the vast continuum of individual differences in susceptibility vs. resilience resulting from the interaction of genetic and environmental factors. The environmental enrichment paradigm is an animal model that is useful for studying a range of psychiatric conditions, including protective phenotypes in addiction and depression models. The major question is how environmental enrichment, a non-drug and non-surgical manipulation, can produce such robust individual differences in such a wide range of behaviors. This paper draws from a variety of published sources to outline a coherent hypothesis of inoculation stress as a factor producing the protective enrichment phenotypes. The basic tenet suggests that chronic mild stress from living in a complex environment and interacting non-aggressively with conspecifics can inoculate enriched rats against subsequent stressors and/or drugs of abuse. This paper reviews the enrichment phenotypes, mulls the fundamental nature of environmental enrichment vs. isolation, discusses the most appropriate control for environmental enrichment, and challenges the idea that cortisol/corticosterone equals stress. The intent of the inoculation stress hypothesis of environmental enrichment is to provide a scaffold with which to build testable hypotheses for the elucidation of the molecular mechanisms underlying these protective phenotypes and thus provide new therapeutic targets to treat psychiatric/neurological conditions. Copyright © 2014 Elsevier Ltd. All rights reserved.
Emphasizing the process of science using demonstrations in conceptual chemistry
NASA Astrophysics Data System (ADS)
Lutz, Courtney A.
The purpose of this project was to teach students a method for employing the process of science in a conceptual chemistry classroom when observing a demonstration of a discrepant event. Students observed six demonstrations throughout a trimester study of chemistry and responded to each demonstration by asking as many questions as they could think of, choosing one testable question to answer by making as many hypotheses as possible, and choosing one hypothesis to make predictions about observed results of this hypothesis when tested. Students were evaluated on their curiosity, confidence, knowledge of the process of science, and knowledge of the nature of science before and after the six demonstrations. Many students showed improvement in using or mastery of the process of science within the context of conceptual chemistry after six intensive experiences with it. Results of the study also showed students gained confidence in their scientific abilities after completing one trimester of conceptual chemistry. Curiosity and knowledge of the nature of science did not show statistically significant improvement according to the assessment tool. This may have been due to the scope of the demonstration and response activities, which focused on the process of science methodology instead of knowledge of the nature of science or the constraints of the assessment tool.
Defining the Construct of Synthetic Androgen Intoxication: An Application of General Brain Arousal
Hildebrandt, Tom; Heywood, Ashley; Wesley, Daniel; Schulz, Kurt
2018-01-01
Synthetic androgens (i. e., anabolic-androgenic steroids) are the primary component to the majority of problematic appearance and performance enhancing drug (APED) use. Despite evidence that these substances are associated with increased risk for aggression, violence, body image disturbances, and polypharmacy and can develop a pattern of chronic use consistent with drug dependence, there are no formal definitions of androgen intoxication. Consequently, the purpose of this paper is to establish a testable theory of androgen intoxication. We present evidence and theorize that synthetic androgen intoxication can be defined by a pattern of poor self-regulation characterized by increased propensity for a range of behaviors (e.g., aggression, sex, drug seeking, exercise, etc.) via androgen mediated effects on general brain arousal. This theory posits that androgens reduce threshold for emotional reactivity, motor response, and alertness to sensory stimuli and disrupt inhibitory control over the behaviors associated with synthetic androgen use. These changes result from alteration to basic neurocircuitry that amplifies limbic activation and reduces top-down cortical control. The implications for this definition are to inform APED specific hypotheses about the behavioral and psychological effects of APED use and provide a basis for establishing clinical, legal, and public health guidelines to address the use and misuse of these substances. PMID:29651261
Application of adverse outcome pathways (AOPs) in human ...
The adverse outcome pathway (AOP) framework was developed to help organize and disseminate existing knowledge concerning the means through which specific perturbations of biological pathways can lead to adverse outcomes considered relevant to risk-based regulatory decision-making. Because many fundamental molecular and cellular pathways are conserved across taxa, data from assays that screen chemicals for their ability to interact with specific biomolecular targets can often be credibly applied to a broad range of species, even if the apical outcomes of those perturbations may differ. Information concerning the different trajectories of adversity that molecular initiating events may take in different taxa, life stages, and sexes of organisms can be captured in the form of an AOP network. As an example, AOPs documenting divergent consequences of thyroid peroxidase (TPO) and deiodinase (DIO) inhibition in mammals, amphibians, and fish have been developed. These AOPs provide the foundation for using data from common in vitro assays for TPO or DIO activity to inform both human health and ecological risk assessments. They also provide the foundation for an integrated approach to testing and assessment, where available information and biological understanding can be integrated in order to formulate plausible and testable hypotheses which can be used to target in vivo testing on the endpoints of greatest concern. Application of this AOP knowledge in several different r
Arbilly, Michal; Lotem, Arnon
2017-10-25
Anthropomorphism, the attribution of human cognitive processes and emotional states to animals, is commonly viewed as non-scientific and potentially misleading. This is mainly because apparent similarity to humans can usually be explained by alternative, simpler mechanisms in animals, and because there is no explanatory power in analogies to human phenomena when these phenomena are not well understood. Yet, because it is also difficult to preclude real similarity and continuity in the evolution of humans' and animals' cognitive abilities, it may not be productive to completely ignore our understanding of human behaviour when thinking about animals. Here we propose that in applying a functional approach to the evolution of cognitive mechanisms, human cognition may be used to broaden our theoretical thinking and to generate testable hypotheses. Our goal is not to 'elevate' animals, but rather to find the minimal set of mechanistic principles that may explain 'advanced' cognitive abilities in humans, and consider under what conditions these mechanisms were likely to enhance fitness and to evolve in animals. We illustrate this approach, from relatively simple emotional states, to more advanced mechanisms, involved in planning and decision-making, episodic memory, metacognition, theory of mind, and consciousness. © 2017 The Author(s).
Is prenatal smoking associated with a developmental pattern of conduct problems in young boys?
Wakschlag, Lauren S; Pickett, Kate E; Kasza, Kristen E; Loeber, Rolf
2006-04-01
Prenatal smoking is robustly associated with increased risk of conduct problems in offspring. Observational studies that provide detailed phenotypic description are critical for generating testable hypotheses about underlying processes through which the effects of prenatal smoking may operate. To this end, we use a developmental framework to examine the association of exposure with (1) oppositional defiant disorder and attention-deficit/hyperactivity disorder in young boys and (2) the pattern of delinquent behavior at adolescence. Using diagnostic measures and repeated measures of delinquency, we compare exposed and nonexposed boys from the youngest cohort of the Pittsburgh Youth Study (N = 448). Exposed boys were significantly more likely to (1) develop oppositional defiant disorder and comorbid oppositional defiant disorder-attention-deficit/hyperactivity disorder but not attention-deficit/hyperactivity disorder alone and (2) to have an earlier onset of significant delinquent behavior. The early emergence and developmental coherence of exposure-related conduct problems is striking and is consistent with a behavioral teratological model. Phenotypically, exposure-related conduct problems appear to be characterized by socially resistant and impulsively aggressive behavior. Whether prenatal smoking plays an etiological role in or is a risk marker for the development of conduct problems, exposed offspring are at increased risk of an early-starter pathway to conduct problems.
NASA Astrophysics Data System (ADS)
Douglas, M. M.; Bunn, S. E.; Davies, P. M.
2005-05-01
The tropical rivers of northern Australia are internationally recognised for their high ecological and cultural values. They have largely unmodified flow regimes and are comparatively free of the impacts associated with intensive land use. However, there is growing demand for agricultural development and existing pressures, such as weeds and feral animals, threaten their ecological integrity. Using the international literature to provide a conceptual framework and drawing on limited published and unpublished data on rivers in northern Australia, we have derived five general principles about food webs and related ecosystem processes that both characterise tropical rivers of northern Australia and have important implications for their management. These are: (1) Seasonal hydrology is a strong driver of ecosystem processes and food web structure; (2) Hydrological connectivity is largely intact and underpins important terrestrial-aquatic food web subsidies; (3) River and wetland food webs are strongly dependent on algal production; (4) A few common macroconsumers species have a strong influence on benthic food webs; (5) Omnivory is widespread and food chains are short. These principles have implications for the management and protection of tropical rivers and wetlands of northern Australia and provide a framework for the formation of testable hypotheses in future research programs.
Ahrens, Kym R.; DuBois, David Lane; Garrison, Michelle; Spencer, Renee; Richardson, Laura P.; Lozano, Paula
2012-01-01
Foster youth are at risk of poor adult outcomes. Research on the role of mentoring relationships for this population suggests the value of strategies that increase their access to adult sources of support, both while in foster care and as they reach adulthood. We conducted semi-structured, individual qualitative interviews with 23 former foster youth ages 18-25 regarding their relationships with supportive non-parental adults. We sought to identify factors that influence the formation, quality, and duration of these relationships and to develop testable hypotheses for intervention strategies. Findings suggest several themes related to relationship formation with non-parental adults, including barriers (e.g., youth's fears of being hurt) and facilitators (e.g., patience from the adult). Distinct themes were also identified relating to the ongoing development and longevity of these relationships. Youth also described multiple types of support and positive contributions to their development. Proposed intervention strategies include systematic incorporation of important non-parental adults into transition planning, enhanced training and matching procedures within formal mentoring programs, assistance for youth to strengthen their interpersonal awareness and skills, and the targeting of specific periods of need when linking youth to sources of adult support. Recommended research includes the development, pilot-testing, and evaluation of proposed strategies. PMID:22661797
The missing biology in land carbon models (Invited)
NASA Astrophysics Data System (ADS)
Prentice, I. C.; Cornwell, W.; Dong, N.; Maire, V.; Wang, H.; Wright, I.
2013-12-01
Models of terrestrial carbon cycling give divergent results, and recent developments - notably the inclusion of nitrogen-carbon cycle coupling - have apparently made matters worse. More extensive benchmarking of models would be highly desirable, but is not a panacea. Problems with current models include overparameterization (assigning separate sets of parameter values for each plant functional type can easily obscure more fundamental model limitations), and the widespread persistence of incorrect paradigms to describe plant responses to environment. Next-generation models require a more sound basis in observations and theory. A possible way forward will be outlined. It will be shown how the principle of optimization by natural selection can yield testable, general hypotheses about plant function. A specific optimality hypothesis about the control of CO2 drawdown versus water loss by leaves will be shown to yield global and quantitatively verifable predictions of plant behaviour as demonstrated in field gas-exchange measurements across species from different environments, and in the global pattern of stable carbon isotope discrimination by plants. Combined with the co-limitation hypothesis for the control of photosynthetic capacity and an economic approach to the costs of nutrient acquisition, this hypothesis provides a potential foundation for a comprehensive predictive understanding of the controls of primary production on land.
Sá-Pinto, Alexandra; Branco, Madalena S.; Alexandrino, Paulo B.; Fontaine, Michaël C.; Baird, Stuart J. E.
2012-01-01
Knowledge of the scale of dispersal and the mechanisms governing gene flow in marine environments remains fragmentary despite being essential for understanding evolution of marine biota and to design management plans. We use the limpets Patella ulyssiponensis and Patella rustica as models for identifying factors affecting gene flow in marine organisms across the North-East Atlantic and the Mediterranean Sea. A set of allozyme loci and a fragment of the mitochondrial gene cytochrome C oxidase subunit I were screened for genetic variation through starch gel electrophoresis and DNA sequencing, respectively. An approach combining clustering algorithms with clinal analyses was used to test for the existence of barriers to gene flow and estimate their geographic location and abruptness. Sharp breaks in the genetic composition of individuals were observed in the transitions between the Atlantic and the Mediterranean and across southern Italian shores. An additional break within the Atlantic cluster separates samples from the Alboran Sea and Atlantic African shores from those of the Iberian Atlantic shores. The geographic congruence of the genetic breaks detected in these two limpet species strongly supports the existence of transpecific barriers to gene flow in the Mediterranean Sea and Northeastern Atlantic. This leads to testable hypotheses regarding factors restricting gene flow across the study area. PMID:23239977
Yang, Ines; Woltemate, Sabrina; Piazuelo, M. Blanca; Bravo, Luis E.; Yepez, Maria Clara; Romero-Gallo, Judith; Delgado, Alberto G.; Wilson, Keith T.; Peek, Richard M.; Correa, Pelayo; Josenhans, Christine; Fox, James G.; Suerbaum, Sebastian
2016-01-01
Inhabitants of Túquerres in the Colombian Andes have a 25-fold higher risk of gastric cancer than inhabitants of the coastal town Tumaco, despite similar H. pylori prevalences. The gastric microbiota was recently shown in animal models to accelerate the development of H. pylori-induced precancerous lesions. 20 individuals from each town, matched for age and sex, were selected, and gastric microbiota analyses were performed by deep sequencing of amplified 16S rDNA. In parallel, analyses of H. pylori status, carriage of the cag pathogenicity island and assignment of H. pylori to phylogeographic groups were performed to test for correlations between H. pylori strain properties and microbiota composition. The gastric microbiota composition was highly variable between individuals, but showed a significant correlation with the town of origin. Multiple OTUs were detected exclusively in either Tumaco or Túquerres. Two operational taxonomic units (OTUs), Leptotrichia wadei and a Veillonella sp., were significantly more abundant in Túquerres, and 16 OTUs, including a Staphylococcus sp. were significantly more abundant in Tumaco. There was no significant correlation of H. pylori phylogeographic population or carriage of the cagPAI with microbiota composition. From these data, testable hypotheses can be generated and examined in suitable animal models and prospective clinical trials. PMID:26729566
When cooperation begets cooperation: the role of key individuals in galvanizing support
McAuliffe, Katherine; Wrangham, Richard; Glowacki, Luke; Russell, Andrew F.
2015-01-01
Life abounds with examples of conspecifics actively cooperating to a common end, despite conflicts of interest being expected concerning how much each individual should contribute. Mathematical models typically find that such conflict can be resolved by partial-response strategies, leading investors to contribute relatively equitably. Using a case study approach, we show that such model expectations can be contradicted in at least four disparate contexts: (i) bi-parental care; (ii) cooperative breeding; (iii) cooperative hunting; and (iv) human cooperation. We highlight that: (a) marked variation in contributions is commonplace; and (b) individuals can often respond positively rather than negatively to the contributions of others. Existing models have surprisingly limited power in explaining these phenomena. Here, we propose that, although among-individual variation in cooperative contributions will be influenced by differential costs and benefits, there is likely to be a strong genetic or epigenetic component. We then suggest that selection can maintain high investors (key individuals) when their contributions promote support by increasing the benefits and/or reducing the costs for others. Our intentions are to raise awareness in—and provide testable hypotheses of—two of the most poorly understood, yet integral, questions regarding cooperative ventures: why do individuals vary in their contributions and when does cooperation beget cooperation? PMID:26503685
A parsimonious modular approach to building a mechanistic belowground carbon and nitrogen model
NASA Astrophysics Data System (ADS)
Abramoff, Rose Z.; Davidson, Eric A.; Finzi, Adrien C.
2017-09-01
Soil decomposition models range from simple empirical functions to those that represent physical, chemical, and biological processes. Here we develop a parsimonious, modular C and N cycle model, the Dual Arrhenius Michaelis-Menten-Microbial Carbon and Nitrogen Phyisology (DAMM-MCNiP), that generates testable hypotheses regarding the effect of temperature, moisture, and substrate supply on C and N cycling. We compared this model to DAMM alone and an empirical model of heterotrophic respiration based on Harvard Forest data. We show that while different model structures explain similar amounts of variation in respiration, they differ in their ability to infer processes that affect C flux. We applied DAMM-MCNiP to explain an observed seasonal hysteresis in the relationship between respiration and temperature and show using an exudation simulation that the strength of the priming effect depended on the stoichiometry of the inputs. Low C:N inputs stimulated priming of soil organic matter decomposition, but high C:N inputs were preferentially utilized by microbes as a C source with limited priming. The simplicity of DAMM-MCNiP's simultaneous representations of temperature, moisture, substrate supply, enzyme activity, and microbial growth processes is unique among microbial physiology models and is sufficiently parsimonious that it could be incorporated into larger-scale models of C and N cycling.
Experimenter Effects on Cardiovascular Reactivity and Task Performance during Mental Stress Testing
ERIC Educational Resources Information Center
Siegwarth, Nicole; Larkin, Kevin T.; Kemmner, Christine
2012-01-01
Experimenter effects have long been hypothesized to influence participants' responses to mental stress testing. To explore the influence of experimenter warmth on responses to two mental stress tasks (mental arithmetic, mirror tracing), 32 young women participated in a single 45-min experimental session. Participants were randomized into warm…
A SEU-Hard Flip-Flop for Antifuse FPGAs
NASA Technical Reports Server (NTRS)
Katz, R.; Wang, J. J.; McCollum, J.; Cronquist, B.; Chan, R.; Yu, D.; Kleyner, I.; Day, John H. (Technical Monitor)
2001-01-01
A single event upset (SEU)-hardened flip-flop has been designed and developed for antifuse Field Programmable Gate Array (FPGA) application. Design and application issues, testability, test methods, simulation, and results are discussed.
The changing features of the body-mind problem.
Agassi, Joseph
2007-01-01
The body-mind problem invites scientific study, since mental events are repeated and repeatable and invite testable explanations. They seemed troublesome because of the classical theory of substance that failed to solve its own central problems. These are soluble with the aid of the theory of the laws of nature, particularly in its emergentist version [Bunge, M., 1980. The Body-mind Problem, Pergamon, Oxford] that invites refutable explanations [Popper, K.R., 1959. The Logic of Scientific Discovery, Hutchinson, London]. The view of mental properties as emergent is a modification of the two chief classical views, materialism and dualism. As this view invites testable explanations of events of the inner world, it is better than the quasi-behaviorist view of self-awareness as computer-style self-monitoring [Minsky, M., Laske, O., 1992. A conversation with Marvin Minsky. AI Magazine 13 (3), 31-45].
Design for testability and diagnosis at the system-level
NASA Technical Reports Server (NTRS)
Simpson, William R.; Sheppard, John W.
1993-01-01
The growing complexity of full-scale systems has surpassed the capabilities of most simulation software to provide detailed models or gate-level failure analyses. The process of system-level diagnosis approaches the fault-isolation problem in a manner that differs significantly from the traditional and exhaustive failure mode search. System-level diagnosis is based on a functional representation of the system. For example, one can exercise one portion of a radar algorithm (the Fast Fourier Transform (FFT) function) by injecting several standard input patterns and comparing the results to standardized output results. An anomalous output would point to one of several items (including the FFT circuit) without specifying the gate or failure mode. For system-level repair, identifying an anomalous chip is sufficient. We describe here an information theoretic and dependency modeling approach that discards much of the detailed physical knowledge about the system and analyzes its information flow and functional interrelationships. The approach relies on group and flow associations and, as such, is hierarchical. Its hierarchical nature allows the approach to be applicable to any level of complexity and to any repair level. This approach has been incorporated in a product called STAMP (System Testability and Maintenance Program) which was developed and refined through more than 10 years of field-level applications to complex system diagnosis. The results have been outstanding, even spectacular in some cases. In this paper we describe system-level testability, system-level diagnoses, and the STAMP analysis approach, as well as a few STAMP applications.
A mathematical model of physiological processes and its application to the study of aging
NASA Technical Reports Server (NTRS)
Hibbs, A. R.; Walford, R. L.
1989-01-01
The behavior of a physiological system which, after displacement, returns by homeostatic mechanisms to its original condition can be described by a simple differential equation in which the "recovery time" is a parameter. Two such systems, which influence one another, can be linked mathematically by the use of "coupling" or "feedback" coefficients. These concepts are the basis for many mathematical models of physiological behavior, and we describe the general nature of such models. Next, we introduce the concept of a "fatal limit" for the displacement of a physiological system, and show how measures of such limits can be included in mathematical models. We show how the numerical values of such limits depend on the values of other system parameters, i.e., recovery times and coupling coefficients, and suggest ways of measuring all these parameters experimentally, for example by monitoring changes induced by X-irradiation. Next, we discuss age-related changes in these parameters, and show how the parameters of mortality statistics, such as the famous Gompertz parameters, can be derived from experimentally measurable changes. Concepts of onset-of-aging, critical or fatal limits, equilibrium value (homeostasis), recovery times and coupling constants are involved. Illustrations are given using published data from mouse and rat populations. We believe that this method of deriving survival patterns from model that is experimentally testable is unique.
Sources, Sinks, and Model Accuracy
Spatial demographic models are a necessary tool for understanding how to manage landscapes sustainably for animal populations. These models, therefore, must offer precise and testable predications about animal population dynamics and how animal demographic parameters respond to ...
A critique of the hypothesis, and a defense of the question, as a framework for experimentation.
Glass, David J
2010-07-01
Scientists are often steered by common convention, funding agencies, and journal guidelines into a hypothesis-driven experimental framework, despite Isaac Newton's dictum that hypotheses have no place in experimental science. Some may think that Newton's cautionary note, which was in keeping with an experimental approach espoused by Francis Bacon, is inapplicable to current experimental method since, in accord with the philosopher Karl Popper, modern-day hypotheses are framed to serve as instruments of falsification, as opposed to verification. But Popper's "critical rationalist" framework too is problematic. It has been accused of being: inconsistent on philosophical grounds; unworkable for modern "large science," such as systems biology; inconsistent with the actual goals of experimental science, which is verification and not falsification; and harmful to the process of discovery as a practical matter. A criticism of the hypothesis as a framework for experimentation is offered. Presented is an alternative framework-the query/model approach-which many scientists may discover is the framework they are actually using, despite being required to give lip service to the hypothesis.
Assessing Pupils' Skills in Experimentation
ERIC Educational Resources Information Center
Hammann, Marcus; Phan, Thi Thanh Hoi; Ehmer, Maike; Grimm, Tobias
2008-01-01
This study is concerned with different forms of assessment of pupils' skills in experimentation. The findings of three studies are reported. Study 1 investigates whether it is possible to develop reliable multiple-choice tests for the skills of forming hypotheses, designing experiments and analysing experimental data. Study 2 compares scores from…
Work-Centered Technology Development (WTD)
2005-03-01
theoretical, testable, inductive, and repeatable foundations of science. o Theoretical foundations include notions such as statistical versus analytical...Human Factors and Ergonomics Society, 263-267. 179 Eggleston, R. G. (2005). Coursebook : Work-Centered Design (WCD). AFRL/HECS WCD course training
Writing testable software requirements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knirk, D.
1997-11-01
This tutorial identifies common problems in analyzing requirements in the problem and constructing a written specification of what the software is to do. It deals with two main problem areas: identifying and describing problem requirements, and analyzing and describing behavior specifications.
All pure bipartite entangled states can be self-tested
Coladangelo, Andrea; Goh, Koon Tong; Scarani, Valerio
2017-01-01
Quantum technologies promise advantages over their classical counterparts in the fields of computation, security and sensing. It is thus desirable that classical users are able to obtain guarantees on quantum devices, even without any knowledge of their inner workings. That such classical certification is possible at all is remarkable: it is a consequence of the violation of Bell inequalities by entangled quantum systems. Device-independent self-testing refers to the most complete such certification: it enables a classical user to uniquely identify the quantum state shared by uncharacterized devices by simply inspecting the correlations of measurement outcomes. Self-testing was first demonstrated for the singlet state and a few other examples of self-testable states were reported in recent years. Here, we address the long-standing open question of whether every pure bipartite entangled state is self-testable. We answer it affirmatively by providing explicit self-testing correlations for all such states. PMID:28548093
All pure bipartite entangled states can be self-tested
NASA Astrophysics Data System (ADS)
Coladangelo, Andrea; Goh, Koon Tong; Scarani, Valerio
2017-05-01
Quantum technologies promise advantages over their classical counterparts in the fields of computation, security and sensing. It is thus desirable that classical users are able to obtain guarantees on quantum devices, even without any knowledge of their inner workings. That such classical certification is possible at all is remarkable: it is a consequence of the violation of Bell inequalities by entangled quantum systems. Device-independent self-testing refers to the most complete such certification: it enables a classical user to uniquely identify the quantum state shared by uncharacterized devices by simply inspecting the correlations of measurement outcomes. Self-testing was first demonstrated for the singlet state and a few other examples of self-testable states were reported in recent years. Here, we address the long-standing open question of whether every pure bipartite entangled state is self-testable. We answer it affirmatively by providing explicit self-testing correlations for all such states.
All pure bipartite entangled states can be self-tested.
Coladangelo, Andrea; Goh, Koon Tong; Scarani, Valerio
2017-05-26
Quantum technologies promise advantages over their classical counterparts in the fields of computation, security and sensing. It is thus desirable that classical users are able to obtain guarantees on quantum devices, even without any knowledge of their inner workings. That such classical certification is possible at all is remarkable: it is a consequence of the violation of Bell inequalities by entangled quantum systems. Device-independent self-testing refers to the most complete such certification: it enables a classical user to uniquely identify the quantum state shared by uncharacterized devices by simply inspecting the correlations of measurement outcomes. Self-testing was first demonstrated for the singlet state and a few other examples of self-testable states were reported in recent years. Here, we address the long-standing open question of whether every pure bipartite entangled state is self-testable. We answer it affirmatively by providing explicit self-testing correlations for all such states.
2011-01-01
Background Although many biological databases are applying semantic web technologies, meaningful biological hypothesis testing cannot be easily achieved. Database-driven high throughput genomic hypothesis testing requires both of the capabilities of obtaining semantically relevant experimental data and of performing relevant statistical testing for the retrieved data. Tissue Microarray (TMA) data are semantically rich and contains many biologically important hypotheses waiting for high throughput conclusions. Methods An application-specific ontology was developed for managing TMA and DNA microarray databases by semantic web technologies. Data were represented as Resource Description Framework (RDF) according to the framework of the ontology. Applications for hypothesis testing (Xperanto-RDF) for TMA data were designed and implemented by (1) formulating the syntactic and semantic structures of the hypotheses derived from TMA experiments, (2) formulating SPARQLs to reflect the semantic structures of the hypotheses, and (3) performing statistical test with the result sets returned by the SPARQLs. Results When a user designs a hypothesis in Xperanto-RDF and submits it, the hypothesis can be tested against TMA experimental data stored in Xperanto-RDF. When we evaluated four previously validated hypotheses as an illustration, all the hypotheses were supported by Xperanto-RDF. Conclusions We demonstrated the utility of high throughput biological hypothesis testing. We believe that preliminary investigation before performing highly controlled experiment can be benefited. PMID:21342584
Phase 1 Space Fission Propulsion Energy Source Design
NASA Technical Reports Server (NTRS)
Houts, Mike; VanDyke, Melissa; Godfroy, Tom; Pedersen, Kevin; Martin, James; Dickens, Ricky; Salvail, Pat; Hrbud, Ivana; Carter, Robert; Rodgers, Stephen L. (Technical Monitor)
2002-01-01
Fission technology can enable rapid, affordable access to any point in the solar system. If fission propulsion systems are to be developed to their full potential; however, near-term customers must be identified and initial fission systems successfully developed, launched, and operated. Studies conducted in fiscal year 2001 (IISTP, 2001) show that fission electric propulsion (FEP) systems with a specific mass at or below 50 kg/kWjet could enhance or enable numerous robotic outer solar system missions of interest. At the required specific mass, it is possible to develop safe, affordable systems that meet mission requirements. To help select the system design to pursue, eight evaluation criteria were identified: system integration, safety, reliability, testability, specific mass, cost, schedule, and programmatic risk. A top-level comparison of four potential concepts was performed: a Testable, Passive, Redundant Reactor (TPRR), a Testable Multi-Cell In-Core Thermionic Reactor (TMCT), a Direct Gas Cooled Reactor (DGCR), and a Pumped Liquid Metal Reactor.(PLMR). Development of any of the four systems appears feasible. However, for power levels up to at least 500 kWt (enabling electric power levels of 125-175 kWe, given 25-35% power conversion efficiency) the TPRR has advantages related to several criteria and is competitive with respect to all. Hardware-based research and development has further increased confidence in the TPRR approach. Successful development and utilization of a "Phase I" fission electric propulsion system will enable advanced Phase 2 and Phase 3 systems capable of providing rapid, affordable access to any point in the solar system.
Pediatric Amblyopia Risk Investigation Study (PARIS).
Savage, Howard I; Lee, Hester H; Zaetta, Deneen; Olszowy, Ronald; Hamburger, Ellie; Weissman, Mark; Frick, Kevin
2005-12-01
To assess the learning curve, testability, and reliability of vision screening modalities administered by pediatric health extenders. Prospective masked clinical trial. Two hundred subjects aged 3 to 6 underwent timed screening for amblyopia by physician extenders, including LEA visual acuity (LEA), stereopsis (RDE), and noncycloplegic autorefraction (NCAR). Patients returned for a comprehensive diagnostic eye examination performed by an ophthalmologist or optometrist. Average screening time was 5.4 +/- 1.6 minutes (LEA), 1.9 +/- 0.9 minutes (RDE), and 1.7 +/- 1.0 minutes (NCAR). Test time for NCAR and RDE fell by 40% during the study period. Overall testability was 92% (LEA), 96% (RDE), and 94% (NCAR). Testability among 3-year-olds was 73% (LEA), 96% (RDE), and 89% (NCAR). Reliability of LEA was moderate (r = .59). Reliability of NCAR was high for astigmatism (Cyl) (r = .89), moderate for spherical equivalent (SE) (r = .66), and low for anisometropia (ANISO) (r = .38). Correlation of cycloplegic autorefraction (CAR) with gold standard cycloplegic retinoscopic refraction (CRR) was very high for SE (.85), CYL (.77), and moderate for ANISO (.48). With NCAR, physician extenders can quickly and reliably detect astigmatism and spherical refractive error in one-third the time it takes to obtain visual acuity. LEA has a lower initial cost, but is time consuming, moderately reliable, and more difficult for 3-year-olds. Shorter examination time and higher reliability may make NCAR a more efficient screening tool for refractive amblyopia in younger children. Future study is needed to determine the sensitivity and specificity of NCAR and other screening methods in detecting amblyopia and amblyopia risk factors.
Keenan, Kevin G; Valero-Cuevas, Francisco J
2007-09-01
Computational models of motor-unit populations are the objective implementations of the hypothesized mechanisms by which neural and muscle properties give rise to electromyograms (EMGs) and force. However, the variability/uncertainty of the parameters used in these models--and how they affect predictions--confounds assessing these hypothesized mechanisms. We perform a large-scale computational sensitivity analysis on the state-of-the-art computational model of surface EMG, force, and force variability by combining a comprehensive review of published experimental data with Monte Carlo simulations. To exhaustively explore model performance and robustness, we ran numerous iterative simulations each using a random set of values for nine commonly measured motor neuron and muscle parameters. Parameter values were sampled across their reported experimental ranges. Convergence after 439 simulations found that only 3 simulations met our two fitness criteria: approximating the well-established experimental relations for the scaling of EMG amplitude and force variability with mean force. An additional 424 simulations preferentially sampling the neighborhood of those 3 valid simulations converged to reveal 65 additional sets of parameter values for which the model predictions approximate the experimentally known relations. We find the model is not sensitive to muscle properties but very sensitive to several motor neuron properties--especially peak discharge rates and recruitment ranges. Therefore to advance our understanding of EMG and muscle force, it is critical to evaluate the hypothesized neural mechanisms as implemented in today's state-of-the-art models of motor unit function. We discuss experimental and analytical avenues to do so as well as new features that may be added in future implementations of motor-unit models to improve their experimental validity.
Testing the inhibitory cascade model in Mesozoic and Cenozoic mammaliaforms
2013-01-01
Background Much of the current research in the growing field of evolutionary development concerns relating developmental pathways to large-scale patterns of morphological evolution, with developmental constraints on variation, and hence diversity, a field of particular interest. Tooth morphology offers an excellent model system for such ‘evo-devo’ studies, because teeth are well preserved in the fossil record, and are commonly used in phylogenetic analyses and as ecological proxies. Moreover, tooth development is relatively well studied, and has provided several testable hypotheses of developmental influences on macroevolutionary patterns. The recently-described Inhibitory Cascade (IC) Model provides just such a hypothesis for mammalian lower molar evolution. Derived from experimental data, the IC Model suggests that a balance between mesenchymal activators and molar-derived inhibitors determines the size of the immediately posterior molar, predicting firstly that molars either decrease in size along the tooth row, or increase in size, or are all of equal size, and secondly that the second lower molar should occupy one third of lower molar area. Here, we tested the IC Model in a large selection of taxa from diverse extant and fossil mammalian groups, ranging from the Middle Jurassic (~176 to 161 Ma) to the Recent. Results Results show that most taxa (~65%) fell within the predicted areas of the Inhibitory Cascade Model. However, members of several extinct groups fell into the regions where m2 was largest, or rarely, smallest, including the majority of the polyphyletic “condylarths”. Most Mesozoic mammals fell near the centre of the space with equality of size in all three molars. The distribution of taxa was significantly clustered by diet and by phylogenetic group. Conclusions Overall, the IC Model was supported as a plesiomorphic developmental system for Mammalia, suggesting that mammal tooth size has been subjected to this developmental constraint at least since the divergence of australosphenidans and boreosphenidans approximately 180 Ma. Although exceptions exist, including many ‘condylarths’, these are most likely to be secondarily derived states, rather than alternative ancestral developmental models for Mammalia. PMID:23565593
Iron isotope composition of depleted MORB
NASA Astrophysics Data System (ADS)
Labidi, J.; Sio, C. K. I.; Shahar, A.
2015-12-01
In terrestrial basalts, iron isotope ratios are observed to weakly fractionate as a function of olivine and pyroxene crystallization. However, a ~0.1‰ difference between chondrites and MORB had been reported (Dauphas et al. 2009, Teng et al. 2013 and ref. therein). This observation could illustrate an isotope fractionation occurring during partial melting, as a function of the Fe valence in melt versus crystals. Here, we present high-precision Fe isotopic data measured by MC-ICP-MS on well-characterized samples from the Pacific-Antarctic Ridge (PAR, n=9) and from the Garrett Transform Fault (n=8). These samples allow exploring the Fe isotope fractionation between melt and magnetite, and the role of partial melting on Fe isotope fractionation. Our average δ56Fe value is +0.095±0.013‰ (95% confidence, n=17), indistinguishable from a previous estimate of +0.105±0.006‰ (95% confidence, n=43, see ref. 2). Our δ56Fe values correlate weakly with MgO contents, and correlate positively with K/Ti ratios. PAC1 DR10 shows the largest Ti and Fe depletion after titanomagnetite fractionation, with a δ56Fe value of +0.076±0.036‰. This is ~0.05‰ below other samples at a given MgO. This may illustrate a significant Fe isotope fractionation between the melt and titanomagnetite, in agreement with experimental determination (Shahar et al. 2008). GN09-02, the most incompatible-element depleted sample, has a δ56Fe value of 0.037±0.020‰. This is the lowest high-precision δ56Fe value recorded for a MORB worldwide. This basalt displays an incompatible-element depletion consistent with re-melting beneath the transform fault of mantle source that was depleted during a first melting event, beneath the ridge axis (Wendt et al. 1999). The Fe isotope observation could indicate that its mantle source underwent 56Fe depletion after a first melting event. It could alternatively indicate a lower Fe isotope fractionation during re-melting, if the source was depleted of its Fe3+, likely producing a relatively reduced melt. These hypotheses are testable, and will be discussed in detail at the conference.