Vesselinova, Neda; Alexandrov, Boian; Wall, Michael E.
2016-11-08
We present a dynamical model of drug accumulation in bacteria. The model captures key features in experimental time courses on ofloxacin accumulation: initial uptake; two-phase response; and long-term acclimation. In combination with experimental data, the model provides estimates of import and export rates in each phase, the time of entry into the second phase, and the decrease of internal drug during acclimation. Global sensitivity analysis, local sensitivity analysis, and Bayesian sensitivity analysis of the model provide information about the robustness of these estimates, and about the relative importance of different parameters in determining the features of the accumulation time coursesmore » in three different bacterial species: Escherichia coli, Staphylococcus aureus, and Pseudomonas aeruginosa. The results lead to experimentally testable predictions of the effects of membrane permeability, drug efflux and trapping (e.g., by DNA binding) on drug accumulation. A key prediction is that a sudden increase in ofloxacin accumulation in both E. coli and S. aureus is accompanied by a decrease in membrane permeability.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vesselinova, Neda; Alexandrov, Boian; Wall, Michael E.
We present a dynamical model of drug accumulation in bacteria. The model captures key features in experimental time courses on ofloxacin accumulation: initial uptake; two-phase response; and long-term acclimation. In combination with experimental data, the model provides estimates of import and export rates in each phase, the time of entry into the second phase, and the decrease of internal drug during acclimation. Global sensitivity analysis, local sensitivity analysis, and Bayesian sensitivity analysis of the model provide information about the robustness of these estimates, and about the relative importance of different parameters in determining the features of the accumulation time coursesmore » in three different bacterial species: Escherichia coli, Staphylococcus aureus, and Pseudomonas aeruginosa. The results lead to experimentally testable predictions of the effects of membrane permeability, drug efflux and trapping (e.g., by DNA binding) on drug accumulation. A key prediction is that a sudden increase in ofloxacin accumulation in both E. coli and S. aureus is accompanied by a decrease in membrane permeability.« less
Hu, Xiao-Min; Chen, Jiang-Shan; Liu, Bi-Heng; Guo, Yu; Huang, Yun-Feng; Zhou, Zong-Quan; Han, Yong-Jian; Li, Chuan-Feng; Guo, Guang-Can
2016-10-21
The physical impact and the testability of the Kochen-Specker (KS) theorem is debated because of the fact that perfect compatibility in a single quantum system cannot be achieved in practical experiments with finite precision. Here, we follow the proposal of A. Cabello and M. T. Cunha [Phys. Rev. Lett. 106, 190401 (2011)], and present a compatibility-loophole-free experimental violation of an inequality of noncontextual theories by two spatially separated entangled qutrits. A maximally entangled qutrit-qutrit state with a fidelity as high as 0.975±0.001 is prepared and distributed to separated spaces, and these two photons are then measured locally, providing the compatibility requirement. The results show that the inequality for noncontextual theory is violated by 31 standard deviations. Our experiments pave the way to close the debate about the testability of the KS theorem. In addition, the method to generate high-fidelity and high-dimension entangled states will provide significant advantages in high-dimension quantum encoding and quantum communication.
1981-03-31
logic testing element and a concomitant testability criterion ideally suited to dynamic circuit applications and appro- priate for automatic computer...making connections automatically . PF is an experimental feature which provides users with only four different chip sizes (full, half, quarter, and eighth...initial solution is found constructively which is improved by pair-wise swapping. Results show, however, that the constructive initial sorter , which
Current challenges in fundamental physics
NASA Astrophysics Data System (ADS)
Egana Ugrinovic, Daniel
The discovery of the Higgs boson at the Large Hadron Collider completed the Standard Model of particle physics. The Standard Model is a remarkably successful theory of fundamental physics, but it suffers from severe problems. It does not provide an explanation for the origin or stability of the electroweak scale nor for the origin and structure of flavor and CP violation. It predicts vanishing neutrino masses, in disagreement with experimental observations. It also fails to explain the matter-antimatter asymmetry of the universe, and it does not provide a particle candidate for dark matter. In this thesis we provide experimentally testable solutions for most of these problems and we study their phenomenology.
Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool
NASA Technical Reports Server (NTRS)
Maul, William A.; Fulton, Christopher E.
2011-01-01
This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual
Extended Testability Analysis Tool
NASA Technical Reports Server (NTRS)
Melcher, Kevin; Maul, William A.; Fulton, Christopher
2012-01-01
The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.
Zhou, Shaona; Han, Jing; Koenig, Kathleen; Raplinger, Amy; Pi, Yuan; Li, Dan; Xiao, Hua; Fu, Zhao; Bao, Lei
2016-03-01
Scientific reasoning is an important component under the cognitive strand of the 21st century skills and is highly emphasized in the new science education standards. This study focuses on the assessment of student reasoning in control of variables (COV), which is a core sub-skill of scientific reasoning. The main research question is to investigate the extent to which the existence of experimental data in questions impacts student reasoning and performance. This study also explores the effects of task contexts on student reasoning as well as students' abilities to distinguish between testability and causal influences of variables in COV experiments. Data were collected with students from both USA and China. Students received randomly one of two test versions, one with experimental data and one without. The results show that students from both populations (1) perform better when experimental data are not provided, (2) perform better in physics contexts than in real-life contexts, and (3) students have a tendency to equate non-influential variables to non-testable variables. In addition, based on the analysis of both quantitative and qualitative data, a possible progression of developmental levels of student reasoning in control of variables is proposed, which can be used to inform future development of assessment and instruction.
Zhou, Shaona; Han, Jing; Koenig, Kathleen; Raplinger, Amy; Pi, Yuan; Li, Dan; Xiao, Hua; Fu, Zhao
2015-01-01
Scientific reasoning is an important component under the cognitive strand of the 21st century skills and is highly emphasized in the new science education standards. This study focuses on the assessment of student reasoning in control of variables (COV), which is a core sub-skill of scientific reasoning. The main research question is to investigate the extent to which the existence of experimental data in questions impacts student reasoning and performance. This study also explores the effects of task contexts on student reasoning as well as students’ abilities to distinguish between testability and causal influences of variables in COV experiments. Data were collected with students from both USA and China. Students received randomly one of two test versions, one with experimental data and one without. The results show that students from both populations (1) perform better when experimental data are not provided, (2) perform better in physics contexts than in real-life contexts, and (3) students have a tendency to equate non-influential variables to non-testable variables. In addition, based on the analysis of both quantitative and qualitative data, a possible progression of developmental levels of student reasoning in control of variables is proposed, which can be used to inform future development of assessment and instruction. PMID:26949425
Testability of evolutionary game dynamics based on experimental economics data
NASA Astrophysics Data System (ADS)
Wang, Yijia; Chen, Xiaojie; Wang, Zhijian
2017-11-01
Understanding the dynamic processes of a real game system requires an appropriate dynamics model, and rigorously testing a dynamics model is nontrivial. In our methodological research, we develop an approach to testing the validity of game dynamics models that considers the dynamic patterns of angular momentum and speed as measurement variables. Using Rock-Paper-Scissors (RPS) games as an example, we illustrate the geometric patterns in the experiment data. We then derive the related theoretical patterns from a series of typical dynamics models. By testing the goodness-of-fit between the experimental and theoretical patterns, we show that the validity of these models can be evaluated quantitatively. Our approach establishes a link between dynamics models and experimental systems, which is, to the best of our knowledge, the most effective and rigorous strategy for ascertaining the testability of evolutionary game dynamics models.
Models of cooperative dynamics from biomolecules to magnets
NASA Astrophysics Data System (ADS)
Mobley, David Lowell
This work details application of computer models to several biological systems (prion diseases and Alzheimer's disease) and a magnetic system. These share some common themes, which are discussed. Here, simple lattice-based models are applied to aggregation of misfolded protein in prion diseases like Mad Cow disease. These can explain key features of the diseases. The modeling is based on aggregation being essential in establishing the time-course of infectivity. Growth of initial aggregates is assumed to dominate the experimentally observed lag phase. Subsequent fission, regrowth, and fission set apart the exponential doubling phase in disease progression. We explore several possible modes of growth for 2-D aggregates and suggest the model providing the best explanation for the experimental data. We develop testable predictions from this model. Like prion disease, Alzheimer's disease (AD) is an amyloid disease characterized by large aggregates in the brain. However, evidence increasingly points away from these as the toxic agent and towards oligomers of the Abeta peptide. We explore one possible toxicity mechanism---insertion of Abeta into cell membranes and formation of harmful ion channels. We find that mutations in this peptide which cause familial Alzheimer's disease (FAD) also affect the insertion of this peptide into membranes in a fairly consistent way, suggesting that this toxicity mechanism may be relevant biologically. We find a particular inserted configuration which may be especially harmful and develop testable predictions to verify whether or not this is the case. Nucleation is an essential feature of our models for prion disease, in that it protects normal, healthy individuals from getting prion disease. Nucleation is important in many other areas, and we modify our lattice-based nucleation model to apply to a hysteretic magnetic system where nucleation has been suggested to be important. From a simple model, we find qualitative agreement with experiment, and make testable experimental predictions concerning time-dependence and temperature-dependence of the major hysteresis loop and reversal curves which have been experimentally verified. We argue why this model may be suitable for systems like these and explain implications for Ising-like models. We suggest implications for future modeling work. Finally, we present suggestions for future work in all three areas.
The dynamics of hurricane balls
NASA Astrophysics Data System (ADS)
Andersen, W. L.; Werner, Steven
2015-09-01
We examine the theory of the hurricane balls toy. This toy consists of two steel balls, welded together that are sent spinning on a horizontal surface somewhat like a top. Unlike a top, at high frequency the symmetry axis approaches a limiting inclination that is not perpendicular to the surface. We calculate (and experimentally verify) the limiting inclinations for three toy geometries. We find that at high frequencies, hurricane balls provide an easily realized and testable example of the Poinsot theory of freely rotating symmetrical bodies.
Artificial Intelligence Applications to Testability.
1984-10-01
general software assistant; examining testability utilization of it should wait a few years until the software assistant is a well defined product ...ago. It provides a single host which satisfies the needs of developers, product developers, and end users . As shown in table 5.10-2, it also provides...follows a trend towards more user -oriented design approaches to interactive computer systems. The implicit goal in this trend is the
Trevors, J T
2010-06-01
Methods to research the origin of microbial life are limited. However, microorganisms were the first organisms on the Earth capable of cell growth and division, and interactions with their environment, other microbial cells, and eventually with diverse eukaryotic organisms. The origin of microbial life and the supporting scientific evidence are both an enigma and a scientific priority. Numerous hypotheses have been proposed, scenarios imagined, speculations presented in papers, insights shared, and assumptions made without supporting experimentation, which have led to limited progress in understanding the origin of microbial life. The use of the human imagination to envision the origin of life events, without supporting experimentation, observation and independently replicated experiments required for science, is a significant constraint. The challenge remains how to better understand the origin of microbial life using observations and experimental methods as opposed to speculation, assumptions, scenarios, envisioning events and un-testable hypotheses. This is not an easy challenge as experimental design and plausible hypothesis testing are difficult. Since past approaches have been inconclusive in providing evidence for the origin of microbial life mechanisms and the manner in which genetic instructions was encoded into DNA/RNA, it is reasonable and logical to propose that progress will be made when testable, plausible hypotheses and methods are used in the origin of microbial life research, and the experimental observations are, or are not reproduced in independent laboratories. These perspectives will be discussed in this article as well as the possibility that a pre-biotic film preceded a microbial biofilm as a possible micro-location for the origin of microbial cells capable of growth and division. 2010 Elsevier B.V. All rights reserved.
The use of models to predict potential contamination aboard orbital vehicles
NASA Technical Reports Server (NTRS)
Boraas, Martin E.; Seale, Dianne B.
1989-01-01
A model of fungal growth on air-exposed, nonnutritive solid surfaces, developed for utilization aboard orbital vehicles is presented. A unique feature of this testable model is that the development of a fungal mycelium can facilitate its own growth by condensation of water vapor from its environment directly onto fungal hyphae. The fungal growth rate is limited by the rate of supply of volatile nutrients and fungal biomass is limited by either the supply of nonvolatile nutrients or by metabolic loss processes. The model discussed is structurally simple, but its dynamics can be quite complex. Biofilm accumulation can vary from a simple linear increase to sustained exponential growth, depending on the values of the environmental variable and model parameters. The results of the model are consistent with data from aquatic biofilm studies, insofar as the two types of systems are comparable. It is shown that the model presented is experimentally testable and provides a platform for the interpretation of observational data that may be directly relevant to the question of growth of organisms aboard the proposed Space Station.
Advanced Diagnostic and Prognostic Testbed (ADAPT) Testability Analysis Report
NASA Technical Reports Server (NTRS)
Ossenfort, John
2008-01-01
As system designs become more complex, determining the best locations to add sensors and test points for the purpose of testing and monitoring these designs becomes more difficult. Not only must the designer take into consideration all real and potential faults of the system, he or she must also find efficient ways of detecting and isolating those faults. Because sensors and cabling take up valuable space and weight on a system, and given constraints on bandwidth and power, it is even more difficult to add sensors into these complex designs after the design has been completed. As a result, a number of software tools have been developed to assist the system designer in proper placement of these sensors during the system design phase of a project. One of the key functions provided by many of these software programs is a testability analysis of the system essentially an evaluation of how observable the system behavior is using available tests. During the design phase, testability metrics can help guide the designer in improving the inherent testability of the design. This may include adding, removing, or modifying tests; breaking up feedback loops, or changing the system to reduce fault propagation. Given a set of test requirements, the analysis can also help to verify that the system will meet those requirements. Of course, a testability analysis requires that a software model of the physical system is available. For the analysis to be most effective in guiding system design, this model should ideally be constructed in parallel with these efforts. The purpose of this paper is to present the final testability results of the Advanced Diagnostic and Prognostic Testbed (ADAPT) after the system model was completed. The tool chosen to build the model and to perform the testability analysis with is the Testability Engineering and Maintenance System Designer (TEAMS-Designer). The TEAMS toolset is intended to be a solution to span all phases of the system, from design and development through health management and maintenance. TEAMS-Designer is the model-building and testability analysis software in that suite.
NASA Technical Reports Server (NTRS)
Chen, Chung-Hsing
1992-01-01
In this thesis, a behavioral-level testability analysis approach is presented. This approach is based on analyzing the circuit behavioral description (similar to a C program) to estimate its testability by identifying controllable and observable circuit nodes. This information can be used by a test generator to gain better access to internal circuit nodes and to reduce its search space. The results of the testability analyzer can also be used to select test points or partial scan flip-flops in the early design phase. Based on selection criteria, a novel Synthesis for Testability approach call Test Statement Insertion (TSI) is proposed, which modifies the circuit behavioral description directly. Test Statement Insertion can also be used to modify circuit structural description to improve its testability. As a result, Synthesis for Testability methodology can be combined with an existing behavioral synthesis tool to produce more testable circuits.
Module generation for self-testing integrated systems
NASA Astrophysics Data System (ADS)
Vanriessen, Ronald Pieter
Hardware used for self test in VLSI (Very Large Scale Integrated) systems is reviewed, and an architecture to control the test hardware in an integrated system is presented. Because of the increase of test times, the use of self test techniques has become practically and economically viable for VLSI systems. Beside the reduction in test times and costs, self test also provides testing at operational speeds. Therefore, a suitable combination of scan path and macrospecific (self) tests is required to reduce test times and costs. An expert system that can be used in a silicon compilation environment is presented. The approach requires a minimum of testability knowledge from a system designer. A user friendly interface was described for specifying and modifying testability requirements by a testability expert. A reason directed backtracking mechanism is used to solve selection failures. Both the hierarchical testable architecture and the design for testability expert system are used in a self test compiler. The definition of a self test compiler was given. A self test compiler is a software tool that selects an appropriate test method for every macro in a design. The hardware to control a macro test will be included in the design automatically. As an example, the integration of the self-test compiler in a silicon compilation system PIRAMID was described. The design of a demonstrator circuit by self test compiler is described. This circuit consists of two self testable macros. Control of the self test hardware is carried out via the test access port of the boundary scan standard.
A simple theoretical framework for understanding heterogeneous differentiation of CD4+ T cells
2012-01-01
Background CD4+ T cells have several subsets of functional phenotypes, which play critical yet diverse roles in the immune system. Pathogen-driven differentiation of these subsets of cells is often heterogeneous in terms of the induced phenotypic diversity. In vitro recapitulation of heterogeneous differentiation under homogeneous experimental conditions indicates some highly regulated mechanisms by which multiple phenotypes of CD4+ T cells can be generated from a single population of naïve CD4+ T cells. Therefore, conceptual understanding of induced heterogeneous differentiation will shed light on the mechanisms controlling the response of populations of CD4+ T cells under physiological conditions. Results We present a simple theoretical framework to show how heterogeneous differentiation in a two-master-regulator paradigm can be governed by a signaling network motif common to all subsets of CD4+ T cells. With this motif, a population of naïve CD4+ T cells can integrate the signals from their environment to generate a functionally diverse population with robust commitment of individual cells. Notably, two positive feedback loops in this network motif govern three bistable switches, which in turn, give rise to three types of heterogeneous differentiated states, depending upon particular combinations of input signals. We provide three prototype models illustrating how to use this framework to explain experimental observations and make specific testable predictions. Conclusions The process in which several types of T helper cells are generated simultaneously to mount complex immune responses upon pathogenic challenges can be highly regulated, and a simple signaling network motif can be responsible for generating all possible types of heterogeneous populations with respect to a pair of master regulators controlling CD4+ T cell differentiation. The framework provides a mathematical basis for understanding the decision-making mechanisms of CD4+ T cells, and it can be helpful for interpreting experimental results. Mathematical models based on the framework make specific testable predictions that may improve our understanding of this differentiation system. PMID:22697466
Linking Microbiota to Human Diseases: A Systems Biology Perspective.
Wu, Hao; Tremaroli, Valentina; Bäckhed, Fredrik
2015-12-01
The human gut microbiota encompasses a densely populated ecosystem that provides essential functions for host development, immune maturation, and metabolism. Alterations to the gut microbiota have been observed in numerous diseases, including human metabolic diseases such as obesity, type 2 diabetes (T2D), and irritable bowel syndrome, and some animal experiments have suggested causality. However, few studies have validated causality in humans and the underlying mechanisms remain largely to be elucidated. We discuss how systems biology approaches combined with new experimental technologies may disentangle some of the mechanistic details in the complex interactions of diet, microbiota, and host metabolism and may provide testable hypotheses for advancing our current understanding of human-microbiota interaction. Copyright © 2015 Elsevier Ltd. All rights reserved.
From Cookbook to Experimental Design
ERIC Educational Resources Information Center
Flannagan, Jenny Sue; McMillan, Rachel
2009-01-01
Developing expertise, whether from cook to chef or from student to scientist, occurs over time and requires encouragement, guidance, and support. One key goal of an elementary science program should be to move students toward expertise in their ability to design investigative questions. The ability to design a testable question is difficult for…
Links between Parents' Epistemological Stance and Children's Evidence Talk
ERIC Educational Resources Information Center
Luce, Megan R.; Callanan, Maureen A.; Smilovic, Sarah
2013-01-01
Recent experimental research highlights young children's selectivity in learning from others. Little is known, however, about the patterns of information that children actually encounter in conversations with adults. This study investigated variation in parents' tendency to focus on testable evidence as a way to answer science-related questions…
QUANTITATIVE TESTS OF ELMS AS INTERMEDIATE N PEELING-BALOONING MODES
DOE Office of Scientific and Technical Information (OSTI.GOV)
LAO,LL; SNYDER,PB; LEONARD,AW
2003-03-01
A271 QUANTITATIVE TESTS OF ELMS AS INTERMEDIATE N PEELING-BALOONING MODES. Several testable features of the working model of edge localized modes (ELMs) as intermediate toroidal mode number peeling-ballooning modes are evaluated quantitatively using DIII-D and JT-60U experimental data and the ELITE MHD stability code. These include the hypothesis that ELM sizes are related to the radial widths of the unstable MHD modes, the unstable modes have a strong ballooning character localized in the outboard bad curvature region, and ELM size generally becomes smaller at high edge collisionality. ELMs are triggered when the growth rates of the unstable MHD modes becomemore » significantly large. These testable features are consistent with many ELM observations in DIII-D and JT-60U discharges.« less
Lee, Joy L; DeCamp, Matthew; Dredze, Mark; Chisolm, Margaret S; Berger, Zackary D
2014-10-15
Twitter is home to many health professionals who send messages about a variety of health-related topics. Amid concerns about physicians posting inappropriate content online, more in-depth knowledge about these messages is needed to understand health professionals' behavior on Twitter. Our goal was to characterize the content of Twitter messages, specifically focusing on health professionals and their tweets relating to health. We performed an in-depth content analysis of 700 tweets. Qualitative content analysis was conducted on tweets by health users on Twitter. The primary objective was to describe the general type of content (ie, health-related versus non-health related) on Twitter authored by health professionals and further to describe health-related tweets on the basis of the type of statement made. Specific attention was given to whether a tweet was personal (as opposed to professional) or made a claim that users would expect to be supported by some level of medical evidence (ie, a "testable" claim). A secondary objective was to compare content types among different users, including patients, physicians, nurses, health care organizations, and others. Health-related users are posting a wide range of content on Twitter. Among health-related tweets, 53.2% (184/346) contained a testable claim. Of health-related tweets by providers, 17.6% (61/346) were personal in nature; 61% (59/96) made testable statements. While organizations and businesses use Twitter to promote their services and products, patient advocates are using this tool to share their personal experiences with health. Twitter users in health-related fields tweet about both testable claims and personal experiences. Future work should assess the relationship between testable tweets and the actual level of evidence supporting them, including how Twitter users-especially patients-interpret the content of tweets posted by health providers.
Phenoscape: Identifying Candidate Genes for Evolutionary Phenotypes
Edmunds, Richard C.; Su, Baofeng; Balhoff, James P.; Eames, B. Frank; Dahdul, Wasila M.; Lapp, Hilmar; Lundberg, John G.; Vision, Todd J.; Dunham, Rex A.; Mabee, Paula M.; Westerfield, Monte
2016-01-01
Phenotypes resulting from mutations in genetic model organisms can help reveal candidate genes for evolutionarily important phenotypic changes in related taxa. Although testing candidate gene hypotheses experimentally in nonmodel organisms is typically difficult, ontology-driven information systems can help generate testable hypotheses about developmental processes in experimentally tractable organisms. Here, we tested candidate gene hypotheses suggested by expert use of the Phenoscape Knowledgebase, specifically looking for genes that are candidates responsible for evolutionarily interesting phenotypes in the ostariophysan fishes that bear resemblance to mutant phenotypes in zebrafish. For this, we searched ZFIN for genetic perturbations that result in either loss of basihyal element or loss of scales phenotypes, because these are the ancestral phenotypes observed in catfishes (Siluriformes). We tested the identified candidate genes by examining their endogenous expression patterns in the channel catfish, Ictalurus punctatus. The experimental results were consistent with the hypotheses that these features evolved through disruption in developmental pathways at, or upstream of, brpf1 and eda/edar for the ancestral losses of basihyal element and scales, respectively. These results demonstrate that ontological annotations of the phenotypic effects of genetic alterations in model organisms, when aggregated within a knowledgebase, can be used effectively to generate testable, and useful, hypotheses about evolutionary changes in morphology. PMID:26500251
A bioinformatics expert system linking functional data to anatomical outcomes in limb regeneration
Lobo, Daniel; Feldman, Erica B.; Shah, Michelle; Malone, Taylor J.
2014-01-01
Abstract Amphibians and molting arthropods have the remarkable capacity to regenerate amputated limbs, as described by an extensive literature of experimental cuts, amputations, grafts, and molecular techniques. Despite a rich history of experimental effort, no comprehensive mechanistic model exists that can account for the pattern regulation observed in these experiments. While bioinformatics algorithms have revolutionized the study of signaling pathways, no such tools have heretofore been available to assist scientists in formulating testable models of large‐scale morphogenesis that match published data in the limb regeneration field. Major barriers to preventing an algorithmic approach are the lack of formal descriptions for experimental regenerative information and a repository to centralize storage and mining of functional data on limb regeneration. Establishing a new bioinformatics of shape would significantly accelerate the discovery of key insights into the mechanisms that implement complex regeneration. Here, we describe a novel mathematical ontology for limb regeneration to unambiguously encode phenotype, manipulation, and experiment data. Based on this formalism, we present the first centralized formal database of published limb regeneration experiments together with a user‐friendly expert system tool to facilitate its access and mining. These resources are freely available for the community and will assist both human biologists and artificial intelligence systems to discover testable, mechanistic models of limb regeneration. PMID:25729585
Eye Examination Testability in Children with Autism and in Typical Peers
Coulter, Rachel Anastasia; Bade, Annette; Tea, Yin; Fecho, Gregory; Amster, Deborah; Jenewein, Erin; Rodena, Jacqueline; Lyons, Kara Kelley; Mitchell, G. Lynn; Quint, Nicole; Dunbar, Sandra; Ricamato, Michele; Trocchio, Jennie; Kabat, Bonnie; Garcia, Chantel; Radik, Irina
2015-01-01
ABSTRACT Purpose To compare testability of vision and eye tests in an examination protocol of 9- to 17-year-old patients with autism spectrum disorder (ASD) to typically developing (TD) peers. Methods In a prospective pilot study, 61 children and adolescents (34 with ASD and 27 who were TD) aged 9 to 17 years completed an eye examination protocol including tests of visual acuity, refraction, convergence (eye teaming), stereoacuity (depth perception), ocular motility, and ocular health. Patients who required new refractive correction were retested after wearing their updated spectacle prescription for 1 month. The specialized protocol incorporated visual, sensory, and communication supports. A psychologist determined group status/eligibility using DSM-IV-TR (Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition, Text Revision) criteria by review of previous evaluations and parent responses on the Social Communication Questionnaire. Before the examination, parents provided information regarding patients’ sex, race, ethnicity, and, for ASD patients, verbal communication level (nonverbal, uses short words, verbal). Parents indicated whether the patient wore a refractive correction, whether the patient had ever had an eye examination, and the age at the last examination. Chi-square tests compared testability results for TD and ASD groups. Results Typically developing and ASD groups did not differ by age (p = 0.54), sex (p = 0.53), or ethnicity (p = 0.22). Testability was high on most tests (TD, 100%; ASD, 88 to 100%), except for intraocular pressure (IOP), which was reduced for both the ASD (71%) and the TD (89%) patients. Among ASD patients, IOP testability varied greatly with verbal communication level (p < 0.001). Although IOP measurements were completed on all verbal patients, only 37.5% of nonverbal and 44.4% of ASD patients who used short words were successful. Conclusions Patients with ASD can complete most vision and eye tests within an examination protocol. Testability of IOPs is reduced, particularly for nonverbal patients and patients who use short words to communicate. PMID:25415280
Delay test generation for synchronous sequential circuits
NASA Astrophysics Data System (ADS)
Devadas, Srinivas
1989-05-01
We address the problem of generating tests for delay faults in non-scan synchronous sequential circuits. Delay test generation for sequential circuits is a considerably more difficult problem than delay testing of combinational circuits and has received much less attention. In this paper, we present a method for generating test sequences to detect delay faults in sequential circuits using the stuck-at fault sequential test generator STALLION. The method is complete in that it will generate a delay test sequence for a targeted fault given sufficient CPU time, if such a sequence exists. We term faults for which no delay test sequence exists, under out test methodology, sequentially delay redundant. We describe means of eliminating sequential delay redundancies in logic circuits. We present a partial-scan methodology for enhancing the testability of difficult-to-test of untestable sequential circuits, wherein a small number of flip-flops are selected and made controllable/observable. The selection process guarantees the elimination of all sequential delay redundancies. We show that an intimate relationship exists between state assignment and delay testability of a sequential machine. We describe a state assignment algorithm for the synthesis of sequential machines with maximal delay fault testability. Preliminary experimental results using the test generation, partial-scan and synthesis algorithm are presented.
Harris, Jenine K; Erwin, Paul C; Smith, Carson; Brownson, Ross C
2015-01-01
Evidence-based decision making (EBDM) is the process, in local health departments (LHDs) and other settings, of translating the best available scientific evidence into practice. Local health departments are more likely to be successful if they use evidence-based strategies. However, EBDM and use of evidence-based strategies by LHDs are not widespread. Drawing on diffusion of innovations theory, we sought to understand how LHD directors and program managers perceive the relative advantage, compatibility, simplicity, and testability of EBDM. Directors and managers of programs in chronic disease, environmental health, and infectious disease from LHDs nationwide completed a survey including demographic information and questions about diffusion attributes (advantage, compatibility, simplicity, and testability) related to EBDM. Bivariate inferential tests were used to compare responses between directors and managers and to examine associations between participant characteristics and diffusion attributes. Relative advantage and compatibility scores were high for directors and managers, whereas simplicity and testability scores were lower. Although health department directors and managers of programs in chronic disease generally had higher scores than other groups, there were few significant or large differences between directors and managers across the diffusion attributes. Larger jurisdiction population size was associated with higher relative advantage and compatibility scores for both directors and managers. Overall, directors and managers were in strong agreement on the relative advantage of an LHD using EBDM, with directors in stronger agreement than managers. Perceived relative advantage has been demonstrated to be the most important factor in the rate of innovation adoption, suggesting an opportunity for directors to speed EBDM adoption. However, lower average scores across all groups for simplicity and testability may be hindering EBDM adoption. Recommended strategies for increasing perceived EBDM simplicity and testability are provided.
Measurement uncertainty relations: characterising optimal error bounds for qubits
NASA Astrophysics Data System (ADS)
Bullock, T.; Busch, P.
2018-07-01
In standard formulations of the uncertainty principle, two fundamental features are typically cast as impossibility statements: two noncommuting observables cannot in general both be sharply defined (for the same state), nor can they be measured jointly. The pioneers of quantum mechanics were acutely aware and puzzled by this fact, and it motivated Heisenberg to seek a mitigation, which he formulated in his seminal paper of 1927. He provided intuitive arguments to show that the values of, say, the position and momentum of a particle can at least be unsharply defined, and they can be measured together provided some approximation errors are allowed. Only now, nine decades later, a working theory of approximate joint measurements is taking shape, leading to rigorous and experimentally testable formulations of associated error tradeoff relations. Here we briefly review this new development, explaining the concepts and steps taken in the construction of optimal joint approximations of pairs of incompatible observables. As a case study, we deduce measurement uncertainty relations for qubit observables using two distinct error measures. We provide an operational interpretation of the error bounds and discuss some of the first experimental tests of such relations.
Multidisciplinary analysis and design of printed wiring boards
NASA Astrophysics Data System (ADS)
Fulton, Robert E.; Hughes, Joseph L.; Scott, Waymond R., Jr.; Umeagukwu, Charles; Yeh, Chao-Pin
1991-04-01
Modern printed wiring board design depends on electronic prototyping using computer-based simulation and design tools. Existing electrical computer-aided design (ECAD) tools emphasize circuit connectivity with only rudimentary analysis capabilities. This paper describes a prototype integrated PWB design environment denoted Thermal Structural Electromagnetic Testability (TSET) being developed at Georgia Tech in collaboration with companies in the electronics industry. TSET provides design guidance based on enhanced electrical and mechanical CAD capabilities including electromagnetic modeling testability analysis thermal management and solid mechanics analysis. TSET development is based on a strong analytical and theoretical science base and incorporates an integrated information framework and a common database design based on a systematic structured methodology.
Multidisciplinary approaches to understanding collective cell migration in developmental biology.
Schumacher, Linus J; Kulesa, Paul M; McLennan, Rebecca; Baker, Ruth E; Maini, Philip K
2016-06-01
Mathematical models are becoming increasingly integrated with experimental efforts in the study of biological systems. Collective cell migration in developmental biology is a particularly fruitful application area for the development of theoretical models to predict the behaviour of complex multicellular systems with many interacting parts. In this context, mathematical models provide a tool to assess the consistency of experimental observations with testable mechanistic hypotheses. In this review, we showcase examples from recent years of multidisciplinary investigations of neural crest cell migration. The neural crest model system has been used to study how collective migration of cell populations is shaped by cell-cell interactions, cell-environmental interactions and heterogeneity between cells. The wide range of emergent behaviours exhibited by neural crest cells in different embryonal locations and in different organisms helps us chart out the spectrum of collective cell migration. At the same time, this diversity in migratory characteristics highlights the need to reconcile or unify the array of currently hypothesized mechanisms through the next generation of experimental data and generalized theoretical descriptions. © 2016 The Authors.
Raja, Kalpana; Patrick, Matthew; Gao, Yilin; Madu, Desmond; Yang, Yuyang
2017-01-01
In the past decade, the volume of “omics” data generated by the different high-throughput technologies has expanded exponentially. The managing, storing, and analyzing of this big data have been a great challenge for the researchers, especially when moving towards the goal of generating testable data-driven hypotheses, which has been the promise of the high-throughput experimental techniques. Different bioinformatics approaches have been developed to streamline the downstream analyzes by providing independent information to interpret and provide biological inference. Text mining (also known as literature mining) is one of the commonly used approaches for automated generation of biological knowledge from the huge number of published articles. In this review paper, we discuss the recent advancement in approaches that integrate results from omics data and information generated from text mining approaches to uncover novel biomedical information. PMID:28331849
Woolley, Thomas E; Gaffney, Eamonn A; Goriely, Alain
2017-07-01
If the plasma membrane of a cell is able to delaminate locally from its actin cortex, a cellular bleb can be produced. Blebs are pressure-driven protrusions, which are noteworthy for their ability to produce cellular motion. Starting from a general continuum mechanics description, we restrict ourselves to considering cell and bleb shapes that maintain approximately spherical forms. From this assumption, we obtain a tractable algebraic system for bleb formation. By including cell-substrate adhesions, we can model blebbing cell motility. Further, by considering mechanically isolated blebbing events, which are randomly distributed over the cell, we can derive equations linking the macroscopic migration characteristics to the microscopic structural parameters of the cell. This multiscale modeling framework is then used to provide parameter estimates, which are in agreement with current experimental data. In summary, the construction of the mathematical model provides testable relationships between the bleb size and cell motility.
Architectural Analysis of Dynamically Reconfigurable Systems
NASA Technical Reports Server (NTRS)
Lindvall, Mikael; Godfrey, Sally; Ackermann, Chris; Ray, Arnab; Yonkwa, Lyly
2010-01-01
oTpics include: the problem (increased flexibility of architectural styles decrease analyzability, behavior emerges and varies depending on the configuration, does the resulting system run according to the intended design, and architectural decisions can impede or facilitate testing); top down approach to architecture analysis, detection of defects and deviations, and architecture and its testability; currently targeted projects GMSEC and CFS; analyzing software architectures; analyzing runtime events; actual architecture recognition; GMPUB in Dynamic SAVE; sample output from new approach; taking message timing delays into account; CFS examples of architecture and testability; some recommendations for improved testablity; and CFS examples of abstract interfaces and testability; CFS example of opening some internal details.
Regulation of multispanning membrane protein topology via post-translational annealing.
Van Lehn, Reid C; Zhang, Bin; Miller, Thomas F
2015-09-26
The canonical mechanism for multispanning membrane protein topogenesis suggests that protein topology is established during cotranslational membrane integration. However, this mechanism is inconsistent with the behavior of EmrE, a dual-topology protein for which the mutation of positively charged loop residues, even close to the C-terminus, leads to dramatic shifts in its topology. We use coarse-grained simulations to investigate the Sec-facilitated membrane integration of EmrE and its mutants on realistic biological timescales. This work reveals a mechanism for regulating membrane-protein topogenesis, in which initially misintegrated configurations of the proteins undergo post-translational annealing to reach fully integrated multispanning topologies. The energetic barriers associated with this post-translational annealing process enforce kinetic pathways that dictate the topology of the fully integrated proteins. The proposed mechanism agrees well with the experimentally observed features of EmrE topogenesis and provides a range of experimentally testable predictions regarding the effect of translocon mutations on membrane protein topogenesis.
Dynamic allostery of protein alpha helical coiled-coils
Hawkins, Rhoda J; McLeish, Tom C.B
2005-01-01
Alpha helical coiled-coils appear in many important allosteric proteins such as the dynein molecular motor and bacteria chemotaxis transmembrane receptors. As a mechanism for transmitting the information of ligand binding to a distant site across an allosteric protein, an alternative to conformational change in the mean static structure is an induced change in the pattern of the internal dynamics of the protein. We explore how ligand binding may change the intramolecular vibrational free energy of a coiled-coil, using parameterized coarse-grained models, treating the case of dynein in detail. The models predict that coupling of slide, bend and twist modes of the coiled-coil transmits an allosteric free energy of ∼2kBT, consistent with experimental results. A further prediction is a quantitative increase in the effective stiffness of the coiled-coil without any change in inherent flexibility of the individual helices. The model provides a possible and experimentally testable mechanism for transmission of information through the alpha helical coiled-coil of dynein. PMID:16849225
Boolean Minimization and Algebraic Factorization Procedures for Fully Testable Sequential Machines
1989-09-01
Boolean Minimization and Algebraic Factorization Procedures for Fully Testable Sequential Machines Srinivas Devadas and Kurt Keutzer F ( Abstract In this...Projects Agency under contract number N00014-87-K-0825. Author Information Devadas : Department of Electrical Engineering and Computer Science, Room 36...MA 02139; (617) 253-0292. 0 * Boolean Minimization and Algebraic Factorization Procedures for Fully Testable Sequential Machines Siivas Devadas
Eventful horizons: String theory in de Sitter and anti-de Sitter
NASA Astrophysics Data System (ADS)
Kleban, Matthew Benjamin
String theory purports to be a theory of quantum gravity. As such, it should have much to say about the deep mysteries surrounding the very early stages of our universe. For this reason, although the theory is notoriously difficult to directly test, data from experimental cosmology may provide a way to probe the high energy physics of string theory. In the first part of this thesis, I will address the important issue of the testability of string theory using observations of the cosmic microwave background radiation. In the second part, I will study some formal difficulties that arise in attempting to understand string theory in de Sitter spacetime. In the third part, I will study the singularity of an eternal anti de Sitter Schwarzschild black hole, using the AdS/CFT correspondence.
A Unified Approach to the Synthesis of Fully Testable Sequential Machines
1989-10-01
N A Unified Approach to the Synthesis of Fully Testable Sequential Machines Srinivas Devadas and Kurt Keutzer Abstract • In this paper we attempt to...research was supported in part by the Defense Advanced Research Projects Agency under contract N00014-87-K-0825. Author Information Devadas : Department...Fully Testable Sequential Maine(S P Sritiivas Devadas Departinent of Electrical Engineerinig anid Comivi Sciec Massachusetts Institute of Technology
Factors That Affect Software Testability
NASA Technical Reports Server (NTRS)
Voas, Jeffrey M.
1991-01-01
Software faults that infrequently affect software's output are dangerous. When a software fault causes frequent software failures, testing is likely to reveal the fault before the software is releases; when the fault remains undetected during testing, it can cause disaster after the software is installed. A technique for predicting whether a particular piece of software is likely to reveal faults within itself during testing is found in [Voas91b]. A piece of software that is likely to reveal faults within itself during testing is said to have high testability. A piece of software that is not likely to reveal faults within itself during testing is said to have low testability. It is preferable to design software with higher testabilities from the outset, i.e., create software with as high of a degree of testability as possible to avoid the problems of having undetected faults that are associated with low testability. Information loss is a phenomenon that occurs during program execution that increases the likelihood that a fault will remain undetected. In this paper, I identify two brad classes of information loss, define them, and suggest ways of predicting the potential for information loss to occur. We do this in order to decrease the likelihood that faults will remain undetected during testing.
DeCamp, Matthew; Dredze, Mark; Chisolm, Margaret S; Berger, Zackary D
2014-01-01
Background Twitter is home to many health professionals who send messages about a variety of health-related topics. Amid concerns about physicians posting inappropriate content online, more in-depth knowledge about these messages is needed to understand health professionals’ behavior on Twitter. Objective Our goal was to characterize the content of Twitter messages, specifically focusing on health professionals and their tweets relating to health. Methods We performed an in-depth content analysis of 700 tweets. Qualitative content analysis was conducted on tweets by health users on Twitter. The primary objective was to describe the general type of content (ie, health-related versus non-health related) on Twitter authored by health professionals and further to describe health-related tweets on the basis of the type of statement made. Specific attention was given to whether a tweet was personal (as opposed to professional) or made a claim that users would expect to be supported by some level of medical evidence (ie, a “testable” claim). A secondary objective was to compare content types among different users, including patients, physicians, nurses, health care organizations, and others. Results Health-related users are posting a wide range of content on Twitter. Among health-related tweets, 53.2% (184/346) contained a testable claim. Of health-related tweets by providers, 17.6% (61/346) were personal in nature; 61% (59/96) made testable statements. While organizations and businesses use Twitter to promote their services and products, patient advocates are using this tool to share their personal experiences with health. Conclusions Twitter users in health-related fields tweet about both testable claims and personal experiences. Future work should assess the relationship between testable tweets and the actual level of evidence supporting them, including how Twitter users—especially patients—interpret the content of tweets posted by health providers. PMID:25591063
Heating of trapped ultracold atoms by collapse dynamics
NASA Astrophysics Data System (ADS)
Laloë, Franck; Mullin, William J.; Pearle, Philip
2014-11-01
The continuous spontaneous localization (CSL) theory alters the Schrödinger equation. It describes wave-function collapse as a dynamical process instead of an ill-defined postulate, thereby providing macroscopic uniqueness and solving the so-called measurement problem of standard quantum theory. CSL contains a parameter λ giving the collapse rate of an isolated nucleon in a superposition of two spatially separated states and, more generally, characterizing the collapse time for any physical situation. CSL is experimentally testable, since it predicts some behavior different from that predicted by standard quantum theory. One example is the narrowing of wave functions, which results in energy imparted to particles. Here we consider energy given to trapped ultracold atoms. Since these are the coldest samples under experimental investigation, it is worth inquiring how they are affected by the CSL heating mechanism. We examine the CSL heating of a Bose-Einstein condensate (BEC) in contact with its thermal cloud. Of course, other mechanisms also provide heat and also particle loss. From varied data on optically trapped cesium BECs, we present an energy audit for known heating and loss mechanisms. The result provides an upper limit on CSL heating and thereby an upper limit on the parameter λ . We obtain λ ≲1 (±1 ) ×10-7 s-1.
USDA-ARS?s Scientific Manuscript database
Progress in studying the biology of Trichinella spp. was greatly advanced with the publication and analysis of the draft genome sequence of T. spiralis. Those data provide a basis for constructing testable hypothesis concerning parasite physiology, immunology, and genetics. They also provide tools...
A computational approach to negative priming
NASA Astrophysics Data System (ADS)
Schrobsdorff, H.; Ihrke, M.; Kabisch, B.; Behrendt, J.; Hasselhorn, M.; Herrmann, J. Michael
2007-09-01
Priming is characterized by a sensitivity of reaction times to the sequence of stimuli in psychophysical experiments. The reduction of the reaction time observed in positive priming is well-known and experimentally understood (Scarborough et al., J. Exp. Psycholol: Hum. Percept. Perform., 3, pp. 1-17, 1977). Negative priming—the opposite effect—is experimentally less tangible (Fox, Psychonom. Bull. Rev., 2, pp. 145-173, 1995). The dependence on subtle parameter changes (such as response-stimulus interval) usually varies. The sensitivity of the negative priming effect bears great potential for applications in research in fields such as memory, selective attention, and ageing effects. We develop and analyse a computational realization, CISAM, of a recent psychological model for action decision making, the ISAM (Kabisch, PhD thesis, Friedrich-Schiller-Universitat, 2003), which is sensitive to priming conditions. With the dynamical systems approach of the CISAM, we show that a single adaptive threshold mechanism is sufficient to explain both positive and negative priming effects. This is achieved by comparing results obtained by the computational modelling with experimental data from our laboratory. The implementation provides a rich base from which testable predictions can be derived, e.g. with respect to hitherto untested stimulus combinations (e.g. single-object trials).
Testability of evolutionary game dynamics based on experimental economics data
NASA Astrophysics Data System (ADS)
Wang, Yijia; Chen, Xiaojie; Wang, Zhijian
In order to better understand the dynamic processes of a real game system, we need an appropriate dynamics model, so to evaluate the validity of a model is not a trivial task. Here, we demonstrate an approach, considering the dynamical macroscope patterns of angular momentum and speed as the measurement variables, to evaluate the validity of various dynamics models. Using the data in real time Rock-Paper-Scissors (RPS) games experiments, we obtain the experimental dynamic patterns, and then derive the related theoretical dynamic patterns from a series of typical dynamics models respectively. By testing the goodness-of-fit between the experimental and theoretical patterns, the validity of the models can be evaluated. One of the results in our study case is that, among all the nonparametric models tested, the best-known Replicator dynamics model performs almost worst, while the Projection dynamics model performs best. Besides providing new empirical macroscope patterns of social dynamics, we demonstrate that the approach can be an effective and rigorous tool to test game dynamics models. Fundamental Research Funds for the Central Universities (SSEYI2014Z) and the National Natural Science Foundation of China (Grants No. 61503062).
Origin and Proliferation of Multiple-Drug Resistance in Bacterial Pathogens
Chang, Hsiao-Han; Cohen, Ted; Grad, Yonatan H.; Hanage, William P.; O'Brien, Thomas F.
2015-01-01
SUMMARY Many studies report the high prevalence of multiply drug-resistant (MDR) strains. Because MDR infections are often significantly harder and more expensive to treat, they represent a growing public health threat. However, for different pathogens, different underlying mechanisms are traditionally used to explain these observations, and it is unclear whether each bacterial taxon has its own mechanism(s) for multidrug resistance or whether there are common mechanisms between distantly related pathogens. In this review, we provide a systematic overview of the causes of the excess of MDR infections and define testable predictions made by each hypothetical mechanism, including experimental, epidemiological, population genomic, and other tests of these hypotheses. Better understanding the cause(s) of the excess of MDR is the first step to rational design of more effective interventions to prevent the origin and/or proliferation of MDR. PMID:25652543
Using Backward Design in Education Research: A Research Methods Essay †
Jensen, Jamie L.; Bailey, Elizabeth G.; Kummer, Tyler A.; Weber, K. Scott
2017-01-01
Education research within the STEM disciplines applies a scholarly approach to teaching and learning, with the intent of better understanding how people learn and of improving pedagogy at the undergraduate level. Most of the professionals practicing in this field have ‘crossed over’ from other disciplinary fields and thus have faced challenges in becoming experts in a new discipline. In this article, we offer a novel framework for approaching education research design called Backward Design in Education Research. It is patterned on backward curricular design and provides a three-step, systematic approach to designing education projects: 1) Define a research question that leads to a testable causal hypothesis based on a theoretical rationale; 2) Choose or design the assessment instruments to test the research hypothesis; and 3) Develop an experimental protocol that will be effective in testing the research hypothesis. This approach provides a systematic method to develop and carry out evidence-based research design. PMID:29854045
LSI/VLSI design for testability analysis and general approach
NASA Technical Reports Server (NTRS)
Lam, A. Y.
1982-01-01
The incorporation of testability characteristics into large scale digital design is not only necessary for, but also pertinent to effective device testing and enhancement of device reliability. There are at least three major DFT techniques, namely, the self checking, the LSSD, and the partitioning techniques, each of which can be incorporated into a logic design to achieve a specific set of testability and reliability requirements. Detailed analysis of the design theory, implementation, fault coverage, hardware requirements, application limitations, etc., of each of these techniques are also presented.
Antenna Mechanism of Length Control of Actin Cables
Mohapatra, Lishibanya; Goode, Bruce L.; Kondev, Jane
2015-01-01
Actin cables are linear cytoskeletal structures that serve as tracks for myosin-based intracellular transport of vesicles and organelles in both yeast and mammalian cells. In a yeast cell undergoing budding, cables are in constant dynamic turnover yet some cables grow from the bud neck toward the back of the mother cell until their length roughly equals the diameter of the mother cell. This raises the question: how is the length of these cables controlled? Here we describe a novel molecular mechanism for cable length control inspired by recent experimental observations in cells. This “antenna mechanism” involves three key proteins: formins, which polymerize actin, Smy1 proteins, which bind formins and inhibit actin polymerization, and myosin motors, which deliver Smy1 to formins, leading to a length-dependent actin polymerization rate. We compute the probability distribution of cable lengths as a function of several experimentally tuneable parameters such as the formin-binding affinity of Smy1 and the concentration of myosin motors delivering Smy1. These results provide testable predictions of the antenna mechanism of actin-cable length control. PMID:26107518
Antenna Mechanism of Length Control of Actin Cables.
Mohapatra, Lishibanya; Goode, Bruce L; Kondev, Jane
2015-06-01
Actin cables are linear cytoskeletal structures that serve as tracks for myosin-based intracellular transport of vesicles and organelles in both yeast and mammalian cells. In a yeast cell undergoing budding, cables are in constant dynamic turnover yet some cables grow from the bud neck toward the back of the mother cell until their length roughly equals the diameter of the mother cell. This raises the question: how is the length of these cables controlled? Here we describe a novel molecular mechanism for cable length control inspired by recent experimental observations in cells. This "antenna mechanism" involves three key proteins: formins, which polymerize actin, Smy1 proteins, which bind formins and inhibit actin polymerization, and myosin motors, which deliver Smy1 to formins, leading to a length-dependent actin polymerization rate. We compute the probability distribution of cable lengths as a function of several experimentally tuneable parameters such as the formin-binding affinity of Smy1 and the concentration of myosin motors delivering Smy1. These results provide testable predictions of the antenna mechanism of actin-cable length control.
A spiking neural network model of 3D perception for event-based neuromorphic stereo vision systems
Osswald, Marc; Ieng, Sio-Hoi; Benosman, Ryad; Indiveri, Giacomo
2017-01-01
Stereo vision is an important feature that enables machine vision systems to perceive their environment in 3D. While machine vision has spawned a variety of software algorithms to solve the stereo-correspondence problem, their implementation and integration in small, fast, and efficient hardware vision systems remains a difficult challenge. Recent advances made in neuromorphic engineering offer a possible solution to this problem, with the use of a new class of event-based vision sensors and neural processing devices inspired by the organizing principles of the brain. Here we propose a radically novel model that solves the stereo-correspondence problem with a spiking neural network that can be directly implemented with massively parallel, compact, low-latency and low-power neuromorphic engineering devices. We validate the model with experimental results, highlighting features that are in agreement with both computational neuroscience stereo vision theories and experimental findings. We demonstrate its features with a prototype neuromorphic hardware system and provide testable predictions on the role of spike-based representations and temporal dynamics in biological stereo vision processing systems. PMID:28079187
A spiking neural network model of 3D perception for event-based neuromorphic stereo vision systems.
Osswald, Marc; Ieng, Sio-Hoi; Benosman, Ryad; Indiveri, Giacomo
2017-01-12
Stereo vision is an important feature that enables machine vision systems to perceive their environment in 3D. While machine vision has spawned a variety of software algorithms to solve the stereo-correspondence problem, their implementation and integration in small, fast, and efficient hardware vision systems remains a difficult challenge. Recent advances made in neuromorphic engineering offer a possible solution to this problem, with the use of a new class of event-based vision sensors and neural processing devices inspired by the organizing principles of the brain. Here we propose a radically novel model that solves the stereo-correspondence problem with a spiking neural network that can be directly implemented with massively parallel, compact, low-latency and low-power neuromorphic engineering devices. We validate the model with experimental results, highlighting features that are in agreement with both computational neuroscience stereo vision theories and experimental findings. We demonstrate its features with a prototype neuromorphic hardware system and provide testable predictions on the role of spike-based representations and temporal dynamics in biological stereo vision processing systems.
A spiking neural network model of 3D perception for event-based neuromorphic stereo vision systems
NASA Astrophysics Data System (ADS)
Osswald, Marc; Ieng, Sio-Hoi; Benosman, Ryad; Indiveri, Giacomo
2017-01-01
Stereo vision is an important feature that enables machine vision systems to perceive their environment in 3D. While machine vision has spawned a variety of software algorithms to solve the stereo-correspondence problem, their implementation and integration in small, fast, and efficient hardware vision systems remains a difficult challenge. Recent advances made in neuromorphic engineering offer a possible solution to this problem, with the use of a new class of event-based vision sensors and neural processing devices inspired by the organizing principles of the brain. Here we propose a radically novel model that solves the stereo-correspondence problem with a spiking neural network that can be directly implemented with massively parallel, compact, low-latency and low-power neuromorphic engineering devices. We validate the model with experimental results, highlighting features that are in agreement with both computational neuroscience stereo vision theories and experimental findings. We demonstrate its features with a prototype neuromorphic hardware system and provide testable predictions on the role of spike-based representations and temporal dynamics in biological stereo vision processing systems.
Beyond Critical Exponents in Neuronal Avalanches
NASA Astrophysics Data System (ADS)
Friedman, Nir; Butler, Tom; Deville, Robert; Beggs, John; Dahmen, Karin
2011-03-01
Neurons form a complex network in the brain, where they interact with one another by firing electrical signals. Neurons firing can trigger other neurons to fire, potentially causing avalanches of activity in the network. In many cases these avalanches have been found to be scale independent, similar to critical phenomena in diverse systems such as magnets and earthquakes. We discuss models for neuronal activity that allow for the extraction of testable, statistical predictions. We compare these models to experimental results, and go beyond critical exponents.
Reliability/maintainability/testability design for dormancy
NASA Astrophysics Data System (ADS)
Seman, Robert M.; Etzl, Julius M.; Purnell, Arthur W.
1988-05-01
This document has been prepared as a tool for designers of dormant military equipment and systems. The purpose of this handbook is to provide design engineers with Reliability/Maintainability/Testability design guidelines for systems which spend significant portions of their life cycle in a dormant state. The dormant state is defined as a nonoperating mode where a system experiences very little or no electrical stress. The guidelines in this report present design criteria in the following categories: (1) Part Selection and Control; (2) Derating Practices; (3) Equipment/System Packaging; (4) Transportation and Handling; (5) Maintainability Design; (6) Testability Design; (7) Evaluation Methods for In-Plant and Field Evaluation; and (8) Product Performance Agreements. Whereever applicable, design guidelines for operating systems were included with the dormant design guidelines. This was done in an effort to produce design guidelines for a more complete life cycle. Although dormant systems spend significant portions of their life cycle in a nonoperating mode, the designer must design the system for the complete life cycle, including nonoperating as well as operating modes. The guidelines are primarily intended for use in the design of equipment composed of electronic parts and components. However, they can also be used for the design of systems which encompass both electronic and nonelectronic parts, as well as for the modification of existing systems.
Refinement of Representation Theorems for Context-Free Languages
NASA Astrophysics Data System (ADS)
Fujioka, Kaoru
In this paper, we obtain some refinement of representation theorems for context-free languages by using Dyck languages, insertion systems, strictly locally testable languages, and morphisms. For instance, we improved the Chomsky-Schützenberger representation theorem and show that each context-free language L can be represented in the form L = h (D ∩ R), where D is a Dyck language, R is a strictly 3-testable language, and h is a morphism. A similar representation for context-free languages can be obtained, using insertion systems of weight (3, 0) and strictly 4-testable languages.
An empirical comparison of a dynamic software testability metric to static cyclomatic complexity
NASA Technical Reports Server (NTRS)
Voas, Jeffrey M.; Miller, Keith W.; Payne, Jeffrey E.
1993-01-01
This paper compares the dynamic testability prediction technique termed 'sensitivity analysis' to the static testability technique termed cyclomatic complexity. The application that we chose in this empirical study is a CASE generated version of a B-737 autoland system. For the B-737 system we analyzed, we isolated those functions that we predict are more prone to hide errors during system/reliability testing. We also analyzed the code with several other well-known static metrics. This paper compares and contrasts the results of sensitivity analysis to the results of the static metrics.
Unified TeV scale picture of baryogenesis and dark matter.
Babu, K S; Mohapatra, R N; Nasri, Salah
2007-04-20
We present a simple extension of the minimal supersymmetric standard model which provides a unified picture of cosmological baryon asymmetry and dark matter. Our model introduces a gauge singlet field N and a color triplet field X which couple to the right-handed quark fields. The out-of-equilibrium decay of the Majorana fermion N mediated by the exchange of the scalar field X generates adequate baryon asymmetry for MN approximately 100 GeV and MX approximately TeV. The scalar partner of N (denoted N1) is naturally the lightest SUSY particle as it has no gauge interactions and plays the role of dark matter. The model is experimentally testable in (i) neutron-antineutron oscillations with a transition time estimated to be around 10(10)sec, (ii) discovery of colored particles X at LHC with mass of order TeV, and (iii) direct dark matter detection with a predicted cross section in the observable range.
Brains studying brains: look before you think in vision
NASA Astrophysics Data System (ADS)
Zhaoping, Li
2016-06-01
Using our own brains to study our brains is extraordinary. For example, in vision this makes us naturally blind to our own blindness, since our impression of seeing our world clearly is consistent with our ignorance of what we do not see. Our brain employs its ‘conscious’ part to reason and make logical deductions using familiar rules and past experience. However, human vision employs many ‘subconscious’ brain parts that follow rules alien to our intuition. Our blindness to our unknown unknowns and our presumptive intuitions easily lead us astray in asking and formulating theoretical questions, as witnessed in many unexpected and counter-intuitive difficulties and failures encountered by generations of scientists. We should therefore pay a more than usual amount of attention and respect to experimental data when studying our brain. I show that this can be productive by reviewing two vision theories that have provided testable predictions and surprising insights.
Brains studying brains: look before you think in vision.
Zhaoping, Li
2016-05-11
Using our own brains to study our brains is extraordinary. For example, in vision this makes us naturally blind to our own blindness, since our impression of seeing our world clearly is consistent with our ignorance of what we do not see. Our brain employs its 'conscious' part to reason and make logical deductions using familiar rules and past experience. However, human vision employs many 'subconscious' brain parts that follow rules alien to our intuition. Our blindness to our unknown unknowns and our presumptive intuitions easily lead us astray in asking and formulating theoretical questions, as witnessed in many unexpected and counter-intuitive difficulties and failures encountered by generations of scientists. We should therefore pay a more than usual amount of attention and respect to experimental data when studying our brain. I show that this can be productive by reviewing two vision theories that have provided testable predictions and surprising insights.
Synchronous versus asynchronous modeling of gene regulatory networks.
Garg, Abhishek; Di Cara, Alessandro; Xenarios, Ioannis; Mendoza, Luis; De Micheli, Giovanni
2008-09-01
In silico modeling of gene regulatory networks has gained some momentum recently due to increased interest in analyzing the dynamics of biological systems. This has been further facilitated by the increasing availability of experimental data on gene-gene, protein-protein and gene-protein interactions. The two dynamical properties that are often experimentally testable are perturbations and stable steady states. Although a lot of work has been done on the identification of steady states, not much work has been reported on in silico modeling of cellular differentiation processes. In this manuscript, we provide algorithms based on reduced ordered binary decision diagrams (ROBDDs) for Boolean modeling of gene regulatory networks. Algorithms for synchronous and asynchronous transition models have been proposed and their corresponding computational properties have been analyzed. These algorithms allow users to compute cyclic attractors of large networks that are currently not feasible using existing software. Hereby we provide a framework to analyze the effect of multiple gene perturbation protocols, and their effect on cell differentiation processes. These algorithms were validated on the T-helper model showing the correct steady state identification and Th1-Th2 cellular differentiation process. The software binaries for Windows and Linux platforms can be downloaded from http://si2.epfl.ch/~garg/genysis.html.
NASA Astrophysics Data System (ADS)
Xu, Jin-Shi; Li, Chuan-Feng; Guo, Guang-Can
2016-11-01
In 1935, Einstein, Podolsky and Rosen published their influential paper proposing a now famous paradox (the EPR paradox) that threw doubt on the completeness of quantum mechanics. Two fundamental concepts: entanglement and steering, were given in the response to the EPR paper by Schrodinger, which both reflect the nonlocal nature of quantum mechanics. In 1964, John Bell obtained an experimentally testable inequality, in which its violation contradicts the prediction of local hidden variable models and agrees with that of quantum mechanics. Since then, great efforts have been made to experimentally investigate the nonlocal feature of quantum mechanics and many distinguished quantum properties were observed. In this work, along with the discussion of the development of quantum nonlocality, we would focus on our recent experimental efforts in investigating quantum correlations and their applications with optical systems, including the study of entanglement-assisted entropic uncertainty principle, Einstein-Podolsky-Rosen steering and the dynamics of quantum correlations.
Testability analysis on a hydraulic system in a certain equipment based on simulation model
NASA Astrophysics Data System (ADS)
Zhang, Rui; Cong, Hua; Liu, Yuanhong; Feng, Fuzhou
2018-03-01
Aiming at the problem that the complicated structure and the shortage of fault statistics information in hydraulic systems, a multi value testability analysis method based on simulation model is proposed. Based on the simulation model of AMESim, this method injects the simulated faults and records variation of test parameters ,such as pressure, flow rate, at each test point compared with those under normal conditions .Thus a multi-value fault-test dependency matrix is established. Then the fault detection rate (FDR) and fault isolation rate (FIR) are calculated based on the dependency matrix. Finally the system of testability and fault diagnosis capability are analyzed and evaluated, which can only reach a lower 54%(FDR) and 23%(FIR). In order to improve testability performance of the system,. number and position of the test points are optimized on the system. Results show the proposed test placement scheme can be used to solve the problems that difficulty, inefficiency and high cost in the system maintenance.
Rozier, Kelvin; Bondarenko, Vladimir E
2017-05-01
The β 1 - and β 2 -adrenergic signaling systems play different roles in the functioning of cardiac cells. Experimental data show that the activation of the β 1 -adrenergic signaling system produces significant inotropic, lusitropic, and chronotropic effects in the heart, whereas the effects of the β 2 -adrenergic signaling system is less apparent. In this paper, a comprehensive compartmentalized experimentally based mathematical model of the combined β 1 - and β 2 -adrenergic signaling systems in mouse ventricular myocytes is developed to simulate the experimental findings and make testable predictions of the behavior of the cardiac cells under different physiological conditions. Simulations describe the dynamics of major signaling molecules in different subcellular compartments; kinetics and magnitudes of phosphorylation of ion channels, transporters, and Ca 2+ handling proteins; modifications of action potential shape and duration; and [Ca 2+ ] i and [Na + ] i dynamics upon stimulation of β 1 - and β 2 -adrenergic receptors (β 1 - and β 2 -ARs). The model reveals physiological conditions when β 2 -ARs do not produce significant physiological effects and when their effects can be measured experimentally. Simulations demonstrated that stimulation of β 2 -ARs with isoproterenol caused a marked increase in the magnitude of the L-type Ca 2+ current, [Ca 2+ ] i transient, and phosphorylation of phospholamban only upon additional application of pertussis toxin or inhibition of phosphodiesterases of type 3 and 4. The model also made testable predictions of the changes in magnitudes of [Ca 2+ ] i and [Na + ] i fluxes, the rate of decay of [Na + ] i concentration upon both combined and separate stimulation of β 1 - and β 2 -ARs, and the contribution of phosphorylation of PKA targets to the changes in the action potential and [Ca 2+ ] i transient. Copyright © 2017 the American Physiological Society.
The Simple Theory of Public Library Services.
ERIC Educational Resources Information Center
Newhouse, Joseph P.
A simple normative theory applicable to public library services was developed as a tool to aid libraries in answering the question: which books should be bought by the library? Although developed for normative purposes, the theory generates testable predictions. It is relevant to measuring benefits from services which are provided publicly because…
Lift and drag in three-dimensional steady viscous and compressible flow
NASA Astrophysics Data System (ADS)
Liu, L. Q.; Wu, J. Z.; Su, W. D.; Kang, L. L.
2017-11-01
In a recent paper, Liu, Zhu, and Wu ["Lift and drag in two-dimensional steady viscous and compressible flow," J. Fluid Mech. 784, 304-341 (2015)] present a force theory for a body in a two-dimensional, viscous, compressible, and steady flow. In this companion paper, we do the same for three-dimensional flows. Using the fundamental solution of the linearized Navier-Stokes equations, we improve the force formula for incompressible flows originally derived by Goldstein in 1931 and summarized by Milne-Thomson in 1968, both being far from complete, to its perfect final form, which is further proved to be universally true from subsonic to supersonic flows. We call this result the unified force theorem, which states that the forces are always determined by the vector circulation Γϕ of longitudinal velocity and the scalar inflow Qψ of transverse velocity. Since this theorem is not directly observable either experimentally or computationally, a testable version is also derived, which, however, holds only in the linear far field. We name this version the testable unified force formula. After that, a general principle to increase the lift-drag ratio is proposed.
Modeling the attenuation and failure of action potentials in the dendrites of hippocampal neurons.
Migliore, M
1996-01-01
We modeled two different mechanisms, a shunting conductance and a slow sodium inactivation, to test whether they could modulate the active propagation of a train of action potentials in a dendritic tree. Computer simulations, using a compartmental model of a pyramidal neuron, suggest that each of these two mechanisms could account for the activity-dependent attenuation and failure of the action potentials in the dendrites during the train. Each mechanism is shown to be in good qualitative agreement with experimental findings on somatic or dendritic stimulation and on the effects of hyperpolarization. The conditions under which branch point failures can be observed, and a few experimentally testable predictions, are presented and discussed. PMID:8913580
All pure bipartite entangled states can be self-tested
Coladangelo, Andrea; Goh, Koon Tong; Scarani, Valerio
2017-01-01
Quantum technologies promise advantages over their classical counterparts in the fields of computation, security and sensing. It is thus desirable that classical users are able to obtain guarantees on quantum devices, even without any knowledge of their inner workings. That such classical certification is possible at all is remarkable: it is a consequence of the violation of Bell inequalities by entangled quantum systems. Device-independent self-testing refers to the most complete such certification: it enables a classical user to uniquely identify the quantum state shared by uncharacterized devices by simply inspecting the correlations of measurement outcomes. Self-testing was first demonstrated for the singlet state and a few other examples of self-testable states were reported in recent years. Here, we address the long-standing open question of whether every pure bipartite entangled state is self-testable. We answer it affirmatively by providing explicit self-testing correlations for all such states. PMID:28548093
All pure bipartite entangled states can be self-tested
NASA Astrophysics Data System (ADS)
Coladangelo, Andrea; Goh, Koon Tong; Scarani, Valerio
2017-05-01
Quantum technologies promise advantages over their classical counterparts in the fields of computation, security and sensing. It is thus desirable that classical users are able to obtain guarantees on quantum devices, even without any knowledge of their inner workings. That such classical certification is possible at all is remarkable: it is a consequence of the violation of Bell inequalities by entangled quantum systems. Device-independent self-testing refers to the most complete such certification: it enables a classical user to uniquely identify the quantum state shared by uncharacterized devices by simply inspecting the correlations of measurement outcomes. Self-testing was first demonstrated for the singlet state and a few other examples of self-testable states were reported in recent years. Here, we address the long-standing open question of whether every pure bipartite entangled state is self-testable. We answer it affirmatively by providing explicit self-testing correlations for all such states.
All pure bipartite entangled states can be self-tested.
Coladangelo, Andrea; Goh, Koon Tong; Scarani, Valerio
2017-05-26
Quantum technologies promise advantages over their classical counterparts in the fields of computation, security and sensing. It is thus desirable that classical users are able to obtain guarantees on quantum devices, even without any knowledge of their inner workings. That such classical certification is possible at all is remarkable: it is a consequence of the violation of Bell inequalities by entangled quantum systems. Device-independent self-testing refers to the most complete such certification: it enables a classical user to uniquely identify the quantum state shared by uncharacterized devices by simply inspecting the correlations of measurement outcomes. Self-testing was first demonstrated for the singlet state and a few other examples of self-testable states were reported in recent years. Here, we address the long-standing open question of whether every pure bipartite entangled state is self-testable. We answer it affirmatively by providing explicit self-testing correlations for all such states.
A THEORY OF WORK ADJUSTMENT. MINNESOTA STUDIES IN VOCATIONAL REHABILITATION, 15.
ERIC Educational Resources Information Center
DAWIS, RENE V.; AND OTHERS
A THEORY OF WORK ADJUSTMENT WHICH MAY CONTRIBUTE TO THE DEVELOPMENT OF A SCIENCE OF THE PSYCHOLOGY OF OCCUPATIONAL BEHAVIOR IS PROPOSED. IT BUILDS ON THE BASIC PSYCHOLOGICAL CONCEPTS OF STIMULUS, RESPONSE, AND REINFORCEMENT, AND PROVIDES A RESEARCH PARADIGM FOR GENERATING TESTABLE HYPOTHESES. IT WAS DERIVED FROM EARLY RESEARCH EFFORTS OF THE…
Modules, Theories, or Islands of Expertise? Domain Specificity in Socialization
ERIC Educational Resources Information Center
Gelman, Susan A.
2010-01-01
The domain-specific approach to socialization processes presented by J. E. Grusec and M. Davidov (this issue) provides a compelling framework for integrating and interpreting a large and disparate body of research findings, and it generates a wealth of testable new hypotheses. At the same time, it introduces core theoretical questions regarding…
On Testability of Missing Data Mechanisms in Incomplete Data Sets
ERIC Educational Resources Information Center
Raykov, Tenko
2011-01-01
This article is concerned with the question of whether the missing data mechanism routinely referred to as missing completely at random (MCAR) is statistically examinable via a test for lack of distributional differences between groups with observed and missing data, and related consequences. A discussion is initially provided, from a formal logic…
A Global Classification System for Catchment Hydrology
NASA Astrophysics Data System (ADS)
Woods, R. A.
2004-05-01
It is a shocking state of affairs - there is no underpinning scientific taxonomy of catchments. There are widely used global classification systems for climate, river morphology, lakes and wetlands, but for river catchments there exists only a plethora of inconsistent, incomplete regional schemes. By proceeding without a common taxonomy for catchments, freshwater science has missed one of its key developmental stages, and has leapt from definition of phenomena to experiments, theories and models, without the theoretical framework of a classification. I propose the development of a global hierarchical classification system for physical aspects of river catchments, to help underpin physical science in the freshwater environment and provide a solid foundation for classification of river ecosystems. Such a classification scheme can open completely new vistas in hydrology: for example it will be possible to (i) rationally transfer experimental knowledge of hydrological processes between basins anywhere in the world, provided they belong to the same class; (ii) perform meaningful meta-analyses in order to reconcile studies that show inconsistent results (iii) generate new testable hypotheses which involve locations worldwide.
Behavioral Stochastic Resonance
NASA Astrophysics Data System (ADS)
Freund, Jan A.; Schimansky-Geier, Lutz; Beisner, Beatrix; Neiman, Alexander; Russell, David F.; Yakusheva, Tatyana; Moss, Frank
2001-03-01
Zooplankton emit weak electric fields into the surrounding water that originate from their own muscular activities associated with swimming and feeding. Juvenile paddlefish prey upon single zooplankton by detecting and tracking these weak electric signatures. The passive electric sense in the fish is provided by an elaborate array of electroreceptors, Ampullae Lorenzini, spread over the surface of an elongated rostrum. We have previously shown that the fish use stochastic resonance to enhance prey capture near the detection threshold of their sensory system. But stochastic resonance requires an external source of electrical noise in order to function. The required noise can be provided by a swarm of plankton, for example Daphnia. Thus juvenile paddlefish can detect and attack single Daphnia as outliers in the vicinity of the swarm by making use of noise from the swarm itself. From the power spectral density of the noise plus the weak signal from a single Daphnia we calculate the signal-to-noise ratio and the Fisher information at the surface of the paddlefish's rostrum. The results predict a specific attack pattern for the paddlefish that appears to be experimentally testable.
Neutrino mass as the probe of intermediate mass scales
DOE Office of Scientific and Technical Information (OSTI.GOV)
Senjanovic, G.
1980-01-01
A discussion of the calculability of neutrino mass is presented. The possibility of neutrinos being either Dirac or Majorana particles is analyzed in detail. Arguments are offered in favor of the Majorana case: the smallness of neutrino mass is linked to the maximality of parity violation in weak interactions. It is shown how the measured value of neutrino mass would probe the existence of an intermediate mass scale, presumably in the TeV region, at which parity is supposed to become a good symmetry. Experimental consequences of the proposed scheme are discussed, in particular the neutrino-less double ..beta.. decay, where observationmore » would provide a crucial test of the model, and rare muon decays such as ..mu.. ..-->.. e..gamma.. and ..mu.. ..-->.. ee anti e. Finally, the embedding of this model in an O(10) grand unified theory is analyzed, with the emphasis on the implications for intermediate mass scales that it offers. It is concluded that the proposed scheme provides a distinct and testable alternative for understanding the smallness of neutrino mass. 4 figures.« less
Generating Testable Questions in the Science Classroom: The BDC Model
ERIC Educational Resources Information Center
Tseng, ChingMei; Chen, Shu-Bi Shu-Bi; Chang, Wen-Hua
2015-01-01
Guiding students to generate testable scientific questions is essential in the inquiry classroom, but it is not easy. The purpose of the BDC ("Big Idea, Divergent Thinking, and Convergent Thinking") instructional model is to to scaffold students' inquiry learning. We illustrate the use of this model with an example lesson, designed…
Easily Testable PLA-Based Finite State Machines
1989-03-01
PLATYPUS (20]. Then, justifi- type 1, 4 and 5 can be guaranteed to be testable via cation paths are obtained from the STG using simple logic...next state lines is found, if such a vector par that is gnrt d y the trupt eexists, using PLATYPUS [20]. pair that is generated by the first corrupted
1983-11-01
compound operations, with status. (h) Pre-programmed CRC and double-precision multiply/divide algo- rithms. (i) Double length accumulator with full...IH1.25 _ - MICROCOP ’ RESOLUTION TEST CHART NATIONAL BUREAU OF STANDARDS-1963-A .4 ’* • • . - . .. •. . . . . . . . . . . . . . • - -. .• ,. o. . . .- "o
NASA Astrophysics Data System (ADS)
Chakdar, Shreyashi
The Standard Model of particle physics is assumed to be a low-energy effective theory with new physics theoretically motivated to be around TeV scale. The thesis presents theories with new physics beyond the Standard Model in the TeV scale testable in the colliders. Work done in chapters 2, 3 and 5 in this thesis present some models incorporating different approaches of enlarging the Standard Model gauge group to a grand unified symmetry with each model presenting its unique signatures in the colliders. The study on leptoquarks gauge bosons in reference to TopSU(5) model in chapter 2 showed that their discovery mass range extends up to 1.5 TeV at 14 TeV LHC with luminosity of 100 fb--1. On the other hand, in chapter 3 we studied the collider phenomenology of TeV scale mirror fermions in Left-Right Mirror model finding that the reaches for the mirror quarks goes upto 750 GeV at the 14 TeV LHC with 300 fb--1 luminosity. In chapter 4 we have enlarged the bosonic symmetry to fermi-bose symmetry e.g. supersymmetry and have shown that SUSY with non-universalities in gaugino or scalar masses within high scale SUGRA set up can still be accessible at LHC with 14 TeV. In chapter 5, we performed a study in respect to the e+e-- collider and find that precise measurements of the higgs boson mass splittings up to ˜ 100 MeV may be possible with high luminosity in the International Linear Collider (ILC). In chapter 6 we have shown that the experimental data on neutrino masses and mixings are consistent with the proposed 4/5 parameter Dirac neutrino models yielding a solution for the neutrino masses with inverted mass hierarchy and large CP violating phase delta and thus can be tested experimentally. Chapter 7 of the thesis incorporates a warm dark matter candidate in context of two Higgs doublet model. The model has several testable consequences at colliders with the charged scalar and pseudoscalar being in few hundred GeV mass range. This thesis presents an endeavor to study beyond standard model physics at the TeV scale with testable signals in the Colliders.
Detecting Rotational Superradiance in Fluid Laboratories
NASA Astrophysics Data System (ADS)
Cardoso, Vitor; Coutant, Antonin; Richartz, Mauricio; Weinfurtner, Silke
2016-12-01
Rotational superradiance was predicted theoretically decades ago, and is chiefly responsible for a number of important effects and phenomenology in black-hole physics. However, rotational superradiance has never been observed experimentally. Here, with the aim of probing superradiance in the lab, we investigate the behavior of sound and surface waves in fluids resting in a circular basin at the center of which a rotating cylinder is placed. We show that with a suitable choice for the material of the cylinder, surface and sound waves are amplified. Two types of instabilities are studied: one sets in whenever superradiant modes are confined near the rotating cylinder and the other, which does not rely on confinement, corresponds to a local excitation of the cylinder. Our findings are experimentally testable in existing fluid laboratories and, hence, offer experimental exploration and comparison of dynamical instabilities arising from rapidly rotating boundary layers in astrophysical as well as in fluid dynamical systems.
A Framework for Evidence-Based Licensure of Adaptive Autonomous Systems
2016-03-01
insights gleaned to DoD. The autonomy community has identified significant challenges associated with test, evaluation verification and validation of...licensure as a test, evaluation, verification , and validation (TEVV) framework that can address these challenges. IDA found that traditional...language requirements to testable (preferably machine testable) specifications • Design of architectures that treat development and verification of
Alarcón, Tomás; Marches, Radu; Page, Karen M
2006-05-07
We formulate models of the mechanism(s) by which B cell lymphoma cells stimulated with an antibody specific to the B cell receptor (IgM) become quiescent or apoptotic. In particular, we aim to reproduce experimental results by Marches et al. according to which the fate of the targeted cells (Daudi) depends on the levels of expression of p21(Waf1) (p21) cell-cycle inhibitor. A simple model is formulated in which the basic ingredients are p21 and caspase activity, and their mutual inhibition. We show that this model does not reproduce the experimental results and that further refinement is needed. A second model successfully reproduces the experimental observations, for a given set of parameter values, indicating a critical role for Myc in the fate decision process. We use bifurcation analysis and objective sensitivity analysis to assess the robustness of our results. Importantly, this analysis yields experimentally testable predictions on the role of Myc, which could have therapeutic implications.
2008-12-01
1979; Wasserman and Faust, 1994). SNA thus relies heavily on graph theory to make predictions about network structure and thus social behavior...becomes a tool for increasing the specificity of theory , thinking through the theoretical implications, and generating testable predictions. In...to summarize Construct and its roots in constructural sociological theory . We discover that the (LPM) provides a mathematical bridge between
Inferior olive mirrors joint dynamics to implement an inverse controller.
Alvarez-Icaza, Rodrigo; Boahen, Kwabena
2012-10-01
To produce smooth and coordinated motion, our nervous systems need to generate precisely timed muscle activation patterns that, due to axonal conduction delay, must be generated in a predictive and feedforward manner. Kawato proposed that the cerebellum accomplishes this by acting as an inverse controller that modulates descending motor commands to predictively drive the spinal cord such that the musculoskeletal dynamics are canceled out. This and other cerebellar theories do not, however, account for the rich biophysical properties expressed by the olivocerebellar complex's various cell types, making these theories difficult to verify experimentally. Here we propose that a multizonal microcomplex's (MZMC) inferior olivary neurons use their subthreshold oscillations to mirror a musculoskeletal joint's underdamped dynamics, thereby achieving inverse control. We used control theory to map a joint's inverse model onto an MZMC's biophysics, and we used biophysical modeling to confirm that inferior olivary neurons can express the dynamics required to mirror biomechanical joints. We then combined both techniques to predict how experimentally injecting current into the inferior olive would affect overall motor output performance. We found that this experimental manipulation unmasked a joint's natural dynamics, as observed by motor output ringing at the joint's natural frequency, with amplitude proportional to the amount of current. These results support the proposal that the cerebellum-in particular an MZMC-is an inverse controller; the results also provide a biophysical implementation for this controller and allow one to make an experimentally testable prediction.
Superstitiousness in obsessive-compulsive disorder
Brugger, Peter; Viaud-Delmon, Isabelle
2010-01-01
It has been speculated that superstitiousness and obsessivecompulsive disorder (OCD) exist along a continuum. The distinction between superstitious behavior italic>and superstitious belief, however, is crucial for any theoretical account of claimed associations between superstitiousness and OCD. By demonstrating that there is a dichotomy between behavior and belief, which is experimentally testable, we can differentiate superstitious behavior from superstitious belief, or magical ideation. Different brain circuits are responsible for these two forms of superstitiousness; thus, determining which type of superstition is prominent in the symptomatology of an individual patient may inform us about the primarily affected neurocognitive systems. PMID:20623929
Design for testability and diagnosis at the system-level
NASA Technical Reports Server (NTRS)
Simpson, William R.; Sheppard, John W.
1993-01-01
The growing complexity of full-scale systems has surpassed the capabilities of most simulation software to provide detailed models or gate-level failure analyses. The process of system-level diagnosis approaches the fault-isolation problem in a manner that differs significantly from the traditional and exhaustive failure mode search. System-level diagnosis is based on a functional representation of the system. For example, one can exercise one portion of a radar algorithm (the Fast Fourier Transform (FFT) function) by injecting several standard input patterns and comparing the results to standardized output results. An anomalous output would point to one of several items (including the FFT circuit) without specifying the gate or failure mode. For system-level repair, identifying an anomalous chip is sufficient. We describe here an information theoretic and dependency modeling approach that discards much of the detailed physical knowledge about the system and analyzes its information flow and functional interrelationships. The approach relies on group and flow associations and, as such, is hierarchical. Its hierarchical nature allows the approach to be applicable to any level of complexity and to any repair level. This approach has been incorporated in a product called STAMP (System Testability and Maintenance Program) which was developed and refined through more than 10 years of field-level applications to complex system diagnosis. The results have been outstanding, even spectacular in some cases. In this paper we describe system-level testability, system-level diagnoses, and the STAMP analysis approach, as well as a few STAMP applications.
Authors’ response: mirror neurons: tests and testability.
Catmur, Caroline; Press, Clare; Cook, Richard; Bird, Geoffrey; Heyes, Cecilia
2014-04-01
Commentators have tended to focus on the conceptual framework of our article, the contrast between genetic and associative accounts of mirror neurons, and to challenge it with additional possibilities rather than empirical data. This makes the empirically focused comments especially valuable. The mirror neuron debate is replete with ideas; what it needs now are system-level theories and careful experiments – tests and testability.
Simulating Cancer Growth with Multiscale Agent-Based Modeling
Wang, Zhihui; Butner, Joseph D.; Kerketta, Romica; Cristini, Vittorio; Deisboeck, Thomas S.
2014-01-01
There have been many techniques developed in recent years to in silico model a variety of cancer behaviors. Agent-based modeling is a specific discrete-based hybrid modeling approach that allows simulating the role of diversity in cell populations as well as within each individual cell; it has therefore become a powerful modeling method widely used by computational cancer researchers. Many aspects of tumor morphology including phenotype-changing mutations, the adaptation to microenvironment, the process of angiogenesis, the influence of extracellular matrix, reactions to chemotherapy or surgical intervention, the effects of oxygen and nutrient availability, and metastasis and invasion of healthy tissues have been incorporated and investigated in agent-based models. In this review, we introduce some of the most recent agent-based models that have provided insight into the understanding of cancer growth and invasion, spanning multiple biological scales in time and space, and we further describe several experimentally testable hypotheses generated by those models. We also discuss some of the current challenges of multiscale agent-based cancer models. PMID:24793698
Burger, Gerhard A.; Danen, Erik H. J.; Beltman, Joost B.
2017-01-01
Epithelial–mesenchymal transition (EMT), the process by which epithelial cells can convert into motile mesenchymal cells, plays an important role in development and wound healing but is also involved in cancer progression. It is increasingly recognized that EMT is a dynamic process involving multiple intermediate or “hybrid” phenotypes rather than an “all-or-none” process. However, the role of EMT in various cancer hallmarks, including metastasis, is debated. Given the complexity of EMT regulation, computational modeling has proven to be an invaluable tool for cancer research, i.e., to resolve apparent conflicts in experimental data and to guide experiments by generating testable hypotheses. In this review, we provide an overview of computational modeling efforts that have been applied to regulation of EMT in the context of cancer progression and its associated tumor characteristics. Moreover, we identify possibilities to bridge different modeling approaches and point out outstanding questions in which computational modeling can contribute to advance our understanding of pathological EMT. PMID:28824874
The effect of analytic and experiential modes of thought on moral judgment.
Kvaran, Trevor; Nichols, Shaun; Sanfey, Alan
2013-01-01
According to dual-process theories, moral judgments are the result of two competing processes: a fast, automatic, affect-driven process and a slow, deliberative, reason-based process. Accordingly, these models make clear and testable predictions about the influence of each system. Although a small number of studies have attempted to examine each process independently in the context of moral judgment, no study has yet tried to experimentally manipulate both processes within a single study. In this chapter, a well-established "mode-of-thought" priming technique was used to place participants in either an experiential/emotional or analytic mode while completing a task in which participants provide judgments about a series of moral dilemmas. We predicted that individuals primed analytically would make more utilitarian responses than control participants, while emotional priming would lead to less utilitarian responses. Support was found for both of these predictions. Implications of these findings for dual-process theories of moral judgment will be discussed. Copyright © 2013 Elsevier B.V. All rights reserved.
Modeling T-cell activation using gene expression profiling and state-space models.
Rangel, Claudia; Angus, John; Ghahramani, Zoubin; Lioumi, Maria; Sotheran, Elizabeth; Gaiba, Alessia; Wild, David L; Falciani, Francesco
2004-06-12
We have used state-space models to reverse engineer transcriptional networks from highly replicated gene expression profiling time series data obtained from a well-established model of T-cell activation. State space models are a class of dynamic Bayesian networks that assume that the observed measurements depend on some hidden state variables that evolve according to Markovian dynamics. These hidden variables can capture effects that cannot be measured in a gene expression profiling experiment, e.g. genes that have not been included in the microarray, levels of regulatory proteins, the effects of messenger RNA and protein degradation, etc. Bootstrap confidence intervals are developed for parameters representing 'gene-gene' interactions over time. Our models represent the dynamics of T-cell activation and provide a methodology for the development of rational and experimentally testable hypotheses. Supplementary data and Matlab computer source code will be made available on the web at the URL given below. http://public.kgi.edu/~wild/LDS/index.htm
Cheng, Ryan R; Hawk, Alexander T; Makarov, Dmitrii E
2013-02-21
Recent experiments showed that the reconfiguration dynamics of unfolded proteins are often adequately described by simple polymer models. In particular, the Rouse model with internal friction (RIF) captures internal friction effects as observed in single-molecule fluorescence correlation spectroscopy (FCS) studies of a number of proteins. Here we use RIF, and its non-free draining analog, Zimm model with internal friction, to explore the effect of internal friction on the rate with which intramolecular contacts can be formed within the unfolded chain. Unlike the reconfiguration times inferred from FCS experiments, which depend linearly on the solvent viscosity, the first passage times to form intramolecular contacts are shown to display a more complex viscosity dependence. We further describe scaling relationships obeyed by contact formation times in the limits of high and low internal friction. Our findings provide experimentally testable predictions that can serve as a framework for the analysis of future studies of contact formation in proteins.
[Mechanisms of action of voltage-gated sodium channel ligands].
Tikhonov, D B
2007-05-01
The voltage-gated sodium channels play a key role in the generation of action potential in excitable cells. Sodium channels are targeted by a number of modulating ligands. Despite numerous studies, the mechanisms of action of many ligands are still unknown. The main cause of the problem is the absence of the channel structure. Sodium channels belong to the superfamily of P-loop channels that also the data abowt includes potassium and calcium channels and the channels of ionotropic glutamate receptors. Crystallization of several potassium channels has opened a possibility to analyze the structure of other members of the superfamily using the homology modeling approach. The present study summarizes the results of several recent modelling studies of such sodium channel ligands as tetrodotoxin, batrachotoxin and local anesthetics. Comparison of available experimental data with X-ray structures of potassium channels has provided a new level of understanding of the mechanisms of action of sodium channel ligands and has allowed proposing several testable hypotheses.
Reilly, John J; Wells, Jonathan C K
2005-12-01
The WHO recommends exclusive breast-feeding for the first 6 months of life. At present, <2 % of mothers who breast-feed in the UK do so exclusively for 6 months. We propose the testable hypothesis that this is because many mothers do not provide sufficient breast milk to feed a 6-month-old baby adequately. We review recent evidence on energy requirements during infancy, and energy transfer from mother to baby, and consider the adequacy of exclusive breast-feeding to age 6 months for mothers and babies in the developed world. Evidence from our recent systematic review suggests that mean metabolisable energy intake in exclusively breast-fed infants at 6 months is 2.2-2.4 MJ/d (525-574 kcal/d), and mean energy requirement approximately 2.6-2.7 MJ/d (632-649 kcal/d), leading to a gap between the energy provided by milk and energy needs by 6 months for many babies. Our hypothesis is consistent with other evidence, and with evolutionary considerations, and we briefly review this other evidence. The hypothesis would be testable in a longitudinal study of infant energy balance using stable-isotope techniques, which are both practical and valid.
What is a delusion? Epistemological dimensions.
Leeser, J; O'Donohue, W
1999-11-01
Although the Diagnostic and Statistical Manual of Mental Disorders (American Psychiatric Association, 1994) clearly indicates delusions have an epistemic dimension, it fails to accurately identify the epistemic properties of delusions. The authors explicate the regulative causes of belief revision for rational agents and argue that delusions are unresponsive to these. They argue that delusions are (a) protected beliefs made unfalsifiable either in principle or because the agent refuses to admit anything as a potential falsifier; (b) the protected belief is not typically considered a "properly basic" belief; (c) the belief is not of the variety of protected scientific beliefs; (d) in response to an apparent falsification, the subject posits not a simple, testable explanation for the inconsistency but one that is more complicated, less testable, and provides no new corroborations; (e) the subject has a strong emotional attachment to the belief; and (f) the belief is typically supported by (or originates from) trivial occurrences that are interpreted by the subject as highly unusual, significant, having personal reference, or some combination of these.
2014-01-01
Background Cis-regulatory modules (CRMs), or the DNA sequences required for regulating gene expression, play the central role in biological researches on transcriptional regulation in metazoan species. Nowadays, the systematic understanding of CRMs still mainly resorts to computational methods due to the time-consuming and small-scale nature of experimental methods. But the accuracy and reliability of different CRM prediction tools are still unclear. Without comparative cross-analysis of the results and combinatorial consideration with extra experimental information, there is no easy way to assess the confidence of the predicted CRMs. This limits the genome-wide understanding of CRMs. Description It is known that transcription factor binding and epigenetic profiles tend to determine functions of CRMs in gene transcriptional regulation. Thus integration of the genome-wide epigenetic profiles with systematically predicted CRMs can greatly help researchers evaluate and decipher the prediction confidence and possible transcriptional regulatory functions of these potential CRMs. However, these data are still fragmentary in the literatures. Here we performed the computational genome-wide screening for potential CRMs using different prediction tools and constructed the pioneer database, cisMEP (cis-regulatory module epigenetic profile database), to integrate these computationally identified CRMs with genomic epigenetic profile data. cisMEP collects the literature-curated TFBS location data and nine genres of epigenetic data for assessing the confidence of these potential CRMs and deciphering the possible CRM functionality. Conclusions cisMEP aims to provide a user-friendly interface for researchers to assess the confidence of different potential CRMs and to understand the functions of CRMs through experimentally-identified epigenetic profiles. The deposited potential CRMs and experimental epigenetic profiles for confidence assessment provide experimentally testable hypotheses for the molecular mechanisms of metazoan gene regulation. We believe that the information deposited in cisMEP will greatly facilitate the comparative usage of different CRM prediction tools and will help biologists to study the modular regulatory mechanisms between different TFs and their target genes. PMID:25521507
Percolation mechanism drives actin gels to the critically connected state
NASA Astrophysics Data System (ADS)
Lee, Chiu Fan; Pruessner, Gunnar
2016-05-01
Cell motility and tissue morphogenesis depend crucially on the dynamic remodeling of actomyosin networks. An actomyosin network consists of an actin polymer network connected by cross-linker proteins and motor protein myosins that generate internal stresses on the network. A recent discovery shows that for a range of experimental parameters, actomyosin networks contract to clusters with a power-law size distribution [J. Alvarado, Nat. Phys. 9, 591 (2013), 10.1038/nphys2715]. Here, we argue that actomyosin networks can exhibit a robust critical signature without fine-tuning because the dynamics of the system can be mapped onto a modified version of percolation with trapping (PT), which is known to show critical behavior belonging to the static percolation universality class without the need for fine-tuning of a control parameter. We further employ our PT model to generate experimentally testable predictions.
NASA Astrophysics Data System (ADS)
Wang, Jun-Wei; Zhou, Tian-Shou
2009-12-01
In this paper, we develop a new mathematical model for the mammalian circadian clock, which incorporates both transcriptional/translational feedback loops (TTFLs) and a cAMP-mediated feedback loop. The model shows that TTFLs and cAMP signalling cooperatively drive the circadian rhythms. It reproduces typical experimental observations with qualitative similarities, e.g. circadian oscillations in constant darkness and entrainment to light-dark cycles. In addition, it can explain the phenotypes of cAMP-mutant and Rev-erbα-/--mutant mice, and help us make an experimentally-testable prediction: oscillations may be rescued when arrhythmic mice with constitutively low concentrations of cAMP are crossed with Rev-erbα-/- mutant mice. The model enhances our understanding of the mammalian circadian clockwork from the viewpoint of the entire cell.
DeWitt, S.; Hahn, N.; Zavadil, K.; ...
2015-12-30
Here a new model of electrodeposition and electrodissolution is developed and applied to the evolution of Mg deposits during anode cycling. The model captures Butler-Volmer kinetics, facet evolution, the spatially varying potential in the electrolyte, and the time-dependent electrolyte concentration. The model utilizes a diffuse interface approach, employing the phase field and smoothed boundary methods. Scanning electron microscope (SEM) images of magnesium deposited on a gold substrate show the formation of faceted deposits, often in the form of hexagonal prisms. Orientation-dependent reaction rate coefficients were parameterized using the experimental SEM images. Three-dimensional simulations of the growth of magnesium deposits yieldmore » deposit morphologies consistent with the experimental results. The simulations predict that the deposits become narrower and taller as the current density increases due to the depletion of the electrolyte concentration near the sides of the deposits. Increasing the distance between the deposits leads to increased depletion of the electrolyte surrounding the deposit. Two models relating the orientation-dependence of the deposition and dissolution reactions are presented. Finally, the morphology of the Mg deposit after one deposition-dissolution cycle is significantly different between the two orientation-dependence models, providing testable predictions that suggest the underlying physical mechanisms governing morphology evolution during deposition and dissolution.« less
NASA Astrophysics Data System (ADS)
Mitchell, Michael R.; Leibler, Stanislas
2018-05-01
The abundance of available static protein structural data makes the more effective analysis and interpretation of this data a valuable tool to supplement the experimental study of protein mechanics. Structural displacements can be difficult to analyze and interpret. Previously, we showed that strains provide a more natural and interpretable representation of protein deformations, revealing mechanical coupling between spatially distinct sites of allosteric proteins. Here, we demonstrate that other transformations of displacements yield additional insights. We calculate the divergence and curl of deformations of the transmembrane channel KcsA. Additionally, we introduce quantities analogous to bend, splay, and twist deformation energies of nematic liquid crystals. These transformations enable the decomposition of displacements into different modes of deformation, helping to characterize the type of deformation a protein undergoes. We apply these calculations to study the filter and gating regions of KcsA. We observe a continuous path of rotational deformations physically coupling these two regions, and, we propose, underlying the allosteric interaction between these regions. Bend, splay, and twist distinguish KcsA gate opening, filter opening, and filter-gate coupling, respectively. In general, physically meaningful representations of deformations (like strain, curl, bend, splay, and twist) can make testable predictions and yield insights into protein mechanics, augmenting experimental methods and more fully exploiting available structural data.
Stalk model of membrane fusion: solution of energy crisis.
Kozlovsky, Yonathan; Kozlov, Michael M
2002-01-01
Membrane fusion proceeds via formation of intermediate nonbilayer structures. The stalk model of fusion intermediate is commonly recognized to account for the major phenomenology of the fusion process. However, in its current form, the stalk model poses a challenge. On one hand, it is able to describe qualitatively the modulation of the fusion reaction by the lipid composition of the membranes. On the other, it predicts very large values of the stalk energy, so that the related energy barrier for fusion cannot be overcome by membranes within a biologically reasonable span of time. We suggest a new structure for the fusion stalk, which resolves the energy crisis of the model. Our approach is based on a combined deformation of the stalk membrane including bending of the membrane surface and tilt of the hydrocarbon chains of lipid molecules. We demonstrate that the energy of the fusion stalk is a few times smaller than those predicted previously and the stalks are feasible in real systems. We account quantitatively for the experimental results on dependence of the fusion reaction on the lipid composition of different membrane monolayers. We analyze the dependence of the stalk energy on the distance between the fusing membranes and provide the experimentally testable predictions for the structural features of the stalk intermediates. PMID:11806930
Cruz-Morales, Pablo; Ramos-Aboites, Hilda E; Licona-Cassani, Cuauhtémoc; Selem-Mójica, Nelly; Mejía-Ponce, Paulina M; Souza-Saldívar, Valeria; Barona-Gómez, Francisco
2017-09-01
Desferrioxamines are hydroxamate siderophores widely conserved in both aquatic and soil-dwelling Actinobacteria. While the genetic and enzymatic bases of siderophore biosynthesis and their transport in model families of this phylum are well understood, evolutionary studies are lacking. Here, we perform a comprehensive desferrioxamine-centric (des genes) phylogenomic analysis, which includes the genomes of six novel strains isolated from an iron and phosphorous depleted oasis in the Chihuahuan desert of Mexico. Our analyses reveal previously unnoticed desferrioxamine evolutionary patterns, involving both biosynthetic and transport genes, likely to be related to desferrioxamines chemical diversity. The identified patterns were used to postulate experimentally testable hypotheses after phenotypic characterization, including profiling of siderophores production and growth stimulation of co-cultures under iron deficiency. Based in our results, we propose a novel des gene, which we term desG, as responsible for incorporation of phenylacetyl moieties during biosynthesis of previously reported arylated desferrioxamines. Moreover, a genomic-based classification of the siderophore-binding proteins responsible for specific and generalist siderophore assimilation is postulated. This report provides a much-needed evolutionary framework, with specific insights supported by experimental data, to direct the future ecological and functional analysis of desferrioxamines in the environment. © FEMS 2017.
Canales, Javier; Moyano, Tomás C.; Villarroel, Eva; Gutiérrez, Rodrigo A.
2014-01-01
Nitrogen (N) is an essential macronutrient for plant growth and development. Plants adapt to changes in N availability partly by changes in global gene expression. We integrated publicly available root microarray data under contrasting nitrate conditions to identify new genes and functions important for adaptive nitrate responses in Arabidopsis thaliana roots. Overall, more than 2000 genes exhibited changes in expression in response to nitrate treatments in Arabidopsis thaliana root organs. Global regulation of gene expression by nitrate depends largely on the experimental context. However, despite significant differences from experiment to experiment in the identity of regulated genes, there is a robust nitrate response of specific biological functions. Integrative gene network analysis uncovered relationships between nitrate-responsive genes and 11 highly co-expressed gene clusters (modules). Four of these gene network modules have robust nitrate responsive functions such as transport, signaling, and metabolism. Network analysis hypothesized G2-like transcription factors are key regulatory factors controlling transport and signaling functions. Our meta-analysis highlights the role of biological processes not studied before in the context of the nitrate response such as root hair development and provides testable hypothesis to advance our understanding of nitrate responses in plants. PMID:24570678
Mathematical modeling of the female reproductive system: from oocyte to delivery.
Clark, Alys R; Kruger, Jennifer A
2017-01-01
From ovulation to delivery, and through the menstrual cycle, the female reproductive system undergoes many dynamic changes to provide an optimal environment for the embryo to implant, and to develop successfully. It is difficult ethically and practically to observe the system over the timescales involved in growth and development (often hours to days). Even in carefully monitored conditions clinicians and biologists can only see snapshots of the development process. Mathematical models are emerging as a key means to supplement our knowledge of the reproductive process, and to tease apart complexity in the reproductive system. These models have been used successfully to test existing hypotheses regarding the mechanisms of female infertility and pathological fetal development, and also to provide new experimentally testable hypotheses regarding the process of development. This new knowledge has allowed for improvements in assisted reproductive technologies and is moving toward translation to clinical practice via multiscale assessments of the dynamics of ovulation, development in pregnancy, and the timing and mechanics of delivery. WIREs Syst Biol Med 2017, 9:e1353. doi: 10.1002/wsbm.1353 For further resources related to this article, please visit the WIREs website. © 2016 Wiley Periodicals, Inc.
Taking Bioinformatics to Systems Medicine.
van Kampen, Antoine H C; Moerland, Perry D
2016-01-01
Systems medicine promotes a range of approaches and strategies to study human health and disease at a systems level with the aim of improving the overall well-being of (healthy) individuals, and preventing, diagnosing, or curing disease. In this chapter we discuss how bioinformatics critically contributes to systems medicine. First, we explain the role of bioinformatics in the management and analysis of data. In particular we show the importance of publicly available biological and clinical repositories to support systems medicine studies. Second, we discuss how the integration and analysis of multiple types of omics data through integrative bioinformatics may facilitate the determination of more predictive and robust disease signatures, lead to a better understanding of (patho)physiological molecular mechanisms, and facilitate personalized medicine. Third, we focus on network analysis and discuss how gene networks can be constructed from omics data and how these networks can be decomposed into smaller modules. We discuss how the resulting modules can be used to generate experimentally testable hypotheses, provide insight into disease mechanisms, and lead to predictive models. Throughout, we provide several examples demonstrating how bioinformatics contributes to systems medicine and discuss future challenges in bioinformatics that need to be addressed to enable the advancement of systems medicine.
Phase 1 Space Fission Propulsion Energy Source Design
NASA Technical Reports Server (NTRS)
Houts, Mike; VanDyke, Melissa; Godfroy, Tom; Pedersen, Kevin; Martin, James; Dickens, Ricky; Salvail, Pat; Hrbud, Ivana; Carter, Robert; Rodgers, Stephen L. (Technical Monitor)
2002-01-01
Fission technology can enable rapid, affordable access to any point in the solar system. If fission propulsion systems are to be developed to their full potential; however, near-term customers must be identified and initial fission systems successfully developed, launched, and operated. Studies conducted in fiscal year 2001 (IISTP, 2001) show that fission electric propulsion (FEP) systems with a specific mass at or below 50 kg/kWjet could enhance or enable numerous robotic outer solar system missions of interest. At the required specific mass, it is possible to develop safe, affordable systems that meet mission requirements. To help select the system design to pursue, eight evaluation criteria were identified: system integration, safety, reliability, testability, specific mass, cost, schedule, and programmatic risk. A top-level comparison of four potential concepts was performed: a Testable, Passive, Redundant Reactor (TPRR), a Testable Multi-Cell In-Core Thermionic Reactor (TMCT), a Direct Gas Cooled Reactor (DGCR), and a Pumped Liquid Metal Reactor.(PLMR). Development of any of the four systems appears feasible. However, for power levels up to at least 500 kWt (enabling electric power levels of 125-175 kWe, given 25-35% power conversion efficiency) the TPRR has advantages related to several criteria and is competitive with respect to all. Hardware-based research and development has further increased confidence in the TPRR approach. Successful development and utilization of a "Phase I" fission electric propulsion system will enable advanced Phase 2 and Phase 3 systems capable of providing rapid, affordable access to any point in the solar system.
1982-10-01
e.g., providing voters in TMR systems and detection-switching requirements in standby-sparing sys- tems. The application of mathematical thoery of...and time redundancy required for error detection and correction, are interrelated. Mathematical modeling, when applied to fault tolerant systems, can...9 1.1 Some Fundamental Principles............................. 11 1.2 Mathematical Theory of
Robertson, Scott; Leonhardt, Ulf
2014-11-01
Hawking radiation has become experimentally testable thanks to the many analog systems which mimic the effects of the event horizon on wave propagation. These systems are typically dominated by dispersion and give rise to a numerically soluble and stable ordinary differential equation only if the rest-frame dispersion relation Ω^{2}(k) is a polynomial of relatively low degree. Here we present a new method for the calculation of wave scattering in a one-dimensional medium of arbitrary dispersion. It views the wave equation as an integral equation in Fourier space, which can be solved using standard and efficient numerical techniques.
Proposed experiment to test fundamentally binary theories
NASA Astrophysics Data System (ADS)
Kleinmann, Matthias; Vértesi, Tamás; Cabello, Adán
2017-09-01
Fundamentally binary theories are nonsignaling theories in which measurements of many outcomes are constructed by selecting from binary measurements. They constitute a sensible alternative to quantum theory and have never been directly falsified by any experiment. Here we show that fundamentally binary theories are experimentally testable with current technology. For that, we identify a feasible Bell-type experiment on pairs of entangled qutrits. In addition, we prove that, for any n , quantum n -ary correlations are not fundamentally (n -1 ) -ary. For that, we introduce a family of inequalities that hold for fundamentally (n -1 ) -ary theories but are violated by quantum n -ary correlations.
Emergent quantum mechanics without wavefunctions
NASA Astrophysics Data System (ADS)
Mesa Pascasio, J.; Fussy, S.; Schwabl, H.; Grössing, G.
2016-03-01
We present our model of an Emergent Quantum Mechanics which can be characterized by “realism without pre-determination”. This is illustrated by our analytic description and corresponding computer simulations of Bohmian-like “surreal” trajectories, which are obtained classically, i.e. without the use of any quantum mechanical tool such as wavefunctions. However, these trajectories do not necessarily represent ontological paths of particles but rather mappings of the probability density flux in a hydrodynamical sense. Modelling emergent quantum mechanics in a high-low intesity double slit scenario gives rise to the “quantum sweeper effect” with a characteristic intensity pattern. This phenomenon should be experimentally testable via weak measurement techniques.
Subluxation: dogma or science?
Keating, Joseph C; Charlton, Keith H; Grod, Jaroslaw P; Perle, Stephen M; Sikorski, David; Winterstein, James F
2005-01-01
Subluxation syndrome is a legitimate, potentially testable, theoretical construct for which there is little experimental evidence. Acceptable as hypothesis, the widespread assertion of the clinical meaningfulness of this notion brings ridicule from the scientific and health care communities and confusion within the chiropractic profession. We believe that an evidence-orientation among chiropractors requires that we distinguish between subluxation dogma vs. subluxation as the potential focus of clinical research. We lament efforts to generate unity within the profession through consensus statements concerning subluxation dogma, and believe that cultural authority will continue to elude us so long as we assert dogma as though it were validated clinical theory. PMID:16092955
NASA Astrophysics Data System (ADS)
Tu, K. M.; Matubayasi, N.; Liang, K. K.; Todorov, I. T.; Chan, S. L.; Chau, P.-L.
2012-08-01
We placed halothane, a general anaesthetic, inside palmitoyloleoylphosphatidylcholine (POPC) bilayers and performed molecular dynamics simulations at atmospheric and raised pressures. We demonstrated that halothane aggregated inside POPC membranes at 20 MPa but not at 40 MPa. The pressure range of aggregation matches that of pressure reversal in whole animals, and strongly suggests that this could be the mechanism for this effect. Combining these results with previous experimental data, we describe a testable hypothesis of how aggregation of general anaesthetics at high pressure can lead to pressure reversal, the effect whereby these drugs lose the efficacy at high pressure.
Han, Pu; Deem, Michael W
2017-02-01
CRISPR is a newly discovered prokaryotic immune system. Bacteria and archaea with this system incorporate genetic material from invading viruses into their genomes, providing protection against future infection by similar viruses. The condition for coexistence of prokaryots and viruses is an interesting problem in evolutionary biology. In this work, we show an intriguing phase diagram of the virus extinction probability, which is more complex than that of the classical predator-prey model. As the CRISPR incorporates genetic material, viruses are under pressure to evolve to escape recognition by CRISPR. When bacteria have a small rate of deleting spacers, a new parameter region in which bacteria and viruses can coexist arises, and it leads to a more complex coexistence patten for bacteria and viruses. For example, when the virus mutation rate is low, the virus extinction probability changes non-montonically with the bacterial exposure rate. The virus and bacteria coevolution not only alters the virus extinction probability, but also changes the bacterial population structure. Additionally, we show that recombination is a successful strategy for viruses to escape from CRISPR recognition when viruses have multiple proto-spacers, providing support for a recombination-mediated escape mechanism suggested experimentally. Finally, we suggest that the re-entrant phase diagram, in which phages can progress through three phases of extinction and two phases of abundance at low spacer deletion rates as a function of exposure rate to bacteria, is an experimentally testable phenomenon. © 2017 The Author(s).
Broken SU(3) antidecuplet for {Theta}{sup +} and {Xi}{sub 3/2}
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pakvasa, Sandip; Suzuki, Mahiko
2004-05-05
If the narrow exotic baryon resonances {Theta}{sup +}(1540) and {Xi}{sub 3/2} are members of the J{sup P} = 1/2{sup +} antidecuplet with N*(1710), the octet-antidecuplet mixing is required not only by the mass spectrum but also by the decay pattern of N*(1710). This casts doubt on validity of the {Theta}{sup +} mass prediction by the chiral soliton model. While all pieces of the existing experimental information point to a small octet-decuplet mixing, the magnitude of mixing required by the mass spectrum is not consistent with the value needed to account for the hadronic decay rates. The discrepancy is not resolvedmore » even after the large experimental uncertainty is taken into consideration. We fail to find an alternative SU(3) assignment even with different spin-parity assignment. When we extend the analysis to mixing with a higher SU(3) multiplet, we find one experimentally testable scenario in the case of mixing with a 27-plet.« less
Testability Design Rating System: Testability Handbook. Volume 1
1992-02-01
4-10 4.7.5 Summary of False BIT Alarms (FBA) ............................. 4-10 4.7.6 Smart BIT Technique...Circuit Board PGA Pin Grid Array PLA Programmable Logic Array PLD Programmable Logic Device PN Pseudo-Random Number PREDICT Probabilistic Estimation of...11 4.7.6 Smart BIT ( reference: RADC-TR-85-198). " Smart " BIT is a term given to BIT circuitry in a system LRU which includes dedicated processor/memory
Smart substrates: Making multi-chip modules smarter
NASA Astrophysics Data System (ADS)
Wunsch, T. F.; Treece, R. K.
1995-05-01
A novel multi-chip module (MCM) design and manufacturing methodology which utilizes active CMOS circuits in what is normally a passive substrate realizes the 'smart substrate' for use in highly testable, high reliability MCMS. The active devices are used to test the bare substrate, diagnose assembly errors or integrated circuit (IC) failures that require rework, and improve the testability of the final MCM assembly. A static random access memory (SRAM) MCM has been designed and fabricated in Sandia Microelectronics Development Laboratory in order to demonstrate the technical feasibility of this concept and to examine design and manufacturing issues which will ultimately determine the economic viability of this approach. The smart substrate memory MCM represents a first in MCM packaging. At the time the first modules were fabricated, no other company or MCM vendor had incorporated active devices in the substrate to improve manufacturability and testability, and thereby improve MCM reliability and reduce cost.
Merks, Roeland M H; Guravage, Michael; Inzé, Dirk; Beemster, Gerrit T S
2011-02-01
Plant organs, including leaves and roots, develop by means of a multilevel cross talk between gene regulation, patterned cell division and cell expansion, and tissue mechanics. The multilevel regulatory mechanisms complicate classic molecular genetics or functional genomics approaches to biological development, because these methodologies implicitly assume a direct relation between genes and traits at the level of the whole plant or organ. Instead, understanding gene function requires insight into the roles of gene products in regulatory networks, the conditions of gene expression, etc. This interplay is impossible to understand intuitively. Mathematical and computer modeling allows researchers to design new hypotheses and produce experimentally testable insights. However, the required mathematics and programming experience makes modeling poorly accessible to experimental biologists. Problem-solving environments provide biologically intuitive in silico objects ("cells", "regulation networks") required for setting up a simulation and present those to the user in terms of familiar, biological terminology. Here, we introduce the cell-based computer modeling framework VirtualLeaf for plant tissue morphogenesis. The current version defines a set of biologically intuitive C++ objects, including cells, cell walls, and diffusing and reacting chemicals, that provide useful abstractions for building biological simulations of developmental processes. We present a step-by-step introduction to building models with VirtualLeaf, providing basic example models of leaf venation and meristem development. VirtualLeaf-based models provide a means for plant researchers to analyze the function of developmental genes in the context of the biophysics of growth and patterning. VirtualLeaf is an ongoing open-source software project (http://virtualleaf.googlecode.com) that runs on Windows, Mac, and Linux.
Jo, Sunhwan; Bahar, Ivet; Roux, Benoît
2014-01-01
Biomolecular conformational transitions are essential to biological functions. Most experimental methods report on the long-lived functional states of biomolecules, but information about the transition pathways between these stable states is generally scarce. Such transitions involve short-lived conformational states that are difficult to detect experimentally. For this reason, computational methods are needed to produce plausible hypothetical transition pathways that can then be probed experimentally. Here we propose a simple and computationally efficient method, called ANMPathway, for constructing a physically reasonable pathway between two endpoints of a conformational transition. We adopt a coarse-grained representation of the protein and construct a two-state potential by combining two elastic network models (ENMs) representative of the experimental structures resolved for the endpoints. The two-state potential has a cusp hypersurface in the configuration space where the energies from both the ENMs are equal. We first search for the minimum energy structure on the cusp hypersurface and then treat it as the transition state. The continuous pathway is subsequently constructed by following the steepest descent energy minimization trajectories starting from the transition state on each side of the cusp hypersurface. Application to several systems of broad biological interest such as adenylate kinase, ATP-driven calcium pump SERCA, leucine transporter and glutamate transporter shows that ANMPathway yields results in good agreement with those from other similar methods and with data obtained from all-atom molecular dynamics simulations, in support of the utility of this simple and efficient approach. Notably the method provides experimentally testable predictions, including the formation of non-native contacts during the transition which we were able to detect in two of the systems we studied. An open-access web server has been created to deliver ANMPathway results. PMID:24699246
It takes two to talk: a second-person neuroscience approach to language learning.
Syal, Supriya; Anderson, Adam K
2013-08-01
Language is a social act. We have previously argued that language remains embedded in sociality because the motivation to communicate exists only within a social context. Schilbach et al. underscore the importance of studying linguistic behavior from within the motivated, socially interactive frame in which it is learnt and used, as well as provide testable hypotheses for a participatory, second-person neuroscience approach to language learning.
Empirical approaches to the study of language evolution.
Fitch, W Tecumseh
2017-02-01
The study of language evolution, and human cognitive evolution more generally, has often been ridiculed as unscientific, but in fact it differs little from many other disciplines that investigate past events, such as geology or cosmology. Well-crafted models of language evolution make numerous testable hypotheses, and if the principles of strong inference (simultaneous testing of multiple plausible hypotheses) are adopted, there is an increasing amount of relevant data allowing empirical evaluation of such models. The articles in this special issue provide a concise overview of current models of language evolution, emphasizing the testable predictions that they make, along with overviews of the many sources of data available to test them (emphasizing comparative, neural, and genetic data). The key challenge facing the study of language evolution is not a lack of data, but rather a weak commitment to hypothesis-testing approaches and strong inference, exacerbated by the broad and highly interdisciplinary nature of the relevant data. This introduction offers an overview of the field, and a summary of what needed to evolve to provide our species with language-ready brains. It then briefly discusses different contemporary models of language evolution, followed by an overview of different sources of data to test these models. I conclude with my own multistage model of how different components of language could have evolved.
NASA Space Flight Vehicle Fault Isolation Challenges
NASA Technical Reports Server (NTRS)
Bramon, Christopher; Inman, Sharon K.; Neeley, James R.; Jones, James V.; Tuttle, Loraine
2016-01-01
The Space Launch System (SLS) is the new NASA heavy lift launch vehicle and is scheduled for its first mission in 2017. The goal of the first mission, which will be uncrewed, is to demonstrate the integrated system performance of the SLS rocket and spacecraft before a crewed flight in 2021. SLS has many of the same logistics challenges as any other large scale program. Common logistics concerns for SLS include integration of discrete programs geographically separated, multiple prime contractors with distinct and different goals, schedule pressures and funding constraints. However, SLS also faces unique challenges. The new program is a confluence of new hardware and heritage, with heritage hardware constituting seventy-five percent of the program. This unique approach to design makes logistics concerns such as testability of the integrated flight vehicle especially problematic. The cost of fully automated diagnostics can be completely justified for a large fleet, but not so for a single flight vehicle. Fault detection is mandatory to assure the vehicle is capable of a safe launch, but fault isolation is another issue. SLS has considered various methods for fault isolation which can provide a reasonable balance between adequacy, timeliness and cost. This paper will address the analyses and decisions the NASA Logistics engineers are making to mitigate risk while providing a reasonable testability solution for fault isolation.
Integrated PK-PD and agent-based modeling in oncology.
Wang, Zhihui; Butner, Joseph D; Cristini, Vittorio; Deisboeck, Thomas S
2015-04-01
Mathematical modeling has become a valuable tool that strives to complement conventional biomedical research modalities in order to predict experimental outcome, generate new medical hypotheses, and optimize clinical therapies. Two specific approaches, pharmacokinetic-pharmacodynamic (PK-PD) modeling, and agent-based modeling (ABM), have been widely applied in cancer research. While they have made important contributions on their own (e.g., PK-PD in examining chemotherapy drug efficacy and resistance, and ABM in describing and predicting tumor growth and metastasis), only a few groups have started to combine both approaches together in an effort to gain more insights into the details of drug dynamics and the resulting impact on tumor growth. In this review, we focus our discussion on some of the most recent modeling studies building on a combined PK-PD and ABM approach that have generated experimentally testable hypotheses. Some future directions are also discussed.
Integrated PK-PD and Agent-Based Modeling in Oncology
Wang, Zhihui; Butner, Joseph D.; Cristini, Vittorio
2016-01-01
Mathematical modeling has become a valuable tool that strives to complement conventional biomedical research modalities in order to predict experimental outcome, generate new medical hypotheses, and optimize clinical therapies. Two specific approaches, pharmacokinetic-pharmacodynamic (PK-PD) modeling, and agent-based modeling (ABM), have been widely applied in cancer research. While they have made important contributions on their own (e.g., PK-PD in examining chemotherapy drug efficacy and resistance, and ABM in describing and predicting tumor growth and metastasis), only a few groups have started to combine both approaches together in an effort to gain more insights into the details of drug dynamics and the resulting impact on tumor growth. In this review, we focus our discussion on some of the most recent modeling studies building on a combined PK-PD and ABM approach that have generated experimentally testable hypotheses. Some future directions are also discussed. PMID:25588379
Serotonergic Psychedelics: Experimental Approaches for Assessing Mechanisms of Action.
Canal, Clinton E
2018-03-13
Recent, well-controlled - albeit small-scale - clinical trials show that serotonergic psychedelics, including psilocybin and lysergic acid diethylamide, possess great promise for treating psychiatric disorders, including treatment-resistant depression. Additionally, fresh results from a deluge of clinical neuroimaging studies are unveiling the dynamic effects of serotonergic psychedelics on functional activity within, and connectivity across, discrete neural systems. These observations have led to testable hypotheses regarding neural processing mechanisms that contribute to psychedelic effects and therapeutic benefits. Despite these advances and a plethora of preclinical and clinical observations supporting a central role for brain serotonin 5-HT 2A receptors in producing serotonergic psychedelic effects, lingering and new questions about mechanisms abound. These chiefly pertain to molecular neuropharmacology. This chapter is devoted to illuminating and discussing such questions in the context of preclinical experimental approaches for studying mechanisms of action of serotonergic psychedelics, classic and new.
Simulating cancer growth with multiscale agent-based modeling.
Wang, Zhihui; Butner, Joseph D; Kerketta, Romica; Cristini, Vittorio; Deisboeck, Thomas S
2015-02-01
There have been many techniques developed in recent years to in silico model a variety of cancer behaviors. Agent-based modeling is a specific discrete-based hybrid modeling approach that allows simulating the role of diversity in cell populations as well as within each individual cell; it has therefore become a powerful modeling method widely used by computational cancer researchers. Many aspects of tumor morphology including phenotype-changing mutations, the adaptation to microenvironment, the process of angiogenesis, the influence of extracellular matrix, reactions to chemotherapy or surgical intervention, the effects of oxygen and nutrient availability, and metastasis and invasion of healthy tissues have been incorporated and investigated in agent-based models. In this review, we introduce some of the most recent agent-based models that have provided insight into the understanding of cancer growth and invasion, spanning multiple biological scales in time and space, and we further describe several experimentally testable hypotheses generated by those models. We also discuss some of the current challenges of multiscale agent-based cancer models. Copyright © 2014 Elsevier Ltd. All rights reserved.
Multimodal transport and dispersion of organelles in narrow tubular cells
NASA Astrophysics Data System (ADS)
Mogre, Saurabh S.; Koslover, Elena F.
2018-04-01
Intracellular components explore the cytoplasm via active motor-driven transport in conjunction with passive diffusion. We model the motion of organelles in narrow tubular cells using analytical techniques and numerical simulations to study the efficiency of different transport modes in achieving various cellular objectives. Our model describes length and time scales over which each transport mode dominates organelle motion, along with various metrics to quantify exploration of intracellular space. For organelles that search for a specific target, we obtain the average capture time for given transport parameters and show that diffusion and active motion contribute to target capture in the biologically relevant regime. Because many organelles have been found to tether to microtubules when not engaged in active motion, we study the interplay between immobilization due to tethering and increased probability of active transport. We derive parameter-dependent conditions under which tethering enhances long-range transport and improves the target capture time. These results shed light on the optimization of intracellular transport machinery and provide experimentally testable predictions for the effects of transport regulation mechanisms such as tethering.
Conroy, M.J.; Runge, M.C.; Nichols, J.D.; Stodola, K.W.; Cooper, R.J.
2011-01-01
The broad physical and biological principles behind climate change and its potential large scale ecological impacts on biota are fairly well understood, although likely responses of biotic communities at fine spatio-temporal scales are not, limiting the ability of conservation programs to respond effectively to climate change outside the range of human experience. Much of the climate debate has focused on attempts to resolve key uncertainties in a hypothesis-testing framework. However, conservation decisions cannot await resolution of these scientific issues and instead must proceed in the face of uncertainty. We suggest that conservation should precede in an adaptive management framework, in which decisions are guided by predictions under multiple, plausible hypotheses about climate impacts. Under this plan, monitoring is used to evaluate the response of the system to climate drivers, and management actions (perhaps experimental) are used to confront testable predictions with data, in turn providing feedback for future decision making. We illustrate these principles with the problem of mitigating the effects of climate change on terrestrial bird communities in the southern Appalachian Mountains, USA. ?? 2010 Elsevier Ltd.
Ditlev, Jonathon A; Mayer, Bruce J; Loew, Leslie M
2013-02-05
Mathematical modeling has established its value for investigating the interplay of biochemical and mechanical mechanisms underlying actin-based motility. Because of the complex nature of actin dynamics and its regulation, many of these models are phenomenological or conceptual, providing a general understanding of the physics at play. But the wealth of carefully measured kinetic data on the interactions of many of the players in actin biochemistry cries out for the creation of more detailed and accurate models that could permit investigators to dissect interdependent roles of individual molecular components. Moreover, no human mind can assimilate all of the mechanisms underlying complex protein networks; so an additional benefit of a detailed kinetic model is that the numerous binding proteins, signaling mechanisms, and biochemical reactions can be computationally organized in a fully explicit, accessible, visualizable, and reusable structure. In this review, we will focus on how comprehensive and adaptable modeling allows investigators to explain experimental observations and develop testable hypotheses on the intracellular dynamics of the actin cytoskeleton. Copyright © 2013 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Ditlev, Jonathon A.; Mayer, Bruce J.; Loew, Leslie M.
2013-01-01
Mathematical modeling has established its value for investigating the interplay of biochemical and mechanical mechanisms underlying actin-based motility. Because of the complex nature of actin dynamics and its regulation, many of these models are phenomenological or conceptual, providing a general understanding of the physics at play. But the wealth of carefully measured kinetic data on the interactions of many of the players in actin biochemistry cries out for the creation of more detailed and accurate models that could permit investigators to dissect interdependent roles of individual molecular components. Moreover, no human mind can assimilate all of the mechanisms underlying complex protein networks; so an additional benefit of a detailed kinetic model is that the numerous binding proteins, signaling mechanisms, and biochemical reactions can be computationally organized in a fully explicit, accessible, visualizable, and reusable structure. In this review, we will focus on how comprehensive and adaptable modeling allows investigators to explain experimental observations and develop testable hypotheses on the intracellular dynamics of the actin cytoskeleton. PMID:23442903
Describing Myxococcus xanthus Aggregation Using Ostwald Ripening Equations for Thin Liquid Films
Bahar, Fatmagül; Pratt-Szeliga, Philip C.; Angus, Stuart; Guo, Jiaye; Welch, Roy D.
2014-01-01
When starved, a swarm of millions of Myxococcus xanthus cells coordinate their movement from outward swarming to inward coalescence. The cells then execute a synchronous program of multicellular development, arranging themselves into dome shaped aggregates. Over the course of development, about half of the initial aggregates disappear, while others persist and mature into fruiting bodies. This work seeks to develop a quantitative model for aggregation that accurately simulates which will disappear and which will persist. We analyzed time-lapse movies of M. xanthus development, modeled aggregation using the equations that describe Ostwald ripening of droplets in thin liquid films, and predicted the disappearance and persistence of aggregates with an average accuracy of 85%. We then experimentally validated a prediction that is fundamental to this model by tracking individual fluorescent cells as they moved between aggregates and demonstrating that cell movement towards and away from aggregates correlates with aggregate disappearance. Describing development through this model may limit the number and type of molecular genetic signals needed to complete M. xanthus development, and it provides numerous additional testable predictions. PMID:25231319
Flight control system design factors for applying automated testing techniques
NASA Technical Reports Server (NTRS)
Sitz, Joel R.; Vernon, Todd H.
1990-01-01
The principal design features and operational experiences of the X-29 forward-swept-wing aircraft and F-18 high alpha research vehicle (HARV) automated test systems are discussed. It is noted that operational experiences in developing and using these automated testing techniques have highlighted the need for incorporating target system features to improve testability. Improved target system testability can be accomplished with the addition of nonreal-time and real-time features. Online access to target system implementation details, unobtrusive real-time access to internal user-selectable variables, and proper software instrumentation are all desirable features of the target system. Also, test system and target system design issues must be addressed during the early stages of the target system development. Processing speeds of up to 20 million instructions/s and the development of high-bandwidth reflective memory systems have improved the ability to integrate the target system and test system for the application of automated testing techniques. It is concluded that new methods of designing testability into the target systems are required.
Beyond the bucket: testing the effect of experimental design on rate and sequence of decay
NASA Astrophysics Data System (ADS)
Gabbott, Sarah; Murdock, Duncan; Purnell, Mark
2016-04-01
Experimental decay has revealed the potential for profound biases in our interpretations of exceptionally preserved fossils, with non-random sequences of character loss distorting the position of fossil taxa in phylogenetic trees. By characterising these sequences we can rewind this distortion and make better-informed interpretations of the affinity of enigmatic fossil taxa. Equally, rate of character loss is crucial for estimating the preservation potential of phylogentically informative characters, and revealing the mechanisms of preservation themselves. However, experimental decay has been criticised for poorly modeling 'real' conditions, and dismissed as unsophisticated 'bucket science'. Here we test the effect of a differing experimental parameters on the rate and sequence of decay. By doing so, we can test the assumption that the results of decay experiments are applicable to informing interpretations of exceptionally preserved fossils from diverse preservational settings. The results of our experiments demonstrate the validity of using the sequence of character loss as a phylogenetic tool, and sheds light on the extent to which environment must be considered before making decay-informed interpretations, or reconstructing taphonomic pathways. With careful consideration of experimental design, driven by testable hypotheses, decay experiments are robust and informative - experimental taphonomy needn't kick the bucket just yet.
Inter-Universal Quantum Entanglement
NASA Astrophysics Data System (ADS)
Robles-Pérez, S. J.; González-Díaz, P. F.
2015-01-01
The boundary conditions to be imposed on the quantum state of the whole multiverse could be such that the universes would be created in entangled pairs. Then, interuniversal entanglement would provide us with a vacuum energy for each single universe that might be fitted with observational data, making testable not only the multiverse proposal but also the boundary conditions of the multiverse. Furthermore, the second law of the entanglement thermodynamics would enhance the expansion of the single universes.
All biology is computational biology.
Markowetz, Florian
2017-03-01
Here, I argue that computational thinking and techniques are so central to the quest of understanding life that today all biology is computational biology. Computational biology brings order into our understanding of life, it makes biological concepts rigorous and testable, and it provides a reference map that holds together individual insights. The next modern synthesis in biology will be driven by mathematical, statistical, and computational methods being absorbed into mainstream biological training, turning biology into a quantitative science.
Abu Bakar, Nurul Farhana; Chen, Ai-Hong
2014-02-01
Children with learning disabilities might have difficulties to communicate effectively and give reliable responses as required in various visual function testing procedures. The purpose of this study was to compare the testability of visual acuity using the modified Early Treatment Diabetic Retinopathy Study (ETDRS) and Cambridge Crowding Cards, stereo acuity using Lang Stereo test II and Butterfly stereo tests and colour perception using Colour Vision Test Made Easy (CVTME) and Ishihara's Test for Colour Deficiency (Ishihara Test) between children in mainstream classes and children with learning disabilities in special education classes in government primary schools. A total of 100 primary school children (50 children from mainstream classes and 50 children from special education classes) matched in age were recruited in this cross-sectional comparative study. The testability was determined by the percentage of children who were able to give reliable respond as required by the respective tests. 'Unable to test' was defined as inappropriate response or uncooperative despite best efforts of the screener. The testability of the modified ETDRS, Butterfly stereo test and Ishihara test for respective visual function tests were found lower among children in special education classes ( P < 0.001) but not in Cambridge Crowding Cards, Lang Stereo test II and CVTME. Non verbal or "matching" approaches were found to be more superior in testing visual functions in children with learning disabilities. Modifications of vision testing procedures are essential for children with learning disabilities.
MoCha: Molecular Characterization of Unknown Pathways.
Lobo, Daniel; Hammelman, Jennifer; Levin, Michael
2016-04-01
Automated methods for the reverse-engineering of complex regulatory networks are paving the way for the inference of mechanistic comprehensive models directly from experimental data. These novel methods can infer not only the relations and parameters of the known molecules defined in their input datasets, but also unknown components and pathways identified as necessary by the automated algorithms. Identifying the molecular nature of these unknown components is a crucial step for making testable predictions and experimentally validating the models, yet no specific and efficient tools exist to aid in this process. To this end, we present here MoCha (Molecular Characterization), a tool optimized for the search of unknown proteins and their pathways from a given set of known interacting proteins. MoCha uses the comprehensive dataset of protein-protein interactions provided by the STRING database, which currently includes more than a billion interactions from over 2,000 organisms. MoCha is highly optimized, performing typical searches within seconds. We demonstrate the use of MoCha with the characterization of unknown components from reverse-engineered models from the literature. MoCha is useful for working on network models by hand or as a downstream step of a model inference engine workflow and represents a valuable and efficient tool for the characterization of unknown pathways using known data from thousands of organisms. MoCha and its source code are freely available online under the GPLv3 license.
Lee, Insuk; Li, Zhihua; Marcotte, Edward M.
2007-01-01
Background Probabilistic functional gene networks are powerful theoretical frameworks for integrating heterogeneous functional genomics and proteomics data into objective models of cellular systems. Such networks provide syntheses of millions of discrete experimental observations, spanning DNA microarray experiments, physical protein interactions, genetic interactions, and comparative genomics; the resulting networks can then be easily applied to generate testable hypotheses regarding specific gene functions and associations. Methodology/Principal Findings We report a significantly improved version (v. 2) of a probabilistic functional gene network [1] of the baker's yeast, Saccharomyces cerevisiae. We describe our optimization methods and illustrate their effects in three major areas: the reduction of functional bias in network training reference sets, the application of a probabilistic model for calculating confidences in pair-wise protein physical or genetic interactions, and the introduction of simple thresholds that eliminate many false positive mRNA co-expression relationships. Using the network, we predict and experimentally verify the function of the yeast RNA binding protein Puf6 in 60S ribosomal subunit biogenesis. Conclusions/Significance YeastNet v. 2, constructed using these optimizations together with additional data, shows significant reduction in bias and improvements in precision and recall, in total covering 102,803 linkages among 5,483 yeast proteins (95% of the validated proteome). YeastNet is available from http://www.yeastnet.org. PMID:17912365
Choice Experiments to Quantify Preferences for Health and Healthcare: State of the Practice.
Mühlbacher, Axel; Johnson, F Reed
2016-06-01
Stated-preference methods increasingly are used to quantify preferences in health economics, health technology assessment, benefit-risk analysis and health services research. The objective of stated-preference studies is to acquire information about trade-off preferences among treatment outcomes, prioritization of clinical decision criteria, likely uptake or adherence to healthcare products and acceptability of healthcare services or policies. A widely accepted approach to eliciting preferences is discrete-choice experiments. Patient, physician, insurant or general-public respondents choose among constructed, experimentally controlled alternatives described by decision-relevant features or attributes. Attributes can represent complete health states, sets of treatment outcomes or characteristics of a healthcare system. The observed pattern of choice reveals how different respondents or groups of respondents implicitly weigh, value and assess different characteristics of treatments, products or services. An important advantage of choice experiments is their foundation in microeconomic utility theory. This conceptual framework provides tests of internal validity, guidance for statistical analysis of latent preference structures, and testable behavioural hypotheses. Choice experiments require expertise in survey-research methods, random-utility theory, experimental design and advanced statistical analysis. This paper should be understood as an introduction to setting up a basic experiment rather than an exhaustive critique of the latest findings and procedures. Where appropriate, we have identified topics of active research where a broad consensus has not yet been established.
A one-dimensional statistical mechanics model for nucleosome positioning on genomic DNA.
Tesoro, S; Ali, I; Morozov, A N; Sulaiman, N; Marenduzzo, D
2016-02-12
The first level of folding of DNA in eukaryotes is provided by the so-called '10 nm chromatin fibre', where DNA wraps around histone proteins (∼10 nm in size) to form nucleosomes, which go on to create a zig-zagging bead-on-a-string structure. In this work we present a one-dimensional statistical mechanics model to study nucleosome positioning within one such 10 nm fibre. We focus on the case of genomic sheep DNA, and we start from effective potentials valid at infinite dilution and determined from high-resolution in vitro salt dialysis experiments. We study positioning within a polynucleosome chain, and compare the results for genomic DNA to that obtained in the simplest case of homogeneous DNA, where the problem can be mapped to a Tonks gas. First, we consider the simple, analytically solvable, case where nucleosomes are assumed to be point-like. Then, we perform numerical simulations to gauge the effect of their finite size on the nucleosomal distribution probabilities. Finally we compare nucleosome distributions and simulated nuclease digestion patterns for the two cases (homogeneous and sheep DNA), thereby providing testable predictions of the effect of sequence on experimentally observable quantities in experiments on polynucleosome chromatin fibres reconstituted in vitro.
Testability, Test Automation and Test Driven Development for the Trick Simulation Toolkit
NASA Technical Reports Server (NTRS)
Penn, John
2014-01-01
This paper describes the adoption of a Test Driven Development approach and a Continuous Integration System in the development of the Trick Simulation Toolkit, a generic simulation development environment for creating high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. It describes the approach, and the significant benefits seen, such as fast, thorough and clear test feedback every time code is checked into the code repository. It also describes an approach that encourages development of code that is testable and adaptable.
Ruiz, Patricia; Perlina, Ally; Mumtaz, Moiz; Fowler, Bruce A
2016-07-01
A number of epidemiological studies have identified statistical associations between persistent organic pollutants (POPs) and metabolic diseases, but testable hypotheses regarding underlying molecular mechanisms to explain these linkages have not been published. We assessed the underlying mechanisms of POPs that have been associated with metabolic diseases; three well-known POPs [2,3,7,8-tetrachlorodibenzodioxin (TCDD), 2,2´,4,4´,5,5´-hexachlorobiphenyl (PCB 153), and 4,4´-dichlorodiphenyldichloroethylene (p,p´-DDE)] were studied. We used advanced database search tools to delineate testable hypotheses and to guide laboratory-based research studies into underlying mechanisms by which this POP mixture could produce or exacerbate metabolic diseases. For our searches, we used proprietary systems biology software (MetaCore™/MetaDrug™) to conduct advanced search queries for the underlying interactions database, followed by directional network construction to identify common mechanisms for these POPs within two or fewer interaction steps downstream of their primary targets. These common downstream pathways belong to various cytokine and chemokine families with experimentally well-documented causal associations with type 2 diabetes. Our systems biology approach allowed identification of converging pathways leading to activation of common downstream targets. To our knowledge, this is the first study to propose an integrated global set of step-by-step molecular mechanisms for a combination of three common POPs using a systems biology approach, which may link POP exposure to diseases. Experimental evaluation of the proposed pathways may lead to development of predictive biomarkers of the effects of POPs, which could translate into disease prevention and effective clinical treatment strategies. Ruiz P, Perlina A, Mumtaz M, Fowler BA. 2016. A systems biology approach reveals converging molecular mechanisms that link different POPs to common metabolic diseases. Environ Health Perspect 124:1034-1041; http://dx.doi.org/10.1289/ehp.1510308.
Testing for ontological errors in probabilistic forecasting models of natural systems
Marzocchi, Warner; Jordan, Thomas H.
2014-01-01
Probabilistic forecasting models describe the aleatory variability of natural systems as well as our epistemic uncertainty about how the systems work. Testing a model against observations exposes ontological errors in the representation of a system and its uncertainties. We clarify several conceptual issues regarding the testing of probabilistic forecasting models for ontological errors: the ambiguity of the aleatory/epistemic dichotomy, the quantification of uncertainties as degrees of belief, the interplay between Bayesian and frequentist methods, and the scientific pathway for capturing predictability. We show that testability of the ontological null hypothesis derives from an experimental concept, external to the model, that identifies collections of data, observed and not yet observed, that are judged to be exchangeable when conditioned on a set of explanatory variables. These conditional exchangeability judgments specify observations with well-defined frequencies. Any model predicting these behaviors can thus be tested for ontological error by frequentist methods; e.g., using P values. In the forecasting problem, prior predictive model checking, rather than posterior predictive checking, is desirable because it provides more severe tests. We illustrate experimental concepts using examples from probabilistic seismic hazard analysis. Severe testing of a model under an appropriate set of experimental concepts is the key to model validation, in which we seek to know whether a model replicates the data-generating process well enough to be sufficiently reliable for some useful purpose, such as long-term seismic forecasting. Pessimistic views of system predictability fail to recognize the power of this methodology in separating predictable behaviors from those that are not. PMID:25097265
Abu Bakar, Nurul Farhana; Chen, Ai-Hong
2014-01-01
Context: Children with learning disabilities might have difficulties to communicate effectively and give reliable responses as required in various visual function testing procedures. Aims: The purpose of this study was to compare the testability of visual acuity using the modified Early Treatment Diabetic Retinopathy Study (ETDRS) and Cambridge Crowding Cards, stereo acuity using Lang Stereo test II and Butterfly stereo tests and colour perception using Colour Vision Test Made Easy (CVTME) and Ishihara's Test for Colour Deficiency (Ishihara Test) between children in mainstream classes and children with learning disabilities in special education classes in government primary schools. Materials and Methods: A total of 100 primary school children (50 children from mainstream classes and 50 children from special education classes) matched in age were recruited in this cross-sectional comparative study. The testability was determined by the percentage of children who were able to give reliable respond as required by the respective tests. ‘Unable to test’ was defined as inappropriate response or uncooperative despite best efforts of the screener. Results: The testability of the modified ETDRS, Butterfly stereo test and Ishihara test for respective visual function tests were found lower among children in special education classes (P < 0.001) but not in Cambridge Crowding Cards, Lang Stereo test II and CVTME. Conclusion: Non verbal or “matching” approaches were found to be more superior in testing visual functions in children with learning disabilities. Modifications of vision testing procedures are essential for children with learning disabilities. PMID:24008790
Modelling toehold-mediated RNA strand displacement.
Šulc, Petr; Ouldridge, Thomas E; Romano, Flavio; Doye, Jonathan P K; Louis, Ard A
2015-03-10
We study the thermodynamics and kinetics of an RNA toehold-mediated strand displacement reaction with a recently developed coarse-grained model of RNA. Strand displacement, during which a single strand displaces a different strand previously bound to a complementary substrate strand, is an essential mechanism in active nucleic acid nanotechnology and has also been hypothesized to occur in vivo. We study the rate of displacement reactions as a function of the length of the toehold and temperature and make two experimentally testable predictions: that the displacement is faster if the toehold is placed at the 5' end of the substrate; and that the displacement slows down with increasing temperature for longer toeholds. Copyright © 2015 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Is Psychoanalysis a Folk Psychology?
Arminjon, Mathieu
2013-01-01
Even as the neuro-psychoanalytic field has matured, from a naturalist point of view, the epistemological status of Freudian interpretations still remains problematic at a naturalist point of view. As a result of the resurgence of hermeneutics, the claim has been made that psychoanalysis is an extension of folk psychology. For these “extensionists,” asking psychoanalysis to prove its interpretations would be as absurd as demanding the proofs of the scientific accuracy of folk psychology. I propose to show how Dennett’s theory of the intentional stance allows us to defend an extensionist position while sparing us certain hermeneutic difficulties. In conclusion, I will consider how Shevrin et al. (1996) experiments could turn extensionist conceptual considerations into experimentally testable issues. PMID:23525879
NASA Astrophysics Data System (ADS)
Ao, Ping
2011-03-01
There has been a tremendous progress in cancer research. However, it appears the current dominant cancer research framework of regarding cancer as diseases of genome leads impasse. Naturally questions have been asked that whether it is possible to develop alternative frameworks such that they can connect both to mutations and other genetic/genomic effects and to environmental factors. Furthermore, such framework can be made quantitative and with predictions experimentally testable. In this talk, I will present a positive answer to this calling. I will explain on our construction of endogenous network theory based on molecular-cellular agencies as dynamical variable. Such cancer theory explicitly demonstrates a profound connection to many fundamental concepts in physics, as such stochastic non-equilibrium processes, ``energy'' landscape, metastability, etc. It suggests that neneath cancer's daunting complexity may lie a simplicity that gives grounds for hope. The rationales behind such theory, its predictions, and its initial experimental verifications will be presented. Supported by USA NIH and China NSF.
O'Malley, Maureen A
2018-06-01
Since the 1940s, microbiologists, biochemists and population geneticists have experimented with the genetic mechanisms of microorganisms in order to investigate evolutionary processes. These evolutionary studies of bacteria and other microorganisms gained some recognition from the standard-bearers of the modern synthesis of evolutionary biology, especially Theodosius Dobzhansky and Ledyard Stebbins. A further period of post-synthesis bacterial evolutionary research occurred between the 1950s and 1980s. These experimental analyses focused on the evolution of population and genetic structure, the adaptive gain of new functions, and the evolutionary consequences of competition dynamics. This large body of research aimed to make evolutionary theory testable and predictive, by giving it mechanistic underpinnings. Although evolutionary microbiologists promoted bacterial experiments as methodologically advantageous and a source of general insight into evolution, they also acknowledged the biological differences of bacteria. My historical overview concludes with reflections on what bacterial evolutionary research achieved in this period, and its implications for the still-developing modern synthesis.
A Quantitative Geochemical Target for Modeling the Formation of the Earth and Moon
NASA Technical Reports Server (NTRS)
Boyce, Jeremy W.; Barnes, Jessica J.; McCubbin, Francis M.
2017-01-01
The past decade has been one of geochemical, isotopic, and computational advances that are bringing the laboratory measurements and computational modeling neighborhoods of the Earth-Moon community to ever closer proximity. We are now however in the position to become even better neighbors: modelers can generate testable hypthotheses for geochemists; and geochemists can provide quantitive targets for modelers. Here we present a robust example of the latter based on Cl isotope measurements of mare basalts.
Tailor, Vijay; Glaze, Selina; Unwin, Hilary; Bowman, Richard; Thompson, Graham; Dahlmann-Noor, Annegret
2016-10-01
Children and adults with neurological impairments are often not able to access conventional perimetry; however, information about the visual field is valuable. A new technology, saccadic vector optokinetic perimetry (SVOP), may have improved accessibility, but its accuracy has not been evaluated. We aimed to explore accessibility, testability and accuracy of SVOP in children with neurodisability or isolated visual pathway deficits. Cohort study; recruitment October 2013-May 2014, at children's eye clinics at a tertiary referral centre and a regional Child Development Centre; full orthoptic assessment, SVOP (central 30° of the visual field) and confrontation visual fields (CVF). Group 1: age 1-16 years, neurodisability (n=16), group 2: age 10-16 years, confirmed or suspected visual field defect (n=21); group 2 also completed Goldmann visual field testing (GVFT). Group 1: testability with a full 40-point test protocol is 12.5%; with reduced test protocols, testability is 100%, but plots may be clinically meaningless. Children (44%) and parents/carers (62.5%) find the test easy. SVOP and CVF agree in 50%. Group 2: testability is 62% for the 40-point protocol, and 90.5% for reduced protocols. Corneal changes in childhood glaucoma interfere with SVOP testing. All children and parents/carers find SVOP easy. Overall agreement with GVFT is 64.7%. While SVOP is highly accessible to children, many cannot complete a full 40-point test. Agreement with current standard tests is moderate to poor. Abnormal saccades cause an apparent non-specific visual field defect. In children with glaucoma or nystagmus SVOP calibration often fails. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
NASA Astrophysics Data System (ADS)
Tozzi, Arturo; Peters, James F.
2018-03-01
The paper by Ramstead et al. [1] [in this issue] reminds us the efforts of eminent scientists such as Whitehead and Godel. After having produced influential manuscripts, they turned to more philosophical issues, understanding the need for a larger formalization of their bounteous scientific results [2,3]. In a similar way, the successful free-energy principle has been generalized, in order to encompass not only the brain activity of the original formulation, but also the whole spectrum of life [1]. The final result is of prominent importance, because, in touch with Quine's naturalized epistemology [4] and Badiou's account of set theory [5], provides philosophical significance to otherwise purely scientific matters. The free energy principle becomes a novel paradigm that attempts to explain general physical/biological mechanisms in the light of a novel scientific ontology, the "variational neuroethology". The latter, seemingly grounded in a recursive multilevel reductionistic/emergentistic approach à la Bechtel [6], has also its roots in a rationalistic top-down approach that, starting from mathematical/physical general concepts (von Helmholtz's free energy), formulates experimentally testable (and falsifiable) theories.
Gönner, Lorenz; Vitay, Julien; Hamker, Fred H.
2017-01-01
Hippocampal place-cell sequences observed during awake immobility often represent previous experience, suggesting a role in memory processes. However, recent reports of goals being overrepresented in sequential activity suggest a role in short-term planning, although a detailed understanding of the origins of hippocampal sequential activity and of its functional role is still lacking. In particular, it is unknown which mechanism could support efficient planning by generating place-cell sequences biased toward known goal locations, in an adaptive and constructive fashion. To address these questions, we propose a model of spatial learning and sequence generation as interdependent processes, integrating cortical contextual coding, synaptic plasticity and neuromodulatory mechanisms into a map-based approach. Following goal learning, sequential activity emerges from continuous attractor network dynamics biased by goal memory inputs. We apply Bayesian decoding on the resulting spike trains, allowing a direct comparison with experimental data. Simulations show that this model (1) explains the generation of never-experienced sequence trajectories in familiar environments, without requiring virtual self-motion signals, (2) accounts for the bias in place-cell sequences toward goal locations, (3) highlights their utility in flexible route planning, and (4) provides specific testable predictions. PMID:29075187
Escape rate for nonequilibrium processes dominated by strong non-detailed balance force
NASA Astrophysics Data System (ADS)
Tang, Ying; Xu, Song; Ao, Ping
2018-02-01
Quantifying the escape rate from a meta-stable state is essential to understand a wide range of dynamical processes. Kramers' classical rate formula is the product of an exponential function of the potential barrier height and a pre-factor related to the friction coefficient. Although many applications of the rate formula focused on the exponential term, the prefactor can have a significant effect on the escape rate in certain parameter regions, such as the overdamped limit and the underdamped limit. There have been continuous interests to understand the effect of non-detailed balance on the escape rate; however, how the prefactor behaves under strong non-detailed balance force remains elusive. In this work, we find that the escape rate formula has a vanishing prefactor with decreasing friction strength under the strong non-detailed balance limit. We both obtain analytical solutions in specific examples and provide a derivation for more general cases. We further verify the result by simulations and propose a testable experimental system of a charged Brownian particle in electromagnetic field. Our study demonstrates that a special care is required to estimate the effect of prefactor on the escape rate when non-detailed balance force dominates.
Mapping the landscape of metabolic goals of a cell
Zhao, Qi; Stettner, Arion I.; Reznik, Ed; ...
2016-05-23
Here, genome-scale flux balance models of metabolism provide testable predictions of all metabolic rates in an organism, by assuming that the cell is optimizing a metabolic goal known as the objective function. We introduce an efficient inverse flux balance analysis (invFBA) approach, based on linear programming duality, to characterize the space of possible objective functions compatible with measured fluxes. After testing our algorithm on simulated E. coli data and time-dependent S. oneidensis fluxes inferred from gene expression, we apply our inverse approach to flux measurements in long-term evolved E. coli strains, revealing objective functions that provide insight into metabolic adaptationmore » trajectories.« less
Shaping Gene Expression by Landscaping Chromatin Architecture: Lessons from a Master.
Sartorelli, Vittorio; Puri, Pier Lorenzo
2018-05-19
Since its discovery as a skeletal muscle-specific transcription factor able to reprogram somatic cells into differentiated myofibers, MyoD has provided an instructive model to understand how transcription factors regulate gene expression. Reciprocally, studies of other transcriptional regulators have provided testable hypotheses to further understand how MyoD activates transcription. Using MyoD as a reference, in this review, we discuss the similarities and differences in the regulatory mechanisms employed by tissue-specific transcription factors to access DNA and regulate gene expression by cooperatively shaping the chromatin landscape within the context of cellular differentiation. Copyright © 2018 Elsevier Inc. All rights reserved.
Developing Cognitive Models for Social Simulation from Survey Data
NASA Astrophysics Data System (ADS)
Alt, Jonathan K.; Lieberman, Stephen
The representation of human behavior and cognition continues to challenge the modeling and simulation community. The use of survey and polling instruments to inform belief states, issue stances and action choice models provides a compelling means of developing models and simulations with empirical data. Using these types of data to population social simulations can greatly enhance the feasibility of validation efforts, the reusability of social and behavioral modeling frameworks, and the testable reliability of simulations. We provide a case study demonstrating these effects, document the use of survey data to develop cognitive models, and suggest future paths forward for social and behavioral modeling.
Statistical mechanics of monatomic liquids
NASA Astrophysics Data System (ADS)
Wallace, Duane C.
1997-10-01
Two key experimental properties of elemental liquids, together with an analysis of the condensed-system potential-energy surface, lead us logically to the dynamical theory of monatomic liquids. Experimentally, the ion motional specific heat is approximately 3Nk for N ions, implying the normal modes of motion are approximately 3N independent harmonic oscillators. This implies the potential surface contains nearly harmonic valleys. The equilibrium configuration at the bottom of each valley is a ``structure.'' Structures are crystalline or amorphous, and amorphous structures can have a remnant of local crystal symmetry, or can be random. The random structures are by far the most numerous, and hence dominate the statistical mechanics of the liquid state, and their macroscopic properties are uniform over the structure class, for large-N systems. The Hamiltonian for any structural valley is the static structure potential, a sum of harmonic normal modes, and an anharmonic correction. Again from experiment, the constant-density entropy of melting contains a universal disordering contribution of NkΔ, suggesting the random structural valleys are of universal number wN, where lnw=Δ. Our experimental estimate for Δ is 0.80. In quasiharmonic approximation, the liquid theory for entropy agrees with experiment, for all currently analyzable experimental data at elevated temperatures, to within 1-2% of the total entropy. Further testable predictions of the theory are mentioned.
Elementary signaling modes predict the essentiality of signal transduction network components
2011-01-01
Background Understanding how signals propagate through signaling pathways and networks is a central goal in systems biology. Quantitative dynamic models help to achieve this understanding, but are difficult to construct and validate because of the scarcity of known mechanistic details and kinetic parameters. Structural and qualitative analysis is emerging as a feasible and useful alternative for interpreting signal transduction. Results In this work, we present an integrative computational method for evaluating the essentiality of components in signaling networks. This approach expands an existing signaling network to a richer representation that incorporates the positive or negative nature of interactions and the synergistic behaviors among multiple components. Our method simulates both knockout and constitutive activation of components as node disruptions, and takes into account the possible cascading effects of a node's disruption. We introduce the concept of elementary signaling mode (ESM), as the minimal set of nodes that can perform signal transduction independently. Our method ranks the importance of signaling components by the effects of their perturbation on the ESMs of the network. Validation on several signaling networks describing the immune response of mammals to bacteria, guard cell abscisic acid signaling in plants, and T cell receptor signaling shows that this method can effectively uncover the essentiality of components mediating a signal transduction process and results in strong agreement with the results of Boolean (logical) dynamic models and experimental observations. Conclusions This integrative method is an efficient procedure for exploratory analysis of large signaling and regulatory networks where dynamic modeling or experimental tests are impractical. Its results serve as testable predictions, provide insights into signal transduction and regulatory mechanisms and can guide targeted computational or experimental follow-up studies. The source codes for the algorithms developed in this study can be found at http://www.phys.psu.edu/~ralbert/ESM. PMID:21426566
Soy-Based Therapeutic Baby Formulas: Testable Hypotheses Regarding the Pros and Cons.
Westmark, Cara J
2016-01-01
Soy-based infant formulas have been consumed in the United States since 1909, and currently constitute a significant portion of the infant formula market. There are efforts underway to generate genetically modified soybeans that produce therapeutic agents of interest with the intent to deliver those agents in a soy-based infant formula platform. The threefold purpose of this review article is to first discuss the pros and cons of soy-based infant formulas, then present testable hypotheses to discern the suitability of a soy platform for drug delivery in babies, and finally start a discussion to inform public policy on this important area of infant nutrition.
Bayesian naturalness, simplicity, and testability applied to the B ‑ L MSSM GUT
NASA Astrophysics Data System (ADS)
Fundira, Panashe; Purves, Austin
2018-04-01
Recent years have seen increased use of Bayesian model comparison to quantify notions such as naturalness, simplicity, and testability, especially in the area of supersymmetric model building. After demonstrating that Bayesian model comparison can resolve a paradox that has been raised in the literature concerning the naturalness of the proton mass, we apply Bayesian model comparison to GUTs, an area to which it has not been applied before. We find that the GUTs are substantially favored over the nonunifying puzzle model. Of the GUTs we consider, the B ‑ L MSSM GUT is the most favored, but the MSSM GUT is almost equally favored.
Two fundamental questions about protein evolution.
Penny, David; Zhong, Bojian
2015-12-01
Two basic questions are considered that approach protein evolution from different directions; the problems arising from using Markov models for the deeper divergences, and then the origin of proteins themselves. The real problem for the first question (going backwards in time) is that at deeper phylogenies the Markov models of sequence evolution must lose information exponentially at deeper divergences, and several testable methods are suggested that should help resolve these deeper divergences. For the second question (coming forwards in time) a problem is that most models for the origin of protein synthesis do not give a role for the very earliest stages of the process. From our knowledge of the importance of replication accuracy in limiting the length of a coding molecule, a testable hypothesis is proposed. The length of the code, the code itself, and tRNAs would all have prior roles in increasing the accuracy of RNA replication; thus proteins would have been formed only after the tRNAs and the length of the triplet code are already formed. Both questions lead to testable predictions. Copyright © 2014 Elsevier B.V. and Société Française de Biochimie et Biologie Moléculaire (SFBBM). All rights reserved.
Guimaraes, Sandra; Fernandes, Tiago; Costa, Patrício; Silva, Eduardo
2018-06-01
To determine a normative of tumbling E optotype and its feasibility for visual acuity (VA) assessment in children aged 3-4 years. A cross-sectional study of 1756 children who were invited to participate in a comprehensive non-invasive eye exam. Uncorrected monocular VA with crowded tumbling E with a comprehensive ophthalmological examination were assessed. Testability rates of the whole population and VA of the healthy children for different age subgroups, gender, school type and the order of testing in which the ophthalmological examination was performed were evaluated. The overall testability rate was 95% (92% and 98% for children aged 3 and 4 years, respectively). The mean VA of the first-day assessment (first-VA) and best-VA over 2 days' assessments was 0.14 logMAR (95% CI 0.14 to 0.15) (decimal=0.72, 95% CI 0.71 to 0.73) and 0.13 logMAR (95% CI 0.13 to 0.14) (decimal=0.74, 95% CI 0.73 to 0.74). Analysis with age showed differences between groups in first-VA (F(3,1146)=10.0; p<0.001; η2=0.026) and best-VA (F(3,1155)=8.8; p<0.001; η2=0.022). Our normative was very highly correlated with previous reported HOTV-Amblyopia-Treatment-Study (HOTV-ATS) (first-VA, r=0.97; best-VA, r=0.99), with 0.8 to 0.7 lines consistent overestimation for HOTV-ATS as described in literature. Overall false-positive referral was 1.3%, being specially low regarding anisometropias of ≥2 logMAR lines (0.17%). Interocular difference ≥1 line VA logMAR was not associated with age (p=0.195). This is the first normative for European Caucasian children with single crowded tumbling E in healthy eyes and the largest study comparing 3 and 4 years old testability. Testability rates are higher than found in literature with other optotypes, especially in children aged 3 years, where we found 5%-11% better testability rates. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
A Field Trip to the Archaean in Search of Darwin's Warm Little Pond.
Damer, Bruce
2016-05-25
Charles Darwin's original intuition that life began in a "warm little pond" has for the last three decades been eclipsed by a focus on marine hydrothermal vents as a venue for abiogenesis. However, thermodynamic barriers to polymerization of key molecular building blocks and the difficulty of forming stable membranous compartments in seawater suggest that Darwin's original insight should be reconsidered. I will introduce the terrestrial origin of life hypothesis, which combines field observations and laboratory results to provide a novel and testable model in which life begins as protocells assembling in inland fresh water hydrothermal fields. Hydrothermal fields are associated with volcanic landmasses resembling Hawaii and Iceland today and could plausibly have existed on similar land masses rising out of Earth's first oceans. I will report on a field trip to the living and ancient stromatolite fossil localities of Western Australia, which provided key insights into how life may have emerged in Archaean, fluctuating fresh water hydrothermal pools, geological evidence for which has recently been discovered. Laboratory experimentation and fieldwork are providing mounting evidence that such sites have properties that are conducive to polymerization reactions and generation of membrane-bounded protocells. I will build on the previously developed coupled phases scenario, unifying the chemical and geological frameworks and proposing that a hydrogel of stable, communally supported protocells will emerge as a candidate Woese progenote, the distant common ancestor of microbial communities so abundant in the earliest fossil record.
Improving accuracy and power with transfer learning using a meta-analytic database.
Schwartz, Yannick; Varoquaux, Gaël; Pallier, Christophe; Pinel, Philippe; Poline, Jean-Baptiste; Thirion, Bertrand
2012-01-01
Typical cohorts in brain imaging studies are not large enough for systematic testing of all the information contained in the images. To build testable working hypotheses, investigators thus rely on analysis of previous work, sometimes formalized in a so-called meta-analysis. In brain imaging, this approach underlies the specification of regions of interest (ROIs) that are usually selected on the basis of the coordinates of previously detected effects. In this paper, we propose to use a database of images, rather than coordinates, and frame the problem as transfer learning: learning a discriminant model on a reference task to apply it to a different but related new task. To facilitate statistical analysis of small cohorts, we use a sparse discriminant model that selects predictive voxels on the reference task and thus provides a principled procedure to define ROIs. The benefits of our approach are twofold. First it uses the reference database for prediction, i.e., to provide potential biomarkers in a clinical setting. Second it increases statistical power on the new task. We demonstrate on a set of 18 pairs of functional MRI experimental conditions that our approach gives good prediction. In addition, on a specific transfer situation involving different scanners at different locations, we show that voxel selection based on transfer learning leads to higher detection power on small cohorts.
“Feature Detection” vs. “Predictive Coding” Models of Plant Behavior
Calvo, Paco; Baluška, František; Sims, Andrew
2016-01-01
In this article we consider the possibility that plants exhibit anticipatory behavior, a mark of intelligence. If plants are able to anticipate and respond accordingly to varying states of their surroundings, as opposed to merely responding online to environmental contingencies, then such capacity may be in principle testable, and subject to empirical scrutiny. Our main thesis is that adaptive behavior can only take place by way of a mechanism that predicts the environmental sources of sensory stimulation. We propose to test for anticipation in plants experimentally by contrasting two empirical hypotheses: “feature detection” and “predictive coding.” We spell out what these contrasting hypotheses consist of by way of illustration from the animal literature, and consider how to transfer the rationale involved to the plant literature. PMID:27757094
NASA Astrophysics Data System (ADS)
Amoroso, Richard L.
HÉCTOR A.A brief introductory survey of Unified Field Mechanics (UFM) is given from the perspective of a Holographic Anthropic Multiverse cosmology in 12 `continuous-state' dimensions. The paradigm with many new parameters is cast in a scale-invariant conformal covariant Dirac polarized vacuum utilizing extended HD forms of the de Broglie-Bohm and Cramer interpretations of quantum theory. The model utilizes a unique form of M-Theory based in part on the original hadronic form of string theory that had a variable string tension, TS and included a tachyon. The model is experimentally testable, thus putatively able to demonstrate the existence of large-scale additional dimensionality (LSXD), test for QED violating tight-bound state spectral lines in hydrogen `below' the lowest Bohr orbit, and surmount the quantum uncertainty principle utilizing a hyperincursive Sagnac Effect resonance hierarchy.
a Heavy Higgs Boson from Flavor and Electroweak Symmetry Unification
NASA Astrophysics Data System (ADS)
Fabbrichesi, Marco
2005-08-01
We present a unified picture of flavor and electroweak symmetry breaking based on a nonlinear sigma model spontaneously broken at the TeV scale. Flavor and Higgs bosons arise as pseudo-Goldstone modes. Explicit collective symmetry breaking yields stable vacuum expectation values and masses protected at one loop by the little-Higgs mechanism. The coupling to the fermions generates well-definite mass textures--according to a U(1) global flavor symmetry--that correctly reproduce the mass hierarchies and mixings of quarks and leptons. The model is more constrained than usual little-Higgs models because of bounds on weak and flavor physics. The main experimental signatures testable at the LHC are a rather large mass m
Kalman filter control of a model of spatiotemporal cortical dynamics
Schiff, Steven J; Sauer, Tim
2007-01-01
Recent advances in Kalman filtering to estimate system state and parameters in nonlinear systems have offered the potential to apply such approaches to spatiotemporal nonlinear systems. We here adapt the nonlinear method of unscented Kalman filtering to observe the state and estimate parameters in a computational spatiotemporal excitable system that serves as a model for cerebral cortex. We demonstrate the ability to track spiral wave dynamics, and to use an observer system to calculate control signals delivered through applied electrical fields. We demonstrate how this strategy can control the frequency of such a system, or quench the wave patterns, while minimizing the energy required for such results. These findings are readily testable in experimental applications, and have the potential to be applied to the treatment of human disease. PMID:18310806
Constraining the loop quantum gravity parameter space from phenomenology
NASA Astrophysics Data System (ADS)
Brahma, Suddhasattwa; Ronco, Michele
2018-03-01
Development of quantum gravity theories rarely takes inputs from experimental physics. In this letter, we take a small step towards correcting this by establishing a paradigm for incorporating putative quantum corrections, arising from canonical quantum gravity (QG) theories, in deriving falsifiable modified dispersion relations (MDRs) for particles on a deformed Minkowski space-time. This allows us to differentiate and, hopefully, pick between several quantization choices via testable, state-of-the-art phenomenological predictions. Although a few explicit examples from loop quantum gravity (LQG) (such as the regularization scheme used or the representation of the gauge group) are shown here to establish the claim, our framework is more general and is capable of addressing other quantization ambiguities within LQG and also those arising from other similar QG approaches.
Advanced Launch System Multi-Path Redundant Avionics Architecture Analysis and Characterization
NASA Technical Reports Server (NTRS)
Baker, Robert L.
1993-01-01
The objective of the Multi-Path Redundant Avionics Suite (MPRAS) program is the development of a set of avionic architectural modules which will be applicable to the family of launch vehicles required to support the Advanced Launch System (ALS). To enable ALS cost/performance requirements to be met, the MPRAS must support autonomy, maintenance, and testability capabilities which exceed those present in conventional launch vehicles. The multi-path redundant or fault tolerance characteristics of the MPRAS are necessary to offset a reduction in avionics reliability due to the increased complexity needed to support these new cost reduction and performance capabilities and to meet avionics reliability requirements which will provide cost-effective reductions in overall ALS recurring costs. A complex, real-time distributed computing system is needed to meet the ALS avionics system requirements. General Dynamics, Boeing Aerospace, and C.S. Draper Laboratory have proposed system architectures as candidates for the ALS MPRAS. The purpose of this document is to report the results of independent performance and reliability characterization and assessment analyses of each proposed candidate architecture and qualitative assessments of testability, maintainability, and fault tolerance mechanisms. These independent analyses were conducted as part of the MPRAS Part 2 program and were carried under NASA Langley Research Contract NAS1-17964, Task Assignment 28.
The Hubble Web: The Dark Matter Problem and Cosmic Strings
NASA Astrophysics Data System (ADS)
Alexander, Stephon
2009-07-01
I propose a reinterpretation of cosmic dark matter in which a rigid network of cosmic strings formed at the end of inflation. The cosmic strings fulfill three functions: At recombination they provide an accretion mechanism for virializing baryonic and warm dark matter into disks. These cosmic strings survive as configurations which thread spiral and elliptical galaxies leading to the observed flatness of rotation curves and the Tully-Fisher relation. We find a relationship between the rotational velocity of the galaxy and the string tension and discuss the testability of this model.
Whitmore, Leanne S.; Davis, Ryan W.; McCormick, Robert L.; ...
2016-09-15
Screening a large number of biologically derived molecules for potential fuel compounds without recourse to experimental testing is important in identifying understudied yet valuable molecules. Experimental testing, although a valuable standard for measuring fuel properties, has several major limitations, including the requirement of testably high quantities, considerable expense, and a large amount of time. This paper discusses the development of a general-purpose fuel property tool, using machine learning, whose outcome is to screen molecules for desirable fuel properties. BioCompoundML adopts a general methodology, requiring as input only a list of training compounds (with identifiers and measured values) and a listmore » of testing compounds (with identifiers). For the training data, BioCompoundML collects open data from the National Center for Biotechnology Information, incorporates user-provided features, imputes missing values, performs feature reduction, builds a classifier, and clusters compounds. BioCompoundML then collects data for the testing compounds, predicts class membership, and determines whether compounds are found in the range of variability of the training data set. We demonstrate this tool using three different fuel properties: research octane number (RON), threshold soot index (TSI), and melting point (MP). Here we provide measures of its success with these properties using randomized train/test measurements: average accuracy is 88% in RON, 85% in TSI, and 94% in MP; average precision is 88% in RON, 88% in TSI, and 95% in MP; and average recall is 88% in RON, 82% in TSI, and 97% in MP. The receiver operator characteristics (area under the curve) were estimated at 0.88 in RON, 0.86 in TSI, and 0.87 in MP. We also measured the success of BioCompoundML by sending 16 compounds for direct RON determination. Finally, we provide a screen of 1977 hydrocarbons/oxygenates within the 8696 compounds in MetaCyc, identifying compounds with high predictive strength for high or low RON.« less
Soy-Based Therapeutic Baby Formulas: Testable Hypotheses Regarding the Pros and Cons
Westmark, Cara J.
2017-01-01
Soy-based infant formulas have been consumed in the United States since 1909, and currently constitute a significant portion of the infant formula market. There are efforts underway to generate genetically modified soybeans that produce therapeutic agents of interest with the intent to deliver those agents in a soy-based infant formula platform. The threefold purpose of this review article is to first discuss the pros and cons of soy-based infant formulas, then present testable hypotheses to discern the suitability of a soy platform for drug delivery in babies, and finally start a discussion to inform public policy on this important area of infant nutrition. PMID:28149839
Structural similitude and design of scaled down laminated models
NASA Technical Reports Server (NTRS)
Simitses, G. J.; Rezaeepazhand, J.
1993-01-01
The excellent mechanical properties of laminated composite structures make them prime candidates for wide variety of applications in aerospace, mechanical and other branches of engineering. The enormous design flexibility of advanced composites is obtained at the cost of large number of design parameters. Due to complexity of the systems and lack of complete design based informations, designers tend to be conservative in their design. Furthermore, any new design is extensively evaluated experimentally until it achieves the necessary reliability, performance and safety. However, the experimental evaluation of composite structures are costly and time consuming. Consequently, it is extremely useful if a full-scale structure can be replaced by a similar scaled-down model which is much easier to work with. Furthermore, a dramatic reduction in cost and time can be achieved, if available experimental data of a specific structure can be used to predict the behavior of a group of similar systems. This study investigates problems associated with the design of scaled models. Such study is important since it provides the necessary scaling laws, and the factors which affect the accuracy of the scale models. Similitude theory is employed to develop the necessary similarity conditions (scaling laws). Scaling laws provide relationship between a full-scale structure and its scale model, and can be used to extrapolate the experimental data of a small, inexpensive, and testable model into design information for a large prototype. Due to large number of design parameters, the identification of the principal scaling laws by conventional method (dimensional analysis) is tedious. Similitude theory based on governing equations of the structural system is more direct and simpler in execution. The difficulty of making completely similar scale models often leads to accept certain type of distortion from exact duplication of the prototype (partial similarity). Both complete and partial similarity are discussed. The procedure consists of systematically observing the effect of each parameter and corresponding scaling laws. Then acceptable intervals and limitations for these parameters and scaling laws are discussed. In each case, a set of valid scaling factors and corresponding response scaling laws that accurately predict the response of prototypes from experimental models is introduced. The examples used include rectangular laminated plates under destabilizing loads, applied individually, vibrational characteristics of same plates, as well as cylindrical bending of beam-plates.
Huang, Dan; Chen, Xuejuan; Gong, Qi; Yuan, Chaoqun; Ding, Hui; Bai, Jing; Zhu, Hui; Fu, Zhujun; Yu, Rongbin; Liu, Hu
2016-01-01
This survey was conducted to determine the testability, distribution and associations of ocular biometric parameters in Chinese preschool children. Ocular biometric examinations, including the axial length (AL) and corneal radius of curvature (CR), were conducted on 1,688 3-year-old subjects by using an IOLMaster in August 2015. Anthropometric parameters, including height and weight, were measured according to a standardized protocol, and body mass index (BMI) was calculated. The testability was 93.7% for the AL and 78.6% for the CR overall, and both measures improved with age. Girls performed slightly better in AL measurements (P = 0.08), and the difference in CR was statistically significant (P < 0.05). The AL distribution was normal in girls (P = 0.12), whereas it was not in boys (P < 0.05). For CR1, all subgroups presented normal distributions (P = 0.16 for boys; P = 0.20 for girls), but the distribution varied when the subgroups were combined (P < 0.05). CR2 presented a normal distribution (P = 0.11), whereas the AL/CR ratio was abnormal (P < 0.001). Boys exhibited a significantly longer AL, a greater CR and a greater AL/CR ratio than girls (all P < 0.001). PMID:27384307
Modelling the spread of innovation in wild birds.
Shultz, Thomas R; Montrey, Marcel; Aplin, Lucy M
2017-06-01
We apply three plausible algorithms in agent-based computer simulations to recent experiments on social learning in wild birds. Although some of the phenomena are simulated by all three learning algorithms, several manifestations of social conformity bias are simulated by only the approximate majority (AM) algorithm, which has roots in chemistry, molecular biology and theoretical computer science. The simulations generate testable predictions and provide several explanatory insights into the diffusion of innovation through a population. The AM algorithm's success raises the possibility of its usefulness in studying group dynamics more generally, in several different scientific domains. Our differential-equation model matches simulation results and provides mathematical insights into the dynamics of these algorithms. © 2017 The Author(s).
NASA Astrophysics Data System (ADS)
Chun, E. J.; Cvetič, G.; Dev, P. S. B.; Drewes, M.; Fong, C. S.; Garbrecht, B.; Hambye, T.; Harz, J.; Hernández, P.; Kim, C. S.; Molinaro, E.; Nardi, E.; Racker, J.; Rius, N.; Zamora-Saa, J.
2018-02-01
The focus of this paper lies on the possible experimental tests of leptogenesis scenarios. We consider both leptogenesis generated from oscillations, as well as leptogenesis from out-of-equilibrium decays. As the Akhmedov-Rubakov-Smirnov (ARS) mechanism allows for heavy neutrinos in the GeV range, this opens up a plethora of possible experimental tests, e.g. at neutrino oscillation experiments, neutrinoless double beta decay, and direct searches for neutral heavy leptons at future facilities. In contrast, testing leptogenesis from out-of-equilibrium decays is a quite difficult task. We comment on the necessary conditions for having successful leptogenesis at the TeV-scale. We further discuss possible realizations and their model specific testability in extended seesaw models, models with extended gauge sectors, and supersymmetric leptogenesis. Not being able to test high-scale leptogenesis directly, we present a way to falsify such scenarios by focusing on their washout processes. This is discussed specifically for the left-right symmetric model and the observation of a heavy WR, as well as model independently when measuring ΔL = 2 washout processes at the LHC or neutrinoless double beta decay.
Stereoacuity of preschool children with and without vision disorders.
Ciner, Elise B; Ying, Gui-Shuang; Kulp, Marjean Taylor; Maguire, Maureen G; Quinn, Graham E; Orel-Bixler, Deborah; Cyert, Lynn A; Moore, Bruce; Huang, Jiayan
2014-03-01
To evaluate associations between stereoacuity and presence, type, and severity of vision disorders in Head Start preschool children and determine testability and levels of stereoacuity by age in children without vision disorders. Stereoacuity of children aged 3 to 5 years (n = 2898) participating in the Vision in Preschoolers (VIP) Study was evaluated using the Stereo Smile II test during a comprehensive vision examination. This test uses a two-alternative forced-choice paradigm with four stereoacuity levels (480 to 60 seconds of arc). Children were classified by the presence (n = 871) or absence (n = 2027) of VIP Study-targeted vision disorders (amblyopia, strabismus, significant refractive error, or unexplained reduced visual acuity), including type and severity. Median stereoacuity between groups and among severity levels of vision disorders was compared using Wilcoxon rank sum and Kruskal-Wallis tests. Testability and stereoacuity levels were determined for children without VIP Study-targeted disorders overall and by age. Children with VIP Study-targeted vision disorders had significantly worse median stereoacuity than that of children without vision disorders (120 vs. 60 seconds of arc, p < 0.001). Children with the most severe vision disorders had worse stereoacuity than that of children with milder disorders (median 480 vs. 120 seconds of arc, p < 0.001). Among children without vision disorders, testability was 99.6% overall, increasing with age to 100% for 5-year-olds (p = 0.002). Most of the children without vision disorders (88%) had stereoacuity at the two best disparities (60 or 120 seconds of arc); the percentage increasing with age (82% for 3-, 89% for 4-, and 92% for 5-year-olds; p < 0.001). The presence of any VIP Study-targeted vision disorder was associated with significantly worse stereoacuity in preschool children. Severe vision disorders were more likely associated with poorer stereopsis than milder or no vision disorders. Testability was excellent at all ages. These results support the validity of the Stereo Smile II for assessing random-dot stereoacuity in preschool children.
The Assurance Challenges of Advanced Packaging Technologies for Electronics
NASA Technical Reports Server (NTRS)
Sampson, Michael J.
2010-01-01
Advances in microelectronic parts performance are driving towards finer feature sizes, three-dimensional geometries and ever-increasing number of transistor equivalents that are resulting in increased die sizes and interconnection (I/O) counts. The resultant packaging necessary to provide assemble-ability, environmental protection, testability and interconnection to the circuit board for the active die creates major challenges, particularly for space applications, Traditionally, NASA has used hermetically packaged microcircuits whenever available but the new demands make hermetic packaging less and less practical at the same time as more and more expensive, Some part types of great interest to NASA designers are currently only available in non-hermetic packaging. It is a far more complex quality and reliability assurance challenge to gain confidence in the long-term survivability and effectiveness of nonhermetic packages than for hermetic ones. Although they may provide more rugged environmental protection than the familiar Plastic Encapsulated Microcircuits (PEMs), the non-hermetic Ceramic Column Grid Array (CCGA) packages that are the focus of this presentation present a unique combination of challenges to assessing their suitability for spaceflight use. The presentation will discuss the bases for these challenges, some examples of the techniques proposed to mitigate them and a proposed approach to a US MIL specification Class for non-hermetic microcircuits suitable for space application, Class Y, to be incorporated into M. IL-PRF-38535. It has recently emerged that some major packaging suppliers are offering hermetic area array packages that may offer alternatives to the nonhermetic CCGA styles but have also got their own inspectability and testability issues which will be briefly discussed in the presentation,
The diffusion decision model: theory and data for two-choice decision tasks.
Ratcliff, Roger; McKoon, Gail
2008-04-01
The diffusion decision model allows detailed explanations of behavior in two-choice discrimination tasks. In this article, the model is reviewed to show how it translates behavioral data-accuracy, mean response times, and response time distributions-into components of cognitive processing. Three experiments are used to illustrate experimental manipulations of three components: stimulus difficulty affects the quality of information on which a decision is based; instructions emphasizing either speed or accuracy affect the criterial amounts of information that a subject requires before initiating a response; and the relative proportions of the two stimuli affect biases in drift rate and starting point. The experiments also illustrate the strong constraints that ensure the model is empirically testable and potentially falsifiable. The broad range of applications of the model is also reviewed, including research in the domains of aging and neurophysiology.
The Law of Self-Acting Machines and Irreversible Processes with Reversible Replicas
NASA Astrophysics Data System (ADS)
Valev, Pentcho
2002-11-01
Clausius and Kelvin saved Carnot theorem and developed the second law by assuming that Carnot machines can work in the absence of an operator and that all the irreversible processes have reversible replicas. The former assumption restored Carnot theorem as an experience of mankind whereas the latter generated "the law of ever increasing entropy". Both assumptions are wrong so it makes sense to return to Carnot theorem (or some equivalent) and test it experimentally. Two testable paradigms - the system performing two types of reversible work and the system in dynamical equilibrium - suggest that perpetuum mobile of the second kind in the presence of an operator is possible. The deviation from the second law prediction, expressed as difference between partial derivatives in a Maxwell relation, measures the degree of structural-functional evolution for the respective system.
Mechanics of undulatory swimming in a frictional fluid.
Ding, Yang; Sharpe, Sarah S; Masse, Andrew; Goldman, Daniel I
2012-01-01
The sandfish lizard (Scincus scincus) swims within granular media (sand) using axial body undulations to propel itself without the use of limbs. In previous work we predicted average swimming speed by developing a numerical simulation that incorporated experimentally measured biological kinematics into a multibody sandfish model. The model was coupled to an experimentally validated soft sphere discrete element method simulation of the granular medium. In this paper, we use the simulation to study the detailed mechanics of undulatory swimming in a "granular frictional fluid" and compare the predictions to our previously developed resistive force theory (RFT) which models sand-swimming using empirically determined granular drag laws. The simulation reveals that the forward speed of the center of mass (CoM) oscillates about its average speed in antiphase with head drag. The coupling between overall body motion and body deformation results in a non-trivial pattern in the magnitude of lateral displacement of the segments along the body. The actuator torque and segment power are maximal near the center of the body and decrease to zero toward the head and the tail. Approximately 30% of the net swimming power is dissipated in head drag. The power consumption is proportional to the frequency in the biologically relevant range, which confirms that frictional forces dominate during sand-swimming by the sandfish. Comparison of the segmental forces measured in simulation with the force on a laterally oscillating rod reveals that a granular hysteresis effect causes the overestimation of the body thrust forces in the RFT. Our models provide detailed testable predictions for biological locomotion in a granular environment.
Mechanics of Undulatory Swimming in a Frictional Fluid
Ding, Yang; Sharpe, Sarah S.; Masse, Andrew; Goldman, Daniel I.
2012-01-01
The sandfish lizard (Scincus scincus) swims within granular media (sand) using axial body undulations to propel itself without the use of limbs. In previous work we predicted average swimming speed by developing a numerical simulation that incorporated experimentally measured biological kinematics into a multibody sandfish model. The model was coupled to an experimentally validated soft sphere discrete element method simulation of the granular medium. In this paper, we use the simulation to study the detailed mechanics of undulatory swimming in a “granular frictional fluid” and compare the predictions to our previously developed resistive force theory (RFT) which models sand-swimming using empirically determined granular drag laws. The simulation reveals that the forward speed of the center of mass (CoM) oscillates about its average speed in antiphase with head drag. The coupling between overall body motion and body deformation results in a non-trivial pattern in the magnitude of lateral displacement of the segments along the body. The actuator torque and segment power are maximal near the center of the body and decrease to zero toward the head and the tail. Approximately 30% of the net swimming power is dissipated in head drag. The power consumption is proportional to the frequency in the biologically relevant range, which confirms that frictional forces dominate during sand-swimming by the sandfish. Comparison of the segmental forces measured in simulation with the force on a laterally oscillating rod reveals that a granular hysteresis effect causes the overestimation of the body thrust forces in the RFT. Our models provide detailed testable predictions for biological locomotion in a granular environment. PMID:23300407
Thalamocortical mechanisms for integrating musical tone and rhythm
Musacchia, Gabriella; Large, Edward
2014-01-01
Studies over several decades have identified many of the neuronal substrates of music perception by pursuing pitch and rhythm perception separately. Here, we address the question of how these mechanisms interact, starting with the observation that the peripheral pathways of the so-called “Core” and “Matrix” thalamocortical system provide the anatomical bases for tone and rhythm channels. We then examine the hypothesis that these specialized inputs integrate tonal content within rhythm context in auditory cortex using classical types of “driving” and “modulatory” mechanisms. This hypothesis provides a framework for deriving testable predictions about the early stages of music processing. Furthermore, because thalamocortical circuits are shared by speech and music processing, such a model provides concrete implications for how music experience contributes to the development of robust speech encoding mechanisms. PMID:24103509
NASA Astrophysics Data System (ADS)
Lucia, Umberto; Ponzetto, Antonio; Deisboeck, Thomas S.
2016-01-01
To investigate biosystems, we propose a new thermodynamic concept that analyses ion, mass and energy flows across the cell membrane. This paradigm-shifting approach has a wide applicability to medically relevant topics including advancing cancer treatment. To support this claim, we revisit ‘Norton-Simon’ and evolving it from an already important anti-cancer hypothesis to a thermodynamic theorem in medicine. We confirm that an increase in proliferation and a reduction in apoptosis trigger a maximum of ATP consumption by the tumor cell. Moreover, we find that positive, membrane-crossing ions lead to a decrease in the energy used by the tumor, supporting the notion of their growth inhibitory effect while negative ions apparently increase the cancer’s consumption of energy hence reflecting a growth promoting impact. Our results not only represent a thermodynamic proof of the original Norton-Simon hypothesis but, more concretely, they also advance the clinically intriguing and experimentally testable, diagnostic hypothesis that observing an increase in negative ions inside a cell in vitro, and inside a diseased tissue in vivo, may indicate growth or recurrence of a tumor. We conclude with providing theoretical evidence that applying electromagnetic field therapy early on in the treatment cycle may maximize its anti-cancer efficacy.
Nonequilibrium transport in the pseudospin-1 Dirac-Weyl system
NASA Astrophysics Data System (ADS)
Wang, Cheng-Zhen; Xu, Hong-Ya; Huang, Liang; Lai, Ying-Cheng
2017-09-01
Recently, solid state materials hosting pseudospin-1 quasiparticles have attracted a great deal of attention. In these materials, the energy band contains a pair of Dirac cones and a flatband through the connecting point of the cones. As the "caging" of carriers with a zero group velocity, the flatband itself has zero conductivity. However, in a nonequilibrium situation where a constant electric field is suddenly switched on, the flatband can enhance the resulting current in both the linear and nonlinear response regimes through distinct physical mechanisms. Using the (2 +1 )-dimensional pseudospin-1 Dirac-Weyl system as a concrete setting, we demonstrate that, in the weak field regime, the interband current is about twice larger than that for pseudospin-1/2 system due to the interplay between the flatband and the negative band, with the scaling behavior determined by the Kubo formula. In the strong field regime, the intraband current is √{2 } times larger than that in the pseudospin-1/2 system, due to the additional contribution from particles residing in the flatband. In this case, the current and field follow the scaling law associated with Landau-Zener tunneling. These results provide a better understanding of the role of the flatband in nonequilibrium transport and are experimentally testable using electronic or photonic systems.
Contreras-López, Orlando; Moyano, Tomás C; Soto, Daniela C; Gutiérrez, Rodrigo A
2018-01-01
The rapid increase in the availability of transcriptomics data generated by RNA sequencing represents both a challenge and an opportunity for biologists without bioinformatics training. The challenge is handling, integrating, and interpreting these data sets. The opportunity is to use this information to generate testable hypothesis to understand molecular mechanisms controlling gene expression and biological processes (Fig. 1). A successful strategy to generate tractable hypotheses from transcriptomics data has been to build undirected network graphs based on patterns of gene co-expression. Many examples of new hypothesis derived from network analyses can be found in the literature, spanning different organisms including plants and specific fields such as root developmental biology.In order to make the process of constructing a gene co-expression network more accessible to biologists, here we provide step-by-step instructions using published RNA-seq experimental data obtained from a public database. Similar strategies have been used in previous studies to advance root developmental biology. This guide includes basic instructions for the operation of widely used open source platforms such as Bio-Linux, R, and Cytoscape. Even though the data we used in this example was obtained from Arabidopsis thaliana, the workflow developed in this guide can be easily adapted to work with RNA-seq data from any organism.
Slot-like capacity and resource-like coding in a neural model of multiple-item working memory.
Standage, Dominic; Pare, Martin
2018-06-27
For the past decade, research on the storage limitations of working memory has been dominated by two fundamentally different hypotheses. On the one hand, the contents of working memory may be stored in a limited number of `slots', each with a fixed resolution. On the other hand, any number of items may be stored, but with decreasing resolution. These two hypotheses have been invaluable in characterizing the computational structure of working memory, but neither provides a complete account of the available experimental data, nor speaks to the neural basis of the limitations it characterizes. To address these shortcomings, we simulated a multiple-item working memory task with a cortical network model, the cellular resolution of which allowed us to quantify the coding fidelity of memoranda as a function of memory load, as measured by the discriminability, regularity and reliability of simulated neural spiking. Our simulations account for a wealth of neural and behavioural data from human and non-human primate studies, and they demonstrate that feedback inhibition lowers both capacity and coding fidelity. Because the strength of inhibition scales with the number of items stored by the network, increasing this number progressively lowers fidelity until capacity is reached. Crucially, the model makes specific, testable predictions for neural activity on multiple-item working memory tasks.
Slowly switching between environments facilitates reverse evolution in small populations.
Tan, Longzhi; Gore, Jeff
2012-10-01
Natural populations must constantly adapt to ever-changing environmental conditions. A particularly interesting question is whether such adaptations can be reversed by returning the population to an ancestral environment. Such evolutionary reversals have been observed in both natural and laboratory populations. However, the factors that determine the reversibility of evolution are still under debate. The time scales of environmental change vary over a wide range, but little is known about how the rate of environmental change influences the reversibility of evolution. Here, we demonstrate computationally that slowly switching between environments increases the reversibility of evolution for small populations that are subject to only modest clonal interference. For small populations, slow switching reduces the mean number of mutations acquired in a new environment and also increases the probability of reverse evolution at each of these "genetic distances." As the population size increases, slow switching no longer reduces the genetic distance, thus decreasing the evolutionary reversibility. We confirm this effect using both a phenomenological model of clonal interference and also a Wright-Fisher stochastic simulation that incorporates genetic diversity. Our results suggest that the rate of environmental change is a key determinant of the reversibility of evolution, and provides testable hypotheses for experimental evolution. © 2012 The Author(s). Evolution© 2012 The Society for the Study of Evolution.
The Contribution of Psychosocial Stress to the Obesity Epidemic
Siervo, M.; Wells, J. C. K.; Cizza, G.
2009-01-01
The Thrifty Gene hypothesis theorizes that during evolution a set of genes has been selected to ensure survival in environments with limited food supply and marked seasonality. Contemporary environments have predictable and unlimited food availability, an attenuated seasonality due to artificial lighting, indoor heating during the winter and air conditioning during the summer, and promote sedentariness and overeating. In this setting the thrifty genes are constantly activated to enhance energy storage. Psychosocial stress and sleep deprivation are other features of modern societies. Stress-induced hypercortisolemia in the setting of unlimited food supply promotes adiposity. Modern man is becoming obese because these ancient mechanisms are efficiently promoting a positive energy balance. We propose that in today’s plentifully provisioned societies, where sedentariness and mental stress have become typical traits, chronic activation of the neuroendocrine systems may contribute to the increased prevalence of obesity. We suggest that some of the yet unidentified thrifty genes may be linked to highly conserved energy sensing mechanisms (AMP kinase, mTOR kinase). These hypotheses are testable. Rural societies that are becoming rapidly industrialized and are witnessing a dramatic increase in obesity may provide a historical opportunity to conduct epidemiological studies of the thrifty genotype. In experimental settings, the effects of various forms of psychosocial stress in increasing metabolic efficiency and gene expression can be further tested. PMID:19156597
Fields, Chris
2011-01-01
The perception of persisting visual objects is mediated by transient intermediate representations, object files, that are instantiated in response to some, but not all, visual trajectories. The standard object file concept does not, however, provide a mechanism sufficient to account for all experimental data on visual object persistence, object tracking, and the ability to perceive spatially disconnected stimuli as continuously existing objects. Based on relevant anatomical, functional, and developmental data, a functional model is constructed that bases visual object individuation on the recognition of temporal sequences of apparent center-of-mass positions that are specifically identified as trajectories by dedicated “trajectory recognition networks” downstream of the medial–temporal motion-detection area. This model is shown to account for a wide range of data, and to generate a variety of testable predictions. Individual differences in the recognition, abstraction, and encoding of trajectory information are expected to generate distinct object persistence judgments and object recognition abilities. Dominance of trajectory information over feature information in stored object tokens during early infancy, in particular, is expected to disrupt the ability to re-identify human and other individuals across perceptual episodes, and lead to developmental outcomes with characteristics of autism spectrum disorders. PMID:21716599
Avdoshenko, Stanislav M; Das, Atanu; Satija, Rohit; Papoian, Garegin A; Makarov, Dmitrii E
2017-03-21
A long time ago, Kuhn predicted that long polymers should approach a limit where their global motion is controlled by solvent friction alone, with ruggedness of their energy landscapes having no consequences for their dynamics. In contrast, internal friction effects are important for polymers of modest length. Internal friction in proteins, in particular, affects how fast they fold or find their binding targets and, as such, has attracted much recent attention. Here we explore the molecular origins of internal friction in unfolded proteins using atomistic simulations, coarse-grained models and analytic theory. We show that the characteristic internal friction timescale is directly proportional to the timescale of hindered dihedral rotations within the polypeptide chain, with a proportionality coefficient b that is independent of the chain length. Such chain length independence of b provides experimentally testable evidence that internal friction arises from concerted, crankshaft-like dihedral rearrangements. In accord with phenomenological models of internal friction, we find the global reconfiguration timescale of a polypeptide to be the sum of solvent friction and internal friction timescales. At the same time, the time evolution of inter-monomer distances within polypeptides deviates both from the predictions of those models and from a simple, one-dimensional diffusion model.
Critical Roles of the Direct GABAergic Pallido-cortical Pathway in Controlling Absence Seizures
Li, Min; Ma, Tao; Wu, Shengdun; Ma, Jingling; Cui, Yan; Xia, Yang; Xu, Peng; Yao, Dezhong
2015-01-01
The basal ganglia (BG), serving as an intermediate bridge between the cerebral cortex and thalamus, are believed to play crucial roles in controlling absence seizure activities generated by the pathological corticothalamic system. Inspired by recent experiments, here we systematically investigate the contribution of a novel identified GABAergic pallido-cortical pathway, projecting from the globus pallidus externa (GPe) in the BG to the cerebral cortex, to the control of absence seizures. By computational modelling, we find that both increasing the activation of GPe neurons and enhancing the coupling strength of the inhibitory pallido-cortical pathway can suppress the bilaterally synchronous 2–4 Hz spike and wave discharges (SWDs) during absence seizures. Appropriate tuning of several GPe-related pathways may also trigger the SWD suppression, through modulating the activation level of GPe neurons. Furthermore, we show that the previously discovered bidirectional control of absence seizures due to the competition between other two BG output pathways also exists in our established model. Importantly, such bidirectional control is shaped by the coupling strength of this direct GABAergic pallido-cortical pathway. Our work suggests that the novel identified pallido-cortical pathway has a functional role in controlling absence seizures and the presented results might provide testable hypotheses for future experimental studies. PMID:26496656
Dini-Andreote, Francisco; Stegen, James C; van Elsas, Jan Dirk; Salles, Joana Falcão
2015-03-17
Ecological succession and the balance between stochastic and deterministic processes are two major themes within microbial ecology, but these conceptual domains have mostly developed independent of each other. Here we provide a framework that integrates shifts in community assembly processes with microbial primary succession to better understand mechanisms governing the stochastic/deterministic balance. Synthesizing previous work, we devised a conceptual model that links ecosystem development to alternative hypotheses related to shifts in ecological assembly processes. Conceptual model hypotheses were tested by coupling spatiotemporal data on soil bacterial communities with environmental conditions in a salt marsh chronosequence spanning 105 years of succession. Analyses within successional stages showed community composition to be initially governed by stochasticity, but as succession proceeded, there was a progressive increase in deterministic selection correlated with increasing sodium concentration. Analyses of community turnover among successional stages--which provide a larger spatiotemporal scale relative to within stage analyses--revealed that changes in the concentration of soil organic matter were the main predictor of the type and relative influence of determinism. Taken together, these results suggest scale-dependency in the mechanisms underlying selection. To better understand mechanisms governing these patterns, we developed an ecological simulation model that revealed how changes in selective environments cause shifts in the stochastic/deterministic balance. Finally, we propose an extended--and experimentally testable--conceptual model integrating ecological assembly processes with primary and secondary succession. This framework provides a priori hypotheses for future experiments, thereby facilitating a systematic approach to understand assembly and succession in microbial communities across ecosystems.
Colquhoun, Heather L; Carroll, Kelly; Eva, Kevin W; Grimshaw, Jeremy M; Ivers, Noah; Michie, Susan; Sales, Anne; Brehaut, Jamie C
2017-09-29
Audit and feedback (A&F) is a common strategy for helping health providers to implement evidence into practice. Despite being extensively studied, health care A&F interventions remain variably effective, with overall effect sizes that have not improved since 2003. Contributing to this stagnation is the fact that most health care A&F interventions have largely been designed without being informed by theoretical understanding from the behavioral and social sciences. To determine if the trend can be improved, the objective of this study was to develop a list of testable, theory-informed hypotheses about how to design more effective A&F interventions. Using purposive sampling, semi-structured 60-90-min telephone interviews were conducted with experts in theories related to A&F from a range of fields (e.g., cognitive, health and organizational psychology, medical decision-making, economics). Guided by detailed descriptions of A&F interventions from the health care literature, interviewees described how they would approach the problem of designing improved A&F interventions. Specific, theory-informed hypotheses about the conditions for effective design and delivery of A&F interventions were elicited from the interviews. The resulting hypotheses were assigned by three coders working independently into themes, and categories of themes, in an iterative process. We conducted 28 interviews and identified 313 theory-informed hypotheses, which were placed into 30 themes. The 30 themes included hypotheses related to the following five categories: A&F recipient (seven themes), content of the A&F (ten themes), process of delivery of the A&F (six themes), behavior that was the focus of the A&F (three themes), and other (four themes). We have identified a set of testable, theory-informed hypotheses from a broad range of behavioral and social science that suggest conditions for more effective A&F interventions. This work demonstrates the breadth of perspectives about A&F from non-healthcare-specific disciplines in a way that yields testable hypotheses for healthcare A&F interventions. These results will serve as the foundation for further work seeking to set research priorities among the A&F research community.
Exploring Operational Test and Evaluation of Unmanned Aircraft Systems: A Qualitative Case Study
NASA Astrophysics Data System (ADS)
Saliceti, Jose A.
The purpose of this qualitative case study was to explore and identify strategies that may potentially remedy operational test and evaluation procedures used to evaluate Unmanned Aircraft Systems (UAS) technology. The sample for analysis consisted of organizations testing and evaluating UASs (e.g., U.S. Air Force, U.S. Navy, U.S. Army, U.S. Marine Corps, U.S. Coast Guard, and Customs Border Protection). A purposeful sampling technique was used to select 15 subject matter experts in the field of operational test and evaluation of UASs. A questionnaire was provided to participants to construct a descriptive and robust research. Analysis of responses revealed themes related to each research question. Findings revealed operational testers utilized requirements documents to extrapolate measures for testing UAS technology and develop critical operational issues. The requirements documents were (a) developed without the contribution of stakeholders and operational testers, (b) developed with vague or unrealistic measures, and (c) developed without a systematic method to derive requirements from mission tasks. Four approaches are recommended to develop testable operational requirements and assist operational testers: (a) use a mission task analysis tool to derive requirements for mission essential tasks for the system, (b) exercise collaboration among stakeholders and testers to ensure testable operational requirements based on mission tasks, (c) ensure testable measures are used in requirements documents, and (d) create a repository list of critical operational issues by mission areas. The preparation of operational test and evaluation processes for UAS technology is not uniform across testers. The processes in place are not standardized, thus test plan preparation and reporting are different among participants. A standard method to prepare and report UAS technology should be used when preparing and reporting on UAS technology. Using a systematic process, such as mission-based test design, resonated among participants as an analytical method to link UAS mission tasks and measures of performance to the capabilities of the system under test when developing operational test plans. Further research should examine system engineering designs for system requirements traceability matrix of mission tasks and subtasks while using an analysis tool that adequately evaluates UASs with an acceptable level of confidence in the results.
Zou, Cunlu; Ladroue, Christophe; Guo, Shuixia; Feng, Jianfeng
2010-06-21
Reverse-engineering approaches such as Bayesian network inference, ordinary differential equations (ODEs) and information theory are widely applied to deriving causal relationships among different elements such as genes, proteins, metabolites, neurons, brain areas and so on, based upon multi-dimensional spatial and temporal data. There are several well-established reverse-engineering approaches to explore causal relationships in a dynamic network, such as ordinary differential equations (ODE), Bayesian networks, information theory and Granger Causality. Here we focused on Granger causality both in the time and frequency domain and in local and global networks, and applied our approach to experimental data (genes and proteins). For a small gene network, Granger causality outperformed all the other three approaches mentioned above. A global protein network of 812 proteins was reconstructed, using a novel approach. The obtained results fitted well with known experimental findings and predicted many experimentally testable results. In addition to interactions in the time domain, interactions in the frequency domain were also recovered. The results on the proteomic data and gene data confirm that Granger causality is a simple and accurate approach to recover the network structure. Our approach is general and can be easily applied to other types of temporal data.
Multi-omics approach identifies molecular mechanisms of plant-fungus mycorrhizal interaction
Larsen, Peter E.; Sreedasyam, Avinash; Trivedi, Geetika; ...
2016-01-19
In mycorrhizal symbiosis, plant roots form close, mutually beneficial interactions with soil fungi. Before this mycorrhizal interaction can be established however, plant roots must be capable of detecting potential beneficial fungal partners and initiating the gene expression patterns necessary to begin symbiosis. To predict a plant root – mycorrhizal fungi sensor systems, we analyzed in vitro experiments of Populus tremuloides (aspen tree) and Laccaria bicolor (mycorrhizal fungi) interaction and leveraged over 200 previously published transcriptomic experimental data sets, 159 experimentally validated plant transcription factor binding motifs, and more than 120-thousand experimentally validated protein-protein interactions to generate models of pre-mycorrhizal sensormore » systems in aspen root. These sensor mechanisms link extracellular signaling molecules with gene regulation through a network comprised of membrane receptors, signal cascade proteins, transcription factors, and transcription factor biding DNA motifs. Modeling predicted four pre-mycorrhizal sensor complexes in aspen that interact with fifteen transcription factors to regulate the expression of 1184 genes in response to extracellular signals synthesized by Laccaria. Predicted extracellular signaling molecules include common signaling molecules such as phenylpropanoids, salicylate, and, jasmonic acid. Lastly, this multi-omic computational modeling approach for predicting the complex sensory networks yielded specific, testable biological hypotheses for mycorrhizal interaction signaling compounds, sensor complexes, and mechanisms of gene regulation.« less
Multi-omics approach identifies molecular mechanisms of plant-fungus mycorrhizal interaction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larsen, Peter E.; Sreedasyam, Avinash; Trivedi, Geetika
In mycorrhizal symbiosis, plant roots form close, mutually beneficial interactions with soil fungi. Before this mycorrhizal interaction can be established however, plant roots must be capable of detecting potential beneficial fungal partners and initiating the gene expression patterns necessary to begin symbiosis. To predict a plant root – mycorrhizal fungi sensor systems, we analyzed in vitro experiments of Populus tremuloides (aspen tree) and Laccaria bicolor (mycorrhizal fungi) interaction and leveraged over 200 previously published transcriptomic experimental data sets, 159 experimentally validated plant transcription factor binding motifs, and more than 120-thousand experimentally validated protein-protein interactions to generate models of pre-mycorrhizal sensormore » systems in aspen root. These sensor mechanisms link extracellular signaling molecules with gene regulation through a network comprised of membrane receptors, signal cascade proteins, transcription factors, and transcription factor biding DNA motifs. Modeling predicted four pre-mycorrhizal sensor complexes in aspen that interact with fifteen transcription factors to regulate the expression of 1184 genes in response to extracellular signals synthesized by Laccaria. Predicted extracellular signaling molecules include common signaling molecules such as phenylpropanoids, salicylate, and, jasmonic acid. Lastly, this multi-omic computational modeling approach for predicting the complex sensory networks yielded specific, testable biological hypotheses for mycorrhizal interaction signaling compounds, sensor complexes, and mechanisms of gene regulation.« less
NASA Technical Reports Server (NTRS)
Taylor, S. R.
1984-01-01
The concept that the Moon was fissioned from the Earth after core separation is the most readily testable hypothesis of lunar origin, since direct comparisons of lunar and terrestrial compositions can be made. Differences found in such comparisons introduce so many ad hoc adjustments to the fission hypothesis that it becomes untestable. Further constraints may be obtained from attempting to date the volatile-refractory element fractionation. The combination of chemical and isotopic problems suggests that the fission hypothesis is no longer viable, and separate terrestrial and lunar accretion from a population of fractionated precursor planetesimals provides a more reasonable explanation.
A transcriptional serenAID: the role of noncoding RNAs in class switch recombination
Yewdell, William T.; Chaudhuri, Jayanta
2017-01-01
Abstract During an immune response, activated B cells may undergo class switch recombination (CSR), a molecular rearrangement that allows B cells to switch from expressing IgM and IgD to a secondary antibody heavy chain isotype such as IgG, IgA or IgE. Secondary antibody isotypes provide the adaptive immune system with distinct effector functions to optimally combat various pathogens. CSR occurs between repetitive DNA elements within the immunoglobulin heavy chain (Igh) locus, termed switch (S) regions and requires the DNA-modifying enzyme activation-induced cytidine deaminase (AID). AID-mediated DNA deamination within S regions initiates the formation of DNA double-strand breaks, which serve as biochemical beacons for downstream DNA repair pathways that coordinate the ligation of DNA breaks. Myriad factors contribute to optimal AID targeting; however, many of these factors also localize to genomic regions outside of the Igh locus. Thus, a current challenge is to explain the specific targeting of AID to the Igh locus. Recent studies have implicated noncoding RNAs in CSR, suggesting a provocative mechanism that incorporates Igh-specific factors to enable precise AID targeting. Here, we chronologically recount the rich history of noncoding RNAs functioning in CSR to provide a comprehensive context for recent and future discoveries. We present a model for the RNA-guided targeting of AID that attempts to integrate historical and recent findings, and highlight potential caveats. Lastly, we discuss testable hypotheses ripe for current experimentation, and explore promising ideas for future investigations. PMID:28535205
Gillison, Andrew N; Asner, Gregory P; Fernandes, Erick C M; Mafalacusser, Jacinto; Banze, Aurélio; Izidine, Samira; da Fonseca, Ambrósio R; Pacate, Hermenegildo
2016-07-15
Sustainable biodiversity and land management require a cost-effective means of forecasting landscape response to environmental change. Conventional species-based, regional biodiversity assessments are rarely adequate for policy planning and decision making. We show how new ground and remotely-sensed survey methods can be coordinated to help elucidate and predict relationships between biodiversity, land use and soil properties along complex biophysical gradients that typify many similar landscapes worldwide. In the lower Zambezi valley, Mozambique we used environmental, gradient-directed transects (gradsects) to sample vascular plant species, plant functional types, vegetation structure, soil properties and land-use characteristics. Soil fertility indices were derived using novel multidimensional scaling of soil properties. To facilitate spatial analysis, we applied a probabilistic remote sensing approach, analyzing Landsat 7 satellite imagery to map photosynthetically active and inactive vegetation and bare soil along each gradsect. Despite the relatively low sample number, we found highly significant correlations between single and combined sets of specific plant, soil and remotely sensed variables that permitted testable spatial projections of biodiversity and soil fertility across the regional land-use mosaic. This integrative and rapid approach provides a low-cost, high-return and readily transferable methodology that permits the ready identification of testable biodiversity indicators for adaptive management of biodiversity and potential agricultural productivity. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Blacksberg, Jordana; Mahjoub, Ahmed; Poston, Michael; Brown, Mike; Eiler, John; Ehlmann, Bethany; Hand, Kevin; Carlson, Robert W.; Hodyss, Robert; Wong, Ian
2015-11-01
We present an experimental study aimed at exploring the hypothesis suggested by recent dynamical models - that the Jupiter Trojan asteroids originated in the outer solar system, were scattered by the same instability responsibility for the radical rearrangement of the giant planets, and were subsequently captured in their current location (e.g. Morbidelli et al., 2005, Nesvorny et al., 2013). We seek to identify spectroscopic, chemical and isotopic properties that can tie the Trojan populations to these evolutionary pathways, providing experimental support of dynamical models, and providing testable hypotheses that can feed into the design of experiments that might be performed on potential future missions to these and other primitive bodies.We present the results of experiments devised to explore the hypothesis that Kuiper Belt Objects (KBOs) represent the parent populations of the Trojan asteroids. Numerous thin ice films composed of select solar system volatiles (H2O, H2S, CH3OH, NH3) were grown in various mixtures to simulate compositional changes of icy bodies as a function of volatility and radial distance of formation from the Sun. Subsequent processing of these icy bodies was simulated using electron irradiation and heating. Visible reflectance spectra show significant reddening when H2S is present. Mid-infrared spectra confirm the formation of non-volatile sulfur-containing molecules in the products of H2S-containing ices. These experiments suggest that the presence of specific sulfur-bearing chemical species may play an important role in the colors of both the KBOs and Trojans today. Finally, we discuss the role of the silicate component expected on the surface of the Trojan asteroids (Emery et al., 2006), and the implications of a surface composed of silicates in intimate contact with the nonvolatile organic residues generated by ice irradiation.This work has been supported by the Keck Institute for Space Studies (KISS). The research described here was carried out at the Jet Propulsion Laboratory, Caltech, under a contract with the National Aeronautics and Space Administration (NASA) and at the Caltech Division of Geological and Planetary Sciences.
On testing VLSI chips for the big Viterbi decoder
NASA Technical Reports Server (NTRS)
Hsu, I. S.
1989-01-01
A general technique that can be used in testing very large scale integrated (VLSI) chips for the Big Viterbi Decoder (BVD) system is described. The test technique is divided into functional testing and fault-coverage testing. The purpose of functional testing is to verify that the design works functionally. Functional test vectors are converted from outputs of software simulations which simulate the BVD functionally. Fault-coverage testing is used to detect and, in some cases, to locate faulty components caused by bad fabrication. This type of testing is useful in screening out bad chips. Finally, design for testability, which is included in the BVD VLSI chip design, is described in considerable detail. Both the observability and controllability of a VLSI chip are greatly enhanced by including the design for the testability feature.
Model-Based Testability Assessment and Directed Troubleshooting of Shuttle Wiring Systems
NASA Technical Reports Server (NTRS)
Deb, Somnath; Domagala, Chuck; Shrestha, Roshan; Malepati, Venkatesh; Cavanaugh, Kevin; Patterson-Hine, Ann; Sanderfer, Dwight; Cockrell, Jim; Norvig, Peter (Technical Monitor)
2000-01-01
We have recently completed a pilot study on the Space shuttle wiring system commissioned by the Wiring Integrity Research (WIRe) team at NASA Ames Research Center, As the space shuttle ages, it is experiencing wiring degradation problems including arcing, chaffing insulation breakdown and broken conductors. A systematic and comprehensive test process is required to thoroughly test and quality assure (QA) the wiring systems. The NASA WIRe team recognized the value of a formal model based analysis for risk-assessment and fault coverage analysis. However. wiring systems are complex and involve over 50,000 wire segments. Therefore, NASA commissioned this pilot study with Qualtech Systems. Inc. (QSI) to explore means of automatically extracting high fidelity multi-signal models from wiring information database for use with QSI's Testability Engineering and Maintenance System (TEAMS) tool.
Higher-order Fourier analysis over finite fields and applications
NASA Astrophysics Data System (ADS)
Hatami, Pooya
Higher-order Fourier analysis is a powerful tool in the study of problems in additive and extremal combinatorics, for instance the study of arithmetic progressions in primes, where the traditional Fourier analysis comes short. In recent years, higher-order Fourier analysis has found multiple applications in computer science in fields such as property testing and coding theory. In this thesis, we develop new tools within this theory with several new applications such as a characterization theorem in algebraic property testing. One of our main contributions is a strong near-equidistribution result for regular collections of polynomials. The densities of small linear structures in subsets of Abelian groups can be expressed as certain analytic averages involving linear forms. Higher-order Fourier analysis examines such averages by approximating the indicator function of a subset by a function of bounded number of polynomials. Then, to approximate the average, it suffices to know the joint distribution of the polynomials applied to the linear forms. We prove a near-equidistribution theorem that describes these distributions for the group F(n/p) when p is a fixed prime. This fundamental fact was previously known only under various extra assumptions about the linear forms or the field size. We use this near-equidistribution theorem to settle a conjecture of Gowers and Wolf on the true complexity of systems of linear forms. Our next application is towards a characterization of testable algebraic properties. We prove that every locally characterized affine-invariant property of functions f : F(n/p) → R with n∈ N, is testable. In fact, we prove that any such property P is proximity-obliviously testable. More generally, we show that any affine-invariant property that is closed under subspace restrictions and has "bounded complexity" is testable. We also prove that any property that can be described as the property of decomposing into a known structure of low-degree polynomials is locally characterized and is, hence, testable. We discuss several notions of regularity which allow us to deduce algorithmic versions of various regularity lemmas for polynomials by Green and Tao and by Kaufman and Lovett. We show that our algorithmic regularity lemmas for polynomials imply algorithmic versions of several results relying on regularity, such as decoding Reed-Muller codes beyond the list decoding radius (for certain structured errors), and prescribed polynomial decompositions. Finally, motivated by the definition of Gowers norms, we investigate norms defined by different systems of linear forms. We give necessary conditions on the structure of systems of linear forms that define norms. We prove that such norms can be one of only two types, and assuming that |F p| is sufficiently large, they essentially are equivalent to either a Gowers norm or Lp norms.
Bosdriesz, Evert; Magnúsdóttir, Stefanía; Bruggeman, Frank J; Teusink, Bas; Molenaar, Douwe
2015-06-01
Microorganisms rely on binding-protein assisted, active transport systems to scavenge for scarce nutrients. Several advantages of using binding proteins in such uptake systems have been proposed. However, a systematic, rigorous and quantitative analysis of the function of binding proteins is lacking. By combining knowledge of selection pressure and physiochemical constraints, we derive kinetic, thermodynamic, and stoichiometric properties of binding-protein dependent transport systems that enable a maximal import activity per amount of transporter. Under the hypothesis that this maximal specific activity of the transport complex is the selection objective, binding protein concentrations should exceed the concentration of both the scarce nutrient and the transporter. This increases the encounter rate of transporter with loaded binding protein at low substrate concentrations, thereby enhancing the affinity and specific uptake rate. These predictions are experimentally testable, and a number of observations confirm them. © 2015 FEBS.
Implementation of a quantum random number generator based on the optimal clustering of photocounts
NASA Astrophysics Data System (ADS)
Balygin, K. A.; Zaitsev, V. I.; Klimov, A. N.; Kulik, S. P.; Molotkov, S. N.
2017-10-01
To implement quantum random number generators, it is fundamentally important to have a mathematically provable and experimentally testable process of measurements of a system from which an initial random sequence is generated. This makes sure that randomness indeed has a quantum nature. A quantum random number generator has been implemented with the use of the detection of quasi-single-photon radiation by a silicon photomultiplier (SiPM) matrix, which makes it possible to reliably reach the Poisson statistics of photocounts. The choice and use of the optimal clustering of photocounts for the initial sequence of photodetection events and a method of extraction of a random sequence of 0's and 1's, which is polynomial in the length of the sequence, have made it possible to reach a yield rate of 64 Mbit/s of the output certainly random sequence.
Advancing Biological Understanding and Therapeutics Discovery with Small Molecule Probes
Schreiber, Stuart L.; Kotz, Joanne D.; Li, Min; Aubé, Jeffrey; Austin, Christopher P.; Reed, John C.; Rosen, Hugh; White, E. Lucile; Sklar, Larry A.; Lindsley, Craig W.; Alexander, Benjamin R.; Bittker, Joshua A.; Clemons, Paul A.; de Souza, Andrea; Foley, Michael A.; Palmer, Michelle; Shamji, Alykhan F.; Wawer, Mathias J.; McManus, Owen; Wu, Meng; Zou, Beiyan; Yu, Haibo; Golden, Jennifer E.; Schoenen, Frank J.; Simeonov, Anton; Jadhav, Ajit; Jackson, Michael R.; Pinkerton, Anthony B.; Chung, Thomas D.Y.; Griffin, Patrick R.; Cravatt, Benjamin F.; Hodder, Peter S.; Roush, William R.; Roberts, Edward; Chung, Dong-Hoon; Jonsson, Colleen B.; Noah, James W.; Severson, William E.; Ananthan, Subramaniam; Edwards, Bruce; Oprea, Tudor I.; Conn, P. Jeffrey; Hopkins, Corey R.; Wood, Michael R.; Stauffer, Shaun R.; Emmitte, Kyle A.
2015-01-01
Small-molecule probes can illuminate biological processes and aid in the assessment of emerging therapeutic targets by perturbing biological systems in a manner distinct from other experimental approaches. Despite the tremendous promise of chemical tools for investigating biology and disease, small-molecule probes were unavailable for most targets and pathways as recently as a decade ago. In 2005, the U.S. National Institutes of Health launched the decade-long Molecular Libraries Program with the intent of innovating in and broadening access to small-molecule science. This Perspective describes how novel small-molecule probes identified through the program are enabling the exploration of biological pathways and therapeutic hypotheses not otherwise testable. These experiences illustrate how small-molecule probes can help bridge the chasm between biological research and the development of medicines, but also highlight the need to innovate the science of therapeutic discovery. PMID:26046436
Sequential pattern formation governed by signaling gradients
NASA Astrophysics Data System (ADS)
Jörg, David J.; Oates, Andrew C.; Jülicher, Frank
2016-10-01
Rhythmic and sequential segmentation of the embryonic body plan is a vital developmental patterning process in all vertebrate species. However, a theoretical framework capturing the emergence of dynamic patterns of gene expression from the interplay of cell oscillations with tissue elongation and shortening and with signaling gradients, is still missing. Here we show that a set of coupled genetic oscillators in an elongating tissue that is regulated by diffusing and advected signaling molecules can account for segmentation as a self-organized patterning process. This system can form a finite number of segments and the dynamics of segmentation and the total number of segments formed depend strongly on kinetic parameters describing tissue elongation and signaling molecules. The model accounts for existing experimental perturbations to signaling gradients, and makes testable predictions about novel perturbations. The variety of different patterns formed in our model can account for the variability of segmentation between different animal species.
Hypercharged dark matter and direct detection as a probe of reheating.
Feldstein, Brian; Ibe, Masahiro; Yanagida, Tsutomu T
2014-03-14
The lack of new physics at the LHC so far weakens the argument for TeV scale thermal dark matter. On the other hand, heavier, nonthermal dark matter is generally difficult to test experimentally. Here we consider the interesting and generic case of hypercharged dark matter, which can allow for heavy dark matter masses without spoiling testability. Planned direct detection experiments will be able to see a signal for masses up to an incredible 1010 GeV, and this can further serve to probe the reheating temperature up to about 109 GeV, as determined by the nonthermal dark matter relic abundance. The Z-mediated nature of the dark matter scattering may be determined in principle by comparing scattering rates on different detector nuclei, which in turn can reveal the dark matter mass. We will discuss the extent to which future experiments may be able to make such a determination.
Quantifying quantum coherence with quantum Fisher information.
Feng, X N; Wei, L F
2017-11-14
Quantum coherence is one of the old but always important concepts in quantum mechanics, and now it has been regarded as a necessary resource for quantum information processing and quantum metrology. However, the question of how to quantify the quantum coherence has just been paid the attention recently (see, e.g., Baumgratz et al. PRL, 113. 140401 (2014)). In this paper we verify that the well-known quantum Fisher information (QFI) can be utilized to quantify the quantum coherence, as it satisfies the monotonicity under the typical incoherent operations and the convexity under the mixing of the quantum states. Differing from most of the pure axiomatic methods, quantifying quantum coherence by QFI could be experimentally testable, as the bound of the QFI is practically measurable. The validity of our proposal is specifically demonstrated with the typical phase-damping and depolarizing evolution processes of a generic single-qubit state, and also by comparing it with the other quantifying methods proposed previously.
Laplanche, Christophe; Elger, Arnaud; Santoul, Frédéric; Thiede, Gary P.; Budy, Phaedra
2018-01-01
Management actions aimed at eradicating exotic fish species from riverine ecosystems can be better informed by forecasting abilities of mechanistic models. We illustrate this point with an example of the Logan River, Utah, originally populated with endemic cutthroat trout (Oncorhynchus clarkii utah), which compete with exotic brown trout (Salmo trutta). The coexistence equilibrium was disrupted by a large scale, experimental removal of the exotic species in 2009–2011 (on average, 8.2% of the stock each year), followed by an increase in the density of the native species. We built a spatially-explicit, reaction-diffusion model encompassing four key processes: population growth in heterogeneous habitat, competition, dispersal, and a management action. We calibrated the model with detailed long-term monitoring data (2001–2016) collected along the 35.4-km long river main channel. Our model, although simple, did a remarkable job reproducing the system steady state prior to the management action. Insights gained from the model independent predictions are consistent with available knowledge and indicate that the exotic species is more competitive; however, the native species still occupies more favorable habitat upstream. Dynamic runs of the model also recreated the observed increase of the native species following the management action. The model can simulate two possible distinct long-term outcomes: recovery or eradication of the exotic species. The processing of available knowledge using Bayesian methods allowed us to conclude that the chance for eradication of the invader was low at the beginning of the experimental removal (0.7% in 2009) and increased (20.5% in 2016) by using more recent monitoring data. We show that accessible mathematical and numerical tools can provide highly informative insights for managers (e.g., outcome of their conservation actions), identify knowledge gaps, and provide testable theory for researchers.
Linking short-term responses to ecologically-relevant outcomes
Opportunity to participate in the conduct of collaborative integrative lab, field and modelling efforts to characterize molecular-to-organismal level responses and make quantitative testable predictions of population level outcomes
Hartmann, Ernest
2010-01-01
There is a widespread consensus that emotion is important in dreams, deriving from both biological and psychological studies. However, the emphasis on examining emotions explicitly mentioned in dreams is misplaced. The dream is basically made of imagery. The focus of our group has been on relating the dream imagery to the dreamer's underlying emotion. What is most important is the underlying emotion--the emotion of the dreamer, not the emotion in the dream. This chapter discusses many studies relating the dream-especially the central image of the dream--to the dreamer's underlying emotion. Focusing on the underlying emotion leads to a coherent and testable view of the nature of dreaming. It also helps to clarify some important puzzling features of the literature on dreams, such as why the clinical literature is different in so many ways from the experimental literature, especially the laboratory-based experimental literature. Based on central image intensity and the associated underlying emotion, we can identify a hierarchy of dreams, from the highest-intensity, "big dreams," to the lowest-intensity dreams from laboratory awakenings. Copyright © 2010 Elsevier Inc. All rights reserved.
Bairagya, Hridoy R; Bansal, Manju
2016-03-01
Human Guanine Monophosphate Synthetase (hGMPS) converts XMP to GMP, and acts as a bifunctional enzyme with N-terminal "glutaminase" (GAT) and C-terminal "synthetase" domain. The enzyme is identified as a potential target for anti-cancer and immunosuppressive therapies. GAT domain of enzyme plays central role in metabolism, and contains conserved catalytic residues Cys104, His190, and Glu192. MD simulation studies on GAT domain suggest that position of oxyanion in unliganded conformation is occupied by one conserved water molecule (W1), which also stabilizes that pocket. This position is occupied by a negatively charged atom of the substrate or ligand in ligand bound crystal structures. In fact, MD simulation study of Ser75 to Val indicates that W1 conserved water molecule is stabilized by Ser75, while Thr152, and His190 also act as anchor residues to maintain appropriate architecture of oxyanion pocket through water mediated H-bond interactions. Possibly, four conserved water molecules stabilize oxyanion hole in unliganded state, but they vacate these positions when the enzyme (hGMPS)-substrate complex is formed. Thus this study not only reveals functionally important role of conserved water molecules in GAT domain, but also highlights essential role of other non-catalytic residues such as Ser75 and Thr152 in this enzymatic domain. The results from this computational study could be of interest to experimental community and provide a testable hypothesis for experimental validation. Conserved sites of water molecules near and at oxyanion hole highlight structural importance of water molecules and suggest a rethink of the conventional definition of chemical geometry of inhibitor binding site. © 2016 Wiley Periodicals, Inc.
Patel, Mainak
2018-01-15
The spiking of barrel regular-spiking (RS) cells is tuned for both whisker deflection direction and velocity. Velocity tuning arises due to thalamocortical (TC) synchrony (but not spike quantity) varying with deflection velocity, coupled with feedforward inhibition, while direction selectivity is not fully understood, though may be due partly to direction tuning of TC spiking. Data show that as deflection direction deviates from the preferred direction of an RS cell, excitatory input to the RS cell diminishes minimally, but temporally shifts to coincide with the time-lagged inhibitory input. This work constructs a realistic large-scale model of a barrel; model RS cells exhibit velocity and direction selectivity due to TC input dynamics, with the experimentally observed sharpening of direction tuning with decreasing velocity. The model puts forth the novel proposal that RS→RS synapses can naturally and simply account for the unexplained direction dependence of RS cell inputs - as deflection direction deviates from the preferred direction of an RS cell, and TC input declines, RS→RS synaptic transmission buffers the decline in total excitatory input and causes a shift in timing of the excitatory input peak from the peak in TC input to the delayed peak in RS input. The model also provides several experimentally testable predictions on the velocity dependence of RS cell inputs. This model is the first, to my knowledge, to study the interaction of direction and velocity and propose physiological mechanisms for the stimulus dependence in the timing and amplitude of RS cell inputs. Copyright © 2017 IBRO. Published by Elsevier Ltd. All rights reserved.
A four-component model of the action potential in mouse detrusor smooth muscle cell
Brain, Keith L.; Young, John S.; Manchanda, Rohit
2018-01-01
Background and hypothesis Detrusor smooth muscle cells (DSMCs) of the urinary bladder are electrically connected to one another via gap junctions and form a three dimensional syncytium. DSMCs exhibit spontaneous electrical activity, including passive depolarizations and action potentials. The shapes of spontaneous action potentials (sAPs) observed from a single DSM cell can vary widely. The biophysical origins of this variability, and the precise components which contribute to the complex shapes observed are not known. To address these questions, the basic components which constitute the sAPs were investigated. We hypothesized that linear combinations of scaled versions of these basic components can produce sAP shapes observed in the syncytium. Methods and results The basic components were identified as spontaneous evoked junction potentials (sEJP), native AP (nAP), slow after hyperpolarization (sAHP) and very slow after hyperpolarization (vsAHP). The experimental recordings were grouped into two sets: a training data set and a testing data set. A training set was used to estimate the components, and a test set to evaluate the efficiency of the estimated components. We found that a linear combination of the identified components when appropriately amplified and time shifted replicated various AP shapes to a high degree of similarity, as quantified by the root mean square error (RMSE) measure. Conclusions We conclude that the four basic components—sEJP, nAP, sAHP, and vsAHP—identified and isolated in this work are necessary and sufficient to replicate all varieties of the sAPs recorded experimentally in DSMCs. This model has the potential to generate testable hypotheses that can help identify the physiological processes underlying various features of the sAPs. Further, this model also provides a means to classify the sAPs into various shape classes. PMID:29351282
A four-component model of the action potential in mouse detrusor smooth muscle cell.
Padmakumar, Mithun; Brain, Keith L; Young, John S; Manchanda, Rohit
2018-01-01
Detrusor smooth muscle cells (DSMCs) of the urinary bladder are electrically connected to one another via gap junctions and form a three dimensional syncytium. DSMCs exhibit spontaneous electrical activity, including passive depolarizations and action potentials. The shapes of spontaneous action potentials (sAPs) observed from a single DSM cell can vary widely. The biophysical origins of this variability, and the precise components which contribute to the complex shapes observed are not known. To address these questions, the basic components which constitute the sAPs were investigated. We hypothesized that linear combinations of scaled versions of these basic components can produce sAP shapes observed in the syncytium. The basic components were identified as spontaneous evoked junction potentials (sEJP), native AP (nAP), slow after hyperpolarization (sAHP) and very slow after hyperpolarization (vsAHP). The experimental recordings were grouped into two sets: a training data set and a testing data set. A training set was used to estimate the components, and a test set to evaluate the efficiency of the estimated components. We found that a linear combination of the identified components when appropriately amplified and time shifted replicated various AP shapes to a high degree of similarity, as quantified by the root mean square error (RMSE) measure. We conclude that the four basic components-sEJP, nAP, sAHP, and vsAHP-identified and isolated in this work are necessary and sufficient to replicate all varieties of the sAPs recorded experimentally in DSMCs. This model has the potential to generate testable hypotheses that can help identify the physiological processes underlying various features of the sAPs. Further, this model also provides a means to classify the sAPs into various shape classes.
NASA Astrophysics Data System (ADS)
Liu, Hao; Guo, Xiang; Han, Jingcheng; Luo, Ray; Chen, Hai-Feng
2018-06-01
Transcription factor cyclic Adenosine monophosphate response-element binding protein plays a critical role in the cyclic AMP response pathway via its intrinsically disordered kinase inducible transactivation domain (KID). KID is one of the most studied intrinsically disordered proteins (IDPs), although most previous studies focus on characterizing its disordered state structures. An interesting question that remains to be answered is how the order-disorder transition occurs at experimental conditions. Thanks to the newly developed IDP-specific force field ff14IDPSFF, the quality of conformer sampling for IDPs has been dramatically improved. In this study, molecular dynamics (MD) simulations were used to study the order-to-disorder transition kinetics of KID based on the good agreement with the experiment on its disordered-state properties. Specifically, we tested four force fields, ff99SBildn, ff99IDPs, ff14IDPSFF, and ff14IDPs in the simulations of KID and found that ff14IDPSFF can generate more diversified disordered conformers and also reproduce more accurate experimental secondary chemical shifts. Kinetics analysis of MD simulations demonstrates that the order-disorder transition of KID obeys the first-order kinetics, and the transition nucleus is I127/L128/L141. The possible transition pathways from the nucleus to the last folded residues were identified as I127-R125-L138-L141-S143-A145 and L128-R125-L138-L141-S143-A145 based on a residue-level dynamical network analysis. These computational studies not only provide testable prediction/hypothesis on the order-disorder transition of KID but also confirm that the ff14IDPSFF force field can be used to explore the correlation between the structure and function of IDPs.
A cerebellar learning model of vestibulo-ocular reflex adaptation in wild-type and mutant mice.
Clopath, Claudia; Badura, Aleksandra; De Zeeuw, Chris I; Brunel, Nicolas
2014-05-21
Mechanisms of cerebellar motor learning are still poorly understood. The standard Marr-Albus-Ito theory posits that learning involves plasticity at the parallel fiber to Purkinje cell synapses under control of the climbing fiber input, which provides an error signal as in classical supervised learning paradigms. However, a growing body of evidence challenges this theory, in that additional sites of plasticity appear to contribute to motor adaptation. Here, we consider phase-reversal training of the vestibulo-ocular reflex (VOR), a simple form of motor learning for which a large body of experimental data is available in wild-type and mutant mice, in which the excitability of granule cells or inhibition of Purkinje cells was affected in a cell-specific fashion. We present novel electrophysiological recordings of Purkinje cell activity measured in naive wild-type mice subjected to this VOR adaptation task. We then introduce a minimal model that consists of learning at the parallel fibers to Purkinje cells with the help of the climbing fibers. Although the minimal model reproduces the behavior of the wild-type animals and is analytically tractable, it fails at reproducing the behavior of mutant mice and the electrophysiology data. Therefore, we build a detailed model involving plasticity at the parallel fibers to Purkinje cells' synapse guided by climbing fibers, feedforward inhibition of Purkinje cells, and plasticity at the mossy fiber to vestibular nuclei neuron synapse. The detailed model reproduces both the behavioral and electrophysiological data of both the wild-type and mutant mice and allows for experimentally testable predictions. Copyright © 2014 the authors 0270-6474/14/347203-13$15.00/0.
A SEU-Hard Flip-Flop for Antifuse FPGAs
NASA Technical Reports Server (NTRS)
Katz, R.; Wang, J. J.; McCollum, J.; Cronquist, B.; Chan, R.; Yu, D.; Kleyner, I.; Day, John H. (Technical Monitor)
2001-01-01
A single event upset (SEU)-hardened flip-flop has been designed and developed for antifuse Field Programmable Gate Array (FPGA) application. Design and application issues, testability, test methods, simulation, and results are discussed.
The changing features of the body-mind problem.
Agassi, Joseph
2007-01-01
The body-mind problem invites scientific study, since mental events are repeated and repeatable and invite testable explanations. They seemed troublesome because of the classical theory of substance that failed to solve its own central problems. These are soluble with the aid of the theory of the laws of nature, particularly in its emergentist version [Bunge, M., 1980. The Body-mind Problem, Pergamon, Oxford] that invites refutable explanations [Popper, K.R., 1959. The Logic of Scientific Discovery, Hutchinson, London]. The view of mental properties as emergent is a modification of the two chief classical views, materialism and dualism. As this view invites testable explanations of events of the inner world, it is better than the quasi-behaviorist view of self-awareness as computer-style self-monitoring [Minsky, M., Laske, O., 1992. A conversation with Marvin Minsky. AI Magazine 13 (3), 31-45].
Advanced Deployable Structural Systems for Small Satellites
NASA Technical Reports Server (NTRS)
Belvin, W. Keith; Straubel, Marco; Wilkie, W. Keats; Zander, Martin E.; Fernandez, Juan M.; Hillebrandt, Martin F.
2016-01-01
One of the key challenges for small satellites is packaging and reliable deployment of structural booms and arrays used for power, communication, and scientific instruments. The lack of reliable and efficient boom and membrane deployment concepts for small satellites is addressed in this work through a collaborative project between NASA and DLR. The paper provides a state of the art overview on existing spacecraft deployable appendages, the special requirements for small satellites, and initial concepts for deployable booms and arrays needed for various small satellite applications. The goal is to enhance deployable boom predictability and ground testability, develop designs that are tolerant of manufacturing imperfections, and incorporate simple and reliable deployment systems.
Computational Psychiatry of ADHD: Neural Gain Impairments across Marrian Levels of Analysis
Hauser, Tobias U.; Fiore, Vincenzo G.; Moutoussis, Michael; Dolan, Raymond J.
2016-01-01
Attention-deficit hyperactivity disorder (ADHD), one of the most common psychiatric disorders, is characterised by unstable response patterns across multiple cognitive domains. However, the neural mechanisms that explain these characteristic features remain unclear. Using a computational multilevel approach, we propose that ADHD is caused by impaired gain modulation in systems that generate this phenotypic increased behavioural variability. Using Marr's three levels of analysis as a heuristic framework, we focus on this variable behaviour, detail how it can be explained algorithmically, and how it might be implemented at a neural level through catecholamine influences on corticostriatal loops. This computational, multilevel, approach to ADHD provides a framework for bridging gaps between descriptions of neuronal activity and behaviour, and provides testable predictions about impaired mechanisms. PMID:26787097
Predicting Adverse Drug Effects from Literature- and Database-Mined Assertions.
La, Mary K; Sedykh, Alexander; Fourches, Denis; Muratov, Eugene; Tropsha, Alexander
2018-06-06
Given that adverse drug effects (ADEs) have led to post-market patient harm and subsequent drug withdrawal, failure of candidate agents in the drug development process, and other negative outcomes, it is essential to attempt to forecast ADEs and other relevant drug-target-effect relationships as early as possible. Current pharmacologic data sources, providing multiple complementary perspectives on the drug-target-effect paradigm, can be integrated to facilitate the inference of relationships between these entities. This study aims to identify both existing and unknown relationships between chemicals (C), protein targets (T), and ADEs (E) based on evidence in the literature. Cheminformatics and data mining approaches were employed to integrate and analyze publicly available clinical pharmacology data and literature assertions interrelating drugs, targets, and ADEs. Based on these assertions, a C-T-E relationship knowledge base was developed. Known pairwise relationships between chemicals, targets, and ADEs were collected from several pharmacological and biomedical data sources. These relationships were curated and integrated according to Swanson's paradigm to form C-T-E triangles. Missing C-E edges were then inferred as C-E relationships. Unreported associations between drugs, targets, and ADEs were inferred, and inferences were prioritized as testable hypotheses. Several C-E inferences, including testosterone → myocardial infarction, were identified using inferences based on the literature sources published prior to confirmatory case reports. Timestamping approaches confirmed the predictive ability of this inference strategy on a larger scale. The presented workflow, based on free-access databases and an association-based inference scheme, provided novel C-E relationships that have been validated post hoc in case reports. With refinement of prioritization schemes for the generated C-E inferences, this workflow may provide an effective computational method for the early detection of potential drug candidate ADEs that can be followed by targeted experimental investigations.
Scientific realism and wishful thinking in soil hydrology
NASA Astrophysics Data System (ADS)
Flühler, H.
2009-04-01
In our field we often learn - or could have learned - more from failures than from successes provided we had postulated testable hypotheses to be accepted or rejected. In soil hydrology, hypotheses are testable if independent information quantifying the pertinent system features is at hand. This view on how to operate is an idealized concept of how we could or should have worked. In reality, the path to success is more tortuous and we usually progress differently obeying to other professional musts. Although we missed some shortcuts over the past few decades, we definitely made significant progress in understanding vadose zone progresses, but we could have advanced our system understanding faster by more rigorously questioning the fundamental assumptions. I will try to illustrate the tortuous path of learning and identify some causes of the slowed-down learning curve. In the pioneering phase of vadose zone research many models have been mapped in our minds and implemented on our computers. Many of them are now well established, powerful and represent the state-of-the-art even when they do not work. Some of them are based on erroneous or misleading concepts. Even when based on adequate concepts they might have been applied in the wrong context or inadequate models may have lead to apparent success. I address this process of collective learning with the intention that we spend more time and efforts to find the right question instead of improving tools, which are questionably suitable for solving the main problems.
A mathematical model of physiological processes and its application to the study of aging
NASA Technical Reports Server (NTRS)
Hibbs, A. R.; Walford, R. L.
1989-01-01
The behavior of a physiological system which, after displacement, returns by homeostatic mechanisms to its original condition can be described by a simple differential equation in which the "recovery time" is a parameter. Two such systems, which influence one another, can be linked mathematically by the use of "coupling" or "feedback" coefficients. These concepts are the basis for many mathematical models of physiological behavior, and we describe the general nature of such models. Next, we introduce the concept of a "fatal limit" for the displacement of a physiological system, and show how measures of such limits can be included in mathematical models. We show how the numerical values of such limits depend on the values of other system parameters, i.e., recovery times and coupling coefficients, and suggest ways of measuring all these parameters experimentally, for example by monitoring changes induced by X-irradiation. Next, we discuss age-related changes in these parameters, and show how the parameters of mortality statistics, such as the famous Gompertz parameters, can be derived from experimentally measurable changes. Concepts of onset-of-aging, critical or fatal limits, equilibrium value (homeostasis), recovery times and coupling constants are involved. Illustrations are given using published data from mouse and rat populations. We believe that this method of deriving survival patterns from model that is experimentally testable is unique.
Dynamical Properties of a Living Nematic
NASA Astrophysics Data System (ADS)
Genkin, Mikhail
The systems, which are made of a large number or interacting particles, or agents that convert the energy stored in the environment into mechanical motion, are called active systems, or active matter. The examples of active matter include both living and synthetic systems. The size of agents varies significantly: bird flocks and fish schools represent macroscopic active systems, while suspensions of living organisms or artificial colloidal particles are examples of microscopic ones. In this work, I studied one of the simplest realization of active matter termed living (or active) nematics, that can be conceived by mixing swimming bacteria and nematic liquid crystal. Using modeling, numerical simulations and experiments I studied various dynamical properties of active nematics. This work hints into new methods of control and manipulation of active matter. Active nematic exhibits complex spatiotemporal behavior manifested by formation, proliferation, and annihilation of topological defects. A new computational 2D model coupling nematic liquid crystal and swimming bacteria dynamics have been proposed. We investigated the developed system of partial differential equations analytically and integrated it numerically using the highly efficient parallel GPU code. The integration results are in a very good agreement with other theoretical and experimental studies. In addition, our model revealed a number of testable phenomena. The major model prediction (bacteria accumulation in positive and depletion in negative topological defects) was tested by a dedicated experiment. We extended our model to study active nematics in a biphasic state, where nematic and isotropic phases coexist. Typically this coexistence is manifested by formation of tactoids - isotropic elongated regions surrounded by nematic phase, or nematic regions surrounded by isotropic phase. Using numerical integration, we revealed fundamental properties of such systems. Our main model outcome - spontaneous negative charging of isotropic-nematic interfaces - was confirmed by the experiment. The provided modeling and experimental results are in a very good qualitative and quantitative agreement. At last, we studied living nematics experimentally. We worked with swimming bacteria Bacillus subtilis suspended in disodium cromoglycate (DSCG) liquid crystal. Using cylindrical confinement, we were able to observe quantization of nematics' bending instability. Our experimental results revealed a complex interplay between bacteria self-propulsion and nematics' elasticity in the presence of cylindrical confinements of different sizes.
Sources, Sinks, and Model Accuracy
Spatial demographic models are a necessary tool for understanding how to manage landscapes sustainably for animal populations. These models, therefore, must offer precise and testable predications about animal population dynamics and how animal demographic parameters respond to ...
Buckler, Andrew J; Liu, Tiffany Ting; Savig, Erica; Suzek, Baris E; Ouellette, M; Danagoulian, J; Wernsing, G; Rubin, Daniel L; Paik, David
2013-08-01
A widening array of novel imaging biomarkers is being developed using ever more powerful clinical and preclinical imaging modalities. These biomarkers have demonstrated effectiveness in quantifying biological processes as they occur in vivo and in the early prediction of therapeutic outcomes. However, quantitative imaging biomarker data and knowledge are not standardized, representing a critical barrier to accumulating medical knowledge based on quantitative imaging data. We use an ontology to represent, integrate, and harmonize heterogeneous knowledge across the domain of imaging biomarkers. This advances the goal of developing applications to (1) improve precision and recall of storage and retrieval of quantitative imaging-related data using standardized terminology; (2) streamline the discovery and development of novel imaging biomarkers by normalizing knowledge across heterogeneous resources; (3) effectively annotate imaging experiments thus aiding comprehension, re-use, and reproducibility; and (4) provide validation frameworks through rigorous specification as a basis for testable hypotheses and compliance tests. We have developed the Quantitative Imaging Biomarker Ontology (QIBO), which currently consists of 488 terms spanning the following upper classes: experimental subject, biological intervention, imaging agent, imaging instrument, image post-processing algorithm, biological target, indicated biology, and biomarker application. We have demonstrated that QIBO can be used to annotate imaging experiments with standardized terms in the ontology and to generate hypotheses for novel imaging biomarker-disease associations. Our results established the utility of QIBO in enabling integrated analysis of quantitative imaging data.
Gait control in a soft robot by sensing interactions with the environment using self-deformation.
Umedachi, Takuya; Kano, Takeshi; Ishiguro, Akio; Trimmer, Barry A
2016-12-01
All animals use mechanosensors to help them move in complex and changing environments. With few exceptions, these sensors are embedded in soft tissues that deform in normal use such that sensory feedback results from the interaction of an animal with its environment. Useful information about the environment is expected to be embedded in the mechanical responses of the tissues during movements. To explore how such sensory information can be used to control movements, we have developed a soft-bodied crawling robot inspired by a highly tractable animal model, the tobacco hornworm Manduca sexta . This robot uses deformations of its body to detect changes in friction force on a substrate. This information is used to provide local sensory feedback for coupled oscillators that control the robot's locomotion. The validity of the control strategy is demonstrated with both simulation and a highly deformable three-dimensionally printed soft robot. The results show that very simple oscillators are able to generate propagating waves and crawling/inching locomotion through the interplay of deformation in different body parts in a fully decentralized manner. Additionally, we confirmed numerically and experimentally that the gait pattern can switch depending on the surface contact points. These results are expected to help in the design of adaptable, robust locomotion control systems for soft robots and also suggest testable hypotheses about how soft animals use sensory feedback.
Model of the songbird nucleus HVC as a network of central pattern generators
Abarbanel, Henry D. I.
2016-01-01
We propose a functional architecture of the adult songbird nucleus HVC in which the core element is a “functional syllable unit” (FSU). In this model, HVC is organized into FSUs, each of which provides the basis for the production of one syllable in vocalization. Within each FSU, the inhibitory neuron population takes one of two operational states: 1) simultaneous firing wherein all inhibitory neurons fire simultaneously, and 2) competitive firing of the inhibitory neurons. Switching between these basic modes of activity is accomplished via changes in the synaptic strengths among the inhibitory neurons. The inhibitory neurons connect to excitatory projection neurons such that during state 1 the activity of projection neurons is suppressed, while during state 2 patterns of sequential firing of projection neurons can occur. The latter state is stabilized by feedback from the projection to the inhibitory neurons. Song composition for specific species is distinguished by the manner in which different FSUs are functionally connected to each other. Ours is a computational model built with biophysically based neurons. We illustrate that many observations of HVC activity are explained by the dynamics of the proposed population of FSUs, and we identify aspects of the model that are currently testable experimentally. In addition, and standing apart from the core features of an FSU, we propose that the transition between modes may be governed by the biophysical mechanism of neuromodulation. PMID:27535375
Gait control in a soft robot by sensing interactions with the environment using self-deformation
Ishiguro, Akio; Trimmer, Barry A.
2016-01-01
All animals use mechanosensors to help them move in complex and changing environments. With few exceptions, these sensors are embedded in soft tissues that deform in normal use such that sensory feedback results from the interaction of an animal with its environment. Useful information about the environment is expected to be embedded in the mechanical responses of the tissues during movements. To explore how such sensory information can be used to control movements, we have developed a soft-bodied crawling robot inspired by a highly tractable animal model, the tobacco hornworm Manduca sexta. This robot uses deformations of its body to detect changes in friction force on a substrate. This information is used to provide local sensory feedback for coupled oscillators that control the robot's locomotion. The validity of the control strategy is demonstrated with both simulation and a highly deformable three-dimensionally printed soft robot. The results show that very simple oscillators are able to generate propagating waves and crawling/inching locomotion through the interplay of deformation in different body parts in a fully decentralized manner. Additionally, we confirmed numerically and experimentally that the gait pattern can switch depending on the surface contact points. These results are expected to help in the design of adaptable, robust locomotion control systems for soft robots and also suggest testable hypotheses about how soft animals use sensory feedback. PMID:28083114
VirtualPlant: A Software Platform to Support Systems Biology Research1[W][OA
Katari, Manpreet S.; Nowicki, Steve D.; Aceituno, Felipe F.; Nero, Damion; Kelfer, Jonathan; Thompson, Lee Parnell; Cabello, Juan M.; Davidson, Rebecca S.; Goldberg, Arthur P.; Shasha, Dennis E.; Coruzzi, Gloria M.; Gutiérrez, Rodrigo A.
2010-01-01
Data generation is no longer the limiting factor in advancing biological research. In addition, data integration, analysis, and interpretation have become key bottlenecks and challenges that biologists conducting genomic research face daily. To enable biologists to derive testable hypotheses from the increasing amount of genomic data, we have developed the VirtualPlant software platform. VirtualPlant enables scientists to visualize, integrate, and analyze genomic data from a systems biology perspective. VirtualPlant integrates genome-wide data concerning the known and predicted relationships among genes, proteins, and molecules, as well as genome-scale experimental measurements. VirtualPlant also provides visualization techniques that render multivariate information in visual formats that facilitate the extraction of biological concepts. Importantly, VirtualPlant helps biologists who are not trained in computer science to mine lists of genes, microarray experiments, and gene networks to address questions in plant biology, such as: What are the molecular mechanisms by which internal or external perturbations affect processes controlling growth and development? We illustrate the use of VirtualPlant with three case studies, ranging from querying a gene of interest to the identification of gene networks and regulatory hubs that control seed development. Whereas the VirtualPlant software was developed to mine Arabidopsis (Arabidopsis thaliana) genomic data, its data structures, algorithms, and visualization tools are designed in a species-independent way. VirtualPlant is freely available at www.virtualplant.org. PMID:20007449
Hunting for Snarks in Quantum Mechanics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hestenes, David
2009-12-08
A long-standing debate over the interpretation of quantum mechanics has centered on the meaning of Schroedinger's wave function {psi} for an electron. Broadly speaking, there are two major opposing schools. On the one side, the Copenhagen school(led by Bohr, Heisenberg and Pauli) holds that {psi} provides a complete description of a single electron state; hence the probability interpretation of {psi}{psi}* expresses an irreducible uncertainty in electron behavior that is intrinsic in nature. On the other side, the realist school(led by Einstein, de Broglie, Bohm and Jaynes) holds that {psi} represents a statistical ensemble of possible electron states; hence it ismore » an incomplete description of a single electron state. I contend that the debaters have overlooked crucial facts about the electron revealed by Dirac theory. In particular, analysis of electron zitterbewegung(first noticed by Schroedinger) opens a window to particle substructure in quantum mechanics that explains the physical significance of the complex phase factor in {psi}. This led to a testable model for particle substructure with surprising support by recent experimental evidence. If the explanation is upheld by further research, it will resolve the debate in favor of the realist school. I give details. The perils of research on the foundations of quantum mechanics have been foreseen by Lewis Carroll in The Hunting of the Snark{exclamation_point}.« less
Short- and Long-Term Earthquake Forecasts Based on Statistical Models
NASA Astrophysics Data System (ADS)
Console, Rodolfo; Taroni, Matteo; Murru, Maura; Falcone, Giuseppe; Marzocchi, Warner
2017-04-01
The epidemic-type aftershock sequences (ETAS) models have been experimentally used to forecast the space-time earthquake occurrence rate during the sequence that followed the 2009 L'Aquila earthquake and for the 2012 Emilia earthquake sequence. These forecasts represented the two first pioneering attempts to check the feasibility of providing operational earthquake forecasting (OEF) in Italy. After the 2009 L'Aquila earthquake the Italian Department of Civil Protection nominated an International Commission on Earthquake Forecasting (ICEF) for the development of the first official OEF in Italy that was implemented for testing purposes by the newly established "Centro di Pericolosità Sismica" (CPS, the seismic Hazard Center) at the Istituto Nazionale di Geofisica e Vulcanologia (INGV). According to the ICEF guidelines, the system is open, transparent, reproducible and testable. The scientific information delivered by OEF-Italy is shaped in different formats according to the interested stakeholders, such as scientists, national and regional authorities, and the general public. The communication to people is certainly the most challenging issue, and careful pilot tests are necessary to check the effectiveness of the communication strategy, before opening the information to the public. With regard to long-term time-dependent earthquake forecast, the application of a newly developed simulation algorithm to Calabria region provided typical features in time, space and magnitude behaviour of the seismicity, which can be compared with those of the real observations. These features include long-term pseudo-periodicity and clustering of strong earthquakes, and a realistic earthquake magnitude distribution departing from the Gutenberg-Richter distribution in the moderate and higher magnitude range.
Using next generation transcriptome sequencing to predict an ectomycorrhizal metablome.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larsen, P. E.; Sreedasyam, A.; Trivedi, G
Mycorrhizae, symbiotic interactions between soil fungi and tree roots, are ubiquitous in terrestrial ecosystems. The fungi contribute phosphorous, nitrogen and mobilized nutrients from organic matter in the soil and in return the fungus receives photosynthetically-derived carbohydrates. This union of plant and fungal metabolisms is the mycorrhizal metabolome. Understanding this symbiotic relationship at a molecular level provides important contributions to the understanding of forest ecosystems and global carbon cycling. We generated next generation short-read transcriptomic sequencing data from fully-formed ectomycorrhizae between Laccaria bicolor and aspen (Populus tremuloides) roots. The transcriptomic data was used to identify statistically significantly expressed gene models usingmore » a bootstrap-style approach, and these expressed genes were mapped to specific metabolic pathways. Integration of expressed genes that code for metabolic enzymes and the set of expressed membrane transporters generates a predictive model of the ectomycorrhizal metabolome. The generated model of mycorrhizal metabolome predicts that the specific compounds glycine, glutamate, and allantoin are synthesized by L. bicolor and that these compounds or their metabolites may be used for the benefit of aspen in exchange for the photosynthetically-derived sugars fructose and glucose. The analysis illustrates an approach to generate testable biological hypotheses to investigate the complex molecular interactions that drive ectomycorrhizal symbiosis. These models are consistent with experimental environmental data and provide insight into the molecular exchange processes for organisms in this complex ecosystem. The method used here for predicting metabolomic models of mycorrhizal systems from deep RNA sequencing data can be generalized and is broadly applicable to transcriptomic data derived from complex systems.« less
Informed maintenance for next generation space transportation systems
NASA Astrophysics Data System (ADS)
Fox, Jack J.
2001-02-01
Perhaps the most substantial single obstacle to progress of space exploration and utilization of space for human benefit is the safety & reliability and the inherent cost of launching to, and returning from, space. The primary influence in the high costs of current launch systems (the same is true for commercial and military aircraft and most other reusable systems) is the operations, maintenance and infrastructure portion of the program's total life cycle costs. Reusable Launch Vehicle (RLV) maintenance and design have traditionally been two separate engineering disciplines with often conflicting objectives-maximizing ease of maintenance versus optimizing performance, size and cost. Testability analysis, an element of Informed Maintenance (IM), has been an ad hoc, manual effort, in which maintenance engineers attempt to identify an efficient method of troubleshooting for the given product, with little or no control over product design. Therefore, testability deficiencies in the design cannot be rectified. It is now widely recognized that IM must be engineered into the product at the design stage itself, so that an optimal compromise is achieved between system maintainability and performance. The elements of IM include testability analysis, diagnostics/prognostics, automated maintenance scheduling, automated logistics coordination, paperless documentation and data mining. IM derives its heritage from complimentary NASA science, space and aeronautic enterprises such as the on-board autonomous Remote Agent Architecture recently flown on NASA's Deep Space 1 Probe as well as commercial industries that employ quick turnaround operations. Commercial technologies and processes supporting NASA's IM initiatives include condition based maintenance technologies from Boeing's Commercial 777 Aircraft and Lockheed-Martin's F-22 Fighter, automotive computer diagnostics and autonomous controllers that enable 100,000 mile maintenance free operations, and locomotive monitoring system software. This paper will summarize NASA's long-term strategy, development, and implementation plans for Informed Maintenance for next generation RLVs. This will be done through a convergence into a single IM vision the work being performed throughout NASA, industry and academia. Additionally, a current status of IM development throughout NASA programs such as the Space Shuttle, X-33, X-34 and X-37 will be provided and will conclude with an overview of near-term work that is being initiated in FY00 to support NASA's 2nd Generation Reusable Launch Vehicle Program. .
Informed maintenance for next generation reusable launch systems
NASA Astrophysics Data System (ADS)
Fox, Jack J.; Gormley, Thomas J.
2001-03-01
Perhaps the most substantial single obstacle to progress of space exploration and utilization of space for human benefit is the safety & reliability and the inherent cost of launching to, and returning from, space. The primary influence in the high costs of current launch systems (the same is true for commercial and military aircraft and most other reusable systems) is the operations, maintenance and infrastructure portion of the program's total life cycle costs. Reusable Launch Vehicle (RLV) maintenance and design have traditionally been two separate engineering disciplines with often conflicting objectives - maximizing ease of maintenance versus optimizing performance, size and cost. Testability analysis, an element of Informed Maintenance (IM), has been an ad hoc, manual effort, in which maintenance engineers attempt to identify an efficient method of troubleshooting for the given product, with little or no control over product design. Therefore, testability deficiencies in the design cannot be rectified. It is now widely recognized that IM must be engineered into the product at the design stage itself, so that an optimal compromise is achieved between system maintainability and performance. The elements of IM include testability analysis, diagnostics/prognostics, automated maintenance scheduling, automated logistics coordination, paperless documentation and data mining. IM derives its heritage from complimentary NASA science, space and aeronautic enterprises such as the on-board autonomous Remote Agent Architecture recently flown on NASA's Deep Space 1 Probe as well as commercial industries that employ quick turnaround operations. Commercial technologies and processes supporting NASA's IM initiatives include condition based maintenance technologies from Boeing's Commercial 777 Aircraft and Lockheed-Martin's F-22 Fighter, automotive computer diagnostics and autonomous controllers that enable 100,000 mile maintenance free operations, and locomotive monitoring system software. This paper will summarize NASA's long-term strategy, development, and implementation plans for Informed Maintenance for next generation RLVs. This will be done through a convergence into a single IM vision the work being performed throughout NASA, industry and academia. Additionally, a current status of IM development throughout NASA programs such as the Space Shuttle, X-33, X-34 and X-37 will be provided and will conclude with an overview of near-term work that is being initiated in FY00 to support NASA's 2 nd Generation Reusable Launch Vehicle Program.
Work-Centered Technology Development (WTD)
2005-03-01
theoretical, testable, inductive, and repeatable foundations of science. o Theoretical foundations include notions such as statistical versus analytical...Human Factors and Ergonomics Society, 263-267. 179 Eggleston, R. G. (2005). Coursebook : Work-Centered Design (WCD). AFRL/HECS WCD course training
Writing testable software requirements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knirk, D.
1997-11-01
This tutorial identifies common problems in analyzing requirements in the problem and constructing a written specification of what the software is to do. It deals with two main problem areas: identifying and describing problem requirements, and analyzing and describing behavior specifications.
Software Tools to Support the Assessment of System Health
NASA Technical Reports Server (NTRS)
Melcher, Kevin J.
2013-01-01
This presentation provides an overview of three software tools that were developed by the NASA Glenn Research Center to support the assessment of system health: the Propulsion Diagnostic Method Evaluation Strategy (ProDIMES), the Systematic Sensor Selection Strategy (S4), and the Extended Testability Analysis (ETA) tool. Originally developed to support specific NASA projects in aeronautics and space, these software tools are currently available to U.S. citizens through the NASA Glenn Software Catalog. The ProDiMES software tool was developed to support a uniform comparison of propulsion gas path diagnostic methods. Methods published in the open literature are typically applied to dissimilar platforms with different levels of complexity. They often address different diagnostic problems and use inconsistent metrics for evaluating performance. As a result, it is difficult to perform a one ]to ]one comparison of the various diagnostic methods. ProDIMES solves this problem by serving as a theme problem to aid in propulsion gas path diagnostic technology development and evaluation. The overall goal is to provide a tool that will serve as an industry standard, and will truly facilitate the development and evaluation of significant Engine Health Management (EHM) capabilities. ProDiMES has been developed under a collaborative project of The Technical Cooperation Program (TTCP) based on feedback provided by individuals within the aircraft engine health management community. The S4 software tool provides a framework that supports the optimal selection of sensors for health management assessments. S4 is structured to accommodate user ]defined applications, diagnostic systems, search techniques, and system requirements/constraints. One or more sensor suites that maximize this performance while meeting other user ]defined system requirements that are presumed to exist. S4 provides a systematic approach for evaluating combinations of sensors to determine the set or sets of sensors that optimally meet the performance goals and the constraints. It identifies optimal sensor suite solutions by utilizing a merit (i.e., cost) function with one of several available optimization approaches. As part of its analysis, S4 can expose fault conditions that are difficult to diagnose due to an incomplete diagnostic philosophy and/or a lack of sensors. S4 was originally developed and applied to liquid rocket engines. It was subsequently used to study the optimized selection of sensors for a simulation ]based aircraft engine diagnostic system. The ETA Tool is a software ]based analysis tool that augments the testability analysis and reporting capabilities of a commercial ]off ]the ]shelf (COTS) package. An initial diagnostic assessment is performed by the COTS software using a user ]developed, qualitative, directed ]graph model of the system being analyzed. The ETA Tool accesses system design information captured within the model and the associated testability analysis output to create a series of six reports for various system engineering needs. These reports are highlighted in the presentation. The ETA Tool was developed by NASA to support the verification of fault management requirements early in the Launch Vehicle process. Due to their early development during the design process, the TEAMS ]based diagnostic model and the ETA Tool were able to positively influence the system design by highlighting gaps in failure detection, fault isolation, and failure recovery.
Pediatric Amblyopia Risk Investigation Study (PARIS).
Savage, Howard I; Lee, Hester H; Zaetta, Deneen; Olszowy, Ronald; Hamburger, Ellie; Weissman, Mark; Frick, Kevin
2005-12-01
To assess the learning curve, testability, and reliability of vision screening modalities administered by pediatric health extenders. Prospective masked clinical trial. Two hundred subjects aged 3 to 6 underwent timed screening for amblyopia by physician extenders, including LEA visual acuity (LEA), stereopsis (RDE), and noncycloplegic autorefraction (NCAR). Patients returned for a comprehensive diagnostic eye examination performed by an ophthalmologist or optometrist. Average screening time was 5.4 +/- 1.6 minutes (LEA), 1.9 +/- 0.9 minutes (RDE), and 1.7 +/- 1.0 minutes (NCAR). Test time for NCAR and RDE fell by 40% during the study period. Overall testability was 92% (LEA), 96% (RDE), and 94% (NCAR). Testability among 3-year-olds was 73% (LEA), 96% (RDE), and 89% (NCAR). Reliability of LEA was moderate (r = .59). Reliability of NCAR was high for astigmatism (Cyl) (r = .89), moderate for spherical equivalent (SE) (r = .66), and low for anisometropia (ANISO) (r = .38). Correlation of cycloplegic autorefraction (CAR) with gold standard cycloplegic retinoscopic refraction (CRR) was very high for SE (.85), CYL (.77), and moderate for ANISO (.48). With NCAR, physician extenders can quickly and reliably detect astigmatism and spherical refractive error in one-third the time it takes to obtain visual acuity. LEA has a lower initial cost, but is time consuming, moderately reliable, and more difficult for 3-year-olds. Shorter examination time and higher reliability may make NCAR a more efficient screening tool for refractive amblyopia in younger children. Future study is needed to determine the sensitivity and specificity of NCAR and other screening methods in detecting amblyopia and amblyopia risk factors.
Encoding dependence in Bayesian causal networks
USDA-ARS?s Scientific Manuscript database
Bayesian networks (BNs) represent complex, uncertain spatio-temporal dynamics by propagation of conditional probabilities between identifiable states with a testable causal interaction model. Typically, they assume random variables are discrete in time and space with a static network structure that ...
Understanding protein evolution: from protein physics to Darwinian selection.
Zeldovich, Konstantin B; Shakhnovich, Eugene I
2008-01-01
Efforts in whole-genome sequencing and structural proteomics start to provide a global view of the protein universe, the set of existing protein structures and sequences. However, approaches based on the selection of individual sequences have not been entirely successful at the quantitative description of the distribution of structures and sequences in the protein universe because evolutionary pressure acts on the entire organism, rather than on a particular molecule. In parallel to this line of study, studies in population genetics and phenomenological molecular evolution established a mathematical framework to describe the changes in genome sequences in populations of organisms over time. Here, we review both microscopic (physics-based) and macroscopic (organism-level) models of protein-sequence evolution and demonstrate that bridging the two scales provides the most complete description of the protein universe starting from clearly defined, testable, and physiologically relevant assumptions.
Language, music, syntax and the brain.
Patel, Aniruddh D
2003-07-01
The comparative study of music and language is drawing an increasing amount of research interest. Like language, music is a human universal involving perceptually discrete elements organized into hierarchically structured sequences. Music and language can thus serve as foils for each other in the study of brain mechanisms underlying complex sound processing, and comparative research can provide novel insights into the functional and neural architecture of both domains. This review focuses on syntax, using recent neuroimaging data and cognitive theory to propose a specific point of convergence between syntactic processing in language and music. This leads to testable predictions, including the prediction that that syntactic comprehension problems in Broca's aphasia are not selective to language but influence music perception as well.
Stochastic recruitment leads to symmetry breaking in foraging populations
NASA Astrophysics Data System (ADS)
Biancalani, Tommaso; Dyson, Louise; McKane, Alan
2014-03-01
When an ant colony is faced with two identical equidistant food sources, the foraging ants are found to concentrate more on one source than the other. Analogous symmetry-breaking behaviours have been reported in various population systems, (such as queueing or stock market trading) suggesting the existence of a simple universal mechanism. Past studies have neglected the effect of demographic noise and required rather complicated models to qualitatively reproduce this behaviour. I will show how including the effects of demographic noise leads to a radically different conclusion. The symmetry-breaking arises solely due to the process of recruitment and ceases to occur for large population sizes. The latter fact provides a testable prediction for a real system.
Panarchy: theory and application
Allen, Craig R.; Angeler, David G.; Garmestani, Ahjond S.; Gunderson, Lance H.; Holling, Crawford S.
2014-01-01
The concept of panarchy provides a framework that characterizes complex systems of people and nature as dynamically organized and structured within and across scales of space and time. It has been more than a decade since the introduction of panarchy. Over this period, its invocation in peer-reviewed literature has been steadily increasing, but its use remains primarily descriptive and abstract. Here, we discuss the use of the concept in the literature to date, highlight where the concept may be useful, and discuss limitations to the broader applicability of panarchy theory for research in the ecological and social sciences. Finally, we forward a set of testable hypotheses to evaluate key propositions that follow from panarchy theory.
The polyadenylation code: a unified model for the regulation of mRNA alternative polyadenylation*
Davis, Ryan; Shi, Yongsheng
2014-01-01
The majority of eukaryotic genes produce multiple mRNA isoforms with distinct 3′ ends through a process called mRNA alternative polyadenylation (APA). Recent studies have demonstrated that APA is dynamically regulated during development and in response to environmental stimuli. A number of mechanisms have been described for APA regulation. In this review, we attempt to integrate all the known mechanisms into a unified model. This model not only explains most of previous results, but also provides testable predictions that will improve our understanding of the mechanistic details of APA regulation. Finally, we briefly discuss the known and putative functions of APA regulation. PMID:24793760
Modules, theories, or islands of expertise? Domain specificity in socialization.
Gelman, Susan A
2010-01-01
The domain-specific approach to socialization processes presented by J. E. Grusec and M. Davidov (this issue) provides a compelling framework for integrating and interpreting a large and disparate body of research findings, and it generates a wealth of testable new hypotheses. At the same time, it introduces core theoretical questions regarding the nature of social interactions, from the perspective of both children and their caregivers. This commentary draws on the literature regarding domain specificity in cognitive development, applauds what is innovative and exciting about applying a domain-specific approach to socialization processes, and points to questions for future research. Foremost among these is what is meant by "domain specificity."
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bray, O.H.
This paper describes a natural language based, semantic information modeling methodology and explores its use and value in clarifying and comparing political science theories and frameworks. As an example, the paper uses this methodology to clarify and compare some of the basic concepts and relationships in the realist (e.g. Waltz) and the liberal (e.g. Rosenau) paradigms for international relations. The methodology can provide three types of benefits: (1) it can clarify and make explicit exactly what is meant by a concept; (2) it can often identify unanticipated implications and consequence of concepts and relationships; and (3) it can help inmore » identifying and operationalizing testable hypotheses.« less
Neutrino mass, dark matter, and Baryon asymmetry via TeV-scale physics without fine-tuning.
Aoki, Mayumi; Kanemura, Shinya; Seto, Osamu
2009-02-06
We propose an extended version of the standard model, in which neutrino oscillation, dark matter, and the baryon asymmetry of the Universe can be simultaneously explained by the TeV-scale physics without assuming a large hierarchy among the mass scales. Tiny neutrino masses are generated at the three-loop level due to the exact Z2 symmetry, by which the stability of the dark matter candidate is guaranteed. The extra Higgs doublet is required not only for the tiny neutrino masses but also for successful electroweak baryogenesis. The model provides discriminative predictions especially in Higgs phenomenology, so that it is testable at current and future collider experiments.
Evolution beyond neo-Darwinism: a new conceptual framework.
Noble, Denis
2015-01-01
Experimental results in epigenetics and related fields of biological research show that the Modern Synthesis (neo-Darwinist) theory of evolution requires either extension or replacement. This article examines the conceptual framework of neo-Darwinism, including the concepts of 'gene', 'selfish', 'code', 'program', 'blueprint', 'book of life', 'replicator' and 'vehicle'. This form of representation is a barrier to extending or replacing existing theory as it confuses conceptual and empirical matters. These need to be clearly distinguished. In the case of the central concept of 'gene', the definition has moved all the way from describing a necessary cause (defined in terms of the inheritable phenotype itself) to an empirically testable hypothesis (in terms of causation by DNA sequences). Neo-Darwinism also privileges 'genes' in causation, whereas in multi-way networks of interactions there can be no privileged cause. An alternative conceptual framework is proposed that avoids these problems, and which is more favourable to an integrated systems view of evolution. © 2015. Published by The Company of Biologists Ltd.
Certification trails and software design for testability
NASA Technical Reports Server (NTRS)
Sullivan, Gregory F.; Wilson, Dwight S.; Masson, Gerald M.
1993-01-01
Design techniques which may be applied to make program testing easier were investigated. Methods for modifying a program to generate additional data which we refer to as a certification trail are presented. This additional data is designed to allow the program output to be checked more quickly and effectively. Certification trails were described primarily from a theoretical perspective. A comprehensive attempt to assess experimentally the performance and overall value of the certification trail method is reported. The method was applied to nine fundamental, well-known algorithms for the following problems: convex hull, sorting, huffman tree, shortest path, closest pair, line segment intersection, longest increasing subsequence, skyline, and voronoi diagram. Run-time performance data for each of these problems is given, and selected problems are described in more detail. Our results indicate that there are many cases in which certification trails allow for significantly faster overall program execution time than a 2-version programming approach, and also give further evidence of the breadth of applicability of this method.
Inconclusive quantum measurements and decisions under uncertainty
NASA Astrophysics Data System (ADS)
Yukalov, Vyacheslav; Sornette, Didier
2016-04-01
We give a mathematical definition for the notion of inconclusive quantum measurements. In physics, such measurements occur at intermediate stages of a complex measurement procedure, with the final measurement result being operationally testable. Since the mathematical structure of Quantum Decision Theory has been developed in analogy with the theory of quantum measurements, the inconclusive quantum measurements correspond, in Quantum Decision Theory, to intermediate stages of decision making in the process of taking decisions under uncertainty. The general form of the quantum probability for a composite event is the sum of a utility factor, describing a rational evaluation of the considered prospect, and of an attraction factor, characterizing irrational, subconscious attitudes of the decision maker. Despite the involved irrationality, the probability of prospects can be evaluated. This is equivalent to the possibility of calculating quantum probabilities without specifying hidden variables. We formulate a general way of evaluation, based on the use of non-informative priors. As an example, we suggest the explanation of the decoy effect. Our quantitative predictions are in very good agreement with experimental data.
Self-organization in the limb: a Turing mechanism for digit development.
Cooper, Kimberly L
2015-06-01
The statistician George E. P. Box stated, 'Essentially all models are wrong, but some are useful.' (Box GEP, Draper NR: Empirical Model-Building and Response Surfaces. Wiley; 1987). Modeling biological processes is challenging for many of the reasons classically trained developmental biologists often resist the idea that black and white equations can explain the grayscale subtleties of living things. Although a simplified mathematical model of development will undoubtedly fall short of precision, a good model is exceedingly useful if it raises at least as many testable questions as it answers. Self-organizing Turing models that simulate the pattern of digits in the hand replicate events that have not yet been explained by classical approaches. The union of theory and experimentation has recently identified and validated the minimal components of a Turing network for digit pattern and triggered a cascade of questions that will undoubtedly be well-served by the continued merging of disciplines. Copyright © 2015 Elsevier Ltd. All rights reserved.
Autism as the Low-Fitness Extreme of a Parentally Selected Fitness Indicator.
Shaner, Andrew; Miller, Geoffrey; Mintz, Jim
2008-12-01
Siblings compete for parental care and feeding, while parents must allocate scarce resources to those offspring most likely to survive and reproduce. This could cause offspring to evolve traits that advertise health, and thereby attract parental resources. For example, experimental evidence suggests that bright orange filaments covering the heads of North American coot chicks may have evolved for this fitness-advertising purpose. Could any human mental disorders be the equivalent of dull filaments in coot chicks-low-fitness extremes of mental abilities that evolved as fitness indicators? One possibility is autism. Suppose that the ability of very young children to charm their parents evolved as a parentally selected fitness indicator. Young children would vary greatly in their ability to charm parents, that variation would correlate with underlying fitness, and autism could be the low-fitness extreme of this variation. This view explains many seemingly disparate facts about autism and leads to some surprising and testable predictions.
Yukawa unification in an SO(10) SUSY GUT: SUSY on the edge
NASA Astrophysics Data System (ADS)
Poh, Zijie; Raby, Stuart
2015-07-01
In this paper we analyze Yukawa unification in a three family SO(10) SUSY GUT. We perform a global χ2 analysis and show that supersymmetry (SUSY) effects do not decouple even though the universal scalar mass parameter at the grand unified theory (GUT) scale, m16, is found to lie between 15 and 30 TeV with the best fit given for m16≈25 TeV . Note, SUSY effects do not decouple since stops and bottoms have mass of order 5 TeV, due to renormalization group running from MGUT. The model has many testable predictions. Gauginos are the lightest sparticles and the light Higgs boson is very much standard model-like. The model is consistent with flavor and C P observables with the BR (μ →e γ ) close to the experimental upper bound. With such a large value of m16 we clearly cannot be considered "natural" SUSY nor are we "split" SUSY. We are thus in the region in between or "SUSY on the edge."
Effects of flow on the dynamics of a ferromagnetic nematic liquid crystal
NASA Astrophysics Data System (ADS)
Potisk, Tilen; Pleiner, Harald; Svenšek, Daniel; Brand, Helmut R.
2018-04-01
We investigate the effects of flow on the dynamics of ferromagnetic nematic liquid crystals. As a model, we study the coupled dynamics of the magnetization, M , the director field, n , associated with the liquid crystalline orientational order, and the velocity field, v . We evaluate how simple shear flow in a ferromagnetic nematic is modified in the presence of small external magnetic fields, and we make experimentally testable predictions for the resulting effective shear viscosity: an increase by a factor of 2 in a magnetic field of about 20 mT. Flow alignment, a characteristic feature of classical uniaxial nematic liquid crystals, is analyzed for ferromagnetic nematics for the two cases of magnetization in or perpendicular to the shear plane. In the former case, we find that small in-plane magnetic fields are sufficient to suppress tumbling and thus that the boundary between flow alignment and tumbling can be controlled easily. In the latter case, we furthermore find a possibility of flow alignment in a regime for which one obtains tumbling for the pure nematic component. We derive the analogs of the three Miesowicz viscosities well-known from usual nematic liquid crystals, corresponding to nine different configurations. Combinations of these can be used to determine several dynamic coefficients experimentally.
A genome-wide longitudinal transcriptome analysis of the aging model Podospora anserina.
Philipp, Oliver; Hamann, Andrea; Servos, Jörg; Werner, Alexandra; Koch, Ina; Osiewacz, Heinz D
2013-01-01
Aging of biological systems is controlled by various processes which have a potential impact on gene expression. Here we report a genome-wide transcriptome analysis of the fungal aging model Podospora anserina. Total RNA of three individuals of defined age were pooled and analyzed by SuperSAGE (serial analysis of gene expression). A bioinformatics analysis identified different molecular pathways to be affected during aging. While the abundance of transcripts linked to ribosomes and to the proteasome quality control system were found to decrease during aging, those associated with autophagy increase, suggesting that autophagy may act as a compensatory quality control pathway. Transcript profiles associated with the energy metabolism including mitochondrial functions were identified to fluctuate during aging. Comparison of wild-type transcripts, which are continuously down-regulated during aging, with those down-regulated in the long-lived, copper-uptake mutant grisea, validated the relevance of age-related changes in cellular copper metabolism. Overall, we (i) present a unique age-related data set of a longitudinal study of the experimental aging model P. anserina which represents a reference resource for future investigations in a variety of organisms, (ii) suggest autophagy to be a key quality control pathway that becomes active once other pathways fail, and (iii) present testable predictions for subsequent experimental investigations.
On the importance of scientific rhetoric in stuttering: a reply to Finn, Bothe, and Bramlett (2005).
Kalinowski, Joseph; Saltuklaroglu, Tim; Stuart, Andrew; Guntupalli, Vijaya K
2007-02-01
To refute the alleged practice of "pseudoscience" by P. Finn, A. K. Bothe, and R. E. Bramlett (2005) and to illustrate their experimental and systematic bias when evaluating the SpeechEasy, an altered auditory feedback device used in the management of stuttering. We challenged the experimental design that led to the seemingly predetermined outcome of pseudoscience rather than science: Limited preselected literature was submitted to a purposely sampled panel of judges (i.e., their own students). Each criterion deemed pseudoscientific was contested with published peer-reviewed data illustrating the importance of good rhetoric, testability, and logical outcomes from decades of scientific research. Stuttering is an involuntary disorder that is highly resistant to therapy. Altered auditory feedback is a derivation of choral speech (nature's most powerful stuttering "inhibitor") that can be synergistically combined with other methods for optimal stuttering inhibition. This approach is logical considering that in stuttering no single treatment is universally helpful. Also, caution is suggested when attempting to differentiate science from pseudoscience in stuttering treatments using the criteria employed by Finn et al. For example, evaluating behavioral therapy outcomes implements a post hoc or untestable system. Speech outcome (i.e., stuttered or fluent speech) determines success or failure of technique use, placing responsibility for failure on those who stutter.
Propagating Cell-Membrane Waves Driven by Curved Activators of Actin Polymerization
Peleg, Barak; Disanza, Andrea; Scita, Giorgio; Gov, Nir
2011-01-01
Cells exhibit propagating membrane waves which involve the actin cytoskeleton. One type of such membranal waves are Circular Dorsal Ruffles (CDR) which are related to endocytosis and receptor internalization. Experimentally, CDRs have been associated with membrane bound activators of actin polymerization of concave shape. We present experimental evidence for the localization of convex membrane proteins in these structures, and their insensitivity to inhibition of myosin II contractility in immortalized mouse embryo fibroblasts cell cultures. These observations lead us to propose a theoretical model which explains the formation of these waves due to the interplay between complexes that contain activators of actin polymerization and membrane-bound curved proteins of both types of curvature (concave and convex). Our model predicts that the activity of both types of curved proteins is essential for sustaining propagating waves, which are abolished when one type of curved activator is removed. Within this model waves are initiated when the level of actin polymerization induced by the curved activators is higher than some threshold value, which allows the cell to control CDR formation. We demonstrate that the model can explain many features of CDRs, and give several testable predictions. This work demonstrates the importance of curved membrane proteins in organizing the actin cytoskeleton and cell shape. PMID:21533032
A simple mechanistic explanation for original antigenic sin and its alleviation by adjuvants.
Ndifon, Wilfred
2015-11-06
A large number of published studies have shown that adaptive immunity to a particular antigen, including pathogen-derived, can be boosted by another, cross-reacting antigen while inducing suboptimal immunity to the latter. Although this phenomenon, called original antigenic sin (OAS), was first reported approximately 70 years ago (Francis et al. 1947 Am. J. Public Health 37, 1013-1016 (doi:10.2105/AJPH.37.8.1013)), its underlying biological mechanisms are still inadequately understood (Kim et al. Proc. Natl Acad. Sci. USA 109, 13 751-13 756 (doi:10.1073/pnas.0912458109)). Here, focusing on the humoral aspects of adaptive immunity, I propose a simple and testable mechanism: that OAS occurs when T regulatory cells induced by the first antigen decrease the dose of the second antigen that is loaded by dendritic cells and available to activate naive lymphocytes. I use both a parsimonious mathematical model and experimental data to confirm the deductive validity of this proposal. This model also explains the puzzling experimental observation that administering certain dendritic cell-activating adjuvants during antigen exposure alleviates OAS. Specifically, the model predicts that such adjuvants will attenuate T regulatory suppression of naive lymphocyte activation. Together, these results suggest additional strategies for redeeming adaptive immunity from the destructive consequences of antigenic 'sin'. © 2015 The Author(s).
Interrogating selectivity in catalysis using molecular vibrations
NASA Astrophysics Data System (ADS)
Milo, Anat; Bess, Elizabeth N.; Sigman, Matthew S.
2014-03-01
The delineation of molecular properties that underlie reactivity and selectivity is at the core of physical organic chemistry, and this knowledge can be used to inform the design of improved synthetic methods or identify new chemical transformations. For this reason, the mathematical representation of properties affecting reactivity and selectivity trends, that is, molecular parameters, is paramount. Correlations produced by equating these molecular parameters with experimental outcomes are often defined as free-energy relationships and can be used to evaluate the origin of selectivity and to generate new, experimentally testable hypotheses. The premise behind successful correlations of this type is that a systematically perturbed molecular property affects a transition-state interaction between the catalyst, substrate and any reaction components involved in the determination of selectivity. Classic physical organic molecular descriptors, such as Hammett, Taft or Charton parameters, seek to independently probe isolated electronic or steric effects. However, these parameters cannot address simultaneous, non-additive variations to more than one molecular property, which limits their utility. Here we report a parameter system based on the vibrational response of a molecule to infrared radiation that can be used to mathematically model and predict selectivity trends for reactions with interlinked steric and electronic effects at positions of interest. The disclosed parameter system is mechanistically derived and should find broad use in the study of chemical and biological systems.
How and why does the immunological synapse form? Physical chemistry meets cell biology.
Chakraborty, Arup K
2002-03-05
During T lymphocyte (T cell) recognition of an antigen, a highly organized and specific pattern of membrane proteins forms in the junction between the T cell and the antigen-presenting cell (APC). This specialized cell-cell junction is called the immunological synapse. It is several micrometers large and forms over many minutes. A plethora of experiments are being performed to study the mechanisms that underlie synapse formation and the way in which information transfer occurs across the synapse. The wealth of experimental data that is beginning to emerge must be understood within a mechanistic framework if it is to prove useful in developing modalities to control the immune response. Quantitative models can complement experiments in the quest for such a mechanistic understanding by suggesting experimentally testable hypotheses. Here, a quantitative synapse assembly model is described. The model uses concepts developed in physical chemistry and cell biology and is able to predict the spatiotemporal evolution of cell shape and receptor protein patterns observed during synapse formation. Attention is directed to how the juxtaposition of model predictions and experimental data has led to intriguing hypotheses regarding the role of null and self peptides during synapse assembly, as well as correlations between T cell effector functions and the robustness of synapse assembly. We remark on some ways in which synergistic experiments and modeling studies can improve current models, and we take steps toward a better understanding of information transfer across the T cell-APC junction.
Automated Testability Decision Tool
1991-09-01
Vol. 16,1968, pp. 538-558. Bertsekas, D. P., "Constraints Optimization and Lagrange Multiplier Methods," Academic Press, New York. McLeavey , D.W... McLeavey , J.A., "Parallel Optimization Methods in Standby Reliability, " University of Connecticut, School of Business Administration, Bureau of Business
ERIC Educational Resources Information Center
Niaz, Mansoor
1991-01-01
Discusses differences between the epistemic and the psychological subject, the relationship between the epistemic subject and the ideal gas law, the development of general cognitive operations, and the empirical testability of Piaget's epistemic subject. (PR)
Small Town in Mass Society Revisited.
ERIC Educational Resources Information Center
Young, Frank W.
1996-01-01
A 1958 New York community study dramatized the thesis that macro forces (urbanization, industrialization, bureaucratization) have undermined all small communities' autonomy. Such "oppositional case studies" succeed when they render the dominant view immediately obsolete, have plausible origins, are testable, and generate new research.…
Smith, Leah M; Lévesque, Linda E; Kaufman, Jay S; Strumpf, Erin C
2017-06-01
The regression discontinuity design (RDD) is a quasi-experimental approach used to avoid confounding bias in the assessment of new policies and interventions. It is applied specifically in situations where individuals are assigned to a policy/intervention based on whether they are above or below a pre-specified cut-off on a continuously measured variable, such as birth date, income or weight. The strength of the design is that, provided individuals do not manipulate the value of this variable, assignment to the policy/intervention is considered as good as random for individuals close to the cut-off. Despite its popularity in fields like economics, the RDD remains relatively unknown in epidemiology where its application could be tremendously useful. In this paper, we provide a practical introduction to the RDD for health researchers, describe four empirically testable assumptions of the design and offer strategies that can be used to assess whether these assumptions are met in a given study. For illustrative purposes, we implement these strategies to assess whether the RDD is appropriate for a study of the impact of human papillomavirus vaccination on cervical dysplasia. We found that, whereas the assumptions of the RDD were generally satisfied in our study context, birth timing had the potential to confound our effect estimate in an unexpected way and therefore needed to be taken into account in the analysis. Our findings underscore the importance of assessing the validity of the assumptions of this design, testing them when possible and making adjustments as necessary to support valid causal inference. © The Author 2016. Published by Oxford University Press on behalf of the International Epidemiological Association
A Multilayer Network Approach for Guiding Drug Repositioning in Neglected Diseases
Chernomoretz, Ariel; Agüero, Fernán
2016-01-01
Drug development for neglected diseases has been historically hampered due to lack of market incentives. The advent of public domain resources containing chemical information from high throughput screenings is changing the landscape of drug discovery for these diseases. In this work we took advantage of data from extensively studied organisms like human, mouse, E. coli and yeast, among others, to develop a novel integrative network model to prioritize and identify candidate drug targets in neglected pathogen proteomes, and bioactive drug-like molecules. We modeled genomic (proteins) and chemical (bioactive compounds) data as a multilayer weighted network graph that takes advantage of bioactivity data across 221 species, chemical similarities between 1.7 105 compounds and several functional relations among 1.67 105 proteins. These relations comprised orthology, sharing of protein domains, and shared participation in defined biochemical pathways. We showcase the application of this network graph to the problem of prioritization of new candidate targets, based on the information available in the graph for known compound-target associations. We validated this strategy by performing a cross validation procedure for known mouse and Trypanosoma cruzi targets and showed that our approach outperforms classic alignment-based approaches. Moreover, our model provides additional flexibility as two different network definitions could be considered, finding in both cases qualitatively different but sensible candidate targets. We also showcase the application of the network to suggest targets for orphan compounds that are active against Plasmodium falciparum in high-throughput screens. In this case our approach provided a reduced prioritization list of target proteins for the query molecules and showed the ability to propose new testable hypotheses for each compound. Moreover, we found that some predictions highlighted by our network model were supported by independent experimental validations as found post-facto in the literature. PMID:26735851
A Multilayer Network Approach for Guiding Drug Repositioning in Neglected Diseases.
Berenstein, Ariel José; Magariños, María Paula; Chernomoretz, Ariel; Agüero, Fernán
2016-01-01
Drug development for neglected diseases has been historically hampered due to lack of market incentives. The advent of public domain resources containing chemical information from high throughput screenings is changing the landscape of drug discovery for these diseases. In this work we took advantage of data from extensively studied organisms like human, mouse, E. coli and yeast, among others, to develop a novel integrative network model to prioritize and identify candidate drug targets in neglected pathogen proteomes, and bioactive drug-like molecules. We modeled genomic (proteins) and chemical (bioactive compounds) data as a multilayer weighted network graph that takes advantage of bioactivity data across 221 species, chemical similarities between 1.7 105 compounds and several functional relations among 1.67 105 proteins. These relations comprised orthology, sharing of protein domains, and shared participation in defined biochemical pathways. We showcase the application of this network graph to the problem of prioritization of new candidate targets, based on the information available in the graph for known compound-target associations. We validated this strategy by performing a cross validation procedure for known mouse and Trypanosoma cruzi targets and showed that our approach outperforms classic alignment-based approaches. Moreover, our model provides additional flexibility as two different network definitions could be considered, finding in both cases qualitatively different but sensible candidate targets. We also showcase the application of the network to suggest targets for orphan compounds that are active against Plasmodium falciparum in high-throughput screens. In this case our approach provided a reduced prioritization list of target proteins for the query molecules and showed the ability to propose new testable hypotheses for each compound. Moreover, we found that some predictions highlighted by our network model were supported by independent experimental validations as found post-facto in the literature.
Cognitive Scientists Prefer Theories and Testable Principles with Teeth
ERIC Educational Resources Information Center
Graesser, Arthur C.
2009-01-01
Alexander, Schallert, and Reynolds (2009/this issue) proposed a definition and landscape of learning that included 9 principles and 4 dimensions ("what," "who," "where," "when"). This commentary reflects on the utility of this definition and 4-dimensional landscape from the standpoint of educational…
A systems framework for identifying candidate microbial assemblages for disease management
USDA-ARS?s Scientific Manuscript database
Network models of soil and plant microbiomes present new opportunities for enhancing disease management, but also challenges for interpretation. We present a framework for interpreting microbiome networks, illustrating how the observed structure of networks can be used to generate testable hypothese...
Perceptual Decision-Making as Probabilistic Inference by Neural Sampling.
Haefner, Ralf M; Berkes, Pietro; Fiser, József
2016-05-04
We address two main challenges facing systems neuroscience today: understanding the nature and function of cortical feedback between sensory areas and of correlated variability. Starting from the old idea of perception as probabilistic inference, we show how to use knowledge of the psychophysical task to make testable predictions for the influence of feedback signals on early sensory representations. Applying our framework to a two-alternative forced choice task paradigm, we can explain multiple empirical findings that have been hard to account for by the traditional feedforward model of sensory processing, including the task dependence of neural response correlations and the diverging time courses of choice probabilities and psychophysical kernels. Our model makes new predictions and characterizes a component of correlated variability that represents task-related information rather than performance-degrading noise. It demonstrates a normative way to integrate sensory and cognitive components into physiologically testable models of perceptual decision-making. Copyright © 2016 Elsevier Inc. All rights reserved.
Segmental folding of chromosomes: a basis for structural and regulatory chromosomal neighborhoods?
Nora, Elphège P; Dekker, Job; Heard, Edith
2013-09-01
We discuss here a series of testable hypotheses concerning the role of chromosome folding into topologically associating domains (TADs). Several lines of evidence suggest that segmental packaging of chromosomal neighborhoods may underlie features of chromatin that span large domains, such as heterochromatin blocks, association with the nuclear lamina and replication timing. By defining which DNA elements preferentially contact each other, the segmentation of chromosomes into TADs may also underlie many properties of long-range transcriptional regulation. Several observations suggest that TADs can indeed provide a structural basis to regulatory landscapes, by controlling enhancer sharing and allocation. We also discuss how TADs may shape the evolution of chromosomes, by causing maintenance of synteny over large chromosomal segments. Finally we suggest a series of experiments to challenge these ideas and provide concrete examples illustrating how they could be practically applied. © 2013 The Authors. Bioessays published by WILEY Periodicals, Inc.
Segmental folding of chromosomes: A basis for structural and regulatory chromosomal neighborhoods?
Nora, Elphège P; Dekker, Job; Heard, Edith
2013-01-01
We discuss here a series of testable hypotheses concerning the role of chromosome folding into topologically associating domains (TADs). Several lines of evidence suggest that segmental packaging of chromosomal neighborhoods may underlie features of chromatin that span large domains, such as heterochromatin blocks, association with the nuclear lamina and replication timing. By defining which DNA elements preferentially contact each other, the segmentation of chromosomes into TADs may also underlie many properties of long-range transcriptional regulation. Several observations suggest that TADs can indeed provide a structural basis to regulatory landscapes, by controlling enhancer sharing and allocation. We also discuss how TADs may shape the evolution of chromosomes, by causing maintenance of synteny over large chromosomal segments. Finally we suggest a series of experiments to challenge these ideas and provide concrete examples illustrating how they could be practically applied. PMID:23832846
Różycki, Bartosz; Cazade, Pierre-André; O'Mahony, Shane; Thompson, Damien; Cieplak, Marek
2017-08-16
Cellulosomes are large multi-protein catalysts produced by various anaerobic microorganisms to efficiently degrade plant cell-wall polysaccharides down into simple sugars. X-ray and physicochemical structural characterisations show that cellulosomes are composed of numerous protein domains that are connected by unstructured polypeptide segments, yet the properties and possible roles of these 'linker' peptides are largely unknown. We have performed coarse-grained and all-atom molecular dynamics computer simulations of a number of cellulosomal linkers of different lengths and compositions. Our data demonstrates that the effective stiffness of the linker peptides, as quantified by the equilibrium fluctuations in the end-to-end distances, depends primarily on the length of the linker and less so on the specific amino acid sequence. The presence of excluded volume - provided by the domains that are connected - dampens the motion of the linker residues and reduces the effective stiffness of the linkers. Simultaneously, the presence of the linkers alters the conformations of the protein domains that are connected. We demonstrate that short, stiff linkers induce significant rearrangements in the folded domains of the mini-cellulosome composed of endoglucanase Cel8A in complex with scaffoldin ScafT (Cel8A-ScafT) of Clostridium thermocellum as well as in a two-cohesin system derived from the scaffoldin ScaB of Acetivibrio cellulolyticus. We give experimentally testable predictions on structural changes in protein domains that depend on the length of linkers.
Rajeev, Lara; Luning, Eric G; Dehal, Paramvir S; Price, Morgan N; Arkin, Adam P; Mukhopadhyay, Aindrila
2011-10-12
Two component regulatory systems are the primary form of signal transduction in bacteria. Although genomic binding sites have been determined for several eukaryotic and bacterial transcription factors, comprehensive identification of gene targets of two component response regulators remains challenging due to the lack of knowledge of the signals required for their activation. We focused our study on Desulfovibrio vulgaris Hildenborough, a sulfate reducing bacterium that encodes unusually diverse and largely uncharacterized two component signal transduction systems. We report the first systematic mapping of the genes regulated by all transcriptionally acting response regulators in a single bacterium. Our results enabled functional predictions for several response regulators and include key processes of carbon, nitrogen and energy metabolism, cell motility and biofilm formation, and responses to stresses such as nitrite, low potassium and phosphate starvation. Our study also led to the prediction of new genes and regulatory networks, which found corroboration in a compendium of transcriptome data available for D. vulgaris. For several regulators we predicted and experimentally verified the binding site motifs, most of which were discovered as part of this study. The gene targets identified for the response regulators allowed strong functional predictions to be made for the corresponding two component systems. By tracking the D. vulgaris regulators and their motifs outside the Desulfovibrio spp. we provide testable hypotheses regarding the functions of orthologous regulators in other organisms. The in vitro array based method optimized here is generally applicable for the study of such systems in all organisms.
Shear-induced aggregation dynamics in a polymer microrod suspension
NASA Astrophysics Data System (ADS)
Kumar, Pramukta S.
A non-Brownian suspension of micron scale rods is found to exhibit reversible shear-driven formation of disordered aggregates resulting in dramatic viscosity enhancement at low shear rates. Aggregate formation is imaged at low magnification using a combined rheometer and fluorescence microscope system. The size and structure of these aggregates are found to depend on shear rate and concentration, with larger aggregates present at lower shear rates and higher concentrations. Quantitative measurements of the early-stage aggregation process are modeled by a collision driven growth of porous structures which show that the aggregate density increases with a shear rate. A Krieger-Dougherty type constitutive relation and steady-state viscosity measurements are used to estimate the intrinsic viscosity of complex structures developed under shear. Higher magnification images are collected and used to validate the aggregate size versus density relationship, as well as to obtain particle flow fields via PIV. The flow fields provide a tantalizing view of fluctuations involved in the aggregation process. Interaction strength is estimated via contact force measurements and JKR theory and found to be extremely strong in comparison to shear forces present in the system, estimated using hydrodynamic arguments. All of the results are then combined to produce a consistent conceptual model of aggregation in the system that features testable consequences. These results represent a direct, quantitative, experimental study of aggregation and viscosity enhancement in rod suspension, and demonstrate a strategy for inferring inaccessible microscopic geometric properties of a dynamic system through the combination of quantitative imaging and rheology.
Reinhold, William C
2015-12-10
There is currently a split within the cancer research community between traditional molecular biological hypothesis-driven and the more recent "omic" forms or research. While the molecular biological approach employs the tried and true single alteration-single response formulations of experimentation, the omic employs broad-based assay or sample collection approaches that generate large volumes of data. How to integrate the benefits of these two approaches in an efficient and productive fashion remains an outstanding issue. Ideally, one would merge the understandability, exactness, simplicity, and testability of the molecular biological approach, with the larger amounts of data, simultaneous consideration of multiple alterations, consideration of genes both of known interest along with the novel, cross-sample comparisons among cell lines and patient samples, and consideration of directed questions while simultaneously gaining exposure to the novel provided by the omic approach. While at the current time integration of the two disciplines remains problematic, attempts to do so are ongoing, and will be necessary for the understanding of the large cell line screens including the Developmental Therapeutics Program's NCI-60, the Broad Institute's Cancer Cell Line Encyclopedia, and the Wellcome Trust Sanger Institute's Cancer Genome Project, as well as the the Cancer Genome Atlas clinical samples project. Going forward there is significant benefit to be had from the integration of the molecular biological and the omic forms or research, with the desired goal being improved translational understanding and application.
Kerr, Robert R.; Grayden, David B.; Thomas, Doreen A.; Gilson, Matthieu; Burkitt, Anthony N.
2014-01-01
A fundamental goal of neuroscience is to understand how cognitive processes, such as operant conditioning, are performed by the brain. Typical and well studied examples of operant conditioning, in which the firing rates of individual cortical neurons in monkeys are increased using rewards, provide an opportunity for insight into this. Studies of reward-modulated spike-timing-dependent plasticity (RSTDP), and of other models such as R-max, have reproduced this learning behavior, but they have assumed that no unsupervised learning is present (i.e., no learning occurs without, or independent of, rewards). We show that these models cannot elicit firing rate reinforcement while exhibiting both reward learning and ongoing, stable unsupervised learning. To fix this issue, we propose a new RSTDP model of synaptic plasticity based upon the observed effects that dopamine has on long-term potentiation and depression (LTP and LTD). We show, both analytically and through simulations, that our new model can exhibit unsupervised learning and lead to firing rate reinforcement. This requires that the strengthening of LTP by the reward signal is greater than the strengthening of LTD and that the reinforced neuron exhibits irregular firing. We show the robustness of our findings to spike-timing correlations, to the synaptic weight dependence that is assumed, and to changes in the mean reward. We also consider our model in the differential reinforcement of two nearby neurons. Our model aligns more strongly with experimental studies than previous models and makes testable predictions for future experiments. PMID:24475240
Fabina, Nicholas S; Putnam, Hollie M; Franklin, Erik C; Stat, Michael; Gates, Ruth D
2013-11-01
Climate change-driven stressors threaten the persistence of coral reefs worldwide. Symbiotic relationships between scleractinian corals and photosynthetic endosymbionts (genus Symbiodinium) are the foundation of reef ecosystems, and these associations are differentially impacted by stress. Here, we couple empirical data from the coral reefs of Moorea, French Polynesia, and a network theoretic modeling approach to evaluate how patterns in coral-Symbiodinium associations influence community stability under climate change. To introduce the effect of climate perturbations, we simulate local 'extinctions' that represent either the loss of coral species or the ability to engage in symbiotic interactions. Community stability is measured by determining the duration and number of species that persist through the simulated extinctions. Our results suggest that four factors greatly increase coral-Symbiodinium community stability in response to global changes: (i) the survival of generalist hosts and symbionts maximizes potential symbiotic unions; (ii) elevated symbiont diversity provides redundant or complementary symbiotic functions; (iii) compatible symbiotic assemblages create the potential for local recolonization; and (iv) the persistence of certain traits associate with symbiotic diversity and redundancy. Symbiodinium may facilitate coral persistence through novel environmental regimes, but this capacity is mediated by symbiotic specificity, association patterns, and the functional performance of the symbionts. Our model-based approach identifies general trends and testable hypotheses in coral-Symbiodinium community responses. Future studies should consider similar methods when community size and/or environmental complexity preclude experimental approaches. © 2013 John Wiley & Sons Ltd.
Towards a theory of individual differences in statistical learning
Bogaerts, Louisa; Christiansen, Morten H.; Frost, Ram
2017-01-01
In recent years, statistical learning (SL) research has seen a growing interest in tracking individual performance in SL tasks, mainly as a predictor of linguistic abilities. We review studies from this line of research and outline three presuppositions underlying the experimental approach they employ: (i) that SL is a unified theoretical construct; (ii) that current SL tasks are interchangeable, and equally valid for assessing SL ability; and (iii) that performance in the standard forced-choice test in the task is a good proxy of SL ability. We argue that these three critical presuppositions are subject to a number of theoretical and empirical issues. First, SL shows patterns of modality- and informational-specificity, suggesting that SL cannot be treated as a unified construct. Second, different SL tasks may tap into separate sub-components of SL that are not necessarily interchangeable. Third, the commonly used forced-choice tests in most SL tasks are subject to inherent limitations and confounds. As a first step, we offer a methodological approach that explicitly spells out a potential set of different SL dimensions, allowing for better transparency in choosing a specific SL task as a predictor of a given linguistic outcome. We then offer possible methodological solutions for better tracking and measuring SL ability. Taken together, these discussions provide a novel theoretical and methodological approach for assessing individual differences in SL, with clear testable predictions. This article is part of the themed issue ‘New frontiers for statistical learning in the cognitive sciences’. PMID:27872377
Electrical test prediction using hybrid metrology and machine learning
NASA Astrophysics Data System (ADS)
Breton, Mary; Chao, Robin; Muthinti, Gangadhara Raja; de la Peña, Abraham A.; Simon, Jacques; Cepler, Aron J.; Sendelbach, Matthew; Gaudiello, John; Emans, Susan; Shifrin, Michael; Etzioni, Yoav; Urenski, Ronen; Lee, Wei Ti
2017-03-01
Electrical test measurement in the back-end of line (BEOL) is crucial for wafer and die sorting as well as comparing intended process splits. Any in-line, nondestructive technique in the process flow to accurately predict these measurements can significantly improve mean-time-to-detect (MTTD) of defects and improve cycle times for yield and process learning. Measuring after BEOL metallization is commonly done for process control and learning, particularly with scatterometry (also called OCD (Optical Critical Dimension)), which can solve for multiple profile parameters such as metal line height or sidewall angle and does so within patterned regions. This gives scatterometry an advantage over inline microscopy-based techniques, which provide top-down information, since such techniques can be insensitive to sidewall variations hidden under the metal fill of the trench. But when faced with correlation to electrical test measurements that are specific to the BEOL processing, both techniques face the additional challenge of sampling. Microscopy-based techniques are sampling-limited by their small probe size, while scatterometry is traditionally limited (for microprocessors) to scribe targets that mimic device ground rules but are not necessarily designed to be electrically testable. A solution to this sampling challenge lies in a fast reference-based machine learning capability that allows for OCD measurement directly of the electrically-testable structures, even when they are not OCD-compatible. By incorporating such direct OCD measurements, correlation to, and therefore prediction of, resistance of BEOL electrical test structures is significantly improved. Improvements in prediction capability for multiple types of in-die electrically-testable device structures is demonstrated. To further improve the quality of the prediction of the electrical resistance measurements, hybrid metrology using the OCD measurements as well as X-ray metrology (XRF) is used. Hybrid metrology is the practice of combining information from multiple sources in order to enable or improve the measurement of one or more critical parameters. Here, the XRF measurements are used to detect subtle changes in barrier layer composition and thickness that can have second-order effects on the electrical resistance of the test structures. By accounting for such effects with the aid of the X-ray-based measurements, further improvement in the OCD correlation to electrical test measurements is achieved. Using both types of solution incorporation of fast reference-based machine learning on nonOCD-compatible test structures, and hybrid metrology combining OCD with XRF technology improvement in BEOL cycle time learning could be accomplished through improved prediction capability.
JACOB: an enterprise framework for computational chemistry.
Waller, Mark P; Dresselhaus, Thomas; Yang, Jack
2013-06-15
Here, we present just a collection of beans (JACOB): an integrated batch-based framework designed for the rapid development of computational chemistry applications. The framework expedites developer productivity by handling the generic infrastructure tier, and can be easily extended by user-specific scientific code. Paradigms from enterprise software engineering were rigorously applied to create a scalable, testable, secure, and robust framework. A centralized web application is used to configure and control the operation of the framework. The application-programming interface provides a set of generic tools for processing large-scale noninteractive jobs (e.g., systematic studies), or for coordinating systems integration (e.g., complex workflows). The code for the JACOB framework is open sourced and is available at: www.wallerlab.org/jacob. Copyright © 2013 Wiley Periodicals, Inc.
Initial eccentricity fluctuations and their relation to higher-order flow harmonics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lacey, R.; Wei,R.; Jia,J.
2011-06-01
Monte Carlo simulations are used to compute the centrality dependence of the participant eccentricities ({var_epsilon}{sub n}) in Au+Au collisions for the two primary models currently employed for eccentricity estimates - the Glauber and the factorized Kharzeev-Levin-Nardi (fKLN) models. They suggest specific testable predictions for the magnitude and centrality dependence of the flow coefficients v{sub n}, respectively measured relative to the event planes {Psi}{sub n}. They also indicate that the ratios of several of these coefficients may provide an additional constraint for distinguishing between the models. Such a constraint could be important for a more precise determination of the specific viscositymore » of the matter produced in heavy ion collisions.« less
Search for the Footprints of New Physics with Laboratory and Cosmic Neutrinos
NASA Technical Reports Server (NTRS)
Stecker, Floyd W.
2017-01-01
Observations of high energy neutrinos, both in the laboratory and from cosmic sources, can be a useful probe in searching for new physics. Such observations can provide sensitive tests of Lorentz invariance violation (LIV), which may be a the result of quantum gravity physics (QG). We review some observationally testable consequences of LIV using effective field theory (EFT) formalism. To do this, one can postulate the existence of additional small LIV terms in free particle Lagrangians, suppressed by powers of the Planck mass. The observational consequences of such terms are then examined. In particular, one can place limits on a class of non-renormalizable, mass dimension five and six Lorentz invariance violating operators that may be the result of QG.
Metabolic network flux analysis for engineering plant systems.
Shachar-Hill, Yair
2013-04-01
Metabolic network flux analysis (NFA) tools have proven themselves to be powerful aids to metabolic engineering of microbes by providing quantitative insights into the flows of material and energy through cellular systems. The development and application of NFA tools to plant systems has advanced in recent years and are yielding significant insights and testable predictions. Plants present substantial opportunities for the practical application of NFA but they also pose serious challenges related to the complexity of plant metabolic networks and to deficiencies in our knowledge of their structure and regulation. By considering the tools available and selected examples, this article attempts to assess where and how NFA is most likely to have a real impact on plant biotechnology. Copyright © 2013 Elsevier Ltd. All rights reserved.
Shevtsova, Natalia A; Talpalar, Adolfo E; Markin, Sergey N; Harris-Warrick, Ronald M; Kiehn, Ole; Rybak, Ilya A
2015-01-01
Different locomotor gaits in mammals, such as walking or galloping, are produced by coordinated activity in neuronal circuits in the spinal cord. Coordination of neuronal activity between left and right sides of the cord is provided by commissural interneurons (CINs), whose axons cross the midline. In this study, we construct and analyse two computational models of spinal locomotor circuits consisting of left and right rhythm generators interacting bilaterally via several neuronal pathways mediated by different CINs. The CIN populations incorporated in the models include the genetically identified inhibitory (V0D) and excitatory (V0V) subtypes of V0 CINs and excitatory V3 CINs. The model also includes the ipsilaterally projecting excitatory V2a interneurons mediating excitatory drive to the V0V CINs. The proposed network architectures and CIN connectivity allow the models to closely reproduce and suggest mechanistic explanations for several experimental observations. These phenomena include: different speed-dependent contributions of V0D and V0V CINs and V2a interneurons to left–right alternation of neural activity, switching gaits between the left–right alternating walking-like activity and the left–right synchronous hopping-like pattern in mutants lacking specific neuron classes, and speed-dependent asymmetric changes of flexor and extensor phase durations. The models provide insights into the architecture of spinal network and the organization of parallel inhibitory and excitatory CIN pathways and suggest explanations for how these pathways maintain alternating and synchronous gaits at different locomotor speeds. The models propose testable predictions about the neural organization and operation of mammalian locomotor circuits. Key points Coordination of neuronal activity between left and right sides of the mammalian spinal cord is provided by several sets of commissural interneurons (CINs) whose axons cross the midline. Genetically identified inhibitory V0D and excitatory V0V CINs and ipsilaterally projecting excitatory V2a interneurons were shown to secure left–right alternation at different locomotor speeds. We have developed computational models of neuronal circuits in the spinal cord that include left and right rhythm-generating centres interacting bilaterally via three parallel pathways mediated by V0D, V2a–V0V and V3 neuron populations. The models reproduce the experimentally observed speed-dependent left–right coordination in normal mice and the changes in coordination seen in mutants lacking specific neuron classes. The models propose an explanation for several experimental results and provide insights into the organization of the spinal locomotor network and parallel CIN pathways involved in gait control at different locomotor speeds. PMID:25820677
Beabout, Kathryn; McCurry, Megan D; Mehta, Heer; Shah, Akshay A; Pulukuri, Kiran Kumar; Rigol, Stephan; Wang, Yanping; Nicolaou, K C; Shamoo, Yousif
2017-11-10
The continuing rise of multidrug resistant pathogens has made it clear that in the absence of new antibiotics we are moving toward a "postantibiotic" world, in which even routine infections will become increasingly untreatable. There is a clear need for the development of new antibiotics with truly novel mechanisms of action to combat multidrug resistant pathogens. Experimental evolution to resistance can be a useful tactic for the characterization of the biochemical mechanism of action for antibiotics of interest. Herein, we demonstrate that the use of a diverse panel of strains with well-annotated reference genomes improves the success of using experimental evolution to characterize the mechanism of action of a novel pyrrolizidinone antibiotic analog. Importantly, we used experimental evolution under conditions that favor strongly polymorphic populations to adapt a panel of three substantially different Gram-positive species (lab strain Bacillus subtilis and clinical strains methicillin-resistant Staphylococcus aureus MRSA131 and Enterococcus faecalis S613) to produce a sufficiently diverse set of evolutionary outcomes. Comparative whole genome sequencing (WGS) between the susceptible starting strain and the resistant strains was then used to identify the genetic changes within each species in response to the pyrrolizidinone. Taken together, the adaptive response across a range of organisms allowed us to develop a readily testable hypothesis for the mechanism of action of the CJ-16 264 analog. In conjunction with mitochondrial inhibition studies, we were able to elucidate that this novel pyrrolizidinone antibiotic is an electron transport chain (ETC) inhibitor. By studying evolution to resistance in a panel of different species of bacteria, we have developed an enhanced method for the characterization of new lead compounds for the discovery of new mechanisms of action.
The CRISP theory of hippocampal function in episodic memory
Cheng, Sen
2013-01-01
Over the past four decades, a “standard framework” has emerged to explain the neural mechanisms of episodic memory storage. This framework has been instrumental in driving hippocampal research forward and now dominates the design and interpretation of experimental and theoretical studies. It postulates that cortical inputs drive plasticity in the recurrent cornu ammonis 3 (CA3) synapses to rapidly imprint memories as attractor states in CA3. Here we review a range of experimental studies and argue that the evidence against the standard framework is mounting, notwithstanding the considerable evidence in its support. We propose CRISP as an alternative theory to the standard framework. CRISP is based on Context Reset by dentate gyrus (DG), Intrinsic Sequences in CA3, and Pattern completion in cornu ammonis 1 (CA1). Compared to previous models, CRISP uses a radically different mechanism for storing episodic memories in the hippocampus. Neural sequences are intrinsic to CA3, and inputs are mapped onto these intrinsic sequences through synaptic plasticity in the feedforward projections of the hippocampus. Hence, CRISP does not require plasticity in the recurrent CA3 synapses during the storage process. Like in other theories DG and CA1 play supporting roles, however, their function in CRISP have distinct implications. For instance, CA1 performs pattern completion in the absence of CA3 and DG contributes to episodic memory retrieval, increasing the speed, precision, and robustness of retrieval. We propose the conceptual theory, discuss its implications for experimental results and suggest testable predictions. It appears that CRISP not only accounts for those experimental results that are consistent with the standard framework, but also for results that are at odds with the standard framework. We therefore suggest that CRISP is a viable, and perhaps superior, theory for the hippocampal function in episodic memory. PMID:23653597
Final Report of DOE Grant No. DE-FG02-04ER41306
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nandi, Satyanarayan; Babu, Kaladi S; Rizatdinova, Flera
2013-12-10
Project: Theoretical and Experimental Research in Weak, Electromagnetic and Strong Interactions: Investigators: S. Nandi, K.S. Babu, F. Rizatdinova Institution: Oklahoma State University, Stillwater, OK 74078 This completed project focused on the cutting edge research in theoretical and experimental high energy physics. In theoretical high energy physics, the two investigators (Nandi and Babu) worked on a variety of topics in model-building and phenomenological aspects of elementary particle physics. This includes unification of particles and forces, neutrino physics, Higgs boson physics, proton decay, supersymmetry, and collider physics. Novel physics ideas beyond the Standard Model with testable consequences at the LHC have beenmore » proposed. These ideas have stimulated the experimental community to look for new signals. The contributions of the experimental high energy physics group has been at the D0 experiment at the Fermilab Tevatraon and the ATLAS experiment at the Large Hadron Collider. At the D0 experiment, the main focus was search for the Higgs boson in the WH channel, where improved limits were obtained. At the LHC, the OSU group has made significant contributions to the top quark physics, and the calibration of the b-tagging algorithms. The group is also involved in the pixel detector upgrade. This DOE supported grant has resulted in 5 PhD degrees during the past three years. Three postdoctoral fellows were supported as well. In theoretical research over 40 refereed publications have resulted in the past three years, with several involving graduate students and postdoctoral fellows. It also resulted in over 30 conference presentations in the same time period. We are also involved in outreach activities through the Quarknet program, where we engage Oklahoma school teachers and students in our research.« less
ERIC Educational Resources Information Center
Barth, Lorna
2007-01-01
By changing the venue from festival to a required academic exposition, the traditional science fair was transformed into a "Science Expo" wherein students were guided away from cookbook experiments toward developing a question about their environment into a testable and measurable experiment. The revamped "Science Expo" became a night for students…
Leveraging Rigorous Local Evaluations to Understand Contradictory Findings
ERIC Educational Resources Information Center
Boulay, Beth; Martin, Carlos; Zief, Susan; Granger, Robert
2013-01-01
Contradictory findings from "well-implemented" rigorous evaluations invite researchers to identify the differences that might explain the contradictions, helping to generate testable hypotheses for new research. This panel will examine efforts to ensure that the large number of local evaluations being conducted as part of four…
Changing Perspectives on Basic Research in Adult Learning and Memory
ERIC Educational Resources Information Center
Hultsch, David F.
1977-01-01
It is argued that wheather the course of cognitive development is characterized by growth, stability, or decline is less a matter of the metamodel on which the theories and data are based. Such metamodels are representations of reality that are not empirically testable. (Author)
Adolescent Pregnancy and Its Delay.
ERIC Educational Resources Information Center
Bell, Lloyd H.
This paper examines some probable reasons for the black adolescent male's contribution to increased pregnancy in the black community. Using a situation analysis, it presents the following testable suppositions: (1) black males' fear of retribution for impregnating a girl has diminished, leading to increased sexual intercourse and ultimately to…
The Process of Mentoring Pregnant Adolescents: An Exploratory Study.
ERIC Educational Resources Information Center
Blinn-Pike, Lynn; Kuschel, Diane; McDaniel, Annette; Mingus, Suzanne; Mutti, Megan Poole
1998-01-01
The process that occurs in relationships between volunteer adult mentors and pregnant adolescent "mentees" is described empirically; testable hypotheses based on findings concerning the mentor role are proposed. Case records from 20 mentors are analyzed; findings regarding mentors' roles are discussed. Criteria for conceptualizing quasi-parenting…
Software reliability through fault-avoidance and fault-tolerance
NASA Technical Reports Server (NTRS)
Vouk, Mladen A.; Mcallister, David F.
1992-01-01
Accomplishments in the following research areas are summarized: structure based testing, reliability growth, and design testability with risk evaluation; reliability growth models and software risk management; and evaluation of consensus voting, consensus recovery block, and acceptance voting. Four papers generated during the reporting period are included as appendices.
Mentoring: A Typology of Costs for Higher Education Faculty
ERIC Educational Resources Information Center
Lunsford, Laura G.; Baker, Vicki; Griffin, Kimberly A.; Johnson, W. Brad
2013-01-01
In this theoretical paper, we apply a social exchange framework to understand mentors' negative experiences. We propose a typology of costs, categorized according to psychosocial and career mentoring functions. Our typology generates testable research propositions. Psychosocial costs of mentoring are burnout, anger, and grief or loss. Career…
Instructional Design: Science, Technology, Both, Neither
ERIC Educational Resources Information Center
Gropper, George L.
2017-01-01
What would it take for instructional design to qualify as a bona fide applied discipline? First and foremost, a fundamental requirement is a testable and tested theoretical base. Untested rationales until verified remain in limbo. Secondly, the discipline's applied prescriptions must be demonstrably traceable to the theoretical base once it is…
ERIC Educational Resources Information Center
Tweney, Ryan D.
Drawing parallels with critical thinking and creative thinking, this document describes some ways that scientific thinking is utilized. Cognitive approaches to scientific thinking are discussed, and it is argued that all science involves an attempt to construct a testable mental model of some aspect of reality. The role of mental models is…
ERIC Educational Resources Information Center
Wallace, Robert B.
1994-01-01
Health survey research assesses health of individuals in population. Measures include prevalence/incidence of diseases, signs/symptoms, functional states, and health services utilization. Although assessing individual biologic robustness can be problematic, testable approaches do exist. Characteristics of health of populations/communities, not…
Equilibration: Developing the Hard Core of the Piagetian Research Program.
ERIC Educational Resources Information Center
Rowell, J.A.
1983-01-01
Argues that the status of the concept of equilibration is classified by considering Piagetian theory as a research program in the sense elaborated in 1974 by Lakatos. A pilot study was made to examine the precision and testability of equilibration in Piaget's 1977 model.(Author/RH)
NASA Astrophysics Data System (ADS)
Sabater, Bartolomé; Marín, Dolores
2018-03-01
The minimum rate principle is applied to the chemical reaction in a steady-state open cell system where, under constant supply of the glucose precursor, reference to time or to glucose consumption does not affect the conclusions.
Tracking the "Lizardman": Writing Rotten to Write Well.
ERIC Educational Resources Information Center
Polette, Keith
1995-01-01
Suggests that students can improve their writing by being instructed on how to write badly. Applies the criteria of testability, tunnel-vision, excessive vagueness, flying in the face of established fact, and hazy authority to tabloid newspaper stories. Discusses how students can write their own "rotten" tabloid stories by taking these…
Researching the Study Abroad Experience
ERIC Educational Resources Information Center
McLeod, Mark; Wainwright, Philip
2009-01-01
The authors propose a paradigm for rigorous scientific assessment of study abroad programs, with the focus being on how study abroad experiences affect psychological constructs as opposed to looking solely at study-abroad-related outcomes. Social learning theory is used as a possible theoretical basis for making testable hypotheses and guiding…
Toward a Testable Developmental Model of Pedophilia: The Development of Erotic Age Preference.
ERIC Educational Resources Information Center
Freund, Kurt; Kuban, Michael
1993-01-01
Analysis of retrospective self-reports about childhood curiosity to see persons in the nude, with heterosexual and homosexual pedophiles, gynephiles, and androphiles, suggests that establishment of erotic sex preference proceeded that of age preference, and a greater proportion of pedophiles than gynephiles or androphiles remembered childhood…
NASA Astrophysics Data System (ADS)
Colomb, Warren; Sarkar, Susanta K.
2015-06-01
We would like to thank all the commentators for their constructive comments on our paper. Commentators agree that a proper analysis of noisy single-molecule data is important for extracting meaningful and accurate information about the system. We concur with their views and indeed, motivating an accurate analysis of experimental data is precisely the point of our paper. After a model about the system of interest is constructed based on the experimental single-molecule data, it is very helpful to simulate the model to generate theoretical single-molecule data and analyze exactly the same way. In our experience, such self-consistent approach involving experiments, simulations, and analyses often forces us to revise our model and make experimentally testable predictions. In light of comments from the commentators with different expertise, we would also like to point out that a single model should be able to connect different experimental techniques because the underlying science does not depend on the experimental techniques used. Wohland [1] has made a strong case for fluorescence correlation spectroscopy (FCS) as an important experimental technique to bridge single-molecule and ensemble experiments. FCS is a very powerful technique that can measure ensemble parameters with single-molecule sensitivity. Therefore, it is logical to simulate any proposed model and predict both single-molecule data and FCS data, and confirm with experimental data. Fitting the diffraction-limited point spread function (PSF) of an isolated fluorescent marker to localize a labeled biomolecule is a critical step in many single-molecule tracking experiments. Flyvbjerg et al. [2] have rigorously pointed out some important drawbacks of the prevalent practice of fitting diffraction-limited PSF with 2D Gaussian. As we try to achieve more accurate and precise localization of biomolecules, we need to consider subtle points as mentioned by Flyvbjerg et al. Shepherd [3] has mentioned specific examples of PSF that have been used for localization and has rightly mentioned the importance of detector noise in single-molecule localization. Meroz [4] has pointed out more clearly that the signal itself could be noisy and it is necessary to distinguish the noise of interest from the background noise. Krapf [5] has pointed out different origins of fluctuations in biomolecular systems and commented on their possible Gaussian and non-Gaussian nature. Importance of noise along with the possibility that the noise itself can be the signal of interest has been discussed in our paper [6]. However, Meroz [4] and Krapf [5] have provided specific examples to guide the readers in a better way. Sachs et al. [7] have discussed kinetic analysis in the presence of indistinguishable states and have pointed to the free software for the general kinetic analysis that originated from their research.
Ficklin, Stephen P; Feltus, Frank Alex
2013-01-01
Many traits of biological and agronomic significance in plants are controlled in a complex manner where multiple genes and environmental signals affect the expression of the phenotype. In Oryza sativa (rice), thousands of quantitative genetic signals have been mapped to the rice genome. In parallel, thousands of gene expression profiles have been generated across many experimental conditions. Through the discovery of networks with real gene co-expression relationships, it is possible to identify co-localized genetic and gene expression signals that implicate complex genotype-phenotype relationships. In this work, we used a knowledge-independent, systems genetics approach, to discover a high-quality set of co-expression networks, termed Gene Interaction Layers (GILs). Twenty-two GILs were constructed from 1,306 Affymetrix microarray rice expression profiles that were pre-clustered to allow for improved capture of gene co-expression relationships. Functional genomic and genetic data, including over 8,000 QTLs and 766 phenotype-tagged SNPs (p-value < = 0.001) from genome-wide association studies, both covering over 230 different rice traits were integrated with the GILs. An online systems genetics data-mining resource, the GeneNet Engine, was constructed to enable dynamic discovery of gene sets (i.e. network modules) that overlap with genetic traits. GeneNet Engine does not provide the exact set of genes underlying a given complex trait, but through the evidence of gene-marker correspondence, co-expression, and functional enrichment, site visitors can identify genes with potential shared causality for a trait which could then be used for experimental validation. A set of 2 million SNPs was incorporated into the database and serve as a potential set of testable biomarkers for genes in modules that overlap with genetic traits. Herein, we describe two modules found using GeneNet Engine, one with significant overlap with the trait amylose content and another with significant overlap with blast disease resistance.
Ficklin, Stephen P.; Feltus, Frank Alex
2013-01-01
Many traits of biological and agronomic significance in plants are controlled in a complex manner where multiple genes and environmental signals affect the expression of the phenotype. In Oryza sativa (rice), thousands of quantitative genetic signals have been mapped to the rice genome. In parallel, thousands of gene expression profiles have been generated across many experimental conditions. Through the discovery of networks with real gene co-expression relationships, it is possible to identify co-localized genetic and gene expression signals that implicate complex genotype-phenotype relationships. In this work, we used a knowledge-independent, systems genetics approach, to discover a high-quality set of co-expression networks, termed Gene Interaction Layers (GILs). Twenty-two GILs were constructed from 1,306 Affymetrix microarray rice expression profiles that were pre-clustered to allow for improved capture of gene co-expression relationships. Functional genomic and genetic data, including over 8,000 QTLs and 766 phenotype-tagged SNPs (p-value < = 0.001) from genome-wide association studies, both covering over 230 different rice traits were integrated with the GILs. An online systems genetics data-mining resource, the GeneNet Engine, was constructed to enable dynamic discovery of gene sets (i.e. network modules) that overlap with genetic traits. GeneNet Engine does not provide the exact set of genes underlying a given complex trait, but through the evidence of gene-marker correspondence, co-expression, and functional enrichment, site visitors can identify genes with potential shared causality for a trait which could then be used for experimental validation. A set of 2 million SNPs was incorporated into the database and serve as a potential set of testable biomarkers for genes in modules that overlap with genetic traits. Herein, we describe two modules found using GeneNet Engine, one with significant overlap with the trait amylose content and another with significant overlap with blast disease resistance. PMID:23874666
Deichmann, Ute
2011-09-01
Three early 20th-century attempts at unifying separate areas of biology, in particular development, genetics, physiology, and evolution, are compared in regard to their success and fruitfulness for further research: Jacques Loeb's reductionist project of unifying approaches by physico-chemical explanations; Richard Goldschmidt's anti-reductionist attempts to unify by integration; and Sewall Wright's combination of reductionist research and vision of hierarchical genetic systems. Loeb's program, demanding that all aspects of biology, including evolution, be studied by the methods of the experimental sciences, proved highly successful and indispensible for higher level investigations, even though evolutionary change and properties of biological systems up to now cannot be fully explained on the molecular level alone. Goldschmidt has been appraised as pioneer of physiological and developmental genetics and of a new evolutionary synthesis which transcended neo-Darwinism. However, this study concludes that his anti-reductionist attempts to integrate genetics, development and evolution have to be regarded as failures or dead ends. His grand speculations were based on the one hand on concepts and experimental systems that were too vague in order to stimulate further research, and on the other on experiments which in their core parts turned out not to be reproducible. In contrast, Sewall Wright, apart from being one of the architects of the neo-Darwinian synthesis of the 1930s, opened up new paths of testable quantitative developmental genetic investigations. He placed his research within a framework of logical reasoning, which resulted in the farsighted speculation that examinations of biological systems should be related to the regulation of hierarchical genetic subsystems, possibly providing a mechanism for development and evolution. I argue that his suggestion of basing the study of systems on clearly defined properties of the components has proved superior to Goldschmidt's approach of studying systems as a whole, and that attempts to integrate different fields at a too early stage may prove futile or worse. Copyright © 2011 Elsevier Inc. All rights reserved.
Abdo, Nour; Xia, Menghang; Brown, Chad C.; Kosyk, Oksana; Huang, Ruili; Sakamuru, Srilatha; Zhou, Yi-Hui; Jack, John R.; Gallins, Paul; Xia, Kai; Li, Yun; Chiu, Weihsueh A.; Motsinger-Reif, Alison A.; Austin, Christopher P.; Tice, Raymond R.
2015-01-01
Background: Understanding of human variation in toxicity to environmental chemicals remains limited, so human health risk assessments still largely rely on a generic 10-fold factor (10½ each for toxicokinetics and toxicodynamics) to account for sensitive individuals or subpopulations. Objectives: We tested a hypothesis that population-wide in vitro cytotoxicity screening can rapidly inform both the magnitude of and molecular causes for interindividual toxicodynamic variability. Methods: We used 1,086 lymphoblastoid cell lines from the 1000 Genomes Project, representing nine populations from five continents, to assess variation in cytotoxic response to 179 chemicals. Analysis included assessments of population variation and heritability, and genome-wide association mapping, with attention to phenotypic relevance to human exposures. Results: For about half the tested compounds, cytotoxic response in the 1% most “sensitive” individual occurred at concentrations within a factor of 10½ (i.e., approximately 3) of that in the median individual; however, for some compounds, this factor was > 10. Genetic mapping suggested important roles for variation in membrane and transmembrane genes, with a number of chemicals showing association with SNP rs13120371 in the solute carrier SLC7A11, previously implicated in chemoresistance. Conclusions: This experimental approach fills critical gaps unaddressed by recent large-scale toxicity testing programs, providing quantitative, experimentally based estimates of human toxicodynamic variability, and also testable hypotheses about mechanisms contributing to interindividual variation. Citation: Abdo N, Xia M, Brown CC, Kosyk O, Huang R, Sakamuru S, Zhou YH, Jack JR, Gallins P, Xia K, Li Y, Chiu WA, Motsinger-Reif AA, Austin CP, Tice RR, Rusyn I, Wright FA. 2015. Population-based in vitro hazard and concentration–response assessment of chemicals: the 1000 Genomes high-throughput screening study. Environ Health Perspect 123:458–466; http://dx.doi.org/10.1289/ehp.1408775 PMID:25622337
Kapoor, Abhijeet; Shandilya, Manish; Kundu, Suman
2011-01-01
Human dopamine β-hydroxylase (DBH) is an important therapeutic target for complex traits. Several single nucleotide polymorphisms (SNPs) have also been identified in DBH with potential adverse physiological effect. However, difficulty in obtaining diffractable crystals and lack of a suitable template for modeling the protein has ensured that neither crystallographic three-dimensional structure nor computational model for the enzyme is available to aid rational drug design, prediction of functional significance of SNPs or analytical protein engineering. Adequate biochemical information regarding human DBH, structural coordinates for peptidylglycine alpha-hydroxylating monooxygenase and computational data from a partial model of rat DBH were used along with logical manual intervention in a novel way to build an in silico model of human DBH. The model provides structural insight into the active site, metal coordination, subunit interface, substrate recognition and inhibitor binding. It reveals that DOMON domain potentially promotes tetramerization, while substrate dopamine and a potential therapeutic inhibitor nepicastat are stabilized in the active site through multiple hydrogen bonding. Functional significance of several exonic SNPs could be described from a structural analysis of the model. The model confirms that SNP resulting in Ala318Ser or Leu317Pro mutation may not influence enzyme activity, while Gly482Arg might actually do so being in the proximity of the active site. Arg549Cys may cause abnormal oligomerization through non-native disulfide bond formation. Other SNPs like Glu181, Glu250, Lys239 and Asp290 could potentially inhibit tetramerization thus affecting function. The first three-dimensional model of full-length human DBH protein was obtained in a novel manner with a set of experimental data as guideline for consistency of in silico prediction. Preliminary physicochemical tests validated the model. The model confirms, rationalizes and provides structural basis for several biochemical data and claims testable hypotheses regarding function. It provides a reasonable template for drug design as well.
Induction and modulation of persistent activity in a layer V PFC microcircuit model
Papoutsi, Athanasia; Sidiropoulou, Kyriaki; Cutsuridis, Vassilis; Poirazi, Panayiota
2013-01-01
Working memory refers to the temporary storage of information and is strongly associated with the prefrontal cortex (PFC). Persistent activity of cortical neurons, namely the activity that persists beyond the stimulus presentation, is considered the cellular correlate of working memory. Although past studies suggested that this type of activity is characteristic of large scale networks, recent experimental evidence imply that small, tightly interconnected clusters of neurons in the cortex may support similar functionalities. However, very little is known about the biophysical mechanisms giving rise to persistent activity in small-sized microcircuits in the PFC. Here, we present a detailed biophysically—yet morphologically simplified—microcircuit model of layer V PFC neurons that incorporates connectivity constraints and is validated against a multitude of experimental data. We show that (a) a small-sized network can exhibit persistent activity under realistic stimulus conditions. (b) Its emergence depends strongly on the interplay of dADP, NMDA, and GABAB currents. (c) Although increases in stimulus duration increase the probability of persistent activity induction, variability in the stimulus firing frequency does not consistently influence it. (d) Modulation of ionic conductances (Ih, ID, IsAHP, IcaL, IcaN, IcaR) differentially controls persistent activity properties in a location dependent manner. These findings suggest that modulation of the microcircuit's firing characteristics is achieved primarily through changes in its intrinsic mechanism makeup, supporting the hypothesis of multiple bi-stable units in the PFC. Overall, the model generates a number of experimentally testable predictions that may lead to a better understanding of the biophysical mechanisms of persistent activity induction and modulation in the PFC. PMID:24130519
Morris, Melody K.; Saez-Rodriguez, Julio; Clarke, David C.; Sorger, Peter K.; Lauffenburger, Douglas A.
2011-01-01
Predictive understanding of cell signaling network operation based on general prior knowledge but consistent with empirical data in a specific environmental context is a current challenge in computational biology. Recent work has demonstrated that Boolean logic can be used to create context-specific network models by training proteomic pathway maps to dedicated biochemical data; however, the Boolean formalism is restricted to characterizing protein species as either fully active or inactive. To advance beyond this limitation, we propose a novel form of fuzzy logic sufficiently flexible to model quantitative data but also sufficiently simple to efficiently construct models by training pathway maps on dedicated experimental measurements. Our new approach, termed constrained fuzzy logic (cFL), converts a prior knowledge network (obtained from literature or interactome databases) into a computable model that describes graded values of protein activation across multiple pathways. We train a cFL-converted network to experimental data describing hepatocytic protein activation by inflammatory cytokines and demonstrate the application of the resultant trained models for three important purposes: (a) generating experimentally testable biological hypotheses concerning pathway crosstalk, (b) establishing capability for quantitative prediction of protein activity, and (c) prediction and understanding of the cytokine release phenotypic response. Our methodology systematically and quantitatively trains a protein pathway map summarizing curated literature to context-specific biochemical data. This process generates a computable model yielding successful prediction of new test data and offering biological insight into complex datasets that are difficult to fully analyze by intuition alone. PMID:21408212
NASA Astrophysics Data System (ADS)
Persinger, M. A.; McKay, B. E.; O'Donovan, C. A.; Koren, S. A.
2005-03-01
To test the hypothesis that sudden unexplained death (SUD) in some epileptic patients is related to geomagnetic activity we exposed rats in which limbic epilepsy had been induced to experimentally produced magnetic fields designed to simulate sudden storm commencements (SSCs). Prior studies with rats had shown that sudden death in groups of rats in which epilepsy had been induced months earlier was associated with the occurrence of SSCs and increased geomagnetic activity during the previous night. Schnabel et al. [(2000) Neurology 54:903 908) found no relationship between SUD in human patients and geomagnetic activity. A total of 96 rats were exposed to either 500, 50, 10 40 nT or sham (less than 10 nT) magnetic fields for 6 min every hour between midnight and 0800 hours (local time) for three successive nights. The shape of the complex, amplitude-modulated magnetic fields simulated the shape and structure of an average SSC. The rats were then seized with lithium and pilocarpine and the mortality was monitored. Whereas 10% of the rats that had been exposed to the sham field died within 24 h, 60% of the rats that had been exposed to the experimental magnetic fields simulating natural geomagnetic activity died (P<.001) during this period. These results suggest that correlational analyses between SUD in epileptic patients and increased geomagnetic activity can be simulated experimentally in epileptic rats and that potential mechanisms might be testable directly.
Ewing's Sarcoma: Development of RNA Interference-Based Therapy for Advanced Disease
Simmons, Olivia; Maples, Phillip B.; Senzer, Neil; Nemunaitis, John
2012-01-01
Ewing's sarcoma tumors are associated with chromosomal translocation between the EWS gene and the ETS transcription factor gene. These unique target sequences provide opportunity for RNA interference(i)-based therapy. A summary of RNAi mechanism and therapeutically designed products including siRNA, shRNA and bi-shRNA are described. Comparison is made between each of these approaches. Systemic RNAi-based therapy, however, requires protected delivery to the Ewing's sarcoma tumor site for activity. Delivery systems which have been most effective in preclinical and clinical testing are reviewed, followed by preclinical assessment of various silencing strategies with demonstration of effectiveness to EWS/FLI-1 target sequences. It is concluded that RNAi-based therapeutics may have testable and achievable activity in management of Ewing's sarcoma. PMID:22523703
Sterile neutrino searches at future e-e+, pp and e-p colliders
NASA Astrophysics Data System (ADS)
Antusch, Stefan; Cazzato, Eros; Fischer, Oliver
2017-05-01
Sterile neutrinos are among the most attractive extensions of the SM to generate the light neutrino masses observed in neutrino oscillation experiments. When the sterile neutrinos are subject to a protective symmetry, they can have masses around the electroweak scale and potentially large neutrino Yukawa couplings, which makes them testable at planned future particle colliders. We systematically discuss the production and decay channels at electron-positron, proton-proton and electron-proton colliders and provide a complete list of the leading order signatures for sterile neutrino searches. Among other things, we discuss several novel search channels, and present a first look at the possible sensitivities for the active-sterile mixings and the heavy neutrino masses. We compare the performance of the different collider types and discuss their complementarity.
Microbial endocrinology and the microbiota-gut-brain axis.
Lyte, Mark
2014-01-01
Microbial endocrinology is defined as the study of the ability of microorganisms to both produce and recognize neurochemicals that originate either within the microorganisms themselves or within the host they inhabit. As such, microbial endocrinology represents the intersection of the fields of microbiology and neurobiology. The acquisition of neurochemical-based cell-to-cell signaling mechanisms in eukaryotic organisms is believed to have been acquired due to late horizontal gene transfer from prokaryotic microorganisms. When considered in the context of the microbiota's ability to influence host behavior, microbial endocrinology with its theoretical basis rooted in shared neuroendocrine signaling mechanisms provides for testable experiments with which to understand the role of the microbiota in host behavior and as importantly the ability of the host to influence the microbiota through neuroendocrine-based mechanisms.
Complex Causal Process Diagrams for Analyzing the Health Impacts of Policy Interventions
Joffe, Michael; Mindell, Jennifer
2006-01-01
Causal diagrams are rigorous tools for controlling confounding. They also can be used to describe complex causal systems, which is done routinely in communicable disease epidemiology. The use of change diagrams has advantages over static diagrams, because change diagrams are more tractable, relate better to interventions, and have clearer interpretations. Causal diagrams are a useful basis for modeling. They make assumptions explicit, provide a framework for analysis, generate testable predictions, explore the effects of interventions, and identify data gaps. Causal diagrams can be used to integrate different types of information and to facilitate communication both among public health experts and between public health experts and experts in other fields. Causal diagrams allow the use of instrumental variables, which can help control confounding and reverse causation. PMID:16449586
Graduate students' teaching experiences improve their methodological research skills.
Feldon, David F; Peugh, James; Timmerman, Briana E; Maher, Michelle A; Hurst, Melissa; Strickland, Denise; Gilmore, Joanna A; Stiegelmeyer, Cindy
2011-08-19
Science, technology, engineering, and mathematics (STEM) graduate students are often encouraged to maximize their engagement with supervised research and minimize teaching obligations. However, the process of teaching students engaged in inquiry provides practice in the application of important research skills. Using a performance rubric, we compared the quality of methodological skills demonstrated in written research proposals for two groups of early career graduate students (those with both teaching and research responsibilities and those with only research responsibilities) at the beginning and end of an academic year. After statistically controlling for preexisting differences between groups, students who both taught and conducted research demonstrate significantly greater improvement in their abilities to generate testable hypotheses and design valid experiments. These results indicate that teaching experience can contribute substantially to the improvement of essential research skills.
A unifying model of the role of the infralimbic cortex in extinction and habits
Taylor, Jane R.; Chandler, L. Judson
2014-01-01
The infralimbic prefrontal cortex (IL) has been shown to be critical for the regulation of flexible behavior, but its precise function remains unclear. This region has been shown to be critical for the acquisition, consolidation, and expression of extinction learning, leading many to hypothesize that IL suppresses behavior as part of a “stop” network. However, this framework is at odds with IL function in habitual behavior in which the IL has been shown to be required for the expression and acquisition of ongoing habitual behavior. Here, we will review the current state of knowledge of IL anatomy and function in behavioral flexibility and provide a testable framework for a single IL mechanism underlying its function in both extinction and habit learning. PMID:25128534
ERIC Educational Resources Information Center
Maul, Andrew
2015-01-01
Briggs and Peck [in "Using Learning Progressions to Design Vertical Scales That Support Coherent Inferences about Student Growth"] call for greater care in the conceptualization of the target attributes of students, or "what it is that is growing from grade to grade." In particular, they argue that learning progressions can…
Performance Models of Testability.
1984-08-01
4.1.17 Cost of Isolating Component/Part (CPI) 5J Cost of isolating components or parts at the depot is at CPI - n1 (HDC)(TPI)(NPI) where TPI = average...testing component N Deec N a aiur Yes(PFD D)S Cos ofi oaig op nn -- Cost ofcmpnn rmva n relaemn Exece Cost of omponn reatmovae anda
There's No Such Thing as Value-Free Science.
ERIC Educational Resources Information Center
Makosky, Vivian Parker
This paper is based on the view that, although scientists rely on research values such as predictive accuracy and testability, scientific research is still subject to the unscientific values, attitudes, and emotions of the scientists. It is noted that undergraduate students are likely not to think critically about the science they encounter. A…
Phases in the Adoption of Educational Innovations in Teacher Training Institutions.
ERIC Educational Resources Information Center
Hall, Gene E.
An attempt has been made to categorize phenomena observed as 20 teacher training institutions have adopted innovations and to extrapolate from these findings key concepts and principles that could form the basis for developing empirically testable hypotheses and could be of some immediate utility to those involved in innovation adoption. The…
Twelve testable hypotheses on the geobiology of weathering
S.L. Brantley; J.P. Megonigal; F.N. Scatena; Z. Balogh-Brunstad; R.T. Barnes; M.A. Bruns; P. van Cappelen; K. Dontsova; H.E. Hartnett; A.S. Hartshorn; A. Heimsath; E. Herndon; L. Jin; C.K. Keller; J.R. Leake; W.H. McDowell; F.C. Meinzer; T.J. Mozdzer; S. Petsch; J. Pett-Ridge; K.S. Pretziger; P.A. Raymond; C.S. Riebe; K. Shumaker; A. Sutton-Grier; R. Walter; K. Yoo
2011-01-01
Critical Zone (CZ) research investigates the chemical, physical, and biological processes that modulate the Earth's surface. Here, we advance 12 hypotheses that must be tested to improve our understanding of the CZ: (1) Solar-to-chemical conversion of energy by plants regulates flows of carbon, water, and nutrients through plant-microbe soil networks, thereby...
ERIC Educational Resources Information Center
Kirch, Susan A.; Stetsenko, Anna
2012-01-01
What do people mean when they say they "know" something in science? It usually means they did an investigation and expended considerable intellectual effort to build a useful explanatory model. It means they are confident about an explanation, believe others should trust what they say, and believe that their claim is testable. It means they can…
ERIC Educational Resources Information Center
Martin-Dunlop, Catherine S.
2013-01-01
This study investigated prospective elementary teachers' understandings of the nature of science and explored associations with their guided-inquiry science learning environment. Over 500 female students completed the Nature of Scientific Knowledge Survey (NSKS), although only four scales were analyzed-Creative, Testable, Amoral, and Unified. The…
Forensic Impact of the Child Sexual Abuse Medical Examination.
ERIC Educational Resources Information Center
Myers, John E. B.
1998-01-01
This commentary on an article (EC 619 279) about research issues at the interface of medicine and law concerning medical evaluation for child sexual abuse focuses on empirically testable questions: (1) the medical history--its accuracy, interviewing issues, and elicitation and preservation of verbal evidence of abuse; and, (2) expert testimony.…
Two New Empirically Derived Reasons To Use the Assessment of Basic Learning Abilities.
ERIC Educational Resources Information Center
Richards, David F.; Williams, W. Larry; Follette, William C.
2002-01-01
Scores on the Assessment of Basic Learning Abilities (ABLA), Vineland Adaptive Behavior Scales, and the Wechsler Intelligences Scale-Revised (WAIS-R) were obtained for 30 adults with mental retardation. Correlations between the Vineland domains and ABLA were all significant. No participants performing below ABLA Level 6 were testable on the…
A Cognitive Approach to Brailling Errors
ERIC Educational Resources Information Center
Wells-Jensen, Sheri; Schwartz, Aaron; Gosche, Bradley
2007-01-01
This article analyzes a corpus of 1,600 brailling errors made by one expert braillist. It presents a testable model of braille writing and shows that the subject braillist stores standard braille contractions as part of the orthographic representation of words, rather than imposing contractions on a serially ordered string of letters. (Contains 1…
Thinking about Evolution: Combinatorial Play as a Strategy for Exercising Scientific Creativity
ERIC Educational Resources Information Center
Wingate, Richard J. T.
2011-01-01
An enduring focus in education on how scientists formulate experiments and "do science" in the laboratory has excluded a vital element of scientific practice: the creative and imaginative thinking that generates models and testable hypotheses. In this case study, final-year biomedical sciences university students were invited to create and justify…
Purposeful Instruction: Mixing up the "I," "We," and "You"
ERIC Educational Resources Information Center
Grant, Maria; Lapp, Diane; Fisher, Douglas; Johnson, Kelly; Frey, Nancy
2012-01-01
This article discusses the flexible nature of the gradual release of responsibility (GRR) as a frame for inquiry-based science instruction. Given the mandate for the use of text-supported learning (Common Core Standards), the GRR can be used to allow students to learn as scientists as they collaboratively develop testable questions and experiments…
What can we learn from a two-brain approach to verbal interaction?
Schoot, Lotte; Hagoort, Peter; Segaert, Katrien
2016-09-01
Verbal interaction is one of the most frequent social interactions humans encounter on a daily basis. In the current paper, we zoom in on what the multi-brain approach has contributed, and can contribute in the future, to our understanding of the neural mechanisms supporting verbal interaction. Indeed, since verbal interaction can only exist between individuals, it seems intuitive to focus analyses on inter-individual neural markers, i.e. between-brain neural coupling. To date, however, there is a severe lack of theoretically-driven, testable hypotheses about what between-brain neural coupling actually reflects. In this paper, we develop a testable hypothesis in which between-pair variation in between-brain neural coupling is of key importance. Based on theoretical frameworks and empirical data, we argue that the level of between-brain neural coupling reflects speaker-listener alignment at different levels of linguistic and extra-linguistic representation. We discuss the possibility that between-brain neural coupling could inform us about the highest level of inter-speaker alignment: mutual understanding. Copyright © 2016 Elsevier Ltd. All rights reserved.
Active processes make mixed lipid membranes either flat or crumpled
NASA Astrophysics Data System (ADS)
Banerjee, Tirthankar; Basu, Abhik
2018-01-01
Whether live cell membranes show miscibility phase transitions (MPTs), and if so, how they fluctuate near the transitions remain outstanding unresolved issues in physics and biology alike. Motivated by these questions we construct a generic hydrodynamic theory for lipid membranes that are active, due for instance, to the molecular motors in the surrounding cytoskeleton, or active protein components in the membrane itself. We use this to uncover a direct correspondence between membrane fluctuations and MPTs. Several testable predictions are made: (i) generic active stiffening with orientational long range order (flat membrane) or softening with crumpling of the membrane, controlled by the active tension and (ii) for mixed lipid membranes, capturing the nature of putative MPTs by measuring the membrane conformation fluctuations. Possibilities of both first and second order MPTs in mixed active membranes are argued for. Near second order MPTs, active stiffening (softening) manifests as a super-stiff (super-soft) membrane. Our predictions are testable in a variety of in vitro systems, e.g. live cytoskeletal extracts deposited on liposomes and lipid membranes containing active proteins embedded in a passive fluid.
Hallow, K M; Gebremichael, Y
2017-06-01
Renal function plays a central role in cardiovascular, kidney, and multiple other diseases, and many existing and novel therapies act through renal mechanisms. Even with decades of accumulated knowledge of renal physiology, pathophysiology, and pharmacology, the dynamics of renal function remain difficult to understand and predict, often resulting in unexpected or counterintuitive therapy responses. Quantitative systems pharmacology modeling of renal function integrates this accumulated knowledge into a quantitative framework, allowing evaluation of competing hypotheses, identification of knowledge gaps, and generation of new experimentally testable hypotheses. Here we present a model of renal physiology and control mechanisms involved in maintaining sodium and water homeostasis. This model represents the core renal physiological processes involved in many research questions in drug development. The model runs in R and the code is made available. In a companion article, we present a case study using the model to explore mechanisms and pharmacology of salt-sensitive hypertension. © 2017 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.
NASA Astrophysics Data System (ADS)
Hansson, Johan; Francois, Stephane
The search for a theory of quantum gravity is the most fundamental problem in all of theoretical physics, but there are as yet no experimental results at all to guide this endeavor. What seems to be needed is a pragmatic way to test if gravitation really occurs between quantum objects or not. In this paper, we suggest such a potential way out of this deadlock, utilizing macroscopic quantum systems; superfluid helium, gaseous Bose-Einstein condensates and “macroscopic” molecules. It turns out that true quantum gravity effects — here defined as observable gravitational interactions between truly quantum objects — could and should be seen (if they occur in nature) using existing technology. A falsification of the low-energy limit in the accessible weak-field regime would also falsify the full theory of quantum gravity, making it enter the realm of testable, potentially falsifiable theories, i.e. becoming real physics after almost a century of pure theorizing. If weak-field gravity between quantum objects is shown to be absent (in the regime where the approximation should apply), we know that gravity then is a strictly classical phenomenon absent at the quantum level.
Higgs mass corrections in the SUSY B - L model with inverse seesaw
NASA Astrophysics Data System (ADS)
Elsayed, A.; Khalil, S.; Moretti, S.
2012-08-01
In the context of the Supersymmetric (SUSY) B - L (Baryon minus Lepton number) model with inverse seesaw mechanism, we calculate the one-loop radiative corrections due to right-handed (s)neutrinos to the mass of the lightest Higgs boson when the latter is Standard Model (SM)-like. We show that such effects can be as large as O (100) GeV, thereby giving an absolute upper limit on such a mass around 180 GeV. The importance of this result from a phenomenological point of view is twofold. On the one hand, this enhancement greatly reconciles theory and experiment, by alleviating the so-called 'little hierarchy problem' of the minimal SUSY realization, whereby the current experimental limit on the SM-like Higgs mass is very near its absolute upper limit predicted theoretically, of 130 GeV. On the other hand, a SM-like Higgs boson with mass below 180 GeV is still well within the reach of the Large Hadron Collider (LHC), so that the SUSY realization discussed here is just as testable as the minimal version.
Friction law and hysteresis in granular materials
Wyart, M.
2017-01-01
The macroscopic friction of particulate materials often weakens as the flow rate is increased, leading to potentially disastrous intermittent phenomena including earthquakes and landslides. We theoretically and numerically study this phenomenon in simple granular materials. We show that velocity weakening, corresponding to a nonmonotonic behavior in the friction law, μ(I), is present even if the dynamic and static microscopic friction coefficients are identical, but disappears for softer particles. We argue that this instability is induced by endogenous acoustic noise, which tends to make contacts slide, leading to faster flow and increased noise. We show that soft spots, or excitable regions in the materials, correspond to rolling contacts that are about to slide, whose density is described by a nontrivial exponent θs. We build a microscopic theory for the nonmonotonicity of μ(I), which also predicts the scaling behavior of acoustic noise, the fraction of sliding contacts χ, and the sliding velocity, in terms of θs. Surprisingly, these quantities have no limit when particles become infinitely hard, as confirmed numerically. Our analysis rationalizes previously unexplained observations and makes experimentally testable predictions. PMID:28811373
Friction law and hysteresis in granular materials
NASA Astrophysics Data System (ADS)
DeGiuli, E.; Wyart, M.
2017-08-01
The macroscopic friction of particulate materials often weakens as the flow rate is increased, leading to potentially disastrous intermittent phenomena including earthquakes and landslides. We theoretically and numerically study this phenomenon in simple granular materials. We show that velocity weakening, corresponding to a nonmonotonic behavior in the friction law, μ(I), is present even if the dynamic and static microscopic friction coefficients are identical, but disappears for softer particles. We argue that this instability is induced by endogenous acoustic noise, which tends to make contacts slide, leading to faster flow and increased noise. We show that soft spots, or excitable regions in the materials, correspond to rolling contacts that are about to slide, whose density is described by a nontrivial exponent θs. We build a microscopic theory for the nonmonotonicity of μ(I), which also predicts the scaling behavior of acoustic noise, the fraction of sliding contacts χ, and the sliding velocity, in terms of θs. Surprisingly, these quantities have no limit when particles become infinitely hard, as confirmed numerically. Our analysis rationalizes previously unexplained observations and makes experimentally testable predictions.
STOP using just GO: a multi-ontology hypothesis generation tool for high throughput experimentation
2013-01-01
Background Gene Ontology (GO) enrichment analysis remains one of the most common methods for hypothesis generation from high throughput datasets. However, we believe that researchers strive to test other hypotheses that fall outside of GO. Here, we developed and evaluated a tool for hypothesis generation from gene or protein lists using ontological concepts present in manually curated text that describes those genes and proteins. Results As a consequence we have developed the method Statistical Tracking of Ontological Phrases (STOP) that expands the realm of testable hypotheses in gene set enrichment analyses by integrating automated annotations of genes to terms from over 200 biomedical ontologies. While not as precise as manually curated terms, we find that the additional enriched concepts have value when coupled with traditional enrichment analyses using curated terms. Conclusion Multiple ontologies have been developed for gene and protein annotation, by using a dataset of both manually curated GO terms and automatically recognized concepts from curated text we can expand the realm of hypotheses that can be discovered. The web application STOP is available at http://mooneygroup.org/stop/. PMID:23409969
Online testable concept maps: benefits for learning about the pathogenesis of disease.
Ho, Veronica; Kumar, Rakesh K; Velan, Gary
2014-07-01
Concept maps have been used to promote meaningful learning and critical thinking. Although these are crucially important in all disciplines, evidence for the benefits of concept mapping for learning in medicine is limited. We performed a randomised crossover study to assess the benefits of online testable concept maps for learning in pathology by volunteer junior medical students. Participants (n = 65) were randomly allocated to either of two groups with equivalent mean prior academic performance, in which they were given access to either online maps or existing online resources for a 2-week block on renal disease. Groups then crossed over for a 2-week block on hepatic disease. Outcomes were assessed using timed online quizzes, which included questions unrelated to topics in the pathogenesis maps as an internal control. Questionnaires were administered to evaluate students' acceptance of the maps. In both blocks, the group with access to pathogenesis maps achieved significantly higher average scores than the control group on quiz questions related to topics covered by the maps (Block 1: p < 0.001, Cohen's d = 0.9; Block 2: p = 0.008, Cohen's d = 0.7). However, mean scores on unrelated questions did not differ significantly between the groups. In a third block on pancreatic disease, both groups received pathogenesis maps and collectively performed significantly better on quiz topics related to the maps than on unrelated topics (p < 0.01, Cohen's d = 0.5). Regression analysis revealed that access to pathogenesis maps was the dominant contributor to variance in performance on map-related quiz questions. Responses to questionnaire items on pathogenesis maps were overwhelmingly positive in both groups. These results indicate that online testable pathogenesis maps are well accepted and can improve learning of concepts in pathology by medical students. © 2014 John Wiley & Sons Ltd.
Color Vision Deficiency in Preschool Children
Xie, John Z.; Tarczy-Hornoch, Kristina; Lin, Jesse; Cotter, Susan A.; Torres, Mina; Varma, Rohit
2016-01-01
Purpose To determine the sex- and ethnicity-specific prevalence of color vision deficiency (CVD) in black, Asian, Hispanic, and non-Hispanic white preschool children. Design Population-based, cross-sectional study. Participants The Multi-Ethnic Pediatric Eye Disease Study is a population-based evaluation of the prevalence of vision disorders in children in Southern California. A total of 5960 subjects 30 to 72 months of age were recruited for the study, of whom 4177 were able to complete color vision testing (1265 black, 812 Asian, 1280 Hispanic, and 820 non-Hispanic white). Methods Color vision testing was performed using Color Vision Testing Made Easy color plates (Home Vision Care, Gulf Breeze, FL), and diagnostic confirmatory testing was performed using the Waggoner HRR Diagnostic Test color plates (Home Vision Care). Main Outcome Measures Testability of color vision in preschool children between 30 and 72 months of age and prevalence of CVD stratified by age, sex, and ethnicity. Results Testability was 17% in children younger than 37 months of age, increasing to 57% in children 37 to 48 months of age, 89% in children 49 to 60 months of age, and 98% in children 61 to 72 months of age. The prevalence of CVD among boys was 1.4% for black, 3.1% for Asian, 2.6% for Hispanic, and 5.6% for non-Hispanic white children; the prevalence in girls was 0.0% to 0.5% for all ethnicities. The ethnic difference in CVD was statistically significant between black and non-Hispanic white children (P = 0.0003) and between Hispanic and non-Hispanic white children (P = 0.02). In boys, most CVD cases were either deutan (51%) or protan (34%); 32% were classified as mild, 15% as moderate, and 41% as severe. Conclusions Testability for CVD in preschool children is high by 4 years of age. The prevalence of CVD in preschool boys varies by ethnicity, with the highest prevalence in non-Hispanic white and lowest in black children. PMID:24702753
NASA Astrophysics Data System (ADS)
Sannino, Francesco
I discuss the impact of the discovery of a Higgs-like state on composite dynamics starting by critically examining the reasons in favour of either an elementary or composite nature of this state. Accepting the standard model interpretation I re-address the standard model vacuum stability within a Weyl-consistent computation. I will carefully examine the fundamental reasons why what has been discovered might not be the standard model Higgs. Dynamical electroweak breaking naturally addresses a number of the fundamental issues unsolved by the standard model interpretation. However this paradigm has been challenged by the discovery of a not-so-heavy Higgs-like state. I will therefore review the recent discovery1 that the standard model top-induced radiative corrections naturally reduce the intrinsic non-perturbative mass of the composite Higgs state towards the desired experimental value. Not only we have a natural and testable working framework but we have also suggested specic gauge theories that can realise, at the fundamental level, these minimal models of dynamical electroweak symmetry breaking. These strongly coupled gauge theories are now being heavily investigated via first principle lattice simulations with encouraging results. The new findings show that the recent naive claims made about new strong dynamics at the electroweak scale being disfavoured by the discovery of a not-so-heavy composite Higgs are unwarranted. I will then introduce the more speculative idea of extreme compositeness according to which not only the Higgs sector of the standard model is composite but also quarks and leptons, and provide a toy example in the form of gauge-gauge duality.
Evolutionary Dynamics on Protein Bi-stability Landscapes can Potentially Resolve Adaptive Conflicts
Sikosek, Tobias; Bornberg-Bauer, Erich; Chan, Hue Sun
2012-01-01
Experimental studies have shown that some proteins exist in two alternative native-state conformations. It has been proposed that such bi-stable proteins can potentially function as evolutionary bridges at the interface between two neutral networks of protein sequences that fold uniquely into the two different native conformations. Under adaptive conflict scenarios, bi-stable proteins may be of particular advantage if they simultaneously provide two beneficial biological functions. However, computational models that simulate protein structure evolution do not yet recognize the importance of bi-stability. Here we use a biophysical model to analyze sequence space to identify bi-stable or multi-stable proteins with two or more equally stable native-state structures. The inclusion of such proteins enhances phenotype connectivity between neutral networks in sequence space. Consideration of the sequence space neighborhood of bridge proteins revealed that bi-stability decreases gradually with each mutation that takes the sequence further away from an exactly bi-stable protein. With relaxed selection pressures, we found that bi-stable proteins in our model are highly successful under simulated adaptive conflict. Inspired by these model predictions, we developed a method to identify real proteins in the PDB with bridge-like properties, and have verified a clear bi-stability gradient for a series of mutants studied by Alexander et al. (Proc Nat Acad Sci USA 2009, 106:21149–21154) that connect two sequences that fold uniquely into two different native structures via a bridge-like intermediate mutant sequence. Based on these findings, new testable predictions for future studies on protein bi-stability and evolution are discussed. PMID:23028272
ElSawy, Karim M
2017-02-01
A large number of single-stranded RNA viruses assemble their capsid and their genomic material simultaneously. The RNA viral genome plays multiple roles in this process that are currently only partly understood. In this work, we investigated the thermodynamic basis of the role of viral RNA on the assembly of capsid proteins. The viral capsid of bacteriophage MS2 was considered as a case study. The MS2 virus capsid is composed of 60 AB and 30 CC protein dimers. We investigated the effect of RNA stem loop (the translational repressor TR) binding to the capsid dimers on the dimer-dimer relative association free energies. We found that TR binding results in destabilization of AB self-association compared with AB and CC association. This indicates that the association of the AB and CC dimers is the most likely assembly pathway for the MS2 virus, which explains the experimental observation of alternating patterns of AB and CC dimers in dominant assembly intermediates of the MS2 virus. The presence of viral RNA, therefore, dramatically channels virus assembly to a limited number of pathways, thereby enhancing the efficiency of virus self-assembly process. Interestingly, Thr59Ser and Thr45Ala mutations of the dimers, in the absence of RNA stem loops, lead to stabilization of AB self-association compared with the AB and CC associations, thereby channelling virus assembly towards a fivefold (AB) 5 pentamer intermediate, providing a testable hypothesis of our thermodynamic arguments.
Computations underlying the visuomotor transformation for smooth pursuit eye movements
Murdison, T. Scott; Leclercq, Guillaume; Lefèvre, Philippe
2014-01-01
Smooth pursuit eye movements are driven by retinal motion and enable us to view moving targets with high acuity. Complicating the generation of these movements is the fact that different eye and head rotations can produce different retinal stimuli but giving rise to identical smooth pursuit trajectories. However, because our eyes accurately pursue targets regardless of eye and head orientation (Blohm G, Lefèvre P. J Neurophysiol 104: 2103–2115, 2010), the brain must somehow take these signals into account. To learn about the neural mechanisms potentially underlying this visual-to-motor transformation, we trained a physiologically inspired neural network model to combine two-dimensional (2D) retinal motion signals with three-dimensional (3D) eye and head orientation and velocity signals to generate a spatially correct 3D pursuit command. We then simulated conditions of 1) head roll-induced ocular counterroll, 2) oblique gaze-induced retinal rotations, 3) eccentric gazes (invoking the half-angle rule), and 4) optokinetic nystagmus to investigate how units in the intermediate layers of the network accounted for different 3D constraints. Simultaneously, we simulated electrophysiological recordings (visual and motor tunings) and microstimulation experiments to quantify the reference frames of signals at each processing stage. We found a gradual retinal-to-intermediate-to-spatial feedforward transformation through the hidden layers. Our model is the first to describe the general 3D transformation for smooth pursuit mediated by eye- and head-dependent gain modulation. Based on several testable experimental predictions, our model provides a mechanism by which the brain could perform the 3D visuomotor transformation for smooth pursuit. PMID:25475344
2011-01-01
Background Two component regulatory systems are the primary form of signal transduction in bacteria. Although genomic binding sites have been determined for several eukaryotic and bacterial transcription factors, comprehensive identification of gene targets of two component response regulators remains challenging due to the lack of knowledge of the signals required for their activation. We focused our study on Desulfovibrio vulgaris Hildenborough, a sulfate reducing bacterium that encodes unusually diverse and largely uncharacterized two component signal transduction systems. Results We report the first systematic mapping of the genes regulated by all transcriptionally acting response regulators in a single bacterium. Our results enabled functional predictions for several response regulators and include key processes of carbon, nitrogen and energy metabolism, cell motility and biofilm formation, and responses to stresses such as nitrite, low potassium and phosphate starvation. Our study also led to the prediction of new genes and regulatory networks, which found corroboration in a compendium of transcriptome data available for D. vulgaris. For several regulators we predicted and experimentally verified the binding site motifs, most of which were discovered as part of this study. Conclusions The gene targets identified for the response regulators allowed strong functional predictions to be made for the corresponding two component systems. By tracking the D. vulgaris regulators and their motifs outside the Desulfovibrio spp. we provide testable hypotheses regarding the functions of orthologous regulators in other organisms. The in vitro array based method optimized here is generally applicable for the study of such systems in all organisms. PMID:21992415
Modulation of hippocampal rhythms by subthreshold electric fields and network topology
Berzhanskaya, Julia; Chernyy, Nick; Gluckman, Bruce J.; Schiff, Steven J.; Ascoli, Giorgio A.
2012-01-01
Theta (4–12 Hz) and gamma (30–80 Hz) rhythms are considered important for cortical and hippocampal function. Although several neuron types are implicated in rhythmogenesis, the exact cellular mechanisms remain unknown. Subthreshold electric fields provide a flexible, area-specific tool to modulate neural activity and directly test functional hypotheses. Here we present experimental and computational evidence of the interplay among hippocampal synaptic circuitry, neuronal morphology, external electric fields, and network activity. Electrophysiological data are used to constrain and validate an anatomically and biophysically realistic model of area CA1 containing pyramidal cells and two interneuron types: dendritic- and perisomatic-targeting. We report two lines of results: addressing the network structure capable of generating theta-modulated gamma rhythms, and demonstrating electric field effects on those rhythms. First, theta-modulated gamma rhythms require specific inhibitory connectivity. In one configuration, GABAergic axo-dendritic feedback on pyramidal cells is only effective in proximal but not distal layers. An alternative configuration requires two distinct perisomatic interneuron classes, one exclusively receiving excitatory contacts, the other additionally targeted by inhibition. These observations suggest novel roles for particular classes of oriens and basket cells. The second major finding is that subthreshold electric fields robustly alter the balance between different rhythms. Independent of network configuration, positive electric fields decrease, while negative fields increase the theta/gamma ratio. Moreover, electric fields differentially affect average theta frequency depending on specific synaptic connectivity. These results support the testable prediction that subthreshold electric fields can alter hippocampal rhythms, suggesting new approaches to explore their cognitive functions and underlying circuitry. PMID:23053863
Yu, Chenggang; Boutté, Angela; Yu, Xueping; Dutta, Bhaskar; Feala, Jacob D; Schmid, Kara; Dave, Jitendra; Tawa, Gregory J; Wallqvist, Anders; Reifman, Jaques
2015-02-01
The multifactorial nature of traumatic brain injury (TBI), especially the complex secondary tissue injury involving intertwined networks of molecular pathways that mediate cellular behavior, has confounded attempts to elucidate the pathology underlying the progression of TBI. Here, systems biology strategies are exploited to identify novel molecular mechanisms and protein indicators of brain injury. To this end, we performed a meta-analysis of four distinct high-throughput gene expression studies involving different animal models of TBI. By using canonical pathways and a large human protein-interaction network as a scaffold, we separately overlaid the gene expression data from each study to identify molecular signatures that were conserved across the different studies. At 24 hr after injury, the significantly activated molecular signatures were nonspecific to TBI, whereas the significantly suppressed molecular signatures were specific to the nervous system. In particular, we identified a suppressed subnetwork consisting of 58 highly interacting, coregulated proteins associated with synaptic function. We selected three proteins from this subnetwork, postsynaptic density protein 95, nitric oxide synthase 1, and disrupted in schizophrenia 1, and hypothesized that their abundance would be significantly reduced after TBI. In a penetrating ballistic-like brain injury rat model of severe TBI, Western blot analysis confirmed our hypothesis. In addition, our analysis recovered 12 previously identified protein biomarkers of TBI. The results suggest that systems biology may provide an efficient, high-yield approach to generate testable hypotheses that can be experimentally validated to identify novel mechanisms of action and molecular indicators of TBI. © 2014 The Authors. Journal of Neuroscience Research Published by Wiley Periodicals, Inc.
Curiosity at Vera Rubin Ridge: Testable Hypotheses, First Results, and Implications for Habitability
NASA Astrophysics Data System (ADS)
Fraeman, A.; Bedford, C.; Bridges, J.; Edgar, L. A.; Hardgrove, C.; Horgan, B. H. N.; Gabriel, T. S. J.; Grotzinger, J. P.; Gupta, S.; Johnson, J. R.; Rampe, E. B.; Morris, R. V.; Salvatore, M. R.; Schwenzer, S. P.; Stack, K.; Pinet, P. C.; Rubin, D. M.; Weitz, C. M.; Wellington, D. F.; Wiens, R. C.; Williams, A. J.; Vasavada, A. R.
2017-12-01
As of sol 1756, Curiosity was 250 meters from ascending Vera Rubin Ridge, a unique geomorphic feature preserved in the lower foothills of Aeolis Mons (informally known as Mt. Sharp) that is distinguishable from orbit. Vera Rubin Ridge (previously termed the Hematite Ridge) is characterized by a higher thermal inertia than the surrounding terrain, is comparatively resistant to erosion, and is capped with a hematite-bearing layer that is visible in 18 m/pixel CRISM data. A key hypothesis associated with this unit is that it represents a redox interface where ferrous iron oxidized and precipitated either as hematite or another ferric precursor. The Curiosity integrated payload is being used to determine the depositional environment(s), stratigraphic context and geochemical conditions associated with this interface, all of which will provide key insights into its past habitability potential and the relative timing of processes. Specifically, analysis of Curiosity data will address four major questions related to the history and evolution of ridge-forming strata: (1) What is the stratigraphic relationship between the units in the ridge and the Mt. Sharp group (see Grotzinger et al., 2015)? (2) What primary and secondary geologic processes deposited and modified the ridge units over time? (3) What is the nature and timing of the hematite precipitation environment, and how does it relate to similar oxidized phases in the Murray formation? (4) What are the implications for habitability and the preservation of organic molecules? Initial results of a systematic imaging campaign along the contact between the lower portion or the ridge and the Murray formation has revealed dm-scale cross bedding within the ridge stratigraphy, which provide clues about the depositional environments; these can be compared to suites of sedimentary structures within the adjacent Murray formation. Long distance ChemCam passive and Mastcam multispectral data show that hematite and likely other ferric phases are present in the upper ridge, consistent with orbital data. Curiosity will continue to take systematic observations that draw upon testable hypotheses about the ridge environments as the rover ascends Vera Rubin Ridge.
Age-Related Macular Degeneration: Genetics and Biology Coming Together
Fritsche, Lars G.; Fariss, Robert N.; Stambolian, Dwight; Abecasis, Gonçalo R.; Curcio, Christine A.
2014-01-01
Genetic and genomic studies have enhanced our understanding of complex neurodegenerative diseases that exert a devastating impact on individuals and society. One such disease, age-related macular degeneration (AMD), is a major cause of progressive and debilitating visual impairment. Since the pioneering discovery in 2005 of complement factor H (CFH) as a major AMD susceptibility gene, extensive investigations have confirmed 19 additional genetic risk loci, and more are anticipated. In addition to common variants identified by now-conventional genome-wide association studies, targeted genomic sequencing and exome-chip analyses are uncovering rare variant alleles of high impact. Here, we provide a critical review of the ongoing genetic studies and of common and rare risk variants at a total of 20 susceptibility loci, which together explain 40–60% of the disease heritability but provide limited power for diagnostic testing of disease risk. Identification of these susceptibility loci has begun to untangle the complex biological pathways underlying AMD pathophysiology, pointing to new testable paradigms for treatment. PMID:24773320
Bogenschutz, Michael P; Pommy, Jessica M
2012-01-01
Alcohol and drug addiction are major public health problems, and existing treatments are only moderately effective. Although there has been interest for over half a century in the therapeutic use of classic hallucinogens to treat addictions, clinical research with these drugs was halted at an early stage in the early 1970s, leaving many fundamental questions unanswered. In the past two decades, clinical research on classic hallucinogens has resumed, although addiction treatment trials are only now beginning. The purpose of this paper is to provide a targeted review of the research most relevant to the therapeutic potential of hallucinogens, and to integrate this information with current thinking about addiction and recovery. On the basis of this information, we present a heuristic model which organizes a number of hypotheses that may be tested in future research. We conclude that existing evidence provides a convincing rationale for further research on the effects of classic hallucinogens in the treatment of addiction. Copyright © 2012 John Wiley & Sons, Ltd.
Jackson, Chris J; Izadikah, Zahra; Oei, Tian P S
2012-06-01
Jackson's (2005, 2008a) hybrid model of learning identifies a number of learning mechanisms that lead to the emergence and maintenance of the balance between rationality and irrationality. We test a general hypothesis that Jackson's model will predict depressive symptoms, such that poor learning is related to depression. We draw comparisons between Jackson's model and Ellis' (2004) Rational Emotive Behavior Therapy and Theory (REBT) and thereby provide a set of testable learning mechanisms potentially underlying REBT. Results from 80 patients diagnosed with depression completed the learning styles profiler (LSP; Jackson, 2005) and two measures of depression. Results provide support for the proposed model of learning and further evidence that low rationality is a key predictor of depression. We conclude that the hybrid model of learning has the potential to explain some of the learning and cognitive processes related to the development and maintenance of irrational beliefs and depression. Copyright © 2011. Published by Elsevier B.V.
The evolution of speech: a comparative review.
Fitch
2000-07-01
The evolution of speech can be studied independently of the evolution of language, with the advantage that most aspects of speech acoustics, physiology and neural control are shared with animals, and thus open to empirical investigation. At least two changes were necessary prerequisites for modern human speech abilities: (1) modification of vocal tract morphology, and (2) development of vocal imitative ability. Despite an extensive literature, attempts to pinpoint the timing of these changes using fossil data have proven inconclusive. However, recent comparative data from nonhuman primates have shed light on the ancestral use of formants (a crucial cue in human speech) to identify individuals and gauge body size. Second, comparative analysis of the diverse vertebrates that have evolved vocal imitation (humans, cetaceans, seals and birds) provides several distinct, testable hypotheses about the adaptive function of vocal mimicry. These developments suggest that, for understanding the evolution of speech, comparative analysis of living species provides a viable alternative to fossil data. However, the neural basis for vocal mimicry and for mimesis in general remains unknown.
Integrating Environmental Genomics and Biogeochemical Models: a Gene-centric Approach
NASA Astrophysics Data System (ADS)
Reed, D. C.; Algar, C. K.; Huber, J. A.; Dick, G.
2013-12-01
Rapid advances in molecular microbial ecology have yielded an unprecedented amount of data about the evolutionary relationships and functional traits of microbial communities that regulate global geochemical cycles. Biogeochemical models, however, are trailing in the wake of the environmental genomics revolution and such models rarely incorporate explicit representations of bacteria and archaea, nor are they compatible with nucleic acid or protein sequence data. Here, we present a functional gene-based framework for describing microbial communities in biogeochemical models that uses genomics data and provides predictions that are readily testable using cutting-edge molecular tools. To demonstrate the approach in practice, nitrogen cycling in the Arabian Sea oxygen minimum zone (OMZ) was modelled to examine key questions about cryptic sulphur cycling and dinitrogen production pathways in OMZs. By directly linking geochemical dynamics to the genetic composition of microbial communities, the method provides mechanistic insights into patterns and biogeochemical consequences of marine microbes. Such an approach is critical for informing our understanding of the key role microbes play in modulating Earth's biogeochemistry.
Catastrophic desert formation in Daisyworld.
Ackland, Graeme J; Clark, Michael A; Lenton, Timothy M
2003-07-07
Feedback between life and its environment is ubiquitous but the strength of coupling and its global implications remain hotly debated. Abrupt changes in the abundance of life for small changes in forcing provide one indicator of regulation, for example, when vegetation-climate feedback collapses in the formation of a desert. Here we use a two-dimensional "Daisyworld" model with curvature to show that catastrophic collapse of life under gradual forcing provides a testable indicator of environmental feedback. When solar luminosity increases to a critical value, a desert forms across a wide band of the planet. The scale of collapse depends on the strength of feedback. The efficiency of temperature regulation is limited by mutation rate in an analogous manner to the limitation of adaptive fitness in evolutionary theories. The final state of the system emerging from single-site rules can be described by two global quantities: optimization of temperature regulation and maximization of diversity, which are mathematically analogous to energy and entropy in thermodynamics.
Kwan, Patrick; Arzimanoglou, Alexis; Berg, Anne T; Brodie, Martin J; Allen Hauser, W; Mathern, Gary; Moshé, Solomon L; Perucca, Emilio; Wiebe, Samuel; French, Jacqueline
2010-06-01
To improve patient care and facilitate clinical research, the International League Against Epilepsy (ILAE) appointed a Task Force to formulate a consensus definition of drug resistant epilepsy. The overall framework of the definition has two "hierarchical" levels: Level 1 provides a general scheme to categorize response to each therapeutic intervention, including a minimum dataset of knowledge about the intervention that would be needed; Level 2 provides a core definition of drug resistant epilepsy using a set of essential criteria based on the categorization of response (from Level 1) to trials of antiepileptic drugs. It is proposed as a testable hypothesis that drug resistant epilepsy is defined as failure of adequate trials of two tolerated, appropriately chosen and used antiepileptic drug schedules (whether as monotherapies or in combination) to achieve sustained seizure freedom. This definition can be further refined when new evidence emerges. The rationale behind the definition and the principles governing its proper use are discussed, and examples to illustrate its application in clinical practice are provided.
Rethinking intractable conflict: the perspective of dynamical systems.
Vallacher, Robin R; Coleman, Peter T; Nowak, Andrzej; Bui-Wrzosinska, Lan
2010-01-01
Intractable conflicts are demoralizing. Beyond destabilizing the families, communities, or international regions in which they occur, they tend to perpetuate the very conditions of misery and hate that contributed to them in the first place. Although the common factors and processes associated with intractable conflicts have been identified through research, they represent an embarrassment of riches for theory construction. Thus, the current task in this area is integrating these diverse factors into an account that provides a coherent perspective yet allows for prediction and a basis for conflict resolution in specific conflict settings. We suggest that the perspective of dynamical systems provides such an account. This article outlines the key concepts and hypotheses associated with this approach. It is organized around a set of basic questions concerning intractable conflict for which the dynamical perspective offers fresh insight and testable propositions. The questions and answers are intended to provide readers with basic concepts and principles of complexity and dynamical systems that are useful for rethinking the nature of intractable conflict and the means by which such conflict can be transformed. Copyright 2010 APA, all rights reserved.
Pre-Service Teacher Scientific Behavior: Comparative Study of Paired Science Project Assignments
ERIC Educational Resources Information Center
Bulunuz, Mizrap; Tapan Broutin, Menekse Seden; Bulunuz, Nermin
2016-01-01
Problem Statement: University students usually lack the skills to rigorously define a multi-dimensional real-life problem and its limitations in an explicit, clear and testable way, which prevents them from forming a reliable method, obtaining relevant results and making balanced judgments to solve a problem. Purpose of the Study: The study…
ERIC Educational Resources Information Center
Nauta, Margaret M.
2010-01-01
This article celebrates the 50th anniversary of the introduction of John L. Holland's (1959) theory of vocational personalities and work environments by describing the theory's development and evolution, its instrumentation, and its current status. Hallmarks of Holland's theory are its empirical testability and its user-friendliness. By…
Steering Performance, Tactical Vehicles
2015-07-29
5 4.1 General Vehicle and Test Characterization ........................... 5 4.2 Weave Test...able to be driven in a straight line without steer input (i.e., “ hands free”). If the vehicle pulls in either direction, the alignment should be...Evaluation Center (AEC) prior to using military personnel as test participants. 4. TEST PROCEDURES. 4.1 General Vehicle and Test
Binding and Scope Dependencies with "Floating Quantifiers" in Japanese
ERIC Educational Resources Information Center
Mukai, Emi
2012-01-01
The primary concern of this thesis is how we can achieve rigorous testability when we set the properties of the Computational System (hypothesized to be at the center of the language faculty) as our object of inquiry and informant judgments as a tool to construct and/or evaluate our hypotheses concerning the properties of the Computational System.…
The Many Methods to Measure Testability: A Horror Story.
1988-04-01
it seems overly simplistic to assign only one "magic number" as a viable design goal. Different design technologies such as digital, analog, machanical ...FAILURE RATE 1 1 BASIC TEST PROGRAM 1 1 ATLAS TEST PROGRAM 1 1 EDIF FILE 1 1 TEST STRATEGY FLOWCHART 1 1 RTOK FREQUENCY 1 1 DIAGNOSIS AVERAGE COST 1 1
The Social Basis of Math Teaching and Learning. Final Report.
ERIC Educational Resources Information Center
Orvik, James M.; Van Veldhuizen, Philip A.
This study was designed to identify a set of research questions and testable hypothesis to aid in planning long-range research. Five mathematics teachers were selected. These instructors enrolled in a special project-related seminar, video-taped sessions of their own mathematics classes, and kept field journals. The group met once a week to…
ERIC Educational Resources Information Center
Hunter, Lora Rose; Schmidt, Norman B.
2010-01-01
In this review, the extant literature concerning anxiety psychopathology in African American adults is summarized to develop a testable, explanatory framework with implications for future research. The model was designed to account for purported lower rates of anxiety disorders in African Americans compared to European Americans, along with other…
ERIC Educational Resources Information Center
Kulczynska, Agnieszka; Johnson, Reed; Frost, Tony; Margerum, Lawrence D.
2011-01-01
An advanced undergraduate laboratory project is described that integrates inorganic, analytical, physical, and biochemical techniques to reveal differences in binding between cationic metal complexes and anionic DNA (herring testes). Students were guided to formulate testable hypotheses based on the title question and a list of different metal…
ERIC Educational Resources Information Center
Duncan-Wiles, Daphne S.
2012-01-01
With the recent addition of engineering to most K-12 testable state standards, efficient and comprehensive instruments are needed to assess changes in student knowledge and perceptions of engineering. In this study, I developed the Students' Awareness and Perceptions of Learning Engineering (STAPLE) instrument to quantitatively measure fourth…
Wichita's Hispanics: Tensions, Concerns, and the Migrant Stream.
ERIC Educational Resources Information Center
Johnson, Kenneth F.; And Others
In an attempt to formulate a set of testable propositions about the dynamics of Hispanic life that will be valuable pedagogically and as a basis for public policy formation, this study assesses the impact of Hispanic Americans on Wichita, Kansas. Chapter 1 identifies the Hispanic origins of Kansas' 63,339 Hispanics who represent 2.7% of the…
Improving Health Care for Assisted Living Residents
ERIC Educational Resources Information Center
Kane, Robert L.; Mach, John R., Jr.
2007-01-01
Purpose: The purpose of this article is to explore how medical care is delivered to older people in assisted living (AL) settings and to suggest ways for improving it. Design and Methods: We present a review of the limited research available on health care for older AL residents and on building testable models of better ways to organize primary…
ERIC Educational Resources Information Center
Holden, Richard J.; Karsh, Ben-Tzion
2009-01-01
Primary objective: much research and practice related to the design and implementation of information technology in health care has been atheoretical. It is argued that using extant theory to develop testable models of health information technology (HIT) benefits both research and practice. Methods and procedures: several theories of motivation,…
Interpreting clinical trial results by deductive reasoning: In search of improved trial design.
Kurbel, Sven; Mihaljević, Slobodan
2017-10-01
Clinical trial results are often interpreted by inductive reasoning, in a trial design-limited manner, directed toward modifications of the current clinical practice. Deductive reasoning is an alternative in which results of relevant trials are combined in indisputable premises that lead to a conclusion easily testable in future trials. © 2017 WILEY Periodicals, Inc.
The part of cognitive science that is philosophy.
Dennett, Daniel C
2009-04-01
There is much good work for philosophers to do in cognitive science if they adopt the constructive attitude that prevails in science, work toward testable hypotheses, and take on the task of clarifying the relationship between the scientific concepts and the everyday concepts with which we conduct our moral lives. Copyright © 2009 Cognitive Science Society, Inc.
A Progress Report on a Thinking Laboratory for Deaf Children.
ERIC Educational Resources Information Center
Wolff, Sydney
A study was undertaken at the West Virginia School for the Deaf to test the assumption that the modes of thought of deaf children could be improved, and that improvement in concept formation would result in improvement in testable areas. Sixteen primary school children of approximately equal ability were selected and paired to form the control and…
Induction and modulation of persistent activity in a layer V PFC microcircuit model.
Papoutsi, Athanasia; Sidiropoulou, Kyriaki; Cutsuridis, Vassilis; Poirazi, Panayiota
2013-01-01
Working memory refers to the temporary storage of information and is strongly associated with the prefrontal cortex (PFC). Persistent activity of cortical neurons, namely the activity that persists beyond the stimulus presentation, is considered the cellular correlate of working memory. Although past studies suggested that this type of activity is characteristic of large scale networks, recent experimental evidence imply that small, tightly interconnected clusters of neurons in the cortex may support similar functionalities. However, very little is known about the biophysical mechanisms giving rise to persistent activity in small-sized microcircuits in the PFC. Here, we present a detailed biophysically-yet morphologically simplified-microcircuit model of layer V PFC neurons that incorporates connectivity constraints and is validated against a multitude of experimental data. We show that (a) a small-sized network can exhibit persistent activity under realistic stimulus conditions. (b) Its emergence depends strongly on the interplay of dADP, NMDA, and GABAB currents. (c) Although increases in stimulus duration increase the probability of persistent activity induction, variability in the stimulus firing frequency does not consistently influence it. (d) Modulation of ionic conductances (I h , I D , I sAHP, I caL, I caN, I caR) differentially controls persistent activity properties in a location dependent manner. These findings suggest that modulation of the microcircuit's firing characteristics is achieved primarily through changes in its intrinsic mechanism makeup, supporting the hypothesis of multiple bi-stable units in the PFC. Overall, the model generates a number of experimentally testable predictions that may lead to a better understanding of the biophysical mechanisms of persistent activity induction and modulation in the PFC.
Brook, Bindi S.
2017-01-01
The chemokine receptor CCR7 drives leukocyte migration into and within lymph nodes (LNs). It is activated by chemokines CCL19 and CCL21, which are scavenged by the atypical chemokine receptor ACKR4. CCR7-dependent navigation is determined by the distribution of extracellular CCL19 and CCL21, which form concentration gradients at specific microanatomical locations. The mechanisms underpinning the establishment and regulation of these gradients are poorly understood. In this article, we have incorporated multiple biochemical processes describing the CCL19–CCL21–CCR7–ACKR4 network into our model of LN fluid flow to establish a computational model to investigate intranodal chemokine gradients. Importantly, the model recapitulates CCL21 gradients observed experimentally in B cell follicles and interfollicular regions, building confidence in its ability to accurately predict intranodal chemokine distribution. Parameter variation analysis indicates that the directionality of these gradients is robust, but their magnitude is sensitive to these key parameters: chemokine production, diffusivity, matrix binding site availability, and CCR7 abundance. The model indicates that lymph flow shapes intranodal CCL21 gradients, and that CCL19 is functionally important at the boundary between B cell follicles and the T cell area. It also predicts that ACKR4 in LNs prevents CCL19/CCL21 accumulation in efferent lymph, but does not control intranodal gradients. Instead, it attributes the disrupted interfollicular CCL21 gradients observed in Ackr4-deficient LNs to ACKR4 loss upstream. Our novel approach has therefore generated new testable hypotheses and alternative interpretations of experimental data. Moreover, it acts as a framework to investigate gradients at other locations, including those that cannot be visualized experimentally or involve other chemokines. PMID:28807994
Brain Evolution and Human Neuropsychology: The Inferential Brain Hypothesis
Koscik, Timothy R.; Tranel, Daniel
2013-01-01
Collaboration between human neuropsychology and comparative neuroscience has generated invaluable contributions to our understanding of human brain evolution and function. Further cross-talk between these disciplines has the potential to continue to revolutionize these fields. Modern neuroimaging methods could be applied in a comparative context, yielding exciting new data with the potential of providing insight into brain evolution. Conversely, incorporating an evolutionary base into the theoretical perspectives from which we approach human neuropsychology could lead to novel hypotheses and testable predictions. In the spirit of these objectives, we present here a new theoretical proposal, the Inferential Brain Hypothesis, whereby the human brain is thought to be characterized by a shift from perceptual processing to inferential computation, particularly within the social realm. This shift is believed to be a driving force for the evolution of the large human cortex. PMID:22459075
How hierarchical is language use?
Frank, Stefan L.; Bod, Rens; Christiansen, Morten H.
2012-01-01
It is generally assumed that hierarchical phrase structure plays a central role in human language. However, considerations of simplicity and evolutionary continuity suggest that hierarchical structure should not be invoked too hastily. Indeed, recent neurophysiological, behavioural and computational studies show that sequential sentence structure has considerable explanatory power and that hierarchical processing is often not involved. In this paper, we review evidence from the recent literature supporting the hypothesis that sequential structure may be fundamental to the comprehension, production and acquisition of human language. Moreover, we provide a preliminary sketch outlining a non-hierarchical model of language use and discuss its implications and testable predictions. If linguistic phenomena can be explained by sequential rather than hierarchical structure, this will have considerable impact in a wide range of fields, such as linguistics, ethology, cognitive neuroscience, psychology and computer science. PMID:22977157
Chiral primordial blue tensor spectra from the axion-gauge couplings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Obata, Ippei, E-mail: obata@tap.scphys.kyoto-u.ac.jp
We suggest the new feature of primordial gravitational waves sourced by the axion-gauge couplings, whose forms are motivated by the dimensional reduction of the form field in the string theory. In our inflationary model, as an inflaton we adopt two types of axion, dubbed the model-independent axion and the model-dependent axion, which couple with two gauge groups with different sign combination each other. Due to these forms both polarization modes of gauge fields are amplified and enhance both helicies of tensor modes during inflation. We point out the possibility that a primordial blue-tilted tensor power spectra with small chirality aremore » provided by the combination of these axion-gauge couplings, intriguingly both amplitudes and chirality are potentially testable by future space-based gravitational wave interferometers such as DECIGO and BBO project.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vadhavkar, Nikhil; Pham, Christopher; Georgescu, Walter
In contrast to the classic view of static DNA double-strand breaks (DSBs) being repaired at the site of damage, we hypothesize that DSBs move and merge with each other over large distances (m). As X-ray dose increases, the probability of having DSB clusters increases as does the probability of misrepair and cell death. Experimental work characterizing the X-ray dose dependence of radiation-induced foci (RIF) in nonmalignant human mammary epithelial cells (MCF10A) is used here to validate a DSB clustering model. We then use the principles of the local effect model (LEM) to predict the yield of DSBs at the submicronmore » level. Two mechanisms for DSB clustering, namely random coalescence of DSBs versus active movement of DSBs into repair domains are compared and tested. Simulations that best predicted both RIF dose dependence and cell survival after X-ray irradiation favored the repair domain hypothesis, suggesting the nucleus is divided into an array of regularly spaced repair domains of ~;;1.55 m sides. Applying the same approach to high-linear energy transfer (LET) ion tracks, we are able to predict experimental RIF/m along tracks with an overall relative error of 12percent, for LET ranging between 30 350 keV/m and for three different ions. Finally, cell death was predicted by assuming an exponential dependence on the total number of DSBs and of all possible combinations of paired DSBs within each simulated RIF. Relative biological effectiveness (RBE) predictions for cell survival of MCF10A exposed to high-LET showed an LET dependence that matches previous experimental results for similar cell types. Overall, this work suggests that microdosimetric properties of ion tracks at the submicron level are sufficient to explain both RIF data and survival curves for any LET, similarly to the LEM assumption. Conversely, high-LET death mechanism does not have to infer linear-quadratic dose formalism as done in the LEM. In addition, the size of repair domains derived in our model are based on experimental RIF and are three times larger than the hypothetical LEM voxel used to fit survival curves. Our model is therefore an alternative to previous approaches that provides a testable biological mechanism (i.e., RIF). In addition, we propose that DSB pairing will help develop more accurate alternatives to the linear cancer risk model (LNT) currently used for regulating exposure to very low levels of ionizing radiation.« less
Taxes in a Labor Supply Model with Joint Wage-Hours Determination.
ERIC Educational Resources Information Center
Rosen, Harvey S.
1976-01-01
Payroll and progressive income taxes play an enormous role in the American fiscal system. The purpose of this study is to present some econometric evidence on the effects of taxes on married women, a group of growing importance in the American labor force. A testable model of labor supply is developed which permits statistical estimation of a…
ERIC Educational Resources Information Center
Nunez, Rafael
2012-01-01
"The Journal of the Learning Sciences" has devoted this special issue to the study of embodied cognition (as it applies to mathematics), a topic that for several decades has gained attention in the cognitive sciences and in mathematics education, in particular. In this commentary, the author aims to address crucial questions in embodied…
ERIC Educational Resources Information Center
Booker, Lucille M.
2012-01-01
Political discourse is an observable, measurable, and testable manifestation of political worldviews. However, when worldviews collide, notions of truth and of lies are put to the test. The challenge for researchers is how to establish confidence in their analysis. Despite the growing interest in deception research from a diversity of fields and…
Predictors of Organizational-Level Testability Attributes
1987-05-01
A. Elizabeth Gilreath Brian A. Kelley 13a. TYPE OF REPORT 13b. TIME COVERED 14. DATE OF REPORT (YearB, M RSnt, Day) 15.PAGECOUNT ’Final JFROM A TO 6... BRU count. These counts are "described in subsections 6,.2.1.1 and 6.2.1.2. and are further subdivided in Figure 6-4. 6.2.1.1 Functional Cross
Surface fire effects on conifer and hardwood crowns--applications of an integral plume model
Matthew Dickinson; Anthony Bova; Kathleen Kavanagh; Antoine Randolph; Lawrence Band
2009-01-01
An integral plume model was applied to the problems of tree death from canopy injury in dormant-season hardwoods and branch embolism in Douglas fir (Pseudotsuga menziesii) crowns. Our purpose was to generate testable hypotheses. We used the integral plume models to relate crown injury to bole injury and to explore the effects of variation in fire...
Analytical Procedures for Testability.
1983-01-01
Beat Internal Classifications", AD: A018516. "A System of Computer Aided Diagnosis with Blood Serum Chemistry Tests and Bayesian Statistics", AD: 786284...6 LIST OF TALS .. 1. Truth Table ......................................... 49 2. Covering Problem .............................. 93 3. Primary and...quential classification procedure in a coronary care ward is evaluated. In the toxicology field "A System of Computer Aided Diagnosis with Blood Serum
Analysis of optimality in natural and perturbed metabolic networks
Segrè, Daniel; Vitkup, Dennis; Church, George M.
2002-01-01
An important goal of whole-cell computational modeling is to integrate detailed biochemical information with biological intuition to produce testable predictions. Based on the premise that prokaryotes such as Escherichia coli have maximized their growth performance along evolution, flux balance analysis (FBA) predicts metabolic flux distributions at steady state by using linear programming. Corroborating earlier results, we show that recent intracellular flux data for wild-type E. coli JM101 display excellent agreement with FBA predictions. Although the assumption of optimality for a wild-type bacterium is justifiable, the same argument may not be valid for genetically engineered knockouts or other bacterial strains that were not exposed to long-term evolutionary pressure. We address this point by introducing the method of minimization of metabolic adjustment (MOMA), whereby we test the hypothesis that knockout metabolic fluxes undergo a minimal redistribution with respect to the flux configuration of the wild type. MOMA employs quadratic programming to identify a point in flux space, which is closest to the wild-type point, compatibly with the gene deletion constraint. Comparing MOMA and FBA predictions to experimental flux data for E. coli pyruvate kinase mutant PB25, we find that MOMA displays a significantly higher correlation than FBA. Our method is further supported by experimental data for E. coli knockout growth rates. It can therefore be used for predicting the behavior of perturbed metabolic networks, whose growth performance is in general suboptimal. MOMA and its possible future extensions may be useful in understanding the evolutionary optimization of metabolism. PMID:12415116
Stochastic and deterministic model of microbial heat inactivation.
Corradini, Maria G; Normand, Mark D; Peleg, Micha
2010-03-01
Microbial inactivation is described by a model based on the changing survival probabilities of individual cells or spores. It is presented in a stochastic and discrete form for small groups, and as a continuous deterministic model for larger populations. If the underlying mortality probability function remains constant throughout the treatment, the model generates first-order ("log-linear") inactivation kinetics. Otherwise, it produces survival patterns that include Weibullian ("power-law") with upward or downward concavity, tailing with a residual survival level, complete elimination, flat "shoulder" with linear or curvilinear continuation, and sigmoid curves. In both forms, the same algorithm or model equation applies to isothermal and dynamic heat treatments alike. Constructing the model does not require assuming a kinetic order or knowledge of the inactivation mechanism. The general features of its underlying mortality probability function can be deduced from the experimental survival curve's shape. Once identified, the function's coefficients, the survival parameters, can be estimated directly from the experimental survival ratios by regression. The model is testable in principle but matching the estimated mortality or inactivation probabilities with those of the actual cells or spores can be a technical challenge. The model is not intended to replace current models to calculate sterility. Its main value, apart from connecting the various inactivation patterns to underlying probabilities at the cellular level, might be in simulating the irregular survival patterns of small groups of cells and spores. In principle, it can also be used for nonthermal methods of microbial inactivation and their combination with heat.
Migliore, Rosanna; De Simone, Giada; Leinekugel, Xavier; Migliore, Michele
2017-04-01
The possible effects on cognitive processes of external electric fields, such as those generated by power line pillars and household appliances are of increasing public concern. They are difficult to study experimentally, and the relatively scarce and contradictory evidence make it difficult to clearly assess these effects. In this study, we investigate how, why and to what extent external perturbations of the intrinsic neuronal activity, such as those that can be caused by generation, transmission and use of electrical energy can affect neuronal activity during cognitive processes. For this purpose, we used a morphologically and biophysically realistic three-dimensional model of CA1 pyramidal neurons. The simulation findings suggest that an electric field oscillating at power lines frequency, and environmentally measured strength, can significantly alter both the average firing rate and temporal spike distribution properties of a hippocampal CA1 pyramidal neuron. This effect strongly depends on the specific and instantaneous relative spatial location of the neuron with respect to the field, and on the synaptic input properties. The model makes experimentally testable predictions on the possible functional consequences for normal hippocampal functions such as object recognition and spatial navigation. The results suggest that, although EF effects on cognitive processes may be difficult to occur in everyday life, their functional consequences deserve some consideration, especially when they constitute a systematic presence in living environments. © 2016 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.
E-learning platform for automated testing of electronic circuits using signature analysis method
NASA Astrophysics Data System (ADS)
Gherghina, Cǎtǎlina; Bacivarov, Angelica; Bacivarov, Ioan C.; Petricǎ, Gabriel
2016-12-01
Dependability of electronic circuits can be ensured only through testing of circuit modules. This is done by generating test vectors and their application to the circuit. Testability should be viewed as a concerted effort to ensure maximum efficiency throughout the product life cycle, from conception and design stage, through production to repairs during products operating. In this paper, is presented the platform developed by authors for training for testability in electronics, in general and in using signature analysis method, in particular. The platform allows highlighting the two approaches in the field namely analog and digital signature of circuits. As a part of this e-learning platform, it has been developed a database for signatures of different electronic components meant to put into the spotlight different techniques implying fault detection, and from this there were also self-repairing techniques of the systems with this kind of components. An approach for realizing self-testing circuits based on MATLAB environment and using signature analysis method is proposed. This paper analyses the benefits of signature analysis method and simulates signature analyzer performance based on the use of pseudo-random sequences, too.
Objections to routine clinical outcomes measurement in mental health services: any evidence so far?
MacDonald, Alastair J D; Trauer, Tom
2010-12-01
Routine clinical outcomes measurement (RCOM) is gaining importance in mental health services. To examine whether criticisms published in advance of the development of RCOM have been borne out by data now available from such a programme. This was an observational study of routine ratings using HoNOS65+ at inception/admission and again at discharge in an old age psychiatry service from 1997 to 2008. Testable hypotheses were generated from each criticism amenable to empirical examination. Inter-rater reliability estimates were applied to observed differences between scores between community and ward patients using resampling. Five thousand one hundred eighty community inceptions and 862 admissions had HoNOS65+ ratings at referral/admission and discharge. We could find no evidence of gaming (artificially worse scores at inception and better at discharge), selection, attrition or detection bias, and ratings were consistent with diagnosis and level of service. Anticipated low levels of inter-rater reliability did not vitiate differences between levels of service. Although only hypotheses testable from within RCOM data were examined, and only 46% of eligible episodes had complete outcomes data, no evidence of the alleged biases were found. RCOM seems valid and practical in mental health services.
Electronic design of a multichannel programmable implant for neuromuscular electrical stimulation.
Arabi, K; Sawan, M A
1999-06-01
An advanced stimulator for neuromuscular stimulation of spinal cord injured patients has been developed. The stimulator is externally controlled and powered by a single encoded radio frequency carrier and has four independently controlled bipolar stimulation channels. It offers a wide range of reprogrammability and flexibility, and can be used in many neuromuscular electrical stimulation applications. The implant system is adaptable to patient's needs and to future developments in stimulation algorithms by reprogramming the stimulator. The stimulator is capable of generating a wide range of stimulation waveforms and stimulation patterns and therefore is very suitable for selective nerve stimulation techniques. The reliability of the implant has been increased by using a forward error detection and correction communication protocol and by designing the chip for structural testability based on scan test approach. Implemented testability scheme makes it possible to verify the complete functionality of the implant before and after implantation. The stimulators architecture is designed to be modular and therefore its different blocks can be reused as standard building blocks in the design and implementation of other neuromuscular prostheses. Design for low-power techniques have also been employed to reduce power consumption of the electronic circuitry.
Kell, Douglas B.; Oliver, Stephen G.
2014-01-01
One approach to experimental science involves creating hypotheses, then testing them by varying one or more independent variables, and assessing the effects of this variation on the processes of interest. We use this strategy to compare the intellectual status and available evidence for two models or views of mechanisms of transmembrane drug transport into intact biological cells. One (BDII) asserts that lipoidal phospholipid Bilayer Diffusion Is Important, while a second (PBIN) proposes that in normal intact cells Phospholipid Bilayer diffusion Is Negligible (i.e., may be neglected quantitatively), because evolution selected against it, and with transmembrane drug transport being effected by genetically encoded proteinaceous carriers or pores, whose “natural” biological roles, and substrates are based in intermediary metabolism. Despite a recent review elsewhere, we can find no evidence able to support BDII as we can find no experiments in intact cells in which phospholipid bilayer diffusion was either varied independently or measured directly (although there are many papers where it was inferred by seeing a covariation of other dependent variables). By contrast, we find an abundance of evidence showing cases in which changes in the activities of named and genetically identified transporters led to measurable changes in the rate or extent of drug uptake. PBIN also has considerable predictive power, and accounts readily for the large differences in drug uptake between tissues, cells and species, in accounting for the metabolite-likeness of marketed drugs, in pharmacogenomics, and in providing a straightforward explanation for the late-stage appearance of toxicity and of lack of efficacy during drug discovery programmes despite macroscopically adequate pharmacokinetics. Consequently, the view that Phospholipid Bilayer diffusion Is Negligible (PBIN) provides a starting hypothesis for assessing cellular drug uptake that is much better supported by the available evidence, and is both more productive and more predictive. PMID:25400580
The attention schema theory: a mechanistic account of subjective awareness
Graziano, Michael S. A.; Webb, Taylor W.
2015-01-01
We recently proposed the attention schema theory, a novel way to explain the brain basis of subjective awareness in a mechanistic and scientifically testable manner. The theory begins with attention, the process by which signals compete for the brain’s limited computing resources. This internal signal competition is partly under a bottom–up influence and partly under top–down control. We propose that the top–down control of attention is improved when the brain has access to a simplified model of attention itself. The brain therefore constructs a schematic model of the process of attention, the ‘attention schema,’ in much the same way that it constructs a schematic model of the body, the ‘body schema.’ The content of this internal model leads a brain to conclude that it has a subjective experience. One advantage of this theory is that it explains how awareness and attention can sometimes become dissociated; the brain’s internal models are never perfect, and sometimes a model becomes dissociated from the object being modeled. A second advantage of this theory is that it explains how we can be aware of both internal and external events. The brain can apply attention to many types of information including external sensory information and internal information about emotions and cognitive states. If awareness is a model of attention, then this model should pertain to the same domains of information to which attention pertains. A third advantage of this theory is that it provides testable predictions. If awareness is the internal model of attention, used to help control attention, then without awareness, attention should still be possible but should suffer deficits in control. In this article, we review the existing literature on the relationship between attention and awareness, and suggest that at least some of the predictions of the theory are borne out by the evidence. PMID:25954242
The (virtual) conceptual necessity of quantum probabilities in cognitive psychology.
Blutner, Reinhard; beim Graben, Peter
2013-06-01
We propose a way in which Pothos & Busemeyer (P&B) could strengthen their position. Taking a dynamic stance, we consider cognitive tests as functions that transfer a given input state into the state after testing. Under very general conditions, it can be shown that testable properties in cognition form an orthomodular lattice. Gleason's theorem then yields the conceptual necessity of quantum probabilities (QP).
The Systems Test Architect: Enabling The Leap From Testable To Tested
2016-09-01
engineering process requires an interdisciplinary approach, involving both technical and managerial disciplines applied to the synthesis and integration...relationship between the technical and managerial aspects of systems engineering. TP-2003-020-01 describes measurement as having the following...it is evident that DOD makes great strides to tackle both the managerial and technical aspects of test and evaluation within the systems
Active Diagnosis of Navy Machinery Rev 2.0
2016-10-01
electrical distribution and potable water supply systems. Because of these dependencies, ship auxiliary system failures can cause combat load failure...buildup generally causes a pipe to disconnect from a junction, causing water to leak . This limits the faults that are testable, since many of the faults...pipes, junctions, pumps, flow meters, thermal loads, check valve, and water tank. Each agent is responsible for maintaining its constraints locally
Silicon Wafer Advanced Packaging (SWAP). Multichip Module (MCM) Foundry Study. Version 2
1991-04-08
Next Layer Dielectric Spacing - Additional Metal Thickness Impact on Dielectric Uniformity/Adhiesion. The first step in .!Ie EPerimental design would be... design CAM - computer aided manufacturing CAE - computer aided engineering CALCE - computer aided life cycle engineering center CARMA - computer aided...expansion 5 j- CVD - chemical vapor deposition J . ..- j DA - design automation J , DEC - Digital Equipment Corporation --- DFT - design for testability
Structural Genomics of Bacterial Virulence Factors
2006-05-01
positioned in the unit cell by Molecular Replacement (Protein Data Bank ( PDB ) ID code 1acc)6 using MOLREP, and refined with REFMAC version 5.0 (ref. 24...increase our understanding of the molecular mechanisms of pathogenicity, putting us in a stronger position to anticipate and react to emerging...term, the accumulated structural information will generate important and testable hypotheses that will increase our understanding of the molecular
Testability/Diagnostics Design Encyclopedia
1990-09-01
weapon system that is pushing the state of the art and produced In limited numbers, with questionable historical data on their operation, one can...designs with questionable basis and justification. Unfortunately, this process has not been transformed from an art to a rigorous methodology...REQUIREMENT #2.1 - On-the-job training - Formal school training o O-Level data acquieitlonico01ectlon system (and data management) o Requirements to
Rapid Communication: Quasi-gedanken experiment challenging the no-signalling theorem
NASA Astrophysics Data System (ADS)
Kalamidas, Demetrios A.
2018-01-01
Kennedy ( Philos. Sci. 62, 4 (1995)) has argued that the various quantum mechanical no-signalling proofs formulated thus far share a common mathematical framework, are circular in nature, and do not preclude the construction of empirically testable schemes wherein superluminal exchange of information can occur. In light of this thesis, we present a potentially feasible quantum-optical scheme that purports to enable superluminal signalling.
Retrieval as a Fast Route to Memory Consolidation.
Antony, James W; Ferreira, Catarina S; Norman, Kenneth A; Wimber, Maria
2017-08-01
Retrieval-mediated learning is a powerful way to make memories last, but its neurocognitive mechanisms remain unclear. We propose that retrieval acts as a rapid consolidation event, supporting the creation of adaptive hippocampal-neocortical representations via the 'online' reactivation of associative information. We describe parallels between online retrieval and offline consolidation and offer testable predictions for future research. Copyright © 2017 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Maestripieri, Dario
2005-01-01
Comparative behavioral research is important for a number of reasons and can contribute to the understanding of human behavior and development in many different ways. Research with animal models of human behavior and development can be a source not only of general principles and testable hypotheses but also of empirical information that may be…
NASA Space Flight Vehicle Fault Isolation Challenges
NASA Technical Reports Server (NTRS)
Neeley, James R.; Jones, James V.; Bramon, Christopher J.; Inman, Sharon K.; Tuttle, Loraine
2016-01-01
The Space Launch System (SLS) is the new NASA heavy lift launch vehicle in development and is scheduled for its first mission in 2018.SLS has many of the same logistics challenges as any other large scale program. However, SLS also faces unique challenges related to testability. This presentation will address the SLS challenges for diagnostics and fault isolation, along with the analyses and decisions to mitigate risk..
A Survey of Reliability, Maintainability, Supportability, and Testability Software Tools
1991-04-01
designs in terms of their contributions toward forced mission termination and vehicle or function loss . Includes the ability to treat failure modes of...ABSTRACT: Inputs: MTBFs, MTTRs, support equipment costs, equipment weights and costs, available targets, military occupational specialty skill level and...US Army CECOM NAME: SPARECOST ABSTRACT: Calculates expected number of failures and performs spares holding optimization based on cost, weight , or
Predicting the dynamics of bacterial growth inhibition by ribosome-targeting antibiotics
NASA Astrophysics Data System (ADS)
Greulich, Philip; Doležal, Jakub; Scott, Matthew; Evans, Martin R.; Allen, Rosalind J.
2017-12-01
Understanding how antibiotics inhibit bacteria can help to reduce antibiotic use and hence avoid antimicrobial resistance—yet few theoretical models exist for bacterial growth inhibition by a clinically relevant antibiotic treatment regimen. In particular, in the clinic, antibiotic treatment is time-dependent. Here, we use a theoretical model, previously applied to steady-state bacterial growth, to predict the dynamical response of a bacterial cell to a time-dependent dose of ribosome-targeting antibiotic. Our results depend strongly on whether the antibiotic shows reversible transport and/or low-affinity ribosome binding (‘low-affinity antibiotic’) or, in contrast, irreversible transport and/or high affinity ribosome binding (‘high-affinity antibiotic’). For low-affinity antibiotics, our model predicts that growth inhibition depends on the duration of the antibiotic pulse, and can show a transient period of very fast growth following removal of the antibiotic. For high-affinity antibiotics, growth inhibition depends on peak dosage rather than dose duration, and the model predicts a pronounced post-antibiotic effect, due to hysteresis, in which growth can be suppressed for long times after the antibiotic dose has ended. These predictions are experimentally testable and may be of clinical significance.
Monte Carlo modeling of single-molecule cytoplasmic dynein.
Singh, Manoranjan P; Mallik, Roop; Gross, Steven P; Yu, Clare C
2005-08-23
Molecular motors are responsible for active transport and organization in the cell, underlying an enormous number of crucial biological processes. Dynein is more complicated in its structure and function than other motors. Recent experiments have found that, unlike other motors, dynein can take different size steps along microtubules depending on load and ATP concentration. We use Monte Carlo simulations to model the molecular motor function of cytoplasmic dynein at the single-molecule level. The theory relates dynein's enzymatic properties to its mechanical force production. Our simulations reproduce the main features of recent single-molecule experiments that found a discrete distribution of dynein step sizes, depending on load and ATP concentration. The model reproduces the large steps found experimentally under high ATP and no load by assuming that the ATP binding affinities at the secondary sites decrease as the number of ATP bound to these sites increases. Additionally, to capture the essential features of the step-size distribution at very low ATP concentration and no load, the ATP hydrolysis of the primary site must be dramatically reduced when none of the secondary sites have ATP bound to them. We make testable predictions that should guide future experiments related to dynein function.
The role of serotonin in schizophrenia.
Iqbal, N; van Praag, H M
1995-01-01
The hypothesis that the LSD psychosis and by inference schizophrenic psychoses are related to dysfunctions in central serotonergic systems, formulated by Woolley and Shaw in the early 1950s was the first testable theory of modern biological psychiatry. Initially, it did not get the scientific attention it deserved. First, because LSD fell into disrepute and was to all intents and purposes banned from human experimentation. Secondly, the antipsychotics were discovered in the same period, and it became clear that these compounds block dopaminergic transmission and hence for many years thereafter the dopaminergic system occupied center stage in biological schizophrenia research. Presently, interest in the relation between serotonin and schizophrenia has been revived, due to the development of serotonin-blocking agents that appear to exert therapeutic effects in schizophrenia. In this paper the evidence for and against a link between serotonergic defects and schizophrenia psychopathology is critically discussed. The conclusion to be reached is threefold. (1) Interruption of certain serotonergic circuits represents an antipsychotic principle. (2) Tentative evidence suggests the involvement of serotonergic dysfunctions in the pathogenesis of schizophrenic psychoses. (3) It is not yet known whether serotonergic lesions contribute directly to the occurrence of schizophrenic psychopathology or via alterations in the dopaminergic system.
Potential benefits of plant diversity on vegetated roofs: a literature review.
Cook-Patton, Susan C; Bauerle, Taryn L
2012-09-15
Although vegetated green roofs can be difficult to establish and maintain, they are an increasingly popular method for mitigating the negative environmental impacts of urbanization. Most green roof development has focused on maximizing green roof performance by planting one or a few drought-tolerant species. We present an alternative approach, which recognizes green roofs as dynamic ecosystems and employs a diversity of species. We draw links between the ecological and green roof literature to generate testable predictions about how increasing plant diversity could improve short- and long-term green roof functioning. Although we found few papers that experimentally manipulated diversity on green roofs, those that did revealed ecological dynamics similar to those in more natural systems. However, there are many unresolved issues. To improve overall green roof performance, we should (1) elucidate the links among plant diversity, structural complexity, and green roof performance, (2) describe feedback mechanisms between plant and animal diversity on green roofs, (3) identify species with complementary traits, and (4) determine whether diverse green roof communities are more resilient to disturbance and environmental change than less diverse green roofs. Copyright © 2012 Elsevier Ltd. All rights reserved.
Domain fusion analysis by applying relational algebra to protein sequence and domain databases
Truong, Kevin; Ikura, Mitsuhiko
2003-01-01
Background Domain fusion analysis is a useful method to predict functionally linked proteins that may be involved in direct protein-protein interactions or in the same metabolic or signaling pathway. As separate domain databases like BLOCKS, PROSITE, Pfam, SMART, PRINTS-S, ProDom, TIGRFAMs, and amalgamated domain databases like InterPro continue to grow in size and quality, a computational method to perform domain fusion analysis that leverages on these efforts will become increasingly powerful. Results This paper proposes a computational method employing relational algebra to find domain fusions in protein sequence databases. The feasibility of this method was illustrated on the SWISS-PROT+TrEMBL sequence database using domain predictions from the Pfam HMM (hidden Markov model) database. We identified 235 and 189 putative functionally linked protein partners in H. sapiens and S. cerevisiae, respectively. From scientific literature, we were able to confirm many of these functional linkages, while the remainder offer testable experimental hypothesis. Results can be viewed at . Conclusion As the analysis can be computed quickly on any relational database that supports standard SQL (structured query language), it can be dynamically updated along with the sequence and domain databases, thereby improving the quality of predictions over time. PMID:12734020
Dark matter, proton decay and other phenomenological constraints in F-SU(5)
NASA Astrophysics Data System (ADS)
Li, Tianjun; Maxin, James A.; Nanopoulos, Dimitri V.; Walker, Joel W.
2011-07-01
We study gravity mediated supersymmetry breaking in F-SU(5) and its low-energy supersymmetric phenomenology. The gaugino masses are not unified at the traditional grand unification scale, but we nonetheless have the same one-loop gaugino mass relation at the electroweak scale as minimal supergravity (mSUGRA). We introduce parameters testable at the colliders to measure the small second loop deviation from the mSUGRA gaugino mass relation at the electroweak scale. In the minimal SU(5) model with gravity mediated supersymmetry breaking, we show that the deviations from the mSUGRA gaugino mass relations are within 5%. However, in F-SU(5), we predict the deviations from the mSUGRA gaugino mass relations to be larger due to the presence of vector-like particles, which can be tested at the colliders. We determine the viable parameter space that satisfies all the latest experimental constraints and find it is consistent with the CDMS II experiment. Further, we compute the cross-sections of neutralino annihilations into gamma-rays and compare to the first published Fermi-LAT measurement. Finally, the corresponding range of proton lifetime predictions is calculated and found to be within reach of the future Hyper-Kamiokande and DUSEL experiments.
Predicting the dynamics of bacterial growth inhibition by ribosome-targeting antibiotics
Greulich, Philip; Doležal, Jakub; Scott, Matthew; Evans, Martin R; Allen, Rosalind J
2017-01-01
Understanding how antibiotics inhibit bacteria can help to reduce antibiotic use and hence avoid antimicrobial resistance—yet few theoretical models exist for bacterial growth inhibition by a clinically relevant antibiotic treatment regimen. In particular, in the clinic, antibiotic treatment is time-dependent. Here, we use a theoretical model, previously applied to steady-state bacterial growth, to predict the dynamical response of a bacterial cell to a time-dependent dose of ribosome-targeting antibiotic. Our results depend strongly on whether the antibiotic shows reversible transport and/or low-affinity ribosome binding (‘low-affinity antibiotic’) or, in contrast, irreversible transport and/or high affinity ribosome binding (‘high-affinity antibiotic’). For low-affinity antibiotics, our model predicts that growth inhibition depends on the duration of the antibiotic pulse, and can show a transient period of very fast growth following removal of the antibiotic. For high-affinity antibiotics, growth inhibition depends on peak dosage rather than dose duration, and the model predicts a pronounced post-antibiotic effect, due to hysteresis, in which growth can be suppressed for long times after the antibiotic dose has ended. These predictions are experimentally testable and may be of clinical significance. PMID:28714461
The Long and Viscous Road: Uncovering Nuclear Diffusion Barriers in Closed Mitosis
Zavala, Eder; Marquez-Lago, Tatiana T.
2014-01-01
Diffusion barriers are effective means for constraining protein lateral exchange in cellular membranes. In Saccharomyces cerevisiae, they have been shown to sustain parental identity through asymmetric segregation of ageing factors during closed mitosis. Even though barriers have been extensively studied in the plasma membrane, their identity and organization within the nucleus remains poorly understood. Based on different lines of experimental evidence, we present a model of the composition and structural organization of a nuclear diffusion barrier during anaphase. By means of spatial stochastic simulations, we propose how specialised lipid domains, protein rings, and morphological changes of the nucleus may coordinate to restrict protein exchange between mother and daughter nuclear lobes. We explore distinct, plausible configurations of these diffusion barriers and offer testable predictions regarding their protein exclusion properties and the diffusion regimes they generate. Our model predicts that, while a specialised lipid domain and an immobile protein ring at the bud neck can compartmentalize the nucleus during early anaphase; a specialised lipid domain spanning the elongated bridge between lobes would be entirely sufficient during late anaphase. Our work shows how complex nuclear diffusion barriers in closed mitosis may arise from simple nanoscale biophysical interactions. PMID:25032937
Vlachos, Ioannis; Herry, Cyril; Lüthi, Andreas; Aertsen, Ad; Kumar, Arvind
2011-01-01
The basal nucleus of the amygdala (BA) is involved in the formation of context-dependent conditioned fear and extinction memories. To understand the underlying neural mechanisms we developed a large-scale neuron network model of the BA, composed of excitatory and inhibitory leaky-integrate-and-fire neurons. Excitatory BA neurons received conditioned stimulus (CS)-related input from the adjacent lateral nucleus (LA) and contextual input from the hippocampus or medial prefrontal cortex (mPFC). We implemented a plasticity mechanism according to which CS and contextual synapses were potentiated if CS and contextual inputs temporally coincided on the afferents of the excitatory neurons. Our simulations revealed a differential recruitment of two distinct subpopulations of BA neurons during conditioning and extinction, mimicking the activation of experimentally observed cell populations. We propose that these two subgroups encode contextual specificity of fear and extinction memories, respectively. Mutual competition between them, mediated by feedback inhibition and driven by contextual inputs, regulates the activity in the central amygdala (CEA) thereby controlling amygdala output and fear behavior. The model makes multiple testable predictions that may advance our understanding of fear and extinction memories. PMID:21437238
Pair production processes and flavor in gauge-invariant perturbation theory
NASA Astrophysics Data System (ADS)
Egger, Larissa; Maas, Axel; Sondenheimer, René
2017-12-01
Gauge-invariant perturbation theory is an extension of ordinary perturbation theory which describes strictly gauge-invariant states in theories with a Brout-Englert-Higgs effect. Such gauge-invariant states are composite operators which have necessarily only global quantum numbers. As a consequence, flavor is exchanged for custodial quantum numbers in the Standard Model, recreating the fermion spectrum in the process. Here, we study the implications of such a description, possibly also for the generation structure of the Standard Model. In particular, this implies that scattering processes are essentially bound-state-bound-state interactions, and require a suitable description. We analyze the implications for the pair-production process e+e-→f¯f at a linear collider to leading order. We show how ordinary perturbation theory is recovered as the leading contribution. Using a PDF-type language, we also assess the impact of sub-leading contributions. To lowest order, we find that the result is mainly influenced by how large the contribution of the Higgs at large x is. This gives an interesting, possibly experimentally testable, scenario for the formal field theory underlying the electroweak sector of the Standard Model.
Developing Tools to Test the Thermo-Mechanical Models, Examples at Crustal and Upper Mantle Scale
NASA Astrophysics Data System (ADS)
Le Pourhiet, L.; Yamato, P.; Burov, E.; Gurnis, M.
2005-12-01
Testing geodynamical model is never an easy task. Depending on the spatio-temporal scale of the model, different testable predictions are needed and no magic reciepe exist. This contribution first presents different methods that have been used to test themo-mechanical modeling results at upper crustal, lithospheric and upper mantle scale using three geodynamical examples : the Gulf of Corinth (Greece), the Western Alps, and the Sierra Nevada. At short spatio-temporal scale (e.g. Gulf of Corinth). The resolution of the numerical models is usually sufficient to catch the timing and kinematics of the faults precisely enough to be tested by tectono-stratigraphic arguments. In active deforming area, microseismicity can be compared to the effective rheology and P and T axes of the focal mechanism can be compared with local orientation of the major component of the stress tensor. At lithospheric scale the resolution of the models doesn't permit anymore to constrain the models by direct observations (i.e. structural data from field or seismic reflection). Instead, synthetic P-T-t path may be computed and compared to natural ones in term of rate of exhumation for ancient orogens. Topography may also help but on continent it mainly depends on erosion laws that are complicated to constrain. Deeper in the mantle, the only available constrain are long wave length topographic data and tomographic "data". The major problem to overcome now at lithospheric and upper mantle scale, is that the so called "data" results actually from inverse models of the real data and that those inverse model are based on synthetic models. Post processing P and S wave velocities is not sufficient to be able to make testable prediction at upper mantle scale. Instead of that, direct wave propagations model must be computed. This allows checking if the differences between two models constitute a testable prediction or not. On longer term, we may be able to use those synthetic models to reduce the residue in the inversion of elastic wave arrival time
Color vision deficiency in preschool children: the multi-ethnic pediatric eye disease study.
Xie, John Z; Tarczy-Hornoch, Kristina; Lin, Jesse; Cotter, Susan A; Torres, Mina; Varma, Rohit
2014-07-01
To determine the sex- and ethnicity-specific prevalence of color vision deficiency (CVD) in black, Asian, Hispanic, and non-Hispanic white preschool children. Population-based, cross-sectional study. The Multi-Ethnic Pediatric Eye Disease Study is a population-based evaluation of the prevalence of vision disorders in children in Southern California. A total of 5960 subjects 30 to 72 months of age were recruited for the study, of whom 4177 were able to complete color vision testing (1265 black, 812 Asian, 1280 Hispanic, and 820 non-Hispanic white). Color vision testing was performed using Color Vision Testing Made Easy color plates (Home Vision Care, Gulf Breeze, FL), and diagnostic confirmatory testing was performed using the Waggoner HRR Diagnostic Test color plates (Home Vision Care). Testability of color vision in preschool children between 30 and 72 months of age and prevalence of CVD stratified by age, sex, and ethnicity. Testability was 17% in children younger than 37 months of age, increasing to 57% in children 37 to 48 months of age, 89% in children 49 to 60 months of age, and 98% in children 61 to 72 months of age. The prevalence of CVD among boys was 1.4% for black, 3.1% for Asian, 2.6% for Hispanic, and 5.6% for non-Hispanic white children; the prevalence in girls was 0.0% to 0.5% for all ethnicities. The ethnic difference in CVD was statistically significant between black and non-Hispanic white children (P = 0.0003) and between Hispanic and non-Hispanic white children (P = 0.02). In boys, most CVD cases were either deutan (51%) or protan (34%); 32% were classified as mild, 15% as moderate, and 41% as severe. Testability for CVD in preschool children is high by 4 years of age. The prevalence of CVD in preschool boys varies by ethnicity, with the highest prevalence in non-Hispanic white and lowest in black children. Copyright © 2014 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.
Insights into Mechanisms of Chronic Neurodegeneration
Diack, Abigail B.; Alibhai, James D.; Barron, Rona; Bradford, Barry; Piccardo, Pedro; Manson, Jean C.
2016-01-01
Chronic neurodegenerative diseases such as Alzheimer’s disease (AD), Parkinson’s disease (PD), and prion diseases are characterised by the accumulation of abnormal conformers of a host encoded protein in the central nervous system. The process leading to neurodegeneration is still poorly defined and thus development of early intervention strategies is challenging. Unique amongst these diseases are Transmissible Spongiform Encephalopathies (TSEs) or prion diseases, which have the ability to transmit between individuals. The infectious nature of these diseases has permitted in vivo and in vitro modelling of the time course of the disease process in a highly reproducible manner, thus early events can be defined. Recent evidence has demonstrated that the cell-to-cell spread of protein aggregates by a “prion-like mechanism” is common among the protein misfolding diseases. Thus, the TSE models may provide insights into disease mechanisms and testable hypotheses for disease intervention, applicable to a number of these chronic neurodegenerative diseases. PMID:26771599
Traditional fire-use, landscape transition, and the legacies of social theory past.
Coughlan, Michael R
2015-12-01
Fire-use and the scale and character of its effects on landscapes remain hotly debated in the paleo- and historical-fire literature. Since the second half of the nineteenth century, anthropology and geography have played important roles in providing theoretical propositions and testable hypotheses for advancing understandings of the ecological role of human-fire-use in landscape histories. This article reviews some of the most salient and persistent theoretical propositions and hypotheses concerning the role of humans in historical fire ecology. The review discusses this history in light of current research agendas, such as those offered by pyrogeography. The review suggests that a more theoretically cognizant historical fire ecology should strive to operationalize transdisciplinary theory capable of addressing the role of human variability in the evolutionary history of landscapes. To facilitate this process, researchers should focus attention on integrating more current human ecology theory into transdisciplinary research agendas.
Similarities and differences between the Wnt and reelin pathways in the forming brain.
Reiner, Orly; Sapir, Tamar
2005-01-01
One of the key features in development is the reutilization of successful signaling pathways. Here, we emphasize the involvement of the Wnt pathway, one of the five kinds of signal transduction pathway predominating early embryonic development of all animals, in regulating the formation of brain structure. We discuss the interrelationships between the Wnt and reelin pathways in the regulation of cortical layering. We summarize data emphasizing key molecules, which, when mutated, result in abnormal brain development. This integrated view, which is based on conservation of pathways, reveals the relative position of participants in the pathway, points to control mechanisms, and allows raising testable working hypotheses. Nevertheless, although signaling pathways are highly conserved from flies to humans, the overall morphology is not. We propose that future studies directed at understanding of diversification will provide fruitful insights on mammalian brain formation.
Strength and Vulnerability Integration (SAVI): A Model of Emotional Well-Being Across Adulthood
Charles, Susan Turk
2010-01-01
The following paper presents the theoretical model of Strength and Vulnerability Integration (SAVI) to explain factors that influence emotion regulation and emotional well-being across adulthood. The model posits that trajectories of adult development are marked by age-related enhancement in the use of strategies that serve to avoid or limit exposure to negative stimuli, but age-related vulnerabilities in situations that elicit high levels of sustained emotional arousal. When older adults avoid or reduce exposure to emotional distress, they often respond better than younger adults; when they experience high levels of sustained emotional arousal, however, age-related advantages in emotional well-being are attenuated, and older adults are hypothesized to have greater difficulties returning to homeostasis. SAVI provides a testable model to understand the literature on emotion and aging and to predict trajectories of emotional experience across the adult life span. PMID:21038939
Flight elements: Fault detection and fault management
NASA Technical Reports Server (NTRS)
Lum, H.; Patterson-Hine, A.; Edge, J. T.; Lawler, D.
1990-01-01
Fault management for an intelligent computational system must be developed using a top down integrated engineering approach. An approach proposed includes integrating the overall environment involving sensors and their associated data; design knowledge capture; operations; fault detection, identification, and reconfiguration; testability; causal models including digraph matrix analysis; and overall performance impacts on the hardware and software architecture. Implementation of the concept to achieve a real time intelligent fault detection and management system will be accomplished via the implementation of several objectives, which are: Development of fault tolerant/FDIR requirement and specification from a systems level which will carry through from conceptual design through implementation and mission operations; Implementation of monitoring, diagnosis, and reconfiguration at all system levels providing fault isolation and system integration; Optimize system operations to manage degraded system performance through system integration; and Lower development and operations costs through the implementation of an intelligent real time fault detection and fault management system and an information management system.
Strength and vulnerability integration: a model of emotional well-being across adulthood.
Charles, Susan Turk
2010-11-01
The following article presents the theoretical model of strength and vulnerability integration (SAVI) to explain factors that influence emotion regulation and emotional well-being across adulthood. The model posits that trajectories of adult development are marked by age-related enhancement in the use of strategies that serve to avoid or limit exposure to negative stimuli but by age-related vulnerabilities in situations that elicit high levels of sustained emotional arousal. When older adults avoid or reduce exposure to emotional distress, they often respond better than younger adults; when they experience high levels of sustained emotional arousal, however, age-related advantages in emotional well-being are attenuated, and older adults are hypothesized to have greater difficulties returning to homeostasis. SAVI provides a testable model to understand the literature on emotion and aging and to predict trajectories of emotional experience across the adult life span.
How did the swiss cheese plant get its holes?
Muir, Christopher D
2013-02-01
Adult leaf fenestration in "Swiss cheese" plants (Monstera Adans.) is an unusual leaf shape trait lacking a convincing evolutionary explanation. Monstera are secondary hemiepiphytes that inhabit the understory of tropical rainforests, where photosynthesis from sunflecks often makes up a large proportion of daily carbon assimilation. Here I present a simple model of leaf-level photosynthesis and whole-plant canopy dynamics in a stochastic light environment. The model demonstrates that leaf fenestration can reduce the variance in plant growth and thereby increase geometric mean fitness. This growth-variance hypothesis also suggests explanations for conspicuous ontogenetic changes in leaf morphology (heteroblasty) in Monstera, as well as the absence of leaf fenestration in co-occurring juvenile tree species. The model provides a testable hypothesis of the adaptive significance of a unique leaf shape and illustrates how variance in growth rate could be an important factor shaping plant morphology and physiology.
Quantitative Measurements of Autobiographical Memory Content
Mainetti, Matteo; Ascoli, Giorgio A.
2012-01-01
Autobiographical memory (AM), subjective recollection of past experiences, is fundamental in everyday life. Nevertheless, characterization of the spontaneous occurrence of AM, as well as of the number and types of recollected details, remains limited. The CRAM (Cue-Recalled Autobiographical Memory) test (http://cramtest.info) adapts and combines the cue-word method with an assessment that collects counts of details recalled from different life periods. The SPAM (Spontaneous Probability of Autobiographical Memories) protocol samples introspection during everyday activity, recording memory duration and frequency. These measures provide detailed, naturalistic accounts of AM content and frequency, quantifying essential dimensions of recollection. AM content (∼20 details/recollection) decreased with the age of the episode, but less drastically than the probability of reporting remote compared to recent memories. AM retrieval was frequent (∼20/hour), each memory lasting ∼30 seconds. Testable hypotheses of the specific content retrieved in a fixed time from given life periods are presented. PMID:23028629
Cai, Zuowei; Huang, Lihong; Zhang, Lingling
2015-05-01
This paper investigates the problem of exponential synchronization of time-varying delayed neural networks with discontinuous neuron activations. Under the extended Filippov differential inclusion framework, by designing discontinuous state-feedback controller and using some analytic techniques, new testable algebraic criteria are obtained to realize two different kinds of global exponential synchronization of the drive-response system. Moreover, we give the estimated rate of exponential synchronization which depends on the delays and system parameters. The obtained results extend some previous works on synchronization of delayed neural networks not only with continuous activations but also with discontinuous activations. Finally, numerical examples are provided to show the correctness of our analysis via computer simulations. Our method and theoretical results have a leading significance in the design of synchronized neural network circuits involving discontinuous factors and time-varying delays. Copyright © 2015 Elsevier Ltd. All rights reserved.
Simpson, Eleanor H.; Kellendonk, Christoph
2016-01-01
The dopamine hypothesis of schizophrenia is supported by a large number of imaging studies that have identified an increase in dopamine binding at the D2 receptor selectively in the striatum. Here we review a decade of work using a regionally restricted and temporally regulated transgenic mouse model to investigate the behavioral, molecular, electrophysiological, and anatomical consequences of selective D2 receptor upregulation in the striatum. These studies have identified new and potentially important biomarkers at the circuit and molecular level that can now be explored in patients with schizophrenia. They provide an example of how animal models and their detailed level of neurobiological analysis allow a deepening of our understanding of the relationship between neuronal circuit function and symptoms of schizophrenia, and as a consequence generate new hypotheses that are testable in patients. PMID:27720388
Rice-arsenate interactions in hydroponics: a three-gene model for tolerance.
Norton, Gareth J; Nigar, Meher; Williams, Paul N; Dasgupta, Tapash; Meharg, Andrew A; Price, Adam H
2008-01-01
In this study, the genetic mapping of the tolerance of root growth to 13.3 muM arsenate [As(V)] using the BalaxAzucena population is improved, and candidate genes for further study are identified. A remarkable three-gene model of tolerance is advanced, which appears to involve epistatic interaction between three major genes, two on chromosome 6 and one on chromosome 10. Any combination of two of these genes inherited from the tolerant parent leads to the plant having tolerance. Lists of potential positional candidate genes are presented. These are then refined using whole genome transcriptomics data and bioinformatics. Physiological evidence is also provided that genes related to phosphate transport are unlikely to be behind the genetic loci conferring tolerance. These results offer testable hypotheses for genes related to As(V) tolerance that might offer strategies for mitigating arsenic (As) accumulation in consumed rice.
Rice–arsenate interactions in hydroponics: a three-gene model for tolerance
Norton, Gareth J.; Nigar, Meher; Dasgupta, Tapash; Meharg, Andrew A.; Price, Adam H.
2008-01-01
In this study, the genetic mapping of the tolerance of root growth to 13.3 μM arsenate [As(V)] using the Bala×Azucena population is improved, and candidate genes for further study are identified. A remarkable three-gene model of tolerance is advanced, which appears to involve epistatic interaction between three major genes, two on chromosome 6 and one on chromosome 10. Any combination of two of these genes inherited from the tolerant parent leads to the plant having tolerance. Lists of potential positional candidate genes are presented. These are then refined using whole genome transcriptomics data and bioinformatics. Physiological evidence is also provided that genes related to phosphate transport are unlikely to be behind the genetic loci conferring tolerance. These results offer testable hypotheses for genes related to As(V) tolerance that might offer strategies for mitigating arsenic (As) accumulation in consumed rice. PMID:18453529
NASA Technical Reports Server (NTRS)
Harper, Richard E.; Elks, Carl
1995-01-01
An Army Fault Tolerant Architecture (AFTA) has been developed to meet real-time fault tolerant processing requirements of future Army applications. AFTA is the enabling technology that will allow the Army to configure existing processors and other hardware to provide high throughput and ultrahigh reliability necessary for TF/TA/NOE flight control and other advanced Army applications. A comprehensive conceptual study of AFTA has been completed that addresses a wide range of issues including requirements, architecture, hardware, software, testability, producibility, analytical models, validation and verification, common mode faults, VHDL, and a fault tolerant data bus. A Brassboard AFTA for demonstration and validation has been fabricated, and two operating systems and a flight-critical Army application have been ported to it. Detailed performance measurements have been made of fault tolerance and operating system overheads while AFTA was executing the flight application in the presence of faults.
Cultural prototypes and dimensions of honor.
Cross, Susan E; Uskul, Ayse K; Gerçek-Swing, Berna; Sunbay, Zeynep; Alözkan, Cansu; Günsoy, Ceren; Ataca, Bilge; Karakitapoglu-Aygün, Zahide
2014-02-01
Research evidence and theoretical accounts of honor point to differing definitions of the construct in differing cultural contexts. The current studies address the question "What is honor?" using a prototype approach in Turkey and the Northern United States. Studies 1a/1b revealed substantial differences in the specific features generated by members of the two groups, but Studies 2 and 3 revealed cultural similarities in the underlying dimensions of self-respect, moral behavior, and social status/respect. Ratings of the centrality and personal importance of these factors were similar across the two groups, but their association with other relevant constructs differed. The tripartite nature of honor uncovered in these studies helps observers and researchers alike understand how diverse responses to situations can be attributed to honor. Inclusion of a prototype analysis into the literature on honor cultures can provide enhanced coverage of the concept that may lead to testable hypotheses and new theoretical developments.
Leveraging ecological theory to guide natural product discovery.
Smanski, Michael J; Schlatter, Daniel C; Kinkel, Linda L
2016-03-01
Technological improvements have accelerated natural product (NP) discovery and engineering to the point that systematic genome mining for new molecules is on the horizon. NP biosynthetic potential is not equally distributed across organisms, environments, or microbial life histories, but instead is enriched in a number of prolific clades. Also, NPs are not equally abundant in nature; some are quite common and others markedly rare. Armed with this knowledge, random 'fishing expeditions' for new NPs are increasingly harder to justify. Understanding the ecological and evolutionary pressures that drive the non-uniform distribution of NP biosynthesis provides a rational framework for the targeted isolation of strains enriched in new NP potential. Additionally, ecological theory leads to testable hypotheses regarding the roles of NPs in shaping ecosystems. Here we review several recent strain prioritization practices and discuss the ecological and evolutionary underpinnings for each. Finally, we offer perspectives on leveraging microbial ecology and evolutionary biology for future NP discovery.
Functional Interdependence Theory: An Evolutionary Account of Social Situations.
Balliet, Daniel; Tybur, Joshua M; Van Lange, Paul A M
2017-11-01
Social interactions are characterized by distinct forms of interdependence, each of which has unique effects on how behavior unfolds within the interaction. Despite this, little is known about the psychological mechanisms that allow people to detect and respond to the nature of interdependence in any given interaction. We propose that interdependence theory provides clues regarding the structure of interdependence in the human ancestral past. In turn, evolutionary psychology offers a framework for understanding the types of information processing mechanisms that could have been shaped under these recurring conditions. We synthesize and extend these two perspectives to introduce a new theory: functional interdependence theory (FIT). FIT can generate testable hypotheses about the function and structure of the psychological mechanisms for inferring interdependence. This new perspective offers insight into how people initiate and maintain cooperative relationships, select social partners and allies, and identify opportunities to signal social motives.
Phenemenological vs. biophysical models of thermal stress in aquatic eggs
NASA Astrophysics Data System (ADS)
Martin, B.
2016-12-01
Predicting species responses to climate change is a central challenge in ecology, with most efforts relying on lab derived phenomenological relationships between temperature and fitness metrics. We tested one of these models using the embryonic stage of a Chinook salmon population. We parameterized the model with laboratory data, applied it to predict survival in the field, and found that it significantly underestimated field-derived estimates of thermal mortality. We used a biophysical model based on mass-transfer theory to show that the discrepancy was due to the differences in water flow velocities between the lab and the field. This mechanistic approach provides testable predictions for how the thermal tolerance of embryos depends on egg size and flow velocity of the surrounding water. We found support for these predictions across more than 180 fish species, suggesting that flow and temperature mediated oxygen limitation is a general mechanism underlying the thermal tolerance of embryos.
Minimal model linking two great mysteries: Neutrino mass and dark matter
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farzan, Yasaman
2009-10-01
We present an economic model that establishes a link between neutrino masses and properties of the dark matter candidate. The particle content of the model can be divided into two groups: light particles with masses lighter than the electroweak scale and heavy particles. The light particles, which also include the dark matter candidate, are predicted to show up in the low energy experiments such as (K{yields}l+missing energy), making the model testable. The heavy sector can show up at the LHC and may give rise to Br(l{sub i}{yields}l{sub j}{gamma}) close to the present bounds. In principle, the new couplings of themore » model can independently be derived from the data from the LHC and from the information on neutrino masses and lepton flavor violating rare decays, providing the possibility of an intensive cross-check of the model.« less
Li, Hongfei; Jiang, Haijun; Hu, Cheng
2016-03-01
In this paper, we investigate a class of memristor-based BAM neural networks with time-varying delays. Under the framework of Filippov solutions, boundedness and ultimate boundedness of solutions of memristor-based BAM neural networks are guaranteed by Chain rule and inequalities technique. Moreover, a new method involving Yoshizawa-like theorem is favorably employed to acquire the existence of periodic solution. By applying the theory of set-valued maps and functional differential inclusions, an available Lyapunov functional and some new testable algebraic criteria are derived for ensuring the uniqueness and global exponential stability of periodic solution of memristor-based BAM neural networks. The obtained results expand and complement some previous work on memristor-based BAM neural networks. Finally, a numerical example is provided to show the applicability and effectiveness of our theoretical results. Copyright © 2015 Elsevier Ltd. All rights reserved.
Ecological Suitability and Spatial Distribution of Five Anopheles Species in Amazonian Brazil
McKeon, Sascha N.; Schlichting, Carl D.; Povoa, Marinete M.; Conn, Jan E.
2013-01-01
Seventy-six sites characterized in Amazonian Brazil revealed distinct habitat diversification by examining the environmental factors associated with the distribution and abundance of five anopheline species (Diptera: Culicidae) in the subgenus Nyssorhynchus. These included three members of the Albitarsis Complex, Anopheles oryzalimnetes, Anopheles marajoara, Anopheles janconnae; Anopheles triannulatus, and Anopheles goeldii. Anopheles janconnae abundance had a positive correlation to water flow and a negative relationship to sun exposure. Abundance of An. oryzalimentes was associated with water chemistry. Anopheles goeldii larvae were abundant in shaded, more saline waters. Anopheles marajoara and An. triannulatus were negatively associated with available resources, although An. marajoara also showed several local correlations. These analyses suggest An. triannulatus is a habitat generalist, An. oryzalimentes and An. janconnae are specialists, and An. marajoara and An. goeldii could not be easily classified either way. Correlations described herein provide testable hypotheses for future research and identifying habitats for vector control. PMID:23546804
Postmarketing surveillance: perspectives of a journal editor.
Gelenberg, A J
1993-01-01
In the absence of a systematic monitoring program for drugs newly approved by the Food and Drug Administration (FDA), reports in clinical journals provide a legitimate forum for disseminating information about unexpected pharmacologic events. A journal editor bears the responsibility for publishing educated clinical observations that meet standards of scientific rigor while not giving premature credibility to chance and dubious reports of side effects of new drugs. Often this responsibility involves overcoming the fear of bad publicity and withstanding pressures from pharmaceutical companies to print only positive information about new products. Published preliminary observations may contribute to the problem of product liability, but they also generate testable hypotheses and healthy debate. If hypotheses later prove to be incorrect, they can be refuted by systematic studies and clarified in reviews and editorials. Our goal of effective education will be reached not by self-censorship but by scientific openness.
Resolving Microzooplankton Functional Groups In A Size-Structured Planktonic Model
NASA Astrophysics Data System (ADS)
Taniguchi, D.; Dutkiewicz, S.; Follows, M. J.; Jahn, O.; Menden-Deuer, S.
2016-02-01
Microzooplankton are important marine grazers, often consuming a large fraction of primary productivity. They consist of a great diversity of organisms with different behaviors, characteristics, and rates. This functional diversity, and its consequences, are not currently reflected in large-scale ocean ecological simulations. How should these organisms be represented, and what are the implications for their biogeography? We develop a size-structured, trait-based model to characterize a diversity of microzooplankton functional groups. We compile and examine size-based laboratory data on the traits, revealing some patterns with size and functional group that we interpret with mechanistic theory. Fitting the model to the data provides parameterizations of key rates and properties, which we employ in a numerical ocean model. The diversity of grazing preference, rates, and trophic strategies enables the coexistence of different functional groups of micro-grazers under various environmental conditions, and the model produces testable predictions of the biogeography.
Solving puzzles of GW150914 by primordial black holes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blinnikov, S.; Dolgov, A.; Porayko, N.K.
The black hole binary properties inferred from the LIGO gravitational wave signal GW150914 posed several serious problems. The high masses and low effective spin of black hole binary can be explained if they are primordial (PBH) rather than the products of the stellar binary evolution. Such PBH properties are postulated ad hoc but not derived from fundamental theory. We show that the necessary features of PBHs naturally follow from the slightly modified Affleck-Dine (AD) mechanism of baryogenesis. The log-normal distribution of PBHs, predicted within the AD paradigm, is adjusted to provide an abundant population of low-spin stellar mass black holes.more » The same distribution gives a sufficient number of quickly growing seeds of supermassive black holes observed at high redshifts and may comprise an appreciable fraction of Dark Matter which does not contradict any existing observational limits. Testable predictions of this scenario are discussed.« less
A discrete control model of PLANT
NASA Technical Reports Server (NTRS)
Mitchell, C. M.
1985-01-01
A model of the PLANT system using the discrete control modeling techniques developed by Miller is described. Discrete control models attempt to represent in a mathematical form how a human operator might decompose a complex system into simpler parts and how the control actions and system configuration are coordinated so that acceptable overall system performance is achieved. Basic questions include knowledge representation, information flow, and decision making in complex systems. The structure of the model is a general hierarchical/heterarchical scheme which structurally accounts for coordination and dynamic focus of attention. Mathematically, the discrete control model is defined in terms of a network of finite state systems. Specifically, the discrete control model accounts for how specific control actions are selected from information about the controlled system, the environment, and the context of the situation. The objective is to provide a plausible and empirically testable accounting and, if possible, explanation of control behavior.
Stephenson, Chris P; Baguley, Ian J
2018-02-01
Functional Neurological Symptom Disorder (FND) is a relatively common neurological condition, accounting for approximately 3-6% of neurologist referrals. FND is considered a transient disorder of neuronal function, sometimes linked to physical trauma and psychological stress. Despite this, chronic disability is common, for example, around 40% of adults with motor FND have permanent disability. Building on current theoretical models, this paper proposes that microglial dysfunction could perpetuate functional changes within acute motor FND, thus providing a pathophysiological mechanism underlying the chronic stage of the motor FND phenotypes seen clinically. Core to our argument is microglia's dual role in modulating neuroimmunity and their control of synaptic plasticity, which places them at a pathophysiological nexus wherein coincident physical trauma and psychological stress could cause long-term change in neuronal networks without producing macroscopic structural abnormality. This model proposes a range of hypotheses that are testable with current technologies. Copyright © 2017. Published by Elsevier Ltd.
Mercury's magnetic field - A thermoelectric dynamo?
NASA Technical Reports Server (NTRS)
Stevenson, D. J.
1987-01-01
Permanent magnetism and conventional dynamo theory are possible but problematic explanations for the magnitude of the Mercurian magnetic field. A new model is proposed in which thermoelectric currents driven by temperature differences at a bumpy core-mantle boundary are responsible for the (unobserved) toroidal field, and the helicity of convective motions in a thin outer core (thickness of about 100 km) induces the observed poloidal field from the toroidal field. The observed field of about 3 x 10 to the -7th T can be reproduced provided the electrical conductivity of Mercury's semiconducting mantle approaches 1000/ohm per m. This model may be testable by future missions to Mercury because it predicts a more complicated field geometry than conventional dynamo theories. However, it is argued that polar wander may cause the core-mantle topography to migrate so that some aspects of the rotational symmetry may be reflected in the observed field.
Pollinator-driven ecological speciation in plants: new evidence and future perspectives
Van der Niet, Timotheüs; Peakall, Rod; Johnson, Steven D.
2014-01-01
Background The hypothesis that pollinators have been important drivers of angiosperm diversity dates back to Darwin, and remains an important research topic today. Mounting evidence indicates that pollinators have the potential to drive diversification at several different stages of the evolutionary process. Microevolutionary studies have provided evidence for pollinator-mediated floral adaptation, while macroevolutionary evidence supports a general pattern of pollinator-driven diversification of angiosperms. However, the overarching issue of whether, and how, shifts in pollination system drive plant speciation represents a critical gap in knowledge. Bridging this gap is crucial to fully understand whether pollinator-driven microevolution accounts for the observed macroevolutionary patterns. Testable predictions about pollinator-driven speciation can be derived from the theory of ecological speciation, according to which adaptation (microevolution) and speciation (macroevolution) are directly linked. This theory is a particularly suitable framework for evaluating evidence for the processes underlying shifts in pollination systems and their potential consequences for the evolution of reproductive isolation and speciation. Scope This Viewpoint paper focuses on evidence for the four components of ecological speciation in the context of plant-pollinator interactions, namely (1) the role of pollinators as selective agents, (2) floral trait divergence, including the evolution of ‘pollination ecotypes‘, (3) the geographical context of selection on floral traits, and (4) the role of pollinators in the evolution of reproductive isolation. This Viewpoint also serves as the introduction to a Special Issue on Pollinator-Driven Speciation in Plants. The 13 papers in this Special Issue range from microevolutionary studies of ecotypes to macroevolutionary studies of historical ecological shifts, and span a wide range of geographical areas and plant families. These studies further illustrate innovative experimental approaches, and they employ modern tools in genetics and floral trait quantification. Future advances to the field require better quantification of selection through male fitness and pollinator isolation, for instance by exploiting next-generation sequencing technologies. By combining these new tools with strategically chosen study systems, and smart experimental design, we predict that examples of pollinator-driven speciation will be among the most widespread and compelling of all cases of ecological speciation. PMID:24418954
The role of beta-endorphin in the pathophysiology of major depression.
Hegadoren, K M; O'Donnell, T; Lanius, R; Coupland, N J; Lacaze-Masmonteil, N
2009-10-01
A role for beta-endorphin (beta-END) in the pathophysiology of major depressive disorder (MDD) is suggested by both animal research and studies examining clinical populations. The major etiological theories of depression include brain regions and neural systems that interact with opioid systems and beta-END. Recent preclinical data have demonstrated multiple roles for beta-END in the regulation of complex homeostatic and behavioural processes that are affected during a depressive episode. Additionally, beta-END inputs to regulatory pathways involving feeding behaviours, motivation, and specific types of motor activity have important implications in defining the biological foundations for specific depressive symptoms. Early research linking beta-END to MDD did so in the context of the hypothalamic-pituitary-adrenal (HPA) axis activity, where it was suggested that HPA axis dysregulation may account for depressive symptoms in some individuals. The primary aims of this paper are to use both preclinical and clinical research (a) to critically review data that explores potential roles for beta-END in the pathophysiology of MDD and (b) to highlight gaps in the literature that limit further development of etiological theories of depression and testable hypotheses. In addition to examining methodological and theoretical challenges of past clinical studies, we summarize studies that have investigated basal beta-END levels in MDD and that have used challenge tests to examine beta-END responses to a variety of experimental paradigms. A brief description of the synthesis, location in the CNS and behavioural pharmacology of this neuropeptide is also provided to frame this discussion. Given the lack of clinical improvement observed with currently available antidepressants in a significant proportion of depressed individuals, it is imperative that novel mechanisms be investigated for antidepressant potential. We conclude that the renewed interest in elucidating the role of beta-END in the pathophysiology of MDD must be paralleled by consensus building within the research community around the heterogeneity inherent in mood disorders, standardization of experimental protocols, improved discrimination of POMC products in analytical techniques and consistent attention paid to important confounds like age and gender.
DuBois, Debra C; Piel, William H; Jusko, William J
2008-01-01
High-throughput data collection using gene microarrays has great potential as a method for addressing the pharmacogenomics of complex biological systems. Similarly, mechanism-based pharmacokinetic/pharmacodynamic modeling provides a tool for formulating quantitative testable hypotheses concerning the responses of complex biological systems. As the response of such systems to drugs generally entails cascades of molecular events in time, a time series design provides the best approach to capturing the full scope of drug effects. A major problem in using microarrays for high-throughput data collection is sorting through the massive amount of data in order to identify probe sets and genes of interest. Due to its inherent redundancy, a rich time series containing many time points and multiple samples per time point allows for the use of less stringent criteria of expression, expression change and data quality for initial filtering of unwanted probe sets. The remaining probe sets can then become the focus of more intense scrutiny by other methods, including temporal clustering, functional clustering and pharmacokinetic/pharmacodynamic modeling, which provide additional ways of identifying the probes and genes of pharmacological interest. PMID:15212590
Social calls provide novel insights into the evolution of vocal learning
Sewall, Kendra B.; Young, Anna M.; Wright, Timothy F.
2016-01-01
Learned song is among the best-studied models of animal communication. In oscine songbirds, where learned song is most prevalent, it is used primarily for intrasexual selection and mate attraction. Learning of a different class of vocal signals, known as contact calls, is found in a diverse array of species, where they are used to mediate social interactions among individuals. We argue that call learning provides a taxonomically rich system for studying testable hypotheses for the evolutionary origins of vocal learning. We describe and critically evaluate four nonmutually exclusive hypotheses for the origin and current function of vocal learning of calls, which propose that call learning (1) improves auditory detection and recognition, (2) signals local knowledge, (3) signals group membership, or (4) allows for the encoding of more complex social information. We propose approaches to testing these four hypotheses but emphasize that all of them share the idea that social living, not sexual selection, is a central driver of vocal learning. Finally, we identify future areas for research on call learning that could provide new perspectives on the origins and mechanisms of vocal learning in both animals and humans. PMID:28163325
NEXUS Scalable and Distributed Next-Generation Avionics Bus for Space Missions
NASA Technical Reports Server (NTRS)
He, Yutao; Shalom, Eddy; Chau, Savio N.; Some, Raphael R.; Bolotin, Gary S.
2011-01-01
A paper discusses NEXUS, a common, next-generation avionics interconnect that is transparently compatible with wired, fiber-optic, and RF physical layers; provides a flexible, scalable, packet switched topology; is fault-tolerant with sub-microsecond detection/recovery latency; has scalable bandwidth from 1 Kbps to 10 Gbps; has guaranteed real-time determinism with sub-microsecond latency/jitter; has built-in testability; features low power consumption (< 100 mW per Gbps); is lightweight with about a 5,000-logic-gate footprint; and is implemented in a small Bus Interface Unit (BIU) with reconfigurable back-end providing interface to legacy subsystems. NEXUS enhances a commercial interconnect standard, Serial RapidIO, to meet avionics interconnect requirements without breaking the standard. This unified interconnect technology can be used to meet performance, power, size, and reliability requirements of all ranges of equipment, sensors, and actuators at chip-to-chip, board-to-board, or box-to-box boundary. Early results from in-house modeling activity of Serial RapidIO using VisualSim indicate that the use of a switched, high-performance avionics network will provide a quantum leap in spacecraft onboard science and autonomy capability for science and exploration missions.
Embedded performance validity testing in neuropsychological assessment: Potential clinical tools.
Rickards, Tyler A; Cranston, Christopher C; Touradji, Pegah; Bechtold, Kathleen T
2018-01-01
The article aims to suggest clinically-useful tools in neuropsychological assessment for efficient use of embedded measures of performance validity. To accomplish this, we integrated available validity-related and statistical research from the literature, consensus statements, and survey-based data from practicing neuropsychologists. We provide recommendations for use of 1) Cutoffs for embedded performance validity tests including Reliable Digit Span, California Verbal Learning Test (Second Edition) Forced Choice Recognition, Rey-Osterrieth Complex Figure Test Combination Score, Wisconsin Card Sorting Test Failure to Maintain Set, and the Finger Tapping Test; 2) Selecting number of performance validity measures to administer in an assessment; and 3) Hypothetical clinical decision-making models for use of performance validity testing in a neuropsychological assessment collectively considering behavior, patient reporting, and data indicating invalid or noncredible performance. Performance validity testing helps inform the clinician about an individual's general approach to tasks: response to failure, task engagement and persistence, compliance with task demands. Data-driven clinical suggestions provide a resource to clinicians and to instigate conversation within the field to make more uniform, testable decisions to further the discussion, and guide future research in this area.
Diffusion in the presence of a local attracting factor: Theory and interdisciplinary applications.
Veermäe, Hardi; Patriarca, Marco
2017-06-01
In many complex diffusion processes the drift of random walkers is not caused by an external force, as in the case of Brownian motion, but by local variations of fitness perceived by the random walkers. In this paper, a simple but general framework is presented that describes such a type of random motion and may be of relevance in different problems, such as opinion dynamics, cultural spreading, and animal movement. To this aim, we study the problem of a random walker in d dimensions moving in the presence of a local heterogeneous attracting factor expressed in terms of an assigned position-dependent "attractiveness function." At variance with standard Brownian motion, the attractiveness function introduced here regulates both the advection and diffusion of the random walker, thus providing testable predictions for a specific form of fluctuation-relations. We discuss the relation between the drift-diffusion equation based on the attractiveness function and that describing standard Brownian motion, and we provide some explicit examples illustrating its relevance in different fields, such as animal movement, chemotactic diffusion, and social dynamics.
The spectro-contextual encoding and retrieval theory of episodic memory.
Watrous, Andrew J; Ekstrom, Arne D
2014-01-01
The spectral fingerprint hypothesis, which posits that different frequencies of oscillations underlie different cognitive operations, provides one account for how interactions between brain regions support perceptual and attentive processes (Siegel etal., 2012). Here, we explore and extend this idea to the domain of human episodic memory encoding and retrieval. Incorporating findings from the synaptic to cognitive levels of organization, we argue that spectrally precise cross-frequency coupling and phase-synchronization promote the formation of hippocampal-neocortical cell assemblies that form the basis for episodic memory. We suggest that both cell assembly firing patterns as well as the global pattern of brain oscillatory activity within hippocampal-neocortical networks represents the contents of a particular memory. Drawing upon the ideas of context reinstatement and multiple trace theory, we argue that memory retrieval is driven by internal and/or external factors which recreate these frequency-specific oscillatory patterns which occur during episodic encoding. These ideas are synthesized into a novel model of episodic memory (the spectro-contextual encoding and retrieval theory, or "SCERT") that provides several testable predictions for future research.
FPGA Implementation of Heart Rate Monitoring System.
Panigrahy, D; Rakshit, M; Sahu, P K
2016-03-01
This paper describes a field programmable gate array (FPGA) implementation of a system that calculates the heart rate from Electrocardiogram (ECG) signal. After heart rate calculation, tachycardia, bradycardia or normal heart rate can easily be detected. ECG is a diagnosis tool routinely used to access the electrical activities and muscular function of the heart. Heart rate is calculated by detecting the R peaks from the ECG signal. To provide a portable and the continuous heart rate monitoring system for patients using ECG, needs a dedicated hardware. FPGA provides easy testability, allows faster implementation and verification option for implementing a new design. We have proposed a five-stage based methodology by using basic VHDL blocks like addition, multiplication and data conversion (real to the fixed point and vice-versa). Our proposed heart rate calculation (R-peak detection) method has been validated, using 48 first channel ECG records of the MIT-BIH arrhythmia database. It shows an accuracy of 99.84%, the sensitivity of 99.94% and the positive predictive value of 99.89%. Our proposed method outperforms other well-known methods in case of pathological ECG signals and successfully implemented in FPGA.
Combined neurostimulation and neuroimaging in cognitive neuroscience: past, present, and future.
Bestmann, Sven; Feredoes, Eva
2013-08-01
Modern neurostimulation approaches in humans provide controlled inputs into the operations of cortical regions, with highly specific behavioral consequences. This enables causal structure-function inferences, and in combination with neuroimaging, has provided novel insights into the basic mechanisms of action of neurostimulation on distributed networks. For example, more recent work has established the capacity of transcranial magnetic stimulation (TMS) to probe causal interregional influences, and their interaction with cognitive state changes. Combinations of neurostimulation and neuroimaging now face the challenge of integrating the known physiological effects of neurostimulation with theoretical and biological models of cognition, for example, when theoretical stalemates between opposing cognitive theories need to be resolved. This will be driven by novel developments, including biologically informed computational network analyses for predicting the impact of neurostimulation on brain networks, as well as novel neuroimaging and neurostimulation techniques. Such future developments may offer an expanded set of tools with which to investigate structure-function relationships, and to formulate and reconceptualize testable hypotheses about complex neural network interactions and their causal roles in cognition. © 2013 New York Academy of Sciences.
A Framework for Finding and Interpreting Stellar CMEs
NASA Astrophysics Data System (ADS)
Osten, Rachel A.; Wolk, Scott J.
2017-10-01
The astrophysical study of mass loss, both steady-state and transient, on the cool half of the HR diagram has implications both for the star itself and the conditions created around the star that can be hospitable or inimical to supporting life. Stellar coronal mass ejections (CMEs) have not been conclusively detected, despite the ubiquity with which their radiative counterparts in an eruptive event (flares) have been. I will review some of the different observational methods which have been used and possibly could be used in the future in the stellar case, emphasizing some of the difficulties inherent in such attempts. I will provide a framework for interpreting potential transient stellar mass loss in light of the properties of flares known to occur on magnetically active stars. This uses a physically motivated way to connect the properties of flares and coronal mass ejections and provides a testable hypothesis for observing or constraining transient stellar mass loss. Finally I will describe recent results using observations at low radio frequencies to detect stellar coronal mass ejections, and give updates on prospects using future facilities to make headway in this important area.
Diffusion in the presence of a local attracting factor: Theory and interdisciplinary applications
NASA Astrophysics Data System (ADS)
Veermäe, Hardi; Patriarca, Marco
2017-06-01
In many complex diffusion processes the drift of random walkers is not caused by an external force, as in the case of Brownian motion, but by local variations of fitness perceived by the random walkers. In this paper, a simple but general framework is presented that describes such a type of random motion and may be of relevance in different problems, such as opinion dynamics, cultural spreading, and animal movement. To this aim, we study the problem of a random walker in d dimensions moving in the presence of a local heterogeneous attracting factor expressed in terms of an assigned position-dependent "attractiveness function." At variance with standard Brownian motion, the attractiveness function introduced here regulates both the advection and diffusion of the random walker, thus providing testable predictions for a specific form of fluctuation-relations. We discuss the relation between the drift-diffusion equation based on the attractiveness function and that describing standard Brownian motion, and we provide some explicit examples illustrating its relevance in different fields, such as animal movement, chemotactic diffusion, and social dynamics.
Steps in the bacterial flagellar motor.
Mora, Thierry; Yu, Howard; Sowa, Yoshiyuki; Wingreen, Ned S
2009-10-01
The bacterial flagellar motor is a highly efficient rotary machine used by many bacteria to propel themselves. It has recently been shown that at low speeds its rotation proceeds in steps. Here we propose a simple physical model, based on the storage of energy in protein springs, that accounts for this stepping behavior as a random walk in a tilted corrugated potential that combines torque and contact forces. We argue that the absolute angular position of the rotor is crucial for understanding step properties and show this hypothesis to be consistent with the available data, in particular the observation that backward steps are smaller on average than forward steps. We also predict a sublinear speed versus torque relationship for fixed load at low torque, and a peak in rotor diffusion as a function of torque. Our model provides a comprehensive framework for understanding and analyzing stepping behavior in the bacterial flagellar motor and proposes novel, testable predictions. More broadly, the storage of energy in protein springs by the flagellar motor may provide useful general insights into the design of highly efficient molecular machines.
Matisoo-Smith, Elizabeth; Gosling, Anna L
2018-05-01
The Pacific region has had a complex human history. It has been subject to multiple major human dispersal and colonisation events, including some of the earliest Out-of-Africa migrations, the so-called Austronesian expansion of people out of Island Southeast Asia, and the more recent arrival of Europeans. Despite models of island isolation, evidence suggests significant levels of interconnectedness that vary in direction and frequency over time. The Pacific Ocean covers a vast area and its islands provide an array of different physical environments with variable pathogen loads and subsistence opportunities. These diverse environments likely caused Pacific peoples to adapt (both genetically and culturally) in unique ways. Differences in genetic background, in combination with adaptation, likely affect their susceptibility to non-communicable diseases. Here we provide an overview of some of the key issues in the natural and human history of the Pacific region which are likely to impact human health. We argue that understanding the evolutionary and cultural history of Pacific peoples is essential for the generation of testable hypotheses surrounding potential causes of elevated disease susceptibility among Pacific peoples.
Follett, Christopher L; Dutkiewicz, Stephanie; Karl, David M; Inomura, Keisuke; Follows, Michael J
2018-06-01
In the North Pacific Subtropical Gyre (NPSG), an annual pulse of sinking organic carbon is observed at 4000 m between July and August, driven by large diatoms found in association with nitrogen fixing, heterocystous, cyanobacteria: Diatom-Diazotroph Associations (DDAs). Here we ask what drives the bloom of DDAs and present a simplified trait-based model of subtropical phototroph populations driven by observed, monthly averaged, environmental characteristics. The ratio of resource supply rates favors nitrogen fixation year round. The relative fitness of DDA traits is most competitive in early summer when the mixed layer is shallow, solar irradiance is high, and phosphorus and iron are relatively abundant. Later in the season, as light intensity drops and phosphorus is depleted, the traits of small unicellular diazotrophs become more competitive. The competitive transition happens in August, at the time when the DDA export event occurs. This seasonal dynamic is maintained when embedded in a more complex, global-scale, ecological model, and provides predictions for the extent of the North Pacific DDA bloom. The model provides a parsimonious and testable hypothesis for the stimulation of DDA blooms.
A Model-based Health Monitoring and Diagnostic System for the UH-60 Helicopter. Appendix D
NASA Technical Reports Server (NTRS)
Patterson-Hine, Ann; Hindson, William; Sanderfer, Dwight; Deb, Somnath; Domagala, Chuck
2001-01-01
Model-based reasoning techniques hold much promise in providing comprehensive monitoring and diagnostics capabilities for complex systems. We are exploring the use of one of these techniques, which utilizes multi-signal modeling and the TEAMS-RT real-time diagnostic engine, on the UH-60 Rotorcraft Aircrew Systems Concepts Airborne Laboratory (RASCAL) flight research aircraft. We focus on the engine and transmission systems, and acquire sensor data across the 1553 bus as well as by direct analog-to-digital conversion from sensors to the QHuMS (Qualtech health and usage monitoring system) computer. The QHuMS computer uses commercially available components and is rack-mounted in the RASCAL facility. A multi-signal model of the transmission and engine subsystems enables studies of system testability and analysis of the degree of fault isolation available with various instrumentation suites. The model and examples of these analyses will be described and the data architectures enumerated. Flight tests of this system will validate the data architecture and provide real-time flight profiles to be further analyzed in the laboratory.
Clinical and neurocognitive aspects of hallucinations in Alzheimer's disease.
El Haj, Mohamad; Roche, Jean; Jardri, Renaud; Kapogiannis, Dimitrios; Gallouj, Karim; Antoine, Pascal
2017-12-01
Due to their prevalence, hallucinations are considered as one of the most frequent psychotic symptoms in Alzheimer's disease (AD). These psychotic manifestations reduce patients' well-being, increase the burden of caregivers, contribute to early institutionalization, and are related with the course of cognitive decline in AD. Considering their consequences, we provide a comprehensive account of the current state of knowledge about the prevalence and characteristics of hallucinations in AD. We propose a comprehensive and testable theoretical model about hallucinations in AD: the ALZHA (ALZheimer and HAllucinations) model. In this model, neurological, genetic, cognitive, affective, and iatrogenic factors associated with hallucinations in AD are highlighted. According to the ALZHA model, hallucinations in AD first involve trait markers (i.e., cognitive deficits, neurological deficits, genetic predisposition and/or sensory deficits) to which state markers that may trigger these experiences are added (e.g., psychological distress and/or iatrogenic factors). Finally, we provide recommendations for assessment and management of these psychotic manifestations in AD, with the aim to benefit patients, caregivers, and health professionals. Copyright © 2017 Elsevier Ltd. All rights reserved.
What is wrong with intelligent design?
Sober, Elliott
2007-03-01
This article reviews two standard criticisms of creationism/intelligent design (ID)): it is unfalsifiable, and it is refuted by the many imperfect adaptations found in nature. Problems with both criticisms are discussed. A conception of testability is described that avoids the defects in Karl Popper's falsifiability criterion. Although ID comes in multiple forms, which call for different criticisms, it emerges that ID fails to constitute a serious alternative to evolutionary theory.
Integrating principles and multidisciplinary projects in design education
NASA Technical Reports Server (NTRS)
Nevill, Gale E., Jr.
1992-01-01
The critical need to improve engineering design education in the U.S. is presented and a number of actions to achieve that end are discussed. The importance of teaching undergraduates the latest methods and principles through the means of team design in multidisciplinary projects leading to a testable product is emphasized. Desirable training for design instructors is described and techniques for selecting and managing projects that teach effectively are discussed.
Report on phase 1 of the Microprocessor Seminar. [and associated large scale integration
NASA Technical Reports Server (NTRS)
1977-01-01
Proceedings of a seminar on microprocessors and associated large scale integrated (LSI) circuits are presented. The potential for commonality of device requirements, candidate processes and mechanisms for qualifying candidate LSI technologies for high reliability applications, and specifications for testing and testability were among the topics discussed. Various programs and tentative plans of the participating organizations in the development of high reliability LSI circuits are given.
Are some BL Lac objects artefacts of gravitational lensing?
NASA Technical Reports Server (NTRS)
Ostriker, J. P.; Vietri, M.
1985-01-01
It is proposed here that a significant fraction of BL Lac objects are optically violently variable quasars whose continuum emission has been greatly amplified, relative to the line emission, by pointlike gravitational lenses in intervening galaxies. Several anomalous physical and statistical properties of BL Lacs can be understood on the basis of this model, which is immediately testable on the basis of absorption line studies and by direct imaging.
Are there two processes in reasoning? The dimensionality of inductive and deductive inferences.
Stephens, Rachel G; Dunn, John C; Hayes, Brett K
2018-03-01
Single-process accounts of reasoning propose that the same cognitive mechanisms underlie inductive and deductive inferences. In contrast, dual-process accounts propose that these inferences depend upon 2 qualitatively different mechanisms. To distinguish between these accounts, we derived a set of single-process and dual-process models based on an overarching signal detection framework. We then used signed difference analysis to test each model against data from an argument evaluation task, in which induction and deduction judgments are elicited for sets of valid and invalid arguments. Three data sets were analyzed: data from Singmann and Klauer (2011), a database of argument evaluation studies, and the results of an experiment designed to test model predictions. Of the large set of testable models, we found that almost all could be rejected, including all 2-dimensional models. The only testable model able to account for all 3 data sets was a model with 1 dimension of argument strength and independent decision criteria for induction and deduction judgments. We conclude that despite the popularity of dual-process accounts, current results from the argument evaluation task are best explained by a single-process account that incorporates separate decision thresholds for inductive and deductive inferences. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Moses Lake Fishery Restoration Project : FY 1999 Annual Report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None given
2000-12-01
The Moses Lake Project consists of 3 phases. Phase 1 is the assessment of all currently available physical and biological information, the collection of baseline biological data, the formulation of testable hypotheses, and the development of a detailed study plan to test the hypotheses. Phase 2 is dedicated to the implementation of the study plan including data collection, hypotheses testing, and the formulation of a management plan. Phase 3 of the project is the implementation of the management plan, monitoring and evaluation of the implemented recommendations. The project intends to restore the failed recreational fishery for panfish species (black crappie,more » bluegill and yellow perch) in Moses Lake as off site mitigation for lost recreational fishing opportunities for anadromous species in the upper Columbia River. This report summarizes the results of Phase 1 investigations and presents the study plan directed at initiating Phase 2 of the project. Phase 1of the project culminates with the formulation of testable hypotheses directed at investigating possible limiting factors to the production of panfish in Moses Lake. The limiting factors to be investigated will include water quality, habitat quantity and quality, food limitations, competition, recruitment, predation, over harvest, environmental requirements, and the physical and chemical limitations of the system in relation to the fishes.« less
Feldstein Ewing, Sarah W.; Filbey, Francesca M.; Hendershot, Christian S.; McEachern, Amber D.; Hutchison, Kent E.
2011-01-01
Objective: Despite the prevalence and profound consequences of alcohol use disorders, psychosocial alcohol interventions have widely varying outcomes. The range of behavior following psychosocial alcohol treatment indicates the need to gain a better understanding of active ingredients and how they may operate. Although this is an area of great interest, at this time there is a limited understanding of how in-session behaviors may catalyze changes in the brain and subsequent alcohol use behavior. Thus, in this review, we aim to identify the neurobiological routes through which psychosocial alcohol interventions may lead to post-session behavior change as well as offer an approach to conceptualize and evaluate these translational relationships. Method: PubMed and PsycINFO searches identified studies that successfully integrated functional magnetic resonance imaging and psychosocial interventions. Results: Based on this research, we identified potential neurobiological substrates through which behavioral alcohol interventions may initiate and sustain behavior change. In addition, we proposed a testable model linking within-session active ingredients to outside-of-session behavior change. Conclusions: Through this review, we present a testable translational model. Additionally, we illustrate how the proposed model can help facilitate empirical evaluations of psychotherapeutic factors and their underlying neural mechanisms, both in the context of motivational interviewing and in the treatment of alcohol use disorders. PMID:22051204
Testability and epistemic shifts in modern cosmology
NASA Astrophysics Data System (ADS)
Kragh, Helge
2014-05-01
During the last decade new developments in theoretical and speculative cosmology have reopened the old discussion of cosmology's scientific status and the more general question of the demarcation between science and non-science. The multiverse hypothesis, in particular, is central to this discussion and controversial because it seems to disagree with methodological and epistemic standards traditionally accepted in the physical sciences. But what are these standards and how sacrosanct are they? Does anthropic multiverse cosmology rest on evaluation criteria that conflict with and go beyond those ordinarily accepted, so that it constitutes an "epistemic shift" in fundamental physics? The paper offers a brief characterization of the modern multiverse and also refers to a few earlier attempts to introduce epistemic shifts in the science of the universe. It further discusses the several meanings of testability, addresses the question of falsifiability as a sine qua non for a theory being scientific, and briefly compares the situation in cosmology with the one in systematic biology. Multiverse theory is not generally falsifiable, which has led to proposals from some physicists to overrule not only Popperian standards but also other evaluation criteria of a philosophical nature. However, this is hardly possible and nor is it possible to get rid of explicit philosophical considerations in some other aspects of cosmological research, however advanced it becomes.
Multiple payers, commonality and free-riding in health care: Medicare and private payers.
Glazer, Jacob; McGuire, Thomas G
2002-11-01
Managed health care plans and providers in the US and elsewhere sell their services to multiple payers. For example, the three largest groups of purchasers from health plans in the US are employers, Medicaid plans, and Medicare, with the first two accounting for over 90% of the total enrollees. In the case of hospitals, Medicare is the largest buyer, but it alone only accounts for 40% of the total payments. While payers have different objectives and use different contracting practices, the plans and providers set some elements of the quality in common for all payers. In this paper, we study the interactions between a public payer, modeled on Medicare, which sets a price and takes any willing provider, a private payer, which limits providers and pays a price on the basis of quality, and a provider/plan, in the presence of shared elements of quality. The provider compromises in response to divergent incentives from payers. The private sector dilutes Medicare payment initiatives, and may, under some circumstances, repair Medicare payment policy mistakes. If Medicare behaves strategically in the presence of private payers, it can free-ride on the private payer and set its prices too low. Our paper has many testable implications, including a new hypothesis for why Medicare has failed to gain acceptance of health plans in the US.
Educating health care trainees and professionals about suicide prevention in depressed adolescents.
Rice, Timothy R; Sher, Leo
2013-01-01
Adolescent depression is a highly prevalent disorder with significant morbidity and suicide mortality. It is simultaneously highly responsive to treatment. Adolescents wish to discuss depression with their providers, and providers routinely receive opportunities to do so. These characteristics of prevalence, morbidity, mortality, responsiveness, and accessibility make adolescent depression an excellent target of care. However, most health care trainees and professionals report low confidence in caring for adolescent depression. As a caregiver community, we fare poorly in routine matters of assessment and management of adolescent depression. All health care professionals are trained within a medical model. In this light, the conceptualization of adolescent depression and suicidality within the medical model may increase provider confidence and performance. Epidemiology and neurobiology are presented with emphasis in this review. Legal concerns also affect health care professionals. For example, providers may deviate from evidence-based medicine owing to anxieties that the identification and treatment of depression may induce suicide and consequent legal culpability. A review of the historical context and relevant outcome trials concerning the increased risk of suicidality in depressed adolescents treated with selective-serotonin reuptake inhibitors may increase provider comfort. Furthermore, increased didactic and experiential training improve provider performance. In this work, proven models were discussed, and the testable hypothesis that education incorporating the views of this article can produce the best care for depressed adolescents.
A plausible radiobiological model of cardiovascular disease at low or fractionated doses
NASA Astrophysics Data System (ADS)
Little, Mark; Vandoolaeghe, Wendy; Gola, Anna; Tzoulaki, Ioanna
Atherosclerosis is the main cause of coronary heart disease and stroke, the two major causes of death in developed society. There is emerging evidence of excess risk of cardiovascular disease at low radiation doses in various occupationally-exposed groups receiving small daily radia-tion doses. Assuming that they are causal, the mechanisms for effects of chronic fractionated radiation exposures on cardiovascular disease are unclear. We outline a spatial reaction-diffusion model for atherosclerosis, and perform stability analysis, based wherever possible on human data. We show that a predicted consequence of multiple small radiation doses is to cause mean chemo-attractant (MCP-1) concentration to increase linearly with cumulative dose. The main driver for the increase in MCP-1 is monocyte death, and consequent reduction in MCP-1 degradation. The radiation-induced risks predicted by the model are quantitatively consistent with those observed in a number of occupationally-exposed groups. The changes in equilibrium MCP-1 concentrations with low density lipoprotein cholesterol concentration are also consistent with experimental and epidemiologic data. This proposed mechanism would be experimentally testable. If true, it also has substantive implications for radiological protection, which at present does not take cardiovascular disease into account. The Japanese A-bomb survivor data implies that cardiovascular disease and can-cer mortality contribute similarly to radiogenic risk. The major uncertainty in assessing the low-dose risk of cardiovascular disease is the shape of the dose response relationship, which is unclear in the Japanese data. The analysis of the present paper suggests that linear extrapo-lation would be appropriate for this endpoint.
Shvets, Alexey A; Kolomeisky, Anatoly B
2017-10-03
The ability to precisely edit and modify a genome opens endless opportunities to investigate fundamental properties of living systems as well as to advance various medical techniques and bioengineering applications. This possibility is now close to reality due to a recent discovery of the adaptive bacterial immune system, which is based on clustered regularly interspaced short palindromic repeats (CRISPR)-associated proteins (Cas) that utilize RNA to find and cut the double-stranded DNA molecules at specific locations. Here we develop a quantitative theoretical approach to analyze the mechanism of target search on DNA by CRISPR RNA-guided Cas9 proteins, which is followed by a selective cleavage of nucleic acids. It is based on a discrete-state stochastic model that takes into account the most relevant physical-chemical processes in the system. Using a method of first-passage processes, a full dynamic description of the target search is presented. It is found that the location of specific sites on DNA by CRISPR Cas9 proteins is governed by binding first to protospacer adjacent motif sequences on DNA, which is followed by reversible transitions into DNA interrogation states. In addition, the search dynamics is strongly influenced by the off-target cutting. Our theoretical calculations allow us to explain the experimental observations and to give experimentally testable predictions. Thus, the presented theoretical model clarifies some molecular aspects of the genome interrogation by CRISPR RNA-guided Cas9 proteins. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Reptile scale paradigm: Evo-Devo, pattern formation and regeneration
Chang, Cheng; Wu, Ping; Baker, Ruth E.; Maini, Philip K.; Alibardi, Lorenzo; Chuong, Cheng-Ming
2010-01-01
The purpose of this perspective is to highlight the merit of the reptile integument as an experimental model. Reptiles represent the first amniotes. From stem reptiles, extant reptiles, birds and mammals have evolved. Mammal hairs and feathers evolved from Therapsid and Sauropsid reptiles, respectively. The early reptilian integument had to adapt to the challenges of terrestrial life, developing a multi-layered stratum corneum capable of barrier function and ultraviolet protection. For better mechanical protection, diverse reptilian scale types have evolved. The evolution of endothermy has driven the convergent evolution of hair and feather follicles: both form multiple localized growth units with stem cells and transient amplifying cells protected in the proximal follicle. This topological arrangement allows them to elongate, molt and regenerate without structural constraints. Another unique feature of reptile skin is the exquisite arrangement of scales and pigment patterns, making them testable models for mechanisms of pattern formation. Since they face the constant threat of damage on land, different strategies were developed to accommodate skin homeostasis and regeneration. Temporally, they can be under continuous renewal or sloughing cycles. Spatially, they can be diffuse or form discrete localized growth units (follicles). To understand how gene regulatory networks evolved to produce increasingly complex ectodermal organs, we have to study how prototypic scale-forming pathways in reptiles are modulated to produce appendage novelties. Despite the fact that there are numerous studies of reptile scales, molecular analyses have lagged behind. Here, we underscore how further development of this novel experimental model will be valuable in filling the gaps of our understanding of the Evo-Devo of amniote integuments. PMID:19557687
Creation of a Mouse with Stress-Induced Dystonia: Control of an ATPase Chaperone
2013-04-01
was successful, and a mouse with the desired dystonic symptoms was obtained. It has two mutations , one a dominantly inherited gene with 100...the hallmark of dystonia. 15. SUBJECT TERMS Dystonia, genetically modified mice, stress, gene mutations , animal model of disease. 16...there are a variety of hypotheses that should be testable if there were a realistic animal model. Mice with mutations in genes known to cause dystonia
Effectiveness of spacecraft testing programs
NASA Technical Reports Server (NTRS)
Krausz, A.
1980-01-01
The need for testing under simulated mission operational conditions is discussed and the results of such tests are reviewed from the point of view of the user. A brief overview of the usal test sequences for high reliability long life spacecraft is presented and the effectiveness of the testing program is analyzed in terms of the defects which are discovered by such tests. The need for automation, innovative mechanical test procedures, and design for testability is discussed.
Zee-Babu type model with U (1 )Lμ-Lτ gauge symmetry
NASA Astrophysics Data System (ADS)
Nomura, Takaaki; Okada, Hiroshi
2018-05-01
We extend the Zee-Babu model, introducing local U (1 )Lμ-Lτ symmetry with several singly charged bosons. We find a predictive neutrino mass texture in a simple hypothesis in which mixings among singly charged bosons are negligible. Also, lepton-flavor violations are less constrained compared with the original model. Then, we explore the testability of the model, focusing on doubly charged boson physics at the LHC and the International Linear Collider.
Xylella genomics and bacterial pathogenicity to plants.
Dow, J M; Daniels, M J
2000-12-01
Xylella fastidiosa, a pathogen of citrus, is the first plant pathogenic bacterium for which the complete genome sequence has been published. Inspection of the sequence reveals high relatedness to many genes of other pathogens, notably Xanthomonas campestris. Based on this, we suggest that Xylella possesses certain easily testable properties that contribute to pathogenicity. We also present some general considerations for deriving information on pathogenicity from bacterial genomics. Copyright 2000 John Wiley & Sons, Ltd.
An evolutionary scenario for the origin of flowers.
Frohlich, Michael W
2003-07-01
The Mostly Male theory is the first to use evidence from gene phylogenies, genetics, modern plant morphology and fossils to explain the evolutionary origin of flowers. It proposes that flower organization derives more from the male structures of ancestral gymnosperms than from female structures. The theory arose from a hypothesis-based study. Such studies are the most likely to generate testable evolutionary scenarios, which should be the ultimate goal of evo-devo.
A collider observable QCD axion
Dimopoulos, Savas; Hook, Anson; Huang, Junwu; ...
2016-11-09
Here, we present a model where the QCD axion is at the TeV scale and visible at a collider via its decays. Conformal dynamics and strong CP considerations account for the axion coupling strongly enough to the standard model to be produced as well as the coincidence between the weak scale and the axion mass. The model predicts additional pseudoscalar color octets whose properties are completely determined by the axion properties rendering the theory testable.
Soviet Economic Policy Towards Eastern Europe
1988-11-01
high. Without specifying the determinants of Soviet demand for "allegiance" in more detail, the model is not testable; we cannot predict how subsidy...trade inside (Czechoslovakia, Bulgaria). These countries are behaving as predicted by the model . If this hypothesis is true, the pattern of subsidies...also compares the sum of per capita subsidies by country between 1970 and 1982 with the sum of subsidies predicted by the model . Because of the poor
Stephens, Patrick R.; Hua, Jessica; Searle, Catherine L.; Xie, Gisselle Yang; Urbina, Jenny; Olson, Deanna H.; Bancroft, Betsy A.; Weis, Virginia; Hammond, John I.; Relyea, Rick A.; Blaustein, Andrew R.
2017-01-01
Variation in host responses to pathogens can have cascading effects on populations and communities when some individuals or groups of individuals display disproportionate vulnerability to infection or differ in their competence to transmit infection. The fungal pathogen, Batrachochytrium dendrobatidis (Bd) has been detected in almost 700 different amphibian species and is implicated in numerous global amphibian population declines. Identifying key hosts in the amphibian-Bd system–those who are at greatest risk or who pose the greatest risk for others–is challenging due in part to many extrinsic environmental factors driving spatiotemporal Bd distribution and context-dependent host responses to Bd in the wild. One way to improve predictive risk models and generate testable mechanistic hypotheses about vulnerability is to complement what we know about the spatial epidemiology of Bd with data collected through comparative experimental studies. We used standardized pathogen challenges to quantify amphibian survival and infection trajectories across 20 post-metamorphic North American species raised from eggs. We then incorporated trait-based models to investigate the predictive power of phylogenetic history, habitat use, and ecological and life history traits in explaining responses to Bd. True frogs (Ranidae) displayed the lowest infection intensities, whereas toads (Bufonidae) generally displayed the greatest levels of mortality after Bd exposure. Affiliation with ephemeral aquatic habitat and breadth of habitat use were strong predictors of vulnerability to and intensity of infection and several other traits including body size, lifespan, age at sexual maturity, and geographic range also appeared in top models explaining host responses to Bd. Several of the species examined are highly understudied with respect to Bd such that this study represents the first experimental susceptibility data. Combining insights gained from experimental studies with observations of landscape-level disease prevalence may help explain current and predict future pathogen dynamics in the Bd system. PMID:28095428
Gervasi, Stephanie S; Stephens, Patrick R; Hua, Jessica; Searle, Catherine L; Xie, Gisselle Yang; Urbina, Jenny; Olson, Deanna H; Bancroft, Betsy A; Weis, Virginia; Hammond, John I; Relyea, Rick A; Blaustein, Andrew R
2017-01-01
Variation in host responses to pathogens can have cascading effects on populations and communities when some individuals or groups of individuals display disproportionate vulnerability to infection or differ in their competence to transmit infection. The fungal pathogen, Batrachochytrium dendrobatidis (Bd) has been detected in almost 700 different amphibian species and is implicated in numerous global amphibian population declines. Identifying key hosts in the amphibian-Bd system-those who are at greatest risk or who pose the greatest risk for others-is challenging due in part to many extrinsic environmental factors driving spatiotemporal Bd distribution and context-dependent host responses to Bd in the wild. One way to improve predictive risk models and generate testable mechanistic hypotheses about vulnerability is to complement what we know about the spatial epidemiology of Bd with data collected through comparative experimental studies. We used standardized pathogen challenges to quantify amphibian survival and infection trajectories across 20 post-metamorphic North American species raised from eggs. We then incorporated trait-based models to investigate the predictive power of phylogenetic history, habitat use, and ecological and life history traits in explaining responses to Bd. True frogs (Ranidae) displayed the lowest infection intensities, whereas toads (Bufonidae) generally displayed the greatest levels of mortality after Bd exposure. Affiliation with ephemeral aquatic habitat and breadth of habitat use were strong predictors of vulnerability to and intensity of infection and several other traits including body size, lifespan, age at sexual maturity, and geographic range also appeared in top models explaining host responses to Bd. Several of the species examined are highly understudied with respect to Bd such that this study represents the first experimental susceptibility data. Combining insights gained from experimental studies with observations of landscape-level disease prevalence may help explain current and predict future pathogen dynamics in the Bd system.
Lowet, Eric; Roberts, Mark; Hadjipapas, Avgis; Peter, Alina; van der Eerden, Jan; De Weerd, Peter
2015-02-01
Fine-scale temporal organization of cortical activity in the gamma range (∼25-80Hz) may play a significant role in information processing, for example by neural grouping ('binding') and phase coding. Recent experimental studies have shown that the precise frequency of gamma oscillations varies with input drive (e.g. visual contrast) and that it can differ among nearby cortical locations. This has challenged theories assuming widespread gamma synchronization at a fixed common frequency. In the present study, we investigated which principles govern gamma synchronization in the presence of input-dependent frequency modulations and whether they are detrimental for meaningful input-dependent gamma-mediated temporal organization. To this aim, we constructed a biophysically realistic excitatory-inhibitory network able to express different oscillation frequencies at nearby spatial locations. Similarly to cortical networks, the model was topographically organized with spatially local connectivity and spatially-varying input drive. We analyzed gamma synchronization with respect to phase-locking, phase-relations and frequency differences, and quantified the stimulus-related information represented by gamma phase and frequency. By stepwise simplification of our models, we found that the gamma-mediated temporal organization could be reduced to basic synchronization principles of weakly coupled oscillators, where input drive determines the intrinsic (natural) frequency of oscillators. The gamma phase-locking, the precise phase relation and the emergent (measurable) frequencies were determined by two principal factors: the detuning (intrinsic frequency difference, i.e. local input difference) and the coupling strength. In addition to frequency coding, gamma phase contained complementary stimulus information. Crucially, the phase code reflected input differences, but not the absolute input level. This property of relative input-to-phase conversion, contrasting with latency codes or slower oscillation phase codes, may resolve conflicting experimental observations on gamma phase coding. Our modeling results offer clear testable experimental predictions. We conclude that input-dependency of gamma frequencies could be essential rather than detrimental for meaningful gamma-mediated temporal organization of cortical activity.
Cvicek, Vaclav; Goddard, William A.; Abrol, Ravinder
2016-01-01
The understanding of G-protein coupled receptors (GPCRs) is undergoing a revolution due to increased information about their signaling and the experimental determination of structures for more than 25 receptors. The availability of at least one receptor structure for each of the GPCR classes, well separated in sequence space, enables an integrated superfamily-wide analysis to identify signatures involving the role of conserved residues, conserved contacts, and downstream signaling in the context of receptor structures. In this study, we align the transmembrane (TM) domains of all experimental GPCR structures to maximize the conserved inter-helical contacts. The resulting superfamily-wide GpcR Sequence-Structure (GRoSS) alignment of the TM domains for all human GPCR sequences is sufficient to generate a phylogenetic tree that correctly distinguishes all different GPCR classes, suggesting that the class-level differences in the GPCR superfamily are encoded at least partly in the TM domains. The inter-helical contacts conserved across all GPCR classes describe the evolutionarily conserved GPCR structural fold. The corresponding structural alignment of the inactive and active conformations, available for a few GPCRs, identifies activation hot-spot residues in the TM domains that get rewired upon activation. Many GPCR mutations, known to alter receptor signaling and cause disease, are located at these conserved contact and activation hot-spot residue positions. The GRoSS alignment places the chemosensory receptor subfamilies for bitter taste (TAS2R) and pheromones (Vomeronasal, VN1R) in the rhodopsin family, known to contain the chemosensory olfactory receptor subfamily. The GRoSS alignment also enables the quantification of the structural variability in the TM regions of experimental structures, useful for homology modeling and structure prediction of receptors. Furthermore, this alignment identifies structurally and functionally important residues in all human GPCRs. These residues can be used to make testable hypotheses about the structural basis of receptor function and about the molecular basis of disease-associated single nucleotide polymorphisms. PMID:27028541
Lowet, Eric; Roberts, Mark; Hadjipapas, Avgis; Peter, Alina; van der Eerden, Jan; De Weerd, Peter
2015-01-01
Fine-scale temporal organization of cortical activity in the gamma range (∼25–80Hz) may play a significant role in information processing, for example by neural grouping (‘binding’) and phase coding. Recent experimental studies have shown that the precise frequency of gamma oscillations varies with input drive (e.g. visual contrast) and that it can differ among nearby cortical locations. This has challenged theories assuming widespread gamma synchronization at a fixed common frequency. In the present study, we investigated which principles govern gamma synchronization in the presence of input-dependent frequency modulations and whether they are detrimental for meaningful input-dependent gamma-mediated temporal organization. To this aim, we constructed a biophysically realistic excitatory-inhibitory network able to express different oscillation frequencies at nearby spatial locations. Similarly to cortical networks, the model was topographically organized with spatially local connectivity and spatially-varying input drive. We analyzed gamma synchronization with respect to phase-locking, phase-relations and frequency differences, and quantified the stimulus-related information represented by gamma phase and frequency. By stepwise simplification of our models, we found that the gamma-mediated temporal organization could be reduced to basic synchronization principles of weakly coupled oscillators, where input drive determines the intrinsic (natural) frequency of oscillators. The gamma phase-locking, the precise phase relation and the emergent (measurable) frequencies were determined by two principal factors: the detuning (intrinsic frequency difference, i.e. local input difference) and the coupling strength. In addition to frequency coding, gamma phase contained complementary stimulus information. Crucially, the phase code reflected input differences, but not the absolute input level. This property of relative input-to-phase conversion, contrasting with latency codes or slower oscillation phase codes, may resolve conflicting experimental observations on gamma phase coding. Our modeling results offer clear testable experimental predictions. We conclude that input-dependency of gamma frequencies could be essential rather than detrimental for meaningful gamma-mediated temporal organization of cortical activity. PMID:25679780
Domain fusion analysis by applying relational algebra to protein sequence and domain databases.
Truong, Kevin; Ikura, Mitsuhiko
2003-05-06
Domain fusion analysis is a useful method to predict functionally linked proteins that may be involved in direct protein-protein interactions or in the same metabolic or signaling pathway. As separate domain databases like BLOCKS, PROSITE, Pfam, SMART, PRINTS-S, ProDom, TIGRFAMs, and amalgamated domain databases like InterPro continue to grow in size and quality, a computational method to perform domain fusion analysis that leverages on these efforts will become increasingly powerful. This paper proposes a computational method employing relational algebra to find domain fusions in protein sequence databases. The feasibility of this method was illustrated on the SWISS-PROT+TrEMBL sequence database using domain predictions from the Pfam HMM (hidden Markov model) database. We identified 235 and 189 putative functionally linked protein partners in H. sapiens and S. cerevisiae, respectively. From scientific literature, we were able to confirm many of these functional linkages, while the remainder offer testable experimental hypothesis. Results can be viewed at http://calcium.uhnres.utoronto.ca/pi. As the analysis can be computed quickly on any relational database that supports standard SQL (structured query language), it can be dynamically updated along with the sequence and domain databases, thereby improving the quality of predictions over time.
The cancer Warburg effect may be a testable example of the minimum entropy production rate principle
NASA Astrophysics Data System (ADS)
Marín, Dolores; Sabater, Bartolomé
2017-04-01
Cancer cells consume more glucose by glycolytic fermentation to lactate than by respiration, a characteristic known as the Warburg effect. In contrast with the 36 moles of ATP produced by respiration, fermentation produces two moles of ATP per mole of glucose consumed, which poses a puzzle with regard to the function of the Warburg effect. The production of free energy (ΔG), enthalpy (ΔH), and entropy (ΔS) per mole linearly varies with the fraction (x) of glucose consumed by fermentation that is frequently estimated around 0.9. Hence, calculation shows that, in respect to pure respiration, the predominant fermentative metabolism decreases around 10% the production of entropy per mole of glucose consumed in cancer cells. We hypothesize that increased fermentation could allow cancer cells to accomplish the Prigogine theorem of the trend to minimize the rate of production of entropy. According to the theorem, open cellular systems near the steady state could evolve to minimize the rates of entropy production that may be reached by modified replicating cells producing entropy at a low rate. Remarkably, at CO2 concentrations above 930 ppm, glucose respiration produces less entropy than fermentation, which suggests experimental tests to validate the hypothesis of minimization of the rate of entropy production through the Warburg effect.
Multiscale Modeling in the Clinic: Drug Design and Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clancy, Colleen E.; An, Gary; Cannon, William R.
A wide range of length and time scales are relevant to pharmacology, especially in drug development, drug design and drug delivery. Therefore, multi-scale computational modeling and simulation methods and paradigms that advance the linkage of phenomena occurring at these multiple scales have become increasingly important. Multi-scale approaches present in silico opportunities to advance laboratory research to bedside clinical applications in pharmaceuticals research. This is achievable through the capability of modeling to reveal phenomena occurring across multiple spatial and temporal scales, which are not otherwise readily accessible to experimentation. The resultant models, when validated, are capable of making testable predictions tomore » guide drug design and delivery. In this review we describe the goals, methods, and opportunities of multi-scale modeling in drug design and development. We demonstrate the impact of multiple scales of modeling in this field. We indicate the common mathematical techniques employed for multi-scale modeling approaches used in pharmacology and present several examples illustrating the current state-of-the-art regarding drug development for: Excitable Systems (Heart); Cancer (Metastasis and Differentiation); Cancer (Angiogenesis and Drug Targeting); Metabolic Disorders; and Inflammation and Sepsis. We conclude with a focus on barriers to successful clinical translation of drug development, drug design and drug delivery multi-scale models.« less
Marín, Dolores; Sabater, Bartolomé
2017-04-28
Cancer cells consume more glucose by glycolytic fermentation to lactate than by respiration, a characteristic known as the Warburg effect. In contrast with the 36 moles of ATP produced by respiration, fermentation produces two moles of ATP per mole of glucose consumed, which poses a puzzle with regard to the function of the Warburg effect. The production of free energy (ΔG), enthalpy (ΔH), and entropy (ΔS) per mole linearly varies with the fraction (x) of glucose consumed by fermentation that is frequently estimated around 0.9. Hence, calculation shows that, in respect to pure respiration, the predominant fermentative metabolism decreases around 10% the production of entropy per mole of glucose consumed in cancer cells. We hypothesize that increased fermentation could allow cancer cells to accomplish the Prigogine theorem of the trend to minimize the rate of production of entropy. According to the theorem, open cellular systems near the steady state could evolve to minimize the rates of entropy production that may be reached by modified replicating cells producing entropy at a low rate. Remarkably, at CO 2 concentrations above 930 ppm, glucose respiration produces less entropy than fermentation, which suggests experimental tests to validate the hypothesis of minimization of the rate of entropy production through the Warburg effect.
Guilbride, D Lys; Gawlinski, Pawel; Guilbride, Patrick D L
2010-05-19
Clinically protective malaria vaccines consistently fail to protect adults and children in endemic settings, and at best only partially protect infants. We identify and evaluate 1916 immunization studies between 1965-February 2010, and exclude partially or nonprotective results to find 177 completely protective immunization experiments. Detailed reexamination reveals an unexpectedly mundane basis for selective vaccine failure: live malaria parasites in the skin inhibit vaccine function. We next show published molecular and cellular data support a testable, novel model where parasite-host interactions in the skin induce malaria-specific regulatory T cells, and subvert early antigen-specific immunity to parasite-specific immunotolerance. This ensures infection and tolerance to reinfection. Exposure to Plasmodium-infected mosquito bites therefore systematically triggers immunosuppression of endemic vaccine-elicited responses. The extensive vaccine trial data solidly substantiate this model experimentally. We conclude skinstage-initiated immunosuppression, unassociated with bloodstage parasites, systematically blocks vaccine function in the field. Our model exposes novel molecular and procedural strategies to significantly and quickly increase protective efficacy in both pipeline and currently ineffective malaria vaccines, and forces fundamental reassessment of central precepts determining vaccine development. This has major implications for accelerated local eliminations of malaria, and significantly increases potential for eradication.
Dong, Junzi; Colburn, H. Steven
2016-01-01
In multisource, “cocktail party” sound environments, human and animal auditory systems can use spatial cues to effectively separate and follow one source of sound over competing sources. While mechanisms to extract spatial cues such as interaural time differences (ITDs) are well understood in precortical areas, how such information is reused and transformed in higher cortical regions to represent segregated sound sources is not clear. We present a computational model describing a hypothesized neural network that spans spatial cue detection areas and the cortex. This network is based on recent physiological findings that cortical neurons selectively encode target stimuli in the presence of competing maskers based on source locations (Maddox et al., 2012). We demonstrate that key features of cortical responses can be generated by the model network, which exploits spatial interactions between inputs via lateral inhibition, enabling the spatial separation of target and interfering sources while allowing monitoring of a broader acoustic space when there is no competition. We present the model network along with testable experimental paradigms as a starting point for understanding the transformation and organization of spatial information from midbrain to cortex. This network is then extended to suggest engineering solutions that may be useful for hearing-assistive devices in solving the cocktail party problem. PMID:26866056
Sterile neutrino dark matter and low scale leptogenesis from a charged scalar.
Frigerio, Michele; Yaguna, Carlos E
We show that novel paths to dark matter generation and baryogenesis are open when the standard model is extended with three sterile neutrinos [Formula: see text] and a charged scalar [Formula: see text]. Specifically, we propose a new production mechanism for the dark matter particle-a multi-keV sterile neutrino, [Formula: see text]-that does not depend on the active-sterile mixing angle and does not rely on a large primordial lepton asymmetry. Instead, [Formula: see text] is produced, via freeze-in, by the decays of [Formula: see text] while it is in equilibrium in the early Universe. In addition, we demonstrate that, thanks to the couplings between the heavier sterile neutrinos [Formula: see text] and [Formula: see text], baryogenesis via leptogenesis can be realized close to the electroweak scale. The lepton asymmetry is generated either by [Formula: see text]-decays for masses [Formula: see text] TeV, or by [Formula: see text]-oscillations for [Formula: see text] GeV. Experimental signatures of this scenario include an X-ray line from dark matter decays, and the direct production of [Formula: see text] at the LHC. This model thus describes a minimal, testable scenario for neutrino masses, the baryon asymmetry, and dark matter.
Finke, Kathrin; Schwarzkopf, Wolfgang; Müller, Ulrich; Frodl, Thomas; Müller, Hermann J; Schneider, Werner X; Engel, Rolf R; Riedel, Michael; Möller, Hans-Jürgen; Hennig-Fast, Kristina
2011-11-01
Attention deficit hyperactivity disorder (ADHD) persists frequently into adulthood. The decomposition of endophenotypes by means of experimental neuro-cognitive assessment has the potential to improve diagnostic assessment, evaluation of treatment response, and disentanglement of genetic and environmental influences. We assessed four parameters of attentional capacity and selectivity derived from simple psychophysical tasks (verbal report of briefly presented letter displays) and based on a "theory of visual attention." These parameters are mathematically independent, quantitative measures, and previous studies have shown that they are highly sensitive for subtle attention deficits. Potential reductions of attentional capacity, that is, of perceptual processing speed and working memory storage capacity, were assessed with a whole report paradigm. Furthermore, possible pathologies of attentional selectivity, that is, selection of task-relevant information and bias in the spatial distribution of attention, were measured with a partial report paradigm. A group of 30 unmedicated adult ADHD patients and a group of 30 demographically matched healthy controls were tested. ADHD patients showed significant reductions of working memory storage capacity of a moderate to large effect size. Perceptual processing speed, task-based, and spatial selection were unaffected. The results imply a working memory deficit as an important source of behavioral impairments. The theory of visual attention parameter working memory storage capacity might constitute a quantifiable and testable endophenotype of ADHD.
Within-group behavioural consequences of between-group conflict: a prospective review.
Radford, Andrew N; Majolo, Bonaventura; Aureli, Filippo
2016-11-30
Conflict is rife in group-living species and exerts a powerful selective force. Group members face a variety of threats from extra-group conspecifics, from individuals looking for reproductive opportunities to rival groups seeking resources. Theory predicts that such between-group conflict should influence within-group behaviour. However, compared with the extensive literature on the consequences of within-group conflict, relatively little research has considered the behavioural impacts of between-group conflict. We give an overview of why between-group conflict is expected to influence subsequent behaviour among group members. We then use what is known about the consequences of within-group conflict to generate testable predictions about how between-group conflict might affect within-group behaviour in the aftermath. We consider the types of behaviour that could change and how the role of different group members in the conflict can exert an influence. Furthermore, we discuss how conflict characteristics and outcome, group size, social structure and within-group relationship quality might modulate post-conflict behavioural changes. Finally, we propose the need for consistent definitions, a broader range of examined behaviours and taxa, individual-focused data collection, complementary observational and experimental approaches, and a consideration of lasting effects if we are to understand fully the significant influence of between-group conflict on social behaviour. © 2016 The Author(s).
Within-group behavioural consequences of between-group conflict: a prospective review
Aureli, Filippo
2016-01-01
Conflict is rife in group-living species and exerts a powerful selective force. Group members face a variety of threats from extra-group conspecifics, from individuals looking for reproductive opportunities to rival groups seeking resources. Theory predicts that such between-group conflict should influence within-group behaviour. However, compared with the extensive literature on the consequences of within-group conflict, relatively little research has considered the behavioural impacts of between-group conflict. We give an overview of why between-group conflict is expected to influence subsequent behaviour among group members. We then use what is known about the consequences of within-group conflict to generate testable predictions about how between-group conflict might affect within-group behaviour in the aftermath. We consider the types of behaviour that could change and how the role of different group members in the conflict can exert an influence. Furthermore, we discuss how conflict characteristics and outcome, group size, social structure and within-group relationship quality might modulate post-conflict behavioural changes. Finally, we propose the need for consistent definitions, a broader range of examined behaviours and taxa, individual-focused data collection, complementary observational and experimental approaches, and a consideration of lasting effects if we are to understand fully the significant influence of between-group conflict on social behaviour. PMID:27903869
Dong, Junzi; Colburn, H Steven; Sen, Kamal
2016-01-01
In multisource, "cocktail party" sound environments, human and animal auditory systems can use spatial cues to effectively separate and follow one source of sound over competing sources. While mechanisms to extract spatial cues such as interaural time differences (ITDs) are well understood in precortical areas, how such information is reused and transformed in higher cortical regions to represent segregated sound sources is not clear. We present a computational model describing a hypothesized neural network that spans spatial cue detection areas and the cortex. This network is based on recent physiological findings that cortical neurons selectively encode target stimuli in the presence of competing maskers based on source locations (Maddox et al., 2012). We demonstrate that key features of cortical responses can be generated by the model network, which exploits spatial interactions between inputs via lateral inhibition, enabling the spatial separation of target and interfering sources while allowing monitoring of a broader acoustic space when there is no competition. We present the model network along with testable experimental paradigms as a starting point for understanding the transformation and organization of spatial information from midbrain to cortex. This network is then extended to suggest engineering solutions that may be useful for hearing-assistive devices in solving the cocktail party problem.
Sadeghi Ghuchani, Mostafa
2018-02-08
This comment argues against the view that cancer cells produce less entropy than normal cells as stated in a recent paper by Marín and Sabater. The basic principle of estimation of entropy production rate in a living cell is discussed, emphasizing the fact that entropy production depends on both the amount of heat exchange during the metabolism and the entropy difference between products and substrates.