Huang, Dan; Chen, Xuejuan; Gong, Qi; Yuan, Chaoqun; Ding, Hui; Bai, Jing; Zhu, Hui; Fu, Zhujun; Yu, Rongbin; Liu, Hu
2016-01-01
This survey was conducted to determine the testability, distribution and associations of ocular biometric parameters in Chinese preschool children. Ocular biometric examinations, including the axial length (AL) and corneal radius of curvature (CR), were conducted on 1,688 3-year-old subjects by using an IOLMaster in August 2015. Anthropometric parameters, including height and weight, were measured according to a standardized protocol, and body mass index (BMI) was calculated. The testability was 93.7% for the AL and 78.6% for the CR overall, and both measures improved with age. Girls performed slightly better in AL measurements (P = 0.08), and the difference in CR was statistically significant (P < 0.05). The AL distribution was normal in girls (P = 0.12), whereas it was not in boys (P < 0.05). For CR1, all subgroups presented normal distributions (P = 0.16 for boys; P = 0.20 for girls), but the distribution varied when the subgroups were combined (P < 0.05). CR2 presented a normal distribution (P = 0.11), whereas the AL/CR ratio was abnormal (P < 0.001). Boys exhibited a significantly longer AL, a greater CR and a greater AL/CR ratio than girls (all P < 0.001). PMID:27384307
The Transition to Minimal Consciousness through the Evolution of Associative Learning
Bronfman, Zohar Z.; Ginsburg, Simona; Jablonka, Eva
2016-01-01
The minimal state of consciousness is sentience. This includes any phenomenal sensory experience – exteroceptive, such as vision and olfaction; interoceptive, such as pain and hunger; or proprioceptive, such as the sense of bodily position and movement. We propose unlimited associative learning (UAL) as the marker of the evolutionary transition to minimal consciousness (or sentience), its phylogenetically earliest sustainable manifestation and the driver of its evolution. We define and describe UAL at the behavioral and functional level and argue that the structural-anatomical implementations of this mode of learning in different taxa entail subjective feelings (sentience). We end with a discussion of the implications of our proposal for the distribution of consciousness in the animal kingdom, suggesting testable predictions, and revisiting the ongoing debate about the function of minimal consciousness in light of our approach. PMID:28066282
ERIC Educational Resources Information Center
Hunter, Lora Rose; Schmidt, Norman B.
2010-01-01
In this review, the extant literature concerning anxiety psychopathology in African American adults is summarized to develop a testable, explanatory framework with implications for future research. The model was designed to account for purported lower rates of anxiety disorders in African Americans compared to European Americans, along with other…
2008-12-01
1979; Wasserman and Faust, 1994). SNA thus relies heavily on graph theory to make predictions about network structure and thus social behavior...becomes a tool for increasing the specificity of theory , thinking through the theoretical implications, and generating testable predictions. In...to summarize Construct and its roots in constructural sociological theory . We discover that the (LPM) provides a mathematical bridge between
ERIC Educational Resources Information Center
Holden, Richard J.; Karsh, Ben-Tzion
2009-01-01
Primary objective: much research and practice related to the design and implementation of information technology in health care has been atheoretical. It is argued that using extant theory to develop testable models of health information technology (HIT) benefits both research and practice. Methods and procedures: several theories of motivation,…
NASA Technical Reports Server (NTRS)
Chen, Chung-Hsing
1992-01-01
In this thesis, a behavioral-level testability analysis approach is presented. This approach is based on analyzing the circuit behavioral description (similar to a C program) to estimate its testability by identifying controllable and observable circuit nodes. This information can be used by a test generator to gain better access to internal circuit nodes and to reduce its search space. The results of the testability analyzer can also be used to select test points or partial scan flip-flops in the early design phase. Based on selection criteria, a novel Synthesis for Testability approach call Test Statement Insertion (TSI) is proposed, which modifies the circuit behavioral description directly. Test Statement Insertion can also be used to modify circuit structural description to improve its testability. As a result, Synthesis for Testability methodology can be combined with an existing behavioral synthesis tool to produce more testable circuits.
On Testability of Missing Data Mechanisms in Incomplete Data Sets
ERIC Educational Resources Information Center
Raykov, Tenko
2011-01-01
This article is concerned with the question of whether the missing data mechanism routinely referred to as missing completely at random (MCAR) is statistically examinable via a test for lack of distributional differences between groups with observed and missing data, and related consequences. A discussion is initially provided, from a formal logic…
Reply to ``Comment on `Quantum time-of-flight distribution for cold trapped atoms' ''
NASA Astrophysics Data System (ADS)
Ali, Md. Manirul; Home, Dipankar; Majumdar, A. S.; Pan, Alok K.
2008-02-01
In their comment Gomes [Phys. Rev. A 77, 026101 (2008)] have questioned the possibility of empirically testable differences existing between the semiclassical time of flight distribution for cold trapped atoms and a quantum distribution discussed by us recently [Ali , Phys. Rev. A 75, 042110 (2007).]. We argue that their criticism is based on a semiclassical treatment having restricted applicability for a particular trapping potential. Their claim does not preclude, in general, the possibility of differences between the semiclassical calculations and fully quantum results for the arrival time distribution of freely falling atoms.
Reply to 'Comment on 'Quantum time-of-flight distribution for cold trapped atoms''
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ali, Md. Manirul; Home, Dipankar; Pan, Alok K.
2008-02-15
In their comment Gomes et al. [Phys. Rev. A 77, 026101 (2008)] have questioned the possibility of empirically testable differences existing between the semiclassical time of flight distribution for cold trapped atoms and a quantum distribution discussed by us recently [Ali et al., Phys. Rev. A 75, 042110 (2007).]. We argue that their criticism is based on a semiclassical treatment having restricted applicability for a particular trapping potential. Their claim does not preclude, in general, the possibility of differences between the semiclassical calculations and fully quantum results for the arrival time distribution of freely falling atoms.
Higher-order Fourier analysis over finite fields and applications
NASA Astrophysics Data System (ADS)
Hatami, Pooya
Higher-order Fourier analysis is a powerful tool in the study of problems in additive and extremal combinatorics, for instance the study of arithmetic progressions in primes, where the traditional Fourier analysis comes short. In recent years, higher-order Fourier analysis has found multiple applications in computer science in fields such as property testing and coding theory. In this thesis, we develop new tools within this theory with several new applications such as a characterization theorem in algebraic property testing. One of our main contributions is a strong near-equidistribution result for regular collections of polynomials. The densities of small linear structures in subsets of Abelian groups can be expressed as certain analytic averages involving linear forms. Higher-order Fourier analysis examines such averages by approximating the indicator function of a subset by a function of bounded number of polynomials. Then, to approximate the average, it suffices to know the joint distribution of the polynomials applied to the linear forms. We prove a near-equidistribution theorem that describes these distributions for the group F(n/p) when p is a fixed prime. This fundamental fact was previously known only under various extra assumptions about the linear forms or the field size. We use this near-equidistribution theorem to settle a conjecture of Gowers and Wolf on the true complexity of systems of linear forms. Our next application is towards a characterization of testable algebraic properties. We prove that every locally characterized affine-invariant property of functions f : F(n/p) → R with n∈ N, is testable. In fact, we prove that any such property P is proximity-obliviously testable. More generally, we show that any affine-invariant property that is closed under subspace restrictions and has "bounded complexity" is testable. We also prove that any property that can be described as the property of decomposing into a known structure of low-degree polynomials is locally characterized and is, hence, testable. We discuss several notions of regularity which allow us to deduce algorithmic versions of various regularity lemmas for polynomials by Green and Tao and by Kaufman and Lovett. We show that our algorithmic regularity lemmas for polynomials imply algorithmic versions of several results relying on regularity, such as decoding Reed-Muller codes beyond the list decoding radius (for certain structured errors), and prescribed polynomial decompositions. Finally, motivated by the definition of Gowers norms, we investigate norms defined by different systems of linear forms. We give necessary conditions on the structure of systems of linear forms that define norms. We prove that such norms can be one of only two types, and assuming that |F p| is sufficiently large, they essentially are equivalent to either a Gowers norm or Lp norms.
The Demographic Transition: Causes and Consequences
Galor, Oded
2013-01-01
This paper develops the theoretical foundations and the testable implications of the various mechanisms that have been proposed as possible triggers for the demographic transition. Moreover, it examines the empirical validity of each of the theories and their significance for the understanding of the transition from stagnation to growth. The analysis suggests that the rise in the demand for human capital in the process of development was the main trigger for the decline in fertility and the transition to modern growth PMID:25089157
Models of cooperative dynamics from biomolecules to magnets
NASA Astrophysics Data System (ADS)
Mobley, David Lowell
This work details application of computer models to several biological systems (prion diseases and Alzheimer's disease) and a magnetic system. These share some common themes, which are discussed. Here, simple lattice-based models are applied to aggregation of misfolded protein in prion diseases like Mad Cow disease. These can explain key features of the diseases. The modeling is based on aggregation being essential in establishing the time-course of infectivity. Growth of initial aggregates is assumed to dominate the experimentally observed lag phase. Subsequent fission, regrowth, and fission set apart the exponential doubling phase in disease progression. We explore several possible modes of growth for 2-D aggregates and suggest the model providing the best explanation for the experimental data. We develop testable predictions from this model. Like prion disease, Alzheimer's disease (AD) is an amyloid disease characterized by large aggregates in the brain. However, evidence increasingly points away from these as the toxic agent and towards oligomers of the Abeta peptide. We explore one possible toxicity mechanism---insertion of Abeta into cell membranes and formation of harmful ion channels. We find that mutations in this peptide which cause familial Alzheimer's disease (FAD) also affect the insertion of this peptide into membranes in a fairly consistent way, suggesting that this toxicity mechanism may be relevant biologically. We find a particular inserted configuration which may be especially harmful and develop testable predictions to verify whether or not this is the case. Nucleation is an essential feature of our models for prion disease, in that it protects normal, healthy individuals from getting prion disease. Nucleation is important in many other areas, and we modify our lattice-based nucleation model to apply to a hysteretic magnetic system where nucleation has been suggested to be important. From a simple model, we find qualitative agreement with experiment, and make testable experimental predictions concerning time-dependence and temperature-dependence of the major hysteresis loop and reversal curves which have been experimentally verified. We argue why this model may be suitable for systems like these and explain implications for Ising-like models. We suggest implications for future modeling work. Finally, we present suggestions for future work in all three areas.
Architectural Analysis of Dynamically Reconfigurable Systems
NASA Technical Reports Server (NTRS)
Lindvall, Mikael; Godfrey, Sally; Ackermann, Chris; Ray, Arnab; Yonkwa, Lyly
2010-01-01
oTpics include: the problem (increased flexibility of architectural styles decrease analyzability, behavior emerges and varies depending on the configuration, does the resulting system run according to the intended design, and architectural decisions can impede or facilitate testing); top down approach to architecture analysis, detection of defects and deviations, and architecture and its testability; currently targeted projects GMSEC and CFS; analyzing software architectures; analyzing runtime events; actual architecture recognition; GMPUB in Dynamic SAVE; sample output from new approach; taking message timing delays into account; CFS examples of architecture and testability; some recommendations for improved testablity; and CFS examples of abstract interfaces and testability; CFS example of opening some internal details.
Active Diagnosis of Navy Machinery Rev 2.0
2016-10-01
electrical distribution and potable water supply systems. Because of these dependencies, ship auxiliary system failures can cause combat load failure...buildup generally causes a pipe to disconnect from a junction, causing water to leak . This limits the faults that are testable, since many of the faults...pipes, junctions, pumps, flow meters, thermal loads, check valve, and water tank. Each agent is responsible for maintaining its constraints locally
Boolean Minimization and Algebraic Factorization Procedures for Fully Testable Sequential Machines
1989-09-01
Boolean Minimization and Algebraic Factorization Procedures for Fully Testable Sequential Machines Srinivas Devadas and Kurt Keutzer F ( Abstract In this...Projects Agency under contract number N00014-87-K-0825. Author Information Devadas : Department of Electrical Engineering and Computer Science, Room 36...MA 02139; (617) 253-0292. 0 * Boolean Minimization and Algebraic Factorization Procedures for Fully Testable Sequential Machines Siivas Devadas
Advanced Diagnostic and Prognostic Testbed (ADAPT) Testability Analysis Report
NASA Technical Reports Server (NTRS)
Ossenfort, John
2008-01-01
As system designs become more complex, determining the best locations to add sensors and test points for the purpose of testing and monitoring these designs becomes more difficult. Not only must the designer take into consideration all real and potential faults of the system, he or she must also find efficient ways of detecting and isolating those faults. Because sensors and cabling take up valuable space and weight on a system, and given constraints on bandwidth and power, it is even more difficult to add sensors into these complex designs after the design has been completed. As a result, a number of software tools have been developed to assist the system designer in proper placement of these sensors during the system design phase of a project. One of the key functions provided by many of these software programs is a testability analysis of the system essentially an evaluation of how observable the system behavior is using available tests. During the design phase, testability metrics can help guide the designer in improving the inherent testability of the design. This may include adding, removing, or modifying tests; breaking up feedback loops, or changing the system to reduce fault propagation. Given a set of test requirements, the analysis can also help to verify that the system will meet those requirements. Of course, a testability analysis requires that a software model of the physical system is available. For the analysis to be most effective in guiding system design, this model should ideally be constructed in parallel with these efforts. The purpose of this paper is to present the final testability results of the Advanced Diagnostic and Prognostic Testbed (ADAPT) after the system model was completed. The tool chosen to build the model and to perform the testability analysis with is the Testability Engineering and Maintenance System Designer (TEAMS-Designer). The TEAMS toolset is intended to be a solution to span all phases of the system, from design and development through health management and maintenance. TEAMS-Designer is the model-building and testability analysis software in that suite.
Fisher's geometrical model emerges as a property of complex integrated phenotypic networks.
Martin, Guillaume
2014-05-01
Models relating phenotype space to fitness (phenotype-fitness landscapes) have seen important developments recently. They can roughly be divided into mechanistic models (e.g., metabolic networks) and more heuristic models like Fisher's geometrical model. Each has its own drawbacks, but both yield testable predictions on how the context (genomic background or environment) affects the distribution of mutation effects on fitness and thus adaptation. Both have received some empirical validation. This article aims at bridging the gap between these approaches. A derivation of the Fisher model "from first principles" is proposed, where the basic assumptions emerge from a more general model, inspired by mechanistic networks. I start from a general phenotypic network relating unspecified phenotypic traits and fitness. A limited set of qualitative assumptions is then imposed, mostly corresponding to known features of phenotypic networks: a large set of traits is pleiotropically affected by mutations and determines a much smaller set of traits under optimizing selection. Otherwise, the model remains fairly general regarding the phenotypic processes involved or the distribution of mutation effects affecting the network. A statistical treatment and a local approximation close to a fitness optimum yield a landscape that is effectively the isotropic Fisher model or its extension with a single dominant phenotypic direction. The fit of the resulting alternative distributions is illustrated in an empirical data set. These results bear implications on the validity of Fisher's model's assumptions and on which features of mutation fitness effects may vary (or not) across genomic or environmental contexts.
A Unified Approach to the Synthesis of Fully Testable Sequential Machines
1989-10-01
N A Unified Approach to the Synthesis of Fully Testable Sequential Machines Srinivas Devadas and Kurt Keutzer Abstract • In this paper we attempt to...research was supported in part by the Defense Advanced Research Projects Agency under contract N00014-87-K-0825. Author Information Devadas : Department...Fully Testable Sequential Maine(S P Sritiivas Devadas Departinent of Electrical Engineerinig anid Comivi Sciec Massachusetts Institute of Technology
Factors That Affect Software Testability
NASA Technical Reports Server (NTRS)
Voas, Jeffrey M.
1991-01-01
Software faults that infrequently affect software's output are dangerous. When a software fault causes frequent software failures, testing is likely to reveal the fault before the software is releases; when the fault remains undetected during testing, it can cause disaster after the software is installed. A technique for predicting whether a particular piece of software is likely to reveal faults within itself during testing is found in [Voas91b]. A piece of software that is likely to reveal faults within itself during testing is said to have high testability. A piece of software that is not likely to reveal faults within itself during testing is said to have low testability. It is preferable to design software with higher testabilities from the outset, i.e., create software with as high of a degree of testability as possible to avoid the problems of having undetected faults that are associated with low testability. Information loss is a phenomenon that occurs during program execution that increases the likelihood that a fault will remain undetected. In this paper, I identify two brad classes of information loss, define them, and suggest ways of predicting the potential for information loss to occur. We do this in order to decrease the likelihood that faults will remain undetected during testing.
Hu, Xiao-Min; Chen, Jiang-Shan; Liu, Bi-Heng; Guo, Yu; Huang, Yun-Feng; Zhou, Zong-Quan; Han, Yong-Jian; Li, Chuan-Feng; Guo, Guang-Can
2016-10-21
The physical impact and the testability of the Kochen-Specker (KS) theorem is debated because of the fact that perfect compatibility in a single quantum system cannot be achieved in practical experiments with finite precision. Here, we follow the proposal of A. Cabello and M. T. Cunha [Phys. Rev. Lett. 106, 190401 (2011)], and present a compatibility-loophole-free experimental violation of an inequality of noncontextual theories by two spatially separated entangled qutrits. A maximally entangled qutrit-qutrit state with a fidelity as high as 0.975±0.001 is prepared and distributed to separated spaces, and these two photons are then measured locally, providing the compatibility requirement. The results show that the inequality for noncontextual theory is violated by 31 standard deviations. Our experiments pave the way to close the debate about the testability of the KS theorem. In addition, the method to generate high-fidelity and high-dimension entangled states will provide significant advantages in high-dimension quantum encoding and quantum communication.
Extended Testability Analysis Tool
NASA Technical Reports Server (NTRS)
Melcher, Kevin; Maul, William A.; Fulton, Christopher
2012-01-01
The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.
Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool
NASA Technical Reports Server (NTRS)
Maul, William A.; Fulton, Christopher E.
2011-01-01
This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual
Multiple transitions and HIV risk among orphaned Kenyan schoolgirls.
Mojola, Sanyu A
2011-03-01
Why are orphaned girls at particular risk of acquiring HIV infection? Using a transition-to-adulthood framework, this study employs qualitative data from Nyanza Province, Kenya, to explore pathways to HIV risk among orphaned and nonorphaned high-school girls. It shows how simultaneous processes such as leaving their parental home, negotiating financial access, and relationship transitions interact to produce disproportionate risk for orphaned girls. The role of financial provision and parental love in modifying girls' trajectories to risk are also explored. A testable theoretical model is proposed based on the qualitative findings, and policy implications are suggested.
MULTIPLE TRANSITIONS AND HIV RISK AMONG AFRICAN SCHOOL GIRLS
Mojola, Sanyu A
2012-01-01
Why are orphaned girls at particular risk of contracting HIV? Using a transition to adulthood framework, this paper uses qualitative data from Nyanza province, Kenya to explore pathways to HIV risk among orphaned and non-orphaned high school girls. I show how co-occurring processes such as residential transition out of the parental home, negotiating financial access and relationship transitions interact to produce disproportionate risk for orphan girls. I also explore the role of financial provision and parental love in modifying girls’ trajectories to risk. I propose a testable theoretical model based on the qualitative findings and suggest policy implications. PMID:21500699
LSI/VLSI design for testability analysis and general approach
NASA Technical Reports Server (NTRS)
Lam, A. Y.
1982-01-01
The incorporation of testability characteristics into large scale digital design is not only necessary for, but also pertinent to effective device testing and enhancement of device reliability. There are at least three major DFT techniques, namely, the self checking, the LSSD, and the partitioning techniques, each of which can be incorporated into a logic design to achieve a specific set of testability and reliability requirements. Detailed analysis of the design theory, implementation, fault coverage, hardware requirements, application limitations, etc., of each of these techniques are also presented.
Continuous variation caused by genes with graduated effects.
Matthysse, S; Lange, K; Wagener, D K
1979-01-01
The classical polygenic theory of inheritance postulates a large number of genes with small, and essentially similar, effects. We propose instead a model with genes of gradually decreasing effects. The resulting phenotypic distribution is not normal; if the gene effects are geometrically decreasing, it can be triangular. The joint distribution of parent and offspring genic value is calculated. The most readily testable difference between the two models is that, in the decreasing-effect model, the variance of the offspring distribution from given parents depends on the parents' genic values. The more the parents deviate from the mean, the smaller the variance of the offspring should be. In the equal-effect model the offspring variance is independent of the parents' genic values. PMID:288073
Module generation for self-testing integrated systems
NASA Astrophysics Data System (ADS)
Vanriessen, Ronald Pieter
Hardware used for self test in VLSI (Very Large Scale Integrated) systems is reviewed, and an architecture to control the test hardware in an integrated system is presented. Because of the increase of test times, the use of self test techniques has become practically and economically viable for VLSI systems. Beside the reduction in test times and costs, self test also provides testing at operational speeds. Therefore, a suitable combination of scan path and macrospecific (self) tests is required to reduce test times and costs. An expert system that can be used in a silicon compilation environment is presented. The approach requires a minimum of testability knowledge from a system designer. A user friendly interface was described for specifying and modifying testability requirements by a testability expert. A reason directed backtracking mechanism is used to solve selection failures. Both the hierarchical testable architecture and the design for testability expert system are used in a self test compiler. The definition of a self test compiler was given. A self test compiler is a software tool that selects an appropriate test method for every macro in a design. The hardware to control a macro test will be included in the design automatically. As an example, the integration of the self-test compiler in a silicon compilation system PIRAMID was described. The design of a demonstrator circuit by self test compiler is described. This circuit consists of two self testable macros. Control of the self test hardware is carried out via the test access port of the boundary scan standard.
Refinement of Representation Theorems for Context-Free Languages
NASA Astrophysics Data System (ADS)
Fujioka, Kaoru
In this paper, we obtain some refinement of representation theorems for context-free languages by using Dyck languages, insertion systems, strictly locally testable languages, and morphisms. For instance, we improved the Chomsky-Schützenberger representation theorem and show that each context-free language L can be represented in the form L = h (D ∩ R), where D is a Dyck language, R is a strictly 3-testable language, and h is a morphism. A similar representation for context-free languages can be obtained, using insertion systems of weight (3, 0) and strictly 4-testable languages.
An empirical comparison of a dynamic software testability metric to static cyclomatic complexity
NASA Technical Reports Server (NTRS)
Voas, Jeffrey M.; Miller, Keith W.; Payne, Jeffrey E.
1993-01-01
This paper compares the dynamic testability prediction technique termed 'sensitivity analysis' to the static testability technique termed cyclomatic complexity. The application that we chose in this empirical study is a CASE generated version of a B-737 autoland system. For the B-737 system we analyzed, we isolated those functions that we predict are more prone to hide errors during system/reliability testing. We also analyzed the code with several other well-known static metrics. This paper compares and contrasts the results of sensitivity analysis to the results of the static metrics.
Loss Aversion and Time-Differentiated Electricity Pricing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spurlock, C. Anna
2015-06-01
I develop a model of loss aversion over electricity expenditure, from which I derive testable predictions for household electricity consumption while on combination time-of-use (TOU) and critical peak pricing (CPP) plans. Testing these predictions results in evidence consistent with loss aversion: (1) spillover effects - positive expenditure shocks resulted in significantly more peak consumption reduction for several weeks thereafter; and (2) clustering - disproportionate probability of consuming such that expenditure would be equal between the TOUCPP or standard flat-rate pricing structures. This behavior is inconsistent with a purely neoclassical utility model, and has important implications for application of time-differentiated electricitymore » pricing.« less
Wolves in sheep's clothing: Is non-profit status used to signal quality?
Jones, Daniel B; Propper, Carol; Smith, Sarah
2017-09-01
Why do many firms in the healthcare sector adopt non-profit status? One argument is that non-profit status serves as a signal of quality when consumers are not well informed. A testable implication is that an increase in consumer information may lead to a reduction in the number of non-profits in a market. We test this idea empirically by exploiting an exogenous increase in consumer information in the US nursing home industry. We find that the information shock led to a reduction in the share of non-profit homes, driven by a combination of home closure and sector switching. The lowest quality non-profits were the most likely to exit. Our results have important implications for the effects of reforms to increase consumer provision in a number of public services. Copyright © 2017. Published by Elsevier B.V.
Testability analysis on a hydraulic system in a certain equipment based on simulation model
NASA Astrophysics Data System (ADS)
Zhang, Rui; Cong, Hua; Liu, Yuanhong; Feng, Fuzhou
2018-03-01
Aiming at the problem that the complicated structure and the shortage of fault statistics information in hydraulic systems, a multi value testability analysis method based on simulation model is proposed. Based on the simulation model of AMESim, this method injects the simulated faults and records variation of test parameters ,such as pressure, flow rate, at each test point compared with those under normal conditions .Thus a multi-value fault-test dependency matrix is established. Then the fault detection rate (FDR) and fault isolation rate (FIR) are calculated based on the dependency matrix. Finally the system of testability and fault diagnosis capability are analyzed and evaluated, which can only reach a lower 54%(FDR) and 23%(FIR). In order to improve testability performance of the system,. number and position of the test points are optimized on the system. Results show the proposed test placement scheme can be used to solve the problems that difficulty, inefficiency and high cost in the system maintenance.
Generating Testable Questions in the Science Classroom: The BDC Model
ERIC Educational Resources Information Center
Tseng, ChingMei; Chen, Shu-Bi Shu-Bi; Chang, Wen-Hua
2015-01-01
Guiding students to generate testable scientific questions is essential in the inquiry classroom, but it is not easy. The purpose of the BDC ("Big Idea, Divergent Thinking, and Convergent Thinking") instructional model is to to scaffold students' inquiry learning. We illustrate the use of this model with an example lesson, designed…
Easily Testable PLA-Based Finite State Machines
1989-03-01
PLATYPUS (20]. Then, justifi- type 1, 4 and 5 can be guaranteed to be testable via cation paths are obtained from the STG using simple logic...next state lines is found, if such a vector par that is gnrt d y the trupt eexists, using PLATYPUS [20]. pair that is generated by the first corrupted
1983-11-01
compound operations, with status. (h) Pre-programmed CRC and double-precision multiply/divide algo- rithms. (i) Double length accumulator with full...IH1.25 _ - MICROCOP ’ RESOLUTION TEST CHART NATIONAL BUREAU OF STANDARDS-1963-A .4 ’* • • . - . .. •. . . . . . . . . . . . . . • - -. .• ,. o. . . .- "o
Eye Examination Testability in Children with Autism and in Typical Peers
Coulter, Rachel Anastasia; Bade, Annette; Tea, Yin; Fecho, Gregory; Amster, Deborah; Jenewein, Erin; Rodena, Jacqueline; Lyons, Kara Kelley; Mitchell, G. Lynn; Quint, Nicole; Dunbar, Sandra; Ricamato, Michele; Trocchio, Jennie; Kabat, Bonnie; Garcia, Chantel; Radik, Irina
2015-01-01
ABSTRACT Purpose To compare testability of vision and eye tests in an examination protocol of 9- to 17-year-old patients with autism spectrum disorder (ASD) to typically developing (TD) peers. Methods In a prospective pilot study, 61 children and adolescents (34 with ASD and 27 who were TD) aged 9 to 17 years completed an eye examination protocol including tests of visual acuity, refraction, convergence (eye teaming), stereoacuity (depth perception), ocular motility, and ocular health. Patients who required new refractive correction were retested after wearing their updated spectacle prescription for 1 month. The specialized protocol incorporated visual, sensory, and communication supports. A psychologist determined group status/eligibility using DSM-IV-TR (Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition, Text Revision) criteria by review of previous evaluations and parent responses on the Social Communication Questionnaire. Before the examination, parents provided information regarding patients’ sex, race, ethnicity, and, for ASD patients, verbal communication level (nonverbal, uses short words, verbal). Parents indicated whether the patient wore a refractive correction, whether the patient had ever had an eye examination, and the age at the last examination. Chi-square tests compared testability results for TD and ASD groups. Results Typically developing and ASD groups did not differ by age (p = 0.54), sex (p = 0.53), or ethnicity (p = 0.22). Testability was high on most tests (TD, 100%; ASD, 88 to 100%), except for intraocular pressure (IOP), which was reduced for both the ASD (71%) and the TD (89%) patients. Among ASD patients, IOP testability varied greatly with verbal communication level (p < 0.001). Although IOP measurements were completed on all verbal patients, only 37.5% of nonverbal and 44.4% of ASD patients who used short words were successful. Conclusions Patients with ASD can complete most vision and eye tests within an examination protocol. Testability of IOPs is reduced, particularly for nonverbal patients and patients who use short words to communicate. PMID:25415280
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bray, O.H.
This paper describes a natural language based, semantic information modeling methodology and explores its use and value in clarifying and comparing political science theories and frameworks. As an example, the paper uses this methodology to clarify and compare some of the basic concepts and relationships in the realist (e.g. Waltz) and the liberal (e.g. Rosenau) paradigms for international relations. The methodology can provide three types of benefits: (1) it can clarify and make explicit exactly what is meant by a concept; (2) it can often identify unanticipated implications and consequence of concepts and relationships; and (3) it can help inmore » identifying and operationalizing testable hypotheses.« less
Domain generality vs. modality specificity: The paradox of statistical learning
Frost, Ram; Armstrong, Blair C.; Siegelman, Noam; Christiansen, Morten H.
2015-01-01
Statistical learning is typically considered to be a domain-general mechanism by which cognitive systems discover the underlying distributional properties of the input. Recent studies examining whether there are commonalities in the learning of distributional information across different domains or modalities consistently reveal, however, modality and stimulus specificity. An important question is, therefore, how and why a hypothesized domain-general learning mechanism systematically produces such effects. We offer a theoretical framework according to which statistical learning is not a unitary mechanism, but a set of domain-general computational principles, that operate in different modalities and therefore are subject to the specific constraints characteristic of their respective brain regions. This framework offers testable predictions and we discuss its computational and neurobiological plausibility. PMID:25631249
A Framework for Evidence-Based Licensure of Adaptive Autonomous Systems
2016-03-01
insights gleaned to DoD. The autonomy community has identified significant challenges associated with test, evaluation verification and validation of...licensure as a test, evaluation, verification , and validation (TEVV) framework that can address these challenges. IDA found that traditional...language requirements to testable (preferably machine testable) specifications • Design of architectures that treat development and verification of
The Diffusion Model Is Not a Deterministic Growth Model: Comment on Jones and Dzhafarov (2014)
Smith, Philip L.; Ratcliff, Roger; McKoon, Gail
2015-01-01
Jones and Dzhafarov (2014) claim that several current models of speeded decision making in cognitive tasks, including the diffusion model, can be viewed as special cases of other general models or model classes. The general models can be made to match any set of response time (RT) distribution and accuracy data exactly by a suitable choice of parameters and so are unfalsifiable. The implication of their claim is that models like the diffusion model are empirically testable only by artificially restricting them to exclude unfalsifiable instances of the general model. We show that Jones and Dzhafarov’s argument depends on enlarging the class of “diffusion” models to include models in which there is little or no diffusion. The unfalsifiable models are deterministic or near-deterministic growth models, from which the effects of within-trial variability have been removed or in which they are constrained to be negligible. These models attribute most or all of the variability in RT and accuracy to across-trial variability in the rate of evidence growth, which is permitted to be distributed arbitrarily and to vary freely across experimental conditions. In contrast, in the standard diffusion model, within-trial variability in evidence is the primary determinant of variability in RT. Across-trial variability, which determines the relative speed of correct responses and errors, is theoretically and empirically constrained. Jones and Dzhafarov’s attempt to include the diffusion model in a class of models that also includes deterministic growth models misrepresents and trivializes it and conveys a misleading picture of cognitive decision-making research. PMID:25347314
Larval transport modeling of deep-sea invertebrates can aid the search for undiscovered populations.
Yearsley, Jon M; Sigwart, Julia D
2011-01-01
Many deep-sea benthic animals occur in patchy distributions separated by thousands of kilometres, yet because deep-sea habitats are remote, little is known about their larval dispersal. Our novel method simulates dispersal by combining data from the Argo array of autonomous oceanographic probes, deep-sea ecological surveys, and comparative invertebrate physiology. The predicted particle tracks allow quantitative, testable predictions about the dispersal of benthic invertebrate larvae in the south-west Pacific. In a test case presented here, using non-feeding, non-swimming (lecithotrophic trochophore) larvae of polyplacophoran molluscs (chitons), we show that the likely dispersal pathways in a single generation are significantly shorter than the distances between the three known population centres in our study region. The large-scale density of chiton populations throughout our study region is potentially much greater than present survey data suggest, with intermediate 'stepping stone' populations yet to be discovered. We present a new method that is broadly applicable to studies of the dispersal of deep-sea organisms. This test case demonstrates the power and potential applications of our new method, in generating quantitative, testable hypotheses at multiple levels to solve the mismatch between observed and expected distributions: probabilistic predictions of locations of intermediate populations, potential alternative dispersal mechanisms, and expected population genetic structure. The global Argo data have never previously been used to address benthic biology, and our method can be applied to any non-swimming larvae of the deep-sea, giving information upon dispersal corridors and population densities in habitats that remain intrinsically difficult to assess.
Larval Transport Modeling of Deep-Sea Invertebrates Can Aid the Search for Undiscovered Populations
Yearsley, Jon M.; Sigwart, Julia D.
2011-01-01
Background Many deep-sea benthic animals occur in patchy distributions separated by thousands of kilometres, yet because deep-sea habitats are remote, little is known about their larval dispersal. Our novel method simulates dispersal by combining data from the Argo array of autonomous oceanographic probes, deep-sea ecological surveys, and comparative invertebrate physiology. The predicted particle tracks allow quantitative, testable predictions about the dispersal of benthic invertebrate larvae in the south-west Pacific. Principal Findings In a test case presented here, using non-feeding, non-swimming (lecithotrophic trochophore) larvae of polyplacophoran molluscs (chitons), we show that the likely dispersal pathways in a single generation are significantly shorter than the distances between the three known population centres in our study region. The large-scale density of chiton populations throughout our study region is potentially much greater than present survey data suggest, with intermediate ‘stepping stone’ populations yet to be discovered. Conclusions/Significance We present a new method that is broadly applicable to studies of the dispersal of deep-sea organisms. This test case demonstrates the power and potential applications of our new method, in generating quantitative, testable hypotheses at multiple levels to solve the mismatch between observed and expected distributions: probabilistic predictions of locations of intermediate populations, potential alternative dispersal mechanisms, and expected population genetic structure. The global Argo data have never previously been used to address benthic biology, and our method can be applied to any non-swimming larvae of the deep-sea, giving information upon dispersal corridors and population densities in habitats that remain intrinsically difficult to assess. PMID:21857992
Authors’ response: mirror neurons: tests and testability.
Catmur, Caroline; Press, Clare; Cook, Richard; Bird, Geoffrey; Heyes, Cecilia
2014-04-01
Commentators have tended to focus on the conceptual framework of our article, the contrast between genetic and associative accounts of mirror neurons, and to challenge it with additional possibilities rather than empirical data. This makes the empirically focused comments especially valuable. The mirror neuron debate is replete with ideas; what it needs now are system-level theories and careful experiments – tests and testability.
O'Malley, Maureen A
2018-06-01
Since the 1940s, microbiologists, biochemists and population geneticists have experimented with the genetic mechanisms of microorganisms in order to investigate evolutionary processes. These evolutionary studies of bacteria and other microorganisms gained some recognition from the standard-bearers of the modern synthesis of evolutionary biology, especially Theodosius Dobzhansky and Ledyard Stebbins. A further period of post-synthesis bacterial evolutionary research occurred between the 1950s and 1980s. These experimental analyses focused on the evolution of population and genetic structure, the adaptive gain of new functions, and the evolutionary consequences of competition dynamics. This large body of research aimed to make evolutionary theory testable and predictive, by giving it mechanistic underpinnings. Although evolutionary microbiologists promoted bacterial experiments as methodologically advantageous and a source of general insight into evolution, they also acknowledged the biological differences of bacteria. My historical overview concludes with reflections on what bacterial evolutionary research achieved in this period, and its implications for the still-developing modern synthesis.
Hunter, Lora Rose; Schmidt, Norman B
2010-03-01
In this review, the extant literature concerning anxiety psychopathology in African American adults is summarized to develop a testable, explanatory framework with implications for future research. The model was designed to account for purported lower rates of anxiety disorders in African Americans compared to European Americans, along with other ethnoracial differences reported in the literature. Three specific beliefs or attitudes related to the sociocultural experience of African Americans are identified: awareness of racism, stigma of mental illness, and salience of physical illnesses. In our model, we propose that these psychological processes influence interpretations and behaviors relevant to the expression of nonpathological anxiety as well as features of diagnosable anxiety conditions. Moreover, differences in these processes may explain the differential assessed rates of anxiety disorders in African Americans. The model is discussed in the context of existing models of anxiety etiology. Specific follow-up research is also suggested, along with implications for clinical assessment, diagnosis, and treatment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kenyon, Scott J.; Bromley, Benjamin C., E-mail: skenyon@cfa.harvard.edu, E-mail: bromley@physics.utah.edu
2012-03-15
We investigate whether coagulation models of planet formation can explain the observed size distributions of trans-Neptunian objects (TNOs). Analyzing published and new calculations, we demonstrate robust relations between the size of the largest object and the slope of the size distribution for sizes 0.1 km and larger. These relations yield clear, testable predictions for TNOs and other icy objects throughout the solar system. Applying our results to existing observations, we show that a broad range of initial disk masses, planetesimal sizes, and fragmentation parameters can explain the data. Adding dynamical constraints on the initial semimajor axis of 'hot' Kuiper Beltmore » objects along with probable TNO formation times of 10-700 Myr restricts the viable models to those with a massive disk composed of relatively small (1-10 km) planetesimals.« less
Artificial Intelligence Applications to Testability.
1984-10-01
general software assistant; examining testability utilization of it should wait a few years until the software assistant is a well defined product ...ago. It provides a single host which satisfies the needs of developers, product developers, and end users . As shown in table 5.10-2, it also provides...follows a trend towards more user -oriented design approaches to interactive computer systems. The implicit goal in this trend is the
The need for theory to guide concussion research.
Molfese, Dennis L
2015-01-01
Although research into concussion has greatly expanded over the past decade, progress in identifying the mechanisms and consequences of head injury and recovery are largely absent. Instead, data are accumulated without the guidance of a systematic theory to direct research questions or generate testable hypotheses. As part of this special issue on sports concussion, I advance a theory that emphasizes changes in spatial and temporal distributions of the brain's neural networks during normal learning and the disruptions of these networks following injury. Specific predictions are made regarding both the development of the network as well as its breakdown following injury.
Why is there a dearth of close-in planets around fast-rotating stars?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Teitler, Seth; Königl, Arieh, E-mail: satelite@gmail.com, E-mail: akonigl@uchicago.edu
2014-05-10
We propose that the reported dearth of Kepler objects of interest (KOIs) with orbital periods P {sub orb} ≲ 2-3 days around stars with rotation periods P {sub rot} ≲ 5-10 days can be attributed to tidal ingestion of close-in planets by their host stars. We show that the planet distribution in this region of the log P {sub orb}-log P {sub rot} plane is qualitatively reproduced with a model that incorporates tidal interaction and magnetic braking as well as the dependence on the stellar core-envelope coupling timescale. We demonstrate the consistency of this scenario with the inferred break inmore » the P {sub orb} distribution of close-in KOIs and point out a potentially testable prediction of this interpretation.« less
Advanced Launch System Multi-Path Redundant Avionics Architecture Analysis and Characterization
NASA Technical Reports Server (NTRS)
Baker, Robert L.
1993-01-01
The objective of the Multi-Path Redundant Avionics Suite (MPRAS) program is the development of a set of avionic architectural modules which will be applicable to the family of launch vehicles required to support the Advanced Launch System (ALS). To enable ALS cost/performance requirements to be met, the MPRAS must support autonomy, maintenance, and testability capabilities which exceed those present in conventional launch vehicles. The multi-path redundant or fault tolerance characteristics of the MPRAS are necessary to offset a reduction in avionics reliability due to the increased complexity needed to support these new cost reduction and performance capabilities and to meet avionics reliability requirements which will provide cost-effective reductions in overall ALS recurring costs. A complex, real-time distributed computing system is needed to meet the ALS avionics system requirements. General Dynamics, Boeing Aerospace, and C.S. Draper Laboratory have proposed system architectures as candidates for the ALS MPRAS. The purpose of this document is to report the results of independent performance and reliability characterization and assessment analyses of each proposed candidate architecture and qualitative assessments of testability, maintainability, and fault tolerance mechanisms. These independent analyses were conducted as part of the MPRAS Part 2 program and were carried under NASA Langley Research Contract NAS1-17964, Task Assignment 28.
Harris, Jenine K; Erwin, Paul C; Smith, Carson; Brownson, Ross C
2015-01-01
Evidence-based decision making (EBDM) is the process, in local health departments (LHDs) and other settings, of translating the best available scientific evidence into practice. Local health departments are more likely to be successful if they use evidence-based strategies. However, EBDM and use of evidence-based strategies by LHDs are not widespread. Drawing on diffusion of innovations theory, we sought to understand how LHD directors and program managers perceive the relative advantage, compatibility, simplicity, and testability of EBDM. Directors and managers of programs in chronic disease, environmental health, and infectious disease from LHDs nationwide completed a survey including demographic information and questions about diffusion attributes (advantage, compatibility, simplicity, and testability) related to EBDM. Bivariate inferential tests were used to compare responses between directors and managers and to examine associations between participant characteristics and diffusion attributes. Relative advantage and compatibility scores were high for directors and managers, whereas simplicity and testability scores were lower. Although health department directors and managers of programs in chronic disease generally had higher scores than other groups, there were few significant or large differences between directors and managers across the diffusion attributes. Larger jurisdiction population size was associated with higher relative advantage and compatibility scores for both directors and managers. Overall, directors and managers were in strong agreement on the relative advantage of an LHD using EBDM, with directors in stronger agreement than managers. Perceived relative advantage has been demonstrated to be the most important factor in the rate of innovation adoption, suggesting an opportunity for directors to speed EBDM adoption. However, lower average scores across all groups for simplicity and testability may be hindering EBDM adoption. Recommended strategies for increasing perceived EBDM simplicity and testability are provided.
Testability Design Rating System: Testability Handbook. Volume 1
1992-02-01
4-10 4.7.5 Summary of False BIT Alarms (FBA) ............................. 4-10 4.7.6 Smart BIT Technique...Circuit Board PGA Pin Grid Array PLA Programmable Logic Array PLD Programmable Logic Device PN Pseudo-Random Number PREDICT Probabilistic Estimation of...11 4.7.6 Smart BIT ( reference: RADC-TR-85-198). " Smart " BIT is a term given to BIT circuitry in a system LRU which includes dedicated processor/memory
Smart substrates: Making multi-chip modules smarter
NASA Astrophysics Data System (ADS)
Wunsch, T. F.; Treece, R. K.
1995-05-01
A novel multi-chip module (MCM) design and manufacturing methodology which utilizes active CMOS circuits in what is normally a passive substrate realizes the 'smart substrate' for use in highly testable, high reliability MCMS. The active devices are used to test the bare substrate, diagnose assembly errors or integrated circuit (IC) failures that require rework, and improve the testability of the final MCM assembly. A static random access memory (SRAM) MCM has been designed and fabricated in Sandia Microelectronics Development Laboratory in order to demonstrate the technical feasibility of this concept and to examine design and manufacturing issues which will ultimately determine the economic viability of this approach. The smart substrate memory MCM represents a first in MCM packaging. At the time the first modules were fabricated, no other company or MCM vendor had incorporated active devices in the substrate to improve manufacturability and testability, and thereby improve MCM reliability and reduce cost.
Emotional intervention strategies for dementia-related behavior: a theory synthesis.
Yao, Lan; Algase, Donna
2008-04-01
Behavioral disturbances of elders with dementia are prevalent. Yet the science guiding development and testing of effective intervention strategies is limited by rudimentary and often-conflicting theories. Using a theory-synthesis approach conducted within the perspective of the need-driven dementia-compromised behavior model, this article presents the locomoting responses to environment in elders with dementia (LRE-EWD) model. This new model, based on empirical and theoretical evidence, integrates the role of emotion with that of cognition in explicating a person-environment dynamic supporting wandering and other dementia-related disturbances. Included is evidence of the theory's testability and elaboration of its implications. The LRE-EWD model resolves conflicting views and evidence from current research on environmental interventions for behavior disturbances and opens new avenues to advance this field of study and practice.
Chromosomes, conflict, and epigenetics: chromosomal speciation revisited.
Brown, Judith D; O'Neill, Rachel J
2010-01-01
Since Darwin first noted that the process of speciation was indeed the "mystery of mysteries," scientists have tried to develop testable models for the development of reproductive incompatibilities-the first step in the formation of a new species. Early theorists proposed that chromosome rearrangements were implicated in the process of reproductive isolation; however, the chromosomal speciation model has recently been questioned. In addition, recent data from hybrid model systems indicates that simple epistatic interactions, the Dobzhansky-Muller incompatibilities, are more complex. In fact, incompatibilities are quite broad, including interactions among heterochromatin, small RNAs, and distinct, epigenetically defined genomic regions such as the centromere. In this review, we will examine both classical and current models of chromosomal speciation and describe the "evolving" theory of genetic conflict, epigenetics, and chromosomal speciation.
Thalamocortical mechanisms for integrating musical tone and rhythm
Musacchia, Gabriella; Large, Edward
2014-01-01
Studies over several decades have identified many of the neuronal substrates of music perception by pursuing pitch and rhythm perception separately. Here, we address the question of how these mechanisms interact, starting with the observation that the peripheral pathways of the so-called “Core” and “Matrix” thalamocortical system provide the anatomical bases for tone and rhythm channels. We then examine the hypothesis that these specialized inputs integrate tonal content within rhythm context in auditory cortex using classical types of “driving” and “modulatory” mechanisms. This hypothesis provides a framework for deriving testable predictions about the early stages of music processing. Furthermore, because thalamocortical circuits are shared by speech and music processing, such a model provides concrete implications for how music experience contributes to the development of robust speech encoding mechanisms. PMID:24103509
Paleolithic vs. modern diets--selected pathophysiological implications.
Eaton, S B; Eaton, S B
2000-04-01
The nutritional patterns of Paleolithic humans influenced genetic evolution during the time segment within which defining characteristics of contemporary humans were selected. Our genome can have changed little since the beginnings of agriculture, so, genetically, humans remain Stone Agers--adapted for a Paleolithic dietary regimen. Such diets were based chiefly on wild game, fish and uncultivated plant foods. They provided abundant protein; a fat profile much different from that of affluent Western nations; high fibre; carbohydrate from fruits and vegetables (and some honey) but not from cereals, refined sugars and dairy products; high levels of micronutrients and probably of phytochemicals as well. Differences between contemporary and ancestral diets have many pathophysiological implications. This review addresses phytochemicals and cancer; calcium, physical exertion, bone mineral density and bone structural geometry; dietary protein, potassium, renal acid secretion and urinary calcium loss; and finally sarcopenia, adiposity, insulin receptors and insulin resistance. While not, yet, a basis for formal recommendations, awareness of Paleolithic nutritional patterns should generate novel, testable hypotheses grounded in evolutionary theory and it should dispel complacency regarding currently accepted nutritional tenets.
Lee, Joy L; DeCamp, Matthew; Dredze, Mark; Chisolm, Margaret S; Berger, Zackary D
2014-10-15
Twitter is home to many health professionals who send messages about a variety of health-related topics. Amid concerns about physicians posting inappropriate content online, more in-depth knowledge about these messages is needed to understand health professionals' behavior on Twitter. Our goal was to characterize the content of Twitter messages, specifically focusing on health professionals and their tweets relating to health. We performed an in-depth content analysis of 700 tweets. Qualitative content analysis was conducted on tweets by health users on Twitter. The primary objective was to describe the general type of content (ie, health-related versus non-health related) on Twitter authored by health professionals and further to describe health-related tweets on the basis of the type of statement made. Specific attention was given to whether a tweet was personal (as opposed to professional) or made a claim that users would expect to be supported by some level of medical evidence (ie, a "testable" claim). A secondary objective was to compare content types among different users, including patients, physicians, nurses, health care organizations, and others. Health-related users are posting a wide range of content on Twitter. Among health-related tweets, 53.2% (184/346) contained a testable claim. Of health-related tweets by providers, 17.6% (61/346) were personal in nature; 61% (59/96) made testable statements. While organizations and businesses use Twitter to promote their services and products, patient advocates are using this tool to share their personal experiences with health. Twitter users in health-related fields tweet about both testable claims and personal experiences. Future work should assess the relationship between testable tweets and the actual level of evidence supporting them, including how Twitter users-especially patients-interpret the content of tweets posted by health providers.
The social architecture of capitalism
NASA Astrophysics Data System (ADS)
Wright, Ian
2005-02-01
A dynamic model of the social relations between workers and capitalists is introduced. The model self-organises into a dynamic equilibrium with statistical properties that are in close qualitative and in many cases quantitative agreement with a broad range of known empirical distributions of developed capitalism, including the power-law firm size distribution, the Laplace firm and GDP growth distribution, the lognormal firm demises distribution, the exponential recession duration distribution, the lognormal-Pareto income distribution, and the gamma-like firm rate-of-profit distribution. Normally these distributions are studied in isolation, but this model unifies and connects them within a single causal framework. The model also generates business cycle phenomena, including fluctuating wage and profit shares in national income about values consistent with empirical studies. The generation of an approximately lognormal-Pareto income distribution and an exponential-Pareto wealth distribution demonstrates that the power-law regime of the income distribution can be explained by an additive process on a power-law network that models the social relation between employers and employees organised in firms, rather than a multiplicative process that models returns to investment in financial markets. A testable consequence of the model is the conjecture that the rate-of-profit distribution is consistent with a parameter-mix of a ratio of normal variates with means and variances that depend on a firm size parameter that is distributed according to a power-law.
Broadening conceptions of learning in medical education: the message from teamworking.
Bleakley, Alan
2006-02-01
There is a mismatch between the broad range of learning theories offered in the wider education literature and a relatively narrow range of theories privileged in the medical education literature. The latter are usually described under the heading of 'adult learning theory'. This paper critically addresses the limitations of the current dominant learning theories informing medical education. An argument is made that such theories, which address how an individual learns, fail to explain how learning occurs in dynamic, complex and unstable systems such as fluid clinical teams. Models of learning that take into account distributed knowing, learning through time as well as space, and the complexity of a learning environment including relationships between persons and artefacts, are more powerful in explaining and predicting how learning occurs in clinical teams. Learning theories may be privileged for ideological reasons, such as medicine's concern with autonomy. Where an increasing amount of medical education occurs in workplace contexts, sociocultural learning theories offer a best-fit exploration and explanation of such learning. We need to continue to develop testable models of learning that inform safe work practice. One type of learning theory will not inform all practice contexts and we need to think about a range of fit-for-purpose theories that are testable in practice. Exciting current developments include dynamicist models of learning drawing on complexity theory.
NASA Astrophysics Data System (ADS)
Douglas, M. M.; Bunn, S. E.; Davies, P. M.
2005-05-01
The tropical rivers of northern Australia are internationally recognised for their high ecological and cultural values. They have largely unmodified flow regimes and are comparatively free of the impacts associated with intensive land use. However, there is growing demand for agricultural development and existing pressures, such as weeds and feral animals, threaten their ecological integrity. Using the international literature to provide a conceptual framework and drawing on limited published and unpublished data on rivers in northern Australia, we have derived five general principles about food webs and related ecosystem processes that both characterise tropical rivers of northern Australia and have important implications for their management. These are: (1) Seasonal hydrology is a strong driver of ecosystem processes and food web structure; (2) Hydrological connectivity is largely intact and underpins important terrestrial-aquatic food web subsidies; (3) River and wetland food webs are strongly dependent on algal production; (4) A few common macroconsumers species have a strong influence on benthic food webs; (5) Omnivory is widespread and food chains are short. These principles have implications for the management and protection of tropical rivers and wetlands of northern Australia and provide a framework for the formation of testable hypotheses in future research programs.
Curiosity at Vera Rubin Ridge: Testable Hypotheses, First Results, and Implications for Habitability
NASA Astrophysics Data System (ADS)
Fraeman, A.; Bedford, C.; Bridges, J.; Edgar, L. A.; Hardgrove, C.; Horgan, B. H. N.; Gabriel, T. S. J.; Grotzinger, J. P.; Gupta, S.; Johnson, J. R.; Rampe, E. B.; Morris, R. V.; Salvatore, M. R.; Schwenzer, S. P.; Stack, K.; Pinet, P. C.; Rubin, D. M.; Weitz, C. M.; Wellington, D. F.; Wiens, R. C.; Williams, A. J.; Vasavada, A. R.
2017-12-01
As of sol 1756, Curiosity was 250 meters from ascending Vera Rubin Ridge, a unique geomorphic feature preserved in the lower foothills of Aeolis Mons (informally known as Mt. Sharp) that is distinguishable from orbit. Vera Rubin Ridge (previously termed the Hematite Ridge) is characterized by a higher thermal inertia than the surrounding terrain, is comparatively resistant to erosion, and is capped with a hematite-bearing layer that is visible in 18 m/pixel CRISM data. A key hypothesis associated with this unit is that it represents a redox interface where ferrous iron oxidized and precipitated either as hematite or another ferric precursor. The Curiosity integrated payload is being used to determine the depositional environment(s), stratigraphic context and geochemical conditions associated with this interface, all of which will provide key insights into its past habitability potential and the relative timing of processes. Specifically, analysis of Curiosity data will address four major questions related to the history and evolution of ridge-forming strata: (1) What is the stratigraphic relationship between the units in the ridge and the Mt. Sharp group (see Grotzinger et al., 2015)? (2) What primary and secondary geologic processes deposited and modified the ridge units over time? (3) What is the nature and timing of the hematite precipitation environment, and how does it relate to similar oxidized phases in the Murray formation? (4) What are the implications for habitability and the preservation of organic molecules? Initial results of a systematic imaging campaign along the contact between the lower portion or the ridge and the Murray formation has revealed dm-scale cross bedding within the ridge stratigraphy, which provide clues about the depositional environments; these can be compared to suites of sedimentary structures within the adjacent Murray formation. Long distance ChemCam passive and Mastcam multispectral data show that hematite and likely other ferric phases are present in the upper ridge, consistent with orbital data. Curiosity will continue to take systematic observations that draw upon testable hypotheses about the ridge environments as the rover ascends Vera Rubin Ridge.
Flight control system design factors for applying automated testing techniques
NASA Technical Reports Server (NTRS)
Sitz, Joel R.; Vernon, Todd H.
1990-01-01
The principal design features and operational experiences of the X-29 forward-swept-wing aircraft and F-18 high alpha research vehicle (HARV) automated test systems are discussed. It is noted that operational experiences in developing and using these automated testing techniques have highlighted the need for incorporating target system features to improve testability. Improved target system testability can be accomplished with the addition of nonreal-time and real-time features. Online access to target system implementation details, unobtrusive real-time access to internal user-selectable variables, and proper software instrumentation are all desirable features of the target system. Also, test system and target system design issues must be addressed during the early stages of the target system development. Processing speeds of up to 20 million instructions/s and the development of high-bandwidth reflective memory systems have improved the ability to integrate the target system and test system for the application of automated testing techniques. It is concluded that new methods of designing testability into the target systems are required.
Abu Bakar, Nurul Farhana; Chen, Ai-Hong
2014-02-01
Children with learning disabilities might have difficulties to communicate effectively and give reliable responses as required in various visual function testing procedures. The purpose of this study was to compare the testability of visual acuity using the modified Early Treatment Diabetic Retinopathy Study (ETDRS) and Cambridge Crowding Cards, stereo acuity using Lang Stereo test II and Butterfly stereo tests and colour perception using Colour Vision Test Made Easy (CVTME) and Ishihara's Test for Colour Deficiency (Ishihara Test) between children in mainstream classes and children with learning disabilities in special education classes in government primary schools. A total of 100 primary school children (50 children from mainstream classes and 50 children from special education classes) matched in age were recruited in this cross-sectional comparative study. The testability was determined by the percentage of children who were able to give reliable respond as required by the respective tests. 'Unable to test' was defined as inappropriate response or uncooperative despite best efforts of the screener. The testability of the modified ETDRS, Butterfly stereo test and Ishihara test for respective visual function tests were found lower among children in special education classes ( P < 0.001) but not in Cambridge Crowding Cards, Lang Stereo test II and CVTME. Non verbal or "matching" approaches were found to be more superior in testing visual functions in children with learning disabilities. Modifications of vision testing procedures are essential for children with learning disabilities.
Leveraging ecological theory to guide natural product discovery.
Smanski, Michael J; Schlatter, Daniel C; Kinkel, Linda L
2016-03-01
Technological improvements have accelerated natural product (NP) discovery and engineering to the point that systematic genome mining for new molecules is on the horizon. NP biosynthetic potential is not equally distributed across organisms, environments, or microbial life histories, but instead is enriched in a number of prolific clades. Also, NPs are not equally abundant in nature; some are quite common and others markedly rare. Armed with this knowledge, random 'fishing expeditions' for new NPs are increasingly harder to justify. Understanding the ecological and evolutionary pressures that drive the non-uniform distribution of NP biosynthesis provides a rational framework for the targeted isolation of strains enriched in new NP potential. Additionally, ecological theory leads to testable hypotheses regarding the roles of NPs in shaping ecosystems. Here we review several recent strain prioritization practices and discuss the ecological and evolutionary underpinnings for each. Finally, we offer perspectives on leveraging microbial ecology and evolutionary biology for future NP discovery.
Ecological Suitability and Spatial Distribution of Five Anopheles Species in Amazonian Brazil
McKeon, Sascha N.; Schlichting, Carl D.; Povoa, Marinete M.; Conn, Jan E.
2013-01-01
Seventy-six sites characterized in Amazonian Brazil revealed distinct habitat diversification by examining the environmental factors associated with the distribution and abundance of five anopheline species (Diptera: Culicidae) in the subgenus Nyssorhynchus. These included three members of the Albitarsis Complex, Anopheles oryzalimnetes, Anopheles marajoara, Anopheles janconnae; Anopheles triannulatus, and Anopheles goeldii. Anopheles janconnae abundance had a positive correlation to water flow and a negative relationship to sun exposure. Abundance of An. oryzalimentes was associated with water chemistry. Anopheles goeldii larvae were abundant in shaded, more saline waters. Anopheles marajoara and An. triannulatus were negatively associated with available resources, although An. marajoara also showed several local correlations. These analyses suggest An. triannulatus is a habitat generalist, An. oryzalimentes and An. janconnae are specialists, and An. marajoara and An. goeldii could not be easily classified either way. Correlations described herein provide testable hypotheses for future research and identifying habitats for vector control. PMID:23546804
Solving puzzles of GW150914 by primordial black holes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blinnikov, S.; Dolgov, A.; Porayko, N.K.
The black hole binary properties inferred from the LIGO gravitational wave signal GW150914 posed several serious problems. The high masses and low effective spin of black hole binary can be explained if they are primordial (PBH) rather than the products of the stellar binary evolution. Such PBH properties are postulated ad hoc but not derived from fundamental theory. We show that the necessary features of PBHs naturally follow from the slightly modified Affleck-Dine (AD) mechanism of baryogenesis. The log-normal distribution of PBHs, predicted within the AD paradigm, is adjusted to provide an abundant population of low-spin stellar mass black holes.more » The same distribution gives a sufficient number of quickly growing seeds of supermassive black holes observed at high redshifts and may comprise an appreciable fraction of Dark Matter which does not contradict any existing observational limits. Testable predictions of this scenario are discussed.« less
Mäs, Michael; Flache, Andreas
2013-01-01
Explanations of opinion bi-polarization hinge on the assumption of negative influence, individuals' striving to amplify differences to disliked others. However, empirical evidence for negative influence is inconclusive, which motivated us to search for an alternative explanation. Here, we demonstrate that bi-polarization can be explained without negative influence, drawing on theories that emphasize the communication of arguments as central mechanism of influence. Due to homophily, actors interact mainly with others whose arguments will intensify existing tendencies for or against the issue at stake. We develop an agent-based model of this theory and compare its implications to those of existing social-influence models, deriving testable hypotheses about the conditions of bi-polarization. Hypotheses were tested with a group-discussion experiment (N = 96). Results demonstrate that argument exchange can entail bi-polarization even when there is no negative influence.
How hierarchical is language use?
Frank, Stefan L.; Bod, Rens; Christiansen, Morten H.
2012-01-01
It is generally assumed that hierarchical phrase structure plays a central role in human language. However, considerations of simplicity and evolutionary continuity suggest that hierarchical structure should not be invoked too hastily. Indeed, recent neurophysiological, behavioural and computational studies show that sequential sentence structure has considerable explanatory power and that hierarchical processing is often not involved. In this paper, we review evidence from the recent literature supporting the hypothesis that sequential structure may be fundamental to the comprehension, production and acquisition of human language. Moreover, we provide a preliminary sketch outlining a non-hierarchical model of language use and discuss its implications and testable predictions. If linguistic phenomena can be explained by sequential rather than hierarchical structure, this will have considerable impact in a wide range of fields, such as linguistics, ethology, cognitive neuroscience, psychology and computer science. PMID:22977157
Towards Understanding The Origin And Evolution Of Ultra-Diffuse Galaxies
NASA Astrophysics Data System (ADS)
van der Burg, Remco F. J.; Sifón, Cristóbal; Muzzin, Adam; Hoekstra, Henk; KiDS Collaboration; GAMA Collaboration
2017-06-01
Recent observations have shown that Ultra-Diffuse Galaxies (UDGs, which have the luminosities of dwarfs but sizes of giant galaxies) are surprisingly abundant in clusters of galaxies. The origin of these galaxies remains unclear, since one would naively expect them to be easily disrupted by tidal interactions in the cluster environment. Several formation scenarios have been proposed for UDGs, but these make a wide range of different testable observational predictions. I'll summarise recent results on two key observables that have the potential to differentiate between the proposed models, namely 1) a measurement of their (sub)halo masses using weak gravitational lensing, and 2) their abundance in lower-mass haloes using data from the GAMA and KiDS surveys. I'll discuss implications and future prospects to learn more about the properties and formation histories of these elusive galaxies.
The phantom leaf effect: a replication, part 1.
Hubacher, John
2015-02-01
To replicate the phantom leaf effect and demonstrate a possible means to directly observe properties of the biological field. Thirty percent to 60% of plant leaves were amputated, and the remaining leaf sections were photographed with corona discharge imaging. All leaves were cut before placement on film. A total of 137 leaves were used. Plant leaves of 14 different species. Ninety-six phantom leaf specimens were successfully obtained; 41 specimens did not yield the phantom leaf effect. A normally undetected phantom "structure," possibly evidence of the biological field, can persist in the area of an amputated leaf section, and corona discharge can occur from this invisible structure. This protocol may suggest a testable method to study properties of conductivity and other parameters through direct observation of the complete biological field in plant leaves, with broad implications for biology and physics.
Testability, Test Automation and Test Driven Development for the Trick Simulation Toolkit
NASA Technical Reports Server (NTRS)
Penn, John
2014-01-01
This paper describes the adoption of a Test Driven Development approach and a Continuous Integration System in the development of the Trick Simulation Toolkit, a generic simulation development environment for creating high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. It describes the approach, and the significant benefits seen, such as fast, thorough and clear test feedback every time code is checked into the code repository. It also describes an approach that encourages development of code that is testable and adaptable.
NASA Astrophysics Data System (ADS)
Päßler, Jan-Filip; Jarochowska, Emilia; Bestmann, Michel; Munnecke, Axel
2018-02-01
Although carbonate-precipitating cyanobacteria are ubiquitous in aquatic ecosystems today, the criteria used to identify them in the geological record are subjective and rarely testable. Differences in the mode of biomineralization between cyanobacteria and eukaryotes, i.e. biologically induced calcification (BIM) vs. biologically controlled calcification (BCM), result in different crystallographic structures which might be used as a criterion to test cyanobacterial affinities. Cyanobacteria are often used as a ‘wastebasket taxon’, to which various microfossils are assigned. The lack of a testable criterion for the identification of cyanobacteria may bias their fossil record severely. We employed electron backscatter diffraction (EBSD) to investigate the structure of calcareous skeletons in two microproblematica widespread in Palaeozoic marine ecosystems: Rothpletzella, hypothesized to be a cyanobacterium, and an incertae sedis microorganism Allonema. We used a calcareous trilobite shell as a BCM reference. The mineralized structure of Allonema has a simple single-layered structure of acicular crystals perpendicular to the surface of the organism. The c-axes of these crystals are parallel to the elongation and thereby normal to the surface of the organism. EBSD pole figures and misorientation axes distribution reveal a fibre texture around the c-axis with a small degree of variation (up to 30°), indicating a highly ordered structure. A comparable pattern was found in the trilobite shell. This structure allows excluding biologically induced mineralization as the mechanism of shell formation in Allonema. In Rothpletzella, the c-axes of the microcrystalline sheath show a broader clustering compared to Allonema, but still reveal crystals tending to be perpendicular to the surface of the organism. The misorientation axes of adjacent crystals show an approximately random distribution. Rothpletzella also shares morphological similarities with extant cyanobacteria. We propose that the occurrence of a strong misorientation relationship between adjacent crystals with misorientation axes clustering around the c-axis can be used as a proxy for the degree of control exerted by an organism on its mineralized structures. Therefore, precisely constrained distributions of misorientations (misorientation angle and misorientation axis) may be used to identify BCM in otherwise problematic fossils and can be used to ground-truth the cyanobacterial affinities commonly proposed for problematic extinct organisms.
Laser-initiated ordnance for air-to-air missiles
NASA Technical Reports Server (NTRS)
Sumpter, David R.
1993-01-01
McDonnell Douglas Missile Systems Company (MDMSC) has developed a laser ignition subsystem (LIS) for air-to-air missile applications. The MDMSC subsystem is designed to activate batteries, unlock fins, and sequence propulsion system events. The subsystem includes Pyro Zirconium Pump (PZP) lasers, mechanical Safe & Arm, fiber-optic distribution system, and optically activated pyrotechnic devices (initiators, detonators, and thermal batteries). The LIS design has incorporated testability features for the laser modules, drive electronics, fiber-optics, and pyrotechnics. Several of the LIS have been fabricated and have supported thermal battery testing, integral rocket ramjet testing, and have been integrated into integral rocket ramjet flight test vehicles as part of the flight control subsystem.
Abu Bakar, Nurul Farhana; Chen, Ai-Hong
2014-01-01
Context: Children with learning disabilities might have difficulties to communicate effectively and give reliable responses as required in various visual function testing procedures. Aims: The purpose of this study was to compare the testability of visual acuity using the modified Early Treatment Diabetic Retinopathy Study (ETDRS) and Cambridge Crowding Cards, stereo acuity using Lang Stereo test II and Butterfly stereo tests and colour perception using Colour Vision Test Made Easy (CVTME) and Ishihara's Test for Colour Deficiency (Ishihara Test) between children in mainstream classes and children with learning disabilities in special education classes in government primary schools. Materials and Methods: A total of 100 primary school children (50 children from mainstream classes and 50 children from special education classes) matched in age were recruited in this cross-sectional comparative study. The testability was determined by the percentage of children who were able to give reliable respond as required by the respective tests. ‘Unable to test’ was defined as inappropriate response or uncooperative despite best efforts of the screener. Results: The testability of the modified ETDRS, Butterfly stereo test and Ishihara test for respective visual function tests were found lower among children in special education classes (P < 0.001) but not in Cambridge Crowding Cards, Lang Stereo test II and CVTME. Conclusion: Non verbal or “matching” approaches were found to be more superior in testing visual functions in children with learning disabilities. Modifications of vision testing procedures are essential for children with learning disabilities. PMID:24008790
Oxytocin, vasopressin and pair bonding: implications for autism.
Hammock, Elizabeth A D; Young, Larry J
2006-12-29
Understanding the neurobiological substrates regulating normal social behaviours may provide valuable insights in human behaviour, including developmental disorders such as autism that are characterized by pervasive deficits in social behaviour. Here, we review the literature which suggests that the neuropeptides oxytocin and vasopressin play critical roles in modulating social behaviours, with a focus on their role in the regulation of social bonding in monogamous rodents. Oxytocin and vasopressin contribute to a wide variety of social behaviours, including social recognition, communication, parental care, territorial aggression and social bonding. The effects of these two neuropeptides are species-specific and depend on species-specific receptor distributions in the brain. Comparative studies in voles with divergent social structures have revealed some of the neural and genetic mechanisms of social-bonding behaviour. Prairie voles are socially monogamous; males and females form long-term pair bonds, establish a nest site and rear their offspring together. In contrast, montane and meadow voles do not form a bond with a mate and only the females take part in rearing the young. Species differences in the density of receptors for oxytocin and vasopressin in ventral forebrain reward circuitry differentially reinforce social-bonding behaviour in the two species. High levels of oxytocin receptor (OTR) in the nucleus accumbens and high levels of vasopressin 1a receptor (V1aR) in the ventral pallidum contribute to monogamous social structure in the prairie vole. While little is known about the genetic factors contributing to species-differences in OTR distribution, the species-specific distribution pattern of the V1aR is determined in part by a species-specific repetitive element, or 'microsatellite', in the 5' regulatory region of the gene encoding V1aR (avpr1a). This microsatellite is highly expanded in the prairie vole (as well as the monogamous pine vole) compared to a very short version in the promiscuous montane and meadow voles. These species differences in microsatellite sequence are sufficient to change gene expression in cell culture. Within the prairie vole species, intraspecific variation in the microsatellite also modulates gene expression in vitro as well as receptor distribution patterns in vivo and influences the probability of social approach and bonding behaviour. Similar genetic variation in the human AVPR1A may contribute to variations in human social behaviour, including extremes outside the normal range of behaviour and those found in autism spectrum disorders. In sum, comparative studies in pair-bonding rodents have revealed neural and genetic mechanisms contributing to social-bonding behaviour. These studies have generated testable hypotheses regarding the motivational systems and underlying molecular neurobiology involved in social engagement and social bond formation that may have important implications for the core social deficits characterizing autism spectrum disorders.
Oxytocin, vasopressin and pair bonding: implications for autism
Hammock, Elizabeth A.D; Young, Larry J
2006-01-01
Understanding the neurobiological substrates regulating normal social behaviours may provide valuable insights in human behaviour, including developmental disorders such as autism that are characterized by pervasive deficits in social behaviour. Here, we review the literature which suggests that the neuropeptides oxytocin and vasopressin play critical roles in modulating social behaviours, with a focus on their role in the regulation of social bonding in monogamous rodents. Oxytocin and vasopressin contribute to a wide variety of social behaviours, including social recognition, communication, parental care, territorial aggression and social bonding. The effects of these two neuropeptides are species-specific and depend on species-specific receptor distributions in the brain. Comparative studies in voles with divergent social structures have revealed some of the neural and genetic mechanisms of social-bonding behaviour. Prairie voles are socially monogamous; males and females form long-term pair bonds, establish a nest site and rear their offspring together. In contrast, montane and meadow voles do not form a bond with a mate and only the females take part in rearing the young. Species differences in the density of receptors for oxytocin and vasopressin in ventral forebrain reward circuitry differentially reinforce social-bonding behaviour in the two species. High levels of oxytocin receptor (OTR) in the nucleus accumbens and high levels of vasopressin 1a receptor (V1aR) in the ventral pallidum contribute to monogamous social structure in the prairie vole. While little is known about the genetic factors contributing to species-differences in OTR distribution, the species-specific distribution pattern of the V1aR is determined in part by a species-specific repetitive element, or ‘microsatellite’, in the 5′ regulatory region of the gene encoding V1aR (avpr1a). This microsatellite is highly expanded in the prairie vole (as well as the monogamous pine vole) compared to a very short version in the promiscuous montane and meadow voles. These species differences in microsatellite sequence are sufficient to change gene expression in cell culture. Within the prairie vole species, intraspecific variation in the microsatellite also modulates gene expression in vitro as well as receptor distribution patterns in vivo and influences the probability of social approach and bonding behaviour. Similar genetic variation in the human AVPR1A may contribute to variations in human social behaviour, including extremes outside the normal range of behaviour and those found in autism spectrum disorders. In sum, comparative studies in pair-bonding rodents have revealed neural and genetic mechanisms contributing to social-bonding behaviour. These studies have generated testable hypotheses regarding the motivational systems and underlying molecular neurobiology involved in social engagement and social bond formation that may have important implications for the core social deficits characterizing autism spectrum disorders. PMID:17118932
Pair production processes and flavor in gauge-invariant perturbation theory
NASA Astrophysics Data System (ADS)
Egger, Larissa; Maas, Axel; Sondenheimer, René
2017-12-01
Gauge-invariant perturbation theory is an extension of ordinary perturbation theory which describes strictly gauge-invariant states in theories with a Brout-Englert-Higgs effect. Such gauge-invariant states are composite operators which have necessarily only global quantum numbers. As a consequence, flavor is exchanged for custodial quantum numbers in the Standard Model, recreating the fermion spectrum in the process. Here, we study the implications of such a description, possibly also for the generation structure of the Standard Model. In particular, this implies that scattering processes are essentially bound-state-bound-state interactions, and require a suitable description. We analyze the implications for the pair-production process e+e-→f¯f at a linear collider to leading order. We show how ordinary perturbation theory is recovered as the leading contribution. Using a PDF-type language, we also assess the impact of sub-leading contributions. To lowest order, we find that the result is mainly influenced by how large the contribution of the Higgs at large x is. This gives an interesting, possibly experimentally testable, scenario for the formal field theory underlying the electroweak sector of the Standard Model.
A conceptual model for generating and validating in-session clinical judgments
Jacinto, Sofia B.; Lewis, Cara C.; Braga, João N.; Scott, Kelli
2016-01-01
Objective Little attention has been paid to the nuanced and complex decisions made in the clinical session context and how these decisions influence therapy effectiveness. Despite decades of research on the dual-processing systems, it remains unclear when and how intuitive and analytical reasoning influence the direction of the clinical session. Method This paper puts forth a testable conceptual model, guided by an interdisciplinary integration of the literature, that posits that the clinical session context moderates the use of intuitive versus analytical reasoning. Results A synthesis of studies examining professional best practices in clinical decision-making, empirical evidence from clinical judgment research, and the application of decision science theories indicate that intuitive and analytical reasoning may have profoundly different impacts on clinical practice and outcomes. Conclusions The proposed model is discussed with respect to its implications for clinical practice and future research. PMID:27088962
Ethnic Enclaves and the Earnings of Immigrants
Xie, Yu; Gough, Margaret
2011-01-01
A large literature in sociology concerns the implications of immigrants’ participation in ethnic enclaves for their economic and social well-being. The “enclave thesis” speculates that immigrants benefit from working in ethnic enclaves. Previous research concerning the effects of enclave participation on immigrants’ economic outcomes has come to mixed conclusions as to whether enclave effects are positive or negative. In this article, we seek to extend and improve upon past work by formulating testable hypotheses based on the enclave thesis and testing them with data from the 2003 New Immigrant Survey (NIS), employing both residence-based and workplace-based measures of the ethnic enclave. We compare the economic outcomes of immigrants working in ethnic enclaves with those of immigrants working in the mainstream economy. Our research yields minimal support for the enclave thesis. Our results further indicate that for some immigrant groups, ethnic enclave participation actually has a negative effect on economic outcomes. PMID:21863367
The Psychology of Working Theory.
Duffy, Ryan D; Blustein, David L; Diemer, Matthew A; Autin, Kelsey L
2016-03-01
In the current article, we build on research from vocational psychology, multicultural psychology, intersectionality, and the sociology of work to construct an empirically testable Psychology of Working Theory (PWT). Our central aim is to explain the work experiences of all individuals, but particularly people near or in poverty, people who face discrimination and marginalization in their lives, and people facing challenging work-based transitions for which contextual factors are often the primary drivers of the ability to secure decent work. The concept of decent work is defined and positioned as the central variable within the theory. A series of propositions is offered concerning (a) contextual predictors of securing decent work, (b) psychological and economic mediators and moderators of these relations, and (c) outcomes of securing decent work. Recommendations are suggested for researchers seeking to use the theory and practical implications are offered concerning counseling, advocacy, and public policy. (c) 2016 APA, all rights reserved).
John S. Bell's concept of local causality
NASA Astrophysics Data System (ADS)
Norsen, Travis
2011-12-01
John Stewart Bell's famous theorem is widely regarded as one of the most important developments in the foundations of physics. Yet even as we approach the 50th anniversary of Bell's discovery, its meaning and implications remain controversial. Many workers assert that Bell's theorem refutes the possibility suggested by Einstein, Podolsky, and Rosen (EPR) of supplementing ordinary quantum theory with ``hidden'' variables that might restore determinism and/or some notion of an observer-independent reality. But Bell himself interpreted the theorem very differently--as establishing an ``essential conflict'' between the well-tested empirical predictions of quantum theory and relativistic local causality. Our goal is to make Bell's own views more widely known and to explain Bell's little-known formulation of the concept of relativistic local causality on which his theorem rests. We also show precisely how Bell's formulation of local causality can be used to derive an empirically testable Bell-type inequality and to recapitulate the EPR argument.
John S. Bell's concept of local causality
NASA Astrophysics Data System (ADS)
Norsen, Travis
2011-12-01
John Stewart Bell's famous theorem is widely regarded as one of the most important developments in the foundations of physics. Yet even as we approach the 50th anniversary of Bell's discovery, its meaning and implications remain controversial. Many workers assert that Bell's theorem refutes the possibility suggested by Einstein, Podolsky, and Rosen (EPR) of supplementing ordinary quantum theory with "hidden" variables that might restore determinism and/or some notion of an observer-independent reality. But Bell himself interpreted the theorem very differently—as establishing an "essential conflict" between the well-tested empirical predictions of quantum theory and relativistic local causality. Our goal is to make Bell's own views more widely known and to explain Bell's little-known formulation of the concept of relativistic local causality on which his theorem rests. We also show precisely how Bell's formulation of local causality can be used to derive an empirically testable Bell-type inequality and to recapitulate the EPR argument.
Mäs, Michael; Flache, Andreas
2013-01-01
Explanations of opinion bi-polarization hinge on the assumption of negative influence, individuals’ striving to amplify differences to disliked others. However, empirical evidence for negative influence is inconclusive, which motivated us to search for an alternative explanation. Here, we demonstrate that bi-polarization can be explained without negative influence, drawing on theories that emphasize the communication of arguments as central mechanism of influence. Due to homophily, actors interact mainly with others whose arguments will intensify existing tendencies for or against the issue at stake. We develop an agent-based model of this theory and compare its implications to those of existing social-influence models, deriving testable hypotheses about the conditions of bi-polarization. Hypotheses were tested with a group-discussion experiment (N = 96). Results demonstrate that argument exchange can entail bi-polarization even when there is no negative influence. PMID:24312164
Resolving Microzooplankton Functional Groups In A Size-Structured Planktonic Model
NASA Astrophysics Data System (ADS)
Taniguchi, D.; Dutkiewicz, S.; Follows, M. J.; Jahn, O.; Menden-Deuer, S.
2016-02-01
Microzooplankton are important marine grazers, often consuming a large fraction of primary productivity. They consist of a great diversity of organisms with different behaviors, characteristics, and rates. This functional diversity, and its consequences, are not currently reflected in large-scale ocean ecological simulations. How should these organisms be represented, and what are the implications for their biogeography? We develop a size-structured, trait-based model to characterize a diversity of microzooplankton functional groups. We compile and examine size-based laboratory data on the traits, revealing some patterns with size and functional group that we interpret with mechanistic theory. Fitting the model to the data provides parameterizations of key rates and properties, which we employ in a numerical ocean model. The diversity of grazing preference, rates, and trophic strategies enables the coexistence of different functional groups of micro-grazers under various environmental conditions, and the model produces testable predictions of the biogeography.
Thomas, Jennifer J; Lawson, Elizabeth A; Micali, Nadia; Misra, Madhusmita; Deckersbach, Thilo; Eddy, Kamryn T
2017-08-01
DSM-5 defined avoidant/restrictive food intake disorder (ARFID) as a failure to meet nutritional needs leading to low weight, nutritional deficiency, dependence on supplemental feedings, and/or psychosocial impairment. We summarize what is known about ARFID and introduce a three-dimensional model to inform research. Because ARFID prevalence, risk factors, and maintaining mechanisms are not known, prevailing treatment approaches are based on clinical experience rather than data. Furthermore, most ARFID research has focused on children, rather than adolescents or adults. We hypothesize a three-dimensional model wherein neurobiological abnormalities in sensory perception, homeostatic appetite, and negative valence systems underlie the three primary ARFID presentations of sensory sensitivity, lack of interest in eating, and fear of aversive consequences, respectively. Now that ARFID has been defined, studies investigating risk factors, prevalence, and pathophysiology are needed. Our model suggests testable hypotheses about etiology and highlights cognitive-behavioral therapy as one possible treatment.
The hazards of hazard identification in environmental epidemiology.
Saracci, Rodolfo
2017-08-09
Hazard identification is a major scientific challenge, notably for environmental epidemiology, and is often surrounded, as the recent case of glyphosate shows, by debate arising in the first place by the inherently problematic nature of many components of the identification process. Particularly relevant in this respect are components less amenable to logical or mathematical formalization and essentially dependent on scientists' judgment. Four such potentially hazardous components that are capable of distorting the correct process of hazard identification are reviewed and discussed from an epidemiologist perspective: (1) lexical mix-up of hazard and risk (2) scientific questions as distinct from testable hypotheses, and implications for the hierarchy of strength of evidence obtainable from different types of study designs (3) assumptions in prior beliefs and model choices and (4) conflicts of interest. Four suggestions are put forward to strengthen a process that remains in several aspects judgmental, but not arbitrary, in nature.
NASA Astrophysics Data System (ADS)
Hull, Anthony B.; Westerhoff, Thomas
2015-01-01
Management of cost and risk have become the key enabling elements for compelling science to be done within Explorer or M-Class Missions. We trace how optimal primary mirror selection may be co-optimized with orbit selection. And then trace the cost and risk implications of selecting a low diffusivity low thermal expansion material for low and medium earth orbits, vs. high diffusivity high thermal expansion materials for the same orbits. We will discuss that ZERODUR®, a material that has been in space for over 30 years, is now available as highly lightweighted open-back mirrors, and the attributes of these mirrors in spaceborne optical telescope assemblies. Lightweight ZERODUR® solutions are practical from mirrors < 0.3m in diameter to >4m in diameter. An example of a 1.2m lightweight ZERODUR® mirror will be discussed.
Tailor, Vijay; Glaze, Selina; Unwin, Hilary; Bowman, Richard; Thompson, Graham; Dahlmann-Noor, Annegret
2016-10-01
Children and adults with neurological impairments are often not able to access conventional perimetry; however, information about the visual field is valuable. A new technology, saccadic vector optokinetic perimetry (SVOP), may have improved accessibility, but its accuracy has not been evaluated. We aimed to explore accessibility, testability and accuracy of SVOP in children with neurodisability or isolated visual pathway deficits. Cohort study; recruitment October 2013-May 2014, at children's eye clinics at a tertiary referral centre and a regional Child Development Centre; full orthoptic assessment, SVOP (central 30° of the visual field) and confrontation visual fields (CVF). Group 1: age 1-16 years, neurodisability (n=16), group 2: age 10-16 years, confirmed or suspected visual field defect (n=21); group 2 also completed Goldmann visual field testing (GVFT). Group 1: testability with a full 40-point test protocol is 12.5%; with reduced test protocols, testability is 100%, but plots may be clinically meaningless. Children (44%) and parents/carers (62.5%) find the test easy. SVOP and CVF agree in 50%. Group 2: testability is 62% for the 40-point protocol, and 90.5% for reduced protocols. Corneal changes in childhood glaucoma interfere with SVOP testing. All children and parents/carers find SVOP easy. Overall agreement with GVFT is 64.7%. While SVOP is highly accessible to children, many cannot complete a full 40-point test. Agreement with current standard tests is moderate to poor. Abnormal saccades cause an apparent non-specific visual field defect. In children with glaucoma or nystagmus SVOP calibration often fails. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Supermassive Black Holes and Galaxy Evolution
NASA Technical Reports Server (NTRS)
Merritt, D.
2004-01-01
Supermassive black holes appear to be generic components of galactic nuclei. The formation and growth of black holes is intimately connected with the evolution of galaxies on a wide range of scales. For instance, mergers between galaxies containing nuclear black holes would produce supermassive binaries which eventually coalesce via the emission of gravitational radiation. The formation and decay of these binaries is expected to produce a number of observable signatures in the stellar distribution. Black holes can also affect the large-scale structure of galaxies by perturbing the orbits of stars that pass through the nucleus. Large-scale N-body simulations are beginning to generate testable predictions about these processes which will allow us to draw inferences about the formation history of supermassive black holes.
Multidisciplinary analysis and design of printed wiring boards
NASA Astrophysics Data System (ADS)
Fulton, Robert E.; Hughes, Joseph L.; Scott, Waymond R., Jr.; Umeagukwu, Charles; Yeh, Chao-Pin
1991-04-01
Modern printed wiring board design depends on electronic prototyping using computer-based simulation and design tools. Existing electrical computer-aided design (ECAD) tools emphasize circuit connectivity with only rudimentary analysis capabilities. This paper describes a prototype integrated PWB design environment denoted Thermal Structural Electromagnetic Testability (TSET) being developed at Georgia Tech in collaboration with companies in the electronics industry. TSET provides design guidance based on enhanced electrical and mechanical CAD capabilities including electromagnetic modeling testability analysis thermal management and solid mechanics analysis. TSET development is based on a strong analytical and theoretical science base and incorporates an integrated information framework and a common database design based on a systematic structured methodology.
Soy-Based Therapeutic Baby Formulas: Testable Hypotheses Regarding the Pros and Cons.
Westmark, Cara J
2016-01-01
Soy-based infant formulas have been consumed in the United States since 1909, and currently constitute a significant portion of the infant formula market. There are efforts underway to generate genetically modified soybeans that produce therapeutic agents of interest with the intent to deliver those agents in a soy-based infant formula platform. The threefold purpose of this review article is to first discuss the pros and cons of soy-based infant formulas, then present testable hypotheses to discern the suitability of a soy platform for drug delivery in babies, and finally start a discussion to inform public policy on this important area of infant nutrition.
Bayesian naturalness, simplicity, and testability applied to the B ‑ L MSSM GUT
NASA Astrophysics Data System (ADS)
Fundira, Panashe; Purves, Austin
2018-04-01
Recent years have seen increased use of Bayesian model comparison to quantify notions such as naturalness, simplicity, and testability, especially in the area of supersymmetric model building. After demonstrating that Bayesian model comparison can resolve a paradox that has been raised in the literature concerning the naturalness of the proton mass, we apply Bayesian model comparison to GUTs, an area to which it has not been applied before. We find that the GUTs are substantially favored over the nonunifying puzzle model. Of the GUTs we consider, the B ‑ L MSSM GUT is the most favored, but the MSSM GUT is almost equally favored.
Two fundamental questions about protein evolution.
Penny, David; Zhong, Bojian
2015-12-01
Two basic questions are considered that approach protein evolution from different directions; the problems arising from using Markov models for the deeper divergences, and then the origin of proteins themselves. The real problem for the first question (going backwards in time) is that at deeper phylogenies the Markov models of sequence evolution must lose information exponentially at deeper divergences, and several testable methods are suggested that should help resolve these deeper divergences. For the second question (coming forwards in time) a problem is that most models for the origin of protein synthesis do not give a role for the very earliest stages of the process. From our knowledge of the importance of replication accuracy in limiting the length of a coding molecule, a testable hypothesis is proposed. The length of the code, the code itself, and tRNAs would all have prior roles in increasing the accuracy of RNA replication; thus proteins would have been formed only after the tRNAs and the length of the triplet code are already formed. Both questions lead to testable predictions. Copyright © 2014 Elsevier B.V. and Société Française de Biochimie et Biologie Moléculaire (SFBBM). All rights reserved.
Guimaraes, Sandra; Fernandes, Tiago; Costa, Patrício; Silva, Eduardo
2018-06-01
To determine a normative of tumbling E optotype and its feasibility for visual acuity (VA) assessment in children aged 3-4 years. A cross-sectional study of 1756 children who were invited to participate in a comprehensive non-invasive eye exam. Uncorrected monocular VA with crowded tumbling E with a comprehensive ophthalmological examination were assessed. Testability rates of the whole population and VA of the healthy children for different age subgroups, gender, school type and the order of testing in which the ophthalmological examination was performed were evaluated. The overall testability rate was 95% (92% and 98% for children aged 3 and 4 years, respectively). The mean VA of the first-day assessment (first-VA) and best-VA over 2 days' assessments was 0.14 logMAR (95% CI 0.14 to 0.15) (decimal=0.72, 95% CI 0.71 to 0.73) and 0.13 logMAR (95% CI 0.13 to 0.14) (decimal=0.74, 95% CI 0.73 to 0.74). Analysis with age showed differences between groups in first-VA (F(3,1146)=10.0; p<0.001; η2=0.026) and best-VA (F(3,1155)=8.8; p<0.001; η2=0.022). Our normative was very highly correlated with previous reported HOTV-Amblyopia-Treatment-Study (HOTV-ATS) (first-VA, r=0.97; best-VA, r=0.99), with 0.8 to 0.7 lines consistent overestimation for HOTV-ATS as described in literature. Overall false-positive referral was 1.3%, being specially low regarding anisometropias of ≥2 logMAR lines (0.17%). Interocular difference ≥1 line VA logMAR was not associated with age (p=0.195). This is the first normative for European Caucasian children with single crowded tumbling E in healthy eyes and the largest study comparing 3 and 4 years old testability. Testability rates are higher than found in literature with other optotypes, especially in children aged 3 years, where we found 5%-11% better testability rates. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Soy-Based Therapeutic Baby Formulas: Testable Hypotheses Regarding the Pros and Cons
Westmark, Cara J.
2017-01-01
Soy-based infant formulas have been consumed in the United States since 1909, and currently constitute a significant portion of the infant formula market. There are efforts underway to generate genetically modified soybeans that produce therapeutic agents of interest with the intent to deliver those agents in a soy-based infant formula platform. The threefold purpose of this review article is to first discuss the pros and cons of soy-based infant formulas, then present testable hypotheses to discern the suitability of a soy platform for drug delivery in babies, and finally start a discussion to inform public policy on this important area of infant nutrition. PMID:28149839
The structure of high-temperature solar flare plasma in non-thermal flare models
NASA Technical Reports Server (NTRS)
Emslie, A. G.
1985-01-01
Analytic differential emission measure distributions have been derived for coronal plasma in flare loops heated both by collisions of high-energy suprathermal electrons with background plasma, and by ohmic heating by the beam-normalizing return current. For low densities, reverse current heating predominates, while for higher densities collisional heating predominates. There is thus a minimum peak temperature in an electron-heated loop. In contrast to previous approximate analyses, it is found that a stable reverse current can dominate the heating rate in a flare loop, especially in the low corona. Two 'scaling laws' are found which relate the peak temperature in the loop to the suprathermal electron flux. These laws are testable observationally and constitute a new diagnostic procedure for examining modes of energy transport in flaring loops.
NASA Astrophysics Data System (ADS)
Eisenhart, Margaret
This article proposes that the organization of some college curriculum programs as well as some workplaces presents special and perhaps unnecessary obstacles to women who might pursue science or engineering. The article begins with a framework for thinking about connections between school and work in various fields. This section reveals important differences in the way college degree programs are organized and in their implications for the transition to work. Some programs, such as in physics, construct a "tight" link between school and work; others, such as in sociology, construct much looser links. The article proceeds by reviewing results of previous ethnographic research about women's actual experiences in college and work. This section suggests that during the period of transition from college to work, women face special cultural demands that interfere with their pursuit of degrees in tight programs. Joining the lessons from the two preceding sections, the argument is made that the tight organization of some college and workplace environments asks more of women than they can give and helps explain why women continue to be under represented in some fields. The argument has testable Implications for the design of curricularprogramsana'workplace environments that might attract more women (and perhaps more minorities and men) to science and engineering.
Reliability/maintainability/testability design for dormancy
NASA Astrophysics Data System (ADS)
Seman, Robert M.; Etzl, Julius M.; Purnell, Arthur W.
1988-05-01
This document has been prepared as a tool for designers of dormant military equipment and systems. The purpose of this handbook is to provide design engineers with Reliability/Maintainability/Testability design guidelines for systems which spend significant portions of their life cycle in a dormant state. The dormant state is defined as a nonoperating mode where a system experiences very little or no electrical stress. The guidelines in this report present design criteria in the following categories: (1) Part Selection and Control; (2) Derating Practices; (3) Equipment/System Packaging; (4) Transportation and Handling; (5) Maintainability Design; (6) Testability Design; (7) Evaluation Methods for In-Plant and Field Evaluation; and (8) Product Performance Agreements. Whereever applicable, design guidelines for operating systems were included with the dormant design guidelines. This was done in an effort to produce design guidelines for a more complete life cycle. Although dormant systems spend significant portions of their life cycle in a nonoperating mode, the designer must design the system for the complete life cycle, including nonoperating as well as operating modes. The guidelines are primarily intended for use in the design of equipment composed of electronic parts and components. However, they can also be used for the design of systems which encompass both electronic and nonelectronic parts, as well as for the modification of existing systems.
Delay test generation for synchronous sequential circuits
NASA Astrophysics Data System (ADS)
Devadas, Srinivas
1989-05-01
We address the problem of generating tests for delay faults in non-scan synchronous sequential circuits. Delay test generation for sequential circuits is a considerably more difficult problem than delay testing of combinational circuits and has received much less attention. In this paper, we present a method for generating test sequences to detect delay faults in sequential circuits using the stuck-at fault sequential test generator STALLION. The method is complete in that it will generate a delay test sequence for a targeted fault given sufficient CPU time, if such a sequence exists. We term faults for which no delay test sequence exists, under out test methodology, sequentially delay redundant. We describe means of eliminating sequential delay redundancies in logic circuits. We present a partial-scan methodology for enhancing the testability of difficult-to-test of untestable sequential circuits, wherein a small number of flip-flops are selected and made controllable/observable. The selection process guarantees the elimination of all sequential delay redundancies. We show that an intimate relationship exists between state assignment and delay testability of a sequential machine. We describe a state assignment algorithm for the synthesis of sequential machines with maximal delay fault testability. Preliminary experimental results using the test generation, partial-scan and synthesis algorithm are presented.
Stereoacuity of preschool children with and without vision disorders.
Ciner, Elise B; Ying, Gui-Shuang; Kulp, Marjean Taylor; Maguire, Maureen G; Quinn, Graham E; Orel-Bixler, Deborah; Cyert, Lynn A; Moore, Bruce; Huang, Jiayan
2014-03-01
To evaluate associations between stereoacuity and presence, type, and severity of vision disorders in Head Start preschool children and determine testability and levels of stereoacuity by age in children without vision disorders. Stereoacuity of children aged 3 to 5 years (n = 2898) participating in the Vision in Preschoolers (VIP) Study was evaluated using the Stereo Smile II test during a comprehensive vision examination. This test uses a two-alternative forced-choice paradigm with four stereoacuity levels (480 to 60 seconds of arc). Children were classified by the presence (n = 871) or absence (n = 2027) of VIP Study-targeted vision disorders (amblyopia, strabismus, significant refractive error, or unexplained reduced visual acuity), including type and severity. Median stereoacuity between groups and among severity levels of vision disorders was compared using Wilcoxon rank sum and Kruskal-Wallis tests. Testability and stereoacuity levels were determined for children without VIP Study-targeted disorders overall and by age. Children with VIP Study-targeted vision disorders had significantly worse median stereoacuity than that of children without vision disorders (120 vs. 60 seconds of arc, p < 0.001). Children with the most severe vision disorders had worse stereoacuity than that of children with milder disorders (median 480 vs. 120 seconds of arc, p < 0.001). Among children without vision disorders, testability was 99.6% overall, increasing with age to 100% for 5-year-olds (p = 0.002). Most of the children without vision disorders (88%) had stereoacuity at the two best disparities (60 or 120 seconds of arc); the percentage increasing with age (82% for 3-, 89% for 4-, and 92% for 5-year-olds; p < 0.001). The presence of any VIP Study-targeted vision disorder was associated with significantly worse stereoacuity in preschool children. Severe vision disorders were more likely associated with poorer stereopsis than milder or no vision disorders. Testability was excellent at all ages. These results support the validity of the Stereo Smile II for assessing random-dot stereoacuity in preschool children.
DeCamp, Matthew; Dredze, Mark; Chisolm, Margaret S; Berger, Zackary D
2014-01-01
Background Twitter is home to many health professionals who send messages about a variety of health-related topics. Amid concerns about physicians posting inappropriate content online, more in-depth knowledge about these messages is needed to understand health professionals’ behavior on Twitter. Objective Our goal was to characterize the content of Twitter messages, specifically focusing on health professionals and their tweets relating to health. Methods We performed an in-depth content analysis of 700 tweets. Qualitative content analysis was conducted on tweets by health users on Twitter. The primary objective was to describe the general type of content (ie, health-related versus non-health related) on Twitter authored by health professionals and further to describe health-related tweets on the basis of the type of statement made. Specific attention was given to whether a tweet was personal (as opposed to professional) or made a claim that users would expect to be supported by some level of medical evidence (ie, a “testable” claim). A secondary objective was to compare content types among different users, including patients, physicians, nurses, health care organizations, and others. Results Health-related users are posting a wide range of content on Twitter. Among health-related tweets, 53.2% (184/346) contained a testable claim. Of health-related tweets by providers, 17.6% (61/346) were personal in nature; 61% (59/96) made testable statements. While organizations and businesses use Twitter to promote their services and products, patient advocates are using this tool to share their personal experiences with health. Conclusions Twitter users in health-related fields tweet about both testable claims and personal experiences. Future work should assess the relationship between testable tweets and the actual level of evidence supporting them, including how Twitter users—especially patients—interpret the content of tweets posted by health providers. PMID:25591063
Topographic variations in chaos on Europa: Implications for diapiric formation
NASA Technical Reports Server (NTRS)
Schenk, Paul M.; Pappalardo, Robert T.
2004-01-01
Disrupted terrain, or chaos, on Europa, might have formed through melting of a floating ice shell from a subsurface ocean [Cam et al., 1998; Greenberg et al., 19991, or breakup by diapirs rising from the warm lower portion of the ice shell [Head and Pappalardo, 1999; Collins et al., 20001. Each model makes specific and testable predictions for topographic expression within chaos and relative to surrounding terrains on local and regional scales. High-resolution stereo-controlled photoclinometric topography indicates that chaos topography, including the archetypal Conamara Chaos region, is uneven and commonly higher than surrounding plains by up to 250 m. Elevated and undulating topography is more consistent with diapiric uplift of deep material in a relatively thick ice shell, rather than melt-through and refreezing of regionally or globally thin ice by a subsurface ocean. Vertical and horizontal scales of topographic doming in Conamara Chaos are consistent with a total ice shell thickness >15 km. Contact between Europa's ocean and surface may most likely be indirectly via diapirism or convection.
Topographic variations in chaos on Europa: Implications for diapiric formation
NASA Astrophysics Data System (ADS)
Schenk, Paul M.; Pappalardo, Robert T.
2004-08-01
Disrupted terrain, or chaos, on Europa, might have formed through melting of a floating ice shell from a subsurface ocean [Carr et al., 1998; Greenberg et al., 1999], or breakup by diapirs rising from the warm lower portion of the ice shell [Head and Pappalardo, 1999; Collins et al., 2000]. Each model makes specific and testable predictions for topographic expression within chaos and relative to surrounding terrains on local and regional scales. High-resolution stereo-controlled photoclinometric topography indicates that chaos topography, including the archetypal Conamara Chaos region, is uneven and commonly higher than surrounding plains by up to 250 m. Elevated and undulating topography is more consistent with diapiric uplift of deep material in a relatively thick ice shell, rather than melt-through and refreezing of regionally or globally thin ice by a subsurface ocean. Vertical and horizontal scales of topographic doming in Conamara Chaos are consistent with a total ice shell thickness >15 km. Contact between Europa's ocean and surface may most likely be indirectly via diapirism or convection.
Hook, Paul W; McClymont, Sarah A; Cannon, Gabrielle H; Law, William D; Morton, A Jennifer; Goff, Loyal A; McCallion, Andrew S
2018-03-01
Genetic variation modulating risk of sporadic Parkinson disease (PD) has been primarily explored through genome-wide association studies (GWASs). However, like many other common genetic diseases, the impacted genes remain largely unknown. Here, we used single-cell RNA-seq to characterize dopaminergic (DA) neuron populations in the mouse brain at embryonic and early postnatal time points. These data facilitated unbiased identification of DA neuron subpopulations through their unique transcriptional profiles, including a postnatal neuroblast population and substantia nigra (SN) DA neurons. We use these population-specific data to develop a scoring system to prioritize candidate genes in all 49 GWAS intervals implicated in PD risk, including genes with known PD associations and many with extensive supporting literature. As proof of principle, we confirm that the nigrostriatal pathway is compromised in Cplx1-null mice. Ultimately, this systematic approach establishes biologically pertinent candidates and testable hypotheses for sporadic PD, informing a new era of PD genetic research. Copyright © 2018 American Society of Human Genetics. All rights reserved.
Fischer, Barbara; Telser, Harry; Zweifel, Peter
2018-06-07
Healthcare expenditure (HCE) spent during an individual's last year of life accounts for a high share of lifetime HCE. This finding is puzzling because an investment in health is unlikely to have a sufficiently long payback period. However, Becker et al. (2007) and Philipson et al. (2010) have advanced a theory designed to explain high willingness to pay (WTP) for an extension of life close to its end. Their testable implications are complemented by the concept of 'pain of risk bearing' introduced by Eeckhoudt and Schlesinger (2006). They are tested using a discrete choice experiment performed in 2014, involving 1,529 Swiss adults. An individual setting where the price attribute is substantial out-of-pocket payment for a novel drug for treatment of terminal cancer is distinguished from a societal one, where it is an increase in contributions to social health insurance. Most of the economic predictions receive empirical support. Copyright © 2018. Published by Elsevier B.V.
Lightning Scaling Laws Revisited
NASA Technical Reports Server (NTRS)
Boccippio, D. J.; Arnold, James E. (Technical Monitor)
2000-01-01
Scaling laws relating storm electrical generator power (and hence lightning flash rate) to charge transport velocity and storm geometry were originally posed by Vonnegut (1963). These laws were later simplified to yield simple parameterizations for lightning based upon cloud top height, with separate parameterizations derived over land and ocean. It is demonstrated that the most recent ocean parameterization: (1) yields predictions of storm updraft velocity which appear inconsistent with observation, and (2) is formally inconsistent with the theory from which it purports to derive. Revised formulations consistent with Vonnegut's original framework are presented. These demonstrate that Vonnegut's theory is, to first order, consistent with observation. The implications of assuming that flash rate is set by the electrical generator power, rather than the electrical generator current, are examined. The two approaches yield significantly different predictions about the dependence of charge transfer per flash on storm dimensions, which should be empirically testable. The two approaches also differ significantly in their explanation of regional variability in lightning observations.
Alarcón, Tomás; Marches, Radu; Page, Karen M
2006-05-07
We formulate models of the mechanism(s) by which B cell lymphoma cells stimulated with an antibody specific to the B cell receptor (IgM) become quiescent or apoptotic. In particular, we aim to reproduce experimental results by Marches et al. according to which the fate of the targeted cells (Daudi) depends on the levels of expression of p21(Waf1) (p21) cell-cycle inhibitor. A simple model is formulated in which the basic ingredients are p21 and caspase activity, and their mutual inhibition. We show that this model does not reproduce the experimental results and that further refinement is needed. A second model successfully reproduces the experimental observations, for a given set of parameter values, indicating a critical role for Myc in the fate decision process. We use bifurcation analysis and objective sensitivity analysis to assess the robustness of our results. Importantly, this analysis yields experimentally testable predictions on the role of Myc, which could have therapeutic implications.
The effect of analytic and experiential modes of thought on moral judgment.
Kvaran, Trevor; Nichols, Shaun; Sanfey, Alan
2013-01-01
According to dual-process theories, moral judgments are the result of two competing processes: a fast, automatic, affect-driven process and a slow, deliberative, reason-based process. Accordingly, these models make clear and testable predictions about the influence of each system. Although a small number of studies have attempted to examine each process independently in the context of moral judgment, no study has yet tried to experimentally manipulate both processes within a single study. In this chapter, a well-established "mode-of-thought" priming technique was used to place participants in either an experiential/emotional or analytic mode while completing a task in which participants provide judgments about a series of moral dilemmas. We predicted that individuals primed analytically would make more utilitarian responses than control participants, while emotional priming would lead to less utilitarian responses. Support was found for both of these predictions. Implications of these findings for dual-process theories of moral judgment will be discussed. Copyright © 2013 Elsevier B.V. All rights reserved.
Catastrophic desert formation in Daisyworld.
Ackland, Graeme J; Clark, Michael A; Lenton, Timothy M
2003-07-07
Feedback between life and its environment is ubiquitous but the strength of coupling and its global implications remain hotly debated. Abrupt changes in the abundance of life for small changes in forcing provide one indicator of regulation, for example, when vegetation-climate feedback collapses in the formation of a desert. Here we use a two-dimensional "Daisyworld" model with curvature to show that catastrophic collapse of life under gradual forcing provides a testable indicator of environmental feedback. When solar luminosity increases to a critical value, a desert forms across a wide band of the planet. The scale of collapse depends on the strength of feedback. The efficiency of temperature regulation is limited by mutation rate in an analogous manner to the limitation of adaptive fitness in evolutionary theories. The final state of the system emerging from single-site rules can be described by two global quantities: optimization of temperature regulation and maximization of diversity, which are mathematically analogous to energy and entropy in thermodynamics.
Conway's "Game of Life" and the Epigenetic Principle.
Caballero, Lorena; Hodge, Bob; Hernandez, Sergio
2016-01-01
Cellular automatons and computer simulation games are widely used as heuristic devices in biology, to explore implications and consequences of specific theories. Conway's Game of Life has been widely used for this purpose. This game was designed to explore the evolution of ecological communities. We apply it to other biological processes, including symbiopoiesis. We show that Conway's organization of rules reflects the epigenetic principle, that genetic action and developmental processes are inseparable dimensions of a single biological system, analogous to the integration processes in symbiopoiesis. We look for similarities and differences between two epigenetic models, by Turing and Edelman, as they are realized in Game of Life objects. We show the value of computer simulations to experiment with and propose generalizations of broader scope with novel testable predictions. We use the game to explore issues in symbiopoiesis and evo-devo, where we explore a fractal hypothesis: that self-similarity exists at different levels (cells, organisms, ecological communities) as a result of homologous interactions of two as processes modeled in the Game of Life.
NASA Astrophysics Data System (ADS)
Pournelle, J. R.; Smith, J. R.; Hritz, C.; Nsf Hrrpaa 1045974
2011-12-01
Development and flourit of pre-urban and urban complex societies of southern Mesopotamia (Iraq) during the mid-Holocene took place in the context of Tigris-Euphrates and Karun-Karkheh deltaic progradation on one hand, and marine transgression at the head of the Gulf on the other. Understanding these processes has profound implications for assessing likely resource bioavailability, resource extraction and transport options, population distribution and density, and labour requirements for intensification/ extensification of extraction and production activities during this critical formative period. Multiple attempts have been made to reconstruct the Gulf "shoreline" at various pre-historic and historical periods. Because no systematic coring operations have been undertaken in the region, these attempts have been hampered by the paucity of direct geologic evidence. Conflicting hypotheses based on models of deltaic subsidence, tectonic uplift, and and/or eustatic change were barely testable against scant available cores and archaeologically-derived proxies from a few sites on the western "shore," such as H3, Eridu, Ur, Uruk, and Tell al Oueli. Recently published coring operations in the Iranian Karun-Karkheh delta add considerably to the available corpus of archaeological and geomorphologic data useful for reconstructing the timeline and extent of these processes, especially on the eastern "shore," but these are also bounded in spatial and temporal extent. Multi-scale, multi-sensor processing of remote sensing data and imagery make possible a fuller interpretation of geomorphologic and artifactual evidence bearing on overall shoreline reconstruction from approximately 6,000-3,000 BCE. This paper reports the results of combining interpreted LANDSAT, ASTER, SPOT, CORONA, Digital Globe, and other imagery with multiple derived Digital Elevation Models, thus providing stochastic boundaries for re-interpreting geological and archaeological point data, as well as new pilot data collected in 2010-2011. The result is better understanding of the likely location, extent, and impact of maximum mid-Holocene marine incursion into lower Mesopotamia and Khuzistan associated with deltaic geomorphological and ecological evolution, with implications for assessing site locations, agricultural potential, and water transport routes available to the world's oldest-known cities.
On Geomagnetism and Paleomagnetism I
NASA Technical Reports Server (NTRS)
Voorhies, Coerte V.
2000-01-01
A partial description of Earth's broad scale, core-source magnetic field has been developed and tested three ways. The description features an expected, or mean, spatial magnetic power spectrum that is approximately inversely proportional to horizontal wavenumber atop Earth's core. This multipole spectrum describes a magnetic energy range; it is not steep enough for Gubbins' magnetic dissipation range. Temporal variations of core multipole powers about mean values are to be expected and are described statistically, via trial probability distribution functions, instead of deterministically, via trial solution of closed transport equations. The distributions considered here are closed and neither require nor prohibit magnetic isotropy. The description is therefore applicable to, and tested against, both dipole and low degree non-dipole fields. In Part 1, a physical basis for an expectation spectrum is developed and checked. The description is then combined with main field models of twentieth century satellite and surface geomagnetic field measurements to make testable predictions of the radius of Earth's core. The predicted core radius is 0.7% above the 3480 km seismological value. Partial descriptions of other planetary dipole fields are noted.
Multicomponent Gas Diffusion and an Appropriate Momentum Boundary Condition
NASA Technical Reports Server (NTRS)
Noever, David A.
1994-01-01
Multicomponent gas diffusion is reviewed with particular emphasis on gas flows near solid boundaries-the so-called Kramers-Kistemaker effect. The aim is to derive an appropriate momentum boundary condition which governs many gaseous species diffusing together. The many species' generalization of the traditional single gas condition, either as slip or stick (no-slip), is not obvious, particularly for technologically important cases of lower gas pressures and very dissimilar molecular weight gases. No convincing theoretical case exists for why two gases should interact with solid boundaries equally but in opposite flow directions, such that the total gas flow exactly vanishes. ln this way, the multicomponent no-slip boundary requires careful treatment The approaches discussed here generally adopt a microscopic model for gas-solid contact. The method has the advantage that the mathematics remain tractable and hence experimentally testable. Two new proposals are put forward, the first building in some molecular collision physics, the second drawing on a detailed view of surface diffusion which does not unphysically extrapolate bulk gas properties to govern the adsorbed molecules. The outcome is a better accounting of previously anomalous experiments. Models predict novel slip conditions appearing even for the case of equal molecular weight components. These approaches become particularly significant in view of a conceptual contradiction found to arise in previous derivations of the appropriate boundary conditions. The analogous case of three gases, one of which is uniformly distributed and hence non-diffusing, presents a further refinement which gives unexpected flow reversals near solid boundaries. This case is investigated alone and for aggregating gas species near their condensation point. In addition to predicting new physics, this investigation carries practical implications for controlling vapor diffusion in the growth of crystals used in medical diagnosis (e.g. mercuric iodide) and semiconductors.
Understanding protein evolution: from protein physics to Darwinian selection.
Zeldovich, Konstantin B; Shakhnovich, Eugene I
2008-01-01
Efforts in whole-genome sequencing and structural proteomics start to provide a global view of the protein universe, the set of existing protein structures and sequences. However, approaches based on the selection of individual sequences have not been entirely successful at the quantitative description of the distribution of structures and sequences in the protein universe because evolutionary pressure acts on the entire organism, rather than on a particular molecule. In parallel to this line of study, studies in population genetics and phenomenological molecular evolution established a mathematical framework to describe the changes in genome sequences in populations of organisms over time. Here, we review both microscopic (physics-based) and macroscopic (organism-level) models of protein-sequence evolution and demonstrate that bridging the two scales provides the most complete description of the protein universe starting from clearly defined, testable, and physiologically relevant assumptions.
Percolation mechanism drives actin gels to the critically connected state
NASA Astrophysics Data System (ADS)
Lee, Chiu Fan; Pruessner, Gunnar
2016-05-01
Cell motility and tissue morphogenesis depend crucially on the dynamic remodeling of actomyosin networks. An actomyosin network consists of an actin polymer network connected by cross-linker proteins and motor protein myosins that generate internal stresses on the network. A recent discovery shows that for a range of experimental parameters, actomyosin networks contract to clusters with a power-law size distribution [J. Alvarado, Nat. Phys. 9, 591 (2013), 10.1038/nphys2715]. Here, we argue that actomyosin networks can exhibit a robust critical signature without fine-tuning because the dynamics of the system can be mapped onto a modified version of percolation with trapping (PT), which is known to show critical behavior belonging to the static percolation universality class without the need for fine-tuning of a control parameter. We further employ our PT model to generate experimentally testable predictions.
The diffusion decision model: theory and data for two-choice decision tasks.
Ratcliff, Roger; McKoon, Gail
2008-04-01
The diffusion decision model allows detailed explanations of behavior in two-choice discrimination tasks. In this article, the model is reviewed to show how it translates behavioral data-accuracy, mean response times, and response time distributions-into components of cognitive processing. Three experiments are used to illustrate experimental manipulations of three components: stimulus difficulty affects the quality of information on which a decision is based; instructions emphasizing either speed or accuracy affect the criterial amounts of information that a subject requires before initiating a response; and the relative proportions of the two stimuli affect biases in drift rate and starting point. The experiments also illustrate the strong constraints that ensure the model is empirically testable and potentially falsifiable. The broad range of applications of the model is also reviewed, including research in the domains of aging and neurophysiology.
On testing VLSI chips for the big Viterbi decoder
NASA Technical Reports Server (NTRS)
Hsu, I. S.
1989-01-01
A general technique that can be used in testing very large scale integrated (VLSI) chips for the Big Viterbi Decoder (BVD) system is described. The test technique is divided into functional testing and fault-coverage testing. The purpose of functional testing is to verify that the design works functionally. Functional test vectors are converted from outputs of software simulations which simulate the BVD functionally. Fault-coverage testing is used to detect and, in some cases, to locate faulty components caused by bad fabrication. This type of testing is useful in screening out bad chips. Finally, design for testability, which is included in the BVD VLSI chip design, is described in considerable detail. Both the observability and controllability of a VLSI chip are greatly enhanced by including the design for the testability feature.
Model-Based Testability Assessment and Directed Troubleshooting of Shuttle Wiring Systems
NASA Technical Reports Server (NTRS)
Deb, Somnath; Domagala, Chuck; Shrestha, Roshan; Malepati, Venkatesh; Cavanaugh, Kevin; Patterson-Hine, Ann; Sanderfer, Dwight; Cockrell, Jim; Norvig, Peter (Technical Monitor)
2000-01-01
We have recently completed a pilot study on the Space shuttle wiring system commissioned by the Wiring Integrity Research (WIRe) team at NASA Ames Research Center, As the space shuttle ages, it is experiencing wiring degradation problems including arcing, chaffing insulation breakdown and broken conductors. A systematic and comprehensive test process is required to thoroughly test and quality assure (QA) the wiring systems. The NASA WIRe team recognized the value of a formal model based analysis for risk-assessment and fault coverage analysis. However. wiring systems are complex and involve over 50,000 wire segments. Therefore, NASA commissioned this pilot study with Qualtech Systems. Inc. (QSI) to explore means of automatically extracting high fidelity multi-signal models from wiring information database for use with QSI's Testability Engineering and Maintenance System (TEAMS) tool.
QUANTITATIVE TESTS OF ELMS AS INTERMEDIATE N PEELING-BALOONING MODES
DOE Office of Scientific and Technical Information (OSTI.GOV)
LAO,LL; SNYDER,PB; LEONARD,AW
2003-03-01
A271 QUANTITATIVE TESTS OF ELMS AS INTERMEDIATE N PEELING-BALOONING MODES. Several testable features of the working model of edge localized modes (ELMs) as intermediate toroidal mode number peeling-ballooning modes are evaluated quantitatively using DIII-D and JT-60U experimental data and the ELITE MHD stability code. These include the hypothesis that ELM sizes are related to the radial widths of the unstable MHD modes, the unstable modes have a strong ballooning character localized in the outboard bad curvature region, and ELM size generally becomes smaller at high edge collisionality. ELMs are triggered when the growth rates of the unstable MHD modes becomemore » significantly large. These testable features are consistent with many ELM observations in DIII-D and JT-60U discharges.« less
A testable model of earthquake probability based on changes in mean event size
NASA Astrophysics Data System (ADS)
Imoto, Masajiro
2003-02-01
We studied changes in mean event size using data on microearthquakes obtained from a local network in Kanto, central Japan, from a viewpoint that a mean event size tends to increase as the critical point is approached. A parameter describing changes was defined using a simple weighting average procedure. In order to obtain the distribution of the parameter in the background, we surveyed values of the parameter from 1982 to 1999 in a 160 × 160 × 80 km volume. The 16 events of M5.5 or larger in this volume were selected as target events. The conditional distribution of the parameter was estimated from the 16 values, each of which referred to the value immediately prior to each target event. The distribution of the background becomes a function of symmetry, the center of which corresponds to no change in b value. In contrast, the conditional distribution exhibits an asymmetric feature, which tends to decrease the b value. The difference in the distributions between the two groups was significant and provided us a hazard function for estimating earthquake probabilities. Comparing the hazard function with a Poisson process, we obtained an Akaike Information Criterion (AIC) reduction of 24. This reduction agreed closely with the probability gains of a retrospective study in a range of 2-4. A successful example of the proposed model can be seen in the earthquake of 3 June 2000, which is the only event during the period of prospective testing.
Vázquez-Lobo, Alejandra; Carlsbecker, Annelie; Vergara-Silva, Francisco; Alvarez-Buylla, Elena R; Piñero, Daniel; Engström, Peter
2007-01-01
The identity of genes causally implicated in the development and evolutionary origin of reproductive characters in gymnosperms is largely unknown. Working within the framework of plant evolutionary developmental biology, here we have cloned, sequenced, performed phylogenetic analyses upon and tested the expression patterns of LEAFY/FLORICAULA and NEEDLY orthologs in reproductive structures from selected species of the conifer genera Picea, Podocarpus, and Taxus. Contrary to expectations based on previous assessments, expression of LFY/FLO and NLY in cones of these taxa was found to occur simultaneously in a single reproductive axis, initially overlapping but later in mutually exclusive primordia and/or groups of developing cells in both female and male structures. These observations directly affect the status of the "mostly male theory" for the origin of the angiosperm flower. On the other hand, comparative spatiotemporal patterns of the expression of these genes suggest a complex genetic regulatory network of cone development, as well as a scheme of functional divergence for LFY/FLO with respect to NLY homologs in gymnosperms, both with clear heterochronic aspects. Results presented in this study contribute to the understanding of the molecular-genetic basis of morphological evolution in conifer cones, and may aid in establishing a foundation for gymnosperm-specific, testable evo-devo hypotheses.
Linking short-term responses to ecologically-relevant outcomes
Opportunity to participate in the conduct of collaborative integrative lab, field and modelling efforts to characterize molecular-to-organismal level responses and make quantitative testable predictions of population level outcomes
Electromechanical actuation for thrust vector control applications
NASA Technical Reports Server (NTRS)
Roth, Mary Ellen
1990-01-01
The advanced launch system (ALS), is a launch vehicle that is designed to be cost-effective, highly reliable, and operationally efficient with a goal of reducing the cost per pound to orbit. An electromechanical actuation (EMA) system is being developed as an attractive alternative to the hydraulic systems. The controller will integrate 20 kHz resonant link power management and distribution (PMAD) technology and pulse population modulation (PPM) techniques to implement field-oriented vector control (FOVC) of a new advanced induction motor. The driver and the FOVC will be microprocessor controlled. For increased system reliability, a built-in test (BITE) capability will be included. This involves introducing testability into the design of a system such that testing is calibrated and exercised during the design, manufacturing, maintenance, and prelaunch activities. An actuator will be integrated with the motor controller for performance testing of the EMA thrust vector control (TVC) system. The EMA system and work proposed for the future are discussed.
Woolley, Thomas E; Gaffney, Eamonn A; Goriely, Alain
2017-07-01
If the plasma membrane of a cell is able to delaminate locally from its actin cortex, a cellular bleb can be produced. Blebs are pressure-driven protrusions, which are noteworthy for their ability to produce cellular motion. Starting from a general continuum mechanics description, we restrict ourselves to considering cell and bleb shapes that maintain approximately spherical forms. From this assumption, we obtain a tractable algebraic system for bleb formation. By including cell-substrate adhesions, we can model blebbing cell motility. Further, by considering mechanically isolated blebbing events, which are randomly distributed over the cell, we can derive equations linking the macroscopic migration characteristics to the microscopic structural parameters of the cell. This multiscale modeling framework is then used to provide parameter estimates, which are in agreement with current experimental data. In summary, the construction of the mathematical model provides testable relationships between the bleb size and cell motility.
The spatial scaling of species interaction networks.
Galiana, Nuria; Lurgi, Miguel; Claramunt-López, Bernat; Fortin, Marie-Josée; Leroux, Shawn; Cazelles, Kevin; Gravel, Dominique; Montoya, José M
2018-05-01
Species-area relationships (SARs) are pivotal to understand the distribution of biodiversity across spatial scales. We know little, however, about how the network of biotic interactions in which biodiversity is embedded changes with spatial extent. Here we develop a new theoretical framework that enables us to explore how different assembly mechanisms and theoretical models affect multiple properties of ecological networks across space. We present a number of testable predictions on network-area relationships (NARs) for multi-trophic communities. Network structure changes as area increases because of the existence of different SARs across trophic levels, the preferential selection of generalist species at small spatial extents and the effect of dispersal limitation promoting beta-diversity. Developing an understanding of NARs will complement the growing body of knowledge on SARs with potential applications in conservation ecology. Specifically, combined with further empirical evidence, NARs can generate predictions of potential effects on ecological communities of habitat loss and fragmentation in a changing world.
A SEU-Hard Flip-Flop for Antifuse FPGAs
NASA Technical Reports Server (NTRS)
Katz, R.; Wang, J. J.; McCollum, J.; Cronquist, B.; Chan, R.; Yu, D.; Kleyner, I.; Day, John H. (Technical Monitor)
2001-01-01
A single event upset (SEU)-hardened flip-flop has been designed and developed for antifuse Field Programmable Gate Array (FPGA) application. Design and application issues, testability, test methods, simulation, and results are discussed.
The changing features of the body-mind problem.
Agassi, Joseph
2007-01-01
The body-mind problem invites scientific study, since mental events are repeated and repeatable and invite testable explanations. They seemed troublesome because of the classical theory of substance that failed to solve its own central problems. These are soluble with the aid of the theory of the laws of nature, particularly in its emergentist version [Bunge, M., 1980. The Body-mind Problem, Pergamon, Oxford] that invites refutable explanations [Popper, K.R., 1959. The Logic of Scientific Discovery, Hutchinson, London]. The view of mental properties as emergent is a modification of the two chief classical views, materialism and dualism. As this view invites testable explanations of events of the inner world, it is better than the quasi-behaviorist view of self-awareness as computer-style self-monitoring [Minsky, M., Laske, O., 1992. A conversation with Marvin Minsky. AI Magazine 13 (3), 31-45].
Testability of evolutionary game dynamics based on experimental economics data
NASA Astrophysics Data System (ADS)
Wang, Yijia; Chen, Xiaojie; Wang, Zhijian
2017-11-01
Understanding the dynamic processes of a real game system requires an appropriate dynamics model, and rigorously testing a dynamics model is nontrivial. In our methodological research, we develop an approach to testing the validity of game dynamics models that considers the dynamic patterns of angular momentum and speed as measurement variables. Using Rock-Paper-Scissors (RPS) games as an example, we illustrate the geometric patterns in the experiment data. We then derive the related theoretical patterns from a series of typical dynamics models. By testing the goodness-of-fit between the experimental and theoretical patterns, we show that the validity of these models can be evaluated quantitatively. Our approach establishes a link between dynamics models and experimental systems, which is, to the best of our knowledge, the most effective and rigorous strategy for ascertaining the testability of evolutionary game dynamics models.
Design for testability and diagnosis at the system-level
NASA Technical Reports Server (NTRS)
Simpson, William R.; Sheppard, John W.
1993-01-01
The growing complexity of full-scale systems has surpassed the capabilities of most simulation software to provide detailed models or gate-level failure analyses. The process of system-level diagnosis approaches the fault-isolation problem in a manner that differs significantly from the traditional and exhaustive failure mode search. System-level diagnosis is based on a functional representation of the system. For example, one can exercise one portion of a radar algorithm (the Fast Fourier Transform (FFT) function) by injecting several standard input patterns and comparing the results to standardized output results. An anomalous output would point to one of several items (including the FFT circuit) without specifying the gate or failure mode. For system-level repair, identifying an anomalous chip is sufficient. We describe here an information theoretic and dependency modeling approach that discards much of the detailed physical knowledge about the system and analyzes its information flow and functional interrelationships. The approach relies on group and flow associations and, as such, is hierarchical. Its hierarchical nature allows the approach to be applicable to any level of complexity and to any repair level. This approach has been incorporated in a product called STAMP (System Testability and Maintenance Program) which was developed and refined through more than 10 years of field-level applications to complex system diagnosis. The results have been outstanding, even spectacular in some cases. In this paper we describe system-level testability, system-level diagnoses, and the STAMP analysis approach, as well as a few STAMP applications.
Hayes, Mark A.; Cryan, Paul M.; Wunder, Michael B.
2015-01-01
Understanding seasonal distribution and movement patterns of animals that migrate long distances is an essential part of monitoring and conserving their populations. Compared to migratory birds and other more conspicuous migrants, we know very little about the movement patterns of many migratory bats. Hoary bats (Lasiurus cinereus), a cryptic, wide-ranging, long-distance migrant, comprise a substantial proportion of the tens to hundreds of thousands of bat fatalities estimated to occur each year at wind turbines in North America. We created seasonally-dynamic species distribution models (SDMs) from 2,753 museum occurrence records collected over five decades in North America to better understand the seasonal geographic distributions of hoary bats. We used 5 SDM approaches: logistic regression, multivariate adaptive regression splines, boosted regression trees, random forest, and maximum entropy and consolidated outputs to generate ensemble maps. These maps represent the first formal hypotheses for sex- and season-specific hoary bat distributions. Our results suggest that North American hoary bats winter in regions with relatively long growing seasons where temperatures are moderated by proximity to oceans, and then move to the continental interior for the summer. SDMs suggested that hoary bats are most broadly distributed in autumn—the season when they are most susceptible to mortality from wind turbines; this season contains the greatest overlap between potentially suitable habitat and wind energy facilities. Comparing wind-turbine fatality data to model outputs could test many predictions, such as ‘risk from turbines is highest in habitats between hoary bat summering and wintering grounds’. Although future field studies are needed to validate the SDMs, this study generated well-justified and testable hypotheses of hoary bat migration patterns and seasonal distribution. PMID:26208098
Hayes, Mark A.; Cryan, Paul M.; Wunder, Michael B.
2015-01-01
Understanding seasonal distribution and movement patterns of animals that migrate long distances is an essential part of monitoring and conserving their populations. Compared to migratory birds and other more conspicuous migrants, we know very little about the movement patterns of many migratory bats. Hoary bats (Lasiurus cinereus), a cryptic, wide-ranging, long-distance migrant, comprise a substantial proportion of the tens to hundreds of thousands of bat fatalities estimated to occur each year at wind turbines in North America. We created seasonally-dynamic species distribution models (SDMs) from 2,753 museum occurrence records collected over five decades in North America to better understand the seasonal geographic distributions of hoary bats. We used 5 SDM approaches: logistic regression, multivariate adaptive regression splines, boosted regression trees, random forest, and maximum entropy and consolidated outputs to generate ensemble maps. These maps represent the first formal hypotheses for sex- and season-specific hoary bat distributions. Our results suggest that North American hoary bats winter in regions with relatively long growing seasons where temperatures are moderated by proximity to oceans, and then move to the continental interior for the summer. SDMs suggested that hoary bats are most broadly distributed in autumn—the season when they are most susceptible to mortality from wind turbines; this season contains the greatest overlap between potentially suitable habitat and wind energy facilities. Comparing wind-turbine fatality data to model outputs could test many predictions, such as ‘risk from turbines is highest in habitats between hoary bat summering and wintering grounds’. Although future field studies are needed to validate the SDMs, this study generated well-justified and testable hypotheses of hoary bat migration patterns and seasonal distribution.
Hayes, Mark A; Cryan, Paul M; Wunder, Michael B
2015-01-01
Understanding seasonal distribution and movement patterns of animals that migrate long distances is an essential part of monitoring and conserving their populations. Compared to migratory birds and other more conspicuous migrants, we know very little about the movement patterns of many migratory bats. Hoary bats (Lasiurus cinereus), a cryptic, wide-ranging, long-distance migrant, comprise a substantial proportion of the tens to hundreds of thousands of bat fatalities estimated to occur each year at wind turbines in North America. We created seasonally-dynamic species distribution models (SDMs) from 2,753 museum occurrence records collected over five decades in North America to better understand the seasonal geographic distributions of hoary bats. We used 5 SDM approaches: logistic regression, multivariate adaptive regression splines, boosted regression trees, random forest, and maximum entropy and consolidated outputs to generate ensemble maps. These maps represent the first formal hypotheses for sex- and season-specific hoary bat distributions. Our results suggest that North American hoary bats winter in regions with relatively long growing seasons where temperatures are moderated by proximity to oceans, and then move to the continental interior for the summer. SDMs suggested that hoary bats are most broadly distributed in autumn-the season when they are most susceptible to mortality from wind turbines; this season contains the greatest overlap between potentially suitable habitat and wind energy facilities. Comparing wind-turbine fatality data to model outputs could test many predictions, such as 'risk from turbines is highest in habitats between hoary bat summering and wintering grounds'. Although future field studies are needed to validate the SDMs, this study generated well-justified and testable hypotheses of hoary bat migration patterns and seasonal distribution.
Sources, Sinks, and Model Accuracy
Spatial demographic models are a necessary tool for understanding how to manage landscapes sustainably for animal populations. These models, therefore, must offer precise and testable predications about animal population dynamics and how animal demographic parameters respond to ...
The role of prediction in social neuroscience
Brown, Elliot C.; Brüne, Martin
2012-01-01
Research has shown that the brain is constantly making predictions about future events. Theories of prediction in perception, action and learning suggest that the brain serves to reduce the discrepancies between expectation and actual experience, i.e., by reducing the prediction error. Forward models of action and perception propose the generation of a predictive internal representation of the expected sensory outcome, which is matched to the actual sensory feedback. Shared neural representations have been found when experiencing one's own and observing other's actions, rewards, errors, and emotions such as fear and pain. These general principles of the “predictive brain” are well established and have already begun to be applied to social aspects of cognition. The application and relevance of these predictive principles to social cognition are discussed in this article. Evidence is presented to argue that simple non-social cognitive processes can be extended to explain complex cognitive processes required for social interaction, with common neural activity seen for both social and non-social cognitions. A number of studies are included which demonstrate that bottom-up sensory input and top-down expectancies can be modulated by social information. The concept of competing social forward models and a partially distinct category of social prediction errors are introduced. The evolutionary implications of a “social predictive brain” are also mentioned, along with the implications on psychopathology. The review presents a number of testable hypotheses and novel comparisons that aim to stimulate further discussion and integration between currently disparate fields of research, with regard to computational models, behavioral and neurophysiological data. This promotes a relatively new platform for inquiry in social neuroscience with implications in social learning, theory of mind, empathy, the evolution of the social brain, and potential strategies for treating social cognitive deficits. PMID:22654749
A prospective earthquake forecast experiment for Japan
NASA Astrophysics Data System (ADS)
Yokoi, Sayoko; Nanjo, Kazuyoshi; Tsuruoka, Hiroshi; Hirata, Naoshi
2013-04-01
One major focus of the current Japanese earthquake prediction research program (2009-2013) is to move toward creating testable earthquake forecast models. For this purpose we started an experiment of forecasting earthquake activity in Japan under the framework of the Collaboratory for the Study of Earthquake Predictability (CSEP) through an international collaboration. We established the CSEP Testing Centre, an infrastructure to encourage researchers to develop testable models for Japan, and to conduct verifiable prospective tests of their model performance. On 1 November in 2009, we started the 1st earthquake forecast testing experiment for the Japan area. We use the unified JMA catalogue compiled by the Japan Meteorological Agency as authorized catalogue. The experiment consists of 12 categories, with 4 testing classes with different time spans (1 day, 3 months, 1 year, and 3 years) and 3 testing regions called All Japan, Mainland, and Kanto. A total of 91 models were submitted to CSEP-Japan, and are evaluated with the CSEP official suite of tests about forecast performance. In this presentation, we show the results of the experiment of the 3-month testing class for 5 rounds. HIST-ETAS7pa, MARFS and RI10K models corresponding to the All Japan, Mainland and Kanto regions showed the best score based on the total log-likelihood. It is also clarified that time dependency of model parameters is no effective factor to pass the CSEP consistency tests for the 3-month testing class in all regions. Especially, spatial distribution in the All Japan region was too difficult to pass consistency test due to multiple events at a bin. Number of target events for a round in the Mainland region tended to be smaller than model's expectation during all rounds, which resulted in rejections of consistency test because of overestimation. In the Kanto region, pass ratios of consistency tests in each model showed more than 80%, which was associated with good balanced forecasting of event number and spatial distribution. Due to the multiple rounds of the experiment, we are now understanding the stability of models, robustness of model selection and earthquake predictability in each region beyond stochastic fluctuations of seismicity. We plan to use the results for design of 3 dimensional earthquake forecasting model in Kanto region, which is supported by the special project for reducing vulnerability for urban mega earthquake disasters from Ministy of Education, Culture, Sports and Technology of Japan.
Prediction and typicality in multiverse cosmology
NASA Astrophysics Data System (ADS)
Azhar, Feraz
2014-02-01
In the absence of a fundamental theory that precisely predicts values for observable parameters, anthropic reasoning attempts to constrain probability distributions over those parameters in order to facilitate the extraction of testable predictions. The utility of this approach has been vigorously debated of late, particularly in light of theories that claim we live in a multiverse, where parameters may take differing values in regions lying outside our observable horizon. Within this cosmological framework, we investigate the efficacy of top-down anthropic reasoning based on the weak anthropic principle. We argue contrary to recent claims that it is not clear one can either dispense with notions of typicality altogether or presume typicality, in comparing resulting probability distributions with observations. We show in a concrete, top-down setting related to dark matter, that assumptions about typicality can dramatically affect predictions, thereby providing a guide to how errors in reasoning regarding typicality translate to errors in the assessment of predictive power. We conjecture that this dependence on typicality is an integral feature of anthropic reasoning in broader cosmological contexts, and argue in favour of the explicit inclusion of measures of typicality in schemes invoking anthropic reasoning, with a view to extracting predictions from multiverse scenarios.
Experiments in structural dynamics and control using a grid
NASA Technical Reports Server (NTRS)
Montgomery, R. C.
1985-01-01
Future spacecraft are being conceived that are highly flexible and of extreme size. The two features of flexibility and size pose new problems in control system design. Since large scale structures are not testable in ground based facilities, the decision on component placement must be made prior to full-scale tests on the spacecraft. Control law research is directed at solving problems of inadequate modelling knowledge prior to operation required to achieve peak performance. Another crucial problem addressed is accommodating failures in systems with smart components that are physically distributed on highly flexible structures. Parameter adaptive control is a method of promise that provides on-orbit tuning of the control system to improve performance by upgrading the mathematical model of the spacecraft during operation. Two specific questions are answered in this work. They are: What limits does on-line parameter identification with realistic sensors and actuators place on the ultimate achievable performance of a system in the highly flexible environment? Also, how well must the mathematical model used in on-board analytic redundancy be known and what are the reasonable expectations for advanced redundancy management schemes in the highly flexible and distributed component environment?
Monte Carlo modeling of single-molecule cytoplasmic dynein.
Singh, Manoranjan P; Mallik, Roop; Gross, Steven P; Yu, Clare C
2005-08-23
Molecular motors are responsible for active transport and organization in the cell, underlying an enormous number of crucial biological processes. Dynein is more complicated in its structure and function than other motors. Recent experiments have found that, unlike other motors, dynein can take different size steps along microtubules depending on load and ATP concentration. We use Monte Carlo simulations to model the molecular motor function of cytoplasmic dynein at the single-molecule level. The theory relates dynein's enzymatic properties to its mechanical force production. Our simulations reproduce the main features of recent single-molecule experiments that found a discrete distribution of dynein step sizes, depending on load and ATP concentration. The model reproduces the large steps found experimentally under high ATP and no load by assuming that the ATP binding affinities at the secondary sites decrease as the number of ATP bound to these sites increases. Additionally, to capture the essential features of the step-size distribution at very low ATP concentration and no load, the ATP hydrolysis of the primary site must be dramatically reduced when none of the secondary sites have ATP bound to them. We make testable predictions that should guide future experiments related to dynein function.
Asymmetric patch size distribution leads to disruptive selection on dispersal.
Massol, François; Duputié, Anne; David, Patrice; Jarne, Philippe
2011-02-01
Numerous models have been designed to understand how dispersal ability evolves when organisms live in a fragmented landscape. Most of them predict a single dispersal rate at evolutionary equilibrium, and when diversification of dispersal rates has been predicted, it occurs as a response to perturbation or environmental fluctuation regimes. Yet abundant variation in dispersal ability is observed in natural populations and communities, even in relatively stable environments. We show that this diversification can operate in a simple island model without temporal variability: disruptive selection on dispersal occurs when the environment consists of many small and few large patches, a common feature in natural spatial systems. This heterogeneity in patch size results in a high variability in the number of related patch mates by individual, which, in turn, triggers disruptive selection through a high per capita variance of inclusive fitness. Our study provides a likely, parsimonious and testable explanation for the diversity of dispersal rates encountered in nature. It also suggests that biological conservation policies aiming at preserving ecological communities should strive to keep the distribution of patch size sufficiently asymmetric and variable. © 2010 The Author(s). Evolution© 2010 The Society for the Study of Evolution.
Severe infectious diseases of childhood as monogenic inborn errors of immunity
Casanova, Jean-Laurent
2015-01-01
This paper reviews the developments that have occurred in the field of human genetics of infectious diseases from the second half of the 20th century onward. In particular, it stresses and explains the importance of the recently described monogenic inborn errors of immunity underlying resistance or susceptibility to specific infections. The monogenic component of the genetic theory provides a plausible explanation for the occurrence of severe infectious diseases during primary infection. Over the last 20 y, increasing numbers of life-threatening infectious diseases striking otherwise healthy children, adolescents, and even young adults have been attributed to single-gene inborn errors of immunity. These studies were inspired by seminal but neglected findings in plant and animal infections. Infectious diseases typically manifest as sporadic traits because human genotypes often display incomplete penetrance (most genetically predisposed individuals remain healthy) and variable expressivity (different infections can be allelic at the same locus). Infectious diseases of childhood, once thought to be archetypal environmental diseases, actually may be among the most genetically determined conditions of mankind. This nascent and testable notion has interesting medical and biological implications. PMID:26621750
Moral injury: a mechanism for war-related psychological trauma in military family members.
Nash, William P; Litz, Brett T
2013-12-01
Recent research has provided compelling evidence of mental health problems in military spouses and children, including post-traumatic stress disorder (PTSD), related to the war-zone deployments, combat exposures, and post-deployment mental health symptoms experienced by military service members in the family. One obstacle to further research and federal programs targeting the psychological health of military family members has been the lack of a clear, compelling, and testable model to explain how war-zone events can result in psychological trauma in military spouses and children. In this article, we propose a possible mechanism for deployment-related psychological trauma in military spouses and children based on the concept of moral injury, a model that has been developed to better understand how service members and veterans may develop PTSD and other serious mental and behavioral problems in the wake of war-zone events that inflict damage to moral belief systems rather by threatening personal life and safety. After describing means of adapting the moral injury model to family systems, we discuss the clinical implications of moral injury, and describe a model for its psychological treatment.
A Framework for Finding and Interpreting Stellar CMEs
NASA Astrophysics Data System (ADS)
Osten, Rachel A.; Wolk, Scott J.
2017-10-01
The astrophysical study of mass loss, both steady-state and transient, on the cool half of the HR diagram has implications both for the star itself and the conditions created around the star that can be hospitable or inimical to supporting life. Stellar coronal mass ejections (CMEs) have not been conclusively detected, despite the ubiquity with which their radiative counterparts in an eruptive event (flares) have been. I will review some of the different observational methods which have been used and possibly could be used in the future in the stellar case, emphasizing some of the difficulties inherent in such attempts. I will provide a framework for interpreting potential transient stellar mass loss in light of the properties of flares known to occur on magnetically active stars. This uses a physically motivated way to connect the properties of flares and coronal mass ejections and provides a testable hypothesis for observing or constraining transient stellar mass loss. Finally I will describe recent results using observations at low radio frequencies to detect stellar coronal mass ejections, and give updates on prospects using future facilities to make headway in this important area.
Lepton flavor violating Z' explanation of the muon anomalous magnetic moment
Altmannshofer, Wolfgang; Chen, Chien-Yi; Dev, P. S. Bhupal; ...
2016-09-28
Here, we discuss a minimal solution to the long-standing (g-2) μ anomaly in a simple extension of the Standard Model with an extra Z' vector boson that has only flavor off-diagonal couplings to the second and third generation of leptons, i.e. μ, τ, ν μ, ν τ, and their antiparticles. A simplified model realization, as well as various collider and low-energy constraints on this model, are discussed. We find that the (g-2) μ -favored region for a Z' lighter than the tau lepton is totally excluded, while a heavier Z' solution is still allowed. Some testable implications of this scenariomore » in future experiments, such as lepton-flavor universality-violating tau decays at Belle 2, and a new four-lepton signature involving same-sign di-muons and di-taus at HL-LHC and FCC-ee, are pointed out. A characteristic resonant absorption feature in the high-energy neutrino spectrum might also be observed by neutrino telescopes like IceCube and KM3NeT.« less
Lepton flavor violating Z' explanation of the muon anomalous magnetic moment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Altmannshofer, Wolfgang; Chen, Chien-Yi; Dev, P. S. Bhupal
Here, we discuss a minimal solution to the long-standing (g-2) μ anomaly in a simple extension of the Standard Model with an extra Z' vector boson that has only flavor off-diagonal couplings to the second and third generation of leptons, i.e. μ, τ, ν μ, ν τ, and their antiparticles. A simplified model realization, as well as various collider and low-energy constraints on this model, are discussed. We find that the (g-2) μ -favored region for a Z' lighter than the tau lepton is totally excluded, while a heavier Z' solution is still allowed. Some testable implications of this scenariomore » in future experiments, such as lepton-flavor universality-violating tau decays at Belle 2, and a new four-lepton signature involving same-sign di-muons and di-taus at HL-LHC and FCC-ee, are pointed out. A characteristic resonant absorption feature in the high-energy neutrino spectrum might also be observed by neutrino telescopes like IceCube and KM3NeT.« less
Developmental Perspectives on Oxytocin and Vasopressin
Hammock, Elizabeth A D
2015-01-01
The related neuropeptides oxytocin and vasopressin are involved in species-typical behavior, including social recognition behavior, maternal behavior, social bonding, communication, and aggression. A wealth of evidence from animal models demonstrates significant modulation of adult social behavior by both of these neuropeptides and their receptors. Over the last decade, there has been a flood of studies in humans also implicating a role for these neuropeptides in human social behavior. Despite popular assumptions that oxytocin is a molecule of social bonding in the infant brain, less mechanistic research emphasis has been placed on the potential role of these neuropeptides in the developmental emergence of the neural substrates of behavior. This review summarizes what is known and assumed about the developmental influence of these neuropeptides and outlines the important unanswered questions and testable hypotheses. There is tremendous translational need to understand the functions of these neuropeptides in mammalian experience-dependent development of the social brain. The activity of oxytocin and vasopressin during development should inform our understanding of individual, sex, and species differences in social behavior later in life. PMID:24863032
Modal Interpretation of Quantum Mechanics and Classical Physical Theories
NASA Astrophysics Data System (ADS)
Ingarden, R. S.
In 1990, Bas C. van Fraassen defined the modal interpretation of quantum mechanics as the consideration of it as ``a pure theory of the possible, with testable, empirical implications for what actually happens". This is a narrow, traditional understanding of modality, only in the sense of the concept of possibility (usually denoted in logic by the C. I. Lewis's symbol 3) and the concept of necessity 2 defined by means of 3. In modern logic, however, modality is understood in a much wider sense as any intensional functor (i.e. non-extensional or determined not only by the truth value of a sentence). In the recent (independent of van Fraassen) publications of the author (1997), an attempt was made to apply this wider understanding of modality to interpretation of classical and quantum physics. In the present lecture, these problems are discussed on the background of a brief review of the logical approch to quantum mechanics in the recent 7 decades. In this discussion, the new concepts of sub-modality and super-modality of many orders are used.
Conway's “Game of Life” and the Epigenetic Principle
Caballero, Lorena; Hodge, Bob; Hernandez, Sergio
2016-01-01
Cellular automatons and computer simulation games are widely used as heuristic devices in biology, to explore implications and consequences of specific theories. Conway's Game of Life has been widely used for this purpose. This game was designed to explore the evolution of ecological communities. We apply it to other biological processes, including symbiopoiesis. We show that Conway's organization of rules reflects the epigenetic principle, that genetic action and developmental processes are inseparable dimensions of a single biological system, analogous to the integration processes in symbiopoiesis. We look for similarities and differences between two epigenetic models, by Turing and Edelman, as they are realized in Game of Life objects. We show the value of computer simulations to experiment with and propose generalizations of broader scope with novel testable predictions. We use the game to explore issues in symbiopoiesis and evo-devo, where we explore a fractal hypothesis: that self-similarity exists at different levels (cells, organisms, ecological communities) as a result of homologous interactions of two as processes modeled in the Game of Life PMID:27379213
The Buffering Effect of Hope on Clinicians’ Behavior: A Test in Pediatric Primary Care
Tennen, Howard; Cloutier, Michelle M.; Wakefield, Dorothy B.; Hall, Charles B.; Brazil, Kevin
2009-01-01
Although trait hope is thought to motivate goal directed actions in the face of impediments, few studies have examined directly hope’s role in overcoming obstacles, and none have done so while accounting for related goal constructs. We describe a study of 127 pediatric primary care providers who over the course of a year were asked to identify new cases of asthma and confirm previously diagnosed active disease by completing for each of their patients a brief survey validated for this purpose. These clinicians also completed measures of hope, self-efficacy, conscientiousness, and perceived obstacles to implementing a pediatric asthma management program. As predicted by hope theory, the agency component of hope buffered clinicians from perceived obstacles by facilitating the identification of asthma cases among high hope clinicians in the face of obstacles. This buffering effect remained after controlling for self-efficacy and conscientiousness. We discuss the study findings in terms of current theories of goal directed behavior and implications for delivering hope-related interventions, and we offer a testable hypothesis regarding when agency and pathways thinking facilitate goal-related behavior. PMID:20161067
Neutrino mass as the probe of intermediate mass scales
DOE Office of Scientific and Technical Information (OSTI.GOV)
Senjanovic, G.
1980-01-01
A discussion of the calculability of neutrino mass is presented. The possibility of neutrinos being either Dirac or Majorana particles is analyzed in detail. Arguments are offered in favor of the Majorana case: the smallness of neutrino mass is linked to the maximality of parity violation in weak interactions. It is shown how the measured value of neutrino mass would probe the existence of an intermediate mass scale, presumably in the TeV region, at which parity is supposed to become a good symmetry. Experimental consequences of the proposed scheme are discussed, in particular the neutrino-less double ..beta.. decay, where observationmore » would provide a crucial test of the model, and rare muon decays such as ..mu.. ..-->.. e..gamma.. and ..mu.. ..-->.. ee anti e. Finally, the embedding of this model in an O(10) grand unified theory is analyzed, with the emphasis on the implications for intermediate mass scales that it offers. It is concluded that the proposed scheme provides a distinct and testable alternative for understanding the smallness of neutrino mass. 4 figures.« less
Attempts to Localize and Identify the Gravity-sensing Device of Plant Seedlings
NASA Technical Reports Server (NTRS)
Bandurski, R. S.; Schulze, A.; Momonoki, Y.; Desrosiers, M.; Fearn-Desrosiers, D.
1985-01-01
The growth hormone asymmetry develops within three minutes following the initiation of the gravitational asymmetry and radio-labeled compounds being transported from the seed to the shoot also show asymmetric distribution. It is found that the target of the gravity stimulus resides primarily in the permability of the vascular tissue that regulates the supply of hormone to the surrounding tissues. It is hypothesized that the gravitational stimulus induces an asymmetric change in the rate of secretion of the growth hormone, IAA, from the vascular tissue into the surrounding cortical cells. More hormone would be secreted from the vascular stele proximal to the lower side of a horizontally placed plant shoot than from the upper side. This results in more growth hormone in the lower cortical (plus epidermal) cells, and ultimately more growth, such that the plant grows asymmetrically and, ultimately attain its normal vertical orientation. A theory was developed of how plants respond to the gravitational stimulus. The theory is based upon the analytical results concerning the effects of gravity on the distribution of the plant growth hormone, IAA, in both its free and conjugated forms, and upon the effect of the growth stimulis on the distribution of externally applied radio-labeled compounds. Its advantage is that it is testable and that it is built upon solid knowledge of the effects of the gravitational stimulus upon the endogenous growth hormone, IAA, and upon the distribution of externally applied radio-labeled compounds.
The cosmological principle is not in the sky
NASA Astrophysics Data System (ADS)
Park, Chan-Gyung; Hyun, Hwasu; Noh, Hyerim; Hwang, Jai-chan
2017-08-01
The homogeneity of matter distribution at large scales, known as the cosmological principle, is a central assumption in the standard cosmological model. The case is testable though, thus no longer needs to be a principle. Here we perform a test for spatial homogeneity using the Sloan Digital Sky Survey Luminous Red Galaxies (LRG) sample by counting galaxies within a specified volume with the radius scale varying up to 300 h-1 Mpc. We directly confront the large-scale structure data with the definition of spatial homogeneity by comparing the averages and dispersions of galaxy number counts with allowed ranges of the random distribution with homogeneity. The LRG sample shows significantly larger dispersions of number counts than the random catalogues up to 300 h-1 Mpc scale, and even the average is located far outside the range allowed in the random distribution; the deviations are statistically impossible to be realized in the random distribution. This implies that the cosmological principle does not hold even at such large scales. The same analysis of mock galaxies derived from the N-body simulation, however, suggests that the LRG sample is consistent with the current paradigm of cosmology, thus the simulation is also not homogeneous in that scale. We conclude that the cosmological principle is neither in the observed sky nor demanded to be there by the standard cosmological world model. This reveals the nature of the cosmological principle adopted in the modern cosmology paradigm, and opens a new field of research in theoretical cosmology.
Work-Centered Technology Development (WTD)
2005-03-01
theoretical, testable, inductive, and repeatable foundations of science. o Theoretical foundations include notions such as statistical versus analytical...Human Factors and Ergonomics Society, 263-267. 179 Eggleston, R. G. (2005). Coursebook : Work-Centered Design (WCD). AFRL/HECS WCD course training
Writing testable software requirements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knirk, D.
1997-11-01
This tutorial identifies common problems in analyzing requirements in the problem and constructing a written specification of what the software is to do. It deals with two main problem areas: identifying and describing problem requirements, and analyzing and describing behavior specifications.
All pure bipartite entangled states can be self-tested
Coladangelo, Andrea; Goh, Koon Tong; Scarani, Valerio
2017-01-01
Quantum technologies promise advantages over their classical counterparts in the fields of computation, security and sensing. It is thus desirable that classical users are able to obtain guarantees on quantum devices, even without any knowledge of their inner workings. That such classical certification is possible at all is remarkable: it is a consequence of the violation of Bell inequalities by entangled quantum systems. Device-independent self-testing refers to the most complete such certification: it enables a classical user to uniquely identify the quantum state shared by uncharacterized devices by simply inspecting the correlations of measurement outcomes. Self-testing was first demonstrated for the singlet state and a few other examples of self-testable states were reported in recent years. Here, we address the long-standing open question of whether every pure bipartite entangled state is self-testable. We answer it affirmatively by providing explicit self-testing correlations for all such states. PMID:28548093
All pure bipartite entangled states can be self-tested
NASA Astrophysics Data System (ADS)
Coladangelo, Andrea; Goh, Koon Tong; Scarani, Valerio
2017-05-01
Quantum technologies promise advantages over their classical counterparts in the fields of computation, security and sensing. It is thus desirable that classical users are able to obtain guarantees on quantum devices, even without any knowledge of their inner workings. That such classical certification is possible at all is remarkable: it is a consequence of the violation of Bell inequalities by entangled quantum systems. Device-independent self-testing refers to the most complete such certification: it enables a classical user to uniquely identify the quantum state shared by uncharacterized devices by simply inspecting the correlations of measurement outcomes. Self-testing was first demonstrated for the singlet state and a few other examples of self-testable states were reported in recent years. Here, we address the long-standing open question of whether every pure bipartite entangled state is self-testable. We answer it affirmatively by providing explicit self-testing correlations for all such states.
All pure bipartite entangled states can be self-tested.
Coladangelo, Andrea; Goh, Koon Tong; Scarani, Valerio
2017-05-26
Quantum technologies promise advantages over their classical counterparts in the fields of computation, security and sensing. It is thus desirable that classical users are able to obtain guarantees on quantum devices, even without any knowledge of their inner workings. That such classical certification is possible at all is remarkable: it is a consequence of the violation of Bell inequalities by entangled quantum systems. Device-independent self-testing refers to the most complete such certification: it enables a classical user to uniquely identify the quantum state shared by uncharacterized devices by simply inspecting the correlations of measurement outcomes. Self-testing was first demonstrated for the singlet state and a few other examples of self-testable states were reported in recent years. Here, we address the long-standing open question of whether every pure bipartite entangled state is self-testable. We answer it affirmatively by providing explicit self-testing correlations for all such states.
Phase 1 Space Fission Propulsion Energy Source Design
NASA Technical Reports Server (NTRS)
Houts, Mike; VanDyke, Melissa; Godfroy, Tom; Pedersen, Kevin; Martin, James; Dickens, Ricky; Salvail, Pat; Hrbud, Ivana; Carter, Robert; Rodgers, Stephen L. (Technical Monitor)
2002-01-01
Fission technology can enable rapid, affordable access to any point in the solar system. If fission propulsion systems are to be developed to their full potential; however, near-term customers must be identified and initial fission systems successfully developed, launched, and operated. Studies conducted in fiscal year 2001 (IISTP, 2001) show that fission electric propulsion (FEP) systems with a specific mass at or below 50 kg/kWjet could enhance or enable numerous robotic outer solar system missions of interest. At the required specific mass, it is possible to develop safe, affordable systems that meet mission requirements. To help select the system design to pursue, eight evaluation criteria were identified: system integration, safety, reliability, testability, specific mass, cost, schedule, and programmatic risk. A top-level comparison of four potential concepts was performed: a Testable, Passive, Redundant Reactor (TPRR), a Testable Multi-Cell In-Core Thermionic Reactor (TMCT), a Direct Gas Cooled Reactor (DGCR), and a Pumped Liquid Metal Reactor.(PLMR). Development of any of the four systems appears feasible. However, for power levels up to at least 500 kWt (enabling electric power levels of 125-175 kWe, given 25-35% power conversion efficiency) the TPRR has advantages related to several criteria and is competitive with respect to all. Hardware-based research and development has further increased confidence in the TPRR approach. Successful development and utilization of a "Phase I" fission electric propulsion system will enable advanced Phase 2 and Phase 3 systems capable of providing rapid, affordable access to any point in the solar system.
Pediatric Amblyopia Risk Investigation Study (PARIS).
Savage, Howard I; Lee, Hester H; Zaetta, Deneen; Olszowy, Ronald; Hamburger, Ellie; Weissman, Mark; Frick, Kevin
2005-12-01
To assess the learning curve, testability, and reliability of vision screening modalities administered by pediatric health extenders. Prospective masked clinical trial. Two hundred subjects aged 3 to 6 underwent timed screening for amblyopia by physician extenders, including LEA visual acuity (LEA), stereopsis (RDE), and noncycloplegic autorefraction (NCAR). Patients returned for a comprehensive diagnostic eye examination performed by an ophthalmologist or optometrist. Average screening time was 5.4 +/- 1.6 minutes (LEA), 1.9 +/- 0.9 minutes (RDE), and 1.7 +/- 1.0 minutes (NCAR). Test time for NCAR and RDE fell by 40% during the study period. Overall testability was 92% (LEA), 96% (RDE), and 94% (NCAR). Testability among 3-year-olds was 73% (LEA), 96% (RDE), and 89% (NCAR). Reliability of LEA was moderate (r = .59). Reliability of NCAR was high for astigmatism (Cyl) (r = .89), moderate for spherical equivalent (SE) (r = .66), and low for anisometropia (ANISO) (r = .38). Correlation of cycloplegic autorefraction (CAR) with gold standard cycloplegic retinoscopic refraction (CRR) was very high for SE (.85), CYL (.77), and moderate for ANISO (.48). With NCAR, physician extenders can quickly and reliably detect astigmatism and spherical refractive error in one-third the time it takes to obtain visual acuity. LEA has a lower initial cost, but is time consuming, moderately reliable, and more difficult for 3-year-olds. Shorter examination time and higher reliability may make NCAR a more efficient screening tool for refractive amblyopia in younger children. Future study is needed to determine the sensitivity and specificity of NCAR and other screening methods in detecting amblyopia and amblyopia risk factors.
Encoding dependence in Bayesian causal networks
USDA-ARS?s Scientific Manuscript database
Bayesian networks (BNs) represent complex, uncertain spatio-temporal dynamics by propagation of conditional probabilities between identifiable states with a testable causal interaction model. Typically, they assume random variables are discrete in time and space with a static network structure that ...
Automated Testability Decision Tool
1991-09-01
Vol. 16,1968, pp. 538-558. Bertsekas, D. P., "Constraints Optimization and Lagrange Multiplier Methods," Academic Press, New York. McLeavey , D.W... McLeavey , J.A., "Parallel Optimization Methods in Standby Reliability, " University of Connecticut, School of Business Administration, Bureau of Business
ERIC Educational Resources Information Center
Niaz, Mansoor
1991-01-01
Discusses differences between the epistemic and the psychological subject, the relationship between the epistemic subject and the ideal gas law, the development of general cognitive operations, and the empirical testability of Piaget's epistemic subject. (PR)
Small Town in Mass Society Revisited.
ERIC Educational Resources Information Center
Young, Frank W.
1996-01-01
A 1958 New York community study dramatized the thesis that macro forces (urbanization, industrialization, bureaucratization) have undermined all small communities' autonomy. Such "oppositional case studies" succeed when they render the dominant view immediately obsolete, have plausible origins, are testable, and generate new research.…
Stephens, Patrick R.; Hua, Jessica; Searle, Catherine L.; Xie, Gisselle Yang; Urbina, Jenny; Olson, Deanna H.; Bancroft, Betsy A.; Weis, Virginia; Hammond, John I.; Relyea, Rick A.; Blaustein, Andrew R.
2017-01-01
Variation in host responses to pathogens can have cascading effects on populations and communities when some individuals or groups of individuals display disproportionate vulnerability to infection or differ in their competence to transmit infection. The fungal pathogen, Batrachochytrium dendrobatidis (Bd) has been detected in almost 700 different amphibian species and is implicated in numerous global amphibian population declines. Identifying key hosts in the amphibian-Bd system–those who are at greatest risk or who pose the greatest risk for others–is challenging due in part to many extrinsic environmental factors driving spatiotemporal Bd distribution and context-dependent host responses to Bd in the wild. One way to improve predictive risk models and generate testable mechanistic hypotheses about vulnerability is to complement what we know about the spatial epidemiology of Bd with data collected through comparative experimental studies. We used standardized pathogen challenges to quantify amphibian survival and infection trajectories across 20 post-metamorphic North American species raised from eggs. We then incorporated trait-based models to investigate the predictive power of phylogenetic history, habitat use, and ecological and life history traits in explaining responses to Bd. True frogs (Ranidae) displayed the lowest infection intensities, whereas toads (Bufonidae) generally displayed the greatest levels of mortality after Bd exposure. Affiliation with ephemeral aquatic habitat and breadth of habitat use were strong predictors of vulnerability to and intensity of infection and several other traits including body size, lifespan, age at sexual maturity, and geographic range also appeared in top models explaining host responses to Bd. Several of the species examined are highly understudied with respect to Bd such that this study represents the first experimental susceptibility data. Combining insights gained from experimental studies with observations of landscape-level disease prevalence may help explain current and predict future pathogen dynamics in the Bd system. PMID:28095428
Gervasi, Stephanie S; Stephens, Patrick R; Hua, Jessica; Searle, Catherine L; Xie, Gisselle Yang; Urbina, Jenny; Olson, Deanna H; Bancroft, Betsy A; Weis, Virginia; Hammond, John I; Relyea, Rick A; Blaustein, Andrew R
2017-01-01
Variation in host responses to pathogens can have cascading effects on populations and communities when some individuals or groups of individuals display disproportionate vulnerability to infection or differ in their competence to transmit infection. The fungal pathogen, Batrachochytrium dendrobatidis (Bd) has been detected in almost 700 different amphibian species and is implicated in numerous global amphibian population declines. Identifying key hosts in the amphibian-Bd system-those who are at greatest risk or who pose the greatest risk for others-is challenging due in part to many extrinsic environmental factors driving spatiotemporal Bd distribution and context-dependent host responses to Bd in the wild. One way to improve predictive risk models and generate testable mechanistic hypotheses about vulnerability is to complement what we know about the spatial epidemiology of Bd with data collected through comparative experimental studies. We used standardized pathogen challenges to quantify amphibian survival and infection trajectories across 20 post-metamorphic North American species raised from eggs. We then incorporated trait-based models to investigate the predictive power of phylogenetic history, habitat use, and ecological and life history traits in explaining responses to Bd. True frogs (Ranidae) displayed the lowest infection intensities, whereas toads (Bufonidae) generally displayed the greatest levels of mortality after Bd exposure. Affiliation with ephemeral aquatic habitat and breadth of habitat use were strong predictors of vulnerability to and intensity of infection and several other traits including body size, lifespan, age at sexual maturity, and geographic range also appeared in top models explaining host responses to Bd. Several of the species examined are highly understudied with respect to Bd such that this study represents the first experimental susceptibility data. Combining insights gained from experimental studies with observations of landscape-level disease prevalence may help explain current and predict future pathogen dynamics in the Bd system.
McCluney, Kevin E.; Belnap, Jayne; Collins, Scott L.; González, Angélica L.; Hagen, Elizabeth M.; Holland, J. Nathaniel; Kotler, Burt P.; Maestre, Fernando T.; Smith, Stanley D.; Wolf, Blair O.
2012-01-01
Species interactions play key roles in linking the responses of populations, communities, and ecosystems to environmental change. For instance, species interactions are an important determinant of the complexity of changes in trophic biomass with variation in resources. Water resources are a major driver of terrestrial ecology and climate change is expected to greatly alter the distribution of this critical resource. While previous studies have documented strong effects of global environmental change on species interactions in general, responses can vary from region to region. Dryland ecosystems occupy more than one-third of the Earth's land mass, are greatly affected by changes in water availability, and are predicted to be hotspots of climate change. Thus, it is imperative to understand the effects of environmental change on these globally significant ecosystems. Here, we review studies of the responses of population-level plant-plant, plant-herbivore, and predator-prey interactions to changes in water availability in dryland environments in order to develop new hypotheses and predictions to guide future research. To help explain patterns of interaction outcomes, we developed a conceptual model that views interaction outcomes as shifting between (1) competition and facilitation (plant-plant), (2) herbivory, neutralism, or mutualism (plant-herbivore), or (3) neutralism and predation (predator-prey), as water availability crosses physiological, behavioural, or population-density thresholds. We link our conceptual model to hypothetical scenarios of current and future water availability to make testable predictions about the influence of changes in water availability on species interactions. We also examine potential implications of our conceptual model for the relative importance of top-down effects and the linearity of patterns of change in trophic biomass with changes in water availability. Finally, we highlight key research needs and some possible broader impacts of our findings. Overall, we hope to stimulate and guide future research that links changes in water availability to patterns of species interactions and the dynamics of populations and communities in dryland ecosystems.
Is ``the Theory of Everything'' Merely the Ultimate Ensemble Theory?
NASA Astrophysics Data System (ADS)
Tegmark, Max
1998-11-01
We discuss some physical consequences of what might be called "the ultimate ensemble theory,", where not only worlds corresponding to say different sets of initial data or different physical constants are considered equally real, but also worlds ruled by altogether different equations. The only postulate in this theory is that all structures that exist mathematically exist also physically, by which we mean that in those complex enough to contain self-aware substructures (SASs), these SASs will subjectively perceive themselves as existing in a physically "real" world. We find that it is far from clear that this simple theory, which has no free parameters whatsoever, is observationally ruled out. The predictions of the theory take the form of probability distributions for the outcome of experiments, which makes it testable. In addition, it may be possible to rule it out by comparing its a priori predictions for the observable attributes of nature (the particle masses, the dimensionality of spacetime, etc.) with what is observed.
Five potential consequences of climate change for invasive species.
Hellmann, Jessica J; Byers, James E; Bierwagen, Britta G; Dukes, Jeffrey S
2008-06-01
Scientific and societal unknowns make it difficult to predict how global environmental changes such as climate change and biological invasions will affect ecological systems. In the long term, these changes may have interacting effects and compound the uncertainty associated with each individual driver. Nonetheless, invasive species are likely to respond in ways that should be qualitatively predictable, and some of these responses will be distinct from those of native counterparts. We used the stages of invasion known as the "invasion pathway" to identify 5 nonexclusive consequences of climate change for invasive species: (1) altered transport and introduction mechanisms, (2) establishment of new invasive species, (3) altered impact of existing invasive species, (4) altered distribution of existing invasive species, and (5) altered effectiveness of control strategies. We then used these consequences to identify testable hypotheses about the responses of invasive species to climate change and provide suggestions for invasive-species management plans. The 5 consequences also emphasize the need for enhanced environmental monitoring and expanded coordination among entities involved in invasive-species management.
Cognitive Scientists Prefer Theories and Testable Principles with Teeth
ERIC Educational Resources Information Center
Graesser, Arthur C.
2009-01-01
Alexander, Schallert, and Reynolds (2009/this issue) proposed a definition and landscape of learning that included 9 principles and 4 dimensions ("what," "who," "where," "when"). This commentary reflects on the utility of this definition and 4-dimensional landscape from the standpoint of educational…
A systems framework for identifying candidate microbial assemblages for disease management
USDA-ARS?s Scientific Manuscript database
Network models of soil and plant microbiomes present new opportunities for enhancing disease management, but also challenges for interpretation. We present a framework for interpreting microbiome networks, illustrating how the observed structure of networks can be used to generate testable hypothese...
Vesselinova, Neda; Alexandrov, Boian; Wall, Michael E.
2016-11-08
We present a dynamical model of drug accumulation in bacteria. The model captures key features in experimental time courses on ofloxacin accumulation: initial uptake; two-phase response; and long-term acclimation. In combination with experimental data, the model provides estimates of import and export rates in each phase, the time of entry into the second phase, and the decrease of internal drug during acclimation. Global sensitivity analysis, local sensitivity analysis, and Bayesian sensitivity analysis of the model provide information about the robustness of these estimates, and about the relative importance of different parameters in determining the features of the accumulation time coursesmore » in three different bacterial species: Escherichia coli, Staphylococcus aureus, and Pseudomonas aeruginosa. The results lead to experimentally testable predictions of the effects of membrane permeability, drug efflux and trapping (e.g., by DNA binding) on drug accumulation. A key prediction is that a sudden increase in ofloxacin accumulation in both E. coli and S. aureus is accompanied by a decrease in membrane permeability.« less
Perceptual Decision-Making as Probabilistic Inference by Neural Sampling.
Haefner, Ralf M; Berkes, Pietro; Fiser, József
2016-05-04
We address two main challenges facing systems neuroscience today: understanding the nature and function of cortical feedback between sensory areas and of correlated variability. Starting from the old idea of perception as probabilistic inference, we show how to use knowledge of the psychophysical task to make testable predictions for the influence of feedback signals on early sensory representations. Applying our framework to a two-alternative forced choice task paradigm, we can explain multiple empirical findings that have been hard to account for by the traditional feedforward model of sensory processing, including the task dependence of neural response correlations and the diverging time courses of choice probabilities and psychophysical kernels. Our model makes new predictions and characterizes a component of correlated variability that represents task-related information rather than performance-degrading noise. It demonstrates a normative way to integrate sensory and cognitive components into physiologically testable models of perceptual decision-making. Copyright © 2016 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vesselinova, Neda; Alexandrov, Boian; Wall, Michael E.
We present a dynamical model of drug accumulation in bacteria. The model captures key features in experimental time courses on ofloxacin accumulation: initial uptake; two-phase response; and long-term acclimation. In combination with experimental data, the model provides estimates of import and export rates in each phase, the time of entry into the second phase, and the decrease of internal drug during acclimation. Global sensitivity analysis, local sensitivity analysis, and Bayesian sensitivity analysis of the model provide information about the robustness of these estimates, and about the relative importance of different parameters in determining the features of the accumulation time coursesmore » in three different bacterial species: Escherichia coli, Staphylococcus aureus, and Pseudomonas aeruginosa. The results lead to experimentally testable predictions of the effects of membrane permeability, drug efflux and trapping (e.g., by DNA binding) on drug accumulation. A key prediction is that a sudden increase in ofloxacin accumulation in both E. coli and S. aureus is accompanied by a decrease in membrane permeability.« less
What is a delusion? Epistemological dimensions.
Leeser, J; O'Donohue, W
1999-11-01
Although the Diagnostic and Statistical Manual of Mental Disorders (American Psychiatric Association, 1994) clearly indicates delusions have an epistemic dimension, it fails to accurately identify the epistemic properties of delusions. The authors explicate the regulative causes of belief revision for rational agents and argue that delusions are unresponsive to these. They argue that delusions are (a) protected beliefs made unfalsifiable either in principle or because the agent refuses to admit anything as a potential falsifier; (b) the protected belief is not typically considered a "properly basic" belief; (c) the belief is not of the variety of protected scientific beliefs; (d) in response to an apparent falsification, the subject posits not a simple, testable explanation for the inconsistency but one that is more complicated, less testable, and provides no new corroborations; (e) the subject has a strong emotional attachment to the belief; and (f) the belief is typically supported by (or originates from) trivial occurrences that are interpreted by the subject as highly unusual, significant, having personal reference, or some combination of these.
Estapé-Garrastazu, Estela S; Noboa-Ramos, Carlamarie; De Jesús-Ojeda, Lizbelle; De Pedro-Serbiá, Zulmarie; Acosta-Pérez, Edna; Camacho-Feliciano, Delia M
2014-10-01
A preliminary needs assessment was conducted among faculty and students of three minority medical and health science institutions comprising the Puerto Rico Clinical and Translational Research Consortium (PRCTRC). The Web-based survey was focused on evaluating the training interests in the clinical and translational research core areas and competencies developed by the National Institutes of Health-Clinical and Translational Sciences Award. The survey was the result of a team effort of three PRCTRC key function's leaderships: Multidisciplinary Training and Career Development, Tracking and Evaluation and Community Research and Engagement. The questionnaire included 45 items distributed across five content areas including demographics, research training needs, training activities coordination and knowledge about the services offered by the PRCTRC. Analysis of research needs includes a sample distribution according to professor, assistant/associate professor and graduate students. The thematic area with highest response rate among the three groups was: "Identify major clinical/public health problems and relevant translational research questions," with the competency "Identify basic and preclinical studies that are potential testable clinical research hypothesis." These preliminary results will guide the training and professional development of the new generation of clinical and translational researchers needed to eliminate health disparities. © 2014 The Authors. Clinical and Translational Science Published by Wiley Periodicals, Inc.
A one-dimensional statistical mechanics model for nucleosome positioning on genomic DNA.
Tesoro, S; Ali, I; Morozov, A N; Sulaiman, N; Marenduzzo, D
2016-02-12
The first level of folding of DNA in eukaryotes is provided by the so-called '10 nm chromatin fibre', where DNA wraps around histone proteins (∼10 nm in size) to form nucleosomes, which go on to create a zig-zagging bead-on-a-string structure. In this work we present a one-dimensional statistical mechanics model to study nucleosome positioning within one such 10 nm fibre. We focus on the case of genomic sheep DNA, and we start from effective potentials valid at infinite dilution and determined from high-resolution in vitro salt dialysis experiments. We study positioning within a polynucleosome chain, and compare the results for genomic DNA to that obtained in the simplest case of homogeneous DNA, where the problem can be mapped to a Tonks gas. First, we consider the simple, analytically solvable, case where nucleosomes are assumed to be point-like. Then, we perform numerical simulations to gauge the effect of their finite size on the nucleosomal distribution probabilities. Finally we compare nucleosome distributions and simulated nuclease digestion patterns for the two cases (homogeneous and sheep DNA), thereby providing testable predictions of the effect of sequence on experimentally observable quantities in experiments on polynucleosome chromatin fibres reconstituted in vitro.
A new free and open source tool for space plasma modeling.
NASA Astrophysics Data System (ADS)
Honkonen, I. J.
2014-12-01
I will present a new distributed memory parallel, free and open source computational model for studying space plasma. The model is written in C++ with emphasis on good software development practices and code readability without sacrificing serial or parallel performance. As such the model could be especially useful for education, for learning both (magneto)hydrodynamics (MHD) and computational model development. By using latest features of the C++ standard (2011) it has been possible to develop a very modular program which improves not only the readability of code but also the testability of the model and decreases the effort required to make changes to various parts of the program. Major parts of the model, functionality not directly related to (M)HD, have been outsourced to other freely available libraries which has reduced the development time of the model significantly. I will present an overview of the code architecture as well as details of different parts of the model and will show examples of using the model including preparing input files and plotting results. A multitude of 1-, 2- and 3-dimensional test cases are included in the software distribution and the results of, for example, Kelvin-Helmholtz, bow shock, blast wave and reconnection tests, will be presented.
NASA Astrophysics Data System (ADS)
Boger, R. A.; Low, R.; Paull, S.; Anyamba, A.; Soebiyanto, R. P.
2017-12-01
Temperature and precipitation are important drivers of mosquito population dynamics, and a growing set of models have been proposed to characterize these relationships. Validation of these models, and development of broader theories across mosquito species and regions could nonetheless be improved by comparing observations from a global dataset of mosquito larvae with satellite-based measurements of meteorological variables. Citizen science data can be particularly useful for two such aspects of research into the meteorological drivers of mosquito populations: i) Broad-scale validation of mosquito distribution models and ii) Generation of quantitative hypotheses regarding changes to mosquito abundance and phenology across scales. The recently released GLOBE Observer Mosquito Habitat Mapper (GO-MHM) app engages citizen scientists in identifying vector taxa, mapping breeding sites and decommissioning non-natural habitats, and provides a potentially useful new tool for validating mosquito ubiquity projections based on the analysis of remotely sensed environmental data. Our early work with GO-MHM data focuses on two objectives: validating citizen science reports of Aedes aegypti distribution through comparison with accepted scientific data sources, and exploring the relationship between extreme temperature and precipitation events and subsequent observations of mosquito larvae. Ultimately the goal is to develop testable hypotheses regarding the shape and character of this relationship between mosquito species and regions.
ERIC Educational Resources Information Center
Barth, Lorna
2007-01-01
By changing the venue from festival to a required academic exposition, the traditional science fair was transformed into a "Science Expo" wherein students were guided away from cookbook experiments toward developing a question about their environment into a testable and measurable experiment. The revamped "Science Expo" became a night for students…
Leveraging Rigorous Local Evaluations to Understand Contradictory Findings
ERIC Educational Resources Information Center
Boulay, Beth; Martin, Carlos; Zief, Susan; Granger, Robert
2013-01-01
Contradictory findings from "well-implemented" rigorous evaluations invite researchers to identify the differences that might explain the contradictions, helping to generate testable hypotheses for new research. This panel will examine efforts to ensure that the large number of local evaluations being conducted as part of four…
Changing Perspectives on Basic Research in Adult Learning and Memory
ERIC Educational Resources Information Center
Hultsch, David F.
1977-01-01
It is argued that wheather the course of cognitive development is characterized by growth, stability, or decline is less a matter of the metamodel on which the theories and data are based. Such metamodels are representations of reality that are not empirically testable. (Author)
Adolescent Pregnancy and Its Delay.
ERIC Educational Resources Information Center
Bell, Lloyd H.
This paper examines some probable reasons for the black adolescent male's contribution to increased pregnancy in the black community. Using a situation analysis, it presents the following testable suppositions: (1) black males' fear of retribution for impregnating a girl has diminished, leading to increased sexual intercourse and ultimately to…
The Process of Mentoring Pregnant Adolescents: An Exploratory Study.
ERIC Educational Resources Information Center
Blinn-Pike, Lynn; Kuschel, Diane; McDaniel, Annette; Mingus, Suzanne; Mutti, Megan Poole
1998-01-01
The process that occurs in relationships between volunteer adult mentors and pregnant adolescent "mentees" is described empirically; testable hypotheses based on findings concerning the mentor role are proposed. Case records from 20 mentors are analyzed; findings regarding mentors' roles are discussed. Criteria for conceptualizing quasi-parenting…
Spiegel, Brennan M.R.; Chey, William D.; Chang, Lin
2010-01-01
Some studies indicate that small intestinal bacterial overgrowth (SIBO), as measured by hydrogen breath tests (HBT), is more prevalent in patients with irritable bowel syndrome (IBS) vs. matched controls without IBS. Although the data are conflicting, this observation has led to the hypothesis that SIBO may be a primary cause of IBS. Yet, it remains unclear whether SIBO is truly fundamental to the pathophysiology of IBS, or is instead a mere epiphenomenon or bystander of something else altogether. We hypothesize that SIBO might be a byproduct of the disproportionate use of proton pump inhibitors (PPIs) in IBS, as follows: (1) IBS patients are more likely than controls to receive PPI therapy; (2) PPI therapy may promote varying forms of SIBO by eliminating gastric acid; and (3) existing studies linking SIBO to IBS have not adjusted for or excluded the use of PPI therapy. When linked together, these premises form the basis for a simple and testable hypothesis: the relationship between SIBO and IBS may be confounded by PPIs. Our article explores these premises, lays out the argument supporting this “PPI hypothesis,” discusses potential implications, and outlines next steps to further investigate this possibility. PMID:19086951
CP violation in heavy MSSM Higgs scenarios
Carena, M.; Ellis, J.; Lee, J. S.; ...
2016-02-18
We introduce and explore new heavy Higgs scenarios in the Minimal Supersymmetric Standard Model (MSSM) with explicit CP violation, which have important phenomenological implications that may be testable at the LHC. For soft supersymmetry-breaking scales M S above a few TeV and a charged Higgs boson mass M H+ above a few hundred GeV, new physics effects including those from explicit CP violation decouple from the light Higgs boson sector. However, such effects can significantly alter the phenomenology of the heavy Higgs bosons while still being consistent with constraints from low-energy observables, for instance electric dipole moments. To consider scenariosmore » with a charged Higgs boson much heavier than the Standard Model (SM) particles but much lighter than the supersymmetric particles, we revisit previous calculations of the MSSM Higgs sector. We compute the Higgs boson masses in the presence of CP violating phases, implementing improved matching and renormalization-group (RG) effects, as well as two-loop RG effects from the effective two-Higgs Doublet Model (2HDM) scale M H± to the scale M S. Here, we illustrate the possibility of non-decoupling CP-violating effects in the heavy Higgs sector using new benchmark scenarios named.« less
CP violation in heavy MSSM Higgs scenarios
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carena, M.; Ellis, J.; Lee, J. S.
We introduce and explore new heavy Higgs scenarios in the Minimal Supersymmetric Standard Model (MSSM) with explicit CP violation, which have important phenomenological implications that may be testable at the LHC. For soft supersymmetry-breaking scales M S above a few TeV and a charged Higgs boson mass M H+ above a few hundred GeV, new physics effects including those from explicit CP violation decouple from the light Higgs boson sector. However, such effects can significantly alter the phenomenology of the heavy Higgs bosons while still being consistent with constraints from low-energy observables, for instance electric dipole moments. To consider scenariosmore » with a charged Higgs boson much heavier than the Standard Model (SM) particles but much lighter than the supersymmetric particles, we revisit previous calculations of the MSSM Higgs sector. We compute the Higgs boson masses in the presence of CP violating phases, implementing improved matching and renormalization-group (RG) effects, as well as two-loop RG effects from the effective two-Higgs Doublet Model (2HDM) scale M H± to the scale M S. Here, we illustrate the possibility of non-decoupling CP-violating effects in the heavy Higgs sector using new benchmark scenarios named.« less
NASA Astrophysics Data System (ADS)
Cavalcanti, Eric G.; Wiseman, Howard M.
2012-10-01
The 1964 theorem of John Bell shows that no model that reproduces the predictions of quantum mechanics can simultaneously satisfy the assumptions of locality and determinism. On the other hand, the assumptions of signal locality plus predictability are also sufficient to derive Bell inequalities. This simple theorem, previously noted but published only relatively recently by Masanes, Acin and Gisin, has fundamental implications not entirely appreciated. Firstly, nothing can be concluded about the ontological assumptions of locality or determinism independently of each other—it is possible to reproduce quantum mechanics with deterministic models that violate locality as well as indeterministic models that satisfy locality. On the other hand, the operational assumption of signal locality is an empirically testable (and well-tested) consequence of relativity. Thus Bell inequality violations imply that we can trust that some events are fundamentally unpredictable, even if we cannot trust that they are indeterministic. This result grounds the quantum-mechanical prohibition of arbitrarily accurate predictions on the assumption of no superluminal signalling, regardless of any postulates of quantum mechanics. It also sheds a new light on an early stage of the historical debate between Einstein and Bohr.
RABIES SURVEILLANCE AMONG BATS IN TENNESSEE, USA, 1996-2010.
Gilbert, Amy T; McCracken, Gary F; Sheeler, Lorinda L; Muller, Lisa I; O'Rourke, Dorcas; Kelch, William J; New, John C
2015-10-01
Rabies virus (RABV) infects multiple bat species in the Americas, and enzootic foci perpetuate in bats principally via intraspecific transmission. In recent years, bats have been implicated in over 90% of human rabies cases in the US. In Tennessee, two human cases of rabies have occurred since 1960: one case in 1994 associated with a tricolored bat (Perimyotis subflavus) RABV variant and another in 2002 associated with the tricolored/silver-haired bat (P. subflavus/Lasionycteris noctivagans) RABV variant. From 1996 to 2010, 2,039 bats were submitted for rabies testing in Tennessee. Among 1,943 bats in satisfactory condition for testing and with a reported diagnostic result, 96% (1,870 of 1,943) were identified to species and 10% (196 of 1,943) were rabid. Big brown (Eptesicus fuscus), tricolored, and eastern red (Lasiurus borealis) bats comprised 77% of testable bat submissions and 84% of rabid bats. For species with five or more submissions during 1996-2010, the highest proportion of rabid bats occurred in hoary (Lasiurus cinereus; 46%), unspecified Myotis spp. (22%), and eastern red (17%) bats. The best model to predict rabid bats included month of submission, exposure history of submission, species, and sex of bat.
NASA Technical Reports Server (NTRS)
Young, K. E.; Bleacher, J. E.; Evans, C. A.; Rogers, A. D.; Ito, G.; Arzoumanian, Z.; Gendreau, K.
2015-01-01
Regardless of the target destination for the next manned planetary mission, the crew will require technology with which to select samples for return to Earth. The six Apollo lunar surface missions crews had only the tools to enable them to physically pick samples up off the surface or from a boulder and store those samples for return to the Lunar Module and eventually to Earth. Sample characterization was dependent upon visual inspection and relied upon their extensive geology training. In the four decades since Apollo however, great advances have been made in traditionally laboratory-based instrument technologies that enable miniaturization to a field-portable configuration. The implications of these advancements extend past traditional terrestrial field geology and into planetary surface exploration. With tools that will allow for real-time geochemical analysis, an astronaut can better develop a series of working hypotheses that are testable during surface science operations. One such technology is x-ray fluorescence (XRF). Traditionally used in a laboratory configuration, these instruments have now been developed and marketed commercially in a field-portable mode. We examine this technology in the context of geologic sample analysis and discuss current and future plans for instrument deployment. We also discuss the development of the Chromatic Mineral Identification and Surface Texture (CMIST) instrument at the NASA Goddard Space Flight Center (GSFC). Testing is taking place in conjunction with the RIS4E (Remote, In Situ, and Synchrotron Studies for Science and Exploration) SSERVI (Solar System Exploration and Research Virtual Institute) team activities, including field testing at Kilauea Volcano, HI..
Mercury (Hg) species distribution patterns among ecosystem compartments in the Everglades were analyzed at the landscape level in order to explore the implications of Hg distribution for Hg bioaccumulation, and to investigate major biogeochemical processes that are pertinent to t...
Software reliability through fault-avoidance and fault-tolerance
NASA Technical Reports Server (NTRS)
Vouk, Mladen A.; Mcallister, David F.
1992-01-01
Accomplishments in the following research areas are summarized: structure based testing, reliability growth, and design testability with risk evaluation; reliability growth models and software risk management; and evaluation of consensus voting, consensus recovery block, and acceptance voting. Four papers generated during the reporting period are included as appendices.
From Cookbook to Experimental Design
ERIC Educational Resources Information Center
Flannagan, Jenny Sue; McMillan, Rachel
2009-01-01
Developing expertise, whether from cook to chef or from student to scientist, occurs over time and requires encouragement, guidance, and support. One key goal of an elementary science program should be to move students toward expertise in their ability to design investigative questions. The ability to design a testable question is difficult for…
Mentoring: A Typology of Costs for Higher Education Faculty
ERIC Educational Resources Information Center
Lunsford, Laura G.; Baker, Vicki; Griffin, Kimberly A.; Johnson, W. Brad
2013-01-01
In this theoretical paper, we apply a social exchange framework to understand mentors' negative experiences. We propose a typology of costs, categorized according to psychosocial and career mentoring functions. Our typology generates testable research propositions. Psychosocial costs of mentoring are burnout, anger, and grief or loss. Career…
Instructional Design: Science, Technology, Both, Neither
ERIC Educational Resources Information Center
Gropper, George L.
2017-01-01
What would it take for instructional design to qualify as a bona fide applied discipline? First and foremost, a fundamental requirement is a testable and tested theoretical base. Untested rationales until verified remain in limbo. Secondly, the discipline's applied prescriptions must be demonstrably traceable to the theoretical base once it is…
ERIC Educational Resources Information Center
Tweney, Ryan D.
Drawing parallels with critical thinking and creative thinking, this document describes some ways that scientific thinking is utilized. Cognitive approaches to scientific thinking are discussed, and it is argued that all science involves an attempt to construct a testable mental model of some aspect of reality. The role of mental models is…
ERIC Educational Resources Information Center
Wallace, Robert B.
1994-01-01
Health survey research assesses health of individuals in population. Measures include prevalence/incidence of diseases, signs/symptoms, functional states, and health services utilization. Although assessing individual biologic robustness can be problematic, testable approaches do exist. Characteristics of health of populations/communities, not…
Equilibration: Developing the Hard Core of the Piagetian Research Program.
ERIC Educational Resources Information Center
Rowell, J.A.
1983-01-01
Argues that the status of the concept of equilibration is classified by considering Piagetian theory as a research program in the sense elaborated in 1974 by Lakatos. A pilot study was made to examine the precision and testability of equilibration in Piaget's 1977 model.(Author/RH)
Links between Parents' Epistemological Stance and Children's Evidence Talk
ERIC Educational Resources Information Center
Luce, Megan R.; Callanan, Maureen A.; Smilovic, Sarah
2013-01-01
Recent experimental research highlights young children's selectivity in learning from others. Little is known, however, about the patterns of information that children actually encounter in conversations with adults. This study investigated variation in parents' tendency to focus on testable evidence as a way to answer science-related questions…
The Simple Theory of Public Library Services.
ERIC Educational Resources Information Center
Newhouse, Joseph P.
A simple normative theory applicable to public library services was developed as a tool to aid libraries in answering the question: which books should be bought by the library? Although developed for normative purposes, the theory generates testable predictions. It is relevant to measuring benefits from services which are provided publicly because…
NASA Astrophysics Data System (ADS)
Sabater, Bartolomé; Marín, Dolores
2018-03-01
The minimum rate principle is applied to the chemical reaction in a steady-state open cell system where, under constant supply of the glucose precursor, reference to time or to glucose consumption does not affect the conclusions.
Tracking the "Lizardman": Writing Rotten to Write Well.
ERIC Educational Resources Information Center
Polette, Keith
1995-01-01
Suggests that students can improve their writing by being instructed on how to write badly. Applies the criteria of testability, tunnel-vision, excessive vagueness, flying in the face of established fact, and hazy authority to tabloid newspaper stories. Discusses how students can write their own "rotten" tabloid stories by taking these…
Researching the Study Abroad Experience
ERIC Educational Resources Information Center
McLeod, Mark; Wainwright, Philip
2009-01-01
The authors propose a paradigm for rigorous scientific assessment of study abroad programs, with the focus being on how study abroad experiences affect psychological constructs as opposed to looking solely at study-abroad-related outcomes. Social learning theory is used as a possible theoretical basis for making testable hypotheses and guiding…
Toward a Testable Developmental Model of Pedophilia: The Development of Erotic Age Preference.
ERIC Educational Resources Information Center
Freund, Kurt; Kuban, Michael
1993-01-01
Analysis of retrospective self-reports about childhood curiosity to see persons in the nude, with heterosexual and homosexual pedophiles, gynephiles, and androphiles, suggests that establishment of erotic sex preference proceeded that of age preference, and a greater proportion of pedophiles than gynephiles or androphiles remembered childhood…
Latitudinal shifts of introduced species: possible causes and implications
Qinfeng Guo; Dov F. Sax; Hong Qian; Regan Early
2012-01-01
This study aims to document shifts in the latitudinal distributions of non-native species relative to their own native distributions and to discuss possible causes and implications of these shifts. We used published and newly compiled data on intercontinentally introduced birds, mammals and plants. We found strong correlations between the latitudinal distributions...
Implications of long tails in the distribution of mutant effects
NASA Astrophysics Data System (ADS)
Waxman, D.; Feng, J.
2005-07-01
Long-tailed distributions possess an infinite variance, yet a finite sample that is drawn from such a distribution has a finite variance. In this work we consider a model of a population subject to mutation, selection and drift. We investigate the implications of a long-tailed distribution of mutant allelic effects on the distribution of genotypic effects in a model with a continuum of allelic effects. While the analysis is confined to asexual populations, it does also have implications for sexual populations. We obtain analytical results for a selectively neutral population as well as one subject to selection. We supplement these analytical results with numerical simulations, to take into account genetic drift. We find that a long-tailed distribution of mutant effects may affect both the equilibrium and the evolutionary adaptive behaviour of a population.
A normative inference approach for optimal sample sizes in decisions from experience
Ostwald, Dirk; Starke, Ludger; Hertwig, Ralph
2015-01-01
“Decisions from experience” (DFE) refers to a body of work that emerged in research on behavioral decision making over the last decade. One of the major experimental paradigms employed to study experience-based choice is the “sampling paradigm,” which serves as a model of decision making under limited knowledge about the statistical structure of the world. In this paradigm respondents are presented with two payoff distributions, which, in contrast to standard approaches in behavioral economics, are specified not in terms of explicit outcome-probability information, but by the opportunity to sample outcomes from each distribution without economic consequences. Participants are encouraged to explore the distributions until they feel confident enough to decide from which they would prefer to draw from in a final trial involving real monetary payoffs. One commonly employed measure to characterize the behavior of participants in the sampling paradigm is the sample size, that is, the number of outcome draws which participants choose to obtain from each distribution prior to terminating sampling. A natural question that arises in this context concerns the “optimal” sample size, which could be used as a normative benchmark to evaluate human sampling behavior in DFE. In this theoretical study, we relate the DFE sampling paradigm to the classical statistical decision theoretic literature and, under a probabilistic inference assumption, evaluate optimal sample sizes for DFE. In our treatment we go beyond analytically established results by showing how the classical statistical decision theoretic framework can be used to derive optimal sample sizes under arbitrary, but numerically evaluable, constraints. Finally, we critically evaluate the value of deriving optimal sample sizes under this framework as testable predictions for the experimental study of sampling behavior in DFE. PMID:26441720
The Implications of Lowering the Cost to Access Space on Airpower
2017-06-01
1 The Implications of Lowering the Cost to Access Space on Airpower Major Gabe Arrington School of Advanced Air and Space Studies...2017 DISTRIBUTION A: Approved for public release: distribution unlimited 2 Disclaimer Opinions, conclusions, and recommendations...implications that lowering the cost to access space will have on Airpower. The research conducted used predominantly qualitative research techniques to
A THEORY OF WORK ADJUSTMENT. MINNESOTA STUDIES IN VOCATIONAL REHABILITATION, 15.
ERIC Educational Resources Information Center
DAWIS, RENE V.; AND OTHERS
A THEORY OF WORK ADJUSTMENT WHICH MAY CONTRIBUTE TO THE DEVELOPMENT OF A SCIENCE OF THE PSYCHOLOGY OF OCCUPATIONAL BEHAVIOR IS PROPOSED. IT BUILDS ON THE BASIC PSYCHOLOGICAL CONCEPTS OF STIMULUS, RESPONSE, AND REINFORCEMENT, AND PROVIDES A RESEARCH PARADIGM FOR GENERATING TESTABLE HYPOTHESES. IT WAS DERIVED FROM EARLY RESEARCH EFFORTS OF THE…
ERIC Educational Resources Information Center
Maul, Andrew
2015-01-01
Briggs and Peck [in "Using Learning Progressions to Design Vertical Scales That Support Coherent Inferences about Student Growth"] call for greater care in the conceptualization of the target attributes of students, or "what it is that is growing from grade to grade." In particular, they argue that learning progressions can…
Performance Models of Testability.
1984-08-01
4.1.17 Cost of Isolating Component/Part (CPI) 5J Cost of isolating components or parts at the depot is at CPI - n1 (HDC)(TPI)(NPI) where TPI = average...testing component N Deec N a aiur Yes(PFD D)S Cos ofi oaig op nn -- Cost ofcmpnn rmva n relaemn Exece Cost of omponn reatmovae anda
There's No Such Thing as Value-Free Science.
ERIC Educational Resources Information Center
Makosky, Vivian Parker
This paper is based on the view that, although scientists rely on research values such as predictive accuracy and testability, scientific research is still subject to the unscientific values, attitudes, and emotions of the scientists. It is noted that undergraduate students are likely not to think critically about the science they encounter. A…
Modules, Theories, or Islands of Expertise? Domain Specificity in Socialization
ERIC Educational Resources Information Center
Gelman, Susan A.
2010-01-01
The domain-specific approach to socialization processes presented by J. E. Grusec and M. Davidov (this issue) provides a compelling framework for integrating and interpreting a large and disparate body of research findings, and it generates a wealth of testable new hypotheses. At the same time, it introduces core theoretical questions regarding…
Phases in the Adoption of Educational Innovations in Teacher Training Institutions.
ERIC Educational Resources Information Center
Hall, Gene E.
An attempt has been made to categorize phenomena observed as 20 teacher training institutions have adopted innovations and to extrapolate from these findings key concepts and principles that could form the basis for developing empirically testable hypotheses and could be of some immediate utility to those involved in innovation adoption. The…
Twelve testable hypotheses on the geobiology of weathering
S.L. Brantley; J.P. Megonigal; F.N. Scatena; Z. Balogh-Brunstad; R.T. Barnes; M.A. Bruns; P. van Cappelen; K. Dontsova; H.E. Hartnett; A.S. Hartshorn; A. Heimsath; E. Herndon; L. Jin; C.K. Keller; J.R. Leake; W.H. McDowell; F.C. Meinzer; T.J. Mozdzer; S. Petsch; J. Pett-Ridge; K.S. Pretziger; P.A. Raymond; C.S. Riebe; K. Shumaker; A. Sutton-Grier; R. Walter; K. Yoo
2011-01-01
Critical Zone (CZ) research investigates the chemical, physical, and biological processes that modulate the Earth's surface. Here, we advance 12 hypotheses that must be tested to improve our understanding of the CZ: (1) Solar-to-chemical conversion of energy by plants regulates flows of carbon, water, and nutrients through plant-microbe soil networks, thereby...
ERIC Educational Resources Information Center
Kirch, Susan A.; Stetsenko, Anna
2012-01-01
What do people mean when they say they "know" something in science? It usually means they did an investigation and expended considerable intellectual effort to build a useful explanatory model. It means they are confident about an explanation, believe others should trust what they say, and believe that their claim is testable. It means they can…
ERIC Educational Resources Information Center
Martin-Dunlop, Catherine S.
2013-01-01
This study investigated prospective elementary teachers' understandings of the nature of science and explored associations with their guided-inquiry science learning environment. Over 500 female students completed the Nature of Scientific Knowledge Survey (NSKS), although only four scales were analyzed-Creative, Testable, Amoral, and Unified. The…
Forensic Impact of the Child Sexual Abuse Medical Examination.
ERIC Educational Resources Information Center
Myers, John E. B.
1998-01-01
This commentary on an article (EC 619 279) about research issues at the interface of medicine and law concerning medical evaluation for child sexual abuse focuses on empirically testable questions: (1) the medical history--its accuracy, interviewing issues, and elicitation and preservation of verbal evidence of abuse; and, (2) expert testimony.…
Two New Empirically Derived Reasons To Use the Assessment of Basic Learning Abilities.
ERIC Educational Resources Information Center
Richards, David F.; Williams, W. Larry; Follette, William C.
2002-01-01
Scores on the Assessment of Basic Learning Abilities (ABLA), Vineland Adaptive Behavior Scales, and the Wechsler Intelligences Scale-Revised (WAIS-R) were obtained for 30 adults with mental retardation. Correlations between the Vineland domains and ABLA were all significant. No participants performing below ABLA Level 6 were testable on the…
A Cognitive Approach to Brailling Errors
ERIC Educational Resources Information Center
Wells-Jensen, Sheri; Schwartz, Aaron; Gosche, Bradley
2007-01-01
This article analyzes a corpus of 1,600 brailling errors made by one expert braillist. It presents a testable model of braille writing and shows that the subject braillist stores standard braille contractions as part of the orthographic representation of words, rather than imposing contractions on a serially ordered string of letters. (Contains 1…
USDA-ARS?s Scientific Manuscript database
Progress in studying the biology of Trichinella spp. was greatly advanced with the publication and analysis of the draft genome sequence of T. spiralis. Those data provide a basis for constructing testable hypothesis concerning parasite physiology, immunology, and genetics. They also provide tools...
Thinking about Evolution: Combinatorial Play as a Strategy for Exercising Scientific Creativity
ERIC Educational Resources Information Center
Wingate, Richard J. T.
2011-01-01
An enduring focus in education on how scientists formulate experiments and "do science" in the laboratory has excluded a vital element of scientific practice: the creative and imaginative thinking that generates models and testable hypotheses. In this case study, final-year biomedical sciences university students were invited to create and justify…
Purposeful Instruction: Mixing up the "I," "We," and "You"
ERIC Educational Resources Information Center
Grant, Maria; Lapp, Diane; Fisher, Douglas; Johnson, Kelly; Frey, Nancy
2012-01-01
This article discusses the flexible nature of the gradual release of responsibility (GRR) as a frame for inquiry-based science instruction. Given the mandate for the use of text-supported learning (Common Core Standards), the GRR can be used to allow students to learn as scientists as they collaboratively develop testable questions and experiments…
The use of models to predict potential contamination aboard orbital vehicles
NASA Technical Reports Server (NTRS)
Boraas, Martin E.; Seale, Dianne B.
1989-01-01
A model of fungal growth on air-exposed, nonnutritive solid surfaces, developed for utilization aboard orbital vehicles is presented. A unique feature of this testable model is that the development of a fungal mycelium can facilitate its own growth by condensation of water vapor from its environment directly onto fungal hyphae. The fungal growth rate is limited by the rate of supply of volatile nutrients and fungal biomass is limited by either the supply of nonvolatile nutrients or by metabolic loss processes. The model discussed is structurally simple, but its dynamics can be quite complex. Biofilm accumulation can vary from a simple linear increase to sustained exponential growth, depending on the values of the environmental variable and model parameters. The results of the model are consistent with data from aquatic biofilm studies, insofar as the two types of systems are comparable. It is shown that the model presented is experimentally testable and provides a platform for the interpretation of observational data that may be directly relevant to the question of growth of organisms aboard the proposed Space Station.
What can we learn from a two-brain approach to verbal interaction?
Schoot, Lotte; Hagoort, Peter; Segaert, Katrien
2016-09-01
Verbal interaction is one of the most frequent social interactions humans encounter on a daily basis. In the current paper, we zoom in on what the multi-brain approach has contributed, and can contribute in the future, to our understanding of the neural mechanisms supporting verbal interaction. Indeed, since verbal interaction can only exist between individuals, it seems intuitive to focus analyses on inter-individual neural markers, i.e. between-brain neural coupling. To date, however, there is a severe lack of theoretically-driven, testable hypotheses about what between-brain neural coupling actually reflects. In this paper, we develop a testable hypothesis in which between-pair variation in between-brain neural coupling is of key importance. Based on theoretical frameworks and empirical data, we argue that the level of between-brain neural coupling reflects speaker-listener alignment at different levels of linguistic and extra-linguistic representation. We discuss the possibility that between-brain neural coupling could inform us about the highest level of inter-speaker alignment: mutual understanding. Copyright © 2016 Elsevier Ltd. All rights reserved.
Active processes make mixed lipid membranes either flat or crumpled
NASA Astrophysics Data System (ADS)
Banerjee, Tirthankar; Basu, Abhik
2018-01-01
Whether live cell membranes show miscibility phase transitions (MPTs), and if so, how they fluctuate near the transitions remain outstanding unresolved issues in physics and biology alike. Motivated by these questions we construct a generic hydrodynamic theory for lipid membranes that are active, due for instance, to the molecular motors in the surrounding cytoskeleton, or active protein components in the membrane itself. We use this to uncover a direct correspondence between membrane fluctuations and MPTs. Several testable predictions are made: (i) generic active stiffening with orientational long range order (flat membrane) or softening with crumpling of the membrane, controlled by the active tension and (ii) for mixed lipid membranes, capturing the nature of putative MPTs by measuring the membrane conformation fluctuations. Possibilities of both first and second order MPTs in mixed active membranes are argued for. Near second order MPTs, active stiffening (softening) manifests as a super-stiff (super-soft) membrane. Our predictions are testable in a variety of in vitro systems, e.g. live cytoskeletal extracts deposited on liposomes and lipid membranes containing active proteins embedded in a passive fluid.
Online testable concept maps: benefits for learning about the pathogenesis of disease.
Ho, Veronica; Kumar, Rakesh K; Velan, Gary
2014-07-01
Concept maps have been used to promote meaningful learning and critical thinking. Although these are crucially important in all disciplines, evidence for the benefits of concept mapping for learning in medicine is limited. We performed a randomised crossover study to assess the benefits of online testable concept maps for learning in pathology by volunteer junior medical students. Participants (n = 65) were randomly allocated to either of two groups with equivalent mean prior academic performance, in which they were given access to either online maps or existing online resources for a 2-week block on renal disease. Groups then crossed over for a 2-week block on hepatic disease. Outcomes were assessed using timed online quizzes, which included questions unrelated to topics in the pathogenesis maps as an internal control. Questionnaires were administered to evaluate students' acceptance of the maps. In both blocks, the group with access to pathogenesis maps achieved significantly higher average scores than the control group on quiz questions related to topics covered by the maps (Block 1: p < 0.001, Cohen's d = 0.9; Block 2: p = 0.008, Cohen's d = 0.7). However, mean scores on unrelated questions did not differ significantly between the groups. In a third block on pancreatic disease, both groups received pathogenesis maps and collectively performed significantly better on quiz topics related to the maps than on unrelated topics (p < 0.01, Cohen's d = 0.5). Regression analysis revealed that access to pathogenesis maps was the dominant contributor to variance in performance on map-related quiz questions. Responses to questionnaire items on pathogenesis maps were overwhelmingly positive in both groups. These results indicate that online testable pathogenesis maps are well accepted and can improve learning of concepts in pathology by medical students. © 2014 John Wiley & Sons Ltd.
Color Vision Deficiency in Preschool Children
Xie, John Z.; Tarczy-Hornoch, Kristina; Lin, Jesse; Cotter, Susan A.; Torres, Mina; Varma, Rohit
2016-01-01
Purpose To determine the sex- and ethnicity-specific prevalence of color vision deficiency (CVD) in black, Asian, Hispanic, and non-Hispanic white preschool children. Design Population-based, cross-sectional study. Participants The Multi-Ethnic Pediatric Eye Disease Study is a population-based evaluation of the prevalence of vision disorders in children in Southern California. A total of 5960 subjects 30 to 72 months of age were recruited for the study, of whom 4177 were able to complete color vision testing (1265 black, 812 Asian, 1280 Hispanic, and 820 non-Hispanic white). Methods Color vision testing was performed using Color Vision Testing Made Easy color plates (Home Vision Care, Gulf Breeze, FL), and diagnostic confirmatory testing was performed using the Waggoner HRR Diagnostic Test color plates (Home Vision Care). Main Outcome Measures Testability of color vision in preschool children between 30 and 72 months of age and prevalence of CVD stratified by age, sex, and ethnicity. Results Testability was 17% in children younger than 37 months of age, increasing to 57% in children 37 to 48 months of age, 89% in children 49 to 60 months of age, and 98% in children 61 to 72 months of age. The prevalence of CVD among boys was 1.4% for black, 3.1% for Asian, 2.6% for Hispanic, and 5.6% for non-Hispanic white children; the prevalence in girls was 0.0% to 0.5% for all ethnicities. The ethnic difference in CVD was statistically significant between black and non-Hispanic white children (P = 0.0003) and between Hispanic and non-Hispanic white children (P = 0.02). In boys, most CVD cases were either deutan (51%) or protan (34%); 32% were classified as mild, 15% as moderate, and 41% as severe. Conclusions Testability for CVD in preschool children is high by 4 years of age. The prevalence of CVD in preschool boys varies by ethnicity, with the highest prevalence in non-Hispanic white and lowest in black children. PMID:24702753
Exploring Explanations of Subglacial Bedform Sizes Using Statistical Models.
Hillier, John K; Kougioumtzoglou, Ioannis A; Stokes, Chris R; Smith, Michael J; Clark, Chris D; Spagnolo, Matteo S
2016-01-01
Sediments beneath modern ice sheets exert a key control on their flow, but are largely inaccessible except through geophysics or boreholes. In contrast, palaeo-ice sheet beds are accessible, and typically characterised by numerous bedforms. However, the interaction between bedforms and ice flow is poorly constrained and it is not clear how bedform sizes might reflect ice flow conditions. To better understand this link we present a first exploration of a variety of statistical models to explain the size distribution of some common subglacial bedforms (i.e., drumlins, ribbed moraine, MSGL). By considering a range of models, constructed to reflect key aspects of the physical processes, it is possible to infer that the size distributions are most effectively explained when the dynamics of ice-water-sediment interaction associated with bedform growth is fundamentally random. A 'stochastic instability' (SI) model, which integrates random bedform growth and shrinking through time with exponential growth, is preferred and is consistent with other observations of palaeo-bedforms and geophysical surveys of active ice sheets. Furthermore, we give a proof-of-concept demonstration that our statistical approach can bridge the gap between geomorphological observations and physical models, directly linking measurable size-frequency parameters to properties of ice sheet flow (e.g., ice velocity). Moreover, statistically developing existing models as proposed allows quantitative predictions to be made about sizes, making the models testable; a first illustration of this is given for a hypothesised repeat geophysical survey of bedforms under active ice. Thus, we further demonstrate the potential of size-frequency distributions of subglacial bedforms to assist the elucidation of subglacial processes and better constrain ice sheet models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lopez, Eric D.; Fortney, Jonathan J.
2013-10-10
We use models of coupled thermal evolution and photo-evaporative mass loss to understand the formation and evolution of the Kepler-36 system. We show that the large contrast in mean planetary density observed by Carter et al. can be explained as a natural consequence of photo-evaporation from planets that formed with similar initial compositions. However, rather than being due to differences in XUV irradiation between the planets, we find that this contrast is due to the difference in the masses of the planets' rock/iron cores and the impact that this has on mass-loss evolution. We explore in detail how our coupledmore » models depend on irradiation, mass, age, composition, and the efficiency of mass loss. Based on fits to large numbers of coupled evolution and mass-loss runs, we provide analytic fits to understand threshold XUV fluxes for significant atmospheric loss, as a function of core mass and mass-loss efficiency. Finally we discuss these results in the context of recent studies of the radius distribution of Kepler candidates. Using our parameter study, we make testable predictions for the frequency of sub-Neptune-sized planets. We show that 1.8-4.0 R{sub ⊕} planets should become significantly less common on orbits within 10 days and discuss the possibility of a narrow 'occurrence valley' in the radius-flux distribution. Moreover, we describe how photo-evaporation provides a natural explanation for the recent observations of Ciardi et al. that inner planets are preferentially smaller within the systems.« less
Dynamical consequences of mantle heterogeneity in two-phase models of mid-ocean ridges
NASA Astrophysics Data System (ADS)
Katz, R. F.
2010-12-01
The mid-ocean ridge system, over 50,000 km in length, samples the magmatic products of a large swath of the asthenosphere. It provides our best means to assess the heterogeneity structure of the upper mantle. Interpretation of the diverse array of observations of MOR petrology, geochemistry, tomography, etc requires models that can map heterogeneity structure onto predictions testable by comparison with these observations. I report on progress to this end; in particular, I describe numerical models of coupled magma/mantle dynamics at mid-ocean ridges [1,2]. These models incorporate heterogeneity in terms of a simple, two-component thermochemical system with specified amplitude and spatial distribution. They indicate that mantle heterogeneity has significant fluid-dynamical consequences for both mantle and magmatic flow. Models show that the distribution of enrichment can lead to asymmetry in the strength of upwelling across the ridge-axis and channelised magmatic transport to the axis. Furthermore, heterogeneity can cause off-axis upwelling of partially molten diapirs, trapping of enriched melts off-axis, and re-fertilization of the mantle by pooled and refrozen melts. Predicted consequences of geochemical heterogeneity may also be considered. References: [1] Katz, RF, (2008); Magma dynamics with the Enthalpy Method: Benchmark Solutions and Magmatic Focusing at Mid-ocean Ridges. Journal of Petrology, doi: 10.1093/petrology/egn058. [2] Katz RF, (2010); Porosity-driven convection and asymmetry beneath mid-ocean ridges. Submitted to G3.
The epipelagic fish community of Beaufort Sea coastal waters, Alaska
Jarvela, L.E.; Thorsteinson, L.K.
1999-01-01
A three-year study of epipelagic fishes inhabiting Beaufort Sea coastal waters in Alaska documented spatial and temporal patterns in fish distribution and abundance and examined their relationships to thermohaline features during summer. Significant interannual, seasonal, and geographical differences in surface water temperatures and salinities were observed. In 1990, sea ice was absent and marine conditions prevailed, whereas in 1988 and 1991, heavy pack ice was present and the dissolution of the brackish water mass along the coast proceeded more slowly. Arctic cod, capelin, and liparids were the most abundant marine fishes in the catches, while arctic cisco was the only abundant diadromous freshwater species. Age-0 arctic cod were exceptionally abundant and large in 1990, while age-0 capelin dominated in the other years. The alternating numerical dominances of arctic cod and age-0 capelin may represent differing species' responses to wind-driven oceanographic processes affecting growth and survival. The only captures of age-0 arctic cisco occurred during 1990. Catch patterns indicate they use a broad coastal migratory corridor and tolerate high salinities. As in the oceanographic data, geographical anti temporal patterns were apparent in the fish catch data, but in most cases these patterns were not statistically testable because of excessive zero catches. The negative binomial distribution appeared to be a suitable statistical descriptor of the aggregated catch patterns for the more common species.
Evolution of the cerebellum as a neuronal machine for Bayesian state estimation
NASA Astrophysics Data System (ADS)
Paulin, M. G.
2005-09-01
The cerebellum evolved in association with the electric sense and vestibular sense of the earliest vertebrates. Accurate information provided by these sensory systems would have been essential for precise control of orienting behavior in predation. A simple model shows that individual spikes in electrosensory primary afferent neurons can be interpreted as measurements of prey location. Using this result, I construct a computational neural model in which the spatial distribution of spikes in a secondary electrosensory map forms a Monte Carlo approximation to the Bayesian posterior distribution of prey locations given the sense data. The neural circuit that emerges naturally to perform this task resembles the cerebellar-like hindbrain electrosensory filtering circuitry of sharks and other electrosensory vertebrates. The optimal filtering mechanism can be extended to handle dynamical targets observed from a dynamical platform; that is, to construct an optimal dynamical state estimator using spiking neurons. This may provide a generic model of cerebellar computation. Vertebrate motion-sensing neurons have specific fractional-order dynamical characteristics that allow Bayesian state estimators to be implemented elegantly and efficiently, using simple operations with asynchronous pulses, i.e. spikes. The computational neural models described in this paper represent a novel kind of particle filter, using spikes as particles. The models are specific and make testable predictions about computational mechanisms in cerebellar circuitry, while providing a plausible explanation of cerebellar contributions to aspects of motor control, perception and cognition.
Antenna Mechanism of Length Control of Actin Cables
Mohapatra, Lishibanya; Goode, Bruce L.; Kondev, Jane
2015-01-01
Actin cables are linear cytoskeletal structures that serve as tracks for myosin-based intracellular transport of vesicles and organelles in both yeast and mammalian cells. In a yeast cell undergoing budding, cables are in constant dynamic turnover yet some cables grow from the bud neck toward the back of the mother cell until their length roughly equals the diameter of the mother cell. This raises the question: how is the length of these cables controlled? Here we describe a novel molecular mechanism for cable length control inspired by recent experimental observations in cells. This “antenna mechanism” involves three key proteins: formins, which polymerize actin, Smy1 proteins, which bind formins and inhibit actin polymerization, and myosin motors, which deliver Smy1 to formins, leading to a length-dependent actin polymerization rate. We compute the probability distribution of cable lengths as a function of several experimentally tuneable parameters such as the formin-binding affinity of Smy1 and the concentration of myosin motors delivering Smy1. These results provide testable predictions of the antenna mechanism of actin-cable length control. PMID:26107518
Antenna Mechanism of Length Control of Actin Cables.
Mohapatra, Lishibanya; Goode, Bruce L; Kondev, Jane
2015-06-01
Actin cables are linear cytoskeletal structures that serve as tracks for myosin-based intracellular transport of vesicles and organelles in both yeast and mammalian cells. In a yeast cell undergoing budding, cables are in constant dynamic turnover yet some cables grow from the bud neck toward the back of the mother cell until their length roughly equals the diameter of the mother cell. This raises the question: how is the length of these cables controlled? Here we describe a novel molecular mechanism for cable length control inspired by recent experimental observations in cells. This "antenna mechanism" involves three key proteins: formins, which polymerize actin, Smy1 proteins, which bind formins and inhibit actin polymerization, and myosin motors, which deliver Smy1 to formins, leading to a length-dependent actin polymerization rate. We compute the probability distribution of cable lengths as a function of several experimentally tuneable parameters such as the formin-binding affinity of Smy1 and the concentration of myosin motors delivering Smy1. These results provide testable predictions of the antenna mechanism of actin-cable length control.
Coexistence trend contingent to Mediterranean oaks with different leaf habits.
Di Paola, Arianna; Paquette, Alain; Trabucco, Antonio; Mereu, Simone; Valentini, Riccardo; Paparella, Francesco
2017-05-01
In a previous work we developed a mathematical model to explain the co-occurrence of evergreen and deciduous oak groups in the Mediterranean region, regarded as one of the distinctive features of Mediterranean biodiversity. The mathematical analysis showed that a stabilizing mechanism resulting from niche difference (i.e. different water use and water stress tolerance) between groups allows their coexistence at intermediate values of suitable soil water content. A simple formal derivation of the model expresses this hypothesis in a testable form linked uniquely to the actual evapotranspiration of forests community. In the present work we ascertain whether this simplified conclusion possesses some degree of explanatory power by comparing available data on oaks distributions and remotely sensed evapotranspiration (MODIS product) in a large-scale survey embracing the western Mediterranean area. Our findings confirmed the basic assumptions of model addressed on large scale, but also revealed asymmetric responses to water use and water stress tolerance between evergreen and deciduous oaks that should be taken into account to increase the understating of species interactions and, ultimately, improve the modeling capacity to explain co-occurrence.
Combined neurostimulation and neuroimaging in cognitive neuroscience: past, present, and future.
Bestmann, Sven; Feredoes, Eva
2013-08-01
Modern neurostimulation approaches in humans provide controlled inputs into the operations of cortical regions, with highly specific behavioral consequences. This enables causal structure-function inferences, and in combination with neuroimaging, has provided novel insights into the basic mechanisms of action of neurostimulation on distributed networks. For example, more recent work has established the capacity of transcranial magnetic stimulation (TMS) to probe causal interregional influences, and their interaction with cognitive state changes. Combinations of neurostimulation and neuroimaging now face the challenge of integrating the known physiological effects of neurostimulation with theoretical and biological models of cognition, for example, when theoretical stalemates between opposing cognitive theories need to be resolved. This will be driven by novel developments, including biologically informed computational network analyses for predicting the impact of neurostimulation on brain networks, as well as novel neuroimaging and neurostimulation techniques. Such future developments may offer an expanded set of tools with which to investigate structure-function relationships, and to formulate and reconceptualize testable hypotheses about complex neural network interactions and their causal roles in cognition. © 2013 New York Academy of Sciences.
Hypothesis testing and earthquake prediction.
Jackson, D D
1996-04-30
Requirements for testing include advance specification of the conditional rate density (probability per unit time, area, and magnitude) or, alternatively, probabilities for specified intervals of time, space, and magnitude. Here I consider testing fully specified hypotheses, with no parameter adjustments or arbitrary decisions allowed during the test period. Because it may take decades to validate prediction methods, it is worthwhile to formulate testable hypotheses carefully in advance. Earthquake prediction generally implies that the probability will be temporarily higher than normal. Such a statement requires knowledge of "normal behavior"--that is, it requires a null hypothesis. Hypotheses can be tested in three ways: (i) by comparing the number of actual earth-quakes to the number predicted, (ii) by comparing the likelihood score of actual earthquakes to the predicted distribution, and (iii) by comparing the likelihood ratio to that of a null hypothesis. The first two tests are purely self-consistency tests, while the third is a direct comparison of two hypotheses. Predictions made without a statement of probability are very difficult to test, and any test must be based on the ratio of earthquakes in and out of the forecast regions.
Hypothesis testing and earthquake prediction.
Jackson, D D
1996-01-01
Requirements for testing include advance specification of the conditional rate density (probability per unit time, area, and magnitude) or, alternatively, probabilities for specified intervals of time, space, and magnitude. Here I consider testing fully specified hypotheses, with no parameter adjustments or arbitrary decisions allowed during the test period. Because it may take decades to validate prediction methods, it is worthwhile to formulate testable hypotheses carefully in advance. Earthquake prediction generally implies that the probability will be temporarily higher than normal. Such a statement requires knowledge of "normal behavior"--that is, it requires a null hypothesis. Hypotheses can be tested in three ways: (i) by comparing the number of actual earth-quakes to the number predicted, (ii) by comparing the likelihood score of actual earthquakes to the predicted distribution, and (iii) by comparing the likelihood ratio to that of a null hypothesis. The first two tests are purely self-consistency tests, while the third is a direct comparison of two hypotheses. Predictions made without a statement of probability are very difficult to test, and any test must be based on the ratio of earthquakes in and out of the forecast regions. PMID:11607663
The CRISP theory of hippocampal function in episodic memory
Cheng, Sen
2013-01-01
Over the past four decades, a “standard framework” has emerged to explain the neural mechanisms of episodic memory storage. This framework has been instrumental in driving hippocampal research forward and now dominates the design and interpretation of experimental and theoretical studies. It postulates that cortical inputs drive plasticity in the recurrent cornu ammonis 3 (CA3) synapses to rapidly imprint memories as attractor states in CA3. Here we review a range of experimental studies and argue that the evidence against the standard framework is mounting, notwithstanding the considerable evidence in its support. We propose CRISP as an alternative theory to the standard framework. CRISP is based on Context Reset by dentate gyrus (DG), Intrinsic Sequences in CA3, and Pattern completion in cornu ammonis 1 (CA1). Compared to previous models, CRISP uses a radically different mechanism for storing episodic memories in the hippocampus. Neural sequences are intrinsic to CA3, and inputs are mapped onto these intrinsic sequences through synaptic plasticity in the feedforward projections of the hippocampus. Hence, CRISP does not require plasticity in the recurrent CA3 synapses during the storage process. Like in other theories DG and CA1 play supporting roles, however, their function in CRISP have distinct implications. For instance, CA1 performs pattern completion in the absence of CA3 and DG contributes to episodic memory retrieval, increasing the speed, precision, and robustness of retrieval. We propose the conceptual theory, discuss its implications for experimental results and suggest testable predictions. It appears that CRISP not only accounts for those experimental results that are consistent with the standard framework, but also for results that are at odds with the standard framework. We therefore suggest that CRISP is a viable, and perhaps superior, theory for the hippocampal function in episodic memory. PMID:23653597
NASA Astrophysics Data System (ADS)
Derakhshani, Maaneli
In this thesis, we consider the implications of solving the quantum measurement problem for the Newtonian description of semiclassical gravity. First we review the formalism of the Newtonian description of semiclassical gravity based on standard quantum mechanics---the Schroedinger-Newton theory---and two well-established predictions that come out of it, namely, gravitational 'cat states' and gravitationally-induced wavepacket collapse. Then we review three quantum theories with 'primitive ontologies' that are well-known known to solve the measurement problem---Schroedinger's many worlds theory, the GRW collapse theory with matter density ontology, and Nelson's stochastic mechanics. We extend the formalisms of these three quantum theories to Newtonian models of semiclassical gravity and evaluate their implications for gravitational cat states and gravitational wavepacket collapse. We find that (1) Newtonian semiclassical gravity based on Schroedinger's many worlds theory is mathematically equivalent to the Schroedinger-Newton theory and makes the same predictions; (2) Newtonian semiclassical gravity based on the GRW theory differs from Schroedinger-Newton only in the use of a stochastic collapse law, but this law allows it to suppress gravitational cat states so as not to be in contradiction with experiment, while allowing for gravitational wavepacket collapse to happen as well; (3) Newtonian semiclassical gravity based on Nelson's stochastic mechanics differs significantly from Schroedinger-Newton, and does not predict gravitational cat states nor gravitational wavepacket collapse. Considering that gravitational cat states are experimentally ruled out, but gravitational wavepacket collapse is testable in the near future, this implies that only the latter two are viable theories of Newtonian semiclassical gravity and that they can be experimentally tested against each other in future molecular interferometry experiments that are anticipated to be capable of testing the gravitational wavepacket collapse prediction.
Pre-Service Teacher Scientific Behavior: Comparative Study of Paired Science Project Assignments
ERIC Educational Resources Information Center
Bulunuz, Mizrap; Tapan Broutin, Menekse Seden; Bulunuz, Nermin
2016-01-01
Problem Statement: University students usually lack the skills to rigorously define a multi-dimensional real-life problem and its limitations in an explicit, clear and testable way, which prevents them from forming a reliable method, obtaining relevant results and making balanced judgments to solve a problem. Purpose of the Study: The study…
1981-03-31
logic testing element and a concomitant testability criterion ideally suited to dynamic circuit applications and appro- priate for automatic computer...making connections automatically . PF is an experimental feature which provides users with only four different chip sizes (full, half, quarter, and eighth...initial solution is found constructively which is improved by pair-wise swapping. Results show, however, that the constructive initial sorter , which
ERIC Educational Resources Information Center
Nauta, Margaret M.
2010-01-01
This article celebrates the 50th anniversary of the introduction of John L. Holland's (1959) theory of vocational personalities and work environments by describing the theory's development and evolution, its instrumentation, and its current status. Hallmarks of Holland's theory are its empirical testability and its user-friendliness. By…
Steering Performance, Tactical Vehicles
2015-07-29
5 4.1 General Vehicle and Test Characterization ........................... 5 4.2 Weave Test...able to be driven in a straight line without steer input (i.e., “ hands free”). If the vehicle pulls in either direction, the alignment should be...Evaluation Center (AEC) prior to using military personnel as test participants. 4. TEST PROCEDURES. 4.1 General Vehicle and Test
Binding and Scope Dependencies with "Floating Quantifiers" in Japanese
ERIC Educational Resources Information Center
Mukai, Emi
2012-01-01
The primary concern of this thesis is how we can achieve rigorous testability when we set the properties of the Computational System (hypothesized to be at the center of the language faculty) as our object of inquiry and informant judgments as a tool to construct and/or evaluate our hypotheses concerning the properties of the Computational System.…
The Many Methods to Measure Testability: A Horror Story.
1988-04-01
it seems overly simplistic to assign only one "magic number" as a viable design goal. Different design technologies such as digital, analog, machanical ...FAILURE RATE 1 1 BASIC TEST PROGRAM 1 1 ATLAS TEST PROGRAM 1 1 EDIF FILE 1 1 TEST STRATEGY FLOWCHART 1 1 RTOK FREQUENCY 1 1 DIAGNOSIS AVERAGE COST 1 1
The Social Basis of Math Teaching and Learning. Final Report.
ERIC Educational Resources Information Center
Orvik, James M.; Van Veldhuizen, Philip A.
This study was designed to identify a set of research questions and testable hypothesis to aid in planning long-range research. Five mathematics teachers were selected. These instructors enrolled in a special project-related seminar, video-taped sessions of their own mathematics classes, and kept field journals. The group met once a week to…
ERIC Educational Resources Information Center
Kulczynska, Agnieszka; Johnson, Reed; Frost, Tony; Margerum, Lawrence D.
2011-01-01
An advanced undergraduate laboratory project is described that integrates inorganic, analytical, physical, and biochemical techniques to reveal differences in binding between cationic metal complexes and anionic DNA (herring testes). Students were guided to formulate testable hypotheses based on the title question and a list of different metal…
ERIC Educational Resources Information Center
Duncan-Wiles, Daphne S.
2012-01-01
With the recent addition of engineering to most K-12 testable state standards, efficient and comprehensive instruments are needed to assess changes in student knowledge and perceptions of engineering. In this study, I developed the Students' Awareness and Perceptions of Learning Engineering (STAPLE) instrument to quantitatively measure fourth…
Wichita's Hispanics: Tensions, Concerns, and the Migrant Stream.
ERIC Educational Resources Information Center
Johnson, Kenneth F.; And Others
In an attempt to formulate a set of testable propositions about the dynamics of Hispanic life that will be valuable pedagogically and as a basis for public policy formation, this study assesses the impact of Hispanic Americans on Wichita, Kansas. Chapter 1 identifies the Hispanic origins of Kansas' 63,339 Hispanics who represent 2.7% of the…
Improving Health Care for Assisted Living Residents
ERIC Educational Resources Information Center
Kane, Robert L.; Mach, John R., Jr.
2007-01-01
Purpose: The purpose of this article is to explore how medical care is delivered to older people in assisted living (AL) settings and to suggest ways for improving it. Design and Methods: We present a review of the limited research available on health care for older AL residents and on building testable models of better ways to organize primary…
Interpreting clinical trial results by deductive reasoning: In search of improved trial design.
Kurbel, Sven; Mihaljević, Slobodan
2017-10-01
Clinical trial results are often interpreted by inductive reasoning, in a trial design-limited manner, directed toward modifications of the current clinical practice. Deductive reasoning is an alternative in which results of relevant trials are combined in indisputable premises that lead to a conclusion easily testable in future trials. © 2017 WILEY Periodicals, Inc.
The part of cognitive science that is philosophy.
Dennett, Daniel C
2009-04-01
There is much good work for philosophers to do in cognitive science if they adopt the constructive attitude that prevails in science, work toward testable hypotheses, and take on the task of clarifying the relationship between the scientific concepts and the everyday concepts with which we conduct our moral lives. Copyright © 2009 Cognitive Science Society, Inc.
A Progress Report on a Thinking Laboratory for Deaf Children.
ERIC Educational Resources Information Center
Wolff, Sydney
A study was undertaken at the West Virginia School for the Deaf to test the assumption that the modes of thought of deaf children could be improved, and that improvement in concept formation would result in improvement in testable areas. Sixteen primary school children of approximately equal ability were selected and paired to form the control and…
Daly, Kevin C.; Galán, Roberto F.; Peters, Oakland J.; Staudacher, Erich M.
2011-01-01
The transient oscillatory model of odor identity encoding seeks to explain how odorants with spatially overlapped patterns of input into primary olfactory networks can be discriminated. This model provides several testable predictions about the distributed nature of network oscillations and how they control spike timing. To test these predictions, 16 channel electrode arrays were placed within the antennal lobe (AL) of the moth Manduca sexta. Unitary spiking and multi site local field potential (LFP) recordings were made during spontaneous activity and in response to repeated presentations of an odor panel. We quantified oscillatory frequency, cross correlations between LFP recording sites, and spike–LFP phase relationships. We show that odor-driven AL oscillations in Manduca are frequency modulating (FM) from ∼100 to 30 Hz; this was odorant and stimulus duration dependent. FM oscillatory responses were localized to one or two recording sites suggesting a localized (perhaps glomerular) not distributed source. LFP cross correlations further demonstrated that only a small (r < 0.05) distributed and oscillatory component was present. Cross spectral density analysis demonstrated the frequency of these weakly distributed oscillations was state dependent (spontaneous activity = 25–55 Hz; odor-driven = 55–85 Hz). Surprisingly, vector strength analysis indicated that unitary phase locking of spikes to the LFP was strongest during spontaneous activity and dropped significantly during responses. Application of bicuculline, a GABAA receptor antagonist, significantly lowered the frequency content of odor-driven distributed oscillatory activity. Bicuculline significantly reduced spike phase locking generally, but the ubiquitous pattern of increased phase locking during spontaneous activity persisted. Collectively, these results indicate that oscillations perform poorly as a stimulus-mediated spike synchronizing mechanism for Manduca and hence are incongruent with the transient oscillatory model. PMID:22046161
The challenges to inferring the regulators of biodiversity in deep time.
Ezard, Thomas H G; Quental, Tiago B; Benton, Michael J
2016-04-05
Attempts to infer the ecological drivers of macroevolution in deep time have long drawn inspiration from work on extant systems, but long-term evolutionary and geological changes complicate the simple extrapolation of such theory. Recent efforts to incorporate a more informed ecology into macroevolution have moved beyond the descriptive, seeking to isolate generating mechanisms and produce testable hypotheses of how groups of organisms usurp each other or coexist over vast timespans. This theme issue aims to exemplify this progress, providing a series of case studies of how novel modelling approaches are helping infer the regulators of biodiversity in deep time. In this Introduction, we explore the challenges of these new approaches. First, we discuss how our choices of taxonomic units have implications for the conclusions drawn. Second, we emphasize the need to embrace the interdependence of biotic and abiotic changes, because no living organism ignores its environment. Third, in the light of parts 1 and 2, we discuss the set of dynamic signatures that we might expect to observe in the fossil record. Finally, we ask whether these dynamics represent the most ecologically informative foci for research efforts aimed at inferring the regulators of biodiversity in deep time. The papers in this theme issue contribute in each of these areas. © 2016 The Author(s).
Defining the Construct of Synthetic Androgen Intoxication: An Application of General Brain Arousal.
Hildebrandt, Tom; Heywood, Ashley; Wesley, Daniel; Schulz, Kurt
2018-01-01
Synthetic androgens (i. e., anabolic-androgenic steroids) are the primary component to the majority of problematic appearance and performance enhancing drug (APED) use. Despite evidence that these substances are associated with increased risk for aggression, violence, body image disturbances, and polypharmacy and can develop a pattern of chronic use consistent with drug dependence, there are no formal definitions of androgen intoxication. Consequently, the purpose of this paper is to establish a testable theory of androgen intoxication. We present evidence and theorize that synthetic androgen intoxication can be defined by a pattern of poor self-regulation characterized by increased propensity for a range of behaviors (e.g., aggression, sex, drug seeking, exercise, etc.) via androgen mediated effects on general brain arousal. This theory posits that androgens reduce threshold for emotional reactivity, motor response, and alertness to sensory stimuli and disrupt inhibitory control over the behaviors associated with synthetic androgen use. These changes result from alteration to basic neurocircuitry that amplifies limbic activation and reduces top-down cortical control. The implications for this definition are to inform APED specific hypotheses about the behavioral and psychological effects of APED use and provide a basis for establishing clinical, legal, and public health guidelines to address the use and misuse of these substances.
Model averaging, optimal inference, and habit formation
FitzGerald, Thomas H. B.; Dolan, Raymond J.; Friston, Karl J.
2014-01-01
Postulating that the brain performs approximate Bayesian inference generates principled and empirically testable models of neuronal function—the subject of much current interest in neuroscience and related disciplines. Current formulations address inference and learning under some assumed and particular model. In reality, organisms are often faced with an additional challenge—that of determining which model or models of their environment are the best for guiding behavior. Bayesian model averaging—which says that an agent should weight the predictions of different models according to their evidence—provides a principled way to solve this problem. Importantly, because model evidence is determined by both the accuracy and complexity of the model, optimal inference requires that these be traded off against one another. This means an agent's behavior should show an equivalent balance. We hypothesize that Bayesian model averaging plays an important role in cognition, given that it is both optimal and realizable within a plausible neuronal architecture. We outline model averaging and how it might be implemented, and then explore a number of implications for brain and behavior. In particular, we propose that model averaging can explain a number of apparently suboptimal phenomena within the framework of approximate (bounded) Bayesian inference, focusing particularly upon the relationship between goal-directed and habitual behavior. PMID:25018724
Defining the Construct of Synthetic Androgen Intoxication: An Application of General Brain Arousal
Hildebrandt, Tom; Heywood, Ashley; Wesley, Daniel; Schulz, Kurt
2018-01-01
Synthetic androgens (i. e., anabolic-androgenic steroids) are the primary component to the majority of problematic appearance and performance enhancing drug (APED) use. Despite evidence that these substances are associated with increased risk for aggression, violence, body image disturbances, and polypharmacy and can develop a pattern of chronic use consistent with drug dependence, there are no formal definitions of androgen intoxication. Consequently, the purpose of this paper is to establish a testable theory of androgen intoxication. We present evidence and theorize that synthetic androgen intoxication can be defined by a pattern of poor self-regulation characterized by increased propensity for a range of behaviors (e.g., aggression, sex, drug seeking, exercise, etc.) via androgen mediated effects on general brain arousal. This theory posits that androgens reduce threshold for emotional reactivity, motor response, and alertness to sensory stimuli and disrupt inhibitory control over the behaviors associated with synthetic androgen use. These changes result from alteration to basic neurocircuitry that amplifies limbic activation and reduces top-down cortical control. The implications for this definition are to inform APED specific hypotheses about the behavioral and psychological effects of APED use and provide a basis for establishing clinical, legal, and public health guidelines to address the use and misuse of these substances. PMID:29651261
Guilbride, D Lys; Gawlinski, Pawel; Guilbride, Patrick D L
2010-05-19
Clinically protective malaria vaccines consistently fail to protect adults and children in endemic settings, and at best only partially protect infants. We identify and evaluate 1916 immunization studies between 1965-February 2010, and exclude partially or nonprotective results to find 177 completely protective immunization experiments. Detailed reexamination reveals an unexpectedly mundane basis for selective vaccine failure: live malaria parasites in the skin inhibit vaccine function. We next show published molecular and cellular data support a testable, novel model where parasite-host interactions in the skin induce malaria-specific regulatory T cells, and subvert early antigen-specific immunity to parasite-specific immunotolerance. This ensures infection and tolerance to reinfection. Exposure to Plasmodium-infected mosquito bites therefore systematically triggers immunosuppression of endemic vaccine-elicited responses. The extensive vaccine trial data solidly substantiate this model experimentally. We conclude skinstage-initiated immunosuppression, unassociated with bloodstage parasites, systematically blocks vaccine function in the field. Our model exposes novel molecular and procedural strategies to significantly and quickly increase protective efficacy in both pipeline and currently ineffective malaria vaccines, and forces fundamental reassessment of central precepts determining vaccine development. This has major implications for accelerated local eliminations of malaria, and significantly increases potential for eradication.
Foster, Morris W
2009-09-01
The ongoing debate about the relationship between race and genetics is more than a century old and has yet to be resolved. Recent emphasis on population-based patterns in human genetic variation and the implications of those for disease susceptibility and drug response have revitalized that long-standing debate. Both sides in the debate use the same rhetorical device of treating geographic, ancestral, population-specific, and other categories as surrogates for race, but otherwise share no evidentiary standards, analytic frameworks, or scientific goals that might resolve the debate and result in some productive outcome. Setting a common goal of weighing the scientific benefits of using racial and other social heuristics with testable estimates of the potential social harms of racialization can reduce both the unreflexive use of race and other social identities in biological analyses as well as the unreflexive use of racialization in social critiques of genetics. Treating social identities used in genetic studies as objects for investigation rather than artifacts of participant self-report or researcher attribution also will reduce the extent to which genetic studies that report social identities imply that membership in social categories can be defined or predicted using genetic features.
Electronically Distributed Work Communities: Implications for Research on Telework.
ERIC Educational Resources Information Center
Hesse, Bradford W.; Grantham, Charles E.
1991-01-01
Describes the concept of telework, or telecommuting, and its influence on the electronic community and organizational structures. The electronically distributed organization is discussed, and implications for research on telework are suggested in the areas of privacy regulation, self-efficacy, temporal aspects of employee behavior, communication…
Implications of the Advanced Distributed Learning Initiative for Education. Urban Diversity Series.
ERIC Educational Resources Information Center
Fletcher, J. D.; Tobias, Sigmund
This monograph in the Urban Diversity Series describes the The Advanced Distributed Learning (ADL)initiative, relates it to research dealing with instruction generally and computer-mediated instruction specifically, and discusses its implications for education. ADL was undertaken to make instructional material universally accessible primarily, but…
Taxes in a Labor Supply Model with Joint Wage-Hours Determination.
ERIC Educational Resources Information Center
Rosen, Harvey S.
1976-01-01
Payroll and progressive income taxes play an enormous role in the American fiscal system. The purpose of this study is to present some econometric evidence on the effects of taxes on married women, a group of growing importance in the American labor force. A testable model of labor supply is developed which permits statistical estimation of a…
ERIC Educational Resources Information Center
Nunez, Rafael
2012-01-01
"The Journal of the Learning Sciences" has devoted this special issue to the study of embodied cognition (as it applies to mathematics), a topic that for several decades has gained attention in the cognitive sciences and in mathematics education, in particular. In this commentary, the author aims to address crucial questions in embodied…
ERIC Educational Resources Information Center
Booker, Lucille M.
2012-01-01
Political discourse is an observable, measurable, and testable manifestation of political worldviews. However, when worldviews collide, notions of truth and of lies are put to the test. The challenge for researchers is how to establish confidence in their analysis. Despite the growing interest in deception research from a diversity of fields and…
Predictors of Organizational-Level Testability Attributes
1987-05-01
A. Elizabeth Gilreath Brian A. Kelley 13a. TYPE OF REPORT 13b. TIME COVERED 14. DATE OF REPORT (YearB, M RSnt, Day) 15.PAGECOUNT ’Final JFROM A TO 6... BRU count. These counts are "described in subsections 6,.2.1.1 and 6.2.1.2. and are further subdivided in Figure 6-4. 6.2.1.1 Functional Cross
Surface fire effects on conifer and hardwood crowns--applications of an integral plume model
Matthew Dickinson; Anthony Bova; Kathleen Kavanagh; Antoine Randolph; Lawrence Band
2009-01-01
An integral plume model was applied to the problems of tree death from canopy injury in dormant-season hardwoods and branch embolism in Douglas fir (Pseudotsuga menziesii) crowns. Our purpose was to generate testable hypotheses. We used the integral plume models to relate crown injury to bole injury and to explore the effects of variation in fire...
Analytical Procedures for Testability.
1983-01-01
Beat Internal Classifications", AD: A018516. "A System of Computer Aided Diagnosis with Blood Serum Chemistry Tests and Bayesian Statistics", AD: 786284...6 LIST OF TALS .. 1. Truth Table ......................................... 49 2. Covering Problem .............................. 93 3. Primary and...quential classification procedure in a coronary care ward is evaluated. In the toxicology field "A System of Computer Aided Diagnosis with Blood Serum
E-learning platform for automated testing of electronic circuits using signature analysis method
NASA Astrophysics Data System (ADS)
Gherghina, Cǎtǎlina; Bacivarov, Angelica; Bacivarov, Ioan C.; Petricǎ, Gabriel
2016-12-01
Dependability of electronic circuits can be ensured only through testing of circuit modules. This is done by generating test vectors and their application to the circuit. Testability should be viewed as a concerted effort to ensure maximum efficiency throughout the product life cycle, from conception and design stage, through production to repairs during products operating. In this paper, is presented the platform developed by authors for training for testability in electronics, in general and in using signature analysis method, in particular. The platform allows highlighting the two approaches in the field namely analog and digital signature of circuits. As a part of this e-learning platform, it has been developed a database for signatures of different electronic components meant to put into the spotlight different techniques implying fault detection, and from this there were also self-repairing techniques of the systems with this kind of components. An approach for realizing self-testing circuits based on MATLAB environment and using signature analysis method is proposed. This paper analyses the benefits of signature analysis method and simulates signature analyzer performance based on the use of pseudo-random sequences, too.
Objections to routine clinical outcomes measurement in mental health services: any evidence so far?
MacDonald, Alastair J D; Trauer, Tom
2010-12-01
Routine clinical outcomes measurement (RCOM) is gaining importance in mental health services. To examine whether criticisms published in advance of the development of RCOM have been borne out by data now available from such a programme. This was an observational study of routine ratings using HoNOS65+ at inception/admission and again at discharge in an old age psychiatry service from 1997 to 2008. Testable hypotheses were generated from each criticism amenable to empirical examination. Inter-rater reliability estimates were applied to observed differences between scores between community and ward patients using resampling. Five thousand one hundred eighty community inceptions and 862 admissions had HoNOS65+ ratings at referral/admission and discharge. We could find no evidence of gaming (artificially worse scores at inception and better at discharge), selection, attrition or detection bias, and ratings were consistent with diagnosis and level of service. Anticipated low levels of inter-rater reliability did not vitiate differences between levels of service. Although only hypotheses testable from within RCOM data were examined, and only 46% of eligible episodes had complete outcomes data, no evidence of the alleged biases were found. RCOM seems valid and practical in mental health services.
Electronic design of a multichannel programmable implant for neuromuscular electrical stimulation.
Arabi, K; Sawan, M A
1999-06-01
An advanced stimulator for neuromuscular stimulation of spinal cord injured patients has been developed. The stimulator is externally controlled and powered by a single encoded radio frequency carrier and has four independently controlled bipolar stimulation channels. It offers a wide range of reprogrammability and flexibility, and can be used in many neuromuscular electrical stimulation applications. The implant system is adaptable to patient's needs and to future developments in stimulation algorithms by reprogramming the stimulator. The stimulator is capable of generating a wide range of stimulation waveforms and stimulation patterns and therefore is very suitable for selective nerve stimulation techniques. The reliability of the implant has been increased by using a forward error detection and correction communication protocol and by designing the chip for structural testability based on scan test approach. Implemented testability scheme makes it possible to verify the complete functionality of the implant before and after implantation. The stimulators architecture is designed to be modular and therefore its different blocks can be reused as standard building blocks in the design and implementation of other neuromuscular prostheses. Design for low-power techniques have also been employed to reduce power consumption of the electronic circuitry.
Lift and drag in three-dimensional steady viscous and compressible flow
NASA Astrophysics Data System (ADS)
Liu, L. Q.; Wu, J. Z.; Su, W. D.; Kang, L. L.
2017-11-01
In a recent paper, Liu, Zhu, and Wu ["Lift and drag in two-dimensional steady viscous and compressible flow," J. Fluid Mech. 784, 304-341 (2015)] present a force theory for a body in a two-dimensional, viscous, compressible, and steady flow. In this companion paper, we do the same for three-dimensional flows. Using the fundamental solution of the linearized Navier-Stokes equations, we improve the force formula for incompressible flows originally derived by Goldstein in 1931 and summarized by Milne-Thomson in 1968, both being far from complete, to its perfect final form, which is further proved to be universally true from subsonic to supersonic flows. We call this result the unified force theorem, which states that the forces are always determined by the vector circulation Γϕ of longitudinal velocity and the scalar inflow Qψ of transverse velocity. Since this theorem is not directly observable either experimentally or computationally, a testable version is also derived, which, however, holds only in the linear far field. We name this version the testable unified force formula. After that, a general principle to increase the lift-drag ratio is proposed.
Reilly, John J; Wells, Jonathan C K
2005-12-01
The WHO recommends exclusive breast-feeding for the first 6 months of life. At present, <2 % of mothers who breast-feed in the UK do so exclusively for 6 months. We propose the testable hypothesis that this is because many mothers do not provide sufficient breast milk to feed a 6-month-old baby adequately. We review recent evidence on energy requirements during infancy, and energy transfer from mother to baby, and consider the adequacy of exclusive breast-feeding to age 6 months for mothers and babies in the developed world. Evidence from our recent systematic review suggests that mean metabolisable energy intake in exclusively breast-fed infants at 6 months is 2.2-2.4 MJ/d (525-574 kcal/d), and mean energy requirement approximately 2.6-2.7 MJ/d (632-649 kcal/d), leading to a gap between the energy provided by milk and energy needs by 6 months for many babies. Our hypothesis is consistent with other evidence, and with evolutionary considerations, and we briefly review this other evidence. The hypothesis would be testable in a longitudinal study of infant energy balance using stable-isotope techniques, which are both practical and valid.
McCluney, Kevin E; Belnap, Jayne; Collins, Scott L; González, Angélica L; Hagen, Elizabeth M; Nathaniel Holland, J; Kotler, Burt P; Maestre, Fernando T; Smith, Stanley D; Wolf, Blair O
2012-08-01
Species interactions play key roles in linking the responses of populations, communities, and ecosystems to environmental change. For instance, species interactions are an important determinant of the complexity of changes in trophic biomass with variation in resources. Water resources are a major driver of terrestrial ecology and climate change is expected to greatly alter the distribution of this critical resource. While previous studies have documented strong effects of global environmental change on species interactions in general, responses can vary from region to region. Dryland ecosystems occupy more than one-third of the Earth's land mass, are greatly affected by changes in water availability, and are predicted to be hotspots of climate change. Thus, it is imperative to understand the effects of environmental change on these globally significant ecosystems. Here, we review studies of the responses of population-level plant-plant, plant-herbivore, and predator-prey interactions to changes in water availability in dryland environments in order to develop new hypotheses and predictions to guide future research. To help explain patterns of interaction outcomes, we developed a conceptual model that views interaction outcomes as shifting between (1) competition and facilitation (plant-plant), (2) herbivory, neutralism, or mutualism (plant-herbivore), or (3) neutralism and predation (predator-prey), as water availability crosses physiological, behavioural, or population-density thresholds. We link our conceptual model to hypothetical scenarios of current and future water availability to make testable predictions about the influence of changes in water availability on species interactions. We also examine potential implications of our conceptual model for the relative importance of top-down effects and the linearity of patterns of change in trophic biomass with changes in water availability. Finally, we highlight key research needs and some possible broader impacts of our findings. Overall, we hope to stimulate and guide future research that links changes in water availability to patterns of species interactions and the dynamics of populations and communities in dryland ecosystems. © 2011 The Authors. Biological Reviews © 2011 Cambridge Philosophical Society.
Exploring Explanations of Subglacial Bedform Sizes Using Statistical Models
Kougioumtzoglou, Ioannis A.; Stokes, Chris R.; Smith, Michael J.; Clark, Chris D.; Spagnolo, Matteo S.
2016-01-01
Sediments beneath modern ice sheets exert a key control on their flow, but are largely inaccessible except through geophysics or boreholes. In contrast, palaeo-ice sheet beds are accessible, and typically characterised by numerous bedforms. However, the interaction between bedforms and ice flow is poorly constrained and it is not clear how bedform sizes might reflect ice flow conditions. To better understand this link we present a first exploration of a variety of statistical models to explain the size distribution of some common subglacial bedforms (i.e., drumlins, ribbed moraine, MSGL). By considering a range of models, constructed to reflect key aspects of the physical processes, it is possible to infer that the size distributions are most effectively explained when the dynamics of ice-water-sediment interaction associated with bedform growth is fundamentally random. A ‘stochastic instability’ (SI) model, which integrates random bedform growth and shrinking through time with exponential growth, is preferred and is consistent with other observations of palaeo-bedforms and geophysical surveys of active ice sheets. Furthermore, we give a proof-of-concept demonstration that our statistical approach can bridge the gap between geomorphological observations and physical models, directly linking measurable size-frequency parameters to properties of ice sheet flow (e.g., ice velocity). Moreover, statistically developing existing models as proposed allows quantitative predictions to be made about sizes, making the models testable; a first illustration of this is given for a hypothesised repeat geophysical survey of bedforms under active ice. Thus, we further demonstrate the potential of size-frequency distributions of subglacial bedforms to assist the elucidation of subglacial processes and better constrain ice sheet models. PMID:27458921
Radiative transfer within seagrass canopies: impact on carbon budgets and light requirements
NASA Astrophysics Data System (ADS)
Zimmerman, Richard C.; Mobley, Curtis D.
1997-02-01
Seagrasses are ecologically important but extremely vulnerable to anthropogenic modifications of the coastal zone that affect light availability within these unique ecosystems. Strongly pigmented seagrass leaves can extend for more than 1 m above the substrate and biomass is distributed unevenly throughout the canopy. in this study, light attenuation in a 7 m water column that contained a seagrass canopy extending 1.5 m above the bottom was calculated by the radiative transfer model Hydrolight using the spectral absorbance of eelgrass leaves and a non-uniform vertical distribution of biomass. Runs were performed in clear and turbid water columns, over san d and mud substrates, and with shoot densities ranging from 25 to 200 m-2 using solar angles for both winter and summer solstices. The flux of photosynthetically active irradiance (EPAR) reaching the top of the seagrass canopy was twice as high in summer compared to winter, and in clear water compared to turbid water. Sediment type had a measurable effect on EPAR only within the bottom third of the canopy. Light penetration within the canopy was inversely proportional to shoot density. Introduction of daylength and a sinusoidal distribution of EPAR throughout the day greatly increased the importance of solar elevation on daily integrated production relative to water column turbidity and sediment type. Shoot-specific productivity decreased and the position of maximum shoot productivity within the canopy shallowed as shoot density increased. Positive net photosynthesis for entire shoots was possible only when plant density was lower than 100 shoots m-2 in winter; values consistent with field observations. Although very simplistic with regard to inherent optical properties of real seagrass leaves, this model was able to generate estimates of maximum sustainable shoot density that were fully testable by, and wholly consistent with, field observations.
Basketball Teams as Strategic Networks
Fewell, Jennifer H.; Armbruster, Dieter; Ingraham, John; Petersen, Alexander; Waters, James S.
2012-01-01
We asked how team dynamics can be captured in relation to function by considering games in the first round of the NBA 2010 play-offs as networks. Defining players as nodes and ball movements as links, we analyzed the network properties of degree centrality, clustering, entropy and flow centrality across teams and positions, to characterize the game from a network perspective and to determine whether we can assess differences in team offensive strategy by their network properties. The compiled network structure across teams reflected a fundamental attribute of basketball strategy. They primarily showed a centralized ball distribution pattern with the point guard in a leadership role. However, individual play-off teams showed variation in their relative involvement of other players/positions in ball distribution, reflected quantitatively by differences in clustering and degree centrality. We also characterized two potential alternate offensive strategies by associated variation in network structure: (1) whether teams consistently moved the ball towards their shooting specialists, measured as “uphill/downhill” flux, and (2) whether they distributed the ball in a way that reduced predictability, measured as team entropy. These network metrics quantified different aspects of team strategy, with no single metric wholly predictive of success. However, in the context of the 2010 play-offs, the values of clustering (connectedness across players) and network entropy (unpredictability of ball movement) had the most consistent association with team advancement. Our analyses demonstrate the utility of network approaches in quantifying team strategy and show that testable hypotheses can be evaluated using this approach. These analyses also highlight the richness of basketball networks as a dataset for exploring the relationships between network structure and dynamics with team organization and effectiveness. PMID:23139744
Basketball teams as strategic networks.
Fewell, Jennifer H; Armbruster, Dieter; Ingraham, John; Petersen, Alexander; Waters, James S
2012-01-01
We asked how team dynamics can be captured in relation to function by considering games in the first round of the NBA 2010 play-offs as networks. Defining players as nodes and ball movements as links, we analyzed the network properties of degree centrality, clustering, entropy and flow centrality across teams and positions, to characterize the game from a network perspective and to determine whether we can assess differences in team offensive strategy by their network properties. The compiled network structure across teams reflected a fundamental attribute of basketball strategy. They primarily showed a centralized ball distribution pattern with the point guard in a leadership role. However, individual play-off teams showed variation in their relative involvement of other players/positions in ball distribution, reflected quantitatively by differences in clustering and degree centrality. We also characterized two potential alternate offensive strategies by associated variation in network structure: (1) whether teams consistently moved the ball towards their shooting specialists, measured as "uphill/downhill" flux, and (2) whether they distributed the ball in a way that reduced predictability, measured as team entropy. These network metrics quantified different aspects of team strategy, with no single metric wholly predictive of success. However, in the context of the 2010 play-offs, the values of clustering (connectedness across players) and network entropy (unpredictability of ball movement) had the most consistent association with team advancement. Our analyses demonstrate the utility of network approaches in quantifying team strategy and show that testable hypotheses can be evaluated using this approach. These analyses also highlight the richness of basketball networks as a dataset for exploring the relationships between network structure and dynamics with team organization and effectiveness.
Applications of species distribution modeling to paleobiology
NASA Astrophysics Data System (ADS)
Svenning, Jens-Christian; Fløjgaard, Camilla; Marske, Katharine A.; Nógues-Bravo, David; Normand, Signe
2011-10-01
Species distribution modeling (SDM: statistical and/or mechanistic approaches to the assessment of range determinants and prediction of species occurrence) offers new possibilities for estimating and studying past organism distributions. SDM complements fossil and genetic evidence by providing (i) quantitative and potentially high-resolution predictions of the past organism distributions, (ii) statistically formulated, testable ecological hypotheses regarding past distributions and communities, and (iii) statistical assessment of range determinants. In this article, we provide an overview of applications of SDM to paleobiology, outlining the methodology, reviewing SDM-based studies to paleobiology or at the interface of paleo- and neobiology, discussing assumptions and uncertainties as well as how to handle them, and providing a synthesis and outlook. Key methodological issues for SDM applications to paleobiology include predictor variables (types and properties; special emphasis is given to paleoclimate), model validation (particularly important given the emphasis on cross-temporal predictions in paleobiological applications), and the integration of SDM and genetics approaches. Over the last few years the number of studies using SDM to address paleobiology-related questions has increased considerably. While some of these studies only use SDM (23%), most combine them with genetically inferred patterns (49%), paleoecological records (22%), or both (6%). A large number of SDM-based studies have addressed the role of Pleistocene glacial refugia in biogeography and evolution, especially in Europe, but also in many other regions. SDM-based approaches are also beginning to contribute to a suite of other research questions, such as historical constraints on current distributions and diversity patterns, the end-Pleistocene megafaunal extinctions, past community assembly, human paleobiogeography, Holocene paleoecology, and even deep-time biogeography (notably, providing insights into biogeographic dynamics >400 million years ago). We discuss important assumptions and uncertainties that affect the SDM approach to paleobiology - the equilibrium postulate, niche stability, changing atmospheric CO 2 concentrations - as well as ways to address these (ensemble, functional SDM, and non-SDM ecoinformatics approaches). We conclude that the SDM approach offers important opportunities for advances in paleobiology by providing a quantitative ecological perspective, and hereby also offers the potential for an enhanced contribution of paleobiology to ecology and conservation biology, e.g., for estimating climate change impacts and for informing ecological restoration.
The (virtual) conceptual necessity of quantum probabilities in cognitive psychology.
Blutner, Reinhard; beim Graben, Peter
2013-06-01
We propose a way in which Pothos & Busemeyer (P&B) could strengthen their position. Taking a dynamic stance, we consider cognitive tests as functions that transfer a given input state into the state after testing. Under very general conditions, it can be shown that testable properties in cognition form an orthomodular lattice. Gleason's theorem then yields the conceptual necessity of quantum probabilities (QP).
The Systems Test Architect: Enabling The Leap From Testable To Tested
2016-09-01
engineering process requires an interdisciplinary approach, involving both technical and managerial disciplines applied to the synthesis and integration...relationship between the technical and managerial aspects of systems engineering. TP-2003-020-01 describes measurement as having the following...it is evident that DOD makes great strides to tackle both the managerial and technical aspects of test and evaluation within the systems
Silicon Wafer Advanced Packaging (SWAP). Multichip Module (MCM) Foundry Study. Version 2
1991-04-08
Next Layer Dielectric Spacing - Additional Metal Thickness Impact on Dielectric Uniformity/Adhiesion. The first step in .!Ie EPerimental design would be... design CAM - computer aided manufacturing CAE - computer aided engineering CALCE - computer aided life cycle engineering center CARMA - computer aided...expansion 5 j- CVD - chemical vapor deposition J . ..- j DA - design automation J , DEC - Digital Equipment Corporation --- DFT - design for testability
Structural Genomics of Bacterial Virulence Factors
2006-05-01
positioned in the unit cell by Molecular Replacement (Protein Data Bank ( PDB ) ID code 1acc)6 using MOLREP, and refined with REFMAC version 5.0 (ref. 24...increase our understanding of the molecular mechanisms of pathogenicity, putting us in a stronger position to anticipate and react to emerging...term, the accumulated structural information will generate important and testable hypotheses that will increase our understanding of the molecular
Testability/Diagnostics Design Encyclopedia
1990-09-01
weapon system that is pushing the state of the art and produced In limited numbers, with questionable historical data on their operation, one can...designs with questionable basis and justification. Unfortunately, this process has not been transformed from an art to a rigorous methodology...REQUIREMENT #2.1 - On-the-job training - Formal school training o O-Level data acquieitlonico01ectlon system (and data management) o Requirements to
Rapid Communication: Quasi-gedanken experiment challenging the no-signalling theorem
NASA Astrophysics Data System (ADS)
Kalamidas, Demetrios A.
2018-01-01
Kennedy ( Philos. Sci. 62, 4 (1995)) has argued that the various quantum mechanical no-signalling proofs formulated thus far share a common mathematical framework, are circular in nature, and do not preclude the construction of empirically testable schemes wherein superluminal exchange of information can occur. In light of this thesis, we present a potentially feasible quantum-optical scheme that purports to enable superluminal signalling.
Retrieval as a Fast Route to Memory Consolidation.
Antony, James W; Ferreira, Catarina S; Norman, Kenneth A; Wimber, Maria
2017-08-01
Retrieval-mediated learning is a powerful way to make memories last, but its neurocognitive mechanisms remain unclear. We propose that retrieval acts as a rapid consolidation event, supporting the creation of adaptive hippocampal-neocortical representations via the 'online' reactivation of associative information. We describe parallels between online retrieval and offline consolidation and offer testable predictions for future research. Copyright © 2017 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Maestripieri, Dario
2005-01-01
Comparative behavioral research is important for a number of reasons and can contribute to the understanding of human behavior and development in many different ways. Research with animal models of human behavior and development can be a source not only of general principles and testable hypotheses but also of empirical information that may be…
1982-10-01
e.g., providing voters in TMR systems and detection-switching requirements in standby-sparing sys- tems. The application of mathematical thoery of...and time redundancy required for error detection and correction, are interrelated. Mathematical modeling, when applied to fault tolerant systems, can...9 1.1 Some Fundamental Principles............................. 11 1.2 Mathematical Theory of
NASA Space Flight Vehicle Fault Isolation Challenges
NASA Technical Reports Server (NTRS)
Neeley, James R.; Jones, James V.; Bramon, Christopher J.; Inman, Sharon K.; Tuttle, Loraine
2016-01-01
The Space Launch System (SLS) is the new NASA heavy lift launch vehicle in development and is scheduled for its first mission in 2018.SLS has many of the same logistics challenges as any other large scale program. However, SLS also faces unique challenges related to testability. This presentation will address the SLS challenges for diagnostics and fault isolation, along with the analyses and decisions to mitigate risk..
A Survey of Reliability, Maintainability, Supportability, and Testability Software Tools
1991-04-01
designs in terms of their contributions toward forced mission termination and vehicle or function loss . Includes the ability to treat failure modes of...ABSTRACT: Inputs: MTBFs, MTTRs, support equipment costs, equipment weights and costs, available targets, military occupational specialty skill level and...US Army CECOM NAME: SPARECOST ABSTRACT: Calculates expected number of failures and performs spares holding optimization based on cost, weight , or
Colquhoun, Heather L; Carroll, Kelly; Eva, Kevin W; Grimshaw, Jeremy M; Ivers, Noah; Michie, Susan; Sales, Anne; Brehaut, Jamie C
2017-09-29
Audit and feedback (A&F) is a common strategy for helping health providers to implement evidence into practice. Despite being extensively studied, health care A&F interventions remain variably effective, with overall effect sizes that have not improved since 2003. Contributing to this stagnation is the fact that most health care A&F interventions have largely been designed without being informed by theoretical understanding from the behavioral and social sciences. To determine if the trend can be improved, the objective of this study was to develop a list of testable, theory-informed hypotheses about how to design more effective A&F interventions. Using purposive sampling, semi-structured 60-90-min telephone interviews were conducted with experts in theories related to A&F from a range of fields (e.g., cognitive, health and organizational psychology, medical decision-making, economics). Guided by detailed descriptions of A&F interventions from the health care literature, interviewees described how they would approach the problem of designing improved A&F interventions. Specific, theory-informed hypotheses about the conditions for effective design and delivery of A&F interventions were elicited from the interviews. The resulting hypotheses were assigned by three coders working independently into themes, and categories of themes, in an iterative process. We conducted 28 interviews and identified 313 theory-informed hypotheses, which were placed into 30 themes. The 30 themes included hypotheses related to the following five categories: A&F recipient (seven themes), content of the A&F (ten themes), process of delivery of the A&F (six themes), behavior that was the focus of the A&F (three themes), and other (four themes). We have identified a set of testable, theory-informed hypotheses from a broad range of behavioral and social science that suggest conditions for more effective A&F interventions. This work demonstrates the breadth of perspectives about A&F from non-healthcare-specific disciplines in a way that yields testable hypotheses for healthcare A&F interventions. These results will serve as the foundation for further work seeking to set research priorities among the A&F research community.
Developing Tools to Test the Thermo-Mechanical Models, Examples at Crustal and Upper Mantle Scale
NASA Astrophysics Data System (ADS)
Le Pourhiet, L.; Yamato, P.; Burov, E.; Gurnis, M.
2005-12-01
Testing geodynamical model is never an easy task. Depending on the spatio-temporal scale of the model, different testable predictions are needed and no magic reciepe exist. This contribution first presents different methods that have been used to test themo-mechanical modeling results at upper crustal, lithospheric and upper mantle scale using three geodynamical examples : the Gulf of Corinth (Greece), the Western Alps, and the Sierra Nevada. At short spatio-temporal scale (e.g. Gulf of Corinth). The resolution of the numerical models is usually sufficient to catch the timing and kinematics of the faults precisely enough to be tested by tectono-stratigraphic arguments. In active deforming area, microseismicity can be compared to the effective rheology and P and T axes of the focal mechanism can be compared with local orientation of the major component of the stress tensor. At lithospheric scale the resolution of the models doesn't permit anymore to constrain the models by direct observations (i.e. structural data from field or seismic reflection). Instead, synthetic P-T-t path may be computed and compared to natural ones in term of rate of exhumation for ancient orogens. Topography may also help but on continent it mainly depends on erosion laws that are complicated to constrain. Deeper in the mantle, the only available constrain are long wave length topographic data and tomographic "data". The major problem to overcome now at lithospheric and upper mantle scale, is that the so called "data" results actually from inverse models of the real data and that those inverse model are based on synthetic models. Post processing P and S wave velocities is not sufficient to be able to make testable prediction at upper mantle scale. Instead of that, direct wave propagations model must be computed. This allows checking if the differences between two models constitute a testable prediction or not. On longer term, we may be able to use those synthetic models to reduce the residue in the inversion of elastic wave arrival time
Color vision deficiency in preschool children: the multi-ethnic pediatric eye disease study.
Xie, John Z; Tarczy-Hornoch, Kristina; Lin, Jesse; Cotter, Susan A; Torres, Mina; Varma, Rohit
2014-07-01
To determine the sex- and ethnicity-specific prevalence of color vision deficiency (CVD) in black, Asian, Hispanic, and non-Hispanic white preschool children. Population-based, cross-sectional study. The Multi-Ethnic Pediatric Eye Disease Study is a population-based evaluation of the prevalence of vision disorders in children in Southern California. A total of 5960 subjects 30 to 72 months of age were recruited for the study, of whom 4177 were able to complete color vision testing (1265 black, 812 Asian, 1280 Hispanic, and 820 non-Hispanic white). Color vision testing was performed using Color Vision Testing Made Easy color plates (Home Vision Care, Gulf Breeze, FL), and diagnostic confirmatory testing was performed using the Waggoner HRR Diagnostic Test color plates (Home Vision Care). Testability of color vision in preschool children between 30 and 72 months of age and prevalence of CVD stratified by age, sex, and ethnicity. Testability was 17% in children younger than 37 months of age, increasing to 57% in children 37 to 48 months of age, 89% in children 49 to 60 months of age, and 98% in children 61 to 72 months of age. The prevalence of CVD among boys was 1.4% for black, 3.1% for Asian, 2.6% for Hispanic, and 5.6% for non-Hispanic white children; the prevalence in girls was 0.0% to 0.5% for all ethnicities. The ethnic difference in CVD was statistically significant between black and non-Hispanic white children (P = 0.0003) and between Hispanic and non-Hispanic white children (P = 0.02). In boys, most CVD cases were either deutan (51%) or protan (34%); 32% were classified as mild, 15% as moderate, and 41% as severe. Testability for CVD in preschool children is high by 4 years of age. The prevalence of CVD in preschool boys varies by ethnicity, with the highest prevalence in non-Hispanic white and lowest in black children. Copyright © 2014 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.
Exploring Operational Test and Evaluation of Unmanned Aircraft Systems: A Qualitative Case Study
NASA Astrophysics Data System (ADS)
Saliceti, Jose A.
The purpose of this qualitative case study was to explore and identify strategies that may potentially remedy operational test and evaluation procedures used to evaluate Unmanned Aircraft Systems (UAS) technology. The sample for analysis consisted of organizations testing and evaluating UASs (e.g., U.S. Air Force, U.S. Navy, U.S. Army, U.S. Marine Corps, U.S. Coast Guard, and Customs Border Protection). A purposeful sampling technique was used to select 15 subject matter experts in the field of operational test and evaluation of UASs. A questionnaire was provided to participants to construct a descriptive and robust research. Analysis of responses revealed themes related to each research question. Findings revealed operational testers utilized requirements documents to extrapolate measures for testing UAS technology and develop critical operational issues. The requirements documents were (a) developed without the contribution of stakeholders and operational testers, (b) developed with vague or unrealistic measures, and (c) developed without a systematic method to derive requirements from mission tasks. Four approaches are recommended to develop testable operational requirements and assist operational testers: (a) use a mission task analysis tool to derive requirements for mission essential tasks for the system, (b) exercise collaboration among stakeholders and testers to ensure testable operational requirements based on mission tasks, (c) ensure testable measures are used in requirements documents, and (d) create a repository list of critical operational issues by mission areas. The preparation of operational test and evaluation processes for UAS technology is not uniform across testers. The processes in place are not standardized, thus test plan preparation and reporting are different among participants. A standard method to prepare and report UAS technology should be used when preparing and reporting on UAS technology. Using a systematic process, such as mission-based test design, resonated among participants as an analytical method to link UAS mission tasks and measures of performance to the capabilities of the system under test when developing operational test plans. Further research should examine system engineering designs for system requirements traceability matrix of mission tasks and subtasks while using an analysis tool that adequately evaluates UASs with an acceptable level of confidence in the results.
Learning Multisensory Integration and Coordinate Transformation via Density Estimation
Sabes, Philip N.
2013-01-01
Sensory processing in the brain includes three key operations: multisensory integration—the task of combining cues into a single estimate of a common underlying stimulus; coordinate transformations—the change of reference frame for a stimulus (e.g., retinotopic to body-centered) effected through knowledge about an intervening variable (e.g., gaze position); and the incorporation of prior information. Statistically optimal sensory processing requires that each of these operations maintains the correct posterior distribution over the stimulus. Elements of this optimality have been demonstrated in many behavioral contexts in humans and other animals, suggesting that the neural computations are indeed optimal. That the relationships between sensory modalities are complex and plastic further suggests that these computations are learned—but how? We provide a principled answer, by treating the acquisition of these mappings as a case of density estimation, a well-studied problem in machine learning and statistics, in which the distribution of observed data is modeled in terms of a set of fixed parameters and a set of latent variables. In our case, the observed data are unisensory-population activities, the fixed parameters are synaptic connections, and the latent variables are multisensory-population activities. In particular, we train a restricted Boltzmann machine with the biologically plausible contrastive-divergence rule to learn a range of neural computations not previously demonstrated under a single approach: optimal integration; encoding of priors; hierarchical integration of cues; learning when not to integrate; and coordinate transformation. The model makes testable predictions about the nature of multisensory representations. PMID:23637588
NASA Tech Briefs, December 2012
NASA Technical Reports Server (NTRS)
2012-01-01
The topics include: Pattern Generator for Bench Test of Digital Boards; 670-GHz Down- and Up-Converting HEMT-Based Mixers; Lidar Electro-Optic Beam Switch with a Liquid Crystal Variable Retarder; Feedback Augmented Sub-Ranging (FASR) Quantizer; Real-Time Distributed Embedded Oscillator Operating Frequency Monitoring; Software Modules for the Proximity-1 Space Link Interleaved Time Synchronization (PITS) Protocol; Description and User Instructions for the Quaternion to Orbit v3 Software; AdapChem; Mars Relay Lander and Orbiter Overflight Profile Estimation; Extended Testability Analysis Tool; Interactive 3D Mars Visualization; Rapid Diagnostics of Onboard Sequences; MER Telemetry Processor; pyam: Python Implementation of YaM; Process for Patterning Indium for Bump Bonding; Archway for Radiation and Micrometeorite Occurrence Resistance; 4D Light Field Imaging System Using Programmable Aperture; Device and Container for Reheating and Sterilization; Radio Frequency Plasma Discharge Lamps for Use as Stable Calibration Light Sources; Membrane Shell Reflector Segment Antenna; High-Speed Transport of Fluid Drops and Solid Particles via Surface Acoustic Waves; Compact Autonomous Hemispheric Vision System; A Distributive, Non-Destructive, Real-Time Approach to Snowpack Monitoring; Wideband Single-Crystal Transducer for Bone Characterization; Numerical Simulation of Rocket Exhaust Interaction With Lunar Soil; Motion Imagery and Robotics Application (MIRA): Standards-Based Robotics; Particle Filtering for Model-Based Anomaly Detection in Sensor Networks; Ka-band Digitally Beamformed Airborne Radar Using SweepSAR Technique; Composite With In Situ Plenums; Multi-Beam Approach for Accelerating Alignment and Calibration of HyspIRI-Like Imaging Spectrometers; JWST Lifting System; Next-Generation Tumbleweed Rover; Pneumatic System for Concentration of Micrometer-Size Lunar Soil.
Patterns of Drug Distribution: Implications and Issues#
Johnson, Bruce D.
2007-01-01
This article delineates various patterns of illicit sales of drugs, especially at the retail (and near-retail) level, addressing a variety of central issues about drug sales and distribution documented during the past 30 years, including: a) the links between drug consumption and drug distribution activities; b) the various distribution roles; c) various levels of the distribution hierarchy; d) types of retail and wholesale markets; e) the association of drug distribution with nondrug associated criminality and violence. The article also will address the implications of drug distribution: whether various public policies such as supply reduction and source interdiction affect illicit drug markets, and how policing strategies and various law enforcement strategies can influence the involvement of individual participation in drug distribution activities. The overlooked contribution of treatment for “drug abuse” to reducing drug sales and distribution activities also will be considered as will other critical unresolved issues. PMID:14582578
Patterns of drug distribution: implications and issues.
Johnson, Bruce D
2003-01-01
This article delineates various patterns of illicit sales of drugs, especially at the retail (and near-retail) level, addressing a variety of central issues about drug sales and distribution documented during the past 30 years. including: a) the links between drug consumption and drug distribution activities; b) the various distribution roles; c) various levels of the distribution hierarchy; d) types of retail and wholesale markets; e) the association of drug distribution with nondrug associated criminality and violence. The article also will address the implications of drug distribution: whether various public policies such as supply reduction and source interdiction affect illicit drug markets, and how policing strategies and various law enforcement strategies can influence the involvement of individual participation in drug distribution activities. The overlooked contribution of treatment for "drug abuse" to reducing drug sales and distribution activities also will be considered as will other critical unresolved issues.
NASA Astrophysics Data System (ADS)
Hirata, N.; Yokoi, S.; Nanjo, K. Z.; Tsuruoka, H.
2012-04-01
One major focus of the current Japanese earthquake prediction research program (2009-2013), which is now integrated with the research program for prediction of volcanic eruptions, is to move toward creating testable earthquake forecast models. For this purpose we started an experiment of forecasting earthquake activity in Japan under the framework of the Collaboratory for the Study of Earthquake Predictability (CSEP) through an international collaboration. We established the CSEP Testing Centre, an infrastructure to encourage researchers to develop testable models for Japan, and to conduct verifiable prospective tests of their model performance. We started the 1st earthquake forecast testing experiment in Japan within the CSEP framework. We use the earthquake catalogue maintained and provided by the Japan Meteorological Agency (JMA). The experiment consists of 12 categories, with 4 testing classes with different time spans (1 day, 3 months, 1 year, and 3 years) and 3 testing regions called "All Japan," "Mainland," and "Kanto." A total of 105 models were submitted, and are currently under the CSEP official suite of tests for evaluating the performance of forecasts. The experiments were completed for 92 rounds for 1-day, 6 rounds for 3-month, and 3 rounds for 1-year classes. For 1-day testing class all models passed all the CSEP's evaluation tests at more than 90% rounds. The results of the 3-month testing class also gave us new knowledge concerning statistical forecasting models. All models showed a good performance for magnitude forecasting. On the other hand, observation is hardly consistent in space distribution with most models when many earthquakes occurred at a spot. Now we prepare the 3-D forecasting experiment with a depth range of 0 to 100 km in Kanto region. The testing center is improving an evaluation system for 1-day class experiment to finish forecasting and testing results within one day. The special issue of 1st part titled Earthquake Forecast Testing Experiment in Japan was published on the Earth, Planets and Space Vol. 63, No.3, 2011 on March, 2011. The 2nd part of this issue, which is now on line, will be published soon. An outline of the experiment and activities of the Japanese Testing Center are published on our WEB site; http://wwweic.eri.u-tokyo.ac.jp/ZISINyosoku/wiki.en/wiki.cgi
NASA Astrophysics Data System (ADS)
Archibong, Belinda
While previous literature has emphasized the importance of energy and public infrastructure services for economic development, questions surrounding the implications of unequal spatial distribution in access to these resources remain, particularly in the developing country context. This dissertation provides evidence on the nature, origins and implications of this distribution uniting three strands of research from the development and political economy, regional science and energy economics fields. The dissertation unites three papers on the nature of spatial inequality of access to energy and infrastructure with further implications for conflict risk , the historical institutional and biogeographical determinants of current distribution of access to energy and public infrastructure services and the response of households to fuel price changes over time. Chapter 2 uses a novel survey dataset to provide evidence for spatial clustering of public infrastructure non-functionality at schools by geopolitical zone in Nigeria with further implications for armed conflict risk in the region. Chapter 3 investigates the drivers of the results in chapter 2, exploiting variation in the spatial distribution of precolonial institutions and geography in the region, to provide evidence for the long-term impacts of these factors on current heterogeneity of access to public services. Chapter 4 addresses the policy implications of energy access, providing the first multi-year evidence on firewood demand elasticities in India, using the spatial variation in prices for estimation.
ERIC Educational Resources Information Center
Kang, Che Chang
2014-01-01
The study aimed at investigating TOEIC score distribution patterns and learner satisfaction in an intensive TOEIC course and drew implications for pedagogical practice. A one-group pre-test post-test experiment and a survey on learner satisfaction were conducted on Taiwanese college EFL students (n = 50) in a case study. Results showed that the…
Neutrino CP phases from sneutrino chaotic inflation
NASA Astrophysics Data System (ADS)
Nakayama, Kazunori; Takahashi, Fuminobu; Yanagida, Tsutomu T.
2017-10-01
We study if the minimal sneutrino chaotic inflation is consistent with a flavor symmetry of the Froggatt-Nielsen type, to derive testable predictions on the Dirac and Majorana CP violating phases, δ and α. For successful inflation, the two right-handed neutrinos, i.e., the inflaton and stabilizer fields, must be degenerate in mass. First we find that the lepton flavor symmetry structure becomes less manifest in the light neutrino masses in the seesaw mechanism, and this tendency becomes most prominent when right-handed neutrinos are degenerate. Secondly, the Dirac CP phase turns out to be sensitive to whether the shift symmetry breaking depends on the lepton flavor symmetry. When the flavor symmetry is imposed only on the stabilizer Yukawa couplings, distributions of the CP phases are peaked at δ ≃ ± π / 4 , ± 3 π / 4 and α = 0, while the vanishing and maximal Dirac CP phases are disfavored. On the other hand, when the flavor symmetry is imposed on both the inflaton and stabilizer Yukawa couplings, it is rather difficult to explain the observed neutrino data, and those parameters consistent with the observation prefer the vanishing CP phases δ = 0 , π and α = 0.
NASA Technical Reports Server (NTRS)
Hansman, Robert John, Jr.
1999-01-01
MIT has investigated Situational Awareness issues relating to the implementation of Datalink in the Air Traffic Control environment for a number of years under this grant activity. This work has investigated: 1) The Effect of "Party Line" Information. 2) The Effect of Datalink-Enabled Automated Flight Management Systems (FMS) on Flight Crew Situational Awareness. 3) The Effect of Cockpit Display of Traffic Information (CDTI) on Situational Awareness During Close Parallel Approaches. 4) Analysis of Flight Path Management Functions in Current and Future ATM Environments. 5) Human Performance Models in Advanced ATC Automation: Flight Crew and Air Traffic Controllers. 6) CDTI of Datalink-Based Intent Information in Advanced ATC Environments. 7) Shared Situational Awareness between the Flight Deck and ATC in Datalink-Enabled Environments. 8) Analysis of Pilot and Controller Shared SA Requirements & Issues. 9) Development of Robust Scenario Generation and Distributed Simulation Techniques for Flight Deck ATC Simulation. 10) Methods of Testing Situation Awareness Using Testable Response Techniques. The work is detailed in specific technical reports that are listed in the following bibliography, and are attached as an appendix to the master final technical report.
Global Change And Water Availability And Quality: Challenges Ahead
NASA Astrophysics Data System (ADS)
Larsen, M. C.; Ryker, S. J.
2012-12-01
The United States is in the midst of a continental-scale, multi-year water-resources experiment, in which society has not defined testable hypotheses or set the duration and scope of the experiment. What are we doing? We are expanding population at two to three times the national growth rate in our most water-scarce states, in the southwest, where water stress is already great and modeling predicts decreased streamflow by the middle of this century. We are expanding irrigated agriculture from the west into the east, particularly to the southeastern states, where increased competition for ground and surface water has urban, agricultural, and environmental interests at odds, and increasingly, in court. We are expanding our consumption of pharmaceutical and personal care products to historic high levels and disposing of them in surface and groundwater, through sewage treatment plants and individual septic systems that were not designed to treat them. These and other examples of our national-scale experiment are likely to continue well into the 21st century. This experiment and related challenges will continue and likely intensify as non-climatic and climatic factors, such as predicted rising temperature and changes in the distribution of precipitation in time and space, continue to develop.
A parallel implementation of an off-lattice individual-based model of multicellular populations
NASA Astrophysics Data System (ADS)
Harvey, Daniel G.; Fletcher, Alexander G.; Osborne, James M.; Pitt-Francis, Joe
2015-07-01
As computational models of multicellular populations include ever more detailed descriptions of biophysical and biochemical processes, the computational cost of simulating such models limits their ability to generate novel scientific hypotheses and testable predictions. While developments in microchip technology continue to increase the power of individual processors, parallel computing offers an immediate increase in available processing power. To make full use of parallel computing technology, it is necessary to develop specialised algorithms. To this end, we present a parallel algorithm for a class of off-lattice individual-based models of multicellular populations. The algorithm divides the spatial domain between computing processes and comprises communication routines that ensure the model is correctly simulated on multiple processors. The parallel algorithm is shown to accurately reproduce the results of a deterministic simulation performed using a pre-existing serial implementation. We test the scaling of computation time, memory use and load balancing as more processes are used to simulate a cell population of fixed size. We find approximate linear scaling of both speed-up and memory consumption on up to 32 processor cores. Dynamic load balancing is shown to provide speed-up for non-regular spatial distributions of cells in the case of a growing population.
Covariations in ecological scaling laws fostered by community dynamics.
Zaoli, Silvia; Giometto, Andrea; Maritan, Amos; Rinaldo, Andrea
2017-10-03
Scaling laws in ecology, intended both as functional relationships among ecologically relevant quantities and the probability distributions that characterize their occurrence, have long attracted the interest of empiricists and theoreticians. Empirical evidence exists of power laws associated with the number of species inhabiting an ecosystem, their abundances, and traits. Although their functional form appears to be ubiquitous, empirical scaling exponents vary with ecosystem type and resource supply rate. The idea that ecological scaling laws are linked has been entertained before, but the full extent of macroecological pattern covariations, the role of the constraints imposed by finite resource supply, and a comprehensive empirical verification are still unexplored. Here, we propose a theoretical scaling framework that predicts the linkages of several macroecological patterns related to species' abundances and body sizes. We show that such a framework is consistent with the stationary-state statistics of a broad class of resource-limited community dynamics models, regardless of parameterization and model assumptions. We verify predicted theoretical covariations by contrasting empirical data and provide testable hypotheses for yet unexplored patterns. We thus place the observed variability of ecological scaling exponents into a coherent statistical framework where patterns in ecology embed constrained fluctuations.
A systematic survey of lipids across mouse tissues
Jain, Mohit; Ngoy, Soeun; Sheth, Sunil A.; Swanson, Raymond A.; Rhee, Eugene P.; Liao, Ronglih; Clish, Clary B.; Mootha, Vamsi K.
2014-01-01
Lipids are a diverse collection of macromolecules essential for normal physiology, but the tissue distribution and function for many individual lipid species remain unclear. Here, we report a mass spectrometry survey of lipid abundance across 18 mouse tissues, detecting ∼1,000 mass spectrometry features, of which we identify 179 lipids from the glycerolipids, glycerophospholipids, lysophospholipids, acylcarnitines, sphingolipids, and cholesteryl ester classes. Our data reveal tissue-specific organization of lipids and can be used to generate testable hypotheses. For example, our data indicate that circulating triglycerides positively and negatively associated with future diabetes in humans are enriched in mouse adipose tissue and liver, respectively, raising hypotheses regarding the tissue origins of these diabetes-associated lipids. We also integrate our tissue lipid data with gene expression profiles to predict a number of substrates of lipid-metabolizing enzymes, highlighting choline phosphotransferases and sterol O-acyltransferases. Finally, we identify several tissue-specific lipids not present in plasma under normal conditions that may be of interest as biomarkers of tissue injury, and we show that two of these lipids are released into blood following ischemic brain injury in mice. This resource complements existing compendia of tissue gene expression and may be useful for integrative physiology and lipid biology. PMID:24518676
Boosting the Performance of Ionic-Liquid-Based Supercapacitors with Polar Additives
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Kun; Wu, Jianzhong
Recent years have witnessed growing interests in both the fundamentals and applications of electric double layer capacitors (EDLCs), also known as supercapacitors. A number of strategies have been explored to optimize the device performance in terms of both the energy and power densities. Because the properties of electric double layers (EDL) are sensitive to ion distributions in the close vicinity of the electrode surfaces, the supercapacitor performance is sensitive to both the electrode pore structure and the electrolyte composition. In this paper, we study the effects of polar additives on EDLC capacitance using the classical density functional theory within themore » framework of a coarse-grained model for the microscopic structure of the porous electrodes and room-temperature ionic liquids. The theoretical results indicate that a highly polar, low-molecular-weight additive is able to drastically increase the EDLC capacitance at low bulk concentration. Additionally, the additive is able to dampen the oscillatory dependence of the capacitance on the pore size, thereby boosting the performance of amorphous electrode materials. Finally, the theoretical predictions are directly testable with experiments and provide new insights into the additive effects on EDL properties.« less
NEXUS Scalable and Distributed Next-Generation Avionics Bus for Space Missions
NASA Technical Reports Server (NTRS)
He, Yutao; Shalom, Eddy; Chau, Savio N.; Some, Raphael R.; Bolotin, Gary S.
2011-01-01
A paper discusses NEXUS, a common, next-generation avionics interconnect that is transparently compatible with wired, fiber-optic, and RF physical layers; provides a flexible, scalable, packet switched topology; is fault-tolerant with sub-microsecond detection/recovery latency; has scalable bandwidth from 1 Kbps to 10 Gbps; has guaranteed real-time determinism with sub-microsecond latency/jitter; has built-in testability; features low power consumption (< 100 mW per Gbps); is lightweight with about a 5,000-logic-gate footprint; and is implemented in a small Bus Interface Unit (BIU) with reconfigurable back-end providing interface to legacy subsystems. NEXUS enhances a commercial interconnect standard, Serial RapidIO, to meet avionics interconnect requirements without breaking the standard. This unified interconnect technology can be used to meet performance, power, size, and reliability requirements of all ranges of equipment, sensors, and actuators at chip-to-chip, board-to-board, or box-to-box boundary. Early results from in-house modeling activity of Serial RapidIO using VisualSim indicate that the use of a switched, high-performance avionics network will provide a quantum leap in spacecraft onboard science and autonomy capability for science and exploration missions.
What is wrong with intelligent design?
Sober, Elliott
2007-03-01
This article reviews two standard criticisms of creationism/intelligent design (ID)): it is unfalsifiable, and it is refuted by the many imperfect adaptations found in nature. Problems with both criticisms are discussed. A conception of testability is described that avoids the defects in Karl Popper's falsifiability criterion. Although ID comes in multiple forms, which call for different criticisms, it emerges that ID fails to constitute a serious alternative to evolutionary theory.
Integrating principles and multidisciplinary projects in design education
NASA Technical Reports Server (NTRS)
Nevill, Gale E., Jr.
1992-01-01
The critical need to improve engineering design education in the U.S. is presented and a number of actions to achieve that end are discussed. The importance of teaching undergraduates the latest methods and principles through the means of team design in multidisciplinary projects leading to a testable product is emphasized. Desirable training for design instructors is described and techniques for selecting and managing projects that teach effectively are discussed.
Report on phase 1 of the Microprocessor Seminar. [and associated large scale integration
NASA Technical Reports Server (NTRS)
1977-01-01
Proceedings of a seminar on microprocessors and associated large scale integrated (LSI) circuits are presented. The potential for commonality of device requirements, candidate processes and mechanisms for qualifying candidate LSI technologies for high reliability applications, and specifications for testing and testability were among the topics discussed. Various programs and tentative plans of the participating organizations in the development of high reliability LSI circuits are given.
It takes two to talk: a second-person neuroscience approach to language learning.
Syal, Supriya; Anderson, Adam K
2013-08-01
Language is a social act. We have previously argued that language remains embedded in sociality because the motivation to communicate exists only within a social context. Schilbach et al. underscore the importance of studying linguistic behavior from within the motivated, socially interactive frame in which it is learnt and used, as well as provide testable hypotheses for a participatory, second-person neuroscience approach to language learning.
Are some BL Lac objects artefacts of gravitational lensing?
NASA Technical Reports Server (NTRS)
Ostriker, J. P.; Vietri, M.
1985-01-01
It is proposed here that a significant fraction of BL Lac objects are optically violently variable quasars whose continuum emission has been greatly amplified, relative to the line emission, by pointlike gravitational lenses in intervening galaxies. Several anomalous physical and statistical properties of BL Lacs can be understood on the basis of this model, which is immediately testable on the basis of absorption line studies and by direct imaging.
Empirical approaches to the study of language evolution.
Fitch, W Tecumseh
2017-02-01
The study of language evolution, and human cognitive evolution more generally, has often been ridiculed as unscientific, but in fact it differs little from many other disciplines that investigate past events, such as geology or cosmology. Well-crafted models of language evolution make numerous testable hypotheses, and if the principles of strong inference (simultaneous testing of multiple plausible hypotheses) are adopted, there is an increasing amount of relevant data allowing empirical evaluation of such models. The articles in this special issue provide a concise overview of current models of language evolution, emphasizing the testable predictions that they make, along with overviews of the many sources of data available to test them (emphasizing comparative, neural, and genetic data). The key challenge facing the study of language evolution is not a lack of data, but rather a weak commitment to hypothesis-testing approaches and strong inference, exacerbated by the broad and highly interdisciplinary nature of the relevant data. This introduction offers an overview of the field, and a summary of what needed to evolve to provide our species with language-ready brains. It then briefly discusses different contemporary models of language evolution, followed by an overview of different sources of data to test these models. I conclude with my own multistage model of how different components of language could have evolved.
Are there two processes in reasoning? The dimensionality of inductive and deductive inferences.
Stephens, Rachel G; Dunn, John C; Hayes, Brett K
2018-03-01
Single-process accounts of reasoning propose that the same cognitive mechanisms underlie inductive and deductive inferences. In contrast, dual-process accounts propose that these inferences depend upon 2 qualitatively different mechanisms. To distinguish between these accounts, we derived a set of single-process and dual-process models based on an overarching signal detection framework. We then used signed difference analysis to test each model against data from an argument evaluation task, in which induction and deduction judgments are elicited for sets of valid and invalid arguments. Three data sets were analyzed: data from Singmann and Klauer (2011), a database of argument evaluation studies, and the results of an experiment designed to test model predictions. Of the large set of testable models, we found that almost all could be rejected, including all 2-dimensional models. The only testable model able to account for all 3 data sets was a model with 1 dimension of argument strength and independent decision criteria for induction and deduction judgments. We conclude that despite the popularity of dual-process accounts, current results from the argument evaluation task are best explained by a single-process account that incorporates separate decision thresholds for inductive and deductive inferences. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Moses Lake Fishery Restoration Project : FY 1999 Annual Report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None given
2000-12-01
The Moses Lake Project consists of 3 phases. Phase 1 is the assessment of all currently available physical and biological information, the collection of baseline biological data, the formulation of testable hypotheses, and the development of a detailed study plan to test the hypotheses. Phase 2 is dedicated to the implementation of the study plan including data collection, hypotheses testing, and the formulation of a management plan. Phase 3 of the project is the implementation of the management plan, monitoring and evaluation of the implemented recommendations. The project intends to restore the failed recreational fishery for panfish species (black crappie,more » bluegill and yellow perch) in Moses Lake as off site mitigation for lost recreational fishing opportunities for anadromous species in the upper Columbia River. This report summarizes the results of Phase 1 investigations and presents the study plan directed at initiating Phase 2 of the project. Phase 1of the project culminates with the formulation of testable hypotheses directed at investigating possible limiting factors to the production of panfish in Moses Lake. The limiting factors to be investigated will include water quality, habitat quantity and quality, food limitations, competition, recruitment, predation, over harvest, environmental requirements, and the physical and chemical limitations of the system in relation to the fishes.« less
Phenoscape: Identifying Candidate Genes for Evolutionary Phenotypes
Edmunds, Richard C.; Su, Baofeng; Balhoff, James P.; Eames, B. Frank; Dahdul, Wasila M.; Lapp, Hilmar; Lundberg, John G.; Vision, Todd J.; Dunham, Rex A.; Mabee, Paula M.; Westerfield, Monte
2016-01-01
Phenotypes resulting from mutations in genetic model organisms can help reveal candidate genes for evolutionarily important phenotypic changes in related taxa. Although testing candidate gene hypotheses experimentally in nonmodel organisms is typically difficult, ontology-driven information systems can help generate testable hypotheses about developmental processes in experimentally tractable organisms. Here, we tested candidate gene hypotheses suggested by expert use of the Phenoscape Knowledgebase, specifically looking for genes that are candidates responsible for evolutionarily interesting phenotypes in the ostariophysan fishes that bear resemblance to mutant phenotypes in zebrafish. For this, we searched ZFIN for genetic perturbations that result in either loss of basihyal element or loss of scales phenotypes, because these are the ancestral phenotypes observed in catfishes (Siluriformes). We tested the identified candidate genes by examining their endogenous expression patterns in the channel catfish, Ictalurus punctatus. The experimental results were consistent with the hypotheses that these features evolved through disruption in developmental pathways at, or upstream of, brpf1 and eda/edar for the ancestral losses of basihyal element and scales, respectively. These results demonstrate that ontological annotations of the phenotypic effects of genetic alterations in model organisms, when aggregated within a knowledgebase, can be used effectively to generate testable, and useful, hypotheses about evolutionary changes in morphology. PMID:26500251
Feldstein Ewing, Sarah W.; Filbey, Francesca M.; Hendershot, Christian S.; McEachern, Amber D.; Hutchison, Kent E.
2011-01-01
Objective: Despite the prevalence and profound consequences of alcohol use disorders, psychosocial alcohol interventions have widely varying outcomes. The range of behavior following psychosocial alcohol treatment indicates the need to gain a better understanding of active ingredients and how they may operate. Although this is an area of great interest, at this time there is a limited understanding of how in-session behaviors may catalyze changes in the brain and subsequent alcohol use behavior. Thus, in this review, we aim to identify the neurobiological routes through which psychosocial alcohol interventions may lead to post-session behavior change as well as offer an approach to conceptualize and evaluate these translational relationships. Method: PubMed and PsycINFO searches identified studies that successfully integrated functional magnetic resonance imaging and psychosocial interventions. Results: Based on this research, we identified potential neurobiological substrates through which behavioral alcohol interventions may initiate and sustain behavior change. In addition, we proposed a testable model linking within-session active ingredients to outside-of-session behavior change. Conclusions: Through this review, we present a testable translational model. Additionally, we illustrate how the proposed model can help facilitate empirical evaluations of psychotherapeutic factors and their underlying neural mechanisms, both in the context of motivational interviewing and in the treatment of alcohol use disorders. PMID:22051204
A bioinformatics expert system linking functional data to anatomical outcomes in limb regeneration
Lobo, Daniel; Feldman, Erica B.; Shah, Michelle; Malone, Taylor J.
2014-01-01
Abstract Amphibians and molting arthropods have the remarkable capacity to regenerate amputated limbs, as described by an extensive literature of experimental cuts, amputations, grafts, and molecular techniques. Despite a rich history of experimental effort, no comprehensive mechanistic model exists that can account for the pattern regulation observed in these experiments. While bioinformatics algorithms have revolutionized the study of signaling pathways, no such tools have heretofore been available to assist scientists in formulating testable models of large‐scale morphogenesis that match published data in the limb regeneration field. Major barriers to preventing an algorithmic approach are the lack of formal descriptions for experimental regenerative information and a repository to centralize storage and mining of functional data on limb regeneration. Establishing a new bioinformatics of shape would significantly accelerate the discovery of key insights into the mechanisms that implement complex regeneration. Here, we describe a novel mathematical ontology for limb regeneration to unambiguously encode phenotype, manipulation, and experiment data. Based on this formalism, we present the first centralized formal database of published limb regeneration experiments together with a user‐friendly expert system tool to facilitate its access and mining. These resources are freely available for the community and will assist both human biologists and artificial intelligence systems to discover testable, mechanistic models of limb regeneration. PMID:25729585
Zhou, Shaona; Han, Jing; Koenig, Kathleen; Raplinger, Amy; Pi, Yuan; Li, Dan; Xiao, Hua; Fu, Zhao; Bao, Lei
2016-03-01
Scientific reasoning is an important component under the cognitive strand of the 21st century skills and is highly emphasized in the new science education standards. This study focuses on the assessment of student reasoning in control of variables (COV), which is a core sub-skill of scientific reasoning. The main research question is to investigate the extent to which the existence of experimental data in questions impacts student reasoning and performance. This study also explores the effects of task contexts on student reasoning as well as students' abilities to distinguish between testability and causal influences of variables in COV experiments. Data were collected with students from both USA and China. Students received randomly one of two test versions, one with experimental data and one without. The results show that students from both populations (1) perform better when experimental data are not provided, (2) perform better in physics contexts than in real-life contexts, and (3) students have a tendency to equate non-influential variables to non-testable variables. In addition, based on the analysis of both quantitative and qualitative data, a possible progression of developmental levels of student reasoning in control of variables is proposed, which can be used to inform future development of assessment and instruction.
NASA Space Flight Vehicle Fault Isolation Challenges
NASA Technical Reports Server (NTRS)
Bramon, Christopher; Inman, Sharon K.; Neeley, James R.; Jones, James V.; Tuttle, Loraine
2016-01-01
The Space Launch System (SLS) is the new NASA heavy lift launch vehicle and is scheduled for its first mission in 2017. The goal of the first mission, which will be uncrewed, is to demonstrate the integrated system performance of the SLS rocket and spacecraft before a crewed flight in 2021. SLS has many of the same logistics challenges as any other large scale program. Common logistics concerns for SLS include integration of discrete programs geographically separated, multiple prime contractors with distinct and different goals, schedule pressures and funding constraints. However, SLS also faces unique challenges. The new program is a confluence of new hardware and heritage, with heritage hardware constituting seventy-five percent of the program. This unique approach to design makes logistics concerns such as testability of the integrated flight vehicle especially problematic. The cost of fully automated diagnostics can be completely justified for a large fleet, but not so for a single flight vehicle. Fault detection is mandatory to assure the vehicle is capable of a safe launch, but fault isolation is another issue. SLS has considered various methods for fault isolation which can provide a reasonable balance between adequacy, timeliness and cost. This paper will address the analyses and decisions the NASA Logistics engineers are making to mitigate risk while providing a reasonable testability solution for fault isolation.
Zhou, Shaona; Han, Jing; Koenig, Kathleen; Raplinger, Amy; Pi, Yuan; Li, Dan; Xiao, Hua; Fu, Zhao
2015-01-01
Scientific reasoning is an important component under the cognitive strand of the 21st century skills and is highly emphasized in the new science education standards. This study focuses on the assessment of student reasoning in control of variables (COV), which is a core sub-skill of scientific reasoning. The main research question is to investigate the extent to which the existence of experimental data in questions impacts student reasoning and performance. This study also explores the effects of task contexts on student reasoning as well as students’ abilities to distinguish between testability and causal influences of variables in COV experiments. Data were collected with students from both USA and China. Students received randomly one of two test versions, one with experimental data and one without. The results show that students from both populations (1) perform better when experimental data are not provided, (2) perform better in physics contexts than in real-life contexts, and (3) students have a tendency to equate non-influential variables to non-testable variables. In addition, based on the analysis of both quantitative and qualitative data, a possible progression of developmental levels of student reasoning in control of variables is proposed, which can be used to inform future development of assessment and instruction. PMID:26949425
Testability and epistemic shifts in modern cosmology
NASA Astrophysics Data System (ADS)
Kragh, Helge
2014-05-01
During the last decade new developments in theoretical and speculative cosmology have reopened the old discussion of cosmology's scientific status and the more general question of the demarcation between science and non-science. The multiverse hypothesis, in particular, is central to this discussion and controversial because it seems to disagree with methodological and epistemic standards traditionally accepted in the physical sciences. But what are these standards and how sacrosanct are they? Does anthropic multiverse cosmology rest on evaluation criteria that conflict with and go beyond those ordinarily accepted, so that it constitutes an "epistemic shift" in fundamental physics? The paper offers a brief characterization of the modern multiverse and also refers to a few earlier attempts to introduce epistemic shifts in the science of the universe. It further discusses the several meanings of testability, addresses the question of falsifiability as a sine qua non for a theory being scientific, and briefly compares the situation in cosmology with the one in systematic biology. Multiverse theory is not generally falsifiable, which has led to proposals from some physicists to overrule not only Popperian standards but also other evaluation criteria of a philosophical nature. However, this is hardly possible and nor is it possible to get rid of explicit philosophical considerations in some other aspects of cosmological research, however advanced it becomes.
A behavior change model for internet interventions.
Ritterband, Lee M; Thorndike, Frances P; Cox, Daniel J; Kovatchev, Boris P; Gonder-Frederick, Linda A
2009-08-01
The Internet has become a major component to health care and has important implications for the future of the health care system. One of the most notable aspects of the Web is its ability to provide efficient, interactive, and tailored content to the user. Given the wide reach and extensive capabilities of the Internet, researchers in behavioral medicine have been using it to develop and deliver interactive and comprehensive treatment programs with the ultimate goal of impacting patient behavior and reducing unwanted symptoms. To date, however, many of these interventions have not been grounded in theory or developed from behavior change models, and no overarching model to explain behavior change in Internet interventions has yet been published. The purpose of this article is to propose a model to help guide future Internet intervention development and predict and explain behavior changes and symptom improvement produced by Internet interventions. The model purports that effective Internet interventions produce (and maintain) behavior change and symptom improvement via nine nonlinear steps: the user, influenced by environmental factors, affects website use and adherence, which is influenced by support and website characteristics. Website use leads to behavior change and symptom improvement through various mechanisms of change. The improvements are sustained via treatment maintenance. By grounding Internet intervention research within a scientific framework, developers can plan feasible, informed, and testable Internet interventions, and this form of treatment will become more firmly established.
Multiple payers, commonality and free-riding in health care: Medicare and private payers.
Glazer, Jacob; McGuire, Thomas G
2002-11-01
Managed health care plans and providers in the US and elsewhere sell their services to multiple payers. For example, the three largest groups of purchasers from health plans in the US are employers, Medicaid plans, and Medicare, with the first two accounting for over 90% of the total enrollees. In the case of hospitals, Medicare is the largest buyer, but it alone only accounts for 40% of the total payments. While payers have different objectives and use different contracting practices, the plans and providers set some elements of the quality in common for all payers. In this paper, we study the interactions between a public payer, modeled on Medicare, which sets a price and takes any willing provider, a private payer, which limits providers and pays a price on the basis of quality, and a provider/plan, in the presence of shared elements of quality. The provider compromises in response to divergent incentives from payers. The private sector dilutes Medicare payment initiatives, and may, under some circumstances, repair Medicare payment policy mistakes. If Medicare behaves strategically in the presence of private payers, it can free-ride on the private payer and set its prices too low. Our paper has many testable implications, including a new hypothesis for why Medicare has failed to gain acceptance of health plans in the US.
Decaro, Daniel; Stokes, Michael
2008-12-01
Community-based natural resource conservation programs in developing nations face many implementation challenges underpinned by social-psychological mechanisms. One challenge is garnering local support in an economically and socially sustainable fashion despite economic hardship and historical alienation from local resources. Unfortunately, conservationists' limited understanding of the social-psychological mechanisms underlying participatory conservation impedes the search for appropriate solutions. We address this issue by revealing key underlying social-psychological mechanisms of participatory conservation. Different administrative designs create social atmospheres that differentially affect endorsement of conservation goals. Certain forms of endorsement may be less effective motivators and less economically and socially sustainable than others. From a literature review we found that conservation initiatives endorsed primarily for nonautonomous instrumental reasons, such as to avoid economic fines or to secure economic rewards, are less motivating than those endorsed for autonomous reasons, such as for the opportunity for personal expression and growth. We suggest that successful participatory programs promote autonomous endorsement of conservation through an administrative framework of autonomy support-free and open democratic participation in management, substantive recognition and inclusion of local stakeholder identity, and respectful, noncoercive social interaction. This framework of the autonomy-supportive environment (self-determination theory) has important implications for future research into program design and incentive-based conservation and identifies a testable social-psychological theory of conservancy motivation.
Does salt have a permissive role in the induction of puberty?
Pitynski, Dori; Flynn, Francis W; Skinner, Donal C
2015-10-01
Puberty is starting earlier than ever before and there are serious physiological and sociological implications as a result of this development. Current research has focused on the potential role of high caloric, and commensurate high adiposity, contributions to early puberty. However, girls with normal BMI also appear to be initiating puberty earlier. Westernized diets, in addition to being high in fat and sugar, are also high in salt. To date, no research has investigated a link between elevated salt and the reproductive axis. We hypothesize that a high salt diet can result in an earlier onset of puberty through three mechanisms that are not mutually exclusive. (1) High salt activates neurokinin B, a hormone that is involved in both the reproductive axis and salt regulation, and this induces kisspeptin release and ultimate activation of the reproductive axis. (2) Vasopressin released in response to high salt acts on vasopressin receptors expressed on kisspeptin neurons in the anteroventral periventricular nucleus, thereby stimulating gonadotropin releasing hormone and subsequently luteinizing hormone secretion. (3) Salt induces metabolic changes that affect the reproductive axis. Specifically, salt acts indirectly to modulate adiposity, ties in with the obesity epidemic, and further compounds the pathologic effects of obesity. Our overall hypothesis offers an additional cause behind the induction of puberty and provides testable postulates to determine the mechanism of potential salt-mediated affects on puberty. Copyright © 2015. Published by Elsevier Ltd.
Does salt have a permissive role in the induction of puberty?
Pitynski, Dori; Flynn, Francis W.; Skinner, Donal C.
2017-01-01
Puberty is starting earlier than ever before and there are serious physiological and sociological implications as a result of this development. Current research has focused on the potential role of high caloric, and commensurate high adiposity, contributions to early puberty. However, girls with normal BMI also appear to be initiating puberty earlier. Westernized diets, in addition to being high in fat and sugar, are also high in salt. To date, no research has investigated a link between elevated salt and the reproductive axis. We hypothesize that a high salt diet can result in an earlier onset of puberty through three mechanisms that are not mutually exclusive. (1) High salt activates neurokinin B, a hormone that is involved in both the reproductive axis and salt regulation, and this induces kisspeptin release and ultimate activation of the reproductive axis. (2) Vasopressin released in response to high salt acts on vasopressin receptors expressed on kisspeptin neurons in the anteroventral periventricular nucleus, thereby stimulating gonadotropin releasing hormone and subsequently luteinizing hormone secretion. (3) Salt induces metabolic changes that affect the reproductive axis. Specifically, salt acts indirectly to modulate adiposity, ties in with the obesity epidemic, and further compounds the pathologic effects of obesity. Our overall hypothesis offers an additional cause behind the induction of puberty and provides testable postulates to determine the mechanism of potential salt-mediated affects on puberty. PMID:26190310
Gouhier, Tarik C; Guichard, Frédéric
2007-03-01
In marine systems, the occurrence and implications of disturbance-recovery cycles have been revealed at the landscape level, but only in demographically open or closed systems where landscape-level dynamics are assumed to have no feedback effect on regional dynamics. We present a mussel metapopulation model to elucidate the role of landscape-level disturbance cycles for regional response of mussel populations to onshore productivity and larval transport. Landscape dynamics are generated through spatially explicit rules, and each landscape is connected to its neighbor through unidirectional larval dispersal. The role of landscape disturbance cycles in the regional system behavior is elucidated (1) in demographically open vs. demographically coupled systems, in relation to (2) onshore reproductive output and (3) the temporal scale of landscape disturbance dynamics. By controlling for spatial structure at the landscape and metapopulation levels, we first demonstrate the interaction between landscape and oceanographic connectivity. The temporal scale of disturbance cycles, as controlled by mussel colonization rate, plays a critical role in the regional behavior of the system. Indeed, fast disturbance cycles are responsible for regional synchrony in relation to onshore reproductive output. Slow disturbance cycles, however, lead to increased robustness to changes in productivity and to demographic coupling. These testable predictions indicate that the occurrence and temporal scale of local disturbance-recovery dynamics can drive large-scale variability in demographically open systems, and the response of metapopulations to changes in nearshore productivity.
Cyclin A2 promotes DNA repair in the brain during both development and aging.
Gygli, Patrick E; Chang, Joshua C; Gokozan, Hamza N; Catacutan, Fay P; Schmidt, Theresa A; Kaya, Behiye; Goksel, Mustafa; Baig, Faisal S; Chen, Shannon; Griveau, Amelie; Michowski, Wojciech; Wong, Michael; Palanichamy, Kamalakannan; Sicinski, Piotr; Nelson, Randy J; Czeisler, Catherine; Otero, José J
2016-07-01
Various stem cell niches of the brain have differential requirements for Cyclin A2. Cyclin A2 loss results in marked cerebellar dysmorphia, whereas forebrain growth is retarded during early embryonic development yet achieves normal size at birth. To understand the differential requirements of distinct brain regions for Cyclin A2, we utilized neuroanatomical, transgenic mouse, and mathematical modeling techniques to generate testable hypotheses that provide insight into how Cyclin A2 loss results in compensatory forebrain growth during late embryonic development. Using unbiased measurements of the forebrain stem cell niche, we parameterized a mathematical model whereby logistic growth instructs progenitor cells as to the cell-types of their progeny. Our data was consistent with prior findings that progenitors proliferate along an auto-inhibitory growth curve. The growth retardation inCCNA2-null brains corresponded to cell cycle lengthening, imposing a developmental delay. We hypothesized that Cyclin A2 regulates DNA repair and that CCNA2-null progenitors thus experienced lengthened cell cycle. We demonstrate that CCNA2-null progenitors suffer abnormal DNA repair, and implicate Cyclin A2 in double-strand break repair. Cyclin A2's DNA repair functions are conserved among cell lines, neural progenitors, and hippocampal neurons. We further demonstrate that neuronal CCNA2 ablation results in learning and memory deficits in aged mice.
2013-01-01
Background There is extensive evidence for the interaction of metabolic enzymes with the eukaryotic cytoskeleton. The significance of these interactions is far from clear. Presentation of the hypothesis In the cytoskeletal integrative sensor hypothesis presented here, the cytoskeleton senses and integrates the general metabolic activity of the cell. This activity depends on the binding to the cytoskeleton of enzymes and, depending on the nature of the enzyme, this binding may occur if the enzyme is either active or inactive but not both. This enzyme-binding is further proposed to stabilize microtubules and microfilaments and to alter rates of GTP and ATP hydrolysis and their levels. Testing the hypothesis Evidence consistent with the cytoskeletal integrative sensor hypothesis is presented in the case of glycolysis. Several testable predictions are made. There should be a relationship between post-translational modifications of tubulin and of actin and their interaction with metabolic enzymes. Different conditions of cytoskeletal dynamics and enzyme-cytoskeleton binding should reveal significant differences in local and perhaps global levels and ratios of ATP and GTP. The different functions of moonlighting enzymes should depend on cytoskeletal binding. Implications of the hypothesis The physical and chemical effects arising from metabolic sensing by the cytoskeleton would have major consequences on cell shape, dynamics and cell cycle progression. The hypothesis provides a framework that helps the significance of the enzyme-decorated cytoskeleton be determined. PMID:23398642
Cunning, Ross; Muller, Erik B; Gates, Ruth D; Nisbet, Roger M
2017-10-27
Coral reef ecosystems owe their ecological success - and vulnerability to climate change - to the symbiotic metabolism of corals and Symbiodinium spp. The urgency to understand and predict the stability and breakdown of these symbioses (i.e., coral 'bleaching') demands the development and application of theoretical tools. Here, we develop a dynamic bioenergetic model of coral-Symbiodinium symbioses that demonstrates realistic steady-state patterns in coral growth and symbiont abundance across gradients of light, nutrients, and feeding. Furthermore, by including a mechanistic treatment of photo-oxidative stress, the model displays dynamics of bleaching and recovery that can be explained as transitions between alternate stable states. These dynamics reveal that "healthy" and "bleached" states correspond broadly to nitrogen- and carbon-limitation in the system, with transitions between them occurring as integrated responses to multiple environmental factors. Indeed, a suite of complex emergent behaviors reproduced by the model (e.g., bleaching is exacerbated by nutrients and attenuated by feeding) suggests it captures many important attributes of the system; meanwhile, its modular framework and open source R code are designed to facilitate further problem-specific development. We see significant potential for this modeling framework to generate testable hypotheses and predict integrated, mechanistic responses of corals to environmental change, with important implications for understanding the performance and maintenance of symbiotic systems. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Wynne-Edwards, K E
2001-01-01
Hormone disruption is a major, underappreciated component of the plant chemical arsenal, and the historical coevolution between hormone-disrupting plants and herbivores will have both increased the susceptibility of carnivores and diversified the sensitivities of herbivores to man-made endocrine disruptors. Here I review diverse evidence of the influence of plant secondary compounds on vertebrate reproduction, including human reproduction. Three of the testable hypotheses about the evolutionary responses of vertebrate herbivores to hormone-disrupting challenges from their diet are developed. Specifically, the hypotheses are that a) vertebrate herbivores will express steroid hormone receptors in the buccal cavity and/or the vomeronasal organ; b) absolute sex steroid concentrations will be lower in carnivores than in herbivores; and c) herbivore steroid receptors should be more diverse in their binding affinities than carnivore lineages. The argument developed in this review, if empirically validated by support for the specific hypotheses, suggests that a) carnivores will be more susceptible than herbivores to endocrine-disrupting compounds of anthropogenic origin entering their bodies, and b) diverse herbivore lineages will be variably susceptible to any given natural or synthetic contaminant. As screening methods for hormone-disrupting potential are compared and adopted, comparative endocrine physiology research is urgently needed to develop models that predict the broad applicability of those screening results in diverse vertebrate species. PMID:11401754
Sadeu, J C; Hughes, Claude L; Agarwal, Sanjay; Foster, Warren G
2010-08-01
Reproductive function and fertility are thought to be compromised by behaviors such as cigarette smoking, substance abuse, and alcohol consumption; however, the strength of these associations are uncertain. Furthermore, the reproductive system is thought to be under attack from exposure to environmental contaminants, particularly those chemicals shown to affect endocrine homeostasis. The relationship between exposure to environmental contaminants and adverse effects on human reproductive health are frequently debated in the scientific literature and these controversies have spread into the lay press drawing increased public and regulatory attention. Therefore, the objective of the present review was to critically evaluate the literature concerning the relationship between lifestyle exposures and adverse effects on fertility as well as examining the evidence for a role of environmental contaminants in the purported decline of semen quality and the pathophysiology of subfertility, polycystic ovarian syndrome, and endometriosis. The authors conclude that whereas cigarette smoking is strongly associated with adverse reproductive outcomes, high-level exposures to other lifestyle factors are only weakly linked with negative fertility impacts. Finally, there is no compelling evidence that environmental contaminants, at concentrations representative of the levels measured in contemporary biomonitoring studies, have any effect, positive or negative, on reproductive health in the general population. Further research using prospective study designs with robust sample sizes are needed to evaluate testable hypotheses that address the relationship between exposure and adverse reproductive health effects.
Toward a formalized account of attitudes: The Causal Attitude Network (CAN) model.
Dalege, Jonas; Borsboom, Denny; van Harreveld, Frenk; van den Berg, Helma; Conner, Mark; van der Maas, Han L J
2016-01-01
This article introduces the Causal Attitude Network (CAN) model, which conceptualizes attitudes as networks consisting of evaluative reactions and interactions between these reactions. Relevant evaluative reactions include beliefs, feelings, and behaviors toward the attitude object. Interactions between these reactions arise through direct causal influences (e.g., the belief that snakes are dangerous causes fear of snakes) and mechanisms that support evaluative consistency between related contents of evaluative reactions (e.g., people tend to align their belief that snakes are useful with their belief that snakes help maintain ecological balance). In the CAN model, the structure of attitude networks conforms to a small-world structure: evaluative reactions that are similar to each other form tight clusters, which are connected by a sparser set of "shortcuts" between them. We argue that the CAN model provides a realistic formalized measurement model of attitudes and therefore fills a crucial gap in the attitude literature. Furthermore, the CAN model provides testable predictions for the structure of attitudes and how they develop, remain stable, and change over time. Attitude strength is conceptualized in terms of the connectivity of attitude networks and we show that this provides a parsimonious account of the differences between strong and weak attitudes. We discuss the CAN model in relation to possible extensions, implication for the assessment of attitudes, and possibilities for further study. (c) 2015 APA, all rights reserved).
Inter-Universal Quantum Entanglement
NASA Astrophysics Data System (ADS)
Robles-Pérez, S. J.; González-Díaz, P. F.
2015-01-01
The boundary conditions to be imposed on the quantum state of the whole multiverse could be such that the universes would be created in entangled pairs. Then, interuniversal entanglement would provide us with a vacuum energy for each single universe that might be fitted with observational data, making testable not only the multiverse proposal but also the boundary conditions of the multiverse. Furthermore, the second law of the entanglement thermodynamics would enhance the expansion of the single universes.
Creation of a Mouse with Stress-Induced Dystonia: Control of an ATPase Chaperone
2013-04-01
was successful, and a mouse with the desired dystonic symptoms was obtained. It has two mutations , one a dominantly inherited gene with 100...the hallmark of dystonia. 15. SUBJECT TERMS Dystonia, genetically modified mice, stress, gene mutations , animal model of disease. 16...there are a variety of hypotheses that should be testable if there were a realistic animal model. Mice with mutations in genes known to cause dystonia
Effectiveness of spacecraft testing programs
NASA Technical Reports Server (NTRS)
Krausz, A.
1980-01-01
The need for testing under simulated mission operational conditions is discussed and the results of such tests are reviewed from the point of view of the user. A brief overview of the usal test sequences for high reliability long life spacecraft is presented and the effectiveness of the testing program is analyzed in terms of the defects which are discovered by such tests. The need for automation, innovative mechanical test procedures, and design for testability is discussed.
Zee-Babu type model with U (1 )Lμ-Lτ gauge symmetry
NASA Astrophysics Data System (ADS)
Nomura, Takaaki; Okada, Hiroshi
2018-05-01
We extend the Zee-Babu model, introducing local U (1 )Lμ-Lτ symmetry with several singly charged bosons. We find a predictive neutrino mass texture in a simple hypothesis in which mixings among singly charged bosons are negligible. Also, lepton-flavor violations are less constrained compared with the original model. Then, we explore the testability of the model, focusing on doubly charged boson physics at the LHC and the International Linear Collider.
Xylella genomics and bacterial pathogenicity to plants.
Dow, J M; Daniels, M J
2000-12-01
Xylella fastidiosa, a pathogen of citrus, is the first plant pathogenic bacterium for which the complete genome sequence has been published. Inspection of the sequence reveals high relatedness to many genes of other pathogens, notably Xanthomonas campestris. Based on this, we suggest that Xylella possesses certain easily testable properties that contribute to pathogenicity. We also present some general considerations for deriving information on pathogenicity from bacterial genomics. Copyright 2000 John Wiley & Sons, Ltd.
An evolutionary scenario for the origin of flowers.
Frohlich, Michael W
2003-07-01
The Mostly Male theory is the first to use evidence from gene phylogenies, genetics, modern plant morphology and fossils to explain the evolutionary origin of flowers. It proposes that flower organization derives more from the male structures of ancestral gymnosperms than from female structures. The theory arose from a hypothesis-based study. Such studies are the most likely to generate testable evolutionary scenarios, which should be the ultimate goal of evo-devo.
A collider observable QCD axion
Dimopoulos, Savas; Hook, Anson; Huang, Junwu; ...
2016-11-09
Here, we present a model where the QCD axion is at the TeV scale and visible at a collider via its decays. Conformal dynamics and strong CP considerations account for the axion coupling strongly enough to the standard model to be produced as well as the coincidence between the weak scale and the axion mass. The model predicts additional pseudoscalar color octets whose properties are completely determined by the axion properties rendering the theory testable.
Soviet Economic Policy Towards Eastern Europe
1988-11-01
high. Without specifying the determinants of Soviet demand for "allegiance" in more detail, the model is not testable; we cannot predict how subsidy...trade inside (Czechoslovakia, Bulgaria). These countries are behaving as predicted by the model . If this hypothesis is true, the pattern of subsidies...also compares the sum of per capita subsidies by country between 1970 and 1982 with the sum of subsidies predicted by the model . Because of the poor
All biology is computational biology.
Markowetz, Florian
2017-03-01
Here, I argue that computational thinking and techniques are so central to the quest of understanding life that today all biology is computational biology. Computational biology brings order into our understanding of life, it makes biological concepts rigorous and testable, and it provides a reference map that holds together individual insights. The next modern synthesis in biology will be driven by mathematical, statistical, and computational methods being absorbed into mainstream biological training, turning biology into a quantitative science.
2009-12-01
DIRECT AND INDIRECT APPROACHES: IMPLICATIONS FOR ENDING THE VIOLENCE IN SOUTHERN THAILAND by Chaiyo Rodthong December 2009 Thesis Advisor... Thailand 6. AUTHOR(S) Chaiyo Rodthong 5. FUNDING NUMBERS 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Naval Postgraduate School Monterey, CA...distribution is unlimited 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) The instability in the Southern Border Provinces of Thailand resurged on
Implications of the Budgeting Process on State-of-the-Art (SOA) Extensions.
1987-12-01
CLASSIFICATION AUTHORITY 3 DISTRIBUTION /AVAILABILITY OF REPORT Approved for public release; Zb DECLASSiFICATION1DOWNGRADING SCHEDULE Distribution is...PROGRAM PROJECT TASK WORK UNIT ELEMENT NO. NO NO ACCESSION NO 11. TITLE (Include Security Classification ) IMPLICATIONS OF THE BUDGETING PROCESS ON STATE...20 DISTRIBUTIONAVAILABILITY OF ABSTRACT 21 ABSTRACT SECURITY CLASSIFICATION 0 UNCLASSIFIED,UNLIMI"ED 0 SAME AS RPT QJ DTIC USERS UNCLASSIFIED 22a
The SANE Research Project: Its Implications for Higher Education.
ERIC Educational Resources Information Center
Harrison, Andrew; Dugdale, Shirley
2003-01-01
Describes a 2-year research program called Sustainable Accommodation for the New Economy (SANE), which is exploring the implications of the distributed workplace. Its focus is on the creation of sustainable, collaborative workplaces for knowledge workers across Europe, encompassing both virtual and physical spaces. Discusses its implications for…
[The right to have health protection].
Mayer-Serra, Carlos Elizondo
2007-01-01
This article intends to analyze how the health good or the right to have health protection should be distributed, following the definition established by the Mexican Constitution. This legal definition will be contrasted with the way in which the expenditure on health is actually distributed and financed, and the implications of this. The distribution of this expenditure, in turn, will be compared to another social good of great relevance: education, which consumes an even larger proportion of the national fiscal resources. Finally, the article will suggest a possible explanation for this fact and provide some ideas regarding its implications.
Families of transposable elements, population structure and the origin of species.
Jurka, Jerzy; Bao, Weidong; Kojima, Kenji K
2011-09-19
Eukaryotic genomes harbor diverse families of repetitive DNA derived from transposable elements (TEs) that are able to replicate and insert into genomic DNA. The biological role of TEs remains unclear, although they have profound mutagenic impact on eukaryotic genomes and the origin of repetitive families often correlates with speciation events. We present a new hypothesis to explain the observed correlations based on classical concepts of population genetics. The main thesis presented in this paper is that the TE-derived repetitive families originate primarily by genetic drift in small populations derived mostly by subdivisions of large populations into subpopulations. We outline the potential impact of the emerging repetitive families on genetic diversification of different subpopulations, and discuss implications of such diversification for the origin of new species. Several testable predictions of the hypothesis are examined. First, we focus on the prediction that the number of diverse families of TEs fixed in a representative genome of a particular species positively correlates with the cumulative number of subpopulations (demes) in the historical metapopulation from which the species has emerged. Furthermore, we present evidence indicating that human AluYa5 and AluYb8 families might have originated in separate proto-human subpopulations. We also revisit prior evidence linking the origin of repetitive families to mammalian phylogeny and present additional evidence linking repetitive families to speciation based on mammalian taxonomy. Finally, we discuss evidence that mammalian orders represented by the largest numbers of species may be subject to relatively recent population subdivisions and speciation events. The hypothesis implies that subdivision of a population into small subpopulations is the major step in the origin of new families of TEs as well as of new species. The origin of new subpopulations is likely to be driven by the availability of new biological niches, consistent with the hypothesis of punctuated equilibria. The hypothesis also has implications for the ongoing debate on the role of genetic drift in genome evolution.
Short- and Long-Term Earthquake Forecasts Based on Statistical Models
NASA Astrophysics Data System (ADS)
Console, Rodolfo; Taroni, Matteo; Murru, Maura; Falcone, Giuseppe; Marzocchi, Warner
2017-04-01
The epidemic-type aftershock sequences (ETAS) models have been experimentally used to forecast the space-time earthquake occurrence rate during the sequence that followed the 2009 L'Aquila earthquake and for the 2012 Emilia earthquake sequence. These forecasts represented the two first pioneering attempts to check the feasibility of providing operational earthquake forecasting (OEF) in Italy. After the 2009 L'Aquila earthquake the Italian Department of Civil Protection nominated an International Commission on Earthquake Forecasting (ICEF) for the development of the first official OEF in Italy that was implemented for testing purposes by the newly established "Centro di Pericolosità Sismica" (CPS, the seismic Hazard Center) at the Istituto Nazionale di Geofisica e Vulcanologia (INGV). According to the ICEF guidelines, the system is open, transparent, reproducible and testable. The scientific information delivered by OEF-Italy is shaped in different formats according to the interested stakeholders, such as scientists, national and regional authorities, and the general public. The communication to people is certainly the most challenging issue, and careful pilot tests are necessary to check the effectiveness of the communication strategy, before opening the information to the public. With regard to long-term time-dependent earthquake forecast, the application of a newly developed simulation algorithm to Calabria region provided typical features in time, space and magnitude behaviour of the seismicity, which can be compared with those of the real observations. These features include long-term pseudo-periodicity and clustering of strong earthquakes, and a realistic earthquake magnitude distribution departing from the Gutenberg-Richter distribution in the moderate and higher magnitude range.
Hydrologic characteristics of freshwater mussel habitat: novel insights from modeled flows
Drew, C. Ashton; Eddy, Michele; Kwak, Thomas J.; Cope, W. Gregory; Augspurger, Tom
2018-01-01
The ability to model freshwater stream habitat and species distributions is limited by the spatially sparse flow data available from long-term gauging stations. Flow data beyond the immediate vicinity of gauging stations would enhance our ability to explore and characterize hydrologic habitat suitability. The southeastern USA supports high aquatic biodiversity, but threats, such as landuse alteration, climate change, conflicting water-resource demands, and pollution, have led to the imperilment and legal protection of many species. The ability to distinguish suitable from unsuitable habitat conditions, including hydrologic suitability, is a key criterion for successful conservation and restoration of aquatic species. We used the example of the critically endangered Tar River Spinymussel (Parvaspina steinstansana) and associated species to demonstrate the value of modeled flow data (WaterFALL™) to generate novel insights into population structure and testable hypotheses regarding hydrologic suitability. With ordination models, we: 1) identified all catchments with potentially suitable hydrology, 2) identified 2 distinct hydrologic environments occupied by the Tar River Spinymussel, and 3) estimated greater hydrological habitat niche breadth of assumed surrogate species associates at the catchment scale. Our findings provide the first demonstrated application of complete, continuous, regional modeled hydrologic data to freshwater mussel distribution and management. This research highlights the utility of modeling and data-mining methods to facilitate further exploration and application of such modeled environmental conditions to inform aquatic species management. We conclude that such an approach can support landscape-scale management decisions that require spatial information at fine resolution (e.g., enhanced National Hydrology Dataset catchments) and broad extent (e.g., multiple river basins).
Origin of the Local Group satellite planes
NASA Astrophysics Data System (ADS)
Banik, Indranil; O'Ryan, David; Zhao, Hongsheng
2018-04-01
We attempt to understand the planes of satellite galaxies orbiting the Milky Way (MW) and M31 in the context of Modified Newtonian Dynamics (MOND), which implies a close MW-M31 flyby occurred ≈8 Gyr ago. Using the timing argument, we obtain MW-M31 trajectories consistent with cosmological initial conditions and present observations. We adjust the present M31 proper motion within its uncertainty in order to simulate a range of orbital geometries and closest approach distances. Treating the MW and M31 as point masses, we follow the trajectories of surrounding test particle disks, thereby mapping out the tidal debris distribution. Around each galaxy, the resulting tidal debris tends to cluster around a particular orbital pole. We find some models in which these preferred spin vectors align fairly well with those of the corresponding observed satellite planes. The radial distributions of material in the simulated satellite planes are similar to what we observe. Around the MW, our best-fitting model yields a significant fraction (0.22) of counter-rotating material, perhaps explaining why Sculptor counter-rotates within the MW satellite plane. In contrast, our model yields no counter-rotating material around M31. This is testable with proper motions of M31 satellites. In our best model, the MW disk is thickened by the flyby 7.65 Gyr ago to a root mean square height of 0.75 kpc. This is similar to the observed age and thickness of the Galactic thick disk. Thus, the MW thick disk may have formed together with the MW and M31 satellite planes during a past MW-M31 flyby.
Magnetic properties of Proxima Centauri b analogues
NASA Astrophysics Data System (ADS)
Zuluaga, Jorge I.; Bustamante, Sebastian
2018-03-01
The discovery of a planet around the closest star to our Sun, Proxima Centauri, represents a quantum leap in the testability of exoplanetary models. Unlike any other discovered exoplanet, models of Proxima b could be contrasted against near future telescopic observations and far future in-situ measurements. In this paper we aim at predicting the planetary radius and the magnetic properties (dynamo lifetime and magnetic dipole moment) of Proxima b analogues (solid planets with masses of ∼ 1 - 3M⊕ , rotation periods of several days and habitable conditions). For this purpose we build a grid of planetary models with a wide range of compositions and masses. For each point in the grid we run the planetary evolution model developed in Zuluaga et al. (2013). Our model assumes small orbital eccentricity, negligible tidal heating and earth-like radiogenic mantle elements abundances. We devise a statistical methodology to estimate the posterior distribution of the desired planetary properties assuming simple lprior distributions for the orbital inclination and bulk composition. Our model predicts that Proxima b would have a mass 1.3 ≤Mp ≤ 2.3M⊕ and a radius Rp =1.4-0.2+0.3R⊕ . In our simulations, most Proxima b analogues develop intrinsic dynamos that last for ≥4 Gyr (the estimated age of the host star). If alive, the dynamo of Proxima b have a dipole moment ℳdip >0.32÷2.9×2.3ℳdip , ⊕ . These results are not restricted to Proxima b but they also apply to earth-like planets having similar observed properties.
Origin of the Local Group satellite planes
NASA Astrophysics Data System (ADS)
Banik, Indranil; O'Ryan, David; Zhao, Hongsheng
2018-07-01
We attempt to understand the planes of satellite galaxies orbiting the Milky Way (MW) and M31 in the context of Modified Newtonian Dynamics, which implies a close MW-M31 flyby occurred ≈8 Gyr ago. Using the timing argument, we obtain MW-M31 trajectories consistent with cosmological initial conditions and present observations. We adjust the present M31 proper motion within its uncertainty in order to simulate a range of orbital geometries and closest approach distances. Treating the MW and M31 as point masses, we follow the trajectories of surrounding test particle discs, thereby mapping out the tidal debris distribution. Around each galaxy, the resulting tidal debris tends to cluster around a particular orbital pole. We find some models in which these preferred spin vectors align fairly well with those of the corresponding observed satellite planes. The radial distributions of material in the simulated satellite planes are similar to what we observe. Around the MW, our best-fitting model yields a significant fraction (0.22) of counter-rotating material, perhaps explaining why Sculptor counter-rotates within the MW satellite plane. In contrast, our model yields no counter-rotating material around M31. This is testable with proper motions of M31 satellites. In our best model, the MW disc is thickened by the flyby 7.65 Gyr ago to a root mean square height of 0.75 kpc. This is similar to the observed age and thickness of the Galactic thick disc. Thus, the MW thick disc may have formed together with the MW and M31 satellite planes during a past MW-M31 flyby.
Informed maintenance for next generation space transportation systems
NASA Astrophysics Data System (ADS)
Fox, Jack J.
2001-02-01
Perhaps the most substantial single obstacle to progress of space exploration and utilization of space for human benefit is the safety & reliability and the inherent cost of launching to, and returning from, space. The primary influence in the high costs of current launch systems (the same is true for commercial and military aircraft and most other reusable systems) is the operations, maintenance and infrastructure portion of the program's total life cycle costs. Reusable Launch Vehicle (RLV) maintenance and design have traditionally been two separate engineering disciplines with often conflicting objectives-maximizing ease of maintenance versus optimizing performance, size and cost. Testability analysis, an element of Informed Maintenance (IM), has been an ad hoc, manual effort, in which maintenance engineers attempt to identify an efficient method of troubleshooting for the given product, with little or no control over product design. Therefore, testability deficiencies in the design cannot be rectified. It is now widely recognized that IM must be engineered into the product at the design stage itself, so that an optimal compromise is achieved between system maintainability and performance. The elements of IM include testability analysis, diagnostics/prognostics, automated maintenance scheduling, automated logistics coordination, paperless documentation and data mining. IM derives its heritage from complimentary NASA science, space and aeronautic enterprises such as the on-board autonomous Remote Agent Architecture recently flown on NASA's Deep Space 1 Probe as well as commercial industries that employ quick turnaround operations. Commercial technologies and processes supporting NASA's IM initiatives include condition based maintenance technologies from Boeing's Commercial 777 Aircraft and Lockheed-Martin's F-22 Fighter, automotive computer diagnostics and autonomous controllers that enable 100,000 mile maintenance free operations, and locomotive monitoring system software. This paper will summarize NASA's long-term strategy, development, and implementation plans for Informed Maintenance for next generation RLVs. This will be done through a convergence into a single IM vision the work being performed throughout NASA, industry and academia. Additionally, a current status of IM development throughout NASA programs such as the Space Shuttle, X-33, X-34 and X-37 will be provided and will conclude with an overview of near-term work that is being initiated in FY00 to support NASA's 2nd Generation Reusable Launch Vehicle Program. .
Informed maintenance for next generation reusable launch systems
NASA Astrophysics Data System (ADS)
Fox, Jack J.; Gormley, Thomas J.
2001-03-01
Perhaps the most substantial single obstacle to progress of space exploration and utilization of space for human benefit is the safety & reliability and the inherent cost of launching to, and returning from, space. The primary influence in the high costs of current launch systems (the same is true for commercial and military aircraft and most other reusable systems) is the operations, maintenance and infrastructure portion of the program's total life cycle costs. Reusable Launch Vehicle (RLV) maintenance and design have traditionally been two separate engineering disciplines with often conflicting objectives - maximizing ease of maintenance versus optimizing performance, size and cost. Testability analysis, an element of Informed Maintenance (IM), has been an ad hoc, manual effort, in which maintenance engineers attempt to identify an efficient method of troubleshooting for the given product, with little or no control over product design. Therefore, testability deficiencies in the design cannot be rectified. It is now widely recognized that IM must be engineered into the product at the design stage itself, so that an optimal compromise is achieved between system maintainability and performance. The elements of IM include testability analysis, diagnostics/prognostics, automated maintenance scheduling, automated logistics coordination, paperless documentation and data mining. IM derives its heritage from complimentary NASA science, space and aeronautic enterprises such as the on-board autonomous Remote Agent Architecture recently flown on NASA's Deep Space 1 Probe as well as commercial industries that employ quick turnaround operations. Commercial technologies and processes supporting NASA's IM initiatives include condition based maintenance technologies from Boeing's Commercial 777 Aircraft and Lockheed-Martin's F-22 Fighter, automotive computer diagnostics and autonomous controllers that enable 100,000 mile maintenance free operations, and locomotive monitoring system software. This paper will summarize NASA's long-term strategy, development, and implementation plans for Informed Maintenance for next generation RLVs. This will be done through a convergence into a single IM vision the work being performed throughout NASA, industry and academia. Additionally, a current status of IM development throughout NASA programs such as the Space Shuttle, X-33, X-34 and X-37 will be provided and will conclude with an overview of near-term work that is being initiated in FY00 to support NASA's 2 nd Generation Reusable Launch Vehicle Program.
NASA Astrophysics Data System (ADS)
Chakdar, Shreyashi
The Standard Model of particle physics is assumed to be a low-energy effective theory with new physics theoretically motivated to be around TeV scale. The thesis presents theories with new physics beyond the Standard Model in the TeV scale testable in the colliders. Work done in chapters 2, 3 and 5 in this thesis present some models incorporating different approaches of enlarging the Standard Model gauge group to a grand unified symmetry with each model presenting its unique signatures in the colliders. The study on leptoquarks gauge bosons in reference to TopSU(5) model in chapter 2 showed that their discovery mass range extends up to 1.5 TeV at 14 TeV LHC with luminosity of 100 fb--1. On the other hand, in chapter 3 we studied the collider phenomenology of TeV scale mirror fermions in Left-Right Mirror model finding that the reaches for the mirror quarks goes upto 750 GeV at the 14 TeV LHC with 300 fb--1 luminosity. In chapter 4 we have enlarged the bosonic symmetry to fermi-bose symmetry e.g. supersymmetry and have shown that SUSY with non-universalities in gaugino or scalar masses within high scale SUGRA set up can still be accessible at LHC with 14 TeV. In chapter 5, we performed a study in respect to the e+e-- collider and find that precise measurements of the higgs boson mass splittings up to ˜ 100 MeV may be possible with high luminosity in the International Linear Collider (ILC). In chapter 6 we have shown that the experimental data on neutrino masses and mixings are consistent with the proposed 4/5 parameter Dirac neutrino models yielding a solution for the neutrino masses with inverted mass hierarchy and large CP violating phase delta and thus can be tested experimentally. Chapter 7 of the thesis incorporates a warm dark matter candidate in context of two Higgs doublet model. The model has several testable consequences at colliders with the charged scalar and pseudoscalar being in few hundred GeV mass range. This thesis presents an endeavor to study beyond standard model physics at the TeV scale with testable signals in the Colliders.
Wagner, Peter J
2012-02-23
Rate distributions are important considerations when testing hypotheses about morphological evolution or phylogeny. They also have implications about general processes underlying character evolution. Molecular systematists often assume that rates are Poisson processes with gamma distributions. However, morphological change is the product of multiple probabilistic processes and should theoretically be affected by hierarchical integration of characters. Both factors predict lognormal rate distributions. Here, a simple inverse modelling approach assesses the best single-rate, gamma and lognormal models given observed character compatibility for 115 invertebrate groups. Tests reject the single-rate model for nearly all cases. Moreover, the lognormal outperforms the gamma for character change rates and (especially) state derivation rates. The latter in particular is consistent with integration affecting morphological character evolution.
Resource depletion promotes automatic processing: implications for distribution of practice.
Scheel, Matthew H
2010-12-01
Recent models of cognition include two processing systems: an automatic system that relies on associative learning, intuition, and heuristics, and a controlled system that relies on deliberate consideration. Automatic processing requires fewer resources and is more likely when resources are depleted. This study showed that prolonged practice on a resource-depleting mental arithmetic task promoted automatic processing on a subsequent problem-solving task, as evidenced by faster responding and more errors. Distribution of practice effects (0, 60, 120, or 180 sec. between problems) on rigidity also disappeared when groups had equal time on resource-depleting tasks. These results suggest that distribution of practice effects is reducible to resource availability. The discussion includes implications for interpreting discrepancies in the traditional distribution of practice effect.
Sadeghi Ghuchani, Mostafa
2018-02-08
This comment argues against the view that cancer cells produce less entropy than normal cells as stated in a recent paper by Marín and Sabater. The basic principle of estimation of entropy production rate in a living cell is discussed, emphasizing the fact that entropy production depends on both the amount of heat exchange during the metabolism and the entropy difference between products and substrates.
Testing Nonassociative Quantum Mechanics.
Bojowald, Martin; Brahma, Suddhasattwa; Büyükçam, Umut
2015-11-27
The familiar concepts of state vectors and operators in quantum mechanics rely on associative products of observables. However, these notions do not apply to some exotic systems such as magnetic monopoles, which have long been known to lead to nonassociative algebras. Their quantum physics has remained obscure. This Letter presents the first derivation of potentially testable physical results in nonassociative quantum mechanics, based on effective potentials. They imply new effects which cannot be mimicked in usual quantum mechanics with standard magnetic fields.
Almost periodic cellular neural networks with neutral-type proportional delays
NASA Astrophysics Data System (ADS)
Xiao, Songlin
2018-03-01
This paper presents a new result on the existence, uniqueness and generalised exponential stability of almost periodic solutions for cellular neural networks with neutral-type proportional delays and D operator. Based on some novel differential inequality techniques, a testable condition is derived to ensure that all the state trajectories of the system converge to an almost periodic solution with a positive exponential convergence rate. The effectiveness of the obtained result is illustrated by a numerical example.
NASA Astrophysics Data System (ADS)
Sadeghi Ghuchani, Mostafa
2018-03-01
This comment argues against the view that cancer cells produce less entropy than normal cells as stated in a recent paper by Marín and Sabater. The basic principle of estimation of entropy production rate in a living cell is discussed, emphasizing the fact that entropy production depends on both the amount of heat exchange during the metabolism and the entropy difference between products and substrates.
Using Dynamic Sensitivity Analysis to Assess Testability
NASA Technical Reports Server (NTRS)
Voas, Jeffrey; Morell, Larry; Miller, Keith
1990-01-01
This paper discusses sensitivity analysis and its relationship to random black box testing. Sensitivity analysis estimates the impact that a programming fault at a particular location would have on the program's input/output behavior. Locations that are relatively \\"insensitive" to faults can render random black box testing unlikely to uncover programming faults. Therefore, sensitivity analysis gives new insight when interpreting random black box testing results. Although sensitivity analysis is computationally intensive, it requires no oracle and no human intervention.
Earthquake Forecasting System in Italy
NASA Astrophysics Data System (ADS)
Falcone, G.; Marzocchi, W.; Murru, M.; Taroni, M.; Faenza, L.
2017-12-01
In Italy, after the 2009 L'Aquila earthquake, a procedure was developed for gathering and disseminating authoritative information about the time dependence of seismic hazard to help communities prepare for a potentially destructive earthquake. The most striking time dependency of the earthquake occurrence process is the time clustering, which is particularly pronounced in time windows of days and weeks. The Operational Earthquake Forecasting (OEF) system that is developed at the Seismic Hazard Center (Centro di Pericolosità Sismica, CPS) of the Istituto Nazionale di Geofisica e Vulcanologia (INGV) is the authoritative source of seismic hazard information for Italian Civil Protection. The philosophy of the system rests on a few basic concepts: transparency, reproducibility, and testability. In particular, the transparent, reproducible, and testable earthquake forecasting system developed at CPS is based on ensemble modeling and on a rigorous testing phase. Such phase is carried out according to the guidance proposed by the Collaboratory for the Study of Earthquake Predictability (CSEP, international infrastructure aimed at evaluating quantitatively earthquake prediction and forecast models through purely prospective and reproducible experiments). In the OEF system, the two most popular short-term models were used: the Epidemic-Type Aftershock Sequences (ETAS) and the Short-Term Earthquake Probabilities (STEP). Here, we report the results from OEF's 24hour earthquake forecasting during the main phases of the 2016-2017 sequence occurred in Central Apennines (Italy).
Gillison, Andrew N; Asner, Gregory P; Fernandes, Erick C M; Mafalacusser, Jacinto; Banze, Aurélio; Izidine, Samira; da Fonseca, Ambrósio R; Pacate, Hermenegildo
2016-07-15
Sustainable biodiversity and land management require a cost-effective means of forecasting landscape response to environmental change. Conventional species-based, regional biodiversity assessments are rarely adequate for policy planning and decision making. We show how new ground and remotely-sensed survey methods can be coordinated to help elucidate and predict relationships between biodiversity, land use and soil properties along complex biophysical gradients that typify many similar landscapes worldwide. In the lower Zambezi valley, Mozambique we used environmental, gradient-directed transects (gradsects) to sample vascular plant species, plant functional types, vegetation structure, soil properties and land-use characteristics. Soil fertility indices were derived using novel multidimensional scaling of soil properties. To facilitate spatial analysis, we applied a probabilistic remote sensing approach, analyzing Landsat 7 satellite imagery to map photosynthetically active and inactive vegetation and bare soil along each gradsect. Despite the relatively low sample number, we found highly significant correlations between single and combined sets of specific plant, soil and remotely sensed variables that permitted testable spatial projections of biodiversity and soil fertility across the regional land-use mosaic. This integrative and rapid approach provides a low-cost, high-return and readily transferable methodology that permits the ready identification of testable biodiversity indicators for adaptive management of biodiversity and potential agricultural productivity. Copyright © 2016 Elsevier Ltd. All rights reserved.
A Handheld Open-Field Infant Keratometer (An American Ophthalmological Society Thesis)
Miller, Joseph M.
2010-01-01
Purpose: To design and evaluate a new infant keratometer that incorporates an unobstructed view of the infant with both eyes (open-field design). Methods: The design of the open-field infant keratometer is presented, and details of its construction are given. The design incorporates a single-ring keratoscope for measurement of corneal astigmatism over a 4-mm region of the cornea and includes a rectangular grid target concentric within the ring to allow for the study of higher-order aberrations of the eye. In order to calibrate the lens and imaging system, a novel telecentric test object was constructed and used. The system was bench calibrated against steel ball bearings of known dimensions and evaluated for accuracy while being used in handheld mode in a group of 16 adult cooperative subjects. It was then evaluated for testability in a group of 10 infants and toddlers. Results: Results indicate that while the device achieved the goal of creating an open-field instrument containing a single-ring keratoscope with a concentric grid array for the study of higher-order aberrations, additional work is required to establish better control of the vertex distance. Conclusion: The handheld open-field infant keratometer demonstrates testability suitable for the study of infant corneal astigmatism. Use of collimated light sources in future iterations of the design must be incorporated in order to achieve the accuracy required for clinical investigation. PMID:21212850
A handheld open-field infant keratometer (an american ophthalmological society thesis).
Miller, Joseph M
2010-12-01
To design and evaluate a new infant keratometer that incorporates an unobstructed view of the infant with both eyes (open-field design). The design of the open-field infant keratometer is presented, and details of its construction are given. The design incorporates a single-ring keratoscope for measurement of corneal astigmatism over a 4-mm region of the cornea and includes a rectangular grid target concentric within the ring to allow for the study of higher-order aberrations of the eye. In order to calibrate the lens and imaging system, a novel telecentric test object was constructed and used. The system was bench calibrated against steel ball bearings of known dimensions and evaluated for accuracy while being used in handheld mode in a group of 16 adult cooperative subjects. It was then evaluated for testability in a group of 10 infants and toddlers. Results indicate that while the device achieved the goal of creating an open-field instrument containing a single-ring keratoscope with a concentric grid array for the study of higher-order aberrations, additional work is required to establish better control of the vertex distance. The handheld open-field infant keratometer demonstrates testability suitable for the study of infant corneal astigmatism. Use of collimated light sources in future iterations of the design must be incorporated in order to achieve the accuracy required for clinical investigation.
Cowden, Tracy L; Cummings, Greta G
2012-07-01
We describe a theoretical model of staff nurses' intentions to stay in their current positions. The global nursing shortage and high nursing turnover rate demand evidence-based retention strategies. Inconsistent study outcomes indicate a need for testable theoretical models of intent to stay that build on previously published models, are reflective of current empirical research and identify causal relationships between model concepts. Two systematic reviews of electronic databases of English language published articles between 1985-2011. This complex, testable model expands on previous models and includes nurses' affective and cognitive responses to work and their effects on nurses' intent to stay. The concepts of desire to stay, job satisfaction, joy at work, and moral distress are included in the model to capture the emotional response of nurses to their work environments. The influence of leadership is integrated within the model. A causal understanding of clinical nurses' intent to stay and the effects of leadership on the development of that intention will facilitate the development of effective retention strategies internationally. Testing theoretical models is necessary to confirm previous research outcomes and to identify plausible sequences of the development of behavioral intentions. Increased understanding of the causal influences on nurses' intent to stay should lead to strategies that may result in higher retention rates and numbers of nurses willing to work in the health sector. © 2012 Blackwell Publishing Ltd.
Determinants of Dentists' Geographic Distribution.
ERIC Educational Resources Information Center
Beazoglou, Tryfon J.; And Others
1992-01-01
A model for explaining the geographic distribution of dentists' practice locations is presented and applied to particular market areas in Connecticut. Results show geographic distribution is significantly related to a few key variables, including demography, disposable income, and housing prices. Implications for helping students make practice…
Duff, Melissa C.; Mutlu, Bilge; Byom, Lindsey; Turkstra, Lyn S.
2014-01-01
Considerable effort has been directed at understanding the nature of the communicative deficits observed in individuals with acquired brain injuries. Yet several theoretical, methodological, and clinical challenges remain. In this article, we examine distributed cognition as a framework for understanding interaction among communication partners, interaction of communication and cognition, and interaction with the environments and contexts of everyday language use. We review the basic principles of distributed cognition and the implications for applying this approach to the study of discourse in individuals with cognitive-communication disorders. We also review a range of protocols and findings from our research that highlight how the distributed cognition approach might offer a deeper understanding of communicative mechanisms and deficits in individuals with cognitive communication impairments. The advantages and implications of distributed cognition as a framework for studying discourse in adults with acquired brain injury are discussed. PMID:22362323
Wagner, Peter J.
2012-01-01
Rate distributions are important considerations when testing hypotheses about morphological evolution or phylogeny. They also have implications about general processes underlying character evolution. Molecular systematists often assume that rates are Poisson processes with gamma distributions. However, morphological change is the product of multiple probabilistic processes and should theoretically be affected by hierarchical integration of characters. Both factors predict lognormal rate distributions. Here, a simple inverse modelling approach assesses the best single-rate, gamma and lognormal models given observed character compatibility for 115 invertebrate groups. Tests reject the single-rate model for nearly all cases. Moreover, the lognormal outperforms the gamma for character change rates and (especially) state derivation rates. The latter in particular is consistent with integration affecting morphological character evolution. PMID:21795266
Transition to Ultra-Low-Sulfur Diesel Fuel: Effects on Prices and Supply, The
2001-01-01
This report discusses the implications of the new regulations for vehicle fuel efficiency and examines the technology, production, distribution, and cost implications of supplying diesel fuel to meet the new standards.
Escobar, Luis E.; Qiao, Huijie; Phelps, Nicholas B. D.; Wagner, Carli K.; Larkin, Daniel J.
2016-01-01
Nitellopsis obtusa (starry stonewort) is a dioecious green alga native to Europe and Asia that has emerged as an aquatic invasive species in North America. Nitellopsis obtusa is rare across large portions of its native range, but has spread rapidly in northern-tier lakes in the United States, where it can interfere with recreation and may displace native species. Little is known about the invasion ecology of N. obtusa, making it difficult to forecast future expansion. Using ecological niche modeling we investigated environmental variables associated with invasion risk. We used species records, climate data, and remotely sensed environmental variables to characterize the species’ multidimensional distribution. We found that N. obtusa is exploiting novel ecological niche space in its introduced range, which may help explain its invasiveness. While the fundamental niche of N. obtusa may be stable, there appears to have been a shift in its realized niche associated with invasion in North America. Large portions of the United States are predicted to constitute highly suitable habitat for N. obtusa. Our results can inform early detection and rapid response efforts targeting N. obtusa and provide testable estimates of the physiological tolerances of this species as a baseline for future empirical research. PMID:27363541
Pinter-Wollman, Noa; Keiser, Carl N.; Wollman, Roy; Pruitt, Jonathan N.
2017-01-01
Collective behavior emerges from interactions among group members who often vary in their behavior. The presence of just one or a few keystone individuals, such as leaders or tutors, may have a large effect on collective outcomes. These individuals can catalyze behavioral changes in other group members, thus altering group composition and collective behavior. The influence of keystone individuals on group function may lead to trade-offs between ecological situations, because the behavioral composition they facilitate may be suitable in one situation but not another. We use computer simulations to examine various mechanisms that allow keystone individuals to exert their influence on group members. We further discuss a trade-off between two potentially conflicting collective outcomes, cooperative prey attack and disease dynamics. Our simulations match empirical data from a social spider system and produce testable predictions for the causes and consequences of the influence of keystone individuals on group composition and collective outcomes. We find that a group’s behavioral composition can be impacted by the keystone individual through changes to interaction patterns or behavioral persistence over time. Group behavioral composition and the mechanisms that drive the distribution of phenotypes influence collective outcomes and lead to trade-offs between disease dynamics and cooperative prey attack. PMID:27420788
Finke, Kathrin; Schwarzkopf, Wolfgang; Müller, Ulrich; Frodl, Thomas; Müller, Hermann J; Schneider, Werner X; Engel, Rolf R; Riedel, Michael; Möller, Hans-Jürgen; Hennig-Fast, Kristina
2011-11-01
Attention deficit hyperactivity disorder (ADHD) persists frequently into adulthood. The decomposition of endophenotypes by means of experimental neuro-cognitive assessment has the potential to improve diagnostic assessment, evaluation of treatment response, and disentanglement of genetic and environmental influences. We assessed four parameters of attentional capacity and selectivity derived from simple psychophysical tasks (verbal report of briefly presented letter displays) and based on a "theory of visual attention." These parameters are mathematically independent, quantitative measures, and previous studies have shown that they are highly sensitive for subtle attention deficits. Potential reductions of attentional capacity, that is, of perceptual processing speed and working memory storage capacity, were assessed with a whole report paradigm. Furthermore, possible pathologies of attentional selectivity, that is, selection of task-relevant information and bias in the spatial distribution of attention, were measured with a partial report paradigm. A group of 30 unmedicated adult ADHD patients and a group of 30 demographically matched healthy controls were tested. ADHD patients showed significant reductions of working memory storage capacity of a moderate to large effect size. Perceptual processing speed, task-based, and spatial selection were unaffected. The results imply a working memory deficit as an important source of behavioral impairments. The theory of visual attention parameter working memory storage capacity might constitute a quantifiable and testable endophenotype of ADHD.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palladino, Andrea; Vissani, Francesco; Spurio, Maurizio, E-mail: andrea.palladino@gssi.infn.it, E-mail: maurizio.spurio@bo.infn.it, E-mail: francesco.vissani@lngs.infn.it
Recently it was noted that different IceCube datasets are not consistent with the same power law spectrum of the cosmic neutrinos: this is the IceCube spectral anomaly , that suggests that they observe a multicomponent spectrum. In this work, the main possibilities to enhance the description in terms of a single extragalactic neutrino component are examined. The hypothesis of a sizable contribution of Galactic high-energy neutrino events distributed as E {sup −2.7} [ Astrophys. J. 826 (2016) 185] is critically analyzed and its natural generalization is considered. The stability of the expectations is studied by introducing free parameters, motivated bymore » theoretical considerations and observational facts. The upgraded model here examined has 1) a Galactic component with different normalization and shape E {sup −2.4}; 2) an extragalactic neutrino spectrum based on new data; 3) a non-zero prompt component of atmospheric neutrinos. The two key predictions of the model concern the 'high-energy starting events' collected from the Southern sky. The Galactic component produces a softer spectrum and a testable angular anisotropy. A second, radically different class of models, where the second component is instead isotropic, plausibly extragalactic and with a relatively soft spectrum, is disfavored instead by existing observations of muon neutrinos from the Northern sky and below few 100 TeV.« less
NASA Astrophysics Data System (ADS)
Bunn, Henry T.; Pickering, Travis Rayne
2010-11-01
The world's first archaeological traces from 2.6 million years ago (Ma) at Gona, in Ethiopia, include sharp-edged cutting tools and cut-marked animal bones, which indicate consumption of skeletal muscle by early hominin butchers. From that point, evidence of hominin meat-eating becomes increasingly more common throughout the Pleistocene archaeological record. Thus, the substantive debate about hominin meat-eating now centers on mode(s) of carcass resource acquisition. Two prominent hypotheses suggest, alternatively, (1) that early Homo hunted ungulate prey by running them to physiological failure and then dispatching them, or (2) that early Homo was relegated to passively scavenging carcass residues abandoned by carnivore predators. Various paleontologically testable predictions can be formulated for both hypotheses. Here we test four predictions concerning age-frequency distributions for bovids that contributed carcass remains to the 1.8 Ma. old FLK 22 Zinjanthropus (FLK Zinj, Olduvai Gorge, Tanzania) fauna, which zooarchaeological and taphonomic data indicate was formed predominantly by early Homo. In all but one case, the bovid mortality data from FLK Zinj violate test predictions of the endurance running-hunting and passive scavenging hypotheses. When combined with other taphonomic data, these results falsify both hypotheses, and lead to the hypothesis that early Homo operated successfully as an ambush predator.
On the IceCube spectral anomaly
NASA Astrophysics Data System (ADS)
Palladino, Andrea; Spurio, Maurizio; Vissani, Francesco
2016-12-01
Recently it was noted that different IceCube datasets are not consistent with the same power law spectrum of the cosmic neutrinos: this is the IceCube spectral anomaly, that suggests that they observe a multicomponent spectrum. In this work, the main possibilities to enhance the description in terms of a single extragalactic neutrino component are examined. The hypothesis of a sizable contribution of Galactic high-energy neutrino events distributed as E-2.7 [Astrophys. J. 826 (2016) 185] is critically analyzed and its natural generalization is considered. The stability of the expectations is studied by introducing free parameters, motivated by theoretical considerations and observational facts. The upgraded model here examined has 1) a Galactic component with different normalization and shape E-2.4 2) an extragalactic neutrino spectrum based on new data; 3) a non-zero prompt component of atmospheric neutrinos. The two key predictions of the model concern the `high-energy starting events' collected from the Southern sky. The Galactic component produces a softer spectrum and a testable angular anisotropy. A second, radically different class of models, where the second component is instead isotropic, plausibly extragalactic and with a relatively soft spectrum, is disfavored instead by existing observations of muon neutrinos from the Northern sky and below few 100 TeV.
Jourdan-da Silva, Nathalie; Fabre, Laetitia; Robinson, Eve; Fournet, Nelly; Nisavanh, Athinna; Bruyand, Mathias; Mailles, Alexandra; Serre, Estelle; Ravel, Magali; Guibert, Véronique; Issenhuth-Jeanjean, Sylvie; Renaudat, Charlotte; Tourdjman, Mathieu; Septfons, Alexandra; de Valk, Henriette; Le Hello, Simon
2018-01-01
On 1 December 2017, an outbreak of Salmonella Agona infections among infants was identified in France. To date, 37 cases (median age: 4 months) and two further international cases have been confirmed. Five different infant milk products manufactured at one facility were implicated. On 2 and 10 December, the company recalled the implicated products; on 22 December, all products processed at the facility since February 2017. Trace-forward investigations indicated product distribution to 66 countries.
NASA Astrophysics Data System (ADS)
Totani, Tomonori
2017-10-01
In standard general relativity the universe cannot be started with arbitrary initial conditions, because four of the ten components of the Einstein's field equations (EFE) are constraints on initial conditions. In the previous work it was proposed to extend the gravity theory to allow free initial conditions, with a motivation to solve the cosmological constant problem. This was done by setting four constraints on metric variations in the action principle, which is reasonable because the gravity's physical degrees of freedom are at most six. However, there are two problems about this theory; the three constraints in addition to the unimodular condition were introduced without clear physical meanings, and the flat Minkowski spacetime is unstable against perturbations. Here a new set of gravitational field equations is derived by replacing the three constraints with new ones requiring that geodesic paths remain geodesic against metric variations. The instability problem is then naturally solved. Implications for the cosmological constant Λ are unchanged; the theory converges into EFE with nonzero Λ by inflation, but Λ varies on scales much larger than the present Hubble horizon. Then galaxies are formed only in small Λ regions, and the cosmological constant problem is solved by the anthropic argument. Because of the increased degrees of freedom in metric dynamics, the theory predicts new non-oscillatory modes of metric anisotropy generated by quantum fluctuation during inflation, and CMB B -mode polarization would be observed differently from the standard predictions by general relativity.
Contributions of paleorheumatology to understanding contemporary disease.
Rothschild, B
2002-01-01
As paleopathology has evolved from observational speculation to analysis of testable hypotheses, so too has recognition of its contribution to vertebrate paleontology. In the presence of significant structural and density variation (between matrix and osseous structures), x-rays provide an additional perspective of osseous response to stress and disease. As film techniques are time and cost expensive, fluoroscopy has proven a valuable alternative. Radiologic techniques also allow non-invasive "sectioning" of specimens, illustrating significant internal detail. The object can be "split" on a plane and the two portions rotated to "open" the image. This three-dimensional approach now can be applied to other forms of sequential data to their facilitate 3-dimensional representation graphically or with solid representations. Antigen and microstructure may be well preserved in fossils. Molecular preservation with retention of helical structure and sensitivity to collagenase has been demonstrated in 10,000 year old collagen. Antigen has been extracted from 100 million year old bone and documented, in situ, in 11,000 year old bone. If the appropriate site in the tissue is assessed, if antigen is still present, and if the appropriate antisera is utilized, fixation of the antibody to the specimen can be detected. Minute amounts of DNA can be amplified and analyzed. Recovery of DNA from a 40,000 year old mammoth, 17,000 year old bison and from 25 million year old insects provides opportunity for cloning and independent assessment of relationships. Implications of available technology focuses direction for development of collaborative approaches.
Meiotic drive influences the outcome of sexually antagonistic selection at a linked locus.
Patten, M M
2014-11-01
Most meiotic drivers, such as the t-haplotype in Mus and the segregation distorter (SD) in Drosophila, act in a sex-specific manner, gaining a transmission advantage through one sex although suffering only the fitness costs associated with the driver in the other. Their inheritance is thus more likely through one of the two sexes, a property they share with sexually antagonistic alleles. Previous theory has shown that pairs of linked loci segregating for sexually antagonistic alleles are more likely to remain polymorphic and that linkage disequilibrium accrues between them. I probe this similarity between drive and sexual antagonism and examine the evolution of chromosomes experiencing these selection pressures simultaneously. Reminiscent of previous theory, I find that: the opportunity for polymorphism increases for a sexually antagonistic locus that is physically linked to a driving locus; the opportunity for polymorphism at a driving locus also increases when linked to a sexually antagonistic locus; and stable linkage disequilibrium accompanies any polymorphic equilibrium. Additionally, I find that drive at a linked locus favours the fixation of sexually antagonistic alleles that benefit the sex in which drive occurs. Further, I show that under certain conditions reduced recombination between these two loci is selectively favoured. These theoretical results provide clear, testable predictions about the nature of sexually antagonistic variation on driving chromosomes and have implications for the evolution of genomic architecture. © 2014 European Society For Evolutionary Biology. Journal of Evolutionary Biology © 2014 European Society For Evolutionary Biology.
A transcriptional serenAID: the role of noncoding RNAs in class switch recombination
Yewdell, William T.; Chaudhuri, Jayanta
2017-01-01
Abstract During an immune response, activated B cells may undergo class switch recombination (CSR), a molecular rearrangement that allows B cells to switch from expressing IgM and IgD to a secondary antibody heavy chain isotype such as IgG, IgA or IgE. Secondary antibody isotypes provide the adaptive immune system with distinct effector functions to optimally combat various pathogens. CSR occurs between repetitive DNA elements within the immunoglobulin heavy chain (Igh) locus, termed switch (S) regions and requires the DNA-modifying enzyme activation-induced cytidine deaminase (AID). AID-mediated DNA deamination within S regions initiates the formation of DNA double-strand breaks, which serve as biochemical beacons for downstream DNA repair pathways that coordinate the ligation of DNA breaks. Myriad factors contribute to optimal AID targeting; however, many of these factors also localize to genomic regions outside of the Igh locus. Thus, a current challenge is to explain the specific targeting of AID to the Igh locus. Recent studies have implicated noncoding RNAs in CSR, suggesting a provocative mechanism that incorporates Igh-specific factors to enable precise AID targeting. Here, we chronologically recount the rich history of noncoding RNAs functioning in CSR to provide a comprehensive context for recent and future discoveries. We present a model for the RNA-guided targeting of AID that attempts to integrate historical and recent findings, and highlight potential caveats. Lastly, we discuss testable hypotheses ripe for current experimentation, and explore promising ideas for future investigations. PMID:28535205
The affirmation of self: a new perspective on the immune system.
Stewart, John; Coutinho, Antonio
2004-01-01
The fundamental concepts of autopoiesis, which emphasize the circular organization underlying both living organisms and cognition, have been criticized on the grounds that since they are conceived as a tight logical chain of definitions and implications, it is often not clear whether they are indeed a scientific theory or rather just a potential scientific vocabulary of doubtful utility to working scientists. This article presents the deployment of the concepts of autopoiesis in the field of immunology, a discipline where working biologists themselves spontaneously have long had recourse to "cognitive" metaphors: "recognition"; a "repertoire" of recognized molecular shapes; "learning" and "memory"; and, most striking of all, a "self versus non-self" distinction. It is shown that in immunology, the concepts of autopoiesis can be employed to generate clear novel hypotheses, models demonstrating these ideas, testable predictions, and novel therapeutic procedures. Epistemologically, it is shown that the self-non-self distinction, while quite real, is misleadingly named. When a real mechanism for generating this distinction is identified, it appears that the actual operational distinction is between (a) a sufficiently numerous set of initial antigens, present from the start of ontogeny, in conditions that allow for their participation in the construction of the system's organization and operation, and (b) single antigens that are first presented to the system after two successive phases of maturation. To call this a self-non-self distinction obscures the issue by presupposing what it ought to be the job of scientific investigation to explain.
Lewis, Cara C; Klasnja, Predrag; Powell, Byron J; Lyon, Aaron R; Tuzzio, Leah; Jones, Salene; Walsh-Bailey, Callie; Weiner, Bryan
2018-01-01
The science of implementation has offered little toward understanding how different implementation strategies work. To improve outcomes of implementation efforts, the field needs precise, testable theories that describe the causal pathways through which implementation strategies function. In this perspective piece, we describe a four-step approach to developing causal pathway models for implementation strategies. First, it is important to ensure that implementation strategies are appropriately specified. Some strategies in published compilations are well defined but may not be specified in terms of its core component that can have a reliable and measureable impact. Second, linkages between strategies and mechanisms need to be generated. Existing compilations do not offer mechanisms by which strategies act, or the processes or events through which an implementation strategy operates to affect desired implementation outcomes. Third, it is critical to identify proximal and distal outcomes the strategy is theorized to impact, with the former being direct, measurable products of the strategy and the latter being one of eight implementation outcomes (1). Finally, articulating effect modifiers, like preconditions and moderators, allow for an understanding of where, when, and why strategies have an effect on outcomes of interest. We argue for greater precision in use of terms for factors implicated in implementation processes; development of guidelines for selecting research design and study plans that account for practical constructs and allow for the study of mechanisms; psychometrically strong and pragmatic measures of mechanisms; and more robust curation of evidence for knowledge transfer and use.
van Anders, Sari M
2015-07-01
Sexual orientation typically describes people's sexual attractions or desires based on their sex relative to that of a target. Despite its utility, it has been critiqued in part because it fails to account for non-biological gender-related factors, partnered sexualities unrelated to gender or sex, or potential divergences between love and lust. In this article, I propose Sexual Configurations Theory (SCT) as a testable, empirically grounded framework for understanding diverse partnered sexualities, separate from solitary sexualities. I focus on and provide models of two parameters of partnered sexuality--gender/sex and partner number. SCT also delineates individual gender/sex. I discuss a sexual diversity lens as a way to study the particularities and generalities of diverse sexualities without privileging either. I also discuss how sexual identities, orientations, and statuses that are typically seen as misaligned or aligned are more meaningfully conceptualized as branched or co-incident. I map out some existing identities using SCT and detail its applied implications for health and counseling work. I highlight its importance for sexuality in terms of measurement and social neuroendocrinology, and the ways it may be useful for self-knowledge and feminist and queer empowerment and alliance building. I also make a case that SCT changes existing understandings and conceptualizations of sexuality in constructive and generative ways informed by both biology and culture, and that it is a potential starting point for sexual diversity studies and research.
Peripheral Chemoreceptors: Function and Plasticity of the Carotid Body
Kumar, Prem; Prabhakar, Nanduri R.
2014-01-01
The discovery of the sensory nature of the carotid body dates back to the beginning of the 20th century. Following these seminal discoveries, research into carotid body mechanisms moved forward progressively through the 20th century, with many descriptions of the ultrastructure of the organ and stimulus-response measurements at the level of the whole organ. The later part of 20th century witnessed the first descriptions of the cellular responses and electrophysiology of isolated and cultured type I and type II cells, and there now exist a number of testable hypotheses of chemotransduction. The goal of this article is to provide a comprehensive review of current concepts on sensory transduction and transmission of the hypoxic stimulus at the carotid body with an emphasis on integrating cellular mechanisms with the whole organ responses and highlighting the gaps or discrepancies in our knowledge. It is increasingly evident that in addition to hypoxia, the carotid body responds to a wide variety of blood-borne stimuli, including reduced glucose and immune-related cytokines and we therefore also consider the evidence for a polymodal function of the carotid body and its implications. It is clear that the sensory function of the carotid body exhibits considerable plasticity in response to the chronic perturbations in environmental O2 that is associated with many physiological and pathological conditions. The mechanisms and consequences of carotid body plasticity in health and disease are discussed in the final sections of this article. PMID:23728973
The hyper-sentient addict: an exteroception model of addiction
DeWitt, Samuel J.; Ketcherside, Ariel; McQueeny, Tim M.; Dunlop, Joseph P.; Filbey, Francesca M.
2016-01-01
Background Exteroception involves processes related to the perception of environmental stimuli important for an organism's ability to adapt to its environment. As such, exteroception plays a critical role in conditioned response. In addiction, behavioral and neuroimaging studies show that the conditioned response to drug-related cues is often associated with alterations in brain regions including the precuneus/posterior cingulate cortex, an important node within the default mode network dedicated to processes such as self-monitoring. Objective This review aimed to summarize the growing, but largely fragmented, literature that supports a central role of exteroceptive processes in addiction. Methods We performed a systematic review of empirical research via PubMed and Google Scholar with keywords including ‘addiction’, ‘exteroception’, ‘precuneus’, and ‘self-awareness’, to identify human behavioral and neuroimaging studies that report mechanisms of self-awareness in healthy populations, and altered selfawareness processes, specifically exteroception, in addicted populations. Results Results demonstrate that exteroceptive processes play a critical role in conditioned cue response in addiction and serve as targets for interventions such as mindfulness training. Further, a hub of the default mode network, namely, the precuneus, is (i) consistently implicated in exteroceptive processes, and (ii) widely demonstrated to have increased activation and connectivity in addicted populations. Conclusion Heightened exteroceptive processes may underlie cue-elicited craving, which in turn may lead to the maintenance and worsening of substance use disorders. An exteroception model of addiction provides a testable framework from which novel targets for interventions can be identified. PMID:26154169
The hyper-sentient addict: an exteroception model of addiction.
DeWitt, Samuel J; Ketcherside, Ariel; McQueeny, Tim M; Dunlop, Joseph P; Filbey, Francesca M
2015-01-01
Exteroception involves processes related to the perception of environmental stimuli important for an organism's ability to adapt to its environment. As such, exteroception plays a critical role in conditioned response. In addiction, behavioral and neuroimaging studies show that the conditioned response to drug-related cues is often associated with alterations in brain regions including the precuneus/posterior cingulate cortex, an important node within the default mode network dedicated to processes such as self-monitoring. This review aimed to summarize the growing, but largely fragmented, literature that supports a central role of exteroceptive processes in addiction. We performed a systematic review of empirical research via PubMed and Google Scholar with keywords including 'addiction', 'exteroception', 'precuneus', and 'self-awareness', to identify human behavioral and neuroimaging studies that report mechanisms of self-awareness in healthy populations, and altered self-awareness processes, specifically exteroception, in addicted populations. Results demonstrate that exteroceptive processes play a critical role in conditioned cue response in addiction and serve as targets for interventions such as mindfulness training. Further, a hub of the default mode network, namely, the precuneus, is (i) consistently implicated in exteroceptive processes, and (ii) widely demonstrated to have increased activation and connectivity in addicted populations. Heightened exteroceptive processes may underlie cue-elicited craving, which in turn may lead to the maintenance and worsening of substance use disorders. An exteroception model of addiction provides a testable framework from which novel targets for interventions can be identified.
Oberman, Lindsay M; Ramachandran, Vilayanur S
2008-01-01
Autism is a complex disorder, characterized by social, cognitive, communicative, and motor symptoms. One suggestion, proposed in the current study, to explain the spectrum of symptoms is an underlying impairment in multisensory integration (MSI) systems such as a mirror neuron-like system. The mirror neuron system, thought to play a critical role in skills such as imitation, empathy, and language can be thought of as a multisensory system, converting sensory stimuli into motor representations. Consistent with this, we report preliminary evidence for deficits in a task thought to tap into MSI--"the bouba-kiki task" in children with ASD. The bouba-kiki effect is produced when subjects are asked to pair nonsense shapes with nonsense "words". We found that neurotypical children chose the nonsense "word" whose phonemic structure corresponded with the visual shape of the stimuli 88% of the time. This is presumably because of mirror neuron-like multisensory systems that integrate the visual shape with the corresponding motor gestures used to pronounce the nonsense word. Surprisingly, individuals with ASD only chose the corresponding name 56% of the time. The poor performance by the ASD group on this task suggests a deficit in MSI, perhaps related to impaired MSI brain systems. Though this is a behavioral study, it provides a testable hypothesis for the communication impairments in children with ASD that implicates a specific neural system and fits well with the current findings suggesting an impairment in the mirror systems in individuals with ASD.
Social Identities as Pathways into and out of Addiction.
Dingle, Genevieve A; Cruwys, Tegan; Frings, Daniel
2015-01-01
There exists a predominant identity loss and "redemption" narrative in the addiction literature describing how individuals move from a "substance user" identity to a "recovery" identity. However, other identity related pathways influencing onset, treatment seeking and recovery may exist, and the process through which social identities unrelated to substance use change over time is not well understood. This study was designed to provide a richer understanding of such social identities processes. Semi-structured interviews were conducted with 21 adults residing in a drug and alcohol therapeutic community (TC) and thematic analysis revealed two distinct identity-related pathways leading into and out of addiction. Some individuals experienced a loss of valued identities during addiction onset that were later renewed during recovery (consistent with the existing redemption narrative). However, a distinct identity gain pathway emerged for socially isolated individuals, who described the onset of their addiction in terms of a new valued social identity. Almost all participants described their TC experience in terms of belonging to a recovery community. Participants on the identity loss pathway aimed to renew their pre-addiction identities after treatment while those on the identity gain pathway aimed to build aspirational new identities involving study, work, or family roles. These findings help to explain how social factors are implicated in the course of addiction, and may act as either motivations for or barriers to recovery. The qualitative analysis yielded a testable model for future research in other samples and settings.
Gene-centric approach to integrating environmental genomics and biogeochemical models.
Reed, Daniel C; Algar, Christopher K; Huber, Julie A; Dick, Gregory J
2014-02-04
Rapid advances in molecular microbial ecology have yielded an unprecedented amount of data about the evolutionary relationships and functional traits of microbial communities that regulate global geochemical cycles. Biogeochemical models, however, are trailing in the wake of the environmental genomics revolution, and such models rarely incorporate explicit representations of bacteria and archaea, nor are they compatible with nucleic acid or protein sequence data. Here, we present a functional gene-based framework for describing microbial communities in biogeochemical models by incorporating genomics data to provide predictions that are readily testable. To demonstrate the approach in practice, nitrogen cycling in the Arabian Sea oxygen minimum zone (OMZ) was modeled to examine key questions about cryptic sulfur cycling and dinitrogen production pathways in OMZs. Simulations support previous assertions that denitrification dominates over anammox in the central Arabian Sea, which has important implications for the loss of fixed nitrogen from the oceans. Furthermore, cryptic sulfur cycling was shown to attenuate the secondary nitrite maximum often observed in OMZs owing to changes in the composition of the chemolithoautotrophic community and dominant metabolic pathways. Results underscore the need to explicitly integrate microbes into biogeochemical models rather than just the metabolisms they mediate. By directly linking geochemical dynamics to the genetic composition of microbial communities, the method provides a framework for achieving mechanistic insights into patterns and biogeochemical consequences of marine microbes. Such an approach is critical for informing our understanding of the key role microbes play in modulating Earth's biogeochemistry.
Savin, Cristina; Dayan, Peter; Lengyel, Máté
2014-01-01
A venerable history of classical work on autoassociative memory has significantly shaped our understanding of several features of the hippocampus, and most prominently of its CA3 area, in relation to memory storage and retrieval. However, existing theories of hippocampal memory processing ignore a key biological constraint affecting memory storage in neural circuits: the bounded dynamical range of synapses. Recent treatments based on the notion of metaplasticity provide a powerful model for individual bounded synapses; however, their implications for the ability of the hippocampus to retrieve memories well and the dynamics of neurons associated with that retrieval are both unknown. Here, we develop a theoretical framework for memory storage and recall with bounded synapses. We formulate the recall of a previously stored pattern from a noisy recall cue and limited-capacity (and therefore lossy) synapses as a probabilistic inference problem, and derive neural dynamics that implement approximate inference algorithms to solve this problem efficiently. In particular, for binary synapses with metaplastic states, we demonstrate for the first time that memories can be efficiently read out with biologically plausible network dynamics that are completely constrained by the synaptic plasticity rule, and the statistics of the stored patterns and of the recall cue. Our theory organises into a coherent framework a wide range of existing data about the regulation of excitability, feedback inhibition, and network oscillations in area CA3, and makes novel and directly testable predictions that can guide future experiments. PMID:24586137
Exoplanet Curriculum at the International Space University
NASA Astrophysics Data System (ADS)
Burke, J. D.; Hill, H. G. M.
2012-04-01
Rapidly-expanding knowledge of exoplanets is providing a huge opportunity for education at all levels. In addition to the intrinsic scientific interest of finding other planetary systems and developing testable hypotheses about stellar evolution, based for the first time in history on more than one example, there is the prospect of finding habitats for other life. Even if actual life signatures cannot yet be unambiguously detected, just a credible possibility is enough to catalyze new discussions and stimulate new ideas emerging from the rich background of science fiction and the ancient concept of a plurality of inhabited worlds. At the International Space University, a graduate-level institution devoted to identifying, informing and encouraging young professionals from throughout the world, this exploding new field of science provides a grand opportunity for seminars and other activities engaging students in creative thinking about the vast human implications of a populated cosmos. Once a planet's existence and orbit are confirmed by long-continued observations, it may be a suitable object for spectrometry and other techniques to begin finding characteristics of its interior, atmosphere, magnetosphere, possibly even oceans. These observations require not only very advanced instrumentation and data methods but also patience and skill in operations both on Earth and in space. They can serve as an organizing principle for education across all of the specialties represented at ISU. In this paper we discuss the ISU curriculum, focusing on those parts of it that can benefit from the interdisciplinary expansion enabled by exoplanet discoveries.
Social Identities as Pathways into and out of Addiction
Dingle, Genevieve A.; Cruwys, Tegan; Frings, Daniel
2015-01-01
There exists a predominant identity loss and “redemption” narrative in the addiction literature describing how individuals move from a “substance user” identity to a “recovery” identity. However, other identity related pathways influencing onset, treatment seeking and recovery may exist, and the process through which social identities unrelated to substance use change over time is not well understood. This study was designed to provide a richer understanding of such social identities processes. Semi-structured interviews were conducted with 21 adults residing in a drug and alcohol therapeutic community (TC) and thematic analysis revealed two distinct identity-related pathways leading into and out of addiction. Some individuals experienced a loss of valued identities during addiction onset that were later renewed during recovery (consistent with the existing redemption narrative). However, a distinct identity gain pathway emerged for socially isolated individuals, who described the onset of their addiction in terms of a new valued social identity. Almost all participants described their TC experience in terms of belonging to a recovery community. Participants on the identity loss pathway aimed to renew their pre-addiction identities after treatment while those on the identity gain pathway aimed to build aspirational new identities involving study, work, or family roles. These findings help to explain how social factors are implicated in the course of addiction, and may act as either motivations for or barriers to recovery. The qualitative analysis yielded a testable model for future research in other samples and settings. PMID:26648882
The Rome Laboratory Reliability Engineer’s Toolkit
1993-04-01
34Testability Programs for Electronic Systems and Equipment" DODD 5000.1 "Defense Acquistion " DODI 5000.2 "Defense Acquisition Management Policies and...these paths have an equivalent failure rate of zero so that only the remaining serial elements need to be translated. 5. The requirement process...X6) X A2+B2+XAXB One standby off-line unit with n active on- line units required for success. Off-line spare assumed to have a failure rate of zero
The dynamics of hurricane balls
NASA Astrophysics Data System (ADS)
Andersen, W. L.; Werner, Steven
2015-09-01
We examine the theory of the hurricane balls toy. This toy consists of two steel balls, welded together that are sent spinning on a horizontal surface somewhat like a top. Unlike a top, at high frequency the symmetry axis approaches a limiting inclination that is not perpendicular to the surface. We calculate (and experimentally verify) the limiting inclinations for three toy geometries. We find that at high frequencies, hurricane balls provide an easily realized and testable example of the Poinsot theory of freely rotating symmetrical bodies.
The Mars Science Laboratory Entry, Descent, and Landing Flight Software
NASA Technical Reports Server (NTRS)
Gostelow, Kim P.
2013-01-01
This paper describes the design, development, and testing of the EDL program from the perspective of the software engineer. We briefly cover the overall MSL flight software organization, and then the organization of EDL itself. We discuss the timeline, the structure of the GNC code (but not the algorithms as they are covered elsewhere in this conference) and the command and telemetry interfaces. Finally, we cover testing and the influence that testability had on the EDL flight software design.
Brain Organization and Psychodynamics
Peled, Avi; Geva, Amir B.
1999-01-01
Any attempt to link brain neural activity and psychodynamic concepts requires a tremendous conceptual leap. Such a leap may be facilitated if a common language between brain and mind can be devised. System theory proposes formulations that may aid in reconceptualizing psychodynamic descriptions in terms of neural organizations in the brain. Once adopted, these formulations can help to generate testable predictions about brain–psychodynamic relations and thus significantly affect the future of psychotherapy. (The Journal of Psychotherapy Practice and Research 1999; 8:24–39) PMID:9888105
A Quantitative Geochemical Target for Modeling the Formation of the Earth and Moon
NASA Technical Reports Server (NTRS)
Boyce, Jeremy W.; Barnes, Jessica J.; McCubbin, Francis M.
2017-01-01
The past decade has been one of geochemical, isotopic, and computational advances that are bringing the laboratory measurements and computational modeling neighborhoods of the Earth-Moon community to ever closer proximity. We are now however in the position to become even better neighbors: modelers can generate testable hypthotheses for geochemists; and geochemists can provide quantitive targets for modelers. Here we present a robust example of the latter based on Cl isotope measurements of mare basalts.
2017-01-01
A central feature of Darwin's theory of natural selection is that it explains the purpose of biological adaptation. Here, I: emphasize the scientific importance of understanding what adaptations are for, in terms of facilitating the derivation of empirically testable predictions; discuss the population genetical basis for Darwin's theory of the purpose of adaptation, with reference to Fisher's ‘fundamental theorem of natural selection'; and show that a deeper understanding of the purpose of adaptation is achieved in the context of social evolution, with reference to inclusive fitness and superorganisms. PMID:28839927
Gardner, Andy
2017-10-06
A central feature of Darwin's theory of natural selection is that it explains the purpose of biological adaptation. Here, I: emphasize the scientific importance of understanding what adaptations are for, in terms of facilitating the derivation of empirically testable predictions; discuss the population genetical basis for Darwin's theory of the purpose of adaptation, with reference to Fisher's 'fundamental theorem of natural selection'; and show that a deeper understanding of the purpose of adaptation is achieved in the context of social evolution, with reference to inclusive fitness and superorganisms.
The Role of Metaphysical Naturalism in Science
NASA Astrophysics Data System (ADS)
Mahner, Martin
2012-10-01
This paper defends the view that metaphysical naturalism is a constitutive ontological principle of science in that the general empirical methods of science, such as observation, measurement and experiment, and thus the very production of empirical evidence, presuppose a no-supernature principle. It examines the consequences of metaphysical naturalism for the testability of supernatural claims, and it argues that explanations involving supernatural entities are pseudo-explanatory due to the many semantic and ontological problems of supernatural concepts. The paper also addresses the controversy about metaphysical versus methodological naturalism.
Dayside auroral arcs and convection
NASA Technical Reports Server (NTRS)
Reiff, P. H.; Burch, J. L.; Heelis, R. A.
1978-01-01
Recent Defense Meteorological Satellite Program and International Satellite for Ionospheric Studies dayside auroral observations show two striking features: a lack of visible auroral arcs near noon and occasional fan shaped arcs radiating away from noon on both the morning and afternoon sides of the auroral oval. A simple model which includes these two features is developed by reference to the dayside convection pattern of Heelis et al. (1976). The model may be testable in the near future with simultaneous convection, current and auroral light data.
Causal Reasoning on Biological Networks: Interpreting Transcriptional Changes
NASA Astrophysics Data System (ADS)
Chindelevitch, Leonid; Ziemek, Daniel; Enayetallah, Ahmed; Randhawa, Ranjit; Sidders, Ben; Brockel, Christoph; Huang, Enoch
Over the past decade gene expression data sets have been generated at an increasing pace. In addition to ever increasing data generation, the biomedical literature is growing exponentially. The PubMed database (Sayers et al., 2010) comprises more than 20 million citations as of October 2010. The goal of our method is the prediction of putative upstream regulators of observed expression changes based on a set of over 400,000 causal relationships. The resulting putative regulators constitute directly testable hypotheses for follow-up.
Technology advances and market forces: Their impact on high performance architectures
NASA Technical Reports Server (NTRS)
Best, D. R.
1978-01-01
Reasonable projections into future supercomputer architectures and technology require an analysis of the computer industry market environment, the current capabilities and trends within the component industry, and the research activities on computer architecture in the industrial and academic communities. Management, programmer, architect, and user must cooperate to increase the efficiency of supercomputer development efforts. Care must be taken to match the funding, compiler, architecture and application with greater attention to testability, maintainability, reliability, and usability than supercomputer development programs of the past.
Harrison, Luke; Loui, Psyche
2014-01-01
Music has a unique power to elicit moments of intense emotional and psychophysiological response. These moments – termed “chills,” “thrills”, “frissons,” etc. – are subjects of introspection and philosophical debate, as well as scientific study in music perception and cognition. The present article integrates the existing multidisciplinary literature in an attempt to define a comprehensive, testable, and ecologically valid model of transcendent psychophysiological moments in music. PMID:25101043
The Labor Market and the Second Economy in the Soviet Union
1991-01-01
model . WHO WORKS "ON THE LEFT"? 15 (The non-second economy income (V) is in turn composed of official first economy income , pilferage from the first...demands. In other words, the model assumes that the family "pools" all unearned income regardless of source. This is one of the few testable assumptions...of the neoclassical model .16 In the labor supply model in this paper, we have assumed that all first economy income , for both husband and wife, is
Scaling properties of multitension domain wall networks
NASA Astrophysics Data System (ADS)
Oliveira, M. F.; Martins, C. J. A. P.
2015-02-01
We study the asymptotic scaling properties of domain wall networks with three different tensions in various cosmological epochs. We discuss the conditions under which a scale-invariant evolution of the network (which is well established for simpler walls) still applies and also consider the limiting case where defects are locally planar and the curvature is concentrated in the junctions. We present detailed quantitative predictions for scaling densities in various contexts, which should be testable by means of future high-resolution numerical simulations.
White-nose syndrome initiates a cascade of physiologic disturbances in the hibernating bat host
Verant, Michelle L.; Meteyer, Carol U.; Speakman, John R.; Cryan, Paul M.; Lorch, Jeffrey M.; Blehert, David S.
2014-01-01
Integrating these novel findings on the physiological changes that occur in early-stage WNS with those previously documented in late-stage infections, we propose a multi-stage disease progression model that mechanistically describes the pathologic and physiologic effects underlying mortality of WNS in hibernating bats. This model identifies testable hypotheses for better understanding this disease, knowledge that will be critical for defining effective disease mitigation strategies aimed at reducing morbidity and mortality that results from WNS.
Beyond Critical Exponents in Neuronal Avalanches
NASA Astrophysics Data System (ADS)
Friedman, Nir; Butler, Tom; Deville, Robert; Beggs, John; Dahmen, Karin
2011-03-01
Neurons form a complex network in the brain, where they interact with one another by firing electrical signals. Neurons firing can trigger other neurons to fire, potentially causing avalanches of activity in the network. In many cases these avalanches have been found to be scale independent, similar to critical phenomena in diverse systems such as magnets and earthquakes. We discuss models for neuronal activity that allow for the extraction of testable, statistical predictions. We compare these models to experimental results, and go beyond critical exponents.
A not-so-big crisis: re-reading Silurian conodont diversity in a sequence-stratigraphic framework
NASA Astrophysics Data System (ADS)
Jarochowska, Emilia; Munnecke, Axel
2016-04-01
Conodonts are extensively used in Ordovician through Triassic biostratigraphy and fossil-based geochemistry. However, their distribution in rock successions is commonly taken at face value, without taking into account their diverse and poorly understood ecology. Multielement taxonomy, ontogenetic and environmental variability, difficulties in extraction, and relative rarity all contribute to the general lack of quantitative studies on conodont stratigraphic distribution and temporal turnover. With respect to Silurian conodonts, the concept of recurrent conodont extinction events - the so called Ireviken, Mulde and Lau events - has become a standard in the stratigraphic literature. The concept has been proposed based on qualitative observations of local extirpations of open-marine pelagic or nekto-benthic taxa and temporary dominance of shallow-water species in the Silurian succession of the Swedish island of Gotland. These changes coincided with positive carbon isotope excursions, abrupt facies shifts, "blooms" of benthic fauna, and changes in reef communities, which have all been combined into a general view of Silurian bio-geochemical events. This view posits a deterministic, reproducible pattern in Silurian conodont diversity, attributed to recurrent ecological or geochemical conditions. The growing body of sequence-stratigraphic interpretations across these events in Gotland and other sections worldwide indicate that in all cases the Silurian "events" are associated with rapid global regressions. This suggests that faunal changes such as the dominance of shallow-water, low-diversity conodont fauna and the increase of benthic invertebrate diversity and abundance represent predictable consequences of the variation in the completeness of the rock record and preservation potential of different environments. Our studies in Poland and Ukraine indicate that the magnitude of change in the taxonomic composition of conodont assemblages across the middle Silurian global regression and the hypothesized Mulde Event is proportional to the associated facies shift. Quantitative data on facies distribution of individual conodont species combined with sequence stratigraphic architecture provides a testable model for the impact of sea-level changes on perceived conodont diversity in a section or basin. This approach highlights the need for quantitative data on conodont distribution in their environmental context, their integration into conodont-based stratigraphy and geochemistry, and for the regular use of Occam's razor to interpretations of paleobiodiversity.
Hidden in the Neutrons: Physical Evidence for Lunar True Polar Wander
NASA Astrophysics Data System (ADS)
Keane, J. T.; Siegler, M. A.; Miller, R. S.; Laneuville, M.; Paige, D. A.; Matsuyama, I.; Lawrence, D. J.; Crotts, A.; Poston, M.
2015-12-01
Airless bodies like the Moon are time capsules of planetary and solar system evolution. Lunar polar ices, in particular, record a history of volatile delivery, orbital dynamics, and solar system chemistry. However, despite two decades of orbital geochemistry measurements, the observed abundances and spatial distribution of lunar polar volatiles (likely water ice, as inferred by epithermal neutron deficits) remain unexplained. The observed deposits do not correlate with measured surface temperatures or thermal models of ice stability and are notably asymmetric about the lunar poles, with the peak abundance offset from the present-day pole by 5°. Here we show, for the first time, that polar volatile deposits at the North and South pole are antipodal, displaced equally from each each pole along opposite longitudes. These off-polar volatiles likely represent fossilized cold-traps, formed when the moon had a different spin pole. Reorientation of the Moon from this paleopole to the present pole (i.e. true polar wander) altered the locations of cold-traps and resulted in the asymmetric, but antipodal, polar hydrogen distribution. Since true polar wander results from changes in the distribution of mass within a planet, the direction and magnitude of this wander can be used to constrain the evolution of the lunar interior. We find a causal link between this paleopole and the unique thermal evolution of the nearside Procellarum KREEP Terrane (PKT). Radiogenic heating within this province not only resulted major mare volcanism, but also altered the Moon's moments of inertia. We use a combination of analytical, and numerical 3-D thermochemical convection models to show that the evolution of the PKT naturally produces the correct direction and magnitude of polar wander (albeit early in lunar history, when the PKT was most active). This work provides a self-consistent explanation for the spatial distribution of lunar polar volatiles and opens a deeper connection to the evolution of the lunar interior. Our hypothesis will be readily testable with forthcoming lunar missions, including high-resolution orbital geochemistry instruments, in-situ and returned sample analysis, and geophysical networks.
Robotics with Natural Language Comprehension and Learning Abilities.
1985-01-01
Artificial intellignece implications for knowledge retrivedO. Accession For NTIGRA&I DTIC TAB Unannounced 0 Justifioation *5**.By I Distribution...through understanding and generalizing plans", "An approach to learning from observation", and " Artificial intelligence implications for knowledge
Henderson, Peter A; Magurran, Anne E
2010-05-22
Species abundance distributions (SADs) are widely used as a tool for summarizing ecological communities but may have different shapes, depending on the currency used to measure species importance. We develop a simple plotting method that links SADs in the alternative currencies of numerical abundance and biomass and is underpinned by testable predictions about how organisms occupy physical space. When log numerical abundance is plotted against log biomass, the species lie within an approximately triangular region. Simple energetic and sampling constraints explain the triangular form. The dispersion of species within this triangle is the key to understanding why SADs of numerical abundance and biomass can differ. Given regular or random species dispersion, we can predict the shape of the SAD for both currencies under a variety of sampling regimes. We argue that this dispersion pattern will lie between regular and random for the following reasons. First, regular dispersion patterns will result if communities are comprised groups of organisms that use different components of the physical space (e.g. open water, the sea bed surface or rock crevices in a marine fish assemblage), and if the abundance of species in each of these spatial guilds is linked to the way individuals of varying size use the habitat. Second, temporal variation in abundance and sampling error will tend to randomize this regular pattern. Data from two intensively studied marine ecosystems offer empirical support for these predictions. Our approach also has application in environmental monitoring and the recognition of anthropogenic disturbance, which may change the shape of the triangular region by, for example, the loss of large body size top predators that occur at low abundance.
A New Tool for Classifying Small Solar System Objects
NASA Astrophysics Data System (ADS)
Desfosses, Ryan; Arel, D.; Walker, M. E.; Ziffer, J.; Harvell, T.; Campins, H.; Fernandez, Y. R.
2011-05-01
An artificial intelligence program, AutoClass, which was developed by NASA's Artificial Intelligence Branch, uses Bayesian classification theory to automatically choose the most probable classification distribution to describe a dataset. To investigate its usefulness to the Planetary Science community, we tested its ability to reproduce the taxonomic classes as defined by Tholen and Barucci (1989). Of the 406 asteroids from the Eight Color Asteroid Survey (ECAS) we chose for our test, 346 were firmly classified and all but 3 (<1%) were classified by Autoclass as they had been in the previous classification system (Walker et al., 2011). We are now applying it to larger datasets to improve the taxonomy of currently unclassified objects. Having demonstrated AutoClass's ability to recreate existing classification effectively, we extended this work to investigations of albedo-based classification systems. To determine how predictive albedo can be, we used data from the Infrared Astronomical Satellite (IRAS) database in conjunction with the large Sloan Digital Sky Survey (SDSS), which contains color and position data for over 200,000 classified and unclassified asteroids (Ivesic et al., 2001). To judge our success we compared our results with a similar approach to classifying objects using IRAS albedo and asteroid color by Tedesco et al. (1989). Understanding the distribution of the taxonomic classes is important to understanding the history and evolution of our Solar System. AutoClass's success in categorizing ECAS, IRAS and SDSS asteroidal data highlights its potential to scan large domains for natural classes in small solar system objects. Based upon our AutoClass results, we intend to make testable predictions about asteroids observed with the Wide-field Infrared Survey Explorer (WISE).
Brook, Bindi S.
2017-01-01
The chemokine receptor CCR7 drives leukocyte migration into and within lymph nodes (LNs). It is activated by chemokines CCL19 and CCL21, which are scavenged by the atypical chemokine receptor ACKR4. CCR7-dependent navigation is determined by the distribution of extracellular CCL19 and CCL21, which form concentration gradients at specific microanatomical locations. The mechanisms underpinning the establishment and regulation of these gradients are poorly understood. In this article, we have incorporated multiple biochemical processes describing the CCL19–CCL21–CCR7–ACKR4 network into our model of LN fluid flow to establish a computational model to investigate intranodal chemokine gradients. Importantly, the model recapitulates CCL21 gradients observed experimentally in B cell follicles and interfollicular regions, building confidence in its ability to accurately predict intranodal chemokine distribution. Parameter variation analysis indicates that the directionality of these gradients is robust, but their magnitude is sensitive to these key parameters: chemokine production, diffusivity, matrix binding site availability, and CCR7 abundance. The model indicates that lymph flow shapes intranodal CCL21 gradients, and that CCL19 is functionally important at the boundary between B cell follicles and the T cell area. It also predicts that ACKR4 in LNs prevents CCL19/CCL21 accumulation in efferent lymph, but does not control intranodal gradients. Instead, it attributes the disrupted interfollicular CCL21 gradients observed in Ackr4-deficient LNs to ACKR4 loss upstream. Our novel approach has therefore generated new testable hypotheses and alternative interpretations of experimental data. Moreover, it acts as a framework to investigate gradients at other locations, including those that cannot be visualized experimentally or involve other chemokines. PMID:28807994
Low-latitude boundary layer near noon: An open field line model
NASA Technical Reports Server (NTRS)
Lyons, L. R.; Schulz, M.; Pridmore-Brown, D. C.; Roeder, J. L.
1994-01-01
We propose that many features of the cusp and low-latitude boundary layer (LLBL) observed near noon MLT can be explained by interpreting the LLBL as being on open lines with an inner boundary at the separatrix between open and closed magnetic field lines. This interpretation places the poleward boundary of the LLBL and equatorward boundary of the cusp along the field line that bifurcates at the cusp neutral point. The interpretation accounts for the abrupt boundary of magnetosheath particles at the inner edge of the LLBL, a feature that is inconsistent with LLBL formation by diffusion onto closed field lines, and for the distribution of magnetosheath particles appearing more as one continuous region than as two distinct regions across the noon cusp/LLBL boundary. Furthermore, we can explain the existence of energetic radiation belt electrons and protons with differing pitch angle distributions within the LLBL and their abrupt cutoff at the poleward boundary of the LLBL. By modeling the LLBL and cusp region quantitatively, we can account for a hemispherical difference in the location of the equatorial boundary of the cusp that is observed to be dependent on the dipole tilt angle but not on the interplanetary magnetic field (IMF) x component. We also find important variations and hemispherical differences in that the size of the LLBL that should depend strongly upon the x component of the IMF. This prediction is observationally testable. Finally, we find that when the IMF is strongly northward, the LLBL may include a narrow region adjacent to the magnetopause where field lines are detached (i.e., have both ends connected to the IMF).
Henderson, Peter A.; Magurran, Anne E.
2010-01-01
Species abundance distributions (SADs) are widely used as a tool for summarizing ecological communities but may have different shapes, depending on the currency used to measure species importance. We develop a simple plotting method that links SADs in the alternative currencies of numerical abundance and biomass and is underpinned by testable predictions about how organisms occupy physical space. When log numerical abundance is plotted against log biomass, the species lie within an approximately triangular region. Simple energetic and sampling constraints explain the triangular form. The dispersion of species within this triangle is the key to understanding why SADs of numerical abundance and biomass can differ. Given regular or random species dispersion, we can predict the shape of the SAD for both currencies under a variety of sampling regimes. We argue that this dispersion pattern will lie between regular and random for the following reasons. First, regular dispersion patterns will result if communities are comprised groups of organisms that use different components of the physical space (e.g. open water, the sea bed surface or rock crevices in a marine fish assemblage), and if the abundance of species in each of these spatial guilds is linked to the way individuals of varying size use the habitat. Second, temporal variation in abundance and sampling error will tend to randomize this regular pattern. Data from two intensively studied marine ecosystems offer empirical support for these predictions. Our approach also has application in environmental monitoring and the recognition of anthropogenic disturbance, which may change the shape of the triangular region by, for example, the loss of large body size top predators that occur at low abundance. PMID:20071388
ERIC Educational Resources Information Center
Frankema, Ewout
2008-01-01
The present paper introduces a new indicator of educational inequality, the grade distribution ratio (GDR), focusing on levels of grade repetition and drop out rates in primary and secondary education. The indicator is specifically suitable to evaluate the distributive implications of expanding educational systems in developing countries. A…
Friesen, Justin P; Campbell, Troy H; Kay, Aaron C
2015-03-01
We propose that people may gain certain "offensive" and "defensive" advantages for their cherished belief systems (e.g., religious and political views) by including aspects of unfalsifiability in those belief systems, such that some aspects of the beliefs cannot be tested empirically and conclusively refuted. This may seem peculiar, irrational, or at least undesirable to many people because it is assumed that the primary purpose of a belief is to know objective truth. However, past research suggests that accuracy is only one psychological motivation among many, and falsifiability or testability may be less important when the purpose of a belief serves other psychological motives (e.g., to maintain one's worldviews, serve an identity). In Experiments 1 and 2 we demonstrate the "offensive" function of unfalsifiability: that it allows religious adherents to hold their beliefs with more conviction and political partisans to polarize and criticize their opponents more extremely. Next we demonstrate unfalsifiability's "defensive" function: When facts threaten their worldviews, religious participants frame specific reasons for their beliefs in more unfalsifiable terms (Experiment 3) and political partisans construe political issues as more unfalsifiable ("moral opinion") instead of falsifiable ("a matter of facts"; Experiment 4). We conclude by discussing how in a world where beliefs and ideas are becoming more easily testable by data, unfalsifiability might be an attractive aspect to include in one's belief systems, and how unfalsifiability may contribute to polarization, intractability, and the marginalization of science in public discourse. PsycINFO Database Record (c) 2015 APA, all rights reserved.
Muscle MRI findings in facioscapulohumeral muscular dystrophy.
Gerevini, Simonetta; Scarlato, Marina; Maggi, Lorenzo; Cava, Mariangela; Caliendo, Giandomenico; Pasanisi, Barbara; Falini, Andrea; Previtali, Stefano Carlo; Morandi, Lucia
2016-03-01
Facioscapulohumeral muscular dystrophy (FSHD) is characterized by extremely variable degrees of facial, scapular and lower limb muscle involvement. Clinical and genetic determination can be difficult, as molecular analysis is not always definitive, and other similar muscle disorders may have overlapping clinical manifestations. Whole-body muscle MRI examination for fat infiltration, atrophy and oedema was performed to identify specific patterns of muscle involvement in FSHD patients (30 subjects), and compared to a group of control patients (23) affected by other myopathies (NFSHD). In FSHD patients, we detected a specific pattern of muscle fatty replacement and atrophy, particularly in upper girdle muscles. The most frequently affected muscles, including paucisymptomatic and severely affected FSHD patients, were trapezius, teres major and serratus anterior. Moreover, asymmetric muscle involvement was significantly higher in FSHD as compared to NFSHD patients. In conclusion, muscle MRI is very sensitive for identifying a specific pattern of involvement in FSHD patients and in detecting selective muscle involvement of non-clinically testable muscles. Muscle MRI constitutes a reliable tool for differentiating FSHD from other muscular dystrophies to direct diagnostic molecular analysis, as well as to investigate FSHD natural history and follow-up of the disease. Muscle MRI identifies a specific pattern of muscle involvement in FSHD patients. Muscle MRI may predict FSHD in asymptomatic and severely affected patients. Muscle MRI of upper girdle better predicts FSHD. Muscle MRI may differentiate FSHD from other forms of muscular dystrophy. Muscle MRI may show the involvement of non-clinical testable muscles.
Scientific realism and wishful thinking in soil hydrology
NASA Astrophysics Data System (ADS)
Flühler, H.
2009-04-01
In our field we often learn - or could have learned - more from failures than from successes provided we had postulated testable hypotheses to be accepted or rejected. In soil hydrology, hypotheses are testable if independent information quantifying the pertinent system features is at hand. This view on how to operate is an idealized concept of how we could or should have worked. In reality, the path to success is more tortuous and we usually progress differently obeying to other professional musts. Although we missed some shortcuts over the past few decades, we definitely made significant progress in understanding vadose zone progresses, but we could have advanced our system understanding faster by more rigorously questioning the fundamental assumptions. I will try to illustrate the tortuous path of learning and identify some causes of the slowed-down learning curve. In the pioneering phase of vadose zone research many models have been mapped in our minds and implemented on our computers. Many of them are now well established, powerful and represent the state-of-the-art even when they do not work. Some of them are based on erroneous or misleading concepts. Even when based on adequate concepts they might have been applied in the wrong context or inadequate models may have lead to apparent success. I address this process of collective learning with the intention that we spend more time and efforts to find the right question instead of improving tools, which are questionably suitable for solving the main problems.
Dairy farm strategy distribution in Pennsylvania and implications to nutrient management
USDA-ARS?s Scientific Manuscript database
The emergence of alternative dairy farm production strategies may have implications on the environmental impacts as well as profitability of dairy farms. This study characterized Pennsylvania dairy farm production of homegrown forages, herd management, purchased feeds, manure management and economic...
The Allocation of Federal Expenditures Among States
NASA Technical Reports Server (NTRS)
Lee, Maw Lin
1967-01-01
This study explores factors associated with the allocation offederal expenditures by states and examines the implications of theseexpenditures on the state by state distribution of incomes. Theallocation of federal expenditures is functionally oriented toward theobjectives for which various government programs are set up. Thegeographical distribution of federal expenditures, therefore, washistorically considered to be a problem incidental to governmentactivity. Because of this, relatively little attention was given tothe question of why some states receive more federal allocation thanothers. In addition, the implications of this pattern of allocationamong the several states have not been intensively investigated.
Electrical test prediction using hybrid metrology and machine learning
NASA Astrophysics Data System (ADS)
Breton, Mary; Chao, Robin; Muthinti, Gangadhara Raja; de la Peña, Abraham A.; Simon, Jacques; Cepler, Aron J.; Sendelbach, Matthew; Gaudiello, John; Emans, Susan; Shifrin, Michael; Etzioni, Yoav; Urenski, Ronen; Lee, Wei Ti
2017-03-01
Electrical test measurement in the back-end of line (BEOL) is crucial for wafer and die sorting as well as comparing intended process splits. Any in-line, nondestructive technique in the process flow to accurately predict these measurements can significantly improve mean-time-to-detect (MTTD) of defects and improve cycle times for yield and process learning. Measuring after BEOL metallization is commonly done for process control and learning, particularly with scatterometry (also called OCD (Optical Critical Dimension)), which can solve for multiple profile parameters such as metal line height or sidewall angle and does so within patterned regions. This gives scatterometry an advantage over inline microscopy-based techniques, which provide top-down information, since such techniques can be insensitive to sidewall variations hidden under the metal fill of the trench. But when faced with correlation to electrical test measurements that are specific to the BEOL processing, both techniques face the additional challenge of sampling. Microscopy-based techniques are sampling-limited by their small probe size, while scatterometry is traditionally limited (for microprocessors) to scribe targets that mimic device ground rules but are not necessarily designed to be electrically testable. A solution to this sampling challenge lies in a fast reference-based machine learning capability that allows for OCD measurement directly of the electrically-testable structures, even when they are not OCD-compatible. By incorporating such direct OCD measurements, correlation to, and therefore prediction of, resistance of BEOL electrical test structures is significantly improved. Improvements in prediction capability for multiple types of in-die electrically-testable device structures is demonstrated. To further improve the quality of the prediction of the electrical resistance measurements, hybrid metrology using the OCD measurements as well as X-ray metrology (XRF) is used. Hybrid metrology is the practice of combining information from multiple sources in order to enable or improve the measurement of one or more critical parameters. Here, the XRF measurements are used to detect subtle changes in barrier layer composition and thickness that can have second-order effects on the electrical resistance of the test structures. By accounting for such effects with the aid of the X-ray-based measurements, further improvement in the OCD correlation to electrical test measurements is achieved. Using both types of solution incorporation of fast reference-based machine learning on nonOCD-compatible test structures, and hybrid metrology combining OCD with XRF technology improvement in BEOL cycle time learning could be accomplished through improved prediction capability.
ERIC Educational Resources Information Center
Dillenbourg, Pierre
1996-01-01
Maintains that diagnosis, explanation, and tutoring, the functions of an interactive learning environment, are collaborative processes. Examines how human-computer interaction can be improved using a distributed cognition framework. Discusses situational and distributed knowledge theories and provides a model on how they can be used to redesign…
ERIC Educational Resources Information Center
Maxcy, Brendan D.; Nguyen, Thu Su'o'ng Thi.
2006-01-01
Recent work on distributed leadership extends an ongoing critique of conventional "heroic" leader portrayals. This article examines work in this area seeking implications for democratic school governance. With material from case studies of two Texas schools, it considers frameworks presented by Spillane, Halverson, and Diamond and by…
Many US water utilities using chloramine as their secondary disinfectant have experienced nitrification episodes that detrimentally impact water quality in their distribution systems. A semi-closed pipe-loop chloraminated drinking water distribution system (DWDS) simulator was u...
2012-07-01
Mediated by Listeria -Stimulated Human Dendritic Cells: Implications for Cancer Vaccine Therapy PRINCIPAL INVESTIGATOR: David J. Chung, M D , Ph D...CONTRACT NUMBER Evaluation of Immune Responses Mediated by Listeria -Stimulated Human Dendritic Cells: Implications for Cancer Vaccine Therapy 5b...Distribution Unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT The purpose of this project is to study the immunomodulatory effect of Listeria on
The spatial coherence function in scanning transmission electron microscopy and spectroscopy.
Nguyen, D T; Findlay, S D; Etheridge, J
2014-11-01
We investigate the implications of the form of the spatial coherence function, also referred to as the effective source distribution, for quantitative analysis in scanning transmission electron microscopy, and in particular for interpreting the spatial origin of imaging and spectroscopy signals. These questions are explored using three different source distribution models applied to a GaAs crystal case study. The shape of the effective source distribution was found to have a strong influence not only on the scanning transmission electron microscopy (STEM) image contrast, but also on the distribution of the scattered electron wavefield and hence on the spatial origin of the detected electron intensities. The implications this has for measuring structure, composition and bonding at atomic resolution via annular dark field, X-ray and electron energy loss STEM imaging are discussed. Copyright © 2014 Elsevier B.V. All rights reserved.
Steve P. Verrill; Frank C. Owens; David E. Kretschmann; Rubin Shmulsky
2017-01-01
It is common practice to assume that a two-parameter Weibull probability distribution is suitable for modeling lumber properties. Verrill and co-workers demonstrated theoretically and empirically that the modulus of rupture (MOR) distribution of visually graded or machine stress rated (MSR) lumber is not distributed as a Weibull. Instead, the tails of the MOR...
NASA Technical Reports Server (NTRS)
Stevenson, David J.
1991-01-01
The following subject areas are covered: (1) the mass distribution; (2) interior models; (3) atmospheric compositions and their implications; (4) heat flows and their implications; (5) satellite systems; (6) temperatures in the solar nebula; and (7) giant planet formation.
Reliability/Maintainability/Testability Design for Dormancy
1988-05-01
compositions was developed thousands of years ago. It has proven to be one of the most durable and strongest substances known. It has been stated that glass can...potting or casting) ,1, Certain foamed resins and low density , hollow beady compounds can be used to reduce .eight r \\dverse l)ielectric Properties I...4.1.1.1 General Characteristics of Fixed Resistors 4.1-13 4.1.1.1.1 Fixed Composition Resistors 4.1-13 4.1.1.1.2 Fixed Film Resistors 4.1-20 4.1.1.1.3
Repairable chip bonding/interconnect process
Bernhardt, Anthony F.; Contolini, Robert J.; Malba, Vincent; Riddle, Robert A.
1997-01-01
A repairable, chip-to-board interconnect process which addresses cost and testability issues in the multi-chip modules. This process can be carried out using a chip-on-sacrificial-substrate technique, involving laser processing. This process avoids the curing/solvent evolution problems encountered in prior approaches, as well is resolving prior plating problems and the requirements for fillets. For repairable high speed chip-to-board connection, transmission lines can be formed on the sides of the chip from chip bond pads, ending in a gull wing at the bottom of the chip for subsequent solder.
The Hubble Web: The Dark Matter Problem and Cosmic Strings
NASA Astrophysics Data System (ADS)
Alexander, Stephon
2009-07-01
I propose a reinterpretation of cosmic dark matter in which a rigid network of cosmic strings formed at the end of inflation. The cosmic strings fulfill three functions: At recombination they provide an accretion mechanism for virializing baryonic and warm dark matter into disks. These cosmic strings survive as configurations which thread spiral and elliptical galaxies leading to the observed flatness of rotation curves and the Tully-Fisher relation. We find a relationship between the rotational velocity of the galaxy and the string tension and discuss the testability of this model.
1983-09-01
has been reviewed and is approved for publication. Im Ŕ APPROVED: . L,.. &- MARK W. LEVI Project Engineer APPROVED: W.S. TUTHILL, Colonel, USAF Chief...ebetract entered In Block 20, if different from Report) Same IS. SUPPLEMENTARY NOTES RADC Project Engineer: Mark W. Levi (RBRP) This effort was funded...masking the presence of another fault which was a functional or reliability hazard. ’." • ’ ",, ,~ MARK W. LEVI A ec . ston For 1\\ T’ ir I .] / r "- T A
Lattice of quantum predictions
NASA Astrophysics Data System (ADS)
Drieschner, Michael
1993-10-01
What is the structure of reality? Physics is supposed to answer this question, but a purely empiristic view is not sufficient to explain its ability to do so. Quantum mechanics has forced us to think more deeply about what a physical theory is. There are preconditions every physical theory must fulfill. It has to contain, e.g., rules for empirically testable predictions. Those preconditions give physics a structure that is “a priori” in the Kantian sense. An example is given how the lattice structure of quantum mechanics can be understood along these lines.
Chowdhry, Bhagwan
2011-01-01
I formulate a simple and parsimonious evolutionary model that shows that because most species face a possibility of dying because of external factors, called extrinsic mortality in the biology literature, it can simultaneously explain (a) why we discount the future, (b) get weaker with age, and (c) display risk-aversion. The paper suggests that testable restrictions—across species, across time, or across genders—among time preference, aging, and risk-aversion could be analyzed in a simple framework .
Complexity, Testability, and Fault Analysis of Digital, Analog, and Hybrid Systems.
1984-09-30
E. Moret Table 2. Decision Table. Example 3 ond and third rules are inconsistent, since Raining? Yes No No both could apparently apply when it is...misclassification; a similar to be approach based on game theory was de- p.. - 0.134. scribed in SLAG71 and a third in KULK76. When the class assignments are...to a change from (X, Y)=(I,1) to (X,Y)=(O,O); the second corresponds to a change in the value of the function g=X+Y; and the third corresponds to a
NASA Technical Reports Server (NTRS)
VanDyke, Melissa; Godfroy, Tom; Houts, Mike; Dickens, Ricky; Dobson, Chris; Pederson, Kevin; Reid, Bob
1999-01-01
The use of resistance heaters to simulate heat from fission allows extensive development of fission systems to be performed in non-nuclear test facilities, saving time and money. Resistance heated tests on the Module Unfueled Thermal- hydraulic Test (MUTT) article has been performed at the Marshall Space Flight Center. This paper discusses the results of these experiments to date, and describes the additional testing that will be performed. Recommendations related to the design of testable space fission power and propulsion systems are made.
NASA Astrophysics Data System (ADS)
van Dyke, Melissa; Godfroy, Tom; Houts, Mike; Dickens, Ricky; Dobson, Chris; Pederson, Kevin; Reid, Bob; Sena, J. Tom
2000-01-01
The use of resistance heaters to simulate heat from fission allows extensive development of fission systems to be performed in non-nuclear test facilities, saving time and money. Resistance heated tests on the Module Unfueled Thermal-hydraulic Test (MUTT) article has been performed at the Marshall Space Flight Center. This paper discusses the results of these experiments to date, and describes the additional testing that will be performed. Recommendations related to the design of testable space fission power and propulsion systems are made. .
Results of 30 kWt Safe Affordable Fission Engine (SAFE-30) primary heat transport testing
NASA Astrophysics Data System (ADS)
Pedersen, Kevin; van Dyke, Melissa; Houts, Mike; Godfroy, Tom; Martin, James; Dickens, Ricky; Williams, Eric; Harper, Roger; Salvil, Pat; Reid, Bob
2001-02-01
The use of resistance heaters to simulate heat from fission allows extensive development of fission systems to be performed in non-nuclear test facilities, saving time and money. Resistance heated tests on the Safe Affordable Fission Engine-30 kilowatt (SAFE30) test article are being performed at the Marshall Space Flight Center. This paper discusses the results of these experiments to date, and describes the additional testing that will be performed. Recommendations related to the design of testable space fission power and propulsion systems are made. .
NASA Technical Reports Server (NTRS)
1990-01-01
The present conference on digital avionics discusses vehicle-management systems, spacecraft avionics, special vehicle avionics, communication/navigation/identification systems, software qualification and quality assurance, launch-vehicle avionics, Ada applications, sensor and signal processing, general aviation avionics, automated software development, design-for-testability techniques, and avionics-software engineering. Also discussed are optical technology and systems, modular avionics, fault-tolerant avionics, commercial avionics, space systems, data buses, crew-station technology, embedded processors and operating systems, AI and expert systems, data links, and pilot/vehicle interfaces.
Gravity: one of the driving forces for evolution.
Volkmann, D; Baluska, F
2006-12-01
Mechanical load is 10(3) larger for land-living than for water-living organisms. As a consequence, antigravitational material in form of compound materials like lignified cell walls in plants and mineralised bones in animals occurs in land-living organisms preferentially. Besides cellulose, pectic substances of plant cell walls seem to function as antigravitational material in early phases of plant evolution and development. A testable hypothesis including vesicular recycling processes into the tensegrity concept is proposed for both sensing of gravitational force and responding by production of antigravitational material at the cellular level.
Heather Griscom; Helmut Kraenzle; Zachary. Bortolot
2010-01-01
The objective of our project is to create a habitat suitability model to predict potential and future red spruce forest distributions. This model will be used to better understand the influence of climate change on red spruce distribution and to help guide forest restoration efforts.
Distribution of Gull Specific Molecular Marker in Coastal Areas of Lake Ontario
Gulls have been implicated as primary sources of fecal contamination in the Great Lakes, a fact that may have health implications due to the potential spread of microbial pathogens by waterfowl. To better understand the spatial variability of gull fecal contamination, a gull-spe...
Cultural Implications of Human Resource Development.
ERIC Educational Resources Information Center
Hiranpruk, Chaiskran
A discussion of the cultural effects of economic and, by extension, human resource development in Southeast Asia looks at short- and long-term implications. It is suggested that in the short term, increased competition will affect distribution of wealth, which can promote materialism and corruption. The introduction of labor-saving technology may…
The Inequality Implications of Highly Selective Promotion Practices
ERIC Educational Resources Information Center
Mete, Cem
2004-01-01
Faced with the evident impossibility of providing free or significantly subsidized secondary and higher education to all, many poor and middle income countries choose to educate only those students who are most promising, using public examinations as means of distributing scarce resources. This paper investigates the inequality implications of…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walker, Gordon, E-mail: g.p.walker@lancaster.ac.u
2010-09-15
Claims of environmental injustice have increasingly become part of environmental conflicts, both explicitly through the work of environmental justice campaigning groups and implicitly through the arguments deployed about the rights and wrongs of a given situation. Such claims can centre on different notions of justice, including those concerned with questions of distribution and procedure. This paper focuses on distributional or outcome justice and explores what implications follow when the distributional concerns of environmental justice are included in the practice of impact assessment processes, including through social impact assessment (SIA). The current use of impact assessment methods in the UK ismore » reviewed showing that although practices are evolving there is a little routine assessment of distributional inequalities. It is argued that whilst this should become part of established practice to ensure that inequalities are revealed and matters of justice are given a higher profile, the implications for conflict within decision making processes are not straightforward. On the one hand, there could be scope for conflict to be ameliorated by analysis of inequalities informing the debate between stakeholders, and facilitating the implementation of mitigation and compensation measures for disadvantaged groups. On the other hand, contestation over how evidence is produced and therefore what it shows, and disagreement as to the basis on which justice and injustice are to be determined, means that conflict may also be generated and sustained within what are essentially political and strategic settings.« less
Implication of global climate change on the distribution and activity of Phytophthora ramorum
Robert C. Venette
2009-01-01
Global climate change is predicted to alter the distribution and activity of several forest pathogens. Boland et al. (2004) suggested that climate change might affect pathogen establishment, rate of disease progress, and the duration of...
Relations between distributional and Devaney chaos.
Oprocha, Piotr
2006-09-01
Recently, it was proven that chaos in the sense of Devaney and weak mixing both imply chaos in the sense of Li and Yorke. In this article we give explicit examples that any of these two implications do not hold for distributional chaos.
Burke, Mary A; Carman, Katherine G
2017-11-01
Previous studies of survey data from the U.S. and other countries find that women tend to understate their body weight on average, while both men and women overstate their height on average. Social norms have been posited as one potential explanation for misreporting of weight and height, but lack of awareness of body weight has been suggested as an alternative explanation, and the evidence presented to date is inconclusive. This paper is the first to offer a theoretical model of self-reporting behavior for weight and height, in which individuals face a tradeoff between reporting an accurate weight (or height) and reporting a socially desirable weight (or height). The model generates testable implications that help us to determine whether self-reporting errors arise because of social desirability bias or instead reflect lack of awareness of body weight and/or other factors. Using data from the National Health and Nutrition Examination Survey (NHANES) from 1999 to 2010, we find that self-reports of weight offer robust evidence of social desirability bias. However, lack of awareness of weight may also contribute to self-reporting biases, and this factor appears to be more important within some demographic groups than others. Among both women and men, self-reports of height exhibit significant social desirability bias only among those of below-average height, and very few individuals underreport their height. Implied self-reports of BMI exhibit gender-specific patterns similar to those observed for self-reporting of weight, and the inferred social norms for BMI (20.8 for women and 24.8 for men) are within the "normal" range established by public health institutions. Determining why individuals misreport their weight has important implications for survey design as well as for clinical practice. For example, our findings suggest that health care providers might take additional steps to increase self-awareness of body weight. The framework also helps to explain previous findings that the degree of self-reporting bias in weight is stronger in telephone surveys than it is in in-person surveys. Copyright © 2017 Elsevier B.V. All rights reserved.
Paddock, Susan M.; Ebener, Patricia
2010-01-01
Substance abuse treatment research is complicated by the pervasive problem of non-ignorable missing data – i.e., the occurrence of the missing data is related to the unobserved outcomes. Missing data frequently arise due to early client departure from treatment. Pattern-mixture models (PMMs) are often employed in such situations to jointly model the outcome and the missing data mechanism. PMMs require non-testable assumptions to identify model parameters. Several approaches to parameter identification have therefore been explored for longitudinal modeling of continuous outcomes, and informative priors have been developed in other contexts. In this paper, we describe an expert interview conducted with five substance abuse treatment clinical experts who have familiarity with the Therapeutic Community modality of substance abuse treatment and with treatment process scores collected using the Dimensions of Change Instrument. The goal of the interviews was to obtain expert opinion about the rate of change in continuous client-level treatment process scores for clients who leave before completing two assessments and whose rate of change (slope) in treatment process scores is unidentified by the data. We find that the experts’ opinions differed dramatically from widely-utilized assumptions used to identify parameters in the PMM. Further, subjective prior assessment allows one to properly address the uncertainty inherent in the subjective decisions required to identify parameters in the PMM and to measure their effect on conclusions drawn from the analysis. PMID:19012279
Rothwell, Gar W; Wyatt, Sarah E; Tomescu, Alexandru M F
2014-06-01
Paleontology yields essential evidence for inferring not only the pattern of evolution, but also the genetic basis of evolution within an ontogenetic framework. Plant fossils provide evidence for the pattern of plant evolution in the form of transformational series of structure through time. Developmentally diagnostic structural features that serve as "fingerprints" of regulatory genetic pathways also are preserved by plant fossils, and here we provide examples of how those fingerprints can be used to infer the mechanisms by which plant form and development have evolved. When coupled with an understanding of variations and systematic distributions of specific regulatory genetic pathways, this approach provides an avenue for testing evolutionary hypotheses at the organismal level that is analogous to employing bioinformatics to explore genetics at the genomic level. The positions where specific genes, gene families, and developmental regulatory mechanisms first appear in phylogenies are correlated with the positions where fossils with the corresponding structures occur on the tree, thereby yielding testable hypotheses that extend our understanding of the role of developmental changes in the evolution of the body plans of vascular plant sporophytes. As a result, we now have new and powerful methodologies for characterizing major evolutionary changes in morphology, anatomy, and physiology that have resulted from combinations of genetic regulatory changes and that have produced the synapomorphies by which we recognize major clades of plants. © 2014 Botanical Society of America, Inc.
Robles, Estuardo
2017-09-01
In no vertebrate species do we possess an accurate, comprehensive tally of neuron types in the brain. This is in no small part due to the vast diversity of neuronal types that comprise complex vertebrate nervous systems. A fundamental goal of neuroscience is to construct comprehensive catalogs of cell types defined by structure, connectivity, and physiological response properties. This type of information will be invaluable for generating models of how assemblies of neurons encode and distribute sensory information and correspondingly alter behavior. This review summarizes recent efforts in the larval zebrafish to construct sensory projectomes, comprehensive analyses of axonal morphologies in sensory axon tracts. Focusing on the olfactory and optic tract, these studies revealed principles of sensory information processing in the olfactory and visual systems that could not have been directly quantified by other methods. In essence, these studies reconstructed the optic and olfactory tract in a virtual manner, providing insights into patterns of neuronal growth that underlie the formation of sensory axon tracts. Quantitative analysis of neuronal diversity revealed organizing principles that determine information flow through sensory systems in the zebrafish that are likely to be conserved across vertebrate species. The generation of comprehensive cell type classifications based on structural, physiological, and molecular features will lead to testable hypotheses on the functional role of individual sensory neuron subtypes in controlling specific sensory-evoked behaviors.
A plausible radiobiological model of cardiovascular disease at low or fractionated doses
NASA Astrophysics Data System (ADS)
Little, Mark; Vandoolaeghe, Wendy; Gola, Anna; Tzoulaki, Ioanna
Atherosclerosis is the main cause of coronary heart disease and stroke, the two major causes of death in developed society. There is emerging evidence of excess risk of cardiovascular disease at low radiation doses in various occupationally-exposed groups receiving small daily radia-tion doses. Assuming that they are causal, the mechanisms for effects of chronic fractionated radiation exposures on cardiovascular disease are unclear. We outline a spatial reaction-diffusion model for atherosclerosis, and perform stability analysis, based wherever possible on human data. We show that a predicted consequence of multiple small radiation doses is to cause mean chemo-attractant (MCP-1) concentration to increase linearly with cumulative dose. The main driver for the increase in MCP-1 is monocyte death, and consequent reduction in MCP-1 degradation. The radiation-induced risks predicted by the model are quantitatively consistent with those observed in a number of occupationally-exposed groups. The changes in equilibrium MCP-1 concentrations with low density lipoprotein cholesterol concentration are also consistent with experimental and epidemiologic data. This proposed mechanism would be experimentally testable. If true, it also has substantive implications for radiological protection, which at present does not take cardiovascular disease into account. The Japanese A-bomb survivor data implies that cardiovascular disease and can-cer mortality contribute similarly to radiogenic risk. The major uncertainty in assessing the low-dose risk of cardiovascular disease is the shape of the dose response relationship, which is unclear in the Japanese data. The analysis of the present paper suggests that linear extrapo-lation would be appropriate for this endpoint.
Benameur, Laila; Charif, Naceur; Li, Yueying; Stoltz, Jean-François; de Isla, Natalia
2015-01-01
Under physiological conditions, there is a production of limited range of free radicals. However, when the cellular antioxidant defence systems, overwhelm and fail to reverse back the free radicals to their normal basal levels, there is a creation of a condition of redox disequilibrium termed "oxidative stress", which is implicated in a very wide spectrum of genetic, metabolic, and cellular responses. The excess of free radicals can, cause unfavourable molecular alterations to biomolecules through oxidation of lipids, proteins, RNA and DNA, that can in turn lead to mutagenesis, carcinogenesis, and aging. Mesenchymal stem cells (MSCs) have been proven to be a promising source of cells for regenerative medicine, and to be useful in the treatment of pathologies in which tissue damage is linked to oxidative stress. Moreover, MSCs appeared to efficiently manage oxidative stress and to be more resistant to oxidative insult than normal somatic cells, making them an interesting and testable model for the role of oxidative stress in the aging process. In addition, aging is accompanied by a progressive decline in stem cell function, resulting in less effective tissue homeostasis and repair. Also, there is an obvious link between intracellular reactive oxygen species levels and cellular senescence. To date, few studies have investigated the promotion of aging by oxidative stress on human MSCs, and the mechanism by which oxidative stress induce stem cell aging is poorly understood. In this context, the aim of this review is to gain insight the current knowledge about the molecular mechanisms of aging-induced oxidative stress in human MSCs.
Bioinformatics Knowledge Map for Analysis of Beta-Catenin Function in Cancer
Arighi, Cecilia N.; Wu, Cathy H.
2015-01-01
Given the wealth of bioinformatics resources and the growing complexity of biological information, it is valuable to integrate data from disparate sources to gain insight into the role of genes/proteins in health and disease. We have developed a bioinformatics framework that combines literature mining with information from biomedical ontologies and curated databases to create knowledge “maps” of genes/proteins of interest. We applied this approach to the study of beta-catenin, a cell adhesion molecule and transcriptional regulator implicated in cancer. The knowledge map includes post-translational modifications (PTMs), protein-protein interactions, disease-associated mutations, and transcription factors co-activated by beta-catenin and their targets and captures the major processes in which beta-catenin is known to participate. Using the map, we generated testable hypotheses about beta-catenin biology in normal and cancer cells. By focusing on proteins participating in multiple relation types, we identified proteins that may participate in feedback loops regulating beta-catenin transcriptional activity. By combining multiple network relations with PTM proteoform-specific functional information, we proposed a mechanism to explain the observation that the cyclin dependent kinase CDK5 positively regulates beta-catenin co-activator activity. Finally, by overlaying cancer-associated mutation data with sequence features, we observed mutation patterns in several beta-catenin PTM sites and PTM enzyme binding sites that varied by tissue type, suggesting multiple mechanisms by which beta-catenin mutations can contribute to cancer. The approach described, which captures rich information for molecular species from genes and proteins to PTM proteoforms, is extensible to other proteins and their involvement in disease. PMID:26509276
A neuro-computational model of economic decisions.
Rustichini, Aldo; Padoa-Schioppa, Camillo
2015-09-01
Neuronal recordings and lesion studies indicate that key aspects of economic decisions take place in the orbitofrontal cortex (OFC). Previous work identified in this area three groups of neurons encoding the offer value, the chosen value, and the identity of the chosen good. An important and open question is whether and how decisions could emerge from a neural circuit formed by these three populations. Here we adapted a biophysically realistic neural network previously proposed for perceptual decisions (Wang XJ. Neuron 36: 955-968, 2002; Wong KF, Wang XJ. J Neurosci 26: 1314-1328, 2006). The domain of economic decisions is significantly broader than that for which the model was originally designed, yet the model performed remarkably well. The input and output nodes of the network were naturally mapped onto two groups of cells in OFC. Surprisingly, the activity of interneurons in the network closely resembled that of the third group of cells, namely, chosen value cells. The model reproduced several phenomena related to the neuronal origins of choice variability. It also generated testable predictions on the excitatory/inhibitory nature of different neuronal populations and on their connectivity. Some aspects of the empirical data were not reproduced, but simple extensions of the model could overcome these limitations. These results render a biologically credible model for the neuronal mechanisms of economic decisions. They demonstrate that choices could emerge from the activity of cells in the OFC, suggesting that chosen value cells directly participate in the decision process. Importantly, Wang's model provides a platform to investigate the implications of neuroscience results for economic theory. Copyright © 2015 the American Physiological Society.
Parent-offspring conflict over family size in current China.
Liu, Jianghua; Duan, Chongli; Lummaa, Virpi
2017-05-06
In China, the recent replacement of the one-child policy with a two-child policy could potentially change family ecology-parents may switch investment from exclusively one child to two. The parent-offspring conflict theory provides testable hypotheses concerning possible firstborn opposition toward further reproduction of their mother, and who wins the conflict. We tested the hypotheses that if there is any opposition, it will differ between sexes, weaken with offspring age and family resource availability, and affect maternal reproductive decision-making. Using survey data of 531 non-pregnant mothers of only one child from Xi'an (China), logistic regression was used to examine effects of age, family income, and sex on the attitudes of firstborn children toward having a sibling; ordinal regression was used to investigate how such attitudes affect maternal intention to reproduce again. Firstborns' unsupportive attitude toward their mothers' further reproduction weakened with age and was overall more frequent in low-income families. Sons' unsupportive tendency displayed a somewhat U-shaped relationship, whereas daughters' weakened with family income; consequently, sons were more likely than daughters to be unsupportive in high-income families, suggesting a tendency to be more demanding. Forty-nine percent of mothers supported by their firstborns intended to reproduce again, whilst only 9% of mothers not supported by firstborns had such an intention. Our study contributes to evolutionary literature on parent-offspring conflict and its influence on female reproductive strategy in modern human societies, and has also important implications for understanding fertility patterns and conducting interventions in family conflict in China. © 2016 Wiley Periodicals, Inc.
Testing two principles of the Health Action Process Approach in individuals with type 2 diabetes.
Lippke, Sonia; Plotnikoff, Ronald C
2014-01-01
The Health Action Process Approach (HAPA) proposes principles that can be translated into testable hypotheses. This is one of the first studies to have explicitly tested HAPA's first 2 principles, which are (1) health behavior change process can be subdivided into motivation and volition, and (2) volition can be grouped into intentional and action stages. The 3 stage groups are labeled preintenders, intenders, and actors. The hypotheses of the HAPA model were investigated in a sample of 1,193 individuals with Type 2 diabetes. Study participants completed a questionnaire assessing the HAPA variables. The hypotheses were evaluated by examining mean differences of test variables and by the use of multigroup structural equation modeling (MSEM). Findings support the HAPA's 2 principles and 3 distinct stages. The 3 HAPA stages were significantly different in several stage-specific variables, and discontinuity patterns were found in terms of nonlinear trends across means. In terms of predicting goals, action planning, and behavior, differences transpired between the 2 motivational stages (preintenders and intenders), and between the 2 volitional stages (intenders and actors). Results indicate implications for supporting behavior change processes, depending on in which stage a person is at: All individuals should be helped to increase self-efficacy. Preintenders and intenders require interventions targeting outcome expectancies. Actors benefit from an improvement in action planning to maintain and increase their previous behavior. Overall, the first 2 principles of the HAPA were supported and some evidence for the other principles was found. Future research should experimentally test these conclusions. 2014 APA, all rights reserved
Maartens, Roy; Koyama, Kazuya
2010-01-01
The observable universe could be a 1+3-surface (the "brane") embedded in a 1+3+ d -dimensional spacetime (the "bulk"), with Standard Model particles and fields trapped on the brane while gravity is free to access the bulk. At least one of the d extra spatial dimensions could be very large relative to the Planck scale, which lowers the fundamental gravity scale, possibly even down to the electroweak (∼ TeV) level. This revolutionary picture arises in the framework of recent developments in M theory. The 1+10-dimensional M theory encompasses the known 1+9-dimensional superstring theories, and is widely considered to be a promising potential route to quantum gravity. At low energies, gravity is localized at the brane and general relativity is recovered, but at high energies gravity "leaks" into the bulk, behaving in a truly higher-dimensional way. This introduces significant changes to gravitational dynamics and perturbations, with interesting and potentially testable implications for high-energy astrophysics, black holes, and cosmology. Brane-world models offer a phenomenological way to test some of the novel predictions and corrections to general relativity that are implied by M theory. This review analyzes the geometry, dynamics and perturbations of simple brane-world models for cosmology and astrophysics, mainly focusing on warped 5-dimensional brane-worlds based on the Randall-Sundrum models. We also cover the simplest brane-world models in which 4-dimensional gravity on the brane is modified at low energies - the 5-dimensional Dvali-Gabadadze-Porrati models. Then we discuss co-dimension two branes in 6-dimensional models.
Towards a theory of PACS deployment: an integrative PACS maturity framework.
van de Wetering, Rogier; Batenburg, Ronald
2014-06-01
Owing to large financial investments that go along with the picture archiving and communication system (PACS) deployments and inconsistent PACS performance evaluations, there is a pressing need for a better understanding of the implications of PACS deployment in hospitals. We claim that there is a gap in the research field, both theoretically and empirically, to explain the success of the PACS deployment and maturity in hospitals. Theoretical principles are relevant to the PACS performance; maturity and alignment are reviewed from a system and complexity perspective. A conceptual model to explain the PACS performance and a set of testable hypotheses are then developed. Then, structural equation modeling (SEM), i.e. causal modeling, is applied to validate the model and hypotheses based on a research sample of 64 hospitals that use PACS, i.e. 70 % of all hospitals in the Netherlands. Outcomes of the SEM analyses substantiate that the measurements of all constructs are reliable and valid. The PACS alignment-modeled as a higher-order construct of five complementary organizational dimensions and maturity levels-has a significant positive impact on the PACS performance. This result is robust and stable for various sub-samples and segments. This paper presents a conceptual model that explains how alignment in deploying PACS in hospitals is positively related to the perceived performance of PACS. The conceptual model is extended with tools as checklists to systematically identify the improvement areas for hospitals in the PACS domain. The holistic approach towards PACS alignment and maturity provides a framework for clinical practice.
Fundamental insights into ontogenetic growth from theory and fish.
Sibly, Richard M; Baker, Joanna; Grady, John M; Luna, Susan M; Kodric-Brown, Astrid; Venditti, Chris; Brown, James H
2015-11-10
The fundamental features of growth may be universal, because growth trajectories of most animals are very similar, but a unified mechanistic theory of growth remains elusive. Still needed is a synthetic explanation for how and why growth rates vary as body size changes, both within individuals over their ontogeny and between populations and species over their evolution. Here, we use Bertalanffy growth equations to characterize growth of ray-finned fishes in terms of two parameters, the growth rate coefficient, K, and final body mass, m∞. We derive two alternative empirically testable hypotheses and test them by analyzing data from FishBase. Across 576 species, which vary in size at maturity by almost nine orders of magnitude, K scaled as [Formula: see text]. This supports our first hypothesis that growth rate scales as [Formula: see text] as predicted by metabolic scaling theory; it implies that species that grow to larger mature sizes grow faster as juveniles. Within fish species, however, K scaled as [Formula: see text]. This supports our second hypothesis, which predicts that growth rate scales as [Formula: see text] when all juveniles grow at the same rate. The unexpected disparity between across- and within-species scaling challenges existing theoretical interpretations. We suggest that the similar ontogenetic programs of closely related populations constrain growth to [Formula: see text] scaling, but as species diverge over evolutionary time they evolve the near-optimal [Formula: see text] scaling predicted by metabolic scaling theory. Our findings have important practical implications because fish supply essential protein in human diets, and sustainable yields from wild harvests and aquaculture depend on growth rates.
Modulation of hippocampal rhythms by subthreshold electric fields and network topology
Berzhanskaya, Julia; Chernyy, Nick; Gluckman, Bruce J.; Schiff, Steven J.; Ascoli, Giorgio A.
2012-01-01
Theta (4–12 Hz) and gamma (30–80 Hz) rhythms are considered important for cortical and hippocampal function. Although several neuron types are implicated in rhythmogenesis, the exact cellular mechanisms remain unknown. Subthreshold electric fields provide a flexible, area-specific tool to modulate neural activity and directly test functional hypotheses. Here we present experimental and computational evidence of the interplay among hippocampal synaptic circuitry, neuronal morphology, external electric fields, and network activity. Electrophysiological data are used to constrain and validate an anatomically and biophysically realistic model of area CA1 containing pyramidal cells and two interneuron types: dendritic- and perisomatic-targeting. We report two lines of results: addressing the network structure capable of generating theta-modulated gamma rhythms, and demonstrating electric field effects on those rhythms. First, theta-modulated gamma rhythms require specific inhibitory connectivity. In one configuration, GABAergic axo-dendritic feedback on pyramidal cells is only effective in proximal but not distal layers. An alternative configuration requires two distinct perisomatic interneuron classes, one exclusively receiving excitatory contacts, the other additionally targeted by inhibition. These observations suggest novel roles for particular classes of oriens and basket cells. The second major finding is that subthreshold electric fields robustly alter the balance between different rhythms. Independent of network configuration, positive electric fields decrease, while negative fields increase the theta/gamma ratio. Moreover, electric fields differentially affect average theta frequency depending on specific synaptic connectivity. These results support the testable prediction that subthreshold electric fields can alter hippocampal rhythms, suggesting new approaches to explore their cognitive functions and underlying circuitry. PMID:23053863
A genetic programming approach for Burkholderia Pseudomallei diagnostic pattern discovery
Yang, Zheng Rong; Lertmemongkolchai, Ganjana; Tan, Gladys; Felgner, Philip L.; Titball, Richard
2009-01-01
Motivation: Finding diagnostic patterns for fighting diseases like Burkholderia pseudomallei using biomarkers involves two key issues. First, exhausting all subsets of testable biomarkers (antigens in this context) to find a best one is computationally infeasible. Therefore, a proper optimization approach like evolutionary computation should be investigated. Second, a properly selected function of the antigens as the diagnostic pattern which is commonly unknown is a key to the diagnostic accuracy and the diagnostic effectiveness in clinical use. Results: A conversion function is proposed to convert serum tests of antigens on patients to binary values based on which Boolean functions as the diagnostic patterns are developed. A genetic programming approach is designed for optimizing the diagnostic patterns in terms of their accuracy and effectiveness. During optimization, it is aimed to maximize the coverage (the rate of positive response to antigens) in the infected patients and minimize the coverage in the non-infected patients while maintaining the fewest number of testable antigens used in the Boolean functions as possible. The final coverage in the infected patients is 96.55% using 17 of 215 (7.4%) antigens with zero coverage in the non-infected patients. Among these 17 antigens, BPSL2697 is the most frequently selected one for the diagnosis of Burkholderia Pseudomallei. The approach has been evaluated using both the cross-validation and the Jack–knife simulation methods with the prediction accuracy as 93% and 92%, respectively. A novel approach is also proposed in this study to evaluate a model with binary data using ROC analysis. Contact: z.r.yang@ex.ac.uk PMID:19561021
Testable solution of the cosmological constant and coincidence problems
NASA Astrophysics Data System (ADS)
Shaw, Douglas J.; Barrow, John D.
2011-02-01
We present a new solution to the cosmological constant (CC) and coincidence problems in which the observed value of the CC, Λ, is linked to other observable properties of the Universe. This is achieved by promoting the CC from a parameter that must be specified, to a field that can take many possible values. The observed value of Λ≈(9.3Gyrs)-2 [≈10-120 in Planck units] is determined by a new constraint equation which follows from the application of a causally restricted variation principle. When applied to our visible Universe, the model makes a testable prediction for the dimensionless spatial curvature of Ωk0=-0.0056(ζb/0.5), where ζb˜1/2 is a QCD parameter. Requiring that a classical history exist, our model determines the probability of observing a given Λ. The observed CC value, which we successfully predict, is typical within our model even before the effects of anthropic selection are included. When anthropic selection effects are accounted for, we find that the observed coincidence between tΛ=Λ-1/2 and the age of the Universe, tU, is a typical occurrence in our model. In contrast to multiverse explanations of the CC problems, our solution is independent of the choice of a prior weighting of different Λ values and does not rely on anthropic selection effects. Our model includes no unnatural small parameters and does not require the introduction of new dynamical scalar fields or modifications to general relativity, and it can be tested by astronomical observations in the near future.
Evolutionary Perspectives on Genetic and Environmental Risk Factors for Psychiatric Disorders.
Keller, Matthew C
2018-05-07
Evolutionary medicine uses evolutionary theory to help elucidate why humans are vulnerable to disease and disorders. I discuss two different types of evolutionary explanations that have been used to help understand human psychiatric disorders. First, a consistent finding is that psychiatric disorders are moderately to highly heritable, and many, such as schizophrenia, are also highly disabling and appear to decrease Darwinian fitness. Models used in evolutionary genetics to understand why genetic variation exists in fitness-related traits can be used to understand why risk alleles for psychiatric disorders persist in the population. The usual explanation for species-typical adaptations-natural selection-is less useful for understanding individual differences in genetic risk to disorders. Rather, two other types of models, mutation-selection-drift and balancing selection, offer frameworks for understanding why genetic variation in risk to psychiatric (and other) disorders exists, and each makes predictions that are now testable using whole-genome data. Second, species-typical capacities to mount reactions to negative events are likely to have been crafted by natural selection to minimize fitness loss. The pain reaction to tissue damage is almost certainly such an example, but it has been argued that the capacity to experience depressive symptoms such as sadness, anhedonia, crying, and fatigue in the face of adverse life situations may have been crafted by natural selection as well. I review the rationale and strength of evidence for this hypothesis. Evolutionary hypotheses of psychiatric disorders are important not only for offering explanations for why psychiatric disorders exist, but also for generating new, testable hypotheses and understanding how best to design studies and analyze data.
The Assurance Challenges of Advanced Packaging Technologies for Electronics
NASA Technical Reports Server (NTRS)
Sampson, Michael J.
2010-01-01
Advances in microelectronic parts performance are driving towards finer feature sizes, three-dimensional geometries and ever-increasing number of transistor equivalents that are resulting in increased die sizes and interconnection (I/O) counts. The resultant packaging necessary to provide assemble-ability, environmental protection, testability and interconnection to the circuit board for the active die creates major challenges, particularly for space applications, Traditionally, NASA has used hermetically packaged microcircuits whenever available but the new demands make hermetic packaging less and less practical at the same time as more and more expensive, Some part types of great interest to NASA designers are currently only available in non-hermetic packaging. It is a far more complex quality and reliability assurance challenge to gain confidence in the long-term survivability and effectiveness of nonhermetic packages than for hermetic ones. Although they may provide more rugged environmental protection than the familiar Plastic Encapsulated Microcircuits (PEMs), the non-hermetic Ceramic Column Grid Array (CCGA) packages that are the focus of this presentation present a unique combination of challenges to assessing their suitability for spaceflight use. The presentation will discuss the bases for these challenges, some examples of the techniques proposed to mitigate them and a proposed approach to a US MIL specification Class for non-hermetic microcircuits suitable for space application, Class Y, to be incorporated into M. IL-PRF-38535. It has recently emerged that some major packaging suppliers are offering hermetic area array packages that may offer alternatives to the nonhermetic CCGA styles but have also got their own inspectability and testability issues which will be briefly discussed in the presentation,
NASA Astrophysics Data System (ADS)
Hirata, N.; Tsuruoka, H.; Yokoi, S.
2011-12-01
The current Japanese national earthquake prediction program emphasizes the importance of modeling as well as monitoring for a sound scientific development of earthquake prediction research. One major focus of the current program is to move toward creating testable earthquake forecast models. For this purpose, in 2009 we joined the Collaboratory for the Study of Earthquake Predictability (CSEP) and installed, through an international collaboration, the CSEP Testing Centre, an infrastructure to encourage researchers to develop testable models for Japan. We started Japanese earthquake predictability experiment on November 1, 2009. The experiment consists of 12 categories, with 4 testing classes with different time spans (1 day, 3 months, 1 year and 3 years) and 3 testing regions called 'All Japan,' 'Mainland,' and 'Kanto.' A total of 160 models, as of August 2013, were submitted, and are currently under the CSEP official suite of tests for evaluating the performance of forecasts. We will present results of prospective forecast and testing for periods before and after the 2011 Tohoku-oki earthquake. Because a seismic activity has changed dramatically since the 2011 event, performances of models have been affected very much. In addition, as there is the problem of authorized catalogue related to the completeness magnitude, most models did not pass the CSEP consistency tests. Also, we will discuss the retrospective earthquake forecast experiments for aftershocks of the 2011 Tohoku-oki earthquake. Our aim is to describe what has turned out to be the first occasion for setting up a research environment for rigorous earthquake forecasting in Japan.
NASA Astrophysics Data System (ADS)
Hirata, N.; Tsuruoka, H.; Yokoi, S.
2013-12-01
The current Japanese national earthquake prediction program emphasizes the importance of modeling as well as monitoring for a sound scientific development of earthquake prediction research. One major focus of the current program is to move toward creating testable earthquake forecast models. For this purpose, in 2009 we joined the Collaboratory for the Study of Earthquake Predictability (CSEP) and installed, through an international collaboration, the CSEP Testing Centre, an infrastructure to encourage researchers to develop testable models for Japan. We started Japanese earthquake predictability experiment on November 1, 2009. The experiment consists of 12 categories, with 4 testing classes with different time spans (1 day, 3 months, 1 year and 3 years) and 3 testing regions called 'All Japan,' 'Mainland,' and 'Kanto.' A total of 160 models, as of August 2013, were submitted, and are currently under the CSEP official suite of tests for evaluating the performance of forecasts. We will present results of prospective forecast and testing for periods before and after the 2011 Tohoku-oki earthquake. Because a seismic activity has changed dramatically since the 2011 event, performances of models have been affected very much. In addition, as there is the problem of authorized catalogue related to the completeness magnitude, most models did not pass the CSEP consistency tests. Also, we will discuss the retrospective earthquake forecast experiments for aftershocks of the 2011 Tohoku-oki earthquake. Our aim is to describe what has turned out to be the first occasion for setting up a research environment for rigorous earthquake forecasting in Japan.
Global Distribution of Active Volcanism on Io as Known at the End of the Galileo Mission
NASA Technical Reports Server (NTRS)
Lopes, Rosaly M. C.; Kamp. Lucas W.; Smythe, W. D.; Radebaugh, J.; Turtle, E.; Perry, J.; Bruno, B.
2004-01-01
Hot spots are manifestations of Io s mechanism of internal heating and heat transfer. Therefore, the global distribution of hot spots and their power output has important implications for how Io is losing heat. The end of the Galileo mission is an opportune time to revisit studies of the distribution of hot spots on Io, and to investigate the distribution of their power output.
NASA Astrophysics Data System (ADS)
Milliner, C. W. D.; Dolan, J. F.; Hollingsworth, J.; Leprince, S.; Ayoub, F.
2016-10-01
Subpixel correlation of preevent and postevent air photos reveal the complete near-field, horizontal surface deformation patterns of the 1992 Mw 7.3 Landers and 1999 Mw 7.1 Hector Mine ruptures. Total surface displacement values for both earthquakes are systematically larger than "on-fault" displacements from geologic field surveys, indicating significant distributed, inelastic deformation occurred along these ruptures. Comparison of these two data sets shows that 46 ± 10% and 39 ± 22% of the total surface deformation were distributed over fault zones averaging 154 m and 121 m in width for the Landers and Hector Mine events, respectively. Spatial variations of distributed deformation along both ruptures show correlations with the type of near-surface lithology and degree of fault complexity; larger amounts of distributed shear occur where the rupture propagated through loose unconsolidated sediments and areas of more complex fault structure. These results have basic implications for geologic-geodetic rate comparisons and probabilistic seismic hazard analysis.
Survey of Large Methane Emitters in North America
NASA Astrophysics Data System (ADS)
Deiker, S.
2017-12-01
It has been theorized that methane emissions in the oil and gas industry follow log normal or "fat tail" distributions, with large numbers of small sources for every very large source. Such distributions would have significant policy and operational implications. Unfortunately, by their very nature such distributions would require large sample sizes to verify. Until recently, such large-scale studies would be prohibitively expensive. The largest public study to date sampled 450 wells, an order of magnitude too low to effectively constrain these models. During 2016 and 2017, Kairos Aerospace conducted a series of surveys the LeakSurveyor imaging spectrometer, mounted on light aircraft. This small, lightweight instrument was designed to rapidly locate large emission sources. The resulting survey covers over three million acres of oil and gas production. This includes over 100,000 wells, thousands of storage tanks and over 7,500 miles of gathering lines. This data set allows us to now probe the distribution of large methane emitters. Results of this survey, and implications for methane emission distribution, methane policy and LDAR will be discussed.
GlobAl Distribution of GEnetic Traits (GADGET) web server: polygenic trait scores worldwide.
Chande, Aroon T; Wang, Lu; Rishishwar, Lavanya; Conley, Andrew B; Norris, Emily T; Valderrama-Aguirre, Augusto; Jordan, I King
2018-05-18
Human populations from around the world show striking phenotypic variation across a wide variety of traits. Genome-wide association studies (GWAS) are used to uncover genetic variants that influence the expression of heritable human traits; accordingly, population-specific distributions of GWAS-implicated variants may shed light on the genetic basis of human phenotypic diversity. With this in mind, we developed the GlobAl Distribution of GEnetic Traits web server (GADGET http://gadget.biosci.gatech.edu). The GADGET web server provides users with a dynamic visual platform for exploring the relationship between worldwide genetic diversity and the genetic architecture underlying numerous human phenotypes. GADGET integrates trait-implicated single nucleotide polymorphisms (SNPs) from GWAS, with population genetic data from the 1000 Genomes Project, to calculate genome-wide polygenic trait scores (PTS) for 818 phenotypes in 2504 individual genomes. Population-specific distributions of PTS are shown for 26 human populations across 5 continental population groups, with traits ordered based on the extent of variation observed among populations. Users of GADGET can also upload custom trait SNP sets to visualize global PTS distributions for their own traits of interest.
Accommodating Uncertainty in Prior Distributions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Picard, Richard Roy; Vander Wiel, Scott Alan
2017-01-19
A fundamental premise of Bayesian methodology is that a priori information is accurately summarized by a single, precisely de ned prior distribution. In many cases, especially involving informative priors, this premise is false, and the (mis)application of Bayes methods produces posterior quantities whose apparent precisions are highly misleading. We examine the implications of uncertainty in prior distributions, and present graphical methods for dealing with them.
ERIC Educational Resources Information Center
Solomon, David J.
2013-01-01
It has been approximately 20 years since distributing scholarly journals digitally became feasible. This article discusses the broad implications of the transition to digital distributed scholarship from a historical perspective and focuses on the development of open access (OA) and the various models for funding OA in the context of the roles…
K.M. Burnett; G.H. Reeves; D.J. Miller; S. Clarke; K. Vance-Borland; K. Christiansen
2007-01-01
The geographic distribution of stream reaches with potential to support high-quality habitat for salmonids has bearing on the actual status of habitats and populations over broad spatial extents. As part of the Coastal Landscape Analysis and Modeling Study, we examined how salmon-habitat potential was distributed relative to current and future (+100 years) landscape...
ERIC Educational Resources Information Center
Yuan, Ke-Hai
2008-01-01
In the literature of mean and covariance structure analysis, noncentral chi-square distribution is commonly used to describe the behavior of the likelihood ratio (LR) statistic under alternative hypothesis. Due to the inaccessibility of the rather technical literature for the distribution of the LR statistic, it is widely believed that the…
Democratic Schooling in Norway: Implications for Leadership in Practice
ERIC Educational Resources Information Center
Moller, Jorunn
2006-01-01
This article explores the meaning of an education based on democratic values and the implications for school leadership in practice. Based on findings from a case study in a Norwegian upper secondary school, the study describes democratic school leadership in practice, with particular attention to the distribution of power and leadership in the…
Capitalism's New Handmaiden: The Biotechnical World Negotiated through Children's Fiction
ERIC Educational Resources Information Center
Sawers, Naarah
2009-01-01
In an era when the merger between capitalism and science becomes an accepted norm, new questions need to be asked about the ethical implications of scientific practices. One such practice is organ transplantation. However, potent debates surround the just distribution and ethical implications of organ transplantation. This paper examines the ways…
ERIC Educational Resources Information Center
Jones, Adrian
2011-01-01
This paper considers the implications for higher education of recent work on narrative theory, distributed cognition and artificial intelligence. These perspectives are contrasted with the educational implications of Heidegger's ontological phenomenology [being-there and being-aware (Da-sein)] and with the classic and classical foundations of…
NASA Astrophysics Data System (ADS)
Soulsby, Chris; Birkel, Christian; Geris, Josie; Tetzlaff, Doerthe
2016-04-01
Advances in the use of hydrological tracers and their integration into rainfall runoff models is facilitating improved quantification of stream water age distributions. This is of fundamental importance to understanding water quality dynamics over both short- and long-time scales, particularly as water quality parameters are often associated with water sources of markedly different ages. For example, legacy nitrate pollution may reflect deeper waters that have resided in catchments for decades, whilst more dynamics parameters from anthropogenic sources (e.g. P, pathogens etc) are mobilised by very young (<1 day) near-surface water sources. It is increasingly recognised that water age distributions of stream water is non-stationary in both the short (i.e. event dynamics) and longer-term (i.e. in relation to hydroclimatic variability). This provides a crucial context for interpreting water quality time series. Here, we will use longer-term (>5 year), high resolution (daily) isotope time series in modelling studies for different catchments to show how variable stream water age distributions can be a result of hydroclimatic variability and the implications for understanding water quality. We will also use examples from catchments undergoing rapid urbanisation, how the resulting age distributions of stream water change in a predictable way as a result of modified flow paths. The implication for the management of water quality in urban catchments will be discussed.
Dimensions of Employee Compensation: Practical and Theoretical Implications for Superintendents.
ERIC Educational Resources Information Center
Young, I. Phillip
1997-01-01
Explores compensation practices fundamental to the school board/employee exchange relationship, using a sample of 615 midwestern superintendents. Employs an organizational justice model, focusing on its procedural and distributive dimensions. Explores procedural justice via market-rate earnings equations and distributive justice by examining…
The objectives of this presentation are to: review history of distribution system chlorination regulations, raise awareness on the meaning of detectable residual as it relates to chloramines, and perhaps renew dialogue on the discussion of minimum disinfectant residuals.
Extrusomes in ciliates: diversification, distribution, and phylogenetic implications.
Rosati, Giovanna; Modeo, Letizia
2003-01-01
Exocytosis is, in all likelihood, an important communication method among microbes. Ciliates are highly differentiated and specialized micro-organisms for which versatile and/or sophisticated exocytotic organelles may represent important adaptive tools. Thus, in ciliates, we find a broad range of different extrusomes, i.e ejectable membrane-bound organelles. Structurally simple extrusomes, like mucocysts and cortical granules, are widespread in different taxa within the phylum. They play the roles in each case required for the ecological needs of the organisms. Then, we find a number of more elaborate extrusomes, whose distribution within the phylum is more limited, and in some way related to phylogenetic affinities. Herein we provide a survey of literature and our data on selected extrusomes in ciliates. Their morphology, distribution, and possible function are discussed. The possible phylogenetic implications of their diversity are considered.
Natural gas in the energy industry of the 21st century
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cuttica, J.
1995-12-31
This paper provides a gas industry perspective on the impacts of restructuring the natural gas and electric industries. The four main implications discussed are: (1) market trends, (2) strategic positioning, (3) significant market implications, and (4) issues for the future. Market trends discussed include transitioning rate of return to market competition and regulatory impacts. Significant market implications for gas-fired generation identified include limited new generation investment, extension of existing plants, and an opportunity for distributed power generation. 12 tabs.
Online Reports of Foodborne Illness Capture Foods Implicated in Official Foodborne Outbreak Reports
Nsoesie, Elaine O.; Gordon, Sheryl A.; Brownstein, John S.
2014-01-01
Objective Traditional surveillance systems only capture a fraction of the estimated 48 million yearly cases of foodborne illness in the United States. We assessed whether foodservice reviews on Yelp.com (a business review site) can be used to support foodborne illness surveillance efforts. Methods We obtained reviews from 2005–2012 of 5824 foodservice businesses closest to 29 colleges. After extracting recent reviews describing episodes of foodborne illness, we compared implicated foods to foods in outbreak reports from the U.S. Centers for Disease Control and Prevention (CDC). Results Broadly, the distribution of implicated foods across five categories was as follows: aquatic (16% Yelp, 12% CDC), dairy-eggs (23% Yelp, 23% CDC), fruits-nuts (7% Yelp, 7% CDC), meat-poultry (32% Yelp, 33% CDC), and vegetables (22% Yelp, 25% CDC). The distribution of foods across 19 more specific food categories was also similar, with spearman correlations ranging from 0.60 to 0.85 for 2006–2011. The most implicated food categories in both Yelp and CDC were beef, dairy, grains-beans, poultry and vine-stalk. Conclusions Based on observations in this study and the increased usage of social media, we posit that online illness reports could complement traditional surveillance systems by providing near real-time information on foodborne illnesses, implicated foods and locations. PMID:25124281
NASA Technical Reports Server (NTRS)
Cockell, C.; Catling, D.; Waites, H.
1999-01-01
Insects have a number of potential roles in closed-loop life support systems. In this study we examined the tolerance of a range of insect orders and life stages to drops in atmospheric pressure using a terrestrial atmosphere. We found that all insects studied could tolerate pressures down to 100 mb. No effects on insect respiration were noted down to 500 mb. Pressure toleration was not dependent on body volume. Our studies demonstrate that insects are compatible with plants in low-pressure artificial and closed-loop ecosystems. The results also have implications for arthropod colonization and global distribution on Earth.
Geographic Population Structure in Epstein-Barr Virus Revealed by Comparative Genomics
Chiara, Matteo; Manzari, Caterina; Lionetti, Claudia; Mechelli, Rosella; Anastasiadou, Eleni; Chiara Buscarinu, Maria; Ristori, Giovanni; Salvetti, Marco; Picardi, Ernesto; D’Erchia, Anna Maria; Pesole, Graziano; Horner, David S.
2016-01-01
Epstein-Barr virus (EBV) latently infects the majority of the human population and is implicated as a causal or contributory factor in numerous diseases. We sequenced 27 complete EBV genomes from a cohort of Multiple Sclerosis (MS) patients and healthy controls from Italy, although no variants showed a statistically significant association with MS. Taking advantage of the availability of ∼130 EBV genomes with known geographical origins, we reveal a striking geographic distribution of EBV sub-populations with distinct allele frequency distributions. We discuss mechanisms that potentially explain these observations, and their implications for understanding the association of EBV with human disease. PMID:27635051
NASA Technical Reports Server (NTRS)
Richmond, R. G.; Kelso, R. M.
1980-01-01
A concern has arisen regarding the emissive distribution of water molecules from the shuttle orbiter flash evaporator system (FES). The role of the orbiter fuselage and elevon in affecting molecular scattering distributions was nuclear. The effect of these components were evaluated. Molecular distributions of the water vapor effluents from the FE were measured. These data were compared with analytically predicted values and the resulting implications were calculated.
Distribution of transuranic elements in bone.
Durbin, P W
1992-01-01
The transport, retention, and excretion of transuranic elements from the body have been widely studied for many years. A summary of the results is given with an emphasis on the distribution of these elements in bone. Implications of these studies for understanding the relationships between lead in blood and lead in bone are presented. The expected distribution of lead at various bone sites is also considered.
Modelling protein functional domains in signal transduction using Maude
NASA Technical Reports Server (NTRS)
Sriram, M. G.
2003-01-01
Modelling of protein-protein interactions in signal transduction is receiving increased attention in computational biology. This paper describes recent research in the application of Maude, a symbolic language founded on rewriting logic, to the modelling of functional domains within signalling proteins. Protein functional domains (PFDs) are a critical focus of modern signal transduction research. In general, Maude models can simulate biological signalling networks and produce specific testable hypotheses at various levels of abstraction. Developing symbolic models of signalling proteins containing functional domains is important because of the potential to generate analyses of complex signalling networks based on structure-function relationships.
Repairable chip bonding/interconnect process
Bernhardt, A.F.; Contolini, R.J.; Malba, V.; Riddle, R.A.
1997-08-05
A repairable, chip-to-board interconnect process which addresses cost and testability issues in the multi-chip modules is disclosed. This process can be carried out using a chip-on-sacrificial-substrate technique, involving laser processing. This process avoids the curing/solvent evolution problems encountered in prior approaches, as well is resolving prior plating problems and the requirements for fillets. For repairable high speed chip-to-board connection, transmission lines can be formed on the sides of the chip from chip bond pads, ending in a gull wing at the bottom of the chip for subsequent solder. 10 figs.
The evolutionary psychology of hunger.
Al-Shawaf, Laith
2016-10-01
An evolutionary psychological perspective suggests that emotions can be understood as coordinating mechanisms whose job is to regulate various psychological and physiological programs in the service of solving an adaptive problem. This paper suggests that it may also be fruitful to approach hunger from this coordinating mechanism perspective. To this end, I put forward an evolutionary task analysis of hunger, generating novel a priori hypotheses about the coordinating effects of hunger on psychological processes such as perception, attention, categorization, and memory. This approach appears empirically fruitful in that it yields a bounty of testable new hypotheses. Copyright © 2016 Elsevier Ltd. All rights reserved.
Crystal study and econometric model
NASA Technical Reports Server (NTRS)
1975-01-01
An econometric model was developed that can be used to predict demand and supply figures for crystals over a time horizon roughly concurrent with that of NASA's Space Shuttle Program - that is, 1975 through 1990. The model includes an equation to predict the impact on investment in the crystal-growing industry. Actually, two models are presented. The first is a theoretical model which follows rather strictly the standard theoretical economic concepts involved in supply and demand analysis, and a modified version of the model was developed which, though not quite as theoretically sound, was testable utilizing existing data sources.