Extended Testability Analysis Tool
NASA Technical Reports Server (NTRS)
Melcher, Kevin; Maul, William A.; Fulton, Christopher
2012-01-01
The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.
Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool
NASA Technical Reports Server (NTRS)
Maul, William A.; Fulton, Christopher E.
2011-01-01
This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual
NASA Technical Reports Server (NTRS)
Chen, Chung-Hsing
1992-01-01
In this thesis, a behavioral-level testability analysis approach is presented. This approach is based on analyzing the circuit behavioral description (similar to a C program) to estimate its testability by identifying controllable and observable circuit nodes. This information can be used by a test generator to gain better access to internal circuit nodes and to reduce its search space. The results of the testability analyzer can also be used to select test points or partial scan flip-flops in the early design phase. Based on selection criteria, a novel Synthesis for Testability approach call Test Statement Insertion (TSI) is proposed, which modifies the circuit behavioral description directly. Test Statement Insertion can also be used to modify circuit structural description to improve its testability. As a result, Synthesis for Testability methodology can be combined with an existing behavioral synthesis tool to produce more testable circuits.
Advanced Diagnostic and Prognostic Testbed (ADAPT) Testability Analysis Report
NASA Technical Reports Server (NTRS)
Ossenfort, John
2008-01-01
As system designs become more complex, determining the best locations to add sensors and test points for the purpose of testing and monitoring these designs becomes more difficult. Not only must the designer take into consideration all real and potential faults of the system, he or she must also find efficient ways of detecting and isolating those faults. Because sensors and cabling take up valuable space and weight on a system, and given constraints on bandwidth and power, it is even more difficult to add sensors into these complex designs after the design has been completed. As a result, a number of software tools have been developed to assist the system designer in proper placement of these sensors during the system design phase of a project. One of the key functions provided by many of these software programs is a testability analysis of the system essentially an evaluation of how observable the system behavior is using available tests. During the design phase, testability metrics can help guide the designer in improving the inherent testability of the design. This may include adding, removing, or modifying tests; breaking up feedback loops, or changing the system to reduce fault propagation. Given a set of test requirements, the analysis can also help to verify that the system will meet those requirements. Of course, a testability analysis requires that a software model of the physical system is available. For the analysis to be most effective in guiding system design, this model should ideally be constructed in parallel with these efforts. The purpose of this paper is to present the final testability results of the Advanced Diagnostic and Prognostic Testbed (ADAPT) after the system model was completed. The tool chosen to build the model and to perform the testability analysis with is the Testability Engineering and Maintenance System Designer (TEAMS-Designer). The TEAMS toolset is intended to be a solution to span all phases of the system, from design and development through health management and maintenance. TEAMS-Designer is the model-building and testability analysis software in that suite.
Architectural Analysis of Dynamically Reconfigurable Systems
NASA Technical Reports Server (NTRS)
Lindvall, Mikael; Godfrey, Sally; Ackermann, Chris; Ray, Arnab; Yonkwa, Lyly
2010-01-01
oTpics include: the problem (increased flexibility of architectural styles decrease analyzability, behavior emerges and varies depending on the configuration, does the resulting system run according to the intended design, and architectural decisions can impede or facilitate testing); top down approach to architecture analysis, detection of defects and deviations, and architecture and its testability; currently targeted projects GMSEC and CFS; analyzing software architectures; analyzing runtime events; actual architecture recognition; GMPUB in Dynamic SAVE; sample output from new approach; taking message timing delays into account; CFS examples of architecture and testability; some recommendations for improved testablity; and CFS examples of abstract interfaces and testability; CFS example of opening some internal details.
LSI/VLSI design for testability analysis and general approach
NASA Technical Reports Server (NTRS)
Lam, A. Y.
1982-01-01
The incorporation of testability characteristics into large scale digital design is not only necessary for, but also pertinent to effective device testing and enhancement of device reliability. There are at least three major DFT techniques, namely, the self checking, the LSSD, and the partitioning techniques, each of which can be incorporated into a logic design to achieve a specific set of testability and reliability requirements. Detailed analysis of the design theory, implementation, fault coverage, hardware requirements, application limitations, etc., of each of these techniques are also presented.
Pediatric Amblyopia Risk Investigation Study (PARIS).
Savage, Howard I; Lee, Hester H; Zaetta, Deneen; Olszowy, Ronald; Hamburger, Ellie; Weissman, Mark; Frick, Kevin
2005-12-01
To assess the learning curve, testability, and reliability of vision screening modalities administered by pediatric health extenders. Prospective masked clinical trial. Two hundred subjects aged 3 to 6 underwent timed screening for amblyopia by physician extenders, including LEA visual acuity (LEA), stereopsis (RDE), and noncycloplegic autorefraction (NCAR). Patients returned for a comprehensive diagnostic eye examination performed by an ophthalmologist or optometrist. Average screening time was 5.4 +/- 1.6 minutes (LEA), 1.9 +/- 0.9 minutes (RDE), and 1.7 +/- 1.0 minutes (NCAR). Test time for NCAR and RDE fell by 40% during the study period. Overall testability was 92% (LEA), 96% (RDE), and 94% (NCAR). Testability among 3-year-olds was 73% (LEA), 96% (RDE), and 89% (NCAR). Reliability of LEA was moderate (r = .59). Reliability of NCAR was high for astigmatism (Cyl) (r = .89), moderate for spherical equivalent (SE) (r = .66), and low for anisometropia (ANISO) (r = .38). Correlation of cycloplegic autorefraction (CAR) with gold standard cycloplegic retinoscopic refraction (CRR) was very high for SE (.85), CYL (.77), and moderate for ANISO (.48). With NCAR, physician extenders can quickly and reliably detect astigmatism and spherical refractive error in one-third the time it takes to obtain visual acuity. LEA has a lower initial cost, but is time consuming, moderately reliable, and more difficult for 3-year-olds. Shorter examination time and higher reliability may make NCAR a more efficient screening tool for refractive amblyopia in younger children. Future study is needed to determine the sensitivity and specificity of NCAR and other screening methods in detecting amblyopia and amblyopia risk factors.
An empirical comparison of a dynamic software testability metric to static cyclomatic complexity
NASA Technical Reports Server (NTRS)
Voas, Jeffrey M.; Miller, Keith W.; Payne, Jeffrey E.
1993-01-01
This paper compares the dynamic testability prediction technique termed 'sensitivity analysis' to the static testability technique termed cyclomatic complexity. The application that we chose in this empirical study is a CASE generated version of a B-737 autoland system. For the B-737 system we analyzed, we isolated those functions that we predict are more prone to hide errors during system/reliability testing. We also analyzed the code with several other well-known static metrics. This paper compares and contrasts the results of sensitivity analysis to the results of the static metrics.
Testability analysis on a hydraulic system in a certain equipment based on simulation model
NASA Astrophysics Data System (ADS)
Zhang, Rui; Cong, Hua; Liu, Yuanhong; Feng, Fuzhou
2018-03-01
Aiming at the problem that the complicated structure and the shortage of fault statistics information in hydraulic systems, a multi value testability analysis method based on simulation model is proposed. Based on the simulation model of AMESim, this method injects the simulated faults and records variation of test parameters ,such as pressure, flow rate, at each test point compared with those under normal conditions .Thus a multi-value fault-test dependency matrix is established. Then the fault detection rate (FDR) and fault isolation rate (FIR) are calculated based on the dependency matrix. Finally the system of testability and fault diagnosis capability are analyzed and evaluated, which can only reach a lower 54%(FDR) and 23%(FIR). In order to improve testability performance of the system,. number and position of the test points are optimized on the system. Results show the proposed test placement scheme can be used to solve the problems that difficulty, inefficiency and high cost in the system maintenance.
1983-11-01
compound operations, with status. (h) Pre-programmed CRC and double-precision multiply/divide algo- rithms. (i) Double length accumulator with full...IH1.25 _ - MICROCOP ’ RESOLUTION TEST CHART NATIONAL BUREAU OF STANDARDS-1963-A .4 ’* • • . - . .. •. . . . . . . . . . . . . . • - -. .• ,. o. . . .- "o
Cai, Zuowei; Huang, Lihong; Zhang, Lingling
2015-05-01
This paper investigates the problem of exponential synchronization of time-varying delayed neural networks with discontinuous neuron activations. Under the extended Filippov differential inclusion framework, by designing discontinuous state-feedback controller and using some analytic techniques, new testable algebraic criteria are obtained to realize two different kinds of global exponential synchronization of the drive-response system. Moreover, we give the estimated rate of exponential synchronization which depends on the delays and system parameters. The obtained results extend some previous works on synchronization of delayed neural networks not only with continuous activations but also with discontinuous activations. Finally, numerical examples are provided to show the correctness of our analysis via computer simulations. Our method and theoretical results have a leading significance in the design of synchronized neural network circuits involving discontinuous factors and time-varying delays. Copyright © 2015 Elsevier Ltd. All rights reserved.
Multidisciplinary analysis and design of printed wiring boards
NASA Astrophysics Data System (ADS)
Fulton, Robert E.; Hughes, Joseph L.; Scott, Waymond R., Jr.; Umeagukwu, Charles; Yeh, Chao-Pin
1991-04-01
Modern printed wiring board design depends on electronic prototyping using computer-based simulation and design tools. Existing electrical computer-aided design (ECAD) tools emphasize circuit connectivity with only rudimentary analysis capabilities. This paper describes a prototype integrated PWB design environment denoted Thermal Structural Electromagnetic Testability (TSET) being developed at Georgia Tech in collaboration with companies in the electronics industry. TSET provides design guidance based on enhanced electrical and mechanical CAD capabilities including electromagnetic modeling testability analysis thermal management and solid mechanics analysis. TSET development is based on a strong analytical and theoretical science base and incorporates an integrated information framework and a common database design based on a systematic structured methodology.
Lee, Joy L; DeCamp, Matthew; Dredze, Mark; Chisolm, Margaret S; Berger, Zackary D
2014-10-15
Twitter is home to many health professionals who send messages about a variety of health-related topics. Amid concerns about physicians posting inappropriate content online, more in-depth knowledge about these messages is needed to understand health professionals' behavior on Twitter. Our goal was to characterize the content of Twitter messages, specifically focusing on health professionals and their tweets relating to health. We performed an in-depth content analysis of 700 tweets. Qualitative content analysis was conducted on tweets by health users on Twitter. The primary objective was to describe the general type of content (ie, health-related versus non-health related) on Twitter authored by health professionals and further to describe health-related tweets on the basis of the type of statement made. Specific attention was given to whether a tweet was personal (as opposed to professional) or made a claim that users would expect to be supported by some level of medical evidence (ie, a "testable" claim). A secondary objective was to compare content types among different users, including patients, physicians, nurses, health care organizations, and others. Health-related users are posting a wide range of content on Twitter. Among health-related tweets, 53.2% (184/346) contained a testable claim. Of health-related tweets by providers, 17.6% (61/346) were personal in nature; 61% (59/96) made testable statements. While organizations and businesses use Twitter to promote their services and products, patient advocates are using this tool to share their personal experiences with health. Twitter users in health-related fields tweet about both testable claims and personal experiences. Future work should assess the relationship between testable tweets and the actual level of evidence supporting them, including how Twitter users-especially patients-interpret the content of tweets posted by health providers.
Zee-Babu type model with U (1 )Lμ-Lτ gauge symmetry
NASA Astrophysics Data System (ADS)
Nomura, Takaaki; Okada, Hiroshi
2018-05-01
We extend the Zee-Babu model, introducing local U (1 )Lμ-Lτ symmetry with several singly charged bosons. We find a predictive neutrino mass texture in a simple hypothesis in which mixings among singly charged bosons are negligible. Also, lepton-flavor violations are less constrained compared with the original model. Then, we explore the testability of the model, focusing on doubly charged boson physics at the LHC and the International Linear Collider.
DeCamp, Matthew; Dredze, Mark; Chisolm, Margaret S; Berger, Zackary D
2014-01-01
Background Twitter is home to many health professionals who send messages about a variety of health-related topics. Amid concerns about physicians posting inappropriate content online, more in-depth knowledge about these messages is needed to understand health professionals’ behavior on Twitter. Objective Our goal was to characterize the content of Twitter messages, specifically focusing on health professionals and their tweets relating to health. Methods We performed an in-depth content analysis of 700 tweets. Qualitative content analysis was conducted on tweets by health users on Twitter. The primary objective was to describe the general type of content (ie, health-related versus non-health related) on Twitter authored by health professionals and further to describe health-related tweets on the basis of the type of statement made. Specific attention was given to whether a tweet was personal (as opposed to professional) or made a claim that users would expect to be supported by some level of medical evidence (ie, a “testable” claim). A secondary objective was to compare content types among different users, including patients, physicians, nurses, health care organizations, and others. Results Health-related users are posting a wide range of content on Twitter. Among health-related tweets, 53.2% (184/346) contained a testable claim. Of health-related tweets by providers, 17.6% (61/346) were personal in nature; 61% (59/96) made testable statements. While organizations and businesses use Twitter to promote their services and products, patient advocates are using this tool to share their personal experiences with health. Conclusions Twitter users in health-related fields tweet about both testable claims and personal experiences. Future work should assess the relationship between testable tweets and the actual level of evidence supporting them, including how Twitter users—especially patients—interpret the content of tweets posted by health providers. PMID:25591063
Model-Based Testability Assessment and Directed Troubleshooting of Shuttle Wiring Systems
NASA Technical Reports Server (NTRS)
Deb, Somnath; Domagala, Chuck; Shrestha, Roshan; Malepati, Venkatesh; Cavanaugh, Kevin; Patterson-Hine, Ann; Sanderfer, Dwight; Cockrell, Jim; Norvig, Peter (Technical Monitor)
2000-01-01
We have recently completed a pilot study on the Space shuttle wiring system commissioned by the Wiring Integrity Research (WIRe) team at NASA Ames Research Center, As the space shuttle ages, it is experiencing wiring degradation problems including arcing, chaffing insulation breakdown and broken conductors. A systematic and comprehensive test process is required to thoroughly test and quality assure (QA) the wiring systems. The NASA WIRe team recognized the value of a formal model based analysis for risk-assessment and fault coverage analysis. However. wiring systems are complex and involve over 50,000 wire segments. Therefore, NASA commissioned this pilot study with Qualtech Systems. Inc. (QSI) to explore means of automatically extracting high fidelity multi-signal models from wiring information database for use with QSI's Testability Engineering and Maintenance System (TEAMS) tool.
Higher-order Fourier analysis over finite fields and applications
NASA Astrophysics Data System (ADS)
Hatami, Pooya
Higher-order Fourier analysis is a powerful tool in the study of problems in additive and extremal combinatorics, for instance the study of arithmetic progressions in primes, where the traditional Fourier analysis comes short. In recent years, higher-order Fourier analysis has found multiple applications in computer science in fields such as property testing and coding theory. In this thesis, we develop new tools within this theory with several new applications such as a characterization theorem in algebraic property testing. One of our main contributions is a strong near-equidistribution result for regular collections of polynomials. The densities of small linear structures in subsets of Abelian groups can be expressed as certain analytic averages involving linear forms. Higher-order Fourier analysis examines such averages by approximating the indicator function of a subset by a function of bounded number of polynomials. Then, to approximate the average, it suffices to know the joint distribution of the polynomials applied to the linear forms. We prove a near-equidistribution theorem that describes these distributions for the group F(n/p) when p is a fixed prime. This fundamental fact was previously known only under various extra assumptions about the linear forms or the field size. We use this near-equidistribution theorem to settle a conjecture of Gowers and Wolf on the true complexity of systems of linear forms. Our next application is towards a characterization of testable algebraic properties. We prove that every locally characterized affine-invariant property of functions f : F(n/p) → R with n∈ N, is testable. In fact, we prove that any such property P is proximity-obliviously testable. More generally, we show that any affine-invariant property that is closed under subspace restrictions and has "bounded complexity" is testable. We also prove that any property that can be described as the property of decomposing into a known structure of low-degree polynomials is locally characterized and is, hence, testable. We discuss several notions of regularity which allow us to deduce algorithmic versions of various regularity lemmas for polynomials by Green and Tao and by Kaufman and Lovett. We show that our algorithmic regularity lemmas for polynomials imply algorithmic versions of several results relying on regularity, such as decoding Reed-Muller codes beyond the list decoding radius (for certain structured errors), and prescribed polynomial decompositions. Finally, motivated by the definition of Gowers norms, we investigate norms defined by different systems of linear forms. We give necessary conditions on the structure of systems of linear forms that define norms. We prove that such norms can be one of only two types, and assuming that |F p| is sufficiently large, they essentially are equivalent to either a Gowers norm or Lp norms.
Vesselinova, Neda; Alexandrov, Boian; Wall, Michael E.
2016-11-08
We present a dynamical model of drug accumulation in bacteria. The model captures key features in experimental time courses on ofloxacin accumulation: initial uptake; two-phase response; and long-term acclimation. In combination with experimental data, the model provides estimates of import and export rates in each phase, the time of entry into the second phase, and the decrease of internal drug during acclimation. Global sensitivity analysis, local sensitivity analysis, and Bayesian sensitivity analysis of the model provide information about the robustness of these estimates, and about the relative importance of different parameters in determining the features of the accumulation time coursesmore » in three different bacterial species: Escherichia coli, Staphylococcus aureus, and Pseudomonas aeruginosa. The results lead to experimentally testable predictions of the effects of membrane permeability, drug efflux and trapping (e.g., by DNA binding) on drug accumulation. A key prediction is that a sudden increase in ofloxacin accumulation in both E. coli and S. aureus is accompanied by a decrease in membrane permeability.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vesselinova, Neda; Alexandrov, Boian; Wall, Michael E.
We present a dynamical model of drug accumulation in bacteria. The model captures key features in experimental time courses on ofloxacin accumulation: initial uptake; two-phase response; and long-term acclimation. In combination with experimental data, the model provides estimates of import and export rates in each phase, the time of entry into the second phase, and the decrease of internal drug during acclimation. Global sensitivity analysis, local sensitivity analysis, and Bayesian sensitivity analysis of the model provide information about the robustness of these estimates, and about the relative importance of different parameters in determining the features of the accumulation time coursesmore » in three different bacterial species: Escherichia coli, Staphylococcus aureus, and Pseudomonas aeruginosa. The results lead to experimentally testable predictions of the effects of membrane permeability, drug efflux and trapping (e.g., by DNA binding) on drug accumulation. A key prediction is that a sudden increase in ofloxacin accumulation in both E. coli and S. aureus is accompanied by a decrease in membrane permeability.« less
Boolean Minimization and Algebraic Factorization Procedures for Fully Testable Sequential Machines
1989-09-01
Boolean Minimization and Algebraic Factorization Procedures for Fully Testable Sequential Machines Srinivas Devadas and Kurt Keutzer F ( Abstract In this...Projects Agency under contract number N00014-87-K-0825. Author Information Devadas : Department of Electrical Engineering and Computer Science, Room 36...MA 02139; (617) 253-0292. 0 * Boolean Minimization and Algebraic Factorization Procedures for Fully Testable Sequential Machines Siivas Devadas
A Unified Approach to the Synthesis of Fully Testable Sequential Machines
1989-10-01
N A Unified Approach to the Synthesis of Fully Testable Sequential Machines Srinivas Devadas and Kurt Keutzer Abstract • In this paper we attempt to...research was supported in part by the Defense Advanced Research Projects Agency under contract N00014-87-K-0825. Author Information Devadas : Department...Fully Testable Sequential Maine(S P Sritiivas Devadas Departinent of Electrical Engineerinig anid Comivi Sciec Massachusetts Institute of Technology
Factors That Affect Software Testability
NASA Technical Reports Server (NTRS)
Voas, Jeffrey M.
1991-01-01
Software faults that infrequently affect software's output are dangerous. When a software fault causes frequent software failures, testing is likely to reveal the fault before the software is releases; when the fault remains undetected during testing, it can cause disaster after the software is installed. A technique for predicting whether a particular piece of software is likely to reveal faults within itself during testing is found in [Voas91b]. A piece of software that is likely to reveal faults within itself during testing is said to have high testability. A piece of software that is not likely to reveal faults within itself during testing is said to have low testability. It is preferable to design software with higher testabilities from the outset, i.e., create software with as high of a degree of testability as possible to avoid the problems of having undetected faults that are associated with low testability. Information loss is a phenomenon that occurs during program execution that increases the likelihood that a fault will remain undetected. In this paper, I identify two brad classes of information loss, define them, and suggest ways of predicting the potential for information loss to occur. We do this in order to decrease the likelihood that faults will remain undetected during testing.
Broken SU(3) antidecuplet for {Theta}{sup +} and {Xi}{sub 3/2}
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pakvasa, Sandip; Suzuki, Mahiko
2004-05-05
If the narrow exotic baryon resonances {Theta}{sup +}(1540) and {Xi}{sub 3/2} are members of the J{sup P} = 1/2{sup +} antidecuplet with N*(1710), the octet-antidecuplet mixing is required not only by the mass spectrum but also by the decay pattern of N*(1710). This casts doubt on validity of the {Theta}{sup +} mass prediction by the chiral soliton model. While all pieces of the existing experimental information point to a small octet-decuplet mixing, the magnitude of mixing required by the mass spectrum is not consistent with the value needed to account for the hadronic decay rates. The discrepancy is not resolvedmore » even after the large experimental uncertainty is taken into consideration. We fail to find an alternative SU(3) assignment even with different spin-parity assignment. When we extend the analysis to mixing with a higher SU(3) multiplet, we find one experimentally testable scenario in the case of mixing with a 27-plet.« less
Minimal realization of right-handed gauge symmetry
NASA Astrophysics Data System (ADS)
Nomura, Takaaki; Okada, Hiroshi
2018-01-01
We propose a minimally extended gauge symmetry model with U (1 )R , where only the right-handed fermions have nonzero charges in the fermion sector. To achieve both anomaly cancellations and minimality, three right-handed neutrinos are naturally required, and the standard model Higgs has to have nonzero charge under this symmetry. Then we find that its breaking scale(Λ ) is restricted by precise measurement of neutral gauge boson in the standard model; therefore, O (10 ) TeV ≲Λ . We also discuss its testability of the new gauge boson and discrimination of U (1 )R model from U (1 )B-L one at collider physics such as LHC and ILC.
Module generation for self-testing integrated systems
NASA Astrophysics Data System (ADS)
Vanriessen, Ronald Pieter
Hardware used for self test in VLSI (Very Large Scale Integrated) systems is reviewed, and an architecture to control the test hardware in an integrated system is presented. Because of the increase of test times, the use of self test techniques has become practically and economically viable for VLSI systems. Beside the reduction in test times and costs, self test also provides testing at operational speeds. Therefore, a suitable combination of scan path and macrospecific (self) tests is required to reduce test times and costs. An expert system that can be used in a silicon compilation environment is presented. The approach requires a minimum of testability knowledge from a system designer. A user friendly interface was described for specifying and modifying testability requirements by a testability expert. A reason directed backtracking mechanism is used to solve selection failures. Both the hierarchical testable architecture and the design for testability expert system are used in a self test compiler. The definition of a self test compiler was given. A self test compiler is a software tool that selects an appropriate test method for every macro in a design. The hardware to control a macro test will be included in the design automatically. As an example, the integration of the self-test compiler in a silicon compilation system PIRAMID was described. The design of a demonstrator circuit by self test compiler is described. This circuit consists of two self testable macros. Control of the self test hardware is carried out via the test access port of the boundary scan standard.
E-learning platform for automated testing of electronic circuits using signature analysis method
NASA Astrophysics Data System (ADS)
Gherghina, Cǎtǎlina; Bacivarov, Angelica; Bacivarov, Ioan C.; Petricǎ, Gabriel
2016-12-01
Dependability of electronic circuits can be ensured only through testing of circuit modules. This is done by generating test vectors and their application to the circuit. Testability should be viewed as a concerted effort to ensure maximum efficiency throughout the product life cycle, from conception and design stage, through production to repairs during products operating. In this paper, is presented the platform developed by authors for training for testability in electronics, in general and in using signature analysis method, in particular. The platform allows highlighting the two approaches in the field namely analog and digital signature of circuits. As a part of this e-learning platform, it has been developed a database for signatures of different electronic components meant to put into the spotlight different techniques implying fault detection, and from this there were also self-repairing techniques of the systems with this kind of components. An approach for realizing self-testing circuits based on MATLAB environment and using signature analysis method is proposed. This paper analyses the benefits of signature analysis method and simulates signature analyzer performance based on the use of pseudo-random sequences, too.
Using Dynamic Sensitivity Analysis to Assess Testability
NASA Technical Reports Server (NTRS)
Voas, Jeffrey; Morell, Larry; Miller, Keith
1990-01-01
This paper discusses sensitivity analysis and its relationship to random black box testing. Sensitivity analysis estimates the impact that a programming fault at a particular location would have on the program's input/output behavior. Locations that are relatively \\"insensitive" to faults can render random black box testing unlikely to uncover programming faults. Therefore, sensitivity analysis gives new insight when interpreting random black box testing results. Although sensitivity analysis is computationally intensive, it requires no oracle and no human intervention.
Adolescent Pregnancy and Its Delay.
ERIC Educational Resources Information Center
Bell, Lloyd H.
This paper examines some probable reasons for the black adolescent male's contribution to increased pregnancy in the black community. Using a situation analysis, it presents the following testable suppositions: (1) black males' fear of retribution for impregnating a girl has diminished, leading to increased sexual intercourse and ultimately to…
Refinement of Representation Theorems for Context-Free Languages
NASA Astrophysics Data System (ADS)
Fujioka, Kaoru
In this paper, we obtain some refinement of representation theorems for context-free languages by using Dyck languages, insertion systems, strictly locally testable languages, and morphisms. For instance, we improved the Chomsky-Schützenberger representation theorem and show that each context-free language L can be represented in the form L = h (D ∩ R), where D is a Dyck language, R is a strictly 3-testable language, and h is a morphism. A similar representation for context-free languages can be obtained, using insertion systems of weight (3, 0) and strictly 4-testable languages.
Toward a Testable Developmental Model of Pedophilia: The Development of Erotic Age Preference.
ERIC Educational Resources Information Center
Freund, Kurt; Kuban, Michael
1993-01-01
Analysis of retrospective self-reports about childhood curiosity to see persons in the nude, with heterosexual and homosexual pedophiles, gynephiles, and androphiles, suggests that establishment of erotic sex preference proceeded that of age preference, and a greater proportion of pedophiles than gynephiles or androphiles remembered childhood…
Neutrino mass, dark matter, and Baryon asymmetry via TeV-scale physics without fine-tuning.
Aoki, Mayumi; Kanemura, Shinya; Seto, Osamu
2009-02-06
We propose an extended version of the standard model, in which neutrino oscillation, dark matter, and the baryon asymmetry of the Universe can be simultaneously explained by the TeV-scale physics without assuming a large hierarchy among the mass scales. Tiny neutrino masses are generated at the three-loop level due to the exact Z2 symmetry, by which the stability of the dark matter candidate is guaranteed. The extra Higgs doublet is required not only for the tiny neutrino masses but also for successful electroweak baryogenesis. The model provides discriminative predictions especially in Higgs phenomenology, so that it is testable at current and future collider experiments.
Software Tools to Support the Assessment of System Health
NASA Technical Reports Server (NTRS)
Melcher, Kevin J.
2013-01-01
This presentation provides an overview of three software tools that were developed by the NASA Glenn Research Center to support the assessment of system health: the Propulsion Diagnostic Method Evaluation Strategy (ProDIMES), the Systematic Sensor Selection Strategy (S4), and the Extended Testability Analysis (ETA) tool. Originally developed to support specific NASA projects in aeronautics and space, these software tools are currently available to U.S. citizens through the NASA Glenn Software Catalog. The ProDiMES software tool was developed to support a uniform comparison of propulsion gas path diagnostic methods. Methods published in the open literature are typically applied to dissimilar platforms with different levels of complexity. They often address different diagnostic problems and use inconsistent metrics for evaluating performance. As a result, it is difficult to perform a one ]to ]one comparison of the various diagnostic methods. ProDIMES solves this problem by serving as a theme problem to aid in propulsion gas path diagnostic technology development and evaluation. The overall goal is to provide a tool that will serve as an industry standard, and will truly facilitate the development and evaluation of significant Engine Health Management (EHM) capabilities. ProDiMES has been developed under a collaborative project of The Technical Cooperation Program (TTCP) based on feedback provided by individuals within the aircraft engine health management community. The S4 software tool provides a framework that supports the optimal selection of sensors for health management assessments. S4 is structured to accommodate user ]defined applications, diagnostic systems, search techniques, and system requirements/constraints. One or more sensor suites that maximize this performance while meeting other user ]defined system requirements that are presumed to exist. S4 provides a systematic approach for evaluating combinations of sensors to determine the set or sets of sensors that optimally meet the performance goals and the constraints. It identifies optimal sensor suite solutions by utilizing a merit (i.e., cost) function with one of several available optimization approaches. As part of its analysis, S4 can expose fault conditions that are difficult to diagnose due to an incomplete diagnostic philosophy and/or a lack of sensors. S4 was originally developed and applied to liquid rocket engines. It was subsequently used to study the optimized selection of sensors for a simulation ]based aircraft engine diagnostic system. The ETA Tool is a software ]based analysis tool that augments the testability analysis and reporting capabilities of a commercial ]off ]the ]shelf (COTS) package. An initial diagnostic assessment is performed by the COTS software using a user ]developed, qualitative, directed ]graph model of the system being analyzed. The ETA Tool accesses system design information captured within the model and the associated testability analysis output to create a series of six reports for various system engineering needs. These reports are highlighted in the presentation. The ETA Tool was developed by NASA to support the verification of fault management requirements early in the Launch Vehicle process. Due to their early development during the design process, the TEAMS ]based diagnostic model and the ETA Tool were able to positively influence the system design by highlighting gaps in failure detection, fault isolation, and failure recovery.
Guimaraes, Sandra; Fernandes, Tiago; Costa, Patrício; Silva, Eduardo
2018-06-01
To determine a normative of tumbling E optotype and its feasibility for visual acuity (VA) assessment in children aged 3-4 years. A cross-sectional study of 1756 children who were invited to participate in a comprehensive non-invasive eye exam. Uncorrected monocular VA with crowded tumbling E with a comprehensive ophthalmological examination were assessed. Testability rates of the whole population and VA of the healthy children for different age subgroups, gender, school type and the order of testing in which the ophthalmological examination was performed were evaluated. The overall testability rate was 95% (92% and 98% for children aged 3 and 4 years, respectively). The mean VA of the first-day assessment (first-VA) and best-VA over 2 days' assessments was 0.14 logMAR (95% CI 0.14 to 0.15) (decimal=0.72, 95% CI 0.71 to 0.73) and 0.13 logMAR (95% CI 0.13 to 0.14) (decimal=0.74, 95% CI 0.73 to 0.74). Analysis with age showed differences between groups in first-VA (F(3,1146)=10.0; p<0.001; η2=0.026) and best-VA (F(3,1155)=8.8; p<0.001; η2=0.022). Our normative was very highly correlated with previous reported HOTV-Amblyopia-Treatment-Study (HOTV-ATS) (first-VA, r=0.97; best-VA, r=0.99), with 0.8 to 0.7 lines consistent overestimation for HOTV-ATS as described in literature. Overall false-positive referral was 1.3%, being specially low regarding anisometropias of ≥2 logMAR lines (0.17%). Interocular difference ≥1 line VA logMAR was not associated with age (p=0.195). This is the first normative for European Caucasian children with single crowded tumbling E in healthy eyes and the largest study comparing 3 and 4 years old testability. Testability rates are higher than found in literature with other optotypes, especially in children aged 3 years, where we found 5%-11% better testability rates. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
USDA-ARS?s Scientific Manuscript database
Progress in studying the biology of Trichinella spp. was greatly advanced with the publication and analysis of the draft genome sequence of T. spiralis. Those data provide a basis for constructing testable hypothesis concerning parasite physiology, immunology, and genetics. They also provide tools...
Generating Testable Questions in the Science Classroom: The BDC Model
ERIC Educational Resources Information Center
Tseng, ChingMei; Chen, Shu-Bi Shu-Bi; Chang, Wen-Hua
2015-01-01
Guiding students to generate testable scientific questions is essential in the inquiry classroom, but it is not easy. The purpose of the BDC ("Big Idea, Divergent Thinking, and Convergent Thinking") instructional model is to to scaffold students' inquiry learning. We illustrate the use of this model with an example lesson, designed…
Easily Testable PLA-Based Finite State Machines
1989-03-01
PLATYPUS (20]. Then, justifi- type 1, 4 and 5 can be guaranteed to be testable via cation paths are obtained from the STG using simple logic...next state lines is found, if such a vector par that is gnrt d y the trupt eexists, using PLATYPUS [20]. pair that is generated by the first corrupted
Eye Examination Testability in Children with Autism and in Typical Peers
Coulter, Rachel Anastasia; Bade, Annette; Tea, Yin; Fecho, Gregory; Amster, Deborah; Jenewein, Erin; Rodena, Jacqueline; Lyons, Kara Kelley; Mitchell, G. Lynn; Quint, Nicole; Dunbar, Sandra; Ricamato, Michele; Trocchio, Jennie; Kabat, Bonnie; Garcia, Chantel; Radik, Irina
2015-01-01
ABSTRACT Purpose To compare testability of vision and eye tests in an examination protocol of 9- to 17-year-old patients with autism spectrum disorder (ASD) to typically developing (TD) peers. Methods In a prospective pilot study, 61 children and adolescents (34 with ASD and 27 who were TD) aged 9 to 17 years completed an eye examination protocol including tests of visual acuity, refraction, convergence (eye teaming), stereoacuity (depth perception), ocular motility, and ocular health. Patients who required new refractive correction were retested after wearing their updated spectacle prescription for 1 month. The specialized protocol incorporated visual, sensory, and communication supports. A psychologist determined group status/eligibility using DSM-IV-TR (Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition, Text Revision) criteria by review of previous evaluations and parent responses on the Social Communication Questionnaire. Before the examination, parents provided information regarding patients’ sex, race, ethnicity, and, for ASD patients, verbal communication level (nonverbal, uses short words, verbal). Parents indicated whether the patient wore a refractive correction, whether the patient had ever had an eye examination, and the age at the last examination. Chi-square tests compared testability results for TD and ASD groups. Results Typically developing and ASD groups did not differ by age (p = 0.54), sex (p = 0.53), or ethnicity (p = 0.22). Testability was high on most tests (TD, 100%; ASD, 88 to 100%), except for intraocular pressure (IOP), which was reduced for both the ASD (71%) and the TD (89%) patients. Among ASD patients, IOP testability varied greatly with verbal communication level (p < 0.001). Although IOP measurements were completed on all verbal patients, only 37.5% of nonverbal and 44.4% of ASD patients who used short words were successful. Conclusions Patients with ASD can complete most vision and eye tests within an examination protocol. Testability of IOPs is reduced, particularly for nonverbal patients and patients who use short words to communicate. PMID:25415280
JACOB: an enterprise framework for computational chemistry.
Waller, Mark P; Dresselhaus, Thomas; Yang, Jack
2013-06-15
Here, we present just a collection of beans (JACOB): an integrated batch-based framework designed for the rapid development of computational chemistry applications. The framework expedites developer productivity by handling the generic infrastructure tier, and can be easily extended by user-specific scientific code. Paradigms from enterprise software engineering were rigorously applied to create a scalable, testable, secure, and robust framework. A centralized web application is used to configure and control the operation of the framework. The application-programming interface provides a set of generic tools for processing large-scale noninteractive jobs (e.g., systematic studies), or for coordinating systems integration (e.g., complex workflows). The code for the JACOB framework is open sourced and is available at: www.wallerlab.org/jacob. Copyright © 2013 Wiley Periodicals, Inc.
Majorana dark matter with B+L gauge symmetry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chao, Wei; Guo, Huai-Ke; Zhang, Yongchao
Here, we present a new model that extends the Standard Model (SM) with the local B + L symmetry, and point out that the lightest new fermion, introduced to cancel anomalies and stabilized automatically by the B + L symmetry, can serve as the cold dark matter candidate. We also study constraints on the model from Higgs measurements, electroweak precision measurements as well as the relic density and direct detections of the dark matter. Our numerical results reveal that the pseudo-vector coupling of with Z and the Yukawa coupling with the SM Higgs are highly constrained by the latest resultsmore » of LUX, while there are viable parameter space that could satisfy all the constraints and give testable predictions.« less
Majorana dark matter with B+L gauge symmetry
Chao, Wei; Guo, Huai-Ke; Zhang, Yongchao
2017-04-07
Here, we present a new model that extends the Standard Model (SM) with the local B + L symmetry, and point out that the lightest new fermion, introduced to cancel anomalies and stabilized automatically by the B + L symmetry, can serve as the cold dark matter candidate. We also study constraints on the model from Higgs measurements, electroweak precision measurements as well as the relic density and direct detections of the dark matter. Our numerical results reveal that the pseudo-vector coupling of with Z and the Yukawa coupling with the SM Higgs are highly constrained by the latest resultsmore » of LUX, while there are viable parameter space that could satisfy all the constraints and give testable predictions.« less
A Framework for Evidence-Based Licensure of Adaptive Autonomous Systems
2016-03-01
insights gleaned to DoD. The autonomy community has identified significant challenges associated with test, evaluation verification and validation of...licensure as a test, evaluation, verification , and validation (TEVV) framework that can address these challenges. IDA found that traditional...language requirements to testable (preferably machine testable) specifications • Design of architectures that treat development and verification of
Exploring Operational Test and Evaluation of Unmanned Aircraft Systems: A Qualitative Case Study
NASA Astrophysics Data System (ADS)
Saliceti, Jose A.
The purpose of this qualitative case study was to explore and identify strategies that may potentially remedy operational test and evaluation procedures used to evaluate Unmanned Aircraft Systems (UAS) technology. The sample for analysis consisted of organizations testing and evaluating UASs (e.g., U.S. Air Force, U.S. Navy, U.S. Army, U.S. Marine Corps, U.S. Coast Guard, and Customs Border Protection). A purposeful sampling technique was used to select 15 subject matter experts in the field of operational test and evaluation of UASs. A questionnaire was provided to participants to construct a descriptive and robust research. Analysis of responses revealed themes related to each research question. Findings revealed operational testers utilized requirements documents to extrapolate measures for testing UAS technology and develop critical operational issues. The requirements documents were (a) developed without the contribution of stakeholders and operational testers, (b) developed with vague or unrealistic measures, and (c) developed without a systematic method to derive requirements from mission tasks. Four approaches are recommended to develop testable operational requirements and assist operational testers: (a) use a mission task analysis tool to derive requirements for mission essential tasks for the system, (b) exercise collaboration among stakeholders and testers to ensure testable operational requirements based on mission tasks, (c) ensure testable measures are used in requirements documents, and (d) create a repository list of critical operational issues by mission areas. The preparation of operational test and evaluation processes for UAS technology is not uniform across testers. The processes in place are not standardized, thus test plan preparation and reporting are different among participants. A standard method to prepare and report UAS technology should be used when preparing and reporting on UAS technology. Using a systematic process, such as mission-based test design, resonated among participants as an analytical method to link UAS mission tasks and measures of performance to the capabilities of the system under test when developing operational test plans. Further research should examine system engineering designs for system requirements traceability matrix of mission tasks and subtasks while using an analysis tool that adequately evaluates UASs with an acceptable level of confidence in the results.
Design for testability and diagnosis at the system-level
NASA Technical Reports Server (NTRS)
Simpson, William R.; Sheppard, John W.
1993-01-01
The growing complexity of full-scale systems has surpassed the capabilities of most simulation software to provide detailed models or gate-level failure analyses. The process of system-level diagnosis approaches the fault-isolation problem in a manner that differs significantly from the traditional and exhaustive failure mode search. System-level diagnosis is based on a functional representation of the system. For example, one can exercise one portion of a radar algorithm (the Fast Fourier Transform (FFT) function) by injecting several standard input patterns and comparing the results to standardized output results. An anomalous output would point to one of several items (including the FFT circuit) without specifying the gate or failure mode. For system-level repair, identifying an anomalous chip is sufficient. We describe here an information theoretic and dependency modeling approach that discards much of the detailed physical knowledge about the system and analyzes its information flow and functional interrelationships. The approach relies on group and flow associations and, as such, is hierarchical. Its hierarchical nature allows the approach to be applicable to any level of complexity and to any repair level. This approach has been incorporated in a product called STAMP (System Testability and Maintenance Program) which was developed and refined through more than 10 years of field-level applications to complex system diagnosis. The results have been outstanding, even spectacular in some cases. In this paper we describe system-level testability, system-level diagnoses, and the STAMP analysis approach, as well as a few STAMP applications.
Authors’ response: mirror neurons: tests and testability.
Catmur, Caroline; Press, Clare; Cook, Richard; Bird, Geoffrey; Heyes, Cecilia
2014-04-01
Commentators have tended to focus on the conceptual framework of our article, the contrast between genetic and associative accounts of mirror neurons, and to challenge it with additional possibilities rather than empirical data. This makes the empirically focused comments especially valuable. The mirror neuron debate is replete with ideas; what it needs now are system-level theories and careful experiments – tests and testability.
ERIC Educational Resources Information Center
Booker, Lucille M.
2012-01-01
Political discourse is an observable, measurable, and testable manifestation of political worldviews. However, when worldviews collide, notions of truth and of lies are put to the test. The challenge for researchers is how to establish confidence in their analysis. Despite the growing interest in deception research from a diversity of fields and…
Artificial Intelligence Applications to Testability.
1984-10-01
general software assistant; examining testability utilization of it should wait a few years until the software assistant is a well defined product ...ago. It provides a single host which satisfies the needs of developers, product developers, and end users . As shown in table 5.10-2, it also provides...follows a trend towards more user -oriented design approaches to interactive computer systems. The implicit goal in this trend is the
Perfetto, Ralph; Woodside, Arch G
2009-09-01
The present study informs understanding of customer segmentation strategies by extending Twedt's heavy-half propositions to include a segment of users that represent less than 2% of all households-consumers demonstrating extremely frequent behavior (EFB). Extremely frequent behavior (EFB) theory provides testable propositions relating to the observation that few (2%) consumers in many product and service categories constitute more than 25% of the frequency of product or service use. Using casino gambling as an example for testing EFB theory, an analysis of national survey data shows that extremely frequent casino gamblers do exist and that less than 2% of all casino gamblers are responsible for nearly 25% of all casino gambling usage. Approximately 14% of extremely frequent casino users have very low-household income, suggesting somewhat paradoxical consumption patterns (where do very low-income users find the money to gamble so frequently?). Understanding the differences light, heavy, and extreme users and non-users can help marketers and policymakers identify and exploit "blue ocean" opportunities (Kim and Mauborgne, Blue ocean strategy, Harvard Business School Press, Boston, 2005), for example, creating effective strategies to convert extreme users into non-users or non-users into new users.
Harris, Jenine K; Erwin, Paul C; Smith, Carson; Brownson, Ross C
2015-01-01
Evidence-based decision making (EBDM) is the process, in local health departments (LHDs) and other settings, of translating the best available scientific evidence into practice. Local health departments are more likely to be successful if they use evidence-based strategies. However, EBDM and use of evidence-based strategies by LHDs are not widespread. Drawing on diffusion of innovations theory, we sought to understand how LHD directors and program managers perceive the relative advantage, compatibility, simplicity, and testability of EBDM. Directors and managers of programs in chronic disease, environmental health, and infectious disease from LHDs nationwide completed a survey including demographic information and questions about diffusion attributes (advantage, compatibility, simplicity, and testability) related to EBDM. Bivariate inferential tests were used to compare responses between directors and managers and to examine associations between participant characteristics and diffusion attributes. Relative advantage and compatibility scores were high for directors and managers, whereas simplicity and testability scores were lower. Although health department directors and managers of programs in chronic disease generally had higher scores than other groups, there were few significant or large differences between directors and managers across the diffusion attributes. Larger jurisdiction population size was associated with higher relative advantage and compatibility scores for both directors and managers. Overall, directors and managers were in strong agreement on the relative advantage of an LHD using EBDM, with directors in stronger agreement than managers. Perceived relative advantage has been demonstrated to be the most important factor in the rate of innovation adoption, suggesting an opportunity for directors to speed EBDM adoption. However, lower average scores across all groups for simplicity and testability may be hindering EBDM adoption. Recommended strategies for increasing perceived EBDM simplicity and testability are provided.
NASA Astrophysics Data System (ADS)
Chun, E. J.; Cvetič, G.; Dev, P. S. B.; Drewes, M.; Fong, C. S.; Garbrecht, B.; Hambye, T.; Harz, J.; Hernández, P.; Kim, C. S.; Molinaro, E.; Nardi, E.; Racker, J.; Rius, N.; Zamora-Saa, J.
2018-02-01
The focus of this paper lies on the possible experimental tests of leptogenesis scenarios. We consider both leptogenesis generated from oscillations, as well as leptogenesis from out-of-equilibrium decays. As the Akhmedov-Rubakov-Smirnov (ARS) mechanism allows for heavy neutrinos in the GeV range, this opens up a plethora of possible experimental tests, e.g. at neutrino oscillation experiments, neutrinoless double beta decay, and direct searches for neutral heavy leptons at future facilities. In contrast, testing leptogenesis from out-of-equilibrium decays is a quite difficult task. We comment on the necessary conditions for having successful leptogenesis at the TeV-scale. We further discuss possible realizations and their model specific testability in extended seesaw models, models with extended gauge sectors, and supersymmetric leptogenesis. Not being able to test high-scale leptogenesis directly, we present a way to falsify such scenarios by focusing on their washout processes. This is discussed specifically for the left-right symmetric model and the observation of a heavy WR, as well as model independently when measuring ΔL = 2 washout processes at the LHC or neutrinoless double beta decay.
Simple neural substrate predicts complex rhythmic structure in duetting birds
NASA Astrophysics Data System (ADS)
Amador, Ana; Trevisan, M. A.; Mindlin, G. B.
2005-09-01
Horneros (Furnarius Rufus) are South American birds well known for their oven-looking nests and their ability to sing in couples. Previous work has analyzed the rhythmic organization of the duets, unveiling a mathematical structure behind the songs. In this work we analyze in detail an extended database of duets. The rhythms of the songs are compatible with the dynamics presented by a wide class of dynamical systems: forced excitable systems. Compatible with this nonlinear rule, we build a biologically inspired model for how the neural and the anatomical elements may interact to produce the observed rhythmic patterns. This model allows us to synthesize songs presenting the acoustic and rhythmic features observed in real songs. We also make testable predictions in order to support our hypothesis.
NASA Astrophysics Data System (ADS)
Amoroso, Richard L.
HÉCTOR A.A brief introductory survey of Unified Field Mechanics (UFM) is given from the perspective of a Holographic Anthropic Multiverse cosmology in 12 `continuous-state' dimensions. The paradigm with many new parameters is cast in a scale-invariant conformal covariant Dirac polarized vacuum utilizing extended HD forms of the de Broglie-Bohm and Cramer interpretations of quantum theory. The model utilizes a unique form of M-Theory based in part on the original hadronic form of string theory that had a variable string tension, TS and included a tachyon. The model is experimentally testable, thus putatively able to demonstrate the existence of large-scale additional dimensionality (LSXD), test for QED violating tight-bound state spectral lines in hydrogen `below' the lowest Bohr orbit, and surmount the quantum uncertainty principle utilizing a hyperincursive Sagnac Effect resonance hierarchy.
Sneutrino dark matter in gauged inverse seesaw models for neutrinos.
An, Haipeng; Dev, P S Bhupal; Cai, Yi; Mohapatra, R N
2012-02-24
Extending the minimal supersymmetric standard model to explain small neutrino masses via the inverse seesaw mechanism can lead to a new light supersymmetric scalar partner which can play the role of inelastic dark matter (IDM). It is a linear combination of the superpartners of the neutral fermions in the theory (the light left-handed neutrino and two heavy standard model singlet neutrinos) which can be very light with mass in ~5-20 GeV range, as suggested by some current direct detection experiments. The IDM in this class of models has keV-scale mass splitting, which is intimately connected to the small Majorana masses of neutrinos. We predict the differential scattering rate and annual modulation of the IDM signal which can be testable at future germanium- and xenon-based detectors.
Testability Design Rating System: Testability Handbook. Volume 1
1992-02-01
4-10 4.7.5 Summary of False BIT Alarms (FBA) ............................. 4-10 4.7.6 Smart BIT Technique...Circuit Board PGA Pin Grid Array PLA Programmable Logic Array PLD Programmable Logic Device PN Pseudo-Random Number PREDICT Probabilistic Estimation of...11 4.7.6 Smart BIT ( reference: RADC-TR-85-198). " Smart " BIT is a term given to BIT circuitry in a system LRU which includes dedicated processor/memory
Smart substrates: Making multi-chip modules smarter
NASA Astrophysics Data System (ADS)
Wunsch, T. F.; Treece, R. K.
1995-05-01
A novel multi-chip module (MCM) design and manufacturing methodology which utilizes active CMOS circuits in what is normally a passive substrate realizes the 'smart substrate' for use in highly testable, high reliability MCMS. The active devices are used to test the bare substrate, diagnose assembly errors or integrated circuit (IC) failures that require rework, and improve the testability of the final MCM assembly. A static random access memory (SRAM) MCM has been designed and fabricated in Sandia Microelectronics Development Laboratory in order to demonstrate the technical feasibility of this concept and to examine design and manufacturing issues which will ultimately determine the economic viability of this approach. The smart substrate memory MCM represents a first in MCM packaging. At the time the first modules were fabricated, no other company or MCM vendor had incorporated active devices in the substrate to improve manufacturability and testability, and thereby improve MCM reliability and reduce cost.
NASA Astrophysics Data System (ADS)
Chakdar, Shreyashi
The Standard Model of particle physics is assumed to be a low-energy effective theory with new physics theoretically motivated to be around TeV scale. The thesis presents theories with new physics beyond the Standard Model in the TeV scale testable in the colliders. Work done in chapters 2, 3 and 5 in this thesis present some models incorporating different approaches of enlarging the Standard Model gauge group to a grand unified symmetry with each model presenting its unique signatures in the colliders. The study on leptoquarks gauge bosons in reference to TopSU(5) model in chapter 2 showed that their discovery mass range extends up to 1.5 TeV at 14 TeV LHC with luminosity of 100 fb--1. On the other hand, in chapter 3 we studied the collider phenomenology of TeV scale mirror fermions in Left-Right Mirror model finding that the reaches for the mirror quarks goes upto 750 GeV at the 14 TeV LHC with 300 fb--1 luminosity. In chapter 4 we have enlarged the bosonic symmetry to fermi-bose symmetry e.g. supersymmetry and have shown that SUSY with non-universalities in gaugino or scalar masses within high scale SUGRA set up can still be accessible at LHC with 14 TeV. In chapter 5, we performed a study in respect to the e+e-- collider and find that precise measurements of the higgs boson mass splittings up to ˜ 100 MeV may be possible with high luminosity in the International Linear Collider (ILC). In chapter 6 we have shown that the experimental data on neutrino masses and mixings are consistent with the proposed 4/5 parameter Dirac neutrino models yielding a solution for the neutrino masses with inverted mass hierarchy and large CP violating phase delta and thus can be tested experimentally. Chapter 7 of the thesis incorporates a warm dark matter candidate in context of two Higgs doublet model. The model has several testable consequences at colliders with the charged scalar and pseudoscalar being in few hundred GeV mass range. This thesis presents an endeavor to study beyond standard model physics at the TeV scale with testable signals in the Colliders.
Flight control system design factors for applying automated testing techniques
NASA Technical Reports Server (NTRS)
Sitz, Joel R.; Vernon, Todd H.
1990-01-01
The principal design features and operational experiences of the X-29 forward-swept-wing aircraft and F-18 high alpha research vehicle (HARV) automated test systems are discussed. It is noted that operational experiences in developing and using these automated testing techniques have highlighted the need for incorporating target system features to improve testability. Improved target system testability can be accomplished with the addition of nonreal-time and real-time features. Online access to target system implementation details, unobtrusive real-time access to internal user-selectable variables, and proper software instrumentation are all desirable features of the target system. Also, test system and target system design issues must be addressed during the early stages of the target system development. Processing speeds of up to 20 million instructions/s and the development of high-bandwidth reflective memory systems have improved the ability to integrate the target system and test system for the application of automated testing techniques. It is concluded that new methods of designing testability into the target systems are required.
Abu Bakar, Nurul Farhana; Chen, Ai-Hong
2014-02-01
Children with learning disabilities might have difficulties to communicate effectively and give reliable responses as required in various visual function testing procedures. The purpose of this study was to compare the testability of visual acuity using the modified Early Treatment Diabetic Retinopathy Study (ETDRS) and Cambridge Crowding Cards, stereo acuity using Lang Stereo test II and Butterfly stereo tests and colour perception using Colour Vision Test Made Easy (CVTME) and Ishihara's Test for Colour Deficiency (Ishihara Test) between children in mainstream classes and children with learning disabilities in special education classes in government primary schools. A total of 100 primary school children (50 children from mainstream classes and 50 children from special education classes) matched in age were recruited in this cross-sectional comparative study. The testability was determined by the percentage of children who were able to give reliable respond as required by the respective tests. 'Unable to test' was defined as inappropriate response or uncooperative despite best efforts of the screener. The testability of the modified ETDRS, Butterfly stereo test and Ishihara test for respective visual function tests were found lower among children in special education classes ( P < 0.001) but not in Cambridge Crowding Cards, Lang Stereo test II and CVTME. Non verbal or "matching" approaches were found to be more superior in testing visual functions in children with learning disabilities. Modifications of vision testing procedures are essential for children with learning disabilities.
Testability, Test Automation and Test Driven Development for the Trick Simulation Toolkit
NASA Technical Reports Server (NTRS)
Penn, John
2014-01-01
This paper describes the adoption of a Test Driven Development approach and a Continuous Integration System in the development of the Trick Simulation Toolkit, a generic simulation development environment for creating high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. It describes the approach, and the significant benefits seen, such as fast, thorough and clear test feedback every time code is checked into the code repository. It also describes an approach that encourages development of code that is testable and adaptable.
Ethnic Enclaves and the Earnings of Immigrants
Xie, Yu; Gough, Margaret
2011-01-01
A large literature in sociology concerns the implications of immigrants’ participation in ethnic enclaves for their economic and social well-being. The “enclave thesis” speculates that immigrants benefit from working in ethnic enclaves. Previous research concerning the effects of enclave participation on immigrants’ economic outcomes has come to mixed conclusions as to whether enclave effects are positive or negative. In this article, we seek to extend and improve upon past work by formulating testable hypotheses based on the enclave thesis and testing them with data from the 2003 New Immigrant Survey (NIS), employing both residence-based and workplace-based measures of the ethnic enclave. We compare the economic outcomes of immigrants working in ethnic enclaves with those of immigrants working in the mainstream economy. Our research yields minimal support for the enclave thesis. Our results further indicate that for some immigrant groups, ethnic enclave participation actually has a negative effect on economic outcomes. PMID:21863367
Democratic (s)fermions and lepton flavor violation
NASA Astrophysics Data System (ADS)
Hamaguchi, K.; Kakizaki, Mitsuru; Yamaguchi, Masahiro
2003-09-01
The democratic approach to account for fermion masses and mixing is known to be successful not only in the quark sector but also in the lepton sector. Here we extend this ansatz to supersymmetric standard models, in which the Kähler potential obeys the underlying S3 flavor symmetries. The requirement of neutrino bi-large mixing angles constrains the form of the Kähler potential for left-handed lepton multiplets. We find that right-handed sleptons can have nondegenerate masses and flavor mixing, while left-handed sleptons are argued to have universal and hence flavor-blind masses. This mass pattern is testable in future collider experiments when superparticle masses will be measured precisely. Lepton flavor violation arises in this scenario. In particular, μ→eγ is expected to be observed in a planned future experiment if supersymmetry breaking scale is close to the weak scale.
Functional Interdependence Theory: An Evolutionary Account of Social Situations.
Balliet, Daniel; Tybur, Joshua M; Van Lange, Paul A M
2017-11-01
Social interactions are characterized by distinct forms of interdependence, each of which has unique effects on how behavior unfolds within the interaction. Despite this, little is known about the psychological mechanisms that allow people to detect and respond to the nature of interdependence in any given interaction. We propose that interdependence theory provides clues regarding the structure of interdependence in the human ancestral past. In turn, evolutionary psychology offers a framework for understanding the types of information processing mechanisms that could have been shaped under these recurring conditions. We synthesize and extend these two perspectives to introduce a new theory: functional interdependence theory (FIT). FIT can generate testable hypotheses about the function and structure of the psychological mechanisms for inferring interdependence. This new perspective offers insight into how people initiate and maintain cooperative relationships, select social partners and allies, and identify opportunities to signal social motives.
Beyond Λ CDM: Problems, solutions, and the road ahead
NASA Astrophysics Data System (ADS)
Bull, Philip; Akrami, Yashar; Adamek, Julian; Baker, Tessa; Bellini, Emilio; Beltrán Jiménez, Jose; Bentivegna, Eloisa; Camera, Stefano; Clesse, Sébastien; Davis, Jonathan H.; Di Dio, Enea; Enander, Jonas; Heavens, Alan; Heisenberg, Lavinia; Hu, Bin; Llinares, Claudio; Maartens, Roy; Mörtsell, Edvard; Nadathur, Seshadri; Noller, Johannes; Pasechnik, Roman; Pawlowski, Marcel S.; Pereira, Thiago S.; Quartin, Miguel; Ricciardone, Angelo; Riemer-Sørensen, Signe; Rinaldi, Massimiliano; Sakstein, Jeremy; Saltas, Ippocratis D.; Salzano, Vincenzo; Sawicki, Ignacy; Solomon, Adam R.; Spolyar, Douglas; Starkman, Glenn D.; Steer, Danièle; Tereno, Ismael; Verde, Licia; Villaescusa-Navarro, Francisco; von Strauss, Mikael; Winther, Hans A.
2016-06-01
Despite its continued observational successes, there is a persistent (and growing) interest in extending cosmology beyond the standard model, Λ CDM. This is motivated by a range of apparently serious theoretical issues, involving such questions as the cosmological constant problem, the particle nature of dark matter, the validity of general relativity on large scales, the existence of anomalies in the CMB and on small scales, and the predictivity and testability of the inflationary paradigm. In this paper, we summarize the current status of Λ CDM as a physical theory, and review investigations into possible alternatives along a number of different lines, with a particular focus on highlighting the most promising directions. While the fundamental problems are proving reluctant to yield, the study of alternative cosmologies has led to considerable progress, with much more to come if hopes about forthcoming high-precision observations and new theoretical ideas are fulfilled.
The evolution of dispersal in a Levins' type metapopulation model.
Jansen, Vincent A A; Vitalis, Renaud
2007-10-01
We study the evolution of the dispersal rate in a metapopulation model with extinction and colonization dynamics, akin to the model as originally described by Levins. To do so we extend the metapopulation model with a description of the within patch dynamics. By means of a separation of time scales we analytically derive a fitness expression from first principles for this model. The fitness function can be written as an inclusive fitness equation (Hamilton's rule). By recasting this equation in a form that emphasizes the effects of competition we show the effect of the local competition and the local population size on the evolution of dispersal. We find that the evolution of dispersal cannot be easily interpreted in terms of avoidance of kin competition, but rather that increased dispersal reduces the competitive ability. Our model also yields a testable prediction in term of relatedness and life-history parameters.
Singlet-triplet fermionic dark matter and LHC phenomenology
NASA Astrophysics Data System (ADS)
Choubey, Sandhya; Khan, Sarif; Mitra, Manimala; Mondal, Subhadeep
2018-04-01
It is well known that for the pure standard model triplet fermionic WIMP-type dark matter (DM), the relic density is satisfied around 2 TeV. For such a heavy mass particle, the production cross-section at 13 TeV run of LHC will be very small. Extending the model further with a singlet fermion and a triplet scalar, DM relic density can be satisfied for even much lower masses. The lower mass DM can be copiously produced at LHC and hence the model can be tested at collider. For the present model we have studied the multi jet (≥ 2 j) + missing energy ([InlineEquation not available: see fulltext.]) signal and show that this can be detected in the near future of the LHC 13 TeV run. We also predict that the present model is testable by the earth based DM direct detection experiments like Xenon-1T and in future by Darwin.
Abu Bakar, Nurul Farhana; Chen, Ai-Hong
2014-01-01
Context: Children with learning disabilities might have difficulties to communicate effectively and give reliable responses as required in various visual function testing procedures. Aims: The purpose of this study was to compare the testability of visual acuity using the modified Early Treatment Diabetic Retinopathy Study (ETDRS) and Cambridge Crowding Cards, stereo acuity using Lang Stereo test II and Butterfly stereo tests and colour perception using Colour Vision Test Made Easy (CVTME) and Ishihara's Test for Colour Deficiency (Ishihara Test) between children in mainstream classes and children with learning disabilities in special education classes in government primary schools. Materials and Methods: A total of 100 primary school children (50 children from mainstream classes and 50 children from special education classes) matched in age were recruited in this cross-sectional comparative study. The testability was determined by the percentage of children who were able to give reliable respond as required by the respective tests. ‘Unable to test’ was defined as inappropriate response or uncooperative despite best efforts of the screener. Results: The testability of the modified ETDRS, Butterfly stereo test and Ishihara test for respective visual function tests were found lower among children in special education classes (P < 0.001) but not in Cambridge Crowding Cards, Lang Stereo test II and CVTME. Conclusion: Non verbal or “matching” approaches were found to be more superior in testing visual functions in children with learning disabilities. Modifications of vision testing procedures are essential for children with learning disabilities. PMID:24008790
The Demographic Transition: Causes and Consequences
Galor, Oded
2013-01-01
This paper develops the theoretical foundations and the testable implications of the various mechanisms that have been proposed as possible triggers for the demographic transition. Moreover, it examines the empirical validity of each of the theories and their significance for the understanding of the transition from stagnation to growth. The analysis suggests that the rise in the demand for human capital in the process of development was the main trigger for the decline in fertility and the transition to modern growth PMID:25089157
Technology advances and market forces: Their impact on high performance architectures
NASA Technical Reports Server (NTRS)
Best, D. R.
1978-01-01
Reasonable projections into future supercomputer architectures and technology require an analysis of the computer industry market environment, the current capabilities and trends within the component industry, and the research activities on computer architecture in the industrial and academic communities. Management, programmer, architect, and user must cooperate to increase the efficiency of supercomputer development efforts. Care must be taken to match the funding, compiler, architecture and application with greater attention to testability, maintainability, reliability, and usability than supercomputer development programs of the past.
Online testable concept maps: benefits for learning about the pathogenesis of disease.
Ho, Veronica; Kumar, Rakesh K; Velan, Gary
2014-07-01
Concept maps have been used to promote meaningful learning and critical thinking. Although these are crucially important in all disciplines, evidence for the benefits of concept mapping for learning in medicine is limited. We performed a randomised crossover study to assess the benefits of online testable concept maps for learning in pathology by volunteer junior medical students. Participants (n = 65) were randomly allocated to either of two groups with equivalent mean prior academic performance, in which they were given access to either online maps or existing online resources for a 2-week block on renal disease. Groups then crossed over for a 2-week block on hepatic disease. Outcomes were assessed using timed online quizzes, which included questions unrelated to topics in the pathogenesis maps as an internal control. Questionnaires were administered to evaluate students' acceptance of the maps. In both blocks, the group with access to pathogenesis maps achieved significantly higher average scores than the control group on quiz questions related to topics covered by the maps (Block 1: p < 0.001, Cohen's d = 0.9; Block 2: p = 0.008, Cohen's d = 0.7). However, mean scores on unrelated questions did not differ significantly between the groups. In a third block on pancreatic disease, both groups received pathogenesis maps and collectively performed significantly better on quiz topics related to the maps than on unrelated topics (p < 0.01, Cohen's d = 0.5). Regression analysis revealed that access to pathogenesis maps was the dominant contributor to variance in performance on map-related quiz questions. Responses to questionnaire items on pathogenesis maps were overwhelmingly positive in both groups. These results indicate that online testable pathogenesis maps are well accepted and can improve learning of concepts in pathology by medical students. © 2014 John Wiley & Sons Ltd.
Tailor, Vijay; Glaze, Selina; Unwin, Hilary; Bowman, Richard; Thompson, Graham; Dahlmann-Noor, Annegret
2016-10-01
Children and adults with neurological impairments are often not able to access conventional perimetry; however, information about the visual field is valuable. A new technology, saccadic vector optokinetic perimetry (SVOP), may have improved accessibility, but its accuracy has not been evaluated. We aimed to explore accessibility, testability and accuracy of SVOP in children with neurodisability or isolated visual pathway deficits. Cohort study; recruitment October 2013-May 2014, at children's eye clinics at a tertiary referral centre and a regional Child Development Centre; full orthoptic assessment, SVOP (central 30° of the visual field) and confrontation visual fields (CVF). Group 1: age 1-16 years, neurodisability (n=16), group 2: age 10-16 years, confirmed or suspected visual field defect (n=21); group 2 also completed Goldmann visual field testing (GVFT). Group 1: testability with a full 40-point test protocol is 12.5%; with reduced test protocols, testability is 100%, but plots may be clinically meaningless. Children (44%) and parents/carers (62.5%) find the test easy. SVOP and CVF agree in 50%. Group 2: testability is 62% for the 40-point protocol, and 90.5% for reduced protocols. Corneal changes in childhood glaucoma interfere with SVOP testing. All children and parents/carers find SVOP easy. Overall agreement with GVFT is 64.7%. While SVOP is highly accessible to children, many cannot complete a full 40-point test. Agreement with current standard tests is moderate to poor. Abnormal saccades cause an apparent non-specific visual field defect. In children with glaucoma or nystagmus SVOP calibration often fails. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
A Programmable Cellular-Automata Polarized Dirac Vacuum
NASA Astrophysics Data System (ADS)
Osoroma, Drahcir S.
2013-09-01
We explore properties of a `Least Cosmological Unit' (LCU) as an inherent spacetime raster tiling or tessellating the unique backcloth of Holographic Anthropic Multiverse (HAM) cosmology as an array of programmable cellular automata. The HAM vacuum is a scale-invariant HD extension of a covariant polarized Dirac vacuum with `bumps' and `holes' typically described by extended electromagnetic theory corresponding to an Einstein energy-dependent spacetime metric admitting a periodic photon mass. The new cosmology incorporates a unique form of M-Theoretic Calabi-Yau-Poincaré Dodecadedral-AdS5-DS5space (PDS) with mirror symmetry best described by an HD extension of Cramer's Transactional Interpretation when integrated also with an HD extension of the de Broglie-Bohm-Vigier causal interpretation of quantum theory. We incorporate a unique form of large-scale additional dimensionality (LSXD) bearing some similarity to that conceived by Randall and Sundrum; and extend the fundamental basis of our model to the Unified Field, UF. A Sagnac Effect rf-pulsed incursive resonance hierarchy is utilized to manipulate and ballistically program the geometric-topological properties of this putative LSXD space-spacetime network. The model is empirically testable; and it is proposed that a variety of new technologies will arise from ballistic programming of tessellated LCU vacuum cellular automata.
NASA Technical Reports Server (NTRS)
Leonard, J. I.
1985-01-01
Theoretical responses to weightlessness are summarized. The studies include development and validation of a model of erythropoiesis regulation, analysis of the behavior of erythropoiesis under a variety of conditions, simulations of bed rest and space flight, and an evaluation of ground-based animal studies which were conducted as analogs of zero-g. A review of all relevant space flight findings and a set of testable hypotheses which attempt to explain how red cell mass decreases in space flight are presented. An additional document describes details of the mathematical model used in these studies.
Muscle MRI findings in facioscapulohumeral muscular dystrophy.
Gerevini, Simonetta; Scarlato, Marina; Maggi, Lorenzo; Cava, Mariangela; Caliendo, Giandomenico; Pasanisi, Barbara; Falini, Andrea; Previtali, Stefano Carlo; Morandi, Lucia
2016-03-01
Facioscapulohumeral muscular dystrophy (FSHD) is characterized by extremely variable degrees of facial, scapular and lower limb muscle involvement. Clinical and genetic determination can be difficult, as molecular analysis is not always definitive, and other similar muscle disorders may have overlapping clinical manifestations. Whole-body muscle MRI examination for fat infiltration, atrophy and oedema was performed to identify specific patterns of muscle involvement in FSHD patients (30 subjects), and compared to a group of control patients (23) affected by other myopathies (NFSHD). In FSHD patients, we detected a specific pattern of muscle fatty replacement and atrophy, particularly in upper girdle muscles. The most frequently affected muscles, including paucisymptomatic and severely affected FSHD patients, were trapezius, teres major and serratus anterior. Moreover, asymmetric muscle involvement was significantly higher in FSHD as compared to NFSHD patients. In conclusion, muscle MRI is very sensitive for identifying a specific pattern of involvement in FSHD patients and in detecting selective muscle involvement of non-clinically testable muscles. Muscle MRI constitutes a reliable tool for differentiating FSHD from other muscular dystrophies to direct diagnostic molecular analysis, as well as to investigate FSHD natural history and follow-up of the disease. Muscle MRI identifies a specific pattern of muscle involvement in FSHD patients. Muscle MRI may predict FSHD in asymptomatic and severely affected patients. Muscle MRI of upper girdle better predicts FSHD. Muscle MRI may differentiate FSHD from other forms of muscular dystrophy. Muscle MRI may show the involvement of non-clinical testable muscles.
Identifying Synergies in Multilevel Interventions.
Lewis, Megan A; Fitzgerald, Tania M; Zulkiewicz, Brittany; Peinado, Susana; Williams, Pamela A
2017-04-01
Social ecological models of health often describe multiple levels of influence that interact to influence health. However, it is still common for interventions to target only one or two of these levels, perhaps owing in part to a lack of guidance on how to design multilevel interventions to achieve optimal impact. The convergence strategy emphasizes that interventions at different levels mutually reinforce each other by changing patterns of interaction among two or more intervention audiences; this strategy is one approach for combining interventions at different levels to produce synergistic effects. We used semistructured interviews with 65 representatives in a cross-site national initiative that enhanced health and outcomes for patients with diabetes to examine whether the convergence strategy was a useful conceptual model for multilevel interventions. Using a framework analysis approach to analyze qualitative interview data, we found three synergistic themes that match the convergence strategy and support how multilevel interventions can be successful. These three themes were (1) enhancing engagement between patient and provider and access to quality care; (2) supporting communication, information sharing, and coordination among providers, community stakeholders, and systems; and (3) building relationships and fostering alignment among providers, community stakeholders, and systems. These results support the convergence strategy as a testable conceptual model and provide examples of successful intervention strategies for combining multilevel interventions to produce synergies across levels and promote diabetes self-management and that may extend to management of other chronic illnesses as well.
Stoltenberg, Scott F.; Nag, Parthasarathi
2010-01-01
Despite more than a decade of empirical work on the role of genetic polymorphisms in the serotonin system on behavior, the details across levels of analysis are not well understood. We describe a mathematical model of the genetic control of presynaptic serotonergic function that is based on control theory, implemented using systems of differential equations, and focused on better characterizing pathways from genes to behavior. We present the results of model validation tests that include the comparison of simulation outcomes with empirical data on genetic effects on brain response to affective stimuli and on impulsivity. Patterns of simulated neural firing were consistent with recent findings of additive effects of serotonin transporter and tryptophan hydroxylase-2 polymorphisms on brain activation. In addition, simulated levels of cerebral spinal fluid 5-hydroxyindoleacetic acid (CSF 5-HIAA) were negatively correlated with Barratt Impulsiveness Scale (Version 11) Total scores in college students (r = −.22, p = .002, N = 187), which is consistent with the well-established negative correlation between CSF 5-HIAA and impulsivity. The results of the validation tests suggest that the model captures important aspects of the genetic control of presynaptic serotonergic function and behavior via brain activation. The proposed model can be: (1) extended to include other system components, neurotransmitter systems, behaviors and environmental influences; (2) used to generate testable hypotheses. PMID:20111992
Soy-Based Therapeutic Baby Formulas: Testable Hypotheses Regarding the Pros and Cons.
Westmark, Cara J
2016-01-01
Soy-based infant formulas have been consumed in the United States since 1909, and currently constitute a significant portion of the infant formula market. There are efforts underway to generate genetically modified soybeans that produce therapeutic agents of interest with the intent to deliver those agents in a soy-based infant formula platform. The threefold purpose of this review article is to first discuss the pros and cons of soy-based infant formulas, then present testable hypotheses to discern the suitability of a soy platform for drug delivery in babies, and finally start a discussion to inform public policy on this important area of infant nutrition.
Bayesian naturalness, simplicity, and testability applied to the B ‑ L MSSM GUT
NASA Astrophysics Data System (ADS)
Fundira, Panashe; Purves, Austin
2018-04-01
Recent years have seen increased use of Bayesian model comparison to quantify notions such as naturalness, simplicity, and testability, especially in the area of supersymmetric model building. After demonstrating that Bayesian model comparison can resolve a paradox that has been raised in the literature concerning the naturalness of the proton mass, we apply Bayesian model comparison to GUTs, an area to which it has not been applied before. We find that the GUTs are substantially favored over the nonunifying puzzle model. Of the GUTs we consider, the B ‑ L MSSM GUT is the most favored, but the MSSM GUT is almost equally favored.
Two fundamental questions about protein evolution.
Penny, David; Zhong, Bojian
2015-12-01
Two basic questions are considered that approach protein evolution from different directions; the problems arising from using Markov models for the deeper divergences, and then the origin of proteins themselves. The real problem for the first question (going backwards in time) is that at deeper phylogenies the Markov models of sequence evolution must lose information exponentially at deeper divergences, and several testable methods are suggested that should help resolve these deeper divergences. For the second question (coming forwards in time) a problem is that most models for the origin of protein synthesis do not give a role for the very earliest stages of the process. From our knowledge of the importance of replication accuracy in limiting the length of a coding molecule, a testable hypothesis is proposed. The length of the code, the code itself, and tRNAs would all have prior roles in increasing the accuracy of RNA replication; thus proteins would have been formed only after the tRNAs and the length of the triplet code are already formed. Both questions lead to testable predictions. Copyright © 2014 Elsevier B.V. and Société Française de Biochimie et Biologie Moléculaire (SFBBM). All rights reserved.
NASA Technical Reports Server (NTRS)
Young, K. E.; Bleacher, J. E.; Evans, C. A.; Rogers, A. D.; Ito, G.; Arzoumanian, Z.; Gendreau, K.
2015-01-01
Regardless of the target destination for the next manned planetary mission, the crew will require technology with which to select samples for return to Earth. The six Apollo lunar surface missions crews had only the tools to enable them to physically pick samples up off the surface or from a boulder and store those samples for return to the Lunar Module and eventually to Earth. Sample characterization was dependent upon visual inspection and relied upon their extensive geology training. In the four decades since Apollo however, great advances have been made in traditionally laboratory-based instrument technologies that enable miniaturization to a field-portable configuration. The implications of these advancements extend past traditional terrestrial field geology and into planetary surface exploration. With tools that will allow for real-time geochemical analysis, an astronaut can better develop a series of working hypotheses that are testable during surface science operations. One such technology is x-ray fluorescence (XRF). Traditionally used in a laboratory configuration, these instruments have now been developed and marketed commercially in a field-portable mode. We examine this technology in the context of geologic sample analysis and discuss current and future plans for instrument deployment. We also discuss the development of the Chromatic Mineral Identification and Surface Texture (CMIST) instrument at the NASA Goddard Space Flight Center (GSFC). Testing is taking place in conjunction with the RIS4E (Remote, In Situ, and Synchrotron Studies for Science and Exploration) SSERVI (Solar System Exploration and Research Virtual Institute) team activities, including field testing at Kilauea Volcano, HI..
Informed maintenance for next generation space transportation systems
NASA Astrophysics Data System (ADS)
Fox, Jack J.
2001-02-01
Perhaps the most substantial single obstacle to progress of space exploration and utilization of space for human benefit is the safety & reliability and the inherent cost of launching to, and returning from, space. The primary influence in the high costs of current launch systems (the same is true for commercial and military aircraft and most other reusable systems) is the operations, maintenance and infrastructure portion of the program's total life cycle costs. Reusable Launch Vehicle (RLV) maintenance and design have traditionally been two separate engineering disciplines with often conflicting objectives-maximizing ease of maintenance versus optimizing performance, size and cost. Testability analysis, an element of Informed Maintenance (IM), has been an ad hoc, manual effort, in which maintenance engineers attempt to identify an efficient method of troubleshooting for the given product, with little or no control over product design. Therefore, testability deficiencies in the design cannot be rectified. It is now widely recognized that IM must be engineered into the product at the design stage itself, so that an optimal compromise is achieved between system maintainability and performance. The elements of IM include testability analysis, diagnostics/prognostics, automated maintenance scheduling, automated logistics coordination, paperless documentation and data mining. IM derives its heritage from complimentary NASA science, space and aeronautic enterprises such as the on-board autonomous Remote Agent Architecture recently flown on NASA's Deep Space 1 Probe as well as commercial industries that employ quick turnaround operations. Commercial technologies and processes supporting NASA's IM initiatives include condition based maintenance technologies from Boeing's Commercial 777 Aircraft and Lockheed-Martin's F-22 Fighter, automotive computer diagnostics and autonomous controllers that enable 100,000 mile maintenance free operations, and locomotive monitoring system software. This paper will summarize NASA's long-term strategy, development, and implementation plans for Informed Maintenance for next generation RLVs. This will be done through a convergence into a single IM vision the work being performed throughout NASA, industry and academia. Additionally, a current status of IM development throughout NASA programs such as the Space Shuttle, X-33, X-34 and X-37 will be provided and will conclude with an overview of near-term work that is being initiated in FY00 to support NASA's 2nd Generation Reusable Launch Vehicle Program. .
Informed maintenance for next generation reusable launch systems
NASA Astrophysics Data System (ADS)
Fox, Jack J.; Gormley, Thomas J.
2001-03-01
Perhaps the most substantial single obstacle to progress of space exploration and utilization of space for human benefit is the safety & reliability and the inherent cost of launching to, and returning from, space. The primary influence in the high costs of current launch systems (the same is true for commercial and military aircraft and most other reusable systems) is the operations, maintenance and infrastructure portion of the program's total life cycle costs. Reusable Launch Vehicle (RLV) maintenance and design have traditionally been two separate engineering disciplines with often conflicting objectives - maximizing ease of maintenance versus optimizing performance, size and cost. Testability analysis, an element of Informed Maintenance (IM), has been an ad hoc, manual effort, in which maintenance engineers attempt to identify an efficient method of troubleshooting for the given product, with little or no control over product design. Therefore, testability deficiencies in the design cannot be rectified. It is now widely recognized that IM must be engineered into the product at the design stage itself, so that an optimal compromise is achieved between system maintainability and performance. The elements of IM include testability analysis, diagnostics/prognostics, automated maintenance scheduling, automated logistics coordination, paperless documentation and data mining. IM derives its heritage from complimentary NASA science, space and aeronautic enterprises such as the on-board autonomous Remote Agent Architecture recently flown on NASA's Deep Space 1 Probe as well as commercial industries that employ quick turnaround operations. Commercial technologies and processes supporting NASA's IM initiatives include condition based maintenance technologies from Boeing's Commercial 777 Aircraft and Lockheed-Martin's F-22 Fighter, automotive computer diagnostics and autonomous controllers that enable 100,000 mile maintenance free operations, and locomotive monitoring system software. This paper will summarize NASA's long-term strategy, development, and implementation plans for Informed Maintenance for next generation RLVs. This will be done through a convergence into a single IM vision the work being performed throughout NASA, industry and academia. Additionally, a current status of IM development throughout NASA programs such as the Space Shuttle, X-33, X-34 and X-37 will be provided and will conclude with an overview of near-term work that is being initiated in FY00 to support NASA's 2 nd Generation Reusable Launch Vehicle Program.
Advanced Launch System Multi-Path Redundant Avionics Architecture Analysis and Characterization
NASA Technical Reports Server (NTRS)
Baker, Robert L.
1993-01-01
The objective of the Multi-Path Redundant Avionics Suite (MPRAS) program is the development of a set of avionic architectural modules which will be applicable to the family of launch vehicles required to support the Advanced Launch System (ALS). To enable ALS cost/performance requirements to be met, the MPRAS must support autonomy, maintenance, and testability capabilities which exceed those present in conventional launch vehicles. The multi-path redundant or fault tolerance characteristics of the MPRAS are necessary to offset a reduction in avionics reliability due to the increased complexity needed to support these new cost reduction and performance capabilities and to meet avionics reliability requirements which will provide cost-effective reductions in overall ALS recurring costs. A complex, real-time distributed computing system is needed to meet the ALS avionics system requirements. General Dynamics, Boeing Aerospace, and C.S. Draper Laboratory have proposed system architectures as candidates for the ALS MPRAS. The purpose of this document is to report the results of independent performance and reliability characterization and assessment analyses of each proposed candidate architecture and qualitative assessments of testability, maintainability, and fault tolerance mechanisms. These independent analyses were conducted as part of the MPRAS Part 2 program and were carried under NASA Langley Research Contract NAS1-17964, Task Assignment 28.
Are there two processes in reasoning? The dimensionality of inductive and deductive inferences.
Stephens, Rachel G; Dunn, John C; Hayes, Brett K
2018-03-01
Single-process accounts of reasoning propose that the same cognitive mechanisms underlie inductive and deductive inferences. In contrast, dual-process accounts propose that these inferences depend upon 2 qualitatively different mechanisms. To distinguish between these accounts, we derived a set of single-process and dual-process models based on an overarching signal detection framework. We then used signed difference analysis to test each model against data from an argument evaluation task, in which induction and deduction judgments are elicited for sets of valid and invalid arguments. Three data sets were analyzed: data from Singmann and Klauer (2011), a database of argument evaluation studies, and the results of an experiment designed to test model predictions. Of the large set of testable models, we found that almost all could be rejected, including all 2-dimensional models. The only testable model able to account for all 3 data sets was a model with 1 dimension of argument strength and independent decision criteria for induction and deduction judgments. We conclude that despite the popularity of dual-process accounts, current results from the argument evaluation task are best explained by a single-process account that incorporates separate decision thresholds for inductive and deductive inferences. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Zhou, Shaona; Han, Jing; Koenig, Kathleen; Raplinger, Amy; Pi, Yuan; Li, Dan; Xiao, Hua; Fu, Zhao; Bao, Lei
2016-03-01
Scientific reasoning is an important component under the cognitive strand of the 21st century skills and is highly emphasized in the new science education standards. This study focuses on the assessment of student reasoning in control of variables (COV), which is a core sub-skill of scientific reasoning. The main research question is to investigate the extent to which the existence of experimental data in questions impacts student reasoning and performance. This study also explores the effects of task contexts on student reasoning as well as students' abilities to distinguish between testability and causal influences of variables in COV experiments. Data were collected with students from both USA and China. Students received randomly one of two test versions, one with experimental data and one without. The results show that students from both populations (1) perform better when experimental data are not provided, (2) perform better in physics contexts than in real-life contexts, and (3) students have a tendency to equate non-influential variables to non-testable variables. In addition, based on the analysis of both quantitative and qualitative data, a possible progression of developmental levels of student reasoning in control of variables is proposed, which can be used to inform future development of assessment and instruction.
Zhou, Shaona; Han, Jing; Koenig, Kathleen; Raplinger, Amy; Pi, Yuan; Li, Dan; Xiao, Hua; Fu, Zhao
2015-01-01
Scientific reasoning is an important component under the cognitive strand of the 21st century skills and is highly emphasized in the new science education standards. This study focuses on the assessment of student reasoning in control of variables (COV), which is a core sub-skill of scientific reasoning. The main research question is to investigate the extent to which the existence of experimental data in questions impacts student reasoning and performance. This study also explores the effects of task contexts on student reasoning as well as students’ abilities to distinguish between testability and causal influences of variables in COV experiments. Data were collected with students from both USA and China. Students received randomly one of two test versions, one with experimental data and one without. The results show that students from both populations (1) perform better when experimental data are not provided, (2) perform better in physics contexts than in real-life contexts, and (3) students have a tendency to equate non-influential variables to non-testable variables. In addition, based on the analysis of both quantitative and qualitative data, a possible progression of developmental levels of student reasoning in control of variables is proposed, which can be used to inform future development of assessment and instruction. PMID:26949425
Complexity, Testability, and Fault Analysis of Digital, Analog, and Hybrid Systems.
1984-09-30
E. Moret Table 2. Decision Table. Example 3 ond and third rules are inconsistent, since Raining? Yes No No both could apparently apply when it is...misclassification; a similar to be approach based on game theory was de- p.. - 0.134. scribed in SLAG71 and a third in KULK76. When the class assignments are...to a change from (X, Y)=(I,1) to (X,Y)=(O,O); the second corresponds to a change in the value of the function g=X+Y; and the third corresponds to a
Evolution beyond neo-Darwinism: a new conceptual framework.
Noble, Denis
2015-01-01
Experimental results in epigenetics and related fields of biological research show that the Modern Synthesis (neo-Darwinist) theory of evolution requires either extension or replacement. This article examines the conceptual framework of neo-Darwinism, including the concepts of 'gene', 'selfish', 'code', 'program', 'blueprint', 'book of life', 'replicator' and 'vehicle'. This form of representation is a barrier to extending or replacing existing theory as it confuses conceptual and empirical matters. These need to be clearly distinguished. In the case of the central concept of 'gene', the definition has moved all the way from describing a necessary cause (defined in terms of the inheritable phenotype itself) to an empirically testable hypothesis (in terms of causation by DNA sequences). Neo-Darwinism also privileges 'genes' in causation, whereas in multi-way networks of interactions there can be no privileged cause. An alternative conceptual framework is proposed that avoids these problems, and which is more favourable to an integrated systems view of evolution. © 2015. Published by The Company of Biologists Ltd.
Visual attention and flexible normalization pools
Schwartz, Odelia; Coen-Cagli, Ruben
2013-01-01
Attention to a spatial location or feature in a visual scene can modulate the responses of cortical neurons and affect perceptual biases in illusions. We add attention to a cortical model of spatial context based on a well-founded account of natural scene statistics. The cortical model amounts to a generalized form of divisive normalization, in which the surround is in the normalization pool of the center target only if they are considered statistically dependent. Here we propose that attention influences this computation by accentuating the neural unit activations at the attended location, and that the amount of attentional influence of the surround on the center thus depends on whether center and surround are deemed in the same normalization pool. The resulting form of model extends a recent divisive normalization model of attention (Reynolds & Heeger, 2009). We simulate cortical surround orientation experiments with attention and show that the flexible model is suitable for capturing additional data and makes nontrivial testable predictions. PMID:23345413
Finite Cosmology and a CMB Cold Spot
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adler, R.J.; /Stanford U., HEPL; Bjorken, J.D.
2006-03-20
The standard cosmological model posits a spatially flat universe of infinite extent. However, no observation, even in principle, could verify that the matter extends to infinity. In this work we model the universe as a finite spherical ball of dust and dark energy, and obtain a lower limit estimate of its mass and present size: the mass is at least 5 x 10{sup 23}M{sub {circle_dot}} and the present radius is at least 50 Gly. If we are not too far from the dust-ball edge we might expect to see a cold spot in the cosmic microwave background, and there mightmore » be suppression of the low multipoles in the angular power spectrum. Thus the model may be testable, at least in principle. We also obtain and discuss the geometry exterior to the dust ball; it is Schwarzschild-de Sitter with a naked singularity, and provides an interesting picture of cosmogenesis. Finally we briefly sketch how radiation and inflation eras may be incorporated into the model.« less
Bicultural identity conflict in second-generation Asian Canadians.
Stroink, Mirella L; Lalonde, Richard N
2009-02-01
Researchers have shown that bicultural individuals, including 2nd-generation immigrants, face a potential conflict between 2 cultural identities. The present authors extended this primarily qualitative research on the bicultural experience by adopting the social identity perspective (H. Tajfel & J. C. Turner, 1986). They developed and tested an empirically testable model of the role of cultural construals, in-group prototypicality, and identity in bicultural conflict in 2 studies with 2nd-generation Asian Canadians. In both studies, the authors expected and found that participants' construals of their 2 cultures as different predicted lower levels of simultaneous identification with both cultures. Furthermore, the authors found this relation was mediated by participants' feelings of prototypicality as members of both groups. Although the perception of cultural difference did not predict well-being as consistently and directly as the authors expected, levels of simultaneous identification did show these relations. The authors discuss results in the context of social identity theory (H. Tajfel & J. C. Turner) as a framework for understanding bicultural conflict.
Testing sterile neutrino extensions of the Standard Model at future lepton colliders
NASA Astrophysics Data System (ADS)
Antusch, Stefan; Fischer, Oliver
2015-05-01
Extending the Standard Model (SM) with sterile ("right-handed") neutrinos is one of the best motivated ways to account for the observed neutrino masses. We discuss the expected sensitivity of future lepton collider experiments for probing such extensions. An interesting testable scenario is given by "symmetry protected seesaw models", which theoretically allow for sterile neutrino masses around the electroweak scale with up to order one mixings with the light (SM) neutrinos. In addition to indirect tests, e.g. via electroweak precision observables, sterile neutrinos with masses around the electroweak scale can also be probed by direct searches, e.g. via sterile neutrino decays at the Z pole, deviations from the SM cross section for four lepton final states at and beyond the WW threshold and via Higgs boson decays. We study the present bounds on sterile neutrino properties from LEP and LHC as well as the expected sensitivities of possible future lepton colliders such as ILC, CEPC and FCC-ee (TLEP).
Soy-Based Therapeutic Baby Formulas: Testable Hypotheses Regarding the Pros and Cons
Westmark, Cara J.
2017-01-01
Soy-based infant formulas have been consumed in the United States since 1909, and currently constitute a significant portion of the infant formula market. There are efforts underway to generate genetically modified soybeans that produce therapeutic agents of interest with the intent to deliver those agents in a soy-based infant formula platform. The threefold purpose of this review article is to first discuss the pros and cons of soy-based infant formulas, then present testable hypotheses to discern the suitability of a soy platform for drug delivery in babies, and finally start a discussion to inform public policy on this important area of infant nutrition. PMID:28149839
Reliability/maintainability/testability design for dormancy
NASA Astrophysics Data System (ADS)
Seman, Robert M.; Etzl, Julius M.; Purnell, Arthur W.
1988-05-01
This document has been prepared as a tool for designers of dormant military equipment and systems. The purpose of this handbook is to provide design engineers with Reliability/Maintainability/Testability design guidelines for systems which spend significant portions of their life cycle in a dormant state. The dormant state is defined as a nonoperating mode where a system experiences very little or no electrical stress. The guidelines in this report present design criteria in the following categories: (1) Part Selection and Control; (2) Derating Practices; (3) Equipment/System Packaging; (4) Transportation and Handling; (5) Maintainability Design; (6) Testability Design; (7) Evaluation Methods for In-Plant and Field Evaluation; and (8) Product Performance Agreements. Whereever applicable, design guidelines for operating systems were included with the dormant design guidelines. This was done in an effort to produce design guidelines for a more complete life cycle. Although dormant systems spend significant portions of their life cycle in a nonoperating mode, the designer must design the system for the complete life cycle, including nonoperating as well as operating modes. The guidelines are primarily intended for use in the design of equipment composed of electronic parts and components. However, they can also be used for the design of systems which encompass both electronic and nonelectronic parts, as well as for the modification of existing systems.
Delay test generation for synchronous sequential circuits
NASA Astrophysics Data System (ADS)
Devadas, Srinivas
1989-05-01
We address the problem of generating tests for delay faults in non-scan synchronous sequential circuits. Delay test generation for sequential circuits is a considerably more difficult problem than delay testing of combinational circuits and has received much less attention. In this paper, we present a method for generating test sequences to detect delay faults in sequential circuits using the stuck-at fault sequential test generator STALLION. The method is complete in that it will generate a delay test sequence for a targeted fault given sufficient CPU time, if such a sequence exists. We term faults for which no delay test sequence exists, under out test methodology, sequentially delay redundant. We describe means of eliminating sequential delay redundancies in logic circuits. We present a partial-scan methodology for enhancing the testability of difficult-to-test of untestable sequential circuits, wherein a small number of flip-flops are selected and made controllable/observable. The selection process guarantees the elimination of all sequential delay redundancies. We show that an intimate relationship exists between state assignment and delay testability of a sequential machine. We describe a state assignment algorithm for the synthesis of sequential machines with maximal delay fault testability. Preliminary experimental results using the test generation, partial-scan and synthesis algorithm are presented.
Huang, Dan; Chen, Xuejuan; Gong, Qi; Yuan, Chaoqun; Ding, Hui; Bai, Jing; Zhu, Hui; Fu, Zhujun; Yu, Rongbin; Liu, Hu
2016-01-01
This survey was conducted to determine the testability, distribution and associations of ocular biometric parameters in Chinese preschool children. Ocular biometric examinations, including the axial length (AL) and corneal radius of curvature (CR), were conducted on 1,688 3-year-old subjects by using an IOLMaster in August 2015. Anthropometric parameters, including height and weight, were measured according to a standardized protocol, and body mass index (BMI) was calculated. The testability was 93.7% for the AL and 78.6% for the CR overall, and both measures improved with age. Girls performed slightly better in AL measurements (P = 0.08), and the difference in CR was statistically significant (P < 0.05). The AL distribution was normal in girls (P = 0.12), whereas it was not in boys (P < 0.05). For CR1, all subgroups presented normal distributions (P = 0.16 for boys; P = 0.20 for girls), but the distribution varied when the subgroups were combined (P < 0.05). CR2 presented a normal distribution (P = 0.11), whereas the AL/CR ratio was abnormal (P < 0.001). Boys exhibited a significantly longer AL, a greater CR and a greater AL/CR ratio than girls (all P < 0.001). PMID:27384307
Stereoacuity of preschool children with and without vision disorders.
Ciner, Elise B; Ying, Gui-Shuang; Kulp, Marjean Taylor; Maguire, Maureen G; Quinn, Graham E; Orel-Bixler, Deborah; Cyert, Lynn A; Moore, Bruce; Huang, Jiayan
2014-03-01
To evaluate associations between stereoacuity and presence, type, and severity of vision disorders in Head Start preschool children and determine testability and levels of stereoacuity by age in children without vision disorders. Stereoacuity of children aged 3 to 5 years (n = 2898) participating in the Vision in Preschoolers (VIP) Study was evaluated using the Stereo Smile II test during a comprehensive vision examination. This test uses a two-alternative forced-choice paradigm with four stereoacuity levels (480 to 60 seconds of arc). Children were classified by the presence (n = 871) or absence (n = 2027) of VIP Study-targeted vision disorders (amblyopia, strabismus, significant refractive error, or unexplained reduced visual acuity), including type and severity. Median stereoacuity between groups and among severity levels of vision disorders was compared using Wilcoxon rank sum and Kruskal-Wallis tests. Testability and stereoacuity levels were determined for children without VIP Study-targeted disorders overall and by age. Children with VIP Study-targeted vision disorders had significantly worse median stereoacuity than that of children without vision disorders (120 vs. 60 seconds of arc, p < 0.001). Children with the most severe vision disorders had worse stereoacuity than that of children with milder disorders (median 480 vs. 120 seconds of arc, p < 0.001). Among children without vision disorders, testability was 99.6% overall, increasing with age to 100% for 5-year-olds (p = 0.002). Most of the children without vision disorders (88%) had stereoacuity at the two best disparities (60 or 120 seconds of arc); the percentage increasing with age (82% for 3-, 89% for 4-, and 92% for 5-year-olds; p < 0.001). The presence of any VIP Study-targeted vision disorder was associated with significantly worse stereoacuity in preschool children. Severe vision disorders were more likely associated with poorer stereopsis than milder or no vision disorders. Testability was excellent at all ages. These results support the validity of the Stereo Smile II for assessing random-dot stereoacuity in preschool children.
Hassmiller Lich, Kristen; Urban, Jennifer Brown; Frerichs, Leah; Dave, Gaurav
2017-02-01
Group concept mapping (GCM) has been successfully employed in program planning and evaluation for over 25 years. The broader set of systems thinking methodologies (of which GCM is one), have only recently found their way into the field. We present an overview of systems thinking emerging from a system dynamics (SD) perspective, and illustrate the potential synergy between GCM and SD. As with GCM, participatory processes are frequently employed when building SD models; however, it can be challenging to engage a large and diverse group of stakeholders in the iterative cycles of divergent thinking and consensus building required, while maintaining a broad perspective on the issue being studied. GCM provides a compelling resource for overcoming this challenge, by richly engaging a diverse set of stakeholders in broad exploration, structuring, and prioritization. SD provides an opportunity to extend GCM findings by embedding constructs in a testable hypothesis (SD model) describing how system structure and changes in constructs affect outcomes over time. SD can be used to simulate the hypothesized dynamics inherent in GCM concept maps. We illustrate the potential of the marriage of these methodologies in a case study of BECOMING, a federally-funded program aimed at strengthening the cross-sector system of care for youth with severe emotional disturbances. Copyright © 2016 Elsevier Ltd. All rights reserved.
The evolutionary psychology of hunger.
Al-Shawaf, Laith
2016-10-01
An evolutionary psychological perspective suggests that emotions can be understood as coordinating mechanisms whose job is to regulate various psychological and physiological programs in the service of solving an adaptive problem. This paper suggests that it may also be fruitful to approach hunger from this coordinating mechanism perspective. To this end, I put forward an evolutionary task analysis of hunger, generating novel a priori hypotheses about the coordinating effects of hunger on psychological processes such as perception, attention, categorization, and memory. This approach appears empirically fruitful in that it yields a bounty of testable new hypotheses. Copyright © 2016 Elsevier Ltd. All rights reserved.
Crystal study and econometric model
NASA Technical Reports Server (NTRS)
1975-01-01
An econometric model was developed that can be used to predict demand and supply figures for crystals over a time horizon roughly concurrent with that of NASA's Space Shuttle Program - that is, 1975 through 1990. The model includes an equation to predict the impact on investment in the crystal-growing industry. Actually, two models are presented. The first is a theoretical model which follows rather strictly the standard theoretical economic concepts involved in supply and demand analysis, and a modified version of the model was developed which, though not quite as theoretically sound, was testable utilizing existing data sources.
1981-05-01
obtained if one first generates a data base in terms cf the nominal circuit parameters and then extracts the aocrooriate s’pnboc transfer functicn from...0 0 0 0 I 0 I 0 0 0 0 Here we initially allow V0 , ICl , VRA ,and IE to be taken as test outputs. The measure of testability 6min is used to extract ...C C IN QQ2M + VB2 R5 R L V 0 LO R2 R Fig. 4 Direct-Coupled Two-Stage Amplifier The component-connection equations for Mode 2 analysis are: VR1 0 0 0 0
NASA Tech Briefs, December 2012
NASA Technical Reports Server (NTRS)
2012-01-01
The topics include: Pattern Generator for Bench Test of Digital Boards; 670-GHz Down- and Up-Converting HEMT-Based Mixers; Lidar Electro-Optic Beam Switch with a Liquid Crystal Variable Retarder; Feedback Augmented Sub-Ranging (FASR) Quantizer; Real-Time Distributed Embedded Oscillator Operating Frequency Monitoring; Software Modules for the Proximity-1 Space Link Interleaved Time Synchronization (PITS) Protocol; Description and User Instructions for the Quaternion to Orbit v3 Software; AdapChem; Mars Relay Lander and Orbiter Overflight Profile Estimation; Extended Testability Analysis Tool; Interactive 3D Mars Visualization; Rapid Diagnostics of Onboard Sequences; MER Telemetry Processor; pyam: Python Implementation of YaM; Process for Patterning Indium for Bump Bonding; Archway for Radiation and Micrometeorite Occurrence Resistance; 4D Light Field Imaging System Using Programmable Aperture; Device and Container for Reheating and Sterilization; Radio Frequency Plasma Discharge Lamps for Use as Stable Calibration Light Sources; Membrane Shell Reflector Segment Antenna; High-Speed Transport of Fluid Drops and Solid Particles via Surface Acoustic Waves; Compact Autonomous Hemispheric Vision System; A Distributive, Non-Destructive, Real-Time Approach to Snowpack Monitoring; Wideband Single-Crystal Transducer for Bone Characterization; Numerical Simulation of Rocket Exhaust Interaction With Lunar Soil; Motion Imagery and Robotics Application (MIRA): Standards-Based Robotics; Particle Filtering for Model-Based Anomaly Detection in Sensor Networks; Ka-band Digitally Beamformed Airborne Radar Using SweepSAR Technique; Composite With In Situ Plenums; Multi-Beam Approach for Accelerating Alignment and Calibration of HyspIRI-Like Imaging Spectrometers; JWST Lifting System; Next-Generation Tumbleweed Rover; Pneumatic System for Concentration of Micrometer-Size Lunar Soil.
Légaré, France; Stacey, Dawn; Gagnon, Susie; Dunn, Sandy; Pluye, Pierre; Frosch, Dominick; Kryworuchko, Jennifer; Elwyn, Glyn; Gagnon, Marie-Pierre; Graham, Ian D
2011-01-01
Rationale, aims and objectives Following increased interest in having inter-professional (IP) health care teams engage patients in decision making, we developed a conceptual model for an IP approach to shared decision making (SDM) in primary care. We assessed the validity of the model with stakeholders in Canada. Methods In 15 individual interviews and 7 group interviews with 79 stakeholders, we asked them to: (1) propose changes to the IP-SDM model; (2) identify barriers and facilitators to the model's implementation in clinical practice; and (3) assess the model using a theory appraisal questionnaire. We performed a thematic analysis of the transcripts and a descriptive analysis of the questionnaires. Results Stakeholders suggested placing the patient at its centre; extending the concept of family to include significant others; clarifying outcomes; highlighting the concept of time; merging the micro, meso and macro levels in one figure; and recognizing the influence of the environment and emotions. The most common barriers identified were time constraints, insufficient resources and an imbalance of power among health professionals. The most common facilitators were education and training in inter-professionalism and SDM, motivation to achieve an IP approach to SDM, and mutual knowledge and understanding of disciplinary roles. Most stakeholders considered that the concepts and relationships between the concepts were clear and rated the model as logical, testable, having clear schematic representation, and being relevant to inter-professional collaboration, SDM and primary care. Conclusions Stakeholders validated the new IP-SDM model for primary care settings and proposed few modifications. Future research should assess if the model helps implement SDM in IP clinical practice. PMID:20695950
MetNet: Software to Build and Model the Biogenetic Lattice of Arabidopsis
Wurtele, Eve Syrkin; Li, Jie; Diao, Lixia; ...
2003-01-01
MetNet (http://www.botany.iastate.edu/∼mash/metnetex/metabolicnetex.html) is publicly available software in development for analysis of genome-wide RNA, protein and metabolite profiling data. The software is designed to enable the biologist to visualize, statistically analyse and model a metabolic and regulatory network map of Arabidopsis , combined with gene expression profiling data. It contains a JAVA interface to an interactions database (MetNetDB) containing information on regulatory and metabolic interactions derived from a combination of web databases (TAIR, KEGG, BRENDA) and input from biologists in their area of expertise. FCModeler captures input from MetNetDB in a graphical form. Sub-networks can be identified and interpreted using simplemore » fuzzy cognitive maps. FCModeler is intended to develop and evaluate hypotheses, and provide a modelling framework for assessing the large amounts of data captured by high-throughput gene expression experiments. FCModeler and MetNetDB are currently being extended to three-dimensional virtual reality display. The MetNet map, together with gene expression data, can be viewed using multivariate graphics tools in GGobi linked with the data analytic tools in R. Users can highlight different parts of the metabolic network and see the relevant expression data highlighted in other data plots. Multi-dimensional expression data can be rotated through different dimensions. Statistical analysis can be computed alongside the visual. MetNet is designed to provide a framework for the formulation of testable hypotheses regarding the function of specific genes, and in the long term provide the basis for identification of metabolic and regulatory networks that control plant composition and development.« less
Légaré, France; Stacey, Dawn; Gagnon, Susie; Dunn, Sandy; Pluye, Pierre; Frosch, Dominick; Kryworuchko, Jennifer; Elwyn, Glyn; Gagnon, Marie-Pierre; Graham, Ian D
2011-08-01
Following increased interest in having inter-professional (IP) health care teams engage patients in decision making, we developed a conceptual model for an IP approach to shared decision making (SDM) in primary care. We assessed the validity of the model with stakeholders in Canada. In 15 individual interviews and 7 group interviews with 79 stakeholders, we asked them to: (1) propose changes to the IP-SDM model; (2) identify barriers and facilitators to the model's implementation in clinical practice; and (3) assess the model using a theory appraisal questionnaire. We performed a thematic analysis of the transcripts and a descriptive analysis of the questionnaires. Stakeholders suggested placing the patient at its centre; extending the concept of family to include significant others; clarifying outcomes; highlighting the concept of time; merging the micro, meso and macro levels in one figure; and recognizing the influence of the environment and emotions. The most common barriers identified were time constraints, insufficient resources and an imbalance of power among health professionals. The most common facilitators were education and training in inter-professionalism and SDM, motivation to achieve an IP approach to SDM, and mutual knowledge and understanding of disciplinary roles. Most stakeholders considered that the concepts and relationships between the concepts were clear and rated the model as logical, testable, having clear schematic representation, and being relevant to inter-professional collaboration, SDM and primary care. Stakeholders validated the new IP-SDM model for primary care settings and proposed few modifications. Future research should assess if the model helps implement SDM in IP clinical practice. © 2010 Blackwell Publishing Ltd.
Gillison, Andrew N; Asner, Gregory P; Fernandes, Erick C M; Mafalacusser, Jacinto; Banze, Aurélio; Izidine, Samira; da Fonseca, Ambrósio R; Pacate, Hermenegildo
2016-07-15
Sustainable biodiversity and land management require a cost-effective means of forecasting landscape response to environmental change. Conventional species-based, regional biodiversity assessments are rarely adequate for policy planning and decision making. We show how new ground and remotely-sensed survey methods can be coordinated to help elucidate and predict relationships between biodiversity, land use and soil properties along complex biophysical gradients that typify many similar landscapes worldwide. In the lower Zambezi valley, Mozambique we used environmental, gradient-directed transects (gradsects) to sample vascular plant species, plant functional types, vegetation structure, soil properties and land-use characteristics. Soil fertility indices were derived using novel multidimensional scaling of soil properties. To facilitate spatial analysis, we applied a probabilistic remote sensing approach, analyzing Landsat 7 satellite imagery to map photosynthetically active and inactive vegetation and bare soil along each gradsect. Despite the relatively low sample number, we found highly significant correlations between single and combined sets of specific plant, soil and remotely sensed variables that permitted testable spatial projections of biodiversity and soil fertility across the regional land-use mosaic. This integrative and rapid approach provides a low-cost, high-return and readily transferable methodology that permits the ready identification of testable biodiversity indicators for adaptive management of biodiversity and potential agricultural productivity. Copyright © 2016 Elsevier Ltd. All rights reserved.
Metabolic network flux analysis for engineering plant systems.
Shachar-Hill, Yair
2013-04-01
Metabolic network flux analysis (NFA) tools have proven themselves to be powerful aids to metabolic engineering of microbes by providing quantitative insights into the flows of material and energy through cellular systems. The development and application of NFA tools to plant systems has advanced in recent years and are yielding significant insights and testable predictions. Plants present substantial opportunities for the practical application of NFA but they also pose serious challenges related to the complexity of plant metabolic networks and to deficiencies in our knowledge of their structure and regulation. By considering the tools available and selected examples, this article attempts to assess where and how NFA is most likely to have a real impact on plant biotechnology. Copyright © 2013 Elsevier Ltd. All rights reserved.
Use of direct gradient analysis to uncover biological hypotheses in 16s survey data and beyond.
Erb-Downward, John R; Sadighi Akha, Amir A; Wang, Juan; Shen, Ning; He, Bei; Martinez, Fernando J; Gyetko, Margaret R; Curtis, Jeffrey L; Huffnagle, Gary B
2012-01-01
This study investigated the use of direct gradient analysis of bacterial 16S pyrosequencing surveys to identify relevant bacterial community signals in the midst of a "noisy" background, and to facilitate hypothesis-testing both within and beyond the realm of ecological surveys. The results, utilizing 3 different real world data sets, demonstrate the utility of adding direct gradient analysis to any analysis that draws conclusions from indirect methods such as Principal Component Analysis (PCA) and Principal Coordinates Analysis (PCoA). Direct gradient analysis produces testable models, and can identify significant patterns in the midst of noisy data. Additionally, we demonstrate that direct gradient analysis can be used with other kinds of multivariate data sets, such as flow cytometric data, to identify differentially expressed populations. The results of this study demonstrate the utility of direct gradient analysis in microbial ecology and in other areas of research where large multivariate data sets are involved.
Condition-dependent functional connectivity: syntax networks in bilinguals
Dodel, Silke; Golestani, Narly; Pallier, Christophe; ElKouby, Vincent; Le Bihan, Denis; Poline, Jean-Baptiste
2005-01-01
This paper introduces a method to study the variation of brain functional connectivity networks with respect to experimental conditions in fMRI data. It is related to the psychophysiological interaction technique introduced by Friston et al. and extends to networks of correlation modulation (CM networks). Extended networks containing several dozens of nodes are determined in which the links correspond to consistent correlation modulation across subjects. In addition, we assess inter-subject variability and determine networks in which the condition-dependent functional interactions can be explained by a subject-dependent variable. We applied the technique to data from a study on syntactical production in bilinguals and analysed functional interactions differentially across tasks (word reading or sentence production) and across languages. We find an extended network of consistent functional interaction modulation across tasks, whereas the network comparing languages shows fewer links. Interestingly, there is evidence for a specific network in which the differences in functional interaction across subjects can be explained by differences in the subjects' syntactical proficiency. Specifically, we find that regions, including ones that have previously been shown to be involved in syntax and in language production, such as the left inferior frontal gyrus, putamen, insula, precentral gyrus, as well as the supplementary motor area, are more functionally linked during sentence production in the second, compared with the first, language in syntactically more proficient bilinguals than in syntactically less proficient ones. Our approach extends conventional activation analyses to the notion of networks, emphasizing functional interactions between regions independently of whether or not they are activated. On the one hand, it gives rise to testable hypotheses and allows an interpretation of the results in terms of the previous literature, and on the other hand, it provides a basis for studying the structure of functional interactions as a whole, and hence represents a further step towards the notion of large-scale networks in functional imaging. PMID:16087437
On testing VLSI chips for the big Viterbi decoder
NASA Technical Reports Server (NTRS)
Hsu, I. S.
1989-01-01
A general technique that can be used in testing very large scale integrated (VLSI) chips for the Big Viterbi Decoder (BVD) system is described. The test technique is divided into functional testing and fault-coverage testing. The purpose of functional testing is to verify that the design works functionally. Functional test vectors are converted from outputs of software simulations which simulate the BVD functionally. Fault-coverage testing is used to detect and, in some cases, to locate faulty components caused by bad fabrication. This type of testing is useful in screening out bad chips. Finally, design for testability, which is included in the BVD VLSI chip design, is described in considerable detail. Both the observability and controllability of a VLSI chip are greatly enhanced by including the design for the testability feature.
QUANTITATIVE TESTS OF ELMS AS INTERMEDIATE N PEELING-BALOONING MODES
DOE Office of Scientific and Technical Information (OSTI.GOV)
LAO,LL; SNYDER,PB; LEONARD,AW
2003-03-01
A271 QUANTITATIVE TESTS OF ELMS AS INTERMEDIATE N PEELING-BALOONING MODES. Several testable features of the working model of edge localized modes (ELMs) as intermediate toroidal mode number peeling-ballooning modes are evaluated quantitatively using DIII-D and JT-60U experimental data and the ELITE MHD stability code. These include the hypothesis that ELM sizes are related to the radial widths of the unstable MHD modes, the unstable modes have a strong ballooning character localized in the outboard bad curvature region, and ELM size generally becomes smaller at high edge collisionality. ELMs are triggered when the growth rates of the unstable MHD modes becomemore » significantly large. These testable features are consistent with many ELM observations in DIII-D and JT-60U discharges.« less
Linking short-term responses to ecologically-relevant outcomes
Opportunity to participate in the conduct of collaborative integrative lab, field and modelling efforts to characterize molecular-to-organismal level responses and make quantitative testable predictions of population level outcomes
Crowell, Sheila E.; Beauchaine, Theodore P.; Linehan, Marsha M.
2009-01-01
Over the past several decades, research has focused increasingly on developmental precursors to psychological disorders that were previously assumed to emerge only in adulthood. This change in focus follows from the recognition that complex transactions between biological vulnerabilities and psychosocial risk factors shape emotional and behavioral development beginning at conception. To date, however, empirical research on the development of borderline personality is extremely limited. Indeed, in the decade since M. M. Linehan initially proposed a biosocial model of the development of borderline personality disorder, there have been few attempts to test the model among at-risk youth. In this review, diverse literatures are reviewed that can inform understanding of the ontogenesis of borderline pathology, and testable hypotheses are proposed to guide future research with at-risk children and adolescents. One probable pathway is identified that leads to borderline personality disorder; it begins with early vulnerability, expressed initially as impulsivity and followed by heightened emotional sensitivity. These vulnerabilities are potentiated across development by environmental risk factors that give rise to more extreme emotional, behavioral, and cognitive dysregulation. PMID:19379027
The spectro-contextual encoding and retrieval theory of episodic memory.
Watrous, Andrew J; Ekstrom, Arne D
2014-01-01
The spectral fingerprint hypothesis, which posits that different frequencies of oscillations underlie different cognitive operations, provides one account for how interactions between brain regions support perceptual and attentive processes (Siegel etal., 2012). Here, we explore and extend this idea to the domain of human episodic memory encoding and retrieval. Incorporating findings from the synaptic to cognitive levels of organization, we argue that spectrally precise cross-frequency coupling and phase-synchronization promote the formation of hippocampal-neocortical cell assemblies that form the basis for episodic memory. We suggest that both cell assembly firing patterns as well as the global pattern of brain oscillatory activity within hippocampal-neocortical networks represents the contents of a particular memory. Drawing upon the ideas of context reinstatement and multiple trace theory, we argue that memory retrieval is driven by internal and/or external factors which recreate these frequency-specific oscillatory patterns which occur during episodic encoding. These ideas are synthesized into a novel model of episodic memory (the spectro-contextual encoding and retrieval theory, or "SCERT") that provides several testable predictions for future research.
Decoupling electron and ion storage and the path from interfacial storage to artificial electrodes
NASA Astrophysics Data System (ADS)
Chen, Chia-Chin; Maier, Joachim
2018-02-01
The requirements for rechargeable batteries place high demands on the electrodes. Efficient storage means accommodating both ions and electrons, not only in substantial amounts, but also with substantial velocities. The materials' space could be largely extended by decoupling the roles of ions and electrons such that transport and accommodation of ions take place in one phase of a composite, and transport and accommodation of electrons in the other phase. Here we discuss this synergistic concept being equally applicable for positive and negative electrodes along with examples from the literature for Li-based and Ag-based cells. Not only does the concept have the potential to mitigate the trade-off between power density and energy density, it also enables a generalized view of bulk and interfacial storage as necessary for nanocrystals. It furthermore allows for testable predictions of heterogeneous storage in passivation layers, dependence of transfer resistance on the state of charge, or heterogeneous storage of hydrogen at appropriate contacts. We also present an outlook on constructing artificial mixed-conductor electrodes that have the potential to achieve both high energy density and high power density.
Crowell, Sheila E; Beauchaine, Theodore P; Linehan, Marsha M
2009-05-01
Over the past several decades, research has focused increasingly on developmental precursors to psychological disorders that were previously assumed to emerge only in adulthood. This change in focus follows from the recognition that complex transactions between biological vulnerabilities and psychosocial risk factors shape emotional and behavioral development beginning at conception. To date, however, empirical research on the development of borderline personality is extremely limited. Indeed, in the decade since M. M. Linehan initially proposed a biosocial model of the development of borderline personality disorder, there have been few attempts to test the model among at-risk youth. In this review, diverse literatures are reviewed that can inform understanding of the ontogenesis of borderline pathology, and testable hypotheses are proposed to guide future research with at-risk children and adolescents. One probable pathway is identified that leads to borderline personality disorder; it begins with early vulnerability, expressed initially as impulsivity and followed by heightened emotional sensitivity. These vulnerabilities are potentiated across development by environmental risk factors that give rise to more extreme emotional, behavioral, and cognitive dysregulation. (PsycINFO Database Record (c) 2009 APA, all rights reserved).
Kim, Betty E; Seligman, Darryl; Kable, Joseph W
2012-01-01
Recent work has shown that visual fixations reflect and influence trial-to-trial variability in people's preferences between goods. Here we extend this principle to attribute weights during decision making under risk. We measured eye movements while people chose between two risky gambles or bid on a single gamble. Consistent with previous work, we found that people exhibited systematic preference reversals between choices and bids. For two gambles matched in expected value, people systematically chose the higher probability option but provided a higher bid for the option that offered the greater amount to win. This effect was accompanied by a shift in fixations of the two attributes, with people fixating on probabilities more during choices and on amounts more during bids. Our results suggest that the construction of value during decision making under risk depends on task context partly because the task differentially directs attention at probabilities vs. amounts. Since recent work demonstrates that neural correlates of value vary with visual fixations, our results also suggest testable hypotheses regarding how task context modulates the neural computation of value to generate preference reversals.
Oyster reefs can outpace sea-level rise
NASA Astrophysics Data System (ADS)
Rodriguez, Antonio B.; Fodrie, F. Joel; Ridge, Justin T.; Lindquist, Niels L.; Theuerkauf, Ethan J.; Coleman, Sara E.; Grabowski, Jonathan H.; Brodeur, Michelle C.; Gittman, Rachel K.; Keller, Danielle A.; Kenworthy, Matthew D.
2014-06-01
In the high-salinity seaward portions of estuaries, oysters seek refuge from predation, competition and disease in intertidal areas, but this sanctuary will be lost if vertical reef accretion cannot keep pace with sea-level rise (SLR). Oyster-reef abundance has already declined ~85% globally over the past 100 years, mainly from over harvesting, making any additional losses due to SLR cause for concern. Before any assessment of reef response to accelerated SLR can be made, direct measures of reef growth are necessary. Here, we present direct measurements of intertidal oyster-reef growth from cores and terrestrial lidar-derived digital elevation models. On the basis of our measurements collected within a mid-Atlantic estuary over a 15-year period, we developed a globally testable empirical model of intertidal oyster-reef accretion. We show that previous estimates of vertical reef growth, based on radiocarbon dates and bathymetric maps, may be greater than one order of magnitude too slow. The intertidal reefs we studied should be able to keep up with any future accelerated rate of SLR (ref. ) and may even benefit from the additional subaqueous space allowing extended vertical accretion.
Mapping the landscape of metabolic goals of a cell
Zhao, Qi; Stettner, Arion I.; Reznik, Ed; ...
2016-05-23
Here, genome-scale flux balance models of metabolism provide testable predictions of all metabolic rates in an organism, by assuming that the cell is optimizing a metabolic goal known as the objective function. We introduce an efficient inverse flux balance analysis (invFBA) approach, based on linear programming duality, to characterize the space of possible objective functions compatible with measured fluxes. After testing our algorithm on simulated E. coli data and time-dependent S. oneidensis fluxes inferred from gene expression, we apply our inverse approach to flux measurements in long-term evolved E. coli strains, revealing objective functions that provide insight into metabolic adaptationmore » trajectories.« less
Off-line, built-in test techniques for VLSI circuits
NASA Technical Reports Server (NTRS)
Buehler, M. G.; Sievers, M. W.
1982-01-01
It is shown that the use of redundant on-chip circuitry improves the testability of an entire VLSI circuit. In the study described here, five techniques applied to a two-bit ripple carry adder are compared. The techniques considered are self-oscillation, self-comparison, partition, scan path, and built-in logic block observer. It is noted that both classical stuck-at faults and nonclassical faults, such as bridging faults (shorts), stuck-on x faults where x may be 0, 1, or vary between the two, and parasitic flip-flop faults occur in IC structures. To simplify the analysis of the testing techniques, however, a stuck-at fault model is assumed.
A Preliminary Testability Analysis of the Mil-STD-1862 Architecture.
1981-08-01
NEXT Iget address of successor of P (S) and check for write rights tmp2 = read(tmpl,word,M.w) NEXT read(tmp,word,M.w) NEXT Icheck write rights...read(tmp+4,word,M.w) NEXT Icheck write rights read(tmp2+4,word,M.w) NEXT Icheck write rights I IF tmp<l:0> OR tmpl<l:0> OR tmp2<l:0...address of pred (P) read(tmp2,word,M.w) NEXT Icheck access rights read(tmpl+4,word,M.w) NEXT. Icheck access rights IIF tmp<l:0> OR tmpl<l:0> OR
Left-handed and right-handed U(1) gauge symmetry
NASA Astrophysics Data System (ADS)
Nomura, Takaaki; Okada, Hiroshi
2018-01-01
We propose a model with the left-handed and right-handed continuous Abelian gauge symmetry; U(1) L × U(1) R . Then three right-handed neutrinos are naturally required to achieve U(1) R anomaly cancellations, while several mirror fermions are also needed to do U(1) L anomaly cancellations. Then we formulate the model, and discuss its testability of the new gauge interactions at collider physics such as the large hadron collider (LHC) and the international linear collider (ILC). In particular, we can investigate chiral structure of the interactions by the analysis of forward-backward asymmetry based on polarized beam at the ILC.
Computational Psychiatry of ADHD: Neural Gain Impairments across Marrian Levels of Analysis
Hauser, Tobias U.; Fiore, Vincenzo G.; Moutoussis, Michael; Dolan, Raymond J.
2016-01-01
Attention-deficit hyperactivity disorder (ADHD), one of the most common psychiatric disorders, is characterised by unstable response patterns across multiple cognitive domains. However, the neural mechanisms that explain these characteristic features remain unclear. Using a computational multilevel approach, we propose that ADHD is caused by impaired gain modulation in systems that generate this phenotypic increased behavioural variability. Using Marr's three levels of analysis as a heuristic framework, we focus on this variable behaviour, detail how it can be explained algorithmically, and how it might be implemented at a neural level through catecholamine influences on corticostriatal loops. This computational, multilevel, approach to ADHD provides a framework for bridging gaps between descriptions of neuronal activity and behaviour, and provides testable predictions about impaired mechanisms. PMID:26787097
A SEU-Hard Flip-Flop for Antifuse FPGAs
NASA Technical Reports Server (NTRS)
Katz, R.; Wang, J. J.; McCollum, J.; Cronquist, B.; Chan, R.; Yu, D.; Kleyner, I.; Day, John H. (Technical Monitor)
2001-01-01
A single event upset (SEU)-hardened flip-flop has been designed and developed for antifuse Field Programmable Gate Array (FPGA) application. Design and application issues, testability, test methods, simulation, and results are discussed.
The changing features of the body-mind problem.
Agassi, Joseph
2007-01-01
The body-mind problem invites scientific study, since mental events are repeated and repeatable and invite testable explanations. They seemed troublesome because of the classical theory of substance that failed to solve its own central problems. These are soluble with the aid of the theory of the laws of nature, particularly in its emergentist version [Bunge, M., 1980. The Body-mind Problem, Pergamon, Oxford] that invites refutable explanations [Popper, K.R., 1959. The Logic of Scientific Discovery, Hutchinson, London]. The view of mental properties as emergent is a modification of the two chief classical views, materialism and dualism. As this view invites testable explanations of events of the inner world, it is better than the quasi-behaviorist view of self-awareness as computer-style self-monitoring [Minsky, M., Laske, O., 1992. A conversation with Marvin Minsky. AI Magazine 13 (3), 31-45].
Testability of evolutionary game dynamics based on experimental economics data
NASA Astrophysics Data System (ADS)
Wang, Yijia; Chen, Xiaojie; Wang, Zhijian
2017-11-01
Understanding the dynamic processes of a real game system requires an appropriate dynamics model, and rigorously testing a dynamics model is nontrivial. In our methodological research, we develop an approach to testing the validity of game dynamics models that considers the dynamic patterns of angular momentum and speed as measurement variables. Using Rock-Paper-Scissors (RPS) games as an example, we illustrate the geometric patterns in the experiment data. We then derive the related theoretical patterns from a series of typical dynamics models. By testing the goodness-of-fit between the experimental and theoretical patterns, we show that the validity of these models can be evaluated quantitatively. Our approach establishes a link between dynamics models and experimental systems, which is, to the best of our knowledge, the most effective and rigorous strategy for ascertaining the testability of evolutionary game dynamics models.
Sources, Sinks, and Model Accuracy
Spatial demographic models are a necessary tool for understanding how to manage landscapes sustainably for animal populations. These models, therefore, must offer precise and testable predications about animal population dynamics and how animal demographic parameters respond to ...
Lanzén, Anders; Lekang, Katrine; Jonassen, Inge; Thompson, Eric M; Troedsson, Christofer
2016-09-01
As global exploitation of available resources increases, operations extend towards sensitive and previously protected ecosystems. It is important to monitor such areas in order to detect, understand and remediate environmental responses to stressors. The natural heterogeneity and complexity of communities means that accurate monitoring requires high resolution, both temporally and spatially, as well as more complete assessments of taxa. Increased resolution and taxonomic coverage is economically challenging using current microscopy-based monitoring practices. Alternatively, DNA sequencing-based methods have been suggested for cost-efficient monitoring, offering additional insights into ecosystem function and disturbance. Here, we applied DNA metabarcoding of eukaryotic communities in marine sediments, in areas of offshore drilling on the Norwegian continental shelf. Forty-five samples, collected from seven drilling sites in the Troll/Oseberg region, were assessed, using the small subunit ribosomal RNA gene as a taxonomic marker. In agreement with results based on classical morphology-based monitoring, we were able to identify changes in sediment communities surrounding oil platforms. In addition to overall changes in community structure, we identified several potential indicator taxa, responding to pollutants associated with drilling fluids. These included the metazoan orders Macrodasyida, Macrostomida and Ceriantharia, as well as several ciliates and other protist taxa, typically not targeted by environmental monitoring programmes. Analysis of a co-occurrence network to study the distribution of taxa across samples provided a framework for better understanding the impact of anthropogenic activities on the benthic food web, generating novel, testable hypotheses of trophic interactions structuring benthic communities. © 2016 The Authors. Molecular Ecology Published by John Wiley & Sons Ltd.
Comparative morphology of stingray lateral line canal and electrosensory systems.
Jordan, Laura K
2008-11-01
Elasmobranchs (sharks, skates, and rays) possess a variety of sensory systems including the mechanosensory lateral line and electrosensory systems, which are particularly complex with high levels of interspecific variation in batoids (skates and rays). Rays have dorsoventrally compressed, laterally expanded bodies that prevent them from seeing their mouths and more often than not, their prey. This study uses quantitative image analysis techniques to identify, quantify, and compare structural differences that may have functional consequences in the detection capabilities of three Eastern Pacific stingray species. The benthic round stingray, Urobatis halleri, pelagic stingray, Pteroplatytrygon (Dasyatis) violacea, and benthopelagic bat ray, Myliobatis californica, show significant differences in sensory morphology. Ventral lateral line canals correlate with feeding ecology and differ primarily in the proportion of pored and nonpored canals and the degree of branching complexity. Urobatis halleri shows a high proportion of nonpored canals, while P. violacea has an intermediate proportion of pored and nonpored canals with almost no secondary branching of pored canals. In contrast, M. californica has extensive and highly branched pored ventral lateral line canals that extended laterally toward the wing tips on the anterior edge of the pectoral fins. Electrosensory morphology correlates with feeding habitat and prey mobility; benthic feeders U. halleri and M. californica, have greater electrosensory pore numbers and densities than P. violacea. The percentage of the wing surface covered by these sensory systems appears to be inversely related to swimming style. These methods can be applied to a broader range of species to enable further discussion of the relationship of phylogeny, ecology, and morphology, while the results provide testable predictions of detection capabilities.
Janky, Rekin's; van Helden, Jacques
2008-01-23
The detection of conserved motifs in promoters of orthologous genes (phylogenetic footprints) has become a common strategy to predict cis-acting regulatory elements. Several software tools are routinely used to raise hypotheses about regulation. However, these tools are generally used as black boxes, with default parameters. A systematic evaluation of optimal parameters for a footprint discovery strategy can bring a sizeable improvement to the predictions. We evaluate the performances of a footprint discovery approach based on the detection of over-represented spaced motifs. This method is particularly suitable for (but not restricted to) Bacteria, since such motifs are typically bound by factors containing a Helix-Turn-Helix domain. We evaluated footprint discovery in 368 Escherichia coli K12 genes with annotated sites, under 40 different combinations of parameters (taxonomical level, background model, organism-specific filtering, operon inference). Motifs are assessed both at the levels of correctness and significance. We further report a detailed analysis of 181 bacterial orthologs of the LexA repressor. Distinct motifs are detected at various taxonomical levels, including the 7 previously characterized taxon-specific motifs. In addition, we highlight a significantly stronger conservation of half-motifs in Actinobacteria, relative to Firmicutes, suggesting an intermediate state in specificity switching between the two Gram-positive phyla, and thereby revealing the on-going evolution of LexA auto-regulation. The footprint discovery method proposed here shows excellent results with E. coli and can readily be extended to predict cis-acting regulatory signals and propose testable hypotheses in bacterial genomes for which nothing is known about regulation.
Work-Centered Technology Development (WTD)
2005-03-01
theoretical, testable, inductive, and repeatable foundations of science. o Theoretical foundations include notions such as statistical versus analytical...Human Factors and Ergonomics Society, 263-267. 179 Eggleston, R. G. (2005). Coursebook : Work-Centered Design (WCD). AFRL/HECS WCD course training
Writing testable software requirements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knirk, D.
1997-11-01
This tutorial identifies common problems in analyzing requirements in the problem and constructing a written specification of what the software is to do. It deals with two main problem areas: identifying and describing problem requirements, and analyzing and describing behavior specifications.
All pure bipartite entangled states can be self-tested
Coladangelo, Andrea; Goh, Koon Tong; Scarani, Valerio
2017-01-01
Quantum technologies promise advantages over their classical counterparts in the fields of computation, security and sensing. It is thus desirable that classical users are able to obtain guarantees on quantum devices, even without any knowledge of their inner workings. That such classical certification is possible at all is remarkable: it is a consequence of the violation of Bell inequalities by entangled quantum systems. Device-independent self-testing refers to the most complete such certification: it enables a classical user to uniquely identify the quantum state shared by uncharacterized devices by simply inspecting the correlations of measurement outcomes. Self-testing was first demonstrated for the singlet state and a few other examples of self-testable states were reported in recent years. Here, we address the long-standing open question of whether every pure bipartite entangled state is self-testable. We answer it affirmatively by providing explicit self-testing correlations for all such states. PMID:28548093
All pure bipartite entangled states can be self-tested
NASA Astrophysics Data System (ADS)
Coladangelo, Andrea; Goh, Koon Tong; Scarani, Valerio
2017-05-01
Quantum technologies promise advantages over their classical counterparts in the fields of computation, security and sensing. It is thus desirable that classical users are able to obtain guarantees on quantum devices, even without any knowledge of their inner workings. That such classical certification is possible at all is remarkable: it is a consequence of the violation of Bell inequalities by entangled quantum systems. Device-independent self-testing refers to the most complete such certification: it enables a classical user to uniquely identify the quantum state shared by uncharacterized devices by simply inspecting the correlations of measurement outcomes. Self-testing was first demonstrated for the singlet state and a few other examples of self-testable states were reported in recent years. Here, we address the long-standing open question of whether every pure bipartite entangled state is self-testable. We answer it affirmatively by providing explicit self-testing correlations for all such states.
All pure bipartite entangled states can be self-tested.
Coladangelo, Andrea; Goh, Koon Tong; Scarani, Valerio
2017-05-26
Quantum technologies promise advantages over their classical counterparts in the fields of computation, security and sensing. It is thus desirable that classical users are able to obtain guarantees on quantum devices, even without any knowledge of their inner workings. That such classical certification is possible at all is remarkable: it is a consequence of the violation of Bell inequalities by entangled quantum systems. Device-independent self-testing refers to the most complete such certification: it enables a classical user to uniquely identify the quantum state shared by uncharacterized devices by simply inspecting the correlations of measurement outcomes. Self-testing was first demonstrated for the singlet state and a few other examples of self-testable states were reported in recent years. Here, we address the long-standing open question of whether every pure bipartite entangled state is self-testable. We answer it affirmatively by providing explicit self-testing correlations for all such states.
Phase 1 Space Fission Propulsion Energy Source Design
NASA Technical Reports Server (NTRS)
Houts, Mike; VanDyke, Melissa; Godfroy, Tom; Pedersen, Kevin; Martin, James; Dickens, Ricky; Salvail, Pat; Hrbud, Ivana; Carter, Robert; Rodgers, Stephen L. (Technical Monitor)
2002-01-01
Fission technology can enable rapid, affordable access to any point in the solar system. If fission propulsion systems are to be developed to their full potential; however, near-term customers must be identified and initial fission systems successfully developed, launched, and operated. Studies conducted in fiscal year 2001 (IISTP, 2001) show that fission electric propulsion (FEP) systems with a specific mass at or below 50 kg/kWjet could enhance or enable numerous robotic outer solar system missions of interest. At the required specific mass, it is possible to develop safe, affordable systems that meet mission requirements. To help select the system design to pursue, eight evaluation criteria were identified: system integration, safety, reliability, testability, specific mass, cost, schedule, and programmatic risk. A top-level comparison of four potential concepts was performed: a Testable, Passive, Redundant Reactor (TPRR), a Testable Multi-Cell In-Core Thermionic Reactor (TMCT), a Direct Gas Cooled Reactor (DGCR), and a Pumped Liquid Metal Reactor.(PLMR). Development of any of the four systems appears feasible. However, for power levels up to at least 500 kWt (enabling electric power levels of 125-175 kWe, given 25-35% power conversion efficiency) the TPRR has advantages related to several criteria and is competitive with respect to all. Hardware-based research and development has further increased confidence in the TPRR approach. Successful development and utilization of a "Phase I" fission electric propulsion system will enable advanced Phase 2 and Phase 3 systems capable of providing rapid, affordable access to any point in the solar system.
Daniels, Marcus G; Farmer, J Doyne; Gillemot, László; Iori, Giulia; Smith, Eric
2003-03-14
We model trading and price formation in a market under the assumption that order arrival and cancellations are Poisson random processes. This model makes testable predictions for the most basic properties of markets, such as the diffusion rate of prices (which is the standard measure of financial risk) and the spread and price impact functions (which are the main determinants of transaction cost). Guided by dimensional analysis, simulation, and mean-field theory, we find scaling relations in terms of order flow rates. We show that even under completely random order flow the need to store supply and demand to facilitate trading induces anomalous diffusion and temporal structure in prices.
NASA Astrophysics Data System (ADS)
Daniels, Marcus G.; Farmer, J. Doyne; Gillemot, László; Iori, Giulia; Smith, Eric
2003-03-01
We model trading and price formation in a market under the assumption that order arrival and cancellations are Poisson random processes. This model makes testable predictions for the most basic properties of markets, such as the diffusion rate of prices (which is the standard measure of financial risk) and the spread and price impact functions (which are the main determinants of transaction cost). Guided by dimensional analysis, simulation, and mean-field theory, we find scaling relations in terms of order flow rates. We show that even under completely random order flow the need to store supply and demand to facilitate trading induces anomalous diffusion and temporal structure in prices.
Encoding dependence in Bayesian causal networks
USDA-ARS?s Scientific Manuscript database
Bayesian networks (BNs) represent complex, uncertain spatio-temporal dynamics by propagation of conditional probabilities between identifiable states with a testable causal interaction model. Typically, they assume random variables are discrete in time and space with a static network structure that ...
A genetic programming approach for Burkholderia Pseudomallei diagnostic pattern discovery
Yang, Zheng Rong; Lertmemongkolchai, Ganjana; Tan, Gladys; Felgner, Philip L.; Titball, Richard
2009-01-01
Motivation: Finding diagnostic patterns for fighting diseases like Burkholderia pseudomallei using biomarkers involves two key issues. First, exhausting all subsets of testable biomarkers (antigens in this context) to find a best one is computationally infeasible. Therefore, a proper optimization approach like evolutionary computation should be investigated. Second, a properly selected function of the antigens as the diagnostic pattern which is commonly unknown is a key to the diagnostic accuracy and the diagnostic effectiveness in clinical use. Results: A conversion function is proposed to convert serum tests of antigens on patients to binary values based on which Boolean functions as the diagnostic patterns are developed. A genetic programming approach is designed for optimizing the diagnostic patterns in terms of their accuracy and effectiveness. During optimization, it is aimed to maximize the coverage (the rate of positive response to antigens) in the infected patients and minimize the coverage in the non-infected patients while maintaining the fewest number of testable antigens used in the Boolean functions as possible. The final coverage in the infected patients is 96.55% using 17 of 215 (7.4%) antigens with zero coverage in the non-infected patients. Among these 17 antigens, BPSL2697 is the most frequently selected one for the diagnosis of Burkholderia Pseudomallei. The approach has been evaluated using both the cross-validation and the Jack–knife simulation methods with the prediction accuracy as 93% and 92%, respectively. A novel approach is also proposed in this study to evaluate a model with binary data using ROC analysis. Contact: z.r.yang@ex.ac.uk PMID:19561021
NASA Astrophysics Data System (ADS)
Anderton, Rupert N.; Cameron, Colin D.; Burnett, James G.; Güell, Jeff J.; Sanders-Reed, John N.
2014-06-01
This paper discusses the design of an improved passive millimeter wave imaging system intended to be used for base security in degraded visual environments. The discussion starts with the selection of the optimum frequency band. The trade-offs between requirements on detection, recognition and identification ranges and optical aperture are discussed with reference to the Johnson Criteria. It is shown that these requirements also affect image sampling, receiver numbers and noise temperature, frame rate, field of view, focusing requirements and mechanisms, and tolerance budgets. The effect of image quality degradation is evaluated and a single testable metric is derived that best describes the effects of degradation on meeting the requirements. The discussion is extended to tolerance budgeting constraints if significant degradation is to be avoided, including surface roughness, receiver position errors and scan conversion errors. Although the reflective twist-polarization imager design proposed is potentially relatively low cost and high performance, there is a significant problem with obscuration of the beam by the receiver array. Methods of modeling this accurately and thus designing for best performance are given.
Quark seesaw mechanism, dark U (1 ) symmetry, and the baryon-dark matter coincidence
NASA Astrophysics Data System (ADS)
Gu, Pei-Hong; Mohapatra, Rabindra N.
2017-09-01
We attempt to understand the baryon-dark matter coincidence problem within the quark seesaw extension of the standard model where parity invariance is used to solve the strong C P problem. The S U (2 )L×S U (2 )R×U (1 )B -L gauge symmetry of this model is extended by a dark U (1 )X group plus inclusion of a heavy neutral vector-like fermion χL ,R charged under the dark group which plays the role of dark matter. All fermions are Dirac type in this model. Decay of heavy scalars charged under U (1 )X leads to simultaneous asymmetry generation of the dark matter and baryons after sphaleron effects are included. The U (1 )X group not only helps to stabilize the dark matter but also helps in the elimination of the symmetric part of the dark matter via χ -χ ¯ annihilation. For dark matter mass near the proton mass, it explains why the baryon and dark matter abundances are of similar magnitude (the baryon-dark matter coincidence problem). This model is testable in low threshold (sub-keV) direct dark matter search experiments.
Barker, Jessica L.; Bronstein, Judith L.
2016-01-01
Exploitation in cooperative interactions both within and between species is widespread. Although it is assumed to be costly to be exploited, mechanisms to control exploitation are surprisingly rare, making the persistence of cooperation a fundamental paradox in evolutionary biology and ecology. Focusing on between-species cooperation (mutualism), we hypothesize that the temporal sequence in which exploitation occurs relative to cooperation affects its net costs and argue that this can help explain when and where control mechanisms are observed in nature. Our principal prediction is that when exploitation occurs late relative to cooperation, there should be little selection to limit its effects (analogous to “tolerated theft” in human cooperative groups). Although we focus on cases in which mutualists and exploiters are different individuals (of the same or different species), our inferences can readily be extended to cases in which individuals exhibit mixed cooperative-exploitative strategies. We demonstrate that temporal structure should be considered alongside spatial structure as an important process affecting the evolution of cooperation. We also provide testable predictions to guide future empirical research on interspecific as well as intraspecific cooperation. PMID:26841169
Theory of Aging, Rejuvenation, and the Nonequilibrium Steady State in Deformed Polymer Glasses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Kang
The nonlinear Langevin equation theory of segmental relaxation, elasticity, and mechanical response of polymer glasses is extended to describe the coupled effects of physical aging, mechanical rejuvenation, and thermal history. The key structural variable is the amplitude of density fluctuations, and segmental dynamics proceeds via stress-modified activated barrier hopping on a dynamic free-energy profile. Mechanically generated disorder rejuvenation is quantified by a dissipative work argument and increases the amplitude of density fluctuations, thereby speeding up relaxation beyond that induced by the landscape tilting mechanism. The theory makes testable predictions for the time evolution and nonequilibrium steady state of the alphamore » relaxation time, density fluctuation amplitude, elastic modulus, and other properties. Model calculations reveal a rich dependence of these quantities on preaging time, applied stress, and temperature that reflects the highly nonlinear competition between physical aging and mechanical disordering. Thermal history is erased in the long-time limit, although the nonequilibrium steady state is not the literal fully rejuvenated freshly quenched glass. The present work provides the conceptual foundation for a quantitative treatment of the nonlinear mechanical response of polymer glasses under a variety of deformation protocols.« less
Automated Testability Decision Tool
1991-09-01
Vol. 16,1968, pp. 538-558. Bertsekas, D. P., "Constraints Optimization and Lagrange Multiplier Methods," Academic Press, New York. McLeavey , D.W... McLeavey , J.A., "Parallel Optimization Methods in Standby Reliability, " University of Connecticut, School of Business Administration, Bureau of Business
ERIC Educational Resources Information Center
Niaz, Mansoor
1991-01-01
Discusses differences between the epistemic and the psychological subject, the relationship between the epistemic subject and the ideal gas law, the development of general cognitive operations, and the empirical testability of Piaget's epistemic subject. (PR)
Small Town in Mass Society Revisited.
ERIC Educational Resources Information Center
Young, Frank W.
1996-01-01
A 1958 New York community study dramatized the thesis that macro forces (urbanization, industrialization, bureaucratization) have undermined all small communities' autonomy. Such "oppositional case studies" succeed when they render the dominant view immediately obsolete, have plausible origins, are testable, and generate new research.…
Systems Engineering and Integration (SE and I)
NASA Technical Reports Server (NTRS)
Chevers, ED; Haley, Sam
1990-01-01
The issue of technology advancement and future space transportation vehicles is addressed. The challenge is to develop systems which can be evolved and improved in small incremental steps where each increment reduces present cost, improves, reliability, or does neither but sets the stage for a second incremental upgrade that does. Future requirements are interface standards for commercial off the shelf products to aid in the development of integrated facilities; enhanced automated code generation system slightly coupled to specification and design documentation; modeling tools that support data flow analysis; and shared project data bases consisting of technical characteristics cast information, measurement parameters, and reusable software programs. Topics addressed include: advanced avionics development strategy; risk analysis and management; tool quality management; low cost avionics; cost estimation and benefits; computer aided software engineering; computer systems and software safety; system testability; and advanced avionics laboratories - and rapid prototyping. This presentation is represented by viewgraphs only.
Alarcón, Tomás; Marches, Radu; Page, Karen M
2006-05-07
We formulate models of the mechanism(s) by which B cell lymphoma cells stimulated with an antibody specific to the B cell receptor (IgM) become quiescent or apoptotic. In particular, we aim to reproduce experimental results by Marches et al. according to which the fate of the targeted cells (Daudi) depends on the levels of expression of p21(Waf1) (p21) cell-cycle inhibitor. A simple model is formulated in which the basic ingredients are p21 and caspase activity, and their mutual inhibition. We show that this model does not reproduce the experimental results and that further refinement is needed. A second model successfully reproduces the experimental observations, for a given set of parameter values, indicating a critical role for Myc in the fate decision process. We use bifurcation analysis and objective sensitivity analysis to assess the robustness of our results. Importantly, this analysis yields experimentally testable predictions on the role of Myc, which could have therapeutic implications.
Cognitive Scientists Prefer Theories and Testable Principles with Teeth
ERIC Educational Resources Information Center
Graesser, Arthur C.
2009-01-01
Alexander, Schallert, and Reynolds (2009/this issue) proposed a definition and landscape of learning that included 9 principles and 4 dimensions ("what," "who," "where," "when"). This commentary reflects on the utility of this definition and 4-dimensional landscape from the standpoint of educational…
A systems framework for identifying candidate microbial assemblages for disease management
USDA-ARS?s Scientific Manuscript database
Network models of soil and plant microbiomes present new opportunities for enhancing disease management, but also challenges for interpretation. We present a framework for interpreting microbiome networks, illustrating how the observed structure of networks can be used to generate testable hypothese...
A Time-domain Analysis of Nitrogen-rich Quasars.
NASA Astrophysics Data System (ADS)
Dittmann, Alexander; Liu, Xin; Shen, Yue; Jiang, Linhua
2018-01-01
A small population of quasars exhibit anomalously high nitrogen-to-carbon ratios (N/C) in their emission lines. These “nitrogen-rich” (N-rich) quasars have been difficult to explain. Few of the possible mechanism are natural, since stellar populations with abnormally high metallicities are required to produce an N-rich interstellar medium. N-rich quasars are also more likely to be “radio-loud” than average quasars, which is difficult to explain by invoking higher metallicity alone. Recently, tidal disruption events (TDEs) have been proposed as a mechanism for N-rich quasars. Such a TDE would occur between a supersolar mass star and a supermassive black hole. The CNO cycle creates a surplus of N-rich and carbon-deficient material that could naturally explain the N/C observed in N-rich quasars. The TDE hypothesis explains N-rich quasars without requiring extremely exotic stellar populations. A testable difference differentiating the TDE explanation and exotic stellar population scenarios is that TDEs do not produce enough N-rich material to pollute the quasar environment for extended periods of time, in which case N-rich phenomena in quasars would be transient. By analyzing changes in nitrogen and carbon line widths in time-separated spectra of N-rich quasars, we have studied nitrogen abundance in quasars which had previously been identified as nitrogen rich. We have found that over time-frames of greater than one year in the quasar rest frame, nitrogen abundance tends to systematically decrease. The observed decrease is larger than our estimate of the effects of noise based on spectra separated by smaller time frames. Additionally, x-ray observations of one N-rich quasar have demonstrated that its x-ray emission is an outlier among the quasar population, but similar to confirmed TDEs.
Perceptual Decision-Making as Probabilistic Inference by Neural Sampling.
Haefner, Ralf M; Berkes, Pietro; Fiser, József
2016-05-04
We address two main challenges facing systems neuroscience today: understanding the nature and function of cortical feedback between sensory areas and of correlated variability. Starting from the old idea of perception as probabilistic inference, we show how to use knowledge of the psychophysical task to make testable predictions for the influence of feedback signals on early sensory representations. Applying our framework to a two-alternative forced choice task paradigm, we can explain multiple empirical findings that have been hard to account for by the traditional feedforward model of sensory processing, including the task dependence of neural response correlations and the diverging time courses of choice probabilities and psychophysical kernels. Our model makes new predictions and characterizes a component of correlated variability that represents task-related information rather than performance-degrading noise. It demonstrates a normative way to integrate sensory and cognitive components into physiologically testable models of perceptual decision-making. Copyright © 2016 Elsevier Inc. All rights reserved.
Hu, Xiao-Min; Chen, Jiang-Shan; Liu, Bi-Heng; Guo, Yu; Huang, Yun-Feng; Zhou, Zong-Quan; Han, Yong-Jian; Li, Chuan-Feng; Guo, Guang-Can
2016-10-21
The physical impact and the testability of the Kochen-Specker (KS) theorem is debated because of the fact that perfect compatibility in a single quantum system cannot be achieved in practical experiments with finite precision. Here, we follow the proposal of A. Cabello and M. T. Cunha [Phys. Rev. Lett. 106, 190401 (2011)], and present a compatibility-loophole-free experimental violation of an inequality of noncontextual theories by two spatially separated entangled qutrits. A maximally entangled qutrit-qutrit state with a fidelity as high as 0.975±0.001 is prepared and distributed to separated spaces, and these two photons are then measured locally, providing the compatibility requirement. The results show that the inequality for noncontextual theory is violated by 31 standard deviations. Our experiments pave the way to close the debate about the testability of the KS theorem. In addition, the method to generate high-fidelity and high-dimension entangled states will provide significant advantages in high-dimension quantum encoding and quantum communication.
What is a delusion? Epistemological dimensions.
Leeser, J; O'Donohue, W
1999-11-01
Although the Diagnostic and Statistical Manual of Mental Disorders (American Psychiatric Association, 1994) clearly indicates delusions have an epistemic dimension, it fails to accurately identify the epistemic properties of delusions. The authors explicate the regulative causes of belief revision for rational agents and argue that delusions are unresponsive to these. They argue that delusions are (a) protected beliefs made unfalsifiable either in principle or because the agent refuses to admit anything as a potential falsifier; (b) the protected belief is not typically considered a "properly basic" belief; (c) the belief is not of the variety of protected scientific beliefs; (d) in response to an apparent falsification, the subject posits not a simple, testable explanation for the inconsistency but one that is more complicated, less testable, and provides no new corroborations; (e) the subject has a strong emotional attachment to the belief; and (f) the belief is typically supported by (or originates from) trivial occurrences that are interpreted by the subject as highly unusual, significant, having personal reference, or some combination of these.
ERIC Educational Resources Information Center
Barth, Lorna
2007-01-01
By changing the venue from festival to a required academic exposition, the traditional science fair was transformed into a "Science Expo" wherein students were guided away from cookbook experiments toward developing a question about their environment into a testable and measurable experiment. The revamped "Science Expo" became a night for students…
Leveraging Rigorous Local Evaluations to Understand Contradictory Findings
ERIC Educational Resources Information Center
Boulay, Beth; Martin, Carlos; Zief, Susan; Granger, Robert
2013-01-01
Contradictory findings from "well-implemented" rigorous evaluations invite researchers to identify the differences that might explain the contradictions, helping to generate testable hypotheses for new research. This panel will examine efforts to ensure that the large number of local evaluations being conducted as part of four…
Changing Perspectives on Basic Research in Adult Learning and Memory
ERIC Educational Resources Information Center
Hultsch, David F.
1977-01-01
It is argued that wheather the course of cognitive development is characterized by growth, stability, or decline is less a matter of the metamodel on which the theories and data are based. Such metamodels are representations of reality that are not empirically testable. (Author)
The Process of Mentoring Pregnant Adolescents: An Exploratory Study.
ERIC Educational Resources Information Center
Blinn-Pike, Lynn; Kuschel, Diane; McDaniel, Annette; Mingus, Suzanne; Mutti, Megan Poole
1998-01-01
The process that occurs in relationships between volunteer adult mentors and pregnant adolescent "mentees" is described empirically; testable hypotheses based on findings concerning the mentor role are proposed. Case records from 20 mentors are analyzed; findings regarding mentors' roles are discussed. Criteria for conceptualizing quasi-parenting…
MacLean, Adam L; Harrington, Heather A; Stumpf, Michael P H; Byrne, Helen M
2016-01-01
The last decade has seen an explosion in models that describe phenomena in systems medicine. Such models are especially useful for studying signaling pathways, such as the Wnt pathway. In this chapter we use the Wnt pathway to showcase current mathematical and statistical techniques that enable modelers to gain insight into (models of) gene regulation and generate testable predictions. We introduce a range of modeling frameworks, but focus on ordinary differential equation (ODE) models since they remain the most widely used approach in systems biology and medicine and continue to offer great potential. We present methods for the analysis of a single model, comprising applications of standard dynamical systems approaches such as nondimensionalization, steady state, asymptotic and sensitivity analysis, and more recent statistical and algebraic approaches to compare models with data. We present parameter estimation and model comparison techniques, focusing on Bayesian analysis and coplanarity via algebraic geometry. Our intention is that this (non-exhaustive) review may serve as a useful starting point for the analysis of models in systems medicine.
Software reliability through fault-avoidance and fault-tolerance
NASA Technical Reports Server (NTRS)
Vouk, Mladen A.; Mcallister, David F.
1992-01-01
Accomplishments in the following research areas are summarized: structure based testing, reliability growth, and design testability with risk evaluation; reliability growth models and software risk management; and evaluation of consensus voting, consensus recovery block, and acceptance voting. Four papers generated during the reporting period are included as appendices.
From Cookbook to Experimental Design
ERIC Educational Resources Information Center
Flannagan, Jenny Sue; McMillan, Rachel
2009-01-01
Developing expertise, whether from cook to chef or from student to scientist, occurs over time and requires encouragement, guidance, and support. One key goal of an elementary science program should be to move students toward expertise in their ability to design investigative questions. The ability to design a testable question is difficult for…
Mentoring: A Typology of Costs for Higher Education Faculty
ERIC Educational Resources Information Center
Lunsford, Laura G.; Baker, Vicki; Griffin, Kimberly A.; Johnson, W. Brad
2013-01-01
In this theoretical paper, we apply a social exchange framework to understand mentors' negative experiences. We propose a typology of costs, categorized according to psychosocial and career mentoring functions. Our typology generates testable research propositions. Psychosocial costs of mentoring are burnout, anger, and grief or loss. Career…
Instructional Design: Science, Technology, Both, Neither
ERIC Educational Resources Information Center
Gropper, George L.
2017-01-01
What would it take for instructional design to qualify as a bona fide applied discipline? First and foremost, a fundamental requirement is a testable and tested theoretical base. Untested rationales until verified remain in limbo. Secondly, the discipline's applied prescriptions must be demonstrably traceable to the theoretical base once it is…
ERIC Educational Resources Information Center
Tweney, Ryan D.
Drawing parallels with critical thinking and creative thinking, this document describes some ways that scientific thinking is utilized. Cognitive approaches to scientific thinking are discussed, and it is argued that all science involves an attempt to construct a testable mental model of some aspect of reality. The role of mental models is…
ERIC Educational Resources Information Center
Wallace, Robert B.
1994-01-01
Health survey research assesses health of individuals in population. Measures include prevalence/incidence of diseases, signs/symptoms, functional states, and health services utilization. Although assessing individual biologic robustness can be problematic, testable approaches do exist. Characteristics of health of populations/communities, not…
Equilibration: Developing the Hard Core of the Piagetian Research Program.
ERIC Educational Resources Information Center
Rowell, J.A.
1983-01-01
Argues that the status of the concept of equilibration is classified by considering Piagetian theory as a research program in the sense elaborated in 1974 by Lakatos. A pilot study was made to examine the precision and testability of equilibration in Piaget's 1977 model.(Author/RH)
Links between Parents' Epistemological Stance and Children's Evidence Talk
ERIC Educational Resources Information Center
Luce, Megan R.; Callanan, Maureen A.; Smilovic, Sarah
2013-01-01
Recent experimental research highlights young children's selectivity in learning from others. Little is known, however, about the patterns of information that children actually encounter in conversations with adults. This study investigated variation in parents' tendency to focus on testable evidence as a way to answer science-related questions…
The Simple Theory of Public Library Services.
ERIC Educational Resources Information Center
Newhouse, Joseph P.
A simple normative theory applicable to public library services was developed as a tool to aid libraries in answering the question: which books should be bought by the library? Although developed for normative purposes, the theory generates testable predictions. It is relevant to measuring benefits from services which are provided publicly because…
NASA Astrophysics Data System (ADS)
Sabater, Bartolomé; Marín, Dolores
2018-03-01
The minimum rate principle is applied to the chemical reaction in a steady-state open cell system where, under constant supply of the glucose precursor, reference to time or to glucose consumption does not affect the conclusions.
Tracking the "Lizardman": Writing Rotten to Write Well.
ERIC Educational Resources Information Center
Polette, Keith
1995-01-01
Suggests that students can improve their writing by being instructed on how to write badly. Applies the criteria of testability, tunnel-vision, excessive vagueness, flying in the face of established fact, and hazy authority to tabloid newspaper stories. Discusses how students can write their own "rotten" tabloid stories by taking these…
Researching the Study Abroad Experience
ERIC Educational Resources Information Center
McLeod, Mark; Wainwright, Philip
2009-01-01
The authors propose a paradigm for rigorous scientific assessment of study abroad programs, with the focus being on how study abroad experiences affect psychological constructs as opposed to looking solely at study-abroad-related outcomes. Social learning theory is used as a possible theoretical basis for making testable hypotheses and guiding…
Radiative transfer within seagrass canopies: impact on carbon budgets and light requirements
NASA Astrophysics Data System (ADS)
Zimmerman, Richard C.; Mobley, Curtis D.
1997-02-01
Seagrasses are ecologically important but extremely vulnerable to anthropogenic modifications of the coastal zone that affect light availability within these unique ecosystems. Strongly pigmented seagrass leaves can extend for more than 1 m above the substrate and biomass is distributed unevenly throughout the canopy. in this study, light attenuation in a 7 m water column that contained a seagrass canopy extending 1.5 m above the bottom was calculated by the radiative transfer model Hydrolight using the spectral absorbance of eelgrass leaves and a non-uniform vertical distribution of biomass. Runs were performed in clear and turbid water columns, over san d and mud substrates, and with shoot densities ranging from 25 to 200 m-2 using solar angles for both winter and summer solstices. The flux of photosynthetically active irradiance (EPAR) reaching the top of the seagrass canopy was twice as high in summer compared to winter, and in clear water compared to turbid water. Sediment type had a measurable effect on EPAR only within the bottom third of the canopy. Light penetration within the canopy was inversely proportional to shoot density. Introduction of daylength and a sinusoidal distribution of EPAR throughout the day greatly increased the importance of solar elevation on daily integrated production relative to water column turbidity and sediment type. Shoot-specific productivity decreased and the position of maximum shoot productivity within the canopy shallowed as shoot density increased. Positive net photosynthesis for entire shoots was possible only when plant density was lower than 100 shoots m-2 in winter; values consistent with field observations. Although very simplistic with regard to inherent optical properties of real seagrass leaves, this model was able to generate estimates of maximum sustainable shoot density that were fully testable by, and wholly consistent with, field observations.
Radial Mixing and Ru-Mo Isotope Systematics Under Different Accretion Scenarios
NASA Astrophysics Data System (ADS)
Fischer, R. A.; Nimmo, F.; O'Brien, D. P.
2017-12-01
The Ru-Mo isotopic compositions of inner Solar System bodies may reflect the provenance of accreted material and how it evolved with time, both of which are controlled by the accretion scenario these bodies experienced. Here we use a total of 116 N-body simulations of terrestrial planet accretion, run in the Eccentric Jupiter and Saturn (EJS), Circular Jupiter and Saturn (CJS), and Grand Tack scenarios, to model the Ru-Mo anomalies of Earth, Mars, and Theia analogues. This model starts by applying an initial step function in Ru-Mo isotopic composition, with compositions reflecting those in meteorites, and traces compositional evolution as planets accrete. The mass-weighted provenance of the resulting planets reveals more radial mixing in Grand Tack simulations than in EJS/CJS simulations, and more efficient mixing among late-accreted material than during the main phase of accretion in EJS/CJS simulations. We find that an extensive homogenous inner disk region is required to reproduce Earth's observed Ru-Mo composition. EJS/CJS simulations require a homogeneous reservoir in the inner disk extending to ≥3-4 AU (≥74-98% of initial mass) to reproduce Earth's composition, while Grand Tack simulations require a homogeneous reservoir extending to ≥3-10 AU (≥97-99% of initial mass), and likely to ≥7-10 AU. In the Grand Tack model, Jupiter's initial location (the most likely location for a discontinuity in isotopic composition) is 3.5 AU; however, this step location has only a 33% likelihood of producing an Earth with the correct Ru-Mo isotopic signature for the most plausible model conditions. Our results give the testable predictions that Mars has zero Ru anomaly and small or zero Mo anomaly, and the Moon has zero Mo anomaly. These predictions are insensitive to wide variations in parameter choices.
The Synchrotron Shock Model Confronts a "Line of Death" in the BATSE Gamma-Ray Burst Data
NASA Technical Reports Server (NTRS)
Preece, Robert D.; Briggs, Michael S.; Mallozzi, Robert S.; Pendleton, Geoffrey N.; Paciesas, W. S.; Band, David L.
1998-01-01
The synchrotron shock model (SSM) for gamma-ray burst emission makes a testable prediction: that the observed low-energy power-law photon number spectral index cannot exceed -2/3 (where the photon model is defined with a positive index: $dN/dE \\propto E{alpha}$). We have collected time-resolved spectral fit parameters for over 100 bright bursts observed by the Burst And Transient Source Experiment on board the {\\it Compton Gamma Ray Observatory}. Using this database, we find 23 bursts in which the spectral index limit of the SSM is violated, We discuss elements of the analysis methodology that affect the robustness of this result, as well as some of the escape hatches left for the SSM by theory.
Causes and consequences of reduced blood volume in space flight - A multi-discipline modeling study
NASA Technical Reports Server (NTRS)
Leonard, J. I.
1983-01-01
A group of mathematical models of various physiological systems have been developed and applied to studying problems associated with adaptation to weightlessness. One biomedical issue which could be addressed by at least three of these models from varying perspectives was the reduction in blood volume that universally occurs in astronauts. Accordingly, models of fluid-electrolyte, erythropoiesis, and cardiovascular regulation were employed to study the causes and consequences of blood volume loss during space flight. This analysis confirms the notion that alterations of blood volume are central to an understanding of adaptation to prolonged space flight. More importantly, the modeling studies resulted in specific hypotheses accounting for plasma volume and red cell mass losses and testable predictions concerning the behavior of the circulatory system.
Complex Causal Process Diagrams for Analyzing the Health Impacts of Policy Interventions
Joffe, Michael; Mindell, Jennifer
2006-01-01
Causal diagrams are rigorous tools for controlling confounding. They also can be used to describe complex causal systems, which is done routinely in communicable disease epidemiology. The use of change diagrams has advantages over static diagrams, because change diagrams are more tractable, relate better to interventions, and have clearer interpretations. Causal diagrams are a useful basis for modeling. They make assumptions explicit, provide a framework for analysis, generate testable predictions, explore the effects of interventions, and identify data gaps. Causal diagrams can be used to integrate different types of information and to facilitate communication both among public health experts and between public health experts and experts in other fields. Causal diagrams allow the use of instrumental variables, which can help control confounding and reverse causation. PMID:16449586
A THEORY OF WORK ADJUSTMENT. MINNESOTA STUDIES IN VOCATIONAL REHABILITATION, 15.
ERIC Educational Resources Information Center
DAWIS, RENE V.; AND OTHERS
A THEORY OF WORK ADJUSTMENT WHICH MAY CONTRIBUTE TO THE DEVELOPMENT OF A SCIENCE OF THE PSYCHOLOGY OF OCCUPATIONAL BEHAVIOR IS PROPOSED. IT BUILDS ON THE BASIC PSYCHOLOGICAL CONCEPTS OF STIMULUS, RESPONSE, AND REINFORCEMENT, AND PROVIDES A RESEARCH PARADIGM FOR GENERATING TESTABLE HYPOTHESES. IT WAS DERIVED FROM EARLY RESEARCH EFFORTS OF THE…
ERIC Educational Resources Information Center
Maul, Andrew
2015-01-01
Briggs and Peck [in "Using Learning Progressions to Design Vertical Scales That Support Coherent Inferences about Student Growth"] call for greater care in the conceptualization of the target attributes of students, or "what it is that is growing from grade to grade." In particular, they argue that learning progressions can…
Performance Models of Testability.
1984-08-01
4.1.17 Cost of Isolating Component/Part (CPI) 5J Cost of isolating components or parts at the depot is at CPI - n1 (HDC)(TPI)(NPI) where TPI = average...testing component N Deec N a aiur Yes(PFD D)S Cos ofi oaig op nn -- Cost ofcmpnn rmva n relaemn Exece Cost of omponn reatmovae anda
There's No Such Thing as Value-Free Science.
ERIC Educational Resources Information Center
Makosky, Vivian Parker
This paper is based on the view that, although scientists rely on research values such as predictive accuracy and testability, scientific research is still subject to the unscientific values, attitudes, and emotions of the scientists. It is noted that undergraduate students are likely not to think critically about the science they encounter. A…
Modules, Theories, or Islands of Expertise? Domain Specificity in Socialization
ERIC Educational Resources Information Center
Gelman, Susan A.
2010-01-01
The domain-specific approach to socialization processes presented by J. E. Grusec and M. Davidov (this issue) provides a compelling framework for integrating and interpreting a large and disparate body of research findings, and it generates a wealth of testable new hypotheses. At the same time, it introduces core theoretical questions regarding…
Phases in the Adoption of Educational Innovations in Teacher Training Institutions.
ERIC Educational Resources Information Center
Hall, Gene E.
An attempt has been made to categorize phenomena observed as 20 teacher training institutions have adopted innovations and to extrapolate from these findings key concepts and principles that could form the basis for developing empirically testable hypotheses and could be of some immediate utility to those involved in innovation adoption. The…
Twelve testable hypotheses on the geobiology of weathering
S.L. Brantley; J.P. Megonigal; F.N. Scatena; Z. Balogh-Brunstad; R.T. Barnes; M.A. Bruns; P. van Cappelen; K. Dontsova; H.E. Hartnett; A.S. Hartshorn; A. Heimsath; E. Herndon; L. Jin; C.K. Keller; J.R. Leake; W.H. McDowell; F.C. Meinzer; T.J. Mozdzer; S. Petsch; J. Pett-Ridge; K.S. Pretziger; P.A. Raymond; C.S. Riebe; K. Shumaker; A. Sutton-Grier; R. Walter; K. Yoo
2011-01-01
Critical Zone (CZ) research investigates the chemical, physical, and biological processes that modulate the Earth's surface. Here, we advance 12 hypotheses that must be tested to improve our understanding of the CZ: (1) Solar-to-chemical conversion of energy by plants regulates flows of carbon, water, and nutrients through plant-microbe soil networks, thereby...
ERIC Educational Resources Information Center
Kirch, Susan A.; Stetsenko, Anna
2012-01-01
What do people mean when they say they "know" something in science? It usually means they did an investigation and expended considerable intellectual effort to build a useful explanatory model. It means they are confident about an explanation, believe others should trust what they say, and believe that their claim is testable. It means they can…
On Testability of Missing Data Mechanisms in Incomplete Data Sets
ERIC Educational Resources Information Center
Raykov, Tenko
2011-01-01
This article is concerned with the question of whether the missing data mechanism routinely referred to as missing completely at random (MCAR) is statistically examinable via a test for lack of distributional differences between groups with observed and missing data, and related consequences. A discussion is initially provided, from a formal logic…
ERIC Educational Resources Information Center
Martin-Dunlop, Catherine S.
2013-01-01
This study investigated prospective elementary teachers' understandings of the nature of science and explored associations with their guided-inquiry science learning environment. Over 500 female students completed the Nature of Scientific Knowledge Survey (NSKS), although only four scales were analyzed-Creative, Testable, Amoral, and Unified. The…
Forensic Impact of the Child Sexual Abuse Medical Examination.
ERIC Educational Resources Information Center
Myers, John E. B.
1998-01-01
This commentary on an article (EC 619 279) about research issues at the interface of medicine and law concerning medical evaluation for child sexual abuse focuses on empirically testable questions: (1) the medical history--its accuracy, interviewing issues, and elicitation and preservation of verbal evidence of abuse; and, (2) expert testimony.…
Two New Empirically Derived Reasons To Use the Assessment of Basic Learning Abilities.
ERIC Educational Resources Information Center
Richards, David F.; Williams, W. Larry; Follette, William C.
2002-01-01
Scores on the Assessment of Basic Learning Abilities (ABLA), Vineland Adaptive Behavior Scales, and the Wechsler Intelligences Scale-Revised (WAIS-R) were obtained for 30 adults with mental retardation. Correlations between the Vineland domains and ABLA were all significant. No participants performing below ABLA Level 6 were testable on the…
A Cognitive Approach to Brailling Errors
ERIC Educational Resources Information Center
Wells-Jensen, Sheri; Schwartz, Aaron; Gosche, Bradley
2007-01-01
This article analyzes a corpus of 1,600 brailling errors made by one expert braillist. It presents a testable model of braille writing and shows that the subject braillist stores standard braille contractions as part of the orthographic representation of words, rather than imposing contractions on a serially ordered string of letters. (Contains 1…
Thinking about Evolution: Combinatorial Play as a Strategy for Exercising Scientific Creativity
ERIC Educational Resources Information Center
Wingate, Richard J. T.
2011-01-01
An enduring focus in education on how scientists formulate experiments and "do science" in the laboratory has excluded a vital element of scientific practice: the creative and imaginative thinking that generates models and testable hypotheses. In this case study, final-year biomedical sciences university students were invited to create and justify…
Purposeful Instruction: Mixing up the "I," "We," and "You"
ERIC Educational Resources Information Center
Grant, Maria; Lapp, Diane; Fisher, Douglas; Johnson, Kelly; Frey, Nancy
2012-01-01
This article discusses the flexible nature of the gradual release of responsibility (GRR) as a frame for inquiry-based science instruction. Given the mandate for the use of text-supported learning (Common Core Standards), the GRR can be used to allow students to learn as scientists as they collaboratively develop testable questions and experiments…
The use of models to predict potential contamination aboard orbital vehicles
NASA Technical Reports Server (NTRS)
Boraas, Martin E.; Seale, Dianne B.
1989-01-01
A model of fungal growth on air-exposed, nonnutritive solid surfaces, developed for utilization aboard orbital vehicles is presented. A unique feature of this testable model is that the development of a fungal mycelium can facilitate its own growth by condensation of water vapor from its environment directly onto fungal hyphae. The fungal growth rate is limited by the rate of supply of volatile nutrients and fungal biomass is limited by either the supply of nonvolatile nutrients or by metabolic loss processes. The model discussed is structurally simple, but its dynamics can be quite complex. Biofilm accumulation can vary from a simple linear increase to sustained exponential growth, depending on the values of the environmental variable and model parameters. The results of the model are consistent with data from aquatic biofilm studies, insofar as the two types of systems are comparable. It is shown that the model presented is experimentally testable and provides a platform for the interpretation of observational data that may be directly relevant to the question of growth of organisms aboard the proposed Space Station.
What can we learn from a two-brain approach to verbal interaction?
Schoot, Lotte; Hagoort, Peter; Segaert, Katrien
2016-09-01
Verbal interaction is one of the most frequent social interactions humans encounter on a daily basis. In the current paper, we zoom in on what the multi-brain approach has contributed, and can contribute in the future, to our understanding of the neural mechanisms supporting verbal interaction. Indeed, since verbal interaction can only exist between individuals, it seems intuitive to focus analyses on inter-individual neural markers, i.e. between-brain neural coupling. To date, however, there is a severe lack of theoretically-driven, testable hypotheses about what between-brain neural coupling actually reflects. In this paper, we develop a testable hypothesis in which between-pair variation in between-brain neural coupling is of key importance. Based on theoretical frameworks and empirical data, we argue that the level of between-brain neural coupling reflects speaker-listener alignment at different levels of linguistic and extra-linguistic representation. We discuss the possibility that between-brain neural coupling could inform us about the highest level of inter-speaker alignment: mutual understanding. Copyright © 2016 Elsevier Ltd. All rights reserved.
Active processes make mixed lipid membranes either flat or crumpled
NASA Astrophysics Data System (ADS)
Banerjee, Tirthankar; Basu, Abhik
2018-01-01
Whether live cell membranes show miscibility phase transitions (MPTs), and if so, how they fluctuate near the transitions remain outstanding unresolved issues in physics and biology alike. Motivated by these questions we construct a generic hydrodynamic theory for lipid membranes that are active, due for instance, to the molecular motors in the surrounding cytoskeleton, or active protein components in the membrane itself. We use this to uncover a direct correspondence between membrane fluctuations and MPTs. Several testable predictions are made: (i) generic active stiffening with orientational long range order (flat membrane) or softening with crumpling of the membrane, controlled by the active tension and (ii) for mixed lipid membranes, capturing the nature of putative MPTs by measuring the membrane conformation fluctuations. Possibilities of both first and second order MPTs in mixed active membranes are argued for. Near second order MPTs, active stiffening (softening) manifests as a super-stiff (super-soft) membrane. Our predictions are testable in a variety of in vitro systems, e.g. live cytoskeletal extracts deposited on liposomes and lipid membranes containing active proteins embedded in a passive fluid.
Collective thermoregulation in bee clusters
Ocko, Samuel A.; Mahadevan, L.
2014-01-01
Swarming is an essential part of honeybee behaviour, wherein thousands of bees cling onto each other to form a dense cluster that may be exposed to the environment for several days. This cluster has the ability to maintain its core temperature actively without a central controller. We suggest that the swarm cluster is akin to an active porous structure whose functional requirement is to adjust to outside conditions by varying its porosity to control its core temperature. Using a continuum model that takes the form of a set of advection–diffusion equations for heat transfer in a mobile porous medium, we show that the equalization of an effective ‘behavioural pressure’, which propagates information about the ambient temperature through variations in density, leads to effective thermoregulation. Our model extends and generalizes previous models by focusing the question of mechanism on the form and role of the behavioural pressure, and allows us to explain the vertical asymmetry of the cluster (as a consequence of buoyancy-driven flows), the ability of the cluster to overpack at low ambient temperatures without breaking up at high ambient temperatures, and the relative insensitivity to large variations in the ambient temperature. Our theory also makes testable hypotheses for the response of the cluster to external temperature inhomogeneities and suggests strategies for biomimetic thermoregulation. PMID:24335563
Spillover modes in multiplex games: double-edged effects on cooperation and their coevolution.
Khoo, Tommy; Fu, Feng; Pauls, Scott
2018-05-02
In recent years, there has been growing interest in studying games on multiplex networks that account for interactions across linked social contexts. However, little is known about how potential cross-context interference, or spillover, of individual behavioural strategy impact overall cooperation. We consider three plausible spillover modes, quantifying and comparing their effects on the evolution of cooperation. In our model, social interactions take place on two network layers: repeated interactions with close neighbours in a lattice, and one-shot interactions with random individuals. Spillover can occur during the learning process with accidental cross-layer strategy transfer, or during social interactions with errors in implementation. Our analytical results, using extended pair approximation, are in good agreement with extensive simulations. We find double-edged effects of spillover: increasing the intensity of spillover can promote cooperation provided cooperation is favoured in one layer, but too much spillover is detrimental. We also discover a bistability phenomenon: spillover hinders or promotes cooperation depending on initial frequencies of cooperation in each layer. Furthermore, comparing strategy combinations emerging in each spillover mode provides good indication of their co-evolutionary dynamics with cooperation. Our results make testable predictions that inspire future research, and sheds light on human cooperation across social domains.
Dong, Junzi; Colburn, H. Steven
2016-01-01
In multisource, “cocktail party” sound environments, human and animal auditory systems can use spatial cues to effectively separate and follow one source of sound over competing sources. While mechanisms to extract spatial cues such as interaural time differences (ITDs) are well understood in precortical areas, how such information is reused and transformed in higher cortical regions to represent segregated sound sources is not clear. We present a computational model describing a hypothesized neural network that spans spatial cue detection areas and the cortex. This network is based on recent physiological findings that cortical neurons selectively encode target stimuli in the presence of competing maskers based on source locations (Maddox et al., 2012). We demonstrate that key features of cortical responses can be generated by the model network, which exploits spatial interactions between inputs via lateral inhibition, enabling the spatial separation of target and interfering sources while allowing monitoring of a broader acoustic space when there is no competition. We present the model network along with testable experimental paradigms as a starting point for understanding the transformation and organization of spatial information from midbrain to cortex. This network is then extended to suggest engineering solutions that may be useful for hearing-assistive devices in solving the cocktail party problem. PMID:26866056
Sterile neutrino dark matter and low scale leptogenesis from a charged scalar.
Frigerio, Michele; Yaguna, Carlos E
We show that novel paths to dark matter generation and baryogenesis are open when the standard model is extended with three sterile neutrinos [Formula: see text] and a charged scalar [Formula: see text]. Specifically, we propose a new production mechanism for the dark matter particle-a multi-keV sterile neutrino, [Formula: see text]-that does not depend on the active-sterile mixing angle and does not rely on a large primordial lepton asymmetry. Instead, [Formula: see text] is produced, via freeze-in, by the decays of [Formula: see text] while it is in equilibrium in the early Universe. In addition, we demonstrate that, thanks to the couplings between the heavier sterile neutrinos [Formula: see text] and [Formula: see text], baryogenesis via leptogenesis can be realized close to the electroweak scale. The lepton asymmetry is generated either by [Formula: see text]-decays for masses [Formula: see text] TeV, or by [Formula: see text]-oscillations for [Formula: see text] GeV. Experimental signatures of this scenario include an X-ray line from dark matter decays, and the direct production of [Formula: see text] at the LHC. This model thus describes a minimal, testable scenario for neutrino masses, the baryon asymmetry, and dark matter.
Topological Defects in a Living Nematic Ensnare Swimming Bacteria
NASA Astrophysics Data System (ADS)
Genkin, Mikhail M.; Sokolov, Andrey; Lavrentovich, Oleg D.; Aranson, Igor S.
2017-01-01
Active matter exemplified by suspensions of motile bacteria or synthetic self-propelled particles exhibits a remarkable propensity to self-organization and collective motion. The local input of energy and simple particle interactions often lead to complex emergent behavior manifested by the formation of macroscopic vortices and coherent structures with long-range order. A realization of an active system has been conceived by combining swimming bacteria and a lyotropic liquid crystal. Here, by coupling the well-established and validated model of nematic liquid crystals with the bacterial dynamics, we develop a computational model describing intricate properties of such a living nematic. In faithful agreement with the experiment, the model reproduces the onset of periodic undulation of the director and consequent proliferation of topological defects with the increase in bacterial concentration. It yields a testable prediction on the accumulation of bacteria in the cores of +1 /2 topological defects and depletion of bacteria in the cores of -1 /2 defects. Our dedicated experiment on motile bacteria suspended in a freestanding liquid crystalline film fully confirms this prediction. Our findings suggest novel approaches for trapping and transport of bacteria and synthetic swimmers in anisotropic liquids and extend a scope of tools to control and manipulate microscopic objects in active matter.
Dong, Junzi; Colburn, H Steven; Sen, Kamal
2016-01-01
In multisource, "cocktail party" sound environments, human and animal auditory systems can use spatial cues to effectively separate and follow one source of sound over competing sources. While mechanisms to extract spatial cues such as interaural time differences (ITDs) are well understood in precortical areas, how such information is reused and transformed in higher cortical regions to represent segregated sound sources is not clear. We present a computational model describing a hypothesized neural network that spans spatial cue detection areas and the cortex. This network is based on recent physiological findings that cortical neurons selectively encode target stimuli in the presence of competing maskers based on source locations (Maddox et al., 2012). We demonstrate that key features of cortical responses can be generated by the model network, which exploits spatial interactions between inputs via lateral inhibition, enabling the spatial separation of target and interfering sources while allowing monitoring of a broader acoustic space when there is no competition. We present the model network along with testable experimental paradigms as a starting point for understanding the transformation and organization of spatial information from midbrain to cortex. This network is then extended to suggest engineering solutions that may be useful for hearing-assistive devices in solving the cocktail party problem.
Poudel, R; Jumpponen, A; Schlatter, D C; Paulitz, T C; Gardener, B B McSpadden; Kinkel, L L; Garrett, K A
2016-10-01
Network models of soil and plant microbiomes provide new opportunities for enhancing disease management, but also challenges for interpretation. We present a framework for interpreting microbiome networks, illustrating how observed network structures can be used to generate testable hypotheses about candidate microbes affecting plant health. The framework includes four types of network analyses. "General network analysis" identifies candidate taxa for maintaining an existing microbial community. "Host-focused analysis" includes a node representing a plant response such as yield, identifying taxa with direct or indirect associations with that node. "Pathogen-focused analysis" identifies taxa with direct or indirect associations with taxa known a priori as pathogens. "Disease-focused analysis" identifies taxa associated with disease. Positive direct or indirect associations with desirable outcomes, or negative associations with undesirable outcomes, indicate candidate taxa. Network analysis provides characterization not only of taxa with direct associations with important outcomes such as disease suppression, biofertilization, or expression of plant host resistance, but also taxa with indirect associations via their association with other key taxa. We illustrate the interpretation of network structure with analyses of microbiomes in the oak phyllosphere, and in wheat rhizosphere and bulk soil associated with the presence or absence of infection by Rhizoctonia solani.
Color Vision Deficiency in Preschool Children
Xie, John Z.; Tarczy-Hornoch, Kristina; Lin, Jesse; Cotter, Susan A.; Torres, Mina; Varma, Rohit
2016-01-01
Purpose To determine the sex- and ethnicity-specific prevalence of color vision deficiency (CVD) in black, Asian, Hispanic, and non-Hispanic white preschool children. Design Population-based, cross-sectional study. Participants The Multi-Ethnic Pediatric Eye Disease Study is a population-based evaluation of the prevalence of vision disorders in children in Southern California. A total of 5960 subjects 30 to 72 months of age were recruited for the study, of whom 4177 were able to complete color vision testing (1265 black, 812 Asian, 1280 Hispanic, and 820 non-Hispanic white). Methods Color vision testing was performed using Color Vision Testing Made Easy color plates (Home Vision Care, Gulf Breeze, FL), and diagnostic confirmatory testing was performed using the Waggoner HRR Diagnostic Test color plates (Home Vision Care). Main Outcome Measures Testability of color vision in preschool children between 30 and 72 months of age and prevalence of CVD stratified by age, sex, and ethnicity. Results Testability was 17% in children younger than 37 months of age, increasing to 57% in children 37 to 48 months of age, 89% in children 49 to 60 months of age, and 98% in children 61 to 72 months of age. The prevalence of CVD among boys was 1.4% for black, 3.1% for Asian, 2.6% for Hispanic, and 5.6% for non-Hispanic white children; the prevalence in girls was 0.0% to 0.5% for all ethnicities. The ethnic difference in CVD was statistically significant between black and non-Hispanic white children (P = 0.0003) and between Hispanic and non-Hispanic white children (P = 0.02). In boys, most CVD cases were either deutan (51%) or protan (34%); 32% were classified as mild, 15% as moderate, and 41% as severe. Conclusions Testability for CVD in preschool children is high by 4 years of age. The prevalence of CVD in preschool boys varies by ethnicity, with the highest prevalence in non-Hispanic white and lowest in black children. PMID:24702753
The evolution of speech: a comparative review.
Fitch
2000-07-01
The evolution of speech can be studied independently of the evolution of language, with the advantage that most aspects of speech acoustics, physiology and neural control are shared with animals, and thus open to empirical investigation. At least two changes were necessary prerequisites for modern human speech abilities: (1) modification of vocal tract morphology, and (2) development of vocal imitative ability. Despite an extensive literature, attempts to pinpoint the timing of these changes using fossil data have proven inconclusive. However, recent comparative data from nonhuman primates have shed light on the ancestral use of formants (a crucial cue in human speech) to identify individuals and gauge body size. Second, comparative analysis of the diverse vertebrates that have evolved vocal imitation (humans, cetaceans, seals and birds) provides several distinct, testable hypotheses about the adaptive function of vocal mimicry. These developments suggest that, for understanding the evolution of speech, comparative analysis of living species provides a viable alternative to fossil data. However, the neural basis for vocal mimicry and for mimesis in general remains unknown.
Fast gene ontology based clustering for microarray experiments.
Ovaska, Kristian; Laakso, Marko; Hautaniemi, Sampsa
2008-11-21
Analysis of a microarray experiment often results in a list of hundreds of disease-associated genes. In order to suggest common biological processes and functions for these genes, Gene Ontology annotations with statistical testing are widely used. However, these analyses can produce a very large number of significantly altered biological processes. Thus, it is often challenging to interpret GO results and identify novel testable biological hypotheses. We present fast software for advanced gene annotation using semantic similarity for Gene Ontology terms combined with clustering and heat map visualisation. The methodology allows rapid identification of genes sharing the same Gene Ontology cluster. Our R based semantic similarity open-source package has a speed advantage of over 2000-fold compared to existing implementations. From the resulting hierarchical clustering dendrogram genes sharing a GO term can be identified, and their differences in the gene expression patterns can be seen from the heat map. These methods facilitate advanced annotation of genes resulting from data analysis.
Gender and Physics: a Theoretical Analysis
NASA Astrophysics Data System (ADS)
Rolin, Kristina
This article argues that the objections raised by Koertge (1998), Gross and Levitt (1994), and Weinberg (1996) against feminist scholarship on gender and physics are unwarranted. The objections are that feminist science studies perpetuate gender stereotypes, are irrelevant to the content of physics, or promote epistemic relativism. In the first part of this article I argue that the concept of gender, as it has been developed in feminist theory, is a key to understanding why the first objection is misguided. Instead of reinforcing gender stereotypes, feminist science studies scholars can formulate empirically testable hypotheses regarding local and contested beliefs about gender. In the second part of this article I argue that a social analysis of scientific knowledge is a key to understanding why the second and the third objections are misguided. The concept of gender is relevant for understanding the social practice of physics, and the social practice of physics can be of epistemic importance. Instead of advancing epistemic relativism, feminist science studies scholars can make important contributions to a subfield of philosophy called social epistemology.
Pre-Service Teacher Scientific Behavior: Comparative Study of Paired Science Project Assignments
ERIC Educational Resources Information Center
Bulunuz, Mizrap; Tapan Broutin, Menekse Seden; Bulunuz, Nermin
2016-01-01
Problem Statement: University students usually lack the skills to rigorously define a multi-dimensional real-life problem and its limitations in an explicit, clear and testable way, which prevents them from forming a reliable method, obtaining relevant results and making balanced judgments to solve a problem. Purpose of the Study: The study…
1981-03-31
logic testing element and a concomitant testability criterion ideally suited to dynamic circuit applications and appro- priate for automatic computer...making connections automatically . PF is an experimental feature which provides users with only four different chip sizes (full, half, quarter, and eighth...initial solution is found constructively which is improved by pair-wise swapping. Results show, however, that the constructive initial sorter , which
ERIC Educational Resources Information Center
Nauta, Margaret M.
2010-01-01
This article celebrates the 50th anniversary of the introduction of John L. Holland's (1959) theory of vocational personalities and work environments by describing the theory's development and evolution, its instrumentation, and its current status. Hallmarks of Holland's theory are its empirical testability and its user-friendliness. By…
Steering Performance, Tactical Vehicles
2015-07-29
5 4.1 General Vehicle and Test Characterization ........................... 5 4.2 Weave Test...able to be driven in a straight line without steer input (i.e., “ hands free”). If the vehicle pulls in either direction, the alignment should be...Evaluation Center (AEC) prior to using military personnel as test participants. 4. TEST PROCEDURES. 4.1 General Vehicle and Test
Binding and Scope Dependencies with "Floating Quantifiers" in Japanese
ERIC Educational Resources Information Center
Mukai, Emi
2012-01-01
The primary concern of this thesis is how we can achieve rigorous testability when we set the properties of the Computational System (hypothesized to be at the center of the language faculty) as our object of inquiry and informant judgments as a tool to construct and/or evaluate our hypotheses concerning the properties of the Computational System.…
The Many Methods to Measure Testability: A Horror Story.
1988-04-01
it seems overly simplistic to assign only one "magic number" as a viable design goal. Different design technologies such as digital, analog, machanical ...FAILURE RATE 1 1 BASIC TEST PROGRAM 1 1 ATLAS TEST PROGRAM 1 1 EDIF FILE 1 1 TEST STRATEGY FLOWCHART 1 1 RTOK FREQUENCY 1 1 DIAGNOSIS AVERAGE COST 1 1
The Social Basis of Math Teaching and Learning. Final Report.
ERIC Educational Resources Information Center
Orvik, James M.; Van Veldhuizen, Philip A.
This study was designed to identify a set of research questions and testable hypothesis to aid in planning long-range research. Five mathematics teachers were selected. These instructors enrolled in a special project-related seminar, video-taped sessions of their own mathematics classes, and kept field journals. The group met once a week to…
ERIC Educational Resources Information Center
Hunter, Lora Rose; Schmidt, Norman B.
2010-01-01
In this review, the extant literature concerning anxiety psychopathology in African American adults is summarized to develop a testable, explanatory framework with implications for future research. The model was designed to account for purported lower rates of anxiety disorders in African Americans compared to European Americans, along with other…
ERIC Educational Resources Information Center
Kulczynska, Agnieszka; Johnson, Reed; Frost, Tony; Margerum, Lawrence D.
2011-01-01
An advanced undergraduate laboratory project is described that integrates inorganic, analytical, physical, and biochemical techniques to reveal differences in binding between cationic metal complexes and anionic DNA (herring testes). Students were guided to formulate testable hypotheses based on the title question and a list of different metal…
ERIC Educational Resources Information Center
Duncan-Wiles, Daphne S.
2012-01-01
With the recent addition of engineering to most K-12 testable state standards, efficient and comprehensive instruments are needed to assess changes in student knowledge and perceptions of engineering. In this study, I developed the Students' Awareness and Perceptions of Learning Engineering (STAPLE) instrument to quantitatively measure fourth…
Wichita's Hispanics: Tensions, Concerns, and the Migrant Stream.
ERIC Educational Resources Information Center
Johnson, Kenneth F.; And Others
In an attempt to formulate a set of testable propositions about the dynamics of Hispanic life that will be valuable pedagogically and as a basis for public policy formation, this study assesses the impact of Hispanic Americans on Wichita, Kansas. Chapter 1 identifies the Hispanic origins of Kansas' 63,339 Hispanics who represent 2.7% of the…
Improving Health Care for Assisted Living Residents
ERIC Educational Resources Information Center
Kane, Robert L.; Mach, John R., Jr.
2007-01-01
Purpose: The purpose of this article is to explore how medical care is delivered to older people in assisted living (AL) settings and to suggest ways for improving it. Design and Methods: We present a review of the limited research available on health care for older AL residents and on building testable models of better ways to organize primary…
2008-12-01
1979; Wasserman and Faust, 1994). SNA thus relies heavily on graph theory to make predictions about network structure and thus social behavior...becomes a tool for increasing the specificity of theory , thinking through the theoretical implications, and generating testable predictions. In...to summarize Construct and its roots in constructural sociological theory . We discover that the (LPM) provides a mathematical bridge between
ERIC Educational Resources Information Center
Holden, Richard J.; Karsh, Ben-Tzion
2009-01-01
Primary objective: much research and practice related to the design and implementation of information technology in health care has been atheoretical. It is argued that using extant theory to develop testable models of health information technology (HIT) benefits both research and practice. Methods and procedures: several theories of motivation,…
Interpreting clinical trial results by deductive reasoning: In search of improved trial design.
Kurbel, Sven; Mihaljević, Slobodan
2017-10-01
Clinical trial results are often interpreted by inductive reasoning, in a trial design-limited manner, directed toward modifications of the current clinical practice. Deductive reasoning is an alternative in which results of relevant trials are combined in indisputable premises that lead to a conclusion easily testable in future trials. © 2017 WILEY Periodicals, Inc.
The part of cognitive science that is philosophy.
Dennett, Daniel C
2009-04-01
There is much good work for philosophers to do in cognitive science if they adopt the constructive attitude that prevails in science, work toward testable hypotheses, and take on the task of clarifying the relationship between the scientific concepts and the everyday concepts with which we conduct our moral lives. Copyright © 2009 Cognitive Science Society, Inc.
A Progress Report on a Thinking Laboratory for Deaf Children.
ERIC Educational Resources Information Center
Wolff, Sydney
A study was undertaken at the West Virginia School for the Deaf to test the assumption that the modes of thought of deaf children could be improved, and that improvement in concept formation would result in improvement in testable areas. Sixteen primary school children of approximately equal ability were selected and paired to form the control and…
Econometrics of exhaustible resource supply: a theory and an application
DOE Office of Scientific and Technical Information (OSTI.GOV)
Epple, D.
1983-01-01
This report takes a major step toward developing a fruitful approach to empirical analysis of resource supply. It is the first empirical application of resource theory that has successfully integrated the effects of depletion of nonrenewable resources with the effects of uncertainty about future costs and prices on supply behavior. Thus, the model is a major improvement over traditional engineering-optimization models that assume complete certainty, and over traditional econometrics models that are only implicitly related to the theory of resource supply. The model is used to test hypotheses about interdependence of oil and natural gas discoveries, depletion, ultimate recovery, andmore » the role of price expectations. This paper demonstrates the feasibility of using exhaustible resource theory in the development of empirically testable models. 19 refs., 1 fig., 5 tabs.« less
Canales, Javier; Moyano, Tomás C.; Villarroel, Eva; Gutiérrez, Rodrigo A.
2014-01-01
Nitrogen (N) is an essential macronutrient for plant growth and development. Plants adapt to changes in N availability partly by changes in global gene expression. We integrated publicly available root microarray data under contrasting nitrate conditions to identify new genes and functions important for adaptive nitrate responses in Arabidopsis thaliana roots. Overall, more than 2000 genes exhibited changes in expression in response to nitrate treatments in Arabidopsis thaliana root organs. Global regulation of gene expression by nitrate depends largely on the experimental context. However, despite significant differences from experiment to experiment in the identity of regulated genes, there is a robust nitrate response of specific biological functions. Integrative gene network analysis uncovered relationships between nitrate-responsive genes and 11 highly co-expressed gene clusters (modules). Four of these gene network modules have robust nitrate responsive functions such as transport, signaling, and metabolism. Network analysis hypothesized G2-like transcription factors are key regulatory factors controlling transport and signaling functions. Our meta-analysis highlights the role of biological processes not studied before in the context of the nitrate response such as root hair development and provides testable hypothesis to advance our understanding of nitrate responses in plants. PMID:24570678
Domain fusion analysis by applying relational algebra to protein sequence and domain databases
Truong, Kevin; Ikura, Mitsuhiko
2003-01-01
Background Domain fusion analysis is a useful method to predict functionally linked proteins that may be involved in direct protein-protein interactions or in the same metabolic or signaling pathway. As separate domain databases like BLOCKS, PROSITE, Pfam, SMART, PRINTS-S, ProDom, TIGRFAMs, and amalgamated domain databases like InterPro continue to grow in size and quality, a computational method to perform domain fusion analysis that leverages on these efforts will become increasingly powerful. Results This paper proposes a computational method employing relational algebra to find domain fusions in protein sequence databases. The feasibility of this method was illustrated on the SWISS-PROT+TrEMBL sequence database using domain predictions from the Pfam HMM (hidden Markov model) database. We identified 235 and 189 putative functionally linked protein partners in H. sapiens and S. cerevisiae, respectively. From scientific literature, we were able to confirm many of these functional linkages, while the remainder offer testable experimental hypothesis. Results can be viewed at . Conclusion As the analysis can be computed quickly on any relational database that supports standard SQL (structured query language), it can be dynamically updated along with the sequence and domain databases, thereby improving the quality of predictions over time. PMID:12734020
Taxes in a Labor Supply Model with Joint Wage-Hours Determination.
ERIC Educational Resources Information Center
Rosen, Harvey S.
1976-01-01
Payroll and progressive income taxes play an enormous role in the American fiscal system. The purpose of this study is to present some econometric evidence on the effects of taxes on married women, a group of growing importance in the American labor force. A testable model of labor supply is developed which permits statistical estimation of a…
ERIC Educational Resources Information Center
Nunez, Rafael
2012-01-01
"The Journal of the Learning Sciences" has devoted this special issue to the study of embodied cognition (as it applies to mathematics), a topic that for several decades has gained attention in the cognitive sciences and in mathematics education, in particular. In this commentary, the author aims to address crucial questions in embodied…
Predictors of Organizational-Level Testability Attributes
1987-05-01
A. Elizabeth Gilreath Brian A. Kelley 13a. TYPE OF REPORT 13b. TIME COVERED 14. DATE OF REPORT (YearB, M RSnt, Day) 15.PAGECOUNT ’Final JFROM A TO 6... BRU count. These counts are "described in subsections 6,.2.1.1 and 6.2.1.2. and are further subdivided in Figure 6-4. 6.2.1.1 Functional Cross
Surface fire effects on conifer and hardwood crowns--applications of an integral plume model
Matthew Dickinson; Anthony Bova; Kathleen Kavanagh; Antoine Randolph; Lawrence Band
2009-01-01
An integral plume model was applied to the problems of tree death from canopy injury in dormant-season hardwoods and branch embolism in Douglas fir (Pseudotsuga menziesii) crowns. Our purpose was to generate testable hypotheses. We used the integral plume models to relate crown injury to bole injury and to explore the effects of variation in fire...
Analytical Procedures for Testability.
1983-01-01
Beat Internal Classifications", AD: A018516. "A System of Computer Aided Diagnosis with Blood Serum Chemistry Tests and Bayesian Statistics", AD: 786284...6 LIST OF TALS .. 1. Truth Table ......................................... 49 2. Covering Problem .............................. 93 3. Primary and...quential classification procedure in a coronary care ward is evaluated. In the toxicology field "A System of Computer Aided Diagnosis with Blood Serum
Objections to routine clinical outcomes measurement in mental health services: any evidence so far?
MacDonald, Alastair J D; Trauer, Tom
2010-12-01
Routine clinical outcomes measurement (RCOM) is gaining importance in mental health services. To examine whether criticisms published in advance of the development of RCOM have been borne out by data now available from such a programme. This was an observational study of routine ratings using HoNOS65+ at inception/admission and again at discharge in an old age psychiatry service from 1997 to 2008. Testable hypotheses were generated from each criticism amenable to empirical examination. Inter-rater reliability estimates were applied to observed differences between scores between community and ward patients using resampling. Five thousand one hundred eighty community inceptions and 862 admissions had HoNOS65+ ratings at referral/admission and discharge. We could find no evidence of gaming (artificially worse scores at inception and better at discharge), selection, attrition or detection bias, and ratings were consistent with diagnosis and level of service. Anticipated low levels of inter-rater reliability did not vitiate differences between levels of service. Although only hypotheses testable from within RCOM data were examined, and only 46% of eligible episodes had complete outcomes data, no evidence of the alleged biases were found. RCOM seems valid and practical in mental health services.
Electronic design of a multichannel programmable implant for neuromuscular electrical stimulation.
Arabi, K; Sawan, M A
1999-06-01
An advanced stimulator for neuromuscular stimulation of spinal cord injured patients has been developed. The stimulator is externally controlled and powered by a single encoded radio frequency carrier and has four independently controlled bipolar stimulation channels. It offers a wide range of reprogrammability and flexibility, and can be used in many neuromuscular electrical stimulation applications. The implant system is adaptable to patient's needs and to future developments in stimulation algorithms by reprogramming the stimulator. The stimulator is capable of generating a wide range of stimulation waveforms and stimulation patterns and therefore is very suitable for selective nerve stimulation techniques. The reliability of the implant has been increased by using a forward error detection and correction communication protocol and by designing the chip for structural testability based on scan test approach. Implemented testability scheme makes it possible to verify the complete functionality of the implant before and after implantation. The stimulators architecture is designed to be modular and therefore its different blocks can be reused as standard building blocks in the design and implementation of other neuromuscular prostheses. Design for low-power techniques have also been employed to reduce power consumption of the electronic circuitry.
Lift and drag in three-dimensional steady viscous and compressible flow
NASA Astrophysics Data System (ADS)
Liu, L. Q.; Wu, J. Z.; Su, W. D.; Kang, L. L.
2017-11-01
In a recent paper, Liu, Zhu, and Wu ["Lift and drag in two-dimensional steady viscous and compressible flow," J. Fluid Mech. 784, 304-341 (2015)] present a force theory for a body in a two-dimensional, viscous, compressible, and steady flow. In this companion paper, we do the same for three-dimensional flows. Using the fundamental solution of the linearized Navier-Stokes equations, we improve the force formula for incompressible flows originally derived by Goldstein in 1931 and summarized by Milne-Thomson in 1968, both being far from complete, to its perfect final form, which is further proved to be universally true from subsonic to supersonic flows. We call this result the unified force theorem, which states that the forces are always determined by the vector circulation Γϕ of longitudinal velocity and the scalar inflow Qψ of transverse velocity. Since this theorem is not directly observable either experimentally or computationally, a testable version is also derived, which, however, holds only in the linear far field. We name this version the testable unified force formula. After that, a general principle to increase the lift-drag ratio is proposed.
Reilly, John J; Wells, Jonathan C K
2005-12-01
The WHO recommends exclusive breast-feeding for the first 6 months of life. At present, <2 % of mothers who breast-feed in the UK do so exclusively for 6 months. We propose the testable hypothesis that this is because many mothers do not provide sufficient breast milk to feed a 6-month-old baby adequately. We review recent evidence on energy requirements during infancy, and energy transfer from mother to baby, and consider the adequacy of exclusive breast-feeding to age 6 months for mothers and babies in the developed world. Evidence from our recent systematic review suggests that mean metabolisable energy intake in exclusively breast-fed infants at 6 months is 2.2-2.4 MJ/d (525-574 kcal/d), and mean energy requirement approximately 2.6-2.7 MJ/d (632-649 kcal/d), leading to a gap between the energy provided by milk and energy needs by 6 months for many babies. Our hypothesis is consistent with other evidence, and with evolutionary considerations, and we briefly review this other evidence. The hypothesis would be testable in a longitudinal study of infant energy balance using stable-isotope techniques, which are both practical and valid.
Evolution of human brain functions: the functional structure of human consciousness.
Cloninger, C Robert
2009-11-01
The functional structure of self-aware consciousness in human beings is described based on the evolution of human brain functions. Prior work on heritable temperament and character traits is extended to account for the quantum-like and holographic properties (i.e. parts elicit wholes) of self-aware consciousness. Cladistic analysis is used to identify the succession of ancestors leading to human beings. The functional capacities that emerge along this lineage of ancestors are described. The ecological context in which each cladogenesis occurred is described to illustrate the shifting balance of evolution as a complex adaptive system. Comparative neuroanatomy is reviewed to identify the brain structures and networks that emerged coincident with the emergent brain functions. Individual differences in human temperament traits were well developed in the common ancestor shared by reptiles and humans. Neocortical development in mammals proceeded in five major transitions: from early reptiles to early mammals, early primates, simians, early Homo, and modern Homo sapiens. These transitions provide the foundation for human self-awareness related to sexuality, materiality, emotionality, intellectuality, and spirituality, respectively. The functional structure of human self-aware consciousness is concerned with the regulation of five planes of being: sexuality, materiality, emotionality, intellectuality, and spirituality. Each plane elaborates neocortical functions organized around one of the five special senses. The interactions among these five planes gives rise to a 5 x 5 matrix of subplanes, which are functions that coarsely describe the focus of neocortical regulation. Each of these 25 neocortical functions regulates each of five basic motives or drives that can be measured as temperaments or basic emotions related to fear, anger, disgust, surprise, and happiness/sadness. The resulting 5 x 5 x 5 matrix of human characteristics provides a general and testable model of the functional structure of human consciousness that includes personality, physicality, emotionality, cognition, and spirituality in a unified developmental framework.
Neuromechanical simulation of the locust jump
Cofer, D.; Cymbalyuk, G.; Heitler, W. J.; Edwards, D. H.
2010-01-01
The neural circuitry and biomechanics of kicking in locusts have been studied to understand their roles in the control of both kicking and jumping. It has been hypothesized that the same neural circuit and biomechanics governed both behaviors but this hypothesis was not testable with current technology. We built a neuromechanical model to test this and to gain a better understanding of the role of the semi-lunar process (SLP) in jump dynamics. The jumping and kicking behaviors of the model were tested by comparing them with a variety of published data, and were found to reproduce the results from live animals. This confirmed that the kick neural circuitry can produce the jump behavior. The SLP is a set of highly sclerotized bands of cuticle that can be bent to store energy for use during kicking and jumping. It has not been possible to directly test the effects of the SLP on jump performance because it is an integral part of the joint, and attempts to remove its influence prevent the locust from being able to jump. Simulations demonstrated that the SLP can significantly increase jump distance, power, total energy and duration of the jump impulse. In addition, the geometry of the joint enables the SLP force to assist leg flexion when the leg is flexed, and to assist extension once the leg has begun to extend. PMID:20228342
Rothwell, Gar W; Wyatt, Sarah E; Tomescu, Alexandru M F
2014-06-01
Paleontology yields essential evidence for inferring not only the pattern of evolution, but also the genetic basis of evolution within an ontogenetic framework. Plant fossils provide evidence for the pattern of plant evolution in the form of transformational series of structure through time. Developmentally diagnostic structural features that serve as "fingerprints" of regulatory genetic pathways also are preserved by plant fossils, and here we provide examples of how those fingerprints can be used to infer the mechanisms by which plant form and development have evolved. When coupled with an understanding of variations and systematic distributions of specific regulatory genetic pathways, this approach provides an avenue for testing evolutionary hypotheses at the organismal level that is analogous to employing bioinformatics to explore genetics at the genomic level. The positions where specific genes, gene families, and developmental regulatory mechanisms first appear in phylogenies are correlated with the positions where fossils with the corresponding structures occur on the tree, thereby yielding testable hypotheses that extend our understanding of the role of developmental changes in the evolution of the body plans of vascular plant sporophytes. As a result, we now have new and powerful methodologies for characterizing major evolutionary changes in morphology, anatomy, and physiology that have resulted from combinations of genetic regulatory changes and that have produced the synapomorphies by which we recognize major clades of plants. © 2014 Botanical Society of America, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Genkin, Mikhail Mikhailovich; Sokolov, Andrey; Lavrentovich, Oleg D.
Active matter exemplified by suspensions of motile bacteria or synthetic self-propelled particles exhibits a remarkable propensity to self-organization and collective motion. The local input of energy and simple particle interactions often lead to complex emergent behavior manifested by the formation of macroscopic vortices and coherent structures with long-range order. A realization of an active system has been conceived by combining swimming bacteria and a lyotropic liquid crystal. Here, by coupling the well-established and validated model of nematic liquid crystals with the bacterial dynamics, we develop a computational model describing intricate properties of such a living nematic. In faithful agreement withmore » the experiment, the model reproduces the onset of periodic undulation of the director and consequent proliferation of topological defects with the increase in bacterial concentration. It yields a testable prediction on the accumulation of bacteria in the cores of +1/2 topological defects and depletion of bacteria in the cores of -1/2 defects. Our dedicated experiment on motile bacteria suspended in a freestanding liquid crystalline film fully confirms this prediction. Lastly, our findings suggest novel approaches for trapping and transport of bacteria and synthetic swimmers in anisotropic liquids and extend a scope of tools to control and manipulate microscopic objects in active matter.« less
Genkin, Mikhail Mikhailovich; Sokolov, Andrey; Lavrentovich, Oleg D.; ...
2017-03-08
Active matter exemplified by suspensions of motile bacteria or synthetic self-propelled particles exhibits a remarkable propensity to self-organization and collective motion. The local input of energy and simple particle interactions often lead to complex emergent behavior manifested by the formation of macroscopic vortices and coherent structures with long-range order. A realization of an active system has been conceived by combining swimming bacteria and a lyotropic liquid crystal. Here, by coupling the well-established and validated model of nematic liquid crystals with the bacterial dynamics, we develop a computational model describing intricate properties of such a living nematic. In faithful agreement withmore » the experiment, the model reproduces the onset of periodic undulation of the director and consequent proliferation of topological defects with the increase in bacterial concentration. It yields a testable prediction on the accumulation of bacteria in the cores of +1/2 topological defects and depletion of bacteria in the cores of -1/2 defects. Our dedicated experiment on motile bacteria suspended in a freestanding liquid crystalline film fully confirms this prediction. Lastly, our findings suggest novel approaches for trapping and transport of bacteria and synthetic swimmers in anisotropic liquids and extend a scope of tools to control and manipulate microscopic objects in active matter.« less
Evolution of the cerebellum as a neuronal machine for Bayesian state estimation
NASA Astrophysics Data System (ADS)
Paulin, M. G.
2005-09-01
The cerebellum evolved in association with the electric sense and vestibular sense of the earliest vertebrates. Accurate information provided by these sensory systems would have been essential for precise control of orienting behavior in predation. A simple model shows that individual spikes in electrosensory primary afferent neurons can be interpreted as measurements of prey location. Using this result, I construct a computational neural model in which the spatial distribution of spikes in a secondary electrosensory map forms a Monte Carlo approximation to the Bayesian posterior distribution of prey locations given the sense data. The neural circuit that emerges naturally to perform this task resembles the cerebellar-like hindbrain electrosensory filtering circuitry of sharks and other electrosensory vertebrates. The optimal filtering mechanism can be extended to handle dynamical targets observed from a dynamical platform; that is, to construct an optimal dynamical state estimator using spiking neurons. This may provide a generic model of cerebellar computation. Vertebrate motion-sensing neurons have specific fractional-order dynamical characteristics that allow Bayesian state estimators to be implemented elegantly and efficiently, using simple operations with asynchronous pulses, i.e. spikes. The computational neural models described in this paper represent a novel kind of particle filter, using spikes as particles. The models are specific and make testable predictions about computational mechanisms in cerebellar circuitry, while providing a plausible explanation of cerebellar contributions to aspects of motor control, perception and cognition.
Vacuum stability and naturalness in type-II seesaw
Haba, Naoyuki; Ishida, Hiroyuki; Okada, Nobuchika; ...
2016-06-16
Here, we study the vacuum stability and perturbativity conditions in the minimal type-II seesaw model. These conditions give characteristic constraints to the model parameters. In the model, there is a SU(2) L triplet scalar field, which could cause a large Higgs mass correction. From the naturalness point of view, heavy Higgs masses should be lower than 350GeV, which may be testable by the LHC Run-II results. Due to the effects of the triplet scalar field, the branching ratios of the Higgs decay (h → γγ,Zγ) deviate from the standard model, and a large parameter region is excluded by the recentmore » ATLAS and CMS combined analysis of h → γγ. Our result of the signal strength for h → γγ is R γγ ≲ 1.1, but its deviation is too small to observe at the LHC experiment.« less
A test of the hypothesis that correlational selection generates genetic correlations.
Roff, Derek A; Fairbairn, Daphne J
2012-09-01
Theory predicts that correlational selection on two traits will cause the major axis of the bivariate G matrix to orient itself in the same direction as the correlational selection gradient. Two testable predictions follow from this: for a given pair of traits, (1) the sign of correlational selection gradient should be the same as that of the genetic correlation, and (2) the correlational selection gradient should be positively correlated with the value of the genetic correlation. We test this hypothesis with a meta-analysis utilizing empirical estimates of correlational selection gradients and measures of the correlation between the two focal traits. Our results are consistent with both predictions and hence support the underlying hypothesis that correlational selection generates a genetic correlation between the two traits and hence orients the bivariate G matrix. © 2012 The Author(s). Evolution© 2012 The Society for the Study of Evolution.
Flight elements: Fault detection and fault management
NASA Technical Reports Server (NTRS)
Lum, H.; Patterson-Hine, A.; Edge, J. T.; Lawler, D.
1990-01-01
Fault management for an intelligent computational system must be developed using a top down integrated engineering approach. An approach proposed includes integrating the overall environment involving sensors and their associated data; design knowledge capture; operations; fault detection, identification, and reconfiguration; testability; causal models including digraph matrix analysis; and overall performance impacts on the hardware and software architecture. Implementation of the concept to achieve a real time intelligent fault detection and management system will be accomplished via the implementation of several objectives, which are: Development of fault tolerant/FDIR requirement and specification from a systems level which will carry through from conceptual design through implementation and mission operations; Implementation of monitoring, diagnosis, and reconfiguration at all system levels providing fault isolation and system integration; Optimize system operations to manage degraded system performance through system integration; and Lower development and operations costs through the implementation of an intelligent real time fault detection and fault management system and an information management system.
An application of statistics to comparative metagenomics
Rodriguez-Brito, Beltran; Rohwer, Forest; Edwards, Robert A
2006-01-01
Background Metagenomics, sequence analyses of genomic DNA isolated directly from the environments, can be used to identify organisms and model community dynamics of a particular ecosystem. Metagenomics also has the potential to identify significantly different metabolic potential in different environments. Results Here we use a statistical method to compare curated subsystems, to predict the physiology, metabolism, and ecology from metagenomes. This approach can be used to identify those subsystems that are significantly different between metagenome sequences. Subsystems that were overrepresented in the Sargasso Sea and Acid Mine Drainage metagenome when compared to non-redundant databases were identified. Conclusion The methodology described herein applies statistics to the comparisons of metabolic potential in metagenomes. This analysis reveals those subsystems that are more, or less, represented in the different environments that are compared. These differences in metabolic potential lead to several testable hypotheses about physiology and metabolism of microbes from these ecosystems. PMID:16549025
An application of statistics to comparative metagenomics.
Rodriguez-Brito, Beltran; Rohwer, Forest; Edwards, Robert A
2006-03-20
Metagenomics, sequence analyses of genomic DNA isolated directly from the environments, can be used to identify organisms and model community dynamics of a particular ecosystem. Metagenomics also has the potential to identify significantly different metabolic potential in different environments. Here we use a statistical method to compare curated subsystems, to predict the physiology, metabolism, and ecology from metagenomes. This approach can be used to identify those subsystems that are significantly different between metagenome sequences. Subsystems that were overrepresented in the Sargasso Sea and Acid Mine Drainage metagenome when compared to non-redundant databases were identified. The methodology described herein applies statistics to the comparisons of metabolic potential in metagenomes. This analysis reveals those subsystems that are more, or less, represented in the different environments that are compared. These differences in metabolic potential lead to several testable hypotheses about physiology and metabolism of microbes from these ecosystems.
p p →A →Z h and the wrong-sign limit of the two-Higgs-doublet model
NASA Astrophysics Data System (ADS)
Ferreira, Pedro M.; Liebler, Stefan; Wittbrodt, Jonas
2018-03-01
We point out the importance of the decay channels A →Z h and H →V V in the wrong-sign limit of the two-Higgs-doublet model (2HDM) of type II. They can be the dominant decay modes at moderate values of tan β , even if the (pseudo)scalar mass is above the threshold where the decay into a pair of top quarks is kinematically open. Accordingly, large cross sections p p →A →Z h and p p →H →V V are obtained and currently probed by the LHC experiments, yielding conclusive statements about the remaining parameter space of the wrong-sign limit. In addition, mild excesses—as recently found in the ATLAS analysis b b ¯→A →Z h —could be explained. The wrong-sign limit makes other important testable predictions for the light Higgs boson couplings.
Bosdriesz, Evert; Magnúsdóttir, Stefanía; Bruggeman, Frank J; Teusink, Bas; Molenaar, Douwe
2015-06-01
Microorganisms rely on binding-protein assisted, active transport systems to scavenge for scarce nutrients. Several advantages of using binding proteins in such uptake systems have been proposed. However, a systematic, rigorous and quantitative analysis of the function of binding proteins is lacking. By combining knowledge of selection pressure and physiochemical constraints, we derive kinetic, thermodynamic, and stoichiometric properties of binding-protein dependent transport systems that enable a maximal import activity per amount of transporter. Under the hypothesis that this maximal specific activity of the transport complex is the selection objective, binding protein concentrations should exceed the concentration of both the scarce nutrient and the transporter. This increases the encounter rate of transporter with loaded binding protein at low substrate concentrations, thereby enhancing the affinity and specific uptake rate. These predictions are experimentally testable, and a number of observations confirm them. © 2015 FEBS.
Pricing for scarcity? An efficiency analysis of increasing block tariffs
NASA Astrophysics Data System (ADS)
Monteiro, Henrique; Roseta-Palma, Catarina
2011-06-01
Water pricing schedules often contain significant nonlinearities, such as the increasing block tariff (IBT) structure that is abundantly applied for residential users. The IBT is frequently supported as a good tool for achieving the goals of equity, water conservation, and revenue neutrality but seldom has been grounded on efficiency justifications. In particular, existing literature on water pricing establishes that although efficient schedules will depend on demand and supply characteristics, IBT cannot usually be recommended. In this paper, we consider whether the explicit inclusion of scarcity considerations can strengthen the appeal of IBT. Results show that when both demand and costs react to climate factors, increasing marginal prices may come about as a response to a combination of water scarcity and customer heterogeneity. We derive testable conditions and then illustrate their application through an estimation of Portuguese residential water demand. We show that the recommended tariff schedule hinges crucially on the choice of functional form for demand.
Simpson, Eleanor H.; Kellendonk, Christoph
2016-01-01
The dopamine hypothesis of schizophrenia is supported by a large number of imaging studies that have identified an increase in dopamine binding at the D2 receptor selectively in the striatum. Here we review a decade of work using a regionally restricted and temporally regulated transgenic mouse model to investigate the behavioral, molecular, electrophysiological, and anatomical consequences of selective D2 receptor upregulation in the striatum. These studies have identified new and potentially important biomarkers at the circuit and molecular level that can now be explored in patients with schizophrenia. They provide an example of how animal models and their detailed level of neurobiological analysis allow a deepening of our understanding of the relationship between neuronal circuit function and symptoms of schizophrenia, and as a consequence generate new hypotheses that are testable in patients. PMID:27720388
Cultural prototypes and dimensions of honor.
Cross, Susan E; Uskul, Ayse K; Gerçek-Swing, Berna; Sunbay, Zeynep; Alözkan, Cansu; Günsoy, Ceren; Ataca, Bilge; Karakitapoglu-Aygün, Zahide
2014-02-01
Research evidence and theoretical accounts of honor point to differing definitions of the construct in differing cultural contexts. The current studies address the question "What is honor?" using a prototype approach in Turkey and the Northern United States. Studies 1a/1b revealed substantial differences in the specific features generated by members of the two groups, but Studies 2 and 3 revealed cultural similarities in the underlying dimensions of self-respect, moral behavior, and social status/respect. Ratings of the centrality and personal importance of these factors were similar across the two groups, but their association with other relevant constructs differed. The tripartite nature of honor uncovered in these studies helps observers and researchers alike understand how diverse responses to situations can be attributed to honor. Inclusion of a prototype analysis into the literature on honor cultures can provide enhanced coverage of the concept that may lead to testable hypotheses and new theoretical developments.
The Fate of the Method of 'Paradigms' in Paleobiology.
Rudwick, Martin J S
2017-11-02
An earlier article described the mid-twentieth century origins of the method of "paradigms" in paleobiology, as a way of making testable hypotheses about the functional morphology of extinct organisms. The present article describes the use of "paradigms" through the 1970s and, briefly, to the end of the century. After I had proposed the paradigm method to help interpret the ecological history of brachiopods, my students developed it in relation to that and other invertebrate phyla, notably in Euan Clarkson's analysis of vision in trilobites. David Raup's computer-aided "theoretical morphology" was then combined with my functional or adaptive emphasis, in Adolf Seilacher's tripartite "constructional morphology." Stephen Jay Gould, who had strongly endorsed the method, later switched to criticizing the "adaptationist program" he claimed it embodied. Although the explicit use of paradigms in paleobiology had declined by the end of the century, the method was tacitly subsumed into functional morphology as "biomechanics."
Examining the nature of retrocausal effects in biology and psychology
NASA Astrophysics Data System (ADS)
Mossbridge, Julia
2017-05-01
Multiple laboratories have reported physiological and psychological changes associated with future events that are designed to be unpredictable by normal sensory means. Such phenomena seem to be examples of retrocausality at the macroscopic level. Here I will discuss the characteristics of seemingly retrocausal effects in biology and psychology, specifically examining a biological and a psychological form of precognition, predictive anticipatory activity (PAA) and implicit precognition. The aim of this examination is to offer an analysis of the constraints posed by the characteristics of macroscopic retrocausal effects. Such constraints are critical to assessing any physical theory that purports to explain these effects. Following a brief introduction to recent research on PAA and implicit precognition, I will describe what I believe we have learned so far about the nature of these effects, and conclude with a testable, yet embryonic, model of macroscopic retrocausal phenomena.
The (virtual) conceptual necessity of quantum probabilities in cognitive psychology.
Blutner, Reinhard; beim Graben, Peter
2013-06-01
We propose a way in which Pothos & Busemeyer (P&B) could strengthen their position. Taking a dynamic stance, we consider cognitive tests as functions that transfer a given input state into the state after testing. Under very general conditions, it can be shown that testable properties in cognition form an orthomodular lattice. Gleason's theorem then yields the conceptual necessity of quantum probabilities (QP).
The Systems Test Architect: Enabling The Leap From Testable To Tested
2016-09-01
engineering process requires an interdisciplinary approach, involving both technical and managerial disciplines applied to the synthesis and integration...relationship between the technical and managerial aspects of systems engineering. TP-2003-020-01 describes measurement as having the following...it is evident that DOD makes great strides to tackle both the managerial and technical aspects of test and evaluation within the systems
Active Diagnosis of Navy Machinery Rev 2.0
2016-10-01
electrical distribution and potable water supply systems. Because of these dependencies, ship auxiliary system failures can cause combat load failure...buildup generally causes a pipe to disconnect from a junction, causing water to leak . This limits the faults that are testable, since many of the faults...pipes, junctions, pumps, flow meters, thermal loads, check valve, and water tank. Each agent is responsible for maintaining its constraints locally
Silicon Wafer Advanced Packaging (SWAP). Multichip Module (MCM) Foundry Study. Version 2
1991-04-08
Next Layer Dielectric Spacing - Additional Metal Thickness Impact on Dielectric Uniformity/Adhiesion. The first step in .!Ie EPerimental design would be... design CAM - computer aided manufacturing CAE - computer aided engineering CALCE - computer aided life cycle engineering center CARMA - computer aided...expansion 5 j- CVD - chemical vapor deposition J . ..- j DA - design automation J , DEC - Digital Equipment Corporation --- DFT - design for testability
Structural Genomics of Bacterial Virulence Factors
2006-05-01
positioned in the unit cell by Molecular Replacement (Protein Data Bank ( PDB ) ID code 1acc)6 using MOLREP, and refined with REFMAC version 5.0 (ref. 24...increase our understanding of the molecular mechanisms of pathogenicity, putting us in a stronger position to anticipate and react to emerging...term, the accumulated structural information will generate important and testable hypotheses that will increase our understanding of the molecular
Testability/Diagnostics Design Encyclopedia
1990-09-01
weapon system that is pushing the state of the art and produced In limited numbers, with questionable historical data on their operation, one can...designs with questionable basis and justification. Unfortunately, this process has not been transformed from an art to a rigorous methodology...REQUIREMENT #2.1 - On-the-job training - Formal school training o O-Level data acquieitlonico01ectlon system (and data management) o Requirements to
Rapid Communication: Quasi-gedanken experiment challenging the no-signalling theorem
NASA Astrophysics Data System (ADS)
Kalamidas, Demetrios A.
2018-01-01
Kennedy ( Philos. Sci. 62, 4 (1995)) has argued that the various quantum mechanical no-signalling proofs formulated thus far share a common mathematical framework, are circular in nature, and do not preclude the construction of empirically testable schemes wherein superluminal exchange of information can occur. In light of this thesis, we present a potentially feasible quantum-optical scheme that purports to enable superluminal signalling.
Retrieval as a Fast Route to Memory Consolidation.
Antony, James W; Ferreira, Catarina S; Norman, Kenneth A; Wimber, Maria
2017-08-01
Retrieval-mediated learning is a powerful way to make memories last, but its neurocognitive mechanisms remain unclear. We propose that retrieval acts as a rapid consolidation event, supporting the creation of adaptive hippocampal-neocortical representations via the 'online' reactivation of associative information. We describe parallels between online retrieval and offline consolidation and offer testable predictions for future research. Copyright © 2017 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Maestripieri, Dario
2005-01-01
Comparative behavioral research is important for a number of reasons and can contribute to the understanding of human behavior and development in many different ways. Research with animal models of human behavior and development can be a source not only of general principles and testable hypotheses but also of empirical information that may be…
1982-10-01
e.g., providing voters in TMR systems and detection-switching requirements in standby-sparing sys- tems. The application of mathematical thoery of...and time redundancy required for error detection and correction, are interrelated. Mathematical modeling, when applied to fault tolerant systems, can...9 1.1 Some Fundamental Principles............................. 11 1.2 Mathematical Theory of
NASA Space Flight Vehicle Fault Isolation Challenges
NASA Technical Reports Server (NTRS)
Neeley, James R.; Jones, James V.; Bramon, Christopher J.; Inman, Sharon K.; Tuttle, Loraine
2016-01-01
The Space Launch System (SLS) is the new NASA heavy lift launch vehicle in development and is scheduled for its first mission in 2018.SLS has many of the same logistics challenges as any other large scale program. However, SLS also faces unique challenges related to testability. This presentation will address the SLS challenges for diagnostics and fault isolation, along with the analyses and decisions to mitigate risk..
A Survey of Reliability, Maintainability, Supportability, and Testability Software Tools
1991-04-01
designs in terms of their contributions toward forced mission termination and vehicle or function loss . Includes the ability to treat failure modes of...ABSTRACT: Inputs: MTBFs, MTTRs, support equipment costs, equipment weights and costs, available targets, military occupational specialty skill level and...US Army CECOM NAME: SPARECOST ABSTRACT: Calculates expected number of failures and performs spares holding optimization based on cost, weight , or
Models of cooperative dynamics from biomolecules to magnets
NASA Astrophysics Data System (ADS)
Mobley, David Lowell
This work details application of computer models to several biological systems (prion diseases and Alzheimer's disease) and a magnetic system. These share some common themes, which are discussed. Here, simple lattice-based models are applied to aggregation of misfolded protein in prion diseases like Mad Cow disease. These can explain key features of the diseases. The modeling is based on aggregation being essential in establishing the time-course of infectivity. Growth of initial aggregates is assumed to dominate the experimentally observed lag phase. Subsequent fission, regrowth, and fission set apart the exponential doubling phase in disease progression. We explore several possible modes of growth for 2-D aggregates and suggest the model providing the best explanation for the experimental data. We develop testable predictions from this model. Like prion disease, Alzheimer's disease (AD) is an amyloid disease characterized by large aggregates in the brain. However, evidence increasingly points away from these as the toxic agent and towards oligomers of the Abeta peptide. We explore one possible toxicity mechanism---insertion of Abeta into cell membranes and formation of harmful ion channels. We find that mutations in this peptide which cause familial Alzheimer's disease (FAD) also affect the insertion of this peptide into membranes in a fairly consistent way, suggesting that this toxicity mechanism may be relevant biologically. We find a particular inserted configuration which may be especially harmful and develop testable predictions to verify whether or not this is the case. Nucleation is an essential feature of our models for prion disease, in that it protects normal, healthy individuals from getting prion disease. Nucleation is important in many other areas, and we modify our lattice-based nucleation model to apply to a hysteretic magnetic system where nucleation has been suggested to be important. From a simple model, we find qualitative agreement with experiment, and make testable experimental predictions concerning time-dependence and temperature-dependence of the major hysteresis loop and reversal curves which have been experimentally verified. We argue why this model may be suitable for systems like these and explain implications for Ising-like models. We suggest implications for future modeling work. Finally, we present suggestions for future work in all three areas.
Colquhoun, Heather L; Carroll, Kelly; Eva, Kevin W; Grimshaw, Jeremy M; Ivers, Noah; Michie, Susan; Sales, Anne; Brehaut, Jamie C
2017-09-29
Audit and feedback (A&F) is a common strategy for helping health providers to implement evidence into practice. Despite being extensively studied, health care A&F interventions remain variably effective, with overall effect sizes that have not improved since 2003. Contributing to this stagnation is the fact that most health care A&F interventions have largely been designed without being informed by theoretical understanding from the behavioral and social sciences. To determine if the trend can be improved, the objective of this study was to develop a list of testable, theory-informed hypotheses about how to design more effective A&F interventions. Using purposive sampling, semi-structured 60-90-min telephone interviews were conducted with experts in theories related to A&F from a range of fields (e.g., cognitive, health and organizational psychology, medical decision-making, economics). Guided by detailed descriptions of A&F interventions from the health care literature, interviewees described how they would approach the problem of designing improved A&F interventions. Specific, theory-informed hypotheses about the conditions for effective design and delivery of A&F interventions were elicited from the interviews. The resulting hypotheses were assigned by three coders working independently into themes, and categories of themes, in an iterative process. We conducted 28 interviews and identified 313 theory-informed hypotheses, which were placed into 30 themes. The 30 themes included hypotheses related to the following five categories: A&F recipient (seven themes), content of the A&F (ten themes), process of delivery of the A&F (six themes), behavior that was the focus of the A&F (three themes), and other (four themes). We have identified a set of testable, theory-informed hypotheses from a broad range of behavioral and social science that suggest conditions for more effective A&F interventions. This work demonstrates the breadth of perspectives about A&F from non-healthcare-specific disciplines in a way that yields testable hypotheses for healthcare A&F interventions. These results will serve as the foundation for further work seeking to set research priorities among the A&F research community.
Developing Tools to Test the Thermo-Mechanical Models, Examples at Crustal and Upper Mantle Scale
NASA Astrophysics Data System (ADS)
Le Pourhiet, L.; Yamato, P.; Burov, E.; Gurnis, M.
2005-12-01
Testing geodynamical model is never an easy task. Depending on the spatio-temporal scale of the model, different testable predictions are needed and no magic reciepe exist. This contribution first presents different methods that have been used to test themo-mechanical modeling results at upper crustal, lithospheric and upper mantle scale using three geodynamical examples : the Gulf of Corinth (Greece), the Western Alps, and the Sierra Nevada. At short spatio-temporal scale (e.g. Gulf of Corinth). The resolution of the numerical models is usually sufficient to catch the timing and kinematics of the faults precisely enough to be tested by tectono-stratigraphic arguments. In active deforming area, microseismicity can be compared to the effective rheology and P and T axes of the focal mechanism can be compared with local orientation of the major component of the stress tensor. At lithospheric scale the resolution of the models doesn't permit anymore to constrain the models by direct observations (i.e. structural data from field or seismic reflection). Instead, synthetic P-T-t path may be computed and compared to natural ones in term of rate of exhumation for ancient orogens. Topography may also help but on continent it mainly depends on erosion laws that are complicated to constrain. Deeper in the mantle, the only available constrain are long wave length topographic data and tomographic "data". The major problem to overcome now at lithospheric and upper mantle scale, is that the so called "data" results actually from inverse models of the real data and that those inverse model are based on synthetic models. Post processing P and S wave velocities is not sufficient to be able to make testable prediction at upper mantle scale. Instead of that, direct wave propagations model must be computed. This allows checking if the differences between two models constitute a testable prediction or not. On longer term, we may be able to use those synthetic models to reduce the residue in the inversion of elastic wave arrival time
Color vision deficiency in preschool children: the multi-ethnic pediatric eye disease study.
Xie, John Z; Tarczy-Hornoch, Kristina; Lin, Jesse; Cotter, Susan A; Torres, Mina; Varma, Rohit
2014-07-01
To determine the sex- and ethnicity-specific prevalence of color vision deficiency (CVD) in black, Asian, Hispanic, and non-Hispanic white preschool children. Population-based, cross-sectional study. The Multi-Ethnic Pediatric Eye Disease Study is a population-based evaluation of the prevalence of vision disorders in children in Southern California. A total of 5960 subjects 30 to 72 months of age were recruited for the study, of whom 4177 were able to complete color vision testing (1265 black, 812 Asian, 1280 Hispanic, and 820 non-Hispanic white). Color vision testing was performed using Color Vision Testing Made Easy color plates (Home Vision Care, Gulf Breeze, FL), and diagnostic confirmatory testing was performed using the Waggoner HRR Diagnostic Test color plates (Home Vision Care). Testability of color vision in preschool children between 30 and 72 months of age and prevalence of CVD stratified by age, sex, and ethnicity. Testability was 17% in children younger than 37 months of age, increasing to 57% in children 37 to 48 months of age, 89% in children 49 to 60 months of age, and 98% in children 61 to 72 months of age. The prevalence of CVD among boys was 1.4% for black, 3.1% for Asian, 2.6% for Hispanic, and 5.6% for non-Hispanic white children; the prevalence in girls was 0.0% to 0.5% for all ethnicities. The ethnic difference in CVD was statistically significant between black and non-Hispanic white children (P = 0.0003) and between Hispanic and non-Hispanic white children (P = 0.02). In boys, most CVD cases were either deutan (51%) or protan (34%); 32% were classified as mild, 15% as moderate, and 41% as severe. Testability for CVD in preschool children is high by 4 years of age. The prevalence of CVD in preschool boys varies by ethnicity, with the highest prevalence in non-Hispanic white and lowest in black children. Copyright © 2014 American Academy of Ophthalmology. Published by Elsevier Inc. All rights reserved.
Taking Bioinformatics to Systems Medicine.
van Kampen, Antoine H C; Moerland, Perry D
2016-01-01
Systems medicine promotes a range of approaches and strategies to study human health and disease at a systems level with the aim of improving the overall well-being of (healthy) individuals, and preventing, diagnosing, or curing disease. In this chapter we discuss how bioinformatics critically contributes to systems medicine. First, we explain the role of bioinformatics in the management and analysis of data. In particular we show the importance of publicly available biological and clinical repositories to support systems medicine studies. Second, we discuss how the integration and analysis of multiple types of omics data through integrative bioinformatics may facilitate the determination of more predictive and robust disease signatures, lead to a better understanding of (patho)physiological molecular mechanisms, and facilitate personalized medicine. Third, we focus on network analysis and discuss how gene networks can be constructed from omics data and how these networks can be decomposed into smaller modules. We discuss how the resulting modules can be used to generate experimentally testable hypotheses, provide insight into disease mechanisms, and lead to predictive models. Throughout, we provide several examples demonstrating how bioinformatics contributes to systems medicine and discuss future challenges in bioinformatics that need to be addressed to enable the advancement of systems medicine.
Theoretical prediction and impact of fundamental electric dipole moments
Ellis, Sebastian A. R.; Kane, Gordon L.
2016-01-13
The predicted Standard Model (SM) electric dipole moments (EDMs) of electrons and quarks are tiny, providing an important window to observe new physics. Theories beyond the SM typically allow relatively large EDMs. The EDMs depend on the relative phases of terms in the effective Lagrangian of the extended theory, which are generally unknown. Underlying theories, such as string/M-theories compactified to four dimensions, could predict the phases and thus EDMs in the resulting supersymmetric (SUSY) theory. Earlier one of us, with collaborators, made such a prediction and found, unexpectedly, that the phases were predicted to be zero at tree level inmore » the theory at the unification or string scale ~O(10 16 GeV). Electroweak (EW) scale EDMs still arise via running from the high scale, and depend only on the SM Yukawa couplings that also give the CKM phase. Here we extend the earlier work by studying the dependence of the low scale EDMs on the constrained but not fully known fundamental Yukawa couplings. The dominant contribution is from two loop diagrams and is not sensitive to the choice of Yukawa texture. The electron EDM should not be found to be larger than about 5 × 10 –30e cm, and the neutron EDM should not be larger than about 5 × 10 –29e cm. These values are quite a bit smaller than the reported predictions from Split SUSY and typical effective theories, but much larger than the Standard Model prediction. Also, since models with random phases typically give much larger EDMs, it is a significant testable prediction of compactified M-theory that the EDMs should not be above these upper limits. The actual EDMs can be below the limits, so once they are measured they could provide new insight into the fundamental Yukawa couplings of leptons and quarks. As a result, we comment also on the role of strong CP violation. EDMs probe fundamental physics near the Planck scale.« less
Theoretical prediction and impact of fundamental electric dipole moments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ellis, Sebastian A. R.; Kane, Gordon L.
The predicted Standard Model (SM) electric dipole moments (EDMs) of electrons and quarks are tiny, providing an important window to observe new physics. Theories beyond the SM typically allow relatively large EDMs. The EDMs depend on the relative phases of terms in the effective Lagrangian of the extended theory, which are generally unknown. Underlying theories, such as string/M-theories compactified to four dimensions, could predict the phases and thus EDMs in the resulting supersymmetric (SUSY) theory. Earlier one of us, with collaborators, made such a prediction and found, unexpectedly, that the phases were predicted to be zero at tree level inmore » the theory at the unification or string scale ~O(10 16 GeV). Electroweak (EW) scale EDMs still arise via running from the high scale, and depend only on the SM Yukawa couplings that also give the CKM phase. Here we extend the earlier work by studying the dependence of the low scale EDMs on the constrained but not fully known fundamental Yukawa couplings. The dominant contribution is from two loop diagrams and is not sensitive to the choice of Yukawa texture. The electron EDM should not be found to be larger than about 5 × 10 –30e cm, and the neutron EDM should not be larger than about 5 × 10 –29e cm. These values are quite a bit smaller than the reported predictions from Split SUSY and typical effective theories, but much larger than the Standard Model prediction. Also, since models with random phases typically give much larger EDMs, it is a significant testable prediction of compactified M-theory that the EDMs should not be above these upper limits. The actual EDMs can be below the limits, so once they are measured they could provide new insight into the fundamental Yukawa couplings of leptons and quarks. As a result, we comment also on the role of strong CP violation. EDMs probe fundamental physics near the Planck scale.« less
Radial mixing and Ru-Mo isotope systematics under different accretion scenarios
NASA Astrophysics Data System (ADS)
Fischer, Rebecca A.; Nimmo, Francis; O'Brien, David P.
2018-01-01
The Ru-Mo isotopic compositions of inner Solar System bodies may reflect the provenance of accreted material and how it evolved with time, both of which are controlled by the accretion scenario these bodies experienced. Here we use a total of 116 N-body simulations of terrestrial planet accretion, run in the Eccentric Jupiter and Saturn (EJS), Circular Jupiter and Saturn (CJS), and Grand Tack scenarios, to model the Ru-Mo anomalies of Earth, Mars, and Theia analogues. This model starts by applying an initial step function in Ru-Mo isotopic composition, with compositions reflecting those in meteorites, and traces compositional evolution as planets accrete. The mass-weighted provenance of the resulting planets reveals more radial mixing in Grand Tack simulations than in EJS/CJS simulations, and more efficient mixing among late-accreted material than during the main phase of accretion in EJS/CJS simulations. We find that an extensive homogeneous inner disk region is required to reproduce Earth's observed Ru-Mo composition. EJS/CJS simulations require a homogeneous reservoir in the inner disk extending to ≥3-4 AU (≥74-98% of initial mass) to reproduce Earth's composition, while Grand Tack simulations require a homogeneous reservoir extending to ≥3-10 AU (≥97-99% of initial mass), and likely to ≥6-10 AU. In the Grand Tack model, Jupiter's initial location (the most likely location for a discontinuity in isotopic composition) is ∼3.5 AU; however, this step location has only a 33% likelihood of producing an Earth with the correct Ru-Mo isotopic signature for the most plausible model conditions. Our results give the testable predictions that Mars has zero Ru anomaly and small or zero Mo anomaly, and the Moon has zero Mo anomaly. These predictions are insensitive to wide variations in parameter choices.
Archaeogeophysical investigations of early Caddo settlement patterning at the Crenshaw site (3MI6)
NASA Astrophysics Data System (ADS)
Samuelsen, John R.
The Teran map, made during Don Domingo Teran de los Rios' expedition for New Spain, shows a Caddo settlement in 1691 with a vacant mound center and many small farmsteads dispersed across the countryside along both banks of the Red River. This map, combined with the 19th Century photographs taken by William Soule, provides a testable model for the settlement pattern of the Caddo people called the Teran-Soule model. This model states that large numbers of people besides a small caretaker population did not inhabit the mound centers, supporting a vacant mound center hypothesis. Recent studies have begun to challenge this hypothesis, using archaeogeophysical techniques to find structures near Middle to Late Caddo mounds. An archaeogeophysical survey of the Crenshaw site along the Great Bend of the Red River was conducted to determine if structures could be found there. Is the settlement pattern at this early Caddo site, occupied between A.D. 700 and 1400, consistent with the late historic model of a vacant mound center? Is there evidence that both Caddo and Fourche Maline occupations existed in horizontally distinct components? The 3.2 hectare survey identified more than 100 possible structures, of which more than 50 are probably structures associated with the Fourche Maline or early Caddo occupations of the site. Several structures were found in linear patterns, including an oval series of possible structures measuring 90 x 85 m in diameter. While cultural affiliation was not determined for most of these features, some can be attributed to Caddo origin based on architectural attributes, such as extended entranceways. This suggests that Crenshaw was not literally vacant, but the presence of extended entranceways suggests that some of the identified features were special use structures, which does not conflict with the vacant mound center hypothesis. However, the large number of possible structures present with unknown cultural affiliations provides ample opportunities for testing the model.
Domain fusion analysis by applying relational algebra to protein sequence and domain databases.
Truong, Kevin; Ikura, Mitsuhiko
2003-05-06
Domain fusion analysis is a useful method to predict functionally linked proteins that may be involved in direct protein-protein interactions or in the same metabolic or signaling pathway. As separate domain databases like BLOCKS, PROSITE, Pfam, SMART, PRINTS-S, ProDom, TIGRFAMs, and amalgamated domain databases like InterPro continue to grow in size and quality, a computational method to perform domain fusion analysis that leverages on these efforts will become increasingly powerful. This paper proposes a computational method employing relational algebra to find domain fusions in protein sequence databases. The feasibility of this method was illustrated on the SWISS-PROT+TrEMBL sequence database using domain predictions from the Pfam HMM (hidden Markov model) database. We identified 235 and 189 putative functionally linked protein partners in H. sapiens and S. cerevisiae, respectively. From scientific literature, we were able to confirm many of these functional linkages, while the remainder offer testable experimental hypothesis. Results can be viewed at http://calcium.uhnres.utoronto.ca/pi. As the analysis can be computed quickly on any relational database that supports standard SQL (structured query language), it can be dynamically updated along with the sequence and domain databases, thereby improving the quality of predictions over time.
A genome-wide longitudinal transcriptome analysis of the aging model Podospora anserina.
Philipp, Oliver; Hamann, Andrea; Servos, Jörg; Werner, Alexandra; Koch, Ina; Osiewacz, Heinz D
2013-01-01
Aging of biological systems is controlled by various processes which have a potential impact on gene expression. Here we report a genome-wide transcriptome analysis of the fungal aging model Podospora anserina. Total RNA of three individuals of defined age were pooled and analyzed by SuperSAGE (serial analysis of gene expression). A bioinformatics analysis identified different molecular pathways to be affected during aging. While the abundance of transcripts linked to ribosomes and to the proteasome quality control system were found to decrease during aging, those associated with autophagy increase, suggesting that autophagy may act as a compensatory quality control pathway. Transcript profiles associated with the energy metabolism including mitochondrial functions were identified to fluctuate during aging. Comparison of wild-type transcripts, which are continuously down-regulated during aging, with those down-regulated in the long-lived, copper-uptake mutant grisea, validated the relevance of age-related changes in cellular copper metabolism. Overall, we (i) present a unique age-related data set of a longitudinal study of the experimental aging model P. anserina which represents a reference resource for future investigations in a variety of organisms, (ii) suggest autophagy to be a key quality control pathway that becomes active once other pathways fail, and (iii) present testable predictions for subsequent experimental investigations.
What is wrong with intelligent design?
Sober, Elliott
2007-03-01
This article reviews two standard criticisms of creationism/intelligent design (ID)): it is unfalsifiable, and it is refuted by the many imperfect adaptations found in nature. Problems with both criticisms are discussed. A conception of testability is described that avoids the defects in Karl Popper's falsifiability criterion. Although ID comes in multiple forms, which call for different criticisms, it emerges that ID fails to constitute a serious alternative to evolutionary theory.
Integrating principles and multidisciplinary projects in design education
NASA Technical Reports Server (NTRS)
Nevill, Gale E., Jr.
1992-01-01
The critical need to improve engineering design education in the U.S. is presented and a number of actions to achieve that end are discussed. The importance of teaching undergraduates the latest methods and principles through the means of team design in multidisciplinary projects leading to a testable product is emphasized. Desirable training for design instructors is described and techniques for selecting and managing projects that teach effectively are discussed.
Report on phase 1 of the Microprocessor Seminar. [and associated large scale integration
NASA Technical Reports Server (NTRS)
1977-01-01
Proceedings of a seminar on microprocessors and associated large scale integrated (LSI) circuits are presented. The potential for commonality of device requirements, candidate processes and mechanisms for qualifying candidate LSI technologies for high reliability applications, and specifications for testing and testability were among the topics discussed. Various programs and tentative plans of the participating organizations in the development of high reliability LSI circuits are given.
It takes two to talk: a second-person neuroscience approach to language learning.
Syal, Supriya; Anderson, Adam K
2013-08-01
Language is a social act. We have previously argued that language remains embedded in sociality because the motivation to communicate exists only within a social context. Schilbach et al. underscore the importance of studying linguistic behavior from within the motivated, socially interactive frame in which it is learnt and used, as well as provide testable hypotheses for a participatory, second-person neuroscience approach to language learning.
Are some BL Lac objects artefacts of gravitational lensing?
NASA Technical Reports Server (NTRS)
Ostriker, J. P.; Vietri, M.
1985-01-01
It is proposed here that a significant fraction of BL Lac objects are optically violently variable quasars whose continuum emission has been greatly amplified, relative to the line emission, by pointlike gravitational lenses in intervening galaxies. Several anomalous physical and statistical properties of BL Lacs can be understood on the basis of this model, which is immediately testable on the basis of absorption line studies and by direct imaging.
Empirical approaches to the study of language evolution.
Fitch, W Tecumseh
2017-02-01
The study of language evolution, and human cognitive evolution more generally, has often been ridiculed as unscientific, but in fact it differs little from many other disciplines that investigate past events, such as geology or cosmology. Well-crafted models of language evolution make numerous testable hypotheses, and if the principles of strong inference (simultaneous testing of multiple plausible hypotheses) are adopted, there is an increasing amount of relevant data allowing empirical evaluation of such models. The articles in this special issue provide a concise overview of current models of language evolution, emphasizing the testable predictions that they make, along with overviews of the many sources of data available to test them (emphasizing comparative, neural, and genetic data). The key challenge facing the study of language evolution is not a lack of data, but rather a weak commitment to hypothesis-testing approaches and strong inference, exacerbated by the broad and highly interdisciplinary nature of the relevant data. This introduction offers an overview of the field, and a summary of what needed to evolve to provide our species with language-ready brains. It then briefly discusses different contemporary models of language evolution, followed by an overview of different sources of data to test these models. I conclude with my own multistage model of how different components of language could have evolved.
Moses Lake Fishery Restoration Project : FY 1999 Annual Report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None given
2000-12-01
The Moses Lake Project consists of 3 phases. Phase 1 is the assessment of all currently available physical and biological information, the collection of baseline biological data, the formulation of testable hypotheses, and the development of a detailed study plan to test the hypotheses. Phase 2 is dedicated to the implementation of the study plan including data collection, hypotheses testing, and the formulation of a management plan. Phase 3 of the project is the implementation of the management plan, monitoring and evaluation of the implemented recommendations. The project intends to restore the failed recreational fishery for panfish species (black crappie,more » bluegill and yellow perch) in Moses Lake as off site mitigation for lost recreational fishing opportunities for anadromous species in the upper Columbia River. This report summarizes the results of Phase 1 investigations and presents the study plan directed at initiating Phase 2 of the project. Phase 1of the project culminates with the formulation of testable hypotheses directed at investigating possible limiting factors to the production of panfish in Moses Lake. The limiting factors to be investigated will include water quality, habitat quantity and quality, food limitations, competition, recruitment, predation, over harvest, environmental requirements, and the physical and chemical limitations of the system in relation to the fishes.« less
Phenoscape: Identifying Candidate Genes for Evolutionary Phenotypes
Edmunds, Richard C.; Su, Baofeng; Balhoff, James P.; Eames, B. Frank; Dahdul, Wasila M.; Lapp, Hilmar; Lundberg, John G.; Vision, Todd J.; Dunham, Rex A.; Mabee, Paula M.; Westerfield, Monte
2016-01-01
Phenotypes resulting from mutations in genetic model organisms can help reveal candidate genes for evolutionarily important phenotypic changes in related taxa. Although testing candidate gene hypotheses experimentally in nonmodel organisms is typically difficult, ontology-driven information systems can help generate testable hypotheses about developmental processes in experimentally tractable organisms. Here, we tested candidate gene hypotheses suggested by expert use of the Phenoscape Knowledgebase, specifically looking for genes that are candidates responsible for evolutionarily interesting phenotypes in the ostariophysan fishes that bear resemblance to mutant phenotypes in zebrafish. For this, we searched ZFIN for genetic perturbations that result in either loss of basihyal element or loss of scales phenotypes, because these are the ancestral phenotypes observed in catfishes (Siluriformes). We tested the identified candidate genes by examining their endogenous expression patterns in the channel catfish, Ictalurus punctatus. The experimental results were consistent with the hypotheses that these features evolved through disruption in developmental pathways at, or upstream of, brpf1 and eda/edar for the ancestral losses of basihyal element and scales, respectively. These results demonstrate that ontological annotations of the phenotypic effects of genetic alterations in model organisms, when aggregated within a knowledgebase, can be used effectively to generate testable, and useful, hypotheses about evolutionary changes in morphology. PMID:26500251
Feldstein Ewing, Sarah W.; Filbey, Francesca M.; Hendershot, Christian S.; McEachern, Amber D.; Hutchison, Kent E.
2011-01-01
Objective: Despite the prevalence and profound consequences of alcohol use disorders, psychosocial alcohol interventions have widely varying outcomes. The range of behavior following psychosocial alcohol treatment indicates the need to gain a better understanding of active ingredients and how they may operate. Although this is an area of great interest, at this time there is a limited understanding of how in-session behaviors may catalyze changes in the brain and subsequent alcohol use behavior. Thus, in this review, we aim to identify the neurobiological routes through which psychosocial alcohol interventions may lead to post-session behavior change as well as offer an approach to conceptualize and evaluate these translational relationships. Method: PubMed and PsycINFO searches identified studies that successfully integrated functional magnetic resonance imaging and psychosocial interventions. Results: Based on this research, we identified potential neurobiological substrates through which behavioral alcohol interventions may initiate and sustain behavior change. In addition, we proposed a testable model linking within-session active ingredients to outside-of-session behavior change. Conclusions: Through this review, we present a testable translational model. Additionally, we illustrate how the proposed model can help facilitate empirical evaluations of psychotherapeutic factors and their underlying neural mechanisms, both in the context of motivational interviewing and in the treatment of alcohol use disorders. PMID:22051204
A bioinformatics expert system linking functional data to anatomical outcomes in limb regeneration
Lobo, Daniel; Feldman, Erica B.; Shah, Michelle; Malone, Taylor J.
2014-01-01
Abstract Amphibians and molting arthropods have the remarkable capacity to regenerate amputated limbs, as described by an extensive literature of experimental cuts, amputations, grafts, and molecular techniques. Despite a rich history of experimental effort, no comprehensive mechanistic model exists that can account for the pattern regulation observed in these experiments. While bioinformatics algorithms have revolutionized the study of signaling pathways, no such tools have heretofore been available to assist scientists in formulating testable models of large‐scale morphogenesis that match published data in the limb regeneration field. Major barriers to preventing an algorithmic approach are the lack of formal descriptions for experimental regenerative information and a repository to centralize storage and mining of functional data on limb regeneration. Establishing a new bioinformatics of shape would significantly accelerate the discovery of key insights into the mechanisms that implement complex regeneration. Here, we describe a novel mathematical ontology for limb regeneration to unambiguously encode phenotype, manipulation, and experiment data. Based on this formalism, we present the first centralized formal database of published limb regeneration experiments together with a user‐friendly expert system tool to facilitate its access and mining. These resources are freely available for the community and will assist both human biologists and artificial intelligence systems to discover testable, mechanistic models of limb regeneration. PMID:25729585
NASA Space Flight Vehicle Fault Isolation Challenges
NASA Technical Reports Server (NTRS)
Bramon, Christopher; Inman, Sharon K.; Neeley, James R.; Jones, James V.; Tuttle, Loraine
2016-01-01
The Space Launch System (SLS) is the new NASA heavy lift launch vehicle and is scheduled for its first mission in 2017. The goal of the first mission, which will be uncrewed, is to demonstrate the integrated system performance of the SLS rocket and spacecraft before a crewed flight in 2021. SLS has many of the same logistics challenges as any other large scale program. Common logistics concerns for SLS include integration of discrete programs geographically separated, multiple prime contractors with distinct and different goals, schedule pressures and funding constraints. However, SLS also faces unique challenges. The new program is a confluence of new hardware and heritage, with heritage hardware constituting seventy-five percent of the program. This unique approach to design makes logistics concerns such as testability of the integrated flight vehicle especially problematic. The cost of fully automated diagnostics can be completely justified for a large fleet, but not so for a single flight vehicle. Fault detection is mandatory to assure the vehicle is capable of a safe launch, but fault isolation is another issue. SLS has considered various methods for fault isolation which can provide a reasonable balance between adequacy, timeliness and cost. This paper will address the analyses and decisions the NASA Logistics engineers are making to mitigate risk while providing a reasonable testability solution for fault isolation.
Testability and epistemic shifts in modern cosmology
NASA Astrophysics Data System (ADS)
Kragh, Helge
2014-05-01
During the last decade new developments in theoretical and speculative cosmology have reopened the old discussion of cosmology's scientific status and the more general question of the demarcation between science and non-science. The multiverse hypothesis, in particular, is central to this discussion and controversial because it seems to disagree with methodological and epistemic standards traditionally accepted in the physical sciences. But what are these standards and how sacrosanct are they? Does anthropic multiverse cosmology rest on evaluation criteria that conflict with and go beyond those ordinarily accepted, so that it constitutes an "epistemic shift" in fundamental physics? The paper offers a brief characterization of the modern multiverse and also refers to a few earlier attempts to introduce epistemic shifts in the science of the universe. It further discusses the several meanings of testability, addresses the question of falsifiability as a sine qua non for a theory being scientific, and briefly compares the situation in cosmology with the one in systematic biology. Multiverse theory is not generally falsifiable, which has led to proposals from some physicists to overrule not only Popperian standards but also other evaluation criteria of a philosophical nature. However, this is hardly possible and nor is it possible to get rid of explicit philosophical considerations in some other aspects of cosmological research, however advanced it becomes.
Inter-Universal Quantum Entanglement
NASA Astrophysics Data System (ADS)
Robles-Pérez, S. J.; González-Díaz, P. F.
2015-01-01
The boundary conditions to be imposed on the quantum state of the whole multiverse could be such that the universes would be created in entangled pairs. Then, interuniversal entanglement would provide us with a vacuum energy for each single universe that might be fitted with observational data, making testable not only the multiverse proposal but also the boundary conditions of the multiverse. Furthermore, the second law of the entanglement thermodynamics would enhance the expansion of the single universes.
Creation of a Mouse with Stress-Induced Dystonia: Control of an ATPase Chaperone
2013-04-01
was successful, and a mouse with the desired dystonic symptoms was obtained. It has two mutations , one a dominantly inherited gene with 100...the hallmark of dystonia. 15. SUBJECT TERMS Dystonia, genetically modified mice, stress, gene mutations , animal model of disease. 16...there are a variety of hypotheses that should be testable if there were a realistic animal model. Mice with mutations in genes known to cause dystonia
Effectiveness of spacecraft testing programs
NASA Technical Reports Server (NTRS)
Krausz, A.
1980-01-01
The need for testing under simulated mission operational conditions is discussed and the results of such tests are reviewed from the point of view of the user. A brief overview of the usal test sequences for high reliability long life spacecraft is presented and the effectiveness of the testing program is analyzed in terms of the defects which are discovered by such tests. The need for automation, innovative mechanical test procedures, and design for testability is discussed.
Xylella genomics and bacterial pathogenicity to plants.
Dow, J M; Daniels, M J
2000-12-01
Xylella fastidiosa, a pathogen of citrus, is the first plant pathogenic bacterium for which the complete genome sequence has been published. Inspection of the sequence reveals high relatedness to many genes of other pathogens, notably Xanthomonas campestris. Based on this, we suggest that Xylella possesses certain easily testable properties that contribute to pathogenicity. We also present some general considerations for deriving information on pathogenicity from bacterial genomics. Copyright 2000 John Wiley & Sons, Ltd.
An evolutionary scenario for the origin of flowers.
Frohlich, Michael W
2003-07-01
The Mostly Male theory is the first to use evidence from gene phylogenies, genetics, modern plant morphology and fossils to explain the evolutionary origin of flowers. It proposes that flower organization derives more from the male structures of ancestral gymnosperms than from female structures. The theory arose from a hypothesis-based study. Such studies are the most likely to generate testable evolutionary scenarios, which should be the ultimate goal of evo-devo.
A collider observable QCD axion
Dimopoulos, Savas; Hook, Anson; Huang, Junwu; ...
2016-11-09
Here, we present a model where the QCD axion is at the TeV scale and visible at a collider via its decays. Conformal dynamics and strong CP considerations account for the axion coupling strongly enough to the standard model to be produced as well as the coincidence between the weak scale and the axion mass. The model predicts additional pseudoscalar color octets whose properties are completely determined by the axion properties rendering the theory testable.
Soviet Economic Policy Towards Eastern Europe
1988-11-01
high. Without specifying the determinants of Soviet demand for "allegiance" in more detail, the model is not testable; we cannot predict how subsidy...trade inside (Czechoslovakia, Bulgaria). These countries are behaving as predicted by the model . If this hypothesis is true, the pattern of subsidies...also compares the sum of per capita subsidies by country between 1970 and 1982 with the sum of subsidies predicted by the model . Because of the poor
All biology is computational biology.
Markowetz, Florian
2017-03-01
Here, I argue that computational thinking and techniques are so central to the quest of understanding life that today all biology is computational biology. Computational biology brings order into our understanding of life, it makes biological concepts rigorous and testable, and it provides a reference map that holds together individual insights. The next modern synthesis in biology will be driven by mathematical, statistical, and computational methods being absorbed into mainstream biological training, turning biology into a quantitative science.
Spatio-Chromatic Adaptation via Higher-Order Canonical Correlation Analysis of Natural Images
Gutmann, Michael U.; Laparra, Valero; Hyvärinen, Aapo; Malo, Jesús
2014-01-01
Independent component and canonical correlation analysis are two general-purpose statistical methods with wide applicability. In neuroscience, independent component analysis of chromatic natural images explains the spatio-chromatic structure of primary cortical receptive fields in terms of properties of the visual environment. Canonical correlation analysis explains similarly chromatic adaptation to different illuminations. But, as we show in this paper, neither of the two methods generalizes well to explain both spatio-chromatic processing and adaptation at the same time. We propose a statistical method which combines the desirable properties of independent component and canonical correlation analysis: It finds independent components in each data set which, across the two data sets, are related to each other via linear or higher-order correlations. The new method is as widely applicable as canonical correlation analysis, and also to more than two data sets. We call it higher-order canonical correlation analysis. When applied to chromatic natural images, we found that it provides a single (unified) statistical framework which accounts for both spatio-chromatic processing and adaptation. Filters with spatio-chromatic tuning properties as in the primary visual cortex emerged and corresponding-colors psychophysics was reproduced reasonably well. We used the new method to make a theory-driven testable prediction on how the neural response to colored patterns should change when the illumination changes. We predict shifts in the responses which are comparable to the shifts reported for chromatic contrast habituation. PMID:24533049
Spatio-chromatic adaptation via higher-order canonical correlation analysis of natural images.
Gutmann, Michael U; Laparra, Valero; Hyvärinen, Aapo; Malo, Jesús
2014-01-01
Independent component and canonical correlation analysis are two general-purpose statistical methods with wide applicability. In neuroscience, independent component analysis of chromatic natural images explains the spatio-chromatic structure of primary cortical receptive fields in terms of properties of the visual environment. Canonical correlation analysis explains similarly chromatic adaptation to different illuminations. But, as we show in this paper, neither of the two methods generalizes well to explain both spatio-chromatic processing and adaptation at the same time. We propose a statistical method which combines the desirable properties of independent component and canonical correlation analysis: It finds independent components in each data set which, across the two data sets, are related to each other via linear or higher-order correlations. The new method is as widely applicable as canonical correlation analysis, and also to more than two data sets. We call it higher-order canonical correlation analysis. When applied to chromatic natural images, we found that it provides a single (unified) statistical framework which accounts for both spatio-chromatic processing and adaptation. Filters with spatio-chromatic tuning properties as in the primary visual cortex emerged and corresponding-colors psychophysics was reproduced reasonably well. We used the new method to make a theory-driven testable prediction on how the neural response to colored patterns should change when the illumination changes. We predict shifts in the responses which are comparable to the shifts reported for chromatic contrast habituation.
NASA Astrophysics Data System (ADS)
Totani, Tomonori
2017-10-01
In standard general relativity the universe cannot be started with arbitrary initial conditions, because four of the ten components of the Einstein's field equations (EFE) are constraints on initial conditions. In the previous work it was proposed to extend the gravity theory to allow free initial conditions, with a motivation to solve the cosmological constant problem. This was done by setting four constraints on metric variations in the action principle, which is reasonable because the gravity's physical degrees of freedom are at most six. However, there are two problems about this theory; the three constraints in addition to the unimodular condition were introduced without clear physical meanings, and the flat Minkowski spacetime is unstable against perturbations. Here a new set of gravitational field equations is derived by replacing the three constraints with new ones requiring that geodesic paths remain geodesic against metric variations. The instability problem is then naturally solved. Implications for the cosmological constant Λ are unchanged; the theory converges into EFE with nonzero Λ by inflation, but Λ varies on scales much larger than the present Hubble horizon. Then galaxies are formed only in small Λ regions, and the cosmological constant problem is solved by the anthropic argument. Because of the increased degrees of freedom in metric dynamics, the theory predicts new non-oscillatory modes of metric anisotropy generated by quantum fluctuation during inflation, and CMB B -mode polarization would be observed differently from the standard predictions by general relativity.
An, Bo; Abbonante, Vittorio; Xu, Huifang; Gavriilidou, Despoina; Yoshizumi, Ayumi; Bihan, Dominique; Farndale, Richard W.; Kaplan, David L.; Balduini, Alessandra; Leitinger, Birgit; Brodsky, Barbara
2016-01-01
A bacterial collagen-like protein Scl2 has been developed as a recombinant collagen model system to host human collagen ligand-binding sequences, with the goal of generating biomaterials with selective collagen bioactivities. Defined binding sites in human collagen for integrins, fibronectin, heparin, and MMP-1 have been introduced into the triple-helical domain of the bacterial collagen and led to the expected biological activities. The modular insertion of activities is extended here to the discoidin domain receptors (DDRs), which are collagen-activated receptor tyrosine kinases. Insertion of the DDR-binding sequence from human collagen III into bacterial collagen led to specific receptor binding. However, even at the highest testable concentrations, the construct was unable to stimulate DDR autophosphorylation. The recombinant collagen expressed in Escherichia coli does not contain hydroxyproline (Hyp), and complementary synthetic peptide studies showed that replacement of Hyp by Pro at the critical Gly-Val-Met-Gly-Phe-Hyp position decreased the DDR-binding affinity and consequently required a higher concentration for the induction of receptor activation. The ability of the recombinant bacterial collagen to bind the DDRs without inducing kinase activation suggested it could interfere with the interactions between animal collagen and the DDRs, and such an inhibitory role was confirmed in vitro and with a cell migration assay. This study illustrates that recombinant collagen can complement synthetic peptides in investigating structure-activity relationships, and this system has the potential for the introduction or inhibition of specific biological activities. PMID:26702058
Robertson, Danielle M.
2012-01-01
Previous studies using animal models and human clinical trials have demonstrated that the use of low oxygen transmissible contact lens materials produce corneal epithelial surface damage resulting in increased Pseudomonas aeruginosa (PA) adhesion and raft-mediated internalization into surface corneal epithelial cells. These findings led to the testable clinical predictions that: (1) microbial keratitis (MK) risk is expected to be greatest during the first 6 months of wear; (2) there is no difference between 6 and 30 night extended wear; and (3) that wear of hyper-oxygen transmissible lenses would reduce the reported incidence of infection. Subsequent epidemiological studies have confirmed the first two predictions; however, increased oxygen transmissibility with silicone hydrogel (SiHy) lens wear has not altered the overall incidence of MK. In this review, more recent clinical and basic studies that investigate epithelial alterations and bacterial adhesion to corneal epithelial cells following wear of SiHy lenses with and without concomitant exposure to chemically preserved multipurpose solutions (MPS) will be examined. The collective results of these studies demonstrate that even in the absence of lens-related hypoxia, MPS induce ocular surface changes during SiHy lens wear which are associated with a pathophysiological increase in PA adherence and internalization in the corneal epithelium, and therefore, predict an increased risk for PA-MK. In addition, new data supporting an interactive role for inflammation in facilitating PA adherence and internalization in the corneal epithelium will also be discussed. PMID:23266590
Extending the Lincoln-Petersen estimator for multiple identifications in one source.
Köse, T; Orman, M; Ikiz, F; Baksh, M F; Gallagher, J; Böhning, D
2014-10-30
The Lincoln-Petersen estimator is one of the most popular estimators used in capture-recapture studies. It was developed for a sampling situation in which two sources independently identify members of a target population. For each of the two sources, it is determined if a unit of the target population is identified or not. This leads to a 2 × 2 table with frequencies f11 ,f10 ,f01 ,f00 indicating the number of units identified by both sources, by the first but not the second source, by the second but not the first source and not identified by any of the two sources, respectively. However, f00 is unobserved so that the 2 × 2 table is incomplete and the Lincoln-Petersen estimator provides an estimate for f00 . In this paper, we consider a generalization of this situation for which one source provides not only a binary identification outcome but also a count outcome of how many times a unit has been identified. Using a truncated Poisson count model, truncating multiple identifications larger than two, we propose a maximum likelihood estimator of the Poisson parameter and, ultimately, of the population size. This estimator shows benefits, in comparison with Lincoln-Petersen's, in terms of bias and efficiency. It is possible to test the homogeneity assumption that is not testable in the Lincoln-Petersen framework. The approach is applied to surveillance data on syphilis from Izmir, Turkey. Copyright © 2014 John Wiley & Sons, Ltd.
CrossCheck: an open-source web tool for high-throughput screen data analysis.
Najafov, Jamil; Najafov, Ayaz
2017-07-19
Modern high-throughput screening methods allow researchers to generate large datasets that potentially contain important biological information. However, oftentimes, picking relevant hits from such screens and generating testable hypotheses requires training in bioinformatics and the skills to efficiently perform database mining. There are currently no tools available to general public that allow users to cross-reference their screen datasets with published screen datasets. To this end, we developed CrossCheck, an online platform for high-throughput screen data analysis. CrossCheck is a centralized database that allows effortless comparison of the user-entered list of gene symbols with 16,231 published datasets. These datasets include published data from genome-wide RNAi and CRISPR screens, interactome proteomics and phosphoproteomics screens, cancer mutation databases, low-throughput studies of major cell signaling mediators, such as kinases, E3 ubiquitin ligases and phosphatases, and gene ontological information. Moreover, CrossCheck includes a novel database of predicted protein kinase substrates, which was developed using proteome-wide consensus motif searches. CrossCheck dramatically simplifies high-throughput screen data analysis and enables researchers to dig deep into the published literature and streamline data-driven hypothesis generation. CrossCheck is freely accessible as a web-based application at http://proteinguru.com/crosscheck.
Technical Note: The Initial Stages of Statistical Data Analysis
Tandy, Richard D.
1998-01-01
Objective: To provide an overview of several important data-related considerations in the design stage of a research project and to review the levels of measurement and their relationship to the statistical technique chosen for the data analysis. Background: When planning a study, the researcher must clearly define the research problem and narrow it down to specific, testable questions. The next steps are to identify the variables in the study, decide how to group and treat subjects, and determine how to measure, and the underlying level of measurement of, the dependent variables. Then the appropriate statistical technique can be selected for data analysis. Description: The four levels of measurement in increasing complexity are nominal, ordinal, interval, and ratio. Nominal data are categorical or “count” data, and the numbers are treated as labels. Ordinal data can be ranked in a meaningful order by magnitude. Interval data possess the characteristics of ordinal data and also have equal distances between levels. Ratio data have a natural zero point. Nominal and ordinal data are analyzed with nonparametric statistical techniques and interval and ratio data with parametric statistical techniques. Advantages: Understanding the four levels of measurement and when it is appropriate to use each is important in determining which statistical technique to use when analyzing data. PMID:16558489
Modular modelling with Physiome standards
Nickerson, David P.; Nielsen, Poul M. F.; Hunter, Peter J.
2016-01-01
Key points The complexity of computational models is increasing, supported by research in modelling tools and frameworks. But relatively little thought has gone into design principles for complex models.We propose a set of design principles for complex model construction with the Physiome standard modelling protocol CellML.By following the principles, models are generated that are extensible and are themselves suitable for reuse in larger models of increasing complexity.We illustrate these principles with examples including an architectural prototype linking, for the first time, electrophysiology, thermodynamically compliant metabolism, signal transduction, gene regulation and synthetic biology.The design principles complement other Physiome research projects, facilitating the application of virtual experiment protocols and model analysis techniques to assist the modelling community in creating libraries of composable, characterised and simulatable quantitative descriptions of physiology. Abstract The ability to produce and customise complex computational models has great potential to have a positive impact on human health. As the field develops towards whole‐cell models and linking such models in multi‐scale frameworks to encompass tissue, organ, or organism levels, reuse of previous modelling efforts will become increasingly necessary. Any modelling group wishing to reuse existing computational models as modules for their own work faces many challenges in the context of construction, storage, retrieval, documentation and analysis of such modules. Physiome standards, frameworks and tools seek to address several of these challenges, especially for models expressed in the modular protocol CellML. Aside from providing a general ability to produce modules, there has been relatively little research work on architectural principles of CellML models that will enable reuse at larger scales. To complement and support the existing tools and frameworks, we develop a set of principles to address this consideration. The principles are illustrated with examples that couple electrophysiology, signalling, metabolism, gene regulation and synthetic biology, together forming an architectural prototype for whole‐cell modelling (including human intervention) in CellML. Such models illustrate how testable units of quantitative biophysical simulation can be constructed. Finally, future relationships between modular models so constructed and Physiome frameworks and tools are discussed, with particular reference to how such frameworks and tools can in turn be extended to complement and gain more benefit from the results of applying the principles. PMID:27353233
Sadeghi Ghuchani, Mostafa
2018-02-08
This comment argues against the view that cancer cells produce less entropy than normal cells as stated in a recent paper by Marín and Sabater. The basic principle of estimation of entropy production rate in a living cell is discussed, emphasizing the fact that entropy production depends on both the amount of heat exchange during the metabolism and the entropy difference between products and substrates.
Testing Nonassociative Quantum Mechanics.
Bojowald, Martin; Brahma, Suddhasattwa; Büyükçam, Umut
2015-11-27
The familiar concepts of state vectors and operators in quantum mechanics rely on associative products of observables. However, these notions do not apply to some exotic systems such as magnetic monopoles, which have long been known to lead to nonassociative algebras. Their quantum physics has remained obscure. This Letter presents the first derivation of potentially testable physical results in nonassociative quantum mechanics, based on effective potentials. They imply new effects which cannot be mimicked in usual quantum mechanics with standard magnetic fields.
Almost periodic cellular neural networks with neutral-type proportional delays
NASA Astrophysics Data System (ADS)
Xiao, Songlin
2018-03-01
This paper presents a new result on the existence, uniqueness and generalised exponential stability of almost periodic solutions for cellular neural networks with neutral-type proportional delays and D operator. Based on some novel differential inequality techniques, a testable condition is derived to ensure that all the state trajectories of the system converge to an almost periodic solution with a positive exponential convergence rate. The effectiveness of the obtained result is illustrated by a numerical example.
NASA Astrophysics Data System (ADS)
Sadeghi Ghuchani, Mostafa
2018-03-01
This comment argues against the view that cancer cells produce less entropy than normal cells as stated in a recent paper by Marín and Sabater. The basic principle of estimation of entropy production rate in a living cell is discussed, emphasizing the fact that entropy production depends on both the amount of heat exchange during the metabolism and the entropy difference between products and substrates.
Earthquake Forecasting System in Italy
NASA Astrophysics Data System (ADS)
Falcone, G.; Marzocchi, W.; Murru, M.; Taroni, M.; Faenza, L.
2017-12-01
In Italy, after the 2009 L'Aquila earthquake, a procedure was developed for gathering and disseminating authoritative information about the time dependence of seismic hazard to help communities prepare for a potentially destructive earthquake. The most striking time dependency of the earthquake occurrence process is the time clustering, which is particularly pronounced in time windows of days and weeks. The Operational Earthquake Forecasting (OEF) system that is developed at the Seismic Hazard Center (Centro di Pericolosità Sismica, CPS) of the Istituto Nazionale di Geofisica e Vulcanologia (INGV) is the authoritative source of seismic hazard information for Italian Civil Protection. The philosophy of the system rests on a few basic concepts: transparency, reproducibility, and testability. In particular, the transparent, reproducible, and testable earthquake forecasting system developed at CPS is based on ensemble modeling and on a rigorous testing phase. Such phase is carried out according to the guidance proposed by the Collaboratory for the Study of Earthquake Predictability (CSEP, international infrastructure aimed at evaluating quantitatively earthquake prediction and forecast models through purely prospective and reproducible experiments). In the OEF system, the two most popular short-term models were used: the Epidemic-Type Aftershock Sequences (ETAS) and the Short-Term Earthquake Probabilities (STEP). Here, we report the results from OEF's 24hour earthquake forecasting during the main phases of the 2016-2017 sequence occurred in Central Apennines (Italy).
Broadening conceptions of learning in medical education: the message from teamworking.
Bleakley, Alan
2006-02-01
There is a mismatch between the broad range of learning theories offered in the wider education literature and a relatively narrow range of theories privileged in the medical education literature. The latter are usually described under the heading of 'adult learning theory'. This paper critically addresses the limitations of the current dominant learning theories informing medical education. An argument is made that such theories, which address how an individual learns, fail to explain how learning occurs in dynamic, complex and unstable systems such as fluid clinical teams. Models of learning that take into account distributed knowing, learning through time as well as space, and the complexity of a learning environment including relationships between persons and artefacts, are more powerful in explaining and predicting how learning occurs in clinical teams. Learning theories may be privileged for ideological reasons, such as medicine's concern with autonomy. Where an increasing amount of medical education occurs in workplace contexts, sociocultural learning theories offer a best-fit exploration and explanation of such learning. We need to continue to develop testable models of learning that inform safe work practice. One type of learning theory will not inform all practice contexts and we need to think about a range of fit-for-purpose theories that are testable in practice. Exciting current developments include dynamicist models of learning drawing on complexity theory.
A Handheld Open-Field Infant Keratometer (An American Ophthalmological Society Thesis)
Miller, Joseph M.
2010-01-01
Purpose: To design and evaluate a new infant keratometer that incorporates an unobstructed view of the infant with both eyes (open-field design). Methods: The design of the open-field infant keratometer is presented, and details of its construction are given. The design incorporates a single-ring keratoscope for measurement of corneal astigmatism over a 4-mm region of the cornea and includes a rectangular grid target concentric within the ring to allow for the study of higher-order aberrations of the eye. In order to calibrate the lens and imaging system, a novel telecentric test object was constructed and used. The system was bench calibrated against steel ball bearings of known dimensions and evaluated for accuracy while being used in handheld mode in a group of 16 adult cooperative subjects. It was then evaluated for testability in a group of 10 infants and toddlers. Results: Results indicate that while the device achieved the goal of creating an open-field instrument containing a single-ring keratoscope with a concentric grid array for the study of higher-order aberrations, additional work is required to establish better control of the vertex distance. Conclusion: The handheld open-field infant keratometer demonstrates testability suitable for the study of infant corneal astigmatism. Use of collimated light sources in future iterations of the design must be incorporated in order to achieve the accuracy required for clinical investigation. PMID:21212850
A handheld open-field infant keratometer (an american ophthalmological society thesis).
Miller, Joseph M
2010-12-01
To design and evaluate a new infant keratometer that incorporates an unobstructed view of the infant with both eyes (open-field design). The design of the open-field infant keratometer is presented, and details of its construction are given. The design incorporates a single-ring keratoscope for measurement of corneal astigmatism over a 4-mm region of the cornea and includes a rectangular grid target concentric within the ring to allow for the study of higher-order aberrations of the eye. In order to calibrate the lens and imaging system, a novel telecentric test object was constructed and used. The system was bench calibrated against steel ball bearings of known dimensions and evaluated for accuracy while being used in handheld mode in a group of 16 adult cooperative subjects. It was then evaluated for testability in a group of 10 infants and toddlers. Results indicate that while the device achieved the goal of creating an open-field instrument containing a single-ring keratoscope with a concentric grid array for the study of higher-order aberrations, additional work is required to establish better control of the vertex distance. The handheld open-field infant keratometer demonstrates testability suitable for the study of infant corneal astigmatism. Use of collimated light sources in future iterations of the design must be incorporated in order to achieve the accuracy required for clinical investigation.
Cowden, Tracy L; Cummings, Greta G
2012-07-01
We describe a theoretical model of staff nurses' intentions to stay in their current positions. The global nursing shortage and high nursing turnover rate demand evidence-based retention strategies. Inconsistent study outcomes indicate a need for testable theoretical models of intent to stay that build on previously published models, are reflective of current empirical research and identify causal relationships between model concepts. Two systematic reviews of electronic databases of English language published articles between 1985-2011. This complex, testable model expands on previous models and includes nurses' affective and cognitive responses to work and their effects on nurses' intent to stay. The concepts of desire to stay, job satisfaction, joy at work, and moral distress are included in the model to capture the emotional response of nurses to their work environments. The influence of leadership is integrated within the model. A causal understanding of clinical nurses' intent to stay and the effects of leadership on the development of that intention will facilitate the development of effective retention strategies internationally. Testing theoretical models is necessary to confirm previous research outcomes and to identify plausible sequences of the development of behavioral intentions. Increased understanding of the causal influences on nurses' intent to stay should lead to strategies that may result in higher retention rates and numbers of nurses willing to work in the health sector. © 2012 Blackwell Publishing Ltd.
Curiosity at Vera Rubin Ridge: Testable Hypotheses, First Results, and Implications for Habitability
NASA Astrophysics Data System (ADS)
Fraeman, A.; Bedford, C.; Bridges, J.; Edgar, L. A.; Hardgrove, C.; Horgan, B. H. N.; Gabriel, T. S. J.; Grotzinger, J. P.; Gupta, S.; Johnson, J. R.; Rampe, E. B.; Morris, R. V.; Salvatore, M. R.; Schwenzer, S. P.; Stack, K.; Pinet, P. C.; Rubin, D. M.; Weitz, C. M.; Wellington, D. F.; Wiens, R. C.; Williams, A. J.; Vasavada, A. R.
2017-12-01
As of sol 1756, Curiosity was 250 meters from ascending Vera Rubin Ridge, a unique geomorphic feature preserved in the lower foothills of Aeolis Mons (informally known as Mt. Sharp) that is distinguishable from orbit. Vera Rubin Ridge (previously termed the Hematite Ridge) is characterized by a higher thermal inertia than the surrounding terrain, is comparatively resistant to erosion, and is capped with a hematite-bearing layer that is visible in 18 m/pixel CRISM data. A key hypothesis associated with this unit is that it represents a redox interface where ferrous iron oxidized and precipitated either as hematite or another ferric precursor. The Curiosity integrated payload is being used to determine the depositional environment(s), stratigraphic context and geochemical conditions associated with this interface, all of which will provide key insights into its past habitability potential and the relative timing of processes. Specifically, analysis of Curiosity data will address four major questions related to the history and evolution of ridge-forming strata: (1) What is the stratigraphic relationship between the units in the ridge and the Mt. Sharp group (see Grotzinger et al., 2015)? (2) What primary and secondary geologic processes deposited and modified the ridge units over time? (3) What is the nature and timing of the hematite precipitation environment, and how does it relate to similar oxidized phases in the Murray formation? (4) What are the implications for habitability and the preservation of organic molecules? Initial results of a systematic imaging campaign along the contact between the lower portion or the ridge and the Murray formation has revealed dm-scale cross bedding within the ridge stratigraphy, which provide clues about the depositional environments; these can be compared to suites of sedimentary structures within the adjacent Murray formation. Long distance ChemCam passive and Mastcam multispectral data show that hematite and likely other ferric phases are present in the upper ridge, consistent with orbital data. Curiosity will continue to take systematic observations that draw upon testable hypotheses about the ridge environments as the rover ascends Vera Rubin Ridge.
External Dependencies-Driven Architecture Discovery and Analysis of Implemented Systems
NASA Technical Reports Server (NTRS)
Ganesan, Dharmalingam; Lindvall, Mikael; Ron, Monica
2014-01-01
A method for architecture discovery and analysis of implemented systems (AIS) is disclosed. The premise of the method is that architecture decisions are inspired and influenced by the external entities that the software system makes use of. Examples of such external entities are COTS components, frameworks, and ultimately even the programming language itself and its libraries. Traces of these architecture decisions can thus be found in the implemented software and is manifested in the way software systems use such external entities. While this fact is often ignored in contemporary reverse engineering methods, the AIS method actively leverages and makes use of the dependencies to external entities as a starting point for the architecture discovery. The AIS method is demonstrated using the NASA's Space Network Access System (SNAS). The results show that, with abundant evidence, the method offers reusable and repeatable guidelines for discovering the architecture and locating potential risks (e.g. low testability, decreased performance) that are hidden deep in the implementation. The analysis is conducted by using external dependencies to identify, classify and review a minimal set of key source code files. Given the benefits of analyzing external dependencies as a way to discover architectures, it is argued that external dependencies deserve to be treated as first-class citizens during reverse engineering. The current structure of a knowledge base of external entities and analysis questions with strategies for getting answers is also discussed.
Lockheed Martin Skunk Works Single Stage to Orbit/Reusable Launch Vehicle
NASA Technical Reports Server (NTRS)
1999-01-01
Lockheed Martin Skunk Works has compiled an Annual Performance Report of the X-33/RLV Program. This report consists of individual reports from all industry team members, as well as NASA team centers. This portion of the report is comprised of a status report of Lockheed Martin's contribution to the program. The following is a summary of the Lockheed Martin Centers involved and work reviewed under their portion of the agreement: (1) Lockheed Martin Skunk Works - Vehicle Development, Operations Development, X-33 and RLV Systems Engineering, Manufacturing, Ground Operations, Reliability, Maintainability/Testability, Supportability, & Special Analysis Team, and X-33 Flight Assurance; (2) Lockheed Martin Technical Operations - Launch Support Systems, Ground Support Equipment, Flight Test Operations, and RLV Operations Development Support; (3) Lockheed Martin Space Operations - TAEM and A/L Guidance and Flight Control Design, Evaluation of Vehicle Configuration, TAEM and A/L Dispersion Analysis, Modeling and Simulations, Frequency Domain Analysis, Verification and Validation Activities, and Ancillary Support; (4) Lockheed Martin Astronautics-Denver - Systems Engineering, X-33 Development; (5) Sanders - A Lockheed Martin Company - Vehicle Health Management Subsystem Progress, GSS Progress; and (6) Lockheed Martin Michoud Space Systems - X-33 Liquid Oxygen (LOX) Tank, Key Challenges, Lessons Learned, X-33/RLV Composite Technology, Reusable Cyrogenic Insulation (RCI) and Vehicle Health Monitoring, Main Propulsion Systems (MPS), Structural Testing, X-33 System Integration and Analysis, and Cyrogenic Systems Operations.
"Don׳t" versus "won׳t": principles, mechanisms, and intention in action inhibition.
Ridderinkhof, K Richard; van den Wildenberg, Wery P M; Brass, Marcel
2014-12-01
The aim of the present review is to provide a theoretical analysis of the role of intentions in inhibition. We will first outline four dimensions along which inhibition can be categorized: intentionality, timing, specificity, and the nature of the to-be-inhibited action. Next, we relate the concept of inhibition to theories of intentional action. In particular, we integrate ideomotor theory with motor control theories that involve predictive forward modeling of the consequences of one׳s action, and evaluate how the dimensional classification of inhibition fits into such an integrative approach. Furthermore, we will outline testable predictions that derive from this novel hypothesis of ideomotor inhibition. We then discuss the viability of the ideomotor inhibition hypothesis and our classification in view of the available evidence on the neural mechanisms of action inhibition, indicating that sensorimotor and ideomotor inhibition engages largely overlapping networks with additional recruitment of dFMC for ideomotor inhibition. Copyright © 2014 Elsevier Ltd. All rights reserved.
Cremer, Jonas; Arnoldini, Markus; Hwa, Terence
2017-06-20
The human gut harbors a dynamic microbial community whose composition bears great importance for the health of the host. Here, we investigate how colonic physiology impacts bacterial growth, which ultimately dictates microbiota composition. Combining measurements of bacterial physiology with analysis of published data on human physiology into a quantitative, comprehensive modeling framework, we show how water flow in the colon, in concert with other physiological factors, determine the abundances of the major bacterial phyla. Mechanistically, our model shows that local pH values in the lumen, which differentially affect the growth of different bacteria, drive changes in microbiota composition. It identifies key factors influencing the delicate regulation of colonic pH, including epithelial water absorption, nutrient inflow, and luminal buffering capacity, and generates testable predictions on their effects. Our findings show that a predictive and mechanistic understanding of microbial ecology in the gut is possible. Such predictive understanding is needed for the rational design of intervention strategies to actively control the microbiota.
Gao, Yu; Fangel, Jonatan U; Willats, William G T; Vivier, Melané A; Moore, John P
2016-11-05
The effectiveness of enzyme-mediated-maceration in red winemaking relies on the use of an optimum combination of specific enzymes. A lack of information on the relevant enzyme activities and the corresponding polysaccharide-rich berry cell wall structure is a major limitation. This study used different combinations of purified recombinant pectinases with cell wall profiling tools to follow the deconstruction process during winemaking. Multivariate data analysis of the glycan microarray (CoMPP) and gas chromatography (GC) results revealed that pectin lyase performed almost as effectively in de-pectination as certain commercial enzyme mixtures. Surprisingly the combination of endo-polygalacturonase and pectin-methyl-esterase only unraveled the cell walls without de-pectination. Datasets from the various combinations used confirmed pectin-rich and xyloglucan-rich layers within the grape pomace. These data support a proposed grape cell wall model which can serve as a foundation to evaluate testable hypotheses in future studies aimed at developing tailor-made enzymes for winemaking scenarios. Copyright © 2016 Elsevier Ltd. All rights reserved.
Cremer, Jonas; Arnoldini, Markus; Hwa, Terence
2017-01-01
The human gut harbors a dynamic microbial community whose composition bears great importance for the health of the host. Here, we investigate how colonic physiology impacts bacterial growth, which ultimately dictates microbiota composition. Combining measurements of bacterial physiology with analysis of published data on human physiology into a quantitative, comprehensive modeling framework, we show how water flow in the colon, in concert with other physiological factors, determine the abundances of the major bacterial phyla. Mechanistically, our model shows that local pH values in the lumen, which differentially affect the growth of different bacteria, drive changes in microbiota composition. It identifies key factors influencing the delicate regulation of colonic pH, including epithelial water absorption, nutrient inflow, and luminal buffering capacity, and generates testable predictions on their effects. Our findings show that a predictive and mechanistic understanding of microbial ecology in the gut is possible. Such predictive understanding is needed for the rational design of intervention strategies to actively control the microbiota. PMID:28588144
Yukawa unification in an SO(10) SUSY GUT: SUSY on the edge
NASA Astrophysics Data System (ADS)
Poh, Zijie; Raby, Stuart
2015-07-01
In this paper we analyze Yukawa unification in a three family SO(10) SUSY GUT. We perform a global χ2 analysis and show that supersymmetry (SUSY) effects do not decouple even though the universal scalar mass parameter at the grand unified theory (GUT) scale, m16, is found to lie between 15 and 30 TeV with the best fit given for m16≈25 TeV . Note, SUSY effects do not decouple since stops and bottoms have mass of order 5 TeV, due to renormalization group running from MGUT. The model has many testable predictions. Gauginos are the lightest sparticles and the light Higgs boson is very much standard model-like. The model is consistent with flavor and C P observables with the BR (μ →e γ ) close to the experimental upper bound. With such a large value of m16 we clearly cannot be considered "natural" SUSY nor are we "split" SUSY. We are thus in the region in between or "SUSY on the edge."
Towards a natural disaster intervention and recovery framework.
Lawther, Peter M
2016-07-01
Contemporary responses to facilitate long-term recovery from large-scale natural disasters juxtapose between those of humanitarian agencies and governments and those of the affected community. The extent to which these mechanisms articulate is crucial to the recovery propensity of the affected communities. This research examines such action by exploring the relationship between the scale of post-disaster response interventions, the extent of community participation in them, and their impact on community recovery, using a community wealth capital framework. The investigation was applied to a study of the longer-term community recovery of the island of Vilufushi, Republic of Maldives, which was almost completely destroyed by the Indian Ocean tsunami of 26 December 2004. Data were analysed through the employment of a pattern match technique and a holistic recovery network analysis. The research framework, informed by the case-study results, other long-term recovery evaluations, and existing resilience theory, is reconfigured as a testable roadmap for future post-disaster interventions. © 2016 The Author(s). Disasters © Overseas Development Institute, 2016.
Cheng, Ryan R; Hawk, Alexander T; Makarov, Dmitrii E
2013-02-21
Recent experiments showed that the reconfiguration dynamics of unfolded proteins are often adequately described by simple polymer models. In particular, the Rouse model with internal friction (RIF) captures internal friction effects as observed in single-molecule fluorescence correlation spectroscopy (FCS) studies of a number of proteins. Here we use RIF, and its non-free draining analog, Zimm model with internal friction, to explore the effect of internal friction on the rate with which intramolecular contacts can be formed within the unfolded chain. Unlike the reconfiguration times inferred from FCS experiments, which depend linearly on the solvent viscosity, the first passage times to form intramolecular contacts are shown to display a more complex viscosity dependence. We further describe scaling relationships obeyed by contact formation times in the limits of high and low internal friction. Our findings provide experimentally testable predictions that can serve as a framework for the analysis of future studies of contact formation in proteins.
Attack of the Killer Fungus: A Hypothesis-Driven Lab Module †
Sato, Brian K.
2013-01-01
Discovery-driven experiments in undergraduate laboratory courses have been shown to increase student learning and critical thinking abilities. To this end, a lab module involving worm capture by a nematophagous fungus was developed. The goals of this module are to enhance scientific understanding of the regulation of worm capture by soil-dwelling fungi and for students to attain a set of established learning goals, including the ability to develop a testable hypothesis and search for primary literature for data analysis, among others. Students in a ten-week majors lab course completed the lab module and generated novel data as well as data that agrees with the published literature. In addition, learning gains were achieved as seen through a pre-module and post-module test, student self-assessment, class exam, and lab report. Overall, this lab module enables students to become active participants in the scientific method while contributing to the understanding of an ecologically relevant model organism. PMID:24358387
MicroTCA-based Global Trigger Upgrade project for the CMS experiment at LHC
NASA Astrophysics Data System (ADS)
Rahbaran, B.; Arnold, B.; Bergauer, H.; Eichberger, M.; Rabady, D.
2011-12-01
The electronics of the first Level Global Trigger (GT) of CMS is the last stage of the Level-1 trigger system [1]. At LHC up to 40 million collisions of proton bunches occur every second, resulting in about 800 million proton collisions. The CMS Level-1 Global Trigger [1], a custom designed electronics system based on FPGA technology and the VMEbus system, performs a quick on-line analysis of each collision every 25 ns and decides whether to reject or to accept it for further analysis. The CMS trigger group of the Institute of High Energy Physics in Vienna (HEPHY) is involved in the Level-1 trigger of the CMS experiment at CERN. As part of the Trigger Upgrade, the Level-1 Global Trigger will be redesigned and implemented in MicroTCA based technology, which allows engineers to detect all possible faults on plug-in boards, in the power supply and in the cooling system. The upgraded Global Trigger will be designed to have the same basic categories of functions as the present GT, but will have more algorithms and more possibilities for combining trigger candidates. Additionally, reconfigurability and testability will be supported based on the next system generation.
NASA Technical Reports Server (NTRS)
Hansman, Robert John, Jr.
1999-01-01
MIT has investigated Situational Awareness issues relating to the implementation of Datalink in the Air Traffic Control environment for a number of years under this grant activity. This work has investigated: 1) The Effect of "Party Line" Information. 2) The Effect of Datalink-Enabled Automated Flight Management Systems (FMS) on Flight Crew Situational Awareness. 3) The Effect of Cockpit Display of Traffic Information (CDTI) on Situational Awareness During Close Parallel Approaches. 4) Analysis of Flight Path Management Functions in Current and Future ATM Environments. 5) Human Performance Models in Advanced ATC Automation: Flight Crew and Air Traffic Controllers. 6) CDTI of Datalink-Based Intent Information in Advanced ATC Environments. 7) Shared Situational Awareness between the Flight Deck and ATC in Datalink-Enabled Environments. 8) Analysis of Pilot and Controller Shared SA Requirements & Issues. 9) Development of Robust Scenario Generation and Distributed Simulation Techniques for Flight Deck ATC Simulation. 10) Methods of Testing Situation Awareness Using Testable Response Techniques. The work is detailed in specific technical reports that are listed in the following bibliography, and are attached as an appendix to the master final technical report.
Mand, Cara; Gillam, Lynn; Delatycki, Martin B; Duncan, Rony E
2012-09-01
Predictive genetic testing is now routinely offered to asymptomatic adults at risk for genetic disease. However, testing of minors at risk for adult-onset conditions, where no treatment or preventive intervention exists, has evoked greater controversy and inspired a debate spanning two decades. This review aims to provide a detailed longitudinal analysis and concludes by examining the debate's current status and prospects for the future. Fifty-three relevant theoretical papers published between 1990 and December 2010 were identified, and interpretative content analysis was employed to catalogue discrete arguments within these papers. Novel conclusions were drawn from this review. While the debate's first voices were raised in opposition of testing and their arguments have retained currency over many years, arguments in favour of testing, which appeared sporadically at first, have gained momentum more recently. Most arguments on both sides are testable empirical claims, so far untested, rather than abstract ethical or philosophical positions. The dispute, therein, lies not so much in whether minors should be permitted to access predictive genetic testing but whether these empirical claims on the relative benefits or harms of testing should be assessed.
Functional Extended Redundancy Analysis
ERIC Educational Resources Information Center
Hwang, Heungsun; Suk, Hye Won; Lee, Jang-Han; Moskowitz, D. S.; Lim, Jooseop
2012-01-01
We propose a functional version of extended redundancy analysis that examines directional relationships among several sets of multivariate variables. As in extended redundancy analysis, the proposed method posits that a weighed composite of each set of exogenous variables influences a set of endogenous variables. It further considers endogenous…
Reliability Analysis of a Green Roof Under Different Storm Scenarios
NASA Astrophysics Data System (ADS)
William, R. K.; Stillwell, A. S.
2015-12-01
Urban environments continue to face the challenges of localized flooding and decreased water quality brought on by the increasing amount of impervious area in the built environment. Green infrastructure provides an alternative to conventional storm sewer design by using natural processes to filter and store stormwater at its source. However, there are currently few consistent standards available in North America to ensure that installed green infrastructure is performing as expected. This analysis offers a method for characterizing green roof failure using a visual aid commonly used in earthquake engineering: fragility curves. We adapted the concept of the fragility curve based on the efficiency in runoff reduction provided by a green roof compared to a conventional roof under different storm scenarios. We then used the 2D distributed surface water-groundwater coupled model MIKE SHE to model the impact that a real green roof might have on runoff in different storm events. We then employed a multiple regression analysis to generate an algebraic demand model that was input into the Matlab-based reliability analysis model FERUM, which was then used to calculate the probability of failure. The use of reliability analysis as a part of green infrastructure design code can provide insights into green roof weaknesses and areas for improvement. It also supports the design of code that is more resilient than current standards and is easily testable for failure. Finally, the understanding of reliability of a single green roof module under different scenarios can support holistic testing of system reliability.
The Rome Laboratory Reliability Engineer’s Toolkit
1993-04-01
34Testability Programs for Electronic Systems and Equipment" DODD 5000.1 "Defense Acquistion " DODI 5000.2 "Defense Acquisition Management Policies and...these paths have an equivalent failure rate of zero so that only the remaining serial elements need to be translated. 5. The requirement process...X6) X A2+B2+XAXB One standby off-line unit with n active on- line units required for success. Off-line spare assumed to have a failure rate of zero
The dynamics of hurricane balls
NASA Astrophysics Data System (ADS)
Andersen, W. L.; Werner, Steven
2015-09-01
We examine the theory of the hurricane balls toy. This toy consists of two steel balls, welded together that are sent spinning on a horizontal surface somewhat like a top. Unlike a top, at high frequency the symmetry axis approaches a limiting inclination that is not perpendicular to the surface. We calculate (and experimentally verify) the limiting inclinations for three toy geometries. We find that at high frequencies, hurricane balls provide an easily realized and testable example of the Poinsot theory of freely rotating symmetrical bodies.
The Mars Science Laboratory Entry, Descent, and Landing Flight Software
NASA Technical Reports Server (NTRS)
Gostelow, Kim P.
2013-01-01
This paper describes the design, development, and testing of the EDL program from the perspective of the software engineer. We briefly cover the overall MSL flight software organization, and then the organization of EDL itself. We discuss the timeline, the structure of the GNC code (but not the algorithms as they are covered elsewhere in this conference) and the command and telemetry interfaces. Finally, we cover testing and the influence that testability had on the EDL flight software design.
Brain Organization and Psychodynamics
Peled, Avi; Geva, Amir B.
1999-01-01
Any attempt to link brain neural activity and psychodynamic concepts requires a tremendous conceptual leap. Such a leap may be facilitated if a common language between brain and mind can be devised. System theory proposes formulations that may aid in reconceptualizing psychodynamic descriptions in terms of neural organizations in the brain. Once adopted, these formulations can help to generate testable predictions about brain–psychodynamic relations and thus significantly affect the future of psychotherapy. (The Journal of Psychotherapy Practice and Research 1999; 8:24–39) PMID:9888105
A Quantitative Geochemical Target for Modeling the Formation of the Earth and Moon
NASA Technical Reports Server (NTRS)
Boyce, Jeremy W.; Barnes, Jessica J.; McCubbin, Francis M.
2017-01-01
The past decade has been one of geochemical, isotopic, and computational advances that are bringing the laboratory measurements and computational modeling neighborhoods of the Earth-Moon community to ever closer proximity. We are now however in the position to become even better neighbors: modelers can generate testable hypthotheses for geochemists; and geochemists can provide quantitive targets for modelers. Here we present a robust example of the latter based on Cl isotope measurements of mare basalts.
2017-01-01
A central feature of Darwin's theory of natural selection is that it explains the purpose of biological adaptation. Here, I: emphasize the scientific importance of understanding what adaptations are for, in terms of facilitating the derivation of empirically testable predictions; discuss the population genetical basis for Darwin's theory of the purpose of adaptation, with reference to Fisher's ‘fundamental theorem of natural selection'; and show that a deeper understanding of the purpose of adaptation is achieved in the context of social evolution, with reference to inclusive fitness and superorganisms. PMID:28839927
Gardner, Andy
2017-10-06
A central feature of Darwin's theory of natural selection is that it explains the purpose of biological adaptation. Here, I: emphasize the scientific importance of understanding what adaptations are for, in terms of facilitating the derivation of empirically testable predictions; discuss the population genetical basis for Darwin's theory of the purpose of adaptation, with reference to Fisher's 'fundamental theorem of natural selection'; and show that a deeper understanding of the purpose of adaptation is achieved in the context of social evolution, with reference to inclusive fitness and superorganisms.
The Role of Metaphysical Naturalism in Science
NASA Astrophysics Data System (ADS)
Mahner, Martin
2012-10-01
This paper defends the view that metaphysical naturalism is a constitutive ontological principle of science in that the general empirical methods of science, such as observation, measurement and experiment, and thus the very production of empirical evidence, presuppose a no-supernature principle. It examines the consequences of metaphysical naturalism for the testability of supernatural claims, and it argues that explanations involving supernatural entities are pseudo-explanatory due to the many semantic and ontological problems of supernatural concepts. The paper also addresses the controversy about metaphysical versus methodological naturalism.
Dayside auroral arcs and convection
NASA Technical Reports Server (NTRS)
Reiff, P. H.; Burch, J. L.; Heelis, R. A.
1978-01-01
Recent Defense Meteorological Satellite Program and International Satellite for Ionospheric Studies dayside auroral observations show two striking features: a lack of visible auroral arcs near noon and occasional fan shaped arcs radiating away from noon on both the morning and afternoon sides of the auroral oval. A simple model which includes these two features is developed by reference to the dayside convection pattern of Heelis et al. (1976). The model may be testable in the near future with simultaneous convection, current and auroral light data.
Causal Reasoning on Biological Networks: Interpreting Transcriptional Changes
NASA Astrophysics Data System (ADS)
Chindelevitch, Leonid; Ziemek, Daniel; Enayetallah, Ahmed; Randhawa, Ranjit; Sidders, Ben; Brockel, Christoph; Huang, Enoch
Over the past decade gene expression data sets have been generated at an increasing pace. In addition to ever increasing data generation, the biomedical literature is growing exponentially. The PubMed database (Sayers et al., 2010) comprises more than 20 million citations as of October 2010. The goal of our method is the prediction of putative upstream regulators of observed expression changes based on a set of over 400,000 causal relationships. The resulting putative regulators constitute directly testable hypotheses for follow-up.
Harrison, Luke; Loui, Psyche
2014-01-01
Music has a unique power to elicit moments of intense emotional and psychophysiological response. These moments – termed “chills,” “thrills”, “frissons,” etc. – are subjects of introspection and philosophical debate, as well as scientific study in music perception and cognition. The present article integrates the existing multidisciplinary literature in an attempt to define a comprehensive, testable, and ecologically valid model of transcendent psychophysiological moments in music. PMID:25101043
The Labor Market and the Second Economy in the Soviet Union
1991-01-01
model . WHO WORKS "ON THE LEFT"? 15 (The non-second economy income (V) is in turn composed of official first economy income , pilferage from the first...demands. In other words, the model assumes that the family "pools" all unearned income regardless of source. This is one of the few testable assumptions...of the neoclassical model .16 In the labor supply model in this paper, we have assumed that all first economy income , for both husband and wife, is
Scaling properties of multitension domain wall networks
NASA Astrophysics Data System (ADS)
Oliveira, M. F.; Martins, C. J. A. P.
2015-02-01
We study the asymptotic scaling properties of domain wall networks with three different tensions in various cosmological epochs. We discuss the conditions under which a scale-invariant evolution of the network (which is well established for simpler walls) still applies and also consider the limiting case where defects are locally planar and the curvature is concentrated in the junctions. We present detailed quantitative predictions for scaling densities in various contexts, which should be testable by means of future high-resolution numerical simulations.
White-nose syndrome initiates a cascade of physiologic disturbances in the hibernating bat host
Verant, Michelle L.; Meteyer, Carol U.; Speakman, John R.; Cryan, Paul M.; Lorch, Jeffrey M.; Blehert, David S.
2014-01-01
Integrating these novel findings on the physiological changes that occur in early-stage WNS with those previously documented in late-stage infections, we propose a multi-stage disease progression model that mechanistically describes the pathologic and physiologic effects underlying mortality of WNS in hibernating bats. This model identifies testable hypotheses for better understanding this disease, knowledge that will be critical for defining effective disease mitigation strategies aimed at reducing morbidity and mortality that results from WNS.
Beyond Critical Exponents in Neuronal Avalanches
NASA Astrophysics Data System (ADS)
Friedman, Nir; Butler, Tom; Deville, Robert; Beggs, John; Dahmen, Karin
2011-03-01
Neurons form a complex network in the brain, where they interact with one another by firing electrical signals. Neurons firing can trigger other neurons to fire, potentially causing avalanches of activity in the network. In many cases these avalanches have been found to be scale independent, similar to critical phenomena in diverse systems such as magnets and earthquakes. We discuss models for neuronal activity that allow for the extraction of testable, statistical predictions. We compare these models to experimental results, and go beyond critical exponents.
The terrestrial evolution of metabolism and life – by the numbers
O'Kelly, Gregory C
2009-01-01
Background Allometric scaling relating body mass to metabolic rate by an exponent of the former (Kleiber's Law), commonly known as quarter-power scaling (QPS), is controversial for claims made on its behalf, especially that of its universality for all life. As originally formulated, Kleiber was based upon the study of heat; metabolic rate is quantified in watts (or calories per unit time). Techniques and technology for metabolic energy measurement have been refined but the math has not. QPS is susceptible to increasing deviations from theoretical predictions to data, suggesting that there is no single, universal exponent relevant to all of life. QPS's major proponents continue to fail to make good on hints of the power of the equation for understanding aging. Essentialist-deductivist view If the equation includes a term for efficiency in the exponent, thereby ruling out thermogenesis as part of metabolism, its heuristic power is greatly amplified, and testable deductive inferences are generated. If metabolic rate is measured in watts and metabolic efficiency is a redox-coupling ratio, then the equation is essentially about the energy storage capacity of organic molecules. The equation is entirely about the essentials of all life: water, salt, organic molecules, and energy. The water and salt provide an electrochemical salt bridge for the transmission of energy into and through the organic components. The equation, when graphed, treats the organic structure as battery-like, and relates its recharge rate and electrical properties to its longevity. Conclusion The equation models the longevity-extending effects of caloric restriction, and shows where those effects wane. It models the immortality of some types of cells, and supports the argument for the origin of life being at submarine volcanic vents and black smokers. It clarifies how early life had to change to survive drifting to the surface, and what drove mutations in its ascent. It does not deal with cause and effect; it deals with variables in the essentials of all life, and treats life as an epiphenomenon of those variables. The equation describes how battery discharge into the body can increase muscle mass, promote fitness, and extend life span, among other issues. PMID:19712477
Friesen, Justin P; Campbell, Troy H; Kay, Aaron C
2015-03-01
We propose that people may gain certain "offensive" and "defensive" advantages for their cherished belief systems (e.g., religious and political views) by including aspects of unfalsifiability in those belief systems, such that some aspects of the beliefs cannot be tested empirically and conclusively refuted. This may seem peculiar, irrational, or at least undesirable to many people because it is assumed that the primary purpose of a belief is to know objective truth. However, past research suggests that accuracy is only one psychological motivation among many, and falsifiability or testability may be less important when the purpose of a belief serves other psychological motives (e.g., to maintain one's worldviews, serve an identity). In Experiments 1 and 2 we demonstrate the "offensive" function of unfalsifiability: that it allows religious adherents to hold their beliefs with more conviction and political partisans to polarize and criticize their opponents more extremely. Next we demonstrate unfalsifiability's "defensive" function: When facts threaten their worldviews, religious participants frame specific reasons for their beliefs in more unfalsifiable terms (Experiment 3) and political partisans construe political issues as more unfalsifiable ("moral opinion") instead of falsifiable ("a matter of facts"; Experiment 4). We conclude by discussing how in a world where beliefs and ideas are becoming more easily testable by data, unfalsifiability might be an attractive aspect to include in one's belief systems, and how unfalsifiability may contribute to polarization, intractability, and the marginalization of science in public discourse. PsycINFO Database Record (c) 2015 APA, all rights reserved.
Scientific realism and wishful thinking in soil hydrology
NASA Astrophysics Data System (ADS)
Flühler, H.
2009-04-01
In our field we often learn - or could have learned - more from failures than from successes provided we had postulated testable hypotheses to be accepted or rejected. In soil hydrology, hypotheses are testable if independent information quantifying the pertinent system features is at hand. This view on how to operate is an idealized concept of how we could or should have worked. In reality, the path to success is more tortuous and we usually progress differently obeying to other professional musts. Although we missed some shortcuts over the past few decades, we definitely made significant progress in understanding vadose zone progresses, but we could have advanced our system understanding faster by more rigorously questioning the fundamental assumptions. I will try to illustrate the tortuous path of learning and identify some causes of the slowed-down learning curve. In the pioneering phase of vadose zone research many models have been mapped in our minds and implemented on our computers. Many of them are now well established, powerful and represent the state-of-the-art even when they do not work. Some of them are based on erroneous or misleading concepts. Even when based on adequate concepts they might have been applied in the wrong context or inadequate models may have lead to apparent success. I address this process of collective learning with the intention that we spend more time and efforts to find the right question instead of improving tools, which are questionably suitable for solving the main problems.
Choice Experiments to Quantify Preferences for Health and Healthcare: State of the Practice.
Mühlbacher, Axel; Johnson, F Reed
2016-06-01
Stated-preference methods increasingly are used to quantify preferences in health economics, health technology assessment, benefit-risk analysis and health services research. The objective of stated-preference studies is to acquire information about trade-off preferences among treatment outcomes, prioritization of clinical decision criteria, likely uptake or adherence to healthcare products and acceptability of healthcare services or policies. A widely accepted approach to eliciting preferences is discrete-choice experiments. Patient, physician, insurant or general-public respondents choose among constructed, experimentally controlled alternatives described by decision-relevant features or attributes. Attributes can represent complete health states, sets of treatment outcomes or characteristics of a healthcare system. The observed pattern of choice reveals how different respondents or groups of respondents implicitly weigh, value and assess different characteristics of treatments, products or services. An important advantage of choice experiments is their foundation in microeconomic utility theory. This conceptual framework provides tests of internal validity, guidance for statistical analysis of latent preference structures, and testable behavioural hypotheses. Choice experiments require expertise in survey-research methods, random-utility theory, experimental design and advanced statistical analysis. This paper should be understood as an introduction to setting up a basic experiment rather than an exhaustive critique of the latest findings and procedures. Where appropriate, we have identified topics of active research where a broad consensus has not yet been established.
Ali, Ashehad A.; Medlyn, Belinda E.; Aubier, Thomas G.; ...
2015-10-06
Differential species responses to atmospheric CO 2 concentration (C a) could lead to quantitative changes in competition among species and community composition, with flow-on effects for ecosystem function. However, there has been little theoretical analysis of how elevated C a (eC a) will affect plant competition, or how composition of plant communities might change. Such theoretical analysis is needed for developing testable hypotheses to frame experimental research. Here, we investigated theoretically how plant competition might change under eC a by implementing two alternative competition theories, resource use theory and resource capture theory, in a plant carbon and nitrogen cycling model.more » The model makes several novel predictions for the impact of eC a on plant community composition. Using resource use theory, the model predicts that eC a is unlikely to change species dominance in competition, but is likely to increase coexistence among species. Using resource capture theory, the model predicts that eC a may increase community evenness. Collectively, both theories suggest that eC a will favor coexistence and hence that species diversity should increase with eC a. Our theoretical analysis leads to a novel hypothesis for the impact of eC a on plant community composition. In this study, the hypothesis has potential to help guide the design and interpretation of eC a experiments.« less
Improving accuracy and power with transfer learning using a meta-analytic database.
Schwartz, Yannick; Varoquaux, Gaël; Pallier, Christophe; Pinel, Philippe; Poline, Jean-Baptiste; Thirion, Bertrand
2012-01-01
Typical cohorts in brain imaging studies are not large enough for systematic testing of all the information contained in the images. To build testable working hypotheses, investigators thus rely on analysis of previous work, sometimes formalized in a so-called meta-analysis. In brain imaging, this approach underlies the specification of regions of interest (ROIs) that are usually selected on the basis of the coordinates of previously detected effects. In this paper, we propose to use a database of images, rather than coordinates, and frame the problem as transfer learning: learning a discriminant model on a reference task to apply it to a different but related new task. To facilitate statistical analysis of small cohorts, we use a sparse discriminant model that selects predictive voxels on the reference task and thus provides a principled procedure to define ROIs. The benefits of our approach are twofold. First it uses the reference database for prediction, i.e., to provide potential biomarkers in a clinical setting. Second it increases statistical power on the new task. We demonstrate on a set of 18 pairs of functional MRI experimental conditions that our approach gives good prediction. In addition, on a specific transfer situation involving different scanners at different locations, we show that voxel selection based on transfer learning leads to higher detection power on small cohorts.
NASA Astrophysics Data System (ADS)
Wieder, William R.; Knowles, John F.; Blanken, Peter D.; Swenson, Sean C.; Suding, Katharine N.
2017-04-01
Abiotic factors structure plant community composition and ecosystem function across many different spatial scales. Often, such variation is considered at regional or global scales, but here we ask whether ecosystem-scale simulations can be used to better understand landscape-level variation that might be particularly important in complex terrain, such as high-elevation mountains. We performed ecosystem-scale simulations by using the Community Land Model (CLM) version 4.5 to better understand how the increased length of growing seasons may impact carbon, water, and energy fluxes in an alpine tundra landscape. The model was forced with meteorological data and validated with observations from the Niwot Ridge Long Term Ecological Research Program site. Our results demonstrate that CLM is capable of reproducing the observed carbon, water, and energy fluxes for discrete vegetation patches across this heterogeneous ecosystem. We subsequently accelerated snowmelt and increased spring and summer air temperatures in order to simulate potential effects of climate change in this region. We found that vegetation communities that were characterized by different snow accumulation dynamics showed divergent biogeochemical responses to a longer growing season. Contrary to expectations, wet meadow ecosystems showed the strongest decreases in plant productivity under extended summer scenarios because of disruptions in hydrologic connectivity. These findings illustrate how Earth system models such as CLM can be used to generate testable hypotheses about the shifting nature of energy, water, and nutrient limitations across space and through time in heterogeneous landscapes; these hypotheses may ultimately guide further experimental work and model development.
Tuberculosis as a three-act play: A new paradigm for the pathogenesis of pulmonary tuberculosis.
Hunter, Robert L
2016-03-01
Lack of access to human tissues with untreated tuberculosis (TB) has forced generations of researchers to use animal models and to adopt a paradigm that granulomas are the characteristic lesion of both primary and post primary TB. An extended search of studies of human lung tissues failed to find any reports that support this paradigm. We found scores of publications from gross pathology in 1804 through high resolution CT scans in 2015 that identify obstructive lobular pneumonia, not granulomas, as the characteristic lesion of developing post-primary TB. This paper reviews this literature together with other relevant observations to formulate a new paradigm of TB with three distinct stages: a three-act play. First, primary TB, a war of attrition, begins with infection that spreads via lymphatics and blood stream before inducing systemic immunity that contains and controls the organisms within granulomas. Second, post-primary TB, a sneak attack, develops during latent TB as an asymptomatic obstructive lobular pneumonia in persons with effective systemic immunity. It is a paucibacillary process with no granulomas that spreads via bronchi and accumulates mycobacterial antigens and host lipids for 1-2 years before suddenly undergoing caseous necrosis. Third, the fallout, is responsible for nearly all clinical post primary disease. It begins with caseous necrotic pneumonia that is either retained to become the focus of fibrocaseous disease or is coughed out to leave a cavity. This three-stage paradigm suggests testable hypotheses and plausible answers to long standing questions of immunity to TB. Copyright © 2015 The Author. Published by Elsevier Ltd.. All rights reserved.
Dini-Andreote, Francisco; Stegen, James C; van Elsas, Jan Dirk; Salles, Joana Falcão
2015-03-17
Ecological succession and the balance between stochastic and deterministic processes are two major themes within microbial ecology, but these conceptual domains have mostly developed independent of each other. Here we provide a framework that integrates shifts in community assembly processes with microbial primary succession to better understand mechanisms governing the stochastic/deterministic balance. Synthesizing previous work, we devised a conceptual model that links ecosystem development to alternative hypotheses related to shifts in ecological assembly processes. Conceptual model hypotheses were tested by coupling spatiotemporal data on soil bacterial communities with environmental conditions in a salt marsh chronosequence spanning 105 years of succession. Analyses within successional stages showed community composition to be initially governed by stochasticity, but as succession proceeded, there was a progressive increase in deterministic selection correlated with increasing sodium concentration. Analyses of community turnover among successional stages--which provide a larger spatiotemporal scale relative to within stage analyses--revealed that changes in the concentration of soil organic matter were the main predictor of the type and relative influence of determinism. Taken together, these results suggest scale-dependency in the mechanisms underlying selection. To better understand mechanisms governing these patterns, we developed an ecological simulation model that revealed how changes in selective environments cause shifts in the stochastic/deterministic balance. Finally, we propose an extended--and experimentally testable--conceptual model integrating ecological assembly processes with primary and secondary succession. This framework provides a priori hypotheses for future experiments, thereby facilitating a systematic approach to understand assembly and succession in microbial communities across ecosystems.
The luminosities of the coldest brown dwarfs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tinney, C. G.; Faherty, Jacqueline K.; Kirkpatrick, J. Davy
2014-11-20
In recent years, brown dwarfs have been extended to a new Y-dwarf class with effective temperatures colder than 500 K and masses in the range of 5-30 Jupiter masses. They fill a crucial gap in observable atmospheric properties between the much colder gas-giant planets of our own solar system (at around 130 K) and both hotter T-type brown dwarfs and the hotter planets that can be imaged orbiting young nearby stars (both with effective temperatures in the range of 1500-1000 K). Distance measurements for these objects deliver absolute magnitudes that make critical tests of our understanding of very cool atmospheres.more » Here we report new distances for nine Y dwarfs and seven very late T dwarfs. These reveal that Y dwarfs do indeed represent a continuation of the T-dwarf sequence to both fainter luminosities and cooler temperatures. They also show that the coolest objects display a large range in absolute magnitude for a given photometric color. The latest atmospheric models show good agreement with the majority of these Y-dwarf absolute magnitudes. This is also the case for WISE0855-0714, the coldest and closest brown dwarf to the Sun, which shows evidence for water ice clouds. However, there are also some outstanding exceptions, which suggest either binarity or the presence of condensate clouds. The former is readily testable with current adaptive optics facilities. The latter would mean that the range of cloudiness in Y dwarfs is substantial with most hosting almost no clouds—while others have dense clouds, making them prime targets for future variability observations to study cloud dynamics.« less
The cognitive niche: Coevolution of intelligence, sociality, and language
Pinker, Steven
2010-01-01
Although Darwin insisted that human intelligence could be fully explained by the theory of evolution, the codiscoverer of natural selection, Alfred Russel Wallace, claimed that abstract intelligence was of no use to ancestral humans and could only be explained by intelligent design. Wallace's apparent paradox can be dissolved with two hypotheses about human cognition. One is that intelligence is an adaptation to a knowledge-using, socially interdependent lifestyle, the “cognitive niche.” This embraces the ability to overcome the evolutionary fixed defenses of plants and animals by applications of reasoning, including weapons, traps, coordinated driving of game, and detoxification of plants. Such reasoning exploits intuitive theories about different aspects of the world, such as objects, forces, paths, places, states, substances, and other people's beliefs and desires. The theory explains many zoologically unusual traits in Homo sapiens, including our complex toolkit, wide range of habitats and diets, extended childhoods and long lives, hypersociality, complex mating, division into cultures, and language (which multiplies the benefit of knowledge because know-how is useful not only for its practical benefits but as a trade good with others, enhancing the evolution of cooperation). The second hypothesis is that humans possess an ability of metaphorical abstraction, which allows them to coopt faculties that originally evolved for physical problem-solving and social coordination, apply them to abstract subject matter, and combine them productively. These abilities can help explain the emergence of abstract cognition without supernatural or exotic evolutionary forces and are in principle testable by analyses of statistical signs of selection in the human genome. PMID:20445094
Towards a theory of PACS deployment: an integrative PACS maturity framework.
van de Wetering, Rogier; Batenburg, Ronald
2014-06-01
Owing to large financial investments that go along with the picture archiving and communication system (PACS) deployments and inconsistent PACS performance evaluations, there is a pressing need for a better understanding of the implications of PACS deployment in hospitals. We claim that there is a gap in the research field, both theoretically and empirically, to explain the success of the PACS deployment and maturity in hospitals. Theoretical principles are relevant to the PACS performance; maturity and alignment are reviewed from a system and complexity perspective. A conceptual model to explain the PACS performance and a set of testable hypotheses are then developed. Then, structural equation modeling (SEM), i.e. causal modeling, is applied to validate the model and hypotheses based on a research sample of 64 hospitals that use PACS, i.e. 70 % of all hospitals in the Netherlands. Outcomes of the SEM analyses substantiate that the measurements of all constructs are reliable and valid. The PACS alignment-modeled as a higher-order construct of five complementary organizational dimensions and maturity levels-has a significant positive impact on the PACS performance. This result is robust and stable for various sub-samples and segments. This paper presents a conceptual model that explains how alignment in deploying PACS in hospitals is positively related to the perceived performance of PACS. The conceptual model is extended with tools as checklists to systematically identify the improvement areas for hospitals in the PACS domain. The holistic approach towards PACS alignment and maturity provides a framework for clinical practice.
A system architecture for a planetary rover
NASA Technical Reports Server (NTRS)
Smith, D. B.; Matijevic, J. R.
1989-01-01
Each planetary mission requires a complex space vehicle which integrates several functions to accomplish the mission and science objectives. A Mars Rover is one of these vehicles, and extends the normal spacecraft functionality with two additional functions: surface mobility and sample acquisition. All functions are assembled into a hierarchical and structured format to understand the complexities of interactions between functions during different mission times. It can graphically show data flow between functions, and most importantly, the necessary control flow to avoid unambiguous results. Diagrams are presented organizing the functions into a structured, block format where each block represents a major function at the system level. As such, there are six blocks representing telecomm, power, thermal, science, mobility and sampling under a supervisory block called Data Management/Executive. Each block is a simple collection of state machines arranged into a hierarchical order very close to the NASREM model for Telerobotics. Each layer within a block represents a level of control for a set of state machines that do the three primary interface functions: command, telemetry, and fault protection. This latter function is expanded to include automatic reactions to the environment as well as internal faults. Lastly, diagrams are presented that trace the system operations involved in moving from site to site after site selection. The diagrams clearly illustrate both the data and control flows. They also illustrate inter-block data transfers and a hierarchical approach to fault protection. This systems architecture can be used to determine functional requirements, interface specifications and be used as a mechanism for grouping subsystems (i.e., collecting groups of machines, or blocks consistent with good and testable implementations).
Colloquium paper: the cognitive niche: coevolution of intelligence, sociality, and language.
Pinker, Steven
2010-05-11
Although Darwin insisted that human intelligence could be fully explained by the theory of evolution, the codiscoverer of natural selection, Alfred Russel Wallace, claimed that abstract intelligence was of no use to ancestral humans and could only be explained by intelligent design. Wallace's apparent paradox can be dissolved with two hypotheses about human cognition. One is that intelligence is an adaptation to a knowledge-using, socially interdependent lifestyle, the "cognitive niche." This embraces the ability to overcome the evolutionary fixed defenses of plants and animals by applications of reasoning, including weapons, traps, coordinated driving of game, and detoxification of plants. Such reasoning exploits intuitive theories about different aspects of the world, such as objects, forces, paths, places, states, substances, and other people's beliefs and desires. The theory explains many zoologically unusual traits in Homo sapiens, including our complex toolkit, wide range of habitats and diets, extended childhoods and long lives, hypersociality, complex mating, division into cultures, and language (which multiplies the benefit of knowledge because know-how is useful not only for its practical benefits but as a trade good with others, enhancing the evolution of cooperation). The second hypothesis is that humans possess an ability of metaphorical abstraction, which allows them to coopt faculties that originally evolved for physical problem-solving and social coordination, apply them to abstract subject matter, and combine them productively. These abilities can help explain the emergence of abstract cognition without supernatural or exotic evolutionary forces and are in principle testable by analyses of statistical signs of selection in the human genome.
A semantic web framework to integrate cancer omics data with biological knowledge.
Holford, Matthew E; McCusker, James P; Cheung, Kei-Hoi; Krauthammer, Michael
2012-01-25
The RDF triple provides a simple linguistic means of describing limitless types of information. Triples can be flexibly combined into a unified data source we call a semantic model. Semantic models open new possibilities for the integration of variegated biological data. We use Semantic Web technology to explicate high throughput clinical data in the context of fundamental biological knowledge. We have extended Corvus, a data warehouse which provides a uniform interface to various forms of Omics data, by providing a SPARQL endpoint. With the querying and reasoning tools made possible by the Semantic Web, we were able to explore quantitative semantic models retrieved from Corvus in the light of systematic biological knowledge. For this paper, we merged semantic models containing genomic, transcriptomic and epigenomic data from melanoma samples with two semantic models of functional data - one containing Gene Ontology (GO) data, the other, regulatory networks constructed from transcription factor binding information. These two semantic models were created in an ad hoc manner but support a common interface for integration with the quantitative semantic models. Such combined semantic models allow us to pose significant translational medicine questions. Here, we study the interplay between a cell's molecular state and its response to anti-cancer therapy by exploring the resistance of cancer cells to Decitabine, a demethylating agent. We were able to generate a testable hypothesis to explain how Decitabine fights cancer - namely, that it targets apoptosis-related gene promoters predominantly in Decitabine-sensitive cell lines, thus conveying its cytotoxic effect by activating the apoptosis pathway. Our research provides a framework whereby similar hypotheses can be developed easily.
Electrical test prediction using hybrid metrology and machine learning
NASA Astrophysics Data System (ADS)
Breton, Mary; Chao, Robin; Muthinti, Gangadhara Raja; de la Peña, Abraham A.; Simon, Jacques; Cepler, Aron J.; Sendelbach, Matthew; Gaudiello, John; Emans, Susan; Shifrin, Michael; Etzioni, Yoav; Urenski, Ronen; Lee, Wei Ti
2017-03-01
Electrical test measurement in the back-end of line (BEOL) is crucial for wafer and die sorting as well as comparing intended process splits. Any in-line, nondestructive technique in the process flow to accurately predict these measurements can significantly improve mean-time-to-detect (MTTD) of defects and improve cycle times for yield and process learning. Measuring after BEOL metallization is commonly done for process control and learning, particularly with scatterometry (also called OCD (Optical Critical Dimension)), which can solve for multiple profile parameters such as metal line height or sidewall angle and does so within patterned regions. This gives scatterometry an advantage over inline microscopy-based techniques, which provide top-down information, since such techniques can be insensitive to sidewall variations hidden under the metal fill of the trench. But when faced with correlation to electrical test measurements that are specific to the BEOL processing, both techniques face the additional challenge of sampling. Microscopy-based techniques are sampling-limited by their small probe size, while scatterometry is traditionally limited (for microprocessors) to scribe targets that mimic device ground rules but are not necessarily designed to be electrically testable. A solution to this sampling challenge lies in a fast reference-based machine learning capability that allows for OCD measurement directly of the electrically-testable structures, even when they are not OCD-compatible. By incorporating such direct OCD measurements, correlation to, and therefore prediction of, resistance of BEOL electrical test structures is significantly improved. Improvements in prediction capability for multiple types of in-die electrically-testable device structures is demonstrated. To further improve the quality of the prediction of the electrical resistance measurements, hybrid metrology using the OCD measurements as well as X-ray metrology (XRF) is used. Hybrid metrology is the practice of combining information from multiple sources in order to enable or improve the measurement of one or more critical parameters. Here, the XRF measurements are used to detect subtle changes in barrier layer composition and thickness that can have second-order effects on the electrical resistance of the test structures. By accounting for such effects with the aid of the X-ray-based measurements, further improvement in the OCD correlation to electrical test measurements is achieved. Using both types of solution incorporation of fast reference-based machine learning on nonOCD-compatible test structures, and hybrid metrology combining OCD with XRF technology improvement in BEOL cycle time learning could be accomplished through improved prediction capability.
Patterns of Dysmorphic Features in Schizophrenia
Scutt, L.E.; Chow, E.W.C.; Weksberg, R.; Honer, W.G.; Bassett, Anne S.
2011-01-01
Congenital dysmorphic features are prevalent in schizophrenia and may reflect underlying neurodevelopmental abnormalities. A cluster analysis approach delineating patterns of dysmorphic features has been used in genetics to classify individuals into more etiologically homogeneous subgroups. In the present study, this approach was applied to schizophrenia, using a sample with a suspected genetic syndrome as a testable model. Subjects (n = 159) with schizophrenia or schizoaffective disorder were ascertained from chronic patient populations (random, n=123) or referred with possible 22q11 deletion syndrome (referred, n = 36). All subjects were evaluated for presence or absence of 70 reliably assessed dysmorphic features, which were used in a three-step cluster analysis. The analysis produced four major clusters with different patterns of dysmorphic features. Significant between-cluster differences were found for rates of 37 dysmorphic features (P < 0.05), median number of dysmorphic features (P = 0.0001), and validating features not used in the cluster analysis: mild mental retardation (P = 0.001) and congenital heart defects (P = 0.002). Two clusters (1 and 4) appeared to represent more developmental subgroups of schizophrenia with elevated rates of dysmorphic features and validating features. Cluster 1 (n = 27) comprised mostly referred subjects. Cluster 4 (n= 18) had a different pattern of dysmorphic features; one subject had a mosaic Turner syndrome variant. Two other clusters had lower rates and patterns of features consistent with those found in previous studies of schizophrenia. Delineating patterns of dysmorphic features may help identify subgroups that could represent neurodevelopmental forms of schizophrenia with more homogeneous origins. PMID:11803519
Liou, Shwu-Ru
2009-01-01
To systematically analyse the Organizational Commitment model and Theory of Reasoned Action and determine concepts that can better explain nurses' intention to leave their job. The Organizational Commitment model and Theory of Reasoned Action have been proposed and applied to understand intention to leave and turnover behaviour, which are major contributors to nursing shortage. However, the appropriateness of applying these two models in nursing was not analysed. Three main criteria of a useful model were used for the analysis: consistency in the use of concepts, testability and predictability. Both theories use concepts consistently. Concepts in the Theory of Reasoned Action are defined broadly whereas they are operationally defined in the Organizational Commitment model. Predictability of the Theory of Reasoned Action is questionable whereas the Organizational Commitment model can be applied to predict intention to leave. A model was proposed based on this analysis. Organizational commitment, intention to leave, work experiences, job characteristics and personal characteristics can be concepts for predicting nurses' intention to leave. Nursing managers may consider nurses' personal characteristics and experiences to increase their organizational commitment and enhance their intention to stay. Empirical studies are needed to test and cross-validate the re-synthesized model for nurses' intention to leave their job.
NASA Astrophysics Data System (ADS)
Mitchell, Michael R.; Leibler, Stanislas
2018-05-01
The abundance of available static protein structural data makes the more effective analysis and interpretation of this data a valuable tool to supplement the experimental study of protein mechanics. Structural displacements can be difficult to analyze and interpret. Previously, we showed that strains provide a more natural and interpretable representation of protein deformations, revealing mechanical coupling between spatially distinct sites of allosteric proteins. Here, we demonstrate that other transformations of displacements yield additional insights. We calculate the divergence and curl of deformations of the transmembrane channel KcsA. Additionally, we introduce quantities analogous to bend, splay, and twist deformation energies of nematic liquid crystals. These transformations enable the decomposition of displacements into different modes of deformation, helping to characterize the type of deformation a protein undergoes. We apply these calculations to study the filter and gating regions of KcsA. We observe a continuous path of rotational deformations physically coupling these two regions, and, we propose, underlying the allosteric interaction between these regions. Bend, splay, and twist distinguish KcsA gate opening, filter opening, and filter-gate coupling, respectively. In general, physically meaningful representations of deformations (like strain, curl, bend, splay, and twist) can make testable predictions and yield insights into protein mechanics, augmenting experimental methods and more fully exploiting available structural data.
A testable theory of problem solving courts: Avoiding past empirical and legal failures.
Wiener, Richard L; Winick, Bruce J; Georges, Leah Skovran; Castro, Anthony
2010-01-01
Recent years have seen a proliferation of problem solving courts designed to rehabilitate certain classes of offenders and thereby resolve the underlying problems that led to their court involvement in the first place. Some commentators have reacted positively to these courts, considering them an extension of the philosophy and logic of Therapeutic Jurisprudence, but others show concern that the discourse surrounding these specialty courts has not examined their process or outcomes critically enough. This paper examines that criticism from historical and social scientific perspectives. The analysis culminates in a model that describes how offenders are likely to respond to the process as they engage in problem solving court programs and the ways in which those courts might impact subsequent offender conduct. This Therapeutic Jurisprudence model of problem solving courts draws heavily on social cognitive psychology and more specifically on theories of procedural justice, motivation, and anticipated emotion to offer an explanation of how offenders respond to these programs. We offer this model as a lens through which social scientists can begin to address the concern that there is not enough critical analysis of the process and outcome of these courts. Applying this model to specialty courts constitutes an important step in critically examining the contribution of problem solving courts. Copyright © 2010 Elsevier Ltd. All rights reserved.
Cruz-Morales, Pablo; Ramos-Aboites, Hilda E; Licona-Cassani, Cuauhtémoc; Selem-Mójica, Nelly; Mejía-Ponce, Paulina M; Souza-Saldívar, Valeria; Barona-Gómez, Francisco
2017-09-01
Desferrioxamines are hydroxamate siderophores widely conserved in both aquatic and soil-dwelling Actinobacteria. While the genetic and enzymatic bases of siderophore biosynthesis and their transport in model families of this phylum are well understood, evolutionary studies are lacking. Here, we perform a comprehensive desferrioxamine-centric (des genes) phylogenomic analysis, which includes the genomes of six novel strains isolated from an iron and phosphorous depleted oasis in the Chihuahuan desert of Mexico. Our analyses reveal previously unnoticed desferrioxamine evolutionary patterns, involving both biosynthetic and transport genes, likely to be related to desferrioxamines chemical diversity. The identified patterns were used to postulate experimentally testable hypotheses after phenotypic characterization, including profiling of siderophores production and growth stimulation of co-cultures under iron deficiency. Based in our results, we propose a novel des gene, which we term desG, as responsible for incorporation of phenylacetyl moieties during biosynthesis of previously reported arylated desferrioxamines. Moreover, a genomic-based classification of the siderophore-binding proteins responsible for specific and generalist siderophore assimilation is postulated. This report provides a much-needed evolutionary framework, with specific insights supported by experimental data, to direct the future ecological and functional analysis of desferrioxamines in the environment. © FEMS 2017.
Reliability/Maintainability/Testability Design for Dormancy
1988-05-01
compositions was developed thousands of years ago. It has proven to be one of the most durable and strongest substances known. It has been stated that glass can...potting or casting) ,1, Certain foamed resins and low density , hollow beady compounds can be used to reduce .eight r \\dverse l)ielectric Properties I...4.1.1.1 General Characteristics of Fixed Resistors 4.1-13 4.1.1.1.1 Fixed Composition Resistors 4.1-13 4.1.1.1.2 Fixed Film Resistors 4.1-20 4.1.1.1.3
Repairable chip bonding/interconnect process
Bernhardt, Anthony F.; Contolini, Robert J.; Malba, Vincent; Riddle, Robert A.
1997-01-01
A repairable, chip-to-board interconnect process which addresses cost and testability issues in the multi-chip modules. This process can be carried out using a chip-on-sacrificial-substrate technique, involving laser processing. This process avoids the curing/solvent evolution problems encountered in prior approaches, as well is resolving prior plating problems and the requirements for fillets. For repairable high speed chip-to-board connection, transmission lines can be formed on the sides of the chip from chip bond pads, ending in a gull wing at the bottom of the chip for subsequent solder.
The Hubble Web: The Dark Matter Problem and Cosmic Strings
NASA Astrophysics Data System (ADS)
Alexander, Stephon
2009-07-01
I propose a reinterpretation of cosmic dark matter in which a rigid network of cosmic strings formed at the end of inflation. The cosmic strings fulfill three functions: At recombination they provide an accretion mechanism for virializing baryonic and warm dark matter into disks. These cosmic strings survive as configurations which thread spiral and elliptical galaxies leading to the observed flatness of rotation curves and the Tully-Fisher relation. We find a relationship between the rotational velocity of the galaxy and the string tension and discuss the testability of this model.
1983-09-01
has been reviewed and is approved for publication. Im Ŕ APPROVED: . L,.. &- MARK W. LEVI Project Engineer APPROVED: W.S. TUTHILL, Colonel, USAF Chief...ebetract entered In Block 20, if different from Report) Same IS. SUPPLEMENTARY NOTES RADC Project Engineer: Mark W. Levi (RBRP) This effort was funded...masking the presence of another fault which was a functional or reliability hazard. ’." • ’ ",, ,~ MARK W. LEVI A ec . ston For 1\\ T’ ir I .] / r "- T A
Lattice of quantum predictions
NASA Astrophysics Data System (ADS)
Drieschner, Michael
1993-10-01
What is the structure of reality? Physics is supposed to answer this question, but a purely empiristic view is not sufficient to explain its ability to do so. Quantum mechanics has forced us to think more deeply about what a physical theory is. There are preconditions every physical theory must fulfill. It has to contain, e.g., rules for empirically testable predictions. Those preconditions give physics a structure that is “a priori” in the Kantian sense. An example is given how the lattice structure of quantum mechanics can be understood along these lines.
Chowdhry, Bhagwan
2011-01-01
I formulate a simple and parsimonious evolutionary model that shows that because most species face a possibility of dying because of external factors, called extrinsic mortality in the biology literature, it can simultaneously explain (a) why we discount the future, (b) get weaker with age, and (c) display risk-aversion. The paper suggests that testable restrictions—across species, across time, or across genders—among time preference, aging, and risk-aversion could be analyzed in a simple framework .
NASA Technical Reports Server (NTRS)
VanDyke, Melissa; Godfroy, Tom; Houts, Mike; Dickens, Ricky; Dobson, Chris; Pederson, Kevin; Reid, Bob
1999-01-01
The use of resistance heaters to simulate heat from fission allows extensive development of fission systems to be performed in non-nuclear test facilities, saving time and money. Resistance heated tests on the Module Unfueled Thermal- hydraulic Test (MUTT) article has been performed at the Marshall Space Flight Center. This paper discusses the results of these experiments to date, and describes the additional testing that will be performed. Recommendations related to the design of testable space fission power and propulsion systems are made.
NASA Astrophysics Data System (ADS)
van Dyke, Melissa; Godfroy, Tom; Houts, Mike; Dickens, Ricky; Dobson, Chris; Pederson, Kevin; Reid, Bob; Sena, J. Tom
2000-01-01
The use of resistance heaters to simulate heat from fission allows extensive development of fission systems to be performed in non-nuclear test facilities, saving time and money. Resistance heated tests on the Module Unfueled Thermal-hydraulic Test (MUTT) article has been performed at the Marshall Space Flight Center. This paper discusses the results of these experiments to date, and describes the additional testing that will be performed. Recommendations related to the design of testable space fission power and propulsion systems are made. .
Results of 30 kWt Safe Affordable Fission Engine (SAFE-30) primary heat transport testing
NASA Astrophysics Data System (ADS)
Pedersen, Kevin; van Dyke, Melissa; Houts, Mike; Godfroy, Tom; Martin, James; Dickens, Ricky; Williams, Eric; Harper, Roger; Salvil, Pat; Reid, Bob
2001-02-01
The use of resistance heaters to simulate heat from fission allows extensive development of fission systems to be performed in non-nuclear test facilities, saving time and money. Resistance heated tests on the Safe Affordable Fission Engine-30 kilowatt (SAFE30) test article are being performed at the Marshall Space Flight Center. This paper discusses the results of these experiments to date, and describes the additional testing that will be performed. Recommendations related to the design of testable space fission power and propulsion systems are made. .
NASA Technical Reports Server (NTRS)
1990-01-01
The present conference on digital avionics discusses vehicle-management systems, spacecraft avionics, special vehicle avionics, communication/navigation/identification systems, software qualification and quality assurance, launch-vehicle avionics, Ada applications, sensor and signal processing, general aviation avionics, automated software development, design-for-testability techniques, and avionics-software engineering. Also discussed are optical technology and systems, modular avionics, fault-tolerant avionics, commercial avionics, space systems, data buses, crew-station technology, embedded processors and operating systems, AI and expert systems, data links, and pilot/vehicle interfaces.
Gravity: one of the driving forces for evolution.
Volkmann, D; Baluska, F
2006-12-01
Mechanical load is 10(3) larger for land-living than for water-living organisms. As a consequence, antigravitational material in form of compound materials like lignified cell walls in plants and mineralised bones in animals occurs in land-living organisms preferentially. Besides cellulose, pectic substances of plant cell walls seem to function as antigravitational material in early phases of plant evolution and development. A testable hypothesis including vesicular recycling processes into the tensegrity concept is proposed for both sensing of gravitational force and responding by production of antigravitational material at the cellular level.
Effect of ram semen extenders and supplements on computer assisted sperm analysis parameters
USDA-ARS?s Scientific Manuscript database
A study evaluated the effects of ram semen extender and extender supplementation on computer assisted sperm analysis (CASA) parameters positively correlated with progressive motility. Semen collected from 5 rams was distributed across treatment combinations consisting of either TRIS citrate (T) or ...
Review of extended producer responsibility: A case study approach.
Gupt, Yamini; Sahay, Samraj
2015-07-01
Principles of extended producer responsibility have been the core of most of the recent policies and legislation dealing with the end-of-life management of recyclable goods. This article makes an exploratory review of 27 cases of extended producer responsibility from developed and developing economies with and without informal recycling, to ascertain the most important aspect of extended producer responsibility. A comparative analysis of the cases with respect to role of stakeholders in the upstream and downstream stages of the extended producer responsibility has been carried out. Further, the study uses exploratory factor analysis to determine the important aspects of the extended producer responsibility in practice using 13 variables identified from the review. Findings of the comparative analysis reveal that financial responsibility of the producers and separate collecting and recycling agencies contributed significantly to the success of the extended producer responsibility-based environmental policies. Regulatory provisions, take-back responsibility and financial flow come out to be the three most important aspects of the extended producer responsibility. Presence of informal sector had a negative impact on the regulatory provisions. The outcomes of this study could serve as a guideline for designing of effective extended producer responsibility-based policies. © The Author(s) 2015.
Conformal standard model, leptogenesis, and dark matter
NASA Astrophysics Data System (ADS)
Lewandowski, Adrian; Meissner, Krzysztof A.; Nicolai, Hermann
2018-02-01
The conformal standard model is a minimal extension of the Standard Model (SM) of particle physics based on the assumed absence of large intermediate scales between the TeV scale and the Planck scale, which incorporates only right-chiral neutrinos and a new complex scalar in addition to the usual SM degrees of freedom, but no other features such as supersymmetric partners. In this paper, we present a comprehensive quantitative analysis of this model, and show that all outstanding issues of particle physics proper can in principle be solved "in one go" within this framework. This includes in particular the stabilization of the electroweak scale, "minimal" leptogenesis and the explanation of dark matter, with a small mass and very weakly interacting Majoron as the dark matter candidate (for which we propose to use the name "minoron"). The main testable prediction of the model is a new and almost sterile scalar boson that would manifest itself as a narrow resonance in the TeV region. We give a representative range of parameter values consistent with our assumptions and with observation.
Neo-Darwinism, the Modern Synthesis and selfish genes: are they of use in physiology?
Noble, Denis
2011-01-01
This article argues that the gene-centric interpretations of evolution, and more particularly the selfish gene expression of those interpretations, form barriers to the integration of physiological science with evolutionary theory. A gene-centred approach analyses the relationships between genotypes and phenotypes in terms of differences (change the genotype and observe changes in phenotype). We now know that, most frequently, this does not correctly reveal the relationships because of extensive buffering by robust networks of interactions. By contrast, understanding biological function through physiological analysis requires an integrative approach in which the activity of the proteins and RNAs formed from each DNA template is analysed in networks of interactions. These networks also include components that are not specified by nuclear DNA. Inheritance is not through DNA sequences alone. The selfish gene idea is not useful in the physiological sciences, since selfishness cannot be defined as an intrinsic property of nucleotide sequences independently of gene frequency, i.e. the ‘success’ in the gene pool that is supposed to be attributable to the ‘selfish’ property. It is not a physiologically testable hypothesis. PMID:21135048
Neo-Darwinism, the modern synthesis and selfish genes: are they of use in physiology?
Noble, Denis
2011-03-01
This article argues that the gene-centric interpretations of evolution, and more particularly the selfish gene expression of those interpretations, form barriers to the integration of physiological science with evolutionary theory. A gene-centred approach analyses the relationships between genotypes and phenotypes in terms of differences (change the genotype and observe changes in phenotype). We now know that, most frequently, this does not correctly reveal the relationships because of extensive buffering by robust networks of interactions. By contrast, understanding biological function through physiological analysis requires an integrative approach in which the activity of the proteins and RNAs formed from each DNA template is analysed in networks of interactions. These networks also include components that are not specified by nuclear DNA. Inheritance is not through DNA sequences alone. The selfish gene idea is not useful in the physiological sciences, since selfishness cannot be defined as an intrinsic property of nucleotide sequences independently of gene frequency, i.e. the 'success' in the gene pool that is supposed to be attributable to the 'selfish' property. It is not a physiologically testable hypothesis.
NASA Astrophysics Data System (ADS)
Park, Hyeran; Nielsen, Wendy; Woodruff, Earl
2014-05-01
This study examined and compared students' understanding of nature of science (NOS) with 521 Grade 8 Canadian and Korean students using a mixed methods approach. The concepts of NOS were measured using a survey that had both quantitative and qualitative elements. Descriptive statistics and one-way multivariate analysis of variances examined the quantitative data while a conceptually clustered matrix classified the open-ended responses. The country effect could explain 3-12 % of the variances of subjectivity, empirical testability and diverse methods, but it was not significant for the concepts of tentativeness and socio-cultural embeddedness of science. The open-ended responses showed that students believed scientific theories change due to errors or discoveries. Students regarded empirical evidence as undeniable and objective although they acknowledged experiments depend on theories or scientists' knowledge. The open responses revealed that national situations and curriculum content affected their views. For our future democratic citizens to gain scientific literacy, science curricula should include currently acknowledged NOS concepts and should be situated within societal and cultural perspectives.
NASA Technical Reports Server (NTRS)
Harper, R. E.; Alger, L. S.; Babikyan, C. A.; Butler, B. P.; Friend, S. A.; Ganska, R. J.; Lala, J. H.; Masotto, T. K.; Meyer, A. J.; Morton, D. P.
1992-01-01
Digital computing systems needed for Army programs such as the Computer-Aided Low Altitude Helicopter Flight Program and the Armored Systems Modernization (ASM) vehicles may be characterized by high computational throughput and input/output bandwidth, hard real-time response, high reliability and availability, and maintainability, testability, and producibility requirements. In addition, such a system should be affordable to produce, procure, maintain, and upgrade. To address these needs, the Army Fault Tolerant Architecture (AFTA) is being designed and constructed under a three-year program comprised of a conceptual study, detailed design and fabrication, and demonstration and validation phases. Described here are the results of the conceptual study phase of the AFTA development. Given here is an introduction to the AFTA program, its objectives, and key elements of its technical approach. A format is designed for representing mission requirements in a manner suitable for first order AFTA sizing and analysis, followed by a discussion of the current state of mission requirements acquisition for the targeted Army missions. An overview is given of AFTA's architectural theory of operation.
Akeredolu, Oore-Ofe; Soma, Prashilla; Kell, Douglas B
2016-01-01
We review the evidence that infectious agents, including those that become dormant within the host, have a major role to play in much of the etiology of rheumatoid arthritis and the inflammation that is its hallmark. This occurs in particular because they can produce cross-reactive (auto-)antigens, as well as potent inflammagens such as lipopolysaccharide that can themselves catalyze further inflammagenesis, including via β-amyloid formation. A series of observables coexist in many chronic, inflammatory diseases as well as rheumatoid arthritis. They include iron dysregulation, hypercoagulability, anomalous morphologies of host erythrocytes, and microparticle formation. Iron dysregulation may be responsible for the periodic regrowth and resuscitation of the dormant bacteria, with concomitant inflammagen production. The present systems biology analysis benefits from the philosophical idea of “coherence,” that reflects the principle that if a series of ostensibly unrelated findings are brought together into a self-consistent narrative, that narrative is thereby strengthened. As such, we provide a coherent and testable narrative for the major involvement of (often dormant) bacteria in rheumatoid arthritis. PMID:27889698
Symmetry in locomotor central pattern generators and animal gaits
NASA Astrophysics Data System (ADS)
Golubitsky, Martin; Stewart, Ian; Buono, Pietro-Luciano; Collins, J. J.
1999-10-01
Animal locomotion is controlled, in part, by a central pattern generator (CPG), which is an intraspinal network of neurons capable of generating a rhythmic output. The spatio-temporal symmetries of the quadrupedal gaits walk, trot and pace lead to plausible assumptions about the symmetries of locomotor CPGs. These assumptions imply that the CPG of a quadruped should consist of eight nominally identical subcircuits, arranged in an essentially unique matter. Here we apply analogous arguments to myriapod CPGs. Analyses based on symmetry applied to these networks lead to testable predictions, including a distinction between primary and secondary gaits, the existence of a new primary gait called `jump', and the occurrence of half-integer wave numbers in myriapod gaits. For bipeds, our analysis also predicts two gaits with the out-of-phase symmetry of the walk and two gaits with the in-phase symmetry of the hop. We present data that support each of these predictions. This work suggests that symmetry can be used to infer a plausible class of CPG network architectures from observed patterns of animal gaits.
Friction law and hysteresis in granular materials
Wyart, M.
2017-01-01
The macroscopic friction of particulate materials often weakens as the flow rate is increased, leading to potentially disastrous intermittent phenomena including earthquakes and landslides. We theoretically and numerically study this phenomenon in simple granular materials. We show that velocity weakening, corresponding to a nonmonotonic behavior in the friction law, μ(I), is present even if the dynamic and static microscopic friction coefficients are identical, but disappears for softer particles. We argue that this instability is induced by endogenous acoustic noise, which tends to make contacts slide, leading to faster flow and increased noise. We show that soft spots, or excitable regions in the materials, correspond to rolling contacts that are about to slide, whose density is described by a nontrivial exponent θs. We build a microscopic theory for the nonmonotonicity of μ(I), which also predicts the scaling behavior of acoustic noise, the fraction of sliding contacts χ, and the sliding velocity, in terms of θs. Surprisingly, these quantities have no limit when particles become infinitely hard, as confirmed numerically. Our analysis rationalizes previously unexplained observations and makes experimentally testable predictions. PMID:28811373
Coexistence trend contingent to Mediterranean oaks with different leaf habits.
Di Paola, Arianna; Paquette, Alain; Trabucco, Antonio; Mereu, Simone; Valentini, Riccardo; Paparella, Francesco
2017-05-01
In a previous work we developed a mathematical model to explain the co-occurrence of evergreen and deciduous oak groups in the Mediterranean region, regarded as one of the distinctive features of Mediterranean biodiversity. The mathematical analysis showed that a stabilizing mechanism resulting from niche difference (i.e. different water use and water stress tolerance) between groups allows their coexistence at intermediate values of suitable soil water content. A simple formal derivation of the model expresses this hypothesis in a testable form linked uniquely to the actual evapotranspiration of forests community. In the present work we ascertain whether this simplified conclusion possesses some degree of explanatory power by comparing available data on oaks distributions and remotely sensed evapotranspiration (MODIS product) in a large-scale survey embracing the western Mediterranean area. Our findings confirmed the basic assumptions of model addressed on large scale, but also revealed asymmetric responses to water use and water stress tolerance between evergreen and deciduous oaks that should be taken into account to increase the understating of species interactions and, ultimately, improve the modeling capacity to explain co-occurrence.
Friction law and hysteresis in granular materials
NASA Astrophysics Data System (ADS)
DeGiuli, E.; Wyart, M.
2017-08-01
The macroscopic friction of particulate materials often weakens as the flow rate is increased, leading to potentially disastrous intermittent phenomena including earthquakes and landslides. We theoretically and numerically study this phenomenon in simple granular materials. We show that velocity weakening, corresponding to a nonmonotonic behavior in the friction law, μ(I), is present even if the dynamic and static microscopic friction coefficients are identical, but disappears for softer particles. We argue that this instability is induced by endogenous acoustic noise, which tends to make contacts slide, leading to faster flow and increased noise. We show that soft spots, or excitable regions in the materials, correspond to rolling contacts that are about to slide, whose density is described by a nontrivial exponent θs. We build a microscopic theory for the nonmonotonicity of μ(I), which also predicts the scaling behavior of acoustic noise, the fraction of sliding contacts χ, and the sliding velocity, in terms of θs. Surprisingly, these quantities have no limit when particles become infinitely hard, as confirmed numerically. Our analysis rationalizes previously unexplained observations and makes experimentally testable predictions.
STOP using just GO: a multi-ontology hypothesis generation tool for high throughput experimentation
2013-01-01
Background Gene Ontology (GO) enrichment analysis remains one of the most common methods for hypothesis generation from high throughput datasets. However, we believe that researchers strive to test other hypotheses that fall outside of GO. Here, we developed and evaluated a tool for hypothesis generation from gene or protein lists using ontological concepts present in manually curated text that describes those genes and proteins. Results As a consequence we have developed the method Statistical Tracking of Ontological Phrases (STOP) that expands the realm of testable hypotheses in gene set enrichment analyses by integrating automated annotations of genes to terms from over 200 biomedical ontologies. While not as precise as manually curated terms, we find that the additional enriched concepts have value when coupled with traditional enrichment analyses using curated terms. Conclusion Multiple ontologies have been developed for gene and protein annotation, by using a dataset of both manually curated GO terms and automatically recognized concepts from curated text we can expand the realm of hypotheses that can be discovered. The web application STOP is available at http://mooneygroup.org/stop/. PMID:23409969
Heavy use of equations impedes communication among biologists.
Fawcett, Tim W; Higginson, Andrew D
2012-07-17
Most research in biology is empirical, yet empirical studies rely fundamentally on theoretical work for generating testable predictions and interpreting observations. Despite this interdependence, many empirical studies build largely on other empirical studies with little direct reference to relevant theory, suggesting a failure of communication that may hinder scientific progress. To investigate the extent of this problem, we analyzed how the use of mathematical equations affects the scientific impact of studies in ecology and evolution. The density of equations in an article has a significant negative impact on citation rates, with papers receiving 28% fewer citations overall for each additional equation per page in the main text. Long, equation-dense papers tend to be more frequently cited by other theoretical papers, but this increase is outweighed by a sharp drop in citations from nontheoretical papers (35% fewer citations for each additional equation per page in the main text). In contrast, equations presented in an accompanying appendix do not lessen a paper's impact. Our analysis suggests possible strategies for enhancing the presentation of mathematical models to facilitate progress in disciplines that rely on the tight integration of theoretical and empirical work.
A Model-based Health Monitoring and Diagnostic System for the UH-60 Helicopter. Appendix D
NASA Technical Reports Server (NTRS)
Patterson-Hine, Ann; Hindson, William; Sanderfer, Dwight; Deb, Somnath; Domagala, Chuck
2001-01-01
Model-based reasoning techniques hold much promise in providing comprehensive monitoring and diagnostics capabilities for complex systems. We are exploring the use of one of these techniques, which utilizes multi-signal modeling and the TEAMS-RT real-time diagnostic engine, on the UH-60 Rotorcraft Aircrew Systems Concepts Airborne Laboratory (RASCAL) flight research aircraft. We focus on the engine and transmission systems, and acquire sensor data across the 1553 bus as well as by direct analog-to-digital conversion from sensors to the QHuMS (Qualtech health and usage monitoring system) computer. The QHuMS computer uses commercially available components and is rack-mounted in the RASCAL facility. A multi-signal model of the transmission and engine subsystems enables studies of system testability and analysis of the degree of fault isolation available with various instrumentation suites. The model and examples of these analyses will be described and the data architectures enumerated. Flight tests of this system will validate the data architecture and provide real-time flight profiles to be further analyzed in the laboratory.
Pretorius, Etheresia; Akeredolu, Oore-Ofe; Soma, Prashilla; Kell, Douglas B
2017-02-01
We review the evidence that infectious agents, including those that become dormant within the host, have a major role to play in much of the etiology of rheumatoid arthritis and the inflammation that is its hallmark. This occurs in particular because they can produce cross-reactive (auto-)antigens, as well as potent inflammagens such as lipopolysaccharide that can themselves catalyze further inflammagenesis, including via β-amyloid formation. A series of observables coexist in many chronic, inflammatory diseases as well as rheumatoid arthritis. They include iron dysregulation, hypercoagulability, anomalous morphologies of host erythrocytes, and microparticle formation. Iron dysregulation may be responsible for the periodic regrowth and resuscitation of the dormant bacteria, with concomitant inflammagen production. The present systems biology analysis benefits from the philosophical idea of "coherence," that reflects the principle that if a series of ostensibly unrelated findings are brought together into a self-consistent narrative, that narrative is thereby strengthened. As such, we provide a coherent and testable narrative for the major involvement of (often dormant) bacteria in rheumatoid arthritis.
NASA Technical Reports Server (NTRS)
2001-01-01
Qualtech Systems, Inc. developed a complete software system with capabilities of multisignal modeling, diagnostic analysis, run-time diagnostic operations, and intelligent interactive reasoners. Commercially available as the TEAMS (Testability Engineering and Maintenance System) tool set, the software can be used to reveal unanticipated system failures. The TEAMS software package is broken down into four companion tools: TEAMS-RT, TEAMATE, TEAMS-KB, and TEAMS-RDS. TEAMS-RT identifies good, bad, and suspect components in the system in real-time. It reports system health results from onboard tests, and detects and isolates failures within the system, allowing for rapid fault isolation. TEAMATE takes over from where TEAMS-RT left off by intelligently guiding the maintenance technician through the troubleshooting procedure, repair actions, and operational checkout. TEAMS-KB serves as a model management and collection tool. TEAMS-RDS (TEAMS-Remote Diagnostic Server) has the ability to continuously assess a system and isolate any failure in that system or its components, in real time. RDS incorporates TEAMS-RT, TEAMATE, and TEAMS-KB in a large-scale server architecture capable of providing advanced diagnostic and maintenance functions over a network, such as the Internet, with a web browser user interface.
Testable solution of the cosmological constant and coincidence problems
NASA Astrophysics Data System (ADS)
Shaw, Douglas J.; Barrow, John D.
2011-02-01
We present a new solution to the cosmological constant (CC) and coincidence problems in which the observed value of the CC, Λ, is linked to other observable properties of the Universe. This is achieved by promoting the CC from a parameter that must be specified, to a field that can take many possible values. The observed value of Λ≈(9.3Gyrs)-2 [≈10-120 in Planck units] is determined by a new constraint equation which follows from the application of a causally restricted variation principle. When applied to our visible Universe, the model makes a testable prediction for the dimensionless spatial curvature of Ωk0=-0.0056(ζb/0.5), where ζb˜1/2 is a QCD parameter. Requiring that a classical history exist, our model determines the probability of observing a given Λ. The observed CC value, which we successfully predict, is typical within our model even before the effects of anthropic selection are included. When anthropic selection effects are accounted for, we find that the observed coincidence between tΛ=Λ-1/2 and the age of the Universe, tU, is a typical occurrence in our model. In contrast to multiverse explanations of the CC problems, our solution is independent of the choice of a prior weighting of different Λ values and does not rely on anthropic selection effects. Our model includes no unnatural small parameters and does not require the introduction of new dynamical scalar fields or modifications to general relativity, and it can be tested by astronomical observations in the near future.
Larval transport modeling of deep-sea invertebrates can aid the search for undiscovered populations.
Yearsley, Jon M; Sigwart, Julia D
2011-01-01
Many deep-sea benthic animals occur in patchy distributions separated by thousands of kilometres, yet because deep-sea habitats are remote, little is known about their larval dispersal. Our novel method simulates dispersal by combining data from the Argo array of autonomous oceanographic probes, deep-sea ecological surveys, and comparative invertebrate physiology. The predicted particle tracks allow quantitative, testable predictions about the dispersal of benthic invertebrate larvae in the south-west Pacific. In a test case presented here, using non-feeding, non-swimming (lecithotrophic trochophore) larvae of polyplacophoran molluscs (chitons), we show that the likely dispersal pathways in a single generation are significantly shorter than the distances between the three known population centres in our study region. The large-scale density of chiton populations throughout our study region is potentially much greater than present survey data suggest, with intermediate 'stepping stone' populations yet to be discovered. We present a new method that is broadly applicable to studies of the dispersal of deep-sea organisms. This test case demonstrates the power and potential applications of our new method, in generating quantitative, testable hypotheses at multiple levels to solve the mismatch between observed and expected distributions: probabilistic predictions of locations of intermediate populations, potential alternative dispersal mechanisms, and expected population genetic structure. The global Argo data have never previously been used to address benthic biology, and our method can be applied to any non-swimming larvae of the deep-sea, giving information upon dispersal corridors and population densities in habitats that remain intrinsically difficult to assess.
Evolutionary Perspectives on Genetic and Environmental Risk Factors for Psychiatric Disorders.
Keller, Matthew C
2018-05-07
Evolutionary medicine uses evolutionary theory to help elucidate why humans are vulnerable to disease and disorders. I discuss two different types of evolutionary explanations that have been used to help understand human psychiatric disorders. First, a consistent finding is that psychiatric disorders are moderately to highly heritable, and many, such as schizophrenia, are also highly disabling and appear to decrease Darwinian fitness. Models used in evolutionary genetics to understand why genetic variation exists in fitness-related traits can be used to understand why risk alleles for psychiatric disorders persist in the population. The usual explanation for species-typical adaptations-natural selection-is less useful for understanding individual differences in genetic risk to disorders. Rather, two other types of models, mutation-selection-drift and balancing selection, offer frameworks for understanding why genetic variation in risk to psychiatric (and other) disorders exists, and each makes predictions that are now testable using whole-genome data. Second, species-typical capacities to mount reactions to negative events are likely to have been crafted by natural selection to minimize fitness loss. The pain reaction to tissue damage is almost certainly such an example, but it has been argued that the capacity to experience depressive symptoms such as sadness, anhedonia, crying, and fatigue in the face of adverse life situations may have been crafted by natural selection as well. I review the rationale and strength of evidence for this hypothesis. Evolutionary hypotheses of psychiatric disorders are important not only for offering explanations for why psychiatric disorders exist, but also for generating new, testable hypotheses and understanding how best to design studies and analyze data.
Larval Transport Modeling of Deep-Sea Invertebrates Can Aid the Search for Undiscovered Populations
Yearsley, Jon M.; Sigwart, Julia D.
2011-01-01
Background Many deep-sea benthic animals occur in patchy distributions separated by thousands of kilometres, yet because deep-sea habitats are remote, little is known about their larval dispersal. Our novel method simulates dispersal by combining data from the Argo array of autonomous oceanographic probes, deep-sea ecological surveys, and comparative invertebrate physiology. The predicted particle tracks allow quantitative, testable predictions about the dispersal of benthic invertebrate larvae in the south-west Pacific. Principal Findings In a test case presented here, using non-feeding, non-swimming (lecithotrophic trochophore) larvae of polyplacophoran molluscs (chitons), we show that the likely dispersal pathways in a single generation are significantly shorter than the distances between the three known population centres in our study region. The large-scale density of chiton populations throughout our study region is potentially much greater than present survey data suggest, with intermediate ‘stepping stone’ populations yet to be discovered. Conclusions/Significance We present a new method that is broadly applicable to studies of the dispersal of deep-sea organisms. This test case demonstrates the power and potential applications of our new method, in generating quantitative, testable hypotheses at multiple levels to solve the mismatch between observed and expected distributions: probabilistic predictions of locations of intermediate populations, potential alternative dispersal mechanisms, and expected population genetic structure. The global Argo data have never previously been used to address benthic biology, and our method can be applied to any non-swimming larvae of the deep-sea, giving information upon dispersal corridors and population densities in habitats that remain intrinsically difficult to assess. PMID:21857992
The Assurance Challenges of Advanced Packaging Technologies for Electronics
NASA Technical Reports Server (NTRS)
Sampson, Michael J.
2010-01-01
Advances in microelectronic parts performance are driving towards finer feature sizes, three-dimensional geometries and ever-increasing number of transistor equivalents that are resulting in increased die sizes and interconnection (I/O) counts. The resultant packaging necessary to provide assemble-ability, environmental protection, testability and interconnection to the circuit board for the active die creates major challenges, particularly for space applications, Traditionally, NASA has used hermetically packaged microcircuits whenever available but the new demands make hermetic packaging less and less practical at the same time as more and more expensive, Some part types of great interest to NASA designers are currently only available in non-hermetic packaging. It is a far more complex quality and reliability assurance challenge to gain confidence in the long-term survivability and effectiveness of nonhermetic packages than for hermetic ones. Although they may provide more rugged environmental protection than the familiar Plastic Encapsulated Microcircuits (PEMs), the non-hermetic Ceramic Column Grid Array (CCGA) packages that are the focus of this presentation present a unique combination of challenges to assessing their suitability for spaceflight use. The presentation will discuss the bases for these challenges, some examples of the techniques proposed to mitigate them and a proposed approach to a US MIL specification Class for non-hermetic microcircuits suitable for space application, Class Y, to be incorporated into M. IL-PRF-38535. It has recently emerged that some major packaging suppliers are offering hermetic area array packages that may offer alternatives to the nonhermetic CCGA styles but have also got their own inspectability and testability issues which will be briefly discussed in the presentation,
NASA Astrophysics Data System (ADS)
Hirata, N.; Tsuruoka, H.; Yokoi, S.
2011-12-01
The current Japanese national earthquake prediction program emphasizes the importance of modeling as well as monitoring for a sound scientific development of earthquake prediction research. One major focus of the current program is to move toward creating testable earthquake forecast models. For this purpose, in 2009 we joined the Collaboratory for the Study of Earthquake Predictability (CSEP) and installed, through an international collaboration, the CSEP Testing Centre, an infrastructure to encourage researchers to develop testable models for Japan. We started Japanese earthquake predictability experiment on November 1, 2009. The experiment consists of 12 categories, with 4 testing classes with different time spans (1 day, 3 months, 1 year and 3 years) and 3 testing regions called 'All Japan,' 'Mainland,' and 'Kanto.' A total of 160 models, as of August 2013, were submitted, and are currently under the CSEP official suite of tests for evaluating the performance of forecasts. We will present results of prospective forecast and testing for periods before and after the 2011 Tohoku-oki earthquake. Because a seismic activity has changed dramatically since the 2011 event, performances of models have been affected very much. In addition, as there is the problem of authorized catalogue related to the completeness magnitude, most models did not pass the CSEP consistency tests. Also, we will discuss the retrospective earthquake forecast experiments for aftershocks of the 2011 Tohoku-oki earthquake. Our aim is to describe what has turned out to be the first occasion for setting up a research environment for rigorous earthquake forecasting in Japan.
NASA Astrophysics Data System (ADS)
Hirata, N.; Tsuruoka, H.; Yokoi, S.
2013-12-01
The current Japanese national earthquake prediction program emphasizes the importance of modeling as well as monitoring for a sound scientific development of earthquake prediction research. One major focus of the current program is to move toward creating testable earthquake forecast models. For this purpose, in 2009 we joined the Collaboratory for the Study of Earthquake Predictability (CSEP) and installed, through an international collaboration, the CSEP Testing Centre, an infrastructure to encourage researchers to develop testable models for Japan. We started Japanese earthquake predictability experiment on November 1, 2009. The experiment consists of 12 categories, with 4 testing classes with different time spans (1 day, 3 months, 1 year and 3 years) and 3 testing regions called 'All Japan,' 'Mainland,' and 'Kanto.' A total of 160 models, as of August 2013, were submitted, and are currently under the CSEP official suite of tests for evaluating the performance of forecasts. We will present results of prospective forecast and testing for periods before and after the 2011 Tohoku-oki earthquake. Because a seismic activity has changed dramatically since the 2011 event, performances of models have been affected very much. In addition, as there is the problem of authorized catalogue related to the completeness magnitude, most models did not pass the CSEP consistency tests. Also, we will discuss the retrospective earthquake forecast experiments for aftershocks of the 2011 Tohoku-oki earthquake. Our aim is to describe what has turned out to be the first occasion for setting up a research environment for rigorous earthquake forecasting in Japan.
Kanyi, John; Karwa, Rakhi; Pastakia, Sonak Dinesh; Manji, Imran; Manyara, Simon; Saina, Collins
2017-05-01
HIV-infected patients are at an increased risk of developing venous thromboembolism (VTE), and minimal data are available to describe the need for extended treatment. To evaluate the frequency of and determine predictive risk factors for extended anticoagulation of VTE in HIV-infected patients in rural, western Kenya. A retrospective chart review was conducted at the Anticoagulation Monitoring Service affiliated with Moi Teaching and Referral Hospital and the Academic Model Providing Access to Healthcare. Data were collected on patients who were HIV-infected and receiving anticoagulation for lower-limb deep vein thrombosis. The need for extended anticoagulation, defined as receiving ≥7 months of warfarin therapy, was established based on patient symptoms or Doppler ultrasound-confirmed diagnosis. Evaluation of the secondary outcomes utilized a univariate analysis to identify risk factors associated with extended anticoagulation. A total of 71 patients were included in the analysis; 27 patients (38%) required extended anticoagulation. The univariate analysis showed a statistically significant association between the need for extended anticoagulation and achieving a therapeutic international normalized ratio within 21 days in both the unadjusted and adjusted analysis. Patients with a history of opportunistic infections required an extended duration of anticoagulation in the adjusted analysis: odds ratio = 3.42; 95% CI = 1.04-11.32; P = 0.04. This study shows that there may be a need for increased duration of anticoagulation in HIV-infected patients, with a need to address the issue of long-term management. Guideline recommendations are needed to address the complexity of treatment issues in this population.
Zhang, Yuji
2015-01-01
Molecular networks act as the backbone of molecular activities within cells, offering a unique opportunity to better understand the mechanism of diseases. While network data usually constitute only static network maps, integrating them with time course gene expression information can provide clues to the dynamic features of these networks and unravel the mechanistic driver genes characterizing cellular responses. Time course gene expression data allow us to broadly "watch" the dynamics of the system. However, one challenge in the analysis of such data is to establish and characterize the interplay among genes that are altered at different time points in the context of a biological process or functional category. Integrative analysis of these data sources will lead us a more complete understanding of how biological entities (e.g., genes and proteins) coordinately perform their biological functions in biological systems. In this paper, we introduced a novel network-based approach to extract functional knowledge from time-dependent biological processes at a system level using time course mRNA sequencing data in zebrafish embryo development. The proposed method was applied to investigate 1α, 25(OH)2D3-altered mechanisms in zebrafish embryo development. We applied the proposed method to a public zebrafish time course mRNA-Seq dataset, containing two different treatments along four time points. We constructed networks between gene ontology biological process categories, which were enriched in differential expressed genes between consecutive time points and different conditions. The temporal propagation of 1α, 25-Dihydroxyvitamin D3-altered transcriptional changes started from a few genes that were altered initially at earlier stage, to large groups of biological coherent genes at later stages. The most notable biological processes included neuronal and retinal development and generalized stress response. In addition, we also investigated the relationship among biological processes enriched in co-expressed genes under different conditions. The enriched biological processes include translation elongation, nucleosome assembly, and retina development. These network dynamics provide new insights into the impact of 1α, 25-Dihydroxyvitamin D3 treatment in bone and cartilage development. We developed a network-based approach to analyzing the DEGs at different time points by integrating molecular interactions and gene ontology information. These results demonstrate that the proposed approach can provide insight on the molecular mechanisms taking place in vertebrate embryo development upon treatment with 1α, 25(OH)2D3. Our approach enables the monitoring of biological processes that can serve as a basis for generating new testable hypotheses. Such network-based integration approach can be easily extended to any temporal- or condition-dependent genomic data analyses.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mata, Pedro; Fuente, Rafael de la; Iglesias, Javier
Iberdrola (spanish utility) and Iberdrola Ingenieria (engineering branch) have been developing during the last two years the 110% Extended Power Up-rate Project (EPU 110%) for Cofrentes BWR-6. IBERDROLA has available an in-house design and licensing reload methodology that has been approved by the Spanish Nuclear Regulatory Authority. This methodology has been already used to perform the nuclear design and the reload licensing analysis for Cofrentes cycles 12 to 14. The methodology has been also applied to develop a significant number of safety analysis of the Cofrentes Extended Power Up-rate including: Reactor Heat Balance, Core and Fuel performance, Thermal Hydraulic Stability,more » ECCS LOCA Evaluation, Transient Analysis, Anticipated Transient Without Scram (ATWS) and Station Blackout (SBO) Since the scope of the licensing process of the Cofrentes Extended Power Up-rate exceeds the range of analysis included in the Cofrentes generic reload licensing process, it has been required to extend the applicability of the Cofrentes licensing methodology to the analysis of new transients. This is the case of the TLFW transient. The content of this paper shows the benefits of having an in-house design and licensing methodology, and describes the process to extend the applicability of the methodology to the analysis of new transients. The case of analysis of Total Loss of Feedwater with the Cofrentes Retran Model is included as an example of this process. (authors)« less
The Extended Contact Hypothesis: A Meta-Analysis on 20 Years of Research.
Zhou, Shelly; Page-Gould, Elizabeth; Aron, Arthur; Moyer, Anne; Hewstone, Miles
2018-04-01
According to the extended contact hypothesis, knowing that in-group members have cross-group friends improves attitudes toward this out-group. This meta-analysis covers the 20 years of research that currently exists on the extended contact hypothesis, and consists of 248 effect sizes from 115 studies. The aggregate relationship between extended contact and intergroup attitudes was r = .25, 95% confidence interval (CI) = [.22, .27], which reduced to r = .17, 95% CI = [.14, .19] after removing direct friendship's contribution; these results suggest that extended contact's hypothesized relationship to intergroup attitudes is small-to-medium and exists independently of direct friendship. This relationship was larger when extended contact was perceived versus actual, highlighting the importance of perception in extended contact. Current results on extended contact mostly resembled their direct friendship counterparts, suggesting similarity between these contact types. These unique insights about extended contact and its relationship with direct friendship should enrich and spur growth within this literature.
Stirling Convertor Extended Operation Testing and Data Analysis at GRC
NASA Technical Reports Server (NTRS)
Cornell, Peggy A.; Lewandowski, Edward J.; Oriti, Salvatore M.; Wilson, Scott D.
2009-01-01
This paper focuses on extended operation testing and data analysis of free-piston Stirling convertors at the NASA Glenn Research Center (GRC). Extended operation testing is essential to the development of radioisotope power systems and their potential use for long duration missions. To document the reliability of the convertors, regular monitoring and analysis of the extended operation data is particularly valuable; allowing us to better understand and quantity the long life characteristics of the convertors. Further, investigation and comparison of the extended operation data to baseline performance data provides us an opportunity for understanding system behavior should any off-nominal performance occur. GRC currently has 14 Stirling convertors under 24-hour unattended extended operation testing, including two operating the Advanced Stirling Radioisotope Generator Engineering Unit (ASRG-EU). 10 of the 14 Stirling convertors at GRC are the Advanced Stirling Convertors (ASC) developed by Sunpower, Incorporated. These are highly efficient (up to > 33.5% conversion efficiency), low mass convertors that have evolved through technologically progressive convertor builds. The remaining four convertors at GRC are Technology Demonstration Convertors (TDC) from Infinia Corporation. They have achieved> 27% conversion efficiency and have accumulated over 178,000 of the total 250,622 hours of extended operation currently at GRC. A synopsis of the Stirling convertor extended operation testing and data analysis at NASA GRC is presented in this paper, as well as how this testing has contributed to the Stirling convertor's progression toward flight.
ERIC Educational Resources Information Center
McCurdy, Merilee; Clure, Lynne F.; Bleck, Amanda A.; Schmitz, Stephanie L.
2016-01-01
Spelling is an important skill that is crucial to effective written communication. In this study, brief experimental analysis procedures were used to examine spelling instruction strategies (e.g., whole word correction; word study strategy; positive practice; and cover, copy, and compare) for four students. In addition, an extended analysis was…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fuente, Rafael de la; Iglesias, Javier; Sedano, Pablo G.
IBERDROLA (Spanish utility) and IBERDROLA INGENIERIA (engineering branch) have been developing during the last 2 yr the 110% Extended Power Uprate Project for Cofrentes BWR-6. IBERDROLA has available an in-house design and licensing reload methodology that has been approved in advance by the Spanish Nuclear Regulatory Authority. This methodology has been applied to perform the nuclear design and the reload licensing analysis for Cofrentes cycles 12 and 13 and to develop a significant number of safety analyses of the Cofrentes Extended Power.Because the scope of the licensing process of the Cofrentes Extended Power Uprate exceeds the range of analysis includedmore » in the Cofrentes generic reload licensing process, it has been required to extend the applicability of the Cofrentes RETRAN model to the analysis of new transients. This is the case of the total loss of feedwater (TLFW) transient.The content of this paper shows the benefits of having an in-house design and licensing methodology and describes the process to extend the applicability of the Cofrentes RETRAN model to the analysis of new transients, particularly in this paper the TLFW transient.« less
Blood pressure and the contractility of a human leg muscle.
Luu, Billy L; Fitzpatrick, Richard C
2013-11-01
These studies investigate the relationships between perfusion pressure, force output and pressor responses for the contracting human tibialis anterior muscle. Eight healthy adults were studied. Changing the height of tibialis anterior relative to the heart was used to control local perfusion pressure. Electrically stimulated tetanic force output was highly sensitive to physiological variations in perfusion pressure showing a proportionate change in force output of 6.5% per 10 mmHg. This perfusion-dependent change in contractility begins within seconds and is reversible with a 53 s time constant, demonstrating a steady-state equilibrium between contractility and perfusion pressure. These stimulated contractions did not produce significant cardiovascular responses, indicating that the muscle pressor response does not play a major role in cardiovascular regulation at these workloads. Voluntary contractions at forces that would require constant motor drive if perfusion pressure had remained constant generated a central pressor response when perfusion pressure was lowered. This is consistent with a larger cortical drive being required to compensate for the lost contractility with lower perfusion pressure. The relationship between contractility and perfusion for this large postural muscle was not different from that of a small hand muscle (adductor pollicis) and it responded similarly to passive peripheral and active central changes in arterial pressure, but extended over a wider operating range of pressures. If we consider that, in a goal-oriented motor task, muscle contractility determines central motor output and the central pressor response, these results indicate that muscle would fatigue twice as fast without a pressor response. From its extent, timing and reversibility we propose a testable hypothesis that this change in contractility arises through contraction- and perfusion-dependent changes in interstitial K(+) concentration.
Blood pressure and the contractility of a human leg muscle
Luu, Billy L; Fitzpatrick, Richard C
2013-01-01
These studies investigate the relationships between perfusion pressure, force output and pressor responses for the contracting human tibialis anterior muscle. Eight healthy adults were studied. Changing the height of tibialis anterior relative to the heart was used to control local perfusion pressure. Electrically stimulated tetanic force output was highly sensitive to physiological variations in perfusion pressure showing a proportionate change in force output of 6.5% per 10 mmHg. This perfusion-dependent change in contractility begins within seconds and is reversible with a 53 s time constant, demonstrating a steady-state equilibrium between contractility and perfusion pressure. These stimulated contractions did not produce significant cardiovascular responses, indicating that the muscle pressor response does not play a major role in cardiovascular regulation at these workloads. Voluntary contractions at forces that would require constant motor drive if perfusion pressure had remained constant generated a central pressor response when perfusion pressure was lowered. This is consistent with a larger cortical drive being required to compensate for the lost contractility with lower perfusion pressure. The relationship between contractility and perfusion for this large postural muscle was not different from that of a small hand muscle (adductor pollicis) and it responded similarly to passive peripheral and active central changes in arterial pressure, but extended over a wider operating range of pressures. If we consider that, in a goal-oriented motor task, muscle contractility determines central motor output and the central pressor response, these results indicate that muscle would fatigue twice as fast without a pressor response. From its extent, timing and reversibility we propose a testable hypothesis that this change in contractility arises through contraction- and perfusion-dependent changes in interstitial K+ concentration. PMID:24018946
The consciousness state space (CSS)—a unifying model for consciousness and self
Berkovich-Ohana, Aviva; Glicksohn, Joseph
2014-01-01
Every experience, those we are aware of and those we are not, is embedded in a subjective timeline, is tinged with emotion, and inevitably evokes a certain sense of self. Here, we present a phenomenological model for consciousness and selfhood which relates time, awareness, and emotion within one framework. The consciousness state space (CSS) model is a theoretical one. It relies on a broad range of literature, hence has high explanatory and integrative strength, and helps in visualizing the relationship between different aspects of experience. Briefly, it is suggested that all phenomenological states fall into two categories of consciousness, core and extended (CC and EC, respectively). CC supports minimal selfhood that is short of temporal extension, its scope being the here and now. EC supports narrative selfhood, which involves personal identity and continuity across time, as well as memory, imagination and conceptual thought. The CSS is a phenomenological space, created by three dimensions: time, awareness and emotion. Each of the three dimensions is shown to have a dual phenomenological composition, falling within CC and EC. The neural spaces supporting each of these dimensions, as well as CC and EC, are laid out based on the neuroscientific literature. The CSS dynamics include two simultaneous trajectories, one in CC and one in EC, typically antagonistic in normal experiences. However, this characteristic behavior is altered in states in which a person experiences an altered sense of self. Two examples are laid out, flow and meditation. The CSS model creates a broad theoretical framework with explanatory and unificatory power. It constructs a detailed map of the consciousness and selfhood phenomenology, which offers constraints for the science of consciousness. We conclude by outlining several testable predictions raised by the CSS model. PMID:24808870
A semantic web framework to integrate cancer omics data with biological knowledge
2012-01-01
Background The RDF triple provides a simple linguistic means of describing limitless types of information. Triples can be flexibly combined into a unified data source we call a semantic model. Semantic models open new possibilities for the integration of variegated biological data. We use Semantic Web technology to explicate high throughput clinical data in the context of fundamental biological knowledge. We have extended Corvus, a data warehouse which provides a uniform interface to various forms of Omics data, by providing a SPARQL endpoint. With the querying and reasoning tools made possible by the Semantic Web, we were able to explore quantitative semantic models retrieved from Corvus in the light of systematic biological knowledge. Results For this paper, we merged semantic models containing genomic, transcriptomic and epigenomic data from melanoma samples with two semantic models of functional data - one containing Gene Ontology (GO) data, the other, regulatory networks constructed from transcription factor binding information. These two semantic models were created in an ad hoc manner but support a common interface for integration with the quantitative semantic models. Such combined semantic models allow us to pose significant translational medicine questions. Here, we study the interplay between a cell's molecular state and its response to anti-cancer therapy by exploring the resistance of cancer cells to Decitabine, a demethylating agent. Conclusions We were able to generate a testable hypothesis to explain how Decitabine fights cancer - namely, that it targets apoptosis-related gene promoters predominantly in Decitabine-sensitive cell lines, thus conveying its cytotoxic effect by activating the apoptosis pathway. Our research provides a framework whereby similar hypotheses can be developed easily. PMID:22373303
Towards a liquid self: how time, geography, and life experiences reshape the biological identity.
Grignolio, Andrea; Mishto, Michele; Faria, Ana Maria Caetano; Garagnani, Paolo; Franceschi, Claudio; Tieri, Paolo
2014-01-01
The conceptualization of immunological self is amongst the most important theories of modern biology, representing a sort of theoretical guideline for experimental immunologists, in order to understand how host constituents are ignored by the immune system (IS). A consistent advancement in this field has been represented by the danger/damage theory and its subsequent refinements, which at present represents the most comprehensive conceptualization of immunological self. Here, we present the new hypothesis of "liquid self," which integrates and extends the danger/damage theory. The main novelty of the liquid self hypothesis lies in the full integration of the immune response mechanisms into the host body's ecosystems, i.e., in adding the temporal, as well as the geographical/evolutionary and environmental, dimensions, which we suggested to call "immunological biography." Our hypothesis takes into account the important biological changes occurring with time (age) in the IS (including immunosenescence and inflammaging), as well as changes in the organismal context related to nutrition, lifestyle, and geography (populations). We argue that such temporal and geographical dimensions impinge upon, and continuously reshape, the antigenicity of physical entities (molecules, cells, bacteria, viruses), making them switching between "self" and "non-self" states in a dynamical, "liquid" fashion. Particular attention is devoted to oral tolerance and gut microbiota, as well as to a new potential source of unexpected self epitopes produced by proteasome splicing. Finally, our framework allows the set up of a variety of testable predictions, the most straightforward suggesting that the immune responses to defined molecules representing potentials antigens will be quantitatively and qualitatively quite different according to the immuno-biographical background of the host.
McKinstry, Jeffrey L.; Fleischer, Jason G.; Chen, Yanqing; Gall, W. Einar; Edelman, Gerald M.
2016-01-01
Mental imagery occurs “when a representation of the type created during the initial phases of perception is present but the stimulus is not actually being perceived.” How does the capability to perform mental imagery arise? Extending the idea that imagery arises from learned associations, we propose that mental rotation, a specific form of imagery, could arise through the mechanism of sequence learning–that is, by learning to regenerate the sequence of mental images perceived while passively observing a rotating object. To demonstrate the feasibility of this proposal, we constructed a simulated nervous system and embedded it within a behaving humanoid robot. By observing a rotating object, the system learns the sequence of neural activity patterns generated by the visual system in response to the object. After learning, it can internally regenerate a similar sequence of neural activations upon briefly viewing the static object. This system learns to perform a mental rotation task in which the subject must determine whether two objects are identical despite differences in orientation. As with human subjects, the time taken to respond is proportional to the angular difference between the two stimuli. Moreover, as reported in humans, the system fills in intermediate angles during the task, and this putative mental rotation activates the same pathways that are activated when the system views physical rotation. This work supports the proposal that mental rotation arises through sequence learning and the idea that mental imagery aids perception through learned associations, and suggests testable predictions for biological experiments. PMID:27653977
NASA Astrophysics Data System (ADS)
Derakhshani, Maaneli
In this thesis, we consider the implications of solving the quantum measurement problem for the Newtonian description of semiclassical gravity. First we review the formalism of the Newtonian description of semiclassical gravity based on standard quantum mechanics---the Schroedinger-Newton theory---and two well-established predictions that come out of it, namely, gravitational 'cat states' and gravitationally-induced wavepacket collapse. Then we review three quantum theories with 'primitive ontologies' that are well-known known to solve the measurement problem---Schroedinger's many worlds theory, the GRW collapse theory with matter density ontology, and Nelson's stochastic mechanics. We extend the formalisms of these three quantum theories to Newtonian models of semiclassical gravity and evaluate their implications for gravitational cat states and gravitational wavepacket collapse. We find that (1) Newtonian semiclassical gravity based on Schroedinger's many worlds theory is mathematically equivalent to the Schroedinger-Newton theory and makes the same predictions; (2) Newtonian semiclassical gravity based on the GRW theory differs from Schroedinger-Newton only in the use of a stochastic collapse law, but this law allows it to suppress gravitational cat states so as not to be in contradiction with experiment, while allowing for gravitational wavepacket collapse to happen as well; (3) Newtonian semiclassical gravity based on Nelson's stochastic mechanics differs significantly from Schroedinger-Newton, and does not predict gravitational cat states nor gravitational wavepacket collapse. Considering that gravitational cat states are experimentally ruled out, but gravitational wavepacket collapse is testable in the near future, this implies that only the latter two are viable theories of Newtonian semiclassical gravity and that they can be experimentally tested against each other in future molecular interferometry experiments that are anticipated to be capable of testing the gravitational wavepacket collapse prediction.
The role of prediction in social neuroscience
Brown, Elliot C.; Brüne, Martin
2012-01-01
Research has shown that the brain is constantly making predictions about future events. Theories of prediction in perception, action and learning suggest that the brain serves to reduce the discrepancies between expectation and actual experience, i.e., by reducing the prediction error. Forward models of action and perception propose the generation of a predictive internal representation of the expected sensory outcome, which is matched to the actual sensory feedback. Shared neural representations have been found when experiencing one's own and observing other's actions, rewards, errors, and emotions such as fear and pain. These general principles of the “predictive brain” are well established and have already begun to be applied to social aspects of cognition. The application and relevance of these predictive principles to social cognition are discussed in this article. Evidence is presented to argue that simple non-social cognitive processes can be extended to explain complex cognitive processes required for social interaction, with common neural activity seen for both social and non-social cognitions. A number of studies are included which demonstrate that bottom-up sensory input and top-down expectancies can be modulated by social information. The concept of competing social forward models and a partially distinct category of social prediction errors are introduced. The evolutionary implications of a “social predictive brain” are also mentioned, along with the implications on psychopathology. The review presents a number of testable hypotheses and novel comparisons that aim to stimulate further discussion and integration between currently disparate fields of research, with regard to computational models, behavioral and neurophysiological data. This promotes a relatively new platform for inquiry in social neuroscience with implications in social learning, theory of mind, empathy, the evolution of the social brain, and potential strategies for treating social cognitive deficits. PMID:22654749
Datta, Deepshikha; Vaidehi, Nagarajan; Floriano, Wely B; Kim, Kwang S; Prasadarao, Nemani V; Goddard, William A
2003-02-01
Esherichia coli, the most common gram-negative bacteria, can penetrate the brain microvascular endothelial cells (BMECs) during the neonatal period to cause meningitis with significant morbidity and mortality. Experimental studies have shown that outer-membrane protein A (OmpA) of E. coli plays a key role in the initial steps of the invasion process by binding to specific sugar moieties present on the glycoproteins of BMEC. These experiments also show that polymers of chitobiose (GlcNAcbeta1-4GlcNAc) block the invasion, while epitopes substituted with the L-fucosyl group do not. We used HierDock computational technique that consists of a hierarchy of coarse grain docking method with molecular dynamics (MD) to predict the binding sites and energies of interactions of GlcNAcbeta1-4GlcNAc and other sugars with OmpA. The results suggest two important binding sites for the interaction of carbohydrate epitopes of BMEC glycoproteins to OmpA. We identify one site as the binding pocket for chitobiose (GlcNAcbeta1-4GlcNAc) in OmpA, while the second region (including loops 1 and 2) may be important for recognition of specific sugars. We find that the site involving loops 1 and 2 has relative binding energies that correlate well with experimental observations. This theoretical study elucidates the interaction sites of chitobiose with OmpA and the binding site predictions made in this article are testable either by mutation studies or invasion assays. These results can be further extended in suggesting possible peptide antagonists and drug design for therapeutic strategies. Copyright 2002 Wiley-Liss, Inc.
Expanding the role of reactive transport models in critical zone processes
Li, Li; Maher, Kate; Navarre-Sitchler, Alexis; Druhan, Jennifer; Meile, Christof; Lawrence, Corey; Moore, Joel; Perdrial, Julia; Sullivan, Pamela; Thompson, Aaron; Jin, Lixin; Bolton, Edward W.; Brantley, Susan L.; Dietrich, William E.; Mayer, K. Ulrich; Steefel, Carl; Valocchi, Albert J.; Zachara, John M.; Kocar, Benjamin D.; McIntosh, Jennifer; Tutolo, Benjamin M.; Kumar, Mukesh; Sonnenthal, Eric; Bao, Chen; Beisman, Joe
2017-01-01
Models test our understanding of processes and can reach beyond the spatial and temporal scales of measurements. Multi-component Reactive Transport Models (RTMs), initially developed more than three decades ago, have been used extensively to explore the interactions of geothermal, hydrologic, geochemical, and geobiological processes in subsurface systems. Driven by extensive data sets now available from intensive measurement efforts, there is a pressing need to couple RTMs with other community models to explore non-linear interactions among the atmosphere, hydrosphere, biosphere, and geosphere. Here we briefly review the history of RTM development, summarize the current state of RTM approaches, and identify new research directions, opportunities, and infrastructure needs to broaden the use of RTMs. In particular, we envision the expanded use of RTMs in advancing process understanding in the Critical Zone, the veneer of the Earth that extends from the top of vegetation to the bottom of groundwater. We argue that, although parsimonious models are essential at larger scales, process-based models offer tools to explore the highly nonlinear coupling that characterizes natural systems. We present seven testable hypotheses that emphasize the unique capabilities of process-based RTMs for (1) elucidating chemical weathering and its physical and biogeochemical drivers; (2) understanding the interactions among roots, micro-organisms, carbon, water, and minerals in the rhizosphere; (3) assessing the effects of heterogeneity across spatial and temporal scales; and (4) integrating the vast quantity of novel data, including “omics” data (genomics, transcriptomics, proteomics, metabolomics), elemental concentration and speciation data, and isotope data into our understanding of complex earth surface systems. With strong support from data-driven sciences, we are now in an exciting era where integration of RTM framework into other community models will facilitate process understanding across disciplines and across scales.
A New Tool for Classifying Small Solar System Objects
NASA Astrophysics Data System (ADS)
Desfosses, Ryan; Arel, D.; Walker, M. E.; Ziffer, J.; Harvell, T.; Campins, H.; Fernandez, Y. R.
2011-05-01
An artificial intelligence program, AutoClass, which was developed by NASA's Artificial Intelligence Branch, uses Bayesian classification theory to automatically choose the most probable classification distribution to describe a dataset. To investigate its usefulness to the Planetary Science community, we tested its ability to reproduce the taxonomic classes as defined by Tholen and Barucci (1989). Of the 406 asteroids from the Eight Color Asteroid Survey (ECAS) we chose for our test, 346 were firmly classified and all but 3 (<1%) were classified by Autoclass as they had been in the previous classification system (Walker et al., 2011). We are now applying it to larger datasets to improve the taxonomy of currently unclassified objects. Having demonstrated AutoClass's ability to recreate existing classification effectively, we extended this work to investigations of albedo-based classification systems. To determine how predictive albedo can be, we used data from the Infrared Astronomical Satellite (IRAS) database in conjunction with the large Sloan Digital Sky Survey (SDSS), which contains color and position data for over 200,000 classified and unclassified asteroids (Ivesic et al., 2001). To judge our success we compared our results with a similar approach to classifying objects using IRAS albedo and asteroid color by Tedesco et al. (1989). Understanding the distribution of the taxonomic classes is important to understanding the history and evolution of our Solar System. AutoClass's success in categorizing ECAS, IRAS and SDSS asteroidal data highlights its potential to scan large domains for natural classes in small solar system objects. Based upon our AutoClass results, we intend to make testable predictions about asteroids observed with the Wide-field Infrared Survey Explorer (WISE).
Lehrer, Douglas S; Pato, Michele T; Nahhas, Ramzi W; Miller, Brian R; Malaspina, Dolores; Buckley, Peter F; Sobell, Janet L; Walsh-Messinger, Julie; Genomic Psychiatry Cohort Consortium; Pato, Carlos N
2016-06-01
Advanced paternal age (APA) is a risk factor for schizophrenia (Sz) and bipolar disorder (BP). Putative mechanisms include heritable genetic factors, de novo mutations, and epigenetic mechanisms. Few studies have explored phenotypic features associated with APA. The Genomic Psychiatry Cohort established a clinically characterized repository of genomic samples from subjects with a Sz-BP diagnosis or unaffected controls, 12,975 with parental age information. We estimated relative risk ratios for Sz, schizoaffective depressed and bipolar types (SA-D, SA-B), and BP with and without history of psychotic features (PF) relative to the control group, comparing each paternal age group to the reference group 20-24 years. All tests were two-sided with adjustment for multiple comparisons. Subjects with fathers age 45+ had significantly higher risk for all diagnoses except for BP w/o PF. APA also bore no significant relation to family psychiatric history. In conclusion, we replicated APA as a risk factor for Sz. To our knowledge, this is the first published report of APA in a BP sample stratified by psychosis history, extending this association only in BP w/PF. This suggests that phenotypic expression of the APA effect in Sz-BP spectrum is psychosis, per se, rather than other aspects of these complex disorders. The lack of a significant relationship between paternal age and familial disease patterns suggests that underlying mechanisms of the paternal age effect may involve a complex interaction of heritable and non-heritable factors. The authors discuss implications and testable hypotheses, starting with a focus on genetic mechanisms and endophenotypic expressions of dopaminergic function. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.
Tests and consequences of disk plus halo models of gamma-ray burst sources
NASA Technical Reports Server (NTRS)
Smith, I. A.
1995-01-01
The gamma-ray burst observations made by the Burst and Transient Source Experiment (BATSE) and by previous experiments are still consistent with a combined Galactic disk (or Galactic spiral arm) plus extended Galactic halo model. Testable predictions and consequences of the disk plus halo model are discussed here; tests performed on the expanded BATSE database in the future will constrain the allowed model parameters and may eventually rule out the disk plus halo model. Using examples, it is shown that if the halo has an appropriate edge, BATSE will never detect an anisotropic signal from the halo of the Andromeda galaxy. A prediction of the disk plus halo model is that the fraction of the bursts observed to be in the 'disk' population rises as the detector sensitivity improves. A careful reexamination of the numbers of bursts in the two populations for the pre-BATSE databases could rule out this class of models. Similarly, it is predicted that different satellites will observe different relative numbers of bursts in the two classes for any model in which there are two different spatial distribiutions of the sources, or for models in which there is one spatial distribution of the sources that is sampled to different depths for the two classes. An important consequence of the disk plus halo model is that for the birthrate of the halo sources to be small compared to the birthrate of the disk sources, it is necessary for the halo sources to release many orders of magnitude more energy over their bursting lifetime than the disk sources. The halo bursts must also be much more luminous than the disk bursts; if this disk-halo model is correct, it is necessary to explain why the disk sources do not produce halo-type bursts.
Prediction of gene-phenotype associations in humans, mice, and plants using phenologs.
Woods, John O; Singh-Blom, Ulf Martin; Laurent, Jon M; McGary, Kriston L; Marcotte, Edward M
2013-06-21
Phenotypes and diseases may be related to seemingly dissimilar phenotypes in other species by means of the orthology of underlying genes. Such "orthologous phenotypes," or "phenologs," are examples of deep homology, and may be used to predict additional candidate disease genes. In this work, we develop an unsupervised algorithm for ranking phenolog-based candidate disease genes through the integration of predictions from the k nearest neighbor phenologs, comparing classifiers and weighting functions by cross-validation. We also improve upon the original method by extending the theory to paralogous phenotypes. Our algorithm makes use of additional phenotype data--from chicken, zebrafish, and E. coli, as well as new datasets for C. elegans--establishing that several types of annotations may be treated as phenotypes. We demonstrate the use of our algorithm to predict novel candidate genes for human atrial fibrillation (such as HRH2, ATP4A, ATP4B, and HOPX) and epilepsy (e.g., PAX6 and NKX2-1). We suggest gene candidates for pharmacologically-induced seizures in mouse, solely based on orthologous phenotypes from E. coli. We also explore the prediction of plant gene-phenotype associations, as for the Arabidopsis response to vernalization phenotype. We are able to rank gene predictions for a significant portion of the diseases in the Online Mendelian Inheritance in Man database. Additionally, our method suggests candidate genes for mammalian seizures based only on bacterial phenotypes and gene orthology. We demonstrate that phenotype information may come from diverse sources, including drug sensitivities, gene ontology biological processes, and in situ hybridization annotations. Finally, we offer testable candidates for a variety of human diseases, plant traits, and other classes of phenotypes across a wide array of species.
Towards a Liquid Self: How Time, Geography, and Life Experiences Reshape the Biological Identity
Grignolio, Andrea; Mishto, Michele; Faria, Ana Maria Caetano; Garagnani, Paolo; Franceschi, Claudio; Tieri, Paolo
2014-01-01
The conceptualization of immunological self is amongst the most important theories of modern biology, representing a sort of theoretical guideline for experimental immunologists, in order to understand how host constituents are ignored by the immune system (IS). A consistent advancement in this field has been represented by the danger/damage theory and its subsequent refinements, which at present represents the most comprehensive conceptualization of immunological self. Here, we present the new hypothesis of “liquid self,” which integrates and extends the danger/damage theory. The main novelty of the liquid self hypothesis lies in the full integration of the immune response mechanisms into the host body’s ecosystems, i.e., in adding the temporal, as well as the geographical/evolutionary and environmental, dimensions, which we suggested to call “immunological biography.” Our hypothesis takes into account the important biological changes occurring with time (age) in the IS (including immunosenescence and inflammaging), as well as changes in the organismal context related to nutrition, lifestyle, and geography (populations). We argue that such temporal and geographical dimensions impinge upon, and continuously reshape, the antigenicity of physical entities (molecules, cells, bacteria, viruses), making them switching between “self” and “non-self” states in a dynamical, “liquid” fashion. Particular attention is devoted to oral tolerance and gut microbiota, as well as to a new potential source of unexpected self epitopes produced by proteasome splicing. Finally, our framework allows the set up of a variety of testable predictions, the most straightforward suggesting that the immune responses to defined molecules representing potentials antigens will be quantitatively and qualitatively quite different according to the immuno-biographical background of the host. PMID:24782860
Modelling protein functional domains in signal transduction using Maude
NASA Technical Reports Server (NTRS)
Sriram, M. G.
2003-01-01
Modelling of protein-protein interactions in signal transduction is receiving increased attention in computational biology. This paper describes recent research in the application of Maude, a symbolic language founded on rewriting logic, to the modelling of functional domains within signalling proteins. Protein functional domains (PFDs) are a critical focus of modern signal transduction research. In general, Maude models can simulate biological signalling networks and produce specific testable hypotheses at various levels of abstraction. Developing symbolic models of signalling proteins containing functional domains is important because of the potential to generate analyses of complex signalling networks based on structure-function relationships.
Reply to ``Comment on `Quantum time-of-flight distribution for cold trapped atoms' ''
NASA Astrophysics Data System (ADS)
Ali, Md. Manirul; Home, Dipankar; Majumdar, A. S.; Pan, Alok K.
2008-02-01
In their comment Gomes [Phys. Rev. A 77, 026101 (2008)] have questioned the possibility of empirically testable differences existing between the semiclassical time of flight distribution for cold trapped atoms and a quantum distribution discussed by us recently [Ali , Phys. Rev. A 75, 042110 (2007).]. We argue that their criticism is based on a semiclassical treatment having restricted applicability for a particular trapping potential. Their claim does not preclude, in general, the possibility of differences between the semiclassical calculations and fully quantum results for the arrival time distribution of freely falling atoms.
Reply to 'Comment on 'Quantum time-of-flight distribution for cold trapped atoms''
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ali, Md. Manirul; Home, Dipankar; Pan, Alok K.
2008-02-15
In their comment Gomes et al. [Phys. Rev. A 77, 026101 (2008)] have questioned the possibility of empirically testable differences existing between the semiclassical time of flight distribution for cold trapped atoms and a quantum distribution discussed by us recently [Ali et al., Phys. Rev. A 75, 042110 (2007).]. We argue that their criticism is based on a semiclassical treatment having restricted applicability for a particular trapping potential. Their claim does not preclude, in general, the possibility of differences between the semiclassical calculations and fully quantum results for the arrival time distribution of freely falling atoms.
Repairable chip bonding/interconnect process
Bernhardt, A.F.; Contolini, R.J.; Malba, V.; Riddle, R.A.
1997-08-05
A repairable, chip-to-board interconnect process which addresses cost and testability issues in the multi-chip modules is disclosed. This process can be carried out using a chip-on-sacrificial-substrate technique, involving laser processing. This process avoids the curing/solvent evolution problems encountered in prior approaches, as well is resolving prior plating problems and the requirements for fillets. For repairable high speed chip-to-board connection, transmission lines can be formed on the sides of the chip from chip bond pads, ending in a gull wing at the bottom of the chip for subsequent solder. 10 figs.
NASA Technical Reports Server (NTRS)
Taylor, S. R.
1984-01-01
The concept that the Moon was fissioned from the Earth after core separation is the most readily testable hypothesis of lunar origin, since direct comparisons of lunar and terrestrial compositions can be made. Differences found in such comparisons introduce so many ad hoc adjustments to the fission hypothesis that it becomes untestable. Further constraints may be obtained from attempting to date the volatile-refractory element fractionation. The combination of chemical and isotopic problems suggests that the fission hypothesis is no longer viable, and separate terrestrial and lunar accretion from a population of fractionated precursor planetesimals provides a more reasonable explanation.
Toolsets Maintain Health of Complex Systems
NASA Technical Reports Server (NTRS)
2010-01-01
First featured in Spinoff 2001, Qualtech Systems Inc. (QSI), of Wethersfield, Connecticut, adapted its Testability, Engineering, and Maintenance System (TEAMS) toolset under Small Business Innovation Research (SBIR) contracts from Ames Research Center to strengthen NASA's systems health management approach for its large, complex, and interconnected systems. Today, six NASA field centers utilize the TEAMS toolset, including TEAMS-Designer, TEAMS-RT, TEAMATE, and TEAMS-RDS. TEAMS is also being used on industrial systems that generate power, carry data, refine chemicals, perform medical functions, and produce semiconductor wafers. QSI finds TEAMS can lower costs by decreasing problems requiring service by 30 to 50 percent.
Quantum probability and quantum decision-making.
Yukalov, V I; Sornette, D
2016-01-13
A rigorous general definition of quantum probability is given, which is valid not only for elementary events but also for composite events, for operationally testable measurements as well as for inconclusive measurements, and also for non-commuting observables in addition to commutative observables. Our proposed definition of quantum probability makes it possible to describe quantum measurements and quantum decision-making on the same common mathematical footing. Conditions are formulated for the case when quantum decision theory reduces to its classical counterpart and for the situation where the use of quantum decision theory is necessary. © 2015 The Author(s).
PhyloDet: a scalable visualization tool for mapping multiple traits to large evolutionary trees
Lee, Bongshin; Nachmanson, Lev; Robertson, George; Carlson, Jonathan M.; Heckerman, David
2009-01-01
Summary: Evolutionary biologists are often interested in finding correlations among biological traits across a number of species, as such correlations may lead to testable hypotheses about the underlying function. Because some species are more closely related than others, computing and visualizing these correlations must be done in the context of the evolutionary tree that relates species. In this note, we introduce PhyloDet (short for PhyloDetective), an evolutionary tree visualization tool that enables biologists to visualize multiple traits mapped to the tree. Availability: http://research.microsoft.com/cue/phylodet/ Contact: bongshin@microsoft.com. PMID:19633096
NASA Technical Reports Server (NTRS)
Briggs, Hugh C.
2008-01-01
An error budget is a commonly used tool in design of complex aerospace systems. It represents system performance requirements in terms of allowable errors and flows these down through a hierarchical structure to lower assemblies and components. The requirements may simply be 'allocated' based upon heuristics or experience, or they may be designed through use of physics-based models. This paper presents a basis for developing an error budget for models of the system, as opposed to the system itself. The need for model error budgets arises when system models are a principle design agent as is increasingly more common for poorly testable high performance space systems.
ADMiER-ing thin but complex fluids
NASA Astrophysics Data System (ADS)
McDonnell, Amarin G.; Bhattacharjee, Pradipto K.; Pan, Sharadwata; Hill, David; Danquah, Michael K.; Friend, James R.; Yeo, Leslie Y.; Prabhakar, Ranganathan
2011-12-01
The Acoustics Driven Microfluidic Extensional Rheometer (ADMiER) utilises micro litre volumes of liquid, with viscosities as low as that of water, to create valid and observable extensional flows, liquid bridges that pinch off due to capillary forces in this case. ADMiER allows the study fluids that have been beyond conventional methods and also study more subtle fluid properties. We can observe polymeric fluids with solvent viscosities far below those previously testable, accentuating elastic effects. Also, it has enabled the testing of aqueous solutions of living motile particles, which significantly change fluid properties, opening up the potential for diagnostic applications.
The need for theory to guide concussion research.
Molfese, Dennis L
2015-01-01
Although research into concussion has greatly expanded over the past decade, progress in identifying the mechanisms and consequences of head injury and recovery are largely absent. Instead, data are accumulated without the guidance of a systematic theory to direct research questions or generate testable hypotheses. As part of this special issue on sports concussion, I advance a theory that emphasizes changes in spatial and temporal distributions of the brain's neural networks during normal learning and the disruptions of these networks following injury. Specific predictions are made regarding both the development of the network as well as its breakdown following injury.
Multiple transitions and HIV risk among orphaned Kenyan schoolgirls.
Mojola, Sanyu A
2011-03-01
Why are orphaned girls at particular risk of acquiring HIV infection? Using a transition-to-adulthood framework, this study employs qualitative data from Nyanza Province, Kenya, to explore pathways to HIV risk among orphaned and nonorphaned high-school girls. It shows how simultaneous processes such as leaving their parental home, negotiating financial access, and relationship transitions interact to produce disproportionate risk for orphaned girls. The role of financial provision and parental love in modifying girls' trajectories to risk are also explored. A testable theoretical model is proposed based on the qualitative findings, and policy implications are suggested.
MULTIPLE TRANSITIONS AND HIV RISK AMONG AFRICAN SCHOOL GIRLS
Mojola, Sanyu A
2012-01-01
Why are orphaned girls at particular risk of contracting HIV? Using a transition to adulthood framework, this paper uses qualitative data from Nyanza province, Kenya to explore pathways to HIV risk among orphaned and non-orphaned high school girls. I show how co-occurring processes such as residential transition out of the parental home, negotiating financial access and relationship transitions interact to produce disproportionate risk for orphan girls. I also explore the role of financial provision and parental love in modifying girls’ trajectories to risk. I propose a testable theoretical model based on the qualitative findings and suggest policy implications. PMID:21500699
1990-09-01
Monterey, California 93943-5000 Monterey, California 93943-5000 8a NAME OF OjNYNG SPONSORNc Br Oc.(C S VBO_ 9 POCAE’ ,S’ jN1N DE NT CA (’% . ORGANIZATON (If...position of the Depart- ment of Defense or the US Government. ś COSA I CODL> 18 S,,BjECT TERMS (Continue on reverse if necessar dno idenritj b blck...logic for which it was de - signed. Finally, the circuit should retain correct function- ality over time by having stable operating characteristics. If
Superstitiousness in obsessive-compulsive disorder
Brugger, Peter; Viaud-Delmon, Isabelle
2010-01-01
It has been speculated that superstitiousness and obsessivecompulsive disorder (OCD) exist along a continuum. The distinction between superstitious behavior italic>and superstitious belief, however, is crucial for any theoretical account of claimed associations between superstitiousness and OCD. By demonstrating that there is a dichotomy between behavior and belief, which is experimentally testable, we can differentiate superstitious behavior from superstitious belief, or magical ideation. Different brain circuits are responsible for these two forms of superstitiousness; thus, determining which type of superstition is prominent in the symptomatology of an individual patient may inform us about the primarily affected neurocognitive systems. PMID:20623929
A Review of Diagnostic Techniques for ISHM Applications
NASA Technical Reports Server (NTRS)
Patterson-Hine, Ann; Biswas, Gautam; Aaseng, Gordon; Narasimhan, Sriam; Pattipati, Krishna
2005-01-01
System diagnosis is an integral part of any Integrated System Health Management application. Diagnostic applications make use of system information from the design phase, such as safety and mission assurance analysis, failure modes and effects analysis, hazards analysis, functional models, fault propagation models, and testability analysis. In modern process control and equipment monitoring systems, topological and analytic , models of the nominal system, derived from design documents, are also employed for fault isolation and identification. Depending on the complexity of the monitored signals from the physical system, diagnostic applications may involve straightforward trending and feature extraction techniques to retrieve the parameters of importance from the sensor streams. They also may involve very complex analysis routines, such as signal processing, learning or classification methods to derive the parameters of importance to diagnosis. The process that is used to diagnose anomalous conditions from monitored system signals varies widely across the different approaches to system diagnosis. Rule-based expert systems, case-based reasoning systems, model-based reasoning systems, learning systems, and probabilistic reasoning systems are examples of the many diverse approaches ta diagnostic reasoning. Many engineering disciplines have specific approaches to modeling, monitoring and diagnosing anomalous conditions. Therefore, there is no "one-size-fits-all" approach to building diagnostic and health monitoring capabilities for a system. For instance, the conventional approaches to diagnosing failures in rotorcraft applications are very different from those used in communications systems. Further, online and offline automated diagnostic applications are integrated into an operations framework with flight crews, flight controllers and maintenance teams. While the emphasis of this paper is automation of health management functions, striking the correct balance between automated and human-performed tasks is a vital concern.
Analysis of electric vehicle extended range misalignment based on rigid-flexible dynamics
NASA Astrophysics Data System (ADS)
Xu, Xiaowei; Lv, Mingliang; Chen, Zibo; Ji, Wei; Gao, Ruiceng
2017-04-01
The safety of the extended range electric vehicle is seriously affected by the misalignment fault. Therefore, this paper analyzed the electric vehicle extended range misalignment based on rigid-flexible dynamics. Through comprehensively applied the hybrid modeling of rigid-flexible and the method of fault diagnosis of machinery and equipment comprehensively, it established a extender hybrid rigid flexible mechanical model by means of the software ADAMS and ANSYS. By setting the relevant parameters to simulate the misalignment of shafting, the failure phenomenon, the spectrum analysis and the evolution rules were analyzed. It concluded that 0.5th and 1 harmonics are considered as the characteristic parameters of misalignment diagnostics for electric vehicle extended range.
A systematic survey of the integration of animal behavior into conservation
Berger-Tal, Oded; Blumstein, Daniel T.; Carroll, Scott; Fisher, Robert N.; Mesnick, Sarah L.; Owen, Megan A.; Saltz, David; St. Claire, Colleen Cassady; Swaisgood, Ronald R.
2016-01-01
The role of behavioral ecology in improving wildlife conservation and management has been the subject of much recent debate. We aim to answer two foundational questions about the current use of behavioral knowledge in conservation: 1. To what extent is behavioral knowledge used in wildlife conservation and management? 2. How does the use of behavior differ among conservation fields in both frequency and types of use? To answer these questions, we searched the literature for intersections between key fields of animal behavior and conservation biology and created a systematic ‘heat’ map to visualize relative efforts. Our analysis challenges previous suggestions that there is little association between the fields of behavioral ecology and conservation and reveals tremendous variation in the use of different behaviors in conservation. For instance, some behaviors, such as foraging and dispersal, are commonly considered, but other behaviors such as learning, social or anti-predatory behaviors are hardly considered. Our analysis suggests that in many cases awareness of the importance of behavior does not translate into applicable management tools. We recommend that researchers should focus on developing research in underutilized intersections of behavior and conservation themes for which preliminary work show a potential for improving conservation and management, on translating behavioral theory into applicable and testable predictions, and on creating systematic reviews to summarize the behavioral evidence within the behavior-conservation intersections for which many studies exist.
Argasinski, K; Broom, M
2013-10-01
In the standard approach to evolutionary games and replicator dynamics, differences in fitness can be interpreted as an excess from the mean Malthusian growth rate in the population. In the underlying reasoning, related to an analysis of "costs" and "benefits", there is a silent assumption that fitness can be described in some type of units. However, in most cases these units of measure are not explicitly specified. Then the question arises: are these theories testable? How can we measure "benefit" or "cost"? A natural language, useful for describing and justifying comparisons of strategic "cost" versus "benefits", is the terminology of demography, because the basic events that shape the outcome of natural selection are births and deaths. In this paper, we present the consequences of an explicit analysis of births and deaths in an evolutionary game theoretic framework. We will investigate different types of mortality pressures, their combinations and the possibility of trade-offs between mortality and fertility. We will show that within this new approach it is possible to model how strictly ecological factors such as density dependence and additive background fitness, which seem neutral in classical theory, can affect the outcomes of the game. We consider the example of the Hawk-Dove game, and show that when reformulated in terms of our new approach new details and new biological predictions are produced.
Schneider, Sebastian; Provasi, Davide; Filizola, Marta
2016-11-22
Substantial attention has recently been devoted to G protein-biased agonism of the μ-opioid receptor (MOR) as an ideal new mechanism for the design of analgesics devoid of serious side effects. However, designing opioids with appropriate efficacy and bias is challenging because it requires an understanding of the ligand binding process and of the allosteric modulation of the receptor. Here, we investigated these phenomena for TRV-130, a G protein-biased MOR small-molecule agonist that has been shown to exert analgesia with less respiratory depression and constipation than morphine and that is currently being evaluated in human clinical trials for acute pain management. Specifically, we carried out multimicrosecond, all-atom molecular dynamics (MD) simulations of the binding of this ligand to the activated MOR crystal structure. Analysis of >50 μs of these MD simulations provides insights into the energetically preferred binding pathway of TRV-130 and its stable pose at the orthosteric binding site of MOR. Information transfer from the TRV-130 binding pocket to the intracellular region of the receptor was also analyzed, and was compared to a similar analysis carried out on the receptor bound to the classical unbiased agonist morphine. Taken together, these studies lead to a series of testable hypotheses of ligand-receptor interactions that are expected to inform the structure-based design of improved opioid analgesics.
Sticky tunes: how do people react to involuntary musical imagery?
Williamson, Victoria J; Liikkanen, Lassi A; Jakubowski, Kelly; Stewart, Lauren
2014-01-01
The vast majority of people experience involuntary musical imagery (INMI) or 'earworms'; perceptions of spontaneous, repetitive musical sound in the absence of an external source. The majority of INMI episodes are not bothersome, while some cause disruption ranging from distraction to anxiety and distress. To date, little is known about how the majority of people react to INMI, in particular whether evaluation of the experience impacts on chosen response behaviours or if attempts at controlling INMI are successful or not. The present study classified 1046 reports of how people react to INMI episodes. Two laboratories in Finland and the UK conducted an identical qualitative analysis protocol on reports of INMI reactions and derived visual descriptive models of the outcomes using grounded theory techniques. Combined analysis carried out across the two studies confirmed that many INMI episodes were considered neutral or pleasant, with passive acceptance and enjoyment being among the most popular response behaviours. A significant number of people, however, reported on attempts to cope with unwanted INMI. The most popular and effective behaviours in response to INMI were seeking out the tune in question, and musical or verbal distraction. The outcomes of this study contribute to our understanding of the aetiology of INMI, in particular within the framework of memory theory, and present testable hypotheses for future research on successful INMI coping strategies.
Sticky Tunes: How Do People React to Involuntary Musical Imagery?
Williamson, Victoria J.; Liikkanen, Lassi A.; Jakubowski, Kelly; Stewart, Lauren
2014-01-01
The vast majority of people experience involuntary musical imagery (INMI) or ‘earworms’; perceptions of spontaneous, repetitive musical sound in the absence of an external source. The majority of INMI episodes are not bothersome, while some cause disruption ranging from distraction to anxiety and distress. To date, little is known about how the majority of people react to INMI, in particular whether evaluation of the experience impacts on chosen response behaviours or if attempts at controlling INMI are successful or not. The present study classified 1046 reports of how people react to INMI episodes. Two laboratories in Finland and the UK conducted an identical qualitative analysis protocol on reports of INMI reactions and derived visual descriptive models of the outcomes using grounded theory techniques. Combined analysis carried out across the two studies confirmed that many INMI episodes were considered neutral or pleasant, with passive acceptance and enjoyment being among the most popular response behaviours. A significant number of people, however, reported on attempts to cope with unwanted INMI. The most popular and effective behaviours in response to INMI were seeking out the tune in question, and musical or verbal distraction. The outcomes of this study contribute to our understanding of the aetiology of INMI, in particular within the framework of memory theory, and present testable hypotheses for future research on successful INMI coping strategies. PMID:24497938
Kwon, Min-Seok; Nam, Seungyoon; Lee, Sungyoung; Ahn, Young Zoo; Chang, Hae Ryung; Kim, Yon Hui; Park, Taesung
2017-01-01
The recent creation of enormous, cancer-related “Big Data” public depositories represents a powerful means for understanding tumorigenesis. However, a consistently accurate system for clinically evaluating single/multi-biomarkers remains lacking, and it has been asserted that oft-failed clinical advancement of biomarkers occurs within the very early stages of biomarker assessment. To address these challenges, we developed a clinically testable, web-based tool, CANcer-specific single/multi-biomarker Evaluation System (CANES), to evaluate biomarker effectiveness, across 2,134 whole transcriptome datasets, from 94,147 biological samples (from 18 tumor types). For user-provided single/multi-biomarkers, CANES evaluates the performance of single/multi-biomarker candidates, based on four classification methods, support vector machine, random forest, neural networks, and classification and regression trees. In addition, CANES offers several advantages over earlier analysis tools, including: 1) survival analysis; 2) evaluation of mature miRNAs as markers for user-defined diagnostic or prognostic purposes; and 3) provision of a “pan-cancer” summary view, based on each single marker. We believe that such “landscape” evaluation of single/multi-biomarkers, for diagnostic therapeutic/prognostic decision-making, will be highly valuable for the discovery and “repurposing” of existing biomarkers (and their specific targeted therapies), leading to improved patient therapeutic stratification, a key component of targeted therapy success for the avoidance of therapy resistance. PMID:29050243
Using statistical process control to make data-based clinical decisions.
Pfadt, A; Wheeler, D J
1995-01-01
Applied behavior analysis is based on an investigation of variability due to interrelationships among antecedents, behavior, and consequences. This permits testable hypotheses about the causes of behavior as well as for the course of treatment to be evaluated empirically. Such information provides corrective feedback for making data-based clinical decisions. This paper considers how a different approach to the analysis of variability based on the writings of Walter Shewart and W. Edwards Deming in the area of industrial quality control helps to achieve similar objectives. Statistical process control (SPC) was developed to implement a process of continual product improvement while achieving compliance with production standards and other requirements for promoting customer satisfaction. SPC involves the use of simple statistical tools, such as histograms and control charts, as well as problem-solving techniques, such as flow charts, cause-and-effect diagrams, and Pareto charts, to implement Deming's management philosophy. These data-analytic procedures can be incorporated into a human service organization to help to achieve its stated objectives in a manner that leads to continuous improvement in the functioning of the clients who are its customers. Examples are provided to illustrate how SPC procedures can be used to analyze behavioral data. Issues related to the application of these tools for making data-based clinical decisions and for creating an organizational climate that promotes their routine use in applied settings are also considered.
Zhao, Qi; Liu, Yuanning; Zhang, Ning; Hu, Menghan; Zhang, Hao; Joshi, Trupti; Xu, Dong
2018-01-01
In recent years, an increasing number of studies have reported the presence of plant miRNAs in human samples, which resulted in a hypothesis asserting the existence of plant-derived exogenous microRNA (xenomiR). However, this hypothesis is not widely accepted in the scientific community due to possible sample contamination and the small sample size with lack of rigorous statistical analysis. This study provides a systematic statistical test that can validate (or invalidate) the plant-derived xenomiR hypothesis by analyzing 388 small RNA sequencing data from human samples in 11 types of body fluids/tissues. A total of 166 types of plant miRNAs were found in at least one human sample, of which 14 plant miRNAs represented more than 80% of the total plant miRNAs abundance in human samples. Plant miRNA profiles were characterized to be tissue-specific in different human samples. Meanwhile, the plant miRNAs identified from microbiome have an insignificant abundance compared to those from humans, while plant miRNA profiles in human samples were significantly different from those in plants, suggesting that sample contamination is an unlikely reason for all the plant miRNAs detected in human samples. This study also provides a set of testable synthetic miRNAs with isotopes that can be detected in situ after being fed to animals.
Wet Tectonics: A New Planetary Synthesis
NASA Astrophysics Data System (ADS)
Grimm, K. A.
2005-12-01
Most geoscientists (and geoscience textbooks) describe plate tectonics as a `solid-Earth' phenomenon, with fluids playing an important role in discrete geodynamic processes. As a community of diverse research specialists, the critical role of water is being widely elucidated, however these diverse studies do not address the fundamental origin and operation of the global plate tectonic phenomenon, and its expressions in planetary geodynamics and geomorphology. The Wet Tectonics hypothesis extends well beyond the plate tectonics paradigm, to constitute a new synthesis of diverse geoscience specializations and self-organizing complexity into a simple, internally consistent and explicitly testable model. The Wet Tectonics hypothesis asserts that Earth's plate tectonic system arose from and is the explicit and dynamic result of water interacting with the hot silicate mantle. The tectosphere is defined as an interactive functional (rather than structural, compositional or rheological) entity, a planetary-scale dynamic system of plate formation, plate motion, and rock/volatile recycling. Earth's tectosphere extends from the base of the asthenosphere to the top of the crust, arising and evolving as a dynamic pattern of organization that creates, orders and perpetuates itself. Earth's tectosphere is energetically-open, materially ajar (steady-state operation may not require sub-asthenospheric inputs; shifts between distinct tectonic modes may result from changes in coupling between the tectosphere and subasthenospheric reservoirs) and chemically-closed (i.e. the tectosphere recycles its own wastes). Water is a fundamental requirement in all of the constituent processes of Earth's tectosphere, including seafloor spreading, slab cooling/subsidence, plate motion, asthenosphere rheology, and subduction (where crustal and volatile recycling occur). As a working hypothesis, we suggest that the dynamic and persistent hydrosphere and tectosphere on planet Earth are fully interdependent and co-evolving phenomena. The concept of autocatalytic hypercycles has been adapted from molecular biology to resolve the apparent paradox of circular causality amongst the coupled phenomena of liquid water oceans and `plate tectonics'. This new planetary synthesis presents fundamental implications for geological, geophysical, Earth system and planetary sciences, as well as novel hypotheses concerning plate drive (gravity sliding ± slab pull), origin of plate tectonics (Hadean, >=4.4Ga), biogeochemical cycling (balanced global fluxes of water into and out of the tectosphere; is the asthenosphere continuously rehydrated via lateral advection) and planetary geomorphology (simple contrasts between Mars, Earth and Venus).
Astrobiological Phase Transition: Towards Resolution of Fermi's Paradox
NASA Astrophysics Data System (ADS)
Ćirković, Milan M.; Vukotić, Branislav
2008-12-01
Can astrophysics explain Fermi’s paradox or the “Great Silence” problem? If available, such explanation would be advantageous over most of those suggested in literature which rely on unverifiable cultural and/or sociological assumptions. We suggest, instead, a general astrobiological paradigm which might offer a physical and empirically testable paradox resolution. Based on the idea of James Annis, we develop a model of an astrobiological phase transition of the Milky Way, based on the concept of the global regulation mechanism(s). The dominant regulation mechanisms, arguably, are γ-ray bursts, whose properties and cosmological evolution are becoming well-understood. Secular evolution of regulation mechanisms leads to the brief epoch of phase transition: from an essentially dead place, with pockets of low-complexity life restricted to planetary surfaces, it will, on a short (Fermi-Hart) timescale, become filled with high-complexity life. An observation selection effect explains why we are not, in spite of the very small prior probability, to be surprised at being located in that brief phase of disequilibrium. In addition, we show that, although the phase-transition model may explain the “Great Silence”, it is not supportive of the “contact pessimist” position. To the contrary, the phase-transition model offers a rational motivation for continuation and extension of our present-day Search for ExtraTerrestrial Intelligence (SETI) endeavours. Some of the unequivocal and testable predictions of our model include the decrease of extinction risk in the history of terrestrial life, the absence of any traces of Galactic societies significantly older than human society, complete lack of any extragalactic intelligent signals or phenomena, and the presence of ubiquitous low-complexity life in the Milky Way.
Testable solution of the cosmological constant and coincidence problems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shaw, Douglas J.; Barrow, John D.
2011-02-15
We present a new solution to the cosmological constant (CC) and coincidence problems in which the observed value of the CC, {Lambda}, is linked to other observable properties of the Universe. This is achieved by promoting the CC from a parameter that must be specified, to a field that can take many possible values. The observed value of {Lambda}{approx_equal}(9.3 Gyrs){sup -2}[{approx_equal}10{sup -120} in Planck units] is determined by a new constraint equation which follows from the application of a causally restricted variation principle. When applied to our visible Universe, the model makes a testable prediction for the dimensionless spatial curvaturemore » of {Omega}{sub k0}=-0.0056({zeta}{sub b}/0.5), where {zeta}{sub b}{approx}1/2 is a QCD parameter. Requiring that a classical history exist, our model determines the probability of observing a given {Lambda}. The observed CC value, which we successfully predict, is typical within our model even before the effects of anthropic selection are included. When anthropic selection effects are accounted for, we find that the observed coincidence between t{sub {Lambda}={Lambda}}{sup -1/2} and the age of the Universe, t{sub U}, is a typical occurrence in our model. In contrast to multiverse explanations of the CC problems, our solution is independent of the choice of a prior weighting of different {Lambda} values and does not rely on anthropic selection effects. Our model includes no unnatural small parameters and does not require the introduction of new dynamical scalar fields or modifications to general relativity, and it can be tested by astronomical observations in the near future.« less
Trevors, J T
2010-06-01
Methods to research the origin of microbial life are limited. However, microorganisms were the first organisms on the Earth capable of cell growth and division, and interactions with their environment, other microbial cells, and eventually with diverse eukaryotic organisms. The origin of microbial life and the supporting scientific evidence are both an enigma and a scientific priority. Numerous hypotheses have been proposed, scenarios imagined, speculations presented in papers, insights shared, and assumptions made without supporting experimentation, which have led to limited progress in understanding the origin of microbial life. The use of the human imagination to envision the origin of life events, without supporting experimentation, observation and independently replicated experiments required for science, is a significant constraint. The challenge remains how to better understand the origin of microbial life using observations and experimental methods as opposed to speculation, assumptions, scenarios, envisioning events and un-testable hypotheses. This is not an easy challenge as experimental design and plausible hypothesis testing are difficult. Since past approaches have been inconclusive in providing evidence for the origin of microbial life mechanisms and the manner in which genetic instructions was encoded into DNA/RNA, it is reasonable and logical to propose that progress will be made when testable, plausible hypotheses and methods are used in the origin of microbial life research, and the experimental observations are, or are not reproduced in independent laboratories. These perspectives will be discussed in this article as well as the possibility that a pre-biotic film preceded a microbial biofilm as a possible micro-location for the origin of microbial cells capable of growth and division. 2010 Elsevier B.V. All rights reserved.
Astrobiological phase transition: towards resolution of Fermi's paradox.
Cirković, Milan M; Vukotić, Branislav
2008-12-01
Can astrophysics explain Fermi's paradox or the "Great Silence" problem? If available, such explanation would be advantageous over most of those suggested in literature which rely on unverifiable cultural and/or sociological assumptions. We suggest, instead, a general astrobiological paradigm which might offer a physical and empirically testable paradox resolution. Based on the idea of James Annis, we develop a model of an astrobiological phase transition of the Milky Way, based on the concept of the global regulation mechanism(s). The dominant regulation mechanisms, arguably, are gamma-ray bursts, whose properties and cosmological evolution are becoming well-understood. Secular evolution of regulation mechanisms leads to the brief epoch of phase transition: from an essentially dead place, with pockets of low-complexity life restricted to planetary surfaces, it will, on a short (Fermi-Hart) timescale, become filled with high-complexity life. An observation selection effect explains why we are not, in spite of the very small prior probability, to be surprised at being located in that brief phase of disequilibrium. In addition, we show that, although the phase-transition model may explain the "Great Silence", it is not supportive of the "contact pessimist" position. To the contrary, the phase-transition model offers a rational motivation for continuation and extension of our present-day Search for ExtraTerrestrial Intelligence (SETI) endeavours. Some of the unequivocal and testable predictions of our model include the decrease of extinction risk in the history of terrestrial life, the absence of any traces of Galactic societies significantly older than human society, complete lack of any extragalactic intelligent signals or phenomena, and the presence of ubiquitous low-complexity life in the Milky Way.
Al Yami, Majed S; Kurdi, Sawsan; Abraham, Ivo
2018-01-01
Standard-duration (7-10 days) thromboprophylaxis with low molecular weight heparin, low dose unfractionated heparin, or fondaparinux in hospitalized medically ill patients is associated with ~50% reduction in venous thromboembolism (VTE) risk. However, these patients remain at high risk for VTE post-discharge. The direct oral anticoagulants (DOACs) apixaban, rivaroxaban and betrixaban have been evaluated for extended-duration (30-42 days) thromboprophylaxis in this population. We review the efficacy and safety results from the 3 pivotal trials of extended-duration DOAC thromboprophylaxis in medically ill patients. We performed a meta-analysis of these pivotal trials focusing on 6 VTE (efficacy) and three bleeding outcomes (safety). These results were integrated into a quantitative risk/benefit assessment. The trials evaluating extended-duration DOAC thromboprophylaxis in medically ill patients failed to establish clear efficacy and/or safety signals for each agent. Our meta-analysis shows that, as a class, DOACs have selective and partial extended-duration prophylactic activity in preventing VTE events. However, this is associated with a marked increase in the risk of various bleeding events. The risk/benefit analyses fail to show a consistent net clinical benefit of extended-duration DOAC prophylaxis in medically ill patients. At this time, the evidence of safe and effective extended-duration thromboprophylaxis with DOACs in this population is inconclusive.
NASA Astrophysics Data System (ADS)
Choirunnisak; Ibrahim, M.; Yuliani
2018-01-01
The purpose of this research was to develop a guided inquiry-based learning devices on photosynthesis and respiration matter that are feasible (valid, practical, and effective) to train students’ science literacy. This research used 4D development model and tested on 15 students of biology education 2016 the State University of Surabaya with using one group pretest-posttest design. Learning devices developed include (a) Semester Lesson Plan (b) Lecture Schedule, (c) Student Activity Sheet, (d) Student Textbook, and (e) testability of science literacy. Research data obtained through validation method, observation, test, and questionnaire. The results were analyzed descriptively quantitative and qualitative. The ability of science literacy was analyzed by n-gain. The results of this research showed that (a) learning devices that developed was categorically very valid, (b) learning activities performed very well, (c) student’s science literacy skills improved that was a category as moderate, and (d) students responses were very positively to the learning that already held. Based on the results of the analysis and discussion, it is concluded that the development of guided inquiry-based learning devices on photosynthesis and respiration matter was feasible to train students literacy science skills.
Wertz, Annie E; Moya, Cristina
2018-05-30
Despite a shared recognition that the design of the human mind and the design of human culture are tightly linked, researchers in the evolutionary social sciences tend to specialize in understanding one at the expense of the other. The disciplinary boundaries roughly correspond to research traditions that focus more on natural selection and those that focus more on cultural evolution. In this paper, we articulate how two research traditions within the evolutionary social sciences-evolutionary psychology and cultural evolution-approach the study of design. We focus our analysis on the design of cognitive mechanisms that are the result of the interplay of genetic and cultural evolution. We aim to show how the approaches of these two research traditions can complement each other, and provide a framework for developing a wider range of testable hypotheses about cognitive design. To do so, we provide concrete illustrations of how this integrated approach can be used to interrogate cognitive design using examples from our own work on plant and symbolic group boundary cognition. We hope this recognition of different pathways to design will broaden the hypothesis space in the evolutionary social sciences and encourage methodological pluralism in the investigation of the mind. Copyright © 2018 Elsevier B.V. All rights reserved.
From chemical neuroanatomy to an understanding of the olfactory system
Oboti, L.; Peretto, P.; De Marchis, S.; Fasolo, A.
2011-01-01
The olfactory system of mammals is the appropriate model for studying several aspects of neuronal physiology spanning from the developmental stage to neural network remodelling in the adult brain. Both the morphological and physiological understanding of this system were strongly supported by classical histochemistry. It is emblematic the case of the Olfactory Marker Protein (OMP) staining, the first, powerful marker for fully differentiated olfactory receptor neurons and a key tool to investigate the dynamic relations between peripheral sensory epithelia and central relay regions given its presence within olfactory fibers reaching the olfactory bulb (OB). Similarly, the use of thymidine analogues was able to show neurogenesis in an adult mammalian brain far before modern virus labelling and lipophilic tracers based methods. Nowadays, a wealth of new histochemical techniques combining cell and molecular biology approaches is available, giving stance to move from the analysis of the chemically identified circuitries to functional research. The study of adult neurogenesis is indeed one of the best explanatory examples of this statement. After defining the cell types involved and the basic physiology of this phenomenon in the OB plasticity, we can now analyze the role of neurogenesis in well testable behaviours related to socio-chemical communication in rodents. PMID:22297441
Trade and Transport in Late Roman Syria
NASA Astrophysics Data System (ADS)
Fletcher, Christopher
Despite the relative notoriety and miraculous level of preservation of the Dead Cities of Syria, fundamental questions of economic and subsistence viability remain unanswered. In the 1950s Georges Tchalenko theorized that these sites relied on intensive olive monoculture to mass export olive oil to urban centers. Later excavations discovered widespread cultivation of grains, fruit, and beans which directly contradicted Tchalenko's assertion of sole reliance on oleoculture. However, innumerable olive presses in and around the Dead Cities still speak to a strong tradition of olive production. This thesis tests the logistical viability of olive oil transportation from the Dead Cities to the distant urban centers of Antioch and Apamea. Utilization of Raster GIS and remote sensing data allows for the reconstruction of the physical and social landscapes of Late Roman Syria. Least Cost Analysis techniques produce a quantitative and testable model with which to simulate and evaluate the viability of long distance olive oil trade. This model not only provides a clearer understanding of the nature of long distance trade relationships in Syria, but also provides a model for investigating ancient economic systems elsewhere in the world. Furthermore, this project allows for the generation of new information regarding sites that are currently inaccessible to researchers.
Bridging the gap between system and cell: The role of ultra-high field MRI in human neuroscience.
Turner, Robert; De Haan, Daniel
2017-01-01
The volume of published research at the levels of systems and cellular neuroscience continues to increase at an accelerating rate. At the same time, progress in psychiatric medicine has stagnated and scientific confidence in cognitive psychology research is under threat due to careless analysis methods and underpowered experiments. With the advent of ultra-high field MRI, with submillimeter image voxels, imaging neuroscience holds the potential to bridge the cellular and systems levels. Use of these accurate and precisely localized quantitative measures of brain activity may go far in providing more secure foundations for psychology, and hence for more appropriate treatment and management of psychiatric illness. However, fundamental issues regarding the construction of testable mechanistic models using imaging data require careful consideration. This chapter summarizes the characteristics of acceptable models of brain function and provides concise descriptions of the relevant types of neuroimaging data that have recently become available. Approaches to data-driven experiments and analyses are described that may lead to more realistic conceptions of the competences of neural assemblages, as they vary across the brain's complex neuroanatomy. © 2017 Elsevier B.V. All rights reserved.
cit: hypothesis testing software for mediation analysis in genomic applications.
Millstein, Joshua; Chen, Gary K; Breton, Carrie V
2016-08-01
The challenges of successfully applying causal inference methods include: (i) satisfying underlying assumptions, (ii) limitations in data/models accommodated by the software and (iii) low power of common multiple testing approaches. The causal inference test (CIT) is based on hypothesis testing rather than estimation, allowing the testable assumptions to be evaluated in the determination of statistical significance. A user-friendly software package provides P-values and optionally permutation-based FDR estimates (q-values) for potential mediators. It can handle single and multiple binary and continuous instrumental variables, binary or continuous outcome variables and adjustment covariates. Also, the permutation-based FDR option provides a non-parametric implementation. Simulation studies demonstrate the validity of the cit package and show a substantial advantage of permutation-based FDR over other common multiple testing strategies. The cit open-source R package is freely available from the CRAN website (https://cran.r-project.org/web/packages/cit/index.html) with embedded C ++ code that utilizes the GNU Scientific Library, also freely available (http://www.gnu.org/software/gsl/). joshua.millstein@usc.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Language Networks as Models of Cognition: Understanding Cognition through Language
NASA Astrophysics Data System (ADS)
Beckage, Nicole M.; Colunga, Eliana
Language is inherently cognitive and distinctly human. Separating the object of language from the human mind that processes and creates language fails to capture the full language system. Linguistics traditionally has focused on the study of language as a static representation, removed from the human mind. Network analysis has traditionally been focused on the properties and structure that emerge from network representations. Both disciplines could gain from looking at language as a cognitive process. In contrast, psycholinguistic research has focused on the process of language without committing to a representation. However, by considering language networks as approximations of the cognitive system we can take the strength of each of these approaches to study human performance and cognition as related to language. This paper reviews research showcasing the contributions of network science to the study of language. Specifically, we focus on the interplay of cognition and language as captured by a network representation. To this end, we review different types of language network representations before considering the influence of global level network features. We continue by considering human performance in relation to network structure and conclude with theoretical network models that offer potential and testable explanations of cognitive and linguistic phenomena.
Constrained Stochastic Extended Redundancy Analysis.
DeSarbo, Wayne S; Hwang, Heungsun; Stadler Blank, Ashley; Kappe, Eelco
2015-06-01
We devise a new statistical methodology called constrained stochastic extended redundancy analysis (CSERA) to examine the comparative impact of various conceptual factors, or drivers, as well as the specific predictor variables that contribute to each driver on designated dependent variable(s). The technical details of the proposed methodology, the maximum likelihood estimation algorithm, and model selection heuristics are discussed. A sports marketing consumer psychology application is provided in a Major League Baseball (MLB) context where the effects of six conceptual drivers of game attendance and their defining predictor variables are estimated. Results compare favorably to those obtained using traditional extended redundancy analysis (ERA).
Massive Open Online Course Completion Rates Revisited: Assessment, Length and Attrition
ERIC Educational Resources Information Center
Jordan, Katy
2015-01-01
This analysis is based upon enrolment and completion data collected for a total of 221 Massive Open Online Courses (MOOCs). It extends previously reported work (Jordan, 2014) with an expanded dataset; the original work is extended to include a multiple regression analysis of factors that affect completion rates and analysis of attrition rates…
Current challenges in fundamental physics
NASA Astrophysics Data System (ADS)
Egana Ugrinovic, Daniel
The discovery of the Higgs boson at the Large Hadron Collider completed the Standard Model of particle physics. The Standard Model is a remarkably successful theory of fundamental physics, but it suffers from severe problems. It does not provide an explanation for the origin or stability of the electroweak scale nor for the origin and structure of flavor and CP violation. It predicts vanishing neutrino masses, in disagreement with experimental observations. It also fails to explain the matter-antimatter asymmetry of the universe, and it does not provide a particle candidate for dark matter. In this thesis we provide experimentally testable solutions for most of these problems and we study their phenomenology.
Functional test generation for digital circuits described with a declarative language: LUSTRE
NASA Astrophysics Data System (ADS)
Almahrous, Mazen
1990-08-01
A functional approach to the test generation problem starting from a high level description is proposed. The circuit tested is modeled, using the LUSTRE high level data flow description language. The different LUSTRE primitives are translated to a SATAN format graph in order to evaluate the testability of the circuit and to generate test sequences. Another method of testing the complex circuits comprising an operative part and a control part is defined. It consists of checking experiments for the control part observed through the operative part. It was applied to the automata generated from a LUSTRE description of the circuit.
Shaping Gene Expression by Landscaping Chromatin Architecture: Lessons from a Master.
Sartorelli, Vittorio; Puri, Pier Lorenzo
2018-05-19
Since its discovery as a skeletal muscle-specific transcription factor able to reprogram somatic cells into differentiated myofibers, MyoD has provided an instructive model to understand how transcription factors regulate gene expression. Reciprocally, studies of other transcriptional regulators have provided testable hypotheses to further understand how MyoD activates transcription. Using MyoD as a reference, in this review, we discuss the similarities and differences in the regulatory mechanisms employed by tissue-specific transcription factors to access DNA and regulate gene expression by cooperatively shaping the chromatin landscape within the context of cellular differentiation. Copyright © 2018 Elsevier Inc. All rights reserved.
d’Uva, Teresa Bago; Lindeboom, Maarten; O’Donnell, Owen; van Doorslaer, Eddy
2011-01-01
We propose tests of the two assumptions under which anchoring vignettes identify heterogeneity in reporting of categorical evaluations. Systematic variation in the perceived difference between any two vignette states is sufficient to reject vignette equivalence. Response consistency - the respondent uses the same response scale to evaluate the vignette and herself – is testable given sufficiently comprehensive objective indicators that independently identify response scales. Both assumptions are rejected for reporting of cognitive and physical functioning in a sample of older English individuals, although a weaker test resting on less stringent assumptions does not reject response consistency for cognition. PMID:22184479
Advanced Deployable Structural Systems for Small Satellites
NASA Technical Reports Server (NTRS)
Belvin, W. Keith; Straubel, Marco; Wilkie, W. Keats; Zander, Martin E.; Fernandez, Juan M.; Hillebrandt, Martin F.
2016-01-01
One of the key challenges for small satellites is packaging and reliable deployment of structural booms and arrays used for power, communication, and scientific instruments. The lack of reliable and efficient boom and membrane deployment concepts for small satellites is addressed in this work through a collaborative project between NASA and DLR. The paper provides a state of the art overview on existing spacecraft deployable appendages, the special requirements for small satellites, and initial concepts for deployable booms and arrays needed for various small satellite applications. The goal is to enhance deployable boom predictability and ground testability, develop designs that are tolerant of manufacturing imperfections, and incorporate simple and reliable deployment systems.
Robertson, Scott; Leonhardt, Ulf
2014-11-01
Hawking radiation has become experimentally testable thanks to the many analog systems which mimic the effects of the event horizon on wave propagation. These systems are typically dominated by dispersion and give rise to a numerically soluble and stable ordinary differential equation only if the rest-frame dispersion relation Ω^{2}(k) is a polynomial of relatively low degree. Here we present a new method for the calculation of wave scattering in a one-dimensional medium of arbitrary dispersion. It views the wave equation as an integral equation in Fourier space, which can be solved using standard and efficient numerical techniques.
Proposed experiment to test fundamentally binary theories
NASA Astrophysics Data System (ADS)
Kleinmann, Matthias; Vértesi, Tamás; Cabello, Adán
2017-09-01
Fundamentally binary theories are nonsignaling theories in which measurements of many outcomes are constructed by selecting from binary measurements. They constitute a sensible alternative to quantum theory and have never been directly falsified by any experiment. Here we show that fundamentally binary theories are experimentally testable with current technology. For that, we identify a feasible Bell-type experiment on pairs of entangled qutrits. In addition, we prove that, for any n , quantum n -ary correlations are not fundamentally (n -1 ) -ary. For that, we introduce a family of inequalities that hold for fundamentally (n -1 ) -ary theories but are violated by quantum n -ary correlations.
Psychological stress and fibromyalgia: a review of the evidence suggesting a neuroendocrine link
Gupta, Anindya; Silman, Alan J
2004-01-01
The present review attempts to reconcile the dichotomy that exists in the literature in relation to fibromyalgia, in that it is considered either a somatic response to psychological stress or a distinct organically based syndrome. Specifically, the hypothesis explored is that the link between chronic stress and the subsequent development of fibromyalgia can be explained by one or more abnormalities in neuroendocrine function. There are several such abnormalities recognised that both occur as a result of chronic stress and are observed in fibromyalgia. Whether such abnormalities have an aetiologic role remains uncertain but should be testable by well-designed prospective studies. PMID:15142258
The Central Role of Tether-Cutting Reconnection in the Production of CMEs
NASA Technical Reports Server (NTRS)
Moore, Ron; Sterling, Alphonse; Suess, Steve
2007-01-01
This viewgraph presentation describes tether-cutting reconnection in the production of Coronal Mass Ejections (CMEs). The topics include: 1) Birth and Release of the CME Plasmoid; 2) Resulting CME in Outer Corona; 3) Governing Role of Surrounding Field; 4) Testable Prediction of the Standard Scenario Magnetic Bubble CME Model; 5) Lateral Pressure in Outer Corona; 6) Measured Angular Widths of 3 CMEs; 7) LASCO Image of each CME at Final Width; 8) Source of the CME of 2002 May 20; 9) Source of the CME of 1999 Feb 9; 10) Source of the CME of 2003 Nov 4; and 11) Test Results.
NDR proteins: lessons learned from Arabidopsis and animal cells prompt a testable hypothesis.
Mudgil, Yashwanti; Jones, Alan M
2010-08-01
N-myc Down Regulated (NDR) genes were discovered more than fifteen years ago. Indirect evidence support a role in tumor progression and cellular differentiation, but their biochemical function is still unknown. Our detailed analyses on Arabidopsis NDL proteins show their involvement in altering auxin transport, local auxin gradients and expression level of auxin transport proteins. Animal NDL proteins may be involved in membrane recycling of E-cadherin and effector for the small GTPase. In light of these findings, we hypothesize that NDL proteins regulate vesicular trafficking of auxin transport facilitator PIN proteins by biochemically alterating the local lipid environment of PIN proteins.
Developing Cognitive Models for Social Simulation from Survey Data
NASA Astrophysics Data System (ADS)
Alt, Jonathan K.; Lieberman, Stephen
The representation of human behavior and cognition continues to challenge the modeling and simulation community. The use of survey and polling instruments to inform belief states, issue stances and action choice models provides a compelling means of developing models and simulations with empirical data. Using these types of data to population social simulations can greatly enhance the feasibility of validation efforts, the reusability of social and behavioral modeling frameworks, and the testable reliability of simulations. We provide a case study demonstrating these effects, document the use of survey data to develop cognitive models, and suggest future paths forward for social and behavioral modeling.
EELV Booster Assist Options for CEV
NASA Technical Reports Server (NTRS)
McNeal, Curtis, Jr.
2005-01-01
Medium lift EELVs may still play a role in manned space flight. To be considered for manned flight, medium lift EELVs must address the short comings in their current boost assist motors. Two options exist: redesign and requalify the solid rocket motors. Replace solid rocket motors (SRMs) with hybrid rocket motors. Hybrid rocket motors are an attractive alternative. They are safer than SRMs. The TRL's Lockheed Martin Small Launch Vehicle booster development substantially lowers the development risk, cost risk, and the schedule risk for developing hybrid boost assist for EELVs. Hybrid boosters testability offsets SRMs higher inherent reliability.Hybrid booster development and recurring costs are lower than SRMs. Performance gains are readily achieved.
Field-aligned currents and ion convection at high altitudes
NASA Technical Reports Server (NTRS)
Burch, J. L.; Reiff, P. H.
1985-01-01
Hot plasma observations from Dynamics Explorer 1 have been used to investigate solar-wind ion injection, Birkeland currents, and plasma convection at altitudes above 2 earth-radii in the morning sector. The results of the study, along with the antiparallel merging hypothesis, have been used to construct a By-dependent global convection model. A significant element of the model is the coexistence of three types of convection cells (merging cells, viscous cells, and lobe cells). As the IMF direction varies, the model accounts for the changing roles of viscous and merging processes and makes testable predictions about several magnetospheric phenomena, including the newly-observed theta aurora in the polar cap.
Cancer stem cells: impact, heterogeneity, and uncertainty
Magee, Jeffrey A.; Piskounova, Elena; Morrison, Sean J.
2015-01-01
The differentiation of tumorigenic cancer stem cells into non-tumorigenic cancer cells confers heterogeneity to some cancers beyond that explained by clonal evolution or environmental differences. In such cancers, functional differences between tumorigenic and non-tumorigenic cells influence response to therapy and prognosis. However, it remains uncertain whether the model applies to many, or few, cancers due to questions about the robustness of cancer stem cell markers and the extent to which existing assays underestimate the frequency of tumorigenic cells. In cancers with rapid genetic change, reversible changes in cell states, or biological variability among patients the stem cell model may not be readily testable. PMID:22439924
Emergent quantum mechanics without wavefunctions
NASA Astrophysics Data System (ADS)
Mesa Pascasio, J.; Fussy, S.; Schwabl, H.; Grössing, G.
2016-03-01
We present our model of an Emergent Quantum Mechanics which can be characterized by “realism without pre-determination”. This is illustrated by our analytic description and corresponding computer simulations of Bohmian-like “surreal” trajectories, which are obtained classically, i.e. without the use of any quantum mechanical tool such as wavefunctions. However, these trajectories do not necessarily represent ontological paths of particles but rather mappings of the probability density flux in a hydrodynamical sense. Modelling emergent quantum mechanics in a high-low intesity double slit scenario gives rise to the “quantum sweeper effect” with a characteristic intensity pattern. This phenomenon should be experimentally testable via weak measurement techniques.
Subluxation: dogma or science?
Keating, Joseph C; Charlton, Keith H; Grod, Jaroslaw P; Perle, Stephen M; Sikorski, David; Winterstein, James F
2005-01-01
Subluxation syndrome is a legitimate, potentially testable, theoretical construct for which there is little experimental evidence. Acceptable as hypothesis, the widespread assertion of the clinical meaningfulness of this notion brings ridicule from the scientific and health care communities and confusion within the chiropractic profession. We believe that an evidence-orientation among chiropractors requires that we distinguish between subluxation dogma vs. subluxation as the potential focus of clinical research. We lament efforts to generate unity within the profession through consensus statements concerning subluxation dogma, and believe that cultural authority will continue to elude us so long as we assert dogma as though it were validated clinical theory. PMID:16092955
Cerebrovascular Hemodynamics in Women.
Duque, Cristina; Feske, Steven K; Sorond, Farzaneh A
2017-12-01
Sex and gender, as biological and social factors, significantly influence health outcomes. Among the biological factors, sex differences in vascular physiology may be one specific mechanism contributing to the observed differences in clinical presentation, response to treatment, and clinical outcomes in several vascular disorders. This review focuses on the cerebrovascular bed and summarizes the existing literature on sex differences in cerebrovascular hemodynamics to highlight the knowledge deficit that exists in this domain. The available evidence is used to generate mechanistically plausible and testable hypotheses to underscore the unmet need in understanding sex-specific mechanisms as targets for more effective therapeutic and preventive strategies. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.
The four hundred years of planetary science since Galileo and Kepler.
Burns, Joseph A
2010-07-29
For 350 years after Galileo's discoveries, ground-based telescopes and theoretical modelling furnished everything we knew about the Sun's planetary retinue. Over the past five decades, however, spacecraft visits to many targets transformed these early notions, revealing the diversity of Solar System bodies and displaying active planetary processes at work. Violent events have punctuated the histories of many planets and satellites, changing them substantially since their birth. Contemporary knowledge has finally allowed testable models of the Solar System's origin to be developed and potential abodes for extraterrestrial life to be explored. Future planetary research should involve focused studies of selected targets, including exoplanets.
NASA Astrophysics Data System (ADS)
Tu, K. M.; Matubayasi, N.; Liang, K. K.; Todorov, I. T.; Chan, S. L.; Chau, P.-L.
2012-08-01
We placed halothane, a general anaesthetic, inside palmitoyloleoylphosphatidylcholine (POPC) bilayers and performed molecular dynamics simulations at atmospheric and raised pressures. We demonstrated that halothane aggregated inside POPC membranes at 20 MPa but not at 40 MPa. The pressure range of aggregation matches that of pressure reversal in whole animals, and strongly suggests that this could be the mechanism for this effect. Combining these results with previous experimental data, we describe a testable hypothesis of how aggregation of general anaesthetics at high pressure can lead to pressure reversal, the effect whereby these drugs lose the efficacy at high pressure.
Loss Aversion and Time-Differentiated Electricity Pricing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spurlock, C. Anna
2015-06-01
I develop a model of loss aversion over electricity expenditure, from which I derive testable predictions for household electricity consumption while on combination time-of-use (TOU) and critical peak pricing (CPP) plans. Testing these predictions results in evidence consistent with loss aversion: (1) spillover effects - positive expenditure shocks resulted in significantly more peak consumption reduction for several weeks thereafter; and (2) clustering - disproportionate probability of consuming such that expenditure would be equal between the TOUCPP or standard flat-rate pricing structures. This behavior is inconsistent with a purely neoclassical utility model, and has important implications for application of time-differentiated electricitymore » pricing.« less
Theory analysis of the Dental Hygiene Human Needs Conceptual Model.
MacDonald, L; Bowen, D M
2017-11-01
Theories provide a structural knowing about concept relationships, practice intricacies, and intuitions and thus shape the distinct body of the profession. Capturing ways of knowing and being is essential to any professions' practice, education and research. This process defines the phenomenon of the profession - its existence or experience. Theory evaluation is a systematic criterion-based assessment of a specific theory. This study presents a theory analysis of the Dental Hygiene Human Needs Conceptual Model (DH HNCM). Using the Walker and Avant Theory Analysis, a seven-step process, the DH HNCM, was analysed and evaluated for its meaningfulness and contribution to dental hygiene. The steps include the following: (i) investigate the origins; (ii) examine relationships of the theory's concepts; (iii) assess the logic of the theory's structure; (iv) consider the usefulness to practice; (v) judge the generalizability; (vi) evaluate the parsimony; and (vii) appraise the testability of the theory. Human needs theory in nursing and Maslow's Hierarchy of Need Theory prompted this theory's development. The DH HNCM depicts four concepts based on the paradigm concepts of the profession: client, health/oral health, environment and dental hygiene actions, and includes validated eleven human needs that evolved overtime to eight. It is logical, simplistic, allows scientific predictions and testing, and provides a unique lens for the dental hygiene practitioner. With this model, dental hygienists have entered practice, knowing they enable clients to meet their human needs. For the DH HNCM, theory analysis affirmed that the model is reasonable and insightful and adds to the dental hygiene professions' epistemology and ontology. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Hospital electronic medical record enterprise application strategies: do they matter?
Fareed, Naleef; Ozcan, Yasar A; DeShazo, Jonathan P
2012-01-01
Successful implementations and the ability to reap the benefits of electronic medical record (EMR) systems may be correlated with the type of enterprise application strategy that an administrator chooses when acquiring an EMR system. Moreover, identifying the most optimal enterprise application strategy is a task that may have important linkages with hospital performance. This study explored whether hospitals that have adopted differential EMR enterprise application strategies concomitantly differ in their overall efficiency. Specifically, the study examined whether hospitals with a single-vendor strategy had a higher likelihood of being efficient than those with a best-of-breed strategy and whether hospitals with a best-of-suite strategy had a higher probability of being efficient than those with best-of-breed or single-vendor strategies. A conceptual framework was used to formulate testable hypotheses. A retrospective cross-sectional approach using data envelopment analysis was used to obtain efficiency scores of hospitals by EMR enterprise application strategy. A Tobit regression analysis was then used to determine the probability of a hospital being inefficient as related to its EMR enterprise application strategy, while moderating for the hospital's EMR "implementation status" and controlling for hospital and market characteristics. The data envelopment analysis of hospitals suggested that only 32 hospitals were efficient in the study's sample of 2,171 hospitals. The results from the post hoc analysis showed partial support for the hypothesis that hospitals with a best-of-suite strategy were more likely to be efficient than those with a single-vendor strategy. This study underscores the importance of understanding the differences between the three strategies discussed in this article. On the basis of the findings, hospital administrators should consider the efficiency associations that a specific strategy may have compared with another prior to moving toward an enterprise application strategy.
Spaide, Richard F; Curcio, Christine A
2011-09-01
To evaluate the validity of commonly used anatomical designations for the four hyperreflective outer retinal bands seen in current-generation optical coherence tomography, a scale model of outer retinal morphology was created using published information for direct comparison with optical coherence tomography scans. Articles and books concerning histology of the outer retina from 1900 until 2009 were evaluated, and data were used to create a scale model drawing. Boundaries between outer retinal tissue compartments described by the model were compared with intensity variations of representative spectral-domain optical coherence tomography scans using longitudinal reflectance profiles to determine the region of origin of the hyperreflective outer retinal bands. This analysis showed a high likelihood that the spectral-domain optical coherence tomography bands attributed to the external limiting membrane (the first, innermost band) and to the retinal pigment epithelium (the fourth, outermost band) are correctly attributed. Comparative analysis showed that the second band, often attributed to the boundary between inner and outer segments of the photoreceptors, actually aligns with the ellipsoid portion of the inner segments. The third band corresponded to an ensheathment of the cone outer segments by apical processes of the retinal pigment epithelium in a structure known as the contact cylinder. Anatomical attributions and subsequent pathophysiologic assessments pertaining to the second and third outer retinal hyperreflective bands may not be correct. This analysis has identified testable hypotheses for the actual correlates of the second and third bands. Nonretinal pigment epithelium contributions to the fourth band (e.g., Bruch membrane) remain to be determined.
Opatovsky, Itai; Santos-Garcia, Diego; Ruan, Zhepu; Lahav, Tamar; Ofaim, Shany; Mouton, Laurence; Barbe, Valérie; Jiang, Jiandong; Zchori-Fein, Einat; Freilich, Shiri
2018-05-25
Individual organisms are linked to their communities and ecosystems via metabolic activities. Metabolic exchanges and co-dependencies have long been suggested to have a pivotal role in determining community structure. In phloem-feeding insects such metabolic interactions with bacteria enable complementation of their deprived nutrition. The phloem-feeding whitefly Bemisia tabaci (Hemiptera: Aleyrodidae) harbors an obligatory symbiotic bacterium, as well as varying combinations of facultative symbionts. This well-defined bacterial community in B. tabaci serves here as a case study for a comprehensive and systematic survey of metabolic interactions within the bacterial community and their associations with documented occurrences of bacterial combinations. We first reconstructed the metabolic networks of five common B. tabaci symbionts genera (Portiera, Rickettsia, Hamiltonella, Cardinium and Wolbachia), and then used network analysis approaches to predict: (1) species-specific metabolic capacities in a simulated bacteriocyte-like environment; (2) metabolic capacities of the corresponding species' combinations, and (3) dependencies of each species on different media components. The predictions for metabolic capacities of the symbionts in the host environment were in general agreement with previously reported genome analyses, each focused on the single-species level. The analysis suggests several previously un-reported routes for complementary interactions and estimated the dependency of each symbiont in specific host metabolites. No clear association was detected between metabolic co-dependencies and co-occurrence patterns. The analysis generated predictions for testable hypotheses of metabolic exchanges and co-dependencies in bacterial communities and by crossing them with co-occurrence profiles, contextualized interaction patterns into a wider ecological perspective.
Yu, Chenggang; Boutté, Angela; Yu, Xueping; Dutta, Bhaskar; Feala, Jacob D; Schmid, Kara; Dave, Jitendra; Tawa, Gregory J; Wallqvist, Anders; Reifman, Jaques
2015-02-01
The multifactorial nature of traumatic brain injury (TBI), especially the complex secondary tissue injury involving intertwined networks of molecular pathways that mediate cellular behavior, has confounded attempts to elucidate the pathology underlying the progression of TBI. Here, systems biology strategies are exploited to identify novel molecular mechanisms and protein indicators of brain injury. To this end, we performed a meta-analysis of four distinct high-throughput gene expression studies involving different animal models of TBI. By using canonical pathways and a large human protein-interaction network as a scaffold, we separately overlaid the gene expression data from each study to identify molecular signatures that were conserved across the different studies. At 24 hr after injury, the significantly activated molecular signatures were nonspecific to TBI, whereas the significantly suppressed molecular signatures were specific to the nervous system. In particular, we identified a suppressed subnetwork consisting of 58 highly interacting, coregulated proteins associated with synaptic function. We selected three proteins from this subnetwork, postsynaptic density protein 95, nitric oxide synthase 1, and disrupted in schizophrenia 1, and hypothesized that their abundance would be significantly reduced after TBI. In a penetrating ballistic-like brain injury rat model of severe TBI, Western blot analysis confirmed our hypothesis. In addition, our analysis recovered 12 previously identified protein biomarkers of TBI. The results suggest that systems biology may provide an efficient, high-yield approach to generate testable hypotheses that can be experimentally validated to identify novel mechanisms of action and molecular indicators of TBI. © 2014 The Authors. Journal of Neuroscience Research Published by Wiley Periodicals, Inc.
ERIC Educational Resources Information Center
White, Pamela; O'Reilly, Mark; Fragale, Christina; Kang, Soyeon; Muhich, Kimberly; Falcomata, Terry; Lang, Russell; Sigafoos, Jeff; Lancioni, Giulio
2011-01-01
Two children with autism who engaged in aggression and stereotypy were assessed using common analogue functional analysis procedures. Aggression was maintained by access to specific preferred items. Data on the rates of stereotypy and appropriate play were collected during an extended functional analysis tangible condition. These data reveal that…
Evaluation of SAMe-TT2R2 Score on Predicting Success With Extended-Interval Warfarin Monitoring.
Hwang, Andrew Y; Carris, Nicholas W; Dietrich, Eric A; Gums, John G; Smith, Steven M
2018-06-01
In patients with stable international normalized ratios, 12-week extended-interval warfarin monitoring can be considered; however, predictors of success with this strategy are unknown. The previously validated SAMe-TT 2 R 2 score (considering sex, age, medical history, treatment, tobacco, and race) predicts anticoagulation control during standard follow-up (every 4 weeks), with lower scores associated with greater time in therapeutic range. To evaluate the ability of the SAMe-TT 2 R 2 score in predicting success with extended-interval warfarin follow-up in patients with previously stable warfarin doses. In this post hoc analysis of a single-arm feasibility study, baseline SAMe-TT 2 R 2 scores were calculated for patients with ≥1 extended-interval follow-up visit. The primary analysis assessed achieved weeks of extended-interval follow-up according to baseline SAMe-TT 2 R 2 scores. A total of 47 patients receiving chronic anticoagulation completed a median of 36 weeks of extended-interval follow-up. The median baseline SAMe-TT 2 R 2 score was 1 (range 0-5). Lower SAMe-TT 2 R 2 scores appeared to be associated with greater duration of extended-interval follow-up achieved, though the differences between scores were not statistically significant. No individual variable of the SAMe-TT 2 R 2 score was associated with achieved weeks of extended-interval follow-up. Analysis of additional patient factors found that longer duration (≥24 weeks) of prior stable treatment was significantly associated with greater weeks of extended-interval follow-up completed ( P = 0.04). Conclusion and Relevance: This pilot study provides limited evidence that the SAMe-TT 2 R 2 score predicts success with extended-interval warfarin follow-up but requires confirmation in a larger study. Further research is also necessary to establish additional predictors of successful extended-interval warfarin follow-up.
Stirling Convertor Extended Operation Testing and Data Analysis at Glenn Research Center
NASA Technical Reports Server (NTRS)
Cornell, Peggy A.; Lewandowski, Edward J.; Oriti, Salvatore M.; Wilson, Scott D.
2010-01-01
Extended operation of Stirling convertors is essential to the development of radioisotope power systems and their potential use for longduration missions. To document the reliability of the convertors, regular monitoring and analysis of the extended operation data is particularly valuable, allowing us to better understand and quantify long-life characteristics of the convertors. Furthermore, investigation and comparison of the extended operation data to baseline performance data provides an opportunity to understand system behavior should any off-nominal performance occur. Glenn Research Center (GRC) has tested 16 Stirling convertors under 24-hr unattended extended operation, including four that have operated in a thermal vacuum environment and two that are operating in the Advanced Stirling Radioisotope Generator Engineering Unit. Ten of the sixteen convertors are the Advanced Stirling Convertors (ASC) developed by Sunpower, Inc. with GRC. These are highly efficient (conversion efficiency of up to 38 percent for the ASC-1), low-mass convertors that have evolved through technologically progressive convertor builds. Six convertors at GRC are Technology Demonstration Convertors from Infinia Corporation. They have achieved greater than 27 percent conversion efficiency and have accumulated over 185,000 of the total 265,000 hr of extended operation at GRC. This paper presents the extended operation testing and data analysis of free-piston Stirling convertors at NASA GRC as well as how these tests have contributed to the Stirling convertor s progression toward flight.
ERIC Educational Resources Information Center
Juricic, Davor; Barr, Ronald E.
1996-01-01
Reports on a project that extended the Engineering Design Graphics curriculum to include instruction and laboratory experience in computer-aided design, analysis, and manufacturing (CAD/CAM). Discusses issues in project implementation, including introduction of finite element analysis to lower-division students, feasibility of classroom prototype…
Three-dimensional Stress Analysis Using the Boundary Element Method
NASA Technical Reports Server (NTRS)
Wilson, R. B.; Banerjee, P. K.
1984-01-01
The boundary element method is to be extended (as part of the NASA Inelastic Analysis Methods program) to the three-dimensional stress analysis of gas turbine engine hot section components. The analytical basis of the method (as developed in elasticity) is outlined, its numerical implementation is summarized, and the approaches to be followed in extending the method to include inelastic material response indicated.
Perger, Ludwig; Rentsch, Katharina M.; Kullak-Ublick, Gerd A.; Verotta, Davide; Fattinger, Karin
2009-01-01
In diacetylmorphine prescription programs for heavily dependent addicts, diacetylmorphine is usually administered intravenously, but this may not be possible due to venosclerosis or when heroin abuse had occurred via non-intravenous routes. Since up to 25% of patients administer diacetylmorphine orally, we characterised morphine absorption after single oral doses of immediate and extended release diacetylmorphine in 8 opioid addicts. Plasma concentrations were determined by liquid chromatography-mass spectrometry. Non-compartmental methods and deconvolution were applied for data analysis. Mean (±SD) immediate and extended release doses were 719 ± 297 mg and 956 ± 404 mg, with high absolute morphine bioavailabilities of 56% to 61%, respectively. Immediate release diacetylmorphine caused rapid morphine absorption, peaking at 10 to 15 min. Morphine absorption was considerably slower and more sustained for extended release diacetylmorphine, with only ~30% of maximal immediate release absorption being reached after 10 min and maintained for 3 to 4 h, with no relevant food interaction. The relative extended to immediate release bioavailability was calculated to be 86% by non-compartmental analysis and 93% by deconvolution analysis. Thus, immediate and extended release diacetylmorphine produce the intended morphine exposures. Both are suitable for substitution treatments. Similar doses can be applied if used in combination or sequentially. PMID:19084595
Analysis of optimality in natural and perturbed metabolic networks
Segrè, Daniel; Vitkup, Dennis; Church, George M.
2002-01-01
An important goal of whole-cell computational modeling is to integrate detailed biochemical information with biological intuition to produce testable predictions. Based on the premise that prokaryotes such as Escherichia coli have maximized their growth performance along evolution, flux balance analysis (FBA) predicts metabolic flux distributions at steady state by using linear programming. Corroborating earlier results, we show that recent intracellular flux data for wild-type E. coli JM101 display excellent agreement with FBA predictions. Although the assumption of optimality for a wild-type bacterium is justifiable, the same argument may not be valid for genetically engineered knockouts or other bacterial strains that were not exposed to long-term evolutionary pressure. We address this point by introducing the method of minimization of metabolic adjustment (MOMA), whereby we test the hypothesis that knockout metabolic fluxes undergo a minimal redistribution with respect to the flux configuration of the wild type. MOMA employs quadratic programming to identify a point in flux space, which is closest to the wild-type point, compatibly with the gene deletion constraint. Comparing MOMA and FBA predictions to experimental flux data for E. coli pyruvate kinase mutant PB25, we find that MOMA displays a significantly higher correlation than FBA. Our method is further supported by experimental data for E. coli knockout growth rates. It can therefore be used for predicting the behavior of perturbed metabolic networks, whose growth performance is in general suboptimal. MOMA and its possible future extensions may be useful in understanding the evolutionary optimization of metabolism. PMID:12415116
A Mathematical Framework for the Analysis of Cyber-Resilient Control Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Melin, Alexander M; Ferragut, Erik M; Laska, Jason A
2013-01-01
The increasingly recognized vulnerability of industrial control systems to cyber-attacks has inspired a considerable amount of research into techniques for cyber-resilient control systems. The majority of this effort involves the application of well known information security (IT) techniques to control system networks. While these efforts are important to protect the control systems that operate critical infrastructure, they are never perfectly effective. Little research has focused on the design of closed-loop dynamics that are resilient to cyber-attack. The majority of control system protection measures are concerned with how to prevent unauthorized access and protect data integrity. We believe that the abilitymore » to analyze how an attacker can effect the closed loop dynamics of a control system configuration once they have access is just as important to the overall security of a control system. To begin to analyze this problem, consistent mathematical definitions of concepts within resilient control need to be established so that a mathematical analysis of the vulnerabilities and resiliencies of a particular control system design methodology and configuration can be made. In this paper, we propose rigorous definitions for state awareness, operational normalcy, and resiliency as they relate to control systems. We will also discuss some mathematical consequences that arise from the proposed definitions. The goal is to begin to develop a mathematical framework and testable conditions for resiliency that can be used to build a sound theoretical foundation for resilient control research.« less
Franchini, Paolo; Irisarri, Iker; Fudickar, Adam; Schmidt, Andreas; Meyer, Axel; Wikelski, Martin; Partecke, Jesko
2017-06-01
Seasonal migration is a widespread phenomenon, which is found in many different lineages of animals. This spectacular behaviour allows animals to avoid seasonally adverse environmental conditions to exploit more favourable habitats. Migration has been intensively studied in birds, which display astonishing variation in migration strategies, thus providing a powerful system for studying the ecological and evolutionary processes that shape migratory behaviour. Despite intensive research, the genetic basis of migration remains largely unknown. Here, we used state-of-the-art radio-tracking technology to characterize the migratory behaviour of a partially migratory population of European blackbirds (Turdus merula) in southern Germany. We compared gene expression of resident and migrant individuals using high-throughput transcriptomics in blood samples. Analyses of sequence variation revealed a nonsignificant genetic structure between blackbirds differing by their migratory phenotype. We detected only four differentially expressed genes between migrants and residents, which might be associated with hyperphagia, moulting and enhanced DNA replication and transcription. The most pronounced changes in gene expression occurred between migratory birds depending on when, in relation to their date of departure, blood was collected. Overall, the differentially expressed genes detected in this analysis may play crucial roles in determining the decision to migrate, or in controlling the physiological processes required for the onset of migration. These results provide new insights into, and testable hypotheses for, the molecular mechanisms controlling the migratory phenotype and its underlying physiological mechanisms in blackbirds and other migratory bird species. © 2017 John Wiley & Sons Ltd.
Exploring universal patterns in human home-work commuting from mobile phone data.
Kung, Kevin S; Greco, Kael; Sobolevsky, Stanislav; Ratti, Carlo
2014-01-01
Home-work commuting has always attracted significant research attention because of its impact on human mobility. One of the key assumptions in this domain of study is the universal uniformity of commute times. However, a true comparison of commute patterns has often been hindered by the intrinsic differences in data collection methods, which make observation from different countries potentially biased and unreliable. In the present work, we approach this problem through the use of mobile phone call detail records (CDRs), which offers a consistent method for investigating mobility patterns in wholly different parts of the world. We apply our analysis to a broad range of datasets, at both the country (Portugal, Ivory Coast, and Saudi Arabia), and city (Boston) scale. Additionally, we compare these results with those obtained from vehicle GPS traces in Milan. While different regions have some unique commute time characteristics, we show that the home-work time distributions and average values within a single region are indeed largely independent of commute distance or country (Portugal, Ivory Coast, and Boston)-despite substantial spatial and infrastructural differences. Furthermore, our comparative analysis demonstrates that such distance-independence holds true only if we consider multimodal commute behaviors-as consistent with previous studies. In car-only (Milan GPS traces) and car-heavy (Saudi Arabia) commute datasets, we see that commute time is indeed influenced by commute distance. Finally, we put forth a testable hypothesis and suggest ways for future work to make more accurate and generalizable statements about human commute behaviors.
Evaluation of cable tension sensors of FAST reflector from the perspective of EMI
NASA Astrophysics Data System (ADS)
Zhu, Ming; Wang, Qiming; Egan, Dennis; Wu, Mingchang; Sun, Xiao
2016-06-01
The active reflector of FAST (five-hundred-meter aperture spherical radio telescope) is supported by a ring beam and a cable-net structure, in which nodes are actively controlled to form series of real-time paraboloids. To ensure the security and stability of the supporting structure, tension must be monitored for some typical cables. Considering the stringent requirements in accuracy and long-term stability, magnetic flux sensor, vibrating wire strain gauge and fiber bragg grating strain gauge are screened for the cable tension monitoring of the supporting cable-net. Specifically, receivers of radio telescopes have strict restriction on electro magnetic interference (EMI) or radio frequency interference (RFI). These three types of sensors are evaluated from the view of EMI/RFI. Firstly, these fundamentals are theoretically analyzed. Secondly, typical sensor signals are collected in the time and analyzed in the frequency domain, which shows the characteristic in the frequency domain. Finally, typical sensors are tested in an anechoic chamber to get the EMI levels. Theoretical analysis shows that Fiber Bragg Grating strain gauge itself will not lead to EMI/RFI. According to GJB151A, frequency domain analysis and test results show that for the vibrating wire strain gauge and magnetic flux sensor themselves, testable EMI/RFI levels are typically below the background noise of the anechoic chamber. FAST finally choses these three sensors as the monitoring sensors of its cable tension. The proposed study is also a reference to the monitoring equipment selection of other radio telescopes and large structures.
Social Identities as Pathways into and out of Addiction.
Dingle, Genevieve A; Cruwys, Tegan; Frings, Daniel
2015-01-01
There exists a predominant identity loss and "redemption" narrative in the addiction literature describing how individuals move from a "substance user" identity to a "recovery" identity. However, other identity related pathways influencing onset, treatment seeking and recovery may exist, and the process through which social identities unrelated to substance use change over time is not well understood. This study was designed to provide a richer understanding of such social identities processes. Semi-structured interviews were conducted with 21 adults residing in a drug and alcohol therapeutic community (TC) and thematic analysis revealed two distinct identity-related pathways leading into and out of addiction. Some individuals experienced a loss of valued identities during addiction onset that were later renewed during recovery (consistent with the existing redemption narrative). However, a distinct identity gain pathway emerged for socially isolated individuals, who described the onset of their addiction in terms of a new valued social identity. Almost all participants described their TC experience in terms of belonging to a recovery community. Participants on the identity loss pathway aimed to renew their pre-addiction identities after treatment while those on the identity gain pathway aimed to build aspirational new identities involving study, work, or family roles. These findings help to explain how social factors are implicated in the course of addiction, and may act as either motivations for or barriers to recovery. The qualitative analysis yielded a testable model for future research in other samples and settings.
Restoration of contaminated ecosystems: adaptive management in a changing climate
Farag, Aida; Larson, Diane L.; Stauber, Jenny; Stahl, Ralph; Isanhart, John; McAbee, Kevin T.; Walsh, Christopher J.
2017-01-01
Three case studies illustrate how adaptive management (AM) has been used in ecological restorations that involve contaminants. Contaminants addressed include mercury, selenium, and contaminants and physical disturbances delivered to streams by urban stormwater runoff. All three cases emphasize the importance of broad stakeholder input early and consistently throughout decision analysis for AM. Risk of contaminant exposure provided input to the decision analyses (e.g. selenium exposure to endangered razorback suckers, Stewart Lake; multiple contaminants in urban stormwater runoff, Melbourne) and was balanced with the protection of resources critical for a desired future state (e.g. preservation old growth trees, South River). Monitoring also played a critical role in the ability to conduct the decision analyses necessary for AM plans. For example, newer technologies in the Melbourne case provided a testable situation where contaminant concentrations and flow disturbance were reduced to support a return to good ecological condition. In at least one case (Stewart Lake), long-term monitoring data are being used to document the potential effects of climate change on a restoration trajectory. Decision analysis formalized the process by which stakeholders arrived at the priorities for the sites, which together constituted the desired future condition towards which each restoration is aimed. Alternative models were developed that described in mechanistic terms how restoration can influence the system towards the desired future condition. Including known and anticipated effects of future climate scenarios in these models will make them robust to the long-term exposure and effects of contaminants in restored ecosystems.
Social Identities as Pathways into and out of Addiction
Dingle, Genevieve A.; Cruwys, Tegan; Frings, Daniel
2015-01-01
There exists a predominant identity loss and “redemption” narrative in the addiction literature describing how individuals move from a “substance user” identity to a “recovery” identity. However, other identity related pathways influencing onset, treatment seeking and recovery may exist, and the process through which social identities unrelated to substance use change over time is not well understood. This study was designed to provide a richer understanding of such social identities processes. Semi-structured interviews were conducted with 21 adults residing in a drug and alcohol therapeutic community (TC) and thematic analysis revealed two distinct identity-related pathways leading into and out of addiction. Some individuals experienced a loss of valued identities during addiction onset that were later renewed during recovery (consistent with the existing redemption narrative). However, a distinct identity gain pathway emerged for socially isolated individuals, who described the onset of their addiction in terms of a new valued social identity. Almost all participants described their TC experience in terms of belonging to a recovery community. Participants on the identity loss pathway aimed to renew their pre-addiction identities after treatment while those on the identity gain pathway aimed to build aspirational new identities involving study, work, or family roles. These findings help to explain how social factors are implicated in the course of addiction, and may act as either motivations for or barriers to recovery. The qualitative analysis yielded a testable model for future research in other samples and settings. PMID:26648882